WorldWideScience

Sample records for luminex analytical platform

  1. Evaluation of LABType® SSO HLA Typing using the Luminex Platform: Cord Blood Registry Typing for the Korean Population.

    Science.gov (United States)

    Roh, Eun-Youn; Song, Eun-Young; Chang, Jee-Young; Yoon, Jong-Hyun; Shin, Sue

    2016-08-01

    The performance of a new intermediate-resolution method using a PCR-Luminex platform and LABType® SSO A, B DRB1 kits as an HLA typing method for the cord blood (CB) registry of the Korean population was investigated. A total of 1,413 cord blood units (CBUs) were enrolled - 1,382 from Koreans and 31 from non-Koreans or mixed-ancestry individuals. HLA-A, -B, and -DRB1 typing was performed using the LABType® SSO typing kits. HLA typing with the DNA method and 2-digit results are mandatory for the public CB bank in Korea according to the "CB Act." The proportions of ambiguous results in the 2-digit assignment were 14.6% (206/1,413) and 14.8% (205/ 1,382) among the total subjects and the Korean donors, respectively. In the 2-digit resolution, 3 different HLA-A types (69 CBUs), 31 HLA-B types (124 CBUs), and 3 HLA-DRB1 types (13 CBUs) showed ambiguous results. The 'most probable type' to the ambiguous results based on the reported Korean HLA allele frequencies were able to be assigned. The most probable results were 100% consistent with the confirmed types as determined by the HD kits (DRB1) and additional PCR-SBT or PCR-SSP tests (A and B). Luminex technology is more automated and less labor intensive than the conventional SSO typing method, and the results are less affected by differences between inspectors. Although it is not satisfactory as a sole confirmation test and cannot be used as a replacement for the PCR-SBT test, the combination of Luminex technology with LABType® SSO kits and population frequency data provides a proper typing platform that can be used as a qualifying test for CB registries.

  2. The impact of pre-analytical variables on the stability of neurofilament proteins in CSF, determined by a novel validated SinglePlex Luminex assay and ELISA.

    Science.gov (United States)

    Koel-Simmelink, Marleen J A; Vennegoor, Anke; Killestein, Joep; Blankenstein, Marinus A; Norgren, Niklas; Korth, Carsten; Teunissen, Charlotte E

    2014-01-15

    Neurofilament (Nf) proteins have been shown to be promising biomarkers for monitoring and predicting disease progression for various neurological diseases. The aim of this study was to evaluate the effects of pre-analytical variables on the concentration of neurofilament heavy (NfH) and neurofilament light (NfL) proteins. For NfH an in-house newly-developed and validated SinglePlex Luminex assay was used; ELISA was used to analyze NfL. For the NfL ELISA assay, the intra- and inter-assay variation was respectively, 1.5% and 16.7%. Analytical performance of the NfH SinglePlex Luminex assay in terms of sensitivity (6.6pg/mL), recovery in cerebrospinal fluid (CSF) (between 90 and 104%), linearity (from 6.6-1250pg/mL), and inter- and intra-assay variation (<8%) were good. Concentrations of both NfL and NfH appeared not negatively affected by blood contamination, repeated freeze-thaw cycles (up to 4), delayed processing (up to 24hours) and during long-term storage at -20°C, 4°C, and room temperature. A decrease in concentration was observed during storage of both neurofilament proteins up to 21days at 37°C, which was significant by day 5. The newly developed NfH SinglePlex Luminex assay has a good sensitivity and is robust. Moreover, both NfH and NfL are stable under the most prevalent pre-analytical variations. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. A survey on platforms for big data analytics.

    Science.gov (United States)

    Singh, Dilpreet; Reddy, Chandan K

    The primary purpose of this paper is to provide an in-depth analysis of different platforms available for performing big data analytics. This paper surveys different hardware platforms available for big data analytics and assesses the advantages and drawbacks of each of these platforms based on various metrics such as scalability, data I/O rate, fault tolerance, real-time processing, data size supported and iterative task support. In addition to the hardware, a detailed description of the software frameworks used within each of these platforms is also discussed along with their strengths and drawbacks. Some of the critical characteristics described here can potentially aid the readers in making an informed decision about the right choice of platforms depending on their computational needs. Using a star ratings table, a rigorous qualitative comparison between different platforms is also discussed for each of the six characteristics that are critical for the algorithms of big data analytics. In order to provide more insights into the effectiveness of each of the platform in the context of big data analytics, specific implementation level details of the widely used k-means clustering algorithm on various platforms are also described in the form pseudocode.

  4. Luminex® xMAP® technology is an effective strategy for high-definition human leukocyte antigen typing of cord blood units prior to listing.

    Science.gov (United States)

    Guarene, Marco; Badulli, Carla; Cremaschi, Anna L; Sbarsi, Ilaria; Cacciatore, Rosalia; Tinelli, Carmine; Pasi, Annamaria; Bergamaschi, Paola; Perotti, Cesare G

    2018-05-01

    Allele-level donor-recipient match at HLA-A, HLA-B, HLA-C and HLA-DRB1 loci impacts the outcome after cord blood transplantation for hematologic malignancies and modifies the strategy of donor selection. High definition of both class I and II HLA loci at time of listing is a way to improve the attractiveness of cord blood bank inventories, reducing the time for donor search and procurement and simplifying donor choice, in particular, for patients of non-European heritage. In 2014, Luminex ® xMAP ® technology was introduced in our laboratory practice and was applied to cord blood units typing. In this study, we evaluated the impact of this strategy in comparison with the platform in use until 2013, relying on LiPA reverse polymerase chain reaction-sequence-specific oligonucleotide (revPCR-SSO) plus polymerase chain reaction-sequence-specific primer (PCR-SSP). In 2014, the time for testing was shorter (141 vs 181 days on average), the number of test repetitions was lower (in particular for HLA-A locus, p = 0.026), and the cost reduced (240.7 vs 395.6 euros per unit on average) compared to 2013, demonstrating that Luminex xMAP technology is superior to the previous approach. Luminex xMAP platform has useful application in cord blood banking programs, to achieve high-definition HLA typing of cord blood units at the time of banking in a quick, accurate, and cost-effective manner.

  5. ATLAS Analytics and Machine Learning Platforms

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Legger, Federica; Gardner, Robert

    2018-01-01

    In 2015 ATLAS Distributed Computing started to migrate its monitoring systems away from Oracle DB and decided to adopt new big data platforms that are open source, horizontally scalable, and offer the flexibility of NoSQL systems. Three years later, the full software stack is in place, the system is considered in production and operating at near maximum capacity (in terms of storage capacity and tightly coupled analysis capability). The new model provides several tools for fast and easy to deploy monitoring and accounting. The main advantages are: ample ways to do complex analytics studies (using technologies such as java, pig, spark, python, jupyter), flexibility in reorganization of data flows, near real time and inline processing. The analytics studies improve our understanding of different computing systems and their interplay, thus enabling whole-system debugging and optimization. In addition, the platform provides services to alarm or warn on anomalous conditions, and several services closing feedback l...

  6. Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.

    Science.gov (United States)

    Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo

    2017-01-01

    The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.

  7. Simultaneous Detection of Five Pathogens from Cerebrospinal Fluid Specimens Using Luminex Technology

    Directory of Open Access Journals (Sweden)

    Linfu Zhou

    2016-02-01

    Full Text Available Early diagnosis and treatment are crucial for the outcome of central nervous system (CNS infections. In this study, we developed a multiplex PCR-Luminex assay for the simultaneous detection of five major pathogens, including Mycobacterium tuberculosis, Cryptococcus neoformans, Streptococcus pneumoniae, and herpes simplex virus types 1 and 2, which frequently cause CNS infections. Through the hybridization reaction between multiplex PCR-amplified targets and oligonucleotide “anti-TAG” sequences, we found that the PCR-Luminex assay could detect as low as 101–102 copies of synthetic pathogen DNAs. Furthermore, 163 cerebrospinal fluid (CSF specimens from patients with suspected CNS infections were used to evaluate the efficiency of this multiplex PCR-Luminex method. Compared with Ziehl-Neelsen stain, this assay showed a high diagnostic accuracy for tuberculosis meningitis (sensitivity, 90.7% and specificity, 99.1%. For cryptococcal meningitis, the sensitivity and specificity were 92% and 97.1%, respectively, compared with the May Grunwald Giemsa (MGG stain. For herpes simplex virus types 1 and 2 encephalitis, the sensitivities were 80.8% and 100%, and the specificities were 94.2% and 99%, respectively, compared with Enzyme Linked Immunosorbent Assay (ELISA assays. Taken together, this multiplex PCR-Luminex assay showed potential efficiency for the simultaneous detection of five pathogens and may be a promising supplement to conventional methods for diagnosing CNS infections.

  8. Analytics Platform for ATLAS Computing Services

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Bryant, Lincoln

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning tools like Spark, Jupyter, R, S...

  9. A platform for real-time online health analytics during spaceflight

    Science.gov (United States)

    McGregor, Carolyn

    Monitoring the health and wellbeing of astronauts during spaceflight is an important aspect of any manned mission. To date the monitoring has been based on a sequential set of discontinuous samplings of physiological data to support initial studies on aspects such as weightlessness, and its impact on the cardiovascular system and to perform proactive monitoring for health status. The research performed and the real-time monitoring has been hampered by the lack of a platform to enable a more continuous approach to real-time monitoring. While any spaceflight is monitored heavily by Mission Control, an important requirement within the context of any spaceflight setting and in particular where there are extended periods with a lack of communication with Mission Control, is the ability for the mission to operate in an autonomous manner. This paper presents a platform to enable real-time astronaut monitoring for prognostics and health management within space medicine using online health analytics. The platform is based on extending previous online health analytics research known as the Artemis and Artemis Cloud platforms which have demonstrated their relevance for multi-patient, multi-diagnosis and multi-stream temporal analysis in real-time for clinical management and research within Neonatal Intensive Care. Artemis and Artemis Cloud source data from a range of medical devices capable of transmission of the signal via wired or wireless connectivity and hence are well suited to process real-time data acquired from astronauts. A key benefit of this platform is its ability to monitor their health and wellbeing onboard the mission as well as enabling the astronaut's physiological data, and other clinical data, to be sent to the platform components at Mission Control at each stage when that communication is available. As a result, researchers at Mission Control would be able to simulate, deploy and tailor predictive analytics and diagnostics during the same spaceflight for

  10. Analytical laboratory and mobile sampling platform

    International Nuclear Information System (INIS)

    Stetzenbach, K.; Smiecinski, A.

    1996-01-01

    This is the final report for the Analytical Laboratory and Mobile Sampling Platform project. This report contains only major findings and conclusions resulting from this project. Detailed reports of all activities performed for this project were provided to the Project Office every quarter since the beginning of the project. This report contains water chemistry data for samples collected in the Nevada section of Death Valley National Park (Triangle Area Springs), Nevada Test Site springs, Pahranagat Valley springs, Nevada Test Site wells, Spring Mountain springs and Crater Flat and Amargosa Valley wells

  11. Conversion of a Capture ELISA to a Luminex xMAP Assay using a Multiplex Antibody Screening Method

    Science.gov (United States)

    Baker, Harold N.; Murphy, Robin; Lopez, Erica; Garcia, Carlos

    2012-01-01

    The enzyme-linked immunosorbent assay (ELISA) has long been the primary tool for detection of analytes of interest in biological samples for both life science research and clinical diagnostics. However, ELISA has limitations. It is typically performed in a 96-well microplate, and the wells are coated with capture antibody, requiring a relatively large amount of sample to capture an antigen of interest . The large surface area of the wells and the hydrophobic binding of capture antibody can also lead to non-specific binding and increased background. Additionally, most ELISAs rely upon enzyme-mediated amplification of signal in order to achieve reasonable sensitivity. Such amplification is not always linear and can thus skew results. In the past 15 years, a new technology has emerged that offers the benefits of the ELISA, but also enables higher throughput, increased flexibility, reduced sample volume, and lower cost, with a similar workflow 1, 2. Luminex xMAP Technology is a microsphere (bead) array platform enabling both monoplex and multiplex assays that can be applied to both protein and nucleic acid applications 3-5. The beads have the capture antibody covalently immobilized on a smaller surface area, requiring less capture antibody and smaller sample volumes, compared to ELISA, and non-specific binding is significantly reduced. Smaller sample volumes are important when working with limiting samples such as cerebrospinal fluid, synovial fluid, etc. 6. Multiplexing the assay further reduces sample volume requirements, enabling multiple results from a single sample. Recent improvements by Luminex include: the new MAGPIX system, a smaller, less expensive, easier-to-use analyzer; Low-Concentration Magnetic MagPlex Microspheres which eliminate the need for expensive filter plates and come in a working concentration better suited for assay development and low-throughput applications; and the xMAP Antibody Coupling (AbC) Kit, which includes a protocol, reagents, and

  12. Case study: IBM Watson Analytics cloud platform as Analytics-as-a-Service system for heart failure early detection

    OpenAIRE

    Guidi, Gabriele; Miniati, Roberto; Mazzola, Matteo; Iadanza, Ernesto

    2016-01-01

    In the recent years the progress in technology and the increasing availability of fast connections have produced a migration of functionalities in Information Technologies services, from static servers to distributed technologies. This article describes the main tools available on the market to perform Analytics as a Service (AaaS) using a cloud platform. It is also described a use case of IBM Watson Analytics, a cloud system for data analytics, applied to the following research scope: detect...

  13. Analytical simulation platform describing projections in computed tomography systems

    International Nuclear Information System (INIS)

    Youn, Hanbean; Kim, Ho Kyung

    2013-01-01

    To reduce the patient dose, several approaches such as spectral imaging using photon counting detectors and statistical image reconstruction, are being considered. Although image-reconstruction algorithms may significantly enhance image quality in reconstructed images with low dose, true signal-to-noise properties are mainly determined by image quality in projections. We are developing an analytical simulation platform describing projections to investigate how quantum-interaction physics in each component configuring CT systems affect image quality in projections. This simulator will be very useful for an improved design or optimization of CT systems in economy as well as the development of novel image-reconstruction algorithms. In this study, we present the progress of development of the simulation platform with an emphasis on the theoretical framework describing the generation of projection data. We have prepared the analytical simulation platform describing projections in computed tomography systems. The remained further study before the meeting includes the following: Each stage in the cascaded signal-transfer model for obtaining projections will be validated by the Monte Carlo simulations. We will build up energy-dependent scatter and pixel-crosstalk kernels, and show their effects on image quality in projections and reconstructed images. We will investigate the effects of projections obtained from various imaging conditions and system (or detector) operation parameters on reconstructed images. It is challenging to include the interaction physics due to photon-counting detectors into the simulation platform. Detailed descriptions of the simulator will be presented with discussions on its performance and limitation as well as Monte Carlo validations. Computational cost will also be addressed in detail. The proposed method in this study is simple and can be used conveniently in lab environment

  14. Development of a sensitive Luminex xMAP-based microsphere immunoassay for specific detection of Iris yellow spot virus.

    Science.gov (United States)

    Yu, Cui; Yang, Cuiyun; Song, Shaoyi; Yu, Zixiang; Zhou, Xueping; Wu, Jianxiang

    2018-04-04

    Iris yellow spot virus (IYSV) is an Orthotospovirus that infects most Allium species. Very few approaches for specific detection of IYSV from infected plants are available to date. We report the development of a high-sensitive Luminex xMAP-based microsphere immunoassay (MIA) for specific detection of IYSV. The nucleocapsid (N) gene of IYSV was cloned and expressed in Escherichia coli to produce the His-tagged recombinant N protein. A panel of monoclonal antibodies (MAbs) against IYSV was generated by immunizing the mice with recombinant N protein. Five specific MAbs (16D9, 11C6, 7F4, 12C10, and 14H12) were identified and used for developing the Luminex xMAP-based MIA systems along with a polyclonal antibody against IYSV. Comparative analyses of their sensitivity and specificity in detecting IYSV from infected tobacco leaves identified 7F4 as the best-performed MAb in MIA. We then optimized the working conditions of Luminex xMAP-based MIA in specific detection of IYSV from infected tobacco leaves by using appropriate blocking buffer and proper concentration of biotin-labeled antibodies as well as the suitable ratio between the antibodies and the streptavidin R-phycoerythrin (SA-RPE). Under the optimized conditions the Luminex xMAP-based MIA was able to specifically detect IYSV with much higher sensitivity than conventional enzyme-linked immunosorbent assay (ELISA). Importantly, the Luminex xMAP-based MIA is time-saving and the whole procedure could be completed within 2.5 h. We generated five specific MAbs against IYSV and developed the Luminex xMAP-based MIA method for specific detection of IYSV in plants. This assay provides a sensitive, high-specific, easy to perform and likely cost-effective approach for IYSV detection from infected plants, implicating potential broad usefulness of MIA in plant virus diagnosis.

  15. Multiplex detection of plant pathogens through the luminex magplex bead system

    NARCIS (Netherlands)

    Vlugt, van der R.A.A.; Raaij, van H.M.G.; Weerdt, de M.; Bergervoet, J.H.W.

    2015-01-01

    Here we describe a versatile multiplex method for both the serological and molecular detection of plant pathogens. The Luminex MagPlex bead system uses small paramagnetic microspheres (“beads”), either coated with specific antibodies or oligonucleotides, which capture respectively viruses and/or

  16. Big Data Analytics Platforms analyze from startups to traditional database players

    Directory of Open Access Journals (Sweden)

    Ionut TARANU

    2015-07-01

    Full Text Available Big data analytics enables organizations to analyze a mix of structured, semi-structured and unstructured data in search of valuable business information and insights. The analytical findings can lead to more effective marketing, new revenue opportunities, better customer service, improved operational efficiency, competitive advantages over rival organizations and other business benefits. With so many emerging trends around big data and analytics, IT organizations need to create conditions that will allow analysts and data scientists to experiment. "You need a way to evaluate, prototype and eventually integrate some of these technologies into the business," says Chris Curran[1]. In this paper we are going to review 10 Top Big Data Analytics Platforms and compare the key-features.

  17. Using fuzzy analytical hierarchy process (AHP to evaluate web development platform

    Directory of Open Access Journals (Sweden)

    Ahmad Sarfaraz

    2012-01-01

    Full Text Available Web development is plays an important role on business plans and people's lives. One of the key decisions in which both short-term and long-term success of the project depends is choosing the right development platform. Its criticality can be judged by the fact that once a platform is chosen, one has to live with it throughout the software development life cycle. The entire shape of the project depends on the language, operating system, tools, frameworks etc., in short the web development platform chosen. In addition, choosing the right platform is a multi criteria decision making (MCDM problem. We propose a fuzzy analytical hierarchy process model to solve the MCDM problem. We try to tap the real-life modeling potential of fuzzy logic and conjugate it with the commonly used powerful AHP modeling method.

  18. A big data geospatial analytics platform - Physical Analytics Integrated Repository and Services (PAIRS)

    Science.gov (United States)

    Hamann, H.; Jimenez Marianno, F.; Klein, L.; Albrecht, C.; Freitag, M.; Hinds, N.; Lu, S.

    2015-12-01

    A big data geospatial analytics platform:Physical Analytics Information Repository and Services (PAIRS)Fernando Marianno, Levente Klein, Siyuan Lu, Conrad Albrecht, Marcus Freitag, Nigel Hinds, Hendrik HamannIBM TJ Watson Research Center, Yorktown Heights, NY 10598A major challenge in leveraging big geospatial data sets is the ability to quickly integrate multiple data sources into physical and statistical models and be run these models in real time. A geospatial data platform called Physical Analytics Information and Services (PAIRS) is developed on top of open source hardware and software stack to manage Terabyte of data. A new data interpolation and re gridding is implemented where any geospatial data layers can be associated with a set of global grid where the grid resolutions is doubling for consecutive layers. Each pixel on the PAIRS grid have an index that is a combination of locations and time stamp. The indexing allow quick access to data sets that are part of a global data layers and allowing to retrieve only the data of interest. PAIRS takes advantages of parallel processing framework (Hadoop) in a cloud environment to digest, curate, and analyze the data sets while being very robust and stable. The data is stored on a distributed no-SQL database (Hbase) across multiple server, data upload and retrieval is parallelized where the original analytics task is broken up is smaller areas/volume, analyzed independently, and then reassembled for the original geographical area. The differentiating aspect of PAIRS is the ability to accelerate model development across large geographical regions and spatial resolution ranging from 0.1 m up to hundreds of kilometer. System performance is benchmarked on real time automated data ingestion and retrieval of Modis and Landsat data layers. The data layers are curated for sensor error, verified for correctness, and analyzed statistically to detect local anomalies. Multi-layer query enable PAIRS to filter different data

  19. Platform for Automated Real-Time High Performance Analytics on Medical Image Data.

    Science.gov (United States)

    Allen, William J; Gabr, Refaat E; Tefera, Getaneh B; Pednekar, Amol S; Vaughn, Matthew W; Narayana, Ponnada A

    2018-03-01

    Biomedical data are quickly growing in volume and in variety, providing clinicians an opportunity for better clinical decision support. Here, we demonstrate a robust platform that uses software automation and high performance computing (HPC) resources to achieve real-time analytics of clinical data, specifically magnetic resonance imaging (MRI) data. We used the Agave application programming interface to facilitate communication, data transfer, and job control between an MRI scanner and an off-site HPC resource. In this use case, Agave executed the graphical pipeline tool GRAphical Pipeline Environment (GRAPE) to perform automated, real-time, quantitative analysis of MRI scans. Same-session image processing will open the door for adaptive scanning and real-time quality control, potentially accelerating the discovery of pathologies and minimizing patient callbacks. We envision this platform can be adapted to other medical instruments, HPC resources, and analytics tools.

  20. Information Management Platform for Data Analytics and Aggregation (IMPALA) System Design Document

    Science.gov (United States)

    Carnell, Andrew; Akinyelu, Akinyele

    2016-01-01

    The System Design document tracks the design activities that are performed to guide the integration, installation, verification, and acceptance testing of the IMPALA Platform. The inputs to the design document are derived from the activities recorded in Tasks 1 through 6 of the Statement of Work (SOW), with the proposed technical solution being the completion of Phase 1-A. With the documentation of the architecture of the IMPALA Platform and the installation steps taken, the SDD will be a living document, capturing the details about capability enhancements and system improvements to the IMPALA Platform to provide users in development of accurate and precise analytical models. The IMPALA Platform infrastructure team, data architecture team, system integration team, security management team, project manager, NASA data scientists and users are the intended audience of this document. The IMPALA Platform is an assembly of commercial-off-the-shelf (COTS) products installed on an Apache-Hadoop platform. User interface details for the COTS products will be sourced from the COTS tools vendor documentation. The SDD is a focused explanation of the inputs, design steps, and projected outcomes of every design activity for the IMPALA Platform through installation and validation.

  1. The Analytic Information Warehouse (AIW): a platform for analytics using electronic health record data.

    Science.gov (United States)

    Post, Andrew R; Kurc, Tahsin; Cholleti, Sharath; Gao, Jingjing; Lin, Xia; Bornstein, William; Cantrell, Dedra; Levine, David; Hohmann, Sam; Saltz, Joel H

    2013-06-01

    To create an analytics platform for specifying and detecting clinical phenotypes and other derived variables in electronic health record (EHR) data for quality improvement investigations. We have developed an architecture for an Analytic Information Warehouse (AIW). It supports transforming data represented in different physical schemas into a common data model, specifying derived variables in terms of the common model to enable their reuse, computing derived variables while enforcing invariants and ensuring correctness and consistency of data transformations, long-term curation of derived data, and export of derived data into standard analysis tools. It includes software that implements these features and a computing environment that enables secure high-performance access to and processing of large datasets extracted from EHRs. We have implemented and deployed the architecture in production locally. The software is available as open source. We have used it as part of hospital operations in a project to reduce rates of hospital readmission within 30days. The project examined the association of over 100 derived variables representing disease and co-morbidity phenotypes with readmissions in 5years of data from our institution's clinical data warehouse and the UHC Clinical Database (CDB). The CDB contains administrative data from over 200 hospitals that are in academic medical centers or affiliated with such centers. A widely available platform for managing and detecting phenotypes in EHR data could accelerate the use of such data in quality improvement and comparative effectiveness studies. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. Computer-operated analytical platform for the determination of nutrients in hydroponic systems.

    Science.gov (United States)

    Rius-Ruiz, F Xavier; Andrade, Francisco J; Riu, Jordi; Rius, F Xavier

    2014-03-15

    Hydroponics is a water, energy, space, and cost efficient system for growing plants in constrained spaces or land exhausted areas. Precise control of hydroponic nutrients is essential for growing healthy plants and producing high yields. In this article we report for the first time on a new computer-operated analytical platform which can be readily used for the determination of essential nutrients in hydroponic growing systems. The liquid-handling system uses inexpensive components (i.e., peristaltic pump and solenoid valves), which are discretely computer-operated to automatically condition, calibrate and clean a multi-probe of solid-contact ion-selective electrodes (ISEs). These ISEs, which are based on carbon nanotubes, offer high portability, robustness and easy maintenance and storage. With this new computer-operated analytical platform we performed automatic measurements of K(+), Ca(2+), NO3(-) and Cl(-) during tomato plants growth in order to assure optimal nutritional uptake and tomato production. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. The Analytic Information Warehouse (AIW): a Platform for Analytics using Electronic Health Record Data

    Science.gov (United States)

    Post, Andrew R.; Kurc, Tahsin; Cholleti, Sharath; Gao, Jingjing; Lin, Xia; Bornstein, William; Cantrell, Dedra; Levine, David; Hohmann, Sam; Saltz, Joel H.

    2013-01-01

    Objective To create an analytics platform for specifying and detecting clinical phenotypes and other derived variables in electronic health record (EHR) data for quality improvement investigations. Materials and Methods We have developed an architecture for an Analytic Information Warehouse (AIW). It supports transforming data represented in different physical schemas into a common data model, specifying derived variables in terms of the common model to enable their reuse, computing derived variables while enforcing invariants and ensuring correctness and consistency of data transformations, long-term curation of derived data, and export of derived data into standard analysis tools. It includes software that implements these features and a computing environment that enables secure high-performance access to and processing of large datasets extracted from EHRs. Results We have implemented and deployed the architecture in production locally. The software is available as open source. We have used it as part of hospital operations in a project to reduce rates of hospital readmission within 30 days. The project examined the association of over 100 derived variables representing disease and co-morbidity phenotypes with readmissions in five years of data from our institution’s clinical data warehouse and the UHC Clinical Database (CDB). The CDB contains administrative data from over 200 hospitals that are in academic medical centers or affiliated with such centers. Discussion and Conclusion A widely available platform for managing and detecting phenotypes in EHR data could accelerate the use of such data in quality improvement and comparative effectiveness studies. PMID:23402960

  4. Case Study: IBM Watson Analytics Cloud Platform as Analytics-as-a-Service System for Heart Failure Early Detection

    Directory of Open Access Journals (Sweden)

    Gabriele Guidi

    2016-07-01

    Full Text Available In the recent years the progress in technology and the increasing availability of fast connections have produced a migration of functionalities in Information Technologies services, from static servers to distributed technologies. This article describes the main tools available on the market to perform Analytics as a Service (AaaS using a cloud platform. It is also described a use case of IBM Watson Analytics, a cloud system for data analytics, applied to the following research scope: detecting the presence or absence of Heart Failure disease using nothing more than the electrocardiographic signal, in particular through the analysis of Heart Rate Variability. The obtained results are comparable with those coming from the literature, in terms of accuracy and predictive power. Advantages and drawbacks of cloud versus static approaches are discussed in the last sections.

  5. AI based HealthCare Platform for Real Time, Predictive and Prescriptive Analytics using Reactive Programming

    Science.gov (United States)

    Kaur, Jagreet; Singh Mann, Kulwinder, Dr.

    2018-01-01

    AI in Healthcare needed to bring real, actionable insights and Individualized insights in real time for patients and Doctors to support treatment decisions., We need a Patient Centred Platform for integrating EHR Data, Patient Data, Prescriptions, Monitoring, Clinical research and Data. This paper proposes a generic architecture for enabling AI based healthcare analytics Platform by using open sources Technologies Apache beam, Apache Flink Apache Spark, Apache NiFi, Kafka, Tachyon, Gluster FS, NoSQL- Elasticsearch, Cassandra. This paper will show the importance of applying AI based predictive and prescriptive analytics techniques in Health sector. The system will be able to extract useful knowledge that helps in decision making and medical monitoring in real-time through an intelligent process analysis and big data processing.

  6. VAP/VAT: video analytics platform and test bed for testing and deploying video analytics

    Science.gov (United States)

    Gorodnichy, Dmitry O.; Dubrofsky, Elan

    2010-04-01

    Deploying Video Analytics in operational environments is extremely challenging. This paper presents a methodological approach developed by the Video Surveillance and Biometrics Section (VSB) of the Science and Engineering Directorate (S&E) of the Canada Border Services Agency (CBSA) to resolve these problems. A three-phase approach to enable VA deployment within an operational agency is presented and the Video Analytics Platform and Testbed (VAP/VAT) developed by the VSB section is introduced. In addition to allowing the integration of third party and in-house built VA codes into an existing video surveillance infrastructure, VAP/VAT also allows the agency to conduct an unbiased performance evaluation of the cameras and VA software available on the market. VAP/VAT consists of two components: EventCapture, which serves to Automatically detect a "Visual Event", and EventBrowser, which serves to Display & Peruse of "Visual Details" captured at the "Visual Event". To deal with Open architecture as well as with Closed architecture cameras, two video-feed capture mechanisms have been developed within the EventCapture component: IPCamCapture and ScreenCapture.

  7. Ex Machina: Analytical platforms, Law and the Challenges of Computational Legal Science

    Directory of Open Access Journals (Sweden)

    Nicola Lettieri

    2018-04-01

    Full Text Available Over the years, computation has become a fundamental part of the scientific practice in several research fields that goes far beyond the boundaries of natural sciences. Data mining, machine learning, simulations and other computational methods lie today at the hearth of the scientific endeavour in a growing number of social research areas from anthropology to economics. In this scenario, an increasingly important role is played by analytical platforms: integrated environments allowing researchers to experiment cutting-edge data-driven and computation-intensive analyses. The paper discusses the appearance of such tools in the emerging field of computational legal science. After a general introduction to the impact of computational methods on both natural and social sciences, we describe the concept and the features of an analytical platform exploring innovative cross-methodological approaches to the academic and investigative study of crime. Stemming from an ongoing project involving researchers from law, computer science and bioinformatics, the initiative is presented and discussed as an opportunity to raise a debate about the future of legal scholarship and, inside of it, about the challenges of computational legal science.

  8. Analytical interference of HBOC-201 (Hemopure, a synthetic hemoglobin-based oxygen carrier) on four common clinical chemistry platforms.

    Science.gov (United States)

    Korte, Erik A; Pozzi, Nicole; Wardrip, Nina; Ayyoubi, M Tayyeb; Jortani, Saeed A

    2018-07-01

    There are 13 million blood transfusions each year in the US. Limitations in the donor pool, storage capabilities, mass casualties, access in remote locations and reactivity of donors all limit the availability of transfusable blood products to patients. HBOC-201 (Hemopure®) is a second-generation glutaraldehyde-polymer of bovine hemoglobin, which can serve as an "oxygen bridge" to maintain oxygen carrying capacity while transfusion products are unavailable. Hemopure presents the advantages of extended shelf life, ambient storage, and limited reactive potential, but its extracellular location can also cause significant interference in modern laboratory analyzers similar to severe hemolysis. Observed error in 26 commonly measured analytes was determined on 4 different analytical platforms in plasma from a patient therapeutically transfused Hemopure as well as donor blood spiked with Hemopure at a level equivalent to the therapeutic loading dose (10% v/v). Significant negative error ratios >50% of the total allowable error (>0.5tAE) were reported in 23/104 assays (22.1%), positive bias of >0.5tAE in 26/104 assays (25.0%), and acceptable bias between -0.5tAE and 0.5tAE error ratio was reported in 44/104 (42.3%). Analysis failed in the presence of Hemopure in 11/104 (10.6%). Observed error is further subdivided by platform, wavelength, dilution and reaction method. Administration of Hemopure (or other hemoglobin-based oxygen carriers) presents a challenge to laboratorians tasked with analyzing patient specimens. We provide laboratorians with a reference to evaluate patient samples, select optimal analytical platforms for specific analytes, and predict possible bias beyond the 4 analytical platforms included in this study. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Detection of Neisseria meningitidis in cerebrospinal fluid using a multiplex PCR and the Luminex detection technology

    DEFF Research Database (Denmark)

    Møller, Jens Kjølseth

    2012-01-01

    pathogens most frequently found in the cerebrospinal fluid of patients. The Luminex suspension array system uniquely combines flow cytometry, microspheres, laser technology, digital signal processing, and traditional chemistry. In this method, the reaction is carried out in one vessel, in which distinctly...

  10. SmartR: an open-source platform for interactive visual analytics for translational research data.

    Science.gov (United States)

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  11. PARAMO: a PARAllel predictive MOdeling platform for healthcare analytic research using electronic health records.

    Science.gov (United States)

    Ng, Kenney; Ghoting, Amol; Steinhubl, Steven R; Stewart, Walter F; Malin, Bradley; Sun, Jimeng

    2014-04-01

    Healthcare analytics research increasingly involves the construction of predictive models for disease targets across varying patient cohorts using electronic health records (EHRs). To facilitate this process, it is critical to support a pipeline of tasks: (1) cohort construction, (2) feature construction, (3) cross-validation, (4) feature selection, and (5) classification. To develop an appropriate model, it is necessary to compare and refine models derived from a diversity of cohorts, patient-specific features, and statistical frameworks. The goal of this work is to develop and evaluate a predictive modeling platform that can be used to simplify and expedite this process for health data. To support this goal, we developed a PARAllel predictive MOdeling (PARAMO) platform which (1) constructs a dependency graph of tasks from specifications of predictive modeling pipelines, (2) schedules the tasks in a topological ordering of the graph, and (3) executes those tasks in parallel. We implemented this platform using Map-Reduce to enable independent tasks to run in parallel in a cluster computing environment. Different task scheduling preferences are also supported. We assess the performance of PARAMO on various workloads using three datasets derived from the EHR systems in place at Geisinger Health System and Vanderbilt University Medical Center and an anonymous longitudinal claims database. We demonstrate significant gains in computational efficiency against a standard approach. In particular, PARAMO can build 800 different models on a 300,000 patient data set in 3h in parallel compared to 9days if running sequentially. This work demonstrates that an efficient parallel predictive modeling platform can be developed for EHR data. This platform can facilitate large-scale modeling endeavors and speed-up the research workflow and reuse of health information. This platform is only a first step and provides the foundation for our ultimate goal of building analytic pipelines

  12. Development of a bead-based Luminex assay using lipopolysaccharide specific monoclonal antibodies to detect biological threats from Brucella species.

    Science.gov (United States)

    Silbereisen, Angelika; Tamborrini, Marco; Wittwer, Matthias; Schürch, Nadia; Pluschke, Gerd

    2015-10-05

    Brucella, a Gram-negative bacterium, is classified as a potential bioterrorism agent mainly due to the low dose needed to cause infection and the ability to transmit the bacteria via aerosols. Goats/sheep, cattle, pigs, dogs, sheep and rodents are infected by B. melitensis, B. abortus, B. suis, B. canis, B. ovis and B. neotomae, respectively, the six classical Brucella species. Most human cases are caused by B. melitensis and B. abortus. Our aim was to specifically detect Brucellae with 'smooth' lipopolysaccharide (LPS) using a highly sensitive monoclonal antibody (mAb) based immunological assay. To complement molecular detection systems for potential bioterror agents, as required by international biodefense regulations, sets of mAbs were generated by B cell hybridoma technology and used to develop immunological assays. The combination of mAbs most suitable for an antigen capture assay format was identified and an immunoassay using the Luminex xMAP technology was developed. MAbs specific for the LPS O-antigen of Brucella spp. were generated by immunising mice with inactivated B. melitensis or B. abortus cells. Most mAbs recognised both B. melitensis and B. abortus and antigen binding was not impeded by inactivation of the bacterial cells by γ irradiation, formalin or heat treatment, a step required to analyse the samples immunologically under biosafety level two conditions. The Luminex assay recognised all tested Brucella species with 'smooth' LPS with detection limits of 2×10(2) to 8×10(4) cells per mL, depending on the species tested. Milk samples spiked with Brucella spp. cells were identified successfully using the Luminex assay. In addition, the bead-based immunoassay was integrated into a multiplex format, allowing for simultaneous, rapid and specific detection of Brucella spp., Bacillus anthracis, Francisella tularensis and Yersinia pestis within a single sample. Overall, the robust Luminex assay should allow detection of Brucella spp. in both natural

  13. Development and validation of a Luminex assay for detection of a predictive biomarker for PROSTVAC-VF therapy

    Science.gov (United States)

    Lucas, Julie L.; Tacheny, Erin A.; Ferris, Allison; Galusha, Michelle; Srivastava, Apurva K.; Ganguly, Aniruddha; Williams, P. Mickey; Sachs, Michael C.; Thurin, Magdalena; Tricoli, James V.; Ricker, Winnie; Gildersleeve, Jeffrey C.

    2017-01-01

    Cancer therapies can provide substantially improved survival in some patients while other seemingly similar patients receive little or no benefit. Strategies to identify patients likely to respond well to a given therapy could significantly improve health care outcomes by maximizing clinical benefits while reducing toxicities and adverse effects. Using a glycan microarray assay, we recently reported that pretreatment serum levels of IgM specific to blood group A trisaccharide (BG-Atri) correlate positively with overall survival of cancer patients on PROSTVAC-VF therapy. The results suggested anti-BG-Atri IgM measured prior to treatment could serve as a biomarker for identifying patients likely to benefit from PROSTVAC-VF. For continued development and clinical application of serum IgM specific to BG-Atri as a predictive biomarker, a clinical assay was needed. In this study, we developed and validated a Luminex-based clinical assay for measuring serum IgM specific to BG-Atri. IgM levels were measured with the Luminex assay and compared to levels measured using the microarray for 126 healthy individuals and 77 prostate cancer patients. This assay provided reproducible and consistent results with low %CVs, and tolerance ranges were established for the assay. IgM levels measured using the Luminex assay were found to be highly correlated to the microarray results with R values of 0.93–0.95. This assay is a Laboratory Developed Test (LDT) and is suitable for evaluating thousands of serum samples in CLIA certified laboratories that have validated the assay. In addition, the study demonstrates that discoveries made using neoglycoprotein-based microarrays can be readily migrated to a clinical assay. PMID:28771597

  14. Rapid detection and semi-quantification of IgG-accessible Staphylococcus aureus surface-associated antigens using a multiplex competitive Luminex assay

    NARCIS (Netherlands)

    Hansenova Manaskova, S.; Bikker, F.J.; Veerman, E.C.I.; van Belkum, A.; van Wamel, W.J.B.

    2013-01-01

    The surface characterization of Staphylococcus aureus is currently labor intensive and time consuming. Therefore, we developed a novel method for the rapid yet comprehensive characterization of S. aureus cell-surface-associated proteins and carbohydrates, based on a competitive Luminex assay. In

  15. SAW-Based Phononic Crystal Microfluidic Sensor-Microscale Realization of Velocimetry Approaches for Integrated Analytical Platform Applications.

    Science.gov (United States)

    Oseev, Aleksandr; Lucklum, Ralf; Zubtsov, Mikhail; Schmidt, Marc-Peter; Mukhin, Nikolay V; Hirsch, Soeren

    2017-09-23

    The current work demonstrates a novel surface acoustic wave (SAW) based phononic crystal sensor approach that allows the integration of a velocimetry-based sensor concept into single chip integrated solutions, such as Lab-on-a-Chip devices. The introduced sensor platform merges advantages of ultrasonic velocimetry analytic systems and a microacoustic sensor approach. It is based on the analysis of structural resonances in a periodic composite arrangement of microfluidic channels confined within a liquid analyte. Completed theoretical and experimental investigations show the ability to utilize periodic structure localized modes for the detection of volumetric properties of liquids and prove the efficacy of the proposed sensor concept.

  16. Standardization of a cytometric p24-capture bead-assay for the detection of main HIV-1 subtypes.

    Science.gov (United States)

    Merbah, Mélanie; Onkar, Sayali; Grivel, Jean-Charles; Vanpouille, Christophe; Biancotto, Angélique; Bonar, Lydia; Sanders-Buell, Eric; Kijak, Gustavo; Michael, Nelson; Robb, Merlin; Kim, Jerome H; Tovanabutra, Sodsai; Chenine, Agnès-Laurence

    2016-04-01

    The prevailing method to assess HIV-1 replication and infectivity is to measure the production of p24 Gag protein by enzyme-linked immunosorbent assay (ELISA). Since fluorescent bead-based technologies offer a broader dynamic range and higher sensitivity, this study describes a p24 capture Luminex assay capable of detecting HIV-1 subtypes A-D, circulating recombinant forms (CRF) CRF01_AE and CRF02_AG, which together are responsible for over 90% of HIV-1 infections worldwide. The success of the assay lies in the identification and selection of a cross-reactive capture antibody (clone 183-H12-5C). Fifty-six isolates that belonged to six HIV-1 subtypes and CRFs were successfully detected with p-values below 0.021; limits of detection ranging from 3.7 to 3 × 104 pg/ml. The intra- and inter-assay variation gave coefficient of variations below 6 and 14%, respectively. The 183-bead Luminex assay also displayed higher sensitivity of 91% and 98% compared to commercial p24 ELISA and a previously described Luminex assay. The p24 concentrations measured by the 183-bead Luminex assay showed a significant correlation (R=0.92, passay leverages the advantages of the Luminex platform, which include smaller sample volume and simultaneous detection of up to 500 analytes in a single sample, and delivers a valuable tool for the field. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. The performance of Luminex ARIES® Flu A/B & RSV and Cepheid Xpert® Flu/RSV XC for the detection of influenza A, influenza B, and respiratory syncytial virus in prospective patient samples.

    Science.gov (United States)

    McMullen, Phillip; Boonlayangoor, Sue; Charnot-Katsikas, Angella; Beavis, Kathleen G; Tesic, Vera

    2017-10-01

    The demand for rapid, accurate viral testing has increased the number of assays available for the detection of viral pathogens. One of the newest FDA cleared platforms is the Luminex ARIES ® Flu A/B & RSV, which is a fully automated, real-time PCR-based assay used for detection of influenza A, influenza B, and respiratory syncytial virus (RSV). We sought to compare the performance of Luminex ARIES ® Flu A/B & RSV assay to the Cepheid Xpert ® Flu/RSV XC assay for rapid Flu and RSV testing. A series of consecutive nasopharyngeal specimens received in the clinical microbiology laboratory during peak influenza season at a major academic center in Chicago, IL, were prospectively tested, using both the ARIES ® Flu A/B & RSV and Xpert ® Flu/RSV XC assays, side by side. Discrepant results were tested on the BioFire FilmArray ® Respiratory Panel for resolution. A total of 143 consecutive nasopharyngeal specimens, obtained from patients ranging from six months to ninety-three years in age were received between January 1st, 2017 and March 21st, 2017. There was 96.6% agreement between the two assays for detection influenza A, 100% agreement for detection influenza B and RSV, and 98.9% agreement for negative results. The Xpert ® Flu/RSV XC performed with an average turn-around time of approximately 60min, compared to the ARIES ® Flu A/B & RSV of approximately 120min. Both assays were equally easy to perform, with a similar amount of hands-on technologist time for each platform. Overall, these results indicate that both tests are comparable in terms of result agreement and technical ease-of-use. The Xpert ® Flu/RSV XC assay did produce results with less turn-around-time, approximately 60min quicker than the ARIES ® Flu A/B & RSV. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Google analytics integrations

    CERN Document Server

    Waisberg, Daniel

    2015-01-01

    A roadmap for turning Google Analytics into a centralized marketing analysis platform With Google Analytics Integrations, expert author Daniel Waisberg shows you how to gain a more meaningful, complete view of customers that can drive growth opportunities. This in-depth guide shows not only how to use Google Analytics, but also how to turn this powerful data collection and analysis tool into a central marketing analysis platform for your company. Taking a hands-on approach, this resource explores the integration and analysis of a host of common data sources, including Google AdWords, AdSens

  19. Comparison of ophthalmic sponges and extraction buffers for quantifying cytokine profiles in tears using Luminex technology.

    Science.gov (United States)

    Inic-Kanada, Aleksandra; Nussbaumer, Andrea; Montanaro, Jacqueline; Belij, Sandra; Schlacher, Simone; Stein, Elisabeth; Bintner, Nora; Merio, Margarethe; Zlabinger, Gerhard J; Barisani-Asenbauer, Talin

    2012-01-01

    Evaluating cytokine profiles in tears could shed light on the pathogenesis of various ocular surface diseases. When collecting tears with the methods currently available, it is often not possible to avoid the tear reflex, which may give a different cytokine profile compared to basal tears. More importantly, tear collection with glass capillaries, the most widely used method for taking samples and the best method for avoiding tear reflex, is impractical for remote area field studies because it is tedious and time-consuming for health workers, who cannot collect tears from a large number of patients with this method in one day. Furthermore, this method is uncomfortable for anxious patients and children. Thus, tears are frequently collected using ophthalmic sponges. These sponges have the advantage that they are well tolerated by the patient, especially children, and enable standardization of the tear collection volume. The aim of this study was to compare various ophthalmic sponges and extraction buffers to optimize the tear collection method for field studies for subsequent quantification of cytokines in tears using the Luminex technology. Three ophthalmic sponges, Merocel, Pro-ophta, and Weck-Cel, were tested. Sponges were presoaked with 25 cytokines/chemokines of known concentrations and eluted with seven different extraction buffers (EX1-EX7). To assess possible interference in the assay from the sponges, two standard curves were prepared in parallel: 1) cytokines of known concentrations with the extraction buffers and 2) cytokines of known concentrations loaded onto the sponges with the extraction buffers. Subsequently, a clinical assessment of the chosen sponge-buffer combination was performed with tears collected from four healthy subjects using 1) aspiration and 2) sponges. To quantify cytokine/chemokine recovery and the concentration in the tears, a 25-plex Cytokine Panel and the Luminex xMap were used. This platform enables simultaneous measurement of

  20. Comparison of ophthalmic sponges and extraction buffers for quantifying cytokine profiles in tears using Luminex technology

    Science.gov (United States)

    Inic-Kanada, Aleksandra; Nussbaumer, Andrea; Montanaro, Jacqueline; Belij, Sandra; Schlacher, Simone; Stein, Elisabeth; Bintner, Nora; Merio, Margarethe; Zlabinger, Gerhard J.

    2012-01-01

    Purpose Evaluating cytokine profiles in tears could shed light on the pathogenesis of various ocular surface diseases. When collecting tears with the methods currently available, it is often not possible to avoid the tear reflex, which may give a different cytokine profile compared to basal tears. More importantly, tear collection with glass capillaries, the most widely used method for taking samples and the best method for avoiding tear reflex, is impractical for remote area field studies because it is tedious and time-consuming for health workers, who cannot collect tears from a large number of patients with this method in one day. Furthermore, this method is uncomfortable for anxious patients and children. Thus, tears are frequently collected using ophthalmic sponges. These sponges have the advantage that they are well tolerated by the patient, especially children, and enable standardization of the tear collection volume. The aim of this study was to compare various ophthalmic sponges and extraction buffers to optimize the tear collection method for field studies for subsequent quantification of cytokines in tears using the Luminex technology. Methods Three ophthalmic sponges, Merocel, Pro-ophta, and Weck-Cel, were tested. Sponges were presoaked with 25 cytokines/chemokines of known concentrations and eluted with seven different extraction buffers (EX1–EX7). To assess possible interference in the assay from the sponges, two standard curves were prepared in parallel: 1) cytokines of known concentrations with the extraction buffers and 2) cytokines of known concentrations loaded onto the sponges with the extraction buffers. Subsequently, a clinical assessment of the chosen sponge-buffer combination was performed with tears collected from four healthy subjects using 1) aspiration and 2) sponges. To quantify cytokine/chemokine recovery and the concentration in the tears, a 25-plex Cytokine Panel and the Luminex xMap were used. This platform enables simultaneous

  1. Service Quality of Online Shopping Platforms: A Case-Based Empirical and Analytical Study

    Directory of Open Access Journals (Sweden)

    Tsan-Ming Choi

    2013-01-01

    Full Text Available Customer service is crucially important for online shopping platforms (OSPs such as eBay and Taobao. Based on the well-established service quality instruments and the scenario of the specific case on Taobao, this paper focuses on exploring the service quality of an OSP with an aim of revealing customer perceptions of the service quality associated with the provided functions and investigating their impacts on customer loyalty. By an empirical study, this paper finds that the “fulfillment and responsiveness” function is significantly related to the customer loyalty. Further analytical study is conducted to reveal that the optimal service level on the “fulfillment and responsiveness” function for the risk averse OSP uniquely exists. Moreover, the analytical results prove that (i if the customer loyalty is more positively correlated to the service level, it will lead to a larger optimal service level, and (ii the optimal service level is independent of the profit target, the source of uncertainty, and the risk preference of the OSP.

  2. RTEMP: Exploring an end-to-end, agnostic platform for multidisciplinary real-time analytics in the space physics community and beyond

    Science.gov (United States)

    Chaddock, D.; Donovan, E.; Spanswick, E.; Jackel, B. J.

    2014-12-01

    Large-scale, real-time, sensor-driven analytics are a highly effective set of tools in many research environments; however, the barrier to entry is expensive and the learning curve is steep. These systems need to operate efficiently from end to end, with the key aspects being data transmission, acquisition, management and organization, and retrieval. When building a generic multidisciplinary platform, acquisition and data management needs to be designed with scalability and flexibility as the primary focus. Additionally, in order to leverage current sensor web technologies, the integration of common sensor data standards (ie. SensorML and SWE Services) should be supported. Perhaps most important, researchers should be able to get started and integrate the platform into their set of research tools as easily and quickly as possible. The largest issue with current platforms is that the sensor data must be formed and described using the previously mentioned standards. As useful as these standards are for organizing data, they are cumbersome to adopt, often restrictive, and are required to be geospatially-driven. Our solution, RTEMP (Real-time Environment Monitoring Platform), is a real-time analytics platform with over ten years and an estimated two million dollars of investment. It has been developed for our continuously expanding requirements of operating and building remote sensors and supporting equipment for space physics research. A key benefit of our approach is RTEMP's ability to manage agnostic data. This allows data that flows through the system to be structured in any way that best addresses the needs of the sensor operators and data users, enabling extensive flexibility and streamlined development and research. Here we begin with an overview of RTEMP and how it is structured. Additionally, we will showcase the ways that we are using RTEMP and how it is being adopted by researchers in an increasingly broad range of other research fields. We will lay out a

  3. Effect of implementing anti-HLA antibody detection by Luminex in the kidney transplant program in Chile.

    Science.gov (United States)

    Elgueta, S; Fuentes, C; López, M; Hernández, J; Arenas, A; Jiménez, M; Gajardo, J G; Rodríguez, H; Labraña, C

    2011-11-01

    The development of new highly sensitive, specific technologies to detect HLA antibodies has allowed a better definition of the profile of non-permitted antigens for patients awaiting kidney transplantation. The use of calculated or virtual panel reactive antibodies (CPRA or vPRA) seeks to improve the prediction of positive crossmatches (XM), but increases the proportion of sensitized patients on the waiting list. In 2008-2009, we implemented detection of antibodies using Luminex technology and applied vPRA since 2009. The objective of this study was to evaluate the impact of these innovations in defecting patient sensitization on kidney transplant waiting lists for deceased donors and among transplanted patients. We analyzed the waiting list for 2007 through 2009 and the first semester of 2010, including the patients transplanted in these periods and the XM with deceased donors. We observed an increase in the mean peak PRA of transplanted patients from 7.2% in 2007 to 17.1% in 2010 (P = .001), and in the proportion of patients transplanted with a peak PRA > 50% from 2.8% in 2007 to 15.7% in 2010 (P = .0001), with no increase in the proportion of this population on the waiting lists. There was a concurrent decrease in positive XM among patients with a peak PRA > 50%. The use of vPRA and Luminex permitted a greater number of transplants of patients with peak PRA > 50% and was a good predictor of a positive XM. Published by Elsevier Inc.

  4. A Mobile-based Platform for Big Load Profiles Data Analytics in Non-Advanced Metering Infrastructures

    Directory of Open Access Journals (Sweden)

    Moussa Sherin

    2016-01-01

    Full Text Available With the rapidly increase of electricity demand around the world due to industrialization and urbanization, this turns the availability of precise knowledge about the consumption patterns of consumers to a valuable asset for electricity providers, given the current competitive electricity market. This would allow them to provide satisfactory services in time of load peaks and to control fraud and abuse cases. Despite of this crucial necessity, this is currently very hard to achieve in many developing countries since smart meters or advanced metering infrastructures (AMIs are not yet settled there to monitor and report energy usages. Whereas the communication and information technologies have widely emerged in such nations, allowing the enormous spread of smart devices among population. In this paper, we present mobile-based BLPDA, a novel platform for big data analytics of consumerss’ load profiles (LPs in the absence of AMIs’ establishment. The proposed platform utilizes mobile computing in order to collect the consumptions of consumers, build their LPs, and analyze the aggregated usages data. Thus, allowing electricity providers to have better vision for an enhanced decision making process. The experimental results emphasize the effectiveness of our platform as an adequate alternative for AMIs in developing countries with minimal cost.

  5. Cytobank: providing an analytics platform for community cytometry data analysis and collaboration.

    Science.gov (United States)

    Chen, Tiffany J; Kotecha, Nikesh

    2014-01-01

    Cytometry is used extensively in clinical and laboratory settings to diagnose and track cell subsets in blood and tissue. High-throughput, single-cell approaches leveraging cytometry are developed and applied in the computational and systems biology communities by researchers, who seek to improve the diagnosis of human diseases, map the structures of cell signaling networks, and identify new cell types. Data analysis and management present a bottleneck in the flow of knowledge from bench to clinic. Multi-parameter flow and mass cytometry enable identification of signaling profiles of patient cell samples. Currently, this process is manual, requiring hours of work to summarize multi-dimensional data and translate these data for input into other analysis programs. In addition, the increase in the number and size of collaborative cytometry studies as well as the computational complexity of analytical tools require the ability to assemble sufficient and appropriately configured computing capacity on demand. There is a critical need for platforms that can be used by both clinical and basic researchers who routinely rely on cytometry. Recent advances provide a unique opportunity to facilitate collaboration and analysis and management of cytometry data. Specifically, advances in cloud computing and virtualization are enabling efficient use of large computing resources for analysis and backup. An example is Cytobank, a platform that allows researchers to annotate, analyze, and share results along with the underlying single-cell data.

  6. A DUAL PLATFORM FOR SELECTIVE ANALYTE ENRICHMENT AND IONIZATION IN MASS SPECTROMETRY USING APTAMER-CONJUGATED GRAPHENE OXIDE

    OpenAIRE

    Gulbakan, Basri; Yasun, Emir; Shukoor, M. Ibrahim; Zhu, Zhi; You, Mingxu; Tan, Xiaohong; Sanchez, Hernan; Powell, David H.; Dai, Hongjie; Tan, Weihong

    2010-01-01

    This study demonstrates the use of aptamer-conjugated graphene oxide as an affinity extraction and detection platform for analytes from complex biological media. We have shown that cocaine and adenosine can be selectively enriched from plasma samples and that direct mass spectrometric readout can be obtained without a matrix and with greatly improved signal-to-noise ratios. The aptamer conjugated graphene oxide has clear advantages in target enrichment and in generating highly efficient ioniz...

  7. Towards a Web-Enabled Geovisualization and Analytics Platform for the Energy and Water Nexus

    Science.gov (United States)

    Sanyal, J.; Chandola, V.; Sorokine, A.; Allen, M.; Berres, A.; Pang, H.; Karthik, R.; Nugent, P.; McManamay, R.; Stewart, R.; Bhaduri, B. L.

    2017-12-01

    Interactive data analytics are playing an increasingly vital role in the generation of new, critical insights regarding the complex dynamics of the energy/water nexus (EWN) and its interactions with climate variability and change. Integration of impacts, adaptation, and vulnerability (IAV) science with emerging, and increasingly critical, data science capabilities offers a promising potential to meet the needs of the EWN community. To enable the exploration of pertinent research questions, a web-based geospatial visualization platform is being built that integrates a data analysis toolbox with advanced data fusion and data visualization capabilities to create a knowledge discovery framework for the EWN. The system, when fully built out, will offer several geospatial visualization capabilities including statistical visual analytics, clustering, principal-component analysis, dynamic time warping, support uncertainty visualization and the exploration of data provenance, as well as support machine learning discoveries to render diverse types of geospatial data and facilitate interactive analysis. Key components in the system architecture includes NASA's WebWorldWind, the Globus toolkit, postgresql, as well as other custom built software modules.

  8. Using Distributed Data over HBase in Big Data Analytics Platform for Clinical Services

    Directory of Open Access Journals (Sweden)

    Dillon Chrimes

    2017-01-01

    Full Text Available Big data analytics (BDA is important to reduce healthcare costs. However, there are many challenges of data aggregation, maintenance, integration, translation, analysis, and security/privacy. The study objective to establish an interactive BDA platform with simulated patient data using open-source software technologies was achieved by construction of a platform framework with Hadoop Distributed File System (HDFS using HBase (key-value NoSQL database. Distributed data structures were generated from benchmarked hospital-specific metadata of nine billion patient records. At optimized iteration, HDFS ingestion of HFiles to HBase store files revealed sustained availability over hundreds of iterations; however, to complete MapReduce to HBase required a week (for 10 TB and a month for three billion (30 TB indexed patient records, respectively. Found inconsistencies of MapReduce limited the capacity to generate and replicate data efficiently. Apache Spark and Drill showed high performance with high usability for technical support but poor usability for clinical services. Hospital system based on patient-centric data was challenging in using HBase, whereby not all data profiles were fully integrated with the complex patient-to-hospital relationships. However, we recommend using HBase to achieve secured patient data while querying entire hospital volumes in a simplified clinical event model across clinical services.

  9. Using Distributed Data over HBase in Big Data Analytics Platform for Clinical Services.

    Science.gov (United States)

    Chrimes, Dillon; Zamani, Hamid

    2017-01-01

    Big data analytics (BDA) is important to reduce healthcare costs. However, there are many challenges of data aggregation, maintenance, integration, translation, analysis, and security/privacy. The study objective to establish an interactive BDA platform with simulated patient data using open-source software technologies was achieved by construction of a platform framework with Hadoop Distributed File System (HDFS) using HBase (key-value NoSQL database). Distributed data structures were generated from benchmarked hospital-specific metadata of nine billion patient records. At optimized iteration, HDFS ingestion of HFiles to HBase store files revealed sustained availability over hundreds of iterations; however, to complete MapReduce to HBase required a week (for 10 TB) and a month for three billion (30 TB) indexed patient records, respectively. Found inconsistencies of MapReduce limited the capacity to generate and replicate data efficiently. Apache Spark and Drill showed high performance with high usability for technical support but poor usability for clinical services. Hospital system based on patient-centric data was challenging in using HBase, whereby not all data profiles were fully integrated with the complex patient-to-hospital relationships. However, we recommend using HBase to achieve secured patient data while querying entire hospital volumes in a simplified clinical event model across clinical services.

  10. Using Distributed Data over HBase in Big Data Analytics Platform for Clinical Services

    Science.gov (United States)

    Zamani, Hamid

    2017-01-01

    Big data analytics (BDA) is important to reduce healthcare costs. However, there are many challenges of data aggregation, maintenance, integration, translation, analysis, and security/privacy. The study objective to establish an interactive BDA platform with simulated patient data using open-source software technologies was achieved by construction of a platform framework with Hadoop Distributed File System (HDFS) using HBase (key-value NoSQL database). Distributed data structures were generated from benchmarked hospital-specific metadata of nine billion patient records. At optimized iteration, HDFS ingestion of HFiles to HBase store files revealed sustained availability over hundreds of iterations; however, to complete MapReduce to HBase required a week (for 10 TB) and a month for three billion (30 TB) indexed patient records, respectively. Found inconsistencies of MapReduce limited the capacity to generate and replicate data efficiently. Apache Spark and Drill showed high performance with high usability for technical support but poor usability for clinical services. Hospital system based on patient-centric data was challenging in using HBase, whereby not all data profiles were fully integrated with the complex patient-to-hospital relationships. However, we recommend using HBase to achieve secured patient data while querying entire hospital volumes in a simplified clinical event model across clinical services. PMID:29375652

  11. Simultaneous analysis of cerebrospinal fluid biomarkers using microsphere-based xMAP multiplex technology for early detection of Alzheimer's disease.

    Science.gov (United States)

    Kang, Ju-Hee; Vanderstichele, Hugo; Trojanowski, John Q; Shaw, Leslie M

    2012-04-01

    The xMAP-Luminex multiplex platform for measurement of Alzheimer's disease (AD) cerebrospinal fluid (CSF) biomarkers using Innogenetics AlzBio3 immunoassay reagents that are for research use only has been shown to be an effective tool for early detection of an AD-like biomarker signature based on concentrations of CSF Aβ(1-42), t-tau and p-tau(181). Among the several advantages of the xMAP-Luminex platform for AD CSF biomarkers are: a wide dynamic range of ready-to-use calibrators, time savings for the simultaneous analyses of three biomarkers in one analytical run, reduction of human error, potential of reduced cost of reagents, and a modest reduction of sample volume as compared to conventional enzyme-linked immunosorbant assay (ELISA) methodology. Recent clinical studies support the use of CSF Aβ(1-42), t-tau and p-tau(181) measurement using the xMAP-Luminex platform for the early detection of AD pathology in cognitively normal individuals, and for prediction of progression to AD dementia in subjects with mild cognitive impairment (MCI). Studies that have shown the prediction of risk for progression to AD dementia by MCI patients provide the basis for the use of CSF Aβ(1-42), t-tau and p-tau(181) testing to assign risk for progression in patients enrolled in therapeutic trials. Furthermore emerging study data suggest that these pathologic changes occur in cognitively normal subjects 20 or more years before the onset of clinically detectable memory changes thus providing an objective measurement for use in the assessment of treatment effects in primary treatment trials. However, numerous previous ELISA and Luminex-based multiplex studies reported a wide range of absolute values of CSF Aβ(1-42), t-tau and p-tau(181) indicative of substantial inter-laboratory variability as well as varying degrees of intra-laboratory imprecision. In order to address these issues a recent inter-laboratory investigation that included a common set of CSF pool aliquots from

  12. Solution of direct kinematic problem for Stewart-Gough platform with the use of analytical equation of plane

    Directory of Open Access Journals (Sweden)

    A. L. Lapikov

    2014-01-01

    Full Text Available The paper concerns the solution of direct kinematic problem for the Stewart-Gough platform of the type 6-3. The article represents a detailed analysis of methods of direct kinematic problem solution for platform mechanisms based on the parallel structures. The complexity of the problem solution is estimated for the mechanisms of parallel kinematics in comparison with the classic manipulators, characterized by the open kinematic chain.The method for the solution of this problem is suggested. It consists in setting up the correspondence between the functional dependence of Cartesian coordinates and the orientation of the moving platform centre on the values of generalized coordinates of the manipulator, which may be represented, in the case of platform manipulators, by the lengths of extensible arms to connect the foundation and the moving platform of the manipulator. The method is constructed in such a way that the solution of the direct kinematic problem reduces to solution of the analytical equation of plane where the moving platform is situated. The equation of the required plane is built according to three points which in this case are attachment points of moving platform joints. To define joints coordinates values it is necessary to generate a system of nine nonlinear equations. It ought to be noted that in generating a system of equation are used the equations with the same type of nonlinearity. The physical meaning of all nine equations of the system is Euclidean distance between the points of the manipulator. The location and orientation of the moving platform is represented as a homogeneous transformation matrix. The components of translation and rotation of this matrix can be defined through the required plane.The obtained theoretical results are supposed to be used in the decision support system during the complex research of multi-sectional manipulators of parallel kinematics to describe the geometrically similar 3D-prototype of the

  13. Simultaneous detection of IgG antibodies associated with viral hemorrhagic fever by a multiplexed Luminex-based immunoassay.

    Science.gov (United States)

    Wu, Wei; Zhang, Shuo; Qu, Jing; Zhang, Quanfu; Li, Chuan; Li, Jiandong; Jin, Cong; Liang, Mifang; Li, Dexin

    2014-07-17

    Viral hemorrhagic fevers (VHFs) are worldwide diseases caused by several kinds of viruses. With the emergence of new viruses, advanced diagnostic methods are urgently needed for identification of VHFs. Based on Luminex xMAP technology, a rapid, sensitive, multi-pathogen and high-throughput method which could simultaneously detect hemorrhagic fever viruses (HFVs) specific IgG antibodies was developed. Recombinant antigens of nine HFVs including Hantaan virus (HTNV), Seoul virus (SEOV), Puumala virus (PUUV), Andes virus (ANDV), Sin Nombre virus (SNV), Crimean-Congo hemorrhagic fever virus (CCHFV), Rift Valley fever virus (RVFV), Severe fever with thrombocytopenia syndrome bunyavirus (SFTSV) and dengue virus (DENV) were produced and purified from a prokaryotic expression system and the influence of the coupling amount was investigated. Cross-reactions among antigens and their rabbit immune sera were evaluated. Serum samples collected from 51 laboratory confirmed hemorrhagic fever with renal syndrome (HFRS) patients, 43 confirmed SFTS patients and 88 healthy donors were analyzed. Results showed that recombinant nucleocapsid protein of the five viruses belonging to the genus Hantavirus, had serological cross-reactivity with their corresponding rabbit immune sera, but not apparent with immune sera of other four viruses. Evaluation of this new method with clinical serum samples showed 98.04% diagnostic sensitivity for HFRS, 90.70% for SFTS detection and the specificity was ranging from 66.67% to 100.00%. The multiplexed Luminex-based immunoassay has firstly been established in our study, which provides a potentially reliable diagnostic tool for IgG antibody detection of VHFs. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Evaluation of multiplex assay platforms for detection of influenza hemagglutinin subtype specific antibody responses.

    Science.gov (United States)

    Li, Zhu-Nan; Weber, Kimberly M; Limmer, Rebecca A; Horne, Bobbi J; Stevens, James; Schwerzmann, Joy; Wrammert, Jens; McCausland, Megan; Phipps, Andrew J; Hancock, Kathy; Jernigan, Daniel B; Levine, Min; Katz, Jacqueline M; Miller, Joseph D

    2017-05-01

    Influenza hemagglutination inhibition (HI) and virus microneutralization assays (MN) are widely used for seroprevalence studies. However, these assays have limited field portability and are difficult to fully automate for high throughput laboratory testing. To address these issues, three multiplex influenza subtype-specific antibody detection assays were developed using recombinant hemagglutinin antigens in combination with Chembio, Luminex ® , and ForteBio ® platforms. Assay sensitivity, specificity, and subtype cross-reactivity were evaluated using a panel of well characterized human sera. Compared to the traditional HI, assay sensitivity ranged from 87% to 92% and assay specificity in sera collected from unexposed persons ranged from 65% to 100% across the platforms. High assay specificity (86-100%) for A(H5N1) rHA was achieved for sera from exposed or unexposed to hetorosubtype influenza HAs. In contrast, assay specificity for A(H1N1)pdm09 rHA using sera collected from A/Vietnam/1204/2004 (H5N1) vaccinees in 2008 was low (22-30%) in all platforms. Although cross-reactivity against rHA subtype proteins was observed in each assay platform, the correct subtype specific responses were identified 78%-94% of the time when paired samples were available for analysis. These results show that high throughput and portable multiplex assays that incorporate rHA can be used to identify influenza subtype specific infections. Published by Elsevier B.V.

  15. Detection of enteropathogens associated with travelers’ diarrhea using a multiplex Luminex-based assay performed on stool samples smeared on Whatman FTA Elute cards

    Science.gov (United States)

    Lalani, Tahaniyat; Tisdale, Michele D; Maguire, Jason D; Wongsrichanalai, Chansuda; Riddle, Mark S; Tribble, David R

    2015-01-01

    We evaluated the limits of detection (LoD) for an 11-plex PCR-Luminex assay performed on Whatman FTA Elute cards smeared with stool containing pathogens associated with travelers’ diarrhea. LoDs ranged between 102-105 CFU, PFU or cysts/g for most pathogens except Cryptosporidium. Campylobacter and norovirus LoD increased with prolonged storage of cards. PMID:26072151

  16. Development and performance assessment of a luminex xMAP® direct hybridization assay for the detection and identification of indoor air fungal contamination.

    Science.gov (United States)

    Libert, Xavier; Packeu, Ann; Bureau, Fabrice; Roosens, Nancy H; De Keersmaecker, Sigrid C J

    2017-01-01

    Considered as a public health problem, indoor fungal contamination is generally monitored using classical protocols based on culturing. However, this culture dependency could influence the representativeness of the fungal population detected in an analyzed sample as this includes the dead and uncultivable fraction. Moreover, culture-based protocols are often time-consuming. In this context, molecular tools are a powerful alternative, especially those allowing multiplexing. In this study a Luminex xMAP® assay was developed for the simultaneous detection of 10 fungal species which are most frequently in indoor air and that may cause health problems. This xMAP® assay was found to be sensitive, i.e. its limit of detection is ranging between 0.05 and 0.01 ng of gDNA. The assay was subsequently tested with environmental air samples which were also analyzed with a classical protocol. All the species identified with the classical method were also detected with the xMAP® assay, however in a shorter time frame. These results demonstrate that the Luminex xMAP® fungal assay developed in this study could contribute to the improvement of public health and specifically to the indoor fungal contamination treatment.

  17. Development and performance assessment of a luminex xMAP® direct hybridization assay for the detection and identification of indoor air fungal contamination.

    Directory of Open Access Journals (Sweden)

    Xavier Libert

    Full Text Available Considered as a public health problem, indoor fungal contamination is generally monitored using classical protocols based on culturing. However, this culture dependency could influence the representativeness of the fungal population detected in an analyzed sample as this includes the dead and uncultivable fraction. Moreover, culture-based protocols are often time-consuming. In this context, molecular tools are a powerful alternative, especially those allowing multiplexing. In this study a Luminex xMAP® assay was developed for the simultaneous detection of 10 fungal species which are most frequently in indoor air and that may cause health problems. This xMAP® assay was found to be sensitive, i.e. its limit of detection is ranging between 0.05 and 0.01 ng of gDNA. The assay was subsequently tested with environmental air samples which were also analyzed with a classical protocol. All the species identified with the classical method were also detected with the xMAP® assay, however in a shorter time frame. These results demonstrate that the Luminex xMAP® fungal assay developed in this study could contribute to the improvement of public health and specifically to the indoor fungal contamination treatment.

  18. Real-time analytics techniques to analyze and visualize streaming data

    CERN Document Server

    Ellis, Byron

    2014-01-01

    Construct a robust end-to-end solution for analyzing and visualizing streaming data Real-time analytics is the hottest topic in data analytics today. In Real-Time Analytics: Techniques to Analyze and Visualize Streaming Data, expert Byron Ellis teaches data analysts technologies to build an effective real-time analytics platform. This platform can then be used to make sense of the constantly changing data that is beginning to outpace traditional batch-based analysis platforms. The author is among a very few leading experts in the field. He has a prestigious background in research, development,

  19. eTRIKS platform: Conception and operation of a highly scalable cloud-based platform for translational research and applications development.

    Science.gov (United States)

    Bussery, Justin; Denis, Leslie-Alexandre; Guillon, Benjamin; Liu, Pengfeï; Marchetti, Gino; Rahal, Ghita

    2018-04-01

    We describe the genesis, design and evolution of a computing platform designed and built to improve the success rate of biomedical translational research. The eTRIKS project platform was developed with the aim of building a platform that can securely host heterogeneous types of data and provide an optimal environment to run tranSMART analytical applications. Many types of data can now be hosted, including multi-OMICS data, preclinical laboratory data and clinical information, including longitudinal data sets. During the last two years, the platform has matured into a robust translational research knowledge management system that is able to host other data mining applications and support the development of new analytical tools. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Analysis of Luminex-based algorithms to define unacceptable HLA antibodies in CDC-crossmatch negative kidney transplant recipients.

    Science.gov (United States)

    Zecher, Daniel; Bach, Christian; Preiss, Adrian; Staudner, Christoph; Utpatel, Kirsten; Evert, Matthias; Jung, Bettina; Bergler, Tobias; Böger, Carsten A; Spriewald, Bernd M; Banas, Bernhard

    2018-02-20

    HLA-specific antibodies detected by solid phase assays are increasingly used to define unacceptable HLA antigen mismatches (UAM) prior to renal transplantation. The accuracy of this approach is unclear. Day of transplant sera from 211 CDC-crossmatch-negative patients were retrospectively analyzed for donor-specific anti-HLA antibodies (DSA) using Luminex technology. HLA were defined as UAM if DSA had mean fluorescence intensity above (I) 3000 (patients retransplanted and those with DSA against HLA class I and II) or 5000 (all other patients), (II) 5000 for HLA A, B and DR and 10,000 for HLA DQ or (III) 10,000 (all HLA). We then studied the accuracy of these algorithms to identify patients with antibody-mediated rejection (AMR) and graft loss. UAM were also determined in 256 transplant candidates and virtual panel-reactive antibody (vPRA) levels calculated. At transplantation, 67/211 patients had DSA. Of these, 31 (algorithm I), 24 (II) and 17 (III) had UAM. 9 (I and II) and 8 (III) of 11 early AMR episodes and 7 (I), 6 (II) and 5 (III) of 9 graft losses occurred in UAM-positive patients during 4.9 years of follow-up. Algorithms (I) and (II) identified patients with persistently lower GFR even in the absence of overt AMR. 23-33% of waiting list patients had UAM with median vPRA of 69.2-79.1%. Algorithms (I) and (II) had comparable efficacy but were superior to (III) in identifying at-risk patients at an acceptable false positive rate. However, Luminex-defined UAM significantly restrict the donor pool of affected patients, which might prolong waiting time.

  1. A platform analytical quality by design (AQbD) approach for multiple UHPLC-UV and UHPLC-MS methods development for protein analysis.

    Science.gov (United States)

    Kochling, Jianmei; Wu, Wei; Hua, Yimin; Guan, Qian; Castaneda-Merced, Juan

    2016-06-05

    A platform analytical quality by design approach for methods development is presented in this paper. This approach is not limited just to method development following the same logical Analytical quality by design (AQbD) process, it is also exploited across a range of applications in methods development with commonality in equipment and procedures. As demonstrated by the development process of 3 methods, the systematic approach strategy offers a thorough understanding of the method scientific strength. The knowledge gained from the UHPLC-UV peptide mapping method can be easily transferred to the UHPLC-MS oxidation method and the UHPLC-UV C-terminal heterogeneity methods of the same protein. In addition, the platform AQbD method development strategy ensures method robustness is built in during development. In early phases, a good method can generate reliable data for product development allowing confident decision making. Methods generated following the AQbD approach have great potential for avoiding extensive post-approval analytical method change. While in the commercial phase, high quality data ensures timely data release, reduced regulatory risk, and lowered lab operational cost. Moreover, large, reliable database and knowledge gained during AQbD method development provide strong justifications during regulatory filling for the selection of important parameters or parameter change needs for method validation, and help to justify for removal of unnecessary tests used for product specifications. Copyright © 2016 Elsevier B.V. All rights reserved.

  2. Miniaturized Analytical Platforms From Nanoparticle Components: Studies in the Construction, Characterization, and High-Throughput Usage of These Novel Architectures

    Energy Technology Data Exchange (ETDEWEB)

    Pris, Andrew David [Iowa State Univ., Ames, IA (United States)

    2003-01-01

    The scientific community has recently experienced an overall effort to reduce the physical size of many experimental components to the nanometer size range. This size is unique as the characteristics of this regime involve aspects of pure physics, biology, and chemistry. One extensively studied example of a nanometer sized experimental component, which acts as a junction between these three principle scientific theologies, is deoxyribonucleic acid (DNA) or ribonucleic acid (RNA). These biopolymers not only contain the biological genetic guide to code for the production of life-sustaining materials, but are also being probed by physicists as a means to create electrical circuits and furthermore as controllable architectural and sensor motifs in the chemical disciplines. Possibly the most common nano-sized component between these sciences are nanoparticles composed of a variety of materials. The cross discipline employment of nanoparticles is evident from the vast amount of literature that has been produced from each of the individual communities within the last decade. Along these cross-discipline lines, this dissertation examines the use of several different types of nanoparticles with a wide array of surface chemistries to understand their adsorption properties and to construct unique miniaturized analytical and immunoassay platforms. This introduction will act as a literature review to provide key information regarding the synthesis and surface chemistries of several types of nanoparticles. This material will set the stage for a discussion of assembling ordered arrays of nanoparticles into functional platforms, architectures, and sensors. The introduction will also include a short explanation of the atomic force microscope that is used throughout the thesis to characterize the nanoparticle-based structures. Following the Introduction, four research chapters are presented as separate manuscripts. Chapter 1 examines the self-assembly of polymeric nanoparticles

  3. A dual platform for selective analyte enrichment and ionization in mass spectrometry using aptamer-conjugated graphene oxide.

    Science.gov (United States)

    Gulbakan, Basri; Yasun, Emir; Shukoor, M Ibrahim; Zhu, Zhi; You, Mingxu; Tan, Xiaohong; Sanchez, Hernan; Powell, David H; Dai, Hongjie; Tan, Weihong

    2010-12-15

    This study demonstrates the use of aptamer-conjugated graphene oxide as an affinity extraction and detection platform for analytes from complex biological media. We have shown that cocaine and adenosine can be selectively enriched from plasma samples and that direct mass spectrometric readouts can be obtained without a matrix and with greatly improved signal-to-noise ratios. Aptamer-conjugated graphene oxide has clear advantages in target enrichment and in generating highly efficient ionization of target molecules for mass spectrometry. These results demonstrate the utility of the approach for analysis of small molecules in real biological samples.

  4. Detection of enteropathogens associated with travelers' diarrhea using a multiplex Luminex-based assay performed on stool samples smeared on Whatman FTA Elute cards.

    Science.gov (United States)

    Lalani, Tahaniyat; Tisdale, Michele D; Maguire, Jason D; Wongsrichanalai, Chansuda; Riddle, Mark S; Tribble, David R

    2015-09-01

    We evaluated the limits of detection (LoD) for an 11-plex PCR-Luminex assay performed on Whatman(™) FTA Elute cards smeared with stool containing pathogens associated with travelers' diarrhea. LoDs ranged from 10(2) to 10(5)CFU, PFU, or cysts/g for most pathogens except Cryptosporidium. Campylobacter and norovirus LoDs increased with prolonged storage of cards. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. TrajAnalytics: An Open-Source, Web-Based Visual Analytics Software of Urban Trajectory Data

    OpenAIRE

    Zhao, Ye

    2018-01-01

    We developed a software system named TrajAnalytics, which explicitly supports interactive visual analytics of the emerging trajectory data. It offers data management capability and support various data queries by leveraging web-based computing platforms. It allows users to visually conduct queries and make sense of massive trajectory data.

  6. A Class of Generalized Gough-Stewart Platforms Used for Effectively Obtaining Dynamic Isotropy – An Analytical Study

    Directory of Open Access Journals (Sweden)

    Afzali-Far Behrouz

    2015-01-01

    Full Text Available In this paper, we propose a class of Generalized Gough-Stewart Platforms (GGSPs used, as a novel approach, to eliminate the classical isotropic constraint of GSPs (hexapods. GGSPs are based on the standard GSP architecture with additional rotations of the three strut-pairs. Despite the architectural generalization introduced in GGSPs, they do not require much more effort in order to be fabricated. This is due to the fact that all the struts (actuators can be chosen identical, similar to standard GSPs. We analytically show how effectively the classical isotropic constraint is removed and that still sufficient simplicity is retained. Furthermore, this paper gives an intuitive understanding of dynamic isotropy in GGSPs as well as GSPs.

  7. Operational Efficiencies and Simulated Performance of Big Data Analytics Platform over Billions of Patient Records of a Hospital System

    Directory of Open Access Journals (Sweden)

    Dillon Chrimes

    2017-01-01

    Full Text Available Big Data Analytics (BDA is important to utilize data from hospital systems to reduce healthcare costs. BDA enable queries of large volumes of patient data in an interactively dynamic way for healthcare. The study objective was high performance establishment of interactive BDA platform of hospital system. A Hadoop/MapReduce framework was established at University of Victoria (UVic with Compute Canada/Westgrid to form a Healthcare BDA (HBDA platform with HBase (NoSQL database using hospital-specific metadata and file ingestion. Patient data profiles and clinical workflow derived from Vancouver Island Health Authority (VIHA, Victoria, BC, Canada. The proof-of-concept implementation tested patient data representative of the entire Provincial hospital systems. We cross-referenced all data profiles and metadata with real patient data used in clinical reporting. Query performance tested Apache tools in Hadoop’s ecosystem. At optimized iteration, Hadoop Distributed File System (HDFS ingestion required three seconds but HBase required four to twelve hours to complete the Reducer of MapReduce. HBase bulkloads took a week for one billion (10TB and over two months for three billion (30TB. Simple and complex query results showed about two seconds for one and three billion, respectively. Apache Drill outperformed Apache Spark. However, it was restricted to running more simplified queries with poor usability for healthcare. Jupyter on Spark offered high performance and customization to run all queries simultaneously with high usability. BDA platform of HBase distributed over Hadoop successfully; however, some inconsistencies of MapReduce limited operational efficiencies. Importance of Hadoop/MapReduce on representation of platform performance discussed.

  8. Learning Analytics across a Statewide System

    Science.gov (United States)

    Buyarski, Catherine; Murray, Jim; Torstrick, Rebecca

    2017-01-01

    This chapter explores lessons learned from two different learning analytics efforts at a large, public, multicampus university--one internally developed and one vended platform. It raises questions about how to best use analytics to support students while keeping students responsible for their own learning and success.

  9. Getting started with Greenplum for big data analytics

    CERN Document Server

    Gollapudi, Sunila

    2013-01-01

    Standard tutorial-based approach.""Getting Started with Greenplum for Big Data"" Analytics is great for data scientists and data analysts with a basic knowledge of Data Warehousing and Business Intelligence platforms who are new to Big Data and who are looking to get a good grounding in how to use the Greenplum Platform. It's assumed that you will have some experience with database design and programming as well as be familiar with analytics tools like R and Weka.

  10. Big Data Analytics with Datalog Queries on Spark.

    Science.gov (United States)

    Shkapsky, Alexander; Yang, Mohan; Interlandi, Matteo; Chiu, Hsuan; Condie, Tyson; Zaniolo, Carlo

    2016-01-01

    There is great interest in exploiting the opportunity provided by cloud computing platforms for large-scale analytics. Among these platforms, Apache Spark is growing in popularity for machine learning and graph analytics. Developing efficient complex analytics in Spark requires deep understanding of both the algorithm at hand and the Spark API or subsystem APIs (e.g., Spark SQL, GraphX). Our BigDatalog system addresses the problem by providing concise declarative specification of complex queries amenable to efficient evaluation. Towards this goal, we propose compilation and optimization techniques that tackle the important problem of efficiently supporting recursion in Spark. We perform an experimental comparison with other state-of-the-art large-scale Datalog systems and verify the efficacy of our techniques and effectiveness of Spark in supporting Datalog-based analytics.

  11. Opportunity and Challenges for Migrating Big Data Analytics in Cloud

    Science.gov (United States)

    Amitkumar Manekar, S.; Pradeepini, G., Dr.

    2017-08-01

    Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day’s trend towards “big data-as-a-service” is talked everywhere. On one hand, cloud-based big data analytics exactly tackle in progress issues of scale, speed, and cost. But researchers working to solve security and other real-time problem of big data migration on cloud based platform. This article specially focused on finding possible ways to migrate big data to cloud. Technology which support coherent data migration and possibility of doing big data analytics on cloud platform is demanding in natute for new era of growth. This article also gives information about available technology and techniques for migration of big data in cloud.

  12. Towards a Market Entry Framework for Digital Payment Platforms

    DEFF Research Database (Denmark)

    Kazan, Erol; Damsgaard, Jan

    2016-01-01

    This study presents a framework to understand and explain the design and configuration of digital payment platforms and how these platforms create conditions for market entries. By embracing the theoretical lens of platform envelopment, we employed a multiple and comparative-case study...... in a European setting by using our framework as an analytical lens to assess market-entry conditions. We found that digital payment platforms have acquired market entry capabilities, which is achieved through strategic platform design (i.e., platform development and service distribution) and technology design...... (i.e., issuing evolutionary and revolutionary payment instruments). The studied cases reveal that digital platforms leverage payment services as a mean to bridge and converge core and adjacent platform markets. In so doing, platform envelopment strengthens firms’ market position in their respective...

  13. Functionalization and Characterization of Nanomaterial Gated Field-Effect Transistor-Based Biosensors and the Design of a Multi-Analyte Implantable Biosensing Platform

    Science.gov (United States)

    Croce, Robert A., Jr.

    Advances in semiconductor research and complementary-metal-oxide semiconductor fabrication allow for the design and implementation of miniaturized metabolic monitoring systems, as well as advanced biosensor design. The first part of this dissertation will focus on the design and fabrication of nanomaterial (single-walled carbon nanotube and quantum dot) gated field-effect transistors configured as protein sensors. These novel device structures have been functionalized with single-stranded DNA aptamers, and have shown sensor operation towards the protein Thrombin. Such advanced transistor-based sensing schemes present considerable advantages over traditional sensing methodologies in view of its miniaturization, low cost, and facile fabrication, paving the way for the ultimate realization of a multi-analyte lab-on-chip. The second part of this dissertation focuses on the design and fabrication of a needle-implantable glucose sensing platform which is based solely on photovoltaic powering and optical communication. By employing these powering and communication schemes, this design negates the need for bulky on-chip RF-based transmitters and batteries in an effort to attain extreme miniaturization required for needle-implantable/extractable applications. A complete single-sensor system coupled with a miniaturized amperometric glucose sensor has been demonstrated to exhibit reality of this technology. Furthermore, an optical selection scheme of multiple potentiostats for four different analytes (glucose, lactate, O 2 and CO2) as well as the optical transmission of sensor data has been designed for multi-analyte applications. The last part of this dissertation will focus on the development of a computational model for the amperometric glucose sensors employed in the aforementioned implantable platform. This model has been applied to single-layer single-enzyme systems, as well as multi-layer (single enzyme) systems utilizing glucose flux limiting layer-by-layer assembled

  14. Proteome Analysis of Subsarcolemmal Cardiomyocyte Mitochondria: A Comparison of Different Analytical Platforms

    Directory of Open Access Journals (Sweden)

    Francesco Giorgianni

    2014-05-01

    Full Text Available Mitochondria are complex organelles that play critical roles in diverse aspects of cellular function. Heart disease and a number of other pathologies are associated with perturbations in the molecular machinery of the mitochondria. Therefore, comprehensive, unbiased examination of the mitochondrial proteome represents a powerful approach toward system-level insights into disease mechanisms. A crucial aspect in proteomics studies is design of bioanalytical strategies that maximize coverage of the complex repertoire of mitochondrial proteins. In this study, we evaluated the performance of gel-based and gel-free multidimensional platforms for profiling of the proteome in subsarcolemmal mitochondria harvested from rat heart. We compared three different multidimensional proteome fractionation platforms: polymeric reversed-phase liquid chromatography at high pH (PLRP, sodium dodecyl sulfate-polyacrylamide gel electrophoresis (SDS-PAGE, and isoelectric focusing (IEF separations combined with liquid chromatography-mass spectrometry/mass spectrometry (LC-MS/MS, and bioinformatics for protein identification. Across all three platforms, a total of 1043 proteins were identified. Among the three bioanalytical strategies, SDS-PAGE followed by LC-MS/MS provided the best coverage of the mitochondrial proteome. With this platform, 890 proteins with diverse physicochemical characteristics were identified; the mitochondrial protein panel encompassed proteins with various functional roles including bioenergetics, protein import, and mitochondrial fusion. Taken together, results of this study provide a large-scale view of the proteome in subsarcolemmal mitochondria from the rat heart, and aid in the selection of optimal bioanalytical platforms for differential protein expression profiling of mitochondria in health and disease.

  15. Cloud-Based Software Platform for Smart Meter Data Management

    DEFF Research Database (Denmark)

    Liu, Xiufeng; Nielsen, Per Sieverts

    of the so-called big data possible. This can improve energy management, e.g., help utility companies to forecast energy loads and improve services, and help households to manage energy usage and save money. As this regard, the proposed paper focuses on building an innovative software platform for smart...... their knowledge; scalable data analytics platform for data mining over big data sets for energy demand forecasting and consumption discovering; data as the service for other applications using smart meter data; and a portal for visualizing data analytics results. The design will incorporate hybrid clouds......, including Infrastructure as a Service (IaaS) and Platform as a Service (PaaS), which are suitable for on-demand provisioning, massive scaling, and manageability. Besides, the design will impose extensibility, eciency, and high availability on the system. The paper will evaluate the system comprehensively...

  16. Rheem: Enabling Multi-Platform Task Execution

    KAUST Repository

    Agrawal, Divy; Kruse, Sebastian; Ouzzani, Mourad; Papotti, Paolo; Quiane-Ruiz, Jorge-Arnulfo; Tang, Nan; Zaki, Mohammed J.; Ba, Lamine; Berti-Equille, Laure; Chawla, Sanjay; Elmagarmid, Ahmed; Hammady, Hossam; Idris, Yasser; Kaoudi, Zoi; Khayyat, Zuhair

    2016-01-01

    Many emerging applications, from domains such as healthcare and oil & gas, require several data processing systems for complex analytics. This demo paper showcases Rheem, a framework that provides multi-platform task execution for such applications. It features a three-layer data processing abstraction and a new query optimization approach for multi-platform settings. We will demonstrate the strengths of Rheem by using real-world scenarios from three different applications, namely, machine learning, data cleaning, and data fusion. © 2016 ACM.

  17. Rheem: Enabling Multi-Platform Task Execution

    KAUST Repository

    Agrawal, Divy

    2016-06-16

    Many emerging applications, from domains such as healthcare and oil & gas, require several data processing systems for complex analytics. This demo paper showcases Rheem, a framework that provides multi-platform task execution for such applications. It features a three-layer data processing abstraction and a new query optimization approach for multi-platform settings. We will demonstrate the strengths of Rheem by using real-world scenarios from three different applications, namely, machine learning, data cleaning, and data fusion. © 2016 ACM.

  18. Development of a swine specific 9-plex Luminex cytokine assay and assessment of immunity after porcine reproductive and respiratory syndrome virus (PRRSV) vaccination: Elevated serum IL-12 levels are not predictive of protect

    Science.gov (United States)

    A Luminex multiplex swine cytokine assay was developed to measure 9 cytokines simultaneously in pig serum and tested in a porcine reproductive and respiratory syndrome virus (PRRSV) vaccine/challenge study. This assay detects innate (IL-1ß, IL-6, IL-8, IFNa, TNFa); regulatory (IL-10), Th1 (IL-12, I...

  19. Towards the Smart World. Smart Platform: Infrastructure and Analytics

    CSIR Research Space (South Africa)

    Velthausz, D

    2012-10-01

    Full Text Available In this presentation the author outlines the 'smart world' concept and how technology (smart infrastructure, analytics) can foster smarter cities, smarter regions and a smarter world....

  20. Data analytics in the ATLAS Distributed Computing

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Bryant, Lincoln

    2015-01-01

    The ATLAS Data analytics effort is focused on creating systems which provide the ATLAS ADC with new capabilities for understanding distributed systems and overall operational performance. These capabilities include: warehousing information from multiple systems (the production and distributed analysis system - PanDA, the distributed data management system - Rucio, the file transfer system, various monitoring services etc. ); providing a platform to execute arbitrary data mining and machine learning algorithms over aggregated data; satisfy a variety of use cases for different user roles; host new third party analytics services on a scalable compute platform. We describe the implemented system where: data sources are existing RDBMS (Oracle) and Flume collectors; a Hadoop cluster is used to store the data; native Hadoop and Apache Pig scripts are used for data aggregation; and R for in-depth analytics. Part of the data is indexed in ElasticSearch so both simpler investigations and complex dashboards can be made ...

  1. Simultaneous Genomic Detection of Multiple Enteric Bacterial and Viral Pathogens, Including Sars-CoV and RVFV

    National Research Council Canada - National Science Library

    Payne, S; Peters, C. J. (Clarence James), 1940; Makino, S; Oliver, K; Weiss, C; Kornguth, S; Carruthers, L; Chin, R

    2004-01-01

    ...) associated with the SARS-associated coronavirus (SARS-CoV) and Rift Valley Fever Virus (RVFV) has been developed. This system is based upon the Luminex xMAP" System, a multiplexed assay platform that combines high sample throughput...

  2. Performance of commercial platforms for rapid genotyping of polymorphisms affecting warfarin dose.

    Science.gov (United States)

    King, Cristi R; Porche-Sorbet, Rhonda M; Gage, Brian F; Ridker, Paul M; Renaud, Yannick; Phillips, Michael S; Eby, Charles

    2008-06-01

    Initiation of warfarin therapy is associated with bleeding owing to its narrow therapeutic window and unpredictable therapeutic dose. Pharmacogenetic-based dosing algorithms can improve accuracy of initial warfarin dosing but require rapid genotyping for cytochrome P-450 2C9 (CYP2C9) *2 and *3 single nucleotide polymorphisms (SNPs) and a vitamin K epoxide reductase (VKORC1) SNP. We evaluated 4 commercial systems: INFINITI analyzer (AutoGenomics, Carlsbad, CA), Invader assay (Third Wave Technologies, Madison, WI), Tag-It Mutation Detection assay (Luminex Molecular Diagnostics, formerly Tm Bioscience, Toronto, Canada), and Pyrosequencing (Biotage, Uppsala, Sweden). We genotyped 112 DNA samples and resolved any discrepancies with bidirectional sequencing. The INFINITI analyzer was 100% accurate for all SNPs and required 8 hours. Invader and Tag-It were 100% accurate for CYP2C9 SNPs, 99% accurate for VKORC1 -1639/3673 SNP, and required 3 hours and 8 hours, respectively. Pyrosequencing was 99% accurate for CYP2C9 *2, 100% accurate for CYP2C9 *3, and 100% accurate for VKORC1 and required 4 hours. Current commercial platforms provide accurate and rapid genotypes for pharmacogenetic dosing during initiation of warfarin therapy.

  3. Learning analytics approach of EMMA project

    NARCIS (Netherlands)

    Tammets, Kairit; Brouns, Francis

    2014-01-01

    The EMMA project provides a MOOC platform to aggregate and delivers massive open online courses (MOOC) in multiple languages from a variety of European universities. Learning analytics play an important role in MOOCs to support the individual needs of the learner.

  4. Analytical characterization using surface-enhanced Raman scattering (SERS) and microfluidic sampling

    International Nuclear Information System (INIS)

    Wang, Chao; Yu, Chenxu

    2015-01-01

    With the rapid development of analytical techniques, it has become much easier to detect chemical and biological analytes, even at very low detection limits. In recent years, techniques based on vibrational spectroscopy, such as surface enhanced Raman spectroscopy (SERS), have been developed for non-destructive detection of pathogenic microorganisms. SERS is a highly sensitive analytical tool that can be used to characterize chemical and biological analytes interacting with SERS-active substrates. However, it has always been a challenge to obtain consistent and reproducible SERS spectroscopic results at complicated experimental conditions. Microfluidics, a tool for highly precise manipulation of small volume liquid samples, can be used to overcome the major drawbacks of SERS-based techniques. High reproducibility of SERS measurement could be obtained in continuous flow generated inside microfluidic devices. This article provides a thorough review of the principles, concepts and methods of SERS-microfluidic platforms, and the applications of such platforms in trace analysis of chemical and biological analytes. (topical review)

  5. Instrument platforms for nano liquid chromatography

    Czech Academy of Sciences Publication Activity Database

    Šesták, Jozef; Moravcová, Dana; Kahle, Vladislav

    2015-01-01

    Roč. 1421, NOV (2015), s. 2-17 ISSN 0021-9673 R&D Projects: GA MV VG20112015021 Institutional support: RVO:68081715 Keywords : nano liquid chromatography * splitless gradient generation * nano LC platforms Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.926, year: 2015 http://hdl.handle.net/11104/0250900

  6. Perioperative and ICU Healthcare Analytics within a Veterans Integrated System Network: a Qualitative Gap Analysis.

    Science.gov (United States)

    Mudumbai, Seshadri; Ayer, Ferenc; Stefanko, Jerry

    2017-08-01

    Health care facilities are implementing analytics platforms as a way to document quality of care. However, few gap analyses exist on platforms specifically designed for patients treated in the Operating Room, Post-Anesthesia Care Unit, and Intensive Care Unit (ICU). As part of a quality improvement effort, we undertook a gap analysis of an existing analytics platform within the Veterans Healthcare Administration. The objectives were to identify themes associated with 1) current clinical use cases and stakeholder needs; 2) information flow and pain points; and 3) recommendations for future analytics development. Methods consisted of semi-structured interviews in 2 phases with a diverse set (n = 9) of support personnel and end users from five facilities across a Veterans Integrated Service Network. Phase 1 identified underlying needs and previous experiences with the analytics platform across various roles and operational responsibilities. Phase 2 validated preliminary feedback, lessons learned, and recommendations for improvement. Emerging themes suggested that the existing system met a small pool of national reporting requirements. However, pain points were identified with accessing data in several information system silos and performing multiple manual validation steps of data content. Notable recommendations included enhancing systems integration to create "one-stop shopping" for data, and developing a capability to perform trends analysis. Our gap analysis suggests that analytics platforms designed for surgical and ICU patients should employ approaches similar to those being used for primary care patients.

  7. Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing

    Science.gov (United States)

    Tang, Jingyin; Matyas, Corene J.

    2018-02-01

    Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.

  8. Identification of Brucella genus and eight Brucella species by Luminex bead-based suspension array.

    Science.gov (United States)

    Lusk Pfefer, Tina S; Timme, Ruth; Kase, Julie A

    2018-04-01

    Globally, unpasteurized milk products are vehicles for the transmission of brucellosis, a zoonosis responsible for cases of foodborne illness in the United States and elsewhere. Existing PCR assays to detect Brucella species are restricted by the resolution of band sizes on a gel or the number of fluorescent channels in a single real-time system. The Luminex bead-based suspension array is performed in a 96-well plate allowing for high throughput screening of up to 100 targets in one sample with easily discernible results. We have developed an array using the Bio-Plex 200 to differentiate the most common Brucella species: B. abortus, B. melitensis, B. suis, B. suis bv5, B. canis, B. ovis, B. pinnipedia, and B. neotomae, as well as Brucella genus. All probes showed high specificity, with no cross-reaction with non-Brucella strains. We could detect pure DNA from B. abortus, B. melitensis, and genus-level Brucella at concentrations of ≤5 fg/μL. Pure DNA from all other species tested positive at concentrations well below 500 fg/μL and we positively identified B. neotomae in six artificially contaminated cheese and milk products. An intra-laboratory verification further demonstrated the assay's accuracy and robustness in the rapid screening (3-4 h including PCR) of DNA. Published by Elsevier Ltd.

  9. A multi-analyte biosensor for the simultaneous label-free detection of pathogens and biomarkers in point-of-need animal testing.

    Science.gov (United States)

    Ewald, Melanie; Fechner, Peter; Gauglitz, Günter

    2015-05-01

    For the first time, a multi-analyte biosensor platform has been developed using the label-free 1-lambda-reflectometry technique. This platform is the first, which does not use imaging techniques, but is able to perform multi-analyte measurements. It is designed to be portable and cost-effective and therefore allows for point-of-need testing or on-site field-testing with possible applications in diagnostics. This work highlights the application possibilities of this platform in the field of animal testing, but is also relevant and transferable to human diagnostics. The performance of the platform has been evaluated using relevant reference systems like biomarker (C-reactive protein) and serology (anti-Salmonella antibodies) as well as a panel of real samples (animal sera). The comparison of the working range and limit of detection shows no loss of performance transferring the separate assays to the multi-analyte setup. Moreover, the new multi-analyte platform allows for discrimination between sera of animals infected with different Salmonella subtypes.

  10. Monitoring WLCG with lambda-architecture: a new scalable data store and analytics platform for monitoring at petabyte scale.

    CERN Document Server

    Magnoni, L; Cordeiro, C; Georgiou, M; Andreeva, J; Khan, A; Smith, D R

    2015-01-01

    Monitoring the WLCG infrastructure requires the gathering and analysis of a high volume of heterogeneous data (e.g. data transfers, job monitoring, site tests) coming from different services and experiment-specific frameworks to provide a uniform and flexible interface for scientists and sites. The current architecture, where relational database systems are used to store, to process and to serve monitoring data, has limitations in coping with the foreseen increase in the volume (e.g. higher LHC luminosity) and the variety (e.g. new data-transfer protocols and new resource-types, as cloud-computing) of WLCG monitoring events. This paper presents a new scalable data store and analytics platform designed by the Support for Distributed Computing (SDC) group, at the CERN IT department, which uses a variety of technologies each one targeting specific aspects of big-scale distributed data-processing (commonly referred as lambda-architecture approach). Results of data processing on Hadoop for WLCG data activities mon...

  11. Digital platforms: an analytical framework for identifying and evaluating policy options

    NARCIS (Netherlands)

    van Eijk, N.; Fahy, R.; van Til, H.; Nooren, P.; Stokking, H.; Gelevert, H.

    2015-01-01

    At the request of the Ministry of Economic Affairs, a project consortium of TNO, Ecorys and IViR have developed a framework to analyse policy questions regarding ‘digital platforms’. This framework enables the government to take advantage of the opportunities these platforms offer and to appreciate

  12. Reconfigurable microfluidic platform in ice

    OpenAIRE

    Varejka, M.

    2008-01-01

    Microfluidic devices are popular tools in the biotechnology industry where they provide smaller reagent requirements, high speed of analysis and the possibility for automation. The aim of the project is to make a flexible biocompatible microfluidic platform adapted to different specific applications, mainly analytical and separations which parameters and configuration can be changed multiple times by changing corresponding computer programme. The current project has been sup...

  13. Graphene-Plasmonic Hybrid Platform for Label-Free SERS Biomedical Detection

    Science.gov (United States)

    Wang, Pu

    Surface Enhanced Raman Scattering (SERS) has attracted explosive interest for the wealth of vibrational information it provides with minimal invasive effects to target analyte. Nanotechnology, especially in the form of noble metal nanoparticles exhibit unique electromagnetic and chemical characteristics that are explored to realize ultra-sensitive SERS detection in chemical and biological analysis. Graphene, atom-thick carbon monolayer, exhibits superior chemical stability and bio-compatibility. A combination of SERS-active metal nanostructures and graphene will create various synergies in SERS. The main objective of this research was to exploit the applications of the graphene-Au tip hybrid platform in SERS. The hybrid platform consists of a periodic Au nano-pyramid substrate to provide reproducible plasmonic enhancement, and the superimposed monolayer graphene sheet, serving as "built-in" Raman marker. Extensive theoretical and experimental studies were conducted to determine the potentials of the hybrid platform as SERS substrate. Results from both Finite-Domain Time-Domain (FDTD) numerical simulation and Raman scattering of graphene suggested that the hybrid platform boosted a high density of hotspots yielding 1000 times SERS enhancement of graphene bands. Ultra-high sensitivity of the hybrid platform was demonstrated by bio-molecules including dye, protein and neurotransmitters. Dopamine and serotonin can be detected and distinguished at 10-9 M concentration in the presence of human body fluid. Single molecule detection was obtained using a bi-analyte technique. Graphene supported a vibration mode dependent SERS chemical enhancement of ˜10 to the analyte. Quantitative evaluation of hotspots was presented using spatially resolved Raman mapping of graphene SERS enhancement. Graphene plays a crucial role in quantifying SERS hotspots and paves the path for defining SERS EF that could be universally applied to various SERS systems. A reproducible and statistically

  14. Design challenges in nanoparticle-based platforms: Implications for targeted drug delivery systems

    Science.gov (United States)

    Mullen, Douglas Gurnett

    Characterization and control of heterogeneous distributions of nanoparticle-ligand components are major design challenges for nanoparticle-based platforms. This dissertation begins with an examination of poly(amidoamine) (PAMAM) dendrimer-based targeted delivery platform. A folic acid targeted modular platform was developed to target human epithelial cancer cells. Although active targeting was observed in vitro, active targeting was not found in vivo using a mouse tumor model. A major flaw of this platform design was that it did not provide for characterization or control of the component distribution. Motivated by the problems experienced with the modular design, the actual composition of nanoparticle-ligand distributions were examined using a model dendrimer-ligand system. High Pressure Liquid Chromatography (HPLC) resolved the distribution of components in samples with mean ligand/dendrimer ratios ranging from 0.4 to 13. A peak fitting analysis enabled the quantification of the component distribution. Quantified distributions were found to be significantly more heterogeneous than commonly expected and standard analytical parameters, namely the mean ligand/nanoparticle ratio, failed to adequately represent the component heterogeneity. The distribution of components was also found to be sensitive to particle modifications that preceded the ligand conjugation. With the knowledge gained from this detailed distribution analysis, a new platform design was developed to provide a system with dramatically improved control over the number of components and with improved batch reproducibility. Using semi-preparative HPLC, individual dendrimer-ligand components were isolated. The isolated dendrimer with precise numbers of ligands were characterized by NMR and analytical HPLC. In total, nine different dendrimer-ligand components were obtained with degrees of purity ≥80%. This system has the potential to serve as a platform to which a precise number of functional molecules

  15. Emerging technology and architecture for big-data analytics

    CERN Document Server

    Chang, Chip; Yu, Hao

    2017-01-01

    This book describes the current state of the art in big-data analytics, from a technology and hardware architecture perspective. The presentation is designed to be accessible to a broad audience, with general knowledge of hardware design and some interest in big-data analytics. Coverage includes emerging technology and devices for data-analytics, circuit design for data-analytics, and architecture and algorithms to support data-analytics. Readers will benefit from the realistic context used by the authors, which demonstrates what works, what doesn’t work, and what are the fundamental problems, solutions, upcoming challenges and opportunities. Provides a single-source reference to hardware architectures for big-data analytics; Covers various levels of big-data analytics hardware design abstraction and flow, from device, to circuits and systems; Demonstrates how non-volatile memory (NVM) based hardware platforms can be a viable solution to existing challenges in hardware architecture for big-data analytics.

  16. A Novel Platform for Evaluating the Environmental Impacts on Bacterial Cellulose Production.

    Science.gov (United States)

    Basu, Anindya; Vadanan, Sundaravadanam Vishnu; Lim, Sierin

    2018-04-10

    Bacterial cellulose (BC) is a biocompatible material with versatile applications. However, its large-scale production is challenged by the limited biological knowledge of the bacteria. The advent of synthetic biology has lead the way to the development of BC producing microbes as a novel chassis. Hence, investigation on optimal growth conditions for BC production and understanding of the fundamental biological processes are imperative. In this study, we report a novel analytical platform that can be used for studying the biology and optimizing growth conditions of cellulose producing bacteria. The platform is based on surface growth pattern of the organism and allows us to confirm that cellulose fibrils produced by the bacteria play a pivotal role towards their chemotaxis. The platform efficiently determines the impacts of different growth conditions on cellulose production and is translatable to static culture conditions. The analytical platform provides a means for fundamental biological studies of bacteria chemotaxis as well as systematic approach towards rational design and development of scalable bioprocessing strategies for industrial production of bacterial cellulose.

  17. A large-scale superhydrophobic surface-enhanced Raman scattering (SERS) platform fabricated via capillary force lithography and assembly of Ag nanocubes for ultratrace molecular sensing.

    Science.gov (United States)

    Tan, Joel Ming Rui; Ruan, Justina Jiexin; Lee, Hiang Kwee; Phang, In Yee; Ling, Xing Yi

    2014-12-28

    An analytical platform with an ultratrace detection limit in the atto-molar (aM) concentration range is vital for forensic, industrial and environmental sectors that handle scarce/highly toxic samples. Superhydrophobic surface-enhanced Raman scattering (SERS) platforms serve as ideal platforms to enhance detection sensitivity by reducing the random spreading of aqueous solution. However, the fabrication of superhydrophobic SERS platforms is generally limited due to the use of sophisticated and expensive protocols and/or suffers structural and signal inconsistency. Herein, we demonstrate a high-throughput fabrication of a stable and uniform superhydrophobic SERS platform for ultratrace molecular sensing. Large-area box-like micropatterns of the polymeric surface are first fabricated using capillary force lithography (CFL). Subsequently, plasmonic properties are incorporated into the patterned surfaces by decorating with Ag nanocubes using the Langmuir-Schaefer technique. To create a stable superhydrophobic SERS platform, an additional 25 nm Ag film is coated over the Ag nanocube-decorated patterned template followed by chemical functionalization with perfluorodecanethiol. Our resulting superhydrophobic SERS platform demonstrates excellent water-repellency with a static contact angle of 165° ± 9° and a consequent analyte concentration factor of 59-fold, as compared to its hydrophilic counterpart. By combining the analyte concentration effect of superhydrophobic surfaces with the intense electromagnetic "hot spots" of Ag nanocubes, our superhydrophobic SERS platform achieves an ultra-low detection limit of 10(-17) M (10 aM) for rhodamine 6G using just 4 μL of analyte solutions, corresponding to an analytical SERS enhancement factor of 10(13). Our fabrication protocol demonstrates a simple, cost- and time-effective approach for the large-scale fabrication of a superhydrophobic SERS platform for ultratrace molecular detection.

  18. An analytical platform for mass spectrometry-based identification and chemical analysis of RNA in ribonucleoprotein complexes.

    Science.gov (United States)

    Taoka, Masato; Yamauchi, Yoshio; Nobe, Yuko; Masaki, Shunpei; Nakayama, Hiroshi; Ishikawa, Hideaki; Takahashi, Nobuhiro; Isobe, Toshiaki

    2009-11-01

    We describe here a mass spectrometry (MS)-based analytical platform of RNA, which combines direct nano-flow reversed-phase liquid chromatography (RPLC) on a spray tip column and a high-resolution LTQ-Orbitrap mass spectrometer. Operating RPLC under a very low flow rate with volatile solvents and MS in the negative mode, we could estimate highly accurate mass values sufficient to predict the nucleotide composition of a approximately 21-nucleotide small interfering RNA, detect post-transcriptional modifications in yeast tRNA, and perform collision-induced dissociation/tandem MS-based structural analysis of nucleolytic fragments of RNA at a sub-femtomole level. Importantly, the method allowed the identification and chemical analysis of small RNAs in ribonucleoprotein (RNP) complex, such as the pre-spliceosomal RNP complex, which was pulled down from cultured cells with a tagged protein cofactor as bait. We have recently developed a unique genome-oriented database search engine, Ariadne, which allows tandem MS-based identification of RNAs in biological samples. Thus, the method presented here has broad potential for automated analysis of RNA; it complements conventional molecular biology-based techniques and is particularly suited for simultaneous analysis of the composition, structure, interaction, and dynamics of RNA and protein components in various cellular RNP complexes.

  19. Teachable, high-content analytics for live-cell, phase contrast movies.

    Science.gov (United States)

    Alworth, Samuel V; Watanabe, Hirotada; Lee, James S J

    2010-09-01

    CL-Quant is a new solution platform for broad, high-content, live-cell image analysis. Powered by novel machine learning technologies and teach-by-example interfaces, CL-Quant provides a platform for the rapid development and application of scalable, high-performance, and fully automated analytics for a broad range of live-cell microscopy imaging applications, including label-free phase contrast imaging. The authors used CL-Quant to teach off-the-shelf universal analytics, called standard recipes, for cell proliferation, wound healing, cell counting, and cell motility assays using phase contrast movies collected on the BioStation CT and BioStation IM platforms. Similar to application modules, standard recipes are intended to work robustly across a wide range of imaging conditions without requiring customization by the end user. The authors validated the performance of the standard recipes by comparing their performance with truth created manually, or by custom analytics optimized for each individual movie (and therefore yielding the best possible result for the image), and validated by independent review. The validation data show that the standard recipes' performance is comparable with the validated truth with low variation. The data validate that the CL-Quant standard recipes can provide robust results without customization for live-cell assays in broad cell types and laboratory settings.

  20. Climate News Across Media Platforms

    DEFF Research Database (Denmark)

    Eskjær, Mikkel Fugl

    2015-01-01

    In a changing media landscape marked by technological, institutional and cultural convergence, comparative and cross-media content analysis represents a valuable analytical tool in mapping the diverse channels of climate change communication. This paper presents a comparative study of climate...... quantitative and qualitative content analysis the paper documents and explores the extent and character of climate change news across different media platforms. The study aims at contributing to the on-going assessment of how news media are addressing climate change at a time when old and new media...... change news on five different media platforms: newspapers, television, radio, web-news and mobile news. It investigates the themes and actors represented in public climate change communication as well as the diverse possibilities of participating in public debates and information sharing. By combining...

  1. Architectural Considerations for Highly Scalable Computing to Support On-demand Video Analytics

    Science.gov (United States)

    2017-04-19

    research were used to implement a distributed on-demand video analytics system that was prototyped for the use of forensics investigators in law...demand video intelligence; intelligent video system ; video analytics platform I. INTRODUCTION Video Analytics systems has been of tremendous interest...enforcement. The system was tested in the wild using video files as well as a commercial Video Management System supporting more than 100 surveillance

  2. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    Science.gov (United States)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  3. Analytical investigations of the earthquake resistance of the support base of an oil-gas platform

    Energy Technology Data Exchange (ETDEWEB)

    Glagovskii, V. B.; Kassirova, N. A.; Turchina, O. A.; Finagenov, O. M.; Tsirukhin, N. A. [JSC ' VNIIG im. B. E. Vedeneeva' (Russian Federation)

    2012-01-15

    In designing stationary oil-gas recovery platforms on the continental shelf, the need arises to compute the estimated strength of their support base during seismic events. This paper is devoted to this estimation. The paper examines a structure consisting of the superstructure of an oil-gas platform and its gravity-type base. It is possible to install earthquake-insulating supports between them. Calculations performed for the design earthquake indicated that the design of the gravity base can resist a seismic effect without special additional measures. During the maximum design earthquake, moreover, significant stresses may develop in the zone of base where the columns are connected to the upper slab of the caisson. In that case, the earthquake insulation considered for the top of the platform becomes critical.

  4. Analytical investigations of the earthquake resistance of the support base of an oil-gas platform

    International Nuclear Information System (INIS)

    Glagovskii, V. B.; Kassirova, N. A.; Turchina, O. A.; Finagenov, O. M.; Tsirukhin, N. A.

    2012-01-01

    In designing stationary oil-gas recovery platforms on the continental shelf, the need arises to compute the estimated strength of their support base during seismic events. This paper is devoted to this estimation. The paper examines a structure consisting of the superstructure of an oil-gas platform and its gravity-type base. It is possible to install earthquake-insulating supports between them. Calculations performed for the design earthquake indicated that the design of the gravity base can resist a seismic effect without special additional measures. During the maximum design earthquake, moreover, significant stresses may develop in the zone of base where the columns are connected to the upper slab of the caisson. In that case, the earthquake insulation considered for the top of the platform becomes critical.

  5. Rigid multipodal platforms for metal surfaces

    Directory of Open Access Journals (Sweden)

    Michal Valášek

    2016-03-01

    Full Text Available In this review the recent progress in molecular platforms that form rigid and well-defined contact to a metal surface are discussed. Most of the presented examples have at least three anchoring units in order to control the spatial arrangement of the protruding molecular subunit. Another interesting feature is the lateral orientation of these foot structures which, depending on the particular application, is equally important as the spatial arrangement of the molecules. The numerous approaches towards assembling and organizing functional molecules into specific architectures on metal substrates are reviewed here. Particular attention is paid to variations of both, the core structures and the anchoring groups. Furthermore, the analytical methods enabling the investigation of individual molecules as well as monomolecular layers of ordered platform structures are summarized. The presented multipodal platforms bearing several anchoring groups form considerably more stable molecule–metal contacts than corresponding monopodal analogues and exhibit an enlarged separation of the functional molecules due to the increased footprint, as well as restrict tilting of the functional termini with respect to the metal surface. These platforms are thus ideally suited to tune important properties of the molecule–metal interface. On a single-molecule level, several of these platforms enable the control over the arrangement of the protruding rod-type molecular structures (e.g., molecular wires, switches, rotors, sensors with respect to the surface of the substrate.

  6. Marketing analytics for Free-to-Play Games

    OpenAIRE

    Kuokka, Ari

    2013-01-01

    This thesis deals with free to play marketing analytics in the light of mobile iOS games. Other platforms will be also discussed as well as mobile marketing aspects such as user acquisition, big data and metrics. The case company is a Finnish game startup which is about to release their first game The Supernauts. The objective of this thesis was to research what kind of analytics and metrics are needed in the marketing of free-to-play games as well as to examine what are the best practices...

  7. MEGA X: Molecular Evolutionary Genetics Analysis across Computing Platforms.

    Science.gov (United States)

    Kumar, Sudhir; Stecher, Glen; Li, Michael; Knyaz, Christina; Tamura, Koichiro

    2018-06-01

    The Molecular Evolutionary Genetics Analysis (Mega) software implements many analytical methods and tools for phylogenomics and phylomedicine. Here, we report a transformation of Mega to enable cross-platform use on Microsoft Windows and Linux operating systems. Mega X does not require virtualization or emulation software and provides a uniform user experience across platforms. Mega X has additionally been upgraded to use multiple computing cores for many molecular evolutionary analyses. Mega X is available in two interfaces (graphical and command line) and can be downloaded from www.megasoftware.net free of charge.

  8. Ultrasensitive microchip based on smart microgel for real-time online detection of trace threat analytes.

    Science.gov (United States)

    Lin, Shuo; Wang, Wei; Ju, Xiao-Jie; Xie, Rui; Liu, Zhuang; Yu, Hai-Rong; Zhang, Chuan; Chu, Liang-Yin

    2016-02-23

    Real-time online detection of trace threat analytes is critical for global sustainability, whereas the key challenge is how to efficiently convert and amplify analyte signals into simple readouts. Here we report an ultrasensitive microfluidic platform incorporated with smart microgel for real-time online detection of trace threat analytes. The microgel can swell responding to specific stimulus in flowing solution, resulting in efficient conversion of the stimulus signal into significantly amplified signal of flow-rate change; thus highly sensitive, fast, and selective detection can be achieved. We demonstrate this by incorporating ion-recognizable microgel for detecting trace Pb(2+), and connecting our platform with pipelines of tap water and wastewater for real-time online Pb(2+) detection to achieve timely pollution warning and terminating. This work provides a generalizable platform for incorporating myriad stimuli-responsive microgels to achieve ever-better performance for real-time online detection of various trace threat molecules, and may expand the scope of applications of detection techniques.

  9. Designing Game Analytics For A City-Builder Game

    OpenAIRE

    Korppoo, Karoliina

    2015-01-01

    The video game industry continues to grow. Competition is tough as games become more and more popular and easier for the users to get, thanks to digital distribution and social media platforms that support games. Thanks to the readily available internet connections and games using them, data of player behaviour can be acquired. This is where game analytics come in. What sort of player actions provide meaningful information that can be used to iterate the game? Typically game analytics is appl...

  10. Supporting Building Portfolio Investment and Policy Decision Making through an Integrated Building Utility Data Platform

    Energy Technology Data Exchange (ETDEWEB)

    Aziz, Azizan [Carnegie Mellon Univ., Pittsburgh, PA (United States); Lasternas, Bertrand [Carnegie Mellon Univ., Pittsburgh, PA (United States); Alschuler, Elena [US DOE; View Inc; Loftness, Vivian [Carnegie Mellon Univ., Pittsburgh, PA (United States); Wang, Haopeng [Carnegie Mellon Univ., Pittsburgh, PA (United States); Mo, Yunjeong [Carnegie Mellon Univ., Pittsburgh, PA (United States); Wang, Ting [Carnegie Mellon Univ., Pittsburgh, PA (United States); Zhang, Chenlu [Carnegie Mellon Univ., Pittsburgh, PA (United States); Sharma, Shilpi [Carnegie Mellon; Stevens, Ivana [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    2016-03-18

    The American Recovery and Reinvestment Act stimulus funding of 2009 for smart grid projects resulted in the tripling of smart meters deployment. In 2012, the Green Button initiative provided utility customers with access to their real-time1 energy usage. The availability of finely granular data provides an enormous potential for energy data analytics and energy benchmarking. The sheer volume of time-series utility data from a large number of buildings also poses challenges in data collection, quality control, and database management for rigorous and meaningful analyses. In this paper, we will describe a building portfolio-level data analytics tool for operational optimization, business investment and policy assessment using 15-minute to monthly intervals utility data. The analytics tool is developed on top of the U.S. Department of Energy’s Standard Energy Efficiency Data (SEED) platform, an open source software application that manages energy performance data of large groups of buildings. To support the significantly large volume of granular interval data, we integrated a parallel time-series database to the existing relational database. The time-series database improves on the current utility data input, focusing on real-time data collection, storage, analytics and data quality control. The fully integrated data platform supports APIs for utility apps development by third party software developers. These apps will provide actionable intelligence for building owners and facilities managers. Unlike a commercial system, this platform is an open source platform funded by the U.S. Government, accessible to the public, researchers and other developers, to support initiatives in reducing building energy consumption.

  11. Comparison of pre-processing methods for multiplex bead-based immunoassays.

    Science.gov (United States)

    Rausch, Tanja K; Schillert, Arne; Ziegler, Andreas; Lüking, Angelika; Zucht, Hans-Dieter; Schulz-Knappe, Peter

    2016-08-11

    High throughput protein expression studies can be performed using bead-based protein immunoassays, such as the Luminex® xMAP® technology. Technical variability is inherent to these experiments and may lead to systematic bias and reduced power. To reduce technical variability, data pre-processing is performed. However, no recommendations exist for the pre-processing of Luminex® xMAP® data. We compared 37 different data pre-processing combinations of transformation and normalization methods in 42 samples on 384 analytes obtained from a multiplex immunoassay based on the Luminex® xMAP® technology. We evaluated the performance of each pre-processing approach with 6 different performance criteria. Three performance criteria were plots. All plots were evaluated by 15 independent and blinded readers. Four different combinations of transformation and normalization methods performed well as pre-processing procedure for this bead-based protein immunoassay. The following combinations of transformation and normalization were suitable for pre-processing Luminex® xMAP® data in this study: weighted Box-Cox followed by quantile or robust spline normalization (rsn), asinh transformation followed by loess normalization and Box-Cox followed by rsn.

  12. Big Data Analytics in Chemical Engineering.

    Science.gov (United States)

    Chiang, Leo; Lu, Bo; Castillo, Ivan

    2017-06-07

    Big data analytics is the journey to turn data into insights for more informed business and operational decisions. As the chemical engineering community is collecting more data (volume) from different sources (variety), this journey becomes more challenging in terms of using the right data and the right tools (analytics) to make the right decisions in real time (velocity). This article highlights recent big data advancements in five industries, including chemicals, energy, semiconductors, pharmaceuticals, and food, and then discusses technical, platform, and culture challenges. To reach the next milestone in multiplying successes to the enterprise level, government, academia, and industry need to collaboratively focus on workforce development and innovation.

  13. Big Data Analytics in Medicine and Healthcare.

    Science.gov (United States)

    Ristevski, Blagoj; Chen, Ming

    2018-05-10

    This paper surveys big data with highlighting the big data analytics in medicine and healthcare. Big data characteristics: value, volume, velocity, variety, veracity and variability are described. Big data analytics in medicine and healthcare covers integration and analysis of large amount of complex heterogeneous data such as various - omics data (genomics, epigenomics, transcriptomics, proteomics, metabolomics, interactomics, pharmacogenomics, diseasomics), biomedical data and electronic health records data. We underline the challenging issues about big data privacy and security. Regarding big data characteristics, some directions of using suitable and promising open-source distributed data processing software platform are given.

  14. Establishment of a Molecular Serotyping Scheme and a Multiplexed Luminex-Based Array for Enterobacter aerogenes.

    Science.gov (United States)

    Guo, Xi; Wang, Min; Wang, Lu; Wang, Yao; Chen, Tingting; Wu, Pan; Chen, Min; Liu, Bin; Feng, Lu

    2018-01-01

    Serotyping based on surface polysaccharide antigens is important for the clinical detection and epidemiological surveillance of pathogens. Polysaccharide gene clusters (PSgcs) are typically responsible for the diversity of bacterial surface polysaccharides. Through whole-genome sequencing and analysis, eight putative PSgc types were identified in 23 Enterobacter aerogenes strains from several geographic areas, allowing us to present the first molecular serotyping system for E. aerogenes . A conventional antigenic scheme was also established and correlated well with the molecular serotyping system that was based on PSgc genetic variation, indicating that PSgc-based molecular typing and immunological serology provide equally valid results. Further, a multiplex Luminex-based array was developed, and a double-blind test was conducted with 97 clinical specimens from Shanghai, China, to validate our array. The results of these analyses indicated that strains containing PSgc4 and PSgc7 comprised the predominant groups. We then examined 86 publicly available E. aerogenes strain genomes and identified an additional seven novel PSgc types, with PSgc10 being the most abundant type. In total, our study identified 15 PSgc types in E. aerogenes , providing the basis for a molecular serotyping scheme. From these results, differing epidemic patterns were identified between strains that were predominant in different regions. Our study highlights the feasibility and reliability of a serotyping system based on PSgc diversity, and for the first time, presents a molecular serotyping system, as well as an antigenic scheme for E. aerogenes , providing the basis for molecular diagnostics and epidemiological surveillance of this important emerging pathogen.

  15. Genomics Portals: integrative web-platform for mining genomics data.

    Science.gov (United States)

    Shinde, Kaustubh; Phatak, Mukta; Johannes, Freudenberg M; Chen, Jing; Li, Qian; Vineet, Joshi K; Hu, Zhen; Ghosh, Krishnendu; Meller, Jaroslaw; Medvedovic, Mario

    2010-01-13

    A large amount of experimental data generated by modern high-throughput technologies is available through various public repositories. Our knowledge about molecular interaction networks, functional biological pathways and transcriptional regulatory modules is rapidly expanding, and is being organized in lists of functionally related genes. Jointly, these two sources of information hold a tremendous potential for gaining new insights into functioning of living systems. Genomics Portals platform integrates access to an extensive knowledge base and a large database of human, mouse, and rat genomics data with basic analytical visualization tools. It provides the context for analyzing and interpreting new experimental data and the tool for effective mining of a large number of publicly available genomics datasets stored in the back-end databases. The uniqueness of this platform lies in the volume and the diversity of genomics data that can be accessed and analyzed (gene expression, ChIP-chip, ChIP-seq, epigenomics, computationally predicted binding sites, etc), and the integration with an extensive knowledge base that can be used in such analysis. The integrated access to primary genomics data, functional knowledge and analytical tools makes Genomics Portals platform a unique tool for interpreting results of new genomics experiments and for mining the vast amount of data stored in the Genomics Portals backend databases. Genomics Portals can be accessed and used freely at http://GenomicsPortals.org.

  16. Essential Means for Urban Computing : Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    NARCIS (Netherlands)

    Nourian, P.; Martinez-Ortiz, Carlos; Arroyo Ohori, G.A.K.

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages,

  17. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    Directory of Open Access Journals (Sweden)

    Pirouz Nourian

    2018-03-01

    Full Text Available This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential and applicability in urban planning and urban data analytics. This review is not only based on the technical factors such as capabilities of the programming languages but also the ease of developing and sharing complex data processing workflows. The arena of web-based computing platforms is currently under rapid development and is too volatile to be predictable; therefore, in this article we focus on the specification of the requirements and potentials from an urban planning point of view rather than speculating about the fate of computing platforms or programming languages. The article presents a list of promising computing technologies, a technical specification of the essential data models and operators for geo-spatial data processing, and mathematical models for an ideal urban computing platform.

  18. Use of Tethered Enzymes as a Platform Technology for Rapid Analyte Detection.

    Directory of Open Access Journals (Sweden)

    Roy Cohen

    Full Text Available Rapid diagnosis for time-sensitive illnesses such as stroke, cardiac arrest, and septic shock is essential for successful treatment. Much attention has therefore focused on new strategies for rapid and objective diagnosis, such as Point-of-Care Tests (PoCT for blood biomarkers. Here we use a biomimicry-based approach to demonstrate a new diagnostic platform, based on enzymes tethered to nanoparticles (NPs. As proof of principle, we use oriented immobilization of pyruvate kinase (PK and luciferase (Luc on silica NPs to achieve rapid and sensitive detection of neuron-specific enolase (NSE, a clinically relevant biomarker for multiple diseases ranging from acute brain injuries to lung cancer. We hypothesize that an approach capitalizing on the speed and catalytic nature of enzymatic reactions would enable fast and sensitive biomarker detection, suitable for PoCT devices.We performed in-vitro, animal model, and human subject studies. First, the efficiency of coupled enzyme activities when tethered to NPs versus when in solution was tested, demonstrating a highly sensitive and rapid detection of physiological and pathological concentrations of NSE. Next, in rat stroke models the enzyme-based assay was able in minutes to show a statistically significant increase in NSE levels in samples taken 1 hour before and 0, 1, 3 and 6 hours after occlusion of the distal middle cerebral artery. Finally, using the tethered enzyme assay for detection of NSE in samples from 20 geriatric human patients, we show that our data match well (r = 0.815 with the current gold standard for biomarker detection, ELISA-with a major difference being that we achieve detection in 10 minutes as opposed to the several hours required for traditional ELISA.Oriented enzyme immobilization conferred more efficient coupled activity, and thus higher assay sensitivity, than non-tethered enzymes. Together, our findings provide proof of concept for using oriented immobilization of active

  19. OLED-based biosensing platform with ZnO nanoparticles for enzyme immobilization

    Science.gov (United States)

    Cai, Yuankun; Shinar, Ruth; Shinar, Joseph

    2009-08-01

    Organic light-emitting diode (OLED)-based sensing platforms are attractive for photoluminescence (PL)-based monitoring of a variety of analytes. Among the promising OLED attributes for sensing applications is the thin and flexible size and design of the OLED pixel array that is used for PL excitation. To generate a compact, fielddeployable sensor, other major sensor components, such as the sensing probe and the photodetector, in addition to the thin excitation source, should be compact. To this end, the OLED-based sensing platform was tested with composite thin biosensing films, where oxidase enzymes were immobilized on ZnO nanoparticles, rather than dissolved in solution, to generate a more compact device. The analytes tested, glucose, cholesterol, and lactate, were monitored by following their oxidation reactions in the presence of oxygen and their respective oxidase enzymes. During such reactions, oxygen is consumed and its residual concentration, which is determined by the initial concentration of the above-mentioned analytes, is monitored. The sensors utilized the oxygen-sensitive dye Pt octaethylporphyrin, embedded in polystyrene. The enzymes were sandwiched between two thin ZnO layers, an approach that was found to improve the stability of the sensing probes.

  20. Understanding the promises and premises of online health platforms

    Directory of Open Access Journals (Sweden)

    José Van Dijck

    2016-06-01

    Full Text Available This article investigates the claims and complexities involved in the platform-based economics of health and fitness apps. We examine a double-edged logic inscribed in these platforms, promising to offer personal solutions to medical problems while also contributing to the public good. On the one hand, online platforms serve as personalized data-driven services to their customers. On the other hand, they allegedly serve public interests, such as medical research or health education. In doing so, many apps employ a diffuse discourse, hinging on terms like “sharing,” “open,” and “reuse” when they talk about data extraction and distribution. The analytical approach we adopt in this article is situated at the nexus of science and technology studies, political economy, and the sociology of health and illness. The analysis concentrates on two aspects: datafication (the use and reuse of data and commodification (a platform’s deployment of governance and business models. We apply these analytical categories to three specific platforms: 23andMe, PatientsLikeMe, and Parkinson mPower. The last section will connect these individual examples to the wider implications of health apps’ data flows, governance policies, and business models. Regulatory bodies commonly focus on the (medical safety and security of apps, but pay scarce attention to health apps’ techno-economic governance. Who owns user-generated health data and who gets to benefit? We argue that it is important to reflect on the societal implications of health data markets. Governments have the duty to provide conceptual clarity in the grand narrative of transforming health care and health research.

  1. pH-Triggered Molecular Alignment for Reproducible SERS Detection via an AuNP/Nanocellulose Platform

    Science.gov (United States)

    Wei, Haoran; Vikesland, Peter J.

    2015-12-01

    The low affinity of neutral and hydrophobic molecules towards noble metal surfaces hinders their detection by surface-enhanced Raman spectroscopy (SERS). Herein, we present a method to enhance gold nanoparticle (AuNP) surface affinity by lowering the suspension pH below the analyte pKa. We developed an AuNP/bacterial cellulose (BC) nanocomposite platform and applied it to two common pollutants, carbamazepine (CBZ) and atrazine (ATZ) with pKa values of 2.3 and 1.7, respectively. Simple mixing of the analytes with AuNP/BC at pH < pKa resulted in consistent electrostatic alignment of the CBZ and ATZ molecules across the nanocomposite and highly reproducible SERS spectra. Limits of detection of 3 nM and 11 nM for CBZ and ATZ, respectively, were attained. Tests with additional analytes (melamine, 2,4-dichloroaniline, 4-chloroaniline, 3-bromoaniline, and 3-nitroaniline) further illustrate that the AuNP/BC platform provides reproducible analyte detection and quantification while avoiding the uncontrolled aggregation and flocculation of AuNPs that often hinder low pH detection.

  2. Multi-function microfluidic platform for sensor integration.

    Science.gov (United States)

    Fernandes, Ana C; Semenova, Daria; Panjan, Peter; Sesay, Adama M; Gernaey, Krist V; Krühne, Ulrich

    2018-03-06

    The limited availability of metabolite-specific sensors for continuous sampling and monitoring is one of the main bottlenecks contributing to failures in bioprocess development. Furthermore, only a limited number of approaches exist to connect currently available measurement systems with high throughput reactor units. This is especially relevant in the biocatalyst screening and characterization stage of process development. In this work, a strategy for sensor integration in microfluidic platforms is demonstrated, to address the need for rapid, cost-effective and high-throughput screening in bioprocesses. This platform is compatible with different sensor formats by enabling their replacement and was built in order to be highly flexible and thus suitable for a wide range of applications. Moreover, this re-usable platform can easily be connected to analytical equipment, such as HPLC, laboratory scale reactors or other microfluidic chips through the use of standardized fittings. In addition, the developed platform includes a two-sensor system interspersed with a mixing channel, which allows the detection of samples that might be outside the first sensor's range of detection, through dilution of the sample solution up to 10 times. In order to highlight the features of the proposed platform, inline monitoring of glucose levels is presented and discussed. Glucose was chosen due to its importance in biotechnology as a relevant substrate. The platform demonstrated continuous measurement of substrate solutions for up to 12 h. Furthermore, the influence of the fluid velocity on substrate diffusion was observed, indicating the need for in-flow calibration to achieve a good quantitative output. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Visual Analytics for MOOC Data.

    Science.gov (United States)

    Qu, Huamin; Chen, Qing

    2015-01-01

    With the rise of massive open online courses (MOOCs), tens of millions of learners can now enroll in more than 1,000 courses via MOOC platforms such as Coursera and edX. As a result, a huge amount of data has been collected. Compared with traditional education records, the data from MOOCs has much finer granularity and also contains new pieces of information. It is the first time in history that such comprehensive data related to learning behavior has become available for analysis. What roles can visual analytics play in this MOOC movement? The authors survey the current practice and argue that MOOCs provide an opportunity for visualization researchers and that visual analytics systems for MOOCs can benefit a range of end users such as course instructors, education researchers, students, university administrators, and MOOC providers.

  4. Method and platform standardization in MRM-based quantitative plasma proteomics.

    Science.gov (United States)

    Percy, Andrew J; Chambers, Andrew G; Yang, Juncong; Jackson, Angela M; Domanski, Dominik; Burkhart, Julia; Sickmann, Albert; Borchers, Christoph H

    2013-12-16

    There exists a growing demand in the proteomics community to standardize experimental methods and liquid chromatography-mass spectrometry (LC/MS) platforms in order to enable the acquisition of more precise and accurate quantitative data. This necessity is heightened by the evolving trend of verifying and validating candidate disease biomarkers in complex biofluids, such as blood plasma, through targeted multiple reaction monitoring (MRM)-based approaches with stable isotope-labeled standards (SIS). Considering the lack of performance standards for quantitative plasma proteomics, we previously developed two reference kits to evaluate the MRM with SIS peptide approach using undepleted and non-enriched human plasma. The first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). Here, these kits have been refined for practical use and then evaluated through intra- and inter-laboratory testing on 6 common LC/MS platforms. For an identical panel of 22 plasma proteins, similar concentrations were determined, regardless of the kit, instrument platform, and laboratory of analysis. These results demonstrate the value of the kit and reinforce the utility of standardized methods and protocols. The proteomics community needs standardized experimental protocols and quality control methods in order to improve the reproducibility of MS-based quantitative data. This need is heightened by the evolving trend for MRM-based validation of proposed disease biomarkers in complex biofluids such as blood plasma. We have developed two kits to assist in the inter- and intra-laboratory quality control of MRM experiments: the first kit tests the effectiveness of the LC/MRM-MS platform (kit #1), while the second evaluates the performance of an entire analytical workflow (kit #2). In this paper, we report the use of these kits in intra- and inter-laboratory testing on 6 common LC/MS platforms. This

  5. Laboratory Tests of Multiplex Detection of PCR Amplicons Using the Luminex 100 Flow Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Venkateswaran, K.S.; Nasarabadi, S.; Langlois, R.G.

    2000-05-05

    Lawrence Livermore National Laboratory (LLNL) demonstrated the power of flow cytometry in detecting the biological agents simulants at JFT III. LLNL pioneered in the development of advanced nucleic acid analyzer (ANM) for portable real time identification. Recent advances in flow cytometry provide a means for multiplexed nucleic acid detection and immunoassay of pathogenic microorganisms. We are presently developing multiplexed immunoassays for the simultaneous detection of different simulants. Our goal is to build an integrated instrument for both nucleic acid analysis and immuno detection. In this study we evaluated the Luminex LX 100 for concurrent identification of more than one PCR amplified product. ANAA has real-time Taqman fluorescent detection capability for rapid identification of field samples. However, its multiplexing ability is limited by the combination of available fluorescent labels. Hence integration of ANAA with flow cytometry can give the rapidity of ANAA amplification and the multiplex capability of flow cytometry. Multiplexed flow cytometric analysis is made possible using a set of fluorescent latex microsphere that are individually identified by their red and infrared fluorescence. A green fluorochrome is used as the assay signal. Methods were developed for the identification of specific nucleic acid sequences from Bacillus globigii (Bg), Bacillus thuringensis (Bt) and Erwinia herbicola (Eh). Detection sensitivity using different reporter fluorochromes was tested with the LX 100, and also different assay formats were evaluated for their suitability for rapid testing. A blind laboratory trial was carried out December 22-27, 1999 to evaluate bead assays for multiplex identification of Bg and Bt PCR products. This report summarizes the assay development, fluorochrome comparisons, and the results of the blind trial conducted at LLNL for the laboratory evaluation of the LX 100 flow analyzer.

  6. SciCloud: A Scientific Cloud and Management Platform for Smart City Data

    DEFF Research Database (Denmark)

    Liu, Xiufeng; Nielsen, Per Sieverts; Heller, Alfred

    2017-01-01

    private scientific cloud, SciCloud, to tackle these grand challenges. SciCloud provides on-demand computing resource provisions, a scalable data management platform and an in-place data analytics environment to support the scientific research using smart city data....

  7. Smart Meter Data Analytics: Systems, Algorithms and Benchmarking

    DEFF Research Database (Denmark)

    Liu, Xiufeng; Golab, Lukasz; Golab, Wojciech

    2016-01-01

    the proposed benchmark using five representative platforms: a traditional numeric computing platform (Matlab), a relational DBMS with a built-in machine learning toolkit (PostgreSQL/MADlib), a main-memory column store (“System C”), and two distributed data processing platforms (Hive and Spark/Spark Streaming......Smart electricity meters have been replacing conventional meters worldwide, enabling automated collection of fine-grained (e.g., every 15 minutes or hourly) consumption data. A variety of smart meter analytics algorithms and applications have been proposed, mainly in the smart grid literature......-line feature extraction and model building as well a framework for on-line anomaly detection that we propose. Second, since obtaining real smart meter data is difficult due to privacy issues, we present an algorithm for generating large realistic data sets from a small seed of real data. Third, we implement...

  8. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    OpenAIRE

    Pirouz Nourian; Carlos Martinez-Ortiz; Ken Arroyo Ohori

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential ...

  9. SM4AM: A Semantic Metamodel for Analytical Metadata

    DEFF Research Database (Denmark)

    Varga, Jovan; Romero, Oscar; Pedersen, Torben Bach

    2014-01-01

    Next generation BI systems emerge as platforms where traditional BI tools meet semi-structured and unstructured data coming from the Web. In these settings, the user-centric orientation represents a key characteristic for the acceptance and wide usage by numerous and diverse end users in their data....... We present SM4AM, a Semantic Metamodel for Analytical Metadata created as an RDF formalization of the Analytical Metadata artifacts needed for user assistance exploitation purposes in next generation BI systems. We consider the Linked Data initiative and its relevance for user assistance...

  10. Achieving Cost Reduction Through Data Analytics.

    Science.gov (United States)

    Rocchio, Betty Jo

    2016-10-01

    The reimbursement structure of the US health care system is shifting from a volume-based system to a value-based system. Adopting a comprehensive data analytics platform has become important to health care facilities, in part to navigate this shift. Hospitals generate plenty of data, but actionable analytics are necessary to help personnel interpret and apply data to improve practice. Perioperative services is an important revenue-generating department for hospitals, and each perioperative service line requires a tailored approach to be successful in managing outcomes and controlling costs. Perioperative leaders need to prepare to use data analytics to reduce variation in supplies, labor, and overhead. Mercy, based in Chesterfield, Missouri, adopted a perioperative dashboard that helped perioperative leaders collaborate with surgeons and perioperative staff members to organize and analyze health care data, which ultimately resulted in significant cost savings. Copyright © 2016 AORN, Inc. Published by Elsevier Inc. All rights reserved.

  11. Versatile electrophoresis-based self-test platform.

    Science.gov (United States)

    Guijt, Rosanne M

    2015-03-01

    Lab on a Chip technology offers the possibility to extract chemical information from a complex sample in a simple, automated way without the need for a laboratory setting. In the health care sector, this chemical information could be used as a diagnostic tool for example to inform dosing. In this issue, the research underpinning a family of electrophoresis-based point-of-care devices for self-testing of ionic analytes in various sample matrices is described [Electrophoresis 2015, 36, 712-721.]. Hardware, software, and methodological chances made to improve the overall analytical performance in terms of accuracy, precision, detection limit, and reliability are discussed. In addition to the main focus of lithium monitoring, new applications including the use of the platform for veterinary purposes, sodium, and for creatinine measurements are included. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Recent Progress in Optical Biosensors Based on Smartphone Platforms

    Science.gov (United States)

    Geng, Zhaoxin; Zhang, Xiong; Fan, Zhiyuan; Lv, Xiaoqing; Su, Yue; Chen, Hongda

    2017-01-01

    With a rapid improvement of smartphone hardware and software, especially complementary metal oxide semiconductor (CMOS) cameras, many optical biosensors based on smartphone platforms have been presented, which have pushed the development of the point-of-care testing (POCT). Imaging-based and spectrometry-based detection techniques have been widely explored via different approaches. Combined with the smartphone, imaging-based and spectrometry-based methods are currently used to investigate a wide range of molecular properties in chemical and biological science for biosensing and diagnostics. Imaging techniques based on smartphone-based microscopes are utilized to capture microscale analysts, while spectrometry-based techniques are used to probe reactions or changes of molecules. Here, we critically review the most recent progress in imaging-based and spectrometry-based smartphone-integrated platforms that have been developed for chemical experiments and biological diagnosis. We focus on the analytical performance and the complexity for implementation of the platforms. PMID:29068375

  13. Recent Progress in Optical Biosensors Based on Smartphone Platforms.

    Science.gov (United States)

    Geng, Zhaoxin; Zhang, Xiong; Fan, Zhiyuan; Lv, Xiaoqing; Su, Yue; Chen, Hongda

    2017-10-25

    With a rapid improvement of smartphone hardware and software, especially complementary metal oxide semiconductor (CMOS) cameras, many optical biosensors based on smartphone platforms have been presented, which have pushed the development of the point-of-care testing (POCT). Imaging-based and spectrometry-based detection techniques have been widely explored via different approaches. Combined with the smartphone, imaging-based and spectrometry-based methods are currently used to investigate a wide range of molecular properties in chemical and biological science for biosensing and diagnostics. Imaging techniques based on smartphone-based microscopes are utilized to capture microscale analysts, while spectrometry-based techniques are used to probe reactions or changes of molecules. Here, we critically review the most recent progress in imaging-based and spectrometry-based smartphone-integrated platforms that have been developed for chemical experiments and biological diagnosis. We focus on the analytical performance and the complexity for implementation of the platforms.

  14. Use of Tethered Enzymes as a Platform Technology for Rapid Analyte Detection

    Science.gov (United States)

    Cohen, Roy; Lata, James P.; Lee, Yurim; Hernández, Jean C. Cruz; Nishimura, Nozomi; Schaffer, Chris B.; Mukai, Chinatsu; Nelson, Jacquelyn L.; Brangman, Sharon A.; Agrawal, Yash; Travis, Alexander J.

    2015-01-01

    Background Rapid diagnosis for time-sensitive illnesses such as stroke, cardiac arrest, and septic shock is essential for successful treatment. Much attention has therefore focused on new strategies for rapid and objective diagnosis, such as Point-of-Care Tests (PoCT) for blood biomarkers. Here we use a biomimicry-based approach to demonstrate a new diagnostic platform, based on enzymes tethered to nanoparticles (NPs). As proof of principle, we use oriented immobilization of pyruvate kinase (PK) and luciferase (Luc) on silica NPs to achieve rapid and sensitive detection of neuron-specific enolase (NSE), a clinically relevant biomarker for multiple diseases ranging from acute brain injuries to lung cancer. We hypothesize that an approach capitalizing on the speed and catalytic nature of enzymatic reactions would enable fast and sensitive biomarker detection, suitable for PoCT devices. Methods and findings We performed in-vitro, animal model, and human subject studies. First, the efficiency of coupled enzyme activities when tethered to NPs versus when in solution was tested, demonstrating a highly sensitive and rapid detection of physiological and pathological concentrations of NSE. Next, in rat stroke models the enzyme-based assay was able in minutes to show a statistically significant increase in NSE levels in samples taken 1 hour before and 0, 1, 3 and 6 hours after occlusion of the distal middle cerebral artery. Finally, using the tethered enzyme assay for detection of NSE in samples from 20 geriatric human patients, we show that our data match well (r = 0.815) with the current gold standard for biomarker detection, ELISA—with a major difference being that we achieve detection in 10 minutes as opposed to the several hours required for traditional ELISA. Conclusions Oriented enzyme immobilization conferred more efficient coupled activity, and thus higher assay sensitivity, than non-tethered enzymes. Together, our findings provide proof of concept for using

  15. Genomics Portals: integrative web-platform for mining genomics data

    Directory of Open Access Journals (Sweden)

    Ghosh Krishnendu

    2010-01-01

    Full Text Available Abstract Background A large amount of experimental data generated by modern high-throughput technologies is available through various public repositories. Our knowledge about molecular interaction networks, functional biological pathways and transcriptional regulatory modules is rapidly expanding, and is being organized in lists of functionally related genes. Jointly, these two sources of information hold a tremendous potential for gaining new insights into functioning of living systems. Results Genomics Portals platform integrates access to an extensive knowledge base and a large database of human, mouse, and rat genomics data with basic analytical visualization tools. It provides the context for analyzing and interpreting new experimental data and the tool for effective mining of a large number of publicly available genomics datasets stored in the back-end databases. The uniqueness of this platform lies in the volume and the diversity of genomics data that can be accessed and analyzed (gene expression, ChIP-chip, ChIP-seq, epigenomics, computationally predicted binding sites, etc, and the integration with an extensive knowledge base that can be used in such analysis. Conclusion The integrated access to primary genomics data, functional knowledge and analytical tools makes Genomics Portals platform a unique tool for interpreting results of new genomics experiments and for mining the vast amount of data stored in the Genomics Portals backend databases. Genomics Portals can be accessed and used freely at http://GenomicsPortals.org.

  16. The BTWorld use case for big data analytics : Description, MapReduce logical workflow, and empirical evaluation

    NARCIS (Netherlands)

    Hegeman, T.; Ghit, B.; Capota, M.; Hidders, A.J.H.; Epema, D.H.J.; Iosup, A.

    2013-01-01

    The commoditization of big data analytics, that is, the deployment, tuning, and future development of big data processing platforms such as MapReduce, relies on a thorough understanding of relevant use cases and workloads. In this work we propose BTWorld, a use case for time-based big data analytics

  17. Evaluation of a new eLearning platform for distance teaching of microsurgery.

    Science.gov (United States)

    Messaoudi, T; Bodin, F; Hidalgo Diaz, J J; Ichihara, S; Fikry, T; Lacreuse, I; Liverneaux, P; Facca, S

    2015-06-01

    Online learning (or eLearning) is in constant evolution in medicine. An analytical survey of the websites of eight academic societies and medical schools was carried out. These sites were evaluated against parameters that define the quality of an eLearning website, as well as the shareable content object reference model (SCORM) technical standards. All studied platforms were maintained by a webmaster and regularly updated. Only two platforms had teleconference opportunities, five had courses in PDF format, and four allowed online testing. Based on SCORM standards, only four platforms allowed direct access without a password. The content of all platforms was adaptable, interoperable and reusable. But their sustainability was difficult to assess. In parallel, we developed the first eLearning platform to be used as part of a university diploma in microsurgery in France. The platform was evaluated by students enrolled this diploma program. A satisfaction survey and platform evaluation showed that students were generally satisfied and had used the platform for microsurgery education, especially the seven students living abroad. ELearning for microsurgery allows the content to be continuously updated, makes for fewer classroom visits, provides easy remote access, and especially better training time management and cost savings in terms of travel and accommodations. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  18. A Short Survey on the State of the Art in Architectures and Platforms for Large Scale Data Analysis and Knowledge Discovery from Data

    Energy Technology Data Exchange (ETDEWEB)

    Begoli, Edmon [ORNL

    2012-01-01

    Intended as a survey for practicing architects and researchers seeking an overview of the state-of-the-art architectures for data analysis, this paper provides an overview of the emerg- ing data management and analytic platforms including par- allel databases, Hadoop-based systems, High Performance Computing (HPC) platforms and platforms popularly re- ferred to as NoSQL platforms. Platforms are presented based on their relevance, analysis they support and the data organization model they support.

  19. Spark - a modern approach for distributed analytics

    CERN Multimedia

    CERN. Geneva; Kothuri, Prasanth

    2016-01-01

    The Hadoop ecosystem is the leading opensource platform for distributed storing and processing big data. It is a very popular system for implementing data warehouses and data lakes. Spark has also emerged to be one of the leading engines for data analytics. The Hadoop platform is available at CERN as a central service provided by the IT department. By attending the session, a participant will acquire knowledge of the essential concepts need to benefit from the parallel data processing offered by Spark framework. The session is structured around practical examples and tutorials. Main topics: Architecture overview - work distribution, concepts of a worker and a driver Computing concepts of transformations and actions Data processing APIs - RDD, DataFrame, and SparkSQL

  20. A Supramolecular Sensing Platform for Phosphate Anions and an Anthrax Biomarker in a Microfluidic Device

    Directory of Open Access Journals (Sweden)

    Jurriaan Huskens

    2011-10-01

    Full Text Available A supramolecular platform based on self-assembled monolayers (SAMs has been implemented in a microfluidic device. The system has been applied for the sensing of two different analyte types: biologically relevant phosphate anions and aromatic carboxylic acids, which are important for anthrax detection. A Eu(III-EDTA complex was bound to β-cyclodextrin monolayers via orthogonal supramolecular host-guest interactions. The self-assembly of the Eu(III-EDTA conjugate and naphthalene β-diketone as an antenna resulted in the formation of a highly luminescent lanthanide complex on the microchannel surface. Detection of different phosphate anions and aromatic carboxylic acids was demonstrated by monitoring the decrease in red emission following displacement of the antenna by the analyte. Among these analytes, adenosine triphosphate (ATP and pyrophosphate, as well as dipicolinic acid (DPA which is a biomarker for anthrax, showed a strong response. Parallel fabrication of five sensing SAMs in a single multichannel chip was performed, as a first demonstration of phosphate and carboxylic acid screening in a multiplexed format that allows a general detection platform for both analyte systems in a single test run with µM and nM detection sensitivity for ATP and DPA, respectively.

  1. A versatile electrophoresis-based self-test platform.

    Science.gov (United States)

    Staal, Steven; Ungerer, Mathijn; Floris, Arjan; Ten Brinke, Hans-Willem; Helmhout, Roy; Tellegen, Marian; Janssen, Kjeld; Karstens, Erik; van Arragon, Charlotte; Lenk, Stefan; Staijen, Erik; Bartholomew, Jody; Krabbe, Hans; Movig, Kris; Dubský, Pavel; van den Berg, Albert; Eijkel, Jan

    2015-03-01

    This paper reports on recent research creating a family of electrophoresis-based point of care devices for the determination of a wide range of ionic analytes in various sample matrices. These devices are based on a first version for the point-of-care measurement of Li(+), reported in 2010 by Floris et al. (Lab Chip 2010, 10, 1799-1806). With respect to this device, significant improvements in accuracy, precision, detection limit, and reliability have been obtained especially by the use of multiple injections of one sample on a single chip and integrated data analysis. Internal and external validation by clinical laboratories for the determination of analytes in real patients by a self-test is reported. For Li(+) in blood better precision than the standard clinical determination for Li(+) was achieved. For Na(+) in human urine the method was found to be within the clinical acceptability limits. In a veterinary application, Ca(2+) and Mg(2+) were determined in bovine blood by means of the same chip, but using a different platform. Finally, promising preliminary results are reported with the Medimate platform for the determination of creatinine in whole blood and quantification of both cations and anions through replicate measurements on the same sample with the same chip. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Recent Progress in Optical Biosensors Based on Smartphone Platforms

    Directory of Open Access Journals (Sweden)

    Zhaoxin Geng

    2017-10-01

    Full Text Available With a rapid improvement of smartphone hardware and software, especially complementary metal oxide semiconductor (CMOS cameras, many optical biosensors based on smartphone platforms have been presented, which have pushed the development of the point-of-care testing (POCT. Imaging-based and spectrometry-based detection techniques have been widely explored via different approaches. Combined with the smartphone, imaging-based and spectrometry-based methods are currently used to investigate a wide range of molecular properties in chemical and biological science for biosensing and diagnostics. Imaging techniques based on smartphone-based microscopes are utilized to capture microscale analysts, while spectrometry-based techniques are used to probe reactions or changes of molecules. Here, we critically review the most recent progress in imaging-based and spectrometry-based smartphone-integrated platforms that have been developed for chemical experiments and biological diagnosis. We focus on the analytical performance and the complexity for implementation of the platforms.

  3. A Versatile Phenotyping System and Analytics Platform Reveals Diverse Temporal Responses to Water Availability in Setaria.

    Science.gov (United States)

    Fahlgren, Noah; Feldman, Maximilian; Gehan, Malia A; Wilson, Melinda S; Shyu, Christine; Bryant, Douglas W; Hill, Steven T; McEntee, Colton J; Warnasooriya, Sankalpi N; Kumar, Indrajit; Ficor, Tracy; Turnipseed, Stephanie; Gilbert, Kerrigan B; Brutnell, Thomas P; Carrington, James C; Mockler, Todd C; Baxter, Ivan

    2015-10-05

    Phenotyping has become the rate-limiting step in using large-scale genomic data to understand and improve agricultural crops. Here, the Bellwether Phenotyping Platform for controlled-environment plant growth and automated multimodal phenotyping is described. The system has capacity for 1140 plants, which pass daily through stations to record fluorescence, near-infrared, and visible images. Plant Computer Vision (PlantCV) was developed as open-source, hardware platform-independent software for quantitative image analysis. In a 4-week experiment, wild Setaria viridis and domesticated Setaria italica had fundamentally different temporal responses to water availability. While both lines produced similar levels of biomass under limited water conditions, Setaria viridis maintained the same water-use efficiency under water replete conditions, while Setaria italica shifted to less efficient growth. Overall, the Bellwether Phenotyping Platform and PlantCV software detected significant effects of genotype and environment on height, biomass, water-use efficiency, color, plant architecture, and tissue water status traits. All ∼ 79,000 images acquired during the course of the experiment are publicly available. Copyright © 2015 The Author. Published by Elsevier Inc. All rights reserved.

  4. Assessment of the real-time PCR and different digital PCR platforms for DNA quantification.

    Science.gov (United States)

    Pavšič, Jernej; Žel, Jana; Milavec, Mojca

    2016-01-01

    Digital PCR (dPCR) is beginning to supersede real-time PCR (qPCR) for quantification of nucleic acids in many different applications. Several analytical properties of the two most commonly used dPCR platforms, namely the QX100 system (Bio-Rad) and the 12.765 array of the Biomark system (Fluidigm), have already been evaluated and compared with those of qPCR. However, to the best of our knowledge, direct comparison between the three of these platforms using the same DNA material has not been done, and the 37 K array on the Biomark system has also not been evaluated in terms of linearity, analytical sensitivity and limit of quantification. Here, a first assessment of qPCR, the QX100 system and both arrays of the Biomark system was performed with plasmid and genomic DNA from human cytomegalovirus. With use of PCR components that alter the efficiency of qPCR, each dPCR platform demonstrated consistent copy-number estimations, which indicates the high resilience of dPCR. Two approaches, one considering the total reaction volume and the other considering the effective reaction size, were used to assess linearity, analytical sensitivity and variability. When the total reaction volume was considered, the best performance was observed with qPCR, followed by the QX100 system and the Biomark system. In contrast, when the effective reaction size was considered, all three platforms showed almost equal limits of detection and variability. Although dPCR might not always be more appropriate than qPCR for quantification of low copy numbers, dPCR is a suitable method for robust and reproducible quantification of viral DNA, and a promising technology for the higher-order reference measurement method.

  5. Micro-optics for microfluidic analytical applications.

    Science.gov (United States)

    Yang, Hui; Gijs, Martin A M

    2018-02-19

    This critical review summarizes the developments in the integration of micro-optical elements with microfluidic platforms for facilitating detection and automation of bio-analytical applications. Micro-optical elements, made by a variety of microfabrication techniques, advantageously contribute to the performance of an analytical system, especially when the latter has microfluidic features. Indeed the easy integration of optical control and detection modules with microfluidic technology helps to bridge the gap between the macroscopic world and chip-based analysis, paving the way for automated and high-throughput applications. In our review, we start the discussion with an introduction of microfluidic systems and micro-optical components, as well as aspects of their integration. We continue with a detailed description of different microfluidic and micro-optics technologies and their applications, with an emphasis on the realization of optical waveguides and microlenses. The review continues with specific sections highlighting the advantages of integrated micro-optical components in microfluidic systems for tackling a variety of analytical problems, like cytometry, nucleic acid and protein detection, cell biology, and chemical analysis applications.

  6. Using a digital marketing platform for the promotion of an internet based health encyclopedia in saudi arabia.

    Science.gov (United States)

    Al Ateeq, Asma; Al Moamary, Eman; Daghestani, Tahani; Al Muallem, Yahya; Al Dogether, Majed; Alsughayr, Abdulrahman; Altuwaijri, Majid; Househ, Mowafa

    2015-01-01

    The objective of this paper is to investigate the experiences of using a digital marketing platform to promote the use of an internet based health encyclopedia in Saudi Arabia. Key informant interviews, meeting documentation, and Google Analytics were the data collection sources used in the study. Findings show that using a digital marketing platform led to a significant increase in the number of visitors to the health encyclopedia. The results demonstrate that digital marketing platforms are effective tools to be used for promoting internet based health education interventions. Future work will examine long-term educational impacts and costs in using digital marketing platforms to promote online healthcare sites in Saudi Arabia.

  7. Analytical detection techniques for droplet microfluidics—A review

    International Nuclear Information System (INIS)

    Zhu, Ying; Fang, Qun

    2013-01-01

    Graphical abstract: -- Highlights: •This is the first review paper focused on the analytical techniques for droplet-based microfluidics. •We summarized the analytical methods used in droplet-based microfluidic systems. •We discussed the advantage and disadvantage of each method through its application. •We also discuss the future development direction of analytical methods for droplet-based microfluidic systems. -- Abstract: In the last decade, droplet-based microfluidics has undergone rapid progress in the fields of single-cell analysis, digital PCR, protein crystallization and high throughput screening. It has been proved to be a promising platform for performing chemical and biological experiments with ultra-small volumes (picoliter to nanoliter) and ultra-high throughput. The ability to analyze the content in droplet qualitatively and quantitatively is playing an increasing role in the development and application of droplet-based microfluidic systems. In this review, we summarized the analytical detection techniques used in droplet systems and discussed the advantage and disadvantage of each technique through its application. The analytical techniques mentioned in this paper include bright-field microscopy, fluorescence microscopy, laser induced fluorescence, Raman spectroscopy, electrochemistry, capillary electrophoresis, mass spectrometry, nuclear magnetic resonance spectroscopy, absorption detection, chemiluminescence, and sample pretreatment techniques. The importance of analytical detection techniques in enabling new applications is highlighted. We also discuss the future development direction of analytical detection techniques for droplet-based microfluidic systems

  8. CISP: Simulation Platform for Collective Instabilities in the BRing of HIAF project

    Science.gov (United States)

    Liu, J.; Yang, J. C.; Xia, J. W.; Yin, D. Y.; Shen, G. D.; Li, P.; Zhao, H.; Ruan, S.; Wu, B.

    2018-02-01

    To simulate collective instabilities during the complicated beam manipulation in the BRing (Booster Ring) of HIAF (High Intensity heavy-ion Accelerator Facility) or other high intensity accelerators, a code, named CISP (Simulation Platform for Collective Instabilities), is designed and constructed in China's IMP (Institute of Modern Physics). The CISP is a scalable multi-macroparticle simulation platform that can perform longitudinal and transverse tracking when chromaticity, space charge effect, nonlinear magnets and wakes are included. And due to its well object-oriented design, the CISP is also a basic platform used to develop many other applications (like feedback). Several simulations, completed by the CISP in this paper, agree with analytical results very well, which shows that the CISP is fully functional now and it is a powerful platform for the further collective instability research in the BRing or other accelerators. In the future, the CISP can also be extended easily into a physics control system for HIAF or other facilities.

  9. Analytical workflow for rapid screening and purification of bioactives from venom proteomes

    NARCIS (Netherlands)

    Otvos, R.A.; Heus, F.A.M.; Vonk, F.J.; Halff, J.; Bruynzeel, B.; Paliukhovich, I.; Smit, A.B.; Niessen, W.M.A.; Kool, J.

    2013-01-01

    Animal venoms are important sources for finding new pharmaceutical lead molecules. We used an analytical platform for initial rapid screening and identification of bioactive compounds from these venoms followed by fast and straightforward LC-MS only guided purification to obtain bioactives for

  10. Comparative Assessment of Anti-HLA Antibodies Using Two Commercially Available Luminex-Based Assays.

    Science.gov (United States)

    Clerkin, Kevin J; See, Sarah B; Farr, Maryjane A; Restaino, Susan W; Serban, Geo; Latif, Farhana; Li, Lingzhi; Colombo, Paolo C; Vlad, George; Ray, Bryan; Vasilescu, Elena R; Zorn, Emmanuel

    2017-11-01

    Allospecific anti-HLA antibodies (Abs) are associated with rejection of solid organ grafts. The 2 main kits to detect anti-HLA Ab in patient serum are commercialized by Immucor and One Lambda/ThermoFisher. We sought to compare the performance of both platforms. Background-adjusted mean fluorescence intensity (MFI) values were used from both platforms to compare sera collected from 125 pretransplant and posttransplant heart and lung transplant recipients. Most HLA class I (94.5%) and HLA class II (89%) Abs with moderate to high MFI titer (≥4000) were detected by both assays. A modest correlation was observed between MFI values obtained from the 2 assays for both class I ( r = 0.3, r 2 = 0.09, P < 0.0001) and class II Ab ( r = 0.707, r 2 = 0.5, P < 0.0001). Both assays detected anti-class I and II Ab that the other did not; however, no specific HLA allele was detected preferentially by either of the 2 assays. For a limited number of discrepant sera, dilution resulted in comparable reactivity profiles between the 2 platforms. Immucor and One Lambda/ThermoFisher assays have a similar, albeit nonidentical, ability to detect anti-HLA Ab. Although the correlation between the assays was present, significant variances exist, some of which can be explained by a dilution-sensitive "prozone" effect.

  11. Twitter 101 and beyond: introduction to social media platforms available to practicing hematologist/oncologists.

    Science.gov (United States)

    Thompson, Michael A; Ahlstrom, Jenny; Dizon, Don S; Gad, Yash; Matthews, Greg; Luks, Howard J; Schorr, Andrew

    2017-10-01

    Social media utilizes specific media platforms to allow increased interactivity between participants. These platforms serve diverse groups and purposes including participation from patients, family caregivers, research scientists, physicians, and pharmaceutical companies. Utilization of these information outlets has increased with integration at conferences and between conferences with the use of hashtags and "chats". In the realm of the "e-Patient" it is key to not underestimate your audience. Highly technical information is just as useful as a basic post. With growing use, social media analytics help track the volume and impact of content. Additionally, platforms are leveraging each other for uses, including Twitter, blogs, web radio, and recorded video and images. We explore information on social media resources and applications from varying perspectives. While these platforms will evolve over time, or disappear with new platforms taking their place, it is apparent they are now a part of the everyday experience of oncology communication. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. A Versatile Integrated Ambient Ionization Source Platform

    Science.gov (United States)

    Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei

    2018-04-01

    The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. [Figure not available: see fulltext.

  13. Mobile Technology: Binding Social and Cloud into a New Enterprise Applications Platform

    Directory of Open Access Journals (Sweden)

    Luminita HURBEAN

    2013-01-01

    Full Text Available Nowadays, the IT industry is revolving around the build-out and adoption of a new platform, characterized by mobility, cloud-based application and service delivery, and value-generating overlays of social business and pervasive analytics. The paper explores the convergence of mobile, cloud, and social, as well as the effects for the enterprise and the emergence of the new enterprise application platforms. In the beginning we set the stage, while showing the ex-pansion of the mobile, cloud, and social in the business information system, as they were found in the literature. We then look over the IT trends, especially the consumerization of IT, as reasons and basis for the information systems embracing of mobile. Afterwards, we present a mobility roadmap for the enterprise and illustrate the reconfiguration of the enterprise ap-plication platform.

  14. Ontology-Based Platform for Conceptual Guided Dataset Analysis

    KAUST Repository

    Rodriguez-Garcia, Miguel Angel

    2016-05-31

    Nowadays organizations should handle a huge amount of both internal and external data from structured, semi-structured, and unstructured sources. This constitutes a major challenge (and also an opportunity) to current Business Intelligence solutions. The complexity and effort required to analyse such plethora of data implies considerable execution times. Besides, the large number of data analysis methods and techniques impede domain experts (laymen from an IT-assisted analytics perspective) to fully exploit their potential, while technology experts lack the business background to get the proper questions. In this work, we present a semantically-boosted platform for assisting layman users in (i) extracting a relevant subdataset from all the data, and (ii) selecting the data analysis technique(s) best suited for scrutinising that subdataset. The outcome is getting better answers in significantly less time. The platform has been evaluated in the music domain with promising results.

  15. Development of a NanoBioAnalytical platform for "on-chip" qualification and quantification of platelet-derived microparticles.

    Science.gov (United States)

    Obeid, Sameh; Ceroi, Adam; Mourey, Guillaume; Saas, Philippe; Elie-Caille, Celine; Boireau, Wilfrid

    2017-07-15

    Blood microparticles (MPs) are small membrane vesicles (50-1000nm), derived from different cell types. They are known to play important roles in various biological processes and also recognized as potential biomarkers of various health disorders. Different methods are currently used for the detection and characterization of MPs, but none of these methods is capable to quantify and qualify total MPs at the same time, hence, there is a need to develop a new approach for simultaneous detection, characterization and quantification of microparticles. Here we show the potential of surface plasmon resonance (SPR) method coupled to atomic force microscopy (AFM) to quantify and qualify platelet-derived microparticles (PMPs), on the whole nano-to micro-meter scale. The different subpopulations of microparticles could be determined via their capture onto the surface using specific ligands. In order to verify the correlation between the capture level and the microparticles concentration in solution, two calibration standards were used: Virus-Like Particles (VLPs) and synthetic beads with a mean diameter of 53nm and 920nm respectively. The AFM analysis of the biochip surface allowed metrological analysis of captured PMPs and revealed that more than 95% of PMPs were smaller than 300nm. Our results suggest that our NanoBioAnalytical platform, combining SPR and AFM, is a suitable method for a sensitive, reproducible, label-free characterization and quantification of MPs over a wide concentration range (≈10 7 to 10 12 particles/mL; with a limit of detection (LOD) in the lowest ng/µL range) which matches with their typical concentrations in blood. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. A differential mobility spectrometry/mass spectrometry platform for the rapid detection and quantitation of DNA adduct dG-ABP.

    Science.gov (United States)

    Kafle, Amol; Klaene, Joshua; Hall, Adam B; Glick, James; Coy, Stephen L; Vouros, Paul

    2013-07-15

    There is continued interest in exploring new analytical technologies for the detection and quantitation of DNA adducts, biomarkers which provide direct evidence of exposure and genetic damage in cells. With the goal of reducing clean-up steps and improving sample throughput, a Differential Mobility Spectrometry/Mass Spectrometry (DMS/MS) platform has been introduced for adduct analysis. A DMS/MS platform has been utilized for the analysis of dG-ABP, the deoxyguanosine adduct of the bladder carcinogen 4-aminobiphenyl (4-ABP). After optimization of the DMS parameters, each sample was analyzed in just 30 s following a simple protein precipitation step of the digested DNA. A detection limit of one modification in 10^6 nucleosides has been achieved using only 2 µg of DNA. A brief comparison (quantitative and qualitative) with liquid chromatography/mass spectrometry is also presented highlighting the advantages of using the DMS/MS method as a high-throughput platform. The data presented demonstrate the successful application of a DMS/MS/MS platform for the rapid quantitation of DNA adducts using, as a model analyte, the deoxyguanosine adduct of the bladder carcinogen 4-aminobiphenyl. Copyright © 2013 John Wiley & Sons, Ltd.

  17. Titanium Dioxide Nanoparticles (TiO₂) Quenching Based Aptasensing Platform: Application to Ochratoxin A Detection.

    Science.gov (United States)

    Sharma, Atul; Hayat, Akhtar; Mishra, Rupesh K; Catanante, Gaëlle; Bhand, Sunil; Marty, Jean Louis

    2015-09-22

    We demonstrate for the first time, the development of titanium dioxide nanoparticles (TiO₂) quenching based aptasensing platform for detection of target molecules. TiO₂ quench the fluorescence of FAM-labeled aptamer (fluorescein labeled aptamer) upon the non-covalent adsorption of fluorescent labeled aptamer on TiO₂ surface. When OTA interacts with the aptamer, it induced aptamer G-quadruplex complex formation, weakens the interaction between FAM-labeled aptamer and TiO₂, resulting in fluorescence recovery. As a proof of concept, an assay was employed for detection of Ochratoxin A (OTA). At optimized experimental condition, the obtained limit of detection (LOD) was 1.5 nM with a good linearity in the range 1.5 nM to 1.0 µM for OTA. The obtained results showed the high selectivity of assay towards OTA without interference to structurally similar analogue Ochratoxin B (OTB). The developed aptamer assay was evaluated for detection of OTA in beer sample and recoveries were recorded in the range from 94.30%-99.20%. Analytical figures of the merits of the developed aptasensing platform confirmed its applicability to real samples analysis. However, this is a generic aptasensing platform and can be extended for detection of other toxins or target analyte.

  18. Web-based Visual Analytics for Extreme Scale Climate Science

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Evans, Katherine J [ORNL; Harney, John F [ORNL; Jewell, Brian C [ORNL; Shipman, Galen M [ORNL; Smith, Brian E [ORNL; Thornton, Peter E [ORNL; Williams, Dean N. [Lawrence Livermore National Laboratory (LLNL)

    2014-01-01

    In this paper, we introduce a Web-based visual analytics framework for democratizing advanced visualization and analysis capabilities pertinent to large-scale earth system simulations. We address significant limitations of present climate data analysis tools such as tightly coupled dependencies, ineffi- cient data movements, complex user interfaces, and static visualizations. Our Web-based visual analytics framework removes critical barriers to the widespread accessibility and adoption of advanced scientific techniques. Using distributed connections to back-end diagnostics, we minimize data movements and leverage HPC platforms. We also mitigate system dependency issues by employing a RESTful interface. Our framework embraces the visual analytics paradigm via new visual navigation techniques for hierarchical parameter spaces, multi-scale representations, and interactive spatio-temporal data mining methods that retain details. Although generalizable to other science domains, the current work focuses on improving exploratory analysis of large-scale Community Land Model (CLM) and Community Atmosphere Model (CAM) simulations.

  19. Evaluation of analytical performance of a new high-sensitivity immunoassay for cardiac troponin I.

    Science.gov (United States)

    Masotti, Silvia; Prontera, Concetta; Musetti, Veronica; Storti, Simona; Ndreu, Rudina; Zucchelli, Gian Carlo; Passino, Claudio; Clerico, Aldo

    2018-02-23

    The study aim was to evaluate and compare the analytical performance of the new chemiluminescent immunoassay for cardiac troponin I (cTnI), called Access hs-TnI using DxI platform, with those of Access AccuTnI+3 method, and high-sensitivity (hs) cTnI method for ARCHITECT platform. The limits of blank (LoB), detection (LoD) and quantitation (LoQ) at 10% and 20% CV were evaluated according to international standardized protocols. For the evaluation of analytical performance and comparison of cTnI results, both heparinized plasma samples, collected from healthy subjects and patients with cardiac diseases, and quality control samples distributed in external quality assessment programs were used. LoB, LoD and LoQ at 20% and 10% CV values of the Access hs-cTnI method were 0.6, 1.3, 2.1 and 5.3 ng/L, respectively. Access hs-cTnI method showed analytical performance significantly better than that of Access AccuTnI+3 method and similar results to those of hs ARCHITECT cTnI method. Moreover, the cTnI concentrations measured with Access hs-cTnI method showed close linear regressions with both Access AccuTnI+3 and ARCHITECT hs-cTnI methods, although there were systematic differences between these methods. There was no difference between cTnI values measured by Access hs-cTnI in heparinized plasma and serum samples, whereas there was a significant difference between cTnI values, respectively measured in EDTA and heparin plasma samples. Access hs-cTnI has analytical sensitivity parameters significantly improved compared to Access AccuTnI+3 method and is similar to those of the high-sensitivity method using ARCHITECT platform.

  20. A GPU code for analytic continuation through a sampling method

    Directory of Open Access Journals (Sweden)

    Johan Nordström

    2016-01-01

    Full Text Available We here present a code for performing analytic continuation of fermionic Green’s functions and self-energies as well as bosonic susceptibilities on a graphics processing unit (GPU. The code is based on the sampling method introduced by Mishchenko et al. (2000, and is written for the widely used CUDA platform from NVidia. Detailed scaling tests are presented, for two different GPUs, in order to highlight the advantages of this code with respect to standard CPU computations. Finally, as an example of possible applications, we provide the analytic continuation of model Gaussian functions, as well as more realistic test cases from many-body physics.

  1. Development and assessment of molecular diagnostic tests for 15 enteropathogens causing childhood diarrhoea: a multicentre study.

    Science.gov (United States)

    Liu, Jie; Kabir, Furqan; Manneh, Jainaba; Lertsethtakarn, Paphavee; Begum, Sharmin; Gratz, Jean; Becker, Steve M; Operario, Darwin J; Taniuchi, Mami; Janaki, Lalitha; Platts-Mills, James A; Haverstick, Doris M; Kabir, Mamun; Sobuz, Shihab U; Nakjarung, Kaewkanya; Sakpaisal, Pimmada; Silapong, Sasikorn; Bodhidatta, Ladaporn; Qureshi, Shahida; Kalam, Adil; Saidi, Queen; Swai, Ndealilia; Mujaga, Buliga; Maro, Athanasia; Kwambana, Brenda; Dione, Michel; Antonio, Martin; Kibiki, Gibson; Mason, Carl J; Haque, Rashidul; Iqbal, Najeeha; Zaidi, Anita K M; Houpt, Eric R

    2014-08-01

    Childhood diarrhoea can be caused by many pathogens that are difficult to assay in the laboratory. Molecular diagnostic techniques provide a uniform method to detect and quantify candidate enteropathogens. We aimed to develop and assess molecular tests for identification of enteropathogens and their association with disease. We developed and assessed molecular diagnostic tests for 15 enteropathogens across three platforms-PCR-Luminex, multiplex real-time PCR, and TaqMan array card-at five laboratories worldwide. We judged the analytical and clinical performance of these molecular techniques against comparator methods (bacterial culture, ELISA, and PCR) using 867 diarrhoeal and 619 non-diarrhoeal stool specimens. We also measured molecular quantities of pathogens to predict the association with diarrhoea, by univariate logistic regression analysis. The molecular tests showed very good analytical and clinical performance at all five laboratories. Comparator methods had limited sensitivity compared with the molecular techniques (20-85% depending on the target) but good specificity (median 97·3%, IQR 96·5-98·9; mean 95·2%, SD 9·1). Positive samples by comparator methods usually had higher molecular quantities of pathogens than did negative samples, across almost all platforms and for most pathogens (pMolecular diagnostic tests can be implemented successfully and with fidelity across laboratories around the world. In the case of diarrhoea, these techniques can detect pathogens with high sensitivity and ascribe diarrhoeal associations based on quantification, including in mixed infections, providing rich and unprecedented measurements of infectious causes. Bill & Melinda Gates Foundation Next Generation Molecular Diagnostics Project. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. A LabVIEW®-based software for the control of the AUTORAD platform. A fully automated multisequential flow injection analysis Lab-on-Valve (MSFIA-LOV) system for radiochemical analysis

    International Nuclear Information System (INIS)

    Barbesi, Donato; Vilas, Victor Vicente; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Heras, Laura Aldave de las

    2017-01-01

    A LabVIEW®-based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino®-based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW®VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste. (author)

  3. A LabVIEW®-based software for the control of the AUTORAD platform: a fully automated multisequential flow injection analysis Lab-on-Valve (MSFIA-LOV) system for radiochemical analysis.

    Science.gov (United States)

    Barbesi, Donato; Vicente Vilas, Víctor; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Aldave de Las Heras, Laura

    2017-01-01

    A LabVIEW ® -based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino ® -based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW ® VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste.

  4. Platform capitalism: The intermediation and capitalization of digital economic circulation

    Directory of Open Access Journals (Sweden)

    Paul Langley

    2017-10-01

    Full Text Available A new form of digital economic circulation has emerged, wherein ideas, knowledge, labour and use rights for otherwise idle assets move between geographically distributed but connected and interactive online communities. Such circulation is apparent across a number of digital economic ecologies, including social media, online marketplaces, crowdsourcing, crowdfunding and other manifestations of the so-called ‘sharing economy’. Prevailing accounts deploy concepts such as ‘co-production’, ‘prosumption’ and ‘peer-to-peer’ to explain digital economic circulation as networked exchange relations characterised by their disintermediated, collaborative and democratising qualities. Building from the neologism of platform capitalism, we place ‘the platform’ – understood as a distinct mode of socio-technical intermediary and business arrangement that is incorporated into wider processes of capitalisation – at the centre of the critical analysis of digital economic circulation. To create multi-sided markets and coordinate network effects, platforms enrol users through a participatory economic culture and mobilise code and data analytics to compose immanent infrastructures. Platform intermediation is also nested in the ex-post construction of a replicable business model. Prioritising rapid up-scaling and extracting revenues from circulations and associated data trails, the model performs the structure of venture capital investment which capitalises on the potential of platforms to realise monopoly rents.

  5. BlockSci: Design and applications of a blockchain analysis platform

    OpenAIRE

    Kalodner, Harry; Goldfeder, Steven; Chator, Alishah; Möser, Malte; Narayanan, Arvind

    2017-01-01

    Analysis of blockchain data is useful for both scientific research and commercial applications. We present BlockSci, an open-source software platform for blockchain analysis. BlockSci is versatile in its support for different blockchains and analysis tasks. It incorporates an in-memory, analytical (rather than transactional) database, making it several hundred times faster than existing tools. We describe BlockSci's design and present four analyses that illustrate its capabilities. This is a ...

  6. Market implementation of the MVA platform for pre-pandemic and pandemic influenza vaccines: A quantitative key opinion leader analysis.

    Science.gov (United States)

    Ramezanpour, Bahar; Pronker, Esther S; Kreijtz, Joost H C M; Osterhaus, Albert D M E; Claassen, E

    2015-08-20

    A quantitative method is presented to rank strengths, weaknesses, opportunities, and threats (SWOT) of modified vaccinia virus Ankara (MVA) as a platform for pre-pandemic and pandemic influenza vaccines. Analytic hierarchy process (AHP) was applied to achieve pairwise comparisons among SWOT factors in order to prioritize them. Key opinion leaders (KOLs) in the influenza vaccine field were interviewed to collect a unique dataset to evaluate the market potential of this platform. The purpose of this study, to evaluate commercial potential of the MVA platform for the development of novel generation pandemic influenza vaccines, is accomplished by using a SWOT and AHP combined analytic method. Application of the SWOT-AHP model indicates that its strengths are considered more important by KOLs than its weaknesses, opportunities, and threats. Particularly, the inherent immunogenicity capability of MVA without the requirement of an adjuvant is the most important factor to increase commercial attractiveness of this platform. Concerns regarding vector vaccines and anti-vector immunity are considered its most important weakness, which might lower public health value of this platform. Furthermore, evaluation of the results of this study emphasizes equally important role that threats and opportunities of this platform play. This study further highlights unmet needs in the influenza vaccine market, which could be addressed by the implementation of the MVA platform. Broad use of MVA in clinical trials shows great promise for this vector as vaccine platform for pre-pandemic and pandemic influenza and threats by other respiratory viruses. Moreover, from the results of the clinical trials seem that MVA is particularly attractive for development of vaccines against pathogens for which no, or only insufficiently effective vaccines, are available. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Payment Platform

    DEFF Research Database (Denmark)

    Hjelholt, Morten; Damsgaard, Jan

    2012-01-01

    thoroughly and substitute current payment standards in the decades to come. This paper portrays how digital payment platforms evolve in socio-technical niches and how various technological platforms aim for institutional attention in their attempt to challenge earlier platforms and standards. The paper...... applies a co-evolutionary multilevel perspective to model the interplay and processes between technology and society wherein digital payment platforms potentially will substitute other payment platforms just like the credit card negated the check. On this basis this paper formulate a multilevel conceptual...

  8. Liquid chromatography-mass spectrometry platform for both small neurotransmitters and neuropeptides in blood, with automatic and robust solid phase extraction

    Science.gov (United States)

    Johnsen, Elin; Leknes, Siri; Wilson, Steven Ray; Lundanes, Elsa

    2015-03-01

    Neurons communicate via chemical signals called neurotransmitters (NTs). The numerous identified NTs can have very different physiochemical properties (solubility, charge, size etc.), so quantification of the various NT classes traditionally requires several analytical platforms/methodologies. We here report that a diverse range of NTs, e.g. peptides oxytocin and vasopressin, monoamines adrenaline and serotonin, and amino acid GABA, can be simultaneously identified/measured in small samples, using an analytical platform based on liquid chromatography and high-resolution mass spectrometry (LC-MS). The automated platform is cost-efficient as manual sample preparation steps and one-time-use equipment are kept to a minimum. Zwitter-ionic HILIC stationary phases were used for both on-line solid phase extraction (SPE) and liquid chromatography (capillary format, cLC). This approach enabled compounds from all NT classes to elute in small volumes producing sharp and symmetric signals, and allowing precise quantifications of small samples, demonstrated with whole blood (100 microliters per sample). An additional robustness-enhancing feature is automatic filtration/filter back-flushing (AFFL), allowing hundreds of samples to be analyzed without any parts needing replacement. The platform can be installed by simple modification of a conventional LC-MS system.

  9. Microfluidic platform for efficient Nanodisc assembly, membrane protein incorporation, and purification.

    Science.gov (United States)

    Wade, James H; Jones, Joshua D; Lenov, Ivan L; Riordan, Colleen M; Sligar, Stephen G; Bailey, Ryan C

    2017-08-22

    The characterization of integral membrane proteins presents numerous analytical challenges on account of their poor activity under non-native conditions, limited solubility in aqueous solutions, and low expression in most cell culture systems. Nanodiscs are synthetic model membrane constructs that offer many advantages for studying membrane protein function by offering a native-like phospholipid bilayer environment. The successful incorporation of membrane proteins within Nanodiscs requires experimental optimization of conditions. Standard protocols for Nanodisc formation can require large amounts of time and input material, limiting the facile screening of formation conditions. Capitalizing on the miniaturization and efficient mass transport inherent to microfluidics, we have developed a microfluidic platform for efficient Nanodisc assembly and purification, and demonstrated the ability to incorporate functional membrane proteins into the resulting Nanodiscs. In addition to working with reduced sample volumes, this platform simplifies membrane protein incorporation from a multi-stage protocol requiring several hours or days into a single platform that outputs purified Nanodiscs in less than one hour. To demonstrate the utility of this platform, we incorporated Cytochrome P450 into Nanodiscs of variable size and lipid composition, and present spectroscopic evidence for the functional active site of the membrane protein. This platform is a promising new tool for membrane protein biology and biochemistry that enables tremendous versatility for optimizing the incorporation of membrane proteins using microfluidic gradients to screen across diverse formation conditions.

  10. Titanium Dioxide Nanoparticles (TiO2 Quenching Based Aptasensing Platform: Application to Ochratoxin A Detection

    Directory of Open Access Journals (Sweden)

    Atul Sharma

    2015-09-01

    Full Text Available We demonstrate for the first time, the development of titanium dioxide nanoparticles (TiO2 quenching based aptasensing platform for detection of target molecules. TiO2 quench the fluorescence of FAM-labeled aptamer (fluorescein labeled aptamer upon the non-covalent adsorption of fluorescent labeled aptamer on TiO2 surface. When OTA interacts with the aptamer, it induced aptamer G-quadruplex complex formation, weakens the interaction between FAM-labeled aptamer and TiO2, resulting in fluorescence recovery. As a proof of concept, an assay was employed for detection of Ochratoxin A (OTA. At optimized experimental condition, the obtained limit of detection (LOD was 1.5 nM with a good linearity in the range 1.5 nM to 1.0 µM for OTA. The obtained results showed the high selectivity of assay towards OTA without interference to structurally similar analogue Ochratoxin B (OTB. The developed aptamer assay was evaluated for detection of OTA in beer sample and recoveries were recorded in the range from 94.30%–99.20%. Analytical figures of the merits of the developed aptasensing platform confirmed its applicability to real samples analysis. However, this is a generic aptasensing platform and can be extended for detection of other toxins or target analyte.

  11. An automated robotic platform for rapid profiling oligosaccharide analysis of monoclonal antibodies directly from cell culture.

    Science.gov (United States)

    Doherty, Margaret; Bones, Jonathan; McLoughlin, Niaobh; Telford, Jayne E; Harmon, Bryan; DeFelippis, Michael R; Rudd, Pauline M

    2013-11-01

    Oligosaccharides attached to Asn297 in each of the CH2 domains of monoclonal antibodies play an important role in antibody effector functions by modulating the affinity of interaction with Fc receptors displayed on cells of the innate immune system. Rapid, detailed, and quantitative N-glycan analysis is required at all stages of bioprocess development to ensure the safety and efficacy of the therapeutic. The high sample numbers generated during quality by design (QbD) and process analytical technology (PAT) create a demand for high-performance, high-throughput analytical technologies for comprehensive oligosaccharide analysis. We have developed an automated 96-well plate-based sample preparation platform for high-throughput N-glycan analysis using a liquid handling robotic system. Complete process automation includes monoclonal antibody (mAb) purification directly from bioreactor media, glycan release, fluorescent labeling, purification, and subsequent ultra-performance liquid chromatography (UPLC) analysis. The entire sample preparation and commencement of analysis is achieved within a 5-h timeframe. The automated sample preparation platform can easily be interfaced with other downstream analytical technologies, including mass spectrometry (MS) and capillary electrophoresis (CE), for rapid characterization of oligosaccharides present on therapeutic antibodies. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Helios: Understanding Solar Evolution Through Text Analytics

    Energy Technology Data Exchange (ETDEWEB)

    Randazzese, Lucien [SRI International, Menlo Park, CA (United States)

    2016-12-02

    This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance, or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.

  13. New analytical methods for quality control of St. John's wort

    International Nuclear Information System (INIS)

    Huck-Pezzei, V.

    2013-01-01

    In the present work, a novel analytical platform is introduced, which enables both anal-ysis and quality control of St. John´s wort extracts and tissue. The synergistic combina-tion of separation techniques (including thin-layer chromatography (TLC), high-performance liquid chromatography (HPLC)) with mass spectrometry (MS) and vibra-tional spectroscopy is demonstrated to get deeper insights into the ingredients composi-tion. TLC was successfully employed to identify some unknown ingredients being pre-sent in samples with Chinese provenience. The here described novel HPLC method allowed to differentiate clearly between European and Chinese samples on one hand, on the other hand this method could successfully be employed for the semi-preparative isolation of the unknown ingredient. Matrix-free laser desorption ionization time of flight mass spectrometry (mf-LDI-TOF/MS) using a special designed titanium oxide layer was employed to identify the structure of the substance. The analytical knowledge generated so far was used to establish an infrared spectroscopic model allowing both quantitative analysis of ingredients as well as differentiating between European and Chinese provenience. Finally, infrared imaging spectroscopy was conducted to get knowledge about the high resolved distribution of ingredients. The analytical platform established can be used for fast and non-destructive quantitation and quality control to identify adulteration being of interest according to the Deutsche Arzneimittel Codex (DAC) even for the phytopharmaceutical industry. (author) [de

  14. Monitoring WLCG with lambda-architecture: a new scalable data store and analytics platform for monitoring at petabyte scale

    International Nuclear Information System (INIS)

    Magnoni, L; Cordeiro, C; Georgiou, M; Andreeva, J; Suthakar, U; Khan, A; Smith, D R

    2015-01-01

    Monitoring the WLCG infrastructure requires the gathering and analysis of a high volume of heterogeneous data (e.g. data transfers, job monitoring, site tests) coming from different services and experiment-specific frameworks to provide a uniform and flexible interface for scientists and sites. The current architecture, where relational database systems are used to store, to process and to serve monitoring data, has limitations in coping with the foreseen increase in the volume (e.g. higher LHC luminosity) and the variety (e.g. new data-transfer protocols and new resource-types, as cloud-computing) of WLCG monitoring events. This paper presents a new scalable data store and analytics platform designed by the Support for Distributed Computing (SDC) group, at the CERN IT department, which uses a variety of technologies each one targeting specific aspects of big-scale distributed data-processing (commonly referred as lambda-architecture approach). Results of data processing on Hadoop for WLCG data activities monitoring are presented, showing how the new architecture can easily analyze hundreds of millions of transfer logs in a few minutes. Moreover, a comparison of data partitioning, compression and file format (e.g. CSV, Avro) is presented, with particular attention given to how the file structure impacts the overall MapReduce performance. In conclusion, the evolution of the current implementation, which focuses on data storage and batch processing, towards a complete lambda-architecture is discussed, with consideration of candidate technology for the serving layer (e.g. Elasticsearch) and a description of a proof of concept implementation, based on Apache Spark and Esper, for the real-time part which compensates for batch-processing latency and automates problem detection and failures. (paper)

  15. Monitoring WLCG with lambda-architecture: a new scalable data store and analytics platform for monitoring at petabyte scale.

    Science.gov (United States)

    Magnoni, L.; Suthakar, U.; Cordeiro, C.; Georgiou, M.; Andreeva, J.; Khan, A.; Smith, D. R.

    2015-12-01

    Monitoring the WLCG infrastructure requires the gathering and analysis of a high volume of heterogeneous data (e.g. data transfers, job monitoring, site tests) coming from different services and experiment-specific frameworks to provide a uniform and flexible interface for scientists and sites. The current architecture, where relational database systems are used to store, to process and to serve monitoring data, has limitations in coping with the foreseen increase in the volume (e.g. higher LHC luminosity) and the variety (e.g. new data-transfer protocols and new resource-types, as cloud-computing) of WLCG monitoring events. This paper presents a new scalable data store and analytics platform designed by the Support for Distributed Computing (SDC) group, at the CERN IT department, which uses a variety of technologies each one targeting specific aspects of big-scale distributed data-processing (commonly referred as lambda-architecture approach). Results of data processing on Hadoop for WLCG data activities monitoring are presented, showing how the new architecture can easily analyze hundreds of millions of transfer logs in a few minutes. Moreover, a comparison of data partitioning, compression and file format (e.g. CSV, Avro) is presented, with particular attention given to how the file structure impacts the overall MapReduce performance. In conclusion, the evolution of the current implementation, which focuses on data storage and batch processing, towards a complete lambda-architecture is discussed, with consideration of candidate technology for the serving layer (e.g. Elasticsearch) and a description of a proof of concept implementation, based on Apache Spark and Esper, for the real-time part which compensates for batch-processing latency and automates problem detection and failures.

  16. Evaluating the enhancement and improvement of China's technology and financial services platform innovation strategy.

    Science.gov (United States)

    Wu, Ching-Sung; Hu, Kuang-Hua; Chen, Fu-Hsiang

    2016-01-01

    The development of high-tech industry has been prosperous around the world in past decades, while technology and finance have already become the most significant issues in the information era. While high-tech firms are a major force behind a country's economic development, it requires a lot of money for the development process, as well as the financing difficulties for its potential problems, thus, how to evaluate and establish appropriate technology and financial services platforms innovation strategy has become one of the most critical and difficult issues. Moreover, how the chosen intertwined financial environment can be optimized in order that high-tech firms financing problems can be decided has seldom been addressed. Thus, this research aims to establish a technology and financial services platform innovation strategy improvement model, as based on the hybrid MADM model, which addresses the main causal factors and amended priorities in order to strengthen ongoing planning. A DEMATEL technique, as based on Analytic Network Process, as well as modified VIKOR, will be proposed for selecting and re-configuring the aspired technology and financial services platform. An empirical study, as based on China's technology and financial services platform innovation strategy, will be provided for verifying the effectiveness of this proposed methodology. Based on expert interviews, technology and financial services platforms innovation strategy improvement should be made in the following order: credit guarantee platform ( C )_credit rating platform ( B )_investment and finance platform ( A ).

  17. An integrated platform for gas-diffusion separation and electrochemical determination of ethanol on fermentation broths

    Energy Technology Data Exchange (ETDEWEB)

    Giordano, Gabriela Furlan [Microfabrication Laboratory, Brazilian Nanotechnology National Laboratory (LNNano), Brazilian Center for Research in Energy and Materials (CNPEM), Campinas, SP 13083-970 (Brazil); Department of Analytical Chemistry, Institute of Chemistry – UNICAMP, Campinas, SP 13083-970 (Brazil); National Institute of Science and Technology of Bioanalytics, Institute of Chemistry – UNICAMP, Campinas, SP 13083-970 (Brazil); Vieira, Luis Carlos Silveira; Gobbi, Angelo Luiz [Microfabrication Laboratory, Brazilian Nanotechnology National Laboratory (LNNano), Brazilian Center for Research in Energy and Materials (CNPEM), Campinas, SP 13083-970 (Brazil); Lima, Renato Sousa [Microfabrication Laboratory, Brazilian Nanotechnology National Laboratory (LNNano), Brazilian Center for Research in Energy and Materials (CNPEM), Campinas, SP 13083-970 (Brazil); Department of Analytical Chemistry, Institute of Chemistry – UNICAMP, Campinas, SP 13083-970 (Brazil); National Institute of Science and Technology of Bioanalytics, Institute of Chemistry – UNICAMP, Campinas, SP 13083-970 (Brazil); Kubota, Lauro Tatsuo, E-mail: kubota@iqm.unicamp.br [Department of Analytical Chemistry, Institute of Chemistry – UNICAMP, Campinas, SP 13083-970 (Brazil); National Institute of Science and Technology of Bioanalytics, Institute of Chemistry – UNICAMP, Campinas, SP 13083-970 (Brazil)

    2015-05-22

    Highlights: • Integrated platform was developed to determine ethanol in fermentation broths. • The designed system integrates gas diffusion separation with voltammetric detection. • Detector relied on Ni(OH){sub 2}-modified electrode stabilized by Co{sup 2+} and Cd{sup 2+} insertion. • Separation was made by PTFE membrane separating sample from electrolyte (receptor). • Despite the sample complexity, accurate tests were achieved by direct interpolation. - Abstract: An integrated platform was developed for point-of-use determination of ethanol in sugar cane fermentation broths. Such analysis is important because ethanol reduces its fuel production efficiency by altering the alcoholic fermentation step when in excess. The custom-designed platform integrates gas diffusion separation with voltammetric detection in a single analysis module. The detector relied on a Ni(OH){sub 2}-modified electrode. It was stabilized by uniformly depositing cobalt and cadmium hydroxides as shown by XPS measurements. Such tests were in accordance with the hypothesis related to stabilization of the Ni(OH){sub 2} structure by insertion of Co{sup 2+} and Cd{sup 2+} ions in this structure. The separation step, in turn, was based on a hydrophobic PTFE membrane, which separates the sample from receptor solution (electrolyte) where the electrodes were placed. Parameters of limit of detection and analytical sensitivity were estimated to be 0.2% v/v and 2.90 μA % (v/v){sup −1}, respectively. Samples of fermentation broth were analyzed by both standard addition method and direct interpolation in saline medium based-analytical curve. In this case, the saline solution exhibited ionic strength similar to those of the samples intended to surpass the tonometry colligative effect of the samples over analyte concentration data by attributing the reduction in quantity of diffused ethanol vapor majorly to the electrolyte. The approach of analytical curve provided rapid, simple and accurate

  18. Product Platform Performance

    DEFF Research Database (Denmark)

    Munk, Lone

    The aim of this research is to improve understanding of platform-based product development by studying platform performance in relation to internal effects in companies. Platform-based product development makes it possible to deliver product variety and at the same time reduce the needed resources...... engaging in platform-based product development. Similarly platform assessment criteria lack empirical verification regarding relevance and sufficiency. The thesis focuses on • the process of identifying and estimating internal effects, • verification of performance of product platforms, (i...... experienced representatives from the different life systems phase systems of the platform products. The effects are estimated and modeled within different scenarios, taking into account financial and real option aspects. The model illustrates and supports estimation and quantification of internal platform...

  19. Microfluidic Platform for Enzyme-Linked and Magnetic Particle-Based Immunoassay

    Directory of Open Access Journals (Sweden)

    Dorota G. Pijanowska

    2013-06-01

    Full Text Available This article presents design and testing of a microfluidic platform for immunoassay. The method is based on sandwiched ELISA, whereby the primary antibody is immobilized on nitrocelluose and, subsequently, magnetic beads are used as a label to detect the analyte. The chip takes approximately 2 h and 15 min to complete the assay. A Hall Effect sensor using 0.35-μm BioMEMS TSMC technology (Taiwan Semiconductor Manufacturing Company Bio-Micro-Electro-Mechanical Systems was fabricated to sense the magnetic field from the beads. Furthermore, florescence detection and absorbance measurements from the chip demonstrate successful immunoassay on the chip. In addition, investigation also covers the Hall Effect simulations, mechanical modeling of the bead–protein complex, testing of the microfluidic platform with magnetic beads averaging 10 nm, and measurements with an inductor-based system.

  20. Evaluating the E-Learning Platform from the Perspective of Knowledge Management: The AHP Approach

    Directory of Open Access Journals (Sweden)

    I-Chin Wu

    2013-06-01

    Full Text Available A growing number of higher education institutions have adopted asynchronous and synchronous Web-based learning platforms to improve students’ learning efficiency and increase learning satisfaction in the past decade. Unlike traditional face-to-face learning methods, e-learning platforms allow teachers to communicate with students and discuss course content anytime or anywhere. In addition, the teaching material can be reused via the e-learning platforms. To understand how students use e-learning platforms and what the implications are, we conducted an empirical study of the iCAN e-learning platform, which has been widely used in Fu-Jen Catholic University since 2005. We use the Analytic Hierarchy Process (AHP, a well-known multi-criteria evaluation approach, to compare five practices, i.e. the functions of the iCAN teaching platform. We adopted a brainstorming approach to design a questionnaire to measure learners’ perception of the e-learning platform based on the theory of knowledge transforming process in knowledge management. Accordingly, the model considers functioning and objectivity in terms of the following three attributes of learning effectiveness: individual learning, group sharing and learning performance. Twelve criteria with twelve evaluation items were used to investigate the effectiveness of the five practices. We also evaluated the strengths and weaknesses of the functions based on the types of courses in the iCan platform. We expect that the empirical evaluation results will provide teachers with suggestions and guidelines for using the e-learning platform effectively to facilitate their teaching activities and promote students’ learning efficiency and satisfaction.

  1. IBM Watson Analytics: Automating Visualization, Descriptive, and Predictive Statistics.

    Science.gov (United States)

    Hoyt, Robert Eugene; Snider, Dallas; Thompson, Carla; Mantravadi, Sarita

    2016-10-11

    We live in an era of explosive data generation that will continue to grow and involve all industries. One of the results of this explosion is the need for newer and more efficient data analytics procedures. Traditionally, data analytics required a substantial background in statistics and computer science. In 2015, International Business Machines Corporation (IBM) released the IBM Watson Analytics (IBMWA) software that delivered advanced statistical procedures based on the Statistical Package for the Social Sciences (SPSS). The latest entry of Watson Analytics into the field of analytical software products provides users with enhanced functions that are not available in many existing programs. For example, Watson Analytics automatically analyzes datasets, examines data quality, and determines the optimal statistical approach. Users can request exploratory, predictive, and visual analytics. Using natural language processing (NLP), users are able to submit additional questions for analyses in a quick response format. This analytical package is available free to academic institutions (faculty and students) that plan to use the tools for noncommercial purposes. To report the features of IBMWA and discuss how this software subjectively and objectively compares to other data mining programs. The salient features of the IBMWA program were examined and compared with other common analytical platforms, using validated health datasets. Using a validated dataset, IBMWA delivered similar predictions compared with several commercial and open source data mining software applications. The visual analytics generated by IBMWA were similar to results from programs such as Microsoft Excel and Tableau Software. In addition, assistance with data preprocessing and data exploration was an inherent component of the IBMWA application. Sensitivity and specificity were not included in the IBMWA predictive analytics results, nor were odds ratios, confidence intervals, or a confusion matrix

  2. Automated Ground-Water Sampling and Analysis of Hexavalent Chromium using a “Universal” Sampling/Analytical System

    Directory of Open Access Journals (Sweden)

    Richard J. Venedam

    2005-02-01

    Full Text Available The capabilities of a “universal platform” for the deployment of analyticalsensors in the field for long-term monitoring of environmental contaminants were expandedin this investigation. The platform was previously used to monitor trichloroethene inmonitoring wells and at groundwater treatment systems (1,2. The platform was interfacedwith chromium (VI and conductivity analytical systems to monitor shallow wells installedadjacent to the Columbia River at the 100-D Area of the Hanford Site, Washington. Agroundwater plume of hexavalent chromium is discharging into the Columbia River throughthe gravels beds used by spawning salmon. The sampling/analytical platform was deployedfor the purpose of collecting data on subsurface hexavalent chromium concentrations atmore frequent intervals than was possible with the previous sampling and analysis methodsemployed a the Site.

  3. The Platformization of the Web: Making Web Data Platform Ready

    NARCIS (Netherlands)

    Helmond, A.

    2015-01-01

    In this article, I inquire into Facebook’s development as a platform by situating it within the transformation of social network sites into social media platforms. I explore this shift with a historical perspective on, what I refer to as, platformization, or the rise of the platform as the dominant

  4. OpenHealth Platform for Interactive Contextualization of Population Health Open Data

    OpenAIRE

    Almeida, Jonas S; Hajagos, Janos; Crnosija, Ivan; Kurc, Tahsin; Saltz, Mary; Saltz, Joel

    2015-01-01

    The financial incentives for data science applications leading to improved health outcomes, such as DSRIP (bit.ly/dsrip), are well-aligned with the broad adoption of Open Data by State and Federal agencies. This creates entirely novel opportunities for analytical applications that make exclusive use of the pervasive Web Computing platform. The framework described here explores this new avenue to contextualize Health data in a manner that relies exclusively on the native JavaScript interpreter...

  5. Review on microfluidic paper-based analytical devices towards commercialisation.

    Science.gov (United States)

    Akyazi, Tugce; Basabe-Desmonts, Lourdes; Benito-Lopez, Fernando

    2018-02-25

    Paper-based analytical devices introduce an innovative platform technology for fluid handling and analysis, with wide range of applications, promoting low cost, ease of fabrication/operation and equipment independence. This review gives a general overview on the fabrication techniques reported to date, revealing and discussing their weak points as well as the newest approaches in order to overtake current mass production limitations and therefore commercialisation. Moreover, this review aims especially to highlight novel technologies appearing in literature for the effective handling and controlling of fluids. The lack of flow control is the main problem of paper-based analytical devices, which generates obstacles for marketing and slows down the transition of paper devices from the laboratory into the consumers' hands. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Big Data Analytics Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of big data, statistical and machine learning tools...

  7. Extending Climate Analytics-As to the Earth System Grid Federation

    Science.gov (United States)

    Tamkin, G.; Schnase, J. L.; Duffy, D.; McInerney, M.; Nadeau, D.; Li, J.; Strong, S.; Thompson, J. H.

    2015-12-01

    We are building three extensions to prior-funded work on climate analytics-as-a-service that will benefit the Earth System Grid Federation (ESGF) as it addresses the Big Data challenges of future climate research: (1) We are creating a cloud-based, high-performance Virtual Real-Time Analytics Testbed supporting a select set of climate variables from six major reanalysis data sets. This near real-time capability will enable advanced technologies like the Cloudera Impala-based Structured Query Language (SQL) query capabilities and Hadoop-based MapReduce analytics over native NetCDF files while providing a platform for community experimentation with emerging analytic technologies. (2) We are building a full-featured Reanalysis Ensemble Service comprising monthly means data from six reanalysis data sets. The service will provide a basic set of commonly used operations over the reanalysis collections. The operations will be made accessible through NASA's climate data analytics Web services and our client-side Climate Data Services (CDS) API. (3) We are establishing an Open Geospatial Consortium (OGC) WPS-compliant Web service interface to our climate data analytics service that will enable greater interoperability with next-generation ESGF capabilities. The CDS API will be extended to accommodate the new WPS Web service endpoints as well as ESGF's Web service endpoints. These activities address some of the most important technical challenges for server-side analytics and support the research community's requirements for improved interoperability and improved access to reanalysis data.

  8. The Convergence of High Performance Computing and Large Scale Data Analytics

    Science.gov (United States)

    Duffy, D.; Bowen, M. K.; Thompson, J. H.; Yang, C. P.; Hu, F.; Wills, B.

    2015-12-01

    As the combinations of remote sensing observations and model outputs have grown, scientists are increasingly burdened with both the necessity and complexity of large-scale data analysis. Scientists are increasingly applying traditional high performance computing (HPC) solutions to solve their "Big Data" problems. While this approach has the benefit of limiting data movement, the HPC system is not optimized to run analytics, which can create problems that permeate throughout the HPC environment. To solve these issues and to alleviate some of the strain on the HPC environment, the NASA Center for Climate Simulation (NCCS) has created the Advanced Data Analytics Platform (ADAPT), which combines both HPC and cloud technologies to create an agile system designed for analytics. Large, commonly used data sets are stored in this system in a write once/read many file system, such as Landsat, MODIS, MERRA, and NGA. High performance virtual machines are deployed and scaled according to the individual scientist's requirements specifically for data analysis. On the software side, the NCCS and GMU are working with emerging commercial technologies and applying them to structured, binary scientific data in order to expose the data in new ways. Native NetCDF data is being stored within a Hadoop Distributed File System (HDFS) enabling storage-proximal processing through MapReduce while continuing to provide accessibility of the data to traditional applications. Once the data is stored within HDFS, an additional indexing scheme is built on top of the data and placed into a relational database. This spatiotemporal index enables extremely fast mappings of queries to data locations to dramatically speed up analytics. These are some of the first steps toward a single unified platform that optimizes for both HPC and large-scale data analysis, and this presentation will elucidate the resulting and necessary exascale architectures required for future systems.

  9. Dataset of aqueous humor cytokine profile in HIV patients with Cytomegalovirus (CMV retinitis

    Directory of Open Access Journals (Sweden)

    Jayant Venkatramani Iyer

    2016-09-01

    Full Text Available The data shows the aqueous humor cytokine profiling results acquired in a small cohort of 17 HIV patients clinically diagnosed with Cytomegalovirus retinitis using the FlexMAP 3D (Luminex® platform using the Milliplex Human Cytokine® kit. Aqueous humor samples were collected from these patients at different time points (pre-treatment and at 4-weekly intervals through the 12-week course of intravitreal ganciclovir treatment and 41 cytokine levels were analyzed at each time point. CMV DNA viral load was assessed in 8 patients at different time points throughout the course of ganciclovir treatment. The data described herein is related to the research article entitled “Aqueous humor immune factors and cytomegalovirus (CMV levels in CMV retinitis through treatment - The CRIGSS study” (Iyer et al., 2016 [1]. Cytokine levels against the different time points which indicate the response to the given treatment and against the CMV viral load were analyzed. Keywords: Cytokines, CMV retinitis, Dataset, HIV, Luminex bead assay

  10. Renewable energy systems: A societal and technological platform

    Energy Technology Data Exchange (ETDEWEB)

    Polatidis, Heracles; Haralambopoulos, Dias A. [University of the Aegean, Mytilene (Greece). Department of Environment

    2007-02-15

    Today, the analysis of renewable energy places the emphasis on the technological and economic attributes with social and environmental impact assessment providing for a rather static, narrow frame of analysis. The participation and response of social actors and other stakeholders is usually of a traditional type, with consultation documents and public meetings, collection of complaints and suggestion schemes. This often encourages parochialism and an over-concentration on relatively trivial issues. It is, therefore, imperative to establish a new participatory planning platform to incorporate the wider socio-economic aspects of renewable energy systems and to provide for an operational analytical decomposition of them. In this work the issue of decomposition analysis is clarified, and a new agenda for the societal and technological decomposition analysis of renewable energy systems is developed. A case study is disclosed to present the relevance of the established platform for integrated (renewable) energy systems planning. Innovative aspects comprise of the simultaneous inclusion of decision analysis and social acceptance methods and tools in concert with the related public participation techniques. (author)

  11. An Integrated Web-Based 3d Modeling and Visualization Platform to Support Sustainable Cities

    Science.gov (United States)

    Amirebrahimi, S.; Rajabifard, A.

    2012-07-01

    Sustainable Development is found as the key solution to preserve the sustainability of cities in oppose to ongoing population growth and its negative impacts. This is complex and requires a holistic and multidisciplinary decision making. Variety of stakeholders with different backgrounds also needs to be considered and involved. Numerous web-based modeling and visualization tools have been designed and developed to support this process. There have been some success stories; however, majority failed to bring a comprehensive platform to support different aspects of sustainable development. In this work, in the context of SDI and Land Administration, CSDILA Platform - a 3D visualization and modeling platform -was proposed which can be used to model and visualize different dimensions to facilitate the achievement of sustainability, in particular, in urban context. The methodology involved the design of a generic framework for development of an analytical and visualization tool over the web. CSDILA Platform was then implemented via number of technologies based on the guidelines provided by the framework. The platform has a modular structure and uses Service-Oriented Architecture (SOA). It is capable of managing spatial objects in a 4D data store and can flexibly incorporate a variety of developed models using the platform's API. Development scenarios can be modeled and tested using the analysis and modeling component in the platform and the results are visualized in seamless 3D environment. The platform was further tested using number of scenarios and showed promising results and potentials to serve a wider need. In this paper, the design process of the generic framework, the implementation of CSDILA Platform and technologies used, and also findings and future research directions will be presented and discussed.

  12. Centrifugal micro-fluidic platform for radiochemistry: Potentialities for the chemical analysis of nuclear spent fuels

    International Nuclear Information System (INIS)

    Bruchet, Anthony; Mariet, Clarisse; Taniga, Velan; Descroix, Stephanie; Malaquin, Laurent; Goutelard, Florence

    2013-01-01

    The use of a centrifugal micro-fluidic platform is for the first time reported as an alternative to classical chromatographic procedures for radiochemistry. The original design of the micro-fluidic platform has been thought to fasten and simplify the prototyping process with the use of a circular platform integrating four rectangular microchips made of thermoplastic. The microchips, dedicated to anion-exchange chromatographic separations, integrate a localized monolithic stationary phase as well as injection and collection reservoirs. The results presented here were obtained with a simplified simulated nuclear spent fuel sample composed of non-radioactive isotopes of Europium and Uranium, in proportion usually found for uranium oxide nuclear spent fuel. While keeping the analytical results consistent with the conventional procedure (extraction yield for Europium of ∼97%), the use of the centrifugal micro-fluidic platform allowed to reduce the volume of liquid needed by a factor of ∼250. Thanks to their unique 'easy-to-use' features, centrifugal micro-fluidic platforms are potential successful candidates for the down-scaling of chromatographic separation of radioactive samples (automation, multiplexing, easy integration in glove-boxes environment and low cost of maintenance). (authors)

  13. Cross-Platform Technologies

    Directory of Open Access Journals (Sweden)

    Maria Cristina ENACHE

    2017-04-01

    Full Text Available Cross-platform - a concept becoming increasingly used in recent years especially in the development of mobile apps, but this consistently over time and in the development of conventional desktop applications. The notion of cross-platform software (multi-platform or platform-independent refers to a software application that can run on more than one operating system or computing architecture. Thus, a cross-platform application can operate independent of software or hardware platform on which it is execute. As a generic definition presents a wide range of meanings for purposes of this paper we individualize this definition as follows: we will reduce the horizon of meaning and we use functionally following definition: a cross-platform application is a software application that can run on more than one operating system (desktop or mobile identical or in a similar way.

  14. Platform Performance and Challenges - using Platforms in Lego Company

    DEFF Research Database (Denmark)

    Munk, Lone; Mortensen, Niels Henrik

    2009-01-01

    needs focus on the incentive of using the platform. This problem lacks attention in literature, as well as industry, where assessment criteria do not cover this aspect. Therefore, we recommend including user incentive in platform assessment criteria to these challenges. Concrete solution elements...... ensuring user incentive in platforms is an object for future research...

  15. Development of a Luminex Bead Based Assay for Diagnosis of Toxocariasis Using Recombinant Antigens Tc-CTL-1 and Tc-TES-26.

    Directory of Open Access Journals (Sweden)

    John P Anderson

    Full Text Available The clinical spectrum of human disease caused by the roundworms Toxocara canis and Toxocara cati ranges from visceral and ocular larva migrans to covert toxocariasis. The parasite is not typically recovered in affected tissues, so detection of parasite-specific antibodies is usually necessary for establishing a diagnosis. The most reliable immunodiagnostic methods use the Toxocara excretory-secretory antigens (TES-Ag in ELISA formats to detect Toxocara-specific antibodies. To eliminate the need for native parasite materials, we identified and purified immunodiagnostic antigens using 2D gel electrophoresis followed by electrospray ionization mass spectrometry. Three predominant immunoreactive proteins were found in the TES; all three had been previously described in the literature: Tc-CTL-1, Tc-TES-26, and Tc-MUC-3. We generated Escherichia coli expressed recombinant proteins for evaluation in Luminex based immunoassays. We were unable to produce a functional assay with the Tc-MUC-3 recombinant protein. Tc-CTL-1 and Tc-TES-26 were successfully coupled and tested using defined serum batteries. The use of both proteins together generated better results than if the proteins were used individually. The sensitivity and specificity of the assay for detecting visceral larval migrans using Tc-CTL-1 plus Tc-TES-26 was 99% and 94%, respectively; the sensitivity for detecting ocular larval migrans was 64%. The combined performance of the new assay was superior to the currently available EIA and could potentially be employed to replace current assays that rely on native TES-Ag.

  16. Platform Constellations

    DEFF Research Database (Denmark)

    Staykova, Kalina Stefanova; Damsgaard, Jan

    2016-01-01

    This research paper presents an initial attempt to introduce and explain the emergence of new phenomenon, which we refer to as platform constellations. Functioning as highly modular systems, the platform constellations are collections of highly connected platforms which co-exist in parallel and a......’ acquisition and users’ engagement rates as well as unlock new sources of value creation and diversify revenue streams....

  17. A Multimodal Deep Log-Based User Experience (UX) Platform for UX Evaluation.

    Science.gov (United States)

    Hussain, Jamil; Khan, Wajahat Ali; Hur, Taeho; Bilal, Hafiz Syed Muhammad; Bang, Jaehun; Hassan, Anees Ul; Afzal, Muhammad; Lee, Sungyoung

    2018-05-18

    The user experience (UX) is an emerging field in user research and design, and the development of UX evaluation methods presents a challenge for both researchers and practitioners. Different UX evaluation methods have been developed to extract accurate UX data. Among UX evaluation methods, the mixed-method approach of triangulation has gained importance. It provides more accurate and precise information about the user while interacting with the product. However, this approach requires skilled UX researchers and developers to integrate multiple devices, synchronize them, analyze the data, and ultimately produce an informed decision. In this paper, a method and system for measuring the overall UX over time using a triangulation method are proposed. The proposed platform incorporates observational and physiological measurements in addition to traditional ones. The platform reduces the subjective bias and validates the user's perceptions, which are measured by different sensors through objectification of the subjective nature of the user in the UX assessment. The platform additionally offers plug-and-play support for different devices and powerful analytics for obtaining insight on the UX in terms of multiple participants.

  18. Paper-based electrochemical sensing platform with integral battery and electrochromic read-out.

    Science.gov (United States)

    Liu, Hong; Crooks, Richard M

    2012-03-06

    We report a battery-powered, microelectrochemical sensing platform that reports its output using an electrochromic display. The platform is fabricated based on paper fluidics and uses a Prussian blue spot electrodeposited on an indium-doped tin oxide thin film as the electrochromic indicator. The integrated metal/air battery powers both the electrochemical sensor and the electrochromic read-out, which are in electrical contact via a paper reservoir. The sample activates the battery and the presence of analyte in the sample initiates the color change of the Prussian blue spot. The entire system is assembled on the lab bench, without the need for cleanroom facilities. The applicability of the device to point-of-care sensing is demonstrated by qualitative detection of 0.1 mM glucose and H(2)O(2) in artificial urine samples.

  19. Continuous Platform Development

    DEFF Research Database (Denmark)

    Nielsen, Ole Fiil

    low risks and investments but also with relatively fuzzy results. When looking for new platform projects, it is important to make sure that the company and market is ready for the introduction of platforms, and to make sure that people from marketing and sales, product development, and downstream......, but continuous product family evolution challenges this strategy. The concept of continuous platform development is based on the fact that platform development should not be a one-time experience but rather an ongoing process of developing new platforms and updating existing ones, so that product family...

  20. 3D Printed Paper-Based Microfluidic Analytical Devices

    Directory of Open Access Journals (Sweden)

    Yong He

    2016-06-01

    Full Text Available As a pump-free and lightweight analytical tool, paper-based microfluidic analytical devices (μPADs attract more and more interest. If the flow speed of μPAD can be programmed, the analytical sequences could be designed and they will be more popular. This reports presents a novel μPAD, driven by the capillary force of cellulose powder, printed by a desktop three-dimensional (3D printer, which has some promising features, such as easy fabrication and programmable flow speed. First, a suitable size-scale substrate with open microchannels on its surface is printed. Next, the surface of the substrate is covered with a thin layer of polydimethylsiloxane (PDMS to seal the micro gap caused by 3D printing. Then, the microchannels are filled with a mixture of cellulose powder and deionized water in an appropriate proportion. After drying in an oven at 60 °C for 30 min, it is ready for use. As the different channel depths can be easily printed, which can be used to achieve the programmable capillary flow speed of cellulose powder in the microchannels. A series of microfluidic analytical experiments, including quantitative analysis of nitrite ion and fabrication of T-sensor were used to demonstrate its capability. As the desktop 3D printer (D3DP is very cheap and accessible, this device can be rapidly printed at the test field with a low cost and has a promising potential in the point-of-care (POC system or as a lightweight platform for analytical chemistry.

  1. Automated processing, extraction and detection of herpes simplex virus types 1 and 2: A comparative evaluation of three commercial platforms using clinical specimens.

    Science.gov (United States)

    Binnicker, Matthew J; Espy, Mark J; Duresko, Brian; Irish, Cole; Mandrekar, Jay

    2017-04-01

    Recently, automated platforms have been developed that can perform processing, extraction and testing for herpes simplex virus (HSV) nucleic acid on a single instrument. In this study, we compared three commercially-available systems; Aptima ® /Panther (Hologic, San Diego, CA), ARIES ® (Luminex Corporation, Austin, TX), and cobas ® 4800 (Roche Molecular Systems Inc, Pleasanton, CA) for the qualitative detection of HSV-1/2 in clinical samples. Two-hundred seventy-seven specimens (genital [n=193], dermal [n=84]) were submitted for routine HSV-1/2 real-time PCR by a laboratory developed test. Following routine testing, samples were also tested by the Aptima, ARIES, and cobas HSV-1/2 assays per the manufacturer's recommendations. Results were compared to a "consensus standard" defined as the result obtained from ≥3 of the 4 assays. Following testing of 277 specimens, the cobas and ARIES assays demonstrated a sensitivity of 100% for HSV-1 (61/61) and HSV-2 (55/55). The Aptima assays showed a sensitivity of 91.8% (56/61) for HSV-1 and 90.9% (50/55) for HSV-2. Percent specificities for HSV-1 were 96.2% (202/210) by cobas, 99.5% (209/210) by ARIES and 100% (236/236) by Aptima. For HSV-2, the specificities were 98.1% (211/215) by cobas, 99.5% (215/216) by ARIES and 100% (216/216) by Aptima. The turnaround time for testing 24 samples was 2.5h by the cobas 4800, 3.1h by Aptima/Panther, and 3.9h by ARIES. The three commercial systems can perform all current functions on a single platform, thereby improving workflow and potentially reducing errors associated with manual processing of samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Mass Spectrometry Based Lipidomics: An Overview of Technological Platforms

    Science.gov (United States)

    Köfeler, Harald C.; Fauland, Alexander; Rechberger, Gerald N.; Trötzmüller, Martin

    2012-01-01

    One decade after the genomic and the proteomic life science revolution, new ‘omics’ fields are emerging. The metabolome encompasses the entity of small molecules—Most often end products of a catalytic process regulated by genes and proteins—with the lipidome being its fat soluble subdivision. Within recent years, lipids are more and more regarded not only as energy storage compounds but also as interactive players in various cellular regulation cycles and thus attain rising interest in the bio-medical community. The field of lipidomics is, on one hand, fuelled by analytical technology advances, particularly mass spectrometry and chromatography, but on the other hand new biological questions also drive analytical technology developments. Compared to fairly standardized genomic or proteomic high-throughput protocols, the high degree of molecular heterogeneity adds a special analytical challenge to lipidomic analysis. In this review, we will take a closer look at various mass spectrometric platforms for lipidomic analysis. We will focus on the advantages and limitations of various experimental setups like ‘shotgun lipidomics’, liquid chromatography—Mass spectrometry (LC-MS) and matrix assisted laser desorption ionization-time of flight (MALDI-TOF) based approaches. We will also examine available software packages for data analysis, which nowadays is in fact the rate limiting step for most ‘omics’ workflows. PMID:24957366

  3. Mass Spectrometry Based Lipidomics: An Overview of Technological Platforms

    Directory of Open Access Journals (Sweden)

    Harald C. Köfeler

    2012-01-01

    Full Text Available One decade after the genomic and the proteomic life science revolution, new ‘omics’ fields are emerging. The metabolome encompasses the entity of small molecules—Most often end products of a catalytic process regulated by genes and proteins—with the lipidome being its fat soluble subdivision. Within recent years, lipids are more and more regarded not only as energy storage compounds but also as interactive players in various cellular regulation cycles and thus attain rising interest in the bio-medical community. The field of lipidomics is, on one hand, fuelled by analytical technology advances, particularly mass spectrometry and chromatography, but on the other hand new biological questions also drive analytical technology developments. Compared to fairly standardized genomic or proteomic high-throughput protocols, the high degree of molecular heterogeneity adds a special analytical challenge to lipidomic analysis. In this review, we will take a closer look at various mass spectrometric platforms for lipidomic analysis. We will focus on the advantages and limitations of various experimental setups like ‘shotgun lipidomics’, liquid chromatography—Mass spectrometry (LC-MS and matrix assisted laser desorption ionization-time of flight (MALDI-TOF based approaches. We will also examine available software packages for data analysis, which nowadays is in fact the rate limiting step for most ‘omics’ workflows.

  4. Droplet-based Biosensing for Lab-on-a-Chip, Open Microfluidics Platforms

    Directory of Open Access Journals (Sweden)

    Piyush Dak

    2016-04-01

    Full Text Available Low cost, portable sensors can transform health care by bringing easily available diagnostic devices to low and middle income population, particularly in developing countries. Sample preparation, analyte handling and labeling are primary cost concerns for traditional lab-based diagnostic systems. Lab-on-a-chip (LoC platforms based on droplet-based microfluidics promise to integrate and automate these complex and expensive laboratory procedures onto a single chip; the cost will be further reduced if label-free biosensors could be integrated onto the LoC platforms. Here, we review some recent developments of label-free, droplet-based biosensors, compatible with “open” digital microfluidic systems. These low-cost droplet-based biosensors overcome some of the fundamental limitations of the classical sensors, enabling timely diagnosis. We identify the key challenges that must be addressed to make these sensors commercially viable and summarize a number of promising research directions.

  5. Internet Of Things And Analytics

    Directory of Open Access Journals (Sweden)

    Harshini G

    2017-08-01

    Full Text Available The Internet of Things IoT helps to encompass many aspects of life from connecting homes and cities to connecting cars and roads roads to devices that helps in examining individuals behavior and then the data collected is used.IoT is to make lives better secure and enjoyable. IoT solutions will promote cleaner environment improve peoples health with preventative care mechanisms and the constant supervision of elderly family members. It provides a platform for communication between objects where objects can organize and manage themselves. In this paper an attempt is made to understand how analytics can be applied to IoT data. Various statistical and data mining techniques will help to derive knowledge out of huge data collected by the IoT devices.

  6. The Earth Data Analytic Services (EDAS) Framework

    Science.gov (United States)

    Maxwell, T. P.; Duffy, D.

    2017-12-01

    Faced with unprecedented growth in earth data volume and demand, NASA has developed the Earth Data Analytic Services (EDAS) framework, a high performance big data analytics framework built on Apache Spark. This framework enables scientists to execute data processing workflows combining common analysis operations close to the massive data stores at NASA. The data is accessed in standard (NetCDF, HDF, etc.) formats in a POSIX file system and processed using vetted earth data analysis tools (ESMF, CDAT, NCO, etc.). EDAS utilizes a dynamic caching architecture, a custom distributed array framework, and a streaming parallel in-memory workflow for efficiently processing huge datasets within limited memory spaces with interactive response times. EDAS services are accessed via a WPS API being developed in collaboration with the ESGF Compute Working Team to support server-side analytics for ESGF. The API can be accessed using direct web service calls, a Python script, a Unix-like shell client, or a JavaScript-based web application. New analytic operations can be developed in Python, Java, or Scala (with support for other languages planned). Client packages in Python, Java/Scala, or JavaScript contain everything needed to build and submit EDAS requests. The EDAS architecture brings together the tools, data storage, and high-performance computing required for timely analysis of large-scale data sets, where the data resides, to ultimately produce societal benefits. It is is currently deployed at NASA in support of the Collaborative REAnalysis Technical Environment (CREATE) project, which centralizes numerous global reanalysis datasets onto a single advanced data analytics platform. This service enables decision makers to compare multiple reanalysis datasets and investigate trends, variability, and anomalies in earth system dynamics around the globe.

  7. Hybrid Analytical and Data-Driven Modeling for Feed-Forward Robot Control.

    Science.gov (United States)

    Reinhart, René Felix; Shareef, Zeeshan; Steil, Jochen Jakob

    2017-02-08

    Feed-forward model-based control relies on models of the controlled plant, e.g., in robotics on accurate knowledge of manipulator kinematics or dynamics. However, mechanical and analytical models do not capture all aspects of a plant's intrinsic properties and there remain unmodeled dynamics due to varying parameters, unmodeled friction or soft materials. In this context, machine learning is an alternative suitable technique to extract non-linear plant models from data. However, fully data-based models suffer from inaccuracies as well and are inefficient if they include learning of well known analytical models. This paper thus argues that feed-forward control based on hybrid models comprising an analytical model and a learned error model can significantly improve modeling accuracy. Hybrid modeling here serves the purpose to combine the best of the two modeling worlds. The hybrid modeling methodology is described and the approach is demonstrated for two typical problems in robotics, i.e., inverse kinematics control and computed torque control. The former is performed for a redundant soft robot and the latter for a rigid industrial robot with redundant degrees of freedom, where a complete analytical model is not available for any of the platforms.

  8. Multiplex Immunoassay Profiling of Hormones Involved in Metabolic Regulation.

    Science.gov (United States)

    Stephen, Laurie; Guest, Paul C

    2018-01-01

    Multiplex immunoassays are used for rapid profiling of biomarker proteins and small molecules in biological fluids. The advantages over single immunoassays include lower sample consumption, cost, and labor. This chapter details a protocol to develop a 5-plex assay for glucagon-like peptide 1, growth hormone, insulin, leptin, and thyroid-stimulating hormone on the Luminex ® platform. The results of the analysis of insulin in normal control subjects are given due to the important role of this hormone in nutritional programming diseases.

  9. New Antifouling Platform Characterized by Single-Molecule Imaging

    Science.gov (United States)

    2015-01-01

    Antifouling surfaces have been widely studied for their importance in medical devices and industry. Antifouling surfaces mostly achieved by methoxy-poly(ethylene glycol) (mPEG) have shown biomolecular adsorption less than 1 ng/cm2 which was measured by surface analytical tools such as surface plasmon resonance (SPR) spectroscopy, quartz crystal microbalance (QCM), or optical waveguide lightmode (OWL) spectroscopy. Herein, we utilize a single-molecule imaging technique (i.e., an ultimate resolution) to study antifouling properties of functionalized surfaces. We found that about 600 immunoglobulin G (IgG) molecules are adsorbed. This result corresponds to ∼5 pg/cm2 adsorption, which is far below amount for the detection limit of the conventional tools. Furthermore, we developed a new antifouling platform that exhibits improved antifouling performance that shows only 78 IgG molecules adsorbed (∼0.5 pg/cm2). The antifouling platform consists of forming 1 nm TiO2 thin layer, on which peptidomimetic antifouling polymer (PMAP) is robustly anchored. The unprecedented antifouling performance can potentially revolutionize a variety of research fields such as single-molecule imaging, medical devices, biosensors, and others. PMID:24503420

  10. New antifouling platform characterized by single-molecule imaging.

    Science.gov (United States)

    Ryu, Ji Young; Song, In Taek; Lau, K H Aaron; Messersmith, Phillip B; Yoon, Tae-Young; Lee, Haeshin

    2014-03-12

    Antifouling surfaces have been widely studied for their importance in medical devices and industry. Antifouling surfaces mostly achieved by methoxy-poly(ethylene glycol) (mPEG) have shown biomolecular adsorption less than 1 ng/cm(2) which was measured by surface analytical tools such as surface plasmon resonance (SPR) spectroscopy, quartz crystal microbalance (QCM), or optical waveguide lightmode (OWL) spectroscopy. Herein, we utilize a single-molecule imaging technique (i.e., an ultimate resolution) to study antifouling properties of functionalized surfaces. We found that about 600 immunoglobulin G (IgG) molecules are adsorbed. This result corresponds to ∼5 pg/cm(2) adsorption, which is far below amount for the detection limit of the conventional tools. Furthermore, we developed a new antifouling platform that exhibits improved antifouling performance that shows only 78 IgG molecules adsorbed (∼0.5 pg/cm(2)). The antifouling platform consists of forming 1 nm TiO2 thin layer, on which peptidomimetic antifouling polymer (PMAP) is robustly anchored. The unprecedented antifouling performance can potentially revolutionize a variety of research fields such as single-molecule imaging, medical devices, biosensors, and others.

  11. Mobile platform security

    CERN Document Server

    Asokan, N; Dmitrienko, Alexandra

    2013-01-01

    Recently, mobile security has garnered considerable interest in both the research community and industry due to the popularity of smartphones. The current smartphone platforms are open systems that allow application development, also for malicious parties. To protect the mobile device, its user, and other mobile ecosystem stakeholders such as network operators, application execution is controlled by a platform security architecture. This book explores how such mobile platform security architectures work. We present a generic model for mobile platform security architectures: the model illustrat

  12. Engineering of Surface Chemistry for Enhanced Sensitivity in Nanoporous Interferometric Sensing Platforms.

    Science.gov (United States)

    Law, Cheryl Suwen; Sylvia, Georgina M; Nemati, Madieh; Yu, Jingxian; Losic, Dusan; Abell, Andrew D; Santos, Abel

    2017-03-15

    We explore new approaches to engineering the surface chemistry of interferometric sensing platforms based on nanoporous anodic alumina (NAA) and reflectometric interference spectroscopy (RIfS). Two surface engineering strategies are presented, namely (i) selective chemical functionalization of the inner surface of NAA pores with amine-terminated thiol molecules and (ii) selective chemical functionalization of the top surface of NAA with dithiol molecules. The strong molecular interaction of Au 3+ ions with thiol-containing functional molecules of alkane chain or peptide character provides a model sensing system with which to assess the sensitivity of these NAA platforms by both molecular feature and surface engineering. Changes in the effective optical thickness of the functionalized NAA photonic films (i.e., sensing principle), in response to gold ions, are monitored in real-time by RIfS. 6-Amino-1-hexanethiol (inner surface) and 1,6-hexanedithiol (top surface), the most sensitive functional molecules from approaches i and ii, respectively, were combined into a third sensing strategy whereby the NAA platforms are functionalized on both the top and inner surfaces concurrently. Engineering of the surface according to this approach resulted in an additive enhancement in sensitivity of up to 5-fold compared to previously reported systems. This study advances the rational engineering of surface chemistry for interferometric sensing on nanoporous platforms with potential applications for real-time monitoring of multiple analytes in dynamic environments.

  13. Hierarchically sinergistical integration of Social Media Analytics/Social CRM with Business Intelligence and with the Geographic Information System

    OpenAIRE

    Круковський, Ігор Анатолійович; Хомів, Богдан Арсенович; Гаврилюк, Всеволод Леонідович

    2014-01-01

    The actuality of integration of Social Media Analytics/Social CRM with Decision Support Systems on the basis of Business Intelligence 2.0 (DSS/BI 2.0) and with the Geographic Information System is presented. On the basis of their integration a new type of DSS is offered - Social Media Spatial DSS/BI. The variant is shown of this system realization on the programmatic platform of Social Media Analytics of the SemanticForce Company, which has its own semantic analyzer Blueberry. The suitability...

  14. Footprints of Fascination: Digital Traces of Public Engagement with Particle Physics on CERN's Social Media Platforms

    Science.gov (United States)

    Baram-Tsabari, Ayelet

    2016-01-01

    Although the scientific community increasingly recognizes that its communication with the public may shape civic engagement with science, few studies have characterized how this communication occurs online. Social media plays a growing role in this engagement, yet it is not known if or how different platforms support different types of engagement. This study sets out to explore how users engage with science communication items on different platforms of social media, and what are the characteristics of the items that tend to attract large numbers of user interactions. Here, user interactions with almost identical items on five of CERN's social media platforms were quantitatively compared over an eight-week period, including likes, comments, shares, click-throughs, and time spent on CERN's site. The most popular items were qualitatively analyzed for content features. Findings indicate that as audience size of a social media platform grows, the total rate of engagement with content tends to grow as well. However, per user, engagement tends to decline with audience size. Across all platforms, similar topics tend to consistently receive high engagement. In particular, awe-inspiring imagery tends to frequently attract high engagement across platforms, independent of newsworthiness. To our knowledge, this study provides the first cross-platform characterization of public engagement with science on social media. Findings, although focused on particle physics, have a multidisciplinary nature; they may serve to benchmark social media analytics for assessing science communication activities in various domains. Evidence-based suggestions for practitioners are also offered. PMID:27232498

  15. Footprints of Fascination: Digital Traces of Public Engagement with Particle Physics on CERN's Social Media Platforms.

    Science.gov (United States)

    Kahle, Kate; Sharon, Aviv J; Baram-Tsabari, Ayelet

    2016-01-01

    Although the scientific community increasingly recognizes that its communication with the public may shape civic engagement with science, few studies have characterized how this communication occurs online. Social media plays a growing role in this engagement, yet it is not known if or how different platforms support different types of engagement. This study sets out to explore how users engage with science communication items on different platforms of social media, and what are the characteristics of the items that tend to attract large numbers of user interactions. Here, user interactions with almost identical items on five of CERN's social media platforms were quantitatively compared over an eight-week period, including likes, comments, shares, click-throughs, and time spent on CERN's site. The most popular items were qualitatively analyzed for content features. Findings indicate that as audience size of a social media platform grows, the total rate of engagement with content tends to grow as well. However, per user, engagement tends to decline with audience size. Across all platforms, similar topics tend to consistently receive high engagement. In particular, awe-inspiring imagery tends to frequently attract high engagement across platforms, independent of newsworthiness. To our knowledge, this study provides the first cross-platform characterization of public engagement with science on social media. Findings, although focused on particle physics, have a multidisciplinary nature; they may serve to benchmark social media analytics for assessing science communication activities in various domains. Evidence-based suggestions for practitioners are also offered.

  16. Robotics-assisted mass spectrometry assay platform enabled by open-source electronics.

    Science.gov (United States)

    Chiu, Shih-Hao; Urban, Pawel L

    2015-02-15

    Mass spectrometry (MS) is an important analytical technique with numerous applications in clinical analysis, biochemistry, environmental analysis, geology and physics. Its success builds on the ability of MS to determine molecular weights of analytes, and elucidate their structures. However, sample handling prior to MS requires a lot of attention and labor. In this work we were aiming to automate processing samples for MS so that analyses could be conducted without much supervision of experienced analysts. The goal of this study was to develop a robotics and information technology-oriented platform that could control the whole analysis process including sample delivery, reaction-based assay, data acquisition, and interaction with the analyst. The proposed platform incorporates a robotic arm for handling sample vials delivered to the laboratory, and several auxiliary devices which facilitate and secure the analysis process. They include: multi-relay board, infrared sensors, photo-interrupters, gyroscopes, force sensors, fingerprint scanner, barcode scanner, touch screen panel, and internet interface. The control of all the building blocks is achieved through implementation of open-source electronics (Arduino), and enabled by custom-written programs in C language. The advantages of the proposed system include: low cost, simplicity, small size, as well as facile automation of sample delivery and processing without the intervention of the analyst. It is envisaged that this simple robotic system may be the forerunner of automated laboratories dedicated to mass spectrometric analysis of biological samples. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Platform-based production development

    DEFF Research Database (Denmark)

    Bossen, Jacob; Brunoe, Thomas Ditlev; Nielsen, Kjeld

    2015-01-01

    Platforms as a means for applying modular thinking in product development is relatively well studied, but platforms in the production system has until now not been given much attention. With the emerging concept of platform-based co-development the importance of production platforms is though...

  18. Omnidirectional holonomic platforms

    International Nuclear Information System (INIS)

    Pin, F.G.; Killough, S.M.

    1994-01-01

    This paper presents the concepts for a new family of wheeled platforms which feature full omnidirectionality with simultaneous and independently controlled rotational and translational motion capabilities. The authors first present the orthogonal-wheels concept and the two major wheel assemblies on which these platforms are based. They then describe how a combination of these assemblies with appropriate control can be used to generate an omnidirectional capability for mobile robot platforms. The design and control of two prototype platforms are then presented and their respective characteristics with respect to rotational and translational motion control are discussed

  19. Platform decommissioning costs

    International Nuclear Information System (INIS)

    Rodger, David

    1998-01-01

    There are over 6500 platforms worldwide contributing to the offshore oil and gas production industry. In the North Sea there are around 500 platforms in place. There are many factors to be considered in planning for platform decommissioning and the evaluation of options for removal and disposal. The environmental impact, technical feasibility, safety and cost factors all have to be considered. This presentation considers what information is available about the overall decommissioning costs for the North Sea and the costs of different removal and disposal options for individual platforms. 2 figs., 1 tab

  20. The Prospect of Internet of Things and Big Data Analytics in Transportation System

    Science.gov (United States)

    Noori Hussein, Waleed; Kamarudin, L. M.; Hussain, Haider N.; Zakaria, A.; Badlishah Ahmed, R.; Zahri, N. A. H.

    2018-05-01

    Internet of Things (IoT); the new dawn technology that describes how data, people and interconnected physical objects act based on communicated information, and big data analytics have been adopted by diverse domains for varying purposes. Manufacturing, agriculture, banks, oil and gas, healthcare, retail, hospitality, and food services are few of the sectors that have adopted and massively utilized IoT and big data analytics. The transportation industry is also an early adopter, with significant attendant effects on its processes of tracking shipment, freight monitoring, and transparent warehousing. This is recorded in countries like England, Singapore, Portugal, and Germany, while Malaysia is currently assessing the potentials and researching a purpose-driven adoption and implementation. This paper, based on review of related literature, presents a summary of the inherent prospects in adopting IoT and big data analytics in the Malaysia transportation system. Efficient and safe port environment, predictive maintenance and remote management, boundary-less software platform and connected ecosystem, among others, are the inherent benefits in the IoT and big data analytics for the Malaysia transportation system.

  1. Product Platform Replacements

    DEFF Research Database (Denmark)

    Sköld, Martin; Karlsson, Christer

    2012-01-01

    . To shed light on this unexplored and growing managerial concern, the purpose of this explorative study is to identify operational challenges to management when product platforms are replaced. Design/methodology/approach – The study uses a longitudinal field-study approach. Two companies, Gamma and Omega...... replacement was chosen in each company. Findings – The study shows that platform replacements primarily challenge managers' existing knowledge about platform architectures. A distinction can be made between “width” and “height” in platform replacements, and it is crucial that managers observe this in order...... to challenge their existing knowledge about platform architectures. Issues on technologies, architectures, components and processes as well as on segments, applications and functions are identified. Practical implications – Practical implications are summarized and discussed in relation to a framework...

  2. Product Platform Modeling

    DEFF Research Database (Denmark)

    Pedersen, Rasmus

    for customisation of products. In many companies these changes in the business environment have created a controversy between the need for a wide variety of products offered to the marketplace and a desire to reduce variation within the company in order to increase efficiency. Many companies use the concept...... other. These groups can be varied and combined to form different product variants without increasing the internal variety in the company. Based on the Theory of Domains, the concept of encapsulation in the organ domain is introduced, and organs are formulated as platform elements. Included......This PhD thesis has the title Product Platform Modelling. The thesis is about product platforms and visual product platform modelling. Product platforms have gained an increasing attention in industry and academia in the past decade. The reasons are many, yet the increasing globalisation...

  3. The mid-IR silicon photonics sensor platform (Conference Presentation)

    Science.gov (United States)

    Kimerling, Lionel; Hu, Juejun; Agarwal, Anuradha M.

    2017-02-01

    Advances in integrated silicon photonics are enabling highly connected sensor networks that offer sensitivity, selectivity and pattern recognition. Cost, performance and the evolution path of the so-called `Internet of Things' will gate the proliferation of these networks. The wavelength spectral range of 3-8um, commonly known as the mid-IR, is critical to specificity for sensors that identify materials by detection of local vibrational modes, reflectivity and thermal emission. For ubiquitous sensing applications in this regime, the sensors must move from premium to commodity level manufacturing volumes and cost. Scaling performance/cost is critically dependent on establishing a minimum set of platform attributes for point, wearable, and physical sensing. Optical sensors are ideal for non-invasive applications. Optical sensor device physics involves evanescent or intra-cavity structures for applied to concentration, interrogation and photo-catalysis functions. The ultimate utility of a platform is dependent on sample delivery/presentation modalities; system reset, recalibration and maintenance capabilities; and sensitivity and selectivity performance. The attributes and performance of a unified Glass-on-Silicon platform has shown good prospects for heterogeneous integration on materials and devices using a low cost process flow. Integrated, single mode, silicon photonic platforms offer significant performance and cost advantages, but they require discovery and qualification of new materials and process integration schemes for the mid-IR. Waveguide integrated light sources based on rare earth dopants and Ge-pumped frequency combs have promise. Optical resonators and waveguide spirals can enhance sensitivity. PbTe materials are among the best choices for a standard, waveguide integrated photodetector. Chalcogenide glasses are capable of transmitting mid-IR signals with high transparency. Integrated sensor case studies of i) high sensitivity analyte detection in

  4. Hybrid Analytical and Data-Driven Modeling for Feed-Forward Robot Control †

    Directory of Open Access Journals (Sweden)

    René Felix Reinhart

    2017-02-01

    Full Text Available Feed-forward model-based control relies on models of the controlled plant, e.g., in robotics on accurate knowledge of manipulator kinematics or dynamics. However, mechanical and analytical models do not capture all aspects of a plant’s intrinsic properties and there remain unmodeled dynamics due to varying parameters, unmodeled friction or soft materials. In this context, machine learning is an alternative suitable technique to extract non-linear plant models from data. However, fully data-based models suffer from inaccuracies as well and are inefficient if they include learning of well known analytical models. This paper thus argues that feed-forward control based on hybrid models comprising an analytical model and a learned error model can significantly improve modeling accuracy. Hybrid modeling here serves the purpose to combine the best of the two modeling worlds. The hybrid modeling methodology is described and the approach is demonstrated for two typical problems in robotics, i.e., inverse kinematics control and computed torque control. The former is performed for a redundant soft robot and the latter for a rigid industrial robot with redundant degrees of freedom, where a complete analytical model is not available for any of the platforms.

  5. Hybrid Analytical and Data-Driven Modeling for Feed-Forward Robot Control †

    Science.gov (United States)

    Reinhart, René Felix; Shareef, Zeeshan; Steil, Jochen Jakob

    2017-01-01

    Feed-forward model-based control relies on models of the controlled plant, e.g., in robotics on accurate knowledge of manipulator kinematics or dynamics. However, mechanical and analytical models do not capture all aspects of a plant’s intrinsic properties and there remain unmodeled dynamics due to varying parameters, unmodeled friction or soft materials. In this context, machine learning is an alternative suitable technique to extract non-linear plant models from data. However, fully data-based models suffer from inaccuracies as well and are inefficient if they include learning of well known analytical models. This paper thus argues that feed-forward control based on hybrid models comprising an analytical model and a learned error model can significantly improve modeling accuracy. Hybrid modeling here serves the purpose to combine the best of the two modeling worlds. The hybrid modeling methodology is described and the approach is demonstrated for two typical problems in robotics, i.e., inverse kinematics control and computed torque control. The former is performed for a redundant soft robot and the latter for a rigid industrial robot with redundant degrees of freedom, where a complete analytical model is not available for any of the platforms. PMID:28208697

  6. Introducing Platform Interactions Model for Studying Multi-Sided Platforms

    DEFF Research Database (Denmark)

    Staykova, Kalina; Damsgaard, Jan

    2018-01-01

    Multi-Sided Platforms (MSPs) function as socio-technical entities that facilitate direct interactions between various affiliated to them constituencies through developing and managing IT architecture. In this paper, we aim to explain the nature of the platform interactions as key characteristic o...

  7. Real-time cellular exometabolome analysis with a microfluidic-mass spectrometry platform.

    Directory of Open Access Journals (Sweden)

    Christina C Marasco

    Full Text Available To address the challenges of tracking the multitude of signaling molecules and metabolites that is the basis of biological complexity, we describe a strategy to expand the analytical techniques for dynamic systems biology. Using microfluidics, online desalting, and mass spectrometry technologies, we constructed and validated a platform well suited for sampling the cellular microenvironment with high temporal resolution. Our platform achieves success in: automated cellular stimulation and microenvironment control; reduced non-specific adsorption to polydimethylsiloxane due to surface passivation; real-time online sample collection; near real-time sample preparation for salt removal; and real-time online mass spectrometry. When compared against the benchmark of "in-culture" experiments combined with ultraperformance liquid chromatography-electrospray ionization-ion mobility-mass spectrometry (UPLC-ESI-IM-MS, our platform alleviates the volume challenge issues caused by dilution of autocrine and paracrine signaling and dramatically reduces sample preparation and data collection time, while reducing undesirable external influence from various manual methods of manipulating cells and media (e.g., cell centrifugation. To validate this system biologically, we focused on cellular responses of Jurkat T cells to microenvironmental stimuli. Application of these stimuli, in conjunction with the cell's metabolic processes, results in changes in consumption of nutrients and secretion of biomolecules (collectively, the exometabolome, which enable communication with other cells or tissues and elimination of waste. Naïve and experienced T-cell metabolism of cocaine is used as an exemplary system to confirm the platform's capability, highlight its potential for metabolite discovery applications, and explore immunological memory of T-cell drug exposure. Our platform proved capable of detecting metabolomic variations between naïve and experienced Jurkat T cells

  8. Comparison of microarray platforms for measuring differential microRNA expression in paired normal/cancer colon tissues.

    Directory of Open Access Journals (Sweden)

    Maurizio Callari

    Full Text Available BACKGROUND: Microarray technology applied to microRNA (miRNA profiling is a promising tool in many research fields; nevertheless, independent studies characterizing the same pathology have often reported poorly overlapping results. miRNA analysis methods have only recently been systematically compared but only in few cases using clinical samples. METHODOLOGY/PRINCIPAL FINDINGS: We investigated the inter-platform reproducibility of four miRNA microarray platforms (Agilent, Exiqon, Illumina, and Miltenyi, comparing nine paired tumor/normal colon tissues. The most concordant and selected discordant miRNAs were further studied by quantitative RT-PCR. Globally, a poor overlap among differentially expressed miRNAs identified by each platform was found. Nevertheless, for eight miRNAs high agreement in differential expression among the four platforms and comparability to qRT-PCR was observed. Furthermore, most of the miRNA sets identified by each platform are coherently enriched in data from the other platforms and the great majority of colon cancer associated miRNA sets derived from the literature were validated in our data, independently from the platform. Computational integration of miRNA and gene expression profiles suggested that anti-correlated predicted target genes of differentially expressed miRNAs are commonly enriched in cancer-related pathways and in genes involved in glycolysis and nutrient transport. CONCLUSIONS: Technical and analytical challenges in measuring miRNAs still remain and further research is required in order to increase consistency between different microarray-based methodologies. However, a better inter-platform agreement was found by looking at miRNA sets instead of single miRNAs and through a miRNAs - gene expression integration approach.

  9. An analytical approach to managing complex process problems

    Energy Technology Data Exchange (ETDEWEB)

    Ramstad, Kari; Andersen, Espen; Rohde, Hans Christian; Tydal, Trine

    2006-03-15

    The oil companies are continuously investing time and money to ensure optimum regularity on their production facilities. High regularity increases profitability, reduces workload on the offshore organisation and most important; - reduces discharge to air and sea. There are a number of mechanisms and tools available in order to achieve high regularity. Most of these are related to maintenance, system integrity, well operations and process conditions. However, for all of these tools, they will only be effective if quick and proper analysis of fluids and deposits are carried out. In fact, analytical backup is a powerful tool used to maintain optimised oil production, and should as such be given high priority. The present Operator (Hydro Oil and Energy) and the Chemical Supplier (MI Production Chemicals) have developed a cooperation to ensure that analytical backup is provided efficiently to the offshore installations. The Operator's Research and Development (R and D) departments and the Chemical Supplier have complementary specialties in both personnel and equipment, and this is utilized to give the best possible service when required from production technologists or operations. In order for the Operator's Research departments, Health, Safety and Environment (HSE) departments and Operations to approve analytical work performed by the Chemical Supplier, a number of analytical tests are carried out following procedures agreed by both companies. In the present paper, three field case examples of analytical cooperation for managing process problems will be presented. 1) Deposition in a Complex Platform Processing System. 2) Contaminated Production Chemicals. 3) Improved Monitoring of Scale Inhibitor, Suspended Solids and Ions. In each case the Research Centre, Operations and the Chemical Supplier have worked closely together to achieve fast solutions and Best Practice. (author) (tk)

  10. Fog Computing: An Overview of Big IoT Data Analytics

    Directory of Open Access Journals (Sweden)

    Muhammad Rizwan Anawar

    2018-01-01

    Full Text Available A huge amount of data, generated by Internet of Things (IoT, is growing up exponentially based on nonstop operational states. Those IoT devices are generating an avalanche of information that is disruptive for predictable data processing and analytics functionality, which is perfectly handled by the cloud before explosion growth of IoT. Fog computing structure confronts those disruptions, with powerful complement functionality of cloud framework, based on deployment of micro clouds (fog nodes at proximity edge of data sources. Particularly big IoT data analytics by fog computing structure is on emerging phase and requires extensive research to produce more proficient knowledge and smart decisions. This survey summarizes the fog challenges and opportunities in the context of big IoT data analytics on fog networking. In addition, it emphasizes that the key characteristics in some proposed research works make the fog computing a suitable platform for new proliferating IoT devices, services, and applications. Most significant fog applications (e.g., health care monitoring, smart cities, connected vehicles, and smart grid will be discussed here to create a well-organized green computing paradigm to support the next generation of IoT applications.

  11. Development of a Sensitive and Specific Serological Assay Based on Luminex Technology for Detection of Antibodies to Zaire Ebola Virus.

    Science.gov (United States)

    Ayouba, Ahidjo; Touré, Abdoulaye; Butel, Christelle; Keita, Alpha Kabinet; Binetruy, Florian; Sow, Mamadou S; Foulongne, Vincent; Delaporte, Eric; Peeters, Martine

    2017-01-01

    The recent Zaire Ebola virus (EBOV) outbreak in West Africa illustrates clearly the need for additional studies with humans and animals to elucidate the ecology of Ebola viruses (EBVs). In this study, we developed a serological assay based on the Luminex technology. Nine recombinant proteins representing different viral regions (nucleoprotein [NP], 40-kDa viral protein [VP40], and glycoprotein [GP]) from four of the five EBV lineages were used. Samples from 94 survivors of the EBOV outbreak in Guinea and negative samples from 108 patients in France were used to calculate test performance for EBOV detection and cross-reaction with other Ebola virus lineages. For EBOV antibody detection, sensitivities of 95.7%, 96.8%, and 92.5% and specificities of 94.4%, 95.4%, and 96.3% for NP, GP, and VP40, respectively, were observed. All EBOV-negative samples that presented a reaction, except for one, interacted with a single antigen, whereas almost all samples from EBOV survivors were simultaneously reactive with NP and GP (90/94) or with NP, GP, and VP40 (87/94). Considering as positive for past EBOV infection only samples that reacted with EBOV NP and GP, sensitivity was 95.7% and specificity increased to 99.1%. Comparing results with commercial EBOV NP and GP enzyme-linked immunosorbent assays (ELISAs; Alpha Diagnostic, San Antonio, TX), lower sensitivity (92.5%) and high specificity (100%) were observed with the same positivity criteria. Samples from EBOV survivors cross-reacted with GP from Sudan Ebola virus (GP-SUDV) (81.9%), GP from Bundibugyo Ebola virus (GP-BDBV) (51.1%), GP from Reston Ebola virus (GP-RESTV) (9.6%), VP40-SUDV (76.6%), and VP40-BDBV (38.3%). Overall, we developed a sensitive and specific high-throughput serological assay, and defined an algorithm, for epidemiological surveys with humans. Copyright © 2016 American Society for Microbiology.

  12. Employing socially driven techniques for framing, contextualization, and collaboration in complex analytical threads

    Science.gov (United States)

    Wollocko, Arthur; Danczyk, Jennifer; Farry, Michael; Jenkins, Michael; Voshell, Martin

    2015-05-01

    The proliferation of sensor technologies continues to impact Intelligence Analysis (IA) work domains. Historical procurement focus on sensor platform development and acquisition has resulted in increasingly advanced collection systems; however, such systems often demonstrate classic data overload conditions by placing increased burdens on already overtaxed human operators and analysts. Support technologies and improved interfaces have begun to emerge to ease that burden, but these often focus on single modalities or sensor platforms rather than underlying operator and analyst support needs, resulting in systems that do not adequately leverage their natural human attentional competencies, unique skills, and training. One particular reason why emerging support tools often fail is due to the gap between military applications and their functions, and the functions and capabilities afforded by cutting edge technology employed daily by modern knowledge workers who are increasingly "digitally native." With the entry of Generation Y into these workplaces, "net generation" analysts, who are familiar with socially driven platforms that excel at giving users insight into large data sets while keeping cognitive burdens at a minimum, are creating opportunities for enhanced workflows. By using these ubiquitous platforms, net generation analysts have trained skills in discovering new information socially, tracking trends among affinity groups, and disseminating information. However, these functions are currently under-supported by existing tools. In this paper, we describe how socially driven techniques can be contextualized to frame complex analytical threads throughout the IA process. This paper focuses specifically on collaborative support technology development efforts for a team of operators and analysts. Our work focuses on under-supported functions in current working environments, and identifies opportunities to improve a team's ability to discover new information and

  13. MyDiabetesMyWay: An Evolving National Data Driven Diabetes Self-Management Platform.

    Science.gov (United States)

    Wake, Deborah J; He, Jinzhang; Czesak, Anna Maria; Mughal, Fezan; Cunningham, Scott G

    2016-09-01

    MyDiabetesMyWay (MDMW) is an award-wining national electronic personal health record and self-management platform for diabetes patients in Scotland. This platform links multiple national institutional and patient-recorded data sources to provide a unique resource for patient care and self-management. This review considers the current evidence for online interventions in diabetes and discusses these in the context of current and ongoing developments for MDMW. Evaluation of MDMW through patient reported outcomes demonstrates a positive impact on self-management. User feedback has highlighted barriers to uptake and has guided platform evolution from an education resource website to an electronic personal health record now encompassing remote monitoring, communication tools and personalized education links. Challenges in delivering digital interventions for long-term conditions include integration of data between institutional and personal recorded sources to perform big data analytics and facilitating technology use in those with disabilities, low digital literacy, low socioeconomic status and in minority groups. The potential for technology supported health improvement is great, but awareness and adoption by health workers and patients remains a significant barrier. © 2016 Diabetes Technology Society.

  14. Global Simulation of Bioenergy Crop Productivity: Analytical framework and Case Study for Switchgrass

    Energy Technology Data Exchange (ETDEWEB)

    Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Nichols, Jeff A. {Cyber Sciences} [ORNL; Post, Wilfred M [ORNL; Wang, Dali [ORNL; Wullschleger, Stan D [ORNL; Kline, Keith L [ORNL; Wei, Yaxing [ORNL; Singh, Nagendra [ORNL; Kang, Shujiang [ORNL

    2014-01-01

    Contemporary global assessments of the deployment potential and sustainability aspects of biofuel crops lack quantitative details. This paper describes an analytical framework capable of meeting the challenges associated with global scale agro-ecosystem modeling. We designed a modeling platform for bioenergy crops, consisting of five major components: (i) standardized global natural resources and management data sets, (ii) global simulation unit and management scenarios, (iii) model calibration and validation, (iv) high-performance computing (HPC) modeling, and (v) simulation output processing and analysis. A case study with the HPC- Environmental Policy Integrated Climate model (HPC-EPIC) to simulate a perennial bioenergy crop, switchgrass (Panicum virgatum L.) and global biomass feedstock analysis on grassland demonstrates the application of this platform. The results illustrate biomass feedstock variability of switchgrass and provide insights on how the modeling platform can be expanded to better assess sustainable production criteria and other biomass crops. Feedstock potentials on global grasslands and within different countries are also shown. Future efforts involve developing databases of productivity, implementing global simulations for other bioenergy crops (e.g. miscanthus, energycane and agave), and assessing environmental impacts under various management regimes. We anticipated this platform will provide an exemplary tool and assessment data for international communities to conduct global analysis of biofuel biomass feedstocks and sustainability.

  15. OpenChrom: a cross-platform open source software for the mass spectrometric analysis of chromatographic data.

    Science.gov (United States)

    Wenig, Philip; Odermatt, Juergen

    2010-07-30

    Today, data evaluation has become a bottleneck in chromatographic science. Analytical instruments equipped with automated samplers yield large amounts of measurement data, which needs to be verified and analyzed. Since nearly every GC/MS instrument vendor offers its own data format and software tools, the consequences are problems with data exchange and a lack of comparability between the analytical results. To challenge this situation a number of either commercial or non-profit software applications have been developed. These applications provide functionalities to import and analyze several data formats but have shortcomings in terms of the transparency of the implemented analytical algorithms and/or are restricted to a specific computer platform. This work describes a native approach to handle chromatographic data files. The approach can be extended in its functionality such as facilities to detect baselines, to detect, integrate and identify peaks and to compare mass spectra, as well as the ability to internationalize the application. Additionally, filters can be applied on the chromatographic data to enhance its quality, for example to remove background and noise. Extended operations like do, undo and redo are supported. OpenChrom is a software application to edit and analyze mass spectrometric chromatographic data. It is extensible in many different ways, depending on the demands of the users or the analytical procedures and algorithms. It offers a customizable graphical user interface. The software is independent of the operating system, due to the fact that the Rich Client Platform is written in Java. OpenChrom is released under the Eclipse Public License 1.0 (EPL). There are no license constraints regarding extensions. They can be published using open source as well as proprietary licenses. OpenChrom is available free of charge at http://www.openchrom.net.

  16. Curriculum Mapping with Academic Analytics in Medical and Healthcare Education.

    Science.gov (United States)

    Komenda, Martin; Víta, Martin; Vaitsis, Christos; Schwarz, Daniel; Pokorná, Andrea; Zary, Nabil; Dušek, Ladislav

    2015-01-01

    No universal solution, based on an approved pedagogical approach, exists to parametrically describe, effectively manage, and clearly visualize a higher education institution's curriculum, including tools for unveiling relationships inside curricular datasets. We aim to solve the issue of medical curriculum mapping to improve understanding of the complex structure and content of medical education programs. Our effort is based on the long-term development and implementation of an original web-based platform, which supports an outcomes-based approach to medical and healthcare education and is suitable for repeated updates and adoption to curriculum innovations. We adopted data exploration and visualization approaches in the context of medical curriculum innovations in higher education institutions domain. We have developed a robust platform, covering detailed formal metadata specifications down to the level of learning units, interconnections, and learning outcomes, in accordance with Bloom's taxonomy and direct links to a particular biomedical nomenclature. Furthermore, we used selected modeling techniques and data mining methods to generate academic analytics reports from medical curriculum mapping datasets. We present a solution that allows users to effectively optimize a curriculum structure that is described with appropriate metadata, such as course attributes, learning units and outcomes, a standardized vocabulary nomenclature, and a tree structure of essential terms. We present a case study implementation that includes effective support for curriculum reengineering efforts of academics through a comprehensive overview of the General Medicine study program. Moreover, we introduce deep content analysis of a dataset that was captured with the use of the curriculum mapping platform; this may assist in detecting any potentially problematic areas, and hence it may help to construct a comprehensive overview for the subsequent global in-depth medical curriculum

  17. ADMS Evaluation Platform

    Energy Technology Data Exchange (ETDEWEB)

    2018-01-23

    Deploying an ADMS or looking to optimize its value? NREL offers a low-cost, low-risk evaluation platform for assessing ADMS performance. The National Renewable Energy Laboratory (NREL) has developed a vendor-neutral advanced distribution management system (ADMS) evaluation platform and is expanding its capabilities. The platform uses actual grid-scale hardware, large-scale distribution system models, and advanced visualization to simulate realworld conditions for the most accurate ADMS evaluation and experimentation.

  18. A wearable fingernail chemical sensing platform: pH sensing at your fingertips.

    Science.gov (United States)

    Kim, Jayoung; Cho, Thomas N; Valdés-Ramírez, Gabriela; Wang, Joseph

    2016-04-01

    This article demonstrates an example of a wearable chemical sensor based on a fingernail platform. Fingernails represent an attractive wearable platform, merging beauty products with chemical sensing, to enable monitoring of our surrounding environment. The new colorimetric pH fingernail sensor relies on coating artificial nails with a recognition layer consisted of pH indicators entrapped in a polyvinyl chloride (PVC) matrix. Such color changing fingernails offer fast and reversible response to pH changes, repeated use, and intense color change detected easily with naked eye. The PVC matrix prevents leaching out of the indicator molecules from the fingernail sensor toward such repeated use. The limited narrow working pH range of a single pH indicator has been addressed by multiplexing three different pH indicators: bromothymol blue (pH 6.0-7.6), bromocresol green (pH 3.8-5.4), and cresol red (pH 7.2-8.8), as demonstrated for analyses of real-life samples of acidic, neutral, and basic character. The new concept of an optical wearable chemical sensor on fingernail platforms can be expanded towards diverse analytes for various applications in connection to the judicious design of the recognition layer. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Analytical dose modeling for preclinical proton irradiation of millimetric targets.

    Science.gov (United States)

    Vanstalle, Marie; Constanzo, Julie; Karakaya, Yusuf; Finck, Christian; Rousseau, Marc; Brasse, David

    2018-01-01

    Due to the considerable development of proton radiotherapy, several proton platforms have emerged to irradiate small animals in order to study the biological effectiveness of proton radiation. A dedicated analytical treatment planning tool was developed in this study to accurately calculate the delivered dose given the specific constraints imposed by the small dimensions of the irradiated areas. The treatment planning system (TPS) developed in this study is based on an analytical formulation of the Bragg peak and uses experimental range values of protons. The method was validated after comparison with experimental data from the literature and then compared to Monte Carlo simulations conducted using Geant4. Three examples of treatment planning, performed with phantoms made of water targets and bone-slab insert, were generated with the analytical formulation and Geant4. Each treatment planning was evaluated using dose-volume histograms and gamma index maps. We demonstrate the value of the analytical function for mouse irradiation, which requires a targeting accuracy of 0.1 mm. Using the appropriate database, the analytical modeling limits the errors caused by misestimating the stopping power. For example, 99% of a 1-mm tumor irradiated with a 24-MeV beam receives the prescribed dose. The analytical dose deviations from the prescribed dose remain within the dose tolerances stated by report 62 of the International Commission on Radiation Units and Measurements for all tested configurations. In addition, the gamma index maps show that the highly constrained targeting accuracy of 0.1 mm for mouse irradiation leads to a significant disagreement between Geant4 and the reference. This simulated treatment planning is nevertheless compatible with a targeting accuracy exceeding 0.2 mm, corresponding to rat and rabbit irradiations. Good dose accuracy for millimetric tumors is achieved with the analytical calculation used in this work. These volume sizes are typical in mouse

  20. Validation of tumor protein marker quantification by two independent automated immunofluorescence image analysis platforms

    Science.gov (United States)

    Peck, Amy R; Girondo, Melanie A; Liu, Chengbao; Kovatich, Albert J; Hooke, Jeffrey A; Shriver, Craig D; Hu, Hai; Mitchell, Edith P; Freydin, Boris; Hyslop, Terry; Chervoneva, Inna; Rui, Hallgeir

    2016-01-01

    Protein marker levels in formalin-fixed, paraffin-embedded tissue sections traditionally have been assayed by chromogenic immunohistochemistry and evaluated visually by pathologists. Pathologist scoring of chromogen staining intensity is subjective and generates low-resolution ordinal or nominal data rather than continuous data. Emerging digital pathology platforms now allow quantification of chromogen or fluorescence signals by computer-assisted image analysis, providing continuous immunohistochemistry values. Fluorescence immunohistochemistry offers greater dynamic signal range than chromogen immunohistochemistry, and combined with image analysis holds the promise of enhanced sensitivity and analytic resolution, and consequently more robust quantification. However, commercial fluorescence scanners and image analysis software differ in features and capabilities, and claims of objective quantitative immunohistochemistry are difficult to validate as pathologist scoring is subjective and there is no accepted gold standard. Here we provide the first side-by-side validation of two technologically distinct commercial fluorescence immunohistochemistry analysis platforms. We document highly consistent results by (1) concordance analysis of fluorescence immunohistochemistry values and (2) agreement in outcome predictions both for objective, data-driven cutpoint dichotomization with Kaplan–Meier analyses or employment of continuous marker values to compute receiver-operating curves. The two platforms examined rely on distinct fluorescence immunohistochemistry imaging hardware, microscopy vs line scanning, and functionally distinct image analysis software. Fluorescence immunohistochemistry values for nuclear-localized and tyrosine-phosphorylated Stat5a/b computed by each platform on a cohort of 323 breast cancer cases revealed high concordance after linear calibration, a finding confirmed on an independent 382 case cohort, with concordance correlation coefficients >0

  1. Analytical Cost Metrics : Days of Future Past

    Energy Technology Data Exchange (ETDEWEB)

    Prajapati, Nirmal [Colorado State Univ., Fort Collins, CO (United States); Rajopadhye, Sanjay [Colorado State Univ., Fort Collins, CO (United States); Djidjev, Hristo Nikolov [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-20

    As we move towards the exascale era, the new architectures must be capable of running the massive computational problems efficiently. Scientists and researchers are continuously investing in tuning the performance of extreme-scale computational problems. These problems arise in almost all areas of computing, ranging from big data analytics, artificial intelligence, search, machine learning, virtual/augmented reality, computer vision, image/signal processing to computational science and bioinformatics. With Moore’s law driving the evolution of hardware platforms towards exascale, the dominant performance metric (time efficiency) has now expanded to also incorporate power/energy efficiency. Therefore the major challenge that we face in computing systems research is: “how to solve massive-scale computational problems in the most time/power/energy efficient manner?”

  2. Design Patterns to Achieve 300x Speedup for Oceanographic Analytics in the Cloud

    Science.gov (United States)

    Jacob, J. C.; Greguska, F. R., III; Huang, T.; Quach, N.; Wilson, B. D.

    2017-12-01

    We describe how we achieve super-linear speedup over standard approaches for oceanographic analytics on a cluster computer and the Amazon Web Services (AWS) cloud. NEXUS is an open source platform for big data analytics in the cloud that enables this performance through a combination of horizontally scalable data parallelism with Apache Spark and rapid data search, subset, and retrieval with tiled array storage in cloud-aware NoSQL databases like Solr and Cassandra. NEXUS is the engine behind several public portals at NASA and OceanWorks is a newly funded project for the ocean community that will mature and extend this capability for improved data discovery, subset, quality screening, analysis, matchup of satellite and in situ measurements, and visualization. We review the Python language API for Spark and how to use it to quickly convert existing programs to use Spark to run with cloud-scale parallelism, and discuss strategies to improve performance. We explain how partitioning the data over space, time, or both leads to algorithmic design patterns for Spark analytics that can be applied to many different algorithms. We use NEXUS analytics as examples, including area-averaged time series, time averaged map, and correlation map.

  3. A Multi-Usable Cloud Service Platform: A Case Study on Improved Development Pace and Efficiency

    Directory of Open Access Journals (Sweden)

    John Lindström

    2018-02-01

    Full Text Available The case study, spanning three contexts, concerns a multi-usable cloud service platform for big data collection and analytics and how the development pace and efficiency of it has been improved by 50–75% by using the Arrowhead framework and changing development processes/practices. Furthermore, additional results captured during the case study are related to technology, competencies and skills, organization, management, infrastructure, and service and support. A conclusion is that when offering a complex offer such as an Industrial Product-Service System, comprising sensors, hardware, communications, software, cloud service platform, etc., it is necessary that the technology, business model, business setup, and organization all go hand in hand during the development and later operation, as all ‘components’ are required for a successful result.

  4. Integrated Rapid-Diagnostic-Test Reader Platform on a Cellphone

    Science.gov (United States)

    Mudanyali, Onur; Dimitrov, Stoyan; Sikora, Uzair; Padmanabhan, Swati; Navruz, Isa; Ozcan, Aydogan

    2012-01-01

    We demonstrate a cellphone based Rapid-Diagnostic-Test (RDT) reader platform that can work with various lateral flow immuno-chromatographic assays and similar tests to sense the presence of a target analyte in a sample. This compact and cost-effective digital RDT reader, weighing only ~65 grams, mechanically attaches to the existing camera unit of a cellphone, where various types of RDTs can be inserted to be imaged in reflection or transmission modes under light-emitting-diode (LED) based illumination. Captured raw images of these tests are then digitally processed (within less than 0.2 sec/image) through a smart application running on the cellphone for validation of the RDT as well as for automated reading of its diagnostic result. The same smart application running on the cellphone then transmits the resulting data, together with the RDT images and other related information (e.g., demographic data) to a central server, which presents the diagnostic results on a world-map through geo-tagging. This dynamic spatio-temporal map of various RDT results can then be viewed and shared using internet browsers or through the same cellphone application. We tested this platform using malaria, tuberculosis (TB) as well as HIV RDTs by installing it on both Android based smart-phones as well as an iPhone. Providing real-time spatio-temporal statistics for the prevalence of various infectious diseases, this smart RDT reader platform running on cellphones might assist health-care professionals and policy makers to track emerging epidemics worldwide and help epidemic preparedness. PMID:22596243

  5. Optimizing Hadoop Performance for Big Data Analytics in Smart Grid

    Directory of Open Access Journals (Sweden)

    Mukhtaj Khan

    2017-01-01

    Full Text Available The rapid deployment of Phasor Measurement Units (PMUs in power systems globally is leading to Big Data challenges. New high performance computing techniques are now required to process an ever increasing volume of data from PMUs. To that extent the Hadoop framework, an open source implementation of the MapReduce computing model, is gaining momentum for Big Data analytics in smart grid applications. However, Hadoop has over 190 configuration parameters, which can have a significant impact on the performance of the Hadoop framework. This paper presents an Enhanced Parallel Detrended Fluctuation Analysis (EPDFA algorithm for scalable analytics on massive volumes of PMU data. The novel EPDFA algorithm builds on an enhanced Hadoop platform whose configuration parameters are optimized by Gene Expression Programming. Experimental results show that the EPDFA is 29 times faster than the sequential DFA in processing PMU data and 1.87 times faster than a parallel DFA, which utilizes the default Hadoop configuration settings.

  6. Data Platforms and Cities

    DEFF Research Database (Denmark)

    Blok, Anders; Courmont, Antoine; Hoyng, Rolien

    2017-01-01

    This section offers a series of joint reflections on (open) data platform from a variety of cases, from cycling, traffic and mapping to activism, environment and data brokering. Data platforms play a key role in contemporary urban governance. Linked to open data initiatives, such platforms are of...

  7. Mobile Platforms and Development Environments

    CERN Document Server

    Helal, Sumi; Li, Wengdong

    2012-01-01

    Mobile platform development has lately become a technological war zone with extremely dynamic and fluid movement, especially in the smart phone and tablet market space. This Synthesis lecture is a guide to the latest developments of the key mobile platforms that are shaping the mobile platform industry. The book covers the three currently dominant native platforms -- iOS, Android and Windows Phone -- along with the device-agnostic HTML5 mobile web platform. The lecture also covers location-based services (LBS) which can be considered as a platform in its own right. The lecture utilizes a sampl

  8. Polystyrene Core-Silica Shell Particles with Defined Nanoarchitectures as a Versatile Platform for Suspension Array Technology.

    Science.gov (United States)

    Sarma, Dominik; Gawlitza, Kornelia; Rurack, Knut

    2016-04-19

    The need for rapid and high-throughput screening in analytical laboratories has led to significant growth in interest in suspension array technologies (SATs), especially with regard to cytometric assays targeting a low to medium number of analytes. Such SAT or bead-based assays rely on spherical objects that constitute the analytical platform. Usually, functionalized polymer or silica (SiO2) microbeads are used which each have distinct advantages and drawbacks. In this paper, we present a straightforward synthetic route to highly monodisperse SiO2-coated polystyrene core-shell (CS) beads for SAT with controllable architectures from smooth to raspberry- and multilayer-like shells by varying the molecular weight of poly(vinylpyrrolidone) (PVP), which was used as the stabilizer of the cores. The combination of both organic polymer core and a structurally controlled inorganic SiO2 shell in one hybrid particle holds great promises for flexible next-generation design of the spherical platform. The particles were characterized by electron microscopy (SEM, T-SEM, and TEM), thermogravimetry, flow cytometry, and nitrogen adsorption/desorption, offering comprehensive information on the composition, size, structure, and surface area. All particles show ideal cytometric detection patterns and facile handling due to the hybrid structure. The beads are endowed with straightforward modification possibilities through the defined SiO2 shells. We successfully implemented the particles in fluorometric SAT model assays, illustrating the benefits of tailored surface area which is readily available for small-molecule anchoring. Very promising assay performance was shown for DNA hybridization assays with quantification limits down to 8 fmol.

  9. Getting the Most from Distributed Resources With an Analytics Platform for ATLAS Computing Services

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00225336; The ATLAS collaboration; Gardner, Robert; Bryant, Lincoln

    2016-01-01

    To meet a sharply increasing demand for computing resources for LHC Run 2, ATLAS distributed computing systems reach far and wide to gather CPU resources and storage capacity to execute an evolving ecosystem of production and analysis workflow tools. Indeed more than a hundred computing sites from the Worldwide LHC Computing Grid, plus many “opportunistic” facilities at HPC centers, universities, national laboratories, and public clouds, combine to meet these requirements. These resources have characteristics (such as local queuing availability, proximity to data sources and target destinations, network latency and bandwidth capacity, etc.) affecting the overall processing efficiency and throughput. To quantitatively understand and in some instances predict behavior, we have developed a platform to aggregate, index (for user queries), and analyze the more important information streams affecting performance. These data streams come from the ATLAS production system (PanDA), the distributed data management s...

  10. ITS Platform North Denmark

    DEFF Research Database (Denmark)

    Lahrmann, Harry; Agerholm, Niels; Juhl, Jens

    2012-01-01

    This paper presents the project entitled “ITS Platform North Denmark” which is used as a test platform for Intelligent Transportation System (ITS) solutions. The platform consists of a newly developed GNSS/GPRS On Board Unit (OBU) to be installed in 500 cars, a backend server and a specially...

  11. Platform development supportedby gaming

    DEFF Research Database (Denmark)

    Mikkola, Juliana Hsuan; Hansen, Poul H. Kyvsgård

    2007-01-01

    The challenge of implementing industrial platforms in practice can be described as a configuration problem caused by high number of variables, which often have contradictory influences on the total performance of the firm. Consequently, the specific platform decisions become extremely complex......, possibly increasing the strategic risks for the firm. This paper reports preliminary findings on platform management process at LEGO, a Danish toy company.  Specifically, we report the process of applying games combined with simulations and workshops in the platform development. We also propose a framework...

  12. Baking Gender Into Social Media Design: How Platforms Shape Categories for Users and Advertisers

    Directory of Open Access Journals (Sweden)

    Rena Bivens

    2016-10-01

    Full Text Available In recent years, several popular social media platforms have launched freeform custom gender fields. This decision reconstitutes gender categories beyond an oppressive binary only permitting “males” and “females.” In this work, we uncover many different user-facing gender category design strategies within the social media ecosystem, ranging from custom gender options (on Facebook, Google+, and Pinterest to the absence of gender fields entirely (on Twitter and LinkedIn. To explore how gender is baked into platform design, this article investigates the 10 most popular English-speaking social media platforms by performing recorded walkthroughs from two different subject positions: (1 a new user registering an account, and (2 a new advertiser creating an ad. We explore several different spaces in social media software where designers commonly program gender—sign-up pages, profile pages, and advertising portals—to consider (1 how gender is made durable through social media design, and (2 the shifting composition of the category of gender within the social media ecosystem more broadly. Through this investigation, we question how these categorizations attribute meaning to gender as they materialize in different software spaces, along with the recursive implications for society. Ultimately, our analysis reveals how social media platforms act as intermediaries within the larger ecosystem of advertising and web analytics companies. We argue that this intermediary role entrusts social media platforms with a considerable degree of control over the generation of broader categorization systems, which can be wielded to shape the perceived needs and desires of both users and advertising clients.

  13. The Effect of Electronic Word of Mouth on Sales: A Meta-Analytic Review of Platform, Product, and Metric Factors

    NARCIS (Netherlands)

    Babic, A.; Sotgiu, Francesca; de Valck, K.; Bijmolt, T.H.A.

    2016-01-01

    The increasing amount of electronic word of mouth (eWOM) has significantly affected the way consumers make purchase decisions. Empirical studies have established an effect of eWOM on sales but disagree on which online platforms, products, and eWOM metrics moderate this effect. The authors conduct a

  14. DATA ANALYSIS BY SQL-MAPREDUCE PLATFORM

    Directory of Open Access Journals (Sweden)

    A. A. A. Dergachev

    2014-01-01

    Full Text Available The paper deals with the problems related to the usage of relational database management system (RDBMS, mainly in the analysis of large data content, including data analysis based on web services in the Internet. A solution of these problems can be represented as a web-oriented distributed system of the data analysis with the processor of service requests as an executive kernel. The functions of such system are similar to the functions of relational DBMS, only with the usage of web services. The processor of service requests is responsible for planning of data analysis web services calls and their execution. The efficiency of such web-oriented system depends on the efficiency of web services calls plan and their program implementation where the basic element is the facilities of analyzed data storage – relational DBMS. The main attention is given to extension of functionality of relational DBMS for the analysis of large data content, in particular, the perspective estimation of web services data analysis implementation on the basis of SQL/MapReduce platform. With a view of obtaining this result, analytical task was chosen as an application-oriented part, typical for data analysis in various social networks and web portals, based on data analysis of users’ attendance. In the practical part of this research the algorithm for planning of web services calls was implemented for application-oriented task solution. SQL/MapReduce platform efficiency is confirmed by experimental results that show the opportunity of effective application for data analysis web services.

  15. Development of an Automated LIBS Analytical Test System Integrated with Component Control and Spectrum Analysis Capabilities

    International Nuclear Information System (INIS)

    Ding Yu; Tian Di; Chen Feipeng; Chen Pengfei; Qiao Shujun; Yang Guang; Li Chunsheng

    2015-01-01

    The present paper proposes an automated Laser-Induced Breakdown Spectroscopy (LIBS) analytical test system, which consists of a LIBS measurement and control platform based on a modular design concept, and a LIBS qualitative spectrum analysis software and is developed in C#. The platform provides flexible interfacing and automated control; it is compatible with different manufacturer component models and is constructed in modularized form for easy expandability. During peak identification, a more robust peak identification method with improved stability in peak identification has been achieved by applying additional smoothing on the slope obtained by calculation before peak identification. For the purpose of element identification, an improved main lines analysis method, which detects all elements on the spectral peak to avoid omission of certain elements without strong spectral lines, is applied to element identification in the tested LIBS samples. This method also increases the identification speed. In this paper, actual applications have been carried out. According to tests, the analytical test system is compatible with components of various models made by different manufacturers. It can automatically control components to get experimental data and conduct filtering, peak identification and qualitative analysis, etc. on spectral data. (paper)

  16. Many-core graph analytics using accelerated sparse linear algebra routines

    Science.gov (United States)

    Kozacik, Stephen; Paolini, Aaron L.; Fox, Paul; Kelmelis, Eric

    2016-05-01

    Graph analytics is a key component in identifying emerging trends and threats in many real-world applications. Largescale graph analytics frameworks provide a convenient and highly-scalable platform for developing algorithms to analyze large datasets. Although conceptually scalable, these techniques exhibit poor performance on modern computational hardware. Another model of graph computation has emerged that promises improved performance and scalability by using abstract linear algebra operations as the basis for graph analysis as laid out by the GraphBLAS standard. By using sparse linear algebra as the basis, existing highly efficient algorithms can be adapted to perform computations on the graph. This approach, however, is often less intuitive to graph analytics experts, who are accustomed to vertex-centric APIs such as Giraph, GraphX, and Tinkerpop. We are developing an implementation of the high-level operations supported by these APIs in terms of linear algebra operations. This implementation is be backed by many-core implementations of the fundamental GraphBLAS operations required, and offers the advantages of both the intuitive programming model of a vertex-centric API and the performance of a sparse linear algebra implementation. This technology can reduce the number of nodes required, as well as the run-time for a graph analysis problem, enabling customers to perform more complex analysis with less hardware at lower cost. All of this can be accomplished without the requirement for the customer to make any changes to their analytics code, thanks to the compatibility with existing graph APIs.

  17. Analytical performance of 17 general chemistry analytes across countries and across manufacturers in the INPUtS project of EQA organizers in Italy, the Netherlands, Portugal, United Kingdom and Spain.

    Science.gov (United States)

    Weykamp, Cas; Secchiero, Sandra; Plebani, Mario; Thelen, Marc; Cobbaert, Christa; Thomas, Annette; Jassam, Nuthar; Barth, Julian H; Perich, Carmen; Ricós, Carmen; Faria, Ana Paula

    2017-02-01

    Optimum patient care in relation to laboratory medicine is achieved when results of laboratory tests are equivalent, irrespective of the analytical platform used or the country where the laboratory is located. Standardization and harmonization minimize differences and the success of efforts to achieve this can be monitored with international category 1 external quality assessment (EQA) programs. An EQA project with commutable samples, targeted with reference measurement procedures (RMPs) was organized by EQA institutes in Italy, the Netherlands, Portugal, UK, and Spain. Results of 17 general chemistry analytes were evaluated across countries and across manufacturers according to performance specifications derived from biological variation (BV). For K, uric acid, glucose, cholesterol and high-density density (HDL) cholesterol, the minimum performance specification was met in all countries and by all manufacturers. For Na, Cl, and Ca, the minimum performance specifications were met by none of the countries and manufacturers. For enzymes, the situation was complicated, as standardization of results of enzymes toward RMPs was still not achieved in 20% of the laboratories and questionable in the remaining 80%. The overall performance of the measurement of 17 general chemistry analytes in European medical laboratories met the minimum performance specifications. In this general picture, there were no significant differences per country and no significant differences per manufacturer. There were major differences between the analytes. There were six analytes for which the minimum quality specifications were not met and manufacturers should improve their performance for these analytes. Standardization of results of enzymes requires ongoing efforts.

  18. Platform Expansion Design as Strategic Choice

    DEFF Research Database (Denmark)

    Staykova, Kalina S.; Damsgaard, Jan

    2016-01-01

    In this paper, we address how the strategic choice of platform expansion design impacts the subse-quent platform strategy. We identify two distinct approaches to platform expansion – platform bun-dling and platform constellations, which currently co-exist. The purpose of this paper is to outline...

  19. A Typology of Multi-sided Platforms

    DEFF Research Database (Denmark)

    Staykova, Kalina Stefanova; Damsgaard, Jan

    2015-01-01

    In this paper we address how the composition of a platform impacts the platform’s business model. By platform’s business model we mean platform features, platform architecture and platform governance. To this end, we construct the Platform Business Model Framework. We apply the framework to three...

  20. Fe nanoparticle tailored poly(N-methyl pyrrole) nanowire matrix: a CHEMFET study from the perspective of discrimination among electron donating analytes

    International Nuclear Information System (INIS)

    Datta, K; Rushi, A; Shirsat, M; Mulchandani, A; Ghosh, P

    2015-01-01

    Back-gated chemically sensitive field effect transistor (CHEMFET) platforms have been developed with electrochemically synthesized poly(N-methyl pyrrole) nanowires by a templateless route. The nanowire matrix has been tailored with Fe nanoparticles to probe their effect in enhancing the sensing capabilities of the nanowire platform, and further to see if the inculcation of Fe nanoparticles is helpful to enhance the screening capability of the sensor among electron donating analytes. A noticeable difference in the sensing behaviour of the CHEMFET sensor was observed when it was exposed to three different analytes—ammonia, phosphine and carbon monoxide. FET transfer characteristics were instrumental in the corroboration of the experimental validations. The observations have been rationalized considering the simultaneous modulation of the work functions of Fe and polymeric material. The real time behaviour of the sensor shows that the sensor platform is readily capable of sensing the validated analytes at a ppb level of concentration with good response and recovery behaviour. The best response could be observed for ammonia with an Fe nanoparticle tailored polymeric matrix, with a sensitivity of ∼31.58% and excellent linearity (R 2 = 0.985) in a concentration window of 0.05 ppm to 1 ppm. (paper)

  1. An enhanced computational platform for investigating the roles of regulatory RNA and for identifying functional RNA motifs

    OpenAIRE

    Chang, Tzu-Hao; Huang, Hsi-Yuan; Hsu, Justin Bo-Kai; Weng, Shun-Long; Horng, Jorng-Tzong; Huang, Hsien-Da

    2013-01-01

    Background Functional RNA molecules participate in numerous biological processes, ranging from gene regulation to protein synthesis. Analysis of functional RNA motifs and elements in RNA sequences can obtain useful information for deciphering RNA regulatory mechanisms. Our previous work, RegRNA, is widely used in the identification of regulatory motifs, and this work extends it by incorporating more comprehensive and updated data sources and analytical approaches into a new platform. Methods ...

  2. Dynamic Gaming Platform (DGP)

    Science.gov (United States)

    2009-04-01

    GAMING PLATFORM (DGP) Lockheed Martin Corporation...YYYY) APR 09 2. REPORT TYPE Final 3. DATES COVERED (From - To) Jul 07 – Mar 09 4. TITLE AND SUBTITLE DYNAMIC GAMING PLATFORM (DGP) 5a...CMU Carnegie Mellon University DGP Dynamic Gaming Platform GA Genetic Algorithm IARPA Intelligence Advanced Research Projects Activity LM ATL Lockheed Martin Advanced Technology Laboratories PAINT ProActive INTelligence

  3. EURESCOM Services Platform

    NARCIS (Netherlands)

    Nieuwenhuis, Lambertus Johannes Maria; van Halteren, Aart

    1999-01-01

    This paper presents the results of the EURESCOM Project 715. In February 1999, a large team of researchers from six European public network operators completed a two year period of cooperative experiments on a TINA-based environment, called the EURESCOM Services Platform (ESP). This platform

  4. Global Simulation of Bioenergy Crop Productivity: Analytical Framework and Case Study for Switchgrass

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Shujiang [ORNL; Kline, Keith L [ORNL; Nair, S. Surendran [University of Tennessee, Knoxville (UTK); Nichols, Dr Jeff A [ORNL; Post, Wilfred M [ORNL; Brandt, Craig C [ORNL; Wullschleger, Stan D [ORNL; Wei, Yaxing [ORNL; Singh, Nagendra [ORNL

    2013-01-01

    A global energy crop productivity model that provides geospatially explicit quantitative details on biomass potential and factors affecting sustainability would be useful, but does not exist now. This study describes a modeling platform capable of meeting many challenges associated with global-scale agro-ecosystem modeling. We designed an analytical framework for bioenergy crops consisting of six major components: (i) standardized natural resources datasets, (ii) global field-trial data and crop management practices, (iii) simulation units and management scenarios, (iv) model calibration and validation, (v) high-performance computing (HPC) simulation, and (vi) simulation output processing and analysis. The HPC-Environmental Policy Integrated Climate (HPC-EPIC) model simulated a perennial bioenergy crop, switchgrass (Panicum virgatum L.), estimating feedstock production potentials and effects across the globe. This modeling platform can assess soil C sequestration, net greenhouse gas (GHG) emissions, nonpoint source pollution (e.g., nutrient and pesticide loss), and energy exchange with the atmosphere. It can be expanded to include additional bioenergy crops (e.g., miscanthus, energy cane, and agave) and food crops under different management scenarios. The platform and switchgrass field-trial dataset are available to support global analysis of biomass feedstock production potential and corresponding metrics of sustainability.

  5. Graphene prepared by one-pot solvent exfoliation as a highly sensitive platform for electrochemical sensing

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Can; Cheng, Qin [Key Laboratory for Large-Format Battery Materials and System, Ministry of Education, School of Chemistry and Chemical Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Wu, Kangbing, E-mail: kbwu@hust.edu.cn [Key Laboratory for Large-Format Battery Materials and System, Ministry of Education, School of Chemistry and Chemical Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China); Wu, Gang [Materials Physics and Applications Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Li, Qing, E-mail: qing_li_2@brown.edu [Materials Physics and Applications Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States)

    2014-05-01

    Highlights: • Graphene was prepared by one-step solvent exfoliation as superior electrode material. • Compared with RGO, prepared graphene exhibited stronger signal enhancement. • A widespread and highly-sensitive electrochemical sensing platform was constructed. - Abstract: Graphene was easily obtained via one-step ultrasonic exfoliation of graphite powder in N-methyl-2-pyrrolidone. Scanning electron microscopy, transmission electron microscopy, Raman and particle size measurements indicated that the exfoliation efficiency and the amount of produced graphene increased with ultrasonic time. The electrochemical properties and analytical applications of the resulting graphene were systematically studied. Compared with the predominantly-used reduced graphene oxides, the obtained graphene by one-step solvent exfoliation greatly enhanced the oxidation signals of various analytes, such as ascorbic acid (AA), dopamine (DA), uric acid (UA), xanthine (XA), hypoxanthine (HXA), bisphenol A (BPA), ponceau 4R, and sunset yellow. The detection limits of AA, DA, UA, XA, HXA, BPA, ponceau 4R, and sunset yellow were evaluated to be 0.8 μM, 7.5 nM, 2.5 nM, 4 nM, 10 nM, 20 nM, 2 nM, and 1 nM, which are much lower than the reported values. Thus, the prepared graphene via solvent exfoliation strategy displays strong signal amplification ability and holds great promise in constructing a universal and sensitive electrochemical sensing platform.

  6. Towards a Disruptive Digital Platform Model

    DEFF Research Database (Denmark)

    Kazan, Erol

    that digital platforms leverage on three strategic design elements (i.e., business, architecture, and technology design) to create supportive conditions for facilitating disruption. To shed light on disruptive digital platforms, I opted for payment platforms as my empirical context and unit of analysis......Digital platforms are layered modular information technology architectures that support disruption. Digital platforms are particularly disruptive, as they facilitate the quick release of digital innovations that may replace established innovations. Yet, despite their support for disruption, we have...... not fully understood how such digital platforms can be strategically designed and configured to facilitate disruption. To that end, this thesis endeavors to unravel disruptive digital platforms from the supply perspective that are grounded on strategic digital platform design elements. I suggest...

  7. Groundwater Assessment Platform

    OpenAIRE

    Podgorski, Joel; Berg, Michael

    2018-01-01

    The Groundwater Assessment Platform is a free, interactive online GIS platform for the mapping, sharing and statistical modeling of groundwater quality data. The modeling allows users to take advantage of publicly available global datasets of various environmental parameters to produce prediction maps of their contaminant of interest.

  8. Preparing for a Product Platform

    DEFF Research Database (Denmark)

    Fiil-Nielsen, Ole; Munk, Lone; Mortensen, Niels Henrik

    2005-01-01

    on commonalities and similarities in the product family, and variance should be based on customer demands. To relate these terms and to improve the basis on which decisions are made, we need a way of visualizing the hierarchy of the product family as well as the commonality and variance. This visualization method...... of the platform or ensuring that the platform can meet future demands will be very useful in the preparation process of a platform synthesis as well as in the updating or reengineering of an existing product development platform.......Experience in the industry as well as recent related scientific publications show the benefits of product development platforms. Companies use platforms to develop not a single but multiple products (i.e. a product family) simultaneously. When these product development projects are coordinated...

  9. Use of IT platform in determination of efficiency of mining machines

    Science.gov (United States)

    Brodny, Jarosław; Tutak, Magdalena

    2018-01-01

    Determination of effective use of mining devices has very significant meaning for mining enterprises. High costs of their purchase and tenancy cause that these enterprises tend to the best use of possessed technical potential. However, specifics of mining production causes that this process not always proceeds without interferences. Practical experiences show that determination of objective measure of utilization of machine in mining enterprise is not simple. In the paper a proposition for solution of this problem is presented. For this purpose an IT platform and overall efficiency model OEE were used. This model enables to evaluate the machine in a range of its availability performance and quality of product, and constitutes a quantitative tool of TPM strategy. Adapted to the specificity of mining branch the OEE model together with acquired data from industrial automatic system enabled to determine the partial indicators and overall efficiency of tested machines. Studies were performed for a set of machines directly use in coal exploitation process. They were: longwall-shearer and armoured face conveyor, and beam stage loader. Obtained results clearly indicate that degree of use of machines by mining enterprises are unsatisfactory. Use of IT platforms will significantly facilitate the process of registration, archiving and analytical processing of the acquired data. In the paper there is presented methodology of determination of partial indices and total OEE together with a practical example of its application for investigated machines set. Also IT platform was characterized for its construction, function and application.

  10. Flexible continuous manufacturing platforms for solid dispersion formulations

    Science.gov (United States)

    Karry-Rivera, Krizia Marie

    In 2013 16,000 people died in the US due to overdose from prescription drugs and synthetic narcotics. As of that same year, 90% of new molecular entities in the pharmaceutical drug pipeline are classified as poor water-soluble. The work in this dissertation aims to design, develop and validate platforms that solubilize weak acids and can potentially deter drug abuse. These platforms are based on processing solid dispersions via solvent-casting and hot-melt extrusion methods to produce oral transmucosal films and melt tablets. To develop these platforms, nanocrystalline suspensions and glassy solutions were solvent-casted in the form of films after physicochemical characterizations of drug-excipient interactions and design of experiment approaches. A second order model was fitted to the emulsion diffusion process to predict average nanoparticle size and for process optimization. To further validate the manufacturing flexibility of the formulations, glassy solutions were also extruded and molded into tablets. This process included a systematic quality-by-design (QbD) approach that served to identify the factors affecting the critical quality attributes (CQAs) of the melt tablets. These products, due to their novelty, lack discriminatory performance tests that serve as predictors to their compliance and stability. Consequently, Process Analytical Technology (PAT) tools were integrated into the continuous manufacturing platform for films. Near-infrared (NIR) spectroscopy, including chemical imaging, combined with deconvolution algorithms were utilized for a holistic assessment of the effect of formulation and process variables on the product's CQAs. Biorelevant dissolution protocols were then established to improve the in-vivo in-vitro correlation of the oral transmucosal films. In conclusion, the work in this dissertation supports the delivery of poor-water soluble drugs in products that may deter abuse. Drug nanocrystals ensured high bioavailability, while glassy

  11. The Logic of Digital Platform Disruption

    DEFF Research Database (Denmark)

    Kazan, Erol; Tan, Chee-Wee; Lim, Eric T. K.

    Digital platforms are disruptive IT artifacts, because they facilitate the quick release of innovative platform derivatives from third parties (e.g., apps). This study endeavours to unravel the disruptive potential, caused by distinct designs and configurations of digital platforms on market...... environments. We postulate that the disruptive potential of digital platforms is determined by the degree of alignment among the business, technology and platform profiles. Furthermore, we argue that the design and configuration of the aforementioned three elements dictates the extent to which open innovation...... is permitted. To shed light on the disruptive potential of digital platforms, we opted for payment platforms as our unit of analysis. Through interviews with experts and payment providers, we seek to gain an in-depth appreciation of how contemporary digital payment platforms are designed and configured...

  12. A universal and label-free impedimetric biosensing platform for discrimination of single nucleotide substitutions in long nucleic acid strands.

    Science.gov (United States)

    Mills, Dawn M; Martin, Christopher P; Armas, Stephanie M; Calvo-Marzal, Percy; Kolpashchikov, Dmitry M; Chumbimuni-Torres, Karin Y

    2018-06-30

    We report a label-free universal biosensing platform for highly selective detection of long nucleic acid strands. The sensor consists of an electrode-immobilized universal stem-loop (USL) probe and two adaptor strands that form a 4J structure in the presence of a specific DNA/RNA analyte. The sensor was characterized by electrochemical impedance spectroscopy (EIS) using K 3 [Fe(CN) 6 ]/K 4 [Fe(CN) 6 ] redox couple in solution. An increase in charge transfer resistance (R CT ) was observed upon 4J structure formation, the value of which depends on the analyte length. Cyclic voltammetry (CV) was used to further characterize the sensor and monitor the electrochemical reaction in conjunction with thickness measurements of the mixed DNA monolayer obtained using spectroscopic ellipsometry. In addition, the electron transfer was calculated at the electrode/electrolyte interface using a rotating disk electrode. Limits of detection in the femtomolar range were achieved for nucleic acid targets of different lengths (22 nt, 60 nt, 200 nt). The sensor produced only a background signal in the presence of single base mismatched analytes, even in hundred times excess in concentration. This label-free and highly selective biosensing platform is versatile and can be used for universal detection of nucleic acids of varied lengths which could revolutionize point of care diagnostics for applications such as bacterial or cancer screening. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. THE DESIGN OF A HIGH PERFORMANCE EARTH IMAGERY AND RASTER DATA MANAGEMENT AND PROCESSING PLATFORM

    Directory of Open Access Journals (Sweden)

    Q. Xie

    2016-06-01

    Full Text Available This paper summarizes the general requirements and specific characteristics of both geospatial raster database management system and raster data processing platform from a domain-specific perspective as well as from a computing point of view. It also discusses the need of tight integration between the database system and the processing system. These requirements resulted in Oracle Spatial GeoRaster, a global scale and high performance earth imagery and raster data management and processing platform. The rationale, design, implementation, and benefits of Oracle Spatial GeoRaster are described. Basically, as a database management system, GeoRaster defines an integrated raster data model, supports image compression, data manipulation, general and spatial indices, content and context based queries and updates, versioning, concurrency, security, replication, standby, backup and recovery, multitenancy, and ETL. It provides high scalability using computer and storage clustering. As a raster data processing platform, GeoRaster provides basic operations, image processing, raster analytics, and data distribution featuring high performance computing (HPC. Specifically, HPC features include locality computing, concurrent processing, parallel processing, and in-memory computing. In addition, the APIs and the plug-in architecture are discussed.

  14. The Design of a High Performance Earth Imagery and Raster Data Management and Processing Platform

    Science.gov (United States)

    Xie, Qingyun

    2016-06-01

    This paper summarizes the general requirements and specific characteristics of both geospatial raster database management system and raster data processing platform from a domain-specific perspective as well as from a computing point of view. It also discusses the need of tight integration between the database system and the processing system. These requirements resulted in Oracle Spatial GeoRaster, a global scale and high performance earth imagery and raster data management and processing platform. The rationale, design, implementation, and benefits of Oracle Spatial GeoRaster are described. Basically, as a database management system, GeoRaster defines an integrated raster data model, supports image compression, data manipulation, general and spatial indices, content and context based queries and updates, versioning, concurrency, security, replication, standby, backup and recovery, multitenancy, and ETL. It provides high scalability using computer and storage clustering. As a raster data processing platform, GeoRaster provides basic operations, image processing, raster analytics, and data distribution featuring high performance computing (HPC). Specifically, HPC features include locality computing, concurrent processing, parallel processing, and in-memory computing. In addition, the APIs and the plug-in architecture are discussed.

  15. Transactional Network Platform: Applications

    Energy Technology Data Exchange (ETDEWEB)

    Katipamula, Srinivas; Lutes, Robert G.; Ngo, Hung; Underhill, Ronald M.

    2013-10-31

    In FY13, Pacific Northwest National Laboratory (PNNL) with funding from the Department of Energy’s (DOE’s) Building Technologies Office (BTO) designed, prototyped and tested a transactional network platform to support energy, operational and financial transactions between any networked entities (equipment, organizations, buildings, grid, etc.). Initially, in FY13, the concept demonstrated transactions between packaged rooftop air conditioners and heat pump units (RTUs) and the electric grid using applications or "agents" that reside on the platform, on the equipment, on a local building controller or in the Cloud. The transactional network project is a multi-lab effort with Oakridge National Laboratory (ORNL) and Lawrence Berkeley National Laboratory (LBNL) also contributing to the effort. PNNL coordinated the project and also was responsible for the development of the transactional network (TN) platform and three different applications associated with RTUs. This document describes two applications or "agents" in details, and also summarizes the platform. The TN platform details are described in another companion document.

  16. Vertical Relationships within Platform Marketplaces

    Directory of Open Access Journals (Sweden)

    Mark J. Tremblay

    2016-07-01

    Full Text Available In two-sided markets a platform allows consumers and sellers to interact by creating sub-markets within the platform marketplace. For example, Amazon has sub-markets for all of the different product categories available on its site, and smartphones have sub-markets for different types of applications (gaming apps, weather apps, map apps, ridesharing apps, etc.. The network benefits between consumers and sellers depend on the mode of competition within the sub-markets: more competition between sellers lowers product prices, increases the surplus consumers receive from a sub-market, and makes platform membership more desirable for consumers. However, more competition also lowers profits for a seller which makes platform membership less desirable for a seller and reduces seller entry and the number of sub-markets available on the platform marketplace. This dynamic between seller competition within a sub-market and agents’ network benefits leads to platform pricing strategies, participation decisions by consumers and sellers, and welfare results that depend on the mode of competition. Thus, the sub-market structure is important when investigating platform marketplaces.

  17. moocRP: Enabling Open Learning Analytics with an Open Source Platform for Data Distribution, Analysis, and Visualization

    Science.gov (United States)

    Pardos, Zachary A.; Whyte, Anthony; Kao, Kevin

    2016-01-01

    In this paper, we address issues of transparency, modularity, and privacy with the introduction of an open source, web-based data repository and analysis tool tailored to the Massive Open Online Course community. The tool integrates data request/authorization and distribution workflow features as well as provides a simple analytics module upload…

  18. The vacuum platform

    Science.gov (United States)

    McNab, A.

    2017-10-01

    This paper describes GridPP’s Vacuum Platform for managing virtual machines (VMs), which has been used to run production workloads for WLCG and other HEP experiments. The platform provides a uniform interface between VMs and the sites they run at, whether the site is organised as an Infrastructure-as-a-Service cloud system such as OpenStack, or an Infrastructure-as-a-Client system such as Vac. The paper describes our experience in using this platform, in developing and operating VM lifecycle managers Vac and Vcycle, and in interacting with VMs provided by LHCb, ATLAS, ALICE, CMS, and the GridPP DIRAC service to run production workloads.

  19. Tectonic resemblance of the Indian Platform, Pakistan with the Moesian Platform, Romania and strategy for exploration of hydrocarbons

    International Nuclear Information System (INIS)

    Memon, A.D.

    1994-01-01

    There is a remarkable tectonic resemblance between the indian Platform (Pakistan) and the Moesian Platform (Romania). As viewed in global tectonic perspective Moeslan and Indian Plates have played important role in Alpine Himalayan Orogeny; Moesian and Indian Platforms are extension of these respective plates. Characteristics features of both the platforms are block faulting which has effected not only the general tectonic framework but has also played important role in oil accumulation. Main producing rocks in the Moesian platform are Jurassic sandstones and cretaceous limestones while in the indian platform cretaceous sandstones are important reservoirs. The average geothermal gradient in the indian platform is 2.45 C/100m with the higher gradients in the central gas producing region. Geothermal gradients in the Moesian platform have an average value of 3 C/100m with higher gradients in the northern in the northern part. Some of the producing structures in both the platforms are remarkably similar, traps associated with normal faults are very important. Extensive exploration carried in the Moesian Platform makes it very important oil producing region of Romania. After the discovery of oil lower Sindh, serious exploration is being carried in the Indian platform. The paper deals with the similarities between these two important platforms. In the light of the studies of the Moesian platform, strategies or exploration of oil and gas in the Indian Platform are suggested. (author)

  20. Utilizing platforms in industrialized construction

    DEFF Research Database (Denmark)

    Bonev, Martin; Wörösch, Michael; Hvam, Lars

    2015-01-01

    platform strategies, this researchhighlights key aspects of adapting platform-based developed theory to industrialised construction.Building projects use different layers of product, process and logistics platforms to form the right cost– value ratio for the target market application, while modelling...

  1. Stratospheric Platforms for Monitoring Purposes

    International Nuclear Information System (INIS)

    Konigorski, D.; Gratzel, U.; Obersteiner, M.; Schneidereit, M.

    2010-01-01

    Stratospheric platforms are emerging systems based on challenging technology. Goal is to create a platform, payload, and mission design which is able to complement satellite services on a local scale. Applications are close to traditional satellite business in telecommunication, navigation, science, and earth observation and include for example mobile telecommunications, navigation augmentation, atmospheric research, or border control. Stratospheric platforms could potentially support monitoring activities related to safeguards, e.g. by imagery of surfaces, operational conditions of nuclear facilities, and search for undeclared nuclear activities. Stratospheric platforms are intended to be flown in an altitude band between 16 and 30 km, above 16-20 km to take advantage of usually lower winds facilitating station keeping, below 30 km to limit the challenges to achieve a reasonable payload at acceptable platform sizes. Stratospheric platforms could substitute satellites which are expensive and lack upgrade capabilities for new equipment. Furthermore they have practically an unlimited time over an area of interest. It is intended to keep the platforms operational and maintenance free on a 24/7 basis with an average deployment time of 3 years. Geostationary satellites lack resolution. Potential customers like Armed Forces, National Agencies and commercial customers have indicated interest in the use of stratospheric platforms. Governmental entities are looking for cheaper alternatives to communications and surveillance satellites and stratospheric platforms could offer the following potential advantages: Lower operational cost than satellite or UAV (Unmanned Aerial Vehicles) constellation (fleet required); Faster deployment than satellite constellation; Repositioning capability and ability to loiter as required; Persistent long-term real-time services over a fairly large regional spot; Surge capability: Able to extend capability (either monitoring or communications

  2. Detection of mercury(II) ions using colorimetric gold nanoparticles on paper-based analytical devices.

    Science.gov (United States)

    Chen, Guan-Hua; Chen, Wei-Yu; Yen, Yu-Chun; Wang, Chia-Wei; Chang, Huan-Tsung; Chen, Chien-Fu

    2014-07-15

    An on-field colorimetric sensing strategy employing gold nanoparticles (AuNPs) and a paper-based analytical platform was investigated for mercury ion (Hg(2+)) detection at water sources. By utilizing thymine-Hg(2+)-thymine (T-Hg(2+)-T) coordination chemistry, label-free detection oligonucleotide sequences were attached to unmodified gold nanoparticles to provide rapid mercury ion sensing without complicated and time-consuming thiolated or other costly labeled probe preparation processes. Not only is this strategy's sensing mechanism specific toward Hg(2+), rather than other metal ions, but also the conformational change in the detection oligonucleotide sequences introduces different degrees of AuNP aggregation that causes the color of AuNPs to exhibit a mixture variance. To eliminate the use of sophisticated equipment and minimize the power requirement for data analysis and transmission, the color variance of multiple detection results were transferred and concentrated on cellulose-based paper analytical devices, and the data were subsequently transmitted for the readout and storage of results using cloud computing via a smartphone. As a result, a detection limit of 50 nM for Hg(2+) spiked pond and river water could be achieved. Furthermore, multiple tests could be performed simultaneously with a 40 min turnaround time. These results suggest that the proposed platform possesses the capability for sensitive and high-throughput on-site mercury pollution monitoring in resource-constrained settings.

  3. Adoption of Mobile Payment Platforms

    DEFF Research Database (Denmark)

    Staykova, Kalina Stefanova; Damsgaard, Jan

    2016-01-01

    Numerous mobile payment solutions, which rely on new disruptive technologies, have been launched on the payment market in recent years. But despite the growing number of mobile payment apps, very few solutions have turned to be successful as the majority of them fail to gain a critical mass...... of users. In this paper, we investigate successful platform adoption strategies by using the Reach and Range Framework for Multi-Sided Platforms as a strategic tool to which mobile payment providers can adhere in order to tackle some of the main challenges they face throughout the evolution...... of their platforms. The analysis indicates that successful mobile payment solutions tend to be launched as one-sided platforms and then gradually be expanded into being two-sided. Our study showcases that the success of mobile payment platforms lies with the ability of the platform to balance the reach (number...

  4. The Dynamics of Digital Platform Innovation

    DEFF Research Database (Denmark)

    Eaton, Ben

    2016-01-01

    Curated platforms provide an architectural basis for third parties to develop platform complements and for platform owners to control their implementation as a form of open innovation. The refusal to implement complements as innovations can cause tension between platform owners and developers. Th...

  5. Cross-platform learning: on the nature of children's learning from multiple media platforms.

    Science.gov (United States)

    Fisch, Shalom M

    2013-01-01

    It is increasingly common for an educational media project to span several media platforms (e.g., TV, Web, hands-on materials), assuming that the benefits of learning from multiple media extend beyond those gained from one medium alone. Yet research typically has investigated learning from a single medium in isolation. This paper reviews several recent studies to explore cross-platform learning (i.e., learning from combined use of multiple media platforms) and how such learning compares to learning from one medium. The paper discusses unique benefits of cross-platform learning, a theoretical mechanism to explain how these benefits might arise, and questions for future research in this emerging field. Copyright © 2013 Wiley Periodicals, Inc., A Wiley Company.

  6. Collaborative visual analytics of radio surveys in the Big Data era

    Science.gov (United States)

    Vohl, Dany; Fluke, Christopher J.; Hassan, Amr H.; Barnes, David G.; Kilborn, Virginia A.

    2017-06-01

    Radio survey datasets comprise an increasing number of individual observations stored as sets of multidimensional data. In large survey projects, astronomers commonly face limitations regarding: 1) interactive visual analytics of sufficiently large subsets of data; 2) synchronous and asynchronous collaboration; and 3) documentation of the discovery workflow. To support collaborative data inquiry, we present encube, a large-scale comparative visual analytics framework. encube can utilise advanced visualization environments such as the CAVE2 (a hybrid 2D and 3D virtual reality environment powered with a 100 Tflop/s GPU-based supercomputer and 84 million pixels) for collaborative analysis of large subsets of data from radio surveys. It can also run on standard desktops, providing a capable visual analytics experience across the display ecology. encube is composed of four primary units enabling compute-intensive processing, advanced visualisation, dynamic interaction, parallel data query, along with data management. Its modularity will make it simple to incorporate astronomical analysis packages and Virtual Observatory capabilities developed within our community. We discuss how encube builds a bridge between high-end display systems (such as CAVE2) and the classical desktop, preserving all traces of the work completed on either platform - allowing the research process to continue wherever you are.

  7. The European Photovoltaic Technology Platform

    International Nuclear Information System (INIS)

    Nowak, S.; Aulich, H.; Bal, J.L.; Dimmler, B.; Garnier, A.; Jongerden, G.; Luther, J.; Luque, A.; Milner, A.; Nelson, D.; Pataki, I.; Pearsall, N.; Perezagua, E.; Pietruszko, S.; Rehak, J.; Schellekens, E.; Shanker, A.; Silvestrini, G.; Sinke, W.; Willemsen, H.

    2006-05-01

    The European Photovoltaic Technology Platform is one of the European Technology Platforms, a new instrument proposed by the European Commission. European Technology Platforms (ETPs) are a mechanism to bring together all interested stakeholders to develop a long-term vision to address a specific challenge, create a coherent, dynamic strategy to achieve that vision and steer the implementation of an action plan to deliver agreed programmes of activities and optimise the benefits for all parties. The European Photovoltaic Technology Platform has recently been established to define, support and accompany the implementation of a coherent and comprehensive strategic plan for photovoltaics. The platform will mobilise all stakeholders sharing a long-term European vision for PV, helping to ensure that Europe maintains and improves its industrial position. The platform will realise a European Strategic Research Agenda for PV for the next decade(s). Guided by a Steering Committee of 20 high level decision-makers representing all relevant European PV Stakeholders, the European PV Technology Platform comprises 4 Working Groups dealing with the subjects policy and instruments; market deployment; science, technology and applications as well as developing countries and is supported by a secretariat

  8. Dental Implant Surrounding Marginal Bone Level Evaluation: Platform Switching versus Platform Matching—One-Year Retrospective Study

    Directory of Open Access Journals (Sweden)

    Eisner Salamanca

    2017-01-01

    Full Text Available The benefits and feasibility of platform switching have been discussed in several studies, reporting lesser crestal bone loss in platform-switched implants than in platform-matched implants. Objective. The aim of the present study was to observe the changes in vertical and horizontal marginal bone levels in platform-switched and platform-matched dental implants. Materials and Methods. 51 patients received 60 dental implants in the present study over a 1-year period. Measurement was performed between the implant shoulder and the most apical and horizontal marginal defect by periapical radiographs to examine the changes of peri-implant alveolar bone before and 12 months after prosthodontic restoration delivery. Results. These marginal bone measurements showed a bone gain of 0.23±0.58 mm in the vertical gap and 0.22±0.53 mm in the horizontal gap of platform matching, while in platform switching a bone gain of 0.93±1 mm (P<0.05 in the vertical gap and 0.50±0.56 mm in the horizontal gap was found. The average vertical gap reduction from the baseline until 12 months was 0.92±1.11 mm in platform switching and 0.29±0.85 mm in platform matching (P<0.05. Conclusions. Within the limitations of the present study, platform switching seemed to be more effective for a better peri-implant alveolar bone vertical and horizontal gap reduction at 1 year.

  9. Validation of the three web quality dimensions of a minimally invasive surgery e-learning platform.

    Science.gov (United States)

    Ortega-Morán, Juan Francisco; Pagador, J Blas; Sánchez-Peralta, Luisa Fernanda; Sánchez-González, Patricia; Noguera, José; Burgos, Daniel; Gómez, Enrique J; Sánchez-Margallo, Francisco M

    2017-11-01

    E-learning web environments, including the new TELMA platform, are increasingly being used to provide cognitive training in minimally invasive surgery (MIS) to surgeons. A complete validation of this MIS e-learning platform has been performed to determine whether it complies with the three web quality dimensions: usability, content and functionality. 21 Surgeons participated in the validation trials. They performed a set of tasks in the TELMA platform, where an e-MIS validity approach was followed. Subjective (questionnaires and checklists) and objective (web analytics) metrics were analysed to achieve the complete validation of usability, content and functionality. The TELMA platform allowed access to didactic content with easy and intuitive navigation. Surgeons performed all tasks with a close-to-ideal number of clicks and amount of time. They considered the design of the website to be consistent (95.24%), organised (90.48%) and attractive (85.71%). Moreover, they gave the content a high score (4.06 out of 5) and considered it adequate for teaching purposes. The surgeons scored the professional language and content (4.35), logo (4.24) and recommendations (4.20) the highest. Regarding functionality, the TELMA platform received an acceptance of 95.24% for navigation and 90.48% for interactivity. According to the study, it seems that TELMA had an attractive design, innovative content and interactive navigation, which are three key features of an e-learning platform. TELMA successfully met the three criteria necessary for consideration as a website of quality by achieving more than 70% of agreements regarding all usability, content and functionality items validated; this constitutes a preliminary requirement for an effective e-learning platform. However, the content completeness, authoring tool and registration process required improvement. Finally, the e-MIS validity methodology used to measure the three dimensions of web quality in this work can be applied to other

  10. The Educational Platform: Constructing Conceptual Frameworks.

    Science.gov (United States)

    Peca, Kathy; Isham, Mark

    2001-01-01

    The education faculty at Eastern New Mexico University used educational platforms as a means of developing the unit's conceptual framework. Faculty members developed personal platforms, then synthesized them into one curricular area platform. The resultant unit educational platform became the basis for the unit's conceptual framework, which…

  11. Following User Pathways: Cross Platform and Mixed Methods Analysis in Social Media Studies

    DEFF Research Database (Denmark)

    Hall, Margeret; Mazarakis, Athanasios; Peters, Isabella

    2016-01-01

    is the mixed method approach (e.g. qualitative and quantitative methods) in order to better understand how users and society interacts online. The workshop 'Following User Pathways' brings together a community of researchers and professionals to address methodological, analytical, conceptual, and technological......Social media and the resulting tidal wave of available data have changed the ways and methods researchers analyze communities at scale. But the full potential for social scientists (and others) is not yet achieved. Despite the popularity of social media analysis in the past decade, few researchers...... challenges and opportunities of cross-platform, mixed method analysis in social media ecosystems....

  12. National Community Solar Platform

    Energy Technology Data Exchange (ETDEWEB)

    Rupert, Bart [Clean Energy Collective, Louisville, CO (United States)

    2016-06-30

    This project was created to provide a National Community Solar Platform (NCSP) portal known as Community Solar Hub, that is available to any entity or individual who wants to develop community solar. This has been done by providing a comprehensive portal to make CEC’s solutions, and other proven community solar solutions, externally available for everyone to access – making the process easy through proven platforms to protect subscribers, developers and utilities. The successful completion of this project provides these tools via a web platform and integration APIs, a wide spectrum of community solar projects included in the platform, multiple groups of customers (utilities, EPCs, and advocates) using the platform to develop community solar, and open access to anyone interested in community solar. CEC’s Incubator project includes web-based informational resources, integrated systems for project information and billing systems, and engagement with customers and users by community solar experts. The combined effort externalizes much of Clean Energy Collective’s industry-leading expertise, allowing third parties to develop community solar without duplicating expensive start-up efforts. The availability of this platform creates community solar projects that are cheaper to build and cheaper to participate in, furthering the goals of DOE’s SunShot Initiative. Final SF 425 Final SF 428 Final DOE F 2050.11 Final Report Narrative

  13. Flexible experimental FPGA based platform

    DEFF Research Database (Denmark)

    Andersen, Karsten Holm; Nymand, Morten

    2016-01-01

    This paper presents an experimental flexible Field Programmable Gate Array (FPGA) based platform for testing and verifying digital controlled dc-dc converters. The platform supports different types of control strategies, dc-dc converter topologies and switching frequencies. The controller platform...... interface supporting configuration and reading of setup parameters, controller status and the acquisition memory in a simple way. The FPGA based platform, provides an easy way within education or research to use different digital control strategies and different converter topologies controlled by an FPGA...

  14. Analytical approach to determine vertical dynamics of a semi-trailer truck from the point of view of goods protection

    Science.gov (United States)

    Pidl, Renáta

    2018-01-01

    The overwhelming majority of intercontinental long-haul transportations of goods are usually carried out on road by semi-trailer trucks. Vibration has a major effect regarding the safety of the transport, the load and the transported goods. This paper deals with the logistics goals from the point of view of vibration and summarizes the methods to predict or measure the vibration load in order to design a proper system. From these methods, the focus of this paper is on the computer simulation of the vibration. An analytical method is presented to calculate the vertical dynamics of a semi-trailer truck containing general viscous damping and exposed to harmonic base excitation. For the purpose of a better understanding, the method will be presented through a simplified four degrees-of-freedom (DOF) half-vehicle model, which neglects the stiffness and damping of the tires, thus the four degrees-of-freedom are the vertical and angular displacements of the truck and the trailer. From the vertical and angular accelerations of the trailer, the vertical acceleration of each point of the platform of the trailer can easily be determined, from which the forces acting on the transported goods are given. As a result of this paper the response of the full platform-load-packaging system to any kind of vehicle, any kind of load and any kind of road condition can be analyzed. The peak acceleration of any point on the platform can be determined by the presented analytical method.

  15. Lactulose:Mannitol Diagnostic Test by HPLC and LC-MSMS Platforms: Considerations for Field Studies of Intestinal Barrier Function and Environmental Enteropathy

    Science.gov (United States)

    Lee, Gwenyth O.; Kosek, Peter; Lima, Aldo A.M.; Singh, Ravinder; Yori, Pablo P.; Olortegui, Maribel P.; Lamsam, Jesse L.; Oliveira, Domingos B.; Guerrant, Richard L.; Kosek, Margaret

    2014-01-01

    ABSTRACT Objectives: The lactulose:mannitol (L:M) diagnostic test is frequently used in field studies of environmental enteropathy (EE); however, heterogeneity in test administration and disaccharide measurement has limited the comparison of results between studies and populations. We aim to assess the agreement between L:M measurement between high-performance liquid chromatography with pulsed amperometric detection (HPLC-PAD) and liquid chromatography-tandem mass spectrometry (LC-MSMS) platforms. Methods: The L:M test was administered in a cohort of Peruvian infants considered at risk for EE. A total of 100 samples were tested for lactulose and mannitol at 3 independent laboratories: 1 running an HPLC-PAD platform and 2 running LC-MSMS platforms. Agreement between the platforms was estimated. Results: The Spearman correlation between the 2 LC-MSMS platforms was high (ρ ≥ 0.89) for mannitol, lactulose, and the L:M ratio. The correlation between the HPLC-PAD platform and LC-MSMS platform was ρ = 0.95 for mannitol, ρ = 0.70 for lactulose, and ρ = 0.43 for the L:M ratio. In addition, the HPLC-PAD platform overestimated the lowest disaccharide concentrations to the greatest degree. Conclusions: Given the large analyte concentration range, the improved accuracy of LC-MSMS has important consequences for the assessment of lactulose and mannitol following oral administration in populations at risk for EE. We recommend that researchers wishing to implement a dual-sugar test as part of a study of EE use an LC-MSMS platform to optimize the accuracy of results and increase comparability between studies. PMID:24941958

  16. Pentaho Business Analytics: a Business Intelligence Open Source Alternative

    Directory of Open Access Journals (Sweden)

    Diana TÂRNĂVEANU

    2012-10-01

    Full Text Available Most organizations strive to obtain fast, interactive and insightful analytics in order to fundament the most effective and profitable decisions. They need to incorporate huge amounts of data in order to run analysis based on queries and reports with collaborative capabilities. The large variety of Business Intelligence solutions on the market makes it very difficult for organizations to select one and evaluate the impact of the selected solution to the organization. The need of a strategy to help organization chose the best solution for investment emerges. In the past, Business Intelligence (BI market was dominated by closed source and commercial tools, but in the last years open source solutions developed everywhere. An Open Source Business Intelligence solution can be an option due to time-sensitive, sprawling requirements and tightening budgets. This paper presents a practical solution implemented in a suite of Open Source Business Intelligence products called Pentaho Business Analytics, which provides data integration, OLAP services, reporting, dashboarding, data mining and ETL capabilities. The study conducted in this paper suggests that the open source phenomenon could become a valid alternative to commercial platforms within the BI context.

  17. Evaluation of Optical Detection Platforms for Multiplexed Detection of Proteins and the Need for Point-of-Care Biosensors for Clinical Use

    Directory of Open Access Journals (Sweden)

    Samantha Spindel

    2014-11-01

    Full Text Available This review investigates optical sensor platforms for protein multiplexing, the ability to analyze multiple analytes simultaneously. Multiplexing is becoming increasingly important for clinical needs because disease and therapeutic response often involve the interplay between a variety of complex biological networks encompassing multiple, rather than single, proteins. Multiplexing is generally achieved through one of two routes, either through spatial separation on a surface (different wells or spots or with the use of unique identifiers/labels (such as spectral separation—different colored dyes, or unique beads—size or color. The strengths and weaknesses of conventional platforms such as immunoassays and new platforms involving protein arrays and lab-on-a-chip technology, including commercially-available devices, are discussed. Three major public health concerns are identified whereby detecting medically-relevant markers using Point-of-Care (POC multiplex assays could potentially allow for a more efficient diagnosis and treatment of diseases.

  18. Smart City Platform Development for an Automated Waste Collection System

    Directory of Open Access Journals (Sweden)

    Cicerone Laurentiu Popa

    2017-11-01

    Full Text Available Nowadays, governments and companies are looking for solutions to increase the collection level of various waste types by using new technologies and devices such as smart sensors, Internet of Things (IoT, cloud platforms etc. In order to fulfil this need, this paper presents solutions provided by a research project involving the design, development and implementation of fully automated waste collection systems with an increased usage degree, productivity and storage capacity. The paper will focus on the main results of this research project in turning the automated waste collection system into a smart system so that it can be easily integrated in any smart city infrastructure. For this purpose, the Internet of Things platform for the automated waste collection system provided by the project will allow real time monitoring and communication with central systems. Details about each module are sent to the central systems: various modules’ statuses (working, blocked, needs repairs or maintenance etc.; equipment status; storage systems status (allowing full reports for all waste types; the amount of waste for each module, allowing optimal discharging; route optimization for waste discharging etc. To do that, we describe here an IoT cloud solution integrating device connection, data processing, analytics and management.

  19. AIE-doped poly(ionic liquid) photonic spheres: a single sphere-based customizable sensing platform for the discrimination of multi-analytes.

    Science.gov (United States)

    Zhang, Wanlin; Gao, Ning; Cui, Jiecheng; Wang, Chen; Wang, Shiqiang; Zhang, Guanxin; Dong, Xiaobiao; Zhang, Deqing; Li, Guangtao

    2017-09-01

    By simultaneously exploiting the unique properties of ionic liquids and aggregation-induced emission (AIE) luminogens, as well as photonic structures, a novel customizable sensing system for multi-analytes was developed based on a single AIE-doped poly(ionic liquid) photonic sphere. It was found that due to the extraordinary multiple intermolecular interactions involved in the ionic liquid units, one single sphere could differentially interact with broader classes of analytes, thus generating response patterns with remarkable diversity. Moreover, the optical properties of both the AIE luminogen and photonic structure integrated in the poly(ionic liquid) sphere provide multidimensional signal channels for transducing the involved recognition process in a complementary manner and the acquisition of abundant and sufficient sensing information could be easily achieved on only one sphere sensor element. More importantly, the sensing performance of our poly(ionic liquid) photonic sphere is designable and customizable through a simple ion-exchange reaction and target-oriented multi-analyte sensing can be conveniently realized using a selective receptor species, such as counterions, showing great flexibility and extendibility. The power of our single sphere-based customizable sensing system was exemplified by the successful on-demand detection and discrimination of four multi-analyte challenge systems: all 20 natural amino acids, nine important phosphate derivatives, ten metal ions and three pairs of enantiomers. To further demonstrate the potential of our spheres for real-life application, 20 amino acids in human urine and their 26 unprecedented complex mixtures were also discriminated between by the single sphere-based array.

  20. Changes in intramuscular cytokine levels during masseter inflammation in male and female rats

    OpenAIRE

    Niu, Katelyn Y.; Ro, Jin Y.

    2010-01-01

    The present study was conducted to examine cytokine profiles in the masseter muscle before and after complete Freund’s adjuvant (CFA)-induced inflammation and possible sex differences in the cytokine levels. Age matched male and female Sprague Dawley rats were injected with CFA in the mid-region of the masseter muscle. Muscle tissue surrounding the injection site was extracted 6 hrs, 1, 3 and 7 days after the injection to measure TNF-α, IL-1β, IL-6 and IL-4 levels with Luminex multi-analyte p...

  1. Product Platform Screening at LEGO

    DEFF Research Database (Denmark)

    Mortensen, Niels Henrik; Steen Jensen, Thomas; Nielsen, Ole Fiil

    2012-01-01

    Product platforms offer great benefits to companies developing new products in highly competitive markets. Literature describes how a single platform can be designed from a technical point of view, but rarely mentions how the process begins. How do companies identify possible platform candidates...... after a few changes had been applied to the initial process layout. This case study shows how companies must focus on a limited selection of simultaneous projects in order to keep focus. Primary stakeholders must be involved from the very beginning, and short presentations of the platform concepts...

  2. Reusable platform concepts

    International Nuclear Information System (INIS)

    Gudmestad, O.T.; Sparby, B.K.; Stead, B.L.

    1993-01-01

    There is an increasing need to reduce costs of offshore production facilities in order to make development of offshore fields profitable. For small fields with short production time there is in particular a need to investigate ways to reduce costs. The idea of platform reuse is for such fields particularly attractive. This paper will review reusable platform concepts and will discuss their range of application. Particular emphasis will be placed on technical limitations. Traditional concepts as jackups and floating production facilities will be discussed by major attention will be given to newly developed ideas for reuse of steel jackets and concrete structures. It will be shown how the operator for several fields can obtain considerable savings by applying such reusable platform concepts

  3. Disentangling Competition Among Platform Driven Strategic Groups

    DEFF Research Database (Denmark)

    Kazan, Erol; Tan, Chee-Wee; Lim, Eric

    2015-01-01

    In platform-driven markets, competitive advantage is derived from superior platform design and configurations. For this reason, platform owners strive to create unique and inimitable platform configurals to maintain and extend their competitiveness within network economies. To disentangle firm...... competition within platform-driven markets, we opted for the UK mobile payment market as our empirical setting. By embracing the theoretical lens of strategic groups and digital platforms, this study supplements prior research by deriving a taxonomy of platform-driven strategic groups that is grounded...

  4. Platform development: implications for portfolio management

    DEFF Research Database (Denmark)

    Hsuan, Juliana; Hansen, Poul H. Kyvsgård

    2007-01-01

    " The challenge of implementing industrial platforms in practice can be described as a configuration problem caused by a considerable number of variables, which often have contradictory influences on the total performance of the firm. Consequently, the specific platform decisions become extremely...... complex, possibly increasing the strategic risks for the firm. This paper reports preliminary findings on platform management process at LEGO, a Danish toy company. Specifically, we report the process of applying games combined with simulations and workshops in the platform development. We also propose...... a framework, based on the portfolio management thinking to evaluate the degree of modularity embedded in a given platform and to which extent it is aligned with other platforms."...

  5. DOE's SciDAC Visualization and Analytics Center for EnablingTechnologies -- Strategy for Petascale Visual Data Analysis Success

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, E Wes; Johnson, Chris; Aragon, Cecilia; Rubel, Oliver; Weber, Gunther; Pascucci, Valerio; Childs, Hank; Bremer, Peer-Timo; Whitlock, Brad; Ahern, Sean; Meredith, Jeremey; Ostrouchov, George; Joy, Ken; Hamann, Bernd; Garth, Christoph; Cole, Martin; Hansen, Charles; Parker, Steven; Sanderson, Allen; Silva, Claudio; Tricoche, Xavier

    2007-10-01

    The focus of this article is on how one group of researchersthe DOE SciDAC Visualization and Analytics Center for EnablingTechnologies (VACET) is tackling the daunting task of enabling knowledgediscovery through visualization and analytics on some of the world slargest and most complex datasets and on some of the world's largestcomputational platforms. As a Center for Enabling Technology, VACET smission is the creation of usable, production-quality visualization andknowledge discovery software infrastructure that runs on large, parallelcomputer systems at DOE's Open Computing facilities and that providessolutions to challenging visual data exploration and knowledge discoveryneeds of modern science, particularly the DOE sciencecommunity.

  6. Stratifying the Develoment of Product Platforms

    DEFF Research Database (Denmark)

    Sköld, Martin; Karlsson, Christer

    2013-01-01

    companies develop platforms for different aims, purposes, and product scopes. Following on from this, the requirements for platform development resources, the ways of organizing platform development, and the implications for management styles have not been explored and are presumably varying. To start...... influencing the project length, requirements for platform development resources, principles for organizing, and implications for management styles....

  7. Text Stream Trend Analysis using Multiscale Visual Analytics with Applications to Social Media Systems

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A [ORNL; Beaver, Justin M [ORNL; BogenII, Paul L. [Google Inc.; Drouhard, Margaret MEG G [ORNL; Pyle, Joshua M [ORNL

    2015-01-01

    In this paper, we introduce a new visual analytics system, called Matisse, that allows exploration of global trends in textual information streams with specific application to social media platforms. Despite the potential for real-time situational awareness using these services, interactive analysis of such semi-structured textual information is a challenge due to the high-throughput and high-velocity properties. Matisse addresses these challenges through the following contributions: (1) robust stream data management, (2) automated sen- timent/emotion analytics, (3) inferential temporal, geospatial, and term-frequency visualizations, and (4) a flexible drill-down interaction scheme that progresses from macroscale to microscale views. In addition to describing these contributions, our work-in-progress paper concludes with a practical case study focused on the analysis of Twitter 1% sample stream information captured during the week of the Boston Marathon bombings.

  8. The COMET Sleep Research Platform.

    Science.gov (United States)

    Nichols, Deborah A; DeSalvo, Steven; Miller, Richard A; Jónsson, Darrell; Griffin, Kara S; Hyde, Pamela R; Walsh, James K; Kushida, Clete A

    2014-01-01

    The Comparative Outcomes Management with Electronic Data Technology (COMET) platform is extensible and designed for facilitating multicenter electronic clinical research. Our research goals were the following: (1) to conduct a comparative effectiveness trial (CET) for two obstructive sleep apnea treatments-positive airway pressure versus oral appliance therapy; and (2) to establish a new electronic network infrastructure that would support this study and other clinical research studies. The COMET platform was created to satisfy the needs of CET with a focus on creating a platform that provides comprehensive toolsets, multisite collaboration, and end-to-end data management. The platform also provides medical researchers the ability to visualize and interpret data using business intelligence (BI) tools. COMET is a research platform that is scalable and extensible, and which, in a future version, can accommodate big data sets and enable efficient and effective research across multiple studies and medical specialties. The COMET platform components were designed for an eventual move to a cloud computing infrastructure that enhances sustainability, overall cost effectiveness, and return on investment.

  9. DCMS: A data analytics and management system for molecular simulation.

    Science.gov (United States)

    Kumar, Anand; Grupcev, Vladimir; Berrada, Meryem; Fogarty, Joseph C; Tu, Yi-Cheng; Zhu, Xingquan; Pandit, Sagar A; Xia, Yuni

    Molecular Simulation (MS) is a powerful tool for studying physical/chemical features of large systems and has seen applications in many scientific and engineering domains. During the simulation process, the experiments generate a very large number of atoms and intend to observe their spatial and temporal relationships for scientific analysis. The sheer data volumes and their intensive interactions impose significant challenges for data accessing, managing, and analysis. To date, existing MS software systems fall short on storage and handling of MS data, mainly because of the missing of a platform to support applications that involve intensive data access and analytical process. In this paper, we present the database-centric molecular simulation (DCMS) system our team developed in the past few years. The main idea behind DCMS is to store MS data in a relational database management system (DBMS) to take advantage of the declarative query interface ( i.e. , SQL), data access methods, query processing, and optimization mechanisms of modern DBMSs. A unique challenge is to handle the analytical queries that are often compute-intensive. For that, we developed novel indexing and query processing strategies (including algorithms running on modern co-processors) as integrated components of the DBMS. As a result, researchers can upload and analyze their data using efficient functions implemented inside the DBMS. Index structures are generated to store analysis results that may be interesting to other users, so that the results are readily available without duplicating the analysis. We have developed a prototype of DCMS based on the PostgreSQL system and experiments using real MS data and workload show that DCMS significantly outperforms existing MS software systems. We also used it as a platform to test other data management issues such as security and compression.

  10. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  11. Identification of platform levels

    DEFF Research Database (Denmark)

    Mortensen, Niels Henrik

    2005-01-01

    reduction, ability to launch a wider product portfolio without increasing resources and reduction of complexity within the whole company. To support the multiple product development process, platform based product development has in many companies such as Philips, VW, Ford etc. proven to be a very effective...... product development in one step and therefore the objective of this paper is to identify levels of platform based product development. The structure of this paper is as follows. First the applied terminology for platforms will be briefly explained and then characteristics between single and multi product...... development will be examined. Based on the identification of the above characteristics five platform levels are described. The research presented in this paper is a result of MSc, Ph.D projects at the Technical University of Denmark and consultancy projects within the organisation of Institute of Product...

  12. Fabrication and Optimization of Bilayered Nanoporous Anodic Alumina Structures as Multi-Point Interferometric Sensing Platform

    Science.gov (United States)

    Nemati, Mahdieh; Santos, Abel

    2018-01-01

    Herein, we present an innovative strategy for optimizing hierarchical structures of nanoporous anodic alumina (NAA) to advance their optical sensing performance toward multi-analyte biosensing. This approach is based on the fabrication of multilayered NAA and the formation of differential effective medium of their structure by controlling three fabrication parameters (i.e., anodization steps, anodization time, and pore widening time). The rationale of the proposed concept is that interferometric bilayered NAA (BL-NAA), which features two layers of different pore diameters, can provide distinct reflectometric interference spectroscopy (RIfS) signatures for each layer within the NAA structure and can therefore potentially be used for multi-point biosensing. This paper presents the structural fabrication of layered NAA structures, and the optimization and evaluation of their RIfS optical sensing performance through changes in the effective optical thickness (EOT) using quercetin as a model molecule. The bilayered or funnel-like NAA structures were designed with the aim of characterizing the sensitivity of both layers of quercetin molecules using RIfS and exploring the potential of these photonic structures, featuring different pore diameters, for simultaneous size-exclusion and multi-analyte optical biosensing. The sensing performance of the prepared NAA platforms was examined by real-time screening of binding reactions between human serum albumin (HSA)-modified NAA (i.e., sensing element) and quercetin (i.e., analyte). BL-NAAs display a complex optical interference spectrum, which can be resolved by fast Fourier transform (FFT) to monitor the EOT changes, where three distinctive peaks were revealed corresponding to the top, bottom, and total layer within the BL-NAA structures. The spectral shifts of these three characteristic peaks were used as sensing signals to monitor the binding events in each NAA pore in real-time upon exposure to different concentrations of

  13. A robust and versatile mass spectrometry platform for comprehensive assessment of the thiol redox metabolome

    Directory of Open Access Journals (Sweden)

    T.R. Sutton

    2018-06-01

    Full Text Available Several diseases are associated with perturbations in redox signaling and aberrant hydrogen sulfide metabolism, and numerous analytical methods exist for the measurement of the sulfur-containing species affected. However, uncertainty remains about their concentrations and speciation in cells/biofluids, perhaps in part due to differences in sample processing and detection principles. Using ultrahigh-performance liquid chromatography in combination with electrospray-ionization tandem mass spectrometry we here outline a specific and sensitive platform for the simultaneous measurement of 12 analytes, including total and free thiols, their disulfides and sulfide in complex biological matrices such as blood, saliva and urine. Total assay run time is < 10 min, enabling high-throughput analysis. Enhanced sensitivity and avoidance of artifactual thiol oxidation is achieved by taking advantage of the rapid reaction of sulfhydryl groups with N-ethylmaleimide. We optimized the analytical procedure for detection and separation conditions, linearity and precision including three stable isotope labelled standards. Its versatility for future more comprehensive coverage of the thiol redox metabolome was demonstrated by implementing additional analytes such as methanethiol, N-acetylcysteine, and coenzyme A. Apparent plasma sulfide concentrations were found to vary substantially with sample pretreatment and nature of the alkylating agent. In addition to protein binding in the form of mixed disulfides (S-thiolation a significant fraction of aminothiols and sulfide appears to be also non-covalently associated with proteins. Methodological accuracy was tested by comparing the plasma redox status of 10 healthy human volunteers to a well-established protocol optimized for reduced/oxidized glutathione. In a proof-of-principle study a deeper analysis of the thiol redox metabolome including free reduced/oxidized as well as bound thiols and sulfide was performed

  14. Turbine engine airfoil and platform assembly

    Science.gov (United States)

    Campbell, Christian X [Oviedo, FL; James, Allister W [Chuluota, FL; Morrison, Jay A [Oviedo, FL

    2012-07-31

    A turbine airfoil (22A) is formed by a first process using a first material. A platform (30A) is formed by a second process using a second material that may be different from the first material. The platform (30A) is assembled around a shank (23A) of the airfoil. One or more pins (36A) extend from the platform into holes (28) in the shank (23A). The platform may be formed in two portions (32A, 34A) and placed around the shank, enclosing it. The two platform portions may be bonded to each other. Alternately, the platform (30B) may be cast around the shank (23B) using a metal alloy with better castability than that of the blade and shank, which may be specialized for thermal tolerance. The pins (36A-36D) or holes for them do not extend to an outer surface (31) of the platform, avoiding stress concentrations.

  15. Paper Capillary Enables Effective Sampling for Microfluidic Paper Analytical Devices.

    Science.gov (United States)

    Shangguan, Jin-Wen; Liu, Yu; Wang, Sha; Hou, Yun-Xuan; Xu, Bi-Yi; Xu, Jing-Juan; Chen, Hong-Yuan

    2018-06-06

    Paper capillary is introduced to enable effective sampling on microfluidic paper analytical devices. By coupling mac-roscale capillary force of paper capillary and microscale capillary forces of native paper, fluid transport can be flexibly tailored with proper design. Subsequently, a hybrid-fluid-mode paper capillary device was proposed, which enables fast and reliable sampling in an arrayed form, with less surface adsorption and bias for different components. The resulting device thus well supports high throughput, quantitative, and repeatable assays all by hands operation. With all these merits, multiplex analysis of ions, proteins, and microbe have all been realized on this platform, which has paved the way to level-up analysis on μPADs.

  16. The Innovative Capabilities Of Digital Payment Platforms

    DEFF Research Database (Denmark)

    Kazan, Erol

    2015-01-01

    This study presents a model for studying the innovative capabilities of digital payment platforms in regards to open innovation integration and commercialization. We perceive digital platforms as layered modular IT artifacts, where platform governance and the configuration of platform layers impact...... the support for open innovation. The proposed model has been employed in a comparative case study between two digital payment platforms: Apple Pay and Google Wallet. The findings suggest that digital payment platforms make use of boundary resources to be highly integrative or integratable, which supports...... the intended conjoint commercialization efforts. Furthermore, the architectural design of digital platforms impacts the access to commercialization, resulting to an exclusion or inclusion strategy in accessing value opportunities. Our findings contribute to the open innovation and digital platform literature...

  17. KOLAM: a cross-platform architecture for scalable visualization and tracking in wide-area imagery

    Science.gov (United States)

    Fraser, Joshua; Haridas, Anoop; Seetharaman, Guna; Rao, Raghuveer M.; Palaniappan, Kannappan

    2013-05-01

    KOLAM is an open, cross-platform, interoperable, scalable and extensible framework supporting a novel multi- scale spatiotemporal dual-cache data structure for big data visualization and visual analytics. This paper focuses on the use of KOLAM for target tracking in high-resolution, high throughput wide format video also known as wide-area motion imagery (WAMI). It was originally developed for the interactive visualization of extremely large geospatial imagery of high spatial and spectral resolution. KOLAM is platform, operating system and (graphics) hardware independent, and supports embedded datasets scalable from hundreds of gigabytes to feasibly petabytes in size on clusters, workstations, desktops and mobile computers. In addition to rapid roam, zoom and hyper- jump spatial operations, a large number of simultaneously viewable embedded pyramid layers (also referred to as multiscale or sparse imagery), interactive colormap and histogram enhancement, spherical projection and terrain maps are supported. The KOLAM software architecture was extended to support airborne wide-area motion imagery by organizing spatiotemporal tiles in very large format video frames using a temporal cache of tiled pyramid cached data structures. The current version supports WAMI animation, fast intelligent inspection, trajectory visualization and target tracking (digital tagging); the latter by interfacing with external automatic tracking software. One of the critical needs for working with WAMI is a supervised tracking and visualization tool that allows analysts to digitally tag multiple targets, quickly review and correct tracking results and apply geospatial visual analytic tools on the generated trajectories. One-click manual tracking combined with multiple automated tracking algorithms are available to assist the analyst and increase human effectiveness.

  18. PR-PR: cross-platform laboratory automation system.

    Science.gov (United States)

    Linshiz, Gregory; Stawski, Nina; Goyal, Garima; Bi, Changhao; Poust, Sean; Sharma, Monica; Mutalik, Vivek; Keasling, Jay D; Hillson, Nathan J

    2014-08-15

    To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Golden Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.

  19. Performance Measurement of Complex Event Platforms

    Directory of Open Access Journals (Sweden)

    Eva Zámečníková

    2016-12-01

    Full Text Available The aim of this paper is to find and compare existing solutions of complex event processing platforms (CEP. CEP platforms generally serve for processing and/or predicting of high frequency data. We intend to use CEP platform for processing of complex time series and integrate a solution for newly proposed method of decision making. The decision making process will be described by formal grammar. As there are lots of CEP solutions we will take the following characteristics under consideration - the processing in real time, possibility of processing of high volume data from multiple sources, platform independence, platform allowing integration with user solution and open license. At first we will talk about existing CEP tools and their specific way of use in praxis. Then we will mention the design of method for formalization of business rules used for decision making. Afterwards, we focus on two platforms which seem to be the best fit for integration of our solution and we will list the main pros and cons of each approach. Next part is devoted to benchmark platforms for CEP. Final part is devoted to experimental measurements of platform with integrated method for decision support.

  20. Analytical Performance of Four Polymerase Chain Reaction (PCR and Real Time PCR (qPCR Assays for the Detection of Six Leishmania Species DNA in Colombia

    Directory of Open Access Journals (Sweden)

    Cielo M. León

    2017-10-01

    Full Text Available Leishmaniasis comprises a spectrum of parasitic diseases caused by protozoans of the genus Leishmania. Molecular tools have been widely employed for the detection of Leishmania due to its high sensitivity and specificity. However, the analytical performance of molecular platforms as PCR and real time PCR (qPCR including a wide variety of molecular markers has never been evaluated. Herein, the aim was to evaluate the analytical performance of 4 PCR-based assays (designed on four different targets and applied on conventional and real-time PCR platforms. We evaluated the analytical performance of conventional PCR and real time PCR, determining exclusivity and inclusivity, Anticipated Reportable Range (ARR, limit of detection (LoD and accuracy using primers directed to kDNA, HSP70, 18S and ITS-1 targets. We observed that the kDNA was the most sensitive but does not meet the criterion of exclusivity. The HSP70 presented a higher LoD in conventional PCR and qPCR in comparison with the other markers (1 × 101 and 1 × 10-1 equivalent parasites/mL respectively and had a higher coefficient of variation in qPCR. No statistically significant differences were found between the days of the test with the four molecular markers. The present study revealed that the 18S marker presented the best performance in terms of analytical sensitivity and specificity for the qPCR in the species tested (species circulating in Colombia. Therefore, we recommend to explore the analytical and diagnostic performance in future studies using a broader number of species across America.

  1. Analytical Performance of Four Polymerase Chain Reaction (PCR) and Real Time PCR (qPCR) Assays for the Detection of Six Leishmania Species DNA in Colombia

    Science.gov (United States)

    León, Cielo M.; Muñoz, Marina; Hernández, Carolina; Ayala, Martha S.; Flórez, Carolina; Teherán, Aníbal; Cubides, Juan R.; Ramírez, Juan D.

    2017-01-01

    Leishmaniasis comprises a spectrum of parasitic diseases caused by protozoans of the genus Leishmania. Molecular tools have been widely employed for the detection of Leishmania due to its high sensitivity and specificity. However, the analytical performance of molecular platforms as PCR and real time PCR (qPCR) including a wide variety of molecular markers has never been evaluated. Herein, the aim was to evaluate the analytical performance of 4 PCR-based assays (designed on four different targets) and applied on conventional and real-time PCR platforms. We evaluated the analytical performance of conventional PCR and real time PCR, determining exclusivity and inclusivity, Anticipated Reportable Range (ARR), limit of detection (LoD) and accuracy using primers directed to kDNA, HSP70, 18S and ITS-1 targets. We observed that the kDNA was the most sensitive but does not meet the criterion of exclusivity. The HSP70 presented a higher LoD in conventional PCR and qPCR in comparison with the other markers (1 × 101 and 1 × 10-1 equivalent parasites/mL respectively) and had a higher coefficient of variation in qPCR. No statistically significant differences were found between the days of the test with the four molecular markers. The present study revealed that the 18S marker presented the best performance in terms of analytical sensitivity and specificity for the qPCR in the species tested (species circulating in Colombia). Therefore, we recommend to explore the analytical and diagnostic performance in future studies using a broader number of species across America. PMID:29046670

  2. Cretaceous tropical carbonate platform changes used as paleoclimatic and paleoceanic indicators: the three lower Cretaceous platform crises

    Science.gov (United States)

    Arnaud-Vanneau, A.; Vrielynck, B.

    2009-04-01

    Carbonate platform sediments are of biogenic origin. More commonly the bioclasts are fragments of shells and skeletons. The bioclastic composition of a limestone may reflect the nature of biota inhabiting the area and a carbonate platform can be estimated as a living factory, which reflects the prevailing ecological factors. The rate of carbonate production is highest in the tropics, in oligotrophic environments, and in the photic zone. The rate of carbonate production varies greatly with temperature and nutrient input. Three types of biotic carbonate platform can be distinguished. The highest carbonate production is linked to oligotrophic carbonate platform characterized by the presence of assemblages with hermatypic corals. This type of platform is developed in shallow marine environment, nutrient poor water and warm tropical sea. A less efficient production of carbonate platform is related to mesotrophic environments in cooler and/or deeper water and associated to nutrient flux with, sometime, detrital input. The biota includes red algae, solitary coral and branching ahermatypic corals, common bryozoans, crinoids and echinoids. The less productive carbonate platform is the eutrophic muddy platform where the mud is due to the intense bacterial activity, probably related to strong nutrient flux. All changes of type of carbonate platform can be related to climatic and oceanic changes. Three platform crises occurred during lower Cretaceous time. They are followed by important turnover of microfauna (large benthic foraminifers) and microflora (marine algae). They start with the demise of the previous oligotrophic platform, they continue with oceanic perturbations, expression of which was the widespread deposition of organic-rich sediments, well expressed during Late Aptian/Albian and Cenomanian Turonian boundary and the replacement of previous oligotrophic platforms by mesotrophic to eutrophic platforms. The first crisis occurred during Valanginian and Hauterivian

  3. Helicopter flight simulation motion platform requirements

    Science.gov (United States)

    Schroeder, Jeffery Allyn

    Flight simulators attempt to reproduce in-flight pilot-vehicle behavior on the ground. This reproduction is challenging for helicopter simulators, as the pilot is often inextricably dependent on external cues for pilot-vehicle stabilization. One important simulator cue is platform motion; however, its required fidelity is unknown. To determine the required motion fidelity, several unique experiments were performed. A large displacement motion platform was used that allowed pilots to fly tasks with matched motion and visual cues. Then, the platform motion was modified to give cues varying from full motion to no motion. Several key results were found. First, lateral and vertical translational platform cues had significant effects on fidelity. Their presence improved performance and reduced pilot workload. Second, yaw and roll rotational platform cues were not as important as the translational platform cues. In particular, the yaw rotational motion platform cue did not appear at all useful in improving performance or reducing workload. Third, when the lateral translational platform cue was combined with visual yaw rotational cues, pilots believed the platform was rotating when it was not. Thus, simulator systems can be made more efficient by proper combination of platform and visual cues. Fourth, motion fidelity specifications were revised that now provide simulator users with a better prediction of motion fidelity based upon the frequency responses of their motion control laws. Fifth, vertical platform motion affected pilot estimates of steady-state altitude during altitude repositionings. This refutes the view that pilots estimate altitude and altitude rate in simulation solely from visual cues. Finally, the combined results led to a general method for configuring helicopter motion systems and for developing simulator tasks that more likely represent actual flight. The overall results can serve as a guide to future simulator designers and to today's operators.

  4. SOMFlow: Guided Exploratory Cluster Analysis with Self-Organizing Maps and Analytic Provenance.

    Science.gov (United States)

    Sacha, Dominik; Kraus, Matthias; Bernard, Jurgen; Behrisch, Michael; Schreck, Tobias; Asano, Yuki; Keim, Daniel A

    2018-01-01

    Clustering is a core building block for data analysis, aiming to extract otherwise hidden structures and relations from raw datasets, such as particular groups that can be effectively related, compared, and interpreted. A plethora of visual-interactive cluster analysis techniques has been proposed to date, however, arriving at useful clusterings often requires several rounds of user interactions to fine-tune the data preprocessing and algorithms. We present a multi-stage Visual Analytics (VA) approach for iterative cluster refinement together with an implementation (SOMFlow) that uses Self-Organizing Maps (SOM) to analyze time series data. It supports exploration by offering the analyst a visual platform to analyze intermediate results, adapt the underlying computations, iteratively partition the data, and to reflect previous analytical activities. The history of previous decisions is explicitly visualized within a flow graph, allowing to compare earlier cluster refinements and to explore relations. We further leverage quality and interestingness measures to guide the analyst in the discovery of useful patterns, relations, and data partitions. We conducted two pair analytics experiments together with a subject matter expert in speech intonation research to demonstrate that the approach is effective for interactive data analysis, supporting enhanced understanding of clustering results as well as the interactive process itself.

  5. Distributed Fracturing Affecting the Isolated Carbonate Platforms, the Latemar Platform (Dolomites, North Italy).

    NARCIS (Netherlands)

    Boro, H.; Bertotti, G.V.; Hardebol, N.J.

    2012-01-01

    Isolated carbonate platforms are highly heterogeneous bodies and are typically composed of laterally juxtaposed first order domains with different sedimentological composition and organization, i.e. a well-stratified platform interior, a massive margin and a slope with steeply dipping and poorly

  6. Analysis of human plasma metabolites across different liquid chromatography/mass spectrometry platforms: Cross-platform transferable chemical signatures.

    Science.gov (United States)

    Telu, Kelly H; Yan, Xinjian; Wallace, William E; Stein, Stephen E; Simón-Manso, Yamil

    2016-03-15

    The metabolite profiling of a NIST plasma Standard Reference Material (SRM 1950) on different liquid chromatography/mass spectrometry (LC/MS) platforms showed significant differences. Although these findings suggest caution when interpreting metabolomics results, the degree of overlap of both profiles allowed us to use tandem mass spectral libraries of recurrent spectra to evaluate to what extent these results are transferable across platforms and to develop cross-platform chemical signatures. Non-targeted global metabolite profiles of SRM 1950 were obtained on different LC/MS platforms using reversed-phase chromatography and different chromatographic scales (conventional HPLC, UHPLC and nanoLC). The data processing and the metabolite differential analysis were carried out using publically available (XCMS), proprietary (Mass Profiler Professional) and in-house software (NIST pipeline). Repeatability and intermediate precision showed that the non-targeted SRM 1950 profiling was highly reproducible when working on the same platform (relative standard deviation (RSD) HPLC, UHPLC and nanoLC) on the same platform. A substantial degree of overlap (common molecular features) was also found. A procedure to generate consistent chemical signatures using tandem mass spectral libraries of recurrent spectra is proposed. Different platforms rendered significantly different metabolite profiles, but the results were highly reproducible when working within one platform. Tandem mass spectral libraries of recurrent spectra are proposed to evaluate the degree of transferability of chemical signatures generated on different platforms. Chemical signatures based on our procedure are most likely cross-platform transferable. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.

  7. Flexible Description and Adaptive Processing of Earth Observation Data through the BigEarth Platform

    Science.gov (United States)

    Gorgan, Dorian; Bacu, Victor; Stefanut, Teodor; Nandra, Cosmin; Mihon, Danut

    2016-04-01

    The Earth Observation data repositories extending periodically by several terabytes become a critical issue for organizations. The management of the storage capacity of such big datasets, accessing policy, data protection, searching, and complex processing require high costs that impose efficient solutions to balance the cost and value of data. Data can create value only when it is used, and the data protection has to be oriented toward allowing innovation that sometimes depends on creative people, which achieve unexpected valuable results through a flexible and adaptive manner. The users need to describe and experiment themselves different complex algorithms through analytics in order to valorize data. The analytics uses descriptive and predictive models to gain valuable knowledge and information from data analysis. Possible solutions for advanced processing of big Earth Observation data are given by the HPC platforms such as cloud. With platforms becoming more complex and heterogeneous, the developing of applications is even harder and the efficient mapping of these applications to a suitable and optimum platform, working on huge distributed data repositories, is challenging and complex as well, even by using specialized software services. From the user point of view, an optimum environment gives acceptable execution times, offers a high level of usability by hiding the complexity of computing infrastructure, and supports an open accessibility and control to application entities and functionality. The BigEarth platform [1] supports the entire flow of flexible description of processing by basic operators and adaptive execution over cloud infrastructure [2]. The basic modules of the pipeline such as the KEOPS [3] set of basic operators, the WorDeL language [4], the Planner for sequential and parallel processing, and the Executor through virtual machines, are detailed as the main components of the BigEarth platform [5]. The presentation exemplifies the development

  8. Social media and the social sciences: How researchers employ Big Data analytics

    Directory of Open Access Journals (Sweden)

    Mylynn Felt

    2016-04-01

    Full Text Available Social media posts are full of potential for data mining and analysis. Recognizing this potential, platform providers increasingly restrict free access to such data. This shift provides new challenges for social scientists and other non-profit researchers who seek to analyze public posts with a purpose of better understanding human interaction and improving the human condition. This paper seeks to outline some of the recent changes in social media data analysis, with a focus on Twitter, specifically. Using Twitter data from a 24-hour period following The Sisters in Spirit Candlelight Vigil, sponsored by the Native Women’s Association of Canada, this article compares three free-use Twitter application programming interfaces for capturing tweets and enabling analysis. Although recent Twitter data restrictions limit free access to tweets, there are many dynamic options for social scientists to choose from in the capture and analysis of Twitter and other social media platform data. This paper calls for critical social media data analytics combined with traditional, qualitative methods to address the developing ‘data gold rush.’

  9. Morphological indicators of growth stages in carbonates platform evolution: comparison between present-day and Miocene platforms of Northern Borneo, Malaysia.

    Science.gov (United States)

    Pierson, B.; Menier, D.; Ting, K. K.; Chalabi, A.

    2012-04-01

    Satellite images of present-day reefs and carbonate platforms of the Celebes Sea, east of Sabah, Malaysia, exhibit large-scale features indicative of the recent evolution of the platforms. These include: (1) multiple, sub-parallel reef rims at the windward margin, suggestive of back-stepping of the platform margin; (2) contraction of the platform, possibly as a result of recent sea level fluctuations; (3) colonization of the internal lagoons by polygonal reef structures and (4) fragmentation of the platforms and creation of deep channels separating platforms that used to be part of a single entity. These features are analogue to what has been observed on seismic attribute maps of Miocene carbonate platforms of Sarawak. An analysis of several growth stages of a large Miocene platform, referred to as the Megaplatform, shows that the platform evolves in function of syn-depositional tectonic movements and sea level fluctuations that result in back-stepping of the margin, illustrated by multiple reef rims, contraction of the platform, the development of polygonal structures currently interpreted as karstic in origin and fragmentation of the megaplatform in 3 sub-entities separated by deep channels that precedes the final demise of the whole platform. Comparing similar features on present-day to platforms and Miocene platforms leads to a better understanding of the growth history of Miocene platforms and to a refined predictability of reservoir and non-reservoir facies distribution.

  10. New Tools for New Research in Psychiatry: A Scalable and Customizable Platform to Empower Data Driven Smartphone Research.

    Science.gov (United States)

    Torous, John; Kiang, Mathew V; Lorme, Jeanette; Onnela, Jukka-Pekka

    2016-05-05

    A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-quality data. Our aim is not the creation of yet another app per se but rather the establishment of a platform to collect research-quality smartphone raw sensor and usage pattern data. Our ultimate goal is to develop statistical, mathematical, and computational methodology to enable us and others to extract biomedical and clinical insights from smartphone data. We report on the development and early testing of Beiwe, a research platform featuring a study portal, smartphone app, database, and data modeling and analysis tools designed and developed specifically for transparent, customizable, and reproducible biomedical research use, in particular for the study of psychiatric and neurological disorders. We also outline a proposed study using the platform for patients with schizophrenia. We demonstrate the passive data capabilities of the Beiwe platform and early results of its analytical capabilities. Smartphone sensors and phone usage patterns, when coupled with appropriate statistical learning tools, are able to capture various social and behavioral manifestations of illnesses, in naturalistic settings, as lived and experienced by patients. The ubiquity of smartphones makes this type of moment-by-moment quantification of disease phenotypes highly scalable and, when integrated within a transparent research platform, presents tremendous opportunities for research, discovery, and patient health.

  11. GenoSets: visual analytic methods for comparative genomics.

    Directory of Open Access Journals (Sweden)

    Aurora A Cain

    Full Text Available Many important questions in biology are, fundamentally, comparative, and this extends to our analysis of a growing number of sequenced genomes. Existing genomic analysis tools are often organized around literal views of genomes as linear strings. Even when information is highly condensed, these views grow cumbersome as larger numbers of genomes are added. Data aggregation and summarization methods from the field of visual analytics can provide abstracted comparative views, suitable for sifting large multi-genome datasets to identify critical similarities and differences. We introduce a software system for visual analysis of comparative genomics data. The system automates the process of data integration, and provides the analysis platform to identify and explore features of interest within these large datasets. GenoSets borrows techniques from business intelligence and visual analytics to provide a rich interface of interactive visualizations supported by a multi-dimensional data warehouse. In GenoSets, visual analytic approaches are used to enable querying based on orthology, functional assignment, and taxonomic or user-defined groupings of genomes. GenoSets links this information together with coordinated, interactive visualizations for both detailed and high-level categorical analysis of summarized data. GenoSets has been designed to simplify the exploration of multiple genome datasets and to facilitate reasoning about genomic comparisons. Case examples are included showing the use of this system in the analysis of 12 Brucella genomes. GenoSets software and the case study dataset are freely available at http://genosets.uncc.edu. We demonstrate that the integration of genomic data using a coordinated multiple view approach can simplify the exploration of large comparative genomic data sets, and facilitate reasoning about comparisons and features of interest.

  12. Towards a Framework of Digital Platform Disruption

    DEFF Research Database (Denmark)

    Kazan, Erol; Tan, Chee-Wee; Lim, Eric T. K.

    2014-01-01

    Digital platforms are disruptive information technology (IT) artifacts that erode conventional business logic associated with traditional market structures. This paper presents a framework for examining the disruptive potential of digital platforms whereby we postulate that the strategic interplay...... digital platforms purposely decouple platform layers, to foster open innovation and accelerate market disruption. This paper therefore represents a first concrete step aimed at unravelling the disruptive potential of digital platforms....... of governance regimes and platform layers is deterministic of whether disruptive derivatives are permitted to flourish. This framework has been employed in a comparative case study between centralized (i.e., PayPal) and decentralized (i.e., Coinkite) digital payment platforms to illustrate its applicability...

  13. Prevalence of Pre-Analytical Errors in Clinical Chemistry Diagnostic Labs in Sulaimani City of Iraqi Kurdistan.

    Science.gov (United States)

    Najat, Dereen

    2017-01-01

    Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical phase includes patient preparation, specimen transportation, specimen collection and storage. In the present study, we report the first comprehensive assessment of the frequency and types of pre-analytical errors at the Sulaimani diagnostic labs in Iraqi Kurdistan. Over 2 months, 5500 venous blood samples were observed in 10 public diagnostic labs of Sulaimani City. The percentages of rejected samples and types of sample inappropriateness were evaluated. The percentage of each of the following pre-analytical errors were recorded: delay in sample transportation, clotted samples, expired reagents, hemolyzed samples, samples not on ice, incorrect sample identification, insufficient sample, tube broken in centrifuge, request procedure errors, sample mix-ups, communication conflicts, misinterpreted orders, lipemic samples, contaminated samples and missed physician's request orders. The difference between the relative frequencies of errors observed in the hospitals considered was tested using a proportional Z test. In particular, the survey aimed to discover whether analytical errors were recorded and examine the types of platforms used in the selected diagnostic labs. The analysis showed a high prevalence of improper sample handling during the pre-analytical phase. In appropriate samples, the percentage error was as high as 39%. The major reasons for rejection were hemolyzed samples (9%), incorrect sample identification (8%) and clotted samples (6%). Most quality control schemes at Sulaimani

  14. Prevalence of Pre-Analytical Errors in Clinical Chemistry Diagnostic Labs in Sulaimani City of Iraqi Kurdistan.

    Directory of Open Access Journals (Sweden)

    Dereen Najat

    Full Text Available Laboratory testing is roughly divided into three phases: a pre-analytical phase, an analytical phase and a post-analytical phase. Most analytical errors have been attributed to the analytical phase. However, recent studies have shown that up to 70% of analytical errors reflect the pre-analytical phase. The pre-analytical phase comprises all processes from the time a laboratory request is made by a physician until the specimen is analyzed at the lab. Generally, the pre-analytical phase includes patient preparation, specimen transportation, specimen collection and storage. In the present study, we report the first comprehensive assessment of the frequency and types of pre-analytical errors at the Sulaimani diagnostic labs in Iraqi Kurdistan.Over 2 months, 5500 venous blood samples were observed in 10 public diagnostic labs of Sulaimani City. The percentages of rejected samples and types of sample inappropriateness were evaluated. The percentage of each of the following pre-analytical errors were recorded: delay in sample transportation, clotted samples, expired reagents, hemolyzed samples, samples not on ice, incorrect sample identification, insufficient sample, tube broken in centrifuge, request procedure errors, sample mix-ups, communication conflicts, misinterpreted orders, lipemic samples, contaminated samples and missed physician's request orders. The difference between the relative frequencies of errors observed in the hospitals considered was tested using a proportional Z test. In particular, the survey aimed to discover whether analytical errors were recorded and examine the types of platforms used in the selected diagnostic labs.The analysis showed a high prevalence of improper sample handling during the pre-analytical phase. In appropriate samples, the percentage error was as high as 39%. The major reasons for rejection were hemolyzed samples (9%, incorrect sample identification (8% and clotted samples (6%. Most quality control schemes

  15. Product Platform Development in Industrial Networks

    DEFF Research Database (Denmark)

    Karlsson, Christer; Skold, Martin

    2011-01-01

    The article examines the strategic issues involved in the deployment of product platform development in an industrial network. The move entails identifying the types and characteristics of generically different product platform strategies and clarifying strategic motives and differences. Number o...... of platforms and product brands serve as the key dimensions when distinguishing the different strategies. Each strategy has its own challenges and raises various issues to deal with.......The article examines the strategic issues involved in the deployment of product platform development in an industrial network. The move entails identifying the types and characteristics of generically different product platform strategies and clarifying strategic motives and differences. Number...

  16. The universal modular platform

    International Nuclear Information System (INIS)

    North, R.B.

    1995-01-01

    A new and patented design for offshore wellhead platforms has been developed to meet a 'fast track' requirement for increased offshore production, from field locations not yet identified. The new design uses modular construction to allow for radical changes in the water depth of the final location and assembly line efficiency in fabrication. By utilizing high strength steels and structural support from the well conductors the new design accommodates all planned production requirements on a support structure significantly lighter and less expensive than the conventional design it replaces. Twenty two platforms based on the new design were ready for installation within 18 months of the project start. Installation of the new platforms began in 1992 for drilling support and 1993 for production support. The new design has become the Company standard for all future production platforms. Large saving and construction costs have been realized through its light weight, flexibility in both positioning and water depth, and its modular construction

  17. Microfluidic paper-based analytical devices for potential use in quantitative and direct detection of disease biomarkers in clinical analysis.

    Science.gov (United States)

    Lim, Wei Yin; Goh, Boon Tong; Khor, Sook Mei

    2017-08-15

    Clinicians, working in the health-care diagnostic systems of developing countries, currently face the challenges of rising costs, increased number of patient visits, and limited resources. A significant trend is using low-cost substrates to develop microfluidic devices for diagnostic purposes. Various fabrication techniques, materials, and detection methods have been explored to develop these devices. Microfluidic paper-based analytical devices (μPADs) have gained attention for sensing multiplex analytes, confirming diagnostic test results, rapid sample analysis, and reducing the volume of samples and analytical reagents. μPADs, which can provide accurate and reliable direct measurement without sample pretreatment, can reduce patient medical burden and yield rapid test results, aiding physicians in choosing appropriate treatment. The objectives of this review are to provide an overview of the strategies used for developing paper-based sensors with enhanced analytical performances and to discuss the current challenges, limitations, advantages, disadvantages, and future prospects of paper-based microfluidic platforms in clinical diagnostics. μPADs, with validated and justified analytical performances, can potentially improve the quality of life by providing inexpensive, rapid, portable, biodegradable, and reliable diagnostics. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Random uncertainty of photometric determination of hemolysis index on the Abbott Architect c16000 platform.

    Science.gov (United States)

    Aloisio, Elena; Carnevale, Assunta; Pasqualetti, Sara; Birindelli, Sarah; Dolci, Alberto; Panteghini, Mauro

    2018-01-16

    Automatic photometric determination of the hemolysis index (HI) on serum and plasma samples is central to detect potential interferences of in vitro hemolysis on laboratory tests. When HI is above an established cut-off for interference, results may suffer from a significant bias and undermine clinical reliability of the test. Despite its undeniable importance for patient safety, the analytical performance of HI estimation is not usually checked in laboratories. Here we evaluated for the first time the random source of measurement uncertainty of HI determination on the two Abbott Architect c16000 platforms in use in our laboratory. From January 2016 to September 2017, we collected data from daily photometric determination of HI on a fresh-frozen serum pool with a predetermined HI value of ~100 (corresponding to ~1g/L of free hemoglobin). Monthly and cumulative CVs were calculated. During 21months, 442 and 451 measurements were performed on the two platforms, respectively. Monthly CVs ranged from 0.7% to 2.7% on c16000-1 and from 0.8% to 2.5% on c16000-2, with a between-platform cumulative CV of 1.82% (corresponding to an expanded uncertainty of 3.64%). Mean HI values on the two platforms were just slightly biased (101.3 vs. 103.1, 1.76%), but, due to the high precision of measurements, this difference assumed statistical significance (p<0.0001). Even though no quality specifications are available to date, our study shows that the HI measurement on Architect c16000 platform has nice reproducibility that could be considered in establishing the state of the art of the measurement. Copyright © 2018 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  19. Study of the Analytical Conditions for the Determination of Cadmium in Coal Fly Ashes by GFAAS with evaluation of several matrix modifiers

    International Nuclear Information System (INIS)

    Rucandio, M.I.; Petit, M.D.

    1998-01-01

    A new method for the determination of cadmium in coal fly ash samples by Graphite Furnace Atomic Absorption Spectrometry (GFAAS) has been developed. Analytical conditions and different instrumental parameters have been optimized. In a first step, several types of matrix modifiers have been tested and a mixture of 2% NH 4 H 2 PO 4 with 0.4%Mg(NO 3 ) 2 in 0.5N HNO 3 has been selected, since it provides the highest sensitivity. In a second step, an optimization of several conditions, using the selected modifier, has been carried out, such as ashing and atomization temperatures, heating rate, etc. The influence of the use of a L' vov platform on the analytical and background signals has been studied, showing a significative decrease on the background signal, being the net absorbance similar to those obtained in absence of the platform. Using the optimal conditions, the direct method with standard samples provides cadmium concentration consistent with those obtained using the standard addition method. (Author) 18 refs

  20. MAXIMIZING SOCIAL VALUE IN THE HOTEL ONLINE ENVIRONMENT USING AN ANALYTIC HIERARCHY PROCESS

    Directory of Open Access Journals (Sweden)

    Carmen Păunescu

    2018-03-01

    Full Text Available The paper analyses the possibilities that hoteliers have to create and maximize the social value of their online platforms, in terms of their functionality and usage, in order to improve sales and increase hotels’ performance. It also discusses the opportunities that hotel managers can take to improve the hotel online decision-making strategy to convert more effectively visitors into actual customers. Although social value creation of online platforms has been well researched in the specialized literature, recent research has not examined the ways the online social value can be maximized and put into effective commercial use. The paper reviews the dimensions and characteristics of the hotel online environment by integrating literature analysis and field research practices. It employs the analytic hierarchy process method to analyse key elements of the hotel online environment that can serve as a focal point for value creation. The literature review and field research conducted pinpoint three possibilities of creating online social value: (a building online trust, (b ensuring high quality of the online service, and (c providing effective online communication experience. The paper results have given deeper understanding regarding potential areas of the hotel online environment where social value can be obtained. They prove applicability of the analytic hierarchy process method for evaluation and selection of strategies for online social value creation. At the same time, the paper provides new valuable insights to hoteliers, which might support their decisions to improve the business by proactively incorporating strategies for online social value maximization.

  1. Product competitiveness analysis for e-commerce platform of special agricultural products

    Science.gov (United States)

    Wan, Fucheng; Ma, Ning; Yang, Dongwei; Xiong, Zhangyuan

    2017-09-01

    On the basis of analyzing the influence factors of the product competitiveness of the e-commerce platform of the special agricultural products and the characteristics of the analytical methods for the competitiveness of the special agricultural products, the price, the sales volume, the postage included service, the store reputation, the popularity, etc. were selected in this paper as the dimensionality for analyzing the competitiveness of the agricultural products, and the principal component factor analysis was taken as the competitiveness analysis method. Specifically, the web crawler was adopted to capture the information of various special agricultural products in the e-commerce platform ---- chi.taobao.com. Then, the original data captured thereby were preprocessed and MYSQL database was adopted to establish the information library for the special agricultural products. Then, the principal component factor analysis method was adopted to establish the analysis model for the competitiveness of the special agricultural products, and SPSS was adopted in the principal component factor analysis process to obtain the competitiveness evaluation factor system (support degree factor, price factor, service factor and evaluation factor) of the special agricultural products. Then, the linear regression method was adopted to establish the competitiveness index equation of the special agricultural products for estimating the competitiveness of the special agricultural products.

  2. Using learning analytics to evaluate a video-based lecture series.

    Science.gov (United States)

    Lau, K H Vincent; Farooque, Pue; Leydon, Gary; Schwartz, Michael L; Sadler, R Mark; Moeller, Jeremy J

    2018-01-01

    The video-based lecture (VBL), an important component of the flipped classroom (FC) and massive open online course (MOOC) approaches to medical education, has primarily been evaluated through direct learner feedback. Evaluation may be enhanced through learner analytics (LA) - analysis of quantitative audience usage data generated by video-sharing platforms. We applied LA to an experimental series of ten VBLs on electroencephalography (EEG) interpretation, uploaded to YouTube in the model of a publicly accessible MOOC. Trends in view count; total percentage of video viewed and audience retention (AR) (percentage of viewers watching at a time point compared to the initial total) were examined. The pattern of average AR decline was characterized using regression analysis, revealing a uniform linear decline in viewership for each video, with no evidence of an optimal VBL length. Segments with transient increases in AR corresponded to those focused on core concepts, indicative of content requiring more detailed evaluation. We propose a model for applying LA at four levels: global, series, video, and feedback. LA may be a useful tool in evaluating a VBL series. Our proposed model combines analytics data and learner self-report for comprehensive evaluation.

  3. Platform construction and extraction mechanism study of magnetic mixed hemimicelles solid-phase extraction

    Science.gov (United States)

    Xiao, Deli; Zhang, Chan; He, Jia; Zeng, Rong; Chen, Rong; He, Hua

    2016-12-01

    Simple, accurate and high-throughput pretreatment method would facilitate large-scale studies of trace analysis in complex samples. Magnetic mixed hemimicelles solid-phase extraction has the power to become a key pretreatment method in biological, environmental and clinical research. However, lacking of experimental predictability and unsharpness of extraction mechanism limit the development of this promising method. Herein, this work tries to establish theoretical-based experimental designs for extraction of trace analytes from complex samples using magnetic mixed hemimicelles solid-phase extraction. We selected three categories and six sub-types of compounds for systematic comparative study of extraction mechanism, and comprehensively illustrated the roles of different force (hydrophobic interaction, π-π stacking interactions, hydrogen-bonding interaction, electrostatic interaction) for the first time. What’s more, the application guidelines for supporting materials, surfactants and sample matrix were also summarized. The extraction mechanism and platform established in the study render its future promising for foreseeable and efficient pretreatment under theoretical based experimental design for trace analytes from environmental, biological and clinical samples.

  4. Lab-on-paper micro- and nano-analytical devices: Fabrication, modification, detection and emerging applications

    International Nuclear Information System (INIS)

    Xu, Yuanhong; Liu, Mengli; Kong, Na; Liu, Jingquan

    2016-01-01

    Paper-based chips (PB-chips; also referred to as lab-on-paper chips) are using patterned paper as a substrate in a lab-on-a-chip platform. They represent an outstanding technique for fabrication of analytical devices for multiplex analyte assays. Typical features include low-cost, portability, disposability and small sample consumption. This review (with 211 refs.) gives a comprehensive and critical insight into current trends in terms of materials and techniques for use in fabrication, modification and detection. Following an introduction into the principles of PB-chips, we discuss features of using paper in lab-on-a-chip devices and the proper choice of paper. We then discuss the versatile methods known for fabrication of PB-chips (ranging from photolithography, plasma treatment, ink jet etching, plotting, to printing including flexographic printing). The modification of PB-chips with micro- and nano-materials possessing superior optical or electronic properties is then reviewed, and the final section covers detection techniques (such as colorimetry, electrochemistry, electrochemiluminescence and chemiluminescence) along with specific (bio)analytical examples. A conclusion and outlook section discusses the challenges and future prospectives in this field. (author)

  5. New offshore platform in the Mexican Gulf

    Energy Technology Data Exchange (ETDEWEB)

    Beisel, T.

    1982-04-01

    After a construction period of only 10 months, the second steel Offshore platform was recently completed in the Mexican Gulf. The pattern for this structure was the Cognac platform. The erection of the new platform, called the 'Cerveza' platform, is described in the article.

  6. Sinking offshore platform. Nedsenkbar fralandsplatform

    Energy Technology Data Exchange (ETDEWEB)

    Einstabland, T.B.; Olsen, O.

    1988-12-19

    The invention deals with a sinking offshore platform of the gravitational type designed for being installed on the sea bed on great depths. The platform consists of at least three inclining pillars placed on a foundation unit. The pillars are at the upper end connected to a tower structure by means of a rigid construction. The tower supports the platform deck. The rigid construction comprises a centre-positioned cylinder connected to the foundation. 11 figs.

  7. Understanding Platform-Based Digital Currencies

    OpenAIRE

    Ben Fung; Hanna Halaburda

    2014-01-01

    Given technological advances and the widespread use of the Internet, various digital currencies have emerged. In most cases, Internet platforms such as Facebook and Amazon restrict the functionality of their digital currencies to enhance the business model and maximize their profits. While platform-based digital currencies could increase the efficiency of retail payments, they could also raise some important policy issues if they were to become widely used outside of the platform. Thus, it is...

  8. Analytical chemistry instrumentation

    International Nuclear Information System (INIS)

    Laing, W.R.

    1986-01-01

    In nine sections, 48 chapters cover 1) analytical chemistry and the environment 2) environmental radiochemistry 3) automated instrumentation 4) advances in analytical mass spectrometry 5) fourier transform spectroscopy 6) analytical chemistry of plutonium 7) nuclear analytical chemistry 8) chemometrics and 9) nuclear fuel technology

  9. Evolutionary space platform concept study. Volume 2, part B: Manned space platform concepts

    Science.gov (United States)

    1982-01-01

    Logical, cost-effective steps in the evolution of manned space platforms are investigated and assessed. Tasks included the analysis of requirements for a manned space platform, identifying alternative concepts, performing system analysis and definition of the concepts, comparing the concepts and performing programmatic analysis for a reference concept.

  10. Polymer-based platform for microfluidic systems

    Science.gov (United States)

    Benett, William [Livermore, CA; Krulevitch, Peter [Pleasanton, CA; Maghribi, Mariam [Livermore, CA; Hamilton, Julie [Tracy, CA; Rose, Klint [Boston, MA; Wang, Amy W [Oakland, CA

    2009-10-13

    A method of forming a polymer-based microfluidic system platform using network building blocks selected from a set of interconnectable network building blocks, such as wire, pins, blocks, and interconnects. The selected building blocks are interconnectably assembled and fixedly positioned in precise positions in a mold cavity of a mold frame to construct a three-dimensional model construction of a microfluidic flow path network preferably having meso-scale dimensions. A hardenable liquid, such as poly (dimethylsiloxane) is then introduced into the mold cavity and hardened to form a platform structure as well as to mold the microfluidic flow path network having channels, reservoirs and ports. Pre-fabricated elbows, T's and other joints are used to interconnect various building block elements together. After hardening the liquid the building blocks are removed from the platform structure to make available the channels, cavities and ports within the platform structure. Microdevices may be embedded within the cast polymer-based platform, or bonded to the platform structure subsequent to molding, to create an integrated microfluidic system. In this manner, the new microfluidic platform is versatile and capable of quickly generating prototype systems, and could easily be adapted to a manufacturing setting.

  11. Metal-Organic Framework Thin Film Coated Optical Fiber Sensors: A Novel Waveguide-Based Chemical Sensing Platform.

    Science.gov (United States)

    Kim, Ki-Joong; Lu, Ping; Culp, Jeffrey T; Ohodnicki, Paul R

    2018-02-23

    Integration of optical fiber with sensitive thin films offers great potential for the realization of novel chemical sensing platforms. In this study, we present a simple design strategy and high performance of nanoporous metal-organic framework (MOF) based optical gas sensors, which enables detection of a wide range of concentrations of small molecules based upon extremely small differences in refractive indices as a function of analyte adsorption within the MOF framework. Thin and compact MOF films can be uniformly formed and tightly bound on the surface of etched optical fiber through a simple solution method which is critical for manufacturability of MOF-based sensor devices. The resulting sensors show high sensitivity/selectivity to CO 2 gas relative to other small gases (H 2 , N 2 , O 2 , and CO) with rapid (optical fiber platform which results in an amplification of inherent optical absorption present within the MOF-based sensing layer with increasing values of effective refractive index associated with adsorption of gases.

  12. SenStick: Comprehensive Sensing Platform with an Ultra Tiny All-In-One Sensor Board for IoT Research

    Directory of Open Access Journals (Sweden)

    Yugo Nakamura

    2017-01-01

    Full Text Available We propose a comprehensive sensing platform called SenStick, which is composed of hardware (ultra tiny all-in-one sensor board, software (iOS, Android, and PC, and 3D case data. The platform aims to allow all the researchers to start IoT research, such as activity recognition and context estimation, easily and efficiently. The most important contribution is the hardware that we have designed. Various sensors often used for research are embedded in an ultra tiny board with the size of 50 mm (W × 10 mm (H × 5 mm (D and weight around 3 g including a battery. Concretely, the following sensors are embedded on this board: acceleration, gyro, magnetic, light, UV, temperature, humidity, and pressure. In addition, this board has BLE (Bluetooth low energy connectivity and capability of a rechargeable battery. By using 110 mAh battery, it can run more than 15 hours. The most different point from other similar boards is that our board has a large flash memory for logging all the data without a smartphone. By using SenStick, all the users can collect various data easily and focus on IoT data analytics. In this paper, we introduce SenStick platform and some case studies. Through the user study, we confirmed the usefulness of our proposed platform.

  13. Offshore Minerals Management Platforms for the Gulf of Mexico (GOM), Geographic NAD83, MMS (2006) [platforms_mms_2006

    Data.gov (United States)

    Louisiana Geographic Information Center — Offshore Minerals Management Platforms for the Gulf of Mexico (GOM). Identifies the location of platforms in GOM. All platforms existing in the database are included.

  14. Wireless implantable electronic platform for chronic fluorescent-based biosensors.

    Science.gov (United States)

    Valdastri, Pietro; Susilo, Ekawahyu; Förster, Thilo; Strohhöfer, Christof; Menciassi, Arianna; Dario, Paolo

    2011-06-01

    The development of a long-term wireless implantable biosensor based on fluorescence intensity measurement poses a number of technical challenges, ranging from biocompatibility to sensor stability over time. One of these challenges is the design of a power efficient and miniaturized electronics, enabling the biosensor to move from bench testing to long term validation, up to its final application in human beings. In this spirit, we present a wireless programmable electronic platform for implantable chronic monitoring of fluorescent-based autonomous biosensors. This system is able to achieve extremely low power operation with bidirectional telemetry, based on the IEEE802.15.4-2003 protocol, thus enabling over three-year battery lifetime and wireless networking of multiple sensors. During the performance of single fluorescent-based sensor measurements, the circuit drives a laser diode, for sensor excitation, and acquires the amplified signals from four different photodetectors. In vitro functionality was preliminarily tested for both glucose and calcium monitoring, simply by changing the analyte-binding protein of the biosensor. Electronics performance was assessed in terms of timing, power consumption, tissue exposure to electromagnetic fields, and in vivo wireless connectivity. The final goal of the presented platform is to be integrated in a complete system for blood glucose level monitoring that may be implanted for at least one year under the skin of diabetic patients. Results reported in this paper may be applied to a wide variety of biosensors based on fluorescence intensity measurement.

  15. Inkjet-printed point-of-care immunoassay on a nanoscale polymer brush enables subpicomolar detection of analytes in blood

    Science.gov (United States)

    Joh, Daniel Y.; Hucknall, Angus M.; Wei, Qingshan; Mason, Kelly A.; Lund, Margaret L.; Fontes, Cassio M.; Hill, Ryan T.; Blair, Rebecca; Zimmers, Zackary; Achar, Rohan K.; Tseng, Derek; Gordan, Raluca; Freemark, Michael; Ozcan, Aydogan; Chilkoti, Ashutosh

    2017-08-01

    The ELISA is the mainstay for sensitive and quantitative detection of protein analytes. Despite its utility, ELISA is time-consuming, resource-intensive, and infrastructure-dependent, limiting its availability in resource-limited regions. Here, we describe a self-contained immunoassay platform (the “D4 assay”) that converts the sandwich immunoassay into a point-of-care test (POCT). The D4 assay is fabricated by inkjet printing assay reagents as microarrays on nanoscale polymer brushes on glass chips, so that all reagents are “on-chip,” and these chips show durable storage stability without cold storage. The D4 assay can interrogate multiple analytes from a drop of blood, is compatible with a smartphone detector, and displays analytical figures of merit that are comparable to standard laboratory-based ELISA in whole blood. These attributes of the D4 POCT have the potential to democratize access to high-performance immunoassays in resource-limited settings without sacrificing their performance.

  16. Platforms.

    Science.gov (United States)

    Josko, Deborah

    2014-01-01

    The advent of DNA sequencing technologies and the various applications that can be performed will have a dramatic effect on medicine and healthcare in the near future. There are several DNA sequencing platforms available on the market for research and clinical use. Based on the medical laboratory scientist or researcher's needs and taking into consideration laboratory space and budget, one can chose which platform will be beneficial to their institution and their patient population. Although some of the instrument costs seem high, diagnosing a patient quickly and accurately will save hospitals money with fewer hospital stays and targeted treatment based on an individual's genetic make-up. By determining the type of disease an individual has, based on the mutations present or having the ability to prescribe the appropriate antimicrobials based on the knowledge of the organism's resistance patterns, the clinician will be better able to treat and diagnose a patient which ultimately will improve patient outcomes and prognosis.

  17. Twitter data analytics

    CERN Document Server

    Bruns, Axel; Lewandowski, Dirk

    2014-01-01

    It might still sound strange to dedicate an entire ebook exclusively to a single Internet platform. But it is not the company Twitter, Inc. that is the focus; this ebook is not about a platform and its features and services. It is about its users and the ways in which they interact with one another via the platform, about the situations that motivate people to share their thoughts publicly, using Twitter as a means to reach out to one another. And it is about the digital traces people leave behind when interacting with Twitter, and most of all about the ways in which these traces - as a new ty

  18. Towards a Framework of Digital Platform Competition

    DEFF Research Database (Denmark)

    Kazan, Erol; Tan, Chee-Wee; Lim, Eric T. K.

    2016-01-01

    between monopolistic (i.e., Pingit) and federated (i.e., Paym) mobile payment platforms to illustrate its applicability and yield principles on the nature and impact of competition among platform-driven ubiquitous systems. Preliminary findings indicate that monopolistic mobile digital platforms attempt...... to create unique configurals to obtain monopolistic power by tightly coupling platform layers, which are difficult to replicate. Conversely, federated digital platforms compete by dispersing the service layer to harness the collective resources from individual firms. Furthermore, the interaction...

  19. Assembly procedure for Shot Loading Platform

    International Nuclear Information System (INIS)

    Routh, R.D.

    1995-01-01

    This supporting document describes the assembly procedure for the Shot Loading Platform. The Shot Loading Platform is used by multiple equipment removal projects to load shielding shot in the annular spaces of the equipment storage containers. The platform height is adjustable to accommodate different sizes of storage containers and transport assemblies

  20. Let's Talk... Analytics

    Science.gov (United States)

    Oblinger, Diana G.

    2012-01-01

    Talk about analytics seems to be everywhere. Everyone is talking about analytics. Yet even with all the talk, many in higher education have questions about--and objections to--using analytics in colleges and universities. In this article, the author explores the use of analytics in, and all around, higher education. (Contains 1 note.)

  1. Intrant ELISA: A Novel Approach to Fabrication of Electrospun Fiber Mat-Assisted Biosensor Platforms and Their Integration within Standard Analytical Well Plates

    Directory of Open Access Journals (Sweden)

    Samira Hosseini

    2016-11-01

    Full Text Available A combination of far-field electrospinning (FFES and free-radical polymerization has been used to fabricate coated electrospun polymer fiber mats as a new type of biosensor platform. Poly (3-hydroxybutyrate-co-3-hydroxyvalerate (PHBV electrospun fibers were dip-coated with different compositions of poly methyl methacrylate-co-methacrylic acid (poly(MMA-co-MAA. This synergistic approach utilizes large specific surface area of PHBV fibers and co-polymer coatings that feature an optimum concentration of surface carboxyl (–COOH groups. The platform surface morphology, porosity and tunable hydrophobicity enhance biomolecular interactions via plurality of molecular forces. These customized fiber mats have been integrated into a newly designed 96-well plate called an “intrant enzyme-linked immunosorbent assay” or i-ELISA. I-ELISA allows colorimetric sandwich assay to be carried out without any modifications or additional steps in ELISA methodology. By introducing the fiber mats in fabrication of i-ELISA via extensions on the lid, we address some of the limitations of the previous designs while demonstrating an enhanced signal intensity up to 12 times higher than that of conventional assays. With improved sensitivity, specificity and accuracy in the detection of dengue virus, i-ELISA has proven to be a reliable platform for biomolecular recognition. The proposed fiber mat-assisted well plate in this study holds great potential as a universal approach for integration of different types of fiber mats with pre-designed specific properties in order to enhance the detection sensitivity of the assay.

  2. Analytics for Education

    Science.gov (United States)

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  3. The Ophidia Stack: Toward Large Scale, Big Data Analytics Experiments for Climate Change

    Science.gov (United States)

    Fiore, S.; Williams, D. N.; D'Anca, A.; Nassisi, P.; Aloisio, G.

    2015-12-01

    The Ophidia project is a research effort on big data analytics facing scientific data analysis challenges in multiple domains (e.g. climate change). It provides a "datacube-oriented" framework responsible for atomically processing and manipulating scientific datasets, by providing a common way to run distributive tasks on large set of data fragments (chunks). Ophidia provides declarative, server-side, and parallel data analysis, jointly with an internal storage model able to efficiently deal with multidimensional data and a hierarchical data organization to manage large data volumes. The project relies on a strong background on high performance database management and On-Line Analytical Processing (OLAP) systems to manage large scientific datasets. The Ophidia analytics platform provides several data operators to manipulate datacubes (about 50), and array-based primitives (more than 100) to perform data analysis on large scientific data arrays. To address interoperability, Ophidia provides multiple server interfaces (e.g. OGC-WPS). From a client standpoint, a Python interface enables the exploitation of the framework into Python-based eco-systems/applications (e.g. IPython) and the straightforward adoption of a strong set of related libraries (e.g. SciPy, NumPy). The talk will highlight a key feature of the Ophidia framework stack: the "Analytics Workflow Management System" (AWfMS). The Ophidia AWfMS coordinates, orchestrates, optimises and monitors the execution of multiple scientific data analytics and visualization tasks, thus supporting "complex analytics experiments". Some real use cases related to the CMIP5 experiment will be discussed. In particular, with regard to the "Climate models intercomparison data analysis" case study proposed in the EU H2020 INDIGO-DataCloud project, workflows related to (i) anomalies, (ii) trend, and (iii) climate change signal analysis will be presented. Such workflows will be distributed across multiple sites - according to the

  4. The Journal of Learning Analytics: Supporting and Promoting Learning Analytics Research

    OpenAIRE

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the Journal of Learning Analytics is identified Analytics is the most significant new initiative of SoLAR. 

  5. The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research

    Science.gov (United States)

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.

  6. Rapid bead-based immunoassay for measurement of mannose-binding lectin

    DEFF Research Database (Denmark)

    Bay, J T; Garred, P

    2009-01-01

    have been developed more automated platforms for MBL analysis is urgently needed. To pursue this, we set out to develop a flexible bead-based MBL immunoassay. Serum was obtained from 98 healthy individuals and 50 patients investigated for possible immunodeficiencies. We used the Luminex xMAP bead array...... coefficient were found be 7.88% and 5.70%, respectively. A close correlation between the new assay and a reference MBL measurement ELISA was found (rho 0.9381, P bead-based assay was less sensitive to interfering anti-murine antibodies in the blood samples than when the antibodies employed were...... used in the reference polystyrene-based ELISA. The new assay could be performed in 3 h with less than 25 microl serum required of each sample. These results show that MBL can be measured readily using a bead-based platform, which may form an efficient basis for a multiplex approach to measure different...

  7. Platform decisions supported by gaming

    DEFF Research Database (Denmark)

    Hansen, Poul H. Kyvsgård; Mikkola, Juliana Hsuan

    2007-01-01

    Platform is an ambiguous multidisciplinary concept. The philosophy behind it is easy to communicate and makes intuitively sense. However, the ease in communication does overshadow the high complexity when the concept is implemented. The practical industrial platform implementation challenge can...... be described as being a configuration problem with a high number of variables. These variables are different in nature; they have contradictory influence on the total performance, and, their importance change over time. Consequently, the specific platform decisions become highly complex and the consequences...

  8. Digital platforms as enablers for digital transformation

    DEFF Research Database (Denmark)

    Hossain, Mokter; Lassen, Astrid Heidemann

    transformation is crucial. This study aims at exploring how organizations are driven towards transformation in various ways to embrace digital platforms for ideas, technologies, and knowledge. It shows the opportunities and challenges digital platforms bring in organizations. It also highlights underlying......Digital platforms offer new ways for organizations to collaborate with the external environment for ideas, technologies, and knowledge. They provide new possibilities and competence but they also bring new challenges for organizations. Understanding the role of these platforms in digital...... mechanisms and potential outcomes of various digital platforms. The contribution of the submission is valuable for scholars to understand and further explore this area. It provides insight for practitioners to capture value through digital platforms and accelerate the pace of organizations’ digital...

  9. [Orange Platform].

    Science.gov (United States)

    Toba, Kenji

    2017-07-01

    The Organized Registration for the Assessment of dementia on Nationwide General consortium toward Effective treatment in Japan (ORANGE platform) is a recently established nationwide clinical registry for dementia. This platform consists of multiple registries of patients with dementia stratified by the following clinical stages: preclinical, mild cognitive impairment, early-stage, and advanced-stage dementia. Patients will be examined in a super-longitudinal fashion, and their lifestyle, social background, genetic risk factors, and required care process will be assessed. This project is also notable because the care registry includes information on the successful, comprehensive management of patients with dementia. Therefore, this multicenter prospective cohort study will contribute participants to all clinical trials for Alzheimer's disease as well as improve the understanding of individuals with dementia.

  10. Windows Azure Platform

    CERN Document Server

    Redkar, Tejaswi

    2011-01-01

    The Windows Azure Platform has rapidly established itself as one of the most sophisticated cloud computing platforms available. With Microsoft working to continually update their product and keep it at the cutting edge, the future looks bright - if you have the skills to harness it. In particular, new features such as remote desktop access, dynamic content caching and secure content delivery using SSL make the latest version of Azure a more powerful solution than ever before. It's widely agreed that cloud computing has produced a paradigm shift in traditional architectural concepts by providin

  11. Fabrication and Optimization of Bilayered Nanoporous Anodic Alumina Structures as Multi-Point Interferometric Sensing Platform

    Directory of Open Access Journals (Sweden)

    Mahdieh Nemati

    2018-02-01

    Full Text Available Herein, we present an innovative strategy for optimizing hierarchical structures of nanoporous anodic alumina (NAA to advance their optical sensing performance toward multi-analyte biosensing. This approach is based on the fabrication of multilayered NAA and the formation of differential effective medium of their structure by controlling three fabrication parameters (i.e., anodization steps, anodization time, and pore widening time. The rationale of the proposed concept is that interferometric bilayered NAA (BL-NAA, which features two layers of different pore diameters, can provide distinct reflectometric interference spectroscopy (RIfS signatures for each layer within the NAA structure and can therefore potentially be used for multi-point biosensing. This paper presents the structural fabrication of layered NAA structures, and the optimization and evaluation of their RIfS optical sensing performance through changes in the effective optical thickness (EOT using quercetin as a model molecule. The bilayered or funnel-like NAA structures were designed with the aim of characterizing the sensitivity of both layers of quercetin molecules using RIfS and exploring the potential of these photonic structures, featuring different pore diameters, for simultaneous size-exclusion and multi-analyte optical biosensing. The sensing performance of the prepared NAA platforms was examined by real-time screening of binding reactions between human serum albumin (HSA-modified NAA (i.e., sensing element and quercetin (i.e., analyte. BL-NAAs display a complex optical interference spectrum, which can be resolved by fast Fourier transform (FFT to monitor the EOT changes, where three distinctive peaks were revealed corresponding to the top, bottom, and total layer within the BL-NAA structures. The spectral shifts of these three characteristic peaks were used as sensing signals to monitor the binding events in each NAA pore in real-time upon exposure to different

  12. Radiographic inspection on offshore platforms

    International Nuclear Information System (INIS)

    Soares, Sergio Damasceno; Sperandio, Augusto Gasparoni

    1994-01-01

    One of the great challenges for non-destructive inspection is on offshore platforms, where safety is a critical issue. Inspection by gammagraphy is practically forbidden on the platform deck due to problems to personnel safety and radiological protection. Ir-192 sources are used and the risk of an accident with loss of radioisotope must be considered. It is unfeasible to use gammagraphy, because in case of an accident the rapid evacuation from the platform would be impossible. This problem does not occur when X-ray equipment is used as the radiation source. The limited practicality and portability of the X-ray equipment have prevented its use as a replacement for the gammagraphy. This paper presents the preliminary tests to see the viable use of radiographic tests with constant potential on offshore platforms. (author). 2 refs., 1 fig., 2 tabs, 3 photos

  13. Peer-to-Peer Service Sharing Platforms

    DEFF Research Database (Denmark)

    Andersson, Magnus; Hjalmarsson, Anders; Avital, Michel

    2013-01-01

    The sharing economy has been growing continuously in the last decade thanks to the proliferation of internet-based platforms that allow people to disintermediate the traditional commercial channels and to share excess resources and trade with one another effectively at a reasonably low transaction...... cost. Whereas early peer-to-peer platforms were designed to enable file sharing and goods trading, we recently witness the emergence of a new breed of peer-to-peer platforms that are designed for ordinary service sharing. Ordinary services entail intangible provisions and are defined as an economic...... activity that generates immaterial benefits and does not result in ownership of material goods. Based on a structured analysis of 41 internet-based rideshare platforms, we explore and layout the unique characteristics of peer-to-peer service sharing platforms based on three distinct temporal patterns...

  14. Towards A Research Agenda on Digital Platform Disruption

    DEFF Research Database (Denmark)

    Kazan, Erol; Tan, Chee-Wee; Lim, Eric T. K.

    Digital platforms are disruptive IT artifacts, because they facilitate the quick release of innovative platform derivatives from third parties. This study endeavors to unravel the disruptive potential, caused by distinct designs and configurations of digital platforms on market environments. We...... postulate that the disruptive potential of digital platforms is determined by the degree of alignment among the business, technology and platform profiles. Furthermore, we argue that the design and configuration of the aforementioned three elements dictates the extent to which open innovation is permitted....... To shed light on the disruptive potential of digital platforms, we opted for digital payment platforms as our unit of analysis. Through interviews with experts and payment providers, we seek to gain an in-depth appreciation of how contemporary digital payment platforms are designed and configured...

  15. Experimental platform utilising melting curve technology for detection of mutations in Mycobacterium tuberculosis isolates.

    Science.gov (United States)

    Broda, Agnieszka; Nikolayevskyy, Vlad; Casali, Nicki; Khan, Huma; Bowker, Richard; Blackwell, Gemma; Patel, Bhakti; Hume, James; Hussain, Waqar; Drobniewski, Francis

    2018-04-20

    Tuberculosis (TB) remains one of the most deadly infections with approximately a quarter of cases not being identified and/or treated mainly due to a lack of resources. Rapid detection of TB or drug-resistant TB enables timely adequate treatment and is a cornerstone of effective TB management. We evaluated the analytical performance of a single-tube assay for multidrug-resistant TB (MDR-TB) on an experimental platform utilising RT-PCR and melting curve analysis that could potentially be operated as a point-of-care (PoC) test in resource-constrained settings with a high burden of TB. Firstly, we developed and evaluated the prototype MDR-TB assay using specimens extracted from well-characterised TB isolates with a variety of distinct rifampicin and isoniazid resistance conferring mutations and nontuberculous Mycobacteria (NTM) strains. Secondly, we validated the experimental platform using 98 clinical sputum samples from pulmonary TB patients collected in high MDR-TB settings. The sensitivity of the platform for TB detection in clinical specimens was 75% for smear-negative and 92.6% for smear-positive sputum samples. The sensitivity of detection for rifampicin and isoniazid resistance was 88.9 and 96.0% and specificity was 87.5 and 100%, respectively. Observed limitations in sensitivity and specificity could be resolved by adjusting the sample preparation methodology and melting curve recognition algorithm. Overall technology could be considered a promising PoC methodology especially in resource-constrained settings based on its combined accuracy, convenience, simplicity, speed, and cost characteristics.

  16. Visual Analytics of Complex Genomics Data to Guide Effective Treatment Decisions

    Directory of Open Access Journals (Sweden)

    Quang Vinh Nguyen

    2016-09-01

    Full Text Available In cancer biology, genomics represents a big data problem that needs accurate visual data processing and analytics. The human genome is very complex with thousands of genes that contain the information about the individual patients and the biological mechanisms of their disease. Therefore, when building a framework for personalised treatment, the complexity of the genome must be captured in meaningful and actionable ways. This paper presents a novel visual analytics framework that enables effective analysis of large and complex genomics data. By providing interactive visualisations from the overview of the entire patient cohort to the detail view of individual genes, our work potentially guides effective treatment decisions for childhood cancer patients. The framework consists of multiple components enabling the complete analytics supporting personalised medicines, including similarity space construction, automated analysis, visualisation, gene-to-gene comparison and user-centric interaction and exploration based on feature selection. In addition to the traditional way to visualise data, we utilise the Unity3D platform for developing a smooth and interactive visual presentation of the information. This aims to provide better rendering, image quality, ergonomics and user experience to non-specialists or young users who are familiar with 3D gaming environments and interfaces. We illustrate the effectiveness of our approach through case studies with datasets from childhood cancers, B-cell Acute Lymphoblastic Leukaemia (ALL and Rhabdomyosarcoma (RMS patients, on how to guide the effective treatment decision in the cohort.

  17. The Definitive Guide to NetBeans Platform

    CERN Document Server

    Bock, Heiko

    2009-01-01

    The Definitive Guide to NetBeans(t) Platform is a thorough and definitive introduction to the NetBeans Platform, covering all its major APIs in detail, with relevant code examples used throughout. The original German book on which this title is based was well received. The NetBeans Platform Community has put together this English translation, which author Heiko Bock updated to cover the latest NetBeans Platform 6.5 APIs. With an introduction by known NetBeans Platform experts Jaroslav Tulach, Tim Boudreau, and Geertjan Wielenga, this is the most up-to-date book on this topic at the moment. All

  18. Modelling a flows in supply chain with analytical models: Case of a chemical industry

    Science.gov (United States)

    Benhida, Khalid; Azougagh, Yassine; Elfezazi, Said

    2016-02-01

    This study is interested on the modelling of the logistics flows in a supply chain composed on a production sites and a logistics platform. The contribution of this research is to develop an analytical model (integrated linear programming model), based on a case study of a real company operating in the phosphate field, considering a various constraints in this supply chain to resolve the planning problems for a better decision-making. The objectives of this model is to determine and define the optimal quantities of different products to route, to and from the various entities in the supply chain studied.

  19. Leveraging big data using a novel clinical database and analytic platform based on 323,145 individuals with and without of Diabetes

    Directory of Open Access Journals (Sweden)

    Anjana Ranjit Mohan

    2017-12-01

    Full Text Available Large volumes of biomedical data are being produced every day. Leveraging such voluminous amount of patient data using data science approaches help to uncover hidden patterns, unknown correlations, and other insights of the disease. Integration of diverse genomic data with comprehensive electronic health records (EHRs exhibit challenges, but essentially, they provide a feasible opportunity to better understand the underlying diseases, treatment patterns and develop an efficient and effective approach to identify biomarkers for diagnosis and improve therapy. Here, we describe a big data solution for diabetes, providing an efficient and responsive scientific discovery platform for researchers. Clinical phenotype data was collected from EHR of a tertiary care diabetes centre across 20 locations in India. It encompasses >20 million data points on 323,145 patients registered over 25 years. The biomedical data includes diverse collection of information such as well characterized clinical phenotypes, biochemical investigations, drug prescriptions, genotype mapping, micro- macrovascular complications of diabetes, pedigree charts. Additionally, more than 3 million genomic variants from high-throughput next generation sequencing (NGS were analyzed, annotated and integrated into the respective patient phenotype data. Statistical methods such as t-test, ANOVA were used to describe the significance of clinical variables between subject groups. Time based visualization methods for showing temporal patterns in key clinical variables such fasting glucose, HbA1c, Albuminuria, etc. are provided. It provides a novel clinical database of information on 323,145 patients with type 2 diabetes (n=294,371, type 1 diabetes (n=1,945, gestational diabetes (n=645, prediabetes (n=7,363, patients with miscellaneous forms of diabetes including monogenic forms of diabetes (n=4,601 and normal glucose tolerance (n=12,579. It has about 144,926 diabetes patients (44.8% of total

  20. Matrix Factorizations at Scale: a Comparison of Scientific Data Analytics in Spark and C+MPI Using Three Case Studies

    Energy Technology Data Exchange (ETDEWEB)

    Gittens, Alex; Devarakonda, Aditya; Racah, Evan; Ringenburg, Michael; Gerhardt, Lisa; Kottalam, Jey; Liu, Jialin; Maschhoff, Kristyn; Canon, Shane; Chhugani, Jatin; Sharma, Pramod; Yang, Jiyan; Demmel, James; Harrell, Jim; Krishnamurthy, Venkat; Mahoney, Michael; Prabhat, Mr

    2016-05-12

    We explore the trade-offs of performing linear algebra using Apache Spark, compared to traditional C and MPI implementations on HPC platforms. Spark is designed for data analytics on cluster computing platforms with access to local disks and is optimized for data-parallel tasks. We examine three widely-used and important matrix factorizations: NMF (for physical plausibility), PCA (for its ubiquity) and CX (for data interpretability). We apply these methods to 1.6TB particle physics, 2.2TB and 16TB climate modeling and 1.1TB bioimaging data. The data matrices are tall-and-skinny which enable the algorithms to map conveniently into Spark’s data parallel model. We perform scaling experiments on up to 1600 Cray XC40 nodes, describe the sources of slowdowns, and provide tuning guidance to obtain high performance.

  1. On the Power of Randomization in Big Data Analytics

    DEFF Research Database (Denmark)

    Pham, Ninh Dang

    We are experiencing a big data deluge, a result of not only the internetization and computerization of our society, but also the fast development of affordable and powerful data collection and storage devices. The explosively growing data, in both size and form, has posed a fundamental challenge...... for big data analytics. That is how to efficiently handle and analyze such big data in order to bridge the gap between data and information. In wide range of application domains, data are represented as high-dimensional vectors in the Euclidean space in order to benefit from computationally advanced...... techniques from numerical linear algebra. The computational efficiency and scalability of such techniques have been growing demands for not only novel platform system architectures but also efficient and effective algorithms to address the fast paced big data needs. In the thesis we will tackle...

  2. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    Science.gov (United States)

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  3. A practice scaffolding interactive platform

    DEFF Research Database (Denmark)

    Bundsgaard, Jeppe

    2009-01-01

    A Practice Scaffolding Interactive Platform (PracSIP) is a social learning platform which supports students in collaborative project based learning by simulating a professional practice. A PracSIP puts the core tools of the simulated practice at the students' disposal, it organizes collaboration...

  4. Development of a Modular Robotic Platform

    Directory of Open Access Journals (Sweden)

    Claudiu Ioan Cirebea

    2014-12-01

    Full Text Available In this paper a modular robotic platform is presented, for students and researchers laboratory work based on the Matlab-Simulink and dSpace real time control platform. The goal of this combination is to stimulate and to experiment with real time hardware and software in courses where mobile robotics is adopted as a motivating platform to introduce mechatronics competencies. Its many possibilities for modifications and extensions make experiments very easy. We used, for example, an omnidirectional mobile robot configuration with three Swedish wheels, whose kinematic model was simulated using Simulink. For real-time control, of the robot, the developed model has been implemented using DSpace platform DS1103.

  5. Evolvable Smartphone-Based Platforms for Point-of-Care In-Vitro Diagnostics Applications

    Science.gov (United States)

    Patou, François; AlZahra’a Alatraktchi, Fatima; Kjægaard, Claus; Dimaki, Maria; Madsen, Jan; Svendsen, Winnie E.

    2016-01-01

    The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods. PMID:27598208

  6. Evolvable Smartphone-Based Platforms for Point-of-Care In-Vitro Diagnostics Applications.

    Science.gov (United States)

    Patou, François; AlZahra'a Alatraktchi, Fatima; Kjægaard, Claus; Dimaki, Maria; Madsen, Jan; Svendsen, Winnie E

    2016-09-03

    The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods.

  7. Evolvable Smartphone-Based Platforms for Point-of-Care In-Vitro Diagnostics Applications

    Directory of Open Access Journals (Sweden)

    François Patou

    2016-09-01

    Full Text Available The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods.

  8. Platform economy in Denmark – precarious employment?

    DEFF Research Database (Denmark)

    Rasmussen, Stine; Madsen, Per Kongshøj

    limited. Nevertheless the labour offered through the platforms has a precarious character for instance in terms of lower wages and poorer rights and protection compared to the labour at the traditional, offline labour market. One important issue here is also the confusion as to whether the worker......This paper takes a labour market perspective on the emerging concept of the 'sharing economy' or 'platform economy', which we use as a more appropriate term for the phenomenon. Platform economy is in the article understood as those business models that have emerged since the millennium, where...... digital platforms serve as the link between persons wanting to make use of certain activities, services etc. and those owning them and we only have an interest in the work-related platforms. That means platforms, where paid work is offered and demanded. International examples of this new phenomenon...

  9. Software Development Process Improvement in Datacom Platform

    OpenAIRE

    Trabelsi, Walid

    2008-01-01

    Masteroppgave i Informasjons- og Kommunikasjonsteknologi 2008, Universitetet i Agder, Grimstad Ericsson Mobile Platform (EMP) is responsible of the development of a software platform and also to some extend responsible for related hardware parts. EMP is developing the data communication parts of the platform which is used by EMP customers. The platform development is done in large development programs and each program span over a quite a long time period. However, as we see eve...

  10. Increasing Impact of Coursework Through Deep Analytics

    Science.gov (United States)

    Horodyskyj, L.; Schonstein, D.; Buxner, S.; Semken, S. C.; Anbar, A. D.

    2014-12-01

    Over the past few years, ASU has developed the online astrobiology lab course Habitable Worlds, which has been offered to over 1,500 students over seven semesters. The course is offered through Smart Sparrow's intelligent tutoring system, which records student answers, time on question, simulation setups, and additional data that we refer to as "analytics". As the development of the course has stabilized, we have been able to devote more time to analyzing these data, extracting patterns of student behavior and how they have changed as the course has developed. During the most recent two semesters, pre- and post-tests of content knowledge related to the greenhouse effect were administered to assess changes in students' knowledge. The results of the Fall 2013 content assessment and an analysis of each step of every activity using the course platform analytics were used to identify problematic concepts and lesson elements, which were redesigned for the following semester. We observed a statistically significant improvement from pre to post instruction in Spring 2014. Preliminary results seem to indicate that several interactive activities, which replaced written/spoken content, contributed to this positive outcome. Our study demonstrates the benefit of deep analytics for thorough analysis of student results and quick iteration, allowing for significantly improved exercises to be redeployed quickly. The misconceptions that students have and retain depend on the individual student, although certain patterns do emerge in the class as a whole. These patterns can be seen in student discussion board behavior, the types of answers they submit, and the patterns of mistakes they make. By interrogating this wealth of data, we seek to identify the patterns that outstanding, struggling, and failing students display and how early in the class these patterns can be detected. If these patterns can be identified and detected early in the semester, instructors can intervene earlier

  11. Wireless sensor platform

    Science.gov (United States)

    Joshi, Pooran C.; Killough, Stephen M.; Kuruganti, Phani Teja

    2017-08-08

    A wireless sensor platform and methods of manufacture are provided. The platform involves providing a plurality of wireless sensors, where each of the sensors is fabricated on flexible substrates using printing techniques and low temperature curing. Each of the sensors can include planar sensor elements and planar antennas defined using the printing and curing. Further, each of the sensors can include a communications system configured to encode the data from the sensors into a spread spectrum code sequence that is transmitted to a central computer(s) for use in monitoring an area associated with the sensors.

  12. Windows Azure Platform

    CERN Document Server

    Redkar, Tejaswi

    2010-01-01

    The Azure Services Platform is a brand-new cloud-computing technology from Microsoft. It is composed of four core components-Windows Azure, .NET Services, SQL Services, and Live Services-each with a unique role in the functioning of your cloud service. It is the goal of this book to show you how to use these components, both separately and together, to build flawless cloud services. At its heart Windows Azure Platform is a down-to-earth, code-centric book. This book aims to show you precisely how the components are employed and to demonstrate the techniques and best practices you need to know

  13. Demand Heterogeneity and the Adoption of Platform Complements

    NARCIS (Netherlands)

    G.J. Rietveld (Joost); J.P. Eggers

    2016-01-01

    textabstractThis paper offers a demand-based theory of how platform maturity affects the adoption of platform complements. We argue that differences between early and late adopters of the platform include willingness to pay for the platform-and-complement bundle, risk preferences, preference for

  14. Analytic trigonometry

    CERN Document Server

    Bruce, William J; Maxwell, E A; Sneddon, I N

    1963-01-01

    Analytic Trigonometry details the fundamental concepts and underlying principle of analytic geometry. The title aims to address the shortcomings in the instruction of trigonometry by considering basic theories of learning and pedagogy. The text first covers the essential elements from elementary algebra, plane geometry, and analytic geometry. Next, the selection tackles the trigonometric functions of angles in general, basic identities, and solutions of equations. The text also deals with the trigonometric functions of real numbers. The fifth chapter details the inverse trigonometric functions

  15. Evaluation of data discretization methods to derive platform independent isoform expression signatures for multi-class tumor subtyping.

    Science.gov (United States)

    Jung, Segun; Bi, Yingtao; Davuluri, Ramana V

    2015-01-01

    Many supervised learning algorithms have been applied in deriving gene signatures for patient stratification from gene expression data. However, transferring the multi-gene signatures from one analytical platform to another without loss of classification accuracy is a major challenge. Here, we compared three unsupervised data discretization methods--Equal-width binning, Equal-frequency binning, and k-means clustering--in accurately classifying the four known subtypes of glioblastoma multiforme (GBM) when the classification algorithms were trained on the isoform-level gene expression profiles from exon-array platform and tested on the corresponding profiles from RNA-seq data. We applied an integrated machine learning framework that involves three sequential steps; feature selection, data discretization, and classification. For models trained and tested on exon-array data, the addition of data discretization step led to robust and accurate predictive models with fewer number of variables in the final models. For models trained on exon-array data and tested on RNA-seq data, the addition of data discretization step dramatically improved the classification accuracies with Equal-frequency binning showing the highest improvement with more than 90% accuracies for all the models with features chosen by Random Forest based feature selection. Overall, SVM classifier coupled with Equal-frequency binning achieved the best accuracy (> 95%). Without data discretization, however, only 73.6% accuracy was achieved at most. The classification algorithms, trained and tested on data from the same platform, yielded similar accuracies in predicting the four GBM subgroups. However, when dealing with cross-platform data, from exon-array to RNA-seq, the classifiers yielded stable models with highest classification accuracies on data transformed by Equal frequency binning. The approach presented here is generally applicable to other cancer types for classification and identification of

  16. Enabling IoT ecosystems through platform interoperability

    OpenAIRE

    Bröring, Arne; Schmid, Stefan; Schindhelm, Corina-Kim; Khelil, Abdelmajid; Kabisch, Sebastian; Kramer, Denis; Le Phuoc, Danh; Mitic, Jelena; Anicic, Darko; Teniente López, Ernest

    2017-01-01

    Today, the Internet of Things (IoT) comprises vertically oriented platforms for things. Developers who want to use them need to negotiate access individually and adapt to the platform-specific API and information models. Having to perform these actions for each platform often outweighs the possible gains from adapting applications to multiple platforms. This fragmentation of the IoT and the missing interoperability result in high entry barriers for developers and prevent the emergence of broa...

  17. An Analytic Model for the Success Rate of a Robotic Actuator System in Hitting Random Targets.

    Science.gov (United States)

    Bradley, Stuart

    2015-11-20

    Autonomous robotic systems are increasingly being used in a wide range of applications such as precision agriculture, medicine, and the military. These systems have common features which often includes an action by an "actuator" interacting with a target. While simulations and measurements exist for the success rate of hitting targets by some systems, there is a dearth of analytic models which can give insight into, and guidance on optimization, of new robotic systems. The present paper develops a simple model for estimation of the success rate for hitting random targets from a moving platform. The model has two main dimensionless parameters: the ratio of actuator spacing to target diameter; and the ratio of platform distance moved (between actuator "firings") to the target diameter. It is found that regions of parameter space having specified high success are described by simple equations, providing guidance on design. The role of a "cost function" is introduced which, when minimized, provides optimization of design, operating, and risk mitigation costs.

  18. Demand forecasting and information platform in tourism

    Directory of Open Access Journals (Sweden)

    Li Yue

    2017-05-01

    Full Text Available Information asymmetry and the bullwhip effect have been serious problems in the tourism supply chain. Based on platform theory, this paper established a mathematical model to explore the inner mechanism of a platform’s influence on stakeholders’ ability to forecast demand in tourism. Results showed that the variance of stakeholders’ demand predictions with a platform was smaller than the variance without a platform, which meant that a platform would improve predictions of demand for stakeholders. The higher information-processing ability of the platform also had other effects on demand forecasting. Research on the inner logic of the platform’s influence on stakeholders has important theoretical and realistic value. This area is worthy of further study.

  19. Analytic geometry

    CERN Document Server

    Burdette, A C

    1971-01-01

    Analytic Geometry covers several fundamental aspects of analytic geometry needed for advanced subjects, including calculus.This book is composed of 12 chapters that review the principles, concepts, and analytic proofs of geometric theorems, families of lines, the normal equation of the line, and related matters. Other chapters highlight the application of graphing, foci, directrices, eccentricity, and conic-related topics. The remaining chapters deal with the concept polar and rectangular coordinates, surfaces and curves, and planes.This book will prove useful to undergraduate trigonometric st

  20. IFSA: a microfluidic chip-platform for frit-based immunoassay protocols

    Science.gov (United States)

    Hlawatsch, Nadine; Bangert, Michael; Miethe, Peter; Becker, Holger; Gärtner, Claudia

    2013-03-01

    Point-of-care diagnostics (POC) is one of the key application fields for lab-on-a-chip devices. While in recent years much of the work has concentrated on integrating complex molecular diagnostic assays onto a microfluidic device, there is a need to also put comparatively simple immunoassay-type protocols on a microfluidic platform. In this paper, we present the development of a microfluidic cartridge using an immunofiltration approach. In this method, the sandwich immunoassay takes place in a porous frit on which the antibodies have immobilized. The device is designed to be able to handle three samples in parallel and up to four analytical targets per sample. In order to meet the critical cost targets for the diagnostic market, the microfluidic chip has been designed and manufactured using high-volume manufacturing technologies in mind. Validation experiments show comparable sensitivities in comparison with conventional immunofiltration kits.

  1. The VISPA Internet Platform for Students

    Science.gov (United States)

    Asseldonk, D. v.; Erdmann, M.; Fischer, R.; Glaser, C.; Müller, G.; Quast, T.; Rieger, M.; Urban, M.

    2016-04-01

    The VISPA internet platform enables users to remotely run Python scripts and view resulting plots or inspect their output data. With a standard web browser as the only user requirement on the client-side, the system becomes suitable for blended learning approaches for university physics students. VISPA was used in two consecutive years each by approx. 100 third year physics students at the RWTH Aachen University for their homework assignments. For example, in one exercise students gained a deeper understanding of Einsteins mass-energy relation by analyzing experimental data of electron-positron pairs revealing J / Ψ and Z particles. Because the students were free to choose their working hours, only few users accessed the platform simultaneously. The positive feedback from students and the stability of the platform lead to further development of the concept. This year, students accessed the platform in parallel while they analyzed the data recorded by demonstrated experiments live in the lecture hall. The platform is based on experience in the development of professional analysis tools. It combines core technologies from previous projects: an object-oriented C++ library, a modular data-driven analysis flow, and visual analysis steering. We present the platform and discuss its benefits in the context of teaching based on surveys that are conducted each semester.

  2. GeoChronos: An On-line Collaborative Platform for Earth Observation Scientists

    Science.gov (United States)

    Gamon, J. A.; Kiddle, C.; Curry, R.; Markatchev, N.; Zonta-Pastorello, G., Jr.; Rivard, B.; Sanchez-Azofeifa, G. A.; Simmonds, R.; Tan, T.

    2009-12-01

    Recent advances in cyberinfrastructure are offering new solutions to the growing challenges of managing and sharing large data volumes. Web 2.0 and social networking technologies, provide the means for scientists to collaborate and share information more effectively. Cloud computing technologies can provide scientists with transparent and on-demand access to applications served over the Internet in a dynamic and scalable manner. Semantic Web technologies allow for data to be linked together in a manner understandable by machines, enabling greater automation. Combining all of these technologies together can enable the creation of very powerful platforms. GeoChronos (http://geochronos.org/), part of a CANARIE Network Enabled Platforms project, is an online collaborative platform that incorporates these technologies to enable members of the earth observation science community to share data and scientific applications and to collaborate more effectively. The GeoChronos portal is built on an open source social networking platform called Elgg. Elgg provides a full set of social networking functionalities similar to Facebook including blogs, tags, media/document sharing, wikis, friends/contacts, groups, discussions, message boards, calendars, status, activity feeds and more. An underlying cloud computing infrastructure enables scientists to access dynamically provisioned applications via the portal for visualizing and analyzing data. Users are able to access and run the applications from any computer that has a Web browser and Internet connectivity and do not need to manage and maintain the applications themselves. Semantic Web Technologies, such as the Resource Description Framework (RDF) are being employed for relating and linking together spectral, satellite, meteorological and other data. Social networking functionality plays an integral part in facilitating the sharing of data and applications. Examples of recent GeoChronos users during the early testing phase have

  3. Analytical synthetic methods of solution of neutron transport equation with diffusion theory approaches energy multigroup

    International Nuclear Information System (INIS)

    Moraes, Pedro Gabriel B.; Leite, Michel C.A.; Barros, Ricardo C.

    2013-01-01

    In this work we developed a software to model and generate results in tables and graphs of one-dimensional neutron transport problems in multi-group formulation of energy. The numerical method we use to solve the problem of neutron diffusion is analytic, thus eliminating the truncation errors that appear in classical numerical methods, e.g., the method of finite differences. This numerical analytical method increases the computational efficiency, since they are not refined spatial discretization necessary because for any spatial discretization grids used, the numerical result generated for the same point of the domain remains unchanged unless the rounding errors of computational finite arithmetic. We chose to develop a computational application in MatLab platform for numerical computation and program interface is simple and easy with knobs. We consider important to model this neutron transport problem with a fixed source in the context of shielding calculations of radiation that protects the biosphere, and could be sensitive to ionizing radiation

  4. Understanding Business Analytics

    Science.gov (United States)

    2015-01-05

    analytics have been used in organizations for a variety of reasons for quite some time; ranging from the simple (generating and understanding business analytics...process. understanding business analytics 3 How well these two components are orchestrated will determine the level of success an organization has in

  5. Development of a multiplex Luminex assay for detecting swine antibodies to structural and nonstructural proteins of foot-and-mouth disease virus in Taiwan.

    Science.gov (United States)

    Chen, Tsu-Han; Lee, Fan; Lin, Yeou-Liang; Pan, Chu-Hsiang; Shih, Chia-Ni; Tseng, Chun-Hsien; Tsai, Hsiang-Jung

    2016-04-01

    Foot-and-mouth disease (FMD) and swine vesicular disease (SVD) are serious vesicular diseases that have devastated swine populations throughout the world. The aim of this study was to develop a multianalyte profiling (xMAP) Luminex assay for the differential detection of antibodies to the FMD virus of structural proteins (SP) and nonstructural proteins (NSP). After the xMAP was optimized, it detected antibodies to SP-VP1 and NSP-3ABC of the FMD virus in a single serum sample. These tests were also compared with 3ABC polypeptide blocking enzyme-linked immunosorbent assay (ELISA) and virus neutralization test (VNT) methods for the differential diagnosis and assessment of immune status, respectively. To detect SP antibodies in 661 sera from infected naïve pigs and vaccinated pigs, the diagnostic sensitivity (DSn) and diagnostic specificity (DSp) of the xMAP were 90.0-98.7% and 93.0-96.5%, respectively. To detect NSP antibodies, the DSn was 90% and the DSp ranged from 93.3% to 99.1%. The xMAP can detect the immune response to SP and NSP as early as 4 days postinfection and 8 days postinfection, respectively. Furthermore, the SP and NSP antibodies in all 15 vaccinated but unprotected pigs were detected by xMAP. A comparison of SP and NSP antibodies detected in the sera of the infected samples indicated that the results from the xMAP had a high positive correlation with results from the VNT and a 3ABC polypeptide blocking ELISA assay. However, simultaneous quantitation detected that xMAP had no relationship with the VNT. Furthermore, the specificity was 93.3-94.9% with 3ABC polypeptide blocking ELISA for the FMDV-NSP antibody. The results indicated that xMAP has the potential to detect antibodies to FMDV-SP-VP1 and NSP-3ABC and to distinguish FMDV-infected pigs from pigs infected with the swine vesicular disease virus. Copyright © 2014. Published by Elsevier B.V.

  6. Launching platforms for user-generated content

    OpenAIRE

    Batista, Guilherme Luís Caroço

    2015-01-01

    Field lab: Entrepreneurial and innovative ventures This paper intends to discuss and absorb the Best Practices employed by successful User- Generated Content (UGC)1 platforms and constitute a guide on how to launch a platform without having a cyclical lack of content and users. Research shows that companies have resorted to integration with mature UGC platforms, and providing content by themselves, in an initial state. I conclude that integration possibilities should be explore...

  7. tranSMART: An Open Source and Community-Driven Informatics and Data Sharing Platform for Clinical and Translational Research.

    Science.gov (United States)

    Athey, Brian D; Braxenthaler, Michael; Haas, Magali; Guo, Yike

    2013-01-01

    tranSMART is an emerging global open source public private partnership community developing a comprehensive informatics-based analysis and data-sharing cloud platform for clinical and translational research. The tranSMART consortium includes pharmaceutical and other companies, not-for-profits, academic entities, patient advocacy groups, and government stakeholders. The tranSMART value proposition relies on the concept that the global community of users, developers, and stakeholders are the best source of innovation for applications and for useful data. Continued development and use of the tranSMART platform will create a means to enable "pre-competitive" data sharing broadly, saving money and, potentially accelerating research translation to cures. Significant transformative effects of tranSMART includes 1) allowing for all its user community to benefit from experts globally, 2) capturing the best of innovation in analytic tools, 3) a growing 'big data' resource, 4) convergent standards, and 5) new informatics-enabled translational science in the pharma, academic, and not-for-profit sectors.

  8. Integrated Spintronic Platforms for Biomolecular Recognition Detection

    Science.gov (United States)

    Martins, V. C.; Cardoso, F. A.; Loureiro, J.; Mercier, M.; Germano, J.; Cardoso, S.; Ferreira, R.; Fonseca, L. P.; Sousa, L.; Piedade, M. S.; Freitas, P. P.

    2008-06-01

    This paper covers recent developments in magnetoresistive based biochip platforms fabricated at INESC-MN, and their application to the detection and quantification of pathogenic waterborn microorganisms in water samples for human consumption. Such platforms are intended to give response to the increasing concern related to microbial contaminated water sources. The presented results concern the development of biological active DNA chips and protein chips and the demonstration of the detection capability of the present platforms. Two platforms are described, one including spintronic sensors only (spin-valve based or magnetic tunnel junction based), and the other, a fully scalable platform where each probe site consists of a MTJ in series with a thin film diode (TFD). Two microfluidic systems are described, for cell separation and concentration, and finally, the read out and control integrated electronics are described, allowing the realization of bioassays with a portable point of care unit. The present platforms already allow the detection of complementary biomolecular target recognition with 1 pM concentration.

  9. Analyticity without Differentiability

    Science.gov (United States)

    Kirillova, Evgenia; Spindler, Karlheinz

    2008-01-01

    In this article we derive all salient properties of analytic functions, including the analytic version of the inverse function theorem, using only the most elementary convergence properties of series. Not even the notion of differentiability is required to do so. Instead, analytical arguments are replaced by combinatorial arguments exhibiting…

  10. Microneedle Platforms for Cell Analysis

    KAUST Repository

    Kavaldzhiev, Mincho

    2017-11-01

    Micro-needle platforms are the core components of many recent drug delivery and gene-editing techniques, which allow for intracellular access, controlled cell membrane stress or mechanical trapping of the nucleus. This dissertation work is devoted to the development of micro-needle platforms that offer customized fabrication and new capabilities for enhanced cell analyses. The highest degree of geometrical flexibility is achieved with 3D printed micro-needles, which enable optimizing the topographical stress environment for cells and cell populations of any size. A fabrication process for 3D-printed micro-needles has been developed as well as a metal coating technique based on standard sputter deposition. This extends the functionalities of the platforms by electrical as well as magnetic features. The micro-needles have been tested on human colon cancer cells (HCT116), showing a high degree of biocompatibility of the platform. Moreover, the capabilities of the 3D-printed micro-needles have been explored for drug delivery via the well-established electroporation technique, by coating the micro-needles with gold. Antibodies and fluorescent dyes have been delivered to HCT116 cells and human embryonic kidney cells with a very high transfection rate up to 90%. In addition, the 3D-printed electroporation platform enables delivery of molecules to suspended cells or adherent cells, with or without electroporation buffer solution, and at ultra-low voltages of 2V. In order to provide a micro-needle platform that exploits existing methods for mass fabrication a custom designed template-based process has been developed. It has been used for the production of gold, iron, nickel and poly-pyrrole micro-needles on silicon and glass substrates. A novel delivery method is introduced that activates the micro-needles by electromagnetic induction, which enables to wirelessly gain intracellular access. The method has been successfully tested on HCT116 cells in culture, where a time

  11. A survey of IoT cloud platforms

    Directory of Open Access Journals (Sweden)

    Partha Pratim Ray

    2016-12-01

    Full Text Available Internet of Things (IoT envisages overall merging of several “things” while utilizing internet as the backbone of the communication system to establish a smart interaction between people and surrounding objects. Cloud, being the crucial component of IoT, provides valuable application specific services in many application domains. A number of IoT cloud providers are currently emerging into the market to leverage suitable and specific IoT based services. In spite of huge possible involvement of these IoT clouds, no standard cum comparative analytical study has been found across the literature databases. This article surveys popular IoT cloud platforms in light of solving several service domains such as application development, device management, system management, heterogeneity management, data management, tools for analysis, deployment, monitoring, visualization, and research. A comparison is presented for overall dissemination of IoT clouds according to their applicability. Further, few challenges are also described that the researchers should take on in near future. Ultimately, the goal of this article is to provide detailed knowledge about the existing IoT cloud service providers and their pros and cons in concrete form.

  12. Disentangling Competition Among Platform Driven Strategic Groups

    DEFF Research Database (Denmark)

    Kazan, Erol; Tan, Chee-Wee; Lim, Eric T. K.

    2018-01-01

    for the mobile payment market in the United Kingdom as our empirical setting. By conceptualizing digital platforms as layered modular architectures and embracing the theoretical lens of strategic groups, this study supplements prior research by deriving a taxonomy of platform profiles that is grounded...... delivery architecture. The preceding attributes of value creation architecture and value delivery architecture aided us in identifying six profiles associated with mobile payment platforms, which in turn led us to advance three competitive strategies that could be pursued by digital platforms in network...

  13. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  14. Diamond nanoparticles as a way to improve electron transfer in sol–gel L-lactate biosensing platforms

    Energy Technology Data Exchange (ETDEWEB)

    Briones, M.; Casero, E. [Departamento de Química Analítica y Análisis Instrumental, Facultad de Ciencias, c/Francisco Tomás y Valiente, No7, Campus de Excelencia de la Universidad Autónoma de Madrid, 28049 Madrid (Spain); Vázquez, L. [Instituto de Ciencia de Materiales de Madrid (CSIC), c/Sor Juana Inés de la Cruz No3, Campus de Excelencia de la Universidad Autónoma de Madrid, 28049 Madrid (Spain); Pariente, F.; Lorenzo, E. [Departamento de Química Analítica y Análisis Instrumental, Facultad de Ciencias, c/Francisco Tomás y Valiente, No7, Campus de Excelencia de la Universidad Autónoma de Madrid, 28049 Madrid (Spain); Petit-Domínguez, M.D., E-mail: mdolores.petit@uam.es [Departamento de Química Analítica y Análisis Instrumental, Facultad de Ciencias, c/Francisco Tomás y Valiente, No7, Campus de Excelencia de la Universidad Autónoma de Madrid, 28049 Madrid (Spain)

    2016-02-18

    In the present work, we have included for the first time diamond nanoparticles (DNPs) in a sol–gel matrix derived from (3-mercaptopropyl)-trimethoxysilane (MPTS) in order to improve electron transfer in a lactate oxidase (LOx) based electrochemical biosensing platform. Firstly, an exhaustive AFM study, including topographical, surface potential (KFM) and capacitance gradient (CG) measurements, of each step involved in the biosensing platform development was performed. The platform is based on gold electrodes (Au) modified with the sol–gel matrix (Au/MPTS) in which diamond nanoparticles (Au/MPTS/DNPs) and lactate oxidase (Au/MPTS/DNPs/LOx) have been included. For the sake of comparison, we have also characterized a gold electrode directly modified with DNPs (Au/DNPs). Secondly, the electrochemical behavior of a redox mediator (hydroxymethyl-ferrocene, HMF) was evaluated at the platforms mentioned above. The response of Au/MPTS/DNPs/LOx towards lactate was obtained. A linear concentration range from 0.053 mM to 1.6 mM, a sensitivity of 2.6 μA mM{sup −1} and a detection limit of 16 μM were obtained. These analytical properties are comparable to other biosensors, presenting also as advantages that DNPs are inexpensive, environment-friendly and easy-handled nanomaterials. Finally, the developed biosensor was applied for lactate determination in wine samples. - Highlights: • We have included for the first time diamond nanoparticles (DNPs) in a sol–gel matrix for developing lactate biosensors. • DNPs facilitate electron-transfer within the sol–gel network in electrochemical biosensors. • Lactate biosensors show good sensitivity, detection limit, reproducibility and stability.

  15. A Low-Cost, Simplified Platform of Interchangeable, Ambient Ionization Sources for Rapid, Forensic Evidence Screening on Portable Mass Spectrometric Instrumentation

    Directory of Open Access Journals (Sweden)

    Patrick W. Fedick

    2018-03-01

    Full Text Available Portable mass spectrometers (MS are becoming more prevalent due to improved instrumentation, commercialization, and the robustness of new ionization methodologies. To increase utility towards diverse field-based applications, there is an inherent need for rugged ionization source platforms that are simple, yet robust towards analytical scenarios that may arise. Ambient ionization methodologies have evolved to target specific real-world problems and fulfill requirements of the analysis at hand. Ambient ionization techniques continue to advance towards higher performance, with specific sources showing variable proficiency depending on application area. To realize the full potential and applicability of ambient ionization methods, a selection of sources may be more prudent, showing a need for a low-cost, flexible ionization source platform. This manuscript describes a centralized system that was developed for portable MS systems that incorporates modular, rapidly-interchangeable ionization sources comprised of low-cost, commercially-available parts. Herein, design considerations are reported for a suite of ambient ionization sources that can be crafted with minimal machining or customization. Representative spectral data is included to demonstrate applicability towards field processing of forensic evidence. While this platform is demonstrated on portable instrumentation, retrofitting to lab-scale MS systems is anticipated.

  16. All-soft, battery-free, and wireless chemical sensing platform based on liquid metal for liquid- and gas-phase VOC detection.

    Science.gov (United States)

    Kim, Min-Gu; Alrowais, Hommood; Kim, Choongsoon; Yeon, Pyungwoo; Ghovanloo, Maysam; Brand, Oliver

    2017-06-27

    Lightweight, flexible, stretchable, and wireless sensing platforms have gained significant attention for personal healthcare and environmental monitoring applications. This paper introduces an all-soft (flexible and stretchable), battery-free, and wireless chemical microsystem using gallium-based liquid metal (eutectic gallium-indium alloy, EGaIn) and poly(dimethylsiloxane) (PDMS), fabricated using an advanced liquid metal thin-line patterning technique based on soft lithography. Considering its flexible, stretchable, and lightweight characteristics, the proposed sensing platform is well suited for wearable sensing applications either on the skin or on clothing. Using the microfluidic sensing platform, detection of liquid-phase and gas-phase volatile organic compounds (VOC) is demonstrated using the same design, which gives an opportunity to have the sensor operate under different working conditions and environments. In the case of liquid-phase chemical sensing, the wireless sensing performance and microfluidic capacitance tunability for different dielectric liquids are evaluated using analytical, numerical, and experimental approaches. In the case of gas-phase chemical sensing, PDMS is used both as a substrate and a sensing material. The gas sensing performance is evaluated and compared to a silicon-based, solid-state gas sensor with a PDMS sensing film.

  17. Analysis of offshore platforms lifting with fixed pile structure type (fixed platform) based on ASD89

    Science.gov (United States)

    Sugianto, Agus; Indriani, Andi Marini

    2017-11-01

    Platform construction GTS (Gathering Testing Sattelite) is offshore construction platform with fix pile structure type/fixed platform functioning to support the mining of petroleum exploitation. After construction fabrication process platform was moved to barges, then shipped to the installation site. Moving process is generally done by pull or push based on construction design determined when planning. But at the time of lifting equipment/cranes available in the work area then the moving process can be done by lifting so that moving activity can be implemented more quickly of work. This analysis moving process of GTS platform in a different way that is generally done to GTS platform types by lifting using problem is construction reinforcement required, so the construction can be moved by lifting with analyzing and checking structure working stress that occurs due to construction moving process by lifting AISC code standard and analysis using the SAP2000 structure analysis program. The analysis result showed that existing condition cannot be moved by lifting because stress ratio is above maximum allowable value that is 0.950 (AISC-ASD89). Overstress occurs on the member 295 and 324 with stress ratio value 0.97 and 0.95 so that it is required structural reinforcement. Box plate aplication at both members so that it produces stress ratio values 0.78 at the member 295 and stress ratio of 0.77 at the member 324. These results indicate that the construction have qualified structural reinforcement for being moved by lifting.

  18. 21 CFR 890.3940 - Wheelchair platform scale.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Wheelchair platform scale. 890.3940 Section 890.3940 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED... platform scale. (a) Identification. A wheelchair platform scale is a device with a base designed to...

  19. Assessments of high-consequence platforms: Issues and applications

    International Nuclear Information System (INIS)

    Digre, K.A.; Craig, M.J.K.

    1994-01-01

    An API task group has developed a process for the assessment of existing platforms to determine their fitness for purpose. This has been released as a draft supplement to API RP 2A-WSD, 20th edition. Details and the background of this work are described in a companion paper. The assessment of a platform's fitness for purpose involves firstly a definition of the platform's exposure; and secondly, an evaluation of the platform's predicted performance relative to the assessment criteria associated with that exposure. This paper deals with platforms in the high exposure category. That is, platforms whose potential failure consequences, in terms of potential life loss and environmental damage, are significant. The criteria for placement of a platform in a high exposure category are explained, as are the performance criteria demanded of these high exposure platforms. In the companion paper, the metocean assessment process and associated API-developed acceptance criteria are highlighted. This paper addresses primarily ice and seismic loading assessments and associated API-developed criteria, which are based on over thirty years of successful offshore operation and field experience, as well as extrapolation of land-based performance criteria. Three West Coast, USA production platforms are used for illustration

  20. Experience using EPICS on PC platforms

    International Nuclear Information System (INIS)

    Hill, J.O.; Kasemire, K.U.

    1997-03-01

    The Experimental Physics and Industrial Control System (EPICS) has been widely adopted in the accelerator community. Although EPICS is available on many platforms, the majority of implementations have used UNIX workstations as clients, and VME- or VXI-based processors for distributed input output controllers. Recently, a significant portion of EPICS has been ported to personal computer (PC) hardware platforms running Microsoft's operating systems, and also Wind River System's real time vxWorks operating system. This development should significantly reduce the cost of deploying EPICS systems, and the prospect of using EPICS together with the many high quality commercial components available for PC platforms is also encouraging. A hybrid system using both PC and traditional platforms is currently being implemented at LANL for LEDA, the low energy demonstration accelerator under construction as part of the Accelerator Production of Tritium (APT) project. To illustrate these developments the authors compare their recent experience deploying a PC-based EPICS system with experience deploying similar systems based on traditional (UNIX-hosted) EPICS hardware and software platforms

  1. Near Real-time Scientific Data Analysis and Visualization with the ArcGIS Platform

    Science.gov (United States)

    Shrestha, S. R.; Viswambharan, V.; Doshi, A.

    2017-12-01

    Scientific multidimensional data are generated from a variety of sources and platforms. These datasets are mostly produced by earth observation and/or modeling systems. Agencies like NASA, NOAA, USGS, and ESA produce large volumes of near real-time observation, forecast, and historical data that drives fundamental research and its applications in larger aspects of humanity from basic decision making to disaster response. A common big data challenge for organizations working with multidimensional scientific data and imagery collections is the time and resources required to manage and process such large volumes and varieties of data. The challenge of adopting data driven real-time visualization and analysis, as well as the need to share these large datasets, workflows, and information products to wider and more diverse communities, brings an opportunity to use the ArcGIS platform to handle such demand. In recent years, a significant effort has put in expanding the capabilities of ArcGIS to support multidimensional scientific data across the platform. New capabilities in ArcGIS to support scientific data management, processing, and analysis as well as creating information products from large volumes of data using the image server technology are becoming widely used in earth science and across other domains. We will discuss and share the challenges associated with big data by the geospatial science community and how we have addressed these challenges in the ArcGIS platform. We will share few use cases, such as NOAA High Resolution Refresh Radar (HRRR) data, that demonstrate how we access large collections of near real-time data (that are stored on-premise or on the cloud), disseminate them dynamically, process and analyze them on-the-fly, and serve them to a variety of geospatial applications. We will also share how on-the-fly processing using raster functions capabilities, can be extended to create persisted data and information products using raster analytics

  2. An Exploratory Study to Assess Analytical and Logical Thinking Skills of the Software Practitioners using a Gamification Perspective

    Directory of Open Access Journals (Sweden)

    Şahin KAYALI

    2016-12-01

    Full Text Available The link between analytical and logical thinking skills and success of software practitioners attracted an increasing attention in the last decade. Several studies report that the ability to think logically is a requirement for improving software development skills, which exhibits a strong reasoning. Additionally, analytical thinking is a vital part of software development for example while dividing a task into elemental parts with respect to basic rules and principles.  Using the basic essence of gamification, this study proposes a mobile testing platform for assessing analytical and logical thinking skills of software practitioners as well as computer engineering students. The assessment questions were taken from the literature and transformed into a gamified tool based on the software requirements. A focus group study was conducted to capture the requirements. Using the Delphi method, these requirements were discussed by a group of experts to reach a multidisciplinary understanding where a level of moderate agreement has been achieved. In light of these, an assessment tool was developed, which was tested on both software practitioners from the industry and senior computer engineering students. Our results suggest that individuals who exhibit skills in analytical and logical thinking are also more inclined to be successful in software development.

  3. Current status and future perspectives on molecular and serological methods in diagnostic mycology.

    Science.gov (United States)

    Lau, Anna; Chen, Sharon; Sleiman, Sue; Sorrell, Tania

    2009-11-01

    Invasive fungal infections are an important cause of infectious morbidity. Nonculture-based methods are increasingly used for rapid, accurate diagnosis to improve patient outcomes. New and existing DNA amplification platforms have high sensitivity and specificity for direct detection and identification of fungi in clinical specimens. Since laboratories are increasingly reliant on DNA sequencing for fungal identification, measures to improve sequence interpretation should support validation of reference isolates and quality control in public gene repositories. Novel technologies (e.g., isothermal and PNA FISH methods), platforms enabling high-throughput analyses (e.g., DNA microarrays and Luminex xMAP) and/or commercial PCR assays warrant further evaluation for routine diagnostic use. Notwithstanding the advantages of molecular tests, serological assays remain clinically useful for patient management. The serum Aspergillus galactomannan test has been incorporated into diagnostic algorithms of invasive aspergillosis. Both the galactomannan and the serum beta-D-glucan test have value for diagnosing infection and monitoring therapeutic response.

  4. Genesis and Evolution of Digital Payment Platforms

    DEFF Research Database (Denmark)

    Hjelholt, Morten; Damsgaard, Jan

    2012-01-01

    Payment transactions through the use of physical coins, bank notes or credit cards have for centuries been the standard formats of exchanging money. Recently online and mobile digital payment platforms has entered the stage as contenders to this position and possibly could penetrate societies...... thoroughly and substitute current payment standards in the decades to come. This paper portrays how digital payment paltforms evolve in socio-technical niches and how various technological platforms aim for institutional attention in their attempt to challenge earlier platforms and standards. The paper...... applies a co-evolutionary multilevel perspective to model the interplay and processes between technology and society wherein digital payment platforms potentially will substitute other payment platforms just like the credit card negated the check. On this basis this paper formulate a multilevel conceptual...

  5. Computing platforms for software-defined radio

    CERN Document Server

    Nurmi, Jari; Isoaho, Jouni; Garzia, Fabio

    2017-01-01

    This book addresses Software-Defined Radio (SDR) baseband processing from the computer architecture point of view, providing a detailed exploration of different computing platforms by classifying different approaches, highlighting the common features related to SDR requirements and by showing pros and cons of the proposed solutions. Coverage includes architectures exploiting parallelism by extending single-processor environment (such as VLIW, SIMD, TTA approaches), multi-core platforms distributing the computation to either a homogeneous array or a set of specialized heterogeneous processors, and architectures exploiting fine-grained, coarse-grained, or hybrid reconfigurability. Describes a computer engineering approach to SDR baseband processing hardware; Discusses implementation of numerous compute-intensive signal processing algorithms on single and multicore platforms; Enables deep understanding of optimization techniques related to power and energy consumption of multicore platforms using several basic a...

  6. Quantitative high throughput analytics to support polysaccharide production process development.

    Science.gov (United States)

    Noyes, Aaron; Godavarti, Ranga; Titchener-Hooker, Nigel; Coffman, Jonathan; Mukhopadhyay, Tarit

    2014-05-19

    The rapid development of purification processes for polysaccharide vaccines is constrained by a lack of analytical tools current technologies for the measurement of polysaccharide recovery and process-related impurity clearance are complex, time-consuming, and generally not amenable to high throughput process development (HTPD). HTPD is envisioned to be central to the improvement of existing polysaccharide manufacturing processes through the identification of critical process parameters that potentially impact the quality attributes of the vaccine and to the development of de novo processes for clinical candidates, across the spectrum of downstream processing. The availability of a fast and automated analytics platform will expand the scope, robustness, and evolution of Design of Experiment (DOE) studies. This paper details recent advances in improving the speed, throughput, and success of in-process analytics at the micro-scale. Two methods, based on modifications of existing procedures, are described for the rapid measurement of polysaccharide titre in microplates without the need for heating steps. A simplification of a commercial endotoxin assay is also described that features a single measurement at room temperature. These assays, along with existing assays for protein and nucleic acids are qualified for deployment in the high throughput screening of polysaccharide feedstreams. Assay accuracy, precision, robustness, interference, and ease of use are assessed and described. In combination, these assays are capable of measuring the product concentration and impurity profile of a microplate of 96 samples in less than one day. This body of work relies on the evaluation of a combination of commercially available and clinically relevant polysaccharides to ensure maximum versatility and reactivity of the final assay suite. Together, these advancements reduce overall process time by up to 30-fold and significantly reduce sample volume over current practices. The

  7. Advances in the development of the Mexican platform for analysis and design of nuclear reactors: AZTLAN Platform

    International Nuclear Information System (INIS)

    Gomez T, A. M.; Puente E, F.; Del Valle G, E.; Francois L, J. L.; Espinosa P, G.

    2017-09-01

    The AZTLAN platform project: development of a Mexican platform for the analysis and design of nuclear reactors, financed by the SENER-CONACYT Energy Sustain ability Fund, was approved in early 2014 and formally began at the end of that year. It is a national project led by the Instituto Nacional de Investigaciones Nucleares (ININ) and with the collaboration of Instituto Politecnico Nacional (IPN), the Universidad Autonoma Metropolitana (UAM) and Universidad Nacional Autonoma de Mexico (UNAM) as part of the development team and with the participation of the Laguna Verde Nuclear Power Plant, the National Commission of Nuclear Safety and Safeguards, the Ministry of Energy and the Karlsruhe Institute of Technology (Kit, Germany) as part of the user group. The general objective of the project is to modernize, improve and integrate the neutronic, thermo-hydraulic and thermo-mechanical codes, developed in Mexican institutions, in an integrated platform, developed and maintained by Mexican experts for the benefit of Mexican institutions. Two years into the process, important steps have been taken that have consolidated the platform. The main results of these first two years have been presented in different national and international forums. In this congress, some of the most recent results that have been implemented in the platform codes are shown in more detail. The current status of the platform from a more executive view point is summarized in this paper. (Author)

  8. Determination of High-affinity Antibody-antigen Binding Kinetics Using Four Biosensor Platforms.

    Science.gov (United States)

    Yang, Danlin; Singh, Ajit; Wu, Helen; Kroe-Barrett, Rachel

    2017-04-17

    Label-free optical biosensors are powerful tools in drug discovery for the characterization of biomolecular interactions. In this study, we describe the use of four routinely used biosensor platforms in our laboratory to evaluate the binding affinity and kinetics of ten high-affinity monoclonal antibodies (mAbs) against human proprotein convertase subtilisin kexin type 9 (PCSK9). While both Biacore T100 and ProteOn XPR36 are derived from the well-established Surface Plasmon Resonance (SPR) technology, the former has four flow cells connected by serial flow configuration, whereas the latter presents 36 reaction spots in parallel through an improvised 6 x 6 crisscross microfluidic channel configuration. The IBIS MX96 also operates based on the SPR sensor technology, with an additional imaging feature that provides detection in spatial orientation. This detection technique coupled with the Continuous Flow Microspotter (CFM) expands the throughput significantly by enabling multiplex array printing and detection of 96 reaction sports simultaneously. In contrast, the Octet RED384 is based on the BioLayer Interferometry (BLI) optical principle, with fiber-optic probes acting as the biosensor to detect interference pattern changes upon binding interactions at the tip surface. Unlike the SPR-based platforms, the BLI system does not rely on continuous flow fluidics; instead, the sensor tips collect readings while they are immersed in analyte solutions of a 384-well microplate during orbital agitation. Each of these biosensor platforms has its own advantages and disadvantages. To provide a direct comparison of these instruments' ability to provide quality kinetic data, the described protocols illustrate experiments that use the same assay format and the same high-quality reagents to characterize antibody-antigen kinetics that fit the simple 1:1 molecular interaction model.

  9. Reverse engineering of the robot base platform

    International Nuclear Information System (INIS)

    Anwar A Rahman; Azizul Rahman A Aziz; Mohd Arif Hamzah; Muhd Nor Atan; Fadil Ismail; Rosli Darmawan

    2009-01-01

    The robot base platform used to place the robotic arm version 2 was imported through a local company. The robot base platform is used as a reference for reverse egineering development for a smaller size robot. The paper will discuss the reverse engineering design process and parameters involved in the development of the robot base platform. (Author)

  10. Using decommissioned offshore oil/gas platforms for nuclear/RO desalination: the ONDP (Offshore Nuclear Desalination Platform)

    International Nuclear Information System (INIS)

    Nagar, Ankesh

    2010-01-01

    Oil platforms are manmade concrete and steel giant structures standing high on ocean floor weighing anywhere between 10,000 tonnes and 150,000 tonnes or more and designed to withstand cruel forces of nature, having an average life of 70 years. With the declining petrol reserves within next 30 years, hundreds of platforms will be scheduled for decommissioning. This issue is a hot topic as oil companies tussle with environmentalists and state lawmakers over the future. The cash strapped oil companies have a legal obligation to remove each rig entirely, returning the ocean floor to its original condition. Lean times in oil industry mean a tight cash flow. Safely removing massive structures from deep waters and shipping the pile to the shores for reuse and recycling presents a technological challenge for operators. Some conceptual applications investigated to reuse them are the conversion of offshore structures into fish farms, prisons, military outposts, hotels, for Search and Rescue operations or Centers for Waste Processing and Disposal. Decommissioning oil and gas installation is exorbitantly expensive. On an average, removing a complete platform with or without pipeline in sea waters with 'clean sea approach' costs $15 million to $ 6 billion depending on location. Global warming has adversely affected world climate. Water levels in ground and reservoirs have shown drastic decrement. In future there will be need for more and more water all over the world. Fossil fuel energy based desalination is expensive and not eco-friendly so is dismantling of oil platform with its pipeline. The oil platforms are far located from population, have sufficient tank capacity and pipeline structure to store and pump water to shore. When found economically unviable these mammoth structures with modifications can be installed with 02 or more small or medium sized nuclear reactors such as KLT 40S with required module to desalinate water and co generate electricity which can be sent to

  11. An analytical drain current model for symmetric double-gate MOSFETs

    Science.gov (United States)

    Yu, Fei; Huang, Gongyi; Lin, Wei; Xu, Chuanzhong

    2018-04-01

    An analytical surface-potential-based drain current model of symmetric double-gate (sDG) MOSFETs is described as a SPICE compatible model in this paper. The continuous surface and central potentials from the accumulation to the strong inversion regions are solved from the 1-D Poisson's equation in sDG MOSFETs. Furthermore, the drain current is derived from the charge sheet model as a function of the surface potential. Over a wide range of terminal voltages, doping concentrations, and device geometries, the surface potential calculation scheme and drain current model are verified by solving the 1-D Poisson's equation based on the least square method and using the Silvaco Atlas simulation results and experimental data, respectively. Such a model can be adopted as a useful platform to develop the circuit simulator and provide the clear understanding of sDG MOSFET device physics.

  12. 'Good' platform-political reasons for 'bad' platform-data. Zur sozio-technischen Geschichte der Plattformaktivitäten Fav, Retweet und Like

    NARCIS (Netherlands)

    Paßmann, J.; Gerlitz, C.

    2014-01-01

    In this article, we explore the relation between platform activities and their usage practices. Taking departure from predefined activities offered by social media platforms, this paper inquires into what may happen if platform features cater to opposing user practices. The paper investigates

  13. Functional-analytical capabilities of GIS technology in the study of water use risks

    International Nuclear Information System (INIS)

    Nevidimova, O G; Yankovich, E P; Yankovich, K S

    2015-01-01

    Regional security aspects of economic activities are of great importance for legal regulation in environmental management. This has become a critical issue due to climate change, especially in regions where severe climate conditions have a great impact on almost all types of natural resource uses. A detailed analysis of climate and hydrological situation in Tomsk Oblast considering water use risks was carried out. Based on developed author's techniques an informational and analytical database was created using ArcGIS software platform, which combines statistical (quantitative) and spatial characteristics of natural hazards and socio-economic factors. This system was employed to perform areal zoning according to the degree of water use risks involved

  14. VOLTTRON Lite: Integration Platform for the Transactional Network

    Energy Technology Data Exchange (ETDEWEB)

    Haack, Jereme N.; Katipamula, Srinivas; Akyol, Bora A.; Lutes, Robert G.

    2013-10-31

    In FY13, Pacific Northwest National Laboratory (PNNL) with funding from the Department of Energy’s (DOE’s) Building Technologies Office (BTO) designed, prototyped and tested a transactional network platform. The platform is intended to support energy, operational and financial transactions between any networked entities (equipment, organizations, buildings, grid, etc.). Initially, in FY13, the concept demonstrated transactions between packaged rooftop units (RTUs) and the electric grid using applications or “agents” that reside on the platform, on the equipment, on local building controller or in the Cloud. This document describes the core of the transactional network platform, the Volttron Lite™ software and associated services hosted on the platform. Future enhancements are also discussed. The appendix of the document provides examples of how to use the various services hosted on the platform.

  15. Business and IT Capabilities for Cloud Platform Success

    DEFF Research Database (Denmark)

    Hahn, Christopher; Huntgeburth, Jan; Winkler, Till J.

    2016-01-01

    The growing proliferation of cloud platform ecosystems demands a deeper understanding of the capabilities that help existing and emerging platform providers to be successful by creating and appropriating value. This multiple case study of four cloud platform providers (three large, one SME......) instantiates Rai and Tang’s (2014) framework of dyadic IT and network IT capabilities for a cloud platform context and extends it by exploring previously undertheorized cloud platform business capabilities. We further build on this extended framework by employing a configurational perspective to elucidate...... the complementary role of the three proposed business capabilities (incentives and rules, ecosystem marketing and sales, partner development and support) for relevant value creation and appropriation mechanisms. In addition to providing a capability framework catered to the cloud platform context, our findings...

  16. In-vitro nanodiagnostic platform through nanoparticles and DNA-RNA nanotechnology.

    Science.gov (United States)

    Chan, Ki; Ng, Tzi Bun

    2015-04-01

    Nanocomposites containing nanoparticles or nanostructured domains exhibit an even higher degree of material complexity that leads to an extremely high variability of nanostructured materials. This review introduces analytical concepts and techniques for nanomaterials and derives recommendations for a qualified selection of characterization techniques for specific types of samples, and focuses the characterization of nanoparticles and their agglomerates or aggregates. In addition, DNA nanotechnology and the more recent newcomer RNA nanotechnology have achieved almost an advanced status among nanotechnology researchers¸ therefore, the core features, potential, and significant challenges of DNA nanotechnology are also highlighted as a new discipline. Moreover, nanobiochips made by nanomaterials are rapidly emerging as a new paradigm in the area of large-scale biochemical analysis. The use of nanoscale components enables higher precision in diagnostics while considerably reducing the cost of the platform that leads this review to explore the use of nanoparticles, nanomaterials, and other bionanotechnologies for its application to nanodiagnostics in-vitro.

  17. LOOS: an extensible platform for the structural analysis of simulations.

    Science.gov (United States)

    Romo, Tod D; Grossfield, Alan

    2009-01-01

    We have developed LOOS (Lightweight Object-Oriented Structure-analysis library) as an object-oriented library designed to facilitate the rapid development of tools for the structural analysis of simulations. LOOS supports the native file formats of most common simulation packages including AMBER, CHARMM, CNS, Gromacs, NAMD, Tinker, and X-PLOR. Encapsulation and polymorphism are used to simultaneously provide a stable interface to the programmer and make LOOS easily extensible. A rich atom selection language based on the C expression syntax is included as part of the library. LOOS enables students and casual programmer-scientists to rapidly write their own analytical tools in a compact and expressive manner resembling scripting. LOOS is written in C++ and makes extensive use of the Standard Template Library and Boost, and is freely available under the GNU General Public License (version 3) LOOS has been tested on Linux and MacOS X, but is written to be portable and should work on most Unix-based platforms.

  18. The Monarch Initiative: an integrative data and analytic platform connecting phenotypes to genotypes across species

    International Nuclear Information System (INIS)

    Mungall, Christopher J.; McMurry, Julie A.; Köhler, Sebastian; Balhoff, James P.; Borromeo, Charles

    2016-01-01

    The correlation of phenotypic outcomes with genetic variation and environmental factors is a core pursuit in biology and biomedicine. Numerous challenges impede our progress: patient phenotypes may not match known diseases, candidate variants may be in genes that have not been characterized, model organisms may not recapitulate human or veterinary diseases, filling evolutionary gaps is difficult, and many resources must be queried to find potentially significant genotype-phenotype associations. Nonhuman organisms have proven instrumental in revealing biological mechanisms. Advanced informatics tools can identify phenotypically relevant disease models in research and diagnostic contexts. Large-scale integration of model organism and clinical research data can provide a breadth of knowledge not available from individual sources and can provide contextualization of data back to these sources. The Monarch Initiative (monarchinitiative.org) is a collaborative, open science effort that aims to semantically integrate genotype-phenotype data from many species and sources in order to support precision medicine, disease modeling, and mechanistic exploration. Our integrated knowledge graph, analytic tools, and web services enable diverse users to explore relationships between phenotypes and genotypes across species.

  19. Hierarchical DSE for multi-ASIP platforms

    DEFF Research Database (Denmark)

    Micconi, Laura; Corvino, Rosilde; Gangadharan, Deepak

    2013-01-01

    This work proposes a hierarchical Design Space Exploration (DSE) for the design of multi-processor platforms targeted to specific applications with strict timing and area constraints. In particular, it considers platforms integrating multiple Application Specific Instruction Set Processors (ASIPs...

  20. Smartphone Analytics: Mobilizing the Lab into the Cloud for Omic-Scale Analyses.

    Science.gov (United States)

    Montenegro-Burke, J Rafael; Phommavongsay, Thiery; Aisporna, Aries E; Huan, Tao; Rinehart, Duane; Forsberg, Erica; Poole, Farris L; Thorgersen, Michael P; Adams, Michael W W; Krantz, Gregory; Fields, Matthew W; Northen, Trent R; Robbins, Paul D; Niedernhofer, Laura J; Lairson, Luke; Benton, H Paul; Siuzdak, Gary

    2016-10-04

    Active data screening is an integral part of many scientific activities, and mobile technologies have greatly facilitated this process by minimizing the reliance on large hardware instrumentation. In order to meet with the increasingly growing field of metabolomics and heavy workload of data processing, we designed the first remote metabolomic data screening platform for mobile devices. Two mobile applications (apps), XCMS Mobile and METLIN Mobile, facilitate access to XCMS and METLIN, which are the most important components in the computer-based XCMS Online platforms. These mobile apps allow for the visualization and analysis of metabolic data throughout the entire analytical process. Specifically, XCMS Mobile and METLIN Mobile provide the capabilities for remote monitoring of data processing, real time notifications for the data processing, visualization and interactive analysis of processed data (e.g., cloud plots, principle component analysis, box-plots, extracted ion chromatograms, and hierarchical cluster analysis), and database searching for metabolite identification. These apps, available on Apple iOS and Google Android operating systems, allow for the migration of metabolomic research onto mobile devices for better accessibility beyond direct instrument operation. The utility of XCMS Mobile and METLIN Mobile functionalities was developed and is demonstrated here through the metabolomic LC-MS analyses of stem cells, colon cancer, aging, and bacterial metabolism.

  1. PROBA-V Mission Exploitation Platform

    Directory of Open Access Journals (Sweden)

    Erwin Goor

    2016-07-01

    Full Text Available As an extension of the PROBA-Vegetation (PROBA-V user segment, the European Space Agency (ESA, de Vlaamse Instelling voor Technologisch Onderzoek (VITO, and partners TRASYS and Spacebel developed an operational Mission Exploitation Platform (MEP to drastically improve the exploitation of the PROBA-V Earth Observation (EO data archive, the archive from the historical SPOT-VEGETATION mission, and derived products by researchers, service providers, and thematic users. The analysis of the time series of data (petabyte range is addressed, as well as the large scale on-demand processing of the complete archive, including near real-time data. The platform consists of a private cloud environment, a Hadoop-based processing environment and a data manager. Several applications are released to the users, e.g., a full resolution viewing service, a time series viewer, pre-defined on-demand processing chains, and virtual machines with powerful tools and access to the data. After an initial release in January 2016 a research platform was deployed gradually, allowing users to design, debug, and test applications on the platform. From the PROBA-V MEP, access to, e.g., Sentinel-2 and Sentinel-3 data will be addressed as well.

  2. Development of an Analytic Nodal Diffusion Solver in Multi-groups for 3D Reactor Cores with Rectangular or Hexagonal Assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Lozano, Juan Andres; Aragones, Jose Maria; Garcia-Herranz, Nuria [Universidad Politecnica de Madrid, 28006 Jose Gutierrez Abascal 2, Madrid (Spain)

    2008-07-01

    More accurate modelling of physical phenomena involved in present and future nuclear reactors requires a multi-scale and multi-physics approach. This challenge can be accomplished by the coupling of best-estimate core-physics, thermal-hydraulics and multi-physics solvers. In order to make viable that coupling, the current trends in reactor simulations are along the development of a new generation of tools based on user-friendly, modular, easily linkable, faster and more accurate codes to be integrated in common platforms. These premises are in the origin of the NURESIM Integrated Project within the 6. European Framework Program, which is envisaged to provide the initial step towards a Common European Standard Software Platform for nuclear reactors simulations. In the frame of this project and to reach the above-mentioned goals, a 3-D multigroup nodal solver for neutron diffusion calculations called ANDES (Analytic Nodal Diffusion Equation Solver) has been developed and tested in-depth in this Thesis. ANDES solves the steady-state and time-dependent neutron diffusion equation in three-dimensions and any number of energy groups, utilizing the Analytic Coarse-Mesh Finite-Difference (ACMFD) scheme to yield the nodal coupling equations. It can be applied to both Cartesian and triangular-Z geometries, so that simulations of LWR as well as VVER, HTR and fast reactors can be performed. The solver has been implemented in a fully encapsulated way, enabling it as a module to be readily integrated in other codes and platforms. In fact, it can be used either as a stand-alone nodal code or as a solver to accelerate the convergence of whole core pin-by-pin code systems. Verification of performance has shown that ANDES is a code with high order definition for whole core realistic nodal simulations. In this paper, the methodology developed and involved in ANDES is presented. (authors)

  3. Cloud Based Earth Observation Data Exploitation Platforms

    Science.gov (United States)

    Romeo, A.; Pinto, S.; Loekken, S.; Marin, A.

    2017-12-01

    In the last few years data produced daily by several private and public Earth Observation (EO) satellites reached the order of tens of Terabytes, representing for scientists and commercial application developers both a big opportunity for their exploitation and a challenge for their management. New IT technologies, such as Big Data and cloud computing, enable the creation of web-accessible data exploitation platforms, which offer to scientists and application developers the means to access and use EO data in a quick and cost effective way. RHEA Group is particularly active in this sector, supporting the European Space Agency (ESA) in the Exploitation Platforms (EP) initiative, developing technology to build multi cloud platforms for the processing and analysis of Earth Observation data, and collaborating with larger European initiatives such as the European Plate Observing System (EPOS) and the European Open Science Cloud (EOSC). An EP is a virtual workspace, providing a user community with access to (i) large volume of data, (ii) algorithm development and integration environment, (iii) processing software and services (e.g. toolboxes, visualization routines), (iv) computing resources, (v) collaboration tools (e.g. forums, wiki, etc.). When an EP is dedicated to a specific Theme, it becomes a Thematic Exploitation Platform (TEP). Currently, ESA has seven TEPs in a pre-operational phase dedicated to geo-hazards monitoring and prevention, costal zones, forestry areas, hydrology, polar regions, urban areas and food security. On the technology development side, solutions like the multi cloud EO data processing platform provides the technology to integrate ICT resources and EO data from different vendors in a single platform. In particular it offers (i) Multi-cloud data discovery, (ii) Multi-cloud data management and access and (iii) Multi-cloud application deployment. This platform has been demonstrated with the EGI Federated Cloud, Innovation Platform Testbed Poland

  4. Croatian Analytical Terminology

    Directory of Open Access Journals (Sweden)

    Kastelan-Macan; M.

    2008-04-01

    Full Text Available Results of analytical research are necessary in all human activities. They are inevitable in making decisions in the environmental chemistry, agriculture, forestry, veterinary medicine, pharmaceutical industry, and biochemistry. Without analytical measurements the quality of materials and products cannot be assessed, so that analytical chemistry is an essential part of technical sciences and disciplines.The language of Croatian science, and analytical chemistry within it, was one of the goals of our predecessors. Due to the political situation, they did not succeed entirely, but for the scientists in independent Croatia this is a duty, because language is one of the most important features of the Croatian identity. The awareness of the need to introduce Croatian terminology was systematically developed in the second half of the 19th century, along with the founding of scientific societies and the wish of scientists to write their scientific works in Croatian, so that the results of their research may be applied in economy. Many authors of textbooks from the 19th and the first half of the 20th century contributed to Croatian analytical terminology (F. Rački, B. Šulek, P. Žulić, G. Pexidr, J. Domac, G. Janeček , F. Bubanović, V. Njegovan and others. M. DeŢelić published the first systematic chemical terminology in 1940, adjusted to the IUPAC recommendations. In the second half of 20th century textbooks in classic analytical chemistry were written by V. Marjanović-Krajovan, M. Gyiketta-Ogrizek, S. Žilić and others. I. Filipović wrote the General and Inorganic Chemistry textbook and the Laboratory Handbook (in collaboration with P. Sabioncello and contributed greatly to establishing the terminology in instrumental analytical methods.The source of Croatian nomenclature in modern analytical chemistry today are translated textbooks by Skoog, West and Holler, as well as by Günnzler i Gremlich, and original textbooks by S. Turina, Z.

  5. Integrated chemical sensor array platform based on a light emitting diode, xerogel-derived sensor elements, and high-speed pin printing

    International Nuclear Information System (INIS)

    Cho, Eun Jeong; Bright, Frank V.

    2002-01-01

    We report a new, solid-state, integrated optical array sensor platform. By using pin printing technology in concert with sol-gel-processing methods, we form discrete xerogel-based microsensor elements that are on the order of 100 μm in diameter and 1 μm thick directly on the face of a light emitting diode (LED). The LED serves as the light source to excite chemically responsive luminophores sequestered within the doped xerogel microsensors and the analyte-dependent emission from within the doped xerogel is detected with a charge coupled device (CCD). We overcome the problem of background illumination from the LED reaching the CCD and the associated biasing that results by coating the LED first with a thin layer of blue paint. The thin paint layer serves as an optical filter, knocking out the LEDs red-edge spectral tail. The problem of the spatially-dependent fluence across the LED face is solved entirely by performing ratiometric measurements. We illustrate the performance of the new sensor scheme by forming an array of 100 discrete O 2 -responsive sensing elements on the face of a single LED. The combination of pin printing with an integrated sensor and light source platform results in a rapid method of forming (∼1 s per sensor element) reusable sensor arrays. The entire sensor array can be calibrated using just one sensor element. Array-to-array reproducibly is <8%. Arrays can be formed using single or multiple pins with indistinguishable analytical performance

  6. Available: motorised platform

    CERN Multimedia

    The COMPASS collaboration

    2014-01-01

    The COMPASS collaboration would like to offer to a new owner the following useful and fully operational piece of equipment, which is due to be replaced with better adapted equipment.   Please contact Erwin Bielert (erwin.bielert@cern.ch or 160539) for further information.  Motorized platform (FOR FREE):   Fabricated by ACL (Alfredo Cardoso & Cia Ltd) in Portugal. The model number is MeXs 5-­‐30.  Specifications: 5 m wide, 1 m deep, adjustable height (1.5 m if folded). Maximum working floor height: 4 m. conforms to CERN regulations, number LV158. Type LD500, capacity 500 kg and weight 2000 kg.  If no interested party is found before December 2014, the platform will be thrown away.

  7. MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF INTELLIGENCE PRODUCTS

    Science.gov (United States)

    2016-04-01

    AU/ACSC/2016 AIR COMMAND AND STAFF COLLEGE AIR UNIVERSITY MASTERS OF ANALYTICAL TRADECRAFT: CERTIFYING THE STANDARDS AND ANALYTIC RIGOR OF...establishing unit level certified Masters of Analytic Tradecraft (MAT) analysts to be trained and entrusted to evaluate and rate the standards and...cues) ideally should meet or exceed effective rigor (based on analytical process).4 To accomplish this, decision makers should not be left to their

  8. University technology platform of anticipatory learning

    Directory of Open Access Journals (Sweden)

    Leonid Davidovich Gitelman

    2016-03-01

    Full Text Available The innovative development sets large-scale and challenging tasks, which need to be addressed in the lack-of-knowledge conditions and require the coordination and integration of numerous expert structures, which are scattered around the world and have different status and competencies. One of the mechanisms of integrating the partners’ intellectual and financial resources is provided by the technology platforms. The article discusses the nature and functions of technology platforms and analyzes the experience of their application in different countries with a special emphasis on universities. The article gives an overview of the various interpretations of technology platform concepts. It also describes the development and implementation of the technological platform at the Ural Federal University (research and education centre ‘ENGEC’, which was targeted at organizing anticipatory learning in the sphere of energy engineering and high-tech industries; its mechanism and role in improving different university activities and processes are shown. This platform is based on the original methodology ‘Integrated System of Consulting, Training, and Transformation’ (ISCT, which includes authentic methods and technologies, which are used in the educational process. A significant advantage of this methodology is that it can be applied in university education as well as in corporate training integrated with innovative activities.

  9. Implementation of Online Veterinary Hospital on Cloud Platform.

    Science.gov (United States)

    Chen, Tzer-Shyong; Chen, Tzer-Long; Chung, Yu-Fang; Huang, Yao-Min; Chen, Tao-Chieh; Wang, Huihui; Wei, Wei

    2016-06-01

    Pet markets involve in great commercial possibilities, which boost thriving development of veterinary hospital businesses. The service tends to intensive competition and diversified channel environment. Information technology is integrated for developing the veterinary hospital cloud service platform. The platform contains not only pet medical services but veterinary hospital management and services. In the study, QR Code andcloud technology are applied to establish the veterinary hospital cloud service platform for pet search by labeling a pet's identification with QR Code. This technology can break the restriction on veterinary hospital inspection in different areas and allows veterinary hospitals receiving the medical records and information through the exclusive QR Code for more effective inspection. As an interactive platform, the veterinary hospital cloud service platform allows pet owners gaining the knowledge of pet diseases and healthcare. Moreover, pet owners can enquire and communicate with veterinarians through the platform. Also, veterinary hospitals can periodically send reminders of relevant points and introduce exclusive marketing information with the platform for promoting the service items and establishing individualized marketing. Consequently, veterinary hospitals can increase the profits by information share and create the best solution in such a competitive veterinary market with industry alliance.

  10. Metal-organic frameworks for analytical chemistry: from sample collection to chromatographic separation.

    Science.gov (United States)

    Gu, Zhi-Yuan; Yang, Cheng-Xiong; Chang, Na; Yan, Xiu-Ping

    2012-05-15

    -coated capillaries for high-resolution gas chromatography (GC). We have explored a dynamic coating approach to fabricate a MOF-coated capillary for the GC separation of important raw chemicals and persistent organic pollutants with high resolution and excellent selectivity. We have combined a MOF-coated fiber for solid-phase microextraction with a MOF-coated capillary for GC separation, which provides an effective MOF-based tandem molecular sieve platform for selective microextraction and high-resolution GC separation of target analytes in complex samples. Microsized MOFs with good solvent stability are attractive stationary phases for high-performance liquid chromatography (HPLC). These materials have shown high resolution and good selectivity and reproducibility in both the normal-phase HPLC separation of fullerenes and substituted aromatics on MIL-101 packed columns and position isomers on a MIL-53(Al) packed column and the reversed-phase HPLC separation of a wide range of analytes from nonpolar to polar and acidic to basic solutes. Despite the above achievements, further exploration of MOFs in analytical chemistry is needed. Especially, analytical application-oriented engineering of MOFs is imperative for specific applications.

  11. The development of an open platform to test ITS solutions

    DEFF Research Database (Denmark)

    Lahrmann, Harry; Agerholm, Niels; Juhl, Jens

    2013-01-01

    This paper presents the ITS Platform Northern Denmark, which is an open platform to test ITS solutions. The platform consists of a new developed GNSS/GPRS On Board Unit installed in nearly 500 cars, a backend server and a specially designed digital road map for ITS applications. The platform is o...... is open for third part application. This paper presents the platform’s potentials and explains a series of test applications, which are developed on the plat- form. Moreover, a number of new projects, which are planned for ITS Platform is introduced.......This paper presents the ITS Platform Northern Denmark, which is an open platform to test ITS solutions. The platform consists of a new developed GNSS/GPRS On Board Unit installed in nearly 500 cars, a backend server and a specially designed digital road map for ITS applications. The platform...

  12. Droplet microfluidic platform for cell electrofusion

    NARCIS (Netherlands)

    Schoeman, R.M.

    2015-01-01

    In this thesis a lab on a chip platform is described which is capable of electrofusing cells in a picoliter droplet. The platform consist out of glass part containing recessed platinum electrodes plasma bonded to a PDMS slab containing microchannels. First the two cell populations are introduced

  13. Snap: an integrated SNP annotation platform

    DEFF Research Database (Denmark)

    Li, Shengting; Ma, Lijia; Li, Heng

    2007-01-01

    Snap (Single Nucleotide Polymorphism Annotation Platform) is a server designed to comprehensively analyze single genes and relationships between genes basing on SNPs in the human genome. The aim of the platform is to facilitate the study of SNP finding and analysis within the framework of medical...

  14. Kronos: a workflow assembler for genome analytics and informatics

    Science.gov (United States)

    Taghiyar, M. Jafar; Rosner, Jamie; Grewal, Diljot; Grande, Bruno M.; Aniba, Radhouane; Grewal, Jasleen; Boutros, Paul C.; Morin, Ryan D.

    2017-01-01

    Abstract Background: The field of next-generation sequencing informatics has matured to a point where algorithmic advances in sequence alignment and individual feature detection methods have stabilized. Practical and robust implementation of complex analytical workflows (where such tools are structured into “best practices” for automated analysis of next-generation sequencing datasets) still requires significant programming investment and expertise. Results: We present Kronos, a software platform for facilitating the development and execution of modular, auditable, and distributable bioinformatics workflows. Kronos obviates the need for explicit coding of workflows by compiling a text configuration file into executable Python applications. Making analysis modules would still require programming. The framework of each workflow includes a run manager to execute the encoded workflows locally (or on a cluster or cloud), parallelize tasks, and log all runtime events. The resulting workflows are highly modular and configurable by construction, facilitating flexible and extensible meta-applications that can be modified easily through configuration file editing. The workflows are fully encoded for ease of distribution and can be instantiated on external systems, a step toward reproducible research and comparative analyses. We introduce a framework for building Kronos components that function as shareable, modular nodes in Kronos workflows. Conclusions: The Kronos platform provides a standard framework for developers to implement custom tools, reuse existing tools, and contribute to the community at large. Kronos is shipped with both Docker and Amazon Web Services Machine Images. It is free, open source, and available through the Python Package Index and at https://github.com/jtaghiyar/kronos. PMID:28655203

  15. Cyclic platform dolomites and platform-to-basin transition of Jefferson Formation (Frasnian), southwest Montana and east-central Idaho

    Energy Technology Data Exchange (ETDEWEB)

    Dorobek, S.L.

    1987-08-01

    The Jefferson Formation (Frasnian) in southwestern Montana consists of cyclic sequences of shallow marine platformal dolomites that grade westward into slope/basinal facies in east-central Idaho. Regional sedimentologic characteristics of slope facies in Idaho indicate that the Jefferson platform resembled a distally steepened ramp. Slope facies consist of slope laminites with local small scale slumps and slope breccias. Shallow water platform-derived clasts are lacking in the slope breccias. Individual shallowing upward platform cycles are 25 m to < 1 m thick and consists of, in descending order: local solution-collapse breccia caps; cryptalgal dolomudstone; rare ooid dolograinstone; thin-bedded Amphipora dolowackestone; coarsely crystalline dolostones with abundant lenticular to domal stromatoporoids; and basal thin-bedded, fine-grained, shale dolostones with closely spaced hard-grounds that grade upward into burrow-homogenized, irregularly bedded dolostones.

  16. SREQP: A Solar Radiation Extraction and Query Platform for the Production and Consumption of Linked Data from Weather Stations Sensors

    Directory of Open Access Journals (Sweden)

    José Luis Sánchez-Cervantes

    2016-01-01

    Full Text Available Nowadays, solar radiation information is provided from sensors installed in different geographic locations and platforms of meteorological agencies. However, common formats such as PDF files and HTML documents to provide solar radiation information do not offer semantics in their content, and they may pose problems to integrate and fuse data from multiple resources. One of the challenges of sensors Web is the unification of data from multiple sources, although this type of information facilitates interoperability with other sensor Web systems. This research proposes architecture SREQP (Solar Radiation Extraction and Query Platform to extract solar radiation data from multiple external sources and merge them on a single and unique platform. SREQP makes use of Linked Data to generate a set of triples containing information about extracted data, which allows final users to query data through a SPARQL endpoint. The conceptual model was developed by using known vocabularies, such as SSN or WGS84. Moreover, an Analytic Hierarchy Process was carried out for the evaluation of SREQP in order to identify and evaluate the main features of Linked-Sensor-Data and the sensor Web systems. Results from the evaluation indicated that SREQP contained most of the features considered essential in Linked-Sensor-Data and sensor Web systems.

  17. Detection of trace explosives on relevant substrates using a mobile platform for photothermal infrared imaging spectroscopy (PT-IRIS)

    Science.gov (United States)

    Kendziora, Christopher A.; Furstenberg, Robert; Papantonakis, Michael; Nguyen, Viet; Byers, Jeff; McGill, R. Andrew

    2015-05-01

    This manuscript describes the results of recent tests regarding standoff detection of trace explosives on relevant substrates using a mobile platform. We are developing a technology for detection based on photo-thermal infrared (IR) imaging spectroscopy (PT-IRIS). This approach leverages one or more microfabricated IR quantum cascade lasers, tuned to strong absorption bands in the analytes and directed to illuminate an area on a surface of interest. An IR focal plane array is used to image the surface thermal emission upon laser illumination. The PT-IRIS signal is processed as a hyperspectral image cube comprised of spatial, spectral and temporal dimensions as vectors within a detection algorithm. Increased sensitivity to explosives and selectivity between different analyte types is achieved by narrow bandpass IR filters in the collection path. We have previously demonstrated the technique at several meters of stand-off distance indoors and in field tests, while operating the lasers below the infrared eye-safe intensity limit (100 mW/cm2). Sensitivity to explosive traces as small as a single 10 μm diameter particle (~1 ng) has been demonstrated. Analytes tested here include RDX, TNT, ammonium nitrate and sucrose. The substrates tested in this current work include metal, plastics, glass and painted car panels.

  18. Gas turbine bucket with impingement cooled platform

    Science.gov (United States)

    Jones, Raphael Durand

    2002-01-01

    In a turbine bucket having an airfoil portion and a root portion, with a substantially planar platform at an interface between the airfoil portion and root portion, a platform cooling arrangement including at least one bore in the root portion and at least one impingement cooling tube seated in the bore, the tube extending beyond the bore with an outlet in close proximity to a targeted area on an underside of the platform.

  19. Oil and Gas Producing Platforms in the Gulf of Mexico, Geographic NAD83, MMS (1998)[platforms_MMS_1998

    Data.gov (United States)

    Louisiana Geographic Information Center — This is a point data set for the location of over 4300 MMS administered platform structures used for oil and gas production in the Gulf of Mexico. Groups of platform...

  20. High throughput in vivo protease inhibitor selection platform

    DEFF Research Database (Denmark)

    2017-01-01

    The invention relates to a recombinant microbial cell comprising a selection platform for screening for a protease inhibitor, wherein the platform comprises transgenes encoding a protease having selective peptide bond cleavage activity at a recognition site amino acid sequence; and transgenes...... platform for screening for a protease inhibitor....

  1. German crowd-investing platforms: Literature review and survey

    Directory of Open Access Journals (Sweden)

    David Grundy

    2016-12-01

    Full Text Available This article presents a comprehensive overview of the current German crowd-investing market drawing on a data-set of 31 crowd-investing platforms including the analysis of 265 completed projects. While crowd-investing market still only represents a niche in the German venture capital market, there is potential for an increase in both market volume and in average project investment. The market share is distributed among a few crowd-investing platforms with high entry barriers for new platforms although platforms that specialise in certain sectors have managed to successfully enter the market. German crowd-investing platforms are found to promote mainly internet-based enterprises (36% followed by projects in real estate (24% and green projects (19%, with the median money raised 100,000 euro.

  2. Evaluation of E-learning Platforms: a Case Study

    Directory of Open Access Journals (Sweden)

    Cristina POP

    2012-01-01

    Full Text Available In the recent past, a great number of e-learning platforms have been introduced on the market showing different characteristics and services. These platforms can be evaluated using multiple criteria and methods. This paper proposes a list of selected quality criteria for describing, characterizing and selecting e-learning platform. These criteria were designed based on e-learning standards. I also propose a mathematical model to determine the probability that a student uses an e-learning platform based on the factors (criteria that determine the quality of the platform and the socio-demographic variables of the student. The case study presented is an application of the model and the input data, intermediate calculations and final results were processed using SAS (Statistical Analysis Software.

  3. Making Sense of Video Analytics: Lessons Learned from Clickstream Interactions, Attitudes, and Learning Outcome in a Video-Assisted Course

    Directory of Open Access Journals (Sweden)

    Michail N. Giannakos

    2015-02-01

    Full Text Available Online video lectures have been considered an instructional media for various pedagogic approaches, such as the flipped classroom and open online courses. In comparison to other instructional media, online video affords the opportunity for recording student clickstream patterns within a video lecture. Video analytics within lecture videos may provide insights into student learning performance and inform the improvement of video-assisted teaching tactics. Nevertheless, video analytics are not accessible to learning stakeholders, such as researchers and educators, mainly because online video platforms do not broadly share the interactions of the users with their systems. For this purpose, we have designed an open-access video analytics system for use in a video-assisted course. In this paper, we present a longitudinal study, which provides valuable insights through the lens of the collected video analytics. In particular, we found that there is a relationship between video navigation (repeated views and the level of cognition/thinking required for a specific video segment. Our results indicated that learning performance progress was slightly improved and stabilized after the third week of the video-assisted course. We also found that attitudes regarding easiness, usability, usefulness, and acceptance of this type of course remained at the same levels throughout the course. Finally, we triangulate analytics from diverse sources, discuss them, and provide the lessons learned for further development and refinement of video-assisted courses and practices.

  4. An analytical drain current model for symmetric double-gate MOSFETs

    Directory of Open Access Journals (Sweden)

    Fei Yu

    2018-04-01

    Full Text Available An analytical surface-potential-based drain current model of symmetric double-gate (sDG MOSFETs is described as a SPICE compatible model in this paper. The continuous surface and central potentials from the accumulation to the strong inversion regions are solved from the 1-D Poisson’s equation in sDG MOSFETs. Furthermore, the drain current is derived from the charge sheet model as a function of the surface potential. Over a wide range of terminal voltages, doping concentrations, and device geometries, the surface potential calculation scheme and drain current model are verified by solving the 1-D Poisson’s equation based on the least square method and using the Silvaco Atlas simulation results and experimental data, respectively. Such a model can be adopted as a useful platform to develop the circuit simulator and provide the clear understanding of sDG MOSFET device physics.

  5. Optimization of a nanotechnology based antimicrobial platform for food safety applications using Engineered Water Nanostructures (EWNS)

    Science.gov (United States)

    Pyrgiotakis, Georgios; Vedantam, Pallavi; Cirenza, Caroline; McDevitt, James; Eleftheriadou, Mary; Leonard, Stephen S.; Demokritou, Philip

    2016-02-01

    A chemical free, nanotechnology-based, antimicrobial platform using Engineered Water Nanostructures (EWNS) was recently developed. EWNS have high surface charge, are loaded with reactive oxygen species (ROS), and can interact-with, and inactivate an array of microorganisms, including foodborne pathogens. Here, it was demonstrated that their properties during synthesis can be fine tuned and optimized to further enhance their antimicrobial potential. A lab based EWNS platform was developed to enable fine-tuning of EWNS properties by modifying synthesis parameters. Characterization of EWNS properties (charge, size and ROS content) was performed using state-of-the art analytical methods. Further their microbial inactivation potential was evaluated with food related microorganisms such as Escherichia coli, Salmonella enterica, Listeria innocua, Mycobacterium parafortuitum, and Saccharomyces cerevisiae inoculated onto the surface of organic grape tomatoes. The results presented here indicate that EWNS properties can be fine-tuned during synthesis resulting in a multifold increase of the inactivation efficacy. More specifically, the surface charge quadrupled and the ROS content increased. Microbial removal rates were microorganism dependent and ranged between 1.0 to 3.8 logs after 45 mins of exposure to an EWNS aerosol dose of 40,000 #/cm3.

  6. Optimization of a nanotechnology based antimicrobial platform for food safety applications using Engineered Water Nanostructures (EWNS)

    Science.gov (United States)

    Pyrgiotakis, Georgios; Vedantam, Pallavi; Cirenza, Caroline; McDevitt, James; Eleftheriadou, Mary; Leonard, Stephen S.; Demokritou, Philip

    2016-01-01

    A chemical free, nanotechnology-based, antimicrobial platform using Engineered Water Nanostructures (EWNS) was recently developed. EWNS have high surface charge, are loaded with reactive oxygen species (ROS), and can interact-with, and inactivate an array of microorganisms, including foodborne pathogens. Here, it was demonstrated that their properties during synthesis can be fine tuned and optimized to further enhance their antimicrobial potential. A lab based EWNS platform was developed to enable fine-tuning of EWNS properties by modifying synthesis parameters. Characterization of EWNS properties (charge, size and ROS content) was performed using state-of-the art analytical methods. Further their microbial inactivation potential was evaluated with food related microorganisms such as Escherichia coli, Salmonella enterica, Listeria innocua, Mycobacterium parafortuitum, and Saccharomyces cerevisiae inoculated onto the surface of organic grape tomatoes. The results presented here indicate that EWNS properties can be fine-tuned during synthesis resulting in a multifold increase of the inactivation efficacy. More specifically, the surface charge quadrupled and the ROS content increased. Microbial removal rates were microorganism dependent and ranged between 1.0 to 3.8 logs after 45 mins of exposure to an EWNS aerosol dose of 40,000 #/cm3. PMID:26875817

  7. Optimization of a nanotechnology based antimicrobial platform for food safety applications using Engineered Water Nanostructures (EWNS).

    Science.gov (United States)

    Pyrgiotakis, Georgios; Vedantam, Pallavi; Cirenza, Caroline; McDevitt, James; Eleftheriadou, Mary; Leonard, Stephen S; Demokritou, Philip

    2016-02-15

    A chemical free, nanotechnology-based, antimicrobial platform using Engineered Water Nanostructures (EWNS) was recently developed. EWNS have high surface charge, are loaded with reactive oxygen species (ROS), and can interact-with, and inactivate an array of microorganisms, including foodborne pathogens. Here, it was demonstrated that their properties during synthesis can be fine tuned and optimized to further enhance their antimicrobial potential. A lab based EWNS platform was developed to enable fine-tuning of EWNS properties by modifying synthesis parameters. Characterization of EWNS properties (charge, size and ROS content) was performed using state-of-the art analytical methods. Further their microbial inactivation potential was evaluated with food related microorganisms such as Escherichia coli, Salmonella enterica, Listeria innocua, Mycobacterium parafortuitum, and Saccharomyces cerevisiae inoculated onto the surface of organic grape tomatoes. The results presented here indicate that EWNS properties can be fine-tuned during synthesis resulting in a multifold increase of the inactivation efficacy. More specifically, the surface charge quadrupled and the ROS content increased. Microbial removal rates were microorganism dependent and ranged between 1.0 to 3.8 logs after 45 mins of exposure to an EWNS aerosol dose of 40,000 #/cm(3).

  8. Analytical Performances of Human Immunodeficiency Virus Type 1 RNA-Based Amplix® Real-Time PCR Platform for HIV-1 RNA Quantification

    Directory of Open Access Journals (Sweden)

    Christian Diamant Mossoro-Kpinde

    2016-01-01

    Full Text Available Objectives. We evaluated the performances of Amplix real-time PCR platform developed by Biosynex (Strasbourg, France, combining automated station extraction (Amplix station 16 Dx and real-time PCR (Amplix NG, for quantifying plasma HIV-1 RNA by lyophilized HIV-1 RNA-based Amplix reagents targeting gag and LTR, using samples from HIV-1-infected adults from Central African Republic. Results. Amplix real-time PCR assay showed low limit of detection (28 copies/mL, across wide dynamic range (1.4–10 log copies/mL, 100% sensitivity and 99% specificity, high reproducibility, and accuracy with mean bias < 5%. The assay showed excellent correlations and concordance of 95.3% with the reference HIV-1 RNA load assay (Roche, with mean absolute bias of +0.097 log copies/mL by Bland-Altman analysis. The assay was able to detect and quantify the most prevalent HIV-1 subtype strains and the majority of non-B subtypes, CRFs of HIV-1 group M, and HIV-1 groups N and O circulating in Central Africa. The Amplix assay showed 100% sensitivity and 99.6% specificity to diagnose virological failure in clinical samples from antiretroviral drug-experienced patients. Conclusions. The HIV-1 RNA-based Amplix real-time PCR platform constitutes sensitive and reliable system for clinical monitoring of HIV-1 RNA load in HIV-1-infected children and adults, particularly adapted to intermediate laboratory facilities in sub-Saharan Africa.

  9. Platforms for Persistent Communications, Surveillance and Reconnaissance

    National Research Council Canada - National Science Library

    Campbell, William; Vehlow, Chuck; Wartell, Mike; Adler, Allen; Swan, Pete; Wynn, Bob; Beriwaln, Madhu; Collier, Darrell; Gallagher, Herb; Glaser, Gary; Scalera, Steve; Puthoff, Jolene

    2008-01-01

    ...). The Army Science Board investigated capabilities of platforms deployed in space near space and lower altitudes and assessed tradeoffs among benefits weaknesses costs and logistics burdens associated with platform types...

  10. Hemocompatible ɛ-polylysine-heparin microparticles: A platform for detecting triglycerides in whole blood.

    Science.gov (United States)

    Xu, Tingting; Chi, Bo; Chu, Meilin; Zhang, Qicheng; Zhan, Shuyue; Shi, Rongjia; Xu, Hong; Mao, Chun

    2018-01-15

    Triglycerides are clinically important marker for atherosclerosis, heart disease and hypertension. Here, a platform for detecting triglycerides in whole blood directly was developed based on hemocompatible ɛ-polylysine-heparin microparticles. The obtained products of ɛ-polylysine-heparin microparticles were characterized by fourier transform infrared (FT-IR) spectra, transmission electron microscopy (TEM) and ζ-potential. Moreover, the blood compatibility of ɛ-polylysine-heparin microparticles was characterized by in vitro coagulation tests, hemolysis assay and whole blood adhesion tests. Considering of uniform particle size, good dispersibility and moderate long-term anticoagulation capability of the microparticles, a Lipase-(ɛ-polylysine-heparin)-glassy carbon electrode (GCE) was constructed to detect triglycerides. The proposed biosensor had good electrocatalytic activity towards triglycerides, in which case the sensitivity was 0.40μAmg -1 dLcm -2 and the detection limit was 4.67mgdL -1 (S/N = 3). Meanwhile, the Lipase-(ɛ-polylysine-heparin)-GCE electrode had strong anti-interference ability as well as a long shelf-life. Moreover, for the detection of triglycerides in whole blood directly, the detection limit was as low as 5.18mgdL -1 . The new constructed platform is suitable for detecting triglycerides in whole blood directly, which provides new analytical systems for clinical illness diagnosis. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Urban search mobile platform modeling in hindered access conditions

    Science.gov (United States)

    Barankova, I. I.; Mikhailova, U. V.; Kalugina, O. B.; Barankov, V. V.

    2018-05-01

    The article explores the control system simulation and the design of the experimental model of the rescue robot mobile platform. The functional interface, a structural functional diagram of the mobile platform control unit, and a functional control scheme for the mobile platform of secure robot were modeled. The task of design a mobile platform for urban searching in hindered access conditions is realized through the use of a mechanical basis with a chassis and crawler drive, a warning device, human heat sensors and a microcontroller based on Arduino platforms.

  12. An Investigation of Digital Payment Platform Designs

    DEFF Research Database (Denmark)

    Kazan, Erol; Damsgaard, Jan

    2014-01-01

    This paper focuses on the triumph march of mobile phones that currently are annexing music players, navigation devices, and cameras as separate physical objects. The next target is set on payment. Through synthesizing available literature, we construct a framework for studying digital payment...... platforms that combines platform, technology and business design aspects. The framework is applied to conduct a comparative case study of digital payment platforms. Four types of market actors are considered: banks, mobile network operators, merchants, and startups, which are incumbents and disrupters....... By hosting third-party services, payment instruments are evolving from single-purpose to multi-functional ones. Our research extends existing payment literature from the MSP perspective to emphasize certain digital payment platform components, which impact strategies and complementary products....

  13. Fresh water generators onboard a floating platform

    International Nuclear Information System (INIS)

    Tewari, P.K.; Verma, R.K.; Misra, B.M.; Sadhulkan, H.K.

    1997-01-01

    A dependable supply of fresh water is essential for any ocean going vessel. The operating and maintenance personnel on offshore platforms and marine structures also require a constant and regular supply of fresh water to meet their essential daily needs. A seawater thermal desalination unit onboard delivers good quality fresh water from seawater. The desalination units developed by Bhabha Atomic Research Centre (BARC) suitable for ocean going vessels and offshore platforms have been discussed. Design considerations of such units with reference to floating platforms and corrosive environments have been presented. The feasibility of coupling a low temperature vacuum evaporation (LTVE) desalination plant suitable for an onboard floating platform to a PHWR nuclear power plant has also been discussed. (author). 1 ref., 3 figs, 2 tabs

  14. Business analytics a practitioner's guide

    CERN Document Server

    Saxena, Rahul

    2013-01-01

    This book provides a guide to businesses on how to use analytics to help drive from ideas to execution. Analytics used in this way provides "full lifecycle support" for business and helps during all stages of management decision-making and execution.The framework presented in the book enables the effective interplay of business, analytics, and information technology (business intelligence) both to leverage analytics for competitive advantage and to embed the use of business analytics into the business culture. It lays out an approach for analytics, describes the processes used, and provides gu

  15. Carbonate platform growth and demise offshore Central Vietnam

    DEFF Research Database (Denmark)

    Fyhn, Michael B.W.; Boldreel, Lars Ole; Nielsen, Lars H.

    2013-01-01

    Fault Zone, the Tuy Hoa Carbonate Platform fringes the continental margin between Da Nang and Nha Trang. Here, platform growth initiated during the Early Miocene and continued until Middle Miocene time when regional uplift led to subaerial exposure, termination of platform growth and karstification...... continues on isolated platforms hosting the Paracel Islands farther seawards. The onset of widespread carbonate deposition largely reflects the Early Miocene transgression of the area linked with early post-rift subsidence and the opening of the South China Sea. The mid-Neogene shift in carbonate deposition...

  16. eAnalytics: Dynamic Web-based Analytics for the Energy Industry

    Directory of Open Access Journals (Sweden)

    Paul Govan

    2016-11-01

    Full Text Available eAnalytics is a web application built on top of R that provides dynamic data analytics to energy industry stakeholders. The application allows users to dynamically manipulate chart data and style through the Shiny package’s reactive framework. eAnalytics currently supports a number of features including interactive datatables, dynamic charting capabilities, and the ability to save, download, or export information for further use. Going forward, the goal for this project is that it will serve as a research hub for discovering new relationships in the data. The application is illustrated with a simple tutorial of the user interface design.

  17. FOREIGN EXPERIENCE OF USING CLOUD SERVICES FOR THE INFORMATION-ANALYTICAL SUPPORT OF THE ORGANIZATION OF INTERNATIONAL COOPERATION OF UNIVERSITIES

    Directory of Open Access Journals (Sweden)

    Kravchenko A.

    2017-12-01

    Full Text Available Foreign experience of using cloud services for the information-analytical support of the organization of international cooperation of universities is presented in the article. The best practices of using cloud services like new analytical tools and platforms for solving complex problems of optimization of the management of scientific and international activities of universities are analyzed. Architecture of the cloud computing environment as a system is analysed; it consists of 4 blocks: hardware; infrastructure; platforms and applications and cloud taxonomy for the organization of the scientific, academic and international activities of the University support, as well as taxonomy of the main cloud technologies to support the University's academic and international activities. The activities of the leading universities of the world for 2016-2017 are monitored and the expert results of Quacquarelli Symonds specialists’ are presented according to the World University Ratings. The evaluation was carried out based on more than 50 different indicators, such as: academic reputation; employer's reputation; faculty / student rate; reference (quotation about the faculty; international correlation of faculties; international student rate; assessment of the quality of researches of scientists and determination of productivity of the university; number of quotes; graduate university rewards; assessment of teaching quality; employment opportunity; Internationalization, which includes statistical indicators for the number of foreign students styding at University; number of exchange students; number of international partnership Agreements with other universities; accessibility; the possibility of distance learning; social responsibility; innovation; art and culture; inclusiveness, etc.

  18. How Export and Import Platforms Drive Industry Upgrading

    DEFF Research Database (Denmark)

    Ishida, Masami; Machikita, Tomohiro; Ueki, Yasushi

    2013-01-01

    to extend the geographic scope of their foreign platforms if they run both exporting and importing; (2) firm size and R&D sales ration play a role of foreign platforms in ASEAN, Europe, and the USA, but these have no effects on foreign platforms in east Asia; (3) emerging multinationals do not achieve...

  19. Plasmonic Paper as a Novel Chem/Bio Detection Platform

    Science.gov (United States)

    Tian, Limei

    The time varying electric field of electromagnetic (EM) radiation causes oscillation of conduction electrons of metal nanoparticles. The resonance of such oscillation, termed localized surface plasmon resonance (LSPR), falls into the visible spectral region for noble metals such as gold, silver and copper. LSPR of metal nanostructures is sensitive to numerous factors such as composition, size, shape, dielectric properties of surrounding medium, and proximity to other nanostructures (plasmon coupling). The sensitivity of LSPR to the refractive index of surrounding medium renders it an attractive platform for chemical and biological sensing. When the excitation light is in resonance with the plasmon frequency of the metal nanoparticle, it radiates a characteristic dipolar radiation causing a characteristic spatial distribution in which certain areas show higher EM field intensity, which is manifested as electromagnetic field enhancement. Surface enhanced Raman scattering (SERS) involves dramatic enhancement of the intensity of the Raman scattering from the analyte adsorbed on or in proximity to a nanostructured metal surface exhibiting such strong EM field enhancement. Both LSPR and SERS have been widely investigated for highly sensitive and label-free chemical & biological sensors. Most of the SERS/LSPR sensors demonstrated so far rely on rigid planar substrates (e.g., glass, silicon) owing to the well-established lithographic approaches, which are routinely employed for either fabrication or assembly of plasmonic nanotransducers. In many cases, their rigid nature results in low conformal contact with the sample and hence poor sample collection efficiency. We hypothesized that paper substrates are an excellent alternative to conventional rigid substrates to significantly improve the (multi-)functionality of LSPR/SERS substrates, dramatically simplify the fabrication procedures and lower the cost. The choice of paper substrates for the implementation of SERS

  20. Microfluidic Diatomite Analytical Devices for Illicit Drug Sensing with ppb-Level Sensitivity.

    Science.gov (United States)

    Kong, Xianming; Chong, Xinyuan; Squire, Kenny; Wang, Alan X

    2018-04-15

    The escalating research interests in porous media microfluidics, such as microfluidic paper-based analytical devices, have fostered a new spectrum of biomedical devices for point-of-care (POC) diagnosis and biosensing. In this paper, we report microfluidic diatomite analytical devices (μDADs), which consist of highly porous photonic crystal biosilica channels, as an innovative lab-on-a-chip platform to detect illicit drugs. The μDADs in this work are fabricated by spin-coating and tape-stripping diatomaceous earth on regular glass slides with cross section of 400×30µm 2 . As the most unique feature, our μDADs can simultaneously perform on-chip chromatography to separate small molecules from complex biofluidic samples and acquire the surface-enhanced Raman scattering spectra of the target chemicals with high specificity. Owing to the ultra-small dimension of the diatomite microfluidic channels and the photonic crystal effect from the fossilized diatom frustules, we demonstrate unprecedented sensitivity down to part-per-billion (ppb) level when detecting pyrene (1ppb) from mixed sample with Raman dye and cocaine (10 ppb) from human plasma. This pioneering work proves the exclusive advantage of μDADs as emerging microfluidic devices for chemical and biomedical sensing, especially for POC drug screening.

  1. A Cross-Platform Tactile Capabilities Interface for Humanoid Robots

    Directory of Open Access Journals (Sweden)

    Jie eMa

    2016-04-01

    Full Text Available This article presents the core elements of a cross-platform tactile capabilities interface (TCI for humanoid arms. The aim of the interface is to reduce the cost of developing humanoid robot capabilities by supporting reuse through cross-platform deployment. The article presents a comparative analysis of existing robot middleware frameworks, as well as the technical details of the TCI framework that builds on the the existing YARP platform. The TCI framework currently includes robot arm actuators with robot skin sensors. It presents such hardware in a platform independent manner, making it possible to write robot control software that can be executed on different robots through the TCI frameworks. The TCI framework supports multiple humanoid platforms and this article also presents a case study of a cross-platform implementation of a set of tactile protective withdrawal reflexes that have been realised on both the Nao and iCub humanoid robot platforms using the same high-level source code.

  2. Circular Bioassay Platforms for Applications in Microwave-Accelerated Techniques.

    Science.gov (United States)

    Mohammed, Muzaffer; Clement, Travis C; Aslan, Kadir

    2014-12-02

    In this paper, we present the design of four different circular bioassay platforms, which are suitable for homogeneous microwave heating, using theoretical calculations (i.e., COMSOL™ multiphysics software). Circular bioassay platforms are constructed from poly(methyl methacrylate) (PMMA) for optical transparency between 400-800 nm, has multiple sample capacity (12, 16, 19 and 21 wells) and modified with silver nanoparticle films (SNFs) to be used in microwave-accelerated bioassays (MABs). In addition, a small monomode microwave cavity, which can be operated with an external microwave generator (100 W), for use with the bioassay platforms in MABs is also developed. Our design parameters for the circular bioassay platforms and monomode microwave cavity during microwave heating were: (i) temperature profiles, (ii) electric field distributions, (iii) location of the circular bioassay platforms inside the microwave cavity, and (iv) design and number of wells on the circular bioassay platforms. We have also carried out additional simulations to assess the use of circular bioassay platforms in a conventional kitchen microwave oven (e.g., 900 W). Our results show that the location of the circular bioassay platforms in the microwave cavity was predicted to have a significant effect on the homogeneous heating of these platforms. The 21-well circular bioassay platform design in our monomode microwave cavity was predicted to offer a homogeneous heating pattern, where inter-well temperature was observed to be in between 23.72-24.13°C and intra-well temperature difference was less than 0.21°C for 60 seconds of microwave heating, which was also verified experimentally.

  3. Applied architecture patterns on the Microsoft platform

    CERN Document Server

    Dovgal, Andre; Noriskin, Gregor

    2014-01-01

    Presented in a scenario-driven tutorial way, we lead you through fictitious example problems and present you with the best solutions. This book is intended for architects, developers, and managers who need to improve their knowledge of the Microsoft application platform. This book will appeal to anyone, especially consultants, who want to get up to speed on selecting the most appropriate platform for a particular problem. A good understanding of the general Windows platform and development technologies would be helpful.

  4. Proof-of-Concept Prototyping for Observis Platform

    OpenAIRE

    Ekimov, Victor

    2012-01-01

    Observis Oy is a start-up company first appeared in January 2011. The company is building up a measurement platform that is open and easy to connect. It helps measurement device suppliers, system and service providers, and analyzing services to found and combine each other’s products to create more value to the end customers. Observis Oy intends to develop a platform for integration with other services in order to provide management functionality in environmental field of business. Platform i...

  5. Production of recombinant proteins GST L1, E6 and E7 tag HPV 16 ...

    African Journals Online (AJOL)

    STORAGESEVER

    2009-02-04

    Feb 4, 2009 ... targeting the viral oncoproteins E6 and E7 are markers for HPV-associated ... Luminex XYP plate handler, Luminex SD sheath fluid delivery system, a Pentium 4 .... expression mediated by a potato virus X derived vector of the E7 protein .... inflammation, and antioxidant nutrients – assessing their roles as.

  6. Characterization of dilation-analytic operators

    Energy Technology Data Exchange (ETDEWEB)

    Balslev, E; Grossmann, A; Paul, T

    1986-01-01

    Dilation analytic vectors and operators are characterized in a new representation of quantum mechanical states through functions analytic on the upper half-plane. In this space H/sub o/-bounded operators are integral operators and criteria for dilation analyticity are given in terms of analytic continuation outside of the half-plane for functions and for kernels. A sufficient condition is given for an integral operator in momentum space to be dilation-analytic.

  7. Unsupervised detection of salt marsh platforms: a topographic method

    Science.gov (United States)

    Goodwin, Guillaume C. H.; Mudd, Simon M.; Clubb, Fiona J.

    2018-03-01

    Salt marshes filter pollutants, protect coastlines against storm surges, and sequester carbon, yet are under threat from sea level rise and anthropogenic modification. The sustained existence of the salt marsh ecosystem depends on the topographic evolution of marsh platforms. Quantifying marsh platform topography is vital for improving the management of these valuable landscapes. The determination of platform boundaries currently relies on supervised classification methods requiring near-infrared data to detect vegetation, or demands labour-intensive field surveys and digitisation. We propose a novel, unsupervised method to reproducibly isolate salt marsh scarps and platforms from a digital elevation model (DEM), referred to as Topographic Identification of Platforms (TIP). Field observations and numerical models show that salt marshes mature into subhorizontal platforms delineated by subvertical scarps. Based on this premise, we identify scarps as lines of local maxima on a slope raster, then fill landmasses from the scarps upward, thus isolating mature marsh platforms. We test the TIP method using lidar-derived DEMs from six salt marshes in England with varying tidal ranges and geometries, for which topographic platforms were manually isolated from tidal flats. Agreement between manual and unsupervised classification exceeds 94 % for DEM resolutions of 1 m, with all but one site maintaining an accuracy superior to 90 % for resolutions up to 3 m. For resolutions of 1 m, platforms detected with the TIP method are comparable in surface area to digitised platforms and have similar elevation distributions. We also find that our method allows for the accurate detection of local block failures as small as 3 times the DEM resolution. Detailed inspection reveals that although tidal creeks were digitised as part of the marsh platform, unsupervised classification categorises them as part of the tidal flat, causing an increase in false negatives and overall platform

  8. Unsupervised detection of salt marsh platforms: a topographic method

    Directory of Open Access Journals (Sweden)

    G. C. H. Goodwin

    2018-03-01

    Full Text Available Salt marshes filter pollutants, protect coastlines against storm surges, and sequester carbon, yet are under threat from sea level rise and anthropogenic modification. The sustained existence of the salt marsh ecosystem depends on the topographic evolution of marsh platforms. Quantifying marsh platform topography is vital for improving the management of these valuable landscapes. The determination of platform boundaries currently relies on supervised classification methods requiring near-infrared data to detect vegetation, or demands labour-intensive field surveys and digitisation. We propose a novel, unsupervised method to reproducibly isolate salt marsh scarps and platforms from a digital elevation model (DEM, referred to as Topographic Identification of Platforms (TIP. Field observations and numerical models show that salt marshes mature into subhorizontal platforms delineated by subvertical scarps. Based on this premise, we identify scarps as lines of local maxima on a slope raster, then fill landmasses from the scarps upward, thus isolating mature marsh platforms. We test the TIP method using lidar-derived DEMs from six salt marshes in England with varying tidal ranges and geometries, for which topographic platforms were manually isolated from tidal flats. Agreement between manual and unsupervised classification exceeds 94 % for DEM resolutions of 1 m, with all but one site maintaining an accuracy superior to 90 % for resolutions up to 3 m. For resolutions of 1 m, platforms detected with the TIP method are comparable in surface area to digitised platforms and have similar elevation distributions. We also find that our method allows for the accurate detection of local block failures as small as 3 times the DEM resolution. Detailed inspection reveals that although tidal creeks were digitised as part of the marsh platform, unsupervised classification categorises them as part of the tidal flat, causing an increase in false negatives

  9. Measurement of baseline and orientation between distributed aerospace platforms.

    Science.gov (United States)

    Wang, Wen-Qin

    2013-01-01

    Distributed platforms play an important role in aerospace remote sensing, radar navigation, and wireless communication applications. However, besides the requirement of high accurate time and frequency synchronization for coherent signal processing, the baseline between the transmitting platform and receiving platform and the orientation of platform towards each other during data recording must be measured in real time. In this paper, we propose an improved pulsed duplex microwave ranging approach, which allows determining the spatial baseline and orientation between distributed aerospace platforms by the proposed high-precision time-interval estimation method. This approach is novel in the sense that it cancels the effect of oscillator frequency synchronization errors due to separate oscillators that are used in the platforms. Several performance specifications are also discussed. The effectiveness of the approach is verified by simulation results.

  10. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Combined sensing platform for advanced diagnostics in exhaled mouse breath

    Science.gov (United States)

    Fortes, Paula R.; Wilk, Andreas; Seichter, Felicia; Cajlakovic, Merima; Koestler, Stefan; Ribitsch, Volker; Wachter, Ulrich; Vogt, Josef; Radermacher, Peter; Carter, Chance; Raimundo, Ivo M.; Mizaikoff, Boris

    2013-03-01

    Breath analysis is an attractive non-invasive strategy for early disease recognition or diagnosis, and for therapeutic progression monitoring, as quantitative compositional analysis of breath can be related to biomarker panels provided by a specific physiological condition invoked by e.g., pulmonary diseases, lung cancer, breast cancer, and others. As exhaled breath contains comprehensive information on e.g., the metabolic state, and since in particular volatile organic constituents (VOCs) in exhaled breath may be indicative of certain disease states, analytical techniques for advanced breath diagnostics should be capable of sufficient molecular discrimination and quantification of constituents at ppm-ppb - or even lower - concentration levels. While individual analytical techniques such as e.g., mid-infrared spectroscopy may provide access to a range of relevant molecules, some IR-inactive constituents require the combination of IR sensing schemes with orthogonal analytical tools for extended molecular coverage. Combining mid-infrared hollow waveguides (HWGs) with luminescence sensors (LS) appears particularly attractive, as these complementary analytical techniques allow to simultaneously analyze total CO2 (via luminescence), the 12CO2/13CO2 tracer-to-tracee (TTR) ratio (via IR), selected VOCs (via IR) and O2 (via luminescence) in exhaled breath, yet, establishing a single diagnostic platform as both sensors simultaneously interact with the same breath sample volume. In the present study, we take advantage of a particularly compact (shoebox-size) FTIR spectrometer combined with novel substrate-integrated hollow waveguide (iHWG) recently developed by our research team, and miniaturized fiberoptic luminescence sensors for establishing a multi-constituent breath analysis tool that is ideally compatible with mouse intensive care stations (MICU). Given the low tidal volume and flow of exhaled mouse breath, the TTR is usually determined after sample collection via gas

  12. SRL online Analytical Development

    International Nuclear Information System (INIS)

    Jenkins, C.W.

    1991-01-01

    The Savannah River Site is operated by the Westinghouse Savannah River Co. for the Department of Energy to produce special nuclear materials for defense. R ampersand D support for site programs is provided by the Savannah River Laboratory, which I represent. The site is known primarily for its nuclear reactors, but actually three fourths of the efforts at the site are devoted to fuel/target fabrication, fuel/target reprocessing, and waste management. All of these operations rely heavily on chemical processes. The site is therefore a large chemical plant. There are then many potential applications for process analytical chemistry at SRS. The Savannah River Laboratory (SRL) has an Analytical Development Section of roughly 65 personnel that perform analyses for R ampersand D efforts at the lab, act as backup to the site Analytical Laboratories Department and develop analytical methods and instruments. I manage a subgroup of the Analytical Development Section called the Process Control ampersand Analyzer Development Group. The Prime mission of this group is to develop online/at-line analytical systems for site applications

  13. Analytical mechanics

    CERN Document Server

    Lemos, Nivaldo A

    2018-01-01

    Analytical mechanics is the foundation of many areas of theoretical physics including quantum theory and statistical mechanics, and has wide-ranging applications in engineering and celestial mechanics. This introduction to the basic principles and methods of analytical mechanics covers Lagrangian and Hamiltonian dynamics, rigid bodies, small oscillations, canonical transformations and Hamilton–Jacobi theory. This fully up-to-date textbook includes detailed mathematical appendices and addresses a number of advanced topics, some of them of a geometric or topological character. These include Bertrand's theorem, proof that action is least, spontaneous symmetry breakdown, constrained Hamiltonian systems, non-integrability criteria, KAM theory, classical field theory, Lyapunov functions, geometric phases and Poisson manifolds. Providing worked examples, end-of-chapter problems, and discussion of ongoing research in the field, it is suitable for advanced undergraduate students and graduate students studying analyt...

  14. Benchmarking computer platforms for lattice QCD applications

    International Nuclear Information System (INIS)

    Hasenbusch, M.; Jansen, K.; Pleiter, D.; Stueben, H.; Wegner, P.; Wettig, T.; Wittig, H.

    2004-01-01

    We define a benchmark suite for lattice QCD and report on benchmark results from several computer platforms. The platforms considered are apeNEXT, CRAY T3E; Hitachi SR8000, IBM p690, PC-Clusters, and QCDOC

  15. The Creative Platform

    DEFF Research Database (Denmark)

    Byrge, Christian; Hansen, Søren

    whether you consider thirdgrade teaching, human-resource development, or radical new thinking in product development in a company. The Creative Platform was developed at Aalborg University through a series of research-and-development activities in collaboration with educational institutions and private...

  16. The Construction of Platform Imperialism in the Globalization Era

    Directory of Open Access Journals (Sweden)

    Dal Yong Jin

    2013-01-01

    Full Text Available In the early 21st century, platforms, known as digital media intermediaries, have greatly influenced people’s daily lives. Due to the importance of platforms for the digital economy and culture, including intellectual property and participatory culture, several countries have developed their own social network sites and Web portals. Nonetheless, a handful of Western countries, primarily the U.S., have dominated the global platform market and society. This paper aims to historicize the concept of imperialism in the globalized 21st century. It investigates whether the recent growth of American-based platforms has resulted in a change to the fundamental idea of the imperialism thesis by analyzing the evolutionary nature of imperialism towards platform imperialism. It then addresses whether we are experiencing a new notion of imperialism by mapping out several core characteristics that define platform imperialism, including the swift growth and global dominance of SNSs and smartphones. It pays close attention to the capitalization of platforms and their global expansion, including the major role of intellectual property rights as the most significant form of capital accumulation in the digital age. It eventually endeavors to make a contribution to the platform imperialism discourse as a form of new imperialism, focusing on the nexus of great powers.

  17. Impacts from Partial Removal of Decommissioned Oil and Gas Platforms on Fish Biomass and Production on the Remaining Platform Structure and Surrounding Shell Mounds.

    Science.gov (United States)

    Claisse, Jeremy T; Pondella, Daniel J; Love, Milton; Zahn, Laurel A; Williams, Chelsea M; Bull, Ann S

    2015-01-01

    When oil and gas platforms become obsolete they go through a decommissioning process. This may include partial removal (from the surface to 26 m depth) or complete removal of the platform structure. While complete removal would likely eliminate most of the existing fish biomass and associated secondary production, we find that the potential impacts of partial removal would likely be limited on all but one platform off the coast of California. On average 80% of fish biomass and 86% of secondary fish production would be retained after partial removal, with above 90% retention expected for both metrics on many platforms. Partial removal would likely result in the loss of fish biomass and production for species typically found residing in the shallow portions of the platform structure. However, these fishes generally represent a small proportion of the fishes associated with these platforms. More characteristic of platform fauna are the primarily deeper-dwelling rockfishes (genus Sebastes). "Shell mounds" are biogenic reefs that surround some of these platforms resulting from an accumulation of mollusk shells that have fallen from the shallow areas of the platforms mostly above the depth of partial removal. We found that shell mounds are moderately productive fish habitats, similar to or greater than natural rocky reefs in the region at comparable depths. The complexity and areal extent of these biogenic habitats, and the associated fish biomass and production, will likely be reduced after either partial or complete platform removal. Habitat augmentation by placing the partially removed platform superstructure or some other additional habitat enrichment material (e.g., rock boulders) on the seafloor adjacent to the base of partially removed platforms provides additional options to enhance fish production, potentially mitigating reductions in shell mound habitat.

  18. Impacts from Partial Removal of Decommissioned Oil and Gas Platforms on Fish Biomass and Production on the Remaining Platform Structure and Surrounding Shell Mounds.

    Directory of Open Access Journals (Sweden)

    Jeremy T Claisse

    Full Text Available When oil and gas platforms become obsolete they go through a decommissioning process. This may include partial removal (from the surface to 26 m depth or complete removal of the platform structure. While complete removal would likely eliminate most of the existing fish biomass and associated secondary production, we find that the potential impacts of partial removal would likely be limited on all but one platform off the coast of California. On average 80% of fish biomass and 86% of secondary fish production would be retained after partial removal, with above 90% retention expected for both metrics on many platforms. Partial removal would likely result in the loss of fish biomass and production for species typically found residing in the shallow portions of the platform structure. However, these fishes generally represent a small proportion of the fishes associated with these platforms. More characteristic of platform fauna are the primarily deeper-dwelling rockfishes (genus Sebastes. "Shell mounds" are biogenic reefs that surround some of these platforms resulting from an accumulation of mollusk shells that have fallen from the shallow areas of the platforms mostly above the depth of partial removal. We found that shell mounds are moderately productive fish habitats, similar to or greater than natural rocky reefs in the region at comparable depths. The complexity and areal extent of these biogenic habitats, and the associated fish biomass and production, will likely be reduced after either partial or complete platform removal. Habitat augmentation by placing the partially removed platform superstructure or some other additional habitat enrichment material (e.g., rock boulders on the seafloor adjacent to the base of partially removed platforms provides additional options to enhance fish production, potentially mitigating reductions in shell mound habitat.

  19. A novel rotating experimental platform in a superconducting magnet.

    Science.gov (United States)

    Chen, Da; Cao, Hui-Ling; Ye, Ya-Jing; Dong, Chen; Liu, Yong-Ming; Shang, Peng; Yin, Da-Chuan

    2016-08-01

    This paper introduces a novel platform designed to be used in a strong static magnetic field (in a superconducting magnet). The platform is a sample holder that rotates in the strong magnetic field. Any samples placed in the platform will rotate due to the rotation of the sample holder. With this platform, a number of experiments such as material processing, culture of biological systems, chemical reactions, or other processes can be carried out. In this report, we present some preliminary experiments (protein crystallization, cell culture, and seed germination) conducted using this platform. The experimental results showed that the platform can affect the processes, indicating that it provides a novel environment that has not been investigated before and that the effects of such an environment on many different physical, chemical, or biological processes can be potentially useful for applications in many fields.

  20. Embedded Linux platform for data acquisition systems

    International Nuclear Information System (INIS)

    Patel, Jigneshkumar J.; Reddy, Nagaraj; Kumari, Praveena; Rajpal, Rachana; Pujara, Harshad; Jha, R.; Kalappurakkal, Praveen

    2014-01-01

    Highlights: • The design and the development of data acquisition system on FPGA based reconfigurable hardware platform. • Embedded Linux configuration and compilation for FPGA based systems. • Hardware logic IP core and its Linux device driver development for the external peripheral to interface it with the FPGA based system. - Abstract: This scalable hardware–software system is designed and developed to explore the emerging open standards for data acquisition requirement of Tokamak experiments. To address the future need for a scalable data acquisition and control system for fusion experiments, we have explored the capability of software platform using Open Source Embedded Linux Operating System on a programmable hardware platform such as FPGA. The idea was to identify the platform which can be customizable, flexible and scalable to support the data acquisition system requirements. To do this, we have selected FPGA based reconfigurable and scalable hardware platform to design the system with Embedded Linux based operating system for flexibility in software development and Gigabit Ethernet interface for high speed data transactions. The proposed hardware–software platform using FPGA and Embedded Linux OS offers a single chip solution with processor, peripherals such ADC interface controller, Gigabit Ethernet controller, memory controller amongst other peripherals. The Embedded Linux platform for data acquisition is implemented and tested on a Virtex-5 FXT FPGA ML507 which has PowerPC 440 (PPC440) [2] hard block on FPGA. For this work, we have used the Linux Kernel version 2.6.34 with BSP support for the ML507 platform. It is downloaded from the Xilinx [1] GIT server. Cross-compiler tool chain is created using the Buildroot scripts. The Linux Kernel and Root File System are configured and compiled using the cross-tools to support the hardware platform. The Analog to Digital Converter (ADC) IO module is designed and interfaced with the ML507 through Xilinx