WorldWideScience

Sample records for platform positioning computer

  1. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  2. Platform Architecture for Decentralized Positioning Systems

    Directory of Open Access Journals (Sweden)

    Zakaria Kasmi

    2017-04-01

    Full Text Available A platform architecture for positioning systems is essential for the realization of a flexible localization system, which interacts with other systems and supports various positioning technologies and algorithms. The decentralized processing of a position enables pushing the application-level knowledge into a mobile station and avoids the communication with a central unit such as a server or a base station. In addition, the calculation of the position on low-cost and resource-constrained devices presents a challenge due to the limited computing, storage capacity, as well as power supply. Therefore, we propose a platform architecture that enables the design of a system with the reusability of the components, extensibility (e.g., with other positioning technologies and interoperability. Furthermore, the position is computed on a low-cost device such as a microcontroller, which simultaneously performs additional tasks such as data collecting or preprocessing based on an operating system. The platform architecture is designed, implemented and evaluated on the basis of two positioning systems: a field strength system and a time of arrival-based positioning system.

  3. Benchmarking computer platforms for lattice QCD applications

    International Nuclear Information System (INIS)

    Hasenbusch, M.; Jansen, K.; Pleiter, D.; Wegner, P.; Wettig, T.

    2003-09-01

    We define a benchmark suite for lattice QCD and report on benchmark results from several computer platforms. The platforms considered are apeNEXT, CRAY T3E, Hitachi SR8000, IBM p690, PC-Clusters, and QCDOC. (orig.)

  4. Benchmarking computer platforms for lattice QCD applications

    International Nuclear Information System (INIS)

    Hasenbusch, M.; Jansen, K.; Pleiter, D.; Stueben, H.; Wegner, P.; Wettig, T.; Wittig, H.

    2004-01-01

    We define a benchmark suite for lattice QCD and report on benchmark results from several computer platforms. The platforms considered are apeNEXT, CRAY T3E; Hitachi SR8000, IBM p690, PC-Clusters, and QCDOC

  5. Competitive Positioning of Complementors on Digital Platforms

    DEFF Research Database (Denmark)

    Wessel, Michael; Thies, Ferdinand; Benlian, Alexander

    2017-01-01

    markets. With increasing numbers of products and services offered via the platforms, signals such as popularity and reputation have become critical market mechanisms that affect the decision-making processes of end-users. In this paper, we examine the positioning strategies of new hosts on Airbnb......, a platform focused on accommodation sharing, to understand how they attempt to cope with the inherent lack of credible quality signals as they join the platform. By analyzing close to 47,000 listings, we find that new hosts follow a cost-leadership strategy rather than trying to differentiate their offerings...

  6. Computing platforms for software-defined radio

    CERN Document Server

    Nurmi, Jari; Isoaho, Jouni; Garzia, Fabio

    2017-01-01

    This book addresses Software-Defined Radio (SDR) baseband processing from the computer architecture point of view, providing a detailed exploration of different computing platforms by classifying different approaches, highlighting the common features related to SDR requirements and by showing pros and cons of the proposed solutions. Coverage includes architectures exploiting parallelism by extending single-processor environment (such as VLIW, SIMD, TTA approaches), multi-core platforms distributing the computation to either a homogeneous array or a set of specialized heterogeneous processors, and architectures exploiting fine-grained, coarse-grained, or hybrid reconfigurability. Describes a computer engineering approach to SDR baseband processing hardware; Discusses implementation of numerous compute-intensive signal processing algorithms on single and multicore platforms; Enables deep understanding of optimization techniques related to power and energy consumption of multicore platforms using several basic a...

  7. Architectural analysis for wirelessly powered computing platforms

    NARCIS (Netherlands)

    Kapoor, A.; Pineda de Gyvez, J.

    2013-01-01

    We present a design framework for wirelessly powered generic computing platforms that takes into account various system parameters in response to a time-varying energy source. These parameters are the charging profile of the energy source, computing speed (fclk), digital supply voltage (VDD), energy

  8. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  9. Traffic information computing platform for big data

    International Nuclear Information System (INIS)

    Duan, Zongtao; Li, Ying; Zheng, Xibin; Liu, Yan; Dai, Jiting; Kang, Jun

    2014-01-01

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users

  10. A mobile and portable trusted computing platform

    Directory of Open Access Journals (Sweden)

    Nepal Surya

    2011-01-01

    Full Text Available Abstract The mechanism of establishing trust in a computing platform is tightly coupled with the characteristics of a specific machine. This limits the portability and mobility of trust as demanded by many emerging applications that go beyond the organizational boundaries. In order to address this problem, we propose a mobile and portable trusted computing platform in a form of a USB device. First, we describe the design and implementation of the hardware and software architectures of the device. We then demonstrate the capabilities of the proposed device by developing a trusted application.

  11. ZIVIS: A City Computing Platform Based on Volunteer Computing

    International Nuclear Information System (INIS)

    Antoli, B.; Castejon, F.; Giner, A.; Losilla, G.; Reynolds, J. M.; Rivero, A.; Sangiao, S.; Serrano, F.; Tarancon, A.; Valles, R.; Velasco, J. L.

    2007-01-01

    Abstract Volunteer computing has come up as a new form of distributed computing. Unlike other computing paradigms like Grids, which use to be based on complex architectures, volunteer computing has demonstrated a great ability to integrate dispersed, heterogeneous computing resources with ease. This article presents ZIVIS, a project which aims to deploy a city-wide computing platform in Zaragoza (Spain). ZIVIS is based on BOINC (Berkeley Open Infrastructure for Network Computing), a popular open source framework to deploy volunteer and desktop grid computing systems. A scientific code which simulates the trajectories of particles moving inside a stellarator fusion device, has been chosen as the pilot application of the project. In this paper we describe the approach followed to port the code to the BOINC framework as well as some novel techniques, based on standard Grid protocols, we have used to access the output data present in the BOINC server from a remote visualizer. (Author)

  12. A signal strength priority based position estimation for mobile platforms

    Science.gov (United States)

    Kalgikar, Bhargav; Akopian, David; Chen, Philip

    2010-01-01

    Global Positioning System (GPS) products help to navigate while driving, hiking, boating, and flying. GPS uses a combination of orbiting satellites to determine position coordinates. This works great in most outdoor areas, but the satellite signals are not strong enough to penetrate inside most indoor environments. As a result, a new strain of indoor positioning technologies that make use of 802.11 wireless LANs (WLAN) is beginning to appear on the market. In WLAN positioning the system either monitors propagation delays between wireless access points and wireless device users to apply trilateration techniques or it maintains the database of location-specific signal fingerprints which is used to identify the most likely match of incoming signal data with those preliminary surveyed and saved in the database. In this paper we investigate the issue of deploying WLAN positioning software on mobile platforms with typically limited computational resources. We suggest a novel received signal strength rank order based location estimation system to reduce computational loads with a robust performance. The proposed system performance is compared to conventional approaches.

  13. An Application Development Platform for Neuromorphic Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dean, Mark [University of Tennessee (UT); Chan, Jason [University of Tennessee (UT); Daffron, Christopher [University of Tennessee (UT); Disney, Adam [University of Tennessee (UT); Reynolds, John [University of Tennessee (UT); Rose, Garrett [University of Tennessee (UT); Plank, James [University of Tennessee (UT); Birdwell, John Douglas [University of Tennessee (UT); Schuman, Catherine D [ORNL

    2016-01-01

    Dynamic Adaptive Neural Network Arrays (DANNAs) are neuromorphic computing systems developed as a hardware based approach to the implementation of neural networks. They feature highly adaptive and programmable structural elements, which model arti cial neural networks with spiking behavior. We design them to solve problems using evolutionary optimization. In this paper, we highlight the current hardware and software implementations of DANNA, including their features, functionalities and performance. We then describe the development of an Application Development Platform (ADP) to support efficient application implementation and testing of DANNA based solutions. We conclude with future directions.

  14. Smart SOA platforms in cloud computing architectures

    CERN Document Server

    Exposito , Ernesto

    2014-01-01

    This book is intended to introduce the principles of the Event-Driven and Service-Oriented Architecture (SOA 2.0) and its role in the new interconnected world based on the cloud computing architecture paradigm. In this new context, the concept of "service" is widely applied to the hardware and software resources available in the new generation of the Internet. The authors focus on how current and future SOA technologies provide the basis for the smart management of the service model provided by the Platform as a Service (PaaS) layer.

  15. Bioinformatics on the Cloud Computing Platform Azure

    Science.gov (United States)

    Shanahan, Hugh P.; Owen, Anne M.; Harrison, Andrew P.

    2014-01-01

    We discuss the applicability of the Microsoft cloud computing platform, Azure, for bioinformatics. We focus on the usability of the resource rather than its performance. We provide an example of how R can be used on Azure to analyse a large amount of microarray expression data deposited at the public database ArrayExpress. We provide a walk through to demonstrate explicitly how Azure can be used to perform these analyses in Appendix S1 and we offer a comparison with a local computation. We note that the use of the Platform as a Service (PaaS) offering of Azure can represent a steep learning curve for bioinformatics developers who will usually have a Linux and scripting language background. On the other hand, the presence of an additional set of libraries makes it easier to deploy software in a parallel (scalable) fashion and explicitly manage such a production run with only a few hundred lines of code, most of which can be incorporated from a template. We propose that this environment is best suited for running stable bioinformatics software by users not involved with its development. PMID:25050811

  16. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    Science.gov (United States)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We

  17. Analytics Platform for ATLAS Computing Services

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Bryant, Lincoln

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning tools like Spark, Jupyter, R, S...

  18. Navigation and Positioning System Using High Altitude Platforms Systems (HAPS)

    Science.gov (United States)

    Tsujii, Toshiaki; Harigae, Masatoshi; Harada, Masashi

    Recently, some countries have begun conducting feasibility studies and R&D projects on High Altitude Platform Systems (HAPS). Japan has been investigating the use of an airship system that will function as a stratospheric platform for applications such as environmental monitoring, communications and broadcasting. If pseudolites were mounted on the airships, their GPS-like signals would be stable augmentations that would improve the accuracy, availability, and integrity of GPS-based positioning systems. Also, the sufficient number of HAPS can function as a positioning system independent of GPS. In this paper, a system design of the HAPS-based positioning system and its positioning error analyses are described.

  19. Multi-platform Integrated Positioning and Attitude Determination using GNSS

    NARCIS (Netherlands)

    Buist, P.J.

    2013-01-01

    There is trend in spacecraft engineering toward distributed systems where a number of smaller spacecraft work as a larger satellite. However, in order to make the small satellites work together as a single large platform, the precise relative positions (baseline) and orientations (attitude) of the

  20. Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.

    Science.gov (United States)

    Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo

    2017-01-01

    The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.

  1. Strategies for Sharing Seismic Data Among Multiple Computer Platforms

    Science.gov (United States)

    Baker, L. M.; Fletcher, J. B.

    2001-12-01

    the user. Commercial software packages, such as MatLab, also have the ability to share data in their own formats across multiple computer platforms. Our Fortran applications can create plot files in Adobe PostScript, Illustrator, and Portable Document Format (PDF) formats. Vendor support for reading these files is readily available on multiple computer platforms. We will illustrate by example our strategies for sharing seismic data among our multiple computer platforms, and we will discuss our positive and negative experiences. We will include our solutions for handling the different byte ordering, floating-point formats, and text file ``end-of-line'' conventions on the various computer platforms we use (6 different operating systems on 5 processor architectures).

  2. Platform-independent method for computer aided schematic drawings

    Science.gov (United States)

    Vell, Jeffrey L [Slingerlands, NY; Siganporia, Darius M [Clifton Park, NY; Levy, Arthur J [Fort Lauderdale, FL

    2012-02-14

    A CAD/CAM method is disclosed for a computer system to capture and interchange schematic drawing and associated design information. The schematic drawing and design information are stored in an extensible, platform-independent format.

  3. Trusted computing platforms TPM2.0 in context

    CERN Document Server

    Proudler, Graeme; Dalton, Chris

    2015-01-01

    In this book the authors first describe the background of trusted platforms and trusted computing and speculate about the future. They then describe the technical features and architectures of trusted platforms from several different perspectives, finally explaining second-generation TPMs, including a technical description intended to supplement the Trusted Computing Group's TPM2 specifications. The intended audience is IT managers and engineers and graduate students in information security.

  4. ALICE Connex : Mobile Volunteer Computing and Edutainment Platform

    CERN Document Server

    Chalumporn, Gantaphon

    2016-01-01

    Mobile devices are very powerful and trend to be developed. They have functions that are used in everyday life. One of their main tasks is to be an entertainment devices or gaming platform. A lot of technologies are now accepted and adopted to improve the potential of education. Edutainment is a combination of entertainment and education media together to make use of both benefits. In this work, we introduce a design of edutainment platform which is a part of mobile volunteer computing and edutainment platform called ‘ALICE Connex’ for ALICE at CERN. The edutainment platform focuses to deliver enjoyment and education, while promotes ALICE and Volunteer Computing platform to general public. The design in this work describes the functionality to build an effective edutainment with real-time multiplayer interaction on round-based gameplay, while integrates seamless edutainment with basic particle physic content though game mechanism and items design. For the assessment method we will observe the enjoyment o...

  5. Micromagnetics on high-performance workstation and mobile computational platforms

    Science.gov (United States)

    Fu, S.; Chang, R.; Couture, S.; Menarini, M.; Escobar, M. A.; Kuteifan, M.; Lubarda, M.; Gabay, D.; Lomakin, V.

    2015-05-01

    The feasibility of using high-performance desktop and embedded mobile computational platforms is presented, including multi-core Intel central processing unit, Nvidia desktop graphics processing units, and Nvidia Jetson TK1 Platform. FastMag finite element method-based micromagnetic simulator is used as a testbed, showing high efficiency on all the platforms. Optimization aspects of improving the performance of the mobile systems are discussed. The high performance, low cost, low power consumption, and rapid performance increase of the embedded mobile systems make them a promising candidate for micromagnetic simulations. Such architectures can be used as standalone systems or can be built as low-power computing clusters.

  6. Study on the application of mobile internet cloud computing platform

    Science.gov (United States)

    Gong, Songchun; Fu, Songyin; Chen, Zheng

    2012-04-01

    The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.

  7. Model Infrastruktur dan Manajemen Platform Server Berbasis Cloud Computing

    Directory of Open Access Journals (Sweden)

    Mulki Indana Zulfa

    2017-11-01

    Full Text Available Cloud computing is a new technology that is still very rapidly growing. This technology makes the Internet as the main media for the management of data and applications remotely. Cloud computing allows users to run an application without having to think about infrastructure and its platforms. Other technical aspects such as memory, storage, backup and restore, can be done very easily. This research is intended to modeling the infrastructure and management of computer platform in computer network of Faculty of Engineering, University of Jenderal Soedirman. The first stage in this research is literature study, by finding out the implementation model in previous research. Then the result will be combined with a new approach to existing resources and try to implement directly on the existing server network. The results showed that the implementation of cloud computing technology is able to replace the existing platform network.

  8. Enhancing Trusted Cloud Computing Platform for Infrastructure as a Service

    Directory of Open Access Journals (Sweden)

    KIM, H.

    2017-02-01

    Full Text Available The characteristics of cloud computing including on-demand self-service, resource pooling, and rapid elasticity have made it grow in popularity. However, security concerns still obstruct widespread adoption of cloud computing in the industry. Especially, security risks related to virtual machine make cloud users worry about exposure of their private data in IaaS environment. In this paper, we propose an enhanced trusted cloud computing platform to provide confidentiality and integrity of the user's data and computation. The presented platform provides secure and efficient virtual machine management protocols not only to protect against eavesdropping and tampering during transfer but also to guarantee the virtual machine is hosted only on the trusted cloud nodes against inside attackers. The protocols utilize both symmetric key operations and public key operations together with efficient node authentication model, hence both the computational cost for cryptographic operations and the communication steps are significantly reduced. As a result, the simulation shows the performance of the proposed platform is approximately doubled compared to the previous platforms. The proposed platform eliminates cloud users' worry above by providing confidentiality and integrity of their private data with better performance, and thus it contributes to wider industry adoption of cloud computing.

  9. Energy Consumption Management of Virtual Cloud Computing Platform

    Science.gov (United States)

    Li, Lin

    2017-11-01

    For energy consumption management research on virtual cloud computing platforms, energy consumption management of virtual computers and cloud computing platform should be understood deeper. Only in this way can problems faced by energy consumption management be solved. In solving problems, the key to solutions points to data centers with high energy consumption, so people are in great need to use a new scientific technique. Virtualization technology and cloud computing have become powerful tools in people’s real life, work and production because they have strong strength and many advantages. Virtualization technology and cloud computing now is in a rapid developing trend. It has very high resource utilization rate. In this way, the presence of virtualization and cloud computing technologies is very necessary in the constantly developing information age. This paper has summarized, explained and further analyzed energy consumption management questions of the virtual cloud computing platform. It eventually gives people a clearer understanding of energy consumption management of virtual cloud computing platform and brings more help to various aspects of people’s live, work and son on.

  10. Cloud computing for comparative genomics with windows azure platform.

    Science.gov (United States)

    Kim, Insik; Jung, Jae-Yoon; Deluca, Todd F; Nelson, Tristan H; Wall, Dennis P

    2012-01-01

    Cloud computing services have emerged as a cost-effective alternative for cluster systems as the number of genomes and required computation power to analyze them increased in recent years. Here we introduce the Microsoft Azure platform with detailed execution steps and a cost comparison with Amazon Web Services.

  11. Development of integrated platform for computational material design

    Energy Technology Data Exchange (ETDEWEB)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato [Center for Computational Science and Engineering, Fuji Research Institute Corporation (Japan); Hideaki, Koike [Advance Soft Corporation (Japan)

    2003-07-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned.

  12. Development of integrated platform for computational material design

    International Nuclear Information System (INIS)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato; Hideaki, Koike

    2003-01-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned

  13. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  14. Resilient workflows for computational mechanics platforms

    International Nuclear Information System (INIS)

    Nguyen, Toan; Trifan, Laurentiu; Desideri, Jean-Antoine

    2010-01-01

    Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future.

  15. Resilient workflows for computational mechanics platforms

    Science.gov (United States)

    Nguyên, Toàn; Trifan, Laurentiu; Désidéri, Jean-Antoine

    2010-06-01

    Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide [26, 27]. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications [23, 24]. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities [25, 31, 33]. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes [22, 24]. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come [28]. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future [23, 24, 29].

  16. Determination of UAV position using high accuracy navigation platform

    Directory of Open Access Journals (Sweden)

    Ireneusz Kubicki

    2016-07-01

    Full Text Available The choice of navigation system for mini UAV is very important because of its application and exploitation, particularly when the installed on it a synthetic aperture radar requires highly precise information about an object’s position. The presented exemplary solution of such a system draws attention to the possible problems associated with the use of appropriate technology, sensors, and devices or with a complete navigation system. The position and spatial orientation errors of the measurement platform influence on the obtained SAR imaging. Both, turbulences and maneuvers performed during flight cause the changes in the position of the airborne object resulting in deterioration or lack of images from SAR. Consequently, it is necessary to perform operations for reducing or eliminating the impact of the sensors’ errors on the UAV position accuracy. You need to look for compromise solutions between newer better technologies and in the field of software. Keywords: navigation systems, unmanned aerial vehicles, sensors integration

  17. Regional Platform on Personal Computer Electronic Waste in Latin ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Regional Platform on Personal Computer Electronic Waste in Latin America and the Caribbean. Donation of ... This project aims to identify environmentally responsible and sustainable solutions to the problem of e-waste. ... Policy in Focus publishes a special issue profiling evidence to empower women in the labour market.

  18. Computational Platform About Amazon Web Services (Aws Distributed Rendering

    Directory of Open Access Journals (Sweden)

    Gabriel Rojas-Albarracín

    2017-09-01

    Full Text Available Today has created a dynamic in which people require higher image quality in different media formats (games, movies, animations. Further definition usually requires image processing larger; this brings the need for increased computing power. This paper presents a case study in which the implementation of a low-cost platform on the Amazon cloud for parallel processing of images and animation.

  19. Homomorphic encryption experiments on IBM's cloud quantum computing platform

    Science.gov (United States)

    Huang, He-Liang; Zhao, You-Wei; Li, Tan; Li, Feng-Guang; Du, Yu-Tao; Fu, Xiang-Qun; Zhang, Shuo; Wang, Xiang; Bao, Wan-Su

    2017-02-01

    Quantum computing has undergone rapid development in recent years. Owing to limitations on scalability, personal quantum computers still seem slightly unrealistic in the near future. The first practical quantum computer for ordinary users is likely to be on the cloud. However, the adoption of cloud computing is possible only if security is ensured. Homomorphic encryption is a cryptographic protocol that allows computation to be performed on encrypted data without decrypting them, so it is well suited to cloud computing. Here, we first applied homomorphic encryption on IBM's cloud quantum computer platform. In our experiments, we successfully implemented a quantum algorithm for linear equations while protecting our privacy. This demonstration opens a feasible path to the next stage of development of cloud quantum information technology.

  20. MEGA X: Molecular Evolutionary Genetics Analysis across Computing Platforms.

    Science.gov (United States)

    Kumar, Sudhir; Stecher, Glen; Li, Michael; Knyaz, Christina; Tamura, Koichiro

    2018-06-01

    The Molecular Evolutionary Genetics Analysis (Mega) software implements many analytical methods and tools for phylogenomics and phylomedicine. Here, we report a transformation of Mega to enable cross-platform use on Microsoft Windows and Linux operating systems. Mega X does not require virtualization or emulation software and provides a uniform user experience across platforms. Mega X has additionally been upgraded to use multiple computing cores for many molecular evolutionary analyses. Mega X is available in two interfaces (graphical and command line) and can be downloaded from www.megasoftware.net free of charge.

  1. +Cloud: An Agent-Based Cloud Computing Platform

    OpenAIRE

    González, Roberto; Hernández de la Iglesia, Daniel; de la Prieta Pintado, Fernando; Gil González, Ana Belén

    2017-01-01

    Cloud computing is revolutionizing the services provided through the Internet, and is continually adapting itself in order to maintain the quality of its services. This study presents the platform +Cloud, which proposes a cloud environment for storing information and files by following the cloud paradigm. This study also presents Warehouse 3.0, a cloud-based application that has been developed to validate the services provided by +Cloud.

  2. Artificial and Computational Intelligence for Games on Mobile Platforms

    OpenAIRE

    Congdon, Clare Bates; Hingston, Philip; Kendall, Graham

    2013-01-01

    In this chapter, we consider the possibilities of creating new and innovative games that are targeted for mobile devices, such as smart phones and tablets, and that showcase AI (Artificial Intelligence) and CI (Computational Intelligence) approaches. Such games might take advantage of the sensors and facilities that are not available on other platforms, or might simply rely on the "app culture" to facilitate getting the games into users' hands. While these games might be profitable in themsel...

  3. Development of a Very Dense Liquid Cooled Compute Platform

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, Phillip N.; Lipp, Robert J.

    2013-12-10

    The objective of this project was to design and develop a prototype very energy efficient high density compute platform with 100% pumped refrigerant liquid cooling using commodity components and high volume manufacturing techniques. Testing at SLAC has indicated that we achieved a DCIE of 0.93 against our original goal of 0.85. This number includes both cooling and power supply and was achieved employing some of the highest wattage processors available.

  4. Fundamentals of power integrity for computer platforms and systems

    CERN Document Server

    DiBene, Joseph T

    2014-01-01

    An all-encompassing text that focuses on the fundamentals of power integrity Power integrity is the study of power distribution from the source to the load and the system level issues that can occur across it. For computer systems, these issues can range from inside the silicon to across the board and may egress into other parts of the platform, including thermal, EMI, and mechanical. With a focus on computer systems and silicon level power delivery, this book sheds light on the fundamentals of power integrity, utilizing the author's extensive background in the power integrity industry and un

  5. Application of microarray analysis on computer cluster and cloud platforms.

    Science.gov (United States)

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  6. Performance of scientific computing platforms with MCNP4B

    International Nuclear Information System (INIS)

    McLaughlin, H.E.; Hendricks, J.S.

    1998-01-01

    Several computing platforms were evaluated with the MCNP4B Monte Carlo radiation transport code. The DEC AlphaStation 500/500 was the fastest to run MCNP4B. Compared to the HP 9000-735, the fastest platform 4 yr ago, the AlphaStation is 335% faster, the HP C180 is 133% faster, the SGI Origin 2000 is 82% faster, the Cray T94/4128 is 1% faster, the IBM RS/6000-590 is 93% as fast, the DEC 3000/600 is 81% as fast, the Sun Sparc20 is 57% as fast, the Cray YMP 8/8128 is 57% as fast, the sun Sparc5 is 33% as fast, and the Sun Sparc2 is 13% as fast. All results presented are reproducible and allow for comparison to computer platforms not included in this study. Timing studies are seen to be very problem dependent. The performance gains resulting from advances in software were also investigated. Various compilers and operating systems were seen to have a modest impact on performance, whereas hardware improvements have resulted in a factor of 4 improvement. MCNP4B also ran approximately as fast as MCNP4A

  7. Interactive Computer-Assisted Instruction in Acid-Base Physiology for Mobile Computer Platforms

    Science.gov (United States)

    Longmuir, Kenneth J.

    2014-01-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ~20 screens of information, on the subjects…

  8. Atomdroid: a computational chemistry tool for mobile platforms.

    Science.gov (United States)

    Feldt, Jonas; Mata, Ricardo A; Dieterich, Johannes M

    2012-04-23

    We present the implementation of a new molecular mechanics program designed for use in mobile platforms, the first specifically built for these devices. The software is designed to run on Android operating systems and is compatible with several modern tablet-PCs and smartphones available in the market. It includes molecular viewer/builder capabilities with integrated routines for geometry optimizations and Monte Carlo simulations. These functionalities allow it to work as a stand-alone tool. We discuss some particular development aspects, as well as the overall feasibility of using computational chemistry software packages in mobile platforms. Benchmark calculations show that through efficient implementation techniques even hand-held devices can be used to simulate midsized systems using force fields.

  9. Overview of Parallel Platforms for Common High Performance Computing

    Directory of Open Access Journals (Sweden)

    T. Fryza

    2012-04-01

    Full Text Available The paper deals with various parallel platforms used for high performance computing in the signal processing domain. More precisely, the methods exploiting the multicores central processing units such as message passing interface and OpenMP are taken into account. The properties of the programming methods are experimentally proved in the application of a fast Fourier transform and a discrete cosine transform and they are compared with the possibilities of MATLAB's built-in functions and Texas Instruments digital signal processors with very long instruction word architectures. New FFT and DCT implementations were proposed and tested. The implementation phase was compared with CPU based computing methods and with possibilities of the Texas Instruments digital signal processing library on C6747 floating-point DSPs. The optimal combination of computing methods in the signal processing domain and new, fast routines' implementation is proposed as well.

  10. Analytical simulation platform describing projections in computed tomography systems

    International Nuclear Information System (INIS)

    Youn, Hanbean; Kim, Ho Kyung

    2013-01-01

    To reduce the patient dose, several approaches such as spectral imaging using photon counting detectors and statistical image reconstruction, are being considered. Although image-reconstruction algorithms may significantly enhance image quality in reconstructed images with low dose, true signal-to-noise properties are mainly determined by image quality in projections. We are developing an analytical simulation platform describing projections to investigate how quantum-interaction physics in each component configuring CT systems affect image quality in projections. This simulator will be very useful for an improved design or optimization of CT systems in economy as well as the development of novel image-reconstruction algorithms. In this study, we present the progress of development of the simulation platform with an emphasis on the theoretical framework describing the generation of projection data. We have prepared the analytical simulation platform describing projections in computed tomography systems. The remained further study before the meeting includes the following: Each stage in the cascaded signal-transfer model for obtaining projections will be validated by the Monte Carlo simulations. We will build up energy-dependent scatter and pixel-crosstalk kernels, and show their effects on image quality in projections and reconstructed images. We will investigate the effects of projections obtained from various imaging conditions and system (or detector) operation parameters on reconstructed images. It is challenging to include the interaction physics due to photon-counting detectors into the simulation platform. Detailed descriptions of the simulator will be presented with discussions on its performance and limitation as well as Monte Carlo validations. Computational cost will also be addressed in detail. The proposed method in this study is simple and can be used conveniently in lab environment

  11. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    OpenAIRE

    Pirouz Nourian; Carlos Martinez-Ortiz; Ken Arroyo Ohori

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential ...

  12. Interactive computer-assisted instruction in acid-base physiology for mobile computer platforms.

    Science.gov (United States)

    Longmuir, Kenneth J

    2014-03-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ∼20 screens of information, on the subjects of the CO2-bicarbonate buffer system, other body buffer systems, and acid-base disorders. Five clinical case modules were also developed. For the learning modules, the interactive, active learning activities were primarily step-by-step learner control of explanations of complex physiological concepts, usually presented graphically. For the clinical cases, the active learning activities were primarily question-and-answer exercises that related clinical findings to the relevant basic science concepts. The student response was remarkably positive, with the interactive, active learning aspect of the instruction cited as the most important feature. Also, students cited the self-paced instruction, extensive use of interactive graphics, and side-by-side presentation of text and graphics as positive features. Most students reported that it took less time to study the subject matter with this online instruction compared with subject matter presented in the lecture hall. However, the approach to learning was highly examination driven, with most students delaying the study of the subject matter until a few days before the scheduled examination. Wider implementation of active learning computer-assisted instruction will require that instructors present subject matter interactively, that students fully embrace the responsibilities of independent learning, and that institutional administrations measure instructional effort by criteria other than scheduled hours of instruction.

  13. Essential Means for Urban Computing : Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    NARCIS (Netherlands)

    Nourian, P.; Martinez-Ortiz, Carlos; Arroyo Ohori, G.A.K.

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages,

  14. The potential benefits of photonics in the computing platform

    Science.gov (United States)

    Bautista, Jerry

    2005-03-01

    The increase in computational requirements for real-time image processing, complex computational fluid dynamics, very large scale data mining in the health industry/Internet, and predictive models for financial markets are driving computer architects to consider new paradigms that rely upon very high speed interconnects within and between computing elements. Further challenges result from reduced power requirements, reduced transmission latency, and greater interconnect density. Optical interconnects may solve many of these problems with the added benefit extended reach. In addition, photonic interconnects provide relative EMI immunity which is becoming an increasing issue with a greater dependence on wireless connectivity. However, to be truly functional, the optical interconnect mesh should be able to support arbitration, addressing, etc. completely in the optical domain with a BER that is more stringent than "traditional" communication requirements. Outlined are challenges in the advanced computing environment, some possible optical architectures and relevant platform technologies, as well roughly sizing these opportunities which are quite large relative to the more "traditional" optical markets.

  15. Application research of cloud computing in emergency system platform of nuclear accidents

    International Nuclear Information System (INIS)

    Zhang Yan; Yue Huiguo; Lin Quanyi; Yue Feng

    2013-01-01

    This paper described the key technology of the concept of cloud computing, service type and implementation methods. Combined with the upgrade demand of nuclear accident emergency system platform, the paper also proposed the application design of private cloud computing platform, analyzed safety of cloud platform and the characteristics of cloud disaster recovery. (authors)

  16. Can Nuclear Installations and Research Centres Adopt Cloud Computing Platform-

    International Nuclear Information System (INIS)

    Pichan, A.; Lazarescu, M.; Soh, S.T.

    2015-01-01

    Cloud Computing is arguably one of the recent and highly significant advances in information technology today. It produces transformative changes in the history of computing and presents many promising technological and economic opportunities. The pay-per-use model, the computing power, abundance of storage, skilled resources, fault tolerance and the economy of scale it offers, provides significant advantages to enterprises to adopt cloud platform for their business needs. However, customers especially those dealing with national security, high end scientific research institutions, critical national infrastructure service providers (like power, water) remain very much reluctant to move their business system to the cloud. One of the main concerns is the question of information security in the cloud and the threat of the unknown. Cloud Service Providers (CSP) indirectly encourages this perception by not letting their customers see what is behind their virtual curtain. Jurisdiction (information assets being stored elsewhere), data duplication, multi-tenancy, virtualisation and decentralized nature of data processing are the default characteristics of cloud computing. Therefore traditional approach of enforcing and implementing security controls remains a big challenge and largely depends upon the service provider. The other biggest challenge and open issue is the ability to perform digital forensic investigations in the cloud in case of security breaches. Traditional approaches to evidence collection and recovery are no longer practical as they rely on unrestricted access to the relevant systems and user data, something that is not available in the cloud model. This continues to fuel high insecurity for the cloud customers. In this paper we analyze the cyber security and digital forensics challenges, issues and opportunities for nuclear facilities to adopt cloud computing. We also discuss the due diligence process and applicable industry best practices which shall be

  17. Robust balancing and position control of a single spherical wheeled mobile platform

    OpenAIRE

    Yavuz, Fırat; Yavuz, Firat; Ünel, Mustafa; Unel, Mustafa

    2016-01-01

    Self-balancing mobile platforms with single spherical wheel, generally called ballbots, are suitable example of underactuated systems. Balancing control of a ballbot platform, which aims to maintain the upright orientation by rejecting external disturbances, is important during station keeping or trajectory tracking. In this paper, acceleration based balancing and position control of a single spherical wheeled mobile platform that has three single-row omniwheel drive m...

  18. WLAN Positioning Methods and Supporting Learning Technologies for Mobile Platforms

    Science.gov (United States)

    Melkonyan, Arsen

    2013-01-01

    Location technologies constitute an essential component of systems design for autonomous operations and control. The Global Positioning System (GPS) works well in outdoor areas, but the satellite signals are not strong enough to penetrate inside most indoor environments. As a result, a new strain of indoor positioning technologies that make use of…

  19. Los Alamos radiation transport code system on desktop computing platforms

    International Nuclear Information System (INIS)

    Briesmeister, J.F.; Brinkley, F.W.; Clark, B.A.; West, J.T.

    1990-01-01

    The Los Alamos Radiation Transport Code System (LARTCS) consists of state-of-the-art Monte Carlo and discrete ordinates transport codes and data libraries. These codes were originally developed many years ago and have undergone continual improvement. With a large initial effort and continued vigilance, the codes are easily portable from one type of hardware to another. The performance of scientific work-stations (SWS) has evolved to the point that such platforms can be used routinely to perform sophisticated radiation transport calculations. As the personal computer (PC) performance approaches that of the SWS, the hardware options for desk-top radiation transport calculations expands considerably. The current status of the radiation transport codes within the LARTCS is described: MCNP, SABRINA, LAHET, ONEDANT, TWODANT, TWOHEX, and ONELD. Specifically, the authors discuss hardware systems on which the codes run and present code performance comparisons for various machines

  20. Cloud Computing Platform for an Online Model Library System

    Directory of Open Access Journals (Sweden)

    Mingang Chen

    2013-01-01

    Full Text Available The rapid developing of digital content industry calls for online model libraries. For the efficiency, user experience, and reliability merits of the model library, this paper designs a Web 3D model library system based on a cloud computing platform. Taking into account complex models, which cause difficulties in real-time 3D interaction, we adopt the model simplification and size adaptive adjustment methods to make the system with more efficient interaction. Meanwhile, a cloud-based architecture is developed to ensure the reliability and scalability of the system. The 3D model library system is intended to be accessible by online users with good interactive experiences. The feasibility of the solution has been tested by experiments.

  1. Mapping flow distortion on oceanographic platforms using computational fluid dynamics

    Directory of Open Access Journals (Sweden)

    N. O'Sullivan

    2013-10-01

    Full Text Available Wind speed measurements over the ocean on ships or buoys are affected by flow distortion from the platform and by the anemometer itself. This can lead to errors in direct measurements and the derived parametrisations. Here we computational fluid dynamics (CFD to simulate the errors in wind speed measurements caused by flow distortion on the RV Celtic Explorer. Numerical measurements were obtained from the finite-volume CFD code OpenFOAM, which was used to simulate the velocity fields. This was done over a range of orientations in the test domain from −60 to +60° in increments of 10°. The simulation was also set up for a range of velocities, ranging from 5 to 25 m s−1 in increments of 0.5 m s−1. The numerical analysis showed close agreement to experimental measurements.

  2. Quantum Heterogeneous Computing for Satellite Positioning Optimization

    Science.gov (United States)

    Bass, G.; Kumar, V.; Dulny, J., III

    2016-12-01

    Hard optimization problems occur in many fields of academic study and practical situations. We present results in which quantum heterogeneous computing is used to solve a real-world optimization problem: satellite positioning. Optimization problems like this can scale very rapidly with problem size, and become unsolvable with traditional brute-force methods. Typically, such problems have been approximately solved with heuristic approaches; however, these methods can take a long time to calculate and are not guaranteed to find optimal solutions. Quantum computing offers the possibility of producing significant speed-up and improved solution quality. There are now commercially available quantum annealing (QA) devices that are designed to solve difficult optimization problems. These devices have 1000+ quantum bits, but they have significant hardware size and connectivity limitations. We present a novel heterogeneous computing stack that combines QA and classical machine learning and allows the use of QA on problems larger than the quantum hardware could solve in isolation. We begin by analyzing the satellite positioning problem with a heuristic solver, the genetic algorithm. The classical computer's comparatively large available memory can explore the full problem space and converge to a solution relatively close to the true optimum. The QA device can then evolve directly to the optimal solution within this more limited space. Preliminary experiments, using the Quantum Monte Carlo (QMC) algorithm to simulate QA hardware, have produced promising results. Working with problem instances with known global minima, we find a solution within 8% in a matter of seconds, and within 5% in a few minutes. Future studies include replacing QMC with commercially available quantum hardware and exploring more problem sets and model parameters. Our results have important implications for how heterogeneous quantum computing can be used to solve difficult optimization problems in any

  3. A platform independent communication library for distributed computing

    NARCIS (Netherlands)

    Groen, D.; Rieder, S.; Grosso, P.; de Laat, C.; Portegies Zwart, S.

    2010-01-01

    We present MPWide, a platform independent communication library for performing message passing between supercomputers. Our library couples several local MPI applications through a long distance network using, for example, optical links. The implementation is deliberately kept light-weight, platform

  4. Analysis and experiments of a novel and compact 3-DOF precision positioning platform

    International Nuclear Information System (INIS)

    Huang, Hu; Zhao, Hongwei; Fan, Zunqiang; Zhang, Hui; Ma, Zhichao; Yang, Zhaojun

    2013-01-01

    A novel 3-DOF precision positioning platform with dimensions of 48 mm X 50 mm X 35 mm was designed by integrating piezo actuators and flexure hinges. The platform has a compact structure but it can do high precision positioning in three axes. The dynamic model of the platform in a single direction was established. Stiffness of the flexure hinges and modal characteristics of the flexure hinge mechanism were analyzed by the finite element method. Output displacements of the platform along three axes were forecasted via stiffness analysis. Output performance of the platform in x and y axes with open-loop control as well as the z-axis with closed-loop control was tested and discussed. The preliminary application of the platform in the field of nanoindentation indicates that the designed platform works well during nanoindentation tests, and the closed-loop control ensures the linear displacement output. With suitable control, the platform has the potential to realize different positioning functions under various working conditions.

  5. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  6. A wireless computational platform for distributed computing based traffic monitoring involving mixed Eulerian-Lagrangian sensing

    KAUST Repository

    Jiang, Jiming

    2013-06-01

    This paper presents a new wireless platform designed for an integrated traffic monitoring system based on combined Lagrangian (mobile) and Eulerian (fixed) sensing. The sensor platform is built around a 32-bit ARM Cortex M4 micro-controller and a 2.4GHz 802.15.4 ISM compliant radio module, and can be interfaced with fixed traffic sensors, or receive data from vehicle transponders. The platform is specially designed and optimized to be integrated in a solar-powered wireless sensor network in which traffic flow maps are computed by the nodes directly using distributed computing. A MPPT circuitry is proposed to increase the power output of the attached solar panel. A self-recovering unit is designed to increase reliability and allow periodic hard resets, an essential requirement for sensor networks. A radio monitoring circuitry is proposed to monitor incoming and outgoing transmissions, simplifying software debug. An ongoing implementation is briefly discussed, and compared with existing platforms used in wireless sensor networks. © 2013 IEEE.

  7. Model-driven product line engineering for mapping parallel algorithms to parallel computing platforms

    NARCIS (Netherlands)

    Arkin, Ethem; Tekinerdogan, Bedir

    2016-01-01

    Mapping parallel algorithms to parallel computing platforms requires several activities such as the analysis of the parallel algorithm, the definition of the logical configuration of the platform, the mapping of the algorithm to the logical configuration platform and the implementation of the

  8. PerPos: A Platform Providing Cloud Services for Pervasive Positioning

    DEFF Research Database (Denmark)

    Blunck, Henrik; Godsk, Torben; Grønbæk, Kaj

    2010-01-01

    -based building model manager that allows users to manage building models stored in the PerPos cloud for annotation, logging, and navigation purposes. A core service in the PerPos platform is sensor fusion for positioning that makes it seamless and efficient to combine a rich set of position sensors to obtain...

  9. Cloud Computing as Evolution of Distributed Computing – A Case Study for SlapOS Distributed Cloud Computing Platform

    Directory of Open Access Journals (Sweden)

    George SUCIU

    2013-01-01

    Full Text Available The cloud computing paradigm has been defined from several points of view, the main two directions being either as an evolution of the grid and distributed computing paradigm, or, on the contrary, as a disruptive revolution in the classical paradigms of operating systems, network layers and web applications. This paper presents a distributed cloud computing platform called SlapOS, which unifies technologies and communication protocols into a new technology model for offering any application as a service. Both cloud and distributed computing can be efficient methods for optimizing resources that are aggregated from a grid of standard PCs hosted in homes, offices and small data centers. The paper fills a gap in the existing distributed computing literature by providing a distributed cloud computing model which can be applied for deploying various applications.

  10. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    Directory of Open Access Journals (Sweden)

    Pirouz Nourian

    2018-03-01

    Full Text Available This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential and applicability in urban planning and urban data analytics. This review is not only based on the technical factors such as capabilities of the programming languages but also the ease of developing and sharing complex data processing workflows. The arena of web-based computing platforms is currently under rapid development and is too volatile to be predictable; therefore, in this article we focus on the specification of the requirements and potentials from an urban planning point of view rather than speculating about the fate of computing platforms or programming languages. The article presents a list of promising computing technologies, a technical specification of the essential data models and operators for geo-spatial data processing, and mathematical models for an ideal urban computing platform.

  11. Augmentation of Quasi-Zenith Satellite Positioning System Using High Altitude Platforms Systems (HAPS)

    Science.gov (United States)

    Tsujii, Toshiaki; Harigae, Masatoshi

    Recently, some feasibility studies on a regional positioning system using the quasi-zenith satellites and the geostationary satellites have been conducted in Japan. However, the geometry of this system seems to be unsatisfactory in terms of the positioning accuracy in north-south direction. In this paper, an augmented satellite positioning system by the High Altitude Platform Systems (HAPS) is proposed since the flexibility of the HAPS location is effective to improve the geometry of satellite positioning system. The improved positioning performance of the augmented system is also demonstrated.

  12. Design and Analysis of a Compact Precision Positioning Platform Integrating Strain Gauges and the Piezoactuator

    Directory of Open Access Journals (Sweden)

    Shunguang Wan

    2012-07-01

    Full Text Available Miniaturization precision positioning platforms are needed for in situ nanomechanical test applications. This paper proposes a compact precision positioning platform integrating strain gauges and the piezoactuator. Effects of geometric parameters of two parallel plates on Von Mises stress distribution as well as static and dynamic characteristics of the platform were studied by the finite element method. Results of the calibration experiment indicate that the strain gauge sensor has good linearity and its sensitivity is about 0.0468 mV/μm. A closed-loop control system was established to solve the problem of nonlinearity of the platform. Experimental results demonstrate that for the displacement control process, both the displacement increasing portion and the decreasing portion have good linearity, verifying that the control system is available. The developed platform has a compact structure but can realize displacement measurement with the embedded strain gauges, which is useful for the closed-loop control and structure miniaturization of piezo devices. It has potential applications in nanoindentation and nanoscratch tests, especially in the field of in situ nanomechanical testing which requires compact structures.

  13. Integration of the TNXYZ computer program inside the platform Salome

    International Nuclear Information System (INIS)

    Chaparro V, F. J.

    2014-01-01

    The present work shows the procedure carried out to integrate the code TNXYZ as a calculation tool at the graphical simulation platform Salome. The TNXYZ code propose a numerical solution of the neutron transport equation, in several groups of energy, steady-state and three-dimensional geometry. In order to discretized the variables of the transport equation, the code uses the method of discrete ordinates for the angular variable, and a nodal method for the spatial dependence. The Salome platform is a graphical environment designed for building, editing and simulating mechanical models mainly focused on the industry and unlike other software, in order to form a complete scheme of pre and post processing of information, to integrate and control an external source code. Before the integration the in the Salome platform TNXYZ code was upgraded. TNXYZ was programmed in the 90s using Fortran 77 compiler; for this reason the code was adapted to the characteristics of the current Fortran compilers; in addition, with the intention of extracting partial results over the process sequence, the original structure of the program underwent a modularization process, i.e. the main program was divided into sections where the code performs major operations. This procedure is controlled by the information module (YACS) on Salome platform, and it could be useful for a subsequent coupling with thermal-hydraulics codes. Finally, with the help of the Monte Carlo code Serpent several study cases were defined in order to check the process of integration; the verification process consisted in performing a comparison of the results obtained with the code executed as stand-alone and after modernized, integrated and controlled by the Salome platform. (Author)

  14. Simulating next-generation Cyber-physical computing platforms

    OpenAIRE

    Burgio, Paolo; Álvarez Martínez, Carlos; Ayguadé Parra, Eduard; Filgueras Izquierdo, Antonio; Jiménez González, Daniel; Martorell Bofill, Xavier; Navarro, Nacho; Giorgi, Roberto

    2015-01-01

    In specific domains, such as cyber-physical systems, platforms are quickly evolving to include multiple (many-) cores and programmable logic in a single system-on-chip, while includ- ing interfaces to commodity sensors/actuators. Programmable Logic (e.g., FPGA) allows for greater flexibility and dependability. However, the task of extracting the performance/watt potentia l of heterogeneous many-cores is often demanded at the application level, and this h...

  15. Resilient and Robust High Performance Computing Platforms for Scientific Computing Integrity

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Yier [Univ. of Central Florida, Orlando, FL (United States)

    2017-07-14

    As technology advances, computer systems are subject to increasingly sophisticated cyber-attacks that compromise both their security and integrity. High performance computing platforms used in commercial and scientific applications involving sensitive, or even classified data, are frequently targeted by powerful adversaries. This situation is made worse by a lack of fundamental security solutions that both perform efficiently and are effective at preventing threats. Current security solutions fail to address the threat landscape and ensure the integrity of sensitive data. As challenges rise, both private and public sectors will require robust technologies to protect its computing infrastructure. The research outcomes from this project try to address all these challenges. For example, we present LAZARUS, a novel technique to harden kernel Address Space Layout Randomization (KASLR) against paging-based side-channel attacks. In particular, our scheme allows for fine-grained protection of the virtual memory mappings that implement the randomization. We demonstrate the effectiveness of our approach by hardening a recent Linux kernel with LAZARUS, mitigating all of the previously presented side-channel attacks on KASLR. Our extensive evaluation shows that LAZARUS incurs only 0.943% overhead for standard benchmarks, and is therefore highly practical. We also introduced HA2lloc, a hardware-assisted allocator that is capable of leveraging an extended memory management unit to detect memory errors in the heap. We also perform testing using HA2lloc in a simulation environment and find that the approach is capable of preventing common memory vulnerabilities.

  16. A cloud computing based platform for sleep behavior and chronic diseases collaborative research.

    Science.gov (United States)

    Kuo, Mu-Hsing; Borycki, Elizabeth; Kushniruk, Andre; Huang, Yueh-Min; Hung, Shu-Hui

    2014-01-01

    The objective of this study is to propose a Cloud Computing based platform for sleep behavior and chronic disease collaborative research. The platform consists of two main components: (1) a sensing bed sheet with textile sensors to automatically record patient's sleep behaviors and vital signs, and (2) a service-oriented cloud computing architecture (SOCCA) that provides a data repository and allows for sharing and analysis of collected data. Also, we describe our systematic approach to implementing the SOCCA. We believe that the new cloud-based platform can provide nurse and other health professional researchers located in differing geographic locations with a cost effective, flexible, secure and privacy-preserved research environment.

  17. GPUs: An Emerging Platform for General-Purpose Computation

    Science.gov (United States)

    2007-08-01

    programming; real-time cinematic quality graphics Peak stream (26) License required (limited time no- cost evaluation program) Commercially...folding.stanford.edu (accessed 30 March 2007). 2. Fan, Z.; Qiu, F.; Kaufman, A.; Yoakum-Stover, S. GPU Cluster for High Performance Computing. ACM/IEEE...accessed 30 March 2007). 8. Goodnight, N.; Wang, R.; Humphreys, G. Computation on Programmable Graphics Hardware. IEEE Computer Graphics and

  18. Platforms.

    Science.gov (United States)

    Josko, Deborah

    2014-01-01

    The advent of DNA sequencing technologies and the various applications that can be performed will have a dramatic effect on medicine and healthcare in the near future. There are several DNA sequencing platforms available on the market for research and clinical use. Based on the medical laboratory scientist or researcher's needs and taking into consideration laboratory space and budget, one can chose which platform will be beneficial to their institution and their patient population. Although some of the instrument costs seem high, diagnosing a patient quickly and accurately will save hospitals money with fewer hospital stays and targeted treatment based on an individual's genetic make-up. By determining the type of disease an individual has, based on the mutations present or having the ability to prescribe the appropriate antimicrobials based on the knowledge of the organism's resistance patterns, the clinician will be better able to treat and diagnose a patient which ultimately will improve patient outcomes and prognosis.

  19. Future Computing Platforms for Science in a Power Constrained Era

    International Nuclear Information System (INIS)

    Abdurachmanov, David; Eulisse, Giulio; Elmer, Peter; Knight, Robert

    2015-01-01

    Power consumption will be a key constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics (HEP). This makes performance-per-watt a crucial metric for selecting cost-efficient computing solutions. For this paper, we have done a wide survey of current and emerging architectures becoming available on the market including x86-64 variants, ARMv7 32-bit, ARMv8 64-bit, Many-Core and GPU solutions, as well as newer System-on-Chip (SoC) solutions. We compare performance and energy efficiency using an evolving set of standardized HEP-related benchmarks and power measurement techniques we have been developing. We evaluate the potential for use of such computing solutions in the context of DHTC systems, such as the Worldwide LHC Computing Grid (WLCG). (paper)

  20. Oak Ridge Leadership Computing Facility Position Paper

    Energy Technology Data Exchange (ETDEWEB)

    Oral, H Sarp [ORNL; Hill, Jason J [ORNL; Thach, Kevin G [ORNL; Podhorszki, Norbert [ORNL; Klasky, Scott A [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL

    2011-01-01

    This paper discusses the business, administration, reliability, and usability aspects of storage systems at the Oak Ridge Leadership Computing Facility (OLCF). The OLCF has developed key competencies in architecting and administration of large-scale Lustre deployments as well as HPSS archival systems. Additionally as these systems are architected, deployed, and expanded over time reliability and availability factors are a primary driver. This paper focuses on the implementation of the Spider parallel Lustre file system as well as the implementation of the HPSS archive at the OLCF.

  1. ACCURACY ANALYSIS OF A LOW-COST PLATFORM FOR POSITIONING AND NAVIGATION

    Directory of Open Access Journals (Sweden)

    S. Hofmann

    2012-07-01

    Full Text Available This paper presents an accuracy analysis of a platform based on low-cost components for landmark-based navigation intended for research and teaching purposes. The proposed platform includes a LEGO MINDSTORMS NXT 2.0 kit, an Android-based Smartphone as well as a compact laser scanner Hokuyo URG-04LX. The robot is used in a small indoor environment, where GNSS is not available. Therefore, a landmark map was produced in advance, with the landmark positions provided to the robot. All steps of procedure to set up the platform are shown. The main focus of this paper is the reachable positioning accuracy, which was analyzed in this type of scenario depending on the accuracy of the reference landmarks and the directional and distance measuring accuracy of the laser scanner. Several experiments were carried out, demonstrating the practically achievable positioning accuracy. To evaluate the accuracy, ground truth was acquired using a total station. These results are compared to the theoretically achievable accuracies and the laser scanner’s characteristics.

  2. Platformation: Cloud Computing Tools at the Service of Social Change

    Directory of Open Access Journals (Sweden)

    Anil Patel

    2012-07-01

    Full Text Available The following article establishes some context and definitions for what is termed the “sharing imperative” – a movement or tendency towards sharing information online and in real time that has rapidly transformed several industries. As internet-enabled devices proliferate to all corners of the globe, ways of working and accessing information have changed. Users now expect to be able to access the products, services, and information that they want from anywhere, at any time, on any device. This article addresses how the nonprofit sector might respond to those demands by embracing the sharing imperative. It suggests that how well an organization shares has become one of the most pressing governance questions a nonprofit organization must tackle. Finally, the article introduces Platformation, a project whereby tools that enable better inter and intra-organizational sharing are tested for scalability, affordability, interoperability, and security, all with a non-profit lens.

  3. Scalability of DL_POLY on High Performance Computing Platform

    CSIR Research Space (South Africa)

    Mabakane, Mabule S

    2017-12-01

    Full Text Available stream_source_info Mabakanea_19979_2017.pdf.txt stream_content_type text/plain stream_size 33716 Content-Encoding UTF-8 stream_name Mabakanea_19979_2017.pdf.txt Content-Type text/plain; charset=UTF-8 SACJ 29(3) December... when using many processors within the compute nodes of the supercomputer. The type of the processors of compute nodes and their memory also play an important role in the overall performance of the parallel application running on a supercomputer. DL...

  4. Open source acceleration of wave optics simulations on energy efficient high-performance computing platforms

    Science.gov (United States)

    Beck, Jeffrey; Bos, Jeremy P.

    2017-05-01

    We compare several modifications to the open-source wave optics package, WavePy, intended to improve execution time. Specifically, we compare the relative performance of the Intel MKL, a CPU based OpenCV distribution, and GPU-based version. Performance is compared between distributions both on the same compute platform and between a fully-featured computing workstation and the NVIDIA Jetson TX1 platform. Comparisons are drawn in terms of both execution time and power consumption. We have found that substituting the Fast Fourier Transform operation from OpenCV provides a marked improvement on all platforms. In addition, we show that embedded platforms offer some possibility for extensive improvement in terms of efficiency compared to a fully featured workstation.

  5. The development of a computational platform to design and simulate on-board hydrogen storage systems

    DEFF Research Database (Denmark)

    Mazzucco, Andrea; Rokni, Masoud

    2017-01-01

    A computational platform is developed in the Modelica® language within the Dymola™ environment to provide a tool for the design and performance comparison of on-board hydrogen storage systems. The platform has been coupled with an open source library for hydrogen fueling stations to investigate...... the vehicular tank within the frame of a complete refueling system. The two technologies that are integrated in the platform are solid-state hydrogen storage in the form of metal hydrides and compressed gas systems. In this work the computational platform is used to compare the storage performance of two tank...... to a storage capacity four times larger than a tube-in-tube solution of the same size. The volumetric and gravimetric densities of the shell and tube are 2.46% and 1.25% respectively. The dehydriding ability of this solution is proven to withstand intense discharging conditions....

  6. Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Kozacik, Stephen [EM Photonics, Inc., Newark, DE (United States)

    2017-05-15

    Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.

  7. My4Sight: A Human Computation Platform for Improving Flu Predictions

    OpenAIRE

    Akupatni, Vivek Bharath

    2015-01-01

    While many human computation (human-in-the-loop) systems exist in the field of Artificial Intelligence (AI) to solve problems that can't be solved by computers alone, comparatively fewer platforms exist for collecting human knowledge, and evaluation of various techniques for harnessing human insights in improving forecasting models for infectious diseases, such as Influenza and Ebola. In this thesis, we present the design and implementation of My4Sight, a human computation system develope...

  8. Network architecture test-beds as platforms for ubiquitous computing.

    Science.gov (United States)

    Roscoe, Timothy

    2008-10-28

    Distributed systems research, and in particular ubiquitous computing, has traditionally assumed the Internet as a basic underlying communications substrate. Recently, however, the networking research community has come to question the fundamental design or 'architecture' of the Internet. This has been led by two observations: first, that the Internet as it stands is now almost impossible to evolve to support new functionality; and second, that modern applications of all kinds now use the Internet rather differently, and frequently implement their own 'overlay' networks above it to work around its perceived deficiencies. In this paper, I discuss recent academic projects to allow disruptive change to the Internet architecture, and also outline a radically different view of networking for ubiquitous computing that such proposals might facilitate.

  9. Investigation into Mobile Learning Framework in Cloud Computing Platform

    OpenAIRE

    Wei, Guo; Joan, Lu

    2014-01-01

    Abstract—Cloud computing infrastructure is increasingly\\ud used for distributed applications. Mobile learning\\ud applications deployed in the cloud are a new research\\ud direction. The applications require specific development\\ud approaches for effective and reliable communication. This\\ud paper proposes an interdisciplinary approach for design and\\ud development of mobile applications in the cloud. The\\ud approach includes front service toolkit and backend service\\ud toolkit. The front servi...

  10. BCILAB: a platform for brain-computer interface development

    Science.gov (United States)

    Kothe, Christian Andreas; Makeig, Scott

    2013-10-01

    Objective. The past two decades have seen dramatic progress in our ability to model brain signals recorded by electroencephalography, functional near-infrared spectroscopy, etc., and to derive real-time estimates of user cognitive state, response, or intent for a variety of purposes: to restore communication by the severely disabled, to effect brain-actuated control and, more recently, to augment human-computer interaction. Continuing these advances, largely achieved through increases in computational power and methods, requires software tools to streamline the creation, testing, evaluation and deployment of new data analysis methods. Approach. Here we present BCILAB, an open-source MATLAB-based toolbox built to address the need for the development and testing of brain-computer interface (BCI) methods by providing an organized collection of over 100 pre-implemented methods and method variants, an easily extensible framework for the rapid prototyping of new methods, and a highly automated framework for systematic testing and evaluation of new implementations. Main results. To validate and illustrate the use of the framework, we present two sample analyses of publicly available data sets from recent BCI competitions and from a rapid serial visual presentation task. We demonstrate the straightforward use of BCILAB to obtain results compatible with the current BCI literature. Significance. The aim of the BCILAB toolbox is to provide the BCI community a powerful toolkit for methods research and evaluation, thereby helping to accelerate the pace of innovation in the field, while complementing the existing spectrum of tools for real-time BCI experimentation, deployment and use.

  11. Scalability of DL_POLY on High Performance Computing Platform

    Directory of Open Access Journals (Sweden)

    Mabule Samuel Mabakane

    2017-12-01

    Full Text Available This paper presents a case study on the scalability of several versions of the molecular dynamics code (DL_POLY performed on South Africa‘s Centre for High Performance Computing e1350 IBM Linux cluster, Sun system and Lengau supercomputers. Within this study different problem sizes were designed and the same chosen systems were employed in order to test the performance of DL_POLY using weak and strong scalability. It was found that the speed-up results for the small systems were better than large systems on both Ethernet and Infiniband network. However, simulations of large systems in DL_POLY performed well using Infiniband network on Lengau cluster as compared to e1350 and Sun supercomputer.

  12. Determining position inside building via laser rangefinder and handheld computer

    Science.gov (United States)

    Ramsey, Jr James L. [Albuquerque, NM; Finley, Patrick [Albuquerque, NM; Melton, Brad [Albuquerque, NM

    2010-01-12

    An apparatus, computer software, and a method of determining position inside a building comprising selecting on a PDA at least two walls of a room in a digitized map of a building or a portion of a building, pointing and firing a laser rangefinder at corresponding physical walls, transmitting collected range information to the PDA, and computing on the PDA a position of the laser rangefinder within the room.

  13. The Study of Pallet Pooling Information Platform Based on Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jia-bin Li

    2018-01-01

    Full Text Available Effective implementation of pallet pooling system needs a strong information platform to support. Through the analysis of existing pallet pooling information platform (PPIP, the paper pointed out that the existing studies of PPIP are mainly based on traditional IT infrastructures and technologies which have software, hardware, resource utilization, and process restrictions. Because of the advantages of cloud computing technology like strong computing power, high flexibility, and low cost which meet the requirements of the PPIP well, this paper gave a PPIP architecture of two parts based on cloud computing: the users client and the cloud services. The cloud services include three layers, which are IaaS, PaaS, and SaaS. The method of how to deploy PPIP based on cloud computing is proposed finally.

  14. Architecture and Initial Development of a Digital Library Platform for Computable Knowledge Objects for Health.

    Science.gov (United States)

    Flynn, Allen J; Bahulekar, Namita; Boisvert, Peter; Lagoze, Carl; Meng, George; Rampton, James; Friedman, Charles P

    2017-01-01

    Throughout the world, biomedical knowledge is routinely generated and shared through primary and secondary scientific publications. However, there is too much latency between publication of knowledge and its routine use in practice. To address this latency, what is actionable in scientific publications can be encoded to make it computable. We have created a purpose-built digital library platform to hold, manage, and share actionable, computable knowledge for health called the Knowledge Grid Library. Here we present it with its system architecture.

  15. A Dedicated Computational Platform for Cellular Monte Carlo T-CAD Software Tools

    Science.gov (United States)

    2015-07-14

    computer that establishes an encrypted Virtual Private Network ( OpenVPN [44]) based on the Secure Socket Layer (SSL) paradigm. Each user is given a...security certificate for each device used to connect to the computing nodes. Stable OpenVPN clients are available for Linux, Microsoft Windows, Apple OSX...platform is granted by an encrypted connection base on the Secure Socket Layer (SSL) protocol, and implemented in the OpenVPN Virtual Personal Network

  16. Positive technology: a free mobile platform for the self-management of psychological stress.

    Science.gov (United States)

    Gaggioli, Andrea; Cipresso, Pietro; Serino, Silvia; Campanaro, Danilo Marco; Pallavicini, Federica; Wiederhold, Brenda K; Riva, Giuseppe

    2014-01-01

    We describe the main features and preliminary evaluation of Positive Technology, a free mobile platform for the self-management of psychological stress (http://positiveapp.info/). The mobile platform features three main components: (i) guided relaxation, which provides the user with the opportunity of browsing a gallery of relaxation music and video-narrative resources for reducing stress; (ii) 3D biofeedback, which helps the user learning to control his/her responses, by visualizing variations of heart rate in an engaging 3D environment; (iii) stress tracking, by the recording of heart rate and self-reports. We evaluated the Positive Technology app in an online trial involving 32 participants, out of which 7 used the application in combination with the wrist sensor. Overall, feedback from users was satisfactory and the analysis of data collected online indicated the capability of the app for reducing perceived stress levels. A future goal is to improve the usability of the application and include more advanced stress monitoring features, based on the analysis of heart rate variability indexes.

  17. Multivariate Gradient Analysis for Evaluating and Visualizing a Learning System Platform for Computer Programming

    Science.gov (United States)

    Mather, Richard

    2015-01-01

    This paper explores the application of canonical gradient analysis to evaluate and visualize student performance and acceptance of a learning system platform. The subject of evaluation is a first year BSc module for computer programming. This uses "Ceebot," an animated and immersive game-like development environment. Multivariate…

  18. ClusterCAD: a computational platform for type I modular polyketide synthase design

    DEFF Research Database (Denmark)

    Eng, Clara H.; Backman, Tyler W. H.; Bailey, Constance B.

    2018-01-01

    barrier to the design of active variants, and identifying strategies to reliably construct functional PKS chimeras remains an active area of research. In this work, we formalize a paradigm for the design of PKS chimeras and introduce ClusterCAD as a computational platform to streamline and simplify...

  19. The Relationship between Chief Information Officer Transformational Leadership and Computing Platform Operating Systems

    Science.gov (United States)

    Anderson, George W.

    2010-01-01

    The purpose of this study was to relate the strength of Chief Information Officer (CIO) transformational leadership behaviors to 1 of 5 computing platform operating systems (OSs) that may be selected for a firm's Enterprise Resource Planning (ERP) business system. Research shows executive leader behaviors may promote innovation through the use of…

  20. Cloud computing platform for real-time measurement and verification of energy performance

    International Nuclear Information System (INIS)

    Ke, Ming-Tsun; Yeh, Chia-Hung; Su, Cheng-Jie

    2017-01-01

    Highlights: • Application of PSO algorithm can improve the accuracy of the baseline model. • M&V cloud platform automatically calculates energy performance. • M&V cloud platform can be applied in all energy conservation measures. • Real-time operational performance can be monitored through the proposed platform. • M&V cloud platform facilitates the development of EE programs and ESCO industries. - Abstract: Nations worldwide are vigorously promoting policies to improve energy efficiency. The use of measurement and verification (M&V) procedures to quantify energy performance is an essential topic in this field. Currently, energy performance M&V is accomplished via a combination of short-term on-site measurements and engineering calculations. This requires extensive amounts of time and labor and can result in a discrepancy between actual energy savings and calculated results. In addition, the M&V period typically lasts for periods as long as several months or up to a year, the failure to immediately detect abnormal energy performance not only decreases energy performance, results in the inability to make timely correction, and misses the best opportunity to adjust or repair equipment and systems. In this study, a cloud computing platform for the real-time M&V of energy performance is developed. On this platform, particle swarm optimization and multivariate regression analysis are used to construct accurate baseline models. Instantaneous and automatic calculations of the energy performance and access to long-term, cumulative information about the energy performance are provided via a feature that allows direct uploads of the energy consumption data. Finally, the feasibility of this real-time M&V cloud platform is tested for a case study involving improvements to a cold storage system in a hypermarket. Cloud computing platform for real-time energy performance M&V is applicable to any industry and energy conservation measure. With the M&V cloud platform, real

  1. Ex Machina: Analytical platforms, Law and the Challenges of Computational Legal Science

    Directory of Open Access Journals (Sweden)

    Nicola Lettieri

    2018-04-01

    Full Text Available Over the years, computation has become a fundamental part of the scientific practice in several research fields that goes far beyond the boundaries of natural sciences. Data mining, machine learning, simulations and other computational methods lie today at the hearth of the scientific endeavour in a growing number of social research areas from anthropology to economics. In this scenario, an increasingly important role is played by analytical platforms: integrated environments allowing researchers to experiment cutting-edge data-driven and computation-intensive analyses. The paper discusses the appearance of such tools in the emerging field of computational legal science. After a general introduction to the impact of computational methods on both natural and social sciences, we describe the concept and the features of an analytical platform exploring innovative cross-methodological approaches to the academic and investigative study of crime. Stemming from an ongoing project involving researchers from law, computer science and bioinformatics, the initiative is presented and discussed as an opportunity to raise a debate about the future of legal scholarship and, inside of it, about the challenges of computational legal science.

  2. Positive Wigner functions render classical simulation of quantum computation efficient.

    Science.gov (United States)

    Mari, A; Eisert, J

    2012-12-07

    We show that quantum circuits where the initial state and all the following quantum operations can be represented by positive Wigner functions can be classically efficiently simulated. This is true both for continuous-variable as well as discrete variable systems in odd prime dimensions, two cases which will be treated on entirely the same footing. Noting the fact that Clifford and Gaussian operations preserve the positivity of the Wigner function, our result generalizes the Gottesman-Knill theorem. Our algorithm provides a way of sampling from the output distribution of a computation or a simulation, including the efficient sampling from an approximate output distribution in the case of sampling imperfections for initial states, gates, or measurements. In this sense, this work highlights the role of the positive Wigner function as separating classically efficiently simulable systems from those that are potentially universal for quantum computing and simulation, and it emphasizes the role of negativity of the Wigner function as a computational resource.

  3. A wireless computational platform for distributed computing based traffic monitoring involving mixed Eulerian-Lagrangian sensing

    KAUST Repository

    Jiang, Jiming; Claudel, Christian G.

    2013-01-01

    .4GHz 802.15.4 ISM compliant radio module, and can be interfaced with fixed traffic sensors, or receive data from vehicle transponders. The platform is specially designed and optimized to be integrated in a solar-powered wireless sensor network in which

  4. Information-computational platform for collaborative multidisciplinary investigations of regional climatic changes and their impacts

    Science.gov (United States)

    Gordov, Evgeny; Lykosov, Vasily; Krupchatnikov, Vladimir; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara

    2013-04-01

    Analysis of growing volume of related to climate change data from sensors and model outputs requires collaborative multidisciplinary efforts of researchers. To do it timely and in reliable way one needs in modern information-computational infrastructure supporting integrated studies in the field of environmental sciences. Recently developed experimental software and hardware platform Climate (http://climate.scert.ru/) provides required environment for regional climate change related investigations. The platform combines modern web 2.0 approach, GIS-functionality and capabilities to run climate and meteorological models, process large geophysical datasets and support relevant analysis. It also supports joint software development by distributed research groups, and organization of thematic education for students and post-graduate students. In particular, platform software developed includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also run of integrated into the platform WRF and «Planet Simulator» models, modeling results data preprocessing and visualization is provided. All functions of the platform are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of selection of geographical region of interest (pan and zoom), data layers manipulation (order, enable/disable, features extraction) and visualization of results. Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches. Using it even unskilled user without specific knowledge can perform reliable computational processing and visualization of large meteorological, climatic and satellite monitoring datasets through

  5. Scalable and massively parallel Monte Carlo photon transport simulations for heterogeneous computing platforms.

    Science.gov (United States)

    Yu, Leiming; Nina-Paravecino, Fanny; Kaeli, David; Fang, Qianqian

    2018-01-01

    We present a highly scalable Monte Carlo (MC) three-dimensional photon transport simulation platform designed for heterogeneous computing systems. Through the development of a massively parallel MC algorithm using the Open Computing Language framework, this research extends our existing graphics processing unit (GPU)-accelerated MC technique to a highly scalable vendor-independent heterogeneous computing environment, achieving significantly improved performance and software portability. A number of parallel computing techniques are investigated to achieve portable performance over a wide range of computing hardware. Furthermore, multiple thread-level and device-level load-balancing strategies are developed to obtain efficient simulations using multiple central processing units and GPUs. (2018) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  6. Applications integration in a hybrid cloud computing environment: modelling and platform

    Science.gov (United States)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  7. Mandibular condyle position in cone beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Hyoung Joo; Kim, Gyu Tae; Choi, Yong Suk; Hwang, Eui Hwan [Kyung Hee Univ. School of Dentistry, Seoul (Korea, Republic of)

    2006-06-15

    To evaluate position of the mandibular condyle within articular fossa in an asymptomatic population radiographically by a cone beam computed tomography. Cone beam computed tomography of 60 temporomandibular joints was performed on 15 males and 15 females with no history of any temporomandibular disorders, or any other orthodontic or photoconductors treatments. Position of mandibular condyle within articular fossa at centric occlusion was evaluated. A statistical evaluation was done using a SPSS. In the sagittal views, mandibular condyle within articular fossa was laterally located at central section. Mandibular condyles in the right and left sides were showed asymmetric positional relationship at medial, central, and lateral sections. Mandibular condyle within articular fossa in an asymptomatic population was observed non-concentric position in the sagittal and coronal views.

  8. UrbanWeb: a Platform for Mobile Context-aware Social Computing

    DEFF Research Database (Denmark)

    Hansen, Frank Allan; Grønbæk, Kaj

    2010-01-01

    UrbanWeb is a novel Web-based context-aware hypermedia plat- form. It provides essential mechanisms for mobile social comput- ing applications: the framework implements context as an exten- sion to Web 2.0 tagging and provides developers with an easy to use platform for mobile context......-aware applications. Services can be statically or dynamically defined in the user’s context, data can be pre-cached for data intensive mobile applications, and shared state supports synchronization between running applications such as games. The paper discusses how UrbanWeb acquires cues about the user’s context...... from sensors in mobile phones, ranging from GPS data, to 2D barcodes, and manual entry of context in- formation, as well as how to utilize this context in applications. The experiences show that the UrbanWeb platform efficiently supports a rich variety of urban computing applications in differ- ent...

  9. Design Tools for Accelerating Development and Usage of Multi-Core Computing Platforms

    Science.gov (United States)

    2014-04-01

    Government formulated or supplied the drawings, specifications, or other data does not license the holder or any other person or corporation ; or convey...multicore PDSP platforms. The GPU- based capabilities of TDIF are currently oriented towards NVIDIA GPUs, based on the Compute Unified Device Architecture...CUDA) programming language [ NVIDIA 2007], which can be viewed as an extension of C. The multicore PDSP capabilities currently in TDIF are oriented

  10. nuMap: a web platform for accurate prediction of nucleosome positioning.

    Science.gov (United States)

    Alharbi, Bader A; Alshammari, Thamir H; Felton, Nathan L; Zhurkin, Victor B; Cui, Feng

    2014-10-01

    Nucleosome positioning is critical for gene expression and of major biological interest. The high cost of experimentally mapping nucleosomal arrangement signifies the need for computational approaches to predict nucleosome positions at high resolution. Here, we present a web-based application to fulfill this need by implementing two models, YR and W/S schemes, for the translational and rotational positioning of nucleosomes, respectively. Our methods are based on sequence-dependent anisotropic bending that dictates how DNA is wrapped around a histone octamer. This application allows users to specify a number of options such as schemes and parameters for threading calculation and provides multiple layout formats. The nuMap is implemented in Java/Perl/MySQL and is freely available for public use at http://numap.rit.edu. The user manual, implementation notes, description of the methodology and examples are available at the site. Copyright © 2014 The Authors. Production and hosting by Elsevier Ltd.. All rights reserved.

  11. nuMap: A Web Platform for Accurate Prediction of Nucleosome Positioning

    Directory of Open Access Journals (Sweden)

    Bader A. Alharbi

    2014-10-01

    Full Text Available Nucleosome positioning is critical for gene expression and of major biological interest. The high cost of experimentally mapping nucleosomal arrangement signifies the need for computational approaches to predict nucleosome positions at high resolution. Here, we present a web-based application to fulfill this need by implementing two models, YR and W/S schemes, for the translational and rotational positioning of nucleosomes, respectively. Our methods are based on sequence-dependent anisotropic bending that dictates how DNA is wrapped around a histone octamer. This application allows users to specify a number of options such as schemes and parameters for threading calculation and provides multiple layout formats. The nuMap is implemented in Java/Perl/MySQL and is freely available for public use at http://numap.rit.edu. The user manual, implementation notes, description of the methodology and examples are available at the site.

  12. A Security Monitoring Method Based on Autonomic Computing for the Cloud Platform

    Directory of Open Access Journals (Sweden)

    Jingjie Zhang

    2018-01-01

    Full Text Available With the continuous development of cloud computing, cloud security has become one of the most important issues in cloud computing. For example, data stored in the cloud platform may be attacked, and its security is difficult to be guaranteed. Therefore, we must attach weight to the issue of how to protect the data stored in the cloud. To protect data, data monitoring is a necessary process. Based on autonomic computing, we develop a cloud data monitoring system on the cloud platform, monitoring whether the data is abnormal in the cycle and analyzing the security of the data according to the monitored results. In this paper, the feasibility of the scheme can be verified through simulation. The results show that the proposed method can adapt to the dynamic change of cloud platform load, and it can also accurately evaluate the degree of abnormal data. Meanwhile, by adjusting monitoring frequency automatically, it improves the accuracy and timeliness of monitoring. Furthermore, it can reduce the monitoring cost of the system in normal operation process.

  13. Getting the Most from Distributed Resources With an Analytics Platform for ATLAS Computing Services

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00225336; The ATLAS collaboration; Gardner, Robert; Bryant, Lincoln

    2016-01-01

    To meet a sharply increasing demand for computing resources for LHC Run 2, ATLAS distributed computing systems reach far and wide to gather CPU resources and storage capacity to execute an evolving ecosystem of production and analysis workflow tools. Indeed more than a hundred computing sites from the Worldwide LHC Computing Grid, plus many “opportunistic” facilities at HPC centers, universities, national laboratories, and public clouds, combine to meet these requirements. These resources have characteristics (such as local queuing availability, proximity to data sources and target destinations, network latency and bandwidth capacity, etc.) affecting the overall processing efficiency and throughput. To quantitatively understand and in some instances predict behavior, we have developed a platform to aggregate, index (for user queries), and analyze the more important information streams affecting performance. These data streams come from the ATLAS production system (PanDA), the distributed data management s...

  14. Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)

    Science.gov (United States)

    Nebert, D. D.; Huang, Q.; Yang, C.

    2013-12-01

    The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This

  15. A high performance, low power computational platform for complex sensing operations in smart cities

    KAUST Repository

    Jiang, Jiming; Claudel, Christian

    2017-01-01

    This paper presents a new wireless platform designed for an integrated traffic/flash flood monitoring system. The sensor platform is built around a 32-bit ARM Cortex M4 microcontroller and a 2.4GHz 802.15.4802.15.4 ISM compliant radio module. It can be interfaced with fixed traffic sensors, or receive data from vehicle transponders. This platform is specifically designed for solar-powered, low bandwidth, high computational performance wireless sensor network applications. A self-recovering unit is designed to increase reliability and allow periodic hard resets, an essential requirement for sensor networks. A radio monitoring circuitry is proposed to monitor incoming and outgoing transmissions, simplifying software debugging. We illustrate the performance of this wireless sensor platform on complex problems arising in smart cities, such as traffic flow monitoring, machine-learning-based flash flood monitoring or Kalman-filter based vehicle trajectory estimation. All design files have been uploaded and shared in an open science framework, and can be accessed from [1]. The hardware design is under CERN Open Hardware License v1.2.

  16. A high performance, low power computational platform for complex sensing operations in smart cities

    KAUST Repository

    Jiang, Jiming

    2017-02-02

    This paper presents a new wireless platform designed for an integrated traffic/flash flood monitoring system. The sensor platform is built around a 32-bit ARM Cortex M4 microcontroller and a 2.4GHz 802.15.4802.15.4 ISM compliant radio module. It can be interfaced with fixed traffic sensors, or receive data from vehicle transponders. This platform is specifically designed for solar-powered, low bandwidth, high computational performance wireless sensor network applications. A self-recovering unit is designed to increase reliability and allow periodic hard resets, an essential requirement for sensor networks. A radio monitoring circuitry is proposed to monitor incoming and outgoing transmissions, simplifying software debugging. We illustrate the performance of this wireless sensor platform on complex problems arising in smart cities, such as traffic flow monitoring, machine-learning-based flash flood monitoring or Kalman-filter based vehicle trajectory estimation. All design files have been uploaded and shared in an open science framework, and can be accessed from [1]. The hardware design is under CERN Open Hardware License v1.2.

  17. Contributing to global computing platform: gliding, tunneling standard services and high energy physics application

    International Nuclear Information System (INIS)

    Lodygensky, O.

    2006-09-01

    Centralized computers have been replaced by 'client/server' distributed architectures which are in turn in competition with new distributed systems known as 'peer to peer'. These new technologies are widely spread, and trading, industry and the research world have understood the new goals involved and massively invest around these new technologies, named 'grid'. One of the fields is about calculating. This is the subject of the works presented here. At the Paris Orsay University, a synergy emerged between the Computing Science Laboratory (LRI) and the Linear Accelerator Laboratory (LAL) on grid infrastructure, opening new investigations fields for the first and new high computing perspective for the other. Works presented here are the results of this multi-discipline collaboration. They are based on XtremWeb, the LRI global computing platform. We first introduce a state of the art of the large scale distributed systems, its principles, its architecture based on services. We then introduce XtremWeb and detail modifications and improvements we had to specify and implement to achieve our goals. We present two different studies, first interconnecting grids in order to generalize resource sharing and secondly, be able to use legacy services on such platforms. We finally explain how a research community like the community of high energy cosmic radiation detection can gain access to these services and detail Monte Carlos and data analysis processes over the grids. (author)

  18. Design and performance of the virtualization platform for offline computing on the ATLAS TDAQ Farm

    International Nuclear Information System (INIS)

    Ballestrero, S; Lee, C J; Batraneanu, S M; Scannicchio, D A; Brasolin, F; Contescu, C; Girolamo, A Di; Astigarraga, M E Pozo; Twomey, M S; Zaytsev, A

    2014-01-01

    With the LHC collider at CERN currently going through the period of Long Shutdown 1 there is an opportunity to use the computing resources of the experiments' large trigger farms for other data processing activities. In the case of the ATLAS experiment, the TDAQ farm, consisting of more than 1500 compute nodes, is suitable for running Monte Carlo (MC) production jobs that are mostly CPU and not I/O bound. This contribution gives a thorough review of the design and deployment of a virtualized platform running on this computing resource and of its use to run large groups of CernVM based virtual machines operating as a single CERN-P1 WLCG site. This platform has been designed to guarantee the security and the usability of the ATLAS private network, and to minimize interference with TDAQ's usage of the farm. Openstack has been chosen to provide a cloud management layer. The experience gained in the last 3.5 months shows that the use of the TDAQ farm for the MC simulation contributes to the ATLAS data processing at the level of a large Tier-1 WLCG site, despite the opportunistic nature of the underlying computing resources being used.

  19. Design and performance of the virtualization platform for offline computing on the ATLAS TDAQ Farm

    Science.gov (United States)

    Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Twomey, M. S.; Zaytsev, A.

    2014-06-01

    With the LHC collider at CERN currently going through the period of Long Shutdown 1 there is an opportunity to use the computing resources of the experiments' large trigger farms for other data processing activities. In the case of the ATLAS experiment, the TDAQ farm, consisting of more than 1500 compute nodes, is suitable for running Monte Carlo (MC) production jobs that are mostly CPU and not I/O bound. This contribution gives a thorough review of the design and deployment of a virtualized platform running on this computing resource and of its use to run large groups of CernVM based virtual machines operating as a single CERN-P1 WLCG site. This platform has been designed to guarantee the security and the usability of the ATLAS private network, and to minimize interference with TDAQ's usage of the farm. Openstack has been chosen to provide a cloud management layer. The experience gained in the last 3.5 months shows that the use of the TDAQ farm for the MC simulation contributes to the ATLAS data processing at the level of a large Tier-1 WLCG site, despite the opportunistic nature of the underlying computing resources being used.

  20. The TeraShake Computational Platform for Large-Scale Earthquake Simulations

    Science.gov (United States)

    Cui, Yifeng; Olsen, Kim; Chourasia, Amit; Moore, Reagan; Maechling, Philip; Jordan, Thomas

    Geoscientific and computer science researchers with the Southern California Earthquake Center (SCEC) are conducting a large-scale, physics-based, computationally demanding earthquake system science research program with the goal of developing predictive models of earthquake processes. The computational demands of this program continue to increase rapidly as these researchers seek to perform physics-based numerical simulations of earthquake processes for larger meet the needs of this research program, a multiple-institution team coordinated by SCEC has integrated several scientific codes into a numerical modeling-based research tool we call the TeraShake computational platform (TSCP). A central component in the TSCP is a highly scalable earthquake wave propagation simulation program called the TeraShake anelastic wave propagation (TS-AWP) code. In this chapter, we describe how we extended an existing, stand-alone, wellvalidated, finite-difference, anelastic wave propagation modeling code into the highly scalable and widely used TS-AWP and then integrated this code into the TeraShake computational platform that provides end-to-end (initialization to analysis) research capabilities. We also describe the techniques used to enhance the TS-AWP parallel performance on TeraGrid supercomputers, as well as the TeraShake simulations phases including input preparation, run time, data archive management, and visualization. As a result of our efforts to improve its parallel efficiency, the TS-AWP has now shown highly efficient strong scaling on over 40K processors on IBM’s BlueGene/L Watson computer. In addition, the TSCP has developed into a computational system that is useful to many members of the SCEC community for performing large-scale earthquake simulations.

  1. Improved Stewart platform state estimation using inertial and actuator position measurements

    NARCIS (Netherlands)

    MiletoviC, I.; Pool, D.M.; Stroosma, O.; van Paassen, M.M.; Chu, Q.

    2017-01-01

    Accurate and reliable estimation of the kinematic state of a six degrees-of-freedom Stewart platform is a problem of interest in various engineering disciplines. Particularly so in the area of flight simulation, where the Stewart platform is in widespread use for the generation of motion similar

  2. OSM POI ANALYZER: A PLATFORM FOR ASSESSING POSITION OF POIs IN OPENSTREETMAP

    Directory of Open Access Journals (Sweden)

    A. Kashian

    2017-09-01

    Full Text Available In recent years, more and increased participation in Volunteered Geographical Information (VGI projects provides enough data coverage for most places around the world for ordinary mapping and navigation purposes, however, the positional credibility of contributed data becomes more and more important to bring a long-term trust in VGI data. Today, it is hard to draw a definite traditional boundary between the authoritative map producers and the public map consumers and we observe that more and more volunteers are joining crowdsourcing activities for collecting geodata, which might result in higher rates of man-made mistakes in open map projects such as OpenStreetMap. While there are some methods for monitoring the accuracy and consistency of the created data, there is still a lack of advanced systems to automatically discover misplaced objects on the map. One feature type which is contributed daily to OSM is Point of Interest (POI. In order to understand how likely it is that a newly added POI represents a genuine real-world feature scientific means to calculate a probability of such a POI existing at that specific position is needed. This paper reports on a new analytic tool which dives into OSM data and finds co-existence patterns between one specific POI and its surrounding objects such as roads, parks and buildings. The platform uses a distance-based classification technique to find relationships among objects and tries to identify the high-frequency association patterns among each category of objects. Using such method, for each newly added POI, a probabilistic score would be generated, and the low scored POIs can be highlighted for editors for a manual check. The same scoring method can be used for existing registered POIs to check if they are located correctly. For a sample study, this paper reports on the evaluation of 800 pre-registered ATMs in Paris with associated scores to understand how outliers and fake entries could be detected

  3. [The Key Technology Study on Cloud Computing Platform for ECG Monitoring Based on Regional Internet of Things].

    Science.gov (United States)

    Yang, Shu; Qiu, Yuyan; Shi, Bo

    2016-09-01

    This paper explores the methods of building the internet of things of a regional ECG monitoring, focused on the implementation of ECG monitoring center based on cloud computing platform. It analyzes implementation principles of automatic identifi cation in the types of arrhythmia. It also studies the system architecture and key techniques of cloud computing platform, including server load balancing technology, reliable storage of massive smalfi les and the implications of quick search function.

  4. Computer-operated analytical platform for the determination of nutrients in hydroponic systems.

    Science.gov (United States)

    Rius-Ruiz, F Xavier; Andrade, Francisco J; Riu, Jordi; Rius, F Xavier

    2014-03-15

    Hydroponics is a water, energy, space, and cost efficient system for growing plants in constrained spaces or land exhausted areas. Precise control of hydroponic nutrients is essential for growing healthy plants and producing high yields. In this article we report for the first time on a new computer-operated analytical platform which can be readily used for the determination of essential nutrients in hydroponic growing systems. The liquid-handling system uses inexpensive components (i.e., peristaltic pump and solenoid valves), which are discretely computer-operated to automatically condition, calibrate and clean a multi-probe of solid-contact ion-selective electrodes (ISEs). These ISEs, which are based on carbon nanotubes, offer high portability, robustness and easy maintenance and storage. With this new computer-operated analytical platform we performed automatic measurements of K(+), Ca(2+), NO3(-) and Cl(-) during tomato plants growth in order to assure optimal nutritional uptake and tomato production. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Design and Performance of the Virtualization Platform for Offline computing on the ATLAS TDAQ Farm

    CERN Document Server

    Ballestrero, S; The ATLAS collaboration; Brasolin, F; Contescu, C; Di Girolamo, A; Lee, C J; Pozo Astigarraga, M E; Scannicchio, D A; Twomey, M S; Zaytsev, A

    2013-01-01

    With the LHC collider at CERN currently going through the period of Long Shutdown 1 (LS1) there is a remarkable opportunity to use the computing resources of the large trigger farms of the experiments for other data processing activities. In the case of ATLAS experiment the TDAQ farm, consisting of more than 1500 compute nodes, is particularly suitable for running Monte Carlo production jobs that are mostly CPU and not I/O bound. This contribution gives a thorough review of all the stages of Sim@P1 project dedicated to the design and deployment of a virtualized platform running on the ATLAS TDAQ computing resources and using it to run the large groups of CernVM based virtual machines operating as a single CERN-P1 WLCG site. This platform has been designed to avoid interference with TDAQ usage of the farm and to guarantee the security and the usability of the ATLAS private network; Openstack has been chosen to provide a cloud management layer. The approaches to organizing support for the sustained operation of...

  6. Homemade Buckeye-Pi: A Learning Many-Node Platform for High-Performance Parallel Computing

    Science.gov (United States)

    Amooie, M. A.; Moortgat, J.

    2017-12-01

    We report on the "Buckeye-Pi" cluster, the supercomputer developed in The Ohio State University School of Earth Sciences from 128 inexpensive Raspberry Pi (RPi) 3 Model B single-board computers. Each RPi is equipped with fast Quad Core 1.2GHz ARMv8 64bit processor, 1GB of RAM, and 32GB microSD card for local storage. Therefore, the cluster has a total RAM of 128GB that is distributed on the individual nodes and a flash capacity of 4TB with 512 processors, while it benefits from low power consumption, easy portability, and low total cost. The cluster uses the Message Passing Interface protocol to manage the communications between each node. These features render our platform the most powerful RPi supercomputer to date and suitable for educational applications in high-performance-computing (HPC) and handling of large datasets. In particular, we use the Buckeye-Pi to implement optimized parallel codes in our in-house simulator for subsurface media flows with the goal of achieving a massively-parallelized scalable code. We present benchmarking results for the computational performance across various number of RPi nodes. We believe our project could inspire scientists and students to consider the proposed unconventional cluster architecture as a mainstream and a feasible learning platform for challenging engineering and scientific problems.

  7. Design and Performance of the Virtualization Platform for Offline computing on the ATLAS TDAQ Farm

    CERN Document Server

    Ballestrero, S; The ATLAS collaboration; Brasolin, F; Contescu, C; Di Girolamo, A; Lee, C J; Pozo Astigarraga, M E; Scannicchio, D A; Twomey, M S; Zaytsev, A

    2014-01-01

    With the LHC collider at CERN currently going through the period of Long Shutdown 1 (LS1) there is a remarkable opportunity to use the computing resources of the large trigger farms of the experiments for other data processing activities. In the case of ATLAS experiment the TDAQ farm, consisting of more than 1500 compute nodes, is particularly suitable for running Monte Carlo production jobs that are mostly CPU and not I/O bound. This contribution gives a thorough review of all the stages of Sim@P1 project dedicated to the design and deployment of a virtualized platform running on the ATLAS TDAQ computing resources and using it to run the large groups of CernVM based virtual machines operating as a single CERN-P1 WLCG site. This platform has been designed to avoid interference with TDAQ usage of the farm and to guarantee the security and the usability of the ATLAS private network; Openstack has been chosen to provide a cloud management layer. The approaches to organizing support for the sustained operation of...

  8. Processing-Efficient Distributed Adaptive RLS Filtering for Computationally Constrained Platforms

    Directory of Open Access Journals (Sweden)

    Noor M. Khan

    2017-01-01

    Full Text Available In this paper, a novel processing-efficient architecture of a group of inexpensive and computationally incapable small platforms is proposed for a parallely distributed adaptive signal processing (PDASP operation. The proposed architecture runs computationally expensive procedures like complex adaptive recursive least square (RLS algorithm cooperatively. The proposed PDASP architecture operates properly even if perfect time alignment among the participating platforms is not available. An RLS algorithm with the application of MIMO channel estimation is deployed on the proposed architecture. Complexity and processing time of the PDASP scheme with MIMO RLS algorithm are compared with sequentially operated MIMO RLS algorithm and liner Kalman filter. It is observed that PDASP scheme exhibits much lesser computational complexity parallely than the sequential MIMO RLS algorithm as well as Kalman filter. Moreover, the proposed architecture provides an improvement of 95.83% and 82.29% decreased processing time parallely compared to the sequentially operated Kalman filter and MIMO RLS algorithm for low doppler rate, respectively. Likewise, for high doppler rate, the proposed architecture entails an improvement of 94.12% and 77.28% decreased processing time compared to the Kalman and RLS algorithms, respectively.

  9. An Application-Based Performance Evaluation of NASAs Nebula Cloud Computing Platform

    Science.gov (United States)

    Saini, Subhash; Heistand, Steve; Jin, Haoqiang; Chang, Johnny; Hood, Robert T.; Mehrotra, Piyush; Biswas, Rupak

    2012-01-01

    The high performance computing (HPC) community has shown tremendous interest in exploring cloud computing as it promises high potential. In this paper, we examine the feasibility, performance, and scalability of production quality scientific and engineering applications of interest to NASA on NASA's cloud computing platform, called Nebula, hosted at Ames Research Center. This work represents the comprehensive evaluation of Nebula using NUTTCP, HPCC, NPB, I/O, and MPI function benchmarks as well as four applications representative of the NASA HPC workload. Specifically, we compare Nebula performance on some of these benchmarks and applications to that of NASA s Pleiades supercomputer, a traditional HPC system. We also investigate the impact of virtIO and jumbo frames on interconnect performance. Overall results indicate that on Nebula (i) virtIO and jumbo frames improve network bandwidth by a factor of 5x, (ii) there is a significant virtualization layer overhead of about 10% to 25%, (iii) write performance is lower by a factor of 25x, (iv) latency for short MPI messages is very high, and (v) overall performance is 15% to 48% lower than that on Pleiades for NASA HPC applications. We also comment on the usability of the cloud platform.

  10. A computer simulation platform for the estimation of measurement uncertainties in dimensional X-ray computed tomography

    DEFF Research Database (Denmark)

    Hiller, Jochen; Reindl, Leonard M

    2012-01-01

    into account the main error sources for the measurement. This method has the potential to deal with all kinds of systematic and random errors that influence a dimensional CT measurement. A case study demonstrates the practical application of the VCT simulator using numerically generated CT data and statistical......The knowledge of measurement uncertainty is of great importance in conformance testing in production. The tolerance limit for production must be reduced by the amounts of measurement uncertainty to ensure that the parts are in fact within the tolerance. Over the last 5 years, industrial X......-ray computed tomography (CT) has become an important technology for dimensional quality control. In this paper a computer simulation platform is presented which is able to investigate error sources in dimensional CT measurements. The typical workflow in industrial CT metrology is described and methods...

  11. Real-time computing platform for spiking neurons (RT-spike).

    Science.gov (United States)

    Ros, Eduardo; Ortigosa, Eva M; Agís, Rodrigo; Carrillo, Richard; Arnold, Michael

    2006-07-01

    A computing platform is described for simulating arbitrary networks of spiking neurons in real time. A hybrid computing scheme is adopted that uses both software and hardware components to manage the tradeoff between flexibility and computational power; the neuron model is implemented in hardware and the network model and the learning are implemented in software. The incremental transition of the software components into hardware is supported. We focus on a spike response model (SRM) for a neuron where the synapses are modeled as input-driven conductances. The temporal dynamics of the synaptic integration process are modeled with a synaptic time constant that results in a gradual injection of charge. This type of model is computationally expensive and is not easily amenable to existing software-based event-driven approaches. As an alternative we have designed an efficient time-based computing architecture in hardware, where the different stages of the neuron model are processed in parallel. Further improvements occur by computing multiple neurons in parallel using multiple processing units. This design is tested using reconfigurable hardware and its scalability and performance evaluated. Our overall goal is to investigate biologically realistic models for the real-time control of robots operating within closed action-perception loops, and so we evaluate the performance of the system on simulating a model of the cerebellum where the emulation of the temporal dynamics of the synaptic integration process is important.

  12. The BioIntelligence Framework: a new computational platform for biomedical knowledge computing.

    Science.gov (United States)

    Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles; Mousses, Spyro

    2013-01-01

    Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information.

  13. Beyond computer literacy: supporting youth's positive development through technology.

    Science.gov (United States)

    Bers, Marina Umaschi

    2010-01-01

    In a digital era in which technology plays a role in most aspects of a child's life, having the competence and confidence to use computers might be a necessary step, but not a goal in itself. Developing character traits that will serve children to use technology in a safe way to communicate and connect with others, and providing opportunities for children to make a better world through the use of their computational skills, is just as important. The Positive Technological Development framework (PTD), a natural extension of the computer literacy and the technological fluency movements that have influenced the world of educational technology, adds psychosocial, civic, and ethical components to the cognitive ones. PTD examines the developmental tasks of a child growing up in our digital era and provides a model for developing and evaluating technology-rich youth programs. The explicit goal of PTD programs is to support children in the positive uses of technology to lead more fulfilling lives and make the world a better place. This article introduces the concept of PTD and presents examples of the Zora virtual world program for young people that the author developed following this framework.

  14. The “Chimera”: An Off-The-Shelf CPU/GPGPU/FPGA Hybrid Computing Platform

    Directory of Open Access Journals (Sweden)

    Ra Inta

    2012-01-01

    Full Text Available The nature of modern astronomy means that a number of interesting problems exhibit a substantial computational bound and this situation is gradually worsening. Scientists, increasingly fighting for valuable resources on conventional high-performance computing (HPC facilities—often with a limited customizable user environment—are increasingly looking to hardware acceleration solutions. We describe here a heterogeneous CPU/GPGPU/FPGA desktop computing system (the “Chimera”, built with commercial-off-the-shelf components. We show that this platform may be a viable alternative solution to many common computationally bound problems found in astronomy, however, not without significant challenges. The most significant bottleneck in pipelines involving real data is most likely to be the interconnect (in this case the PCI Express bus residing on the CPU motherboard. Finally, we speculate on the merits of our Chimera system on the entire landscape of parallel computing, through the analysis of representative problems from UC Berkeley’s “Thirteen Dwarves.”

  15. Reliability Assessment of Cloud Computing Platform Based on Semiquantitative Information and Evidential Reasoning

    Directory of Open Access Journals (Sweden)

    Hang Wei

    2016-01-01

    Full Text Available A reliability assessment method based on evidential reasoning (ER rule and semiquantitative information is proposed in this paper, where a new reliability assessment architecture including four aspects with both quantitative data and qualitative knowledge is established. The assessment architecture is more objective in describing complex dynamic cloud computing environment than that in traditional method. In addition, the ER rule which has good performance for multiple attribute decision making problem is employed to integrate different types of the attributes in assessment architecture, which can obtain more accurate assessment results. The assessment results of the case study in an actual cloud computing platform verify the effectiveness and the advantage of the proposed method.

  16. Semiempirical Quantum Chemical Calculations Accelerated on a Hybrid Multicore CPU-GPU Computing Platform.

    Science.gov (United States)

    Wu, Xin; Koslowski, Axel; Thiel, Walter

    2012-07-10

    In this work, we demonstrate that semiempirical quantum chemical calculations can be accelerated significantly by leveraging the graphics processing unit (GPU) as a coprocessor on a hybrid multicore CPU-GPU computing platform. Semiempirical calculations using the MNDO, AM1, PM3, OM1, OM2, and OM3 model Hamiltonians were systematically profiled for three types of test systems (fullerenes, water clusters, and solvated crambin) to identify the most time-consuming sections of the code. The corresponding routines were ported to the GPU and optimized employing both existing library functions and a GPU kernel that carries out a sequence of noniterative Jacobi transformations during pseudodiagonalization. The overall computation times for single-point energy calculations and geometry optimizations of large molecules were reduced by one order of magnitude for all methods, as compared to runs on a single CPU core.

  17. Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing

    Science.gov (United States)

    Tang, Jingyin; Matyas, Corene J.

    2018-02-01

    Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.

  18. University Students Use of Computers and Mobile Devices for Learning and Their Reading Speed on Different Platforms

    Science.gov (United States)

    Mpofu, Bongeka

    2016-01-01

    This research was aimed at the investigation of mobile device and computer use at a higher learning institution. The goal was to determine the current use of computers and mobile devices for learning and the students' reading speed on different platforms. The research was contextualised in a sample of students at the University of South Africa.…

  19. A computational platform for modeling and simulation of pipeline georeferencing systems

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, A.G.; Pellanda, P.C.; Gois, J.A. [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil); Roquette, P.; Pinto, M.; Durao, R. [Instituto de Pesquisas da Marinha (IPqM), Rio de Janeiro, RJ (Brazil); Silva, M.S.V.; Martins, W.F.; Camillo, L.M.; Sacsa, R.P.; Madeira, B. [Ministerio de Ciencia e Tecnologia (CT-PETRO2006MCT), Brasilia, DF (Brazil). Financiadora de Estudos e Projetos (FINEP). Plano Nacional de Ciencia e Tecnologia do Setor Petroleo e Gas Natural

    2009-07-01

    This work presents a computational platform for modeling and simulation of pipeline geo referencing systems, which was developed based on typical pipeline characteristics, on the dynamical modeling of Pipeline Inspection Gauge (PIG) and on the analysis and implementation of an inertial navigation algorithm. The software environment of PIG trajectory simulation and navigation allows the user, through a friendly interface, to carry-out evaluation tests of the inertial navigation system under different scenarios. Therefore, it is possible to define the required specifications of the pipeline geo referencing system components, such as: required precision of inertial sensors, characteristics of the navigation auxiliary system (GPS surveyed control points, odometers etc.), pipeline construction information to be considered in order to improve the trajectory estimation precision, and the signal processing techniques more suitable for the treatment of inertial sensors data. The simulation results are analyzed through the evaluation of several performance metrics usually considered in inertial navigation applications, and 2D and 3D plots of trajectory estimation error and of recovered trajectory in the three coordinates are made available to the user. This paper presents the simulation platform and its constituting modules and defines their functional characteristics and interrelationships.(author)

  20. High-Throughput Computing on High-Performance Platforms: A Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Oleynik, D [University of Texas at Arlington; Panitkin, S [Brookhaven National Laboratory (BNL); Matteo, Turilli [Rutgers University; Angius, Alessio [Rutgers University; Oral, H Sarp [ORNL; De, K [University of Texas at Arlington; Klimentov, A [Brookhaven National Laboratory (BNL); Wells, Jack C. [ORNL; Jha, S [Rutgers University

    2017-10-01

    The computing systems used by LHC experiments has historically consisted of the federation of hundreds to thousands of distributed resources, ranging from small to mid-size resource. In spite of the impressive scale of the existing distributed computing solutions, the federation of small to mid-size resources will be insufficient to meet projected future demands. This paper is a case study of how the ATLAS experiment has embraced Titan -- a DOE leadership facility in conjunction with traditional distributed high- throughput computing to reach sustained production scales of approximately 52M core-hours a years. The three main contributions of this paper are: (i) a critical evaluation of design and operational considerations to support the sustained, scalable and production usage of Titan; (ii) a preliminary characterization of a next generation executor for PanDA to support new workloads and advanced execution modes; and (iii) early lessons for how current and future experimental and observational systems can be integrated with production supercomputers and other platforms in a general and extensible manner.

  1. Virtual network computing: cross-platform remote display and collaboration software.

    Science.gov (United States)

    Konerding, D E

    1999-04-01

    VNC (Virtual Network Computing) is a computer program written to address the problem of cross-platform remote desktop/application display. VNC uses a client/server model in which an image of the desktop of the server is transmitted to the client and displayed. The client collects mouse and keyboard input from the user and transmits them back to the server. The VNC client and server can run on Windows 95/98/NT, MacOS, and Unix (including Linux) operating systems. VNC is multi-user on Unix machines (any number of servers can be run are unrelated to the primary display of the computer), while it is effectively single-user on Macintosh and Windows machines (only one server can be run, displaying the contents of the primary display of the server). The VNC servers can be configured to allow more than one client to connect at one time, effectively allowing collaboration through the shared desktop. I describe the function of VNC, provide details of installation, describe how it achieves its goal, and evaluate the use of VNC for molecular modelling. VNC is an extremely useful tool for collaboration, instruction, software development, and debugging of graphical programs with remote users.

  2. Phonon-based scalable platform for chip-scale quantum computing

    Directory of Open Access Journals (Sweden)

    Charles M. Reinke

    2016-12-01

    Full Text Available We present a scalable phonon-based quantum computer on a phononic crystal platform. Practical schemes involve selective placement of a single acceptor atom in the peak of the strain field in a high-Q phononic crystal cavity that enables coupling of the phonon modes to the energy levels of the atom. We show theoretical optimization of the cavity design and coupling waveguide, along with estimated performance figures of the coupled system. A qubit can be created by entangling a phonon at the resonance frequency of the cavity with the atom states. Qubits based on this half-sound, half-matter quasi-particle, called a phoniton, may outcompete other quantum architectures in terms of combined emission rate, coherence lifetime, and fabrication demands.

  3. CoreFlow: A computational platform for integration, analysis and modeling of complex biological data

    DEFF Research Database (Denmark)

    Pasculescu, Adrian; Schoof, Erwin; Creixell, Pau

    2014-01-01

    between data generation, analysis and manuscript writing. CoreFlow is being released to the scientific community as an open-sourced software package complete with proteomics-specific examples, which include corrections for incomplete isotopic labeling of peptides (SILAC) or arginine-to-proline conversion......A major challenge in mass spectrometry and other large-scale applications is how to handle, integrate, and model the data that is produced. Given the speed at which technology advances and the need to keep pace with biological experiments, we designed a computational platform, CoreFlow, which...... provides programmers with a framework to manage data in real-time. It allows users to upload data into a relational database (MySQL), and to create custom scripts in high-level languages such as R, Python, or Perl for processing, correcting and modeling this data. CoreFlow organizes these scripts...

  4. Design Patterns for Sparse-Matrix Computations on Hybrid CPU/GPU Platforms

    Directory of Open Access Journals (Sweden)

    Valeria Cardellini

    2014-01-01

    Full Text Available We apply object-oriented software design patterns to develop code for scientific software involving sparse matrices. Design patterns arise when multiple independent developments produce similar designs which converge onto a generic solution. We demonstrate how to use design patterns to implement an interface for sparse matrix computations on NVIDIA GPUs starting from PSBLAS, an existing sparse matrix library, and from existing sets of GPU kernels for sparse matrices. We also compare the throughput of the PSBLAS sparse matrix–vector multiplication on two platforms exploiting the GPU with that obtained by a CPU-only PSBLAS implementation. Our experiments exhibit encouraging results regarding the comparison between CPU and GPU executions in double precision, obtaining a speedup of up to 35.35 on NVIDIA GTX 285 with respect to AMD Athlon 7750, and up to 10.15 on NVIDIA Tesla C2050 with respect to Intel Xeon X5650.

  5. ENVIRONMENT: a computational platform to stochastically simulate reacting and self-reproducing lipid compartments

    Science.gov (United States)

    Mavelli, Fabio; Ruiz-Mirazo, Kepa

    2010-09-01

    'ENVIRONMENT' is a computational platform that has been developed in the last few years with the aim to simulate stochastically the dynamics and stability of chemically reacting protocellular systems. Here we present and describe some of its main features, showing how the stochastic kinetics approach can be applied to study the time evolution of reaction networks in heterogeneous conditions, particularly when supramolecular lipid structures (micelles, vesicles, etc) coexist with aqueous domains. These conditions are of special relevance to understand the origins of cellular, self-reproducing compartments, in the context of prebiotic chemistry and evolution. We contrast our simulation results with real lab experiments, with the aim to bring together theoretical and experimental research on protocell and minimal artificial cell systems.

  6. Migration of the Almaraz NPP integrated operation management system to a new computer platform

    International Nuclear Information System (INIS)

    Gonzalez Crego, E.; Martin Lopez-Suevos, C.

    1996-01-01

    In all power plants, it becomes necessary, with the passage of time, to migrate the initial operation management systems to adapt them to current technologies. That is a good time to improve the inclusion of data in the corporative database and standardize the system interfaces and operation, whilst maintaining data system operability. This article contains Almaraz experience in migrating its Integrated Operation Management System to an advanced computer platform based on open systems (UNIX), communications network (ETHERNET) and database (ORACLE). To this effect, clear objectives and strict standards were established to facilitate the work. The most noteworthy results obtained are: Better quality of information and structure in the corporative database Standardised user interface in all applications. Joint migration of applications for Maintenance, Components and Spare parts, Warehouses and Purchases. Integration of new applications into the system. Introduction of the navigator, which allows movement around the database using all available applications. (Author)

  7. PID Controllers Design Applied to Positioning of Ball on the Stewart Platform

    Directory of Open Access Journals (Sweden)

    Koszewnik Andrzej

    2014-12-01

    Full Text Available The paper presents the design and practical implementation of PID controllers for a Stewart platform. The platform uses a resistance touch panel as a sensor and servo motors as actuators. The complete control system stabilizing the ball on the platform is realized with the Arduino microcontroller and the Matlab/Simulink software. Two processes required to acquire measurement signals from the touch panel in two perpendicular directions X and Y, are discussed. The first process includes the calibration of the touch panel, and the second process - the filtering of measurement signals with the low pass Butterworth filter. The obtained signals are used to design the algorithm of the ball stabilization by decoupling the global system into two local subsystems. The algorithm is implemented in a soft real time system. The parameters of both PID controllers (PIDx and PIDy are tuned by the trial-error method and implemented in the microcontroller. Finally, the complete control system is tested at the laboratory stand.

  8. CPSS: a computational platform for the analysis of small RNA deep sequencing data.

    Science.gov (United States)

    Zhang, Yuanwei; Xu, Bo; Yang, Yifan; Ban, Rongjun; Zhang, Huan; Jiang, Xiaohua; Cooke, Howard J; Xue, Yu; Shi, Qinghua

    2012-07-15

    Next generation sequencing (NGS) techniques have been widely used to document the small ribonucleic acids (RNAs) implicated in a variety of biological, physiological and pathological processes. An integrated computational tool is needed for handling and analysing the enormous datasets from small RNA deep sequencing approach. Herein, we present a novel web server, CPSS (a computational platform for the analysis of small RNA deep sequencing data), designed to completely annotate and functionally analyse microRNAs (miRNAs) from NGS data on one platform with a single data submission. Small RNA NGS data can be submitted to this server with analysis results being returned in two parts: (i) annotation analysis, which provides the most comprehensive analysis for small RNA transcriptome, including length distribution and genome mapping of sequencing reads, small RNA quantification, prediction of novel miRNAs, identification of differentially expressed miRNAs, piwi-interacting RNAs and other non-coding small RNAs between paired samples and detection of miRNA editing and modifications and (ii) functional analysis, including prediction of miRNA targeted genes by multiple tools, enrichment of gene ontology terms, signalling pathway involvement and protein-protein interaction analysis for the predicted genes. CPSS, a ready-to-use web server that integrates most functions of currently available bioinformatics tools, provides all the information wanted by the majority of users from small RNA deep sequencing datasets. CPSS is implemented in PHP/PERL+MySQL+R and can be freely accessed at http://mcg.ustc.edu.cn/db/cpss/index.html or http://mcg.ustc.edu.cn/sdap1/cpss/index.html.

  9. The design of an m-Health monitoring system based on a cloud computing platform

    Science.gov (United States)

    Xu, Boyi; Xu, Lida; Cai, Hongming; Jiang, Lihong; Luo, Yang; Gu, Yizhi

    2017-01-01

    Compared to traditional medical services provided within hospitals, m-Health monitoring systems (MHMSs) face more challenges in personalised health data processing. To achieve personalised and high-quality health monitoring by means of new technologies, such as mobile network and cloud computing, in this paper, a framework of an m-Health monitoring system based on a cloud computing platform (Cloud-MHMS) is designed to implement pervasive health monitoring. Furthermore, the modules of the framework, which are Cloud Storage and Multiple Tenants Access Control Layer, Healthcare Data Annotation Layer, and Healthcare Data Analysis Layer, are discussed. In the data storage layer, a multiple tenant access method is designed to protect patient privacy. In the data annotation layer, linked open data are adopted to augment health data interoperability semantically. In the data analysis layer, the process mining algorithm and similarity calculating method are implemented to support personalised treatment plan selection. These three modules cooperate to implement the core functions in the process of health monitoring, which are data storage, data processing, and data analysis. Finally, we study the application of our architecture in the monitoring of antimicrobial drug usage to demonstrate the usability of our method in personal healthcare analysis.

  10. Digital imaging of root traits (DIRT): a high-throughput computing and collaboration platform for field-based root phenomics.

    Science.gov (United States)

    Das, Abhiram; Schneider, Hannah; Burridge, James; Ascanio, Ana Karine Martinez; Wojciechowski, Tobias; Topp, Christopher N; Lynch, Jonathan P; Weitz, Joshua S; Bucksch, Alexander

    2015-01-01

    Plant root systems are key drivers of plant function and yield. They are also under-explored targets to meet global food and energy demands. Many new technologies have been developed to characterize crop root system architecture (CRSA). These technologies have the potential to accelerate the progress in understanding the genetic control and environmental response of CRSA. Putting this potential into practice requires new methods and algorithms to analyze CRSA in digital images. Most prior approaches have solely focused on the estimation of root traits from images, yet no integrated platform exists that allows easy and intuitive access to trait extraction and analysis methods from images combined with storage solutions linked to metadata. Automated high-throughput phenotyping methods are increasingly used in laboratory-based efforts to link plant genotype with phenotype, whereas similar field-based studies remain predominantly manual low-throughput. Here, we present an open-source phenomics platform "DIRT", as a means to integrate scalable supercomputing architectures into field experiments and analysis pipelines. DIRT is an online platform that enables researchers to store images of plant roots, measure dicot and monocot root traits under field conditions, and share data and results within collaborative teams and the broader community. The DIRT platform seamlessly connects end-users with large-scale compute "commons" enabling the estimation and analysis of root phenotypes from field experiments of unprecedented size. DIRT is an automated high-throughput computing and collaboration platform for field based crop root phenomics. The platform is accessible at http://www.dirt.iplantcollaborative.org/ and hosted on the iPlant cyber-infrastructure using high-throughput grid computing resources of the Texas Advanced Computing Center (TACC). DIRT is a high volume central depository and high-throughput RSA trait computation platform for plant scientists working on crop roots

  11. MACBenAbim: A Multi-platform Mobile Application for searching keyterms in Computational Biology and Bioinformatics.

    Science.gov (United States)

    Oluwagbemi, Olugbenga O; Adewumi, Adewole; Esuruoso, Abimbola

    2012-01-01

    Computational biology and bioinformatics are gradually gaining grounds in Africa and other developing nations of the world. However, in these countries, some of the challenges of computational biology and bioinformatics education are inadequate infrastructures, and lack of readily-available complementary and motivational tools to support learning as well as research. This has lowered the morale of many promising undergraduates, postgraduates and researchers from aspiring to undertake future study in these fields. In this paper, we developed and described MACBenAbim (Multi-platform Mobile Application for Computational Biology and Bioinformatics), a flexible user-friendly tool to search for, define and describe the meanings of keyterms in computational biology and bioinformatics, thus expanding the frontiers of knowledge of the users. This tool also has the capability of achieving visualization of results on a mobile multi-platform context. MACBenAbim is available from the authors for non-commercial purposes.

  12. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    Science.gov (United States)

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  13. Validation study of a computer-based open surgical trainer: SimPraxis(®) simulation platform.

    Science.gov (United States)

    Tran, Linh N; Gupta, Priyanka; Poniatowski, Lauren H; Alanee, Shaheen; Dall'era, Marc A; Sweet, Robert M

    2013-01-01

    Technological advances have dramatically changed medical education, particularly in the era of work-hour restrictions, which increasingly highlights a need for novel methods to teach surgical skills. The purpose of this study was to evaluate the validity of a novel, computer-based, interactive, cognitive simulator for training surgeons to perform pelvic lymph node dissection (PLND). Eight prostate cancer experts evaluated the content of the simulator. Contextual aspects of the simulator were rated on a five-point Likert scale. The experts and nine first-year residents completed a simulated PLND. Time and deviations were logged, and the results were compared between experts and novices using the Mann-Whitney test. Before training, 88% of the experts felt that a validated simulator would be useful for PLND training. After testing, 100% of the experts felt that it would be more useful than standard video training. Eighty-eight percent stated that they would like to see the simulator in the curriculum of residency programs and 56% thought it would be useful for accreditation purposes. The experts felt that the simulator aided in overall understanding, training indications, concepts and steps of the procedure, training how to use an assistant, and enhanced the knowledge of anatomy. Median performance times taken by experts and interns to complete a PLND procedure on the simulator were 12.62 and 23.97 minutes, respectively. Median deviation from the incorporated procedure pathway for experts was 24.5 and was 89 for novices. We describe an interactive, computer-based simulator designed to assist in mastery of the cognitive steps of an open surgical procedure. This platform is intuitive and flexible, and could be applied to any stepwise medical procedure. Overall, experts outperformed novices in their performance on the trainer. Experts agreed that the content was acceptable, accurate, and representative.

  14. Cross-Platform Learning Media Development of Software Installation on Computer Engineering and Networking Expertise Package

    Directory of Open Access Journals (Sweden)

    Afis Pratama

    2018-03-01

    Full Text Available Software Installation is one of the important lessons that must be mastered by student of computer and network engineering expertise package. But there is a problem about the lack of attention and concentration of students in following the teaching and learning process in the subject of installation of the software. The matter must immediately find a solution. This research refers to the technology development that is always increasing. The technology can be used as a tool to support learning activities. Currently, all grade 10 students in public vocational high school (SMK 8 Semarang Indonesia already have a gadget, either a smartphone or a laptop and the intensity of usage is high enough. Based on this phenomenon, this research aims to create a learning media software installation that is cross-platform. It is practical and can be carried easily in a smartphone and a laptop that has different operating system. So that, this media is expected to improve learning outcomes, understanding and enthusiasm of the students in the software installation lesson.

  15. Optimization of a Lattice Boltzmann Computation on State-of-the-Art Multicore Platforms

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Samuel; Carter, Jonathan; Oliker, Leonid; Shalf, John; Yelick, Katherine

    2009-04-10

    We present an auto-tuning approach to optimize application performance on emerging multicore architectures. The methodology extends the idea of search-based performance optimizations, popular in linear algebra and FFT libraries, to application-specific computational kernels. Our work applies this strategy to a lattice Boltzmann application (LBMHD) that historically has made poor use of scalar microprocessors due to its complex data structures and memory access patterns. We explore one of the broadest sets of multicore architectures in the HPC literature, including the Intel Xeon E5345 (Clovertown), AMD Opteron 2214 (Santa Rosa), AMD Opteron 2356 (Barcelona), Sun T5140 T2+ (Victoria Falls), as well as a QS20 IBM Cell Blade. Rather than hand-tuning LBMHD for each system, we develop a code generator that allows us to identify a highly optimized version for each platform, while amortizing the human programming effort. Results show that our auto-tuned LBMHD application achieves up to a 15x improvement compared with the original code at a given concurrency. Additionally, we present detailed analysis of each optimization, which reveal surprising hardware bottlenecks and software challenges for future multicore systems and applications.

  16. The Perseus computational platform for comprehensive analysis of (prote)omics data.

    Science.gov (United States)

    Tyanova, Stefka; Temu, Tikira; Sinitcyn, Pavel; Carlson, Arthur; Hein, Marco Y; Geiger, Tamar; Mann, Matthias; Cox, Jürgen

    2016-09-01

    A main bottleneck in proteomics is the downstream biological analysis of highly multivariate quantitative protein abundance data generated using mass-spectrometry-based analysis. We developed the Perseus software platform (http://www.perseus-framework.org) to support biological and biomedical researchers in interpreting protein quantification, interaction and post-translational modification data. Perseus contains a comprehensive portfolio of statistical tools for high-dimensional omics data analysis covering normalization, pattern recognition, time-series analysis, cross-omics comparisons and multiple-hypothesis testing. A machine learning module supports the classification and validation of patient groups for diagnosis and prognosis, and it also detects predictive protein signatures. Central to Perseus is a user-friendly, interactive workflow environment that provides complete documentation of computational methods used in a publication. All activities in Perseus are realized as plugins, and users can extend the software by programming their own, which can be shared through a plugin store. We anticipate that Perseus's arsenal of algorithms and its intuitive usability will empower interdisciplinary analysis of complex large data sets.

  17. A novel tripod-driven platform for in-situ positioning of samples and electrical probes in a TEM

    International Nuclear Information System (INIS)

    Medford, B D; Rogers, B L; Laird, D; Berdunov, N; Beton, P H; Lockwood, A J; Gnanavel, T; Guan, W; Wang, J; Moebus, G; Inkson, B J

    2010-01-01

    We present a design for a novel coarse positioning system based on a tilting platform which is positioned using linear slip/stick motors. The design differs from common arrangements of stacked x, y, and z motors, and also ball mounted slip/stick motors, by allowing easy access along the central axis of the microscope holder. The drive motors are highly compact and co-linear and may be easily incorporated in an off-axis configuration, leaving a central cylindrical region with an approximate diameter of 3mm which is available to accommodate screened electrical wiring and optical fibres. We show that the tripod can be used to manoeuvre two metallic tips towards each other in-situ in a TEM in nanometre-scale lateral steps.

  18. 3D virtual human atria: A computational platform for studying clinical atrial fibrillation.

    Science.gov (United States)

    Aslanidi, Oleg V; Colman, Michael A; Stott, Jonathan; Dobrzynski, Halina; Boyett, Mark R; Holden, Arun V; Zhang, Henggui

    2011-10-01

    Despite a vast amount of experimental and clinical data on the underlying ionic, cellular and tissue substrates, the mechanisms of common atrial arrhythmias (such as atrial fibrillation, AF) arising from the functional interactions at the whole atria level remain unclear. Computational modelling provides a quantitative framework for integrating such multi-scale data and understanding the arrhythmogenic behaviour that emerges from the collective spatio-temporal dynamics in all parts of the heart. In this study, we have developed a multi-scale hierarchy of biophysically detailed computational models for the human atria--the 3D virtual human atria. Primarily, diffusion tensor MRI reconstruction of the tissue geometry and fibre orientation in the human sinoatrial node (SAN) and surrounding atrial muscle was integrated into the 3D model of the whole atria dissected from the Visible Human dataset. The anatomical models were combined with the heterogeneous atrial action potential (AP) models, and used to simulate the AP conduction in the human atria under various conditions: SAN pacemaking and atrial activation in the normal rhythm, break-down of regular AP wave-fronts during rapid atrial pacing, and the genesis of multiple re-entrant wavelets characteristic of AF. Contributions of different properties of the tissue to mechanisms of the normal rhythm and arrhythmogenesis were investigated. Primarily, the simulations showed that tissue heterogeneity caused the break-down of the normal AP wave-fronts at rapid pacing rates, which initiated a pair of re-entrant spiral waves; and tissue anisotropy resulted in a further break-down of the spiral waves into multiple meandering wavelets characteristic of AF. The 3D virtual atria model itself was incorporated into the torso model to simulate the body surface ECG patterns in the normal and arrhythmic conditions. Therefore, a state-of-the-art computational platform has been developed, which can be used for studying multi

  19. CoreFlow: a computational platform for integration, analysis and modeling of complex biological data.

    Science.gov (United States)

    Pasculescu, Adrian; Schoof, Erwin M; Creixell, Pau; Zheng, Yong; Olhovsky, Marina; Tian, Ruijun; So, Jonathan; Vanderlaan, Rachel D; Pawson, Tony; Linding, Rune; Colwill, Karen

    2014-04-04

    A major challenge in mass spectrometry and other large-scale applications is how to handle, integrate, and model the data that is produced. Given the speed at which technology advances and the need to keep pace with biological experiments, we designed a computational platform, CoreFlow, which provides programmers with a framework to manage data in real-time. It allows users to upload data into a relational database (MySQL), and to create custom scripts in high-level languages such as R, Python, or Perl for processing, correcting and modeling this data. CoreFlow organizes these scripts into project-specific pipelines, tracks interdependencies between related tasks, and enables the generation of summary reports as well as publication-quality images. As a result, the gap between experimental and computational components of a typical large-scale biology project is reduced, decreasing the time between data generation, analysis and manuscript writing. CoreFlow is being released to the scientific community as an open-sourced software package complete with proteomics-specific examples, which include corrections for incomplete isotopic labeling of peptides (SILAC) or arginine-to-proline conversion, and modeling of multiple/selected reaction monitoring (MRM/SRM) results. CoreFlow was purposely designed as an environment for programmers to rapidly perform data analysis. These analyses are assembled into project-specific workflows that are readily shared with biologists to guide the next stages of experimentation. Its simple yet powerful interface provides a structure where scripts can be written and tested virtually simultaneously to shorten the life cycle of code development for a particular task. The scripts are exposed at every step so that a user can quickly see the relationships between the data, the assumptions that have been made, and the manipulations that have been performed. Since the scripts use commonly available programming languages, they can easily be

  20. Positional quality assessment of orthophotos obtained from sensors onboard multi-rotor UAV platforms.

    Science.gov (United States)

    Mesas-Carrascosa, Francisco Javier; Rumbao, Inmaculada Clavero; Berrocal, Juan Alberto Barrera; Porras, Alfonso García-Ferrer

    2014-11-26

    In this study we explored the positional quality of orthophotos obtained by an unmanned aerial vehicle (UAV). A multi-rotor UAV was used to obtain images using a vertically mounted digital camera. The flight was processed taking into account the photogrammetry workflow: perform the aerial triangulation, generate a digital surface model, orthorectify individual images and finally obtain a mosaic image or final orthophoto. The UAV orthophotos were assessed with various spatial quality tests used by national mapping agencies (NMAs). Results showed that the orthophotos satisfactorily passed the spatial quality tests and are therefore a useful tool for NMAs in their production flowchart.

  1. Positioning Continuing Education Computer Programs for the Corporate Market.

    Science.gov (United States)

    Tilney, Ceil

    1993-01-01

    Summarizes the findings of the market assessment phase of Bellevue Community College's evaluation of its continuing education computer training program. Indicates that marketing efforts must stress program quality and software training to help overcome strong antiacademic client sentiment. (MGB)

  2. Cpu/gpu Computing for AN Implicit Multi-Block Compressible Navier-Stokes Solver on Heterogeneous Platform

    Science.gov (United States)

    Deng, Liang; Bai, Hanli; Wang, Fang; Xu, Qingxin

    2016-06-01

    CPU/GPU computing allows scientists to tremendously accelerate their numerical codes. In this paper, we port and optimize a double precision alternating direction implicit (ADI) solver for three-dimensional compressible Navier-Stokes equations from our in-house Computational Fluid Dynamics (CFD) software on heterogeneous platform. First, we implement a full GPU version of the ADI solver to remove a lot of redundant data transfers between CPU and GPU, and then design two fine-grain schemes, namely “one-thread-one-point” and “one-thread-one-line”, to maximize the performance. Second, we present a dual-level parallelization scheme using the CPU/GPU collaborative model to exploit the computational resources of both multi-core CPUs and many-core GPUs within the heterogeneous platform. Finally, considering the fact that memory on a single node becomes inadequate when the simulation size grows, we present a tri-level hybrid programming pattern MPI-OpenMP-CUDA that merges fine-grain parallelism using OpenMP and CUDA threads with coarse-grain parallelism using MPI for inter-node communication. We also propose a strategy to overlap the computation with communication using the advanced features of CUDA and MPI programming. We obtain speedups of 6.0 for the ADI solver on one Tesla M2050 GPU in contrast to two Xeon X5670 CPUs. Scalability tests show that our implementation can offer significant performance improvement on heterogeneous platform.

  3. Performance Assessment of a Custom, Portable, and Low-Cost Brain-Computer Interface Platform.

    Science.gov (United States)

    McCrimmon, Colin M; Fu, Jonathan Lee; Wang, Ming; Lopes, Lucas Silva; Wang, Po T; Karimi-Bidhendi, Alireza; Liu, Charles Y; Heydari, Payam; Nenadic, Zoran; Do, An Hong

    2017-10-01

    Conventional brain-computer interfaces (BCIs) are often expensive, complex to operate, and lack portability, which confines their use to laboratory settings. Portable, inexpensive BCIs can mitigate these problems, but it remains unclear whether their low-cost design compromises their performance. Therefore, we developed a portable, low-cost BCI and compared its performance to that of a conventional BCI. The BCI was assembled by integrating a custom electroencephalogram (EEG) amplifier with an open-source microcontroller and a touchscreen. The function of the amplifier was first validated against a commercial bioamplifier, followed by a head-to-head comparison between the custom BCI (using four EEG channels) and a conventional 32-channel BCI. Specifically, five able-bodied subjects were cued to alternate between hand opening/closing and remaining motionless while the BCI decoded their movement state in real time and provided visual feedback through a light emitting diode. Subjects repeated the above task for a total of 10 trials, and were unaware of which system was being used. The performance in each trial was defined as the temporal correlation between the cues and the decoded states. The EEG data simultaneously acquired with the custom and commercial amplifiers were visually similar and highly correlated ( ρ = 0.79). The decoding performances of the custom and conventional BCIs averaged across trials and subjects were 0.70 ± 0.12 and 0.68 ± 0.10, respectively, and were not significantly different. The performance of our portable, low-cost BCI is comparable to that of the conventional BCIs. Platforms, such as the one developed here, are suitable for BCI applications outside of a laboratory.

  4. Validation study of a computer-based open surgical trainer: SimPraxis® simulation platform

    Directory of Open Access Journals (Sweden)

    Tran LN

    2013-03-01

    .Conclusion: We describe an interactive, computer-based simulator designed to assist in mastery of the cognitive steps of an open surgical procedure. This platform is intuitive and flexible, and could be applied to any stepwise medical procedure. Overall, experts outperformed novices in their performance on the trainer. Experts agreed that the content was acceptable, accurate, and representative.Keywords: simulation, surgical education, training, simulator, video

  5. Google Earth Engine: a new cloud-computing platform for global-scale earth observation data and analysis

    Science.gov (United States)

    Moore, R. T.; Hansen, M. C.

    2011-12-01

    Google Earth Engine is a new technology platform that enables monitoring and measurement of changes in the earth's environment, at planetary scale, on a large catalog of earth observation data. The platform offers intrinsically-parallel computational access to thousands of computers in Google's data centers. Initial efforts have focused primarily on global forest monitoring and measurement, in support of REDD+ activities in the developing world. The intent is to put this platform into the hands of scientists and developing world nations, in order to advance the broader operational deployment of existing scientific methods, and strengthen the ability for public institutions and civil society to better understand, manage and report on the state of their natural resources. Earth Engine currently hosts online nearly the complete historical Landsat archive of L5 and L7 data collected over more than twenty-five years. Newly-collected Landsat imagery is downloaded from USGS EROS Center into Earth Engine on a daily basis. Earth Engine also includes a set of historical and current MODIS data products. The platform supports generation, on-demand, of spatial and temporal mosaics, "best-pixel" composites (for example to remove clouds and gaps in satellite imagery), as well as a variety of spectral indices. Supervised learning methods are available over the Landsat data catalog. The platform also includes a new application programming framework, or "API", that allows scientists access to these computational and data resources, to scale their current algorithms or develop new ones. Under the covers of the Google Earth Engine API is an intrinsically-parallel image-processing system. Several forest monitoring applications powered by this API are currently in development and expected to be operational in 2011. Combining science with massive data and technology resources in a cloud-computing framework can offer advantages of computational speed, ease-of-use and collaboration, as

  6. A Conceptual Architecture for Adaptive Human-Computer Interface of a PT Operation Platform Based on Context-Awareness

    Directory of Open Access Journals (Sweden)

    Qing Xue

    2014-01-01

    Full Text Available We present a conceptual architecture for adaptive human-computer interface of a PT operation platform based on context-awareness. This architecture will form the basis of design for such an interface. This paper describes components, key technologies, and working principles of the architecture. The critical contents covered context information modeling, processing, relationship establishing between contexts and interface design knowledge by use of adaptive knowledge reasoning, and visualization implementing of adaptive interface with the aid of interface tools technology.

  7. An Analysis of Impact Factors for Positioning Performance in WLAN Fingerprinting Systems Using Ishikawa Diagrams and a Simulation Platform

    Directory of Open Access Journals (Sweden)

    Keqiang Liu

    2017-01-01

    Full Text Available Many factors influence the positioning performance in WLAN RSSI fingerprinting systems, and summary of these factors is an important but challenging job. Moreover, impact analysis on nonalgorithm factors is significant to system application and quality control but little research has been conducted. This paper analyzes and summarizes the potential impact factors by using an Ishikawa diagram considering radio signal transmitting, propagating, receiving, and processing. A simulation platform was developed to facilitate the analysis experiment, and the paper classifies the potential factors into controllable, uncontrollable, nuisance, and held-constant factors considering simulation feasibility. It takes five nonalgorithm controllable factors including APs density, APs distribution, radio signal propagating attenuation factor, radio signal propagating noise, and RPs density into consideration and adopted the OFAT analysis method in experiment. The positioning result was achieved by using the deterministic and probabilistic algorithms, and the error was presented by RMSE and CDF. The results indicate that the high APs density, signal propagating attenuation factor, and RPs density, with the low signal propagating noise level, are favorable to better performance, while APs distribution has no particular impact pattern on the positioning error. Overall, this paper has made great potential contribution to the quality control of WLAN fingerprinting solutions.

  8. Development of a Cloud Computing-Based Pier Type Port Structure Stability Evaluation Platform Using Fiber Bragg Grating Sensors.

    Science.gov (United States)

    Jo, Byung Wan; Jo, Jun Ho; Khan, Rana Muhammad Asad; Kim, Jung Hoon; Lee, Yun Sung

    2018-05-23

    Structure Health Monitoring is a topic of great interest in port structures due to the ageing of structures and the limitations of evaluating structures. This paper presents a cloud computing-based stability evaluation platform for a pier type port structure using Fiber Bragg Grating (FBG) sensors in a system consisting of a FBG strain sensor, FBG displacement gauge, FBG angle meter, gateway, and cloud computing-based web server. The sensors were installed on core components of the structure and measurements were taken to evaluate the structures. The measurement values were transmitted to the web server via the gateway to analyze and visualize them. All data were analyzed and visualized in the web server to evaluate the structure based on the safety evaluation index (SEI). The stability evaluation platform for pier type port structures involves the efficient monitoring of the structures which can be carried out easily anytime and anywhere by converging new technologies such as cloud computing and FBG sensors. In addition, the platform has been successfully implemented at “Maryang Harbor” situated in Maryang-Meyon of Korea to test its durability.

  9. Computational and Statistical Aspects of Determining Ship’s Position

    Directory of Open Access Journals (Sweden)

    Drapella Antoni

    2017-12-01

    Full Text Available In its mathematical essence, the task of determining ship’s position coordinates, is to minimize appropriately defined goal function. This paper proposes to use the method of conjugate gradient for this purpose. The reason is that calculations may be performed in some seconds time because Microsoft and Apache implemented the conjugate gradient method as a tool called the Solver and embedded this tool in their widely offered and popular spreadsheets, namely Excel and the Open Office Calc, respectively. Further in this paper it is shown how to precisely assess errors of ship’s position coordinates with the Monte Carlo method that employs the Solver.

  10. Bridging Computational Genetics and Vectorcardiography: A Robust Platform for the Early Detection of Heart Disease

    Science.gov (United States)

    Sridhar, S.

    2017-12-01

    By 2030, it is predicted that over 14 million people will die of heart disease annually, many of whom will discover their risk when it is too late to seek effective treatment or pursue lifestyle changes. In this research study, I sought to design a robust computational platform to gauge a patient's risk for cardiac diseases (CDs) based on demographics, genotype, and cardiac action potentials through machine learning, statistical analysis, and vectorcardiography. By analyzing previously published data, I discovered that certain polymorphisms in the ACE and MTHFR genes contribute significantly to CD risk. The deletion allele of the ACE insertion/deletion polymorphism increases ACE serum levels, promoting CD phenotypes. A point mutation in the MTHFR gene curbs the metabolism of folic acid, giving rise to CD phenotypes. I analyzed over 9000 British Medical Journal and American Heart Association patients to determine the CD risk associated with each ACE and MTHFR genotype. In the vectorcardiography phase of my study, I investigated trends in the maximal vectors of the QRS loop of the cardiac wave. Using a database with both normal and diseased vectorcardiographic action potentials, I plotted the maximal vectors on a 3D RAS coordinate plane to analyze their magnitude and direction. From the ACE datasets, I discovered that female patients over 45 and of Indian descent with two ACE deletion alleles exhibited the highest CD risk. Using this spectrum, I successfully constructed a neural network with an accuracy score of 0.867 that predicts CD risk based on ACE genotype, gender, region, and age. Investigation of the MTHFR genome showed that those with a homozygous mutated gene had a significantly higher CD risk. In my vectorcardiography study, I found that healthy QRS vectors pointed predominantly to the right-anterior region of the coordinate plane and exhibited short, consistent magnitudes. On the other hand, diseased vectors pointed to the left-posterior region and

  11. High positive computed tomography yields in the emergency ...

    African Journals Online (AJOL)

    when the diagnosis is uncertain.[1,2] It is therefore ... Methods. This was a retrospective record review of all patients who received CT ... period. Primary outcomes were to establish CT scan usage and positive yield rates. ... scans performed in the hospital. ... considered for surgical intervention may have a negative scan and.

  12. Position-based quantum cryptography and catalytic computation

    NARCIS (Netherlands)

    Speelman, F.

    2016-01-01

    In this thesis, we present several results along two different lines of research. The first part concerns the study of position-based quantum cryptography, a topic in quantum cryptography. By combining quantum mechanics with special relativity theory, new cryptographic tasks can be developed that

  13. Contribution to global computation infrastructure: inter-platform delegation, integration of standard services and application to high-energy physics

    International Nuclear Information System (INIS)

    Lodygensky, Oleg

    2006-01-01

    The generalization and implementation of the current information resources, particularly the large storing capacities and the networks allow conceiving new methods of work and ways of entertainment. Centralized stand-alone, monolithic computing stations have been gradually replaced by distributed client-tailored architectures which in turn are challenged by the new distributed systems called 'pair-by pair' systems. This migration is no longer with the specialists' realm but users of more modest skills get used with this new techniques for e-mailing commercial information and exchanging various sorts of files on a 'equal-to-equal' basis. Trade, industry and research as well make profits largely of the new technique called 'grid', this new technique of handling information at a global scale. The present work concerns the grid utilisation for computation. A synergy was created with Paris-Sud University at Orsay, between the Information Research Laboratory (LRI) and the Linear Accelerator Laboratory (LAL) in order to foster the works on grid infrastructure of high research interest for LRI and offering new working methods for LAL. The results of the work developed within this inter-disciplinary-collaboration are based on XtremWeb, the research and production platform for global computation elaborated at LRI. First one presents the current status of the large-scale distributed systems, their basic principles and user-oriented architecture. The XtremWeb is then described focusing the modifications which were effected upon both architecture and implementation in order to fulfill optimally the requirements imposed to such a platform. Then one presents studies with the platform allowing a generalization of the inter-grid resources and development of a user-oriented grid adapted to special services, as well,. Finally one presents the operation modes, the problems to solve and the advantages of this new platform for the high-energy research community, the most demanding

  14. A novel tablet computer platform for advanced language mapping during awake craniotomy procedures.

    Science.gov (United States)

    Morrison, Melanie A; Tam, Fred; Garavaglia, Marco M; Golestanirad, Laleh; Hare, Gregory M T; Cusimano, Michael D; Schweizer, Tom A; Das, Sunit; Graham, Simon J

    2016-04-01

    A computerized platform has been developed to enhance behavioral testing during intraoperative language mapping in awake craniotomy procedures. The system is uniquely compatible with the environmental demands of both the operating room and preoperative functional MRI (fMRI), thus providing standardized testing toward improving spatial agreement between the 2 brain mapping techniques. Details of the platform architecture, its advantages over traditional testing methods, and its use for language mapping are described. Four illustrative cases demonstrate the efficacy of using the testing platform to administer sophisticated language paradigms, and the spatial agreement between intraoperative mapping and preoperative fMRI results. The testing platform substantially improved the ability of the surgeon to detect and characterize language deficits. Use of a written word generation task to assess language production helped confirm areas of speech apraxia and speech arrest that were inadequately characterized or missed with the use of traditional paradigms, respectively. Preoperative fMRI of the analogous writing task was also assistive, displaying excellent spatial agreement with intraoperative mapping in all 4 cases. Sole use of traditional testing paradigms can be limiting during awake craniotomy procedures. Comprehensive assessment of language function will require additional use of more sophisticated and ecologically valid testing paradigms. The platform presented here provides a means to do so.

  15. Computer modelling of position-sensitive scintillator detectors

    International Nuclear Information System (INIS)

    Schelten, J.; Kurz, R.; Kernforschungsanlage Juelich G.m.b.H.

    1983-01-01

    The essential properties of a two-dimensional PSD consisting of 7 x 7 circular PMs of diameter D = 68 mm, optically coupled to a glass block disperser of thickness H, and of a thin glass scintillator which is optically decoupled from the disperser are analyzed by computer-simulation of the detector geometry which determines the light distribution on rows and columns of PMs for a neutron capture event and the electronic signal handling which leads to the response function Q(x,y). The computer simulations were performed in order to investigate geometrical variations, such as PMs with a square photo-cathode, a hexagonal arrangement, the effect of the disperser thickness and of conical condensers in front of the PMs and edge-effects due to the finite size of the disperser. The linearity of the detector can be optimised by adjusting three smoothing parameters S, S' and S''. These parameters can be introduced if the signal processing, which determines a neutron event, is based on a course selection of three PM columns and three rows followed by a weighted pulse height division for a final determination of the x and y coordinates. This paper briefly describes the simulations and presents the calculated results which refer closely to the two-dimensional PSD which is being built in the Laboratory. (author)

  16. Systematic approach for deriving feasible mappings of parallel algorithms to parallel computing platforms

    NARCIS (Netherlands)

    Arkin, Ethem; Tekinerdogan, Bedir; Imre, Kayhan M.

    2017-01-01

    The need for high-performance computing together with the increasing trend from single processor to parallel computer architectures has leveraged the adoption of parallel computing. To benefit from parallel computing power, usually parallel algorithms are defined that can be mapped and executed

  17. Computational Fluid Dynamic Analysis of a Floating Offshore Wind Turbine Experiencing Platform Pitching Motion

    Directory of Open Access Journals (Sweden)

    Thanhtoan Tran

    2014-08-01

    Full Text Available The objective of this study is to illustrate the unsteady aerodynamic effects of a floating offshore wind turbine experiencing the prescribed pitching motion of a supporting floating platform as a sine function. The three-dimensional, unsteady Reynolds Averaged Navier-Stokes equations with the shear-stress transport (SST k-ω turbulence model were applied. Moreover, an overset grid approach was used to model the rigid body motion of a wind turbine blade. The current simulation results are compared to various approaches from previous studies. The unsteady aerodynamic loads of the blade were demonstrated to change drastically with respect to the frequency and amplitude of platform motion.

  18. Computed tomography with thermal neutrons and gaseous position sensitive detector

    International Nuclear Information System (INIS)

    Souza, Maria Ines Silvani

    2001-12-01

    A third generation tomographic system using a parallel thermal neutron beam and gaseous position sensitive detector has been developed along three discrete phases. At the first one, X-ray tomographic images of several objects, using a position sensitive detector designed and constructed for this purpose have been obtained. The second phase involved the conversion of that detector for thermal neutron detection, by using materials capable to convert neutrons into detectable charged particles, testing afterwards its performance in a tomographic system by evaluation the quality of the image arising from several test-objects containing materials applicable in the engineering field. High enriched 3 He, replacing the argon-methane otherwise used as filling gas for the X-ray detection, as well as, a gadolinium foil, have been utilized as converters. Besides the pure enriched 3 He, its mixture with argon-methane and later on with propane, have been also tested, in order to evaluate the detector efficiency and resolution. After each gas change, the overall performance of the tomographic system using the modified detector, has been analyzed through measurements of the related parameters. This was done by analyzing the images produced by test-objects containing several materials having well known attenuation coefficients for both thermal neutrons and X-rays. In order to compare the performance of the position sensitive detector as modified to detect thermal neutrons, with that of a conventional BF 3 detector, additional tomographs have been conducted using the last one. The results have been compared in terms of advantages, handicaps and complementary aspects for different kinds of radiation and materials. (author)

  19. HySDeP: a computational platform for on-board hydrogen storage systems – hybrid high-pressure solid-state and gaseous storage

    DEFF Research Database (Denmark)

    Mazzucco, Andrea; Rokni, Masoud

    2016-01-01

    A computational platform is developed in the Modelica® language within the DymolaTM environment to provide a tool for the design and performance comparison of on-board hydrogen storage systems. The platform has been coupled with an open source library for hydrogen fueling stations to investigate...

  20. Computer-assisted upper extremity training using interactive biking exercise (iBikE) platform.

    Science.gov (United States)

    Jeong, In Cheol; Finkelstein, Joseph

    2012-01-01

    Upper extremity exercise training has been shown to improve clinical outcomes in different chronic health conditions. Arm-operated bicycles are frequently used to facilitate upper extremity training however effective use of these devices at patient homes is hampered by lack of remote connectivity with clinical rehabilitation team, inability to monitor exercise progress in real time using simple graphical representation, and absence of an alert system which would prevent exertion levels exceeding those approved by the clinical rehabilitation team. We developed an interactive biking exercise (iBikE) platform aimed at addressing these limitations. The platform uses a miniature wireless 3-axis accelerometer mounted on a patient wrist that transmits the cycling acceleration data to a laptop. The laptop screen presents an exercise dashboard to the patient in real time allowing easy graphical visualization of exercise progress and presentation of exercise parameters in relation to prescribed targets. The iBikE platform is programmed to alert the patient when exercise intensity exceeds the levels recommended by the patient care provider. The iBikE platform has been tested in 7 healthy volunteers (age range: 26-50 years) and shown to reliably reflect exercise progress and to generate alerts at pre-setup levels. Implementation of remote connectivity with patient rehabilitation team is warranted for future extension and evaluation efforts.

  1. VibroCV: a computer vision-based vibroarthrography platform with possible application to Juvenile Idiopathic Arthritis.

    Science.gov (United States)

    Wiens, Andrew D; Prahalad, Sampath; Inan, Omer T

    2016-08-01

    Vibroarthrography, a method for interpreting the sounds emitted by a knee during movement, has been studied for several joint disorders since 1902. However, to our knowledge, the usefulness of this method for management of Juvenile Idiopathic Arthritis (JIA) has not been investigated. To study joint sounds as a possible new biomarker for pediatric cases of JIA we designed and built VibroCV, a platform to capture vibroarthrograms from four accelerometers; electromyograms (EMG) and inertial measurements from four wireless EMG modules; and joint angles from two Sony Eye cameras and six light-emitting diodes with commercially-available off-the-shelf parts and computer vision via OpenCV. This article explains the design of this turn-key platform in detail, and provides a sample recording captured from a pediatric subject.

  2. Coupled sensor/platform control design for low-level chemical detection with position-adaptive micro-UAVs

    Science.gov (United States)

    Goodwin, Thomas; Carr, Ryan; Mitra, Atindra K.; Selmic, Rastko R.

    2009-05-01

    We discuss the development of Position-Adaptive Sensors [1] for purposes for detecting embedded chemical substances in challenging environments. This concept is a generalization of patented Position-Adaptive Radar Concepts developed at AFRL for challenging conditions such as urban environments. For purposes of investigating the detection of chemical substances using multiple MAV (Micro-UAV) platforms, we have designed and implemented an experimental testbed with sample structures such as wooden carts that contain controlled leakage points. Under this general concept, some of the members of a MAV swarm can serve as external position-adaptive "transmitters" by blowing air over the cart and some of the members of a MAV swarm can serve as external position-adaptive "receivers" that are equipped with chemical or biological (chem/bio) sensors that function as "electronic noses". The objective can be defined as improving the particle count of chem/bio concentrations that impinge on a MAV-based position-adaptive sensor that surrounds a chemical repository, such as a cart, via the development of intelligent position-adaptive control algorithms. The overall effect is to improve the detection and false-alarm statistics of the overall system. Within the major sections of this paper, we discuss a number of different aspects of developing our initial MAV-Based Sensor Testbed. This testbed includes blowers to simulate position-adaptive excitations and a MAV from Draganfly Innovations Inc. with stable design modifications to accommodate our chem/bio sensor boom design. We include details with respect to several critical phases of the development effort including development of the wireless sensor network and experimental apparatus, development of the stable sensor boom for the MAV, integration of chem/bio sensors and sensor node onto the MAV and boom, development of position-adaptive control algorithms and initial tests at IDCAST (Institute for the Development and

  3. A Platform of Constructivist Learning in Practice: Computer Literacy Integrated into Elementary School

    Directory of Open Access Journals (Sweden)

    Ivan Garcia

    2010-06-01

    Full Text Available In Mexico, the conventional teaching approach, when applied specifically to elementary school, seems to fall short of attaining the overall quality objective. The main consequence of this problem is when teachers are not sure that their students really understand the dynamic nature of concepts and mechanism since an early age, particularly in elementary school. This paper presents a pedagogical/technological platform, based on constructivism ideas, as a means of making the learning process in elementary school more efficient and interesting. The constructivist platform presented here uses graphical simulators developed for Web 2.0 as a support tool, creating a teaching and learning environment in which practical experiments can be undertaken as each topic is introduced and explained.

  4. Online Model Evaluation in a Large-Scale Computational Advertising Platform

    OpenAIRE

    Shariat, Shahriar; Orten, Burkay; Dasdan, Ali

    2015-01-01

    Online media provides opportunities for marketers through which they can deliver effective brand messages to a wide range of audiences. Advertising technology platforms enable advertisers to reach their target audience by delivering ad impressions to online users in real time. In order to identify the best marketing message for a user and to purchase impressions at the right price, we rely heavily on bid prediction and optimization models. Even though the bid prediction models are well studie...

  5. Computer-implemented method and apparatus for autonomous position determination using magnetic field data

    Science.gov (United States)

    Ketchum, Eleanor A. (Inventor)

    2000-01-01

    A computer-implemented method and apparatus for determining position of a vehicle within 100 km autonomously from magnetic field measurements and attitude data without a priori knowledge of position. An inverted dipole solution of two possible position solutions for each measurement of magnetic field data are deterministically calculated by a program controlled processor solving the inverted first order spherical harmonic representation of the geomagnetic field for two unit position vectors 180 degrees apart and a vehicle distance from the center of the earth. Correction schemes such as a successive substitutions and a Newton-Raphson method are applied to each dipole. The two position solutions for each measurement are saved separately. Velocity vectors for the position solutions are calculated so that a total energy difference for each of the two resultant position paths is computed. The position path with the smaller absolute total energy difference is chosen as the true position path of the vehicle.

  6. An Efficient Neural-Network-Based Microseismic Monitoring Platform for Hydraulic Fracture on an Edge Computing Architecture.

    Science.gov (United States)

    Zhang, Xiaopu; Lin, Jun; Chen, Zubin; Sun, Feng; Zhu, Xi; Fang, Gengfa

    2018-06-05

    Microseismic monitoring is one of the most critical technologies for hydraulic fracturing in oil and gas production. To detect events in an accurate and efficient way, there are two major challenges. One challenge is how to achieve high accuracy due to a poor signal-to-noise ratio (SNR). The other one is concerned with real-time data transmission. Taking these challenges into consideration, an edge-computing-based platform, namely Edge-to-Center LearnReduce, is presented in this work. The platform consists of a data center with many edge components. At the data center, a neural network model combined with convolutional neural network (CNN) and long short-term memory (LSTM) is designed and this model is trained by using previously obtained data. Once the model is fully trained, it is sent to edge components for events detection and data reduction. At each edge component, a probabilistic inference is added to the neural network model to improve its accuracy. Finally, the reduced data is delivered to the data center. Based on experiment results, a high detection accuracy (over 96%) with less transmitted data (about 90%) was achieved by using the proposed approach on a microseismic monitoring system. These results show that the platform can simultaneously improve the accuracy and efficiency of microseismic monitoring.

  7. An Efficient Neural-Network-Based Microseismic Monitoring Platform for Hydraulic Fracture on an Edge Computing Architecture

    Directory of Open Access Journals (Sweden)

    Xiaopu Zhang

    2018-06-01

    Full Text Available Microseismic monitoring is one of the most critical technologies for hydraulic fracturing in oil and gas production. To detect events in an accurate and efficient way, there are two major challenges. One challenge is how to achieve high accuracy due to a poor signal-to-noise ratio (SNR. The other one is concerned with real-time data transmission. Taking these challenges into consideration, an edge-computing-based platform, namely Edge-to-Center LearnReduce, is presented in this work. The platform consists of a data center with many edge components. At the data center, a neural network model combined with convolutional neural network (CNN and long short-term memory (LSTM is designed and this model is trained by using previously obtained data. Once the model is fully trained, it is sent to edge components for events detection and data reduction. At each edge component, a probabilistic inference is added to the neural network model to improve its accuracy. Finally, the reduced data is delivered to the data center. Based on experiment results, a high detection accuracy (over 96% with less transmitted data (about 90% was achieved by using the proposed approach on a microseismic monitoring system. These results show that the platform can simultaneously improve the accuracy and efficiency of microseismic monitoring.

  8. OpenVX-based Python Framework for real-time cross platform acceleration of embedded computer vision applications

    Directory of Open Access Journals (Sweden)

    Ori Heimlich

    2016-11-01

    Full Text Available Embedded real-time vision applications are being rapidly deployed in a large realm of consumer electronics, ranging from automotive safety to surveillance systems. However, the relatively limited computational power of embedded platforms is considered as a bottleneck for many vision applications, necessitating optimization. OpenVX is a standardized interface, released in late 2014, in an attempt to provide both system and kernel level optimization to vision applications. With OpenVX, Vision processing are modeled with coarse-grained data flow graphs, which can be optimized and accelerated by the platform implementer. Current full implementations of OpenVX are given in the programming language C, which does not support advanced programming paradigms such as object-oriented, imperative and functional programming, nor does it have runtime or type-checking. Here we present a python-based full Implementation of OpenVX, which eliminates much of the discrepancies between the object-oriented paradigm used by many modern applications and the native C implementations. Our open-source implementation can be used for rapid development of OpenVX applications in embedded platforms. Demonstration includes static and real-time image acquisition and processing using a Raspberry Pi and a GoPro camera. Code is given as supplementary information. Code project and linked deployable virtual machine are located on GitHub: https://github.com/NBEL-lab/PythonOpenVX.

  9. SoC-Based Edge Computing Gateway in the Context of the Internet of Multimedia Things: Experimental Platform

    Directory of Open Access Journals (Sweden)

    Maher Jridi

    2018-01-01

    Full Text Available This paper presents an algorithm/architecture and Hardware/Software co-designs for implementing a digital edge computing layer on a Zynq platform in the context of the Internet of Multimedia Things (IoMT. Traditional cloud computing is no longer suitable for applications that require image processing due to cloud latency and privacy concerns. With edge computing, data are processed, analyzed, and encrypted very close to the device, which enable the ability to secure data and act rapidly on connected things. The proposed edge computing system is composed of a reconfigurable module to simultaneously compress and encrypt multiple images, along with wireless image transmission and display functionalities. A lightweight implementation of the proposed design is obtained by approximate computing of the discrete cosine transform (DCT and by using a simple chaotic generator which greatly enhances the encryption efficiency. The deployed solution includes four configurations based on HW/SW partitioning in order to handle the compromise between execution time, area, and energy consumption. It was found with the experimental setup that by moving more components to hardware execution, a timing speedup of more than nine times could be achieved with a negligible amount of energy consumption. The power efficiency was then enhanced by a ratio of 7.7 times.

  10. Significance of buccopalatal implant position, biotype, platform switching, and pre-implant bone augmentation on the level of the midbuccal mucosa

    NARCIS (Netherlands)

    Zuiderveld, Elise G; den Hartog, Laurens; Vissink, Arjan; Raghoebar, Gerry M; Meijer, Henny J A

    2014-01-01

    This study assessed whether buccopalatal implant position, biotype, platform switching, and pre-implant bone augmentation affects the level of the midbuccal mucosa (MBM). Ninety patients with a single-tooth implant in the esthetic zone were included. The level of the MBM was measured on photographs

  11. A performance model for the communication in fast multipole methods on high-performance computing platforms

    KAUST Repository

    Ibeid, Huda; Yokota, Rio; Keyes, David E.

    2016-01-01

    model and the actual communication time on four high-performance computing (HPC) systems, when latency, bandwidth, network topology, and multicore penalties are all taken into account. To our knowledge, this is the first formal characterization

  12. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    Science.gov (United States)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  13. Analysis and Research on Spatial Data Storage Model Based on Cloud Computing Platform

    Science.gov (United States)

    Hu, Yong

    2017-12-01

    In this paper, the data processing and storage characteristics of cloud computing are analyzed and studied. On this basis, a cloud computing data storage model based on BP neural network is proposed. In this data storage model, it can carry out the choice of server cluster according to the different attributes of the data, so as to complete the spatial data storage model with load balancing function, and have certain feasibility and application advantages.

  14. Scaling up ATLAS Event Service to production levels on opportunistic computing platforms

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00066086; The ATLAS collaboration; Caballero, Jose; Ernst, Michael; Guan, Wen; Hover, John; Lesny, David; Maeno, Tadashi; Nilsson, Paul; Tsulaia, Vakhtang; van Gemmeren, Peter; Vaniachine, Alexandre; Wang, Fuquan; Wenaus, Torre

    2016-01-01

    Continued growth in public cloud and HPC resources is on track to exceed the dedicated resources available for ATLAS on the WLCG. Examples of such platforms are Amazon AWS EC2 Spot Instances, Edison Cray XC30 supercomputer, backfill at Tier 2 and Tier 3 sites, opportunistic resources at the Open Science Grid (OSG), and ATLAS High Level Trigger farm between the data taking periods. Because of specific aspects of opportunistic resources such as preemptive job scheduling and data I/O, their efficient usage requires workflow innovations provided by the ATLAS Event Service. Thanks to the finer granularity of the Event Service data processing workflow, the opportunistic resources are used more efficiently. We report on our progress in scaling opportunistic resource usage to double-digit levels in ATLAS production.

  15. Scaling up ATLAS Event Service to production levels on opportunistic computing platforms

    CERN Document Server

    Benjamin, Douglas; The ATLAS collaboration; Ernst, Michael; Guan, Wen; Hover, John; Lesny, David; Maeno, Tadashi; Nilsson, Paul; Tsulaia, Vakhtang; van Gemmeren, Peter; Vaniachine, Alexandre; Wang, Fuquan; Wenaus, Torre

    2016-01-01

    Continued growth in public cloud and HPC resources is on track to overcome the dedicated resources available for ATLAS on the WLCG. Example of such platforms are Amazon AWS EC2 Spot Instances, Edison Cray XC30 supercomputer, backfill at the Tier-2 and Tier-3 sites, opportunistic resources at the Open Science Grid, and ATLAS High Level Trigger farm between the data taking periods. Because of opportunistic resources specifics such as preemptive job scheduling and data I/O, their efficient usage requires workflow innovations provided by the ATLAS Event Service. Thanks to the finer granularity of the Event Service data processing workflow, the opportunistic resources are used more efficiently. We report on our progress in scaling opportunistic resource usage to double-digit levels in ATLAS production.

  16. NCI's High Performance Computing (HPC) and High Performance Data (HPD) Computing Platform for Environmental and Earth System Data Science

    Science.gov (United States)

    Evans, Ben; Allen, Chris; Antony, Joseph; Bastrakova, Irina; Gohar, Kashif; Porter, David; Pugh, Tim; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2015-04-01

    The National Computational Infrastructure (NCI) has established a powerful and flexible in-situ petascale computational environment to enable both high performance computing and Data-intensive Science across a wide spectrum of national environmental and earth science data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress so far to harmonise the underlying data collections for future interdisciplinary research across these large volume data collections. NCI has established 10+ PBytes of major national and international data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the major Australian national-scale scientific collections), leading research communities, and collaborating overseas organisations. New infrastructures created at NCI mean the data collections are now accessible within an integrated High Performance Computing and Data (HPC-HPD) environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large-scale high-bandwidth Lustre filesystems. The hardware was designed at inception to ensure that it would allow the layered software environment to flexibly accommodate the advancement of future data science. New approaches to software technology and data models have also had to be developed to enable access to these large and exponentially

  17. CertiCloud and JShadObf. Towards Integrity and Software Protection in Cloud Computing Platforms

    OpenAIRE

    Bertholon, Benoit

    2013-01-01

    A simple concept that has emerged out of the notion of heterogeneous distributed computing is that of Cloud Computing (CC) where customers do not own any part of the infrastructure; they simply use the available services and pay for what they use. This approach is often viewed as the next ICT revolution, similar to the birth of the Web or the e-commerce. Indeed, since its advent in the middle of the 2000's, the CC paradigm arouse enthusiasm and interest from the industry and the private secto...

  18. A Framework for Collaborative and Convenient Learning on Cloud Computing Platforms

    Science.gov (United States)

    Sharma, Deepika; Kumar, Vikas

    2017-01-01

    The depth of learning resides in collaborative work with more engagement and fun. Technology can enhance collaboration with a higher level of convenience and cloud computing can facilitate this in a cost effective and scalable manner. However, to deploy a successful online learning environment, elementary components of learning pedagogy must be…

  19. A Middleware Platform for Providing Mobile and Embedded Computing Instruction to Software Engineering Students

    Science.gov (United States)

    Mattmann, C. A.; Medvidovic, N.; Malek, S.; Edwards, G.; Banerjee, S.

    2012-01-01

    As embedded software systems have grown in number, complexity, and importance in the modern world, a corresponding need to teach computer science students how to effectively engineer such systems has arisen. Embedded software systems, such as those that control cell phones, aircraft, and medical equipment, are subject to requirements and…

  20. Supporting Multi-agent Coordination and Computational Collective Intelligence in Enterprise 2.0 Platform

    Directory of Open Access Journals (Sweden)

    Seddik Reguieg

    2017-12-01

    Full Text Available In this paper, we propose a novel approach utilizing a professional Social network (Pro Social Network and a new coordination protocol (CordiNet. Our motivation behind this article is to convince Small and Medium Enterprises managers that current organizations have chosen to use Enterprise 2.0 tools because these latter have demonstrated remarkable innovation as well as successful collaboration and collective intelligence. The particularity of our work is that is allows employer to share diagnosis and fault repair procedures on the basis of some modeling agents. In fact, each enterprise is represented by a container of agents to ensure a secured and confidential information exchange between intra employers, and a central main container to connect all enterprises’ containers for a social information exchange. Enterprise’s container consists of a Checker Enterprise Agent (ChEA, a Coordinator Enterprise Agent (CoEA and a Search Enterprise Agent (SeEA. Whereas the central main container comprises its proper agents such as Selection Agent (SA, and a Supervisor Agent (SuA. JADE platform is used to allow agents to communicate and collaborate. The FIPA-ACL performatives have been extended for this purpose. We conduct some experiments to demonstrate the feasibility of our approach.

  1. A computational platform for MALDI-TOF mass spectrometry data: application to serum and plasma samples.

    Science.gov (United States)

    Mantini, Dante; Petrucci, Francesca; Pieragostino, Damiana; Del Boccio, Piero; Sacchetta, Paolo; Candiano, Giovanni; Ghiggeri, Gian Marco; Lugaresi, Alessandra; Federici, Giorgio; Di Ilio, Carmine; Urbani, Andrea

    2010-01-03

    Mass spectrometry (MS) is becoming the gold standard for biomarker discovery. Several MS-based bioinformatics methods have been proposed for this application, but the divergence of the findings by different research groups on the same MS data suggests that the definition of a reliable method has not been achieved yet. In this work, we propose an integrated software platform, MASCAP, intended for comparative biomarker detection from MALDI-TOF MS data. MASCAP integrates denoising and feature extraction algorithms, which have already shown to provide consistent peaks across mass spectra; furthermore, it relies on statistical analysis and graphical tools to compare the results between groups. The effectiveness in mass spectrum processing is demonstrated using MALDI-TOF data, as well as SELDI-TOF data. The usefulness in detecting potential protein biomarkers is shown comparing MALDI-TOF mass spectra collected from serum and plasma samples belonging to the same clinical population. The analysis approach implemented in MASCAP may simplify biomarker detection, by assisting the recognition of proteomic expression signatures of the disease. A MATLAB implementation of the software and the data used for its validation are available at http://www.unich.it/proteomica/bioinf. (c) 2009 Elsevier B.V. All rights reserved.

  2. Cell illustrator 4.0: a computational platform for systems biology.

    Science.gov (United States)

    Nagasaki, Masao; Saito, Ayumu; Jeong, Euna; Li, Chen; Kojima, Kaname; Ikeda, Emi; Miyano, Satoru

    2011-01-01

    Cell Illustrator is a software platform for Systems Biology that uses the concept of Petri net for modeling and simulating biopathways. It is intended for biological scientists working at bench. The latest version of Cell Illustrator 4.0 uses Java Web Start technology and is enhanced with new capabilities, including: automatic graph grid layout algorithms using ontology information; tools using Cell System Markup Language (CSML) 3.0 and Cell System Ontology 3.0; parameter search module; high-performance simulation module; CSML database management system; conversion from CSML model to programming languages (FORTRAN, C, C++, Java, Python and Perl); import from SBML, CellML, and BioPAX; and, export to SVG and HTML. Cell Illustrator employs an extension of hybrid Petri net in an object-oriented style so that biopathway models can include objects such as DNA sequence, molecular density, 3D localization information, transcription with frame-shift, translation with codon table, as well as biochemical reactions.

  3. The Implementation of Computer Platform for Foundries Cooperating in a Supply Chain

    Directory of Open Access Journals (Sweden)

    Wilk-Kołodziejczyk D.

    2014-08-01

    Full Text Available This article presents a practical solution in the form of implementation of agent-based platform for the management of contracts in a network of foundries. The described implementation is a continuation of earlier scientific work in the field of design and theoretical system specification for cooperating companies [1]. The implementation addresses key design assumptions - the system is implemented using multi-agent technology, which offers the possibility of decentralisation and distributed processing of specified contracts and tenders. The implemented system enables the joint management of orders for a network of small and medium-sized metallurgical plants, while providing them with greater competitiveness and the ability to carry out large procurements. The article presents the functional aspects of the system - the user interface and the principle of operation of individual agents that represent businesses seeking potential suppliers or recipients of services and products. Additionally, the system is equipped with a bi-directional agent translating standards based on ontologies, which aims to automate the decision-making process during tender specifications as a response to the request.

  4. Positioning graphical objects on computer screens: a three-phase model.

    Science.gov (United States)

    Pastel, Robert

    2011-02-01

    This experiment identifies and models phases during the positioning of graphical objects (called cursors in this article) on computer displays. The human computer-interaction community has traditionally used Fitts' law to model selection in graphical user interfaces, whereas human factors experiments have found the single-component Fitts' law inadequate to model positioning of real objects. Participants (N=145) repeatedly positioned variably sized square cursors within variably sized rectangular targets using computer mice. The times for the cursor to just touch the target, for the cursor to enter the target, and for participants to indicate positioning completion were observed. The positioning tolerances were varied from very precise and difficult to imprecise and easy. The time for the cursor to touch the target was proportional to the initial cursor-target distance. The time for the cursor to completely enter the target after touching was proportional to the logarithms of cursor size divided by target tolerances. The time for participants to indicate positioning after entering was inversely proportional to the tolerance. A three-phase model defined by regions--distant, proximate, and inside the target--was proposed and could model the positioning tasks. The three-phase model provides a framework for ergonomists to evaluate new positioning techniques and can explain their deficiencies. The model provides a means to analyze tasks and enhance interaction during positioning.

  5. DCA++: A case for science driven application development for leadership computing platforms

    Energy Technology Data Exchange (ETDEWEB)

    Summers, Michael S; Alvarez, Gonzalo; Meredith, Jeremy; Maier, Thomas A [Computer Science and Mathematics Division, Oak Ridge National Laboratory, P. O. Box 2008, Mail Stop 6164, Oak Ridge, TN 37831 (United States); Schulthess, Thomas C, E-mail: schulthess@cscs.c [Swiss National Supercomputer Center and Institute for Theoretical Physics, ETH Zurich, CSCS MAN E 133, Galeria 2, CH-9628 Manno (Switzerland)

    2009-07-01

    The DCA++ code was one of the early science applications that ran on jaguar at the National Center for Computational Sciences, and the first application code to sustain a petaflop/s under production conditions on a general-purpose supercomputer. The code implements a quantum cluster method with a Quantum Monte Carlo kernel to solve the 2D Hubbard model for high-temperature superconductivity. It is implemented in C++, making heavy use of the generic programming model. In this paper, we discuss how this code was developed, reaching scalability and high efficiency on the world's fastest supercomputer in only a few years. We show how the use of generic concepts combined with systematic refactoring of codes is a better strategy for computational sciences than a comprehensive upfront design.

  6. The Square Kilometre Array Science Data Processor. Preliminary compute platform design

    International Nuclear Information System (INIS)

    Broekema, P.C.; Nieuwpoort, R.V. van; Bal, H.E.

    2015-01-01

    The Square Kilometre Array is a next-generation radio-telescope, to be built in South Africa and Western Australia. It is currently in its detailed design phase, with procurement and construction scheduled to start in 2017. The SKA Science Data Processor is the high-performance computing element of the instrument, responsible for producing science-ready data. This is a major IT project, with the Science Data Processor expected to challenge the computing state-of-the art even in 2020. In this paper we introduce the preliminary Science Data Processor design and the principles that guide the design process, as well as the constraints to the design. We introduce a highly scalable and flexible system architecture capable of handling the SDP workload

  7. DCA++: A case for science driven application development for leadership computing platforms

    International Nuclear Information System (INIS)

    Summers, Michael S; Alvarez, Gonzalo; Meredith, Jeremy; Maier, Thomas A; Schulthess, Thomas C

    2009-01-01

    The DCA++ code was one of the early science applications that ran on jaguar at the National Center for Computational Sciences, and the first application code to sustain a petaflop/s under production conditions on a general-purpose supercomputer. The code implements a quantum cluster method with a Quantum Monte Carlo kernel to solve the 2D Hubbard model for high-temperature superconductivity. It is implemented in C++, making heavy use of the generic programming model. In this paper, we discuss how this code was developed, reaching scalability and high efficiency on the world's fastest supercomputer in only a few years. We show how the use of generic concepts combined with systematic refactoring of codes is a better strategy for computational sciences than a comprehensive upfront design.

  8. Reconfigurable Computing Platforms and Target System Architectures for Automatic HW/SW Compilation

    OpenAIRE

    Lange, Holger

    2011-01-01

    Embedded systems found their way into all areas of technology and everyday life, from transport systems, facility management, health care, to hand-held computers and cell phones as well as television sets and electric cookers. Modern fabrication techniques enable the integration of such complex sophisticated systems on a single chip (System-on-Chip, SoC). In many cases, a high processing power is required at predetermined, often limited energy budgets. To adjust the processing power even more...

  9. Embedded Platforms for Computer Vision-based Advanced Driver Assistance Systems: a Survey

    OpenAIRE

    Velez, Gorka; Otaegui, Oihana

    2015-01-01

    Computer Vision, either alone or combined with other technologies such as radar or Lidar, is one of the key technologies used in Advanced Driver Assistance Systems (ADAS). Its role understanding and analysing the driving scene is of great importance as it can be noted by the number of ADAS applications that use this technology. However, porting a vision algorithm to an embedded automotive system is still very challenging, as there must be a trade-off between several design requisites. Further...

  10. A Collaborative Digital Pathology System for Multi-Touch Mobile and Desktop Computing Platforms

    KAUST Repository

    Jeong, W.

    2013-06-13

    Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server system that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch-enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. We introduce a domain-specific image-stack compression method that leverages real-time hardware decoding on mobile devices. It adaptively encodes image stacks in a decorrelated colour space to achieve extremely low bitrates (0.8 bpp) with very low loss of image quality. We evaluate the image quality of our compression method and the performance of our system for diagnosis with an in-depth user study. Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server systems that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  11. Development of Student Information Management System based on Cloud Computing Platform

    Directory of Open Access Journals (Sweden)

    Ibrahim A. ALAMERI

    2017-10-01

    Full Text Available The management and provision of information about the educational process is an essential part of effective management of the educational process in the institutes of higher education. In this paper the requirements of a reliable student management system are analyzed, formed a use-case model of student information management system, designed and implemented the architecture of the application. Regarding the implementation process, modern approaches were used to develop and deploy a reliable online application in cloud computing environments specifically.

  12. Mental vision: a computer graphics platform for virtual reality, science and education

    OpenAIRE

    Peternier, Achille

    2009-01-01

    Despite the wide amount of computer graphics frameworks and solutions available for virtual reality, it is still difficult to find a perfect one fitting at the same time the many constraints of research and educational contexts. Advanced functionalities and user-friendliness, rendering speed and portability, or scalability and image quality are opposite characteristics rarely found into a same approach. Furthermore, fruition of virtual reality specific devices like CAVEs or wearable systems i...

  13. A Collaborative Digital Pathology System for Multi-Touch Mobile and Desktop Computing Platforms

    KAUST Repository

    Jeong, W.; Schneider, J.; Hansen, A.; Lee, M.; Turney, S. G.; Faulkner-Jones, B. E.; Hecht, J. L.; Najarian, R.; Yee, E.; Lichtman, J. W.; Pfister, H.

    2013-01-01

    Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server system that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch-enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. We introduce a domain-specific image-stack compression method that leverages real-time hardware decoding on mobile devices. It adaptively encodes image stacks in a decorrelated colour space to achieve extremely low bitrates (0.8 bpp) with very low loss of image quality. We evaluate the image quality of our compression method and the performance of our system for diagnosis with an in-depth user study. Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server systems that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  14. Affordable mobile robotic platforms for teaching computer science at African universities

    OpenAIRE

    Gyebi, Ernest; Hanheide, Marc; Cielniak, Grzegorz

    2015-01-01

    Educational robotics can play a key role in addressing some of the challenges faced by higher education in Africa. One of the major obstacles preventing a wider adoption of initiatives involving educational robotics in this part of the world is lack of robots that would be affordable by African institutions. In this paper, we present a survey and analysis of currently available affordable mobile robots and their suitability for teaching computer science at African universities. To this end, w...

  15. BUILDING A COMPLETE FREE AND OPEN SOURCE GIS INFRASTRUCTURE FOR HYDROLOGICAL COMPUTING AND DATA PUBLICATION USING GIS.LAB AND GISQUICK PLATFORMS

    Directory of Open Access Journals (Sweden)

    M. Landa

    2017-07-01

    Full Text Available Building a complete free and open source GIS computing and data publication platform can be a relatively easy task. This paper describes an automated deployment of such platform using two open source software projects – GIS.lab and Gisquick. GIS.lab (http: //web.gislab.io is a project for rapid deployment of a complete, centrally managed and horizontally scalable GIS infrastructure in the local area network, data center or cloud. It provides a comprehensive set of free geospatial software seamlessly integrated into one, easy-to-use system. A platform for GIS computing (in our case demonstrated on hydrological data processing requires core components as a geoprocessing server, map server, and a computation engine as eg. GRASS GIS, SAGA, or other similar GIS software. All these components can be rapidly, and automatically deployed by GIS.lab platform. In our demonstrated solution PyWPS is used for serving WPS processes built on the top of GRASS GIS computation platform. GIS.lab can be easily extended by other components running in Docker containers. This approach is shown on Gisquick seamless integration. Gisquick (http://gisquick.org is an open source platform for publishing geospatial data in the sense of rapid sharing of QGIS projects on the web. The platform consists of QGIS plugin, Django-based server application, QGIS server, and web/mobile clients. In this paper is shown how to easily deploy complete open source GIS infrastructure allowing all required operations as data preparation on desktop, data sharing, and geospatial computation as the service. It also includes data publication in the sense of OGC Web Services and importantly also as interactive web mapping applications.

  16. Control and management unit for a computation platform at the PANDA experiment

    Energy Technology Data Exchange (ETDEWEB)

    Galuska, Martin; Gessler, Thomas; Kuehn, Wolfgang; Lang, Johannes; Lange, Jens Soeren; Liang, Yutie; Liu, Ming; Spruck, Bjoern; Wang, Qiang [II. Physikalisches Institut, Justus-Liebig-Universitaet Giessen (Germany)

    2010-07-01

    The FAIR facility will provide high intensity antiproton and heavy ion beams for the PANDA and HADES experiments, leading to very high reaction rates. PANDA is expected to run at 10-20 MHz with a raw data output rate of up to 200 GB/s. A sophisticated data acquisition system is needed in order to select physically relevant events online. For this purpose a network of interconnected compute nodes can be used. Each compute node can be programmed to run various algorithms, such as online particle track recognition for high level triggering. An ATCA communication shelf provides power, cooling and high-speed interconnections to up to 14 nodes. A single shelf manager supervises and regulates the power distribution and temperature inside the shelf. The shelf manager relies on a local control chip on each node to relay sensor read-outs, provide hardware adresses and power requirements etc. An IPM controller based on an Atmel microcontroller was designed for this purpose, and a prototype was produced. The neccessary software is being developed to allow local communication with the components of the compute node and remote communication with the shelf manager conform to the ATCA specification.

  17. Using the Eclipse Parallel Tools Platform to Assist Earth Science Model Development and Optimization on High Performance Computers

    Science.gov (United States)

    Alameda, J. C.

    2011-12-01

    Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into

  18. LWAs computational platform for e-consultation using mobile devices: cases from developing nations.

    Science.gov (United States)

    Olajubu, Emmanuel Ajayi; Odukoya, Oluwatoyin Helen; Akinboro, Solomon Adegbenro

    2014-01-01

    Mobile devices have been impacting on human standard of living by providing timely and accurate information anywhere and anytime through wireless media in developing nations. Shortage of experts in medical fields is very obvious throughout the whole world but more pronounced in developing nations. Thus, this study proposes a telemedicine platform for the vulnerable areas of developing nations. The vulnerable area are the interior with little or no medical facilities, hence the dwellers are very susceptible to sicknesses and diseases. The framework uses mobile devices that can run LightWeight Agents (LWAs) to send consultation requests to a remote medical expert in urban city from the vulnerable interiors. The feedback is conveyed to the requester through the same medium. The system architecture which contained AgenRoller, LWAs, The front-end (mobile devices) and back-end (the medical server) is presented. The algorithm for the software component of the architecture (AgenRoller) is also presented. The system is modeled as M/M/1/c queuing system, and simulated using Simevents from MATLAB Simulink environment. The simulation result presented show the average queue length, the number of entities in the queue and the number of entities departure from the system. These together present the rate of information processing in the system. A full scale development of this system with proper implementation will help extend the few medical facilities available in the urban cities in developing nations to the interiors thereby reducing the number of casualties in the vulnerable areas of the developing world especially in Sub Saharan Africa.

  19. An FDTD-based computer simulation platform for shock wave propagation in electrohydraulic lithotripsy.

    Science.gov (United States)

    Yılmaz, Bülent; Çiftçi, Emre

    2013-06-01

    Extracorporeal Shock Wave Lithotripsy (ESWL) is based on disintegration of the kidney stone by delivering high-energy shock waves that are created outside the body and transmitted through the skin and body tissues. Nowadays high-energy shock waves are also used in orthopedic operations and investigated to be used in the treatment of myocardial infarction and cancer. Because of these new application areas novel lithotriptor designs are needed for different kinds of treatment strategies. In this study our aim was to develop a versatile computer simulation environment which would give the device designers working on various medical applications that use shock wave principle a substantial amount of flexibility while testing the effects of new parameters such as reflector size, material properties of the medium, water temperature, and different clinical scenarios. For this purpose, we created a finite-difference time-domain (FDTD)-based computational model in which most of the physical system parameters were defined as an input and/or as a variable in the simulations. We constructed a realistic computational model of a commercial electrohydraulic lithotriptor and optimized our simulation program using the results that were obtained by the manufacturer in an experimental setup. We, then, compared the simulation results with the results from an experimental setup in which oxygen level in water was varied. Finally, we studied the effects of changing the input parameters like ellipsoid size and material, temperature change in the wave propagation media, and shock wave source point misalignment. The simulation results were consistent with the experimental results and expected effects of variation in physical parameters of the system. The results of this study encourage further investigation and provide adequate evidence that the numerical modeling of a shock wave therapy system is feasible and can provide a practical means to test novel ideas in new device design procedures

  20. A performance model for the communication in fast multipole methods on high-performance computing platforms

    KAUST Repository

    Ibeid, Huda

    2016-03-04

    Exascale systems are predicted to have approximately 1 billion cores, assuming gigahertz cores. Limitations on affordable network topologies for distributed memory systems of such massive scale bring new challenges to the currently dominant parallel programing model. Currently, there are many efforts to evaluate the hardware and software bottlenecks of exascale designs. It is therefore of interest to model application performance and to understand what changes need to be made to ensure extrapolated scalability. The fast multipole method (FMM) was originally developed for accelerating N-body problems in astrophysics and molecular dynamics but has recently been extended to a wider range of problems. Its high arithmetic intensity combined with its linear complexity and asynchronous communication patterns make it a promising algorithm for exascale systems. In this paper, we discuss the challenges for FMM on current parallel computers and future exascale architectures, with a focus on internode communication. We focus on the communication part only; the efficiency of the computational kernels are beyond the scope of the present study. We develop a performance model that considers the communication patterns of the FMM and observe a good match between our model and the actual communication time on four high-performance computing (HPC) systems, when latency, bandwidth, network topology, and multicore penalties are all taken into account. To our knowledge, this is the first formal characterization of internode communication in FMM that validates the model against actual measurements of communication time. The ultimate communication model is predictive in an absolute sense; however, on complex systems, this objective is often out of reach or of a difficulty out of proportion to its benefit when there exists a simpler model that is inexpensive and sufficient to guide coding decisions leading to improved scaling. The current model provides such guidance.

  1. Gyrokinetic particle-in-cell simulations of plasma microturbulence on advanced computing platforms

    International Nuclear Information System (INIS)

    Ethier, S; Tang, W M; Lin, Z

    2005-01-01

    Since its introduction in the early 1980s, the gyrokinetic particle-in-cell (PIC) method has been very successfully applied to the exploration of many important kinetic stability issues in magnetically confined plasmas. Its self-consistent treatment of charged particles and the associated electromagnetic fluctuations makes this method appropriate for studying enhanced transport driven by plasma turbulence. Advances in algorithms and computer hardware have led to the development of a parallel, global, gyrokinetic code in full toroidal geometry, the gyrokinetic toroidal code (GTC), developed at the Princeton Plasma Physics Laboratory. It has proven to be an invaluable tool to study key effects of low-frequency microturbulence in fusion plasmas. As a high-performance computing applications code, its flexible mixed-model parallel algorithm has allowed GTC to scale to over a thousand processors, which is routinely used for simulations. Improvements are continuously being made. As the US ramps up its support for the International Tokamak Experimental Reactor (ITER), the need for understanding the impact of turbulent transport in burning plasma fusion devices is of utmost importance. Accordingly, the GTC code is at the forefront of the set of numerical tools being used to assess and predict the performance of ITER on critical issues such as the efficiency of energy confinement in reactors

  2. A priori modeling of chemical reactions on computational grid platforms: Workflows and data models

    International Nuclear Information System (INIS)

    Rampino, S.; Monari, A.; Rossi, E.; Evangelisti, S.; Laganà, A.

    2012-01-01

    Graphical abstract: The quantum framework of the Grid Empowered Molecular Simulator GEMS assembled on the European Grid allows the ab initio evaluation of the dynamics of small systems starting from the calculation of the electronic properties. Highlights: ► The grid based GEMS simulator accurately models small chemical systems. ► Q5Cost and D5Cost file formats provide interoperability in the workflow. ► Benchmark runs on H + H 2 highlight the Grid empowering. ► O + O 2 and N + N 2 calculated k (T)’s fall within the error bars of the experiment. - Abstract: The quantum framework of the Grid Empowered Molecular Simulator GEMS has been assembled on the segment of the European Grid devoted to the Computational Chemistry Virtual Organization. The related grid based workflow allows the ab initio evaluation of the dynamics of small systems starting from the calculation of the electronic properties. Interoperability between computational codes across the different stages of the workflow was made possible by the use of the common data formats Q5Cost and D5Cost. Illustrative benchmark runs have been performed on the prototype H + H 2 , N + N 2 and O + O 2 gas phase exchange reactions and thermal rate coefficients have been calculated for the last two. Results are discussed in terms of the modeling of the interaction and advantages of using the Grid is highlighted.

  3. The position of a standard optical computer mouse affects cardiorespiratory responses during the operation of a computer under time constraints.

    Science.gov (United States)

    Sako, Shunji; Sugiura, Hiromichi; Tanoue, Hironori; Kojima, Makoto; Kono, Mitsunobu; Inaba, Ryoichi

    2014-08-01

    This study investigated the association between task-induced stress and fatigue by examining the cardiovascular responses of subjects using different mouse positions while operating a computer under time constraints. The study was participated by 16 young, healthy men and examined the use of optical mouse devices affixed to laptop computers. Two mouse positions were investigated: (1) the distal position (DP), in which the subjects place their forearms on the desk accompanied by the abduction and flexion of their shoulder joints, and (2) the proximal position (PP), in which the subjects place only their wrists on the desk without using an armrest. The subjects continued each task for 16 min. We assessed differences in several characteristics according to mouse position, including expired gas values, autonomic nerve activities (based on cardiorespiratory responses), operating efficiencies (based on word counts), and fatigue levels (based on the visual analog scale - VAS). Oxygen consumption (VO(2)), the ratio of inspiration time to respiration time (T(i)/T(total)), respiratory rate (RR), minute ventilation (VE), and the ratio of expiration to inspiration (Te/T(i)) were significantly lower when the participants were performing the task in the DP than those obtained in the PP. Tidal volume (VT), carbon dioxide output rates (VCO(2)/VE), and oxygen extraction fractions (VO(2)/VE) were significantly higher for the DP than they were for the PP. No significant difference in VAS was observed between the positions; however, as the task progressed, autonomic nerve activities were lower and operating efficiencies were significantly higher for the DP than they were for the PP. Our results suggest that the DP has fewer effects on cardiorespiratory functions, causes lower levels of sympathetic nerve activity and mental stress, and produces a higher total workload than the PP. This suggests that the DP is preferable to the PP when operating a computer.

  4. The position of a standard optical computer mouse affects cardiorespiratory responses during the operation of a computer under time constraints

    Directory of Open Access Journals (Sweden)

    Shunji Sako

    2014-08-01

    Full Text Available Objectives: This study investigated the association between task-induced stress and fatigue by examining the cardiovascular responses of subjects using different mouse positions while operating a computer under time constraints. Material and Methods: The study was participated by 16 young, healthy men and examined the use of optical mouse devices affixed to laptop computers. Two mouse positions were investigated: (1 the distal position (DP, in which the subjects place their forearms on the desk accompanied by the abduction and flexion of their shoulder joints, and (2 the proximal position (PP, in which the subjects place only their wrists on the desk without using an armrest. The subjects continued each task for 16 min. We assessed differences in several characteristics according to mouse position, including expired gas values, autonomic nerve activities (based on cardiorespiratory responses, operating efficiencies (based on word counts, and fatigue levels (based on the visual analog scale – VAS. Results: Oxygen consumption (VO2, the ratio of inspiration time to respiration time (Ti/Ttotal, respiratory rate (RR, minute ventilation (VE, and the ratio of expiration to inspiration (Te/Ti were significantly lower when the participants were performing the task in the DP than those obtained in the PP. Tidal volume (VT, carbon dioxide output rates (VCO2/VE, and oxygen extraction fractions (VO2/VE were significantly higher for the DP than they were for the PP. No significant difference in VAS was observed between the positions; however, as the task progressed, autonomic nerve activities were lower and operating efficiencies were significantly higher for the DP than they were for the PP. Conclusions: Our results suggest that the DP has fewer effects on cardiorespiratory functions, causes lower levels of sympathetic nerve activity and mental stress, and produces a higher total workload than the PP. This suggests that the DP is preferable to the PP when

  5. Work flow management systems applied in nuclear power plants management system to a new computer platform

    International Nuclear Information System (INIS)

    Rodriguez Lorite, M.; Martin Lopez-Suevos, C.

    1996-01-01

    Activities performed in most companies are based on the flow of information between their different departments and personnel. Most of this information is on paper (delivery notes, invoices, reports, etc). The percentage of information transmitted electronically (electronic transactions, spread sheets, files from word processors, etc) is usually low. The implementation of systems to control and speed up this work flow is the aim of work flow management systems. This article presents a prototype for applying work flow management systems to a specific area: the basic life cycle of a purchase order in a nuclear power plant, which requires the involvement of various computer applications: purchase order management, warehouse management, accounting, etc. Once implemented, work flow management systems allow optimisation of the execution of different tasks included in the managed life cycles and provide parameters to, if necessary, control work cycles, allowing their temporary or definitive modification. (Author)

  6. Gait Analysis Using Computer Vision Based on Cloud Platform and Mobile Device

    Directory of Open Access Journals (Sweden)

    Mario Nieto-Hidalgo

    2018-01-01

    Full Text Available Frailty and senility are syndromes that affect elderly people. The ageing process involves a decay of cognitive and motor functions which often produce an impact on the quality of life of elderly people. Some studies have linked this deterioration of cognitive and motor function to gait patterns. Thus, gait analysis can be a powerful tool to assess frailty and senility syndromes. In this paper, we propose a vision-based gait analysis approach performed on a smartphone with cloud computing assistance. Gait sequences recorded by a smartphone camera are processed by the smartphone itself to obtain spatiotemporal features. These features are uploaded onto the cloud in order to analyse and compare them to a stored database to render a diagnostic. The feature extraction method presented can work with both frontal and sagittal gait sequences although the sagittal view provides a better classification since an accuracy of 95% can be obtained.

  7. Parallel Computation of RCS of Electrically Large Platform with Coatings Modeled with NURBS Surfaces

    Directory of Open Access Journals (Sweden)

    Ying Yan

    2012-01-01

    Full Text Available The significance of Radar Cross Section (RCS in the military applications makes its prediction an important problem. This paper uses large-scale parallel Physical Optics (PO to realize the fast computation of RCS to electrically large targets, which are modeled by Non-Uniform Rational B-Spline (NURBS surfaces and coated with dielectric materials. Some numerical examples are presented to validate this paper’s method. In addition, 1024 CPUs are used in Shanghai Supercomputer Center (SSC to perform the simulation of a model with the maximum electrical size 1966.7 λ for the first time in China. From which, it can be found that this paper’s method can greatly speed the calculation and is capable of solving the real-life problem of RCS prediction.

  8. Event-Based Computation of Motion Flow on a Neuromorphic Analog Neural Platform.

    Science.gov (United States)

    Giulioni, Massimiliano; Lagorce, Xavier; Galluppi, Francesco; Benosman, Ryad B

    2016-01-01

    Estimating the speed and direction of moving objects is a crucial component of agents behaving in a dynamic world. Biological organisms perform this task by means of the neural connections originating from their retinal ganglion cells. In artificial systems the optic flow is usually extracted by comparing activity of two or more frames captured with a vision sensor. Designing artificial motion flow detectors which are as fast, robust, and efficient as the ones found in biological systems is however a challenging task. Inspired by the architecture proposed by Barlow and Levick in 1965 to explain the spiking activity of the direction-selective ganglion cells in the rabbit's retina, we introduce an architecture for robust optical flow extraction with an analog neuromorphic multi-chip system. The task is performed by a feed-forward network of analog integrate-and-fire neurons whose inputs are provided by contrast-sensitive photoreceptors. Computation is supported by the precise time of spike emission, and the extraction of the optical flow is based on time lag in the activation of nearby retinal neurons. Mimicking ganglion cells our neuromorphic detectors encode the amplitude and the direction of the apparent visual motion in their output spiking pattern. Hereby we describe the architectural aspects, discuss its latency, scalability, and robustness properties and demonstrate that a network of mismatched delicate analog elements can reliably extract the optical flow from a simple visual scene. This work shows how precise time of spike emission used as a computational basis, biological inspiration, and neuromorphic systems can be used together for solving specific tasks.

  9. Farm Management Support on Cloud Computing Platform: A System for Cropland Monitoring Using Multi-Source Remotely Sensed Data

    Science.gov (United States)

    Coburn, C. A.; Qin, Y.; Zhang, J.; Staenz, K.

    2015-12-01

    Food security is one of the most pressing issues facing humankind. Recent estimates predict that over one billion people don't have enough food to meet their basic nutritional needs. The ability of remote sensing tools to monitor and model crop production and predict crop yield is essential for providing governments and farmers with vital information to ensure food security. Google Earth Engine (GEE) is a cloud computing platform, which integrates storage and processing algorithms for massive remotely sensed imagery and vector data sets. By providing the capabilities of storing and analyzing the data sets, it provides an ideal platform for the development of advanced analytic tools for extracting key variables used in regional and national food security systems. With the high performance computing and storing capabilities of GEE, a cloud-computing based system for near real-time crop land monitoring was developed using multi-source remotely sensed data over large areas. The system is able to process and visualize the MODIS time series NDVI profile in conjunction with Landsat 8 image segmentation for crop monitoring. With multi-temporal Landsat 8 imagery, the crop fields are extracted using the image segmentation algorithm developed by Baatz et al.[1]. The MODIS time series NDVI data are modeled by TIMESAT [2], a software package developed for analyzing time series of satellite data. The seasonality of MODIS time series data, for example, the start date of the growing season, length of growing season, and NDVI peak at a field-level are obtained for evaluating the crop-growth conditions. The system fuses MODIS time series NDVI data and Landsat 8 imagery to provide information of near real-time crop-growth conditions through the visualization of MODIS NDVI time series and comparison of multi-year NDVI profiles. Stakeholders, i.e., farmers and government officers, are able to obtain crop-growth information at crop-field level online. This unique utilization of GEE in

  10. An experimental platform for triaxial high-pressure/high-temperature testing of rocks using computed tomography

    Science.gov (United States)

    Glatz, Guenther; Lapene, Alexandre; Castanier, Louis M.; Kovscek, Anthony R.

    2018-04-01

    A conventional high-pressure/high-temperature experimental apparatus for combined geomechanical and flow-through testing of rocks is not X-ray compatible. Additionally, current X-ray transparent systems for computed tomography (CT) of cm-sized samples are limited to design temperatures below 180 °C. We describe a novel, high-temperature (>400 °C), high-pressure (>2000 psi/>13.8 MPa confining, >10 000 psi/>68.9 MPa vertical load) triaxial core holder suitable for X-ray CT scanning. The new triaxial system permits time-lapse imaging to capture the role of effective stress on fluid distribution and porous medium mechanics. System capabilities are demonstrated using ultimate compressive strength (UCS) tests of Castlegate sandstone. In this case, flooding the porous medium with a radio-opaque gas such as krypton before and after the UCS test improves the discrimination of rock features such as fractures. The results of high-temperature tests are also presented. A Uintah Basin sample of immature oil shale is heated from room temperature to 459 °C under uniaxial compression. The sample contains kerogen that pyrolyzes as temperature rises, releasing hydrocarbons. Imaging reveals the formation of stress bands as well as the evolution and connectivity of the fracture network within the sample as a function of time.

  11. A Computational/Experimental Platform for Investigating Three-Dimensional Puzzle Solving of Comminuted Articular Fractures

    Science.gov (United States)

    Thomas, Thaddeus P.; Anderson, Donald D.; Willis, Andrew R.; Liu, Pengcheng; Frank, Matthew C.; Marsh, J. Lawrence; Brown, Thomas D.

    2011-01-01

    Reconstructing highly comminuted articular fractures poses a difficult surgical challenge, akin to solving a complicated three-dimensional (3D) puzzle. Pre-operative planning using CT is critically important, given the desirability of less invasive surgical approaches. The goal of this work is to advance 3D puzzle solving methods toward use as a pre-operative tool for reconstructing these complex fractures. Methodology for generating typical fragmentation/dispersal patterns was developed. Five identical replicas of human distal tibia anatomy, were machined from blocks of high-density polyetherurethane foam (bone fragmentation surrogate), and were fractured using an instrumented drop tower. Pre- and post-fracture geometries were obtained using laser scans and CT. A semi-automatic virtual reconstruction computer program aligned fragment native (non-fracture) surfaces to a pre-fracture template. The tibias were precisely reconstructed with alignment accuracies ranging from 0.03-0.4mm. This novel technology has potential to significantly enhance surgical techniques for reconstructing comminuted intra-articular fractures, as illustrated for a representative clinical case. PMID:20924863

  12. Event-Based Computation of Motion Flow on a Neuromorphic Analog Neural Platform

    Directory of Open Access Journals (Sweden)

    Massimiliano eGiulioni

    2016-02-01

    Full Text Available We demonstrate robust optical flow extraction with an analog neuromorphic multi-chip system. The task is performed by a feed-forward network of analog integrate-and-fire neurons whose inputs are provided by contrast-sensitive photoreceptors. Computation is supported by the precise time of spike emission and follows the basic theoretical principles presented in (Benosman et al. 2014: the extraction of the optical flow is based on time lag in the activation of nearby retinal neurons. The same basic principle is embedded in the architecture proposed by Barlow and Levick in 1965 to explain the spiking activity of the direction-selective ganglion cells in the rabbit's retina. Mimicking those cells our neuromorphic detectors encode the amplitude and the direction of the apparent visual motion in their output spiking pattern. We built a 3x3 test grid of independent detectors, each observing a different portion of the scene, so that our final output is a spike train encoding a 3x3 optical flow vector field. In this work we focus on the architectural aspects, and we demonstrate that a network of mismatched delicate analog elements can reliably extract the optical flow from a simple visual scene.

  13. AAE and AAOMR Joint Position Statement: Use of Cone Beam Computed Tomography in Endodontics 2015 Update.

    Science.gov (United States)

    2015-10-01

    The following statement was prepared by the Special Committee to Revise the Joint American Association of Endodontists/American Academy of Oral and Maxillofacial Radiology Position on Cone Beam Computed Tomography, and approved by the AAE Board of Directors and AAOMR Executive Council in May 2015. AAE members may reprint this position statement for distribution to patients or referring dentists. Copyright © 2015 American Academy of Oral and Maxillofacial Radiology and American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  14. Study on computer-aided control system design platform of 10MW high temperature gas-cooled test reactor

    International Nuclear Information System (INIS)

    Feng Yan; Shi Lei; Sun Yuliang; Luo Shaojie

    2004-01-01

    the 10 MW high temperature gas-cooled test reactor (HTR-10) is the first modular pebble bed reactor built in China, which needs to be researched on engineering design, control study, safety analysis and operator training. An integrated system for simulation, control design and online assistance of the HTR-10 (HTRSIMU) has been developed by the Institute of Nuclear Energy Technology (INET) of Tsinghua University. The HTRSIMU system is based on a high-speed local area network, on which a computer-aided control system design platform (CDP) is developed and combined with the simulating subsystem in order to provide a visualized and convenient tool for the HTR-10 control system design. The CDP has friendly man-machine interface and good expansibility, in which eighteen types of control items are integrated. These control items are divided into two types: linear and non-linear control items. The linear control items include Proportion, Integral, Differential, Inertial, Leed-lag, Oscillation, Pure-lag, Common, PID and Fuzzy, while the non-linear control items include Saturation, Subsection, Insensitive, Backlash, Relay, Insensi-Relay, Sluggish-Relay and Insens-Slug. The CDP provides a visualized platform for control system modeling and the control loop system can be automatically generated and graphically simulated. Users can conveniently design control loop, modify control parameters, study control method, and analyze control results just by clicking mouse buttons. This kind of control system design method can provide a powerful tool and good reference for the actual system operation for HTR-10. A control scheme is also given and studied to demonstrate the functions of the CDP in this article. (author)

  15. Coupled in silico platform: Computational fluid dynamics (CFD) and physiologically-based pharmacokinetic (PBPK) modelling.

    Science.gov (United States)

    Vulović, Aleksandra; Šušteršič, Tijana; Cvijić, Sandra; Ibrić, Svetlana; Filipović, Nenad

    2018-02-15

    One of the critical components of the respiratory drug delivery is the manner in which the inhaled aerosol is deposited in respiratory tract compartments. Depending on formulation properties, device characteristics and breathing pattern, only a certain fraction of the dose will reach the target site in the lungs, while the rest of the drug will deposit in the inhalation device or in the mouth-throat region. The aim of this study was to link the Computational fluid dynamics (CFD) with physiologically-based pharmacokinetic (PBPK) modelling in order to predict aerolisolization of different dry powder formulations, and estimate concomitant in vivo deposition and absorption of amiloride hydrochloride. Drug physicochemical properties were experimentally determined and used as inputs for the CFD simulations of particle flow in the generated 3D geometric model of Aerolizer® dry powder inhaler (DPI). CFD simulations were used to simulate air flow through Aerolizer® inhaler and Discrete Phase Method (DPM) was used to simulate aerosol particles deposition within the fluid domain. The simulated values for the percent emitted dose were comparable to the values obtained using Andersen cascade impactor (ACI). However, CFD predictions indicated that aerosolized DPI have smaller particle size and narrower size distribution than assumed based on ACI measurements. Comparison with the literature in vivo data revealed that the constructed drug-specific PBPK model was able to capture amiloride absorption pattern following oral and inhalation administration. The PBPK simulation results, based on the CFD generated particle distribution data as input, illustrated the influence of formulation properties on the expected drug plasma concentration profiles. The model also predicted the influence of potential changes in physiological parameters on the extent of inhaled amiloride absorption. Overall, this study demonstrated the potential of the combined CFD-PBPK approach to model inhaled drug

  16. Acceleration of Cherenkov angle reconstruction with the new Intel Xeon/FPGA compute platform for the particle identification in the LHCb Upgrade

    Science.gov (United States)

    Faerber, Christian

    2017-10-01

    The LHCb experiment at the LHC will upgrade its detector by 2018/2019 to a ‘triggerless’ readout scheme, where all the readout electronics and several sub-detector parts will be replaced. The new readout electronics will be able to readout the detector at 40 MHz. This increases the data bandwidth from the detector down to the Event Filter farm to 40 TBit/s, which also has to be processed to select the interesting proton-proton collision for later storage. The architecture of such a computing farm, which can process this amount of data as efficiently as possible, is a challenging task and several compute accelerator technologies are being considered for use inside the new Event Filter farm. In the high performance computing sector more and more FPGA compute accelerators are used to improve the compute performance and reduce the power consumption (e.g. in the Microsoft Catapult project and Bing search engine). Also for the LHCb upgrade the usage of an experimental FPGA accelerated computing platform in the Event Building or in the Event Filter farm is being considered and therefore tested. This platform from Intel hosts a general CPU and a high performance FPGA linked via a high speed link which is for this platform a QPI link. On the FPGA an accelerator is implemented. The used system is a two socket platform from Intel with a Xeon CPU and an FPGA. The FPGA has cache-coherent memory access to the main memory of the server and can collaborate with the CPU. As a first step, a computing intensive algorithm to reconstruct Cherenkov angles for the LHCb RICH particle identification was successfully ported in Verilog to the Intel Xeon/FPGA platform and accelerated by a factor of 35. The same algorithm was ported to the Intel Xeon/FPGA platform with OpenCL. The implementation work and the performance will be compared. Also another FPGA accelerator the Nallatech 385 PCIe accelerator with the same Stratix V FPGA were tested for performance. The results show that the Intel

  17. An algorithm to compute the square root of 3x3 positive definite matrix

    International Nuclear Information System (INIS)

    Franca, L.P.

    1988-06-01

    An efficient closed form to compute the square root of a 3 x 3 positive definite matrix is presented. The derivation employs the Cayley-Hamilton theorem avoiding calculation of eigenvectors. We show that evaluation of one eigenvalue of the square root matrix is needed and can not be circumvented. The algorithm is robust and efficient. (author) [pt

  18. A simple algorithm for computing positively weighted straight skeletons of monotone polygons☆

    Science.gov (United States)

    Biedl, Therese; Held, Martin; Huber, Stefan; Kaaser, Dominik; Palfrader, Peter

    2015-01-01

    We study the characteristics of straight skeletons of monotone polygonal chains and use them to devise an algorithm for computing positively weighted straight skeletons of monotone polygons. Our algorithm runs in O(nlog⁡n) time and O(n) space, where n denotes the number of vertices of the polygon. PMID:25648376

  19. A simple algorithm for computing positively weighted straight skeletons of monotone polygons.

    Science.gov (United States)

    Biedl, Therese; Held, Martin; Huber, Stefan; Kaaser, Dominik; Palfrader, Peter

    2015-02-01

    We study the characteristics of straight skeletons of monotone polygonal chains and use them to devise an algorithm for computing positively weighted straight skeletons of monotone polygons. Our algorithm runs in [Formula: see text] time and [Formula: see text] space, where n denotes the number of vertices of the polygon.

  20. Beyond 1984: The Positive and Negative Potential of Computer Supported School Focused Information Systems.

    Science.gov (United States)

    Klein, Susan S.

    Although educators' use of computers to track student and school information with the attendant positive and negative outcomes is still in an early stage of development, accessible data from such systems could improve the objective rationality of educational and instructional decision-making as long as no one places unwarranted credibility in the…

  1. A computational cognitive model for political positioning and reactions in web media

    NARCIS (Netherlands)

    Fernandes de Mello Araujo, E.; Klein, Michel

    2017-01-01

    This paper presents a computational cognitive model about political positioning inspired on recent insights from neuroscience and psychology. We describe a model that takes into consideration the individual structures of the brain and the environmental influences that may interfere on how a

  2. Analysis of interfractional variations in pancreatic position based on four-dimensional computed tomography

    International Nuclear Information System (INIS)

    Shiinoki, Takehiro; Itoh, Akio; Shibuya, Keiko; Nakamura, Mitsuhiro; Nakamura, Akira; Matsuo, Yukinori; Sawada, Akira; Mizowaki, Takashi; Hiraoka, Masahiro

    2010-01-01

    The purpose of this study was to assess inter-fractional variations in pancreatic position using four-dimensional computed tomography (4D-CT) and to find the suitable phase of respiration for breath-holding. The variations in respiratory motion range during treatment course and inter-fractional variations in pancreatic positions were not negligible; however, our study suggested that breath-holding at end-exhalation with some coaching techniques might be considerable one of the non-invasive approaches to get higher positional reproducibility of pancreatic tumors. (author)

  3. An evolving computational platform for biological mass spectrometry: workflows, statistics and data mining with MASSyPup64.

    Science.gov (United States)

    Winkler, Robert

    2015-01-01

    In biological mass spectrometry, crude instrumental data need to be converted into meaningful theoretical models. Several data processing and data evaluation steps are required to come to the final results. These operations are often difficult to reproduce, because of too specific computing platforms. This effect, known as 'workflow decay', can be diminished by using a standardized informatic infrastructure. Thus, we compiled an integrated platform, which contains ready-to-use tools and workflows for mass spectrometry data analysis. Apart from general unit operations, such as peak picking and identification of proteins and metabolites, we put a strong emphasis on the statistical validation of results and Data Mining. MASSyPup64 includes e.g., the OpenMS/TOPPAS framework, the Trans-Proteomic-Pipeline programs, the ProteoWizard tools, X!Tandem, Comet and SpiderMass. The statistical computing language R is installed with packages for MS data analyses, such as XCMS/metaXCMS and MetabR. The R package Rattle provides a user-friendly access to multiple Data Mining methods. Further, we added the non-conventional spreadsheet program teapot for editing large data sets and a command line tool for transposing large matrices. Individual programs, console commands and modules can be integrated using the Workflow Management System (WMS) taverna. We explain the useful combination of the tools by practical examples: (1) A workflow for protein identification and validation, with subsequent Association Analysis of peptides, (2) Cluster analysis and Data Mining in targeted Metabolomics, and (3) Raw data processing, Data Mining and identification of metabolites in untargeted Metabolomics. Association Analyses reveal relationships between variables across different sample sets. We present its application for finding co-occurring peptides, which can be used for target proteomics, the discovery of alternative biomarkers and protein-protein interactions. Data Mining derived models

  4. Computer-assisted teaching of skin flap surgery: validation of a mobile platform software for medical students.

    Science.gov (United States)

    de Sena, David P; Fabricio, Daniela D; Lopes, Maria Helena I; da Silva, Vinicius D

    2013-01-01

    The purpose of this study was to develop and validate a multimedia software application for mobile platforms to assist in the teaching and learning process of design and construction of a skin flap. Traditional training in surgery is based on learning by doing. Initially, the use of cadavers and animal models appeared to be a valid alternative for training. However, many conflicts with these training models prompted progression to synthetic and virtual reality models. Fifty volunteer fifth- and sixth-year medical students completed a pretest and were randomly allocated into two groups of 25 students each. The control group was exposed for 5 minutes to a standard text-based print article, while the test group used multimedia software describing how to fashion a rhomboid flap. Each group then performed a cutaneous flap on a training bench model while being evaluated by three blinded BSPS (Brazilian Society of Plastic Surgery) board-certified surgeons using the OSATS (Objective Structured Assessment of Technical Skill) protocol and answered a post-test. The text-based group was then tested again using the software. The computer-assisted learning (CAL) group had superior performance as confirmed by checklist scores (pmultimedia method as the best study tool. CAL learners exhibited better subjective and objective performance when fashioning rhomboid flaps as compared to those taught with standard print material. These findings indicate that students preferred to learn using the multimedia method.

  5. Computer graphics of SEM images facilitate recognition of chromosome position in isolated human metaphase plates.

    Science.gov (United States)

    Hodge, L D; Barrett, J M; Welter, D A

    1995-04-01

    There is general agreement that at the time of mitosis chromosomes occupy precise positions and that these positions likely affect subsequent nuclear function in interphase. However, before such ideas can be investigated in human cells, it is necessary to determine first the precise position of each chromosome with regard to its neighbors. It has occurred to us that stereo images, produced by scanning electron microscopy, of isolated metaphase plates could form the basis whereby these positions could be ascertained. In this paper we describe a computer graphic technique that permits us to keep track of individual chromosomes in a metaphase plate and to compare chromosome positions in different metaphase plates. Moreover, the computer graphics provide permanent, easily manipulated, rapid recall of stored chromosome profiles. These advantages are demonstrated by a comparison of the relative position of group A-specific and groups D- and G-specific chromosomes to the full complement of chromosomes in metaphase plates isolated from a nearly triploid human-derived cell (HeLa S3) to a hypo-diploid human fetal lung cell.

  6. Accuracy of a computer-assisted planning and placement system for anatomical femoral tunnel positioning in anterior cruciate ligament reconstruction

    NARCIS (Netherlands)

    Luites, J.W.H.; Wymenga, A.B.; Blankevoort, L.; Eygendaal, D.; Verdonschot, Nicolaas Jacobus Joseph

    2014-01-01

    Background Femoral tunnel positioning is a difficult, but important factor in successful anterior cruciate ligament (ACL) reconstruction. Computer navigation can improve the anatomical planning procedure besides the tunnel placement procedure. Methods The accuracy of the computer-assisted femoral

  7. Vertical Wave Impacts on Offshore Wind Turbine Inspection Platforms

    DEFF Research Database (Denmark)

    Bredmose, Henrik; Jacobsen, Niels Gjøl

    2011-01-01

    Breaking wave impacts on a monopile at 20 m depth are computed with a VOF (Volume Of Fluid) method. The impacting waves are generated by the second-order focused wave group technique, to obtain waves that break at the position of the monopile. The subsequent impact from the vertical run-up flow...... on a horizontal inspection platform is computed for five different platform levels. The computational results show details of monopile impact such as slamming pressures from the overturning wave front and the formation of run-up flow. The results show that vertical platform impacts can occur at 20 m water depth....... The dependence of the vertical platform load to the platform level is discussed. Attention is given to the significant downward force that occur after the upward force associated with the vertical impact. The effect of the numerical resolution on the results is assessed. The position of wave overturning is found...

  8. Sketch for a model of four epistemological positions toward computer game play

    DEFF Research Database (Denmark)

    Leino, Olli

    2008-01-01

    The paper attempts to sketch out four distinct epistemological positions toward the player, who is understood as derived from play and game. To map out the problem field, two equally challenged positions toward computer game play are observed, emerging from inadequate treatment of the differences...... of playing a game is seen as independent of what goes on in the player’s mind (actually, the player might not even be the true subject of the game). Similar polarities are postulated regarding a game; from an exclusive viewpoint .game. is a signifying shorthand for objects, which, when observed from...

  9. Incidental lung cancers and positive computed tomography images in people living with HIV

    DEFF Research Database (Denmark)

    Ronit, Andreas; Kristensen, Thomas; Klitbo, Ditte M.

    2017-01-01

    in 901 patients, including 113 at high risk for lung cancer. A positive image was found in 28 (3.1% of the entire cohort and 9.7% of the high-risk group). Nine patients (all in the high-risk group) had invasive procedures undertaken with no serious adverse events. Lung cancer (stages IA, IIA, and IIIA......Objective: Lung cancer screening with low-dose computed tomography (LDCT) of high-risk groups in the general population is recommended by several authorities. This may not be feasible in people living with HIV (PLWHIV) due to higher prevalence of nodules. We therefore assessed the prevalence...... of positive computed tomography (CT) images and lung cancers in PLWHIV. Design: The Copenhagen comorbidity in HIV infection (COCOMO) study is an observational, longitudinal cohort study. Single-round LDCT was performed with subsequent clinical follow-up (NCT02382822). Method: Outcomes included histology...

  10. Computer-aided position planning of miniplates to treat facial bone defects.

    Directory of Open Access Journals (Sweden)

    Jan Egger

    Full Text Available In this contribution, a software system for computer-aided position planning of miniplates to treat facial bone defects is proposed. The intra-operatively used bone plates have to be passively adapted on the underlying bone contours for adequate bone fragment stabilization. However, this procedure can lead to frequent intra-operatively performed material readjustments especially in complex surgical cases. Our approach is able to fit a selection of common implant models on the surgeon's desired position in a 3D computer model. This happens with respect to the surrounding anatomical structures, always including the possibility of adjusting both the direction and the position of the used osteosynthesis material. By using the proposed software, surgeons are able to pre-plan the out coming implant in its form and morphology with the aid of a computer-visualized model within a few minutes. Further, the resulting model can be stored in STL file format, the commonly used format for 3D printing. Using this technology, surgeons are able to print the virtual generated implant, or create an individually designed bending tool. This method leads to adapted osteosynthesis materials according to the surrounding anatomy and requires further a minimum amount of money and time.

  11. Computer-aided position planning of miniplates to treat facial bone defects

    Science.gov (United States)

    Wallner, Jürgen; Gall, Markus; Chen, Xiaojun; Schwenzer-Zimmerer, Katja; Reinbacher, Knut; Schmalstieg, Dieter

    2017-01-01

    In this contribution, a software system for computer-aided position planning of miniplates to treat facial bone defects is proposed. The intra-operatively used bone plates have to be passively adapted on the underlying bone contours for adequate bone fragment stabilization. However, this procedure can lead to frequent intra-operatively performed material readjustments especially in complex surgical cases. Our approach is able to fit a selection of common implant models on the surgeon’s desired position in a 3D computer model. This happens with respect to the surrounding anatomical structures, always including the possibility of adjusting both the direction and the position of the used osteosynthesis material. By using the proposed software, surgeons are able to pre-plan the out coming implant in its form and morphology with the aid of a computer-visualized model within a few minutes. Further, the resulting model can be stored in STL file format, the commonly used format for 3D printing. Using this technology, surgeons are able to print the virtual generated implant, or create an individually designed bending tool. This method leads to adapted osteosynthesis materials according to the surrounding anatomy and requires further a minimum amount of money and time. PMID:28817607

  12. False-Positive Rate Determination of Protein Target Discovery using a Covalent Modification- and Mass Spectrometry-Based Proteomics Platform

    Science.gov (United States)

    Strickland, Erin C.; Geer, M. Ariel; Hong, Jiyong; Fitzgerald, Michael C.

    2014-01-01

    Detection and quantitation of protein-ligand binding interactions is important in many areas of biological research. Stability of proteins from rates of oxidation (SPROX) is an energetics-based technique for identifying the proteins targets of ligands in complex biological mixtures. Knowing the false-positive rate of protein target discovery in proteome-wide SPROX experiments is important for the correct interpretation of results. Reported here are the results of a control SPROX experiment in which chemical denaturation data is obtained on the proteins in two samples that originated from the same yeast lysate, as would be done in a typical SPROX experiment except that one sample would be spiked with the test ligand. False-positive rates of 1.2-2.2 % and analysis of the isobaric mass tag (e.g., iTRAQ®) reporter ions used for peptide quantitation. Our results also suggest that technical replicates can be used to effectively eliminate such false positives that result from this random error, as is demonstrated in a SPROX experiment to identify yeast protein targets of the drug, manassantin A. The impact of ion purity in the tandem mass spectral analyses and of background oxidation on the false-positive rate of protein target discovery using SPROX is also discussed.

  13. Computer-assisted teaching of skin flap surgery: validation of a mobile platform software for medical students.

    Directory of Open Access Journals (Sweden)

    David P de Sena

    Full Text Available The purpose of this study was to develop and validate a multimedia software application for mobile platforms to assist in the teaching and learning process of design and construction of a skin flap. Traditional training in surgery is based on learning by doing. Initially, the use of cadavers and animal models appeared to be a valid alternative for training. However, many conflicts with these training models prompted progression to synthetic and virtual reality models. Fifty volunteer fifth- and sixth-year medical students completed a pretest and were randomly allocated into two groups of 25 students each. The control group was exposed for 5 minutes to a standard text-based print article, while the test group used multimedia software describing how to fashion a rhomboid flap. Each group then performed a cutaneous flap on a training bench model while being evaluated by three blinded BSPS (Brazilian Society of Plastic Surgery board-certified surgeons using the OSATS (Objective Structured Assessment of Technical Skill protocol and answered a post-test. The text-based group was then tested again using the software. The computer-assisted learning (CAL group had superior performance as confirmed by checklist scores (p<0.002, overall global assessment (p = 0.017 and post-test results (p<0.001. All participants ranked the multimedia method as the best study tool. CAL learners exhibited better subjective and objective performance when fashioning rhomboid flaps as compared to those taught with standard print material. These findings indicate that students preferred to learn using the multimedia method.

  14. The COMET Sleep Research Platform.

    Science.gov (United States)

    Nichols, Deborah A; DeSalvo, Steven; Miller, Richard A; Jónsson, Darrell; Griffin, Kara S; Hyde, Pamela R; Walsh, James K; Kushida, Clete A

    2014-01-01

    The Comparative Outcomes Management with Electronic Data Technology (COMET) platform is extensible and designed for facilitating multicenter electronic clinical research. Our research goals were the following: (1) to conduct a comparative effectiveness trial (CET) for two obstructive sleep apnea treatments-positive airway pressure versus oral appliance therapy; and (2) to establish a new electronic network infrastructure that would support this study and other clinical research studies. The COMET platform was created to satisfy the needs of CET with a focus on creating a platform that provides comprehensive toolsets, multisite collaboration, and end-to-end data management. The platform also provides medical researchers the ability to visualize and interpret data using business intelligence (BI) tools. COMET is a research platform that is scalable and extensible, and which, in a future version, can accommodate big data sets and enable efficient and effective research across multiple studies and medical specialties. The COMET platform components were designed for an eventual move to a cloud computing infrastructure that enhances sustainability, overall cost effectiveness, and return on investment.

  15. An Analysis Of Methods For Sharing An Electronic Platform Of Public Administration Services Using Cloud Computing And Service Oriented Architecture

    Directory of Open Access Journals (Sweden)

    Maciej Hamiga

    2012-01-01

    Full Text Available This paper presents a case study on how to design and implement a publicadministration services platform, using the SOA paradigm and cloud model forsharing among citizens belonging to particular districts and provinces, providingtight integration with an existing ePUAP system. The basic requirements,architecture and implementation of the platform are all discussed. Practicalevaluation of the solution is elaborated using real-case scenario of the BusinessProcess Management related activities.

  16. Integrating medicinal chemistry, organic/combinatorial chemistry, and computational chemistry for the discovery of selective estrogen receptor modulators with Forecaster, a novel platform for drug discovery.

    Science.gov (United States)

    Therrien, Eric; Englebienne, Pablo; Arrowsmith, Andrew G; Mendoza-Sanchez, Rodrigo; Corbeil, Christopher R; Weill, Nathanael; Campagna-Slater, Valérie; Moitessier, Nicolas

    2012-01-23

    As part of a large medicinal chemistry program, we wish to develop novel selective estrogen receptor modulators (SERMs) as potential breast cancer treatments using a combination of experimental and computational approaches. However, one of the remaining difficulties nowadays is to fully integrate computational (i.e., virtual, theoretical) and medicinal (i.e., experimental, intuitive) chemistry to take advantage of the full potential of both. For this purpose, we have developed a Web-based platform, Forecaster, and a number of programs (e.g., Prepare, React, Select) with the aim of combining computational chemistry and medicinal chemistry expertise to facilitate drug discovery and development and more specifically to integrate synthesis into computer-aided drug design. In our quest for potent SERMs, this platform was used to build virtual combinatorial libraries, filter and extract a highly diverse library from the NCI database, and dock them to the estrogen receptor (ER), with all of these steps being fully automated by computational chemists for use by medicinal chemists. As a result, virtual screening of a diverse library seeded with active compounds followed by a search for analogs yielded an enrichment factor of 129, with 98% of the seeded active compounds recovered, while the screening of a designed virtual combinatorial library including known actives yielded an area under the receiver operating characteristic (AU-ROC) of 0.78. The lead optimization proved less successful, further demonstrating the challenge to simulate structure activity relationship studies.

  17. The NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform to Support the Analysis of Petascale Environmental Data Collections

    Science.gov (United States)

    Evans, B. J. K.; Pugh, T.; Wyborn, L. A.; Porter, D.; Allen, C.; Smillie, J.; Antony, J.; Trenham, C.; Evans, B. J.; Beckett, D.; Erwin, T.; King, E.; Hodge, J.; Woodcock, R.; Fraser, R.; Lescinsky, D. T.

    2014-12-01

    The National Computational Infrastructure (NCI) has co-located a priority set of national data assets within a HPC research platform. This powerful in-situ computational platform has been created to help serve and analyse the massive amounts of data across the spectrum of environmental collections - in particular the climate, observational data and geoscientific domains. This paper examines the infrastructure, innovation and opportunity for this significant research platform. NCI currently manages nationally significant data collections (10+ PB) categorised as 1) earth system sciences, climate and weather model data assets and products, 2) earth and marine observations and products, 3) geosciences, 4) terrestrial ecosystem, 5) water management and hydrology, and 6) astronomy, social science and biosciences. The data is largely sourced from the NCI partners (who include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. By co-locating these large valuable data assets, new opportunities have arisen by harmonising the data collections, making a powerful transdisciplinary research platformThe data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. New scientific software, cloud-scale techniques, server-side visualisation and data services have been harnessed and integrated into the platform, so that analysis is performed seamlessly across the traditional boundaries of the underlying data domains. Characterisation of the techniques along with performance profiling ensures scalability of each software component, all of which can either be enhanced or replaced through future improvements. A Development-to-Operations (DevOps) framework has also been implemented to manage the scale of the software complexity alone. This ensures that

  18. Development of customized positioning guides using computer-aided design and manufacturing technology for orthognathic surgery.

    Science.gov (United States)

    Lin, Hsiu-Hsia; Chang, Hsin-Wen; Lo, Lun-Jou

    2015-12-01

    The purpose of this study was to devise a method for producing customized positioning guides for translating virtual plans to actual orthognathic surgery, and evaluation of the feasibility and validity of the devised method. Patients requiring two-jaw orthognathic surgery were enrolled and consented before operation. Two types of positioning guides were designed and fabricated using computer-aided design and manufacturing technology: One of the guides was used for the LeFort I osteotomy, and the other guide was used for positioning the maxillomandibular complex. The guides were fixed to the medial side of maxilla. For validation, the simulation images and postoperative cone beam computed tomography images were superimposed using surface registration to quantify the difference between the images. The data were presented in root-mean-square difference (RMSD) values. Both sets of guides were experienced to provide ideal fit and maximal contact to the maxillary surface to facilitate their accurate management in clinical applications. The validation results indicated that RMSD values between the images ranged from 0.18 to 0.33 mm in the maxilla and from 0.99 to 1.56 mm in the mandible. The patients were followed up for 6 months or more, and all of them were satisfied with the results. The proposed customized positioning guides are practical and reliable for translation of virtual plans to actual surgery. Furthermore, these guides improved the efficiency and outcome of surgery. This approach is uncomplicated in design, cost-effective in fabrication, and particularly convenient to use.

  19. Toward an Optimal Position for IVC Filters: Computational Modeling of the Impact of Renal Vein Inflow

    Energy Technology Data Exchange (ETDEWEB)

    Wang, S L; Singer, M A

    2009-07-13

    The purpose of this report is to evaluate the hemodynamic effects of renal vein inflow and filter position on unoccluded and partially occluded IVC filters using three-dimensional computational fluid dynamics. Three-dimensional models of the TrapEase and Gunther Celect IVC filters, spherical thrombi, and an IVC with renal veins were constructed. Hemodynamics of steady-state flow was examined for unoccluded and partially occluded TrapEase and Gunther Celect IVC filters in varying proximity to the renal veins. Flow past the unoccluded filters demonstrated minimal disruption. Natural regions of stagnant/recirculating flow in the IVC are observed superior to the bilateral renal vein inflows, and high flow velocities and elevated shear stresses are observed in the vicinity of renal inflow. Spherical thrombi induce stagnant and/or recirculating flow downstream of the thrombus. Placement of the TrapEase filter in the suprarenal vein position resulted in a large area of low shear stress/stagnant flow within the filter just downstream of thrombus trapped in the upstream trapping position. Filter position with respect to renal vein inflow influences the hemodynamics of filter trapping. Placement of the TrapEase filter in a suprarenal location may be thrombogenic with redundant areas of stagnant/recirculating flow and low shear stress along the caval wall due to the upstream trapping position and the naturally occurring region of stagnant flow from the renal veins. Infrarenal vein placement of IVC filters in a near juxtarenal position with the downstream cone near the renal vein inflow likely confers increased levels of mechanical lysis of trapped thrombi due to increased shear stress from renal vein inflow.

  20. Improving computer-aided detection assistance in breast cancer screening by removal of obviously false-positive findings

    NARCIS (Netherlands)

    Mordang, Jan-Jurre; Gubern-Merida, Albert; Bria, Alessandro; Tortorella, Francesco; den Heeten, Gerard; Karssemeijer, Nico

    2017-01-01

    Purpose: Computer-aided detection (CADe) systems for mammography screening still mark many false positives. This can cause radiologists to lose confidence in CADe, especially when many false positives are obviously not suspicious to them. In this study, we focus on obvious false positives generated

  1. Improving computer-aided detection assistance in breast cancer screening by removal of obviously false-positive findings

    NARCIS (Netherlands)

    Mordang, J.J.; Gubern Merida, A.; Bria, A.; Tortorella, F.; Heeten, G.; Karssemeijer, N.

    2017-01-01

    PURPOSE: Computer-aided detection (CADe) systems for mammography screening still mark many false positives. This can cause radiologists to lose confidence in CADe, especially when many false positives are obviously not suspicious to them. In this study, we focus on obvious false positives generated

  2. Cross-Platform Technologies

    Directory of Open Access Journals (Sweden)

    Maria Cristina ENACHE

    2017-04-01

    Full Text Available Cross-platform - a concept becoming increasingly used in recent years especially in the development of mobile apps, but this consistently over time and in the development of conventional desktop applications. The notion of cross-platform software (multi-platform or platform-independent refers to a software application that can run on more than one operating system or computing architecture. Thus, a cross-platform application can operate independent of software or hardware platform on which it is execute. As a generic definition presents a wide range of meanings for purposes of this paper we individualize this definition as follows: we will reduce the horizon of meaning and we use functionally following definition: a cross-platform application is a software application that can run on more than one operating system (desktop or mobile identical or in a similar way.

  3. A computational framework for investigating the positional stability of aortic endografts.

    Science.gov (United States)

    Prasad, Anamika; Xiao, Nan; Gong, Xiao-Yan; Zarins, Christopher K; Figueroa, C Alberto

    2013-10-01

    Endovascular aneurysm repair (Greenhalgh in N Engl J Med 362(20):1863-1871, 2010) techniques have revolutionized the treatment of thoracic and abdominal aortic aneurysm disease, greatly reducing the perioperative mortality and morbidity associated with open surgical repair techniques. However, EVAR is not free of important complications such as late device migration, endoleak formation and fracture of device components that may result in adverse events such as aneurysm enlargement, need for long-term imaging surveillance and secondary interventions or even death. These complications result from the device inability to withstand the hemodynamics of blood flow and to keep its originally intended post-operative position over time. Understanding the in vivo biomechanical working environment experienced by endografts is a critical factor in improving their long-term performance. To date, no study has investigated the mechanics of contact between device and aorta in a three-dimensional setting. In this work, we developed a comprehensive Computational Solid Mechanics and Computational Fluid Dynamics framework to investigate the mechanics of endograft positional stability. The main building blocks of this framework are: (1) Three-dimensional non-planar aortic and stent-graft geometrical models, (2) Realistic multi-material constitutive laws for aorta, stent, and graft, (3) Physiological values for blood flow and pressure, and (4) Frictional model to describe the contact between the endograft and the aorta. We introduce a new metric for numerical quantification of the positional stability of the endograft. Lastly, in the results section, we test the framework by investigating the impact of several factors that are clinically known to affect endograft stability.

  4. Computer-assisted preoperative simulation for positioning of plate fixation in Lefort I osteotomy: A case report

    Directory of Open Access Journals (Sweden)

    Hideyuki Suenaga

    2016-06-01

    Full Text Available Computed tomography images are used for three-dimensional planning in orthognathic surgery. This facilitates the actual surgery by simulating the surgical scenario. We performed a computer-assisted virtual orthognathic surgical procedure using optically scanned three-dimensional (3D data and real computed tomography data on a personal computer. It helped maxillary bone movement and positioning and the titanium plate temporary fixation and positioning. This simulated the surgical procedure, which made the procedure easy, and we could perform precise actual surgery and could forecast the postsurgery outcome. This simulation method promises great potential in orthognathic surgery to help surgeons plan and perform operative procedures more precisely.

  5. A User-Centered Mobile Cloud Computing Platform for Improving Knowledge Management in Small-to-Medium Enterprises in the Chilean Construction Industry

    Directory of Open Access Journals (Sweden)

    Daniela Núñez

    2018-03-01

    Full Text Available Knowledge management (KM is a key element for the development of small-to-medium enterprises (SMEs in the construction industry. This is particularly relevant in Chile, where this industry is composed almost entirely of SMEs. Although various KM system proposals can be found in the literature, they are not suitable for SMEs, due to usability problems, budget constraints, and time and connectivity issues. Mobile Cloud Computing (MCC systems offer several advantages to construction SMEs, but they have not yet been exploited to address KM needs. Therefore, this research is aimed at the development of a MCC-based KM platform to manage lessons learned in different construction projects of SMEs, through an iterative and user-centered methodology. Usability and quality evaluations of the proposed platform show that MCC is a feasible and attractive option to address the KM issues in SMEs of the Chilean construction industry, since it is possible to consider both technical and usability requirements.

  6. Computational Studies of Positive and Negative Streamers in Bubbles Suspended in Distilled Water

    KAUST Repository

    Sharma, Ashish

    2017-01-05

    We perform computational studies of nanosecond streamers generated in helium bubbles immersed in distilled water under high pressure conditions. The model takes into account the presence of water vapor in the gas bubble for an accurate description of the chemical kinetics of the discharge. We apply positive and negative trigger voltages much higher than the breakdown voltage and study the dynamic characteristics of the resulting discharge. We observe that, for high positive trigger voltages, the streamer moves along the surface of the gas bubble during the initial stages of the discharge. We also find a considerable difference in the evolution of the streamer discharge for positive and negative trigger voltages with more uniform volumetric distribution of species in the streamer channel for negative trigger voltages due to formation of multiple streamers. We also observe that the presence of water vapor does not influence the breakdown voltage of the discharge but greatly affects the composition of dominant species in the trail of the streamer channel.

  7. Excessive lateral patellar translation on axial computed tomography indicates positive patellar J sign.

    Science.gov (United States)

    Xue, Zhe; Song, Guan-Yang; Liu, Xin; Zhang, Hui; Wu, Guan; Qian, Yi; Feng, Hua

    2018-03-20

    The purpose of the study was to quantify the patellar J sign using traditional computed tomography (CT) scans. Fifty-three patients (fifty-three knees) who suffered from recurrent patellar instability were included and analyzed. The patellar J sign was evaluated pre-operatively during active knee flexion and extension. It was defined as positive when there was obvious lateral patellar translation, and negative when there was not. The CT scans were performed in all patients with full knee extension; and the parameters including bisect offset index (BOI), patellar-trochlear-groove (PTG) distance, and patellar lateral tilt angle (PLTA) were measured on the axial slices. All the three parameters were compared between the J sign-positive group (study group) and the J sign-negative group (control group). In addition, the optimal thresholds of the three CT scan parameters for predicting the positive patellar J sign were determined with receiver operating characteristic (ROC) curves, and the diagnostic values were assessed by the area under the curve (AUC). Among the fifty-three patients (fifty-three knees), thirty-seven (70%) showed obvious lateral patellar translation, which were defined as positive J sign (study group), and the remaining sixteen (30%) who showed no lateral translation were defined as negative J sign (control group). The mean values of the three CT parameters in the study group were all significantly larger compared to the control group, including BOI (121 ± 28% vs 88 ± 12%, P = 0.038), PTG distance (5.2 ± 6.6 mm vs - 4.4 ± 5.2 mm, P J sign was 97.5% (Sensitivity = 83.3%, Specificity = 87.5%). In this study, the prevalence of positive patellar J sign was 70%. The BOI measured from the axial CT scans of the knee joint can be used as an appropriate predictor to differentiate the positive J sign from the negative J sign, highlighting that the excessive lateral patellar translation on axial CT scan indicates positive

  8. Efficient Use of Automatic Exposure Control Systems in Computed Tomography Requires Correct Patient Positioning

    Energy Technology Data Exchange (ETDEWEB)

    Gudjonsdottir, J.; Jonsdottir, B. (Roentgen Domus Medica, Reykjavik (Iceland)); Svensson, J.R.; Campling, S. (Faculty of Health and Social Care, Anglia Ruskin Univ., Cambridge (United Kingdom)); Brennan, P.C. (Diagnostic Imaging, Biological Imaging Research, UCD School of Medicine and Medical Science, Univ. College Dublin, Belfield, Dublin (Ireland))

    2009-11-15

    Background: Image quality and radiation dose to the patient are important factors in computed tomography (CT). To provide constant image quality, tube current modulation (TCM) performed by automatic exposure control (AEC) adjusts the tube current to the patient's size and shape. Purpose: To evaluate the effects of patient centering on tube current-time product (mAs) and image noise. Material and Methods: An oval-shaped acrylic phantom was scanned in various off-center positions, at 30-mm intervals within a 500-mm field of view, using three different CT scanners. Acquisition parameters were similar to routine abdomen examinations at each site. The mAs was recorded and noise measured in the images. The correlation of mAs and noise with position was calculated using Pearson correlation. Results: In all three scanners, the mAs delivered by the AEC changed with y-position of the phantom (P<0.001), with correlation values of 0.98 for scanners A and B and -0.98 for scanner C. With x-position, mAs changes were 4.9% or less. As the phantom moved into the y-positions, compared with the iso-center, the mAs varied by up to +70%, -34%, and +56% in scanners A, B, and C, respectively. For scanners A and B, noise in two regions of interest in the lower part of the phantom decreased with elevation, with correlation factors from -0.95 to -0.86 (P<0.02). In the x-direction, significant noise relationships (P<0.005) were only seen in scanner A. Conclusion: This study demonstrates that patient centering markedly affects the efficacy of AEC function and that tube current changes vary between scanners. Tube position when acquiring the scout projection radiograph is decisive for the direction of the mAs change. Off-center patient positions cause errors in tube current modulation that can outweigh the dose reduction gained by AEC use, and image quality is affected

  9. Accurate 3D Positioning for a Mobile Platform in Non-Line-of-Sight Scenarios Based on IMU/Magnetometer Sensor Fusion.

    Science.gov (United States)

    Hellmers, Hendrik; Kasmi, Zakaria; Norrdine, Abdelmoumen; Eichhorn, Andreas

    2018-01-04

    In recent years, a variety of real-time applications benefit from services provided by localization systems due to the advent of sensing and communication technologies. Since the Global Navigation Satellite System (GNSS) enables localization only outside buildings, applications for indoor positioning and navigation use alternative technologies. Ultra Wide Band Signals (UWB), Wireless Local Area Network (WLAN), ultrasonic or infrared are common examples. However, these technologies suffer from fading and multipath effects caused by objects and materials in the building. In contrast, magnetic fields are able to pass through obstacles without significant propagation errors, i.e. in Non-Line of Sight Scenarios (NLoS). The aim of this work is to propose a novel indoor positioning system based on artificially generated magnetic fields in combination with Inertial Measurement Units (IMUs). In order to reach a better coverage, multiple coils are used as reference points. A basic algorithm for three-dimensional applications is demonstrated as well as evaluated in this article. The established system is then realized by a sensor fusion principle as well as a kinematic motion model on the basis of a Kalman filter. Furthermore, a pressure sensor is used in combination with an adaptive filtering method to reliably estimate the platform's altitude.

  10. Contribution of computed tomography (CT) in affections of the lung parenchyma in HIV positive patients

    International Nuclear Information System (INIS)

    Neuwirth, J.; Stankova, M.; Spala, J.; Strof, J.

    1996-01-01

    CT findings in HIV positive patients with respiratory complaints were analyzed. The predominant morphological type of changes is a 'ground glass' increased density. Minimal changes of the lung parenchyma were recorded on high resolution computed tomography (HRCT) even in patients with a negative or doubtful finding on plain chest radiographs. Also the range of affections on HRCT scans was wider than on simple scans. The morphological changes on HRCT scans alone, however, are not an adequate basis for differentiation of various infectious agents in inflammatory changes of the lung parenchyma, and frequently mixed infections are involved. When at the same time clinical symptoms are considered, it frequently is possible to considerably reduce the number of possible pathogenic organisms and to start treatment. (author) 4 figs., 11 refs

  11. An enhanced computational platform for investigating the roles of regulatory RNA and for identifying functional RNA motifs

    OpenAIRE

    Chang, Tzu-Hao; Huang, Hsi-Yuan; Hsu, Justin Bo-Kai; Weng, Shun-Long; Horng, Jorng-Tzong; Huang, Hsien-Da

    2013-01-01

    Background Functional RNA molecules participate in numerous biological processes, ranging from gene regulation to protein synthesis. Analysis of functional RNA motifs and elements in RNA sequences can obtain useful information for deciphering RNA regulatory mechanisms. Our previous work, RegRNA, is widely used in the identification of regulatory motifs, and this work extends it by incorporating more comprehensive and updated data sources and analytical approaches into a new platform. Methods ...

  12. Estimation Methods of the Point Spread Function Axial Position: A Comparative Computational Study

    Directory of Open Access Journals (Sweden)

    Javier Eduardo Diaz Zamboni

    2017-01-01

    Full Text Available The precise knowledge of the point spread function is central for any imaging system characterization. In fluorescence microscopy, point spread function (PSF determination has become a common and obligatory task for each new experimental device, mainly due to its strong dependence on acquisition conditions. During the last decade, algorithms have been developed for the precise calculation of the PSF, which fit model parameters that describe image formation on the microscope to experimental data. In order to contribute to this subject, a comparative study of three parameter estimation methods is reported, namely: I-divergence minimization (MIDIV, maximum likelihood (ML and non-linear least square (LSQR. They were applied to the estimation of the point source position on the optical axis, using a physical model. Methods’ performance was evaluated under different conditions and noise levels using synthetic images and considering success percentage, iteration number, computation time, accuracy and precision. The main results showed that the axial position estimation requires a high SNR to achieve an acceptable success level and higher still to be close to the estimation error lower bound. ML achieved a higher success percentage at lower SNR compared to MIDIV and LSQR with an intrinsic noise source. Only the ML and MIDIV methods achieved the error lower bound, but only with data belonging to the optical axis and high SNR. Extrinsic noise sources worsened the success percentage, but no difference was found between noise sources for the same method for all methods studied.

  13. Development of a Computer-Assisted Instrumentation Curriculum for Physics Students: Using LabVIEW and Arduino Platform

    Science.gov (United States)

    Kuan, Wen-Hsuan; Tseng, Chi-Hung; Chen, Sufen; Wong, Ching-Chang

    2016-01-01

    We propose an integrated curriculum to establish essential abilities of computer programming for the freshmen of a physics department. The implementation of the graphical-based interfaces from Scratch to LabVIEW then to LabVIEW for Arduino in the curriculum "Computer-Assisted Instrumentation in the Design of Physics Laboratories" brings…

  14. Towards a Versatile Tele-Education Platform for Computer Science Educators Based on the Greek School Network

    Science.gov (United States)

    Paraskevas, Michael; Zarouchas, Thomas; Angelopoulos, Panagiotis; Perikos, Isidoros

    2013-01-01

    Now days the growing need for highly qualified computer science educators in modern educational environments is commonplace. This study examines the potential use of Greek School Network (GSN) to provide a robust and comprehensive e-training course for computer science educators in order to efficiently exploit advanced IT services and establish a…

  15. Positive ventriculography and computer assisted tomography of the skull in the evaluation of megalocephaly in newborns and infants

    International Nuclear Information System (INIS)

    Kellermann, K.; Heuser, L.; Bliesener, J.A.

    1979-01-01

    To clarify indications and limits of computer assisted tomography and positive ventriculography the results of both methods were compared with each other retrospectively in 55 macrocephalic newborns and infants. It could be shown that positive ventriculography, even if combined with coronal views, was superseded by computer assisted tomography to only a certain extent. To complete the appropriate diagnosis preoperatively positive ventriculography was necessary most often in patients with paranatal disturbances who had undergone intensive care treatment. The non-invasive method of computer assisted tomography often yielded sufficient information on localisation, size, and nature of the underlying pathologic process. Both methods turned out to be complementary when topographical and functional interrelations of cranial cysts were to be analysed, or when anatomical details had to be demonstrated, especially structures of the midline and of the posterior fossa, which could not be visualized by computer assisted tomography alone. (orig.) [de

  16. The influence of computer games on positive socialization of delinquent teenagers

    OpenAIRE

    Mosteikaitė, Ernesta

    2016-01-01

    Computers and the Internet are rapidly spreading in everyday society. These two components had a strong influence in teenagers’ daily life. If we take a deeper look, the computer can be found in almost every family of minors, in educational institutions and it is the main reason why children use computers almost every day. Due to the mentioned fact, computers gain increasing meaning in teenagers’ daily life, naturally, a question about the impact of the computer for children’s behaviour and s...

  17. Validation of the Gate simulation platform in single photon emission computed tomography and application to the development of a complete 3-dimensional reconstruction algorithm

    International Nuclear Information System (INIS)

    Lazaro, D.

    2003-10-01

    Monte Carlo simulations are currently considered in nuclear medical imaging as a powerful tool to design and optimize detection systems, and also to assess reconstruction algorithms and correction methods for degrading physical effects. Among the many simulators available, none of them is considered as a standard in nuclear medical imaging: this fact has motivated the development of a new generic Monte Carlo simulation platform (GATE), based on GEANT4 and dedicated to SPECT/PET (single photo emission computed tomography / positron emission tomography) applications. We participated during this thesis to the development of the GATE platform within an international collaboration. GATE was validated in SPECT by modeling two gamma cameras characterized by a different geometry, one dedicated to small animal imaging and the other used in a clinical context (Philips AXIS), and by comparing the results obtained with GATE simulations with experimental data. The simulation results reproduce accurately the measured performances of both gamma cameras. The GATE platform was then used to develop a new 3-dimensional reconstruction method: F3DMC (fully 3-dimension Monte-Carlo) which consists in computing with Monte Carlo simulation the transition matrix used in an iterative reconstruction algorithm (in this case, ML-EM), including within the transition matrix the main physical effects degrading the image formation process. The results obtained with the F3DMC method were compared to the results obtained with three other more conventional methods (FBP, MLEM, MLEMC) for different phantoms. The results of this study show that F3DMC allows to improve the reconstruction efficiency, the spatial resolution and the signal to noise ratio with a satisfactory quantification of the images. These results should be confirmed by performing clinical experiments and open the door to a unified reconstruction method, which could be applied in SPECT but also in PET. (author)

  18. An Evaluation of the FIDAP Computational Fluid Dynamics Code for the Calculation of Hydrodynamic Forces on Underwater Platforms

    National Research Council Canada - National Science Library

    Jones, D

    2003-01-01

    ..., spheres, flat plates, and wing profiles. The degree to which FIDAP accurately reproduces known experimental data on these shapes is described and the applicability of other Computational Fluid Dynamics packages is discussed. (13 tables, 2 figures, 38 refs.)

  19. The cloud services innovation platform- enabling service-based environmental modelling using infrastructure-as-a-service cloud computing

    Science.gov (United States)

    Service oriented architectures allow modelling engines to be hosted over the Internet abstracting physical hardware configuration and software deployments from model users. Many existing environmental models are deployed as desktop applications running on user's personal computers (PCs). Migration ...

  20. Evaluation of condylar positions in patients with temporomandibular disorders: A cone-beam computed tomography study

    Energy Technology Data Exchange (ETDEWEB)

    Imanimoghaddam, Mahrokh; Mahdavi, Pirooze; Bagherpour, Ali; Darijani, Mansoreh; Ebrahimnejad, Hamed [Dept. of Oral and Maxillofacial Radiology, Oral and Maxillofacial Diseases Research Center, School of Dentistry, Mashhad University of Medical Sciences, Mashhad (Iran, Islamic Republic of); Madani, Azam Sadat [Dept. of Oral and Maxillofacial Radiology, Oral and Maxillofacial Diseases Research Center, School of Dentistry, Mashhad University of Medical Sciences, Mashhad (Iran, Islamic Republic of)

    2016-06-15

    This study was performed to compare the condylar position in patients with temporomandibular joint disorders (TMDs) and a normal group by using cone-beam computed tomography (CBCT). In the TMD group, 25 patients (5 men and 20 women) were randomly selected among the ones suffering from TMD according to the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD). The control group consisted of 25 patients (8 men and 17 women) with normal temporomandibular joints (TMJs) who were referred to the radiology department in order to undergo CBCT scanning for implant treatment in the posterior maxilla. Linear measurements from the superior, anterior, and posterior joint spaces between the condyle and glenoid fossa were made through defined landmarks in the sagittal view. The inclination of articular eminence was also determined. The mean anterior joint space was 2.3 mm in the normal group and 2.8 mm in the TMD group, respectively. The results showed that there was a significant correlation between the superior and posterior joint spaces in both the normal and TMD groups, but it was only in the TMD group that the correlation coefficient among the dimensions of anterior and superior spaces was significant. There was a significant correlation between the inclination of articular eminence and the size of the superior and posterior spaces in the normal group. The average dimension of the anterior joint space was different between the two groups. CBCT could be considered a useful diagnostic imaging modality for TMD patients.

  1. Computational design and characterization of a temperature-sensitive plasmid replicon for gram positive thermophiles

    Directory of Open Access Journals (Sweden)

    Olson Daniel G

    2012-05-01

    Full Text Available Abstract Background Temperature-sensitive (Ts plasmids are useful tools for genetic engineering, but there are currently none compatible with the gram positive, thermophilic, obligate anaerobe, Clostridium thermocellum. Traditional mutagenesis techniques yield Ts mutants at a low frequency, and therefore requires the development of high-throughput screening protocols, which are also not available for this organism. Recently there has been progress in the development of computer algorithms which can predict Ts mutations. Most plasmids currently used for genetic modification of C. thermocellum are based on the replicon of plasmid pNW33N, which replicates using the RepB replication protein. To address this problem, we set out to create a Ts plasmid by mutating the gene coding for the RepB replication protein using an algorithm designed by Varadarajan et al. (1996 for predicting Ts mutants based on the amino-acid sequence of the protein. Results A library of 34 mutant plasmids was designed, synthesized and screened, resulting in 6 mutants which exhibited a Ts phenotype. Of these 6, the one with the most temperature-sensitive phenotype (M166A was compared with the original plasmid. It exhibited lower stability at 48°C and was completely unable to replicate at 55°C. Conclusions The plasmid described in this work could be useful in future efforts to genetically engineer C. thermocellum, and the method used to generate this plasmid may be useful for others trying to make Ts plasmids.

  2. Evaluation of condylar positions in patients with temporomandibular disorders: A cone-beam computed tomography study

    International Nuclear Information System (INIS)

    Imanimoghaddam, Mahrokh; Mahdavi, Pirooze; Bagherpour, Ali; Darijani, Mansoreh; Ebrahimnejad, Hamed; Madani, Azam Sadat

    2016-01-01

    This study was performed to compare the condylar position in patients with temporomandibular joint disorders (TMDs) and a normal group by using cone-beam computed tomography (CBCT). In the TMD group, 25 patients (5 men and 20 women) were randomly selected among the ones suffering from TMD according to the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD). The control group consisted of 25 patients (8 men and 17 women) with normal temporomandibular joints (TMJs) who were referred to the radiology department in order to undergo CBCT scanning for implant treatment in the posterior maxilla. Linear measurements from the superior, anterior, and posterior joint spaces between the condyle and glenoid fossa were made through defined landmarks in the sagittal view. The inclination of articular eminence was also determined. The mean anterior joint space was 2.3 mm in the normal group and 2.8 mm in the TMD group, respectively. The results showed that there was a significant correlation between the superior and posterior joint spaces in both the normal and TMD groups, but it was only in the TMD group that the correlation coefficient among the dimensions of anterior and superior spaces was significant. There was a significant correlation between the inclination of articular eminence and the size of the superior and posterior spaces in the normal group. The average dimension of the anterior joint space was different between the two groups. CBCT could be considered a useful diagnostic imaging modality for TMD patients

  3. USA Hire Testing Platform

    Data.gov (United States)

    Office of Personnel Management — The USA Hire Testing Platform delivers tests used in hiring for positions in the Federal Government. To safeguard the integrity of the hiring processes and ensure...

  4. Stabilisation problem in biaxial platform

    Directory of Open Access Journals (Sweden)

    Lindner Tymoteusz

    2016-12-01

    Full Text Available The article describes investigation of rolling ball stabilization problem on a biaxial platform. The aim of the control system proposed here is to stabilize ball moving on a plane in equilibrium point. The authors proposed a control algorithm based on cascade PID and they compared it with another control method. The article shows the results of the accuracy of ball stabilization and influence of applied filter on the signal waveform. The application used to detect the ball position measured by digital camera has been written using a cross platform .Net wrapper to the OpenCV image processing library - EmguCV. The authors used the bipolar stepper motor with dedicated electronic controller. The data between the computer and the designed controller are sent with use of the RS232 standard. The control stand is based on ATmega series microcontroller.

  5. Stabilisation problem in biaxial platform

    Science.gov (United States)

    Lindner, Tymoteusz; Rybarczyk, Dominik; Wyrwał, Daniel

    2016-12-01

    The article describes investigation of rolling ball stabilization problem on a biaxial platform. The aim of the control system proposed here is to stabilize ball moving on a plane in equilibrium point. The authors proposed a control algorithm based on cascade PID and they compared it with another control method. The article shows the results of the accuracy of ball stabilization and influence of applied filter on the signal waveform. The application used to detect the ball position measured by digital camera has been written using a cross platform .Net wrapper to the OpenCV image processing library - EmguCV. The authors used the bipolar stepper motor with dedicated electronic controller. The data between the computer and the designed controller are sent with use of the RS232 standard. The control stand is based on ATmega series microcontroller.

  6. The Effect of In-Service Training of Computer Science Teachers on Scratch Programming Language Skills Using an Electronic Learning Platform on Programming Skills and the Attitudes towards Teaching Programming

    Science.gov (United States)

    Alkaria, Ahmed; Alhassan, Riyadh

    2017-01-01

    This study was conducted to examine the effect of in-service training of computer science teachers in Scratch language using an electronic learning platform on acquiring programming skills and attitudes towards teaching programming. The sample of this study consisted of 40 middle school computer science teachers. They were assigned into two…

  7. Use of personal computers for Gothic arch tracing: analysis and evaluation of horizontal mandibular positions with edentulous prosthesis.

    Science.gov (United States)

    Watanabe, Y

    1999-11-01

    Determining mandibular position for an edentulous patient raises the question of whether to emphasize centric relation or muscular position. This challenge results from the lack of a convenient procedure for quantifying the horizontal mandibular position, which can be determined by a variety of methods. This study analyzed and evaluated the horizontal mandibular positions produced by different guidance systems. Twenty-six edentulous subjects with no clinical evidence of abnormality of temporomandibular disorder were selected. Horizontal position data for the mandible obtained by gothic arch tracing was loaded into a personal computer by setting the sensor portion of a digitizer into the oral cavity to serve as a miniature lightweight tracing board. By connecting this with a digitizer control circuit set in an extraoral location, each mandibular position was displayed in a distinguishable manner on a computer display in real time, then recorded and analyzed. The gothic arch apex and tapping point varied, depending on body position. In the supine position, the gothic arch apex and the tapping point were close to the mandibular position determined by bilateral manipulation. This system provides effective data concerning mandibular positions for fabrication of dentures.

  8. Label-Free Platform for MicroRNA Detection Based on the Fluorescence Quenching of Positively Charged Gold Nanoparticles to Silver Nanoclusters.

    Science.gov (United States)

    Miao, Xiangmin; Cheng, Zhiyuan; Ma, Haiyan; Li, Zongbing; Xue, Ning; Wang, Po

    2018-01-16

    A novel strategy was developed for microRNA-155 (miRNA-155) detection based on the fluorescence quenching of positively charged gold nanoparticles [(+)AuNPs] to Ag nanoclusters (AgNCs). In the designed system, DNA-stabilized Ag nanoclusters (DNA/AgNCs) were introduced as fluorescent probes, and DNA-RNA heteroduplexes were formed upon the addition of target miRNA-155. Meanwhile, the (+)AuNPs could be electrostatically adsorbed on the negatively charged single-stranded DNA (ssDNA) or DNA-RNA heteroduplexes to quench the fluorescence signal. In the presence of duplex-specific nuclease (DSN), DNA-RNA heteroduplexes became a substrate for the enzymatic hydrolysis of the DNA strand to yield a fluorescence signal due to the diffusion of AgNCs away from (+)AuNPs. Under the optimal conditions, (+)AuNPs displayed very high quenching efficiency to AgNCs, which paved the way for ultrasensitive detection with a low detection limit of 33.4 fM. In particular, the present strategy demonstrated excellent specificity and selectivity toward the detection of target miRNA against control miRNAs, including mutated miRNA-155, miRNA-21, miRNA-141, let-7a, and miRNA-182. Moreover, the practical application value of the system was confirmed by the evaluation of the expression levels of miRNA-155 in clinical serum samples with satisfactory results, suggesting that the proposed sensing platform is promising for applications in disease diagnosis as well as the fundamental research of biochemistry.

  9. Study on High Performance of MPI-Based Parallel FDTD from WorkStation to Super Computer Platform

    Directory of Open Access Journals (Sweden)

    Z. L. He

    2012-01-01

    Full Text Available Parallel FDTD method is applied to analyze the electromagnetic problems of the electrically large targets on super computer. It is well known that the more the number of processors the less computing time consumed. Nevertheless, with the same number of processors, computing efficiency is affected by the scheme of the MPI virtual topology. Then, the influence of different virtual topology schemes on parallel performance of parallel FDTD is studied in detail. The general rules are presented on how to obtain the highest efficiency of parallel FDTD algorithm by optimizing MPI virtual topology. To show the validity of the presented method, several numerical results are given in the later part. Various comparisons are made and some useful conclusions are summarized.

  10. Time-domain seismic modeling in viscoelastic media for full waveform inversion on heterogeneous computing platforms with OpenCL

    Science.gov (United States)

    Fabien-Ouellet, Gabriel; Gloaguen, Erwan; Giroux, Bernard

    2017-03-01

    Full Waveform Inversion (FWI) aims at recovering the elastic parameters of the Earth by matching recordings of the ground motion with the direct solution of the wave equation. Modeling the wave propagation for realistic scenarios is computationally intensive, which limits the applicability of FWI. The current hardware evolution brings increasing parallel computing power that can speed up the computations in FWI. However, to take advantage of the diversity of parallel architectures presently available, new programming approaches are required. In this work, we explore the use of OpenCL to develop a portable code that can take advantage of the many parallel processor architectures now available. We present a program called SeisCL for 2D and 3D viscoelastic FWI in the time domain. The code computes the forward and adjoint wavefields using finite-difference and outputs the gradient of the misfit function given by the adjoint state method. To demonstrate the code portability on different architectures, the performance of SeisCL is tested on three different devices: Intel CPUs, NVidia GPUs and Intel Xeon PHI. Results show that the use of GPUs with OpenCL can speed up the computations by nearly two orders of magnitudes over a single threaded application on the CPU. Although OpenCL allows code portability, we show that some device-specific optimization is still required to get the best performance out of a specific architecture. Using OpenCL in conjunction with MPI allows the domain decomposition of large models on several devices located on different nodes of a cluster. For large enough models, the speedup of the domain decomposition varies quasi-linearly with the number of devices. Finally, we investigate two different approaches to compute the gradient by the adjoint state method and show the significant advantages of using OpenCL for FWI.

  11. Payment Platform

    DEFF Research Database (Denmark)

    Hjelholt, Morten; Damsgaard, Jan

    2012-01-01

    thoroughly and substitute current payment standards in the decades to come. This paper portrays how digital payment platforms evolve in socio-technical niches and how various technological platforms aim for institutional attention in their attempt to challenge earlier platforms and standards. The paper...... applies a co-evolutionary multilevel perspective to model the interplay and processes between technology and society wherein digital payment platforms potentially will substitute other payment platforms just like the credit card negated the check. On this basis this paper formulate a multilevel conceptual...

  12. Computing platform to aid in decision making on energy management projects of the ELETROBRAS; Plataforma computacional para auxilio na tomada de decisao em projetos de gestao energetica da ELETROBRAS

    Energy Technology Data Exchange (ETDEWEB)

    Assis, T.B.; Rosa, R.B.V.; Pinto, D.P.; Casagrande, C.G. [Universidade Federal de Juiz de Fora, MG (Brazil). Lab. de Eficiencia Energetica], Emails: tbassis@yahoo.com.br, tatobrasil@yahoo.com.br, casagrandejf@yahoo.com.br, danilo.pinto@ufjf.edu.br; Martins, C.C.; Cantarino, M. [Centrais Eletricas Brasileiras S.A. (ELETROBRAS), Rio de Janeiro, RJ (Brazil). Div. de Eficiencia Energetica em Edificacoes], Emails: cmartin@eletrobras.com, marcelo.cantarino@eletrobras.com

    2009-07-01

    A new tool developed by the Laboratory of Computational Efficiency Energy (LEENER), of the Federal University of Juiz de Fora (UFJF): the SP{sup 3} platform - Planning System of the Public Buildings is presented. This platform, when completed, will help Centrais Eletricas S.A. (ELETROBRAS) in meeting the demand of energetic efficiency projects for public buildings, standardizing data in order to accelerate the approval process and monitoring of a larger number of projects. This article discusses the stages of the platform development, the management methodology used, the goals and outcomes examined with the members of the PROCEL that working on this project.

  13. Proposed Use of the NASA Ames Nebula Cloud Computing Platform for Numerical Weather Prediction and the Distribution of High Resolution Satellite Imagery

    Science.gov (United States)

    Limaye, Ashutosh S.; Molthan, Andrew L.; Srikishen, Jayanthi

    2010-01-01

    The development of the Nebula Cloud Computing Platform at NASA Ames Research Center provides an open-source solution for the deployment of scalable computing and storage capabilities relevant to the execution of real-time weather forecasts and the distribution of high resolution satellite data to the operational weather community. Two projects at Marshall Space Flight Center may benefit from use of the Nebula system. The NASA Short-term Prediction Research and Transition (SPoRT) Center facilitates the use of unique NASA satellite data and research capabilities in the operational weather community by providing datasets relevant to numerical weather prediction, and satellite data sets useful in weather analysis. SERVIR provides satellite data products for decision support, emphasizing environmental threats such as wildfires, floods, landslides, and other hazards, with interests in numerical weather prediction in support of disaster response. The Weather Research and Forecast (WRF) model Environmental Modeling System (WRF-EMS) has been configured for Nebula cloud computing use via the creation of a disk image and deployment of repeated instances. Given the available infrastructure within Nebula and the "infrastructure as a service" concept, the system appears well-suited for the rapid deployment of additional forecast models over different domains, in response to real-time research applications or disaster response. Future investigations into Nebula capabilities will focus on the development of a web mapping server and load balancing configuration to support the distribution of high resolution satellite data sets to users within the National Weather Service and international partners of SERVIR.

  14. Computer simulation in initial teacher education: a bridge across the faculty/practice divide or simply a better viewing platform?

    OpenAIRE

    Lowe, Graham

    2011-01-01

    This thesis reports on a mixed methods research project into the emerging area of computer simulation in Initial Teacher Education (ITE). Some areas where simulation has become a staple of initial or ongoing education and training, i.e. in health care and military applications, are examined to provide a context. The research explores the attitudes of a group of ITE students towards the use of a recently developed simulation tool and in particular considers the question of whether they view co...

  15. Molecular simulation workflows as parallel algorithms: the execution engine of Copernicus, a distributed high-performance computing platform.

    Science.gov (United States)

    Pronk, Sander; Pouya, Iman; Lundborg, Magnus; Rotskoff, Grant; Wesén, Björn; Kasson, Peter M; Lindahl, Erik

    2015-06-09

    Computational chemistry and other simulation fields are critically dependent on computing resources, but few problems scale efficiently to the hundreds of thousands of processors available in current supercomputers-particularly for molecular dynamics. This has turned into a bottleneck as new hardware generations primarily provide more processing units rather than making individual units much faster, which simulation applications are addressing by increasingly focusing on sampling with algorithms such as free-energy perturbation, Markov state modeling, metadynamics, or milestoning. All these rely on combining results from multiple simulations into a single observation. They are potentially powerful approaches that aim to predict experimental observables directly, but this comes at the expense of added complexity in selecting sampling strategies and keeping track of dozens to thousands of simulations and their dependencies. Here, we describe how the distributed execution framework Copernicus allows the expression of such algorithms in generic workflows: dataflow programs. Because dataflow algorithms explicitly state dependencies of each constituent part, algorithms only need to be described on conceptual level, after which the execution is maximally parallel. The fully automated execution facilitates the optimization of these algorithms with adaptive sampling, where undersampled regions are automatically detected and targeted without user intervention. We show how several such algorithms can be formulated for computational chemistry problems, and how they are executed efficiently with many loosely coupled simulations using either distributed or parallel resources with Copernicus.

  16. Parallel statistical image reconstruction for cone-beam x-ray CT on a shared memory computation platform

    International Nuclear Information System (INIS)

    Kole, J S; Beekman, F J

    2005-01-01

    Statistical reconstruction methods offer possibilities of improving image quality as compared to analytical methods, but current reconstruction times prohibit routine clinical applications. To reduce reconstruction times we have parallelized a statistical reconstruction algorithm for cone-beam x-ray CT, the ordered subset convex algorithm (OSC), and evaluated it on a shared memory computer. Two different parallelization strategies were developed: one that employs parallelism by computing the work for all projections within a subset in parallel, and one that divides the total volume into parts and processes the work for each sub-volume in parallel. Both methods are used to reconstruct a three-dimensional mathematical phantom on two different grid densities. The reconstructed images are binary identical to the result of the serial (non-parallelized) algorithm. The speed-up factor equals approximately 30 when using 32 to 40 processors, and scales almost linearly with the number of cpus for both methods. The huge reduction in computation time allows us to apply statistical reconstruction to clinically relevant studies for the first time

  17. Cloud Robotics Platforms

    Directory of Open Access Journals (Sweden)

    Busra Koken

    2015-01-01

    Full Text Available Cloud robotics is a rapidly evolving field that allows robots to offload computation-intensive and storage-intensive jobs into the cloud. Robots are limited in terms of computational capacity, memory and storage. Cloud provides unlimited computation power, memory, storage and especially collaboration opportunity. Cloud-enabled robots are divided into two categories as standalone and networked robots. This article surveys cloud robotic platforms, standalone and networked robotic works such as grasping, simultaneous localization and mapping (SLAM and monitoring.

  18. The impact of reorienting cone-beam computed tomographic images in varied head positions on the coordinates of anatomical landmarks

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Hun; Jeong, Ho Gul; Hwang, Jae Joon; Lee, Jung Hee; Han, Sang Sun [Dept. of Oral and Maxillofacial Radiology, Yonsei University, College of Dentistry, Seoul (Korea, Republic of)

    2016-06-15

    The aim of this study was to compare the coordinates of anatomical landmarks on cone-beam computed tomographic (CBCT) images in varied head positions before and after reorientation using image analysis software. CBCT images were taken in a normal position and four varied head positions using a dry skull marked with 3 points where gutta percha was fixed. In each of the five radiographic images, reference points were set, 20 anatomical landmarks were identified, and each set of coordinates was calculated. Coordinates in the images from the normally positioned head were compared with those in the images obtained from varied head positions using statistical methods. Post-reorientation coordinates calculated using a three-dimensional image analysis program were also compared to the reference coordinates. In the original images, statistically significant differences were found between coordinates in the normal-position and varied-position images. However, post-reorientation, no statistically significant differences were found between coordinates in the normal-position and varied-position images. The changes in head position impacted the coordinates of the anatomical landmarks in three-dimensional images. However, reorientation using image analysis software allowed accurate superimposition onto the reference positions.

  19. The Study of Multifunction External Fixator Based on Stewart Platform

    Directory of Open Access Journals (Sweden)

    Guo Yue

    2015-01-01

    Full Text Available The article develops the model of bone deformities, making 6-DOF Parallel Mechanism have widely applied to correction of deformities. The platform’s positional direct solution is the posture of the motion platform. Malformation can be measured by X-ray, based on the space coordinate transformation can find the final posture of the motion platform. Regarding the reverse solution to platform kinematics the paper gives a quick arithmetic program, six actuators to realize motion requirements. For the computer-assisted fracture reduction, we produced an application interface.

  20. BioWires: Conductive DNA Nanowires in a Computationally-Optimized, Synthetic Biological Platform for Nanoelectronic Fabrication

    Science.gov (United States)

    Vecchioni, Simon; Toomey, Emily; Capece, Mark C.; Rothschild, Lynn; Wind, Shalom

    2017-01-01

    DNA is an ideal template for a biological nanowire-it has a linear structure several atoms thick; it possesses addressable nucleobase geometry that can be precisely defined; and it is massively scalable into branched networks. Until now, the drawback of DNA as a conducting nanowire been, simply put, its low conductance. To address this deficiency, we extensively characterize a chemical variant of canonical DNA that exploits the affinity of natural cytosine bases for silver ions. We successfully construct chains of single silver ions inside double-stranded DNA, confirm the basic dC-Ag+-dC bond geometry and kinetics, and show length-tunability dependent on mismatch distribution, ion availability and enzyme activity. An analysis of the absorbance spectra of natural DNA and silver-binding, poly-cytosine DNA demonstrates the heightened thermostability of the ion chain and its resistance to aqueous stresses such as precipitation, dialysis and forced reduction. These chemically critical traits lend themselves to an increase in electrical conductivity of over an order of magnitude for 11-base silver-paired duplexes over natural strands when assayed by STM break junction. We further construct and implement a genetic pathway in the E. coli bacterium for the biosynthesis of highly ionizable DNA sequences. Toward future circuits, we construct a model of transcription network architectures to determine the most efficient and robust connectivity for cell-based fabrication, and we perform sequence optimization with a genetic algorithm to identify oligonucleotides robust to changes in the base-pairing energy landscape. We propose that this system will serve as a synthetic biological fabrication platform for more complex DNA nanotechnology and nanoelectronics with applications to deep space and low resource environments.

  1. Efficiency Analysis of the access method with the cascading Bloom filter to the data warehouse on the parallel computing platform

    Science.gov (United States)

    Grigoriev, Yu A.; Proletarskaya, V. A.; Ermakov, E. Yu; Ermakov, O. Yu

    2017-10-01

    A new method was developed with a cascading Bloom filter (CBF) for executing SQL queries in the Apache Spark parallel computing environment. It includes the representation of the original query in the form of several subqueries, the development of a connection graph and the transformation of subqueries, the definition of connections where it is necessary to use Bloom filters, the representation of the graph in terms of Spark. On the example of the query Q3 of the TPC-H test, full-scale experiments were carried out, which confirmed the effectiveness of the developed method.

  2. Fragment-based docking: development of the CHARMMing Web user interface as a platform for computer-aided drug design.

    Science.gov (United States)

    Pevzner, Yuri; Frugier, Emilie; Schalk, Vinushka; Caflisch, Amedeo; Woodcock, H Lee

    2014-09-22

    Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser. One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing's capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of "re-dockings" with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing's docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening.

  3. Platform Constellations

    DEFF Research Database (Denmark)

    Staykova, Kalina Stefanova; Damsgaard, Jan

    2016-01-01

    This research paper presents an initial attempt to introduce and explain the emergence of new phenomenon, which we refer to as platform constellations. Functioning as highly modular systems, the platform constellations are collections of highly connected platforms which co-exist in parallel and a......’ acquisition and users’ engagement rates as well as unlock new sources of value creation and diversify revenue streams....

  4. Computational Studies of Positive and Negative Streamers in Bubbles Suspended in Distilled Water

    KAUST Repository

    Sharma, Ashish; Levko, Dmitry; Raja, Laxminarayan L.

    2017-01-01

    We perform computational studies of nanosecond streamers generated in helium bubbles immersed in distilled water under high pressure conditions. The model takes into account the presence of water vapor in the gas bubble for an accurate description

  5. Final technical position on documentation of computer codes for high-level waste management

    International Nuclear Information System (INIS)

    Silling, S.A.

    1983-06-01

    Guidance is given for the content of documentation of computer codes which are used in support of a license application for high-level waste disposal. The guidelines cover theoretical basis, programming, and instructions for use of the code

  6. Diagnosing coronary artery disease after a positive coronary computed tomography angiography

    DEFF Research Database (Denmark)

    Nissen, L; Winther, S; Westra, J

    2018-01-01

    Aims: Perfusion scans after coronary computed tomography angiography (CCTA) in patients with suspected coronary artery disease (CAD) may reduce unnecessary invasive coronary angiographies (ICAs). However, the diagnostic accuracy of perfusion scans after primary CCTA is unknown. The aim...

  7. Position Paper: Applying Machine Learning to Software Analysis to Achieve Trusted, Repeatable Scientific Computing

    Energy Technology Data Exchange (ETDEWEB)

    Prowell, Stacy J [ORNL; Symons, Christopher T [ORNL

    2015-01-01

    Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.

  8. Associating Drugs, Targets and Clinical Outcomes into an Integrated Network Affords a New Platform for Computer-Aided Drug Repurposing

    DEFF Research Database (Denmark)

    Oprea, Tudor; Nielsen, Sonny Kim; Ursu, Oleg

    2011-01-01

    benefit from an integrated, semantic-web compliant computer-aided drug repurposing (CADR) effort, one that would enable deep data mining of associations between approved drugs (D), targets (T), clinical outcomes (CO) and SE. We report preliminary results from text mining and multivariate statistics, based...... on 7684 approved drug labels, ADL (Dailymed) via text mining. From the ADL corresponding to 988 unique drugs, the "adverse reactions" section was mapped onto 174 SE, then clustered via principal component analysis into a 5 x 5 self-organizing map that was integrated into a Cytoscape network of SE......Finding new uses for old drugs is a strategy embraced by the pharmaceutical industry, with increasing participation from the academic sector. Drug repurposing efforts focus on identifying novel modes of action, but not in a systematic manner. With intensive data mining and curation, we aim to apply...

  9. Wireless sensor platform

    Science.gov (United States)

    Joshi, Pooran C.; Killough, Stephen M.; Kuruganti, Phani Teja

    2017-08-08

    A wireless sensor platform and methods of manufacture are provided. The platform involves providing a plurality of wireless sensors, where each of the sensors is fabricated on flexible substrates using printing techniques and low temperature curing. Each of the sensors can include planar sensor elements and planar antennas defined using the printing and curing. Further, each of the sensors can include a communications system configured to encode the data from the sensors into a spread spectrum code sequence that is transmitted to a central computer(s) for use in monitoring an area associated with the sensors.

  10. Windows Azure Platform

    CERN Document Server

    Redkar, Tejaswi

    2011-01-01

    The Windows Azure Platform has rapidly established itself as one of the most sophisticated cloud computing platforms available. With Microsoft working to continually update their product and keep it at the cutting edge, the future looks bright - if you have the skills to harness it. In particular, new features such as remote desktop access, dynamic content caching and secure content delivery using SSL make the latest version of Azure a more powerful solution than ever before. It's widely agreed that cloud computing has produced a paradigm shift in traditional architectural concepts by providin

  11. The hackable city : Citymaking in a platform society

    NARCIS (Netherlands)

    de Waal, Martijn; de Lange, Michiel; Bouw, Matthijs

    2017-01-01

    Can computer hacking have positive parallels in the shaping of the built environment? The Hackable City research project was set up with this question in mind, to investigate the potential of digital platforms to open up the citymaking process. Its cofounders Martijn de Waal, Michiel de Lange and

  12. The BioFragment Database (BFDb): An open-data platform for computational chemistry analysis of noncovalent interactions

    Science.gov (United States)

    Burns, Lori A.; Faver, John C.; Zheng, Zheng; Marshall, Michael S.; Smith, Daniel G. A.; Vanommeslaeghe, Kenno; MacKerell, Alexander D.; Merz, Kenneth M.; Sherrill, C. David

    2017-10-01

    Accurate potential energy models are necessary for reliable atomistic simulations of chemical phenomena. In the realm of biomolecular modeling, large systems like proteins comprise very many noncovalent interactions (NCIs) that can contribute to the protein's stability and structure. This work presents two high-quality chemical databases of common fragment interactions in biomolecular systems as extracted from high-resolution Protein DataBank crystal structures: 3380 sidechain-sidechain interactions and 100 backbone-backbone interactions that inaugurate the BioFragment Database (BFDb). Absolute interaction energies are generated with a computationally tractable explicitly correlated coupled cluster with perturbative triples [CCSD(T)-F12] "silver standard" (0.05 kcal/mol average error) for NCI that demands only a fraction of the cost of the conventional "gold standard," CCSD(T) at the complete basis set limit. By sampling extensively from biological environments, BFDb spans the natural diversity of protein NCI motifs and orientations. In addition to supplying a thorough assessment for lower scaling force-field (2), semi-empirical (3), density functional (244), and wavefunction (45) methods (comprising >1M interaction energies), BFDb provides interactive tools for running and manipulating the resulting large datasets and offers a valuable resource for potential energy model development and validation.

  13. GENESIS 1.1: A hybrid-parallel molecular dynamics simulator with enhanced sampling algorithms on multiple computational platforms.

    Science.gov (United States)

    Kobayashi, Chigusa; Jung, Jaewoon; Matsunaga, Yasuhiro; Mori, Takaharu; Ando, Tadashi; Tamura, Koichi; Kamiya, Motoshi; Sugita, Yuji

    2017-09-30

    GENeralized-Ensemble SImulation System (GENESIS) is a software package for molecular dynamics (MD) simulation of biological systems. It is designed to extend limitations in system size and accessible time scale by adopting highly parallelized schemes and enhanced conformational sampling algorithms. In this new version, GENESIS 1.1, new functions and advanced algorithms have been added. The all-atom and coarse-grained potential energy functions used in AMBER and GROMACS packages now become available in addition to CHARMM energy functions. The performance of MD simulations has been greatly improved by further optimization, multiple time-step integration, and hybrid (CPU + GPU) computing. The string method and replica-exchange umbrella sampling with flexible collective variable choice are used for finding the minimum free-energy pathway and obtaining free-energy profiles for conformational changes of a macromolecule. These new features increase the usefulness and power of GENESIS for modeling and simulation in biological research. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  14. A mangrove forest map of China in 2015: Analysis of time series Landsat 7/8 and Sentinel-1A imagery in Google Earth Engine cloud computing platform

    Science.gov (United States)

    Chen, Bangqian; Xiao, Xiangming; Li, Xiangping; Pan, Lianghao; Doughty, Russell; Ma, Jun; Dong, Jinwei; Qin, Yuanwei; Zhao, Bin; Wu, Zhixiang; Sun, Rui; Lan, Guoyu; Xie, Guishui; Clinton, Nicholas; Giri, Chandra

    2017-09-01

    Due to rapid losses of mangrove forests caused by anthropogenic disturbances and climate change, accurate and contemporary maps of mangrove forests are needed to understand how mangrove ecosystems are changing and establish plans for sustainable management. In this study, a new classification algorithm was developed using the biophysical characteristics of mangrove forests in China. More specifically, these forests were mapped by identifying: (1) greenness, canopy coverage, and tidal inundation from time series Landsat data, and (2) elevation, slope, and intersection-with-sea criterion. The annual mean Normalized Difference Vegetation Index (NDVI) was found to be a key variable in determining the classification thresholds of greenness, canopy coverage, and tidal inundation of mangrove forests, which are greatly affected by tide dynamics. In addition, the integration of Sentinel-1A VH band and modified Normalized Difference Water Index (mNDWI) shows great potential in identifying yearlong tidal and fresh water bodies, which is related to mangrove forests. This algorithm was developed using 6 typical Regions of Interest (ROIs) as algorithm training and was run on the Google Earth Engine (GEE) cloud computing platform to process 1941 Landsat images (25 Path/Row) and 586 Sentinel-1A images circa 2015. The resultant mangrove forest map of China at 30 m spatial resolution has an overall/users/producer's accuracy greater than 95% when validated with ground reference data. In 2015, China's mangrove forests had a total area of 20,303 ha, about 92% of which was in the Guangxi Zhuang Autonomous Region, Guangdong, and Hainan Provinces. This study has demonstrated the potential of using the GEE platform, time series Landsat and Sentine-1A SAR images to identify and map mangrove forests along the coastal zones. The resultant mangrove forest maps are likely to be useful for the sustainable management and ecological assessments of mangrove forests in China.

  15. Computer-Aided Detection in Digital Mammography: False-Positive Marks and Their Reproducibility in Negative Mammograms

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Chang, Jung Min; Seong, Min Hyun

    2009-01-01

    Background: There are relatively few studies reporting the frequency of false-positive computer-aided detection (CAD) marks and their reproducibility in normal cases. Purpose: To evaluate retrospectively the false-positive mark rate of a CAD system and the reproducibility of false-positive marks in two sets of negative digital mammograms. Material and Methods: Two sets of negative digital mammograms were obtained in 360 women (mean age 57 years, range 30-76 years) with an approximate interval of 1 year (mean time 343.7 days), and a CAD system was applied. False-positive CAD marks and the reproducibility were determined. Results: Of the 360 patients, 252 (70.0%) and 240 (66.7%) patients had 1-7 CAD marks on the initial and second mammograms, respectively. The false-positive CAD mark rate was 1.5 (1.1 for masses and 0.4 for calcifications) and 1.4 (1.0 for masses and 0.4 for calcifications) per examination in the initial and second mammograms, respectively. The reproducibility of the false-positive CAD marks was 12.0% for both mass (81/680) and microcalcification (33/278) marks. Conclusion: False-positive CAD marks were seen in approximately 70% of normal cases. However, the reproducibility was very low. Radiologists must be familiar with the findings of false-positive CAD marks, since they are very common and can increase the recall rate in screening

  16. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Plutonium Metals, Oxides, and Solutions on the High Performance Computing Platform Moonlight

    Energy Technology Data Exchange (ETDEWEB)

    Chapman, Bryan Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gough, Sean T. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-12-05

    This report documents a validation of the MCNP6 Version 1.0 computer code on the high performance computing platform Moonlight, for operations at Los Alamos National Laboratory (LANL) that involve plutonium metals, oxides, and solutions. The validation is conducted using the ENDF/B-VII.1 continuous energy group cross section library at room temperature. The results are for use by nuclear criticality safety personnel in performing analysis and evaluation of various facility activities involving plutonium materials.

  17. Positive intraluminal bowel contrast on computed tomography following oral ingestion of Kayexelate

    International Nuclear Information System (INIS)

    Zissin, R.; Stackievicz, R.; Osadchy, A.; Gayer, G.

    2008-01-01

    Our study presents the computed tomography (CT) manifestations of orally ingested kayexelate (a powdered form of sodium polystyrene sulphonate) used to treat hyperkalemia. Five patients with whom kayexelate appeared as high-attenuating intraluminal enteric content, similar to oral contrast material or leakage of intravascular contrast, are reported. Radiologists should be familiar with its appearance as it may mimic oral or vascular contrast within the gastrointestinal tract, a finding that may lead to a diagnostic error or misinterpretation. (author)

  18. Gastrointestinal diseases in HIV-positive patients: ultrasonography and computed tomography in a study of 85 patients

    International Nuclear Information System (INIS)

    Garcia, S.; Yague, D.; Garcia, C.; Villalon, M.; Pascual, A.; Artigas, J.M.

    1998-01-01

    Gastrointestinal diseases constitute the second most common group conditions affecting HIV-positive patients after respiratory diseases. Gastrointestinal involvement may even be the first sign of the disease, a facto which demonstrates the importance of its proper assessment. To demonstrate the utility of computed tomography and ultrasound in the study of gastrointestinal and hepatobiliary diseases in the HIV-positive patient. We review a series of 85 HIV-positive patients presenting gastrointestinal symptomatology who underwent ultrasonography and/or computed tomography. the definitive diagnosis was achieved in all the patients by microbiological or histopathological means. In our series 36.4% the patients had presented systemic TB, 23.52% CMV infection, 17.64% Cryptosporidium infection and 17.64% MAI infection. Much lower incidences were found for Mycobacterium xenopi. M. kansai and Leishmania infection. The presence of lymphoma was confirmed in 7.05% of the patients and Koposi's sarcoma in 0.95%. In these patients, the most common finding on imaging studies in lymph node involvement, followed by diffuse hepatosplenomegaly. Imaging techniques, especially ultrasonography and computed tomography, are useful in these patients: although they do not provide the diagnosis, they do contribute data of prognostic and therapeutic importance. (Author) 11 refs

  19. Analysis of the root position of the maxillary incisors in the alveolar bone using cone-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Yun Hoa; Cho, Bong Hee [Dept. of Oral and Maxillofacial Radiology, School of Dentistry, Pusan National University, Yangsan (Korea, Republic of); Hwang, Jae Jun [Dept. of Oral and Maxillofacial Radiology, Yonsei University College of Dentistry, Seoul (Korea, Republic of)

    2017-09-15

    The purpose of this study was to measure the buccal bone thickness and angulation of the maxillary incisors and to analyze the correlation between these parameters and the root position in the alveolar bone using cone-beam computed tomography (CBCT). CBCT images of 398 maxillary central and lateral incisors from 199 patients were retrospectively reviewed. The root position in the alveolar bone was classified as buccal, middle, or palatal, and the buccal type was further classified into subtypes I, II, and III. In addition, the buccolingual inclination of the tooth and buccal bone thickness were evaluated. A majority of the maxillary incisors were positioned more buccally within the alveolar bone, and only 2 lateral incisors (0.5%) were positioned more palatally. The angulation of buccal subtype III was the greatest and that of the middle type was the lowest. Most of the maxillary incisors exhibited a thin facial bone wall, and the lateral incisors had a significantly thinner buccal bone than the central incisors. The buccal bone of buccal subtypes II and III was significantly thinner than that of buccal subtype I. A majority of the maxillary incisor roots were positioned close to the buccal cortical plate and had a thin buccal bone wall. Significant relationships were observed between the root position in the alveolar bone, the angulation of the tooth in the alveolar bone, and buccal bone thickness. CBCT analyses of the buccal bone and sagittal root position are recommended for the selection of the appropriate treatment approach.

  20. Differences in abdominal organ movement between supine and prone positions measured using four-dimensional computed tomography

    International Nuclear Information System (INIS)

    Kim, Young Seok; Park, Sung Ho; Ahn, Seung Do; Lee, Jeong Eun; Choi, Eun Kyung; Lee, Sang-wook; Shin, Seong Soo; Yoon, Sang Min; Kim, Jong Hoon

    2007-01-01

    Background and purpose: To analyze the differences in intrafractional organ movement throughout the breathing cycles between the supine and prone positions using four-dimensional computed tomography (4D CT). Materials and methods: We performed 4D CT on nine volunteers in the supine and prone positions, with each examinee asked to breathe normally during scanning. The movement of abdominal organs in the cranio-caudal (CC), anterior-posterior (AP) and right-left (RL) directions was quantified by contouring on each phase between inspiration and expiration. Results: The mean intrafractional motions of the hepatic dome, lower tip, pancreatic head and tail, both kidneys, spleen, and celiac axis in the supine/prone position were 17.3/13.0, 14.4/11.0, 12.8/8.9, 13.0/10.0, 14.3/12.1, 12.3/12.6, 11.7/12.6 and 2.2/1.8 mm, respectively. Intrafractional movements of the liver dome and pancreatic head were reduced significantly in the prone position. The CC directional excursions were major determinants of the 3D displacements of the abdominal organs. Alteration from the supine to the prone position did not change the amount of intrafractional movements of kidneys, spleen, and celiac axis. Conclusion: There was a significant reduction in the movements of the liver and pancreas during the prone position, especially in the CC direction, suggesting possible advantage of radiotherapy to these organs in this position

  1. HPC - Platforms Penta Chart

    Energy Technology Data Exchange (ETDEWEB)

    Trujillo, Angelina Michelle [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-10-08

    Strategy, Planning, Acquiring- very large scale computing platforms come and go and planning for immensely scalable machines often precedes actual procurement by 3 years. Procurement can be another year or more. Integration- After Acquisition, machines must be integrated into the computing environments at LANL. Connection to scalable storage via large scale storage networking, assuring correct and secure operations. Management and Utilization – Ongoing operations, maintenance, and trouble shooting of the hardware and systems software at massive scale is required.

  2. Motivating Students through Positive Learning Experiences: A Comparison of Three Learning Designs for Computer Programming Courses

    Science.gov (United States)

    Lykke, Marianne; Coto, Mayela; Jantzen, Christian; Mora, Sonia; Vandel, Niels

    2015-01-01

    Based on the assumption that wellbeing, positive emotions and engagement influence motivation for learning, the aim of this paper is to provide insight into students' emotional responses to and engagement in different learning designs. By comparing students' reports on the experiential qualities of three different learning designs, their…

  3. Computational Model of a Positive BDNF Feedback Loop in Hippocampal Neurons Following Inhibitory Avoidance Training

    Science.gov (United States)

    Zhang, Yili; Smolen, Paul; Alberini, Cristina M.; Baxter, Douglas A.; Byrne, John H.

    2016-01-01

    Inhibitory avoidance (IA) training in rodents initiates a molecular cascade within hippocampal neurons. This cascade contributes to the transition of short- to long-term memory (i.e., consolidation). Here, a differential equation-based model was developed to describe a positive feedback loop within this molecular cascade. The feedback loop begins…

  4. Simple technique to achieve a natural position of the head for cone beam computed tomography

    NARCIS (Netherlands)

    Damstra, Janalt; Fourie, Zacharias; Ren, Yijin

    We developed a modified laser level technique to record the natural position of the head in all three planes of space. This is a simple method for use with three-dimensional images and may be valuable in routine craniofacial assessment.

  5. Radiological and micro-computed tomography analysis of the bone at dental implants inserted 2, 3 and 4 mm apart in a minipig model with platform switching incorporated.

    Science.gov (United States)

    Elian, Nicolas; Bloom, Mitchell; Dard, Michel; Cho, Sang-Choon; Trushkowsky, Richard D; Tarnow, Dennis

    2014-02-01

    The purpose of this study was to assess the effect of inter-implant distance on interproximal bone utilizing platform switching. Analysis of interproximal bone usually depends on traditional two-dimensional radiographic assessment. Although there has been increased reliability of current techniques, there has been an inability to track bone level changes over time and in three dimensions. Micro-CT has provided three-dimensional imaging that can be used in conjunction with traditional two-dimensional radiographic techniques. This study was performed on 24 female minipigs. Twelve animals received three implants with an inter-implant distance of 3 mm on one side of the mandible and another three implants on the contra-lateral side, where the implants were placed 2 mm apart creating a split mouth design. Twelve other animals received three implants with an inter-implant distance of 3 mm on one side of the mandible and another three implants on the contra-lateral side, where the implants were placed 4 mm apart creating a split mouth design too. The quantitative evaluation was performed comparatively on radiographs taken at t 0 (immediately after implantation) and at t 8 weeks (after termination). The samples were scanned by micro-computed tomography (μCT) to quantify the first bone to implant contact (fBIC) and bone volume/total volume (BV/TV). Mixed model regressions using the nonparametric Brunner-Langer method were used to determine the effect of inter-implant distance on the measured outcomes. The change in bone level was determined using radiography and its mean was 0.05 mm for an inter-implant distance of 3 and 0.00 mm for a 2 mm distance (P = 0.7268). The mean of this outcome was 0.18 mm for the 3 mm and for 4 mm inter-implant distance (P = 0.9500). Micro-computed tomography showed that the fBIC was always located above the reference, 0.27 and 0.20 mm for the comparison of 2-3 mm (P = 0.4622) and 0.49 and 0.34 mm for the inter-implant distance of 3 and 4 mm (P

  6. Iterative algorithms for computing the feedback Nash equilibrium point for positive systems

    Science.gov (United States)

    Ivanov, I.; Imsland, Lars; Bogdanova, B.

    2017-03-01

    The paper studies N-player linear quadratic differential games on an infinite time horizon with deterministic feedback information structure. It introduces two iterative methods (the Newton method as well as its accelerated modification) in order to compute the stabilising solution of a set of generalised algebraic Riccati equations. The latter is related to the Nash equilibrium point of the considered game model. Moreover, we derive the sufficient conditions for convergence of the proposed methods. Finally, we discuss two numerical examples so as to illustrate the performance of both of the algorithms.

  7. Position emission tomography with or without computed tomography in the primary staging of Hodgkin's lymphoma

    DEFF Research Database (Denmark)

    Hutchings, Martin; Loft, Annika; Hansen, Mads

    2006-01-01

    BACKGROUND AND OBJECTIVES: In order to receive the most appropriate therapy, patients with Hodgkin's lymphoma (HL) must be accurately stratified into different prognostic staging groups. Computed tomography (CT) plays a pivotal role in the conventional staging. The aim of the present study...... was to investigate the value of positron emission tomography using 2-[18F]fluoro-2-deoxy-D-glucose (FDG-PET) and combined FDG-PET/CT for the staging of HL patients, and the impact on the choice of treatment. DESIGN AND METHODS: Ninety-nine consecutive, prospectively included patients had FDG-PET and CT...

  8. The universal modular platform

    International Nuclear Information System (INIS)

    North, R.B.

    1995-01-01

    A new and patented design for offshore wellhead platforms has been developed to meet a 'fast track' requirement for increased offshore production, from field locations not yet identified. The new design uses modular construction to allow for radical changes in the water depth of the final location and assembly line efficiency in fabrication. By utilizing high strength steels and structural support from the well conductors the new design accommodates all planned production requirements on a support structure significantly lighter and less expensive than the conventional design it replaces. Twenty two platforms based on the new design were ready for installation within 18 months of the project start. Installation of the new platforms began in 1992 for drilling support and 1993 for production support. The new design has become the Company standard for all future production platforms. Large saving and construction costs have been realized through its light weight, flexibility in both positioning and water depth, and its modular construction

  9. Secondary markets for transmission rights in the North West European Market. Position Paper of the North West European Market Parties Platform

    International Nuclear Information System (INIS)

    Van Haaster, G.

    2006-06-01

    The most important way to acquire cross border transmission rights in the North West European electricity market is through explicit auctions. Although market driven flexibility and therefore efficiency can be further enhanced. One way to this is to introduce a secondary market for transmission rights. In this paper the North West European Market Parties Platform (NWE MPP) proposes a model that is developed and preferred by the market parties. The paper will provide a converging contribution to the congestion management discussions in the North Western European region

  10. Development of computation model on the GoldSim platform for the radionuclide transport in the geosphere with the time-dependent parameters

    International Nuclear Information System (INIS)

    Koo, Shigeru; Inagaki, Manabu

    2010-06-01

    In the high-level radioactive waste (HLW) disposal system, numerical evaluation for radionuclide transport with the time-dependent parameters is necessary to evaluate various scenarios. In H12 report, numerical calculation code MESHNOTE and TIGER were used for the evaluation of some natural phenomena scenarios that had to handle the time-dependent parameters. In the future, the necessity of handling the time-dependent parameters will be expected to increase, and more efficient calculation and improvement of quality control of input/output parameters will be required. Therefore, for the purpose of corresponding this requirement, a radionuclide transport model has been developed on the GoldSim platform. The GoldSim is a general simulation software, that was used for the computation modeling of Yucca Mountain Project. The conceptual model, the mathematical model and the verification of the GoldSim model are described in this report. In the future, application resources on this report will be able to upgrade for perturbation scenarios analysis model and other conceptual models. (author)

  11. Comparison of Water, Mannitol and Positive Oral Contrast for Evaluation of Bowel By Computed Tomography

    Directory of Open Access Journals (Sweden)

    Padhmanaban Elamparidhi

    2017-10-01

    Full Text Available Introduction: Small bowel remains a challenging anatomical site. Imaging approaches like CT-enterography helps in diagnosing non specific clinical presentations and imaging aids in appropriate management. Hence, bowel evaluation by CT requires a oral contrast agent for diagnosing the bowel pathology. Thus, quantitative and qualitative analysis of three oral contrast agents i.e., water, mannitol and positive contrast was done for identification of ideal intraluminal contrast agent. Aim: To assess the performance of mannitol as an endoluminal contrast agent as compared to water and positive contrast in the evaluation of bowel, to compare the distention of bowel with different oral contrasts and also to assess the usefulness of bowel distension in assessment of mural enhancement pattern of bowel. Materials and Methods: A comparative observational study was performed which consisted of 75 patients who were divided into three groups of 25 patients each. Patients in each group were given 1500 ml of oral contrast. Group I was given mannitol, Group II was given water and Group III was given positive contrast. Assessments of bowel distention at various levels and mural enhancement of bowel were studied. Chisquare test was used as test of significance for qualitative data. ANOVA (Analysis of Variance was the test of significance for quantitative data. Results: Bowel distention was excellent in mannitol compared to water and positive contrast. Wall enhancement and mural pattern was better appreciated with mannitol compared to other two contrast agents. Conclusion: Adequate bowel evaluation by CT requires an oral contrast agent which can cause maximal bowel distention, uniform intraluminal attenuation, increased contrast between intraluminal content and bowel wall with no artifacts and adverse effects. Mannitol has all the above characteristic and can be used as ideal neutral oral contrast agent.

  12. New Directions for Hardware-assisted Trusted Computing Policies (Position Paper)

    Science.gov (United States)

    Bratus, Sergey; Locasto, Michael E.; Ramaswamy, Ashwin; Smith, Sean W.

    The basic technological building blocks of the TCG architecture seem to be stabilizing. As a result, we believe that the focus of the Trusted Computing (TC) discipline must naturally shift from the design and implementation of the hardware root of trust (and the subsequent trust chain) to the higher-level application policies. Such policies must build on these primitives to express new sets of security goals. We highlight the relationship between enforcing these types of policies and debugging, since both activities establish the link between expected and actual application behavior. We argue that this new class of policies better fits developers' mental models of expected application behaviors, and we suggest a hardware design direction for enabling the efficient interpretation of such policies.

  13. Computers, coders, and voters: Comparing automated methods for estimating party positions

    DEFF Research Database (Denmark)

    Hjorth, F.; Klemmensen, R.; Hobolt, S.

    2015-01-01

    Assigning political actors positions in ideological space is a task of key importance to political scientists. In this paper we compare estimates obtained using the automated Wordscores and Wordfish techniques, along with estimates from voters and the Comparative Manifesto Project (CMP), against...... texts and a more ideologically charged vocabulary in order to produce estimates comparable to Wordscores. The paper contributes to the literature on automated content analysis by providing a comprehensive test of convergent validation, in terms of both number of cases analyzed and number of validation...

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  15. Simulation of personalised haemodynamics by various mounting positions of a prosthetic valve using computational fluid dynamics.

    Science.gov (United States)

    Bongert, Markus; Geller, Marius; Pennekamp, Werner; Nicolas, Volkmar

    2018-03-03

    Diseases of the cardiovascular system account for nearly 42% of all deaths in the European Union. In Germany, approximately 12,000 patients receive surgical replacement of the aortic valve due to heart valve disease alone each year. A three-dimensional (3D) numerical model based on patient-specific anatomy derived from four-dimensional (4D) magnetic resonance imaging (MRI) data was developed to investigate preoperatively the flow-induced impact of mounting positions of aortic prosthetic valves to select the best orientation for individual patients. Systematic steady-state analysis of blood flow for different rotational mounting positions of the valve is only possible using a virtual patient model. A maximum velocity of 1 m/s was used as an inlet boundary condition, because the opening angle of the valve is at its largest at this velocity. For a comparative serial examination, it is important to define the standardised general requirements to avoid impacts other than the rotated implantation of the prosthetic aortic valve. In this study, a uniform velocity profile at the inlet for the inflow of the aortic valve and the real aortic anatomy were chosen for all simulations. An iterative process, with the weighted parameters flow resistance (1), shear stress (2) and velocity (3), was necessary to determine the best rotated orientation. Blood flow was optimal at a 45° rotation from the standard implantation orientation, which will offer a supply to the coronary arteries.

  16. An automatic colour-based computer vision algorithm for tracking the position of piglets

    Energy Technology Data Exchange (ETDEWEB)

    Navarro-Jover, J. M.; Alcaniz-Raya, M.; Gomez, V.; Balasch, S.; Moreno, J. R.; Grau-Colomer, V.; Torres, A.

    2009-07-01

    Artificial vision is a powerful observation tool for research in the field of livestock production. So, based on the search and recognition of colour spots in images, a digital image processing system which permits the detection of the position of piglets in a farrowing pen, was developed. To this end, 24,000 images were captured over five takes (days), with a five-second interval between every other image. The nine piglets in a litter were marked on their backs and sides with different coloured spray paints each one, placed at a considerable distance on the RGB space. The programme requires the user to introduce the colour patterns to be found, and the output is an ASCII file with the positions (column X, lineY) for each of these marks within the image analysed. This information may be extremely useful for further applications in the study of animal behaviour and welfare parameters (huddling, activity, suckling, etc.). The software programme initially segments the image in the RGB colour space to separate the colour marks from the rest of the image, and then recognises the colour patterns, using another colour space [B/(R+G+B), (G-R), (B-G)] more suitable for this purpose. This additional colour space was obtained testing different colour combinations derived from R, G and B. The statistical evaluation of the programmes performance revealed an overall 72.5% in piglet detection, 89.1% of this total being correctly detected. (Author) 33 refs.

  17. Windows Azure Platform

    CERN Document Server

    Redkar, Tejaswi

    2010-01-01

    The Azure Services Platform is a brand-new cloud-computing technology from Microsoft. It is composed of four core components-Windows Azure, .NET Services, SQL Services, and Live Services-each with a unique role in the functioning of your cloud service. It is the goal of this book to show you how to use these components, both separately and together, to build flawless cloud services. At its heart Windows Azure Platform is a down-to-earth, code-centric book. This book aims to show you precisely how the components are employed and to demonstrate the techniques and best practices you need to know

  18. Position of document holder and work related risk factors for neck pain among computer users: a narrative review.

    Science.gov (United States)

    Ambusam, S; Baharudin, O; Roslizawati, N; Leonard, J

    2015-01-01

    Document holder is used as a remedy to address occupational neck pain among computer users. An understanding on the effects of the document holder along with other work related risk factors while working in computer workstation requires attention. A comprehensive knowledge on the optimal location of the document holder in computer use and associated work related factors that may contribute to neck pain reviewed in this article. A literature search has been conducted over the past 14 years based on the published articles from January 1990 to January 2014 in both Science Direct and PubMed databases. Medical Subject Headings (MeSH) keywords for search were neck muscle OR head posture OR muscle tension' OR muscle activity OR work related disorders OR neck pain AND/OR document location OR document holder OR source document OR copy screen holder.Document holder placed lateral to the screen was most preferred to reduce neck discomfort among occupational typists. Document without a holder was placed flat on the surface is least preferred. The head posture and muscle activity increases when the document is placed flat on the surface compared to when placed on the document holder. Work related factors such as static posture, repetitive movement, prolong sitting and awkward positions were the risk factors for chronic neck pain. This review highlights the optimal location for document holder for computer users to reduce neck pain. Together, the importance of work related risk factors for to neck pain on occupational typist is emphasized for the clinical management.

  19. Surgical positioning of orthodontic mini-implants with guides fabricated on models replicated with cone-beam computed tomography.

    Science.gov (United States)

    Kim, Seong-Hun; Choi, Yong-Suk; Hwang, Eui-Hwan; Chung, Kyu-Rhim; Kook, Yoon-Ah; Nelson, Gerald

    2007-04-01

    This article illustrates a new surgical guide system that uses cone-beam computed tomography (CBCT) images to replicate dental models; surgical guides for the proper positioning of orthodontic mini-implants were fabricated on the replicas, and the guides were used for precise placement. The indications, efficacy, and possible complications of this method are discussed. Patients who were planning to have orthodontic mini-implant treatment were recruited for this study. A CBCT system (PSR 9000N, Asahi Roentgen, Kyoto, Japan) was used to acquire virtual slices of the posterior maxilla that were 0.1 to 0.15 mm thick. Color 3-dimensional rapid prototyping was used to differentiate teeth, alveolus, and maxillary sinus wall. A surgical guide for the mini-implant was fabricated on the replica model. Proper positioning for mini-implants on the posterior maxilla was determined by viewing the CBCT images. The surgical guide was placed on the clinical site, and it allowed precise pilot drilling and accurate placement of the mini-implant. CBCT imaging allows remarkably lower radiation doses and thinner acquisition slices compared with medical computed tomography. Virtually reproduced replica models enable precise planning for mini-implant positions in anatomically complex sites.

  20. The description of condyle position in disc displacement with reduction using Cone Beam Computed Tomography 3D radiographic analysis

    Directory of Open Access Journals (Sweden)

    Liana Rahmayani

    2009-07-01

    Full Text Available One of the temporomandibular joint disorders that mostly occurs is disc displacement with reduction. Disc displacement that causes the displacement of condyle position can be evaluated by using radiograph. The Cone Beam Computed Tomography (CBCT-3D is a radiograph equipment which is able to capture the condyle position from many directions. This research was aimed to see the condyle position in patients with disc displacement with reduction symptoms. This research was conducted to 11 patients with symptoms of disc displacement with reduction and 3 patients without symptoms of disc displacement with reduction as the counterpart. What was conducted to the sample was the radiographic imaging using CBCT-3D radiography, followed by measuring the joint space distance in the sagittal and coronal directions. The result of the research was analyzed using the T-test. Statistically, the result of the test showed a significant difference ( = 0.05 between patients with disc displacement with reduction symptoms and the patients without symptoms, in sagittal and coronal views. The conclusion led to the difference in condyle positions in patients with the disc displacement with reduction and patients without the symptoms which meant there was a condyle position displacement that caused the distance alteration in joint space in sagittal and coronal directions.

  1. ESCADT: a FORTRAN code for computing the positions and areas of x-ray photoelectron spectral peaks

    International Nuclear Information System (INIS)

    Cox, L.E.

    1979-09-01

    Program ESCADT uses least-squares-derived convoluting numbers to smooth and differentiate x-ray photoelectron spectra. Peak maxima are located by finding zero crossings of the first derivative and refined using a cubic polynomial fitting procedure. Background points are located using the product of the absolute value of the first derivative and the smoothed ordinate value. Peak areas, using both linear and scattered electron backgrounds, are computed. Spectra are corrected for changes in instrument sensitivity and energy calibration with gold-standard data retrieved from a disk file. Five determinations of the gold 4f peak positions yielded standard deviations of 0.011 and 0.031 eV for the 4f/sub 7/2/ and 4f/sub 5/2/ peaks, respectively. The relative standard deviation for the computed areas was 0.85%

  2. [The P300 based brain-computer interface: effect of stimulus position in a stimulus train].

    Science.gov (United States)

    Ganin, I P; Shishkin, S L; Kochetova, A G; Kaplan, A Ia

    2012-01-01

    The P300 brain-computer interface (BCI) is currently the most efficient BCI. This interface is based on detection of the P300 wave of the brain potentials evoked when a symbol related to the intended input is highlighted. To increase operation speed of the P300 BCI, reduction of the number of stimuli repetitions is needed. This reduction leads to increase of the relative contribution to the input symbol detection from the reaction to the first target stimulus. It is known that the event-related potentials (ERP) to the first stimulus presentations can be different from the ERP to stimuli presented latter. In particular, the amplitude of responses to the first stimulus presentations is often increased, which is beneficial for their recognition by the BCI. However, this effect was not studied within the BCI framework. The current study examined the ERP obtained from healthy participants (n = 14) in the standard P300 BCI paradigm using 10 trials, as well as in the modified P300 BCI with stimuli presented on moving objects in triple-trial (n = 6) and single-trial (n = 6) stimulation modes. Increased ERP amplitude was observed in response to the first target stimuli in both conditions, as well as in the single-trial mode comparing to triple-trial. We discuss the prospects of using the specific features of the ERP to first stimuli and the single-trial ERP for optimizing the high-speed modes in the P300 BCIs.

  3. Understanding the connection between epigenetic DNA methylation and nucleosome positioning from computer simulations.

    Directory of Open Access Journals (Sweden)

    Guillem Portella

    Full Text Available Cytosine methylation is one of the most important epigenetic marks that regulate the process of gene expression. Here, we have examined the effect of epigenetic DNA methylation on nucleosomal stability using molecular dynamics simulations and elastic deformation models. We found that methylation of CpG steps destabilizes nucleosomes, especially when these are placed in sites where the DNA minor groove faces the histone core. The larger stiffness of methylated CpG steps is a crucial factor behind the decrease in nucleosome stability. Methylation changes the positioning and phasing of the nucleosomal DNA, altering the accessibility of DNA to regulatory proteins, and accordingly gene functionality. Our theoretical calculations highlight a simple physical-based explanation on the foundations of epigenetic signaling.

  4. Sinking offshore platform. Nedsenkbar fralandsplatform

    Energy Technology Data Exchange (ETDEWEB)

    Einstabland, T.B.; Olsen, O.

    1988-12-19

    The invention deals with a sinking offshore platform of the gravitational type designed for being installed on the sea bed on great depths. The platform consists of at least three inclining pillars placed on a foundation unit. The pillars are at the upper end connected to a tower structure by means of a rigid construction. The tower supports the platform deck. The rigid construction comprises a centre-positioned cylinder connected to the foundation. 11 figs.

  5. Computer aided analysis of additional chromosome aberrations in Philadelphia chromosome positive acute lymphoblastic leukaemia using a simplified computer readable cytogenetic notation

    Directory of Open Access Journals (Sweden)

    Mohr Brigitte

    2003-01-01

    Full Text Available Abstract Background The analysis of complex cytogenetic databases of distinct leukaemia entities may help to detect rare recurring chromosome aberrations, minimal common regions of gains and losses, and also hot spots of genomic rearrangements. The patterns of the karyotype alterations may provide insights into the genetic pathways of disease progression. Results We developed a simplified computer readable cytogenetic notation (SCCN by which chromosome findings are normalised at a resolution of 400 bands. Lost or gained chromosomes or chromosome segments are specified in detail, and ranges of chromosome breakpoint assignments are recorded. Software modules were written to summarise the recorded chromosome changes with regard to the respective chromosome involvement. To assess the degree of karyotype alterations the ploidy levels and numbers of numerical and structural changes were recorded separately, and summarised in a complex karyotype aberration score (CKAS. The SCCN and CKAS were used to analyse the extend and the spectrum of additional chromosome aberrations in 94 patients with Philadelphia chromosome positive (Ph-positive acute lymphoblastic leukemia (ALL and secondary chromosome anomalies. Dosage changes of chromosomal material represented 92.1% of all additional events. Recurring regions of chromosome losses were identified. Structural rearrangements affecting (pericentromeric chromosome regions were recorded in 24.6% of the cases. Conclusions SCCN and CKAS provide unifying elements between karyotypes and computer processable data formats. They proved to be useful in the investigation of additional chromosome aberrations in Ph-positive ALL, and may represent a step towards full automation of the analysis of large and complex karyotype databases.

  6. Single-Photon Computed Tomography With Large Position-Sensitive Phototubes*

    Science.gov (United States)

    Feldmann, John; Ranck, Amoreena; Saunders, Robert S.; Welsh, Robert E.; Bradley, Eric L.; Saha, Margaret S.; Kross, Brian; Majewski, Stan; Popov, Vladimir; Weisenberger, Andrew G.; Wojcik, Randolph

    2000-10-01

    Position-sensitive photomultiplier tubes (PSPMTs) coupled to pixelated CsI(Tl) scintillators have been used with parallel-hole collimators to view the metabolism in small animals of radiopharmaceuticals tagged with ^125I. We report here our preliminary results analyzed using a tomography program^1 written in IDL programming language. The PSPMTs are mounted on a rotating gantry so as to view the subject animal from any azimuth. Preliminary results to test the tomography algorithm have been obtained by placing a variety of plastic mouse-brain phantoms (loaded with Na^125I) in front of one of the detectors and rotating the phantom in steps through 360 degrees. Results of this simulation taken with a variety of collimator hole sizes will be compared and discussed. Extentions of this technique to the use of very small PSPMTs (Hamamatsu M-64) which are capable of a very close approach to those parts of the animal of greatest interest will be described. *Supported in part by The Department of Energy, The National Science Foundation, The American Diabetes Association, The Howard Hughes Foundation and The Jeffress Trust. 1. Tomography algorithm kindly provided by Dr. S. Meikle of The Royal Prince Albert Hospital, Sydney, Australia

  7. Prevalence of the subclinical sinus disease in HIV positive patients evaluated by the computed tomography versus a control population

    International Nuclear Information System (INIS)

    Senneville, E.; Valette, M.; Ajana, F.; Gerard, Y.; Alfandari, S.; Chidiac, C.; Mouton, Y.

    1997-01-01

    To determine the prevalence of subclinical sinus disease in patients infected with the human immunodeficiency virus (HIV), cerebral computed tomography scans (CCT) done at the Tourcoing hospital over an 18-month period in 139 HIV-positive patients and 140 control patients without evidence of active sinus disease were reviewed retrospectively. CCTs were evaluated independently by two physicians who were blinded to clinical data. Mucosal thickening and/or a full patients (20/139, 14.4%) than in the controls (8/140, 5.7%) (p=0.016). Mucosal thickening was the most common abnormality in both groups. CD4+cell counts were not correlated with the radiographic abnormalities studies. These radiographic data suggest that subclinical chronic sinusitis independent from the degree of immune deficiency may be more common in HIV-positive than in HIV-negative subjects. (author)

  8. Patient Position Verification and Corrective Evaluation Using Cone Beam Computed Tomography (CBCT) in Intensity modulated Radiation Therapy

    International Nuclear Information System (INIS)

    Do, Gyeong Min; Jeong, Deok Yang; Kim, Young Bum

    2009-01-01

    Cone beam computed tomography (CBCT) using an on board imager (OBI) can check the movement and setup error in patient position and target volume by comparing with the image of computer simulation treatment in real.time during patient treatment. Thus, this study purposed to check the change and movement of patient position and target volume using CBCT in IMRT and calculate difference from the treatment plan, and then to correct the position using an automated match system and to test the accuracy of position correction using an electronic portal imaging device (EPID) and examine the usefulness of CBCT in IMRT and the accuracy of the automatic match system. The subjects of this study were 3 head and neck patients and 1 pelvis patient sampled from IMRT patients treated in our hospital. In order to investigate the movement of treatment position and resultant displacement of irradiated volume, we took CBCT using OBI mounted on the linear accelerator. Before each IMRT treatment, we took CBCT and checked difference from the treatment plan by coordinate by comparing it with the image of CT simulation. Then, we made correction through the automatic match system of 3D/3D match to match the treatment plan, and verified and evaluated using electronic portal imaging device. When CBCT was compared with the image of CT simulation before treatment, the average difference by coordinate in the head and neck was 0.99 mm vertically, 1.14 mm longitudinally, 4.91 mm laterally, and 1.07 degrees in the rotational direction, showing somewhat insignificant differences by part. In testing after correction, when the image from the electronic portal imaging device was compared with DRR image, it was found that correction had been made accurately with error less than 0.5 mm. By comparing a CBCT image before treatment with a 3D image reconstructed into a volume instead of a 2D image for the patient's setup error and change in the position of the organs and the target, we could measure and

  9. One positive impact of health care reform to physicians: the computer-based patient record.

    Science.gov (United States)

    England, S P

    1993-11-01

    The health care industry is an information-dependent business that will require a new generation of health information systems if successful health care reform is to occur. We critically need integrated clinical management information systems to support the physician and related clinicians at the direct care level, which in turn will have linkages with secondary users of health information such as health payors, regulators, and researchers. The economic dependence of health care industry on the CPR cannot be underestimated, says Jeffrey Ritter. He sees the U.S. health industry as about to enter a bold new age where our records are electronic, our computers are interconnected, and our money is nothing but pulses running across the telephone lines. Hence the United States is now in an age of electronic commerce. Clinical systems reform must begin with the community-based patient chart, which is located in the physician's office, the hospital, and other related health care provider offices. A community-based CPR and CPR system that integrates all providers within a managed care network is the most logical step since all health information begins with the creation of a patient record. Once a community-based CPR system is in place, the physician and his or her clinical associates will have a common patient record upon which all direct providers have access to input and record patient information. Once a community-level CPR system is in place with a community provider network, each physician will have available health information and data processing capability that will finally provide real savings in professional time and effort. Lost patient charts will no longer be a problem. Data input and storage of health information would occur electronically via transcripted text, voice, and document imaging. All electronic clinical information, voice, and graphics could be recalled at any time and transmitted to any terminal location within the health provider network. Hence

  10. Satellite Remote Sensing of Cropland Characteristics in 30m Resolution: The First North American Continental-Scale Classification on High Performance Computing Platforms

    Science.gov (United States)

    Massey, Richard

    Cropland characteristics and accurate maps of their spatial distribution are required to develop strategies for global food security by continental-scale assessments and agricultural land use policies. North America is the major producer and exporter of coarse grains, wheat, and other crops. While cropland characteristics such as crop types are available at country-scales in North America, however, at continental-scale cropland products are lacking at fine sufficient resolution such as 30m. Additionally, applications of automated, open, and rapid methods to map cropland characteristics over large areas without the need of ground samples are needed on efficient high performance computing platforms for timely and long-term cropland monitoring. In this study, I developed novel, automated, and open methods to map cropland extent, crop intensity, and crop types in the North American continent using large remote sensing datasets on high-performance computing platforms. First, a novel method was developed in this study to fuse pixel-based classification of continental-scale Landsat data using Random Forest algorithm available on Google Earth Engine cloud computing platform with an object-based classification approach, recursive hierarchical segmentation (RHSeg) to map cropland extent at continental scale. Using the fusion method, a continental-scale cropland extent map for North America at 30m spatial resolution for the nominal year 2010 was produced. In this map, the total cropland area for North America was estimated at 275.2 million hectares (Mha). This map was assessed for accuracy using randomly distributed samples derived from United States Department of Agriculture (USDA) cropland data layer (CDL), Agriculture and Agri-Food Canada (AAFC) annual crop inventory (ACI), Servicio de Informacion Agroalimentaria y Pesquera (SIAP), Mexico's agricultural boundaries, and photo-interpretation of high-resolution imagery. The overall accuracies of the map are 93.4% with a

  11. Verification of computed tomographic estimates of cochlear implant array position: a micro-CT and histologic analysis.

    Science.gov (United States)

    Teymouri, Jessica; Hullar, Timothy E; Holden, Timothy A; Chole, Richard A

    2011-08-01

    To determine the efficacy of clinical computed tomographic (CT) imaging to verify postoperative electrode array placement in cochlear implant (CI) patients. Nine fresh cadaver heads underwent clinical CT scanning, followed by bilateral CI insertion and postoperative clinical CT scanning. Temporal bones were removed, trimmed, and scanned using micro-CT. Specimens were then dehydrated, embedded in either methyl methacrylate or LR White resin, and sectioned with a diamond wafering saw. Histology sections were examined by 3 blinded observers to determine the position of individual electrodes relative to soft tissue structures within the cochlea. Electrodes were judged to be within the scala tympani, scala vestibuli, or in an intermediate position between scalae. The position of the array could be estimated accurately from clinical CT scans in all specimens using micro-CT and histology as a criterion standard. Verification using micro-CT yielded 97% agreement, and histologic analysis revealed 95% agreement with clinical CT results. A composite, 3-dimensional image derived from a patient's preoperative and postoperative CT images using a clinical scanner accurately estimates the position of the electrode array as determined by micro-CT imaging and histologic analyses. Information obtained using the CT method provides valuable insight into numerous variables of interest to patient performance such as surgical technique, array design, and processor programming and troubleshooting.

  12. Computational methods and modeling. 1. Sampling a Position Uniformly in a Trilinear Hexahedral Volume

    International Nuclear Information System (INIS)

    Urbatsch, Todd J.; Evans, Thomas M.; Hughes, H. Grady

    2001-01-01

    Monte Carlo particle transport plays an important role in some multi-physics simulations. These simulations, which may additionally involve deterministic calculations, typically use a hexahedral or tetrahedral mesh. Trilinear hexahedrons are attractive for physics calculations because faces between cells are uniquely defined, distance-to-boundary calculations are deterministic, and hexahedral meshes tend to require fewer cells than tetrahedral meshes. We discuss one aspect of Monte Carlo transport: sampling a position in a tri-linear hexahedron, which is made up of eight control points, or nodes, and six bilinear faces, where each face is defined by four non-coplanar nodes in three-dimensional Cartesian space. We derive, code, and verify the exact sampling method and propose an approximation to it. Our proposed approximate method uses about one-third the memory and can be twice as fast as the exact sampling method, but we find that its inaccuracy limits its use to well-behaved hexahedrons. Daunted by the expense of the exact method, we propose an alternate approximate sampling method. First, calculate beforehand an approximate volume for each corner of the hexahedron by taking one-eighth of the volume of an imaginary parallelepiped defined by the corner node and the three nodes to which it is directly connected. For the sampling, assume separability in the parameters, and sample each parameter, in turn, from a linear pdf defined by the sum of the four corner volumes at each limit (-1 and 1) of the parameter. This method ignores the quadratic portion of the pdf, but it requires less storage, has simpler sampling, and needs no extra, on-the-fly calculations. We simplify verification by designing tests that consist of one or more cells that entirely fill a unit cube. Uniformly sampling complicated cells that fill a unit cube will result in uniformly sampling the unit cube. Unit cubes are easily analyzed. The first problem has four wedges (or tents, or A frames) whose

  13. Identifying behaviors that generate positive interactions between museums and people on a social media platform: An analysis of 27 science museums on Twitter

    Science.gov (United States)

    Baker, Stacy Christine

    The aim of this study was to provide a detailed examination of how science museums use Twitter and suggest changes these museums should make to improve their current approach on this social media platform. Previous studies have identified the types of content museums are creating on social media, but none have quantitatively investigated the specific types of content most likely to generate interaction and engagement with a social media audience. A total of 5,278 tweets from 27 science museums were analyzed to determine what type of tweet yields the greatest impact measured in retweets and favorites. 1,453 of those tweets were selected for additional qualitative analysis. The results indicate that tweets with educational content, links, and hashtags lead to the greatest number of retweets and favorites. The results also indicate that the majority of tweets posted by museums do not generate interaction and engagement with a social media audience. A model for existing museums to improve their use of Twitter was created using the results of this study.

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  15. The Milan Project: A New Method for High-Assurance and High-Performance Computing on Large-Scale Distributed Platforms

    National Research Council Canada - National Science Library

    Kedem, Zvi

    2000-01-01

    ...: Calypso, Chime, and Charlotte; which enable applications developed for ideal, shared memory, parallel machines to execute on distributed platforms that are subject to failures, slowdowns, and changing resource availability...

  16. Perceived Internet health literacy of HIV-positive people through the provision of a computer and Internet health education intervention.

    Science.gov (United States)

    Robinson, Christie; Graham, Joy

    2010-12-01

    The objective of this study was to assess perceived Internet health literacy of HIV-positive people before and after an Internet health information educational intervention. We developed a 50-min educational intervention on basic computer skills and online health information evaluation. We administered a demographic survey and a validated health literacy survey (eHEALS) at baseline, immediately after, and 3 months the class. Changes in scores between the surveys were analysed. Eighteen HIV-positive participants were included in the final analysis. Before the intervention, most respondents' assessment of their ability to access Internet health information was unfavourable. Post-intervention, the majority of respondents agreed or strongly agreed they were able to access and identify Internet health information resources. The increase in self-assessed skill level was statistically significant for all eight items eHEALS (P Internet health information educational intervention HIV-positive people with baseline low perceived Internet health literacy significantly improves confidence in finding and using Internet health information resources. Studies with larger numbers of participants should be undertaken to determine if brief interventions improve self-care, patient outcomes and use of emergency services. © 2010 The authors. Health Information and Libraries Journal © 2010 Health Libraries Group.

  17. Multiplayer computer games as youth's leisure phenomenon

    OpenAIRE

    HADERKOVÁ, Barbora

    2016-01-01

    The thesis is dedicated to multiplayer computer games as youth's leisure phenomenon of this time. The theoretical part is focused on computer games history, multiplayer computer games and their types, gaming platforms, community of multiplayer games players and potential negatives and positives, which follows from playing this type of games. The practical part contains a qualitative survey using interviews with multiplayer computer games players aged from 15 to 26 years from city of České Bud...

  18. [Orange Platform].

    Science.gov (United States)

    Toba, Kenji

    2017-07-01

    The Organized Registration for the Assessment of dementia on Nationwide General consortium toward Effective treatment in Japan (ORANGE platform) is a recently established nationwide clinical registry for dementia. This platform consists of multiple registries of patients with dementia stratified by the following clinical stages: preclinical, mild cognitive impairment, early-stage, and advanced-stage dementia. Patients will be examined in a super-longitudinal fashion, and their lifestyle, social background, genetic risk factors, and required care process will be assessed. This project is also notable because the care registry includes information on the successful, comprehensive management of patients with dementia. Therefore, this multicenter prospective cohort study will contribute participants to all clinical trials for Alzheimer's disease as well as improve the understanding of individuals with dementia.

  19. Arthroscopic Latarjet Techniques: Graft and Fixation Positioning Assessed With 2-Dimensional Computed Tomography Is Not Equivalent With Standard Open Technique.

    Science.gov (United States)

    Neyton, Lionel; Barth, Johannes; Nourissat, Geoffroy; Métais, Pierre; Boileau, Pascal; Walch, Gilles; Lafosse, Laurent

    2018-05-19

    To analyze graft and fixation (screw and EndoButton) positioning after the arthroscopic Latarjet technique with 2-dimensional computed tomography (CT) and to compare it with the open technique. We performed a retrospective multicenter study (March 2013 to June 2014). The inclusion criteria included patients with recurrent anterior instability treated with the Latarjet procedure. The exclusion criterion was the absence of a postoperative CT scan. The positions of the hardware, the positions of the grafts in the axial and sagittal planes, and the dispersion of values (variability) were compared. The study included 208 patients (79 treated with open technique, 87 treated with arthroscopic Latarjet technique with screw fixation [arthro-screw], and 42 treated with arthroscopic Latarjet technique with EndoButton fixation [arthro-EndoButton]). The angulation of the screws was different in the open group versus the arthro-screw group (superior, 10.3° ± 0.7° vs 16.9° ± 1.0° [P open inferior screws (P = .003). In the axial plane (level of equator), the arthroscopic techniques resulted in lateral positions (arthro-screw, 1.5 ± 0.3 mm lateral [P open technique (0.9 ± 0.2 mm medial). At the level of 25% of the glenoid height, the arthroscopic techniques resulted in lateral positions (arthro-screw, 0.3 ± 0.3 mm lateral [P open technique (1.0 ± 0.2 mm medial). Higher variability was observed in the arthro-screw group. In the sagittal plane, the arthro-screw technique resulted in higher positions (55% ± 3% of graft below equator) and the arthro-EndoButton technique resulted in lower positions (82% ± 3%, P open technique (71% ± 2%). Variability was not different. This study shows that the position of the fixation devices and position of the bone graft with the arthroscopic techniques are statistically significantly different from those with the open technique with 2-dimensional CT assessment. In the sagittal plane, the arthro-screw technique provides the highest

  20. An analysis of true- and false-positive results of vocal fold uptake in positron emission tomography-computed tomography imaging.

    Science.gov (United States)

    Seymour, N; Burkill, G; Harries, M

    2018-03-01

    Positron emission tomography-computed tomography with fluorine-18 fluorodeoxy-D-glucose has a major role in the investigation of head and neck cancers. Fluorine-18 fluorodeoxy-D-glucose is not a tumour-specific tracer and can also accumulate in benign pathology. Therefore, positron emission tomography-computed tomography scan interpretation difficulties are common in the head and neck, which can produce false-positive results. This study aimed to investigate patients detected as having abnormal vocal fold uptake on fluorine-18 fluorodeoxy-D-glucose positron emission tomography-computed tomography. Positron emission tomography-computed tomography scans were identified over a 15-month period where reports contained evidence of unilateral vocal fold uptake or vocal fold pathology. Patients' notes and laryngoscopy results were analysed. Forty-six patients were identified as having abnormal vocal fold uptake on positron emission tomography-computed tomography. Twenty-three patients underwent positron emission tomography-computed tomography and flexible laryngoscopy: 61 per cent of patients had true-positive positron emission tomography-computed tomography scans and 39 per cent had false-positive scan results. Most patients referred to ENT for abnormal findings on positron emission tomography-computed tomography scans had true-positive findings. Asymmetrical fluorine-18 fluorodeoxy-D-glucose uptake should raise suspicion of vocal fold pathology, accepting a false-positive rate of approximately 40 per cent.

  1. Towards gaze-controlled platform games

    DEFF Research Database (Denmark)

    Muñoz, Jorge; Yannakakis, Georgios N.; Mulvey, Fiona

    2011-01-01

    This paper introduces the concept of using gaze as a sole modality for fully controlling player characters of fast-paced action computer games. A user experiment is devised to collect gaze and gameplay data from subjects playing a version of the popular Super Mario Bros platform game. The initial...... analysis shows that there is a rather limited grid around Mario where the efficient player focuses her attention the most while playing the game. The useful grid as we name it, projects the amount of meaningful visual information a designer should use towards creating successful player character...... controllers with the use of artificial intelligence for a platform game like Super Mario. Information about the eyes' position on the screen and the state of the game are utilized as inputs of an artificial neural network, which is trained to approximate which keyboard action is to be performed at each game...

  2. Control of horizontal plasma position by feedforward-feedback system with digital computer in JIPP T-II tokamak

    International Nuclear Information System (INIS)

    Toi, K.; Sakurai, K.; Itoh, S.; Matsuura, K.; Tanahashi, S.

    1980-01-01

    In the resistive shell tokamak, JIPP T-II, the control of horizontal plasma position is successfully carried out by calculating the equilibrium equation in a thin resistive shell from a large-aspect-ratio approximation every 1.39 msec with a digital computer. The iron core effect also is taken account by a simple form in the equation. The required strength of vertical field is determined by the control-demand composed of a ''feedback'' term with Proportion-Integration-Differentiation correction (PID-controller) and ''feedforward'' one in proportion to plasma current. The experimental results have a satisfactory agreement with the analysis of control system. By this control system, the horizontal displacement has been suppressed within 1 cm throughout a discharge for the plasma of 15 cm-radius with high density and low q(a)-value obtained by the second current rise and strong gas puffing. (author)

  3. Benign thyroid and neck lesions mimicking malignancy with false positive findings on positron emission tomography-computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Ye Ri; Kim, Shin Young; Lee, Sang Mi [Soonchunhyang University Cheonan Hospital, Cheonan (Korea, Republic of); Lee, Deuk Young [Dept. of Surgery, Younsei Angelot Women' s Clinic, Cheonan (Korea, Republic of)

    2017-02-15

    The increasing use of positron emission tomography-computed tomography (PET/CT) has led to the frequent detection of incidental thyroid and neck lesions with increased 18F-deoxyglucose (FDG) uptake. Although lesions with increased FDG uptake are commonly assumed to be malignant, benign lesions may also exhibit increased uptake. The purpose of this pictorial essay is to demonstrate that benign thyroid and neck lesions can produce false-positive findings on PET/CT, and to identify various difficulties in interpretation. It is crucial to be aware that differentiating between benign and malignant lesions is difficult in a considerable proportion of cases, when relying only on PET/CT findings. Correlation of PET/CT findings with additional imaging modalities is essential to avoid misdiagnosis.

  4. Computer-assisted preoperative simulation for positioning and fixation of plate in 2-stage procedure combining maxillary advancement by distraction technique and mandibular setback surgery

    Directory of Open Access Journals (Sweden)

    Hideyuki Suenaga

    2016-01-01

    Conclusion: The implementation of the computer-assisted preoperative simulation for the positioning and fixation of plate in 2-stage orthognathic procedure using distraction technique and mandibular setback surgery yielded good results.

  5. Cloud Based Applications and Platforms (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Brodt-Giles, D.

    2014-05-15

    Presentation to the Cloud Computing East 2014 Conference, where we are highlighting our cloud computing strategy, describing the platforms on the cloud (including Smartgrid.gov), and defining our process for implementing cloud based applications.

  6. Interfractional Position Variation of Pancreatic Tumors Quantified Using Intratumoral Fiducial Markers and Daily Cone Beam Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Horst, Astrid van der, E-mail: a.vanderhorst@amc.uva.nl [Department of Radiation Oncology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands); Wognum, Silvia; Dávila Fajardo, Raquel; Jong, Rianne de [Department of Radiation Oncology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands); Hooft, Jeanin E. van; Fockens, Paul [Department of Gastroenterology and Hepatology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands); Tienhoven, Geertjan van; Bel, Arjan [Department of Radiation Oncology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands)

    2013-09-01

    Purpose: The aim of this study was to quantify interfractional pancreatic position variation using fiducial markers visible on daily cone beam computed tomography (CBCT) scans. In addition, we analyzed possible migration of the markers to investigate their suitability for tumor localization. Methods and Materials: For 13 pancreatic cancer patients with implanted Visicoil markers, CBCT scans were obtained before 17 to 25 fractions (300 CBCTs in total). Image registration with the reference CT was used to determine the displacement of the 2 to 3 markers relative to bony anatomy and to each other. We analyzed the distance between marker pairs as a function of time to identify marker registration error (SD of linear fit residuals) and possible marker migration. For each patient, we determined the mean displacement of markers relative to the reference CT (systematic position error) and the spread in displacements (random position error). From this, we calculated the group systematic error, Σ, and group random error, σ. Results: Marker pair distances showed slight trends with time (range, −0.14 to 0.14 mm/day), possibly due to tissue deformation, but no shifts that would indicate marker migration. The mean SD of the fit residuals was 0.8 mm. We found large interfractional position variations, with for 116 of 300 (39%) fractions a 3-dimensional vector displacement of >10 mm. The spread in displacement varied significantly (P<.01) between patients, from a vector range of 9.1 mm to one of 24.6 mm. For the patient group, Σ was 3.8, 6.6, and 3.5 mm; and σ was 3.6, 4.7 and 2.5 mm, in left–right, superior–inferior, and anterior–posterior directions, respectively. Conclusions: We found large systematic displacements of the fiducial markers relative to bony anatomy, in addition to wide distributions of displacement. These results for interfractional position variation confirm the potential benefit of using fiducial markers rather than bony anatomy for daily online

  7. Feature-guided analysis for reduction of false positives in CAD of polyps for computed tomographic colonography

    International Nuclear Information System (INIS)

    Naeppi, Janne; Yoshida, Hiroyuki

    2003-01-01

    We evaluated the effect of our novel technique of feature-guided analysis of polyps on the reduction of false-positive (FP) findings generated by our computer-aided diagnosis (CAD) scheme for the detection of polyps from computed tomography colonographic data sets. The detection performance obtained by use of feature-guided analysis in the segmentation and feature analysis of polyp candidates was compared with that obtained by use of our previously employed fuzzy clustering technique. We also evaluated the effect of a feature called modified gradient concentration (MGC) on the detection performance. A total of 144 data sets, representing prone and supine views of 72 patients that included 14 patients with 21 colorectal polyps 5-25 mm in diameter, were used in the evaluation. At a 100% by-patient (95% by-polyp) detection sensitivity, the FP rate of our CAD scheme with feature-guided analysis based on round-robin evaluation was 1.3 (1.5) FP detections per patient. This corresponds to a 70-75 % reduction in the number of FPs obtained by use of fuzzy clustering at the same sensitivity levels. Application of the MGC feature instead of our previously used gradient concentration feature did not improve the detection result. The results indicate that feature-guided analysis is useful for achieving high sensitivity and a low FP rate in our CAD scheme

  8. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  9. Control of horizontal plasma position by feedforward-feedback system with digital computer in the JIPP T-II tokamak

    International Nuclear Information System (INIS)

    Toi, Kazuo; Sakurai, Keiichi; Itoh, Satoshi; Matsuura, Kiyokata; Tanashi, Shugo

    1980-01-01

    In the resistive shell tokamak, JIPP T-II, the control of horizontal plasma position is successfully carried out by calculating the equilibrium equation of a large-aspect-ratio tokamak plasma surrounded by a thin resistive shell of a skin time of 5.2 ms, every 1.39 ms with a digital computer. The iron core effect is also taken into account by a simple form in the equation. The required strenght of vertical field is determined by the control demand composed of two groups; one is a ''feedback'' term expressed by the deviation of plasma position from the desired one and proportion-integration-differentiation correction (PID-controller), and the other is a ''feedforward'' term which is in proportion to the plasma current. The experimental results in a quasi-constant phase of plasma current are in good agreement with the stability analysis of the control system by using the so-called Bode-diagram which is calculated on the assumption that the plasma current is independent of time. By this control system, the horizontal plasma displacement has been suppressed within 1 cm of the initiation of discharge to the termination in the high-density and low-q(a) plasma of 15 cm radius which is obtained by both strong gas puffing and second current rise. (author)

  10. Control of horizontal plasma position by feedforward-feedback system with digital computer in the JIPP T-II tokamak

    International Nuclear Information System (INIS)

    Toi, K.; Itoh, S.; Sakurai, K.; Matsuura, K.; Tanahashi, S.

    1980-02-01

    In the resistive shell tokamak, JIPP T-II, the control of horizontal plasma position is successfully carried out by calculating the equilibrium equation of a large-aspect-ratio tokamak plasma surrounded by a thin resistive shell of a skin time of 5.2 msec, every 1.39 msec with a digital computer. The iron core effect is also taken into account by a simple form in the equation. The required strength of vertical field is determined by the control demand composed of two groups; one is a ''feedback'' term expressed by the deviation of plasma position from the desired one and proportion-integration-differentiation correction (PID-controller), and the other is a ''feedforward'' term which is in proportion to the plasma current. The experimental results have a good agreement with the stability analysis of the control system by using the so-called Bode-diagram. By this control system, the horizontal displacement has been suppressed within 1 cm from the initiation of discharge to the termination in the high-density and low-q(a) plasma of 15 cm-radius which is obtained by both strong gas puffing and second current rise. (author)

  11. Three-dimensional computer graphics-based ankle morphometry with computerized tomography for total ankle replacement design and positioning.

    Science.gov (United States)

    Kuo, Chien-Chung; Lu, Hsuan-Lun; Leardini, Alberto; Lu, Tung-Wu; Kuo, Mei-Ying; Hsu, Horng-Chaung

    2014-05-01

    Morphometry of the bones of the ankle joint is important for the design of joint replacements and their surgical implantations. However, very little three-dimensional (3D) data are available and not a single study has addressed the Chinese population. Fifty-eight fresh frozen Chinese cadaveric ankle specimens, 26 females, and 32 males, were CT-scanned in the neutral position and their 3D computer graphics-based models were reconstructed. The 3D morphology of the distal tibia/fibula segment and the full talus was analyzed by measuring 31 parameters, defining the relevant dimensions, areas, and volumes from the models. The measurements were compared statistically between sexes and with previously reported data from Caucasian subjects. The results showed that, within a general similarity of ankle morphology between the current Chinese and previous Caucasian subjects groups, there were significant differences in 9 out of the 31 parameters analyzed. From a quantitative comparison with available prostheses designed for the Caucasian population, few of these designs have both tibial and talar components suitable in dimension for the Chinese population. The current data will be helpful for the sizing, design, and surgical positioning of ankle replacements and for surgical instruments, especially for the Chinese population. Copyright © 2013 Wiley Periodicals, Inc.

  12. Positional relationship between the maxillary sinus floor and the apex of the maxillary first molar using cone beam computed tomograph

    International Nuclear Information System (INIS)

    Kim, Kyung Hwa; Koh, Kwang Joon

    2008-01-01

    To assess the positional relationship between the maxillary sinus floor and the apex of the maxillary first molar using cone beam computed tomograph (CBCT). CBCTs from 127 subjects were analysed. A total of 134 maxillary first molars were classified according to their vertical and horizontal positional relationship to the maxillary sinus floor and measured according to the distance between the maxillary sinus floor and the maxillary first molar. Type III (The root projected laterally on the sinus cavity but its apex is outside the sinus boundaries) was dominated between 10 and 19 years and type I (The root apex was not in contact with the cortical borders of the sinus) was dominated (P<0.05) between 20 and 72 years on the vertical relationship between the maxillary sinus floor and the apex of the maxillary first molar. The maxillary sinus floor was located more at the apex (78.2%) than at the furcation (21.3%) for the palatal root. The distance from the root apex to the maxillary sinus floor confined to type I was increased according to the ages (P<0.05). Type M (The maxillary sinus floor was located between the buccal and the palatal root) was most common (72.4%) on the horizontal relationship between the maxillary sinus floor and the apex of the maxillary first molar. CBCT can provide highly qualified images for the maxillary sinus floor and the root apex of the maxillary first molar.

  13. Cone-beam computed tomography in children with cochlear implants: The effect of electrode array position on ECAP.

    Science.gov (United States)

    Lathuillière, Marine; Merklen, Fanny; Piron, Jean-Pierre; Sicard, Marielle; Villemus, Françoise; Menjot de Champfleur, Nicolas; Venail, Frédéric; Uziel, Alain; Mondain, Michel

    2017-01-01

    To assess the feasibility of using cone-beam computed tomography (CBCT) in young children with cochlear implants (CIs) and study the effect of intracochlear position on electrophysiological and behavioral measurements. A total of 40 children with either unilateral or bilateral cochlear implants were prospectively included in the study. Electrode placement and insertion angles were studied in 55 Cochlear ® implants (16 straight arrays and 39 perimodiolar arrays), using either CBCT or X-ray imaging. CBCT or X-ray imaging were scheduled when the children were leaving the recovery room. We recorded intraoperative and postoperative neural response telemetry threshold (T-NRT) values, intraoperative and postoperative electrode impedance values, as well as behavioral T (threshold) and C (comfort) levels on electrodes 1, 5, 10, 15 and 20. CBCT imaging was feasible without any sedation in 24 children (60%). Accidental scala vestibuli insertion was observed in 3 out of 24 implants as assessed by CBCT. The mean insertion angle was 339.7°±35.8°. The use of a perimodiolar array led to higher angles of insertion, lower postoperative T-NRT, as well as decreased behavioral T and C levels. We found no significant effect of either electrode array position or angle of insertion on electrophysiological data. CBCT appears to be a reliable tool for anatomical assessment of young children with CIs. Intracochlear position had no significant effect on the electrically evoked compound action potential (ECAP) threshold. Our CBCT protocol must be improved to increase the rate of successful investigations. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. Pollution sources for indoor PM2.5 at the platform in subway station using a positive matrix factorization and an instrumental neutron activation analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lim, Jong Myoung; Moon, Jong Hwa; Chung, Yong Sam [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Lee, Jin Hong [Chungnam National University, Daejeon (Korea, Republic of)

    2010-05-15

    Airborne particulate matters, especially the PM2.5 (aerodynamic equivalent diameter, AED, less than 2.5 )m) fraction has been important. This is because of their potential for deposition on to the human respiratory system being accompanied by many harmful trace metals (such as As, Cd, Cr, Cu, Mn, Pb, Se, and Zn). As most people spend more than 80% of their time indoors, indoor air quality (IAQ) can exert a considerable impact on the inhalation condition of toxic substances. Therefore, assessment of the absolute concentration levels and elemental composition of PM in an indoor environment such as subway station can be used as a practical barometer of IAQ. The contaminants originated from the indoor pollution sources as well as various outdoor sources are easily accumulated in indoor environment dissimilar to the outdoor. Especially, since the natural ventilation is nearly impossible in the subway station, its pollution status can be worsened under the circumstance that contaminants are constantly originated and circulated inside of station by the repetitive action of subway trains. In this study, a total of 60 PM2.5 samples were collected for 4 seasonal campaigns in 2009 with a low-volume air sampler at one subway station in Daejeon, Korea. We undertook the measurements of up to 25 elements in PM2.5 using an instrumental neutron activation analysis (INAA) and X-ray fluorescence (XRF). And inorganic ion species (SO{sub 4}{sup 2-}, NO{sub 3}{sup -}, NH{sub 4}{sup +}) also were determined by ion chromatography (IC). Next, sources at indoor/outdoor environment were identified and the contributions of each source were quantified by positive matrix factorization (PMF).

  15. Pollution sources for indoor PM2.5 at the platform in subway station using a positive matrix factorization and an instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Lim, Jong Myoung; Moon, Jong Hwa; Chung, Yong Sam; Lee, Jin Hong

    2010-01-01

    Airborne particulate matters, especially the PM2.5 (aerodynamic equivalent diameter, AED, less than 2.5 )m) fraction has been important. This is because of their potential for deposition on to the human respiratory system being accompanied by many harmful trace metals (such as As, Cd, Cr, Cu, Mn, Pb, Se, and Zn). As most people spend more than 80% of their time indoors, indoor air quality (IAQ) can exert a considerable impact on the inhalation condition of toxic substances. Therefore, assessment of the absolute concentration levels and elemental composition of PM in an indoor environment such as subway station can be used as a practical barometer of IAQ. The contaminants originated from the indoor pollution sources as well as various outdoor sources are easily accumulated in indoor environment dissimilar to the outdoor. Especially, since the natural ventilation is nearly impossible in the subway station, its pollution status can be worsened under the circumstance that contaminants are constantly originated and circulated inside of station by the repetitive action of subway trains. In this study, a total of 60 PM2.5 samples were collected for 4 seasonal campaigns in 2009 with a low-volume air sampler at one subway station in Daejeon, Korea. We undertook the measurements of up to 25 elements in PM2.5 using an instrumental neutron activation analysis (INAA) and X-ray fluorescence (XRF). And inorganic ion species (SO 4 2- , NO 3 - , NH 4 + ) also were determined by ion chromatography (IC). Next, sources at indoor/outdoor environment were identified and the contributions of each source were quantified by positive matrix factorization (PMF).

  16. Kinematics of an in-parallel actuated manipulator based on the Stewart platform mechanism

    Science.gov (United States)

    Williams, Robert L., II

    1992-01-01

    This paper presents kinematic equations and solutions for an in-parallel actuated robotic mechanism based on Stewart's platform. These equations are required for inverse position and resolved rate (inverse velocity) platform control. NASA LaRC has a Vehicle Emulator System (VES) platform designed by MIT which is based on Stewart's platform. The inverse position solution is straight-forward and computationally inexpensive. Given the desired position and orientation of the moving platform with respect to the base, the lengths of the prismatic leg actuators are calculated. The forward position solution is more complicated and theoretically has 16 solutions. The position and orientation of the moving platform with respect to the base is calculated given the leg actuator lengths. Two methods are pursued in this paper to solve this problem. The resolved rate (inverse velocity) solution is derived. Given the desired Cartesian velocity of the end-effector, the required leg actuator rates are calculated. The Newton-Raphson Jacobian matrix resulting from the second forward position kinematics solution is a modified inverse Jacobian matrix. Examples and simulations are given for the VES.

  17. Integration of the TNXYZ computer program inside the platform Salome; Integracion del programa de computo TNXYZ dentro de la plataforma Salome

    Energy Technology Data Exchange (ETDEWEB)

    Chaparro V, F. J.

    2014-07-01

    The present work shows the procedure carried out to integrate the code TNXYZ as a calculation tool at the graphical simulation platform Salome. The TNXYZ code propose a numerical solution of the neutron transport equation, in several groups of energy, steady-state and three-dimensional geometry. In order to discretized the variables of the transport equation, the code uses the method of discrete ordinates for the angular variable, and a nodal method for the spatial dependence. The Salome platform is a graphical environment designed for building, editing and simulating mechanical models mainly focused on the industry and unlike other software, in order to form a complete scheme of pre and post processing of information, to integrate and control an external source code. Before the integration the in the Salome platform TNXYZ code was upgraded. TNXYZ was programmed in the 90s using Fortran 77 compiler; for this reason the code was adapted to the characteristics of the current Fortran compilers; in addition, with the intention of extracting partial results over the process sequence, the original structure of the program underwent a modularization process, i.e. the main program was divided into sections where the code performs major operations. This procedure is controlled by the information module (YACS) on Salome platform, and it could be useful for a subsequent coupling with thermal-hydraulics codes. Finally, with the help of the Monte Carlo code Serpent several study cases were defined in order to check the process of integration; the verification process consisted in performing a comparison of the results obtained with the code executed as stand-alone and after modernized, integrated and controlled by the Salome platform. (Author)

  18. Observation of positional relation between mandibular third molars and the mandibular canal on limited cone beam computed tomography

    International Nuclear Information System (INIS)

    Hashizume, Atsuko; Nakagawa, Yoichi; Ishii, Hisako; Kobayashi, Kaoru

    2004-01-01

    We describe the preoperative use of limited cone beam computed tomography (CT) with a dental CT scanner for the assessment of mandibular third molars before extraction. Cone beam CT provides 42.7-mm-high and 30-mm-wide rectangular solid images, with a resolution of less than 0.2 mm. The positional relationship between the mandibular third molars and the mandibular canal was examined by dental CT. Sixty-eight lower third molars of 62 patients whose teeth were superimposed on the mandibular canal on periapical or panoramic radiographs were studied. Dental CT scans clearly demonstrated the positional relationship between the mandibular canal and the teeth. The mandibular canal was located buccally to the roots of 16 teeth, lingually to the roots of 27 teeth, inferiorly to the roots of 23 teeth, and between the roots of 2 teeth. The presence of bone between the mandibular canal and the teeth was not noted in 7 of 16 buccal cases, 24 of 27 lingual cases, and 10 of 23 inferior cases on dental CT scans, suggesting that the canal was in contact with the teeth. Fifty-nine of the 68 mandibular third molars were surgically removed, and postoperative transient hypoesthesia occurred in 4 patients. Dental CT scans showed no bone between the mandibular canal and the teeth in all 4 patients. Hypoesthesia was not related to the bucco-lingual location of the mandibular canal or to the extent of bone loss between the canal and the teeth. However, hypoesthesia did not occur in patients with bone between the mandibular canal and the teeth. Thus, information on the distance between the canal and teeth on dental CT scans was useful for predicting the risk of inferior alveolar nerve damage. Because of its high resolution and low radiation dose, cone beam CT was useful for examination before mandibular third molar surgery. (author)

  19. Hip joint centre position estimation using a dual unscented Kalman filter for computer-assisted orthopaedic surgery.

    Science.gov (United States)

    Beretta, Elisa; De Momi, Elena; Camomilla, Valentina; Cereatti, Andrea; Cappozzo, Aurelio; Ferrigno, Giancarlo

    2014-09-01

    In computer-assisted knee surgery, the accuracy of the localization of the femur centre of rotation relative to the hip-bone (hip joint centre) is affected by the unavoidable and untracked pelvic movements because only the femoral pose is acquired during passive pivoting manoeuvres. We present a dual unscented Kalman filter algorithm that allows the estimation of the hip joint centre also using as input the position of a pelvic reference point that can be acquired with a skin marker placed on the hip, without increasing the invasiveness of the surgical procedure. A comparative assessment of the algorithm was carried out using data provided by in vitro experiments mimicking in vivo surgical conditions. Soft tissue artefacts were simulated and superimposed onto the position of a pelvic landmark. Femoral pivoting made of a sequence of star-like quasi-planar movements followed by a circumduction was performed. The dual unscented Kalman filter method proved to be less sensitive to pelvic displacements, which were shown to be larger during the manoeuvres in which the femur was more adducted. Comparable accuracy between all the analysed methods resulted for hip joint centre displacements smaller than 1 mm (error: 2.2 ± [0.2; 0.3] mm, median ± [inter-quartile range 25%; inter-quartile range 75%]) and between 1 and 6 mm (error: 4.8 ± [0.5; 0.8] mm) during planar movements. When the hip joint centre displacement exceeded 6 mm, the dual unscented Kalman filter proved to be more accurate than the other methods by 30% during multi-planar movements (error: 5.2 ± [1.2; 1] mm). © IMechE 2014.

  20. Computer-assisted orthognathic surgery: waferless maxillary positioning, versatility, and accuracy of an image-guided visualisation display.

    Science.gov (United States)

    Zinser, Max J; Mischkowski, Robert A; Dreiseidler, Timo; Thamm, Oliver C; Rothamel, Daniel; Zöller, Joachim E

    2013-12-01

    There may well be a shift towards 3-dimensional orthognathic surgery when virtual surgical planning can be applied clinically. We present a computer-assisted protocol that uses surgical navigation supplemented by an interactive image-guided visualisation display (IGVD) to transfer virtual maxillary planning precisely. The aim of this study was to analyse its accuracy and versatility in vivo. The protocol consists of maxillofacial imaging, diagnosis, planning of virtual treatment, and intraoperative surgical transfer using an IGV display. The advantage of the interactive IGV display is that the virtually planned maxilla and its real position can be completely superimposed during operation through a video graphics array (VGA) camera, thereby augmenting the surgeon's 3-dimensional perception. Sixteen adult class III patients were treated with by bimaxillary osteotomy. Seven hard tissue variables were chosen to compare (ΔT1-T0) the virtual maxillary planning (T0) with the postoperative result (T1) using 3-dimensional cephalometry. Clinically acceptable precision for the surgical planning transfer of the maxilla (orthognathic planning. Copyright © 2013 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.

  1. Positron emission tomography/computed tomography surveillance in patients with Hodgkin lymphoma in first remission has a low positive predictive value and high costs.

    Science.gov (United States)

    El-Galaly, Tarec Christoffer; Mylam, Karen Juul; Brown, Peter; Specht, Lena; Christiansen, Ilse; Munksgaard, Lars; Johnsen, Hans Erik; Loft, Annika; Bukh, Anne; Iyer, Victor; Nielsen, Anne Lerberg; Hutchings, Martin

    2012-06-01

    The value of performing post-therapy routine surveillance imaging in patients with Hodgkin lymphoma is controversial. This study evaluates the utility of positron emission tomography/computed tomography using 2-[18F]fluoro-2-deoxyglucose for this purpose and in situations with suspected lymphoma relapse. We conducted a multicenter retrospective study. Patients with newly diagnosed Hodgkin lymphoma achieving at least a partial remission on first-line therapy were eligible if they received positron emission tomography/computed tomography surveillance during follow-up. Two types of imaging surveillance were analyzed: "routine" when patients showed no signs of relapse at referral to positron emission tomography/computed tomography, and "clinically indicated" when recurrence was suspected. A total of 211 routine and 88 clinically indicated positron emission tomography/computed tomography studies were performed in 161 patients. In ten of 22 patients with recurrence of Hodgkin lymphoma, routine imaging surveillance was the primary tool for the diagnosis of the relapse. Extranodal disease, interim positron emission tomography-positive lesions and positron emission tomography activity at response evaluation were all associated with a positron emission tomography/computed tomography-diagnosed preclinical relapse. The true positive rates of routine and clinically indicated imaging were 5% and 13%, respectively (P = 0.02). The overall positive predictive value and negative predictive value of positron emission tomography/computed tomography were 28% and 100%, respectively. The estimated cost per routine imaging diagnosed relapse was US$ 50,778. Negative positron emission tomography/computed tomography reliably rules out a relapse. The high false positive rate is, however, an important limitation and a confirmatory biopsy is mandatory for the diagnosis of a relapse. With no proven survival benefit for patients with a pre-clinically diagnosed relapse, the high costs and low

  2. Evaluation of the Positional Uncertainty of a Liver Tumor using 4-Dimensional Computed Tomography and Gated Orthogonal Kilovolt Setup Images

    International Nuclear Information System (INIS)

    Ju, Sang Gyu; Hong, Chae Seon; Park, Hee Chul; Ahn, Jong Ho; Shin, Eun Hyuk; Shin, Jung Suk; Kim, Jin Sung; Han, Young Yih; Lim, Do Hoon; Choi, Doo Ho

    2010-01-01

    In order to evaluate the positional uncertainty of internal organs during radiation therapy for treatment of liver cancer, we measured differences in inter- and intra-fractional variation of the tumor position and tidal amplitude using 4-dimensional computed radiograph (DCT) images and gated orthogonal setup kilovolt (KV) images taken on every treatment using the on board imaging (OBI) and real time position management (RPM) system. Twenty consecutive patients who underwent 3-dimensional (3D) conformal radiation therapy for treatment of liver cancer participated in this study. All patients received a 4DCT simulation with an RT16 scanner and an RPM system. Lipiodol, which was updated near the target volume after transarterial chemoembolization or diaphragm was chosen as a surrogate for the evaluation of the position difference of internal organs. Two reference orthogonal (anterior and lateral) digital reconstructed radiograph (DRR) images were generated using CT image sets of 0% and 50% into the respiratory phases. The maximum tidal amplitude of the surrogate was measured from 3D conformal treatment planning. After setting the patient up with laser markings on the skin, orthogonal gated setup images at 50% into the respiratory phase were acquired at each treatment session with OBI and registered on reference DRR images by setting each beam center. Online inter-fractional variation was determined with the surrogate. After adjusting the patient setup error, orthogonal setup images at 0% and 50% into the respiratory phases were obtained and tidal amplitude of the surrogate was measured. Measured tidal amplitude was compared with data from 4DCT. For evaluation of intra-fractional variation, an orthogonal gated setup image at 50% into the respiratory phase was promptly acquired after treatment and compared with the same image taken just before treatment. In addition, a statistical analysis for the quantitative evaluation was performed. Medians of inter

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  4. The Prodiguer Messaging Platform

    Science.gov (United States)

    Denvil, S.; Greenslade, M. A.; Carenton, N.; Levavasseur, G.; Raciazek, J.

    2015-12-01

    CONVERGENCE is a French multi-partner national project designed to gather HPC and informatics expertise to innovate in the context of running French global climate models with differing grids and at differing resolutions. Efficient and reliable execution of these models and the management and dissemination of model output are some of the complexities that CONVERGENCE aims to resolve.At any one moment in time, researchers affiliated with the Institut Pierre Simon Laplace (IPSL) climate modeling group, are running hundreds of global climate simulations. These simulations execute upon a heterogeneous set of French High Performance Computing (HPC) environments. The IPSL's simulation execution runtime libIGCM (library for IPSL Global Climate Modeling group) has recently been enhanced so as to support hitherto impossible realtime use cases such as simulation monitoring, data publication, metrics collection, simulation control, visualizations … etc. At the core of this enhancement is Prodiguer: an AMQP (Advanced Message Queue Protocol) based event driven asynchronous distributed messaging platform. libIGCM now dispatches copious amounts of information, in the form of messages, to the platform for remote processing by Prodiguer software agents at IPSL servers in Paris. Such processing takes several forms: Persisting message content to database(s); Launching rollback jobs upon simulation failure; Notifying downstream applications; Automation of visualization pipelines; We will describe and/or demonstrate the platform's: Technical implementation; Inherent ease of scalability; Inherent adaptiveness in respect to supervising simulations; Web portal receiving simulation notifications in realtime.

  5. Computer-aided mass detection in mammography: False positive reduction via gray-scale invariant ranklet texture features

    International Nuclear Information System (INIS)

    Masotti, Matteo; Lanconelli, Nico; Campanini, Renato

    2009-01-01

    In this work, gray-scale invariant ranklet texture features are proposed for false positive reduction (FPR) in computer-aided detection (CAD) of breast masses. Two main considerations are at the basis of this proposal. First, false positive (FP) marks surviving our previous CAD system seem to be characterized by specific texture properties that can be used to discriminate them from masses. Second, our previous CAD system achieves invariance to linear/nonlinear monotonic gray-scale transformations by encoding regions of interest into ranklet images through the ranklet transform, an image transformation similar to the wavelet transform, yet dealing with pixels' ranks rather than with their gray-scale values. Therefore, the new FPR approach proposed herein defines a set of texture features which are calculated directly from the ranklet images corresponding to the regions of interest surviving our previous CAD system, hence, ranklet texture features; then, a support vector machine (SVM) classifier is used for discrimination. As a result of this approach, texture-based information is used to discriminate FP marks surviving our previous CAD system; at the same time, invariance to linear/nonlinear monotonic gray-scale transformations of the new CAD system is guaranteed, as ranklet texture features are calculated from ranklet images that have this property themselves by construction. To emphasize the gray-scale invariance of both the previous and new CAD systems, training and testing are carried out without any in-between parameters' adjustment on mammograms having different gray-scale dynamics; in particular, training is carried out on analog digitized mammograms taken from a publicly available digital database, whereas testing is performed on full-field digital mammograms taken from an in-house database. Free-response receiver operating characteristic (FROC) curve analysis of the two CAD systems demonstrates that the new approach achieves a higher reduction of FP marks

  6. Cone Beam Computed Tomography-based Evaluation of the Anterior Teeth Position Changes obtained by Passive Self-ligating Brackets.

    Science.gov (United States)

    Rhoden, Fernando K; Maltagliati, Liliana Á; de Castro Ferreira Conti, Ana C; Almeida-Pedrin, Renata R; Filho, Leopoldino C; de Almeida Cardoso, Maurício

    2016-08-01

    The objective of this study was to evaluate the anterior teeth position changes obtained by passive self-ligating brackets using cone beam computed tomography (CBCT). Twenty patients with a mean age of 16.5 years, class I malocclusion, constricted maxillary arch, and teeth crowding above 5 mm were enrolled in this study, and treated by passive orthodontic self-ligating brackets. A sequence of stainless steel thermoset wire was implemented with ending wire of 0.019" × 0.025". The CBCT and dental casts were obtained prior to the installation of orthodontic appliances (T1), and 30 days after rectangular steel wire 0.019" × 0.025" installation (T2). The measurements in CBCT were performed with the Anatomage software, and the dental casts were evaluated with a digital caliper rule with an accuracy of 0.01 mm. The CBCT data demonstrated mean buccal inclination of the upper and lower central incisors ranging from 6.55° to 7.24° respectively. The upper and lower lateral incisors ranged from 4.90° to 8.72° respectively. The lower canines showed an average increase of 3.88° in the buccal inclination and 1.96 mm in the transverse intercuspal distance. The upper canines showed a negative inclination with mean average of -0.36°, and an average increase of 0.82 mm in the transverse distance, with negative correlation with the initial crowding. Treatment with passive self-ligating brackets without obtaining spaces increases buccal inclination of the upper and lower incisors with no correlation with the amount of initial teeth crowding. The intercanine distance tends to a small increase showing different inclinations between the arches. When taking into account the self-ligating brackets, the amount of initial dental crowding is not a limitation factor that could increase the buccal inclination of the anterior teeth.

  7. Observation of Interfractional Variations in Lung Tumor Position Using Respiratory Gated and Ungated Megavoltage Cone-Beam Computed Tomography

    International Nuclear Information System (INIS)

    Chang, Jenghwa; Mageras, Gig S.; Yorke, Ellen; De Arruda, Fernando; Sillanpaa, Jussi; Rosenzweig, Kenneth E.; Hertanto, Agung; Pham, Hai; Seppi, Edward; Pevsner, Alex; Ling, C. Clifton; Amols, Howard

    2007-01-01

    Purpose: To evaluate the use of megavoltage cone-beam computed tomography (MV CBCT) to measure interfractional variation in lung tumor position. Methods and Materials: Eight non-small-cell lung cancer patients participated in the study, 4 with respiratory gating and 4 without. All patients underwent MV CBCT scanning at weekly intervals. Contoured planning CT and MV CBCT images were spatially registered based on vertebral anatomy, and displacements of the tumor centroid determined. Setup error was assessed by comparing weekly portal orthogonal radiographs with digitally reconstructed radiographs generated from planning CT images. Hypothesis testing was performed to test the statistical significance of the volume difference, centroid displacement, and setup uncertainty. Results: The vertebral bodies and soft tissue portions of tumor within lung were visible on the MV CBCT scans. Statistically significant systematic volume decrease over the course of treatment was observed for 1 patient. The average centroid displacement between simulation CT and MV CBCT scans were 2.5 mm, -2.0 mm, and -1.5 mm with standard deviations of 2.7 mm, 2.7 mm, and 2.6 mm in the right-left, anterior-posterior and superior-inferior directions. The mean setup errors were smaller than the centroid shifts, while the standard deviations were comparable. In most cases, the gross tumor volume (GTV) defined on the MV CBCT was located on average at least 5 mm inside a 10 mm expansion of the GTV defined on the planning CT scan. Conclusions: The MV CBCT technique can be used to image lung tumors and may prove valuable for image-guided radiotherapy. Our conclusions must be verified in view of the small patient number

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  9. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  11. La seguridad informática en el trabajo con la plataforma Moodle (Computer Security in Working with the Moodle Platform

    Directory of Open Access Journals (Sweden)

    María-José Luisa Romero-Moreno

    2010-12-01

    Full Text Available Resumen: El trabajo presenta los aspectos de seguridad de la plataforma Moodle. Es sabido que los Sistemas Virtuales de Formación impregnan el mundo académico (Campus Virtuales pero cada día más el de la empresa (Formación Continua. Las plataformas eLearning se nos presentan como herramientas adecuadas en estos contextos. Apostamos por el software libre y dentro de él por la plataforma que a día de hoy constituye un auténtico referente en el ámbito de la formación. Pero nos parece fundamental que profesores y tutores puedan tener la seguridad de que sus ficheros están debidamente protegidos. Analizaremos como aprovechar los niveles de seguridad de la herramienta y como configurarla para obtener los resultados esperados.Abstract: This paper presents the security aspects of Moodle platform. It is known that the Virtual Training Systems impregnate the academic world (Virtual Campus but faster and faster the company’s (Continuing Training. The platforms eLearning are presented as suitable tools in these contexts. Here we focus on free software and especially on the platform that today is a real landmark in the area of Virtual Training. But it seems essential that teachers and tutors can be safe that their files are protected. In what follow we will analyze how to take advantage of the security levels and how to configure the tool to obtain the expected results. 

  12. Essentials of cloud computing

    CERN Document Server

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  13. A computational systems biology software platform for multiscale modeling and simulation: Integrating whole-body physiology, disease biology, and molecular reaction networks

    Directory of Open Access Journals (Sweden)

    Thomas eEissing

    2011-02-01

    Full Text Available Today, in silico studies and trial simulations already complement experimental approaches in pharmaceutical R&D and have become indispensable tools for decision making and communication with regulatory agencies. While biology is multi-scale by nature, project work and software tools usually focus on isolated aspects of drug action, such as pharmacokinetics at the organism scale or pharmacodynamic interaction on the molecular level. We present a modeling and simulation software platform consisting of PK-Sim® and MoBi® capable of building and simulating models that integrate across biological scales. A prototypical multiscale model for the progression of a pancreatic tumor and its response to pharmacotherapy is constructed and virtual patients are treated with a prodrug activated by hepatic metabolization. Tumor growth is driven by signal transduction leading to cell cycle transition and proliferation. Free tumor concentrations of the active metabolite inhibit Raf kinase in the signaling cascade and thereby cell cycle progression. In a virtual clinical study, the individual therapeutic outcome of the chemotherapeutic intervention is simulated for a large population with heterogeneous genomic background. Thereby, the platform allows efficient model building and integration of biological knowledge and prior data from all biological scales. Experimental in vitro model systems can be linked with observations in animal experiments and clinical trials. The interplay between patients, diseases, and drugs and topics with high clinical relevance such as the role of pharmacogenomics, drug-drug or drug-metabolite interactions can be addressed using this mechanistic, insight driven multiscale modeling approach.

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  15. The European Photovoltaic Technology Platform

    International Nuclear Information System (INIS)

    Nowak, S.; Aulich, H.; Bal, J.L.; Dimmler, B.; Garnier, A.; Jongerden, G.; Luther, J.; Luque, A.; Milner, A.; Nelson, D.; Pataki, I.; Pearsall, N.; Perezagua, E.; Pietruszko, S.; Rehak, J.; Schellekens, E.; Shanker, A.; Silvestrini, G.; Sinke, W.; Willemsen, H.

    2006-05-01

    The European Photovoltaic Technology Platform is one of the European Technology Platforms, a new instrument proposed by the European Commission. European Technology Platforms (ETPs) are a mechanism to bring together all interested stakeholders to develop a long-term vision to address a specific challenge, create a coherent, dynamic strategy to achieve that vision and steer the implementation of an action plan to deliver agreed programmes of activities and optimise the benefits for all parties. The European Photovoltaic Technology Platform has recently been established to define, support and accompany the implementation of a coherent and comprehensive strategic plan for photovoltaics. The platform will mobilise all stakeholders sharing a long-term European vision for PV, helping to ensure that Europe maintains and improves its industrial position. The platform will realise a European Strategic Research Agenda for PV for the next decade(s). Guided by a Steering Committee of 20 high level decision-makers representing all relevant European PV Stakeholders, the European PV Technology Platform comprises 4 Working Groups dealing with the subjects policy and instruments; market deployment; science, technology and applications as well as developing countries and is supported by a secretariat

  16. Computer-assisted preoperative simulation for positioning and fixation of plate in 2-stage procedure combining maxillary advancement by distraction technique and mandibular setback surgery.

    Science.gov (United States)

    Suenaga, Hideyuki; Taniguchi, Asako; Yonenaga, Kazumichi; Hoshi, Kazuto; Takato, Tsuyoshi

    2016-01-01

    Computer-assisted preoperative simulation surgery is employed to plan and interact with the 3D images during the orthognathic procedure. It is useful for positioning and fixation of maxilla by a plate. We report a case of maxillary retrusion by a bilateral cleft lip and palate, in which a 2-stage orthognathic procedure (maxillary advancement by distraction technique and mandibular setback surgery) was performed following a computer-assisted preoperative simulation planning to achieve the positioning and fixation of the plate. A high accuracy was achieved in the present case. A 21-year-old male patient presented to our department with a complaint of maxillary retrusion following bilateral cleft lip and palate. Computer-assisted preoperative simulation with 2-stage orthognathic procedure using distraction technique and mandibular setback surgery was planned. The preoperative planning of the procedure resulted in good aesthetic outcomes. The error of the maxillary position was less than 1mm. The implementation of the computer-assisted preoperative simulation for the positioning and fixation of plate in 2-stage orthognathic procedure using distraction technique and mandibular setback surgery yielded good results. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  17. Positive and Negative Aspects of the IWB and Tablet Computers in the First Grade of Primary School: A Multiple-Perspective Approach

    Science.gov (United States)

    Fekonja-Peklaj, Urška; Marjanovic-Umek, Ljubica

    2015-01-01

    The aim of this qualitative study was to evaluate the positive and negative aspects of the interactive whiteboard (IWB) and tablet computers use in the first grade of primary school from the perspectives of three groups of evaluators, namely the teachers, the pupils and an independent observer. The sample included three first grade classes with…

  18. Assessment of the effective dose in supine, prone, and oblique positions in the maxillofacial region using a novel combined extremity and maxillofacial cone beam computed tomography scanner

    NARCIS (Netherlands)

    Koivisto, J.; Wolff, J.; Järnstedt, J.; Dastidar, P.; Kortesniemi, M.

    2014-01-01

    Objective The objectives of this study were to assess the organ and effective doses (International Commission on Radiological Protection [ICRP] 103 standard) resulting from supine, prone, and oblique phantom positions in the maxillofacial region using a novel cone beam computed tomography (CBCT)

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  20. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  1. Parallel application of plasma equilibrium fitting based on inhomogeneous platforms

    International Nuclear Information System (INIS)

    Liao Min; Zhang Jinhua; Chen Liaoyuan; Li Yongge; Pan Wei; Pan Li

    2008-01-01

    An online analysis and online display platform EFIT, which is based on the equilibrium-fitting mode, is inducted in this paper. This application can realize large data transportation between inhomogeneous platforms by designing a communication mechanism using sockets. It spends approximately one minute to complete the equilibrium fitting reconstruction by using a finite state machine to describe the management node and several node computers of cluster system to fulfill the parallel computation, this satisfies the online display during the discharge interval. An effective communication model between inhomogeneous platforms is provided, which could transport the computing results from Linux platform to Windows platform for online analysis and display. (authors)

  2. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  8. Survey on the Priority Factors Influencing IT Outsourcing in the Platform of Cloud Computing in Semnan Province Universities by Fuzzy DEMATEL Technique

    Directory of Open Access Journals (Sweden)

    Mir Saeed Shafaee Tonekaboni

    2015-06-01

    Full Text Available Today, Information Technology Outsourcing (ITO has developed and caused organizations to become more flexible and dynamic than they were before. The ever-increasing development of ICT is one of the main reasons for the promotion of outsourcing. That is due to the fact that many organizations are not able to adapt their hardware and software in accordance with the fast-paced technology development. Cloud Computing is considered as one of newest paradigms in ITO. Due to its flexible nature, this paradigm has been able to protect organizations against extreme changes of IT during recent years. Furthermore, by better understanding their needs and prioritizing them, organizations can experience a more successful outsourcing in the context of Cloud Computing. SMI can be a considerable help in identifying organization’s needs to use Cloud Computing. This article suggests that organizations use Fuzzy DEMATEL Technique to prioritize their needs. In this research which is conducted as a case study, all the universities in Semnan Province are examined. The results show that the most important criteria for outsourcing in the context of Cloud Computing are Compliance, Operability and Contracting Experience respectively. Moreover, the model has identified Security Management, Ownership and Contracting Experience as the most effective criteria and Learnability, Maintainability and Recoverability as the most affected one.

  9. What it takes to understand and cure a living system: computational systems biology and a systems biology-driven pharmacokinetics-pharmacodynamics platform

    NARCIS (Netherlands)

    Swat, Maciej; Kiełbasa, Szymon M.; Polak, Sebastian; Olivier, Brett; Bruggeman, Frank J.; Tulloch, Mark Quinton; Snoep, Jacky L.; Verhoeven, Arthur J.; Westerhoff, Hans V.

    2011-01-01

    The utility of model repositories is discussed in the context of systems biology (SB). It is shown how such repositories, and in particular their live versions, can be used for computational SB: we calculate the robustness of the yeast glycolytic network with respect to perturbations of one of its

  10. Platform treatment

    International Nuclear Information System (INIS)

    Jammazi, Rochdi; M'henni, Wafa

    2008-01-01

    Our project consists of making of a punt forms test for the radioactive treatment in the research's unit of the CNSTN .First a complete functional analysis is carried out, this analysis comprises a comparative study between our work and a design already established in a preparatory project, it emphasizes the failures whose made proof the old design .Second the metallurgical study which must lead on the choice of materials is in its turn carried out. Thirdly the design with all the details and the plans is carried out by the intermediary of the software CATIA. This design is accompanied by the various bodies's dimensioning and the calculation of the various states of stresses to which the parts of the system are subjected: buckling, inflection fourthly an automatic study is carried out, this study contains the automation's step of the system using the software, SIMATIC STEP 7. It also contains the choice of the various position and velocity pick-ups. Finally we will finish by the presentation of maintenance and safety's instructions of the system and the operator working in this research's . (Author)

  11. Review: Joachim R. Höflich (2003. Mensch, Computer und Kommunikation. Theoretische Verortungen und empirische Befunde [Man, Computer, Communication. Theoretical Positions and Empirical Findings

    Directory of Open Access Journals (Sweden)

    Jan Schmidt

    2004-05-01

    Full Text Available Joachim R. HÖFLICH presents a theory of the institutionalization of computer-mediated communication that centers on the user and his/her expectations. "Computer frames", consisting of rules and routines for the appropriate use of a medium and its applications as a tool for information, public discussion or interpersonal communication, structure the single usage episodes as well as the users' expectations. Drawing on a variety of data on the development of the Newspaper-Mailbox "Augsburg Newsline" in the Mid-Nineties, HÖFLICH demonstrates the usefulness of his conceptual framework for empirical analysis. His book is, therefore, a valuable contribution to the field of online research in social and communication science alike. URN: urn:nbn:de:0114-fqs040297

  12. SU-D-206-02: Evaluation of Partial Storage of the System Matrix for Cone Beam Computed Tomography Using a GPU Platform

    Energy Technology Data Exchange (ETDEWEB)

    Matenine, D; Cote, G; Mascolo-Fortin, J [Universite Laval, Quebec, QC (Canada); Goussard, Y [Ecole Polytechnique de Montreal, Montreal, QC (Canada); Despres, P [Universite Laval, Quebec, QC (Canada); Departement de radio-oncologie and Centre de recherche du CHU de Quebec, Quebec, QC (Canada)

    2016-06-15

    Purpose: Iterative reconstruction algorithms in computed tomography (CT) require a fast method for computing the intersections between the photons’ trajectories and the object, also called ray-tracing or system matrix computation. This work evaluates different ways to store the system matrix, aiming to reconstruct dense image grids in reasonable time. Methods: We propose an optimized implementation of the Siddon’s algorithm using graphics processing units (GPUs) with a novel data storage scheme. The algorithm computes a part of the system matrix on demand, typically, for one projection angle. The proposed method was enhanced with accelerating options: storage of larger subsets of the system matrix, systematic reuse of data via geometric symmetries, an arithmetic-rich parallel code and code configuration via machine learning. It was tested on geometries mimicking a cone beam CT acquisition of a human head. To realistically assess the execution time, the ray-tracing routines were integrated into a regularized Poisson-based reconstruction algorithm. The proposed scheme was also compared to a different approach, where the system matrix is fully pre-computed and loaded at reconstruction time. Results: Fast ray-tracing of realistic acquisition geometries, which often lack spatial symmetry properties, was enabled via the proposed method. Ray-tracing interleaved with projection and backprojection operations required significant additional time. In most cases, ray-tracing was shown to use about 66 % of the total reconstruction time. In absolute terms, tracing times varied from 3.6 s to 7.5 min, depending on the problem size. The presence of geometrical symmetries allowed for non-negligible ray-tracing and reconstruction time reduction. Arithmetic-rich parallel code and machine learning permitted a modest reconstruction time reduction, in the order of 1 %. Conclusion: Partial system matrix storage permitted the reconstruction of higher 3D image grid sizes and larger

  13. SU-D-206-02: Evaluation of Partial Storage of the System Matrix for Cone Beam Computed Tomography Using a GPU Platform

    International Nuclear Information System (INIS)

    Matenine, D; Cote, G; Mascolo-Fortin, J; Goussard, Y; Despres, P

    2016-01-01

    Purpose: Iterative reconstruction algorithms in computed tomography (CT) require a fast method for computing the intersections between the photons’ trajectories and the object, also called ray-tracing or system matrix computation. This work evaluates different ways to store the system matrix, aiming to reconstruct dense image grids in reasonable time. Methods: We propose an optimized implementation of the Siddon’s algorithm using graphics processing units (GPUs) with a novel data storage scheme. The algorithm computes a part of the system matrix on demand, typically, for one projection angle. The proposed method was enhanced with accelerating options: storage of larger subsets of the system matrix, systematic reuse of data via geometric symmetries, an arithmetic-rich parallel code and code configuration via machine learning. It was tested on geometries mimicking a cone beam CT acquisition of a human head. To realistically assess the execution time, the ray-tracing routines were integrated into a regularized Poisson-based reconstruction algorithm. The proposed scheme was also compared to a different approach, where the system matrix is fully pre-computed and loaded at reconstruction time. Results: Fast ray-tracing of realistic acquisition geometries, which often lack spatial symmetry properties, was enabled via the proposed method. Ray-tracing interleaved with projection and backprojection operations required significant additional time. In most cases, ray-tracing was shown to use about 66 % of the total reconstruction time. In absolute terms, tracing times varied from 3.6 s to 7.5 min, depending on the problem size. The presence of geometrical symmetries allowed for non-negligible ray-tracing and reconstruction time reduction. Arithmetic-rich parallel code and machine learning permitted a modest reconstruction time reduction, in the order of 1 %. Conclusion: Partial system matrix storage permitted the reconstruction of higher 3D image grid sizes and larger

  14. Is there a computable upper bound for the height of a solution of a Diophantine equation with a unique solution in positive integers?

    Directory of Open Access Journals (Sweden)

    Tyszka Apoloniusz

    2017-03-01

    Full Text Available Let Bn = {xi · xj = xk : i, j, k ∈ {1, . . . , n}} ∪ {xi + 1 = xk : i, k ∈ {1, . . . , n}} denote the system of equations in the variables x1, . . . , xn. For a positive integer n, let _(n denote the smallest positive integer b such that for each system of equations S ⊆ Bn with a unique solution in positive integers x1, . . . , xn, this solution belongs to [1, b]n. Let g(1 = 1, and let g(n + 1 = 22g(n for every positive integer n. We conjecture that ξ (n 6 g(2n for every positive integer n. We prove: (1 the function ξ : N \\ {0} → N \\ {0} is computable in the limit; (2 if a function f : N \\ {0} → N \\ {0} has a single-fold Diophantine representation, then there exists a positive integer m such that f (n m; (3 the conjecture implies that there exists an algorithm which takes as input a Diophantine equation D(x1, . . . , xp = 0 and returns a positive integer d with the following property: for every positive integers a1, . . . , ap, if the tuple (a1, . . . , ap solely solves the equation D(x1, . . . , xp = 0 in positive integers, then a1, . . . , ap 6 d; (4 the conjecture implies that if a set M ⊆ N has a single-fold Diophantine representation, then M is computable; (5 for every integer n > 9, the inequality ξ (n < (22n−5 − 12n−5 + 1 implies that 22n−5 + 1 is composite.

  15. Platform pricing in matching markets

    NARCIS (Netherlands)

    Goos, M.; van Cayseele, P.; Willekens, B.

    2011-01-01

    This paper develops a simple model of monopoly platform pricing accounting for two pertinent features of matching markets. 1) The trading process is characterized by search and matching frictions implying limits to positive cross-side network effects and the presence of own-side congestion.

  16. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  17. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  19. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  20. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  2. Galileo positioning technology

    CERN Document Server

    Lohan, Elena; Sand, Stephan; Hurskainen, Heikki

    2015-01-01

    This book covers multi-band Galileo receivers (especially E1-E5 bands of Galileo) and addresses all receiver building blocks, from the antenna and front end, through details of the baseband receiver processing blocks, up to the navigation processing, including the Galileo message structure and Position, Velocity, Time (PVT) computation. Moreover, hybridization solutions with communications systems for improved localization are discussed and an open-source GNSS receiver platform (available for download) developed at Tampere University of Technology (TUT) is addressed in detail. • Takes a holistic approach to GALILEO and related systems, such as EGNOS and hybrid solutions on mobile phones; • Provides an invaluable reference to Binary Offset Carrier modulations and related families, which are some of the trademarks of GALILEO; • Includes a detailed survey of GALILEO receiver research in Europe and existing software-defined radio (SDR) GALILEO receiver implementations; • Addresses the multiple challen...

  3. Reaction of formaldehyde at the ortho- and para-positions of phenol: exploration of mechanisms using computational chemistry.

    Science.gov (United States)

    Anthony H. Conner; Melissa S. Reeves

    2001-01-01

    Computational chemistry methods can be used to explore the theoretical chemistry behind reactive systems, to compare the relative chemical reactivity of different systems, and, by extension, to predict the reactivity of new systems. Ongoing research has focused on the reactivity of a wide variety of phenolic compounds with formaldehyde using semi-empirical and ab...

  4. Cone Beam Computed Tomographic Analyses of the Position and Course of the Mandibular Canal: Relevance to the Sagittal Split Ramus Osteotomy

    Directory of Open Access Journals (Sweden)

    Ahmet Ercan Sekerci

    2014-01-01

    Full Text Available Purpose. The aim of this study was to document the position and course of the mandibular canal through the region of the mandibular angle and body in dental patients, using cone beam computed tomographic imaging. Methods. The position and course of the mandibular canal from the region of the third molar to the first molar were measured at five specific locations in the same plane: at three different positions just between the first and second molars; between the second and third molars; and just distal to the third molar. Results. The study sample was composed of 500 hemimandibles from 250 dental patients with a mean age of 26.32. Significant differences were found between genders, distances, and positions. B decreased significantly from the anterior positions to the posterior positions in both females and males. The mean values of S and CB increased significantly from the posterior positions to the anterior positions in both females and males. Conclusion. Because the sagittal split ramus osteotomy is a technically difficult procedure, we hope that the findings of the present study will help the surgeon in choosing the safest surgical technique for the treatment of mandibular deformities.

  5. Development of a computer-assisted system for model-based condylar position analysis (E-CPM).

    Science.gov (United States)

    Ahlers, M O; Jakstat, H

    2009-01-01

    Condylar position analysis is a measuring method for the three-dimensional quantitative acquisition of the position of the mandible in different conditions or at different points in time. Originally, the measurement was done based on a model, using special mechanical condylar position measuring instruments, and on a research scale with mechanical-electronic measuring instruments. Today, as an alternative, it is possible to take measurements with electronic measuring instruments applied directly to the patient. The computerization of imaging has also facilitated condylar position measurement by means of three-dimensional data records obtained by imaging examination methods, which has been used in connection with the simulation and quantification of surgical operation results. However, the comparative measurement of the condylar position at different points in time has so far not been possible to the required degree. An electronic measuring instrument, allowing acquisition of the condylar position in clinical routine and facilitating later calibration with measurements from later examinations by data storage and use of precise equalizing systems, was therefore designed by the present authors. This measuring instrument was implemented on the basis of already existing components from the Reference CPM und Cadiax Compact articulator and registration systems (Gamma Dental, Klosterneuburg, Austria) as well as the matching CMD3D evaluation software (dentaConcept, Hamburg).

  6. A Study on the Effectiveness of the Manufacture of Compensator and Setup Position for Total Body Irradiation Using Computed Tomography-simulator's Images

    International Nuclear Information System (INIS)

    Lee, Woo Suk; Kim, Dae Sup; Park, Seong Ho; Yun, In Ha; Back, Geum Mun; Kim, Jeong Man

    2005-01-01

    We should use a computed tomography-simulator for the body measure and compensator manufacture process was practiced with TBI's positioning in process and to estimate the availability. Patient took position that lied down. and got picture through computed tomography-simulator. This picture transmitted to Somavision and measured about body measure point on the picture. Measurement was done with skin, and used the image to use measure the image about lungs. We decided thickness of compensator through value that was measured by the image. Also, We decided and confirmed position of compensator through image. Finally, We measured dosage with TLD in the treatment department. About thickness at body measure point. we could find difference of 1-2 cm relationship general measure and image measure. General measure and image measure of body length was seen difference of 3-4 cm. Also, we could paint first drawing of compensator through the image. The value of dose measurement used TLD on head, neck, axilla, chest(lungs inclusion), knee region were measured by 92-98% and abdomen, pelvis, inquinal region, feet region were measured by 102-109%. It was useful for TBI's positioning to use an image of computed tomography-simulator in the process. There was not that is difference of body thickness measure point, but measure about length was achieved definitely. Like this, manufacture of various compensator that consider body density if use image is available. Positioning of compensator could be done exactly, and produce easily without shape of compensator is courted Positioning in the treatment department could shortened overall 15 minute time. and reduce compensator manufacture time about 15 minutes.

  7. Image findings of a false positive radioactive iodine-131 uptake mimicking metastasis in pulmonary aspergillosis identified on single photon emission computed tomography-computed tomography

    Directory of Open Access Journals (Sweden)

    Kamaleshwaran Koramadai Karuppusamy

    2015-01-01

    Full Text Available High doses of iodine-131 are commonly used in patients with differentiated thyroid cancer after total or subtotal thyroidectomy, in order to ablate the remaining cancer or normal thyroid tissue. Multiple different false-positive scans can occur in the absence of residual thyroid tissue or metastases. The authors present a case of abnormal uptake of radioactive iodine in the aspergilloma, potentially masquerading as pulmonary metastases.

  8. Implementation of double-C-arm synchronous real-time X-ray positioning system computer aided for aspiration biopsy of small lung lesion

    International Nuclear Information System (INIS)

    Zhu Hong; Wang Dong; Ye Yukun; Zhou Yuan; Lu Jianfeng; Yang Jingyu; Wang Lining

    2007-01-01

    Objective: To evaluate the feasibility of a new type of real-time three-dimensional X-ray positioning system for aspiration biopsy of small lung lesions. Methods: Using X-ray imaging technology and X-ray collimator technology and combining with double-C-arm X-ray machine, two different synchronous real-time images were obtained from the vertical to the horizontal plane. Then, with the computer image processing and computer vision processing technologies, dynamic tracking for 3D information of a pulmonary lesion and the needle in aspiration, and the relative position of the two, were established. Results: There was no interference while the two imaging perpendicularly X-ray beam met, two synchronous real-time image acquisition and tracking of a lung lesion and a needle could be completed in free respiration. The average positioning system error was about 0.5 mm, the largest positioning error was about 1.0 mm, real-time display rate was 5 screen/sec. Conclusions: the establishment of a new type of double-C-arm synchronous real-time X-ray positioning system is feasible. It is available for the fast and accurate aspiration biopsy of small lung lesions. (authors)

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  10. Retroperitoneal Endometriosis: A Possible Cause of False Positive Finding at 18F-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography

    International Nuclear Information System (INIS)

    Maffione, Anna Margherita; Panzavolta, Riccardo; Lisato, Laura Camilla; Ballotta, Maria; D'Isanto, Mariangela Zanforlini; Rubello, Domenico

    2015-01-01

    Endometriosis is a frequent and clinically relevant problem in young women. Laparoscopy is still the gold standard for the diagnosis of endometriosis, but frequently both morphologic and functional imaging techniques are involved in the diagnostic course before achieving a conclusive diagnosis. We present a case of a patient affected by infiltrating retroperitoneal endometriosis falsely interpreted as a malignant mass by contrast-enhanced magnetic resonance imaging and 18 F-fluorodeoxyglucose positron emission tomography/computed tomography

  11. Computationally efficient implementation of sarse-tap FIR adaptive filters with tap-position control on intel IA-32 processors

    OpenAIRE

    Hirano, Akihiro; Nakayama, Kenji

    2008-01-01

    This paper presents an computationally ef cient implementation of sparse-tap FIR adaptive lters with tapposition control on Intel IA-32 processors with single-instruction multiple-data (SIMD) capability. In order to overcome randomorder memory access which prevents a ectorization, a blockbased processing and a re-ordering buffer are introduced. A dynamic register allocation and the use of memory-to-register operations help the maximization of the loop-unrolling level. Up to 66percent speedup ...

  12. A study of the variation of colonic positioning in the pararenal space as shown by computed tomography

    International Nuclear Information System (INIS)

    Prassopoulos, P.; Gourtsoyiannis, N.; Cavouras, D.; Pantelidis, N.

    1990-01-01

    In a review of 1708 consecutive CT examinations of the abdomen the position of the ascending and descending colon in relation to the posterial and lateral edge of the kidney was studied. It was found that part of the colon was positioned posterior or posterolateral to the kidney's edge in percentages that varied between 14.2% and 0.9% in the different sex groups at the levels of upper, mid- and lower poles of the right and left kidney. It is concluded that this anatomical variation should be known if colon perforation is to be avoided during percutaneous nephrostomy of biopsy. (author). 15 refs.; 4 figs.; 2 tabs

  13. Genomics With Cloud Computing

    OpenAIRE

    Sukhamrit Kaur; Sandeep Kaur

    2015-01-01

    Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computin...

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  15. Recurrent proliferating trichilemmal tumor with malignant change on the f-18 fluorodeoxyglucose position emission tomography/computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Eun Ha; Kim, Eun Ha; Kim, Young Jun; Yoo, Seol Bong; Nam, Kyung Hwa [Presbyterian Medical Center, Seonam University College of Medicine, Jeonju (Korea, Republic of)

    2016-06-15

    F-18 fluorodeoxyglucose (FDG) positron emission tomography/computed tomography scan has been used for the diagnosis, assessment of treatment response, and follow-up of various neoplasms. Proliferating trichilemmal cyst or tumor (PTT) is a rare neoplasm, originated from the outer root sheath of a hair follicle. Because this tumor has unpredictable biological and clinical behavior, the long-term clinical follow-up is necessary to detect metastasis or recurrence. We reported a case of recurrent malignant PTT on scalp that showed increased FDG uptake.

  16. Design and Implementation of Cloud Platform for Intelligent Logistics in the Trend of Intellectualization

    Institute of Scientific and Technical Information of China (English)

    Mengke Yang; Movahedipour Mahmood; Xiaoguang Zhou; Salam Shafaq; Latif Zahid

    2017-01-01

    Intellectualization has become a new trend for telecom industry, driven by in-telligent technology including cloud comput-ing, big data, and Internet of things. In order to satisfy the service demand of intelligent logistics, this paper designed an intelligent logistics platform containing the main ap-plications such as e-commerce, self-service transceiver, big data analysis, path location and distribution optimization. The intelligent logistics service platform has been built based on cloud computing to collect, store and han-dling multi-source heterogeneous mass data from sensors, RFID electronic tag, vehicle ter-minals and APP, so that the open-access cloud services including distribution, positioning, navigation, scheduling and other data services can be provided for the logistics distribution applications. And then the architecture of in-telligent logistics cloud platform containing software layer (SaaS), platform layer (PaaS) and infrastructure (IaaS) has been constructed accordance with the core technology relative high concurrent processing technique, hetero-geneous terminal data access, encapsulation and data mining. Therefore, intelligent logis-tics cloud platform can be carried out by the service mode for implementation to accelerate the construction of the symbiotic win-win logistics ecological system and the benign de-velopment of the ICT industry in the trend of intellectualization in China.

  17. How does arm positioning of polytraumatized patients in the initial computed tomography (CT) affect image quality and diagnostic accuracy?

    International Nuclear Information System (INIS)

    Kahn, Johannes; Grupp, Ulrich; Maurer, Martin

    2014-01-01

    Purpose: To evaluate the influence of different arm positions on abdominal image quality during initial whole-body CT (WBCT) in polytraumatized patients and to assess the risk of missing potentially life-threatening injuries due to arm artifacts. Materials and methods: Between July 2011 and February 2013, WBCT scans of 203 patients with arms in the abdominal area during initial WBCT were analyzed. Six different arms-down positions were defined: patients with both (group A)/one arm(s) (group B) down alongside the torso, patients with both (group C)/one arm(s) (group D) crossed in front of the upper abdomen, patients with both (group E)/one arm(s) (group F) crossed in front of the pelvic area. A group of 203 patients with elevated arms beside the head served as a control group. Two observers jointly evaluated image quality of different organ regions using a 4-point scale system. Follow-up examinations (CT scans and/or ultrasound) were analyzed to identify findings missed during initial WBCT due to reduced image quality. Results: Image quality for most of the organ regions analyzed was found to be significantly different among all groups (p < 0.05). Image quality was most severely degraded in group A, followed by groups E and C. Positioning with one arm up resulted in significantly better image quality than both arms down (p < 0.05). Overall, arms-up positioning showed significantly better image quality than arms-down positions (p < 0.05). In one case, liver hemorrhage missed in the initial WBCT because of arm artifacts, was revealed by follow-up CT. Conclusion: In WBCT arms-down positioning significantly degrades abdominal image quality and artifacts might even conceal potentially life-threatening injuries. If the patient's status does not allow elevation of both arms, image quality can benefit from raising at least one arm. Otherwise, arms should be placed in front of the upper abdomen instead of alongside the torso

  18. Reconfigurable microfluidic platform in ice

    OpenAIRE

    Varejka, M.

    2008-01-01

    Microfluidic devices are popular tools in the biotechnology industry where they provide smaller reagent requirements, high speed of analysis and the possibility for automation. The aim of the project is to make a flexible biocompatible microfluidic platform adapted to different specific applications, mainly analytical and separations which parameters and configuration can be changed multiple times by changing corresponding computer programme. The current project has been sup...

  19. Influence of modifications in the positioning of phantoms in the Monte Carlo computational simulation of the prostate brachytherapy

    International Nuclear Information System (INIS)

    Barbosa, Antonio Konrado de Santana; Vieira, Jose Wilson; Costa, Kleber Souza Silva; Lima, Fernando Roberto de Andrade

    2011-01-01

    Radiotherapy simulation procedures using Monte Carlo methods have shown to be increasingly important to the improvement of cancer fighting strategies. Within this context, brachytherapy is one of the most used methods to ensure better life quality when compared to other therapeutic modalities. These procedures are planned with the use of sectional exams with the patient in lying position. However, it is known that alteration of body posture after the procedure has an influence in the localization of many organs. This study had the aim to identify and to measure the influence of such alterations in MC brachytherapy simulations. In order to do so, prostate brachytherapy with the use of Iodine-125 radionuclide was chosen as model. Simulations were carried out with 108 events using EGSnrc code associated to MASH phantom in orthostatic and supine positions. Significant alterations were found, especially regarding bladder, small intestine and testicles. (author)

  20. A computational model clarifies the roles of positive and negative feedback loops in the Drosophila circadian clock

    International Nuclear Information System (INIS)

    Wang Junwei; Zhou Tianshou

    2010-01-01

    Previous studies showed that a single negative feedback structure should be sufficient for robust circadian oscillations. It is thus pertinent to ask why current cellular clock models almost universally have interlocked negative feedback loop (NFL) and positive feedback loop (PFL). Here, we propose a molecular model that reflects the essential features of the Drosophila circadian clock to clarify the different roles of negative and positive feedback loops. In agreement with experimental observations, the model can simulate circadian oscillations in constant darkness, entrainment by light-dark cycles, as well as phenotypes of per 01 and clk Jrk mutants. Moreover, sustained oscillations persist when the PFL is removed, implying the crucial role of NFL for rhythm generation. Through parameter sensitivity analysis, it is revealed that incorporation of PFL increases the robustness of the system to regulatory processes in PFL itself. Such reduced models can aid understanding of the design principles of circadian clocks in Drosophila and other organisms with complex transcriptional feedback structures.

  1. Three dimensional analysis of impacted maxillary third molars: A cone-beam computed tomographic study of the position and depth of impaction

    Energy Technology Data Exchange (ETDEWEB)

    De Andrade, Priscila Ferreira; Silva, Jesca Neftali Nogueira; Sotto-Maior, Bruno Sales; Devito, Karina Lopes; Assis, Neuza Maria Souza Picorelli [Faculty of Dentistry, Federal University of Juiz de Fora, Juiz de Fora (Brazil); Ribeiro, CleideGisele [Faculty of Medical and Health Sciences - SUPREMA, Juiz de Fora (Brazil)

    2017-09-15

    The classification of impacted maxillary third molars (IMTMs) facilitates interdisciplinary communication and helps estimate the degree of surgical difficulty. Thus, this study aimed to develop a 3-dimensional classification of the position and depth of impaction of IMTMs and to estimate their prevalence with respect to gender and age. This cross-sectional retrospective study analyzed images in sagittal and coronal cone-beam computed tomography (CBCT) sections of 300 maxillary third molars. The proposed classification was based on 3 criteria: buccolingual position (buccal, lingual, or central), mesial-distal position (mesioangular, vertical, or distoangular), and depth of impaction (low, medium, or high). CBCT images of IMTMs were classified, and the associations of the classifications with gender and age were examined using analysis of variance with the Scheffé post-hoc test. To determine the associations among the 3 classifications, the chi-square test was used (P<.05). No significant association of the classifications with gender was observed. Age showed a significant relationship with depth of impaction (P=.0001) and mesial-distal position (P=.005). The most common positions were buccal (n=222), vertical (n=184), and low (n=124). Significant associations among the 3 tested classifications were observed. CBCT enabled the evaluation of IMTMs in a 3-dimensional format, and we developed a proposal for a new classification of the position and depth of impaction of IMTMs.

  2. 3D computational mechanics elucidate the evolutionary implications of orbit position and size diversity of early amphibians.

    Directory of Open Access Journals (Sweden)

    Jordi Marcé-Nogué

    Full Text Available For the first time in vertebrate palaeontology, the potential of joining Finite Element Analysis (FEA and Parametrical Analysis (PA is used to shed new light on two different cranial parameters from the orbits to evaluate their biomechanical role and evolutionary patterns. The early tetrapod group of Stereospondyls, one of the largest groups of Temnospondyls is used as a case study because its orbits position and size vary hugely within the members of this group. An adult skull of Edingerella madagascariensis was analysed using two different cases of boundary and loading conditions in order to quantify stress and deformation response under a bilateral bite and during skull raising. Firstly, the variation of the original geometry of its orbits was introduced in the models producing new FEA results, allowing the exploration of the ecomorphology, feeding strategy and evolutionary patterns of these top predators. Secondly, the quantitative results were analysed in order to check if the orbit size and position were correlated with different stress patterns. These results revealed that in most of the cases the stress distribution is not affected by changes in the size and position of the orbit. This finding supports the high mechanical plasticity of this group during the Triassic period. The absence of mechanical constraints regarding the orbit probably promoted the ecomorphological diversity acknowledged for this group, as well as its ecological niche differentiation in the terrestrial Triassic ecosystems in clades as lydekkerinids, trematosaurs, capitosaurs or metoposaurs.

  3. 3D Computational Mechanics Elucidate the Evolutionary Implications of Orbit Position and Size Diversity of Early Amphibians

    Science.gov (United States)

    Marcé-Nogué, Jordi; Fortuny, Josep; De Esteban-Trivigno, Soledad; Sánchez, Montserrat; Gil, Lluís; Galobart, Àngel

    2015-01-01

    For the first time in vertebrate palaeontology, the potential of joining Finite Element Analysis (FEA) and Parametrical Analysis (PA) is used to shed new light on two different cranial parameters from the orbits to evaluate their biomechanical role and evolutionary patterns. The early tetrapod group of Stereospondyls, one of the largest groups of Temnospondyls is used as a case study because its orbits position and size vary hugely within the members of this group. An adult skull of Edingerella madagascariensis was analysed using two different cases of boundary and loading conditions in order to quantify stress and deformation response under a bilateral bite and during skull raising. Firstly, the variation of the original geometry of its orbits was introduced in the models producing new FEA results, allowing the exploration of the ecomorphology, feeding strategy and evolutionary patterns of these top predators. Secondly, the quantitative results were analysed in order to check if the orbit size and position were correlated with different stress patterns. These results revealed that in most of the cases the stress distribution is not affected by changes in the size and position of the orbit. This finding supports the high mechanical plasticity of this group during the Triassic period. The absence of mechanical constraints regarding the orbit probably promoted the ecomorphological diversity acknowledged for this group, as well as its ecological niche differentiation in the terrestrial Triassic ecosystems in clades as lydekkerinids, trematosaurs, capitosaurs or metoposaurs. PMID:26107295

  4. Product Platform Performance

    DEFF Research Database (Denmark)

    Munk, Lone

    The aim of this research is to improve understanding of platform-based product development by studying platform performance in relation to internal effects in companies. Platform-based product development makes it possible to deliver product variety and at the same time reduce the needed resources...... engaging in platform-based product development. Similarly platform assessment criteria lack empirical verification regarding relevance and sufficiency. The thesis focuses on • the process of identifying and estimating internal effects, • verification of performance of product platforms, (i...... experienced representatives from the different life systems phase systems of the platform products. The effects are estimated and modeled within different scenarios, taking into account financial and real option aspects. The model illustrates and supports estimation and quantification of internal platform...

  5. A Multi-scale Computational Platform to Mechanistically Assess the Effect of Genetic Variation on Drug Responses in Human Erythrocyte Metabolism.

    Science.gov (United States)

    Mih, Nathan; Brunk, Elizabeth; Bordbar, Aarash; Palsson, Bernhard O

    2016-07-01

    Progress in systems medicine brings promise to addressing patient heterogeneity and individualized therapies. Recently, genome-scale models of metabolism have been shown to provide insight into the mechanistic link between drug therapies and systems-level off-target effects while being expanded to explicitly include the three-dimensional structure of proteins. The integration of these molecular-level details, such as the physical, structural, and dynamical properties of proteins, notably expands the computational description of biochemical network-level properties and the possibility of understanding and predicting whole cell phenotypes. In this study, we present a multi-scale modeling framework that describes biological processes which range in scale from atomistic details to an entire metabolic network. Using this approach, we can understand how genetic variation, which impacts the structure and reactivity of a protein, influences both native and drug-induced metabolic states. As a proof-of-concept, we study three enzymes (catechol-O-methyltransferase, glucose-6-phosphate dehydrogenase, and glyceraldehyde-3-phosphate dehydrogenase) and their respective genetic variants which have clinically relevant associations. Using all-atom molecular dynamic simulations enables the sampling of long timescale conformational dynamics of the proteins (and their mutant variants) in complex with their respective native metabolites or drug molecules. We find that changes in a protein's structure due to a mutation influences protein binding affinity to metabolites and/or drug molecules, and inflicts large-scale changes in metabolism.

  6. A Multi-scale Computational Platform to Mechanistically Assess the Effect of Genetic Variation on Drug Responses in Human Erythrocyte Metabolism.

    Directory of Open Access Journals (Sweden)

    Nathan Mih

    2016-07-01

    Full Text Available Progress in systems medicine brings promise to addressing patient heterogeneity and individualized therapies. Recently, genome-scale models of metabolism have been shown to provide insight into the mechanistic link between drug therapies and systems-level off-target effects while being expanded to explicitly include the three-dimensional structure of proteins. The integration of these molecular-level details, such as the physical, structural, and dynamical properties of proteins, notably expands the computational description of biochemical network-level properties and the possibility of understanding and predicting whole cell phenotypes. In this study, we present a multi-scale modeling framework that describes biological processes which range in scale from atomistic details to an entire metabolic network. Using this approach, we can understand how genetic variation, which impacts the structure and reactivity of a protein, influences both native and drug-induced metabolic states. As a proof-of-concept, we study three enzymes (catechol-O-methyltransferase, glucose-6-phosphate dehydrogenase, and glyceraldehyde-3-phosphate dehydrogenase and their respective genetic variants which have clinically relevant associations. Using all-atom molecular dynamic simulations enables the sampling of long timescale conformational dynamics of the proteins (and their mutant variants in complex with their respective native metabolites or drug molecules. We find that changes in a protein's structure due to a mutation influences protein binding affinity to metabolites and/or drug molecules, and inflicts large-scale changes in metabolism.

  7. Registration of planar bioluminescence to magnetic resonance and x-ray computed tomography images as a platform for the development of bioluminescence tomography reconstruction algorithms.

    Science.gov (United States)

    Beattie, Bradley J; Klose, Alexander D; Le, Carl H; Longo, Valerie A; Dobrenkov, Konstantine; Vider, Jelena; Koutcher, Jason A; Blasberg, Ronald G

    2009-01-01

    The procedures we propose make possible the mapping of two-dimensional (2-D) bioluminescence image (BLI) data onto a skin surface derived from a three-dimensional (3-D) anatomical modality [magnetic resonance (MR) or computed tomography (CT)] dataset. This mapping allows anatomical information to be incorporated into bioluminescence tomography (BLT) reconstruction procedures and, when applied using sources visible to both optical and anatomical modalities, can be used to evaluate the accuracy of those reconstructions. Our procedures, based on immobilization of the animal and a priori determined fixed projective transforms, should be more robust and accurate than previously described efforts, which rely on a poorly constrained retrospectively determined warping of the 3-D anatomical information. Experiments conducted to measure the accuracy of the proposed registration procedure found it to have a mean error of 0.36+/-0.23 mm. Additional experiments highlight some of the confounds that are often overlooked in the BLT reconstruction process, and for two of these confounds, simple corrections are proposed.

  8. Mobile platform security

    CERN Document Server

    Asokan, N; Dmitrienko, Alexandra

    2013-01-01

    Recently, mobile security has garnered considerable interest in both the research community and industry due to the popularity of smartphones. The current smartphone platforms are open systems that allow application development, also for malicious parties. To protect the mobile device, its user, and other mobile ecosystem stakeholders such as network operators, application execution is controlled by a platform security architecture. This book explores how such mobile platform security architectures work. We present a generic model for mobile platform security architectures: the model illustrat

  9. Data Platforms and Cities

    DEFF Research Database (Denmark)

    Blok, Anders; Courmont, Antoine; Hoyng, Rolien

    2017-01-01

    This section offers a series of joint reflections on (open) data platform from a variety of cases, from cycling, traffic and mapping to activism, environment and data brokering. Data platforms play a key role in contemporary urban governance. Linked to open data initiatives, such platforms are of...

  10. Dynamic Gaming Platform (DGP)

    Science.gov (United States)

    2009-04-01

    GAMING PLATFORM (DGP) Lockheed Martin Corporation...YYYY) APR 09 2. REPORT TYPE Final 3. DATES COVERED (From - To) Jul 07 – Mar 09 4. TITLE AND SUBTITLE DYNAMIC GAMING PLATFORM (DGP) 5a...CMU Carnegie Mellon University DGP Dynamic Gaming Platform GA Genetic Algorithm IARPA Intelligence Advanced Research Projects Activity LM ATL Lockheed Martin Advanced Technology Laboratories PAINT ProActive INTelligence

  11. ITS Platform North Denmark

    DEFF Research Database (Denmark)

    Lahrmann, Harry; Agerholm, Niels; Juhl, Jens

    2012-01-01

    This paper presents the project entitled “ITS Platform North Denmark” which is used as a test platform for Intelligent Transportation System (ITS) solutions. The platform consists of a newly developed GNSS/GPRS On Board Unit (OBU) to be installed in 500 cars, a backend server and a specially...

  12. Optimisation and validation of a 3D reconstruction algorithm for single photon emission computed tomography by means of GATE simulation platform

    International Nuclear Information System (INIS)

    El Bitar, Ziad

    2006-12-01

    Although time consuming, Monte-Carlo simulations remain an efficient tool enabling to assess correction methods for degrading physical effects in medical imaging. We have optimized and validated a reconstruction method baptized F3DMC (Fully 3D Monte Carlo) in which the physical effects degrading the image formation process were modelled using Monte-Carlo methods and integrated within the system matrix. We used the Monte-Carlo simulation toolbox GATE. We validated GATE in SPECT by modelling the gamma-camera (Philips AXIS) used in clinical routine. Techniques of threshold, filtering by a principal component analysis and targeted reconstruction (functional regions, hybrid regions) were used in order to improve the precision of the system matrix and to reduce the number of simulated photons as well as the time consumption required. The EGEE Grid infrastructures were used to deploy the GATE simulations in order to reduce their computation time. Results obtained with F3DMC were compared with the reconstruction methods (FBP, ML-EM, MLEMC) for a simulated phantom and with the OSEM-C method for the real phantom. Results have shown that the F3DMC method and its variants improve the restoration of activity ratios and the signal to noise ratio. By the use of the grid EGEE, a significant speed-up factor of about 300 was obtained. These results should be confirmed by performing studies on complex phantoms and patients and open the door to a unified reconstruction method, which could be used in SPECT and also in PET. (author)

  13. A computational model clarifies the roles of positive and negative feedback loops in the Drosophila circadian clock

    Energy Technology Data Exchange (ETDEWEB)

    Wang Junwei, E-mail: wangjunweilj@yahoo.com.c [Cisco School of Informatics, Guangdong University of Foreign Studies, Guangzhou 510006 (China); Zhou Tianshou [School of Mathematics and Computational Science, Sun Yat-Sen University, Guangzhou 510275 (China)

    2010-06-14

    Previous studies showed that a single negative feedback structure should be sufficient for robust circadian oscillations. It is thus pertinent to ask why current cellular clock models almost universally have interlocked negative feedback loop (NFL) and positive feedback loop (PFL). Here, we propose a molecular model that reflects the essential features of the Drosophila circadian clock to clarify the different roles of negative and positive feedback loops. In agreement with experimental observations, the model can simulate circadian oscillations in constant darkness, entrainment by light-dark cycles, as well as phenotypes of per{sup 01} and clk{sup Jrk} mutants. Moreover, sustained oscillations persist when the PFL is removed, implying the crucial role of NFL for rhythm generation. Through parameter sensitivity analysis, it is revealed that incorporation of PFL increases the robustness of the system to regulatory processes in PFL itself. Such reduced models can aid understanding of the design principles of circadian clocks in Drosophila and other organisms with complex transcriptional feedback structures.

  14. Linear equations and rap battles: how students in a wired classroom utilized the computer as a resource to coordinate personal and mathematical positional identities in hybrid spaces

    Science.gov (United States)

    Langer-Osuna, Jennifer

    2015-03-01

    This paper draws on the constructs of hybridity, figured worlds, and cultural capital to examine how a group of African-American students in a technology-driven, project-based algebra classroom utilized the computer as a resource to coordinate personal and mathematical positional identities during group work. Analyses of several vignettes of small group dynamics highlight how hybridity was established as the students engaged in multiple on-task and off-task computer-based activities, each of which drew on different lived experiences and forms of cultural capital. The paper ends with a discussion on how classrooms that make use of student-led collaborative work, and where students are afforded autonomy, have the potential to support the academic engagement of students from historically marginalized communities.

  15. Continuous Platform Development

    DEFF Research Database (Denmark)

    Nielsen, Ole Fiil

    low risks and investments but also with relatively fuzzy results. When looking for new platform projects, it is important to make sure that the company and market is ready for the introduction of platforms, and to make sure that people from marketing and sales, product development, and downstream......, but continuous product family evolution challenges this strategy. The concept of continuous platform development is based on the fact that platform development should not be a one-time experience but rather an ongoing process of developing new platforms and updating existing ones, so that product family...

  16. Comparison of Magnetic Resonance Imaging and Computed Tomography for Breast Target Volume Delineation in Prone and Supine Positions

    Energy Technology Data Exchange (ETDEWEB)

    Pogson, Elise M. [Centre for Medical Radiation Physics, Faculty of Engineering and Information Sciences, University of Wollongong, Wollongong (Australia); Liverpool and Macarthur Cancer Therapy Centres, Liverpool (Australia); Ingham Institute for Applied Medical Research, Liverpool (Australia); Delaney, Geoff P. [Liverpool and Macarthur Cancer Therapy Centres, Liverpool (Australia); Ingham Institute for Applied Medical Research, Liverpool (Australia); South Western Sydney Clinical School, University of New South Wales, Sydney (Australia); School of Medicine, University of Western Sydney, Sydney (Australia); Ahern, Verity [Crown Princess Mary Cancer Care Centre, Westmead Hospital, Westmead (Australia); Boxer, Miriam M. [Liverpool and Macarthur Cancer Therapy Centres, Liverpool (Australia); South Western Sydney Clinical School, University of New South Wales, Sydney (Australia); Chan, Christine [Department of Radiology, Liverpool Hospital, Liverpool (Australia); David, Steven [Peter MacCallum Cancer Centre, Melbourne (Australia); Dimigen, Marion [Department of Radiology, Liverpool Hospital, Liverpool (Australia); Harvey, Jennifer A. [School of Medicine, University of Queensland, Herston (Australia); Princess Alexandra Hospital, Woolloongabba (Australia); Koh, Eng-Siew [Liverpool and Macarthur Cancer Therapy Centres, Liverpool (Australia); Ingham Institute for Applied Medical Research, Liverpool (Australia); South Western Sydney Clinical School, University of New South Wales, Sydney (Australia); Lim, Karen [Liverpool and Macarthur Cancer Therapy Centres, Liverpool (Australia); South Western Sydney Clinical School, University of New South Wales, Sydney (Australia); Papadatos, George [Liverpool and Macarthur Cancer Therapy Centres, Liverpool (Australia); and others

    2016-11-15

    Purpose: To determine whether T2-weighted MRI improves seroma cavity (SC) and whole breast (WB) interobserver conformity for radiation therapy purposes, compared with the gold standard of CT, both in the prone and supine positions. Methods and Materials: Eleven observers (2 radiologists and 9 radiation oncologists) delineated SC and WB clinical target volumes (CTVs) on T2-weighted MRI and CT supine and prone scans (4 scans per patient) for 33 patient datasets. Individual observer's volumes were compared using the Dice similarity coefficient, volume overlap index, center of mass shift, and Hausdorff distances. An average cavity visualization score was also determined. Results: Imaging modality did not affect interobserver variation for WB CTVs. Prone WB CTVs were larger in volume and more conformal than supine CTVs (on both MRI and CT). Seroma cavity volumes were larger on CT than on MRI. Seroma cavity volumes proved to be comparable in interobserver conformity in both modalities (volume overlap index of 0.57 (95% Confidence Interval (CI) 0.54-0.60) for CT supine and 0.52 (95% CI 0.48-0.56) for MRI supine, 0.56 (95% CI 0.53-0.59) for CT prone and 0.55 (95% CI 0.51-0.59) for MRI prone); however, after registering modalities together the intermodality variation (Dice similarity coefficient of 0.41 (95% CI 0.36-0.46) for supine and 0.38 (0.34-0.42) for prone) was larger than the interobserver variability for SC, despite the location typically remaining constant. Conclusions: Magnetic resonance imaging interobserver variation was comparable to CT for the WB CTV and SC delineation, in both prone and supine positions. Although the cavity visualization score and interobserver concordance was not significantly higher for MRI than for CT, the SCs were smaller on MRI, potentially owing to clearer SC definition, especially on T2-weighted MR images.

  17. Making Spatial Statistics Service Accessible On Cloud Platform

    OpenAIRE

    Mu, X.; Wu, J.; Li, T; Zhong, Y.; Gao, X.

    2014-01-01

    Web service can bring together applications running on diverse platforms, users can access and share various data, information and models more effectively and conveniently from certain web service platform. Cloud computing emerges as a paradigm of Internet computing in which dynamical, scalable and often virtualized resources are provided as services. With the rampant growth of massive data and restriction of net, traditional web services platforms have some prominent problems existi...

  18. Designing platform independent mobile apps and services

    CERN Document Server

    Heckman, Rocky

    2016-01-01

    This book explains how to help create an innovative and future proof architecture for mobile apps by introducing practical approaches to increase the value and flexibility of their service layers and reduce their delivery time. Designing Platform Independent Mobile Apps and Services begins by describing the mobile computing landscape and previous attempts at cross platform development. Platform independent mobile technologies and development strategies are described in chapter two and three. Communication protocols, details of a recommended five layer architecture, service layers, and the data abstraction layer are also introduced in these chapters. Cross platform languages and multi-client development tools for the User Interface (UI) layer, as well as message processing patterns and message routing of the Service Int rface (SI) layer are explained in chapter four and five. Ways to design the service layer for mobile computing, using Command Query Responsibility Segregation (CQRS) and the Data Abstraction La...

  19. Effect of body position on respiratory system volumes in anesthetized red-tailed hawks (Buteo jamaicensis) as measured via computed tomography.

    Science.gov (United States)

    Malka, Shachar; Hawkins, Michelle G; Jones, James H; Pascoe, Peter J; Kass, Philip H; Wisner, Erik R

    2009-09-01

    To determine the effects of body position on lung and air-sac volumes in anesthetized and spontaneously breathing red-tailed hawks (Buteo jamaicensis). 6 adult red-tailed hawks (sex unknown). A crossover study design was used for quantitative estimation of lung and air-sac volumes in anesthetized hawks in 3 body positions: dorsal, right lateral, and sternal recumbency. Lung volume, lung density, and air-sac volume were calculated from helical computed tomographic (CT) images by use of software designed for volumetric analysis of CT data. Effects of body position were compared by use of repeated-measures ANOVA and a paired Student t test. Results for all pairs of body positions were significantly different from each other. Mean +/- SD lung density was lowest when hawks were in sternal recumbency (-677 +/- 28 CT units), followed by right lateral (-647 +/- 23 CT units) and dorsal (-630 +/- 19 CT units) recumbency. Mean lung volume was largest in sternal recumbency (28.6 +/- 1.5 mL), followed by right lateral (27.6 +/- 1.7 mL) and dorsal (27.0 +/- 1.5 mL) recumbency. Mean partial air-sac volume was largest in sternal recumbency (27.0 +/- 19.3 mL), followed by right lateral (21.9 +/- 16.1 mL) and dorsal (19.3 +/- 16.9 mL) recumbency. In anesthetized red-tailed hawks, positioning in sternal recumbency resulted in the greatest lung and air-sac volumes and lowest lung density, compared with positioning in right lateral and dorsal recumbency. Additional studies are necessary to determine the physiologic effects of body position on the avian respiratory system.

  20. Influence of basis images and skull position on evaluation of cortical bone thickness in cone beam computed tomography.

    Science.gov (United States)

    Nascimento, Monikelly do Carmo Chagas; Boscolo, Solange Maria de Almeida; Haiter-Neto, Francisco; Santos, Emanuela Carla Dos; Lambrichts, Ivo; Pauwels, Ruben; Jacobs, Reinhilde

    2017-06-01

    The aim of this study was to assess the influence of the number of basis images and the orientation of the skull on the evaluation of cortical alveolar bone in cone beam computed tomography (CBCT). Eleven skulls with a total of 59 anterior teeth were selected. CBCT images were acquired by using 4 protocols, by varying the rotation of the tube-detector arm and the orientation of the skull (protocol 1: 360°/0°; protocol 2: 180°/0°; protocol 3: 180°/90°; protocol 4: 180°/180°). Observers evaluated cortical bone as absent, thin, or thick. Direct observation of the skulls was used as the gold standard. Intra- and interobserver agreement, as well as agreement of scoring between the 3 bone thickness classifications, were calculated by using the κ statistic. The Wilcoxon signed-rank test was used to compare the 4 protocols. For lingual cortical bone, protocol 1 showed no statistical difference from the gold standard. Higher reliability was found in protocol 3 for absent (κ = 0.80) and thin (κ = 0.47) cortices, whereas for thick cortical bone, protocol 2 was more consistent (κ = 0.60). In buccal cortical bone, protocol 1 obtained the highest agreement for absent cortices (κ = 0.61), whereas protocol 4 was better for thin cortical plates (κ = 0.38) and protocol 2 for thick cortical plates (κ = 0.40). No consistent effect of the number of basis images or head orientation for visual detection of alveolar bone was detected, except for lingual cortical bone, for which full rotation scanning showed improved visualization. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Cone-beam computed tomographic illustration of an unusual position of keratocystic odontogenic tumor mimicking a dentigerous cyst: A case report

    Directory of Open Access Journals (Sweden)

    Deepankar Misra

    2014-01-01

    Full Text Available Cone-beam computed tomography (CBCT is an advanced imaging modality, with its application in all branches of dentistry. Of all the imaging modalities available, CBCT, with minimum required exposure, provides the best image quality and helps in arriving at a correct diagnosis and in treatment planning. An odontogenic keratocyst, reclassified as a keratocystic odontogenic tumor (KCOT, has an aggressive behavior, is prone to recur, and thus, has been classified as a tumor. Here, we discuss a rare case of a keratocystic odontogenic tumor occurring in the maxilla, with an ectopic tooth position mimicking a dentigerous cyst.

  2. Cloud Computing Fundamentals

    Science.gov (United States)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  3. Influence of heart rhythm, breathing and arm position during computed tomography scanning on the registration accuracy of electro anatomical map (EAM) images, left atrium three-dimensional computed tomography angiography images, and fluoroscopy time during ablation to treat atrial fibrillation

    International Nuclear Information System (INIS)

    Chono, Taiki; Shimoshige, Shinya; Yoshikawa, Kenta; Mizonobe, Kazuhusa; Ogura, Keishi

    2013-01-01

    In CARTOMERGE for treatment of atrial fibrillation (AF) by ablation, by integrating electro anatomical map (EAM) and left atrium three-dimensional computed tomography angiography (3D-CTA) images, identification of the ablation points is simplified and the procedure can be made carried out more rapidly. However, the influence that heart rhythm, breathing and arm position during CT scanning have on registration accuracy and fluoroscopy time is not clear. To clarify the influence on registration accuracy and fluoroscopy time of heart rhythm, breathing and arm position during CT scanning. The patients were CT-scanned during both sinus rhythm (SR) and AF in each study subject. We evaluated the registration accuracy of images reconstructed between the cardiac cycle and assessed the registration accuracy and fluoroscopy time of images obtained during inspiratory breath-hold, expiratory breath-hold and up and down position of the arm. Although the registration accuracy of the EAM image and left atrium 3D-CTA image showed a significant difference during SR, no significant difference was seen during AF. Expiratory breath-hold and down position of the arm resulted in the highest registration accuracy and the shortest fluoroscopy time. However, arm position had no significant effect on registration accuracy. Heart rhythm and breathing during CT scanning have a significant effect on the registration accuracy of EAM images, left atrium 3D-CTA images, and fluoroscopy time. (author)

  4. Influence of Respiratory Gating, Image Filtering, and Animal Positioning on High-Resolution Electrocardiography-Gated Murine Cardiac Single-Photon Emission Computed Tomography

    Directory of Open Access Journals (Sweden)

    Chao Wu

    2015-01-01

    Full Text Available Cardiac parameters obtained from single-photon emission computed tomographic (SPECT images can be affected by respiratory motion, image filtering, and animal positioning. We investigated the influence of these factors on ultra-high-resolution murine myocardial perfusion SPECT. Five mice were injected with 99m technetium (99mTc-tetrofosmin, and each was scanned in supine and prone positions in a U-SPECT-II scanner with respiratory and electrocardiographic (ECG gating. ECG-gated SPECT images were created without applying respiratory motion correction or with two different respiratory motion correction strategies. The images were filtered with a range of three-dimensional gaussian kernels, after which end-diastolic volumes (EDVs, end-systolic volumes (ESVs, and left ventricular ejection fractions were calculated. No significant differences in the measured cardiac parameters were detected when any strategy to reduce or correct for respiratory motion was applied, whereas big differences (> 5% in EDV and ESV were found with regard to different positioning of animals. A linear relationship (p < .001 was found between the EDV or ESV and the kernel size of the gaussian filter. In short, respiratory gating did not significantly affect the cardiac parameters of mice obtained with ultra-high-resolution SPECT, whereas the position of the animals and the image filters should be the same in a comparative study with multiple scans to avoid systematic differences in measured cardiac parameters.

  5. Comparative yield of positive brain Computed Tomography after implementing the NICE or SIGN head injury guidelines in two equivalent urban populations

    International Nuclear Information System (INIS)

    Summerfield, R.; Macduff, R.; Davis, R.; Sambrook, M.; Britton, I.

    2011-01-01

    Aims: To compare the yield of positive computed tomography (CT) brain examinations after the implementation of the National Institute for Clinical Excellence (NICE) or the Scottish Intercollegiate Guidance Network (SIGN) guidelines, in comparable urban populations in two teaching hospitals in England and Scotland. Materials and methods: Four hundred consecutive patients presenting at each location following a head injury who underwent a CT examination of the head according to the locally implemented guidelines were compared. Similar matched populations were compared for indication and yield. Yield was measured according to (1) positive CT findings of the sequelae of trauma and (2) intervention required with anaesthetic or intensive care unit (ICU) support, or neurosurgery. Results: The mean ages of patients at the English and Scottish centres were 49.9 and 49.2 years, respectively. Sex distribution was 64.1% male and 66.4% male respectively. Comparative yield was 23.8 and 26.5% for positive brain scans, 3 and 2.75% for anaesthetic support, and 3.75 and 2.5% for neurosurgical intervention. Glasgow Coma Score (GCS) 10% yield of positive scans. The choice of guideline to follow should be at the discretion of the local institution. The indications GCS <13 and clinical or radiological evidence of a skull fracture are highly predictive of intracranial pathology, and their presence should be an absolute indicator for fast-tracking the management of the patient.

  6. Towards a Market Entry Framework for Digital Payment Platforms

    DEFF Research Database (Denmark)

    Kazan, Erol; Damsgaard, Jan

    2016-01-01

    This study presents a framework to understand and explain the design and configuration of digital payment platforms and how these platforms create conditions for market entries. By embracing the theoretical lens of platform envelopment, we employed a multiple and comparative-case study...... in a European setting by using our framework as an analytical lens to assess market-entry conditions. We found that digital payment platforms have acquired market entry capabilities, which is achieved through strategic platform design (i.e., platform development and service distribution) and technology design...... (i.e., issuing evolutionary and revolutionary payment instruments). The studied cases reveal that digital platforms leverage payment services as a mean to bridge and converge core and adjacent platform markets. In so doing, platform envelopment strengthens firms’ market position in their respective...

  7. Genesis and Evolution of Digital Payment Platforms

    DEFF Research Database (Denmark)

    Hjelholt, Morten; Damsgaard, Jan

    2012-01-01

    Payment transactions through the use of physical coins, bank notes or credit cards have for centuries been the standard formats of exchanging money. Recently online and mobile digital payment platforms has entered the stage as contenders to this position and possibly could penetrate societies...... thoroughly and substitute current payment standards in the decades to come. This paper portrays how digital payment paltforms evolve in socio-technical niches and how various technological platforms aim for institutional attention in their attempt to challenge earlier platforms and standards. The paper...... applies a co-evolutionary multilevel perspective to model the interplay and processes between technology and society wherein digital payment platforms potentially will substitute other payment platforms just like the credit card negated the check. On this basis this paper formulate a multilevel conceptual...

  8. ADMS Evaluation Platform

    Energy Technology Data Exchange (ETDEWEB)

    2018-01-23

    Deploying an ADMS or looking to optimize its value? NREL offers a low-cost, low-risk evaluation platform for assessing ADMS performance. The National Renewable Energy Laboratory (NREL) has developed a vendor-neutral advanced distribution management system (ADMS) evaluation platform and is expanding its capabilities. The platform uses actual grid-scale hardware, large-scale distribution system models, and advanced visualization to simulate realworld conditions for the most accurate ADMS evaluation and experimentation.

  9. Comparative yield of positive brain Computed Tomography after implementing the NICE or SIGN head injury guidelines in two equivalent urban populations

    Energy Technology Data Exchange (ETDEWEB)

    Summerfield, R., E-mail: ruth.summerfield@uhns.nhs.u [Medical Imaging, University Hospital of North Staffordshire, City General Hospital, Stoke-on-Trent, Staffordshire ST4 6QG (United Kingdom); Macduff, R. [Glasgow Royal Infirmary, 84 Castle Street, Glasgow G4 0SF (United Kingdom); Davis, R. [Medical Imaging, University Hospital of North Staffordshire, City General Hospital, Stoke-on-Trent, Staffordshire ST4 6QG (United Kingdom); Sambrook, M. [Glasgow Royal Infirmary, 84 Castle Street, Glasgow G4 0SF (United Kingdom); Britton, I. [Medical Imaging, University Hospital of North Staffordshire, City General Hospital, Stoke-on-Trent, Staffordshire ST4 6QG (United Kingdom)

    2011-04-15

    Aims: To compare the yield of positive computed tomography (CT) brain examinations after the implementation of the National Institute for Clinical Excellence (NICE) or the Scottish Intercollegiate Guidance Network (SIGN) guidelines, in comparable urban populations in two teaching hospitals in England and Scotland. Materials and methods: Four hundred consecutive patients presenting at each location following a head injury who underwent a CT examination of the head according to the locally implemented guidelines were compared. Similar matched populations were compared for indication and yield. Yield was measured according to (1) positive CT findings of the sequelae of trauma and (2) intervention required with anaesthetic or intensive care unit (ICU) support, or neurosurgery. Results: The mean ages of patients at the English and Scottish centres were 49.9 and 49.2 years, respectively. Sex distribution was 64.1% male and 66.4% male respectively. Comparative yield was 23.8 and 26.5% for positive brain scans, 3 and 2.75% for anaesthetic support, and 3.75 and 2.5% for neurosurgical intervention. Glasgow Coma Score (GCS) <13 (NICE) and GCS {<=}12 and radiological or clinical evidence of skull fracture (SIGN) demonstrated the greatest statistical association with a positive CT examination. Conclusion: In a teaching hospital setting, there is no significant difference in the yield between the NICE and SIGN guidelines. Both meet the SIGN standard of >10% yield of positive scans. The choice of guideline to follow should be at the discretion of the local institution. The indications GCS <13 and clinical or radiological evidence of a skull fracture are highly predictive of intracranial pathology, and their presence should be an absolute indicator for fast-tracking the management of the patient.

  10. Platform development supportedby gaming

    DEFF Research Database (Denmark)

    Mikkola, Juliana Hsuan; Hansen, Poul H. Kyvsgård

    2007-01-01

    The challenge of implementing industrial platforms in practice can be described as a configuration problem caused by high number of variables, which often have contradictory influences on the total performance of the firm. Consequently, the specific platform decisions become extremely complex......, possibly increasing the strategic risks for the firm. This paper reports preliminary findings on platform management process at LEGO, a Danish toy company.  Specifically, we report the process of applying games combined with simulations and workshops in the platform development. We also propose a framework...

  11. Omnidirectional holonomic platforms

    International Nuclear Information System (INIS)

    Pin, F.G.; Killough, S.M.

    1994-01-01

    This paper presents the concepts for a new family of wheeled platforms which feature full omnidirectionality with simultaneous and independently controlled rotational and translational motion capabilities. The authors first present the orthogonal-wheels concept and the two major wheel assemblies on which these platforms are based. They then describe how a combination of these assemblies with appropriate control can be used to generate an omnidirectional capability for mobile robot platforms. The design and control of two prototype platforms are then presented and their respective characteristics with respect to rotational and translational motion control are discussed

  12. Platform decommissioning costs

    International Nuclear Information System (INIS)

    Rodger, David

    1998-01-01

    There are over 6500 platforms worldwide contributing to the offshore oil and gas production industry. In the North Sea there are around 500 platforms in place. There are many factors to be considered in planning for platform decommissioning and the evaluation of options for removal and disposal. The environmental impact, technical feasibility, safety and cost factors all have to be considered. This presentation considers what information is available about the overall decommissioning costs for the North Sea and the costs of different removal and disposal options for individual platforms. 2 figs., 1 tab

  13. Platform for High-Assurance Cloud Computing

    Science.gov (United States)

    2016-06-01

    821715). Santa Clara, CA. February 2015. 36. Zhiyuan Teo, Vera Kutsenko, Ken Birman, and Robbert van Renesse. IronStack: performance, stability and...1988. 51. Douglas E . Comer and David L. Stevens. Internetworking with TCP/IP, Vol. III: Client- Server Programming and Applications, Linux/Posix...pp635-671, February 2012. E +16 Ittay Eyal, Adam Efe Gencer, Emin Gun Sirer, and Robbert van Renesse. Bitcoin-NG: A Scalable Blockchain Protocol

  14. The Effect of Round Window vs Cochleostomy Surgical Approaches on Cochlear Implant Electrode Position: A Flat-Panel Computed Tomography Study.

    Science.gov (United States)

    Jiam, Nicole T; Jiradejvong, Patpong; Pearl, Monica S; Limb, Charles J

    2016-09-01

    The round window insertion (RWI) and cochleostomy approaches are the 2 most common surgical techniques used in cochlear implantation (CI). However, there is no consensus on which approach is ideal for electrode array insertion, in part because visualization of intracochlear electrode position is challenging, so postoperative assessment of intracochlear electrode contact is lacking. To measure and compare electrode array position between RWI and cochleostomy approaches for CI insertion. Retrospective case-comparison study of 17 CI users with Med-El standard-length electrode arrays who underwent flat-panel computed tomography scans after CI surgery at a tertiary referral center. The data was analyzed in October 2015. Flat-panel computed tomography scans were collected between January 1 and August 31, 2013, for 22 electrode arrays. The surgical technique was identified by a combination of operative notes and imaging. Eight cochleae underwent RWI and 14 cochleae underwent cochleostomy approaches anterior and inferior to the round window. Interscalar electrode position and electrode centroid distance to the osseous spiral lamina, lateral bony wall, and central axis of the modiolus. Nine participants were men, and 8, women; the mean age was 54.4 (range, 21-64) years. Electrode position was significantly closer to cochlear neural elements with RWI than cochleostomy approaches. Between the 2 surgical approaches, the RWI technique produced shorter distances between the electrode and the modiolus (mean difference, -0.33 [95% CI, -0.29 to -0.39] mm in the apical electrode; -1.42 [95% CI, -1.24 to -1.57] mm in the basal electrode). This difference, which was most prominent in the first third and latter third of the basal turn, decreased after the basal turn. The RWI approach was associated with an increased likelihood of perimodiolar placement. Opting to use RWI over cochleostomy approaches in CI candidates may position electrodes closer to cochlear neural substrates and

  15. An Open-Source Based ITS Platform

    DEFF Research Database (Denmark)

    Andersen, Ove; Krogh, Benjamin Bjerre; Torp, Kristian

    2013-01-01

    In this paper, a complete platform used to compute travel times from GPS data is described. Two approaches to computing travel time are proposed one based on points and one based on trips. Overall both approaches give reasonable results compared to existing manual estimated travel times. However......, the trip-based approach requires more GPS data and of a higher quality than the point-based approach. The platform has been completely implemented using open-source software. The main conclusion is that large quantity of GPS data can be managed, with a limited budget and that GPS data is a good source...... for estimating travel times, if enough data is available....

  16. The evolution of cloud computing how to plan for change

    CERN Document Server

    Longbottom, Clive

    2017-01-01

    Cloud computing has been positioned as today's ideal IT platform. This book looks at what cloud promises and how it's likely to evolve in the future. Readers will be able to ensure that decisions made now will hold them in good stead in the future and will gain an understanding of how cloud can deliver the best outcome for their organisations.

  17. Polymer-based platform for microfluidic systems

    Science.gov (United States)

    Benett, William [Livermore, CA; Krulevitch, Peter [Pleasanton, CA; Maghribi, Mariam [Livermore, CA; Hamilton, Julie [Tracy, CA; Rose, Klint [Boston, MA; Wang, Amy W [Oakland, CA

    2009-10-13

    A method of forming a polymer-based microfluidic system platform using network building blocks selected from a set of interconnectable network building blocks, such as wire, pins, blocks, and interconnects. The selected building blocks are interconnectably assembled and fixedly positioned in precise positions in a mold cavity of a mold frame to construct a three-dimensional model construction of a microfluidic flow path network preferably having meso-scale dimensions. A hardenable liquid, such as poly (dimethylsiloxane) is then introduced into the mold cavity and hardened to form a platform structure as well as to mold the microfluidic flow path network having channels, reservoirs and ports. Pre-fabricated elbows, T's and other joints are used to interconnect various building block elements together. After hardening the liquid the building blocks are removed from the platform structure to make available the channels, cavities and ports within the platform structure. Microdevices may be embedded within the cast polymer-based platform, or bonded to the platform structure subsequent to molding, to create an integrated microfluidic system. In this manner, the new microfluidic platform is versatile and capable of quickly generating prototype systems, and could easily be adapted to a manufacturing setting.

  18. Groundwater Assessment Platform

    OpenAIRE

    Podgorski, Joel; Berg, Michael

    2018-01-01

    The Groundwater Assessment Platform is a free, interactive online GIS platform for the mapping, sharing and statistical modeling of groundwater quality data. The modeling allows users to take advantage of publicly available global datasets of various environmental parameters to produce prediction maps of their contaminant of interest.

  19. EURESCOM Services Platform

    NARCIS (Netherlands)

    Nieuwenhuis, Lambertus Johannes Maria; van Halteren, Aart

    1999-01-01

    This paper presents the results of the EURESCOM Project 715. In February 1999, a large team of researchers from six European public network operators completed a two year period of cooperative experiments on a TINA-based environment, called the EURESCOM Services Platform (ESP). This platform

  20. Whole-body computed tomography in trauma patients: optimization of the patient scanning position significantly shortens examination time while maintaining diagnostic image quality

    Directory of Open Access Journals (Sweden)

    Hickethier T

    2018-05-01

    Full Text Available Tilman Hickethier,1,* Kamal Mammadov,1,* Bettina Baeßler,1 Thorsten Lichtenstein,1 Jochen Hinkelbein,2 Lucy Smith,3 Patrick Sven Plum,4 Seung-Hun Chon,4 David Maintz,1 De-Hua Chang1 1Department of Radiology, University Hospital of Cologne, Cologne, Germany; 2Department of Anesthesiology and Intensive Care Medicine, University Hospital of Cologne, Cologne, Germany; 3Faculty of Medicine, Memorial University of Newfoundland, St. John’s, Canada; 4Department of General, Visceral and Cancer Surgery, University Hospital of Cologne, Cologne, Germany *These authors contributed equally to this work Background: The study was conducted to compare examination time and artifact vulnerability of whole-body computed tomographies (wbCTs for trauma patients using conventional or optimized patient positioning. Patients and methods: Examination time was measured in 100 patients scanned with conventional protocol (Group A: arms positioned alongside the body for head and neck imaging and over the head for trunk imaging and 100 patients scanned with optimized protocol (Group B: arms flexed on a chest pillow without repositioning. Additionally, influence of two different scanning protocols on image quality in the most relevant body regions was assessed by two blinded readers. Results: Total wbCT duration was about 35% or 3:46 min shorter in B than in A. Artifacts in aorta (27 vs 6%, liver (40 vs 8% and spleen (27 vs 5% occurred significantly more often in B than in A. No incident of non-diagnostic image quality was reported, and no significant differences for lungs and spine were found. Conclusion: An optimized wbCT positioning protocol for trauma patients allows a significant reduction of examination time while still maintaining diagnostic image quality. Keywords: CT scan, polytrauma, acute care, time requirement, positioning

  1. Genomics With Cloud Computing

    Directory of Open Access Journals (Sweden)

    Sukhamrit Kaur

    2015-04-01

    Full Text Available Abstract Genomics is study of genome which provides large amount of data for which large storage and computation power is needed. These issues are solved by cloud computing that provides various cloud platforms for genomics. These platforms provides many services to user like easy access to data easy sharing and transfer providing storage in hundreds of terabytes more computational power. Some cloud platforms are Google genomics DNAnexus and Globus genomics. Various features of cloud computing to genomics are like easy access and sharing of data security of data less cost to pay for resources but still there are some demerits like large time needed to transfer data less network bandwidth.

  2. Computer Workstations: Good Working Positions

    Science.gov (United States)

    ... Safety and Health Program Recommendations It's the Law Poster REGULATIONS Law and Regulations Standard Interpretations Training Requirements ... page requires that javascript be enabled for some elements to function correctly. Please contact the OSHA Directorate ...

  3. The Use of Cone Beam Computed Tomography (CBCT to Determine Supernumerary and Impacted Teeth Position in Pediatric Patients: A Case Report

    Directory of Open Access Journals (Sweden)

    Hossein Nematolahi

    2013-03-01

    Full Text Available A case of a compound odontoma which caused delayed eruption of right maxillary central incisor in a ten year old girl is presented with clinical and radiographic findings. The patient presented with complaint of a hard painless swelling in the right anterior region of the maxilla and absence of right maxillary central incisor. After clinical examination, periapical and occlusal radiographs of the mentioned region were taken. Impacted maxillary right central incisor was seen malformed shape on the intraoral radiographs. After taking a cone beam computed tomography (CBCT a compound odontoma associated with the labial aspect of impacted maxillary right central incisor was diagnosed and then removed by simple local excision under local anesthesia. The removal of the odontoma was followed by forced eruption of the impacted central incisor. After three months the tooth returned to its original position.

  4. False-positive 18F-fluorodeoxyglucose positron emission tomography/computed tomography in a patient with metallic implants following chondrosarcoma resection.

    Science.gov (United States)

    Zhou, P U; Tang, Jinliang; Zhang, Dong; Li, Guanghui

    2016-05-01

    Positron emission tomography (PET) with fluorine-18-labeled fluorodeoxyglucose ( 18 F-FDG) has been used for the staging and evaluation of recurrence in cancer patients. We herein report a false-positive result of 18 F-FDG PET/computed tomography (CT) scan in a patient following chondrosarcoma resection and metallic implanting. A 35-year-old male patient with chondrosarcoma of the left iliac bone underwent radical resection, metal brace implanting and radiotherapy. A high uptake of 18 F-FDG was observed in the metallic implants and adjacent tissue during PET/CT scanning in the 5th year of follow-up. Tissue biopsy and follow-up examination identified no tumor recurrence or infection at these sites, suggesting that the results of 18 F-FDG PET/CT must be interpreted with caution in cancer patients with metallic implants.

  5. Rejection Positivity Predicts Trial-to-Trial Reaction Times in an Auditory Selective Attention Task: A Computational Analysis of Inhibitory Control

    Directory of Open Access Journals (Sweden)

    Sufen eChen

    2014-08-01

    Full Text Available A series of computer simulations using variants of a formal model of attention (Melara & Algom, 2003 probed the role of rejection positivity (RP, a slow-wave electroencephalographic (EEG component, in the inhibitory control of distraction. Behavioral and EEG data were recorded as participants performed auditory selective attention tasks. Simulations that modulated processes of distractor inhibition accounted well for reaction-time (RT performance, whereas those that modulated target excitation did not. A model that incorporated RP from actual EEG recordings in estimating distractor inhibition was superior in predicting changes in RT as a function of distractor salience across conditions. A model that additionally incorporated momentary fluctuations in EEG as the source of trial-to-trial variation in performance precisely predicted individual RTs within each condition. The results lend support to the linking proposition that RP controls the speed of responding to targets through the inhibitory control of distractors.

  6. Product Platform Modeling

    DEFF Research Database (Denmark)

    Pedersen, Rasmus

    for customisation of products. In many companies these changes in the business environment have created a controversy between the need for a wide variety of products offered to the marketplace and a desire to reduce variation within the company in order to increase efficiency. Many companies use the concept...... other. These groups can be varied and combined to form different product variants without increasing the internal variety in the company. Based on the Theory of Domains, the concept of encapsulation in the organ domain is introduced, and organs are formulated as platform elements. Included......This PhD thesis has the title Product Platform Modelling. The thesis is about product platforms and visual product platform modelling. Product platforms have gained an increasing attention in industry and academia in the past decade. The reasons are many, yet the increasing globalisation...

  7. Product Platform Replacements

    DEFF Research Database (Denmark)

    Sköld, Martin; Karlsson, Christer

    2012-01-01

    . To shed light on this unexplored and growing managerial concern, the purpose of this explorative study is to identify operational challenges to management when product platforms are replaced. Design/methodology/approach – The study uses a longitudinal field-study approach. Two companies, Gamma and Omega...... replacement was chosen in each company. Findings – The study shows that platform replacements primarily challenge managers' existing knowledge about platform architectures. A distinction can be made between “width” and “height” in platform replacements, and it is crucial that managers observe this in order...... to challenge their existing knowledge about platform architectures. Issues on technologies, architectures, components and processes as well as on segments, applications and functions are identified. Practical implications – Practical implications are summarized and discussed in relation to a framework...

  8. Mobile Prototyping Platforms for Remote Engineering Applications

    Directory of Open Access Journals (Sweden)

    Karsten Henke

    2009-08-01

    Full Text Available This paper describes a low-cost mobile communication platform as a universal rapid-prototyping system, which is based on the Quadrocopter concept. At the Integrated Hardware and Software Systems Group at the Ilmenau University of Technology these mobile platforms are used to motivate bachelor and master students to study Computer Engineering sciences. This could be done by increasing their interest in technical issues, using this platform as integral part of a new ad-hoc lab to demonstrate different aspects in the area of Mobile Communication as well as universal rapid prototyping nodes to investigate different mechanisms for self-organized mobile communication systems within the International Graduate School on Mobile Communications. Beside the three fields of application, the paper describes the current architecture concept of the mobile prototyping platform as well as the chosen control mechanism and the assigned sensor systems to fulfill all the required tasks.

  9. Linking computers for science

    CERN Multimedia

    2005-01-01

    After the success of SETI@home, many other scientists have found computer power donated by the public to be a valuable resource - and sometimes the only possibility to achieve their goals. In July, representatives of several “public resource computing” projects came to CERN to discuss technical issues and R&D activities on the common computing platform they are using, BOINC. This photograph shows the LHC@home screen-saver which uses the BOINC platform: the dots represent protons and the position of the status bar indicates the progress of the calculations. This summer, CERN hosted the first “pangalactic workshop” on BOINC (Berkeley Open Interface for Network Computing). BOINC is modelled on SETI@home, which millions of people have downloaded to help search for signs of extraterrestrial intelligence in radio-astronomical data. BOINC provides a general-purpose framework for scientists to adapt their software to, so that the public can install and run it. An important part of BOINC is managing the...

  10. Acetabular component positioning in total hip arthroplasty with and without a computer-assisted system: a prospective, randomized and controlled study.

    Science.gov (United States)

    Gurgel, Henrique M C; Croci, Alberto T; Cabrita, Henrique A B A; Vicente, José Ricardo N; Leonhardt, Marcos C; Rodrigues, João Carlos

    2014-01-01

    In a study of the acetabular component in total hip arthroplasty, 20 hips were operated on using imageless navigation and 20 hips were operated on using the conventional method. The correct position of the acetabular component was evaluated with computed tomography, measuring the operative anteversion and the operative inclination and determining the cases inside Lewinnek's safe zone. The results were similar in all the analyses: a mean anteversion of 17.4° in the navigated group and 14.5° in the control group (P=.215); a mean inclination of 41.7° and 42.2° (P=.633); a mean deviation from the desired anteversion (15°) of 5.5° and 6.6° (P=.429); a mean deviation from the desired inclination of 3° and 3.2° (P=.783); and location inside the safe zone of 90% and 80% (P=.661). The acetabular component position's tomography analyses were similar whether using the imageless navigation or performing it conventionally. Published by Elsevier Inc.

  11. Prostate positioning using cone-beam computer tomography based on manual soft-tissue registration. Interobserver agreement between radiation oncologists and therapists

    Energy Technology Data Exchange (ETDEWEB)

    Jereczek-Fossa, B.A.; Pobbiati, C.; Fanti, P. [European Institute of Oncology, Department of Radiation Oncology, Milan (Italy); University of Milan, Milan (Italy); Santoro, L. [European Institute of Oncology, Department of Epidemiology and Biostatistics, Milan (Italy); Fodor, C.; Zerini, D. [European Institute of Oncology, Department of Radiation Oncology, Milan (Italy); Vigorito, S. [European Institute of Oncology, Department of Medical Physics, Milan (Italy); Baroni, G. [Politecnico di Milano, Department of Electronics Information and Bioengineering, Milan (Italy); De Cobelli, O. [European Institute of Oncology, Department of Urology, Milan (Italy); University of Milan, Milan (Italy); Orecchia, R. [European Institute of Oncology, Department of Radiation Oncology, Milan (Italy); National Center for Oncological Hadrontherapy (CNAO) Foundation, Pavia (Italy); University of Milan, Milan (Italy)

    2014-01-15

    To check the interobserver agreement between radiation oncologists and therapists (RTT) using an on- and off-line cone-beam computer tomography (CBCT) protocol for setup verification in the radiotherapy of prostate cancer. The CBCT data from six prostate cancer patients treated with hypofractionated intensity-modulated radiotherapy (IMRT) were independently reviewed off-line by four observers (one radiation oncologist, one junior and two senior RTTs) and benchmarked with on-line CBCT positioning performed by a radiation oncologist immediately prior to treatment. CBCT positioning was based on manual soft-tissue registration. Agreement between observers was evaluated using weighted Cohen's kappa statistics. In total, 152 CBCT-based prostate positioning procedures were reviewed by each observer. The mean (± standard deviation) of the differences between off- and on-line CBCT-simCT registration translations along the three directions (antero-posterior, latero-lateral and cranio-caudal) and rotation around the antero-posterior axis were - 0.7 (3.6) mm, 1.9 (2.7) mm, 0.9 (3.6) mm and - 1.8 (5.0) degrees, respectively. Satisfactory interobserver agreement was found, being substantial (weighted kappa > 0.6) in 10 of 16 comparisons and moderate (0.41-0.60) in the remaining six comparisons. CBCT interpretation performed by RTTs is comparable to that of radiation oncologists. Our study might be helpful in the quality assurance of radiotherapy and the optimization of competencies. Further investigation should include larger sample sizes, a greater number of observers and validated methodology in order to assess interobserver variability and its impact on high-precision prostate cancer IGRT. In the future, it should enable the wider implementation of complex and evolving radiotherapy technologies. (orig.)

  12. Prognostic value of 18F-fluorodeoxyglucose positron emission tomography, computed tomography and magnetic resonance imaging in oral cavity squamous cell carcinoma with pathologically positive neck lymph node

    International Nuclear Information System (INIS)

    Jwa, Eun Jin; Lee, Sang Wook; Kim, Jae Seung

    2012-01-01

    To evaluate the prognostic value of preoperative neck lymph node (LN) assessment with 18 F-fluorodeoxyglucose positron emission tomography ( 18 F-FDG PET), computed tomography (CT), and magnetic resonance imaging (MRI) in oral cavity squamous cell carcinoma (OSCC) patients with pathologically positive LN. In total, 47 OSCC patients with pathologically positive LN were retrospectively reviewed with preoperative 18 F-FDG PET and CT/MRI. All patients underwent surgical resection, neck dissection and postoperative adjuvant radiotherapy and/or chemotherapy between March 2002 and October 2010. Histologic correlation was performed for findings of 18 F-FDG PET and CT/MRI. Thirty-six (76.6%) of 47 cases were correctly diagnosed with neck LN metastasis by 18 F-FDG PET and 32 (68.1%) of 47 cases were correctly diagnosed by CT/MRI. Follow-up ranged from 20 to 114 months (median, 56 months). Clinically negative nodal status evaluated by 18 F-FDG PET or CT/MRI revealed a trend toward better clinical outcomes in terms of overall survival, disease-free survival, local recurrence-free survival, regional nodal recurrence-free survival, and distant metastasis-free survival rates even though the trends were not statistically significant. However, there was no impact of neck node standardized uptake value (SUV max ) on clinical outcomes. Notably, SUVmax showed significant correlation with tumor size in LN (p 2 = 0.62). PET and CT/MRI status of LN also had significant correlation with the size of intranodal tumor deposit (p 2 = 0.37 and p 2 = 0.48, respectively). 18 F-FDG PET and CT/MRI at the neck LNs might improve risk stratification in OSCC patients with pathologically positive neck LN in this study, even without significant prognostic value of SUV max .

  13. The vacuum platform

    Science.gov (United States)

    McNab, A.

    2017-10-01

    This paper describes GridPP’s Vacuum Platform for managing virtual machines (VMs), which has been used to run production workloads for WLCG and other HEP experiments. The platform provides a uniform interface between VMs and the sites they run at, whether the site is organised as an Infrastructure-as-a-Service cloud system such as OpenStack, or an Infrastructure-as-a-Client system such as Vac. The paper describes our experience in using this platform, in developing and operating VM lifecycle managers Vac and Vcycle, and in interacting with VMs provided by LHCb, ATLAS, ALICE, CMS, and the GridPP DIRAC service to run production workloads.

  14. Extracting Synthetic Multi-Cluster Platform Configurations from Grid'5000 for Driving Simulation Experiments

    OpenAIRE

    Suter , Frédéric; Casanova , Henri

    2007-01-01

    This report presents a collection of synthetic but realistic distributed computing platform configurations. These configurations are intended for simulation experiments in the study of parallel applications on multi-cluster platforms.

  15. Dynamics of dump truck entrance onto the hoist platform of a mine inclined elevator

    Energy Technology Data Exchange (ETDEWEB)

    Nosyrev, B.A.; Popov, Yu.V.; Mukhutdinov, Sh.D. (Sverdlovskii Gornyi Institut (USSR))

    1989-01-01

    Analyzes the feasibility of transporting heavy-duty dump trucks along slopes on special platforms in coal surface mines. The platforms are hoisted by winches. Theoretical problems associated with hoisting a loaded platform upwards are analyzed. Problems associated with truck travel in the platform area, its exact positioning and mechanical vibrations of the platform caused by truck movement are discussed. Vibrations of the platform with a loaded truck and vibration amplitudes are analyzed. Five states of the system are evaluated. Methods for prevention of excessive vibrations by optimization of platform design and use of flexible elements are evaluated. Optimum speed of truck movement for platform entering is recommended.

  16. Small animal radiotherapy research platforms

    Energy Technology Data Exchange (ETDEWEB)

    Verhaegen, Frank; Granton, Patrick [Department of Radiation Oncology (MAASTRO), GROW-School for Oncology and Developmental Biology, Maastricht University Medical Center, Maastricht 6201 BN (Netherlands); Tryggestad, Erik, E-mail: frank.verhaegen@maastro.nl [Department of Radiation Oncology and Molecular Radiation Sciences, The Johns Hopkins University School of Medicine, Baltimore, MD 21231 (United States)

    2011-06-21

    Advances in conformal radiation therapy and advancements in pre-clinical radiotherapy research have recently stimulated the development of precise micro-irradiators for small animals such as mice and rats. These devices are often kilovolt x-ray radiation sources combined with high-resolution CT imaging equipment for image guidance, as the latter allows precise and accurate beam positioning. This is similar to modern human radiotherapy practice. These devices are considered a major step forward compared to the current standard of animal experimentation in cancer radiobiology research. The availability of this novel equipment enables a wide variety of pre-clinical experiments on the synergy of radiation with other therapies, complex radiation schemes, sub-target boost studies, hypofractionated radiotherapy, contrast-enhanced radiotherapy and studies of relative biological effectiveness, to name just a few examples. In this review we discuss the required irradiation and imaging capabilities of small animal radiation research platforms. We describe the need for improved small animal radiotherapy research and highlight pioneering efforts, some of which led recently to commercially available prototypes. From this, it will be clear that much further development is still needed, on both the irradiation side and imaging side. We discuss at length the need for improved treatment planning tools for small animal platforms, and the current lack of a standard therein. Finally, we mention some recent experimental work using the early animal radiation research platforms, and the potential they offer for advancing radiobiology research. (topical review)

  17. Small animal radiotherapy research platforms

    Science.gov (United States)

    Verhaegen, Frank; Granton, Patrick; Tryggestad, Erik

    2011-06-01

    Advances in conformal radiation therapy and advancements in pre-clinical radiotherapy research have recently stimulated the development of precise micro-irradiators for small animals such as mice and rats. These devices are often kilovolt x-ray radiation sources combined with high-resolution CT imaging equipment for image guidance, as the latter allows precise and accurate beam positioning. This is similar to modern human radiotherapy practice. These devices are considered a major step forward compared to the current standard of animal experimentation in cancer radiobiology research. The availability of this novel equipment enables a wide variety of pre-clinical experiments on the synergy of radiation with other therapies, complex radiation schemes, sub-target boost studies, hypofractionated radiotherapy, contrast-enhanced radiotherapy and studies of relative biological effectiveness, to name just a few examples. In this review we discuss the required irradiation and imaging capabilities of small animal radiation research platforms. We describe the need for improved small animal radiotherapy research and highlight pioneering efforts, some of which led recently to commercially available prototypes. From this, it will be clear that much further development is still needed, on both the irradiation side and imaging side. We discuss at length the need for improved treatment planning tools for small animal platforms, and the current lack of a standard therein. Finally, we mention some recent experimental work using the early animal radiation research platforms, and the potential they offer for advancing radiobiology research.

  18. Small animal radiotherapy research platforms

    International Nuclear Information System (INIS)

    Verhaegen, Frank; Granton, Patrick; Tryggestad, Erik

    2011-01-01

    Advances in conformal radiation therapy and advancements in pre-clinical radiotherapy research have recently stimulated the development of precise micro-irradiators for small animals such as mice and rats. These devices are often kilovolt x-ray radiation sources combined with high-resolution CT imaging equipment for image guidance, as the latter allows precise and accurate beam positioning. This is similar to modern human radiotherapy practice. These devices are considered a major step forward compared to the current standard of animal experimentation in cancer radiobiology research. The availability of this novel equipment enables a wide variety of pre-clinical experiments on the synergy of radiation with other therapies, complex radiation schemes, sub-target boost studies, hypofractionated radiotherapy, contrast-enhanced radiotherapy and studies of relative biological effectiveness, to name just a few examples. In this review we discuss the required irradiation and imaging capabilities of small animal radiation research platforms. We describe the need for improved small animal radiotherapy research and highlight pioneering efforts, some of which led recently to commercially available prototypes. From this, it will be clear that much further development is still needed, on both the irradiation side and imaging side. We discuss at length the need for improved treatment planning tools for small animal platforms, and the current lack of a standard therein. Finally, we mention some recent experimental work using the early animal radiation research platforms, and the potential they offer for advancing radiobiology research. (topical review)

  19. Identification of platform levels

    DEFF Research Database (Denmark)

    Mortensen, Niels Henrik

    2005-01-01

    reduction, ability to launch a wider product portfolio without increasing resources and reduction of complexity within the whole company. To support the multiple product development process, platform based product development has in many companies such as Philips, VW, Ford etc. proven to be a very effective...... product development in one step and therefore the objective of this paper is to identify levels of platform based product development. The structure of this paper is as follows. First the applied terminology for platforms will be briefly explained and then characteristics between single and multi product...... development will be examined. Based on the identification of the above characteristics five platform levels are described. The research presented in this paper is a result of MSc, Ph.D projects at the Technical University of Denmark and consultancy projects within the organisation of Institute of Product...

  20. Paper based electronics platform

    KAUST Repository

    Nassar, Joanna Mohammad; Sevilla, Galo Andres Torres; Hussain, Muhammad Mustafa

    2017-01-01

    A flexible and non-functionalized low cost paper-based electronic system platform fabricated from common paper, such as paper based sensors, and methods of producing paper based sensors, and methods of sensing using the paper based sensors