WorldWideScience

Sample records for platform positioning computer

  1. Platform computing

    CERN Multimedia

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  2. Platform computing powers enterprise grid

    CERN Multimedia

    2002-01-01

    Platform Computing, today announced that the Stanford Linear Accelerator Center is using Platform LSF 5, to carry out groundbreaking research into the origins of the universe. Platform LSF 5 will deliver the mammoth computing power that SLAC's Linear Accelerator needs to process the data associated with intense high-energy physics research (1 page).

  3. Platform Architecture for Decentralized Positioning Systems

    Directory of Open Access Journals (Sweden)

    Zakaria Kasmi

    2017-04-01

    Full Text Available A platform architecture for positioning systems is essential for the realization of a flexible localization system, which interacts with other systems and supports various positioning technologies and algorithms. The decentralized processing of a position enables pushing the application-level knowledge into a mobile station and avoids the communication with a central unit such as a server or a base station. In addition, the calculation of the position on low-cost and resource-constrained devices presents a challenge due to the limited computing, storage capacity, as well as power supply. Therefore, we propose a platform architecture that enables the design of a system with the reusability of the components, extensibility (e.g., with other positioning technologies and interoperability. Furthermore, the position is computed on a low-cost device such as a microcontroller, which simultaneously performs additional tasks such as data collecting or preprocessing based on an operating system. The platform architecture is designed, implemented and evaluated on the basis of two positioning systems: a field strength system and a time of arrival-based positioning system.

  4. Platform Architecture for Decentralized Positioning Systems.

    Science.gov (United States)

    Kasmi, Zakaria; Norrdine, Abdelmoumen; Blankenbach, Jörg

    2017-04-26

    A platform architecture for positioning systems is essential for the realization of a flexible localization system, which interacts with other systems and supports various positioning technologies and algorithms. The decentralized processing of a position enables pushing the application-level knowledge into a mobile station and avoids the communication with a central unit such as a server or a base station. In addition, the calculation of the position on low-cost and resource-constrained devices presents a challenge due to the limited computing, storage capacity, as well as power supply. Therefore, we propose a platform architecture that enables the design of a system with the reusability of the components, extensibility (e.g., with other positioning technologies) and interoperability. Furthermore, the position is computed on a low-cost device such as a microcontroller, which simultaneously performs additional tasks such as data collecting or preprocessing based on an operating system. The platform architecture is designed, implemented and evaluated on the basis of two positioning systems: a field strength system and a time of arrival-based positioning system.

  5. Efficient High Performance Computing on Heterogeneous Platforms

    NARCIS (Netherlands)

    Shen, J.

    2015-01-01

    Heterogeneous platforms are mixes of different processing units in a compute node (e.g., CPUs+GPUs, CPU+MICs) or a chip package (e.g., APUs). This type of platforms keeps gaining popularity in various computer systems ranging from supercomputers to mobile devices. In this context, improving their

  6. Efficient High Performance Computing on Heterogeneous Platforms

    NARCIS (Netherlands)

    Shen, J.

    2015-01-01

    Heterogeneous platforms are mixes of different processing units in a compute node (e.g., CPUs+GPUs, CPU+MICs) or a chip package (e.g., APUs). This type of platforms keeps gaining popularity in various computer systems ranging from supercomputers to mobile devices. In this context, improving their ef

  7. Computing platforms for software-defined radio

    CERN Document Server

    Nurmi, Jari; Isoaho, Jouni; Garzia, Fabio

    2017-01-01

    This book addresses Software-Defined Radio (SDR) baseband processing from the computer architecture point of view, providing a detailed exploration of different computing platforms by classifying different approaches, highlighting the common features related to SDR requirements and by showing pros and cons of the proposed solutions. Coverage includes architectures exploiting parallelism by extending single-processor environment (such as VLIW, SIMD, TTA approaches), multi-core platforms distributing the computation to either a homogeneous array or a set of specialized heterogeneous processors, and architectures exploiting fine-grained, coarse-grained, or hybrid reconfigurability. Describes a computer engineering approach to SDR baseband processing hardware; Discusses implementation of numerous compute-intensive signal processing algorithms on single and multicore platforms; Enables deep understanding of optimization techniques related to power and energy consumption of multicore platforms using several basic a...

  8. Traffic information computing platform for big data

    Energy Technology Data Exchange (ETDEWEB)

    Duan, Zongtao, E-mail: ztduan@chd.edu.cn; Li, Ying, E-mail: ztduan@chd.edu.cn; Zheng, Xibin, E-mail: ztduan@chd.edu.cn; Liu, Yan, E-mail: ztduan@chd.edu.cn; Dai, Jiting, E-mail: ztduan@chd.edu.cn; Kang, Jun, E-mail: ztduan@chd.edu.cn [Chang' an University School of Information Engineering, Xi' an, China and Shaanxi Engineering and Technical Research Center for Road and Traffic Detection, Xi' an (China)

    2014-10-06

    Big data environment create data conditions for improving the quality of traffic information service. The target of this article is to construct a traffic information computing platform for big data environment. Through in-depth analysis the connotation and technology characteristics of big data and traffic information service, a distributed traffic atomic information computing platform architecture is proposed. Under the big data environment, this type of traffic atomic information computing architecture helps to guarantee the traffic safety and efficient operation, more intelligent and personalized traffic information service can be used for the traffic information users.

  9. A mobile and portable trusted computing platform

    Directory of Open Access Journals (Sweden)

    Nepal Surya

    2011-01-01

    Full Text Available Abstract The mechanism of establishing trust in a computing platform is tightly coupled with the characteristics of a specific machine. This limits the portability and mobility of trust as demanded by many emerging applications that go beyond the organizational boundaries. In order to address this problem, we propose a mobile and portable trusted computing platform in a form of a USB device. First, we describe the design and implementation of the hardware and software architectures of the device. We then demonstrate the capabilities of the proposed device by developing a trusted application.

  10. A signal strength priority based position estimation for mobile platforms

    Science.gov (United States)

    Kalgikar, Bhargav; Akopian, David; Chen, Philip

    2010-01-01

    Global Positioning System (GPS) products help to navigate while driving, hiking, boating, and flying. GPS uses a combination of orbiting satellites to determine position coordinates. This works great in most outdoor areas, but the satellite signals are not strong enough to penetrate inside most indoor environments. As a result, a new strain of indoor positioning technologies that make use of 802.11 wireless LANs (WLAN) is beginning to appear on the market. In WLAN positioning the system either monitors propagation delays between wireless access points and wireless device users to apply trilateration techniques or it maintains the database of location-specific signal fingerprints which is used to identify the most likely match of incoming signal data with those preliminary surveyed and saved in the database. In this paper we investigate the issue of deploying WLAN positioning software on mobile platforms with typically limited computational resources. We suggest a novel received signal strength rank order based location estimation system to reduce computational loads with a robust performance. The proposed system performance is compared to conventional approaches.

  11. Smart SOA platforms in cloud computing architectures

    CERN Document Server

    Exposito , Ernesto

    2014-01-01

    This book is intended to introduce the principles of the Event-Driven and Service-Oriented Architecture (SOA 2.0) and its role in the new interconnected world based on the cloud computing architecture paradigm. In this new context, the concept of "service" is widely applied to the hardware and software resources available in the new generation of the Internet. The authors focus on how current and future SOA technologies provide the basis for the smart management of the service model provided by the Platform as a Service (PaaS) layer.

  12. An Application Development Platform for Neuromorphic Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dean, Mark [University of Tennessee (UT); Chan, Jason [University of Tennessee (UT); Daffron, Christopher [University of Tennessee (UT); Disney, Adam [University of Tennessee (UT); Reynolds, John [University of Tennessee (UT); Rose, Garrett [University of Tennessee (UT); Plank, James [University of Tennessee (UT); Birdwell, John Douglas [University of Tennessee (UT); Schuman, Catherine D [ORNL

    2016-01-01

    Dynamic Adaptive Neural Network Arrays (DANNAs) are neuromorphic computing systems developed as a hardware based approach to the implementation of neural networks. They feature highly adaptive and programmable structural elements, which model arti cial neural networks with spiking behavior. We design them to solve problems using evolutionary optimization. In this paper, we highlight the current hardware and software implementations of DANNA, including their features, functionalities and performance. We then describe the development of an Application Development Platform (ADP) to support efficient application implementation and testing of DANNA based solutions. We conclude with future directions.

  13. The Influence of Computer Training Platform on Subsequent Computer Preferences.

    Science.gov (United States)

    Pardamean, Bens; Slovaceks, Simeon

    1995-01-01

    Reports a study that examined the impact of an introductory college computer course on users' subsequent preferences in their choice of computer (IBM versus Macintosh). Surveys found a strong positive relationship between the type of computer students used in the course and their later use and purchasing preferences. (SM)

  14. A Computing Platform for Parallel Sparse Matrix Computations

    Science.gov (United States)

    2016-01-05

    infiniband. Each node contains 24 cores. This parallel computing platform has been used by my research group in the early stages of developing large... research staff Inventions (DD882) Scientific Progress Two classes of parallel solvers have been developed. The first is a family of parallel sparse...SECURITY CLASSIFICATION OF: This grant enabled the purchase of an Intel multiprocessor consisting of eight multicore nodes interconnected via an

  15. Research on the positioning problem in HIFU surgery platform application

    Institute of Scientific and Technical Information of China (English)

    XIANG Lin-qing; GAO Xue-guan; XU Jian-bo; MA Pei-sun

    2006-01-01

    For describing the positioning process of High Intensity Focused Ultrasound (HIFU) Surgery Platform in the application in tumor treatment, a simplified representation of the shape and location of the positioning target tumor in the workspace of the platform by the Positioning Volume Ellipsoid is designed; and the Nearest Neighbor Search method is used to find the closest center point of the simplified ellipsoid tumor model in a selected patient body surface point set determined by the motion parameter of the platform. By the query result the goal positioning path configuration and an intermediate positioning path configuration for the positioning motion are determined for the positioning motion planning. Three new criterions using distance change between Positioning Volume Ellipsoid and the Ultrasound Focus Ellipsoid are proposed to evaluate the result of the whole positioning procedure.

  16. GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data

    Science.gov (United States)

    Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.

    2016-12-01

    Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We

  17. Analytics Platform for ATLAS Computing Services

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Bryant, Lincoln

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of machine learning tools like Spark, Jupyter, R, S...

  18. Multi-platform Integrated Positioning and Attitude Determination using GNSS

    NARCIS (Netherlands)

    Buist, P.J.

    2013-01-01

    There is trend in spacecraft engineering toward distributed systems where a number of smaller spacecraft work as a larger satellite. However, in order to make the small satellites work together as a single large platform, the precise relative positions (baseline) and orientations (attitude) of the e

  19. Computing Platforms for Big Biological Data Analytics: Perspectives and Challenges.

    Science.gov (United States)

    Yin, Zekun; Lan, Haidong; Tan, Guangming; Lu, Mian; Vasilakos, Athanasios V; Liu, Weiguo

    2017-01-01

    The last decade has witnessed an explosion in the amount of available biological sequence data, due to the rapid progress of high-throughput sequencing projects. However, the biological data amount is becoming so great that traditional data analysis platforms and methods can no longer meet the need to rapidly perform data analysis tasks in life sciences. As a result, both biologists and computer scientists are facing the challenge of gaining a profound insight into the deepest biological functions from big biological data. This in turn requires massive computational resources. Therefore, high performance computing (HPC) platforms are highly needed as well as efficient and scalable algorithms that can take advantage of these platforms. In this paper, we survey the state-of-the-art HPC platforms for big biological data analytics. We first list the characteristics of big biological data and popular computing platforms. Then we provide a taxonomy of different biological data analysis applications and a survey of the way they have been mapped onto various computing platforms. After that, we present a case study to compare the efficiency of different computing platforms for handling the classical biological sequence alignment problem. At last we discuss the open issues in big biological data analytics.

  20. Strategies for Sharing Seismic Data Among Multiple Computer Platforms

    Science.gov (United States)

    Baker, L. M.; Fletcher, J. B.

    2001-12-01

    the user. Commercial software packages, such as MatLab, also have the ability to share data in their own formats across multiple computer platforms. Our Fortran applications can create plot files in Adobe PostScript, Illustrator, and Portable Document Format (PDF) formats. Vendor support for reading these files is readily available on multiple computer platforms. We will illustrate by example our strategies for sharing seismic data among our multiple computer platforms, and we will discuss our positive and negative experiences. We will include our solutions for handling the different byte ordering, floating-point formats, and text file ``end-of-line'' conventions on the various computer platforms we use (6 different operating systems on 5 processor architectures).

  1. Platforms for Building and Deploying Applications for Cloud Computing

    CERN Document Server

    Buyya, Rajkumar

    2011-01-01

    Cloud computing is rapidly emerging as a new paradigm for delivering IT services as utlity-oriented services on subscription-basis. The rapid development of applications and their deployment in Cloud computing environments in efficient manner is a complex task. In this article, we give a brief introduction to Cloud computing technology and Platform as a Service, we examine the offerings in this category, and provide the basis for helping readers to understand basic application platform opportunities in Cloud by technologies such as Microsoft Azure, Sales Force, Google App, and Aneka for Cloud computing. We demonstrate that Manjrasoft Aneka is a Cloud Application Platform (CAP) leveraging these concepts and allowing an easy development of Cloud ready applications on a Private/Public/Hybrid Cloud. Aneka CAP offers facilities for quickly developing Cloud applications and a modular platform where additional services can be easily integrated to extend the system capabilities, thus being at pace with the rapidly ev...

  2. Ultra-mobile rugged computing platform design considerations

    Science.gov (United States)

    Garcia, Ray; Wright-Johnson, Mark; Daniels, Reginald

    2011-06-01

    State of the art mobile computing is designed to withstand variable rugged environments. Specific platforms including mobile phones, GPS devices, tablets, Netbooks and laptops that are used by the general public and increasingly by dismounted military users.

  3. Trusted computing platforms TPM2.0 in context

    CERN Document Server

    Proudler, Graeme; Dalton, Chris

    2015-01-01

    In this book the authors first describe the background of trusted platforms and trusted computing and speculate about the future. They then describe the technical features and architectures of trusted platforms from several different perspectives, finally explaining second-generation TPMs, including a technical description intended to supplement the Trusted Computing Group's TPM2 specifications. The intended audience is IT managers and engineers and graduate students in information security.

  4. ALICE Connex : Mobile Volunteer Computing and Edutainment Platform

    CERN Document Server

    Chalumporn, Gantaphon

    2016-01-01

    Mobile devices are very powerful and trend to be developed. They have functions that are used in everyday life. One of their main tasks is to be an entertainment devices or gaming platform. A lot of technologies are now accepted and adopted to improve the potential of education. Edutainment is a combination of entertainment and education media together to make use of both benefits. In this work, we introduce a design of edutainment platform which is a part of mobile volunteer computing and edutainment platform called ‘ALICE Connex’ for ALICE at CERN. The edutainment platform focuses to deliver enjoyment and education, while promotes ALICE and Volunteer Computing platform to general public. The design in this work describes the functionality to build an effective edutainment with real-time multiplayer interaction on round-based gameplay, while integrates seamless edutainment with basic particle physic content though game mechanism and items design. For the assessment method we will observe the enjoyment o...

  5. Study on the application of mobile internet cloud computing platform

    Science.gov (United States)

    Gong, Songchun; Fu, Songyin; Chen, Zheng

    2012-04-01

    The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.

  6. A high-throughput bioinformatics distributed computing platform

    OpenAIRE

    Keane, Thomas M; Page, Andrew J.; McInerney, James O; Naughton, Thomas J.

    2005-01-01

    In the past number of years the demand for high performance computing has greatly increased in the area of bioinformatics. The huge increase in size of many genomic databases has meant that many common tasks in bioinformatics are not possible to complete in a reasonable amount of time on a single processor. Recently distributed computing has emerged as an inexpensive alternative to dedicated parallel computing. We have developed a general-purpose distributed computing platform ...

  7. Enhancing Trusted Cloud Computing Platform for Infrastructure as a Service

    Directory of Open Access Journals (Sweden)

    KIM, H.

    2017-02-01

    Full Text Available The characteristics of cloud computing including on-demand self-service, resource pooling, and rapid elasticity have made it grow in popularity. However, security concerns still obstruct widespread adoption of cloud computing in the industry. Especially, security risks related to virtual machine make cloud users worry about exposure of their private data in IaaS environment. In this paper, we propose an enhanced trusted cloud computing platform to provide confidentiality and integrity of the user's data and computation. The presented platform provides secure and efficient virtual machine management protocols not only to protect against eavesdropping and tampering during transfer but also to guarantee the virtual machine is hosted only on the trusted cloud nodes against inside attackers. The protocols utilize both symmetric key operations and public key operations together with efficient node authentication model, hence both the computational cost for cryptographic operations and the communication steps are significantly reduced. As a result, the simulation shows the performance of the proposed platform is approximately doubled compared to the previous platforms. The proposed platform eliminates cloud users' worry above by providing confidentiality and integrity of their private data with better performance, and thus it contributes to wider industry adoption of cloud computing.

  8. Evaluation of Global Positioning System Data for Offshore Platform Deformation

    Directory of Open Access Journals (Sweden)

    Abdul N. Matori

    2011-01-01

    Full Text Available Problem statement: Reservoir compaction and shallow gas migration phenomena may cause offshore platform to experience deformation which if happens excessively will affect their structural integrity. Approach: Hence it is crucial to monitor and quantify the magnitude of the deformation especially if they are not uniform throughout the platform structure. However since mostly the offshore platforms are few hundreds kilometers away from shore, the precise monitoring of their deformation is limited to very few sophisticated instruments, in which GPS technology is one of them albeit using very special GPS data processing technique such as Long Baseline Relative Positioning. Results: Using this technique and employing GPS data observed on one of PETRONAS own platform, Pulai, its deformation magnitude will be determined with various options such as number of reference stations used, configuration and their geographic location. This study presents initial deformation processing result using scientific software GAMIT/GLOBK and their analysis utilizing postfit nrms and chi-squared statistics. The result indicated that for the period of two months there was displacement as big as 0.0094 m with standard deviation of 0.0106 m. However following congruency statistical test using t-student distribution with 95% confidence level, indicated that this displacement is insignificance. Analysis of the output result with postfit nrms also indicated that the data were of good quality, the processing procedure was correct and the output for each processing epoch is internally and externally consistent. Conclusion/Recommendations: It could be concluded with correct data processing strategy GPS data could be used to determine deformation magnitude which consequently could be utilized as input to assess structural integrity of an offshore platform.

  9. A Scientific Cloud Computing Platform for Condensed Matter Physics

    Science.gov (United States)

    Jorissen, K.; Johnson, W.; Vila, F. D.; Rehr, J. J.

    2013-03-01

    Scientific Cloud Computing (SCC) makes possible calculations with high performance computational tools, without the need to purchase or maintain sophisticated hardware and software. We have recently developed an interface dubbed SC2IT that controls on-demand virtual Linux clusters within the Amazon EC2 cloud platform. Using this interface we have developed a more advanced, user-friendly SCC Platform configured especially for condensed matter calculations. This platform contains a GUI, based on a new Java version of SC2IT, that permits calculations of various materials properties. The cloud platform includes Virtual Machines preconfigured for parallel calculations and several precompiled and optimized materials science codes for electronic structure and x-ray and electron spectroscopy. Consequently this SCC makes state-of-the-art condensed matter calculations easy to access for general users. Proof-of-principle performance benchmarks show excellent parallelization and communication performance. Supported by NSF grant OCI-1048052

  10. Development of integrated platform for computational material design

    Energy Technology Data Exchange (ETDEWEB)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato [Center for Computational Science and Engineering, Fuji Research Institute Corporation (Japan); Hideaki, Koike [Advance Soft Corporation (Japan)

    2003-07-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned.

  11. Numeric computation and statistical data analysis on the Java platform

    CERN Document Server

    Chekanov, Sergei V

    2016-01-01

    Numerical computation, knowledge discovery and statistical data analysis integrated with powerful 2D and 3D graphics for visualization are the key topics of this book. The Python code examples powered by the Java platform can easily be transformed to other programming languages, such as Java, Groovy, Ruby and BeanShell. This book equips the reader with a computational platform which, unlike other statistical programs, is not limited by a single programming language. The author focuses on practical programming aspects and covers a broad range of topics, from basic introduction to the Python language on the Java platform (Jython), to descriptive statistics, symbolic calculations, neural networks, non-linear regression analysis and many other data-mining topics. He discusses how to find regularities in real-world data, how to classify data, and how to process data for knowledge discoveries. The code snippets are so short that they easily fit into single pages. Numeric Computation and Statistical Data Analysis ...

  12. Cellular computational platform and neurally inspired elements thereof

    Energy Technology Data Exchange (ETDEWEB)

    Okandan, Murat

    2016-11-22

    A cellular computational platform is disclosed that includes a multiplicity of functionally identical, repeating computational hardware units that are interconnected electrically and optically. Each computational hardware unit includes a reprogrammable local memory and has interconnections to other such units that have reconfigurable weights. Each computational hardware unit is configured to transmit signals into the network for broadcast in a protocol-less manner to other such units in the network, and to respond to protocol-less broadcast messages that it receives from the network. Each computational hardware unit is further configured to reprogram the local memory in response to incoming electrical and/or optical signals.

  13. Computational Chemistry Data Management Platform Based on the Semantic Web.

    Science.gov (United States)

    Wang, Bing; Dobosh, Paul A; Chalk, Stuart; Sopek, Mirek; Ostlund, Neil S

    2017-01-12

    This paper presents a formal data publishing platform for computational chemistry using semantic web technologies. This platform encapsulates computational chemistry data from a variety of packages in an Extensible Markup Language (XML) file called CSX (Common Standard for eXchange). On the basis of a Gainesville Core (GC) ontology for computational chemistry, a CSX XML file is converted into the JavaScript Object Notation for Linked Data (JSON-LD) format using an XML Stylesheet Language Transformation (XSLT) file. Ultimately the JSON-LD file is converted to subject-predicate-object triples in a Turtle (TTL) file and published on the web portal. By leveraging semantic web technologies, we are able to place computational chemistry data onto web portals as a component of a Giant Global Graph (GGG) such that computer agents, as well as individual chemists, can access the data.

  14. Computer-Controlled, Motorized Positioning System

    Science.gov (United States)

    Vargas-Aburto, Carlos; Liff, Dale R.

    1994-01-01

    Computer-controlled, motorized positioning system developed for use in robotic manipulation of samples in custom-built secondary-ion mass spectrometry (SIMS) system. Positions sample repeatably and accurately, even during analysis in three linear orthogonal coordinates and one angular coordinate under manual local control, or microprocessor-based local control or remote control by computer via general-purpose interface bus (GPIB).

  15. On the performances of computer vision algorithms on mobile platforms

    Science.gov (United States)

    Battiato, S.; Farinella, G. M.; Messina, E.; Puglisi, G.; Ravì, D.; Capra, A.; Tomaselli, V.

    2012-01-01

    Computer Vision enables mobile devices to extract the meaning of the observed scene from the information acquired with the onboard sensor cameras. Nowadays, there is a growing interest in Computer Vision algorithms able to work on mobile platform (e.g., phone camera, point-and-shot-camera, etc.). Indeed, bringing Computer Vision capabilities on mobile devices open new opportunities in different application contexts. The implementation of vision algorithms on mobile devices is still a challenging task since these devices have poor image sensors and optics as well as limited processing power. In this paper we have considered different algorithms covering classic Computer Vision tasks: keypoint extraction, face detection, image segmentation. Several tests have been done to compare the performances of the involved mobile platforms: Nokia N900, LG Optimus One, Samsung Galaxy SII.

  16. Homomorphic encryption experiments on IBM's cloud quantum computing platform

    Science.gov (United States)

    Huang, He-Liang; Zhao, You-Wei; Li, Tan; Li, Feng-Guang; Du, Yu-Tao; Fu, Xiang-Qun; Zhang, Shuo; Wang, Xiang; Bao, Wan-Su

    2017-02-01

    Quantum computing has undergone rapid development in recent years. Owing to limitations on scalability, personal quantum computers still seem slightly unrealistic in the near future. The first practical quantum computer for ordinary users is likely to be on the cloud. However, the adoption of cloud computing is possible only if security is ensured. Homomorphic encryption is a cryptographic protocol that allows computation to be performed on encrypted data without decrypting them, so it is well suited to cloud computing. Here, we first applied homomorphic encryption on IBM's cloud quantum computer platform. In our experiments, we successfully implemented a quantum algorithm for linear equations while protecting our privacy. This demonstration opens a feasible path to the next stage of development of cloud quantum information technology.

  17. Determination of UAV position using high accuracy navigation platform

    Directory of Open Access Journals (Sweden)

    Ireneusz Kubicki

    2016-07-01

    Full Text Available The choice of navigation system for mini UAV is very important because of its application and exploitation, particularly when the installed on it a synthetic aperture radar requires highly precise information about an object’s position. The presented exemplary solution of such a system draws attention to the possible problems associated with the use of appropriate technology, sensors, and devices or with a complete navigation system. The position and spatial orientation errors of the measurement platform influence on the obtained SAR imaging. Both, turbulences and maneuvers performed during flight cause the changes in the position of the airborne object resulting in deterioration or lack of images from SAR. Consequently, it is necessary to perform operations for reducing or eliminating the impact of the sensors’ errors on the UAV position accuracy. You need to look for compromise solutions between newer better technologies and in the field of software. Keywords: navigation systems, unmanned aerial vehicles, sensors integration

  18. A Security Kernel Architecture Based Trusted Computing Platform

    Institute of Scientific and Technical Information of China (English)

    CHEN You-lei; SHEN Chang-xiang

    2005-01-01

    A security kernel architecture built on trusted computing platform in the light of thinking about trusted computing is presented. According to this architecture, a new security module TCB (Trusted Computing Base) is added to the operation system kernel and two operation interface modes are provided for the sake of self-protection. The security kernel is divided into two parts and trusted mechanism is separated from security functionality. The TCB module implements the trusted mechanism such as measurement and attestation,while the other components of security kernel provide security functionality based on these mechanisms. This architecture takes full advantage of functions provided by trusted platform and clearly defines the security perimeter of TCB so as to assure self-security from architectural vision. We also present function description of TCB and discuss the strengths and limitations comparing with other related researches.

  19. Aneka: A Software Platform for .NET-based Cloud Computing

    OpenAIRE

    Vecchiola, Christian; Chu, Xingchen; Buyya, Rajkumar

    2009-01-01

    Aneka is a platform for deploying Clouds developing applications on top of it. It provides a runtime environment and a set of APIs that allow developers to build .NET applications that leverage their computation on either public or private clouds. One of the key features of Aneka is the ability of supporting multiple programming models that are ways of expressing the execution logic of applications by using specific abstractions. This is accomplished by creating a customizable and extensible ...

  20. Development of a Very Dense Liquid Cooled Compute Platform

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, Phillip N.; Lipp, Robert J.

    2013-12-10

    The objective of this project was to design and develop a prototype very energy efficient high density compute platform with 100% pumped refrigerant liquid cooling using commodity components and high volume manufacturing techniques. Testing at SLAC has indicated that we achieved a DCIE of 0.93 against our original goal of 0.85. This number includes both cooling and power supply and was achieved employing some of the highest wattage processors available.

  1. Fundamentals of power integrity for computer platforms and systems

    CERN Document Server

    DiBene, Joseph T

    2014-01-01

    An all-encompassing text that focuses on the fundamentals of power integrity Power integrity is the study of power distribution from the source to the load and the system level issues that can occur across it. For computer systems, these issues can range from inside the silicon to across the board and may egress into other parts of the platform, including thermal, EMI, and mechanical. With a focus on computer systems and silicon level power delivery, this book sheds light on the fundamentals of power integrity, utilizing the author's extensive background in the power integrity industry and un

  2. Application of microarray analysis on computer cluster and cloud platforms.

    Science.gov (United States)

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  3. Interactive Computer-Assisted Instruction in Acid-Base Physiology for Mobile Computer Platforms

    Science.gov (United States)

    Longmuir, Kenneth J.

    2014-01-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ~20 screens of information, on the subjects…

  4. Interactive Computer-Assisted Instruction in Acid-Base Physiology for Mobile Computer Platforms

    Science.gov (United States)

    Longmuir, Kenneth J.

    2014-01-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ~20 screens of information, on the subjects…

  5. Atomdroid: a computational chemistry tool for mobile platforms.

    Science.gov (United States)

    Feldt, Jonas; Mata, Ricardo A; Dieterich, Johannes M

    2012-04-23

    We present the implementation of a new molecular mechanics program designed for use in mobile platforms, the first specifically built for these devices. The software is designed to run on Android operating systems and is compatible with several modern tablet-PCs and smartphones available in the market. It includes molecular viewer/builder capabilities with integrated routines for geometry optimizations and Monte Carlo simulations. These functionalities allow it to work as a stand-alone tool. We discuss some particular development aspects, as well as the overall feasibility of using computational chemistry software packages in mobile platforms. Benchmark calculations show that through efficient implementation techniques even hand-held devices can be used to simulate midsized systems using force fields.

  6. Overview of Parallel Platforms for Common High Performance Computing

    Directory of Open Access Journals (Sweden)

    T. Fryza

    2012-04-01

    Full Text Available The paper deals with various parallel platforms used for high performance computing in the signal processing domain. More precisely, the methods exploiting the multicores central processing units such as message passing interface and OpenMP are taken into account. The properties of the programming methods are experimentally proved in the application of a fast Fourier transform and a discrete cosine transform and they are compared with the possibilities of MATLAB's built-in functions and Texas Instruments digital signal processors with very long instruction word architectures. New FFT and DCT implementations were proposed and tested. The implementation phase was compared with CPU based computing methods and with possibilities of the Texas Instruments digital signal processing library on C6747 floating-point DSPs. The optimal combination of computing methods in the signal processing domain and new, fast routines' implementation is proposed as well.

  7. Interactive computer-assisted instruction in acid-base physiology for mobile computer platforms.

    Science.gov (United States)

    Longmuir, Kenneth J

    2014-03-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ∼20 screens of information, on the subjects of the CO2-bicarbonate buffer system, other body buffer systems, and acid-base disorders. Five clinical case modules were also developed. For the learning modules, the interactive, active learning activities were primarily step-by-step learner control of explanations of complex physiological concepts, usually presented graphically. For the clinical cases, the active learning activities were primarily question-and-answer exercises that related clinical findings to the relevant basic science concepts. The student response was remarkably positive, with the interactive, active learning aspect of the instruction cited as the most important feature. Also, students cited the self-paced instruction, extensive use of interactive graphics, and side-by-side presentation of text and graphics as positive features. Most students reported that it took less time to study the subject matter with this online instruction compared with subject matter presented in the lecture hall. However, the approach to learning was highly examination driven, with most students delaying the study of the subject matter until a few days before the scheduled examination. Wider implementation of active learning computer-assisted instruction will require that instructors present subject matter interactively, that students fully embrace the responsibilities of independent learning, and that institutional administrations measure instructional effort by criteria other than scheduled hours of instruction.

  8. Precise Point Positioning for TAI Computation

    Directory of Open Access Journals (Sweden)

    Gérard Petit

    2008-01-01

    Full Text Available We discuss the use of some new time transfer techniques for computing TAI time links. Precise point positioning (PPP uses GPS dual frequency carrier phase and code measurements to compute the link between a local clock and a reference time scale with the precision of the carrier phase and the accuracy of the code. The time link between any two stations can then be computed by a simple difference. We show that this technique is well adapted and has better short-term stability than other techniques used in TAI. We present a method of combining PPP and two-way time transfer that takes advantage of the qualities of each technique, and shows that it would bring significant improvement to TAI links.

  9. Cloud Computing Platform for an Online Model Library System

    Directory of Open Access Journals (Sweden)

    Mingang Chen

    2013-01-01

    Full Text Available The rapid developing of digital content industry calls for online model libraries. For the efficiency, user experience, and reliability merits of the model library, this paper designs a Web 3D model library system based on a cloud computing platform. Taking into account complex models, which cause difficulties in real-time 3D interaction, we adopt the model simplification and size adaptive adjustment methods to make the system with more efficient interaction. Meanwhile, a cloud-based architecture is developed to ensure the reliability and scalability of the system. The 3D model library system is intended to be accessible by online users with good interactive experiences. The feasibility of the solution has been tested by experiments.

  10. Aneka: A Software Platform for .NET-based Cloud Computing

    CERN Document Server

    Vecchiola, Christian; Buyya, Rajkumar

    2009-01-01

    Aneka is a platform for deploying Clouds developing applications on top of it. It provides a runtime environment and a set of APIs that allow developers to build .NET applications that leverage their computation on either public or private clouds. One of the key features of Aneka is the ability of supporting multiple programming models that are ways of expressing the execution logic of applications by using specific abstractions. This is accomplished by creating a customizable and extensible service oriented runtime environment represented by a collection of software containers connected together. By leveraging on these architecture advanced services including resource reservation, persistence, storage management, security, and performance monitoring have been implemented. On top of this infrastructure different programming models can be plugged to provide support for different scenarios as demonstrated by the engineering, life science, and industry applications.

  11. Mapping flow distortion on oceanographic platforms using computational fluid dynamics

    Directory of Open Access Journals (Sweden)

    N. O'Sullivan

    2013-10-01

    Full Text Available Wind speed measurements over the ocean on ships or buoys are affected by flow distortion from the platform and by the anemometer itself. This can lead to errors in direct measurements and the derived parametrisations. Here we computational fluid dynamics (CFD to simulate the errors in wind speed measurements caused by flow distortion on the RV Celtic Explorer. Numerical measurements were obtained from the finite-volume CFD code OpenFOAM, which was used to simulate the velocity fields. This was done over a range of orientations in the test domain from −60 to +60° in increments of 10°. The simulation was also set up for a range of velocities, ranging from 5 to 25 m s−1 in increments of 0.5 m s−1. The numerical analysis showed close agreement to experimental measurements.

  12. Merkle Tree Digital Signature and Trusted Computing Platform

    Institute of Scientific and Technical Information of China (English)

    WANG Xiaofei; HONG Fan; TANG Xueming; CUI Guohua

    2006-01-01

    Lack of efficiency in the initial key generation process is a serious shortcoming of Merkle tree signature scheme with a large number of possible signatures. Based on two kinds of Merkle trees, a new tree type signature scheme is constructed, and it is provably existentially unforgeable under adaptive chosen message attack. By decentralizing the initial key generation process of the original scheme within the signature process, a large Merkle tree with 6.87×1010 possible signatures can be initialized in 590 milliseconds. Storing some small Merkle trees in hard disk and memory can speed up Merkle tree signature scheme. Mekle tree signature schemes are fit for trusted computing platform in most scenarios.

  13. Positive computer-generated exercise electrocardiogram.

    Science.gov (United States)

    MacKenzie, Ross

    2006-01-01

    The use of computerized averaging of the electrocardiogram (ECG) during stress testing has facilitated the removal of motion artifacts and baseline shifts. However, this process can introduce errors, which may not be appreciated by medical directors. Such errors can lead to significant ST depression in the absence of coronary artery disease. Such false-positive tests may lead to anxiety in the applicant, delays in accepting the application and unnecessary additional testing. This case study illustrates a common pitfall associated with using only a computer-generated exercise ECG for risk assessment of a life insurance applicant.

  14. WLAN Positioning Methods and Supporting Learning Technologies for Mobile Platforms

    Science.gov (United States)

    Melkonyan, Arsen

    2013-01-01

    Location technologies constitute an essential component of systems design for autonomous operations and control. The Global Positioning System (GPS) works well in outdoor areas, but the satellite signals are not strong enough to penetrate inside most indoor environments. As a result, a new strain of indoor positioning technologies that make use of…

  15. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  16. Matrix element method for high performance computing platforms

    Science.gov (United States)

    Grasseau, G.; Chamont, D.; Beaudette, F.; Bianchini, L.; Davignon, O.; Mastrolorenzo, L.; Ochando, C.; Paganini, P.; Strebler, T.

    2015-12-01

    Lot of efforts have been devoted by ATLAS and CMS teams to improve the quality of LHC events analysis with the Matrix Element Method (MEM). Up to now, very few implementations try to face up the huge computing resources required by this method. We propose here a highly parallel version, combining MPI and OpenCL, which makes the MEM exploitation reachable for the whole CMS datasets with a moderate cost. In the article, we describe the status of two software projects under development, one focused on physics and one focused on computing. We also showcase their preliminary performance obtained with classical multi-core processors, CUDA accelerators and MIC co-processors. This let us extrapolate that with the help of 6 high-end accelerators, we should be able to reprocess the whole LHC run 1 within 10 days, and that we have a satisfying metric for the upcoming run 2. The future work will consist in finalizing a single merged system including all the physics and all the parallelism infrastructure, thus optimizing implementation for best hardware platforms.

  17. A platform independent communication library for distributed computing

    NARCIS (Netherlands)

    Groen, D.; Rieder, S.; Grosso, P.; de Laat, C.; Portegies Zwart, S.

    2010-01-01

    We present MPWide, a platform independent communication library for performing message passing between supercomputers. Our library couples several local MPI applications through a long distance network using, for example, optical links. The implementation is deliberately kept light-weight, platform

  18. A wireless computational platform for distributed computing based traffic monitoring involving mixed Eulerian-Lagrangian sensing

    KAUST Repository

    Jiang, Jiming

    2013-06-01

    This paper presents a new wireless platform designed for an integrated traffic monitoring system based on combined Lagrangian (mobile) and Eulerian (fixed) sensing. The sensor platform is built around a 32-bit ARM Cortex M4 micro-controller and a 2.4GHz 802.15.4 ISM compliant radio module, and can be interfaced with fixed traffic sensors, or receive data from vehicle transponders. The platform is specially designed and optimized to be integrated in a solar-powered wireless sensor network in which traffic flow maps are computed by the nodes directly using distributed computing. A MPPT circuitry is proposed to increase the power output of the attached solar panel. A self-recovering unit is designed to increase reliability and allow periodic hard resets, an essential requirement for sensor networks. A radio monitoring circuitry is proposed to monitor incoming and outgoing transmissions, simplifying software debug. An ongoing implementation is briefly discussed, and compared with existing platforms used in wireless sensor networks. © 2013 IEEE.

  19. Drug voyager: a computational platform for exploring unintended drug action.

    Science.gov (United States)

    Oh, Min; Ahn, Jaegyoon; Lee, Taekeon; Jang, Giup; Park, Chihyun; Yoon, Youngmi

    2017-02-28

    The dominant paradigm in understanding drug action focuses on the intended therapeutic effects and frequent adverse reactions. However, this approach may limit opportunities to grasp unintended drug actions, which can open up channels to repurpose existing drugs and identify rare adverse drug reactions. Advances in systems biology can be exploited to comprehensively understand pharmacodynamic actions, although proper frameworks to represent drug actions are still lacking. We suggest a novel platform to construct a drug-specific pathway in which a molecular-level mechanism of action is formulated based on pharmacologic, pharmacogenomic, transcriptomic, and phenotypic data related to drug response ( http://databio.gachon.ac.kr/tools/ ). In this platform, an adoption of three conceptual levels imitating drug perturbation allows these pathways to be realistically rendered in comparison to those of other models. Furthermore, we propose a new method that exploits functional features of the drug-specific pathways to predict new indications as well as adverse reactions. For therapeutic uses, our predictions significantly overlapped with clinical trials and an up-to-date drug-disease association database. Also, our method outperforms existing methods with regard to classification of active compounds for cancers. For adverse reactions, our predictions were significantly enriched in an independent database derived from the Food and Drug Administration (FDA) Adverse Event Reporting System and meaningfully cover an Adverse Reaction Database provided by Health Canada. Lastly, we discuss several predictions for both therapeutic indications and side-effects through the published literature. Our study addresses how we can computationally represent drug-signaling pathways to understand unintended drug actions and to facilitate drug discovery and screening.

  20. Model-driven product line engineering for mapping parallel algorithms to parallel computing platforms

    NARCIS (Netherlands)

    Arkin, Ethem; Tekinerdogan, Bedir

    2016-01-01

    Mapping parallel algorithms to parallel computing platforms requires several activities such as the analysis of the parallel algorithm, the definition of the logical configuration of the platform, the mapping of the algorithm to the logical configuration platform and the implementation of the sou

  1. A statistically-augmented computational platform for evaluating meniscal function.

    Science.gov (United States)

    Guo, Hongqiang; Santner, Thomas J; Chen, Tony; Wang, Hongsheng; Brial, Caroline; Gilbert, Susannah L; Koff, Matthew F; Lerner, Amy L; Maher, Suzanne A

    2015-06-01

    Meniscal implants have been developed in an attempt to provide pain relief and prevent pathological degeneration of articular cartilage. However, as yet there has been no systematic and comprehensive analysis of the effects of the meniscal design variables on meniscal function across a wide patient population, and there are no clear design criteria to ensure the functional performance of candidate meniscal implants. Our aim was to develop a statistically-augmented, experimentally-validated, computational platform to assess the effect of meniscal properties and patient variables on knee joint contact mechanics during the activity of walking. Our analysis used Finite Element Models (FEMs) that represented the geometry, kinematics as based on simulated gait and contact mechanics of three laboratory tested human cadaveric knees. The FEMs were subsequently programmed to represent prescribed meniscal variables (circumferential and radial/axial moduli-Ecm, Erm, stiffness of the meniscal attachments-Slpma, Slamp) and patient variables (varus/valgus alignment-VVA, and articular cartilage modulus-Ec). The contact mechanics data generated from the FEM runs were used as training data to a statistical interpolator which estimated joint contact data for untested configurations of input variables. Our data suggested that while Ecm and Erm of a meniscus are critical in determining knee joint mechanics in early and late stance (peak 1 and peak 3 of the gait cycle), for some knees that have greater laxity in the mid-stance phase of gait, the stiffness of the articular cartilage, Ec, can influence force distribution across the tibial plateau. We found that the medial meniscus plays a dominant load-carrying role in the early stance phase and less so in late stance, while the lateral meniscus distributes load throughout gait. Joint contact mechanics in the medial compartment are more sensitive to Ecm than those in the lateral compartment. Finally, throughout stance, varus

  2. A statistically-augmented computational platform for evaluating meniscal function

    Science.gov (United States)

    Guo, Hongqiang; Santner, Thomas J.; Chen, Tony; Wang, Hongsheng; Brial, Caroline; Gilbert, Susannah L.; Koff, Matthew F.; Lerner, Amy L.; Maher, Suzanne A.

    2015-01-01

    Meniscal implants have been developed in an attempt to provide pain relief and prevent pathological degeneration of articular cartilage. However, as yet there has been no systematic and comprehensive analysis of the effects of the meniscal design variables on meniscal function across a wide patient population, and there are no clear design criteria to ensure the functional performance of candidate meniscal implants. Our aim was to develop a statistically-augmented, experimentally-validated, computational platform to assess the effect of meniscal properties and patient variables on knee joint contact mechanics during the activity of walking. Our analysis used Finite Element Models (FEMs) that represented the geometry, kinematics as based on simulated gait and contact mechanics of three laboratory tested human cadaveric knees. The FEMs were subsequently programmed to represent prescribed meniscal variables (circumferential and radial/axial moduli - Ecm, Erm, stiffness of the meniscal attachments - Slpma, Slamp) and patient variables (varus/valgus alignment – VVA, and articular cartilage modulus - Ec). The contact mechanics data generated from the FEM runs were used as training data to a statistical interpolator which estimated joint contact data for untested configurations of input variables. Our data suggested that while Ecm and Erm of a meniscus are critical in determining knee joint mechanics in early and late stance (peak 1 and peak 3 of the gait cycle), for some knees that have greater laxity in the mid-stance phase of gait, the stiffness of the articular cartilage, Ec, can influence force distribution across the tibial plateau. We found that the medial meniscus plays a dominant load-carrying role in the early stance phase and less so in late stance, while the lateral meniscus distributes load throughout gait. Joint contact mechanics in the medial compartment are more sensitive to Ecm than those in the lateral compartment. Finally, throughout stance, varus

  3. Internet Based General Computer Simulation Platform for Distributed Multi-Robotic System

    Institute of Scientific and Technical Information of China (English)

    迟艳玲; 张斌; 王硕; 谭民

    2002-01-01

    A general computer simulation platform is designed for the purpose of catrrying out experiments on the Distributed Multi-Robotic System. The simulation platform is based on Internet and possesses generality, validity, real-time display and function of supporting algorithm developing. In addition, the platform is equipped wit recording and replay module, and simulation experiment can be reviewed at anytime.By now; a few algorithms have been developed on the Simulation Platform designed.

  4. Computer program to generate attitude error equations for a gimballed platform

    Science.gov (United States)

    Hall, W. A., Jr.; Morris, T. D.; Rone, K. Y.

    1972-01-01

    Computer program for solving attitude error equations related to gimballed platform is described. Program generates matrix elements of attitude error equations when initial matrices and trigonometric identities have been defined. Program is written for IBM 360 computer.

  5. An Early Evaluation and Comparison of Three Private Cloud Computing Software Platforms

    Institute of Scientific and Technical Information of China (English)

    Farrukh Nadeem; Rizwan Qaiser

    2015-01-01

    Cloud computing, after its success as a commercial infrastructure, is now emerging as a private infrastructure. The software platforms available to build private cloud computing infrastructure vary in their performance for management of cloud resources as well as in utilization of local physical resources. Organizations and individuals looking forward to reaping the benefits of private cloud computing need to understand which software platform would provide the efficient services and optimum utilization of cloud resources for their target applications. In this paper, we present our initial study on performance evaluation and comparison of three cloud computing software platforms from the perspective of common cloud users who intend to build their private clouds. We compare the performance of the selected software platforms from several respects describing their suitability for applications from different domains. Our results highlight the critical parameters for performance evaluation of a software platform and the best software platform for different application domains.

  6. Design and Zmplementation of Online Experimental Platform for Computer Networks Course

    Institute of Scientific and Technical Information of China (English)

    WANG Ben; ZHANG Tao

    2012-01-01

    Practice training is very important for students learning Computer networks. But building a real laboratory is constrained and expensive. In this paper, we present an online experimental platform for computer networks course based on Dynamips simulator. Instructors and students can access the platform by IE Browser to manage and take router experiments. On the basis of deployment and testing, the platform is effective and flexible.

  7. Cloud Computing as Evolution of Distributed Computing – A Case Study for SlapOS Distributed Cloud Computing Platform

    Directory of Open Access Journals (Sweden)

    George SUCIU

    2013-01-01

    Full Text Available The cloud computing paradigm has been defined from several points of view, the main two directions being either as an evolution of the grid and distributed computing paradigm, or, on the contrary, as a disruptive revolution in the classical paradigms of operating systems, network layers and web applications. This paper presents a distributed cloud computing platform called SlapOS, which unifies technologies and communication protocols into a new technology model for offering any application as a service. Both cloud and distributed computing can be efficient methods for optimizing resources that are aggregated from a grid of standard PCs hosted in homes, offices and small data centers. The paper fills a gap in the existing distributed computing literature by providing a distributed cloud computing model which can be applied for deploying various applications.

  8. Position measurement/tracking comparison of the instrumentation in a droplet-actuated-robotic platform.

    Science.gov (United States)

    Casier, Renaud; Lenders, Cyrille; Lhernould, Marion Sausse; Gauthier, Michaël; Lambert, Pierre

    2013-05-07

    This paper reports our work on developing a surface tension actuated micro-robotic platform supported by three bubbles (liquid environment) or droplets (gaseous environment). The actuation principle relies on the force developed by surface tension below a millimeter, which benefits from scaling laws, and is used to actuate this new type of compliant robot. By separately controlling the pressure inside each bubble, three degrees of freedom can be actuated. We investigated three sensing solutions to measure the platform attitude in real-time (z-position of each droplet, leading to the knowledge of the z position and Θx and Θy tilts of the platform). The comparison between optical, resistive, and capacitive measurement principles is hereafter reported. The optical technique uses SFH-9201 components. The resistive technique involves measuring the electrical resistance of a path flowing through two droplets and the platform. This innovative technique for sensing table position combines three pairs of resistances, from which the resistance in each drop can be deduced, thus determining the platform position. The third solution is a more usual high frequency (~200 MHz) capacitive measurement. The resistive method has been proven reliable and is simple to implement. This work opens perspectives toward an interesting sensing solution for micro-robotic platforms.

  9. Position Measurement/Tracking Comparison of the Instrumentation in a Droplet-Actuated-Robotic Platform

    Directory of Open Access Journals (Sweden)

    Pierre Lambert

    2013-05-01

    Full Text Available This paper reports our work on developing a surface tension actuated micro-robotic platform supported by three bubbles (liquid environment or droplets (gaseous environment. The actuation principle relies on the force developed by surface tension below a millimeter, which benefits from scaling laws, and is used to actuate this new type of compliant robot. By separately controlling the pressure inside each bubble, three degrees of freedom can be actuated. We investigated three sensing solutions to measure the platform attitude in real-time (z-position of each droplet, leading to the knowledge of the z position and Θx and Θy tilts of the platform. The comparison between optical, resistive, and capacitive measurement principles is hereafter reported. The optical technique uses SFH-9201 components. The resistive technique involves measuring the electrical resistance of a path flowing through two droplets and the platform. This innovative technique for sensing table position combines three pairs of resistances, from which the resistance in each drop can be deduced, thus determining the platform position. The third solution is a more usual high frequency (~200 MHz capacitive measurement. The resistive method has been proven reliable and is simple to implement. This work opens perspectives toward an interesting sensing solution for micro-robotic platforms.

  10. Track-position and vibration control simulation for strut of the Stewart platform

    Institute of Scientific and Technical Information of China (English)

    Zhao-dong XU; Chen-hui WENG

    2013-01-01

    Vibrations inherently generated by on-board disturbance sources degrade the performance of the instruments in an on-orbit spacecraft, which have stringent accuracy requirements.The Stewart platform enables both track-positioning and vibration control.The strut of the Stewart platform is designed as a piezoelectric (PZT) element in series with a voice coil motor (VCM) element and a viscoelastic element.The track-positioning system uses a VCM as the main positioning control driver and a PZT as the positioning compensator.The vibration control system uses the characteristics of struts including active and passive control elements to attenuate the vibration.Simulation results indicate that the Stewart platform with the designed struts has good performance in tracking and vibration attenuation with different interference waves.

  11. PerPos: A Platform Providing Cloud Services for Pervasive Positioning

    DEFF Research Database (Denmark)

    Blunck, Henrik; Godsk, Torben; Grønbæk, Kaj

    2010-01-01

    -based building model manager that allows users to manage building models stored in the PerPos cloud for annotation, logging, and navigation purposes. A core service in the PerPos platform is sensor fusion for positioning that makes it seamless and efficient to combine a rich set of position sensors to obtain...

  12. OpenRS-Cloud:A remote sensing image processing platform based on cloud computing environment

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    This paper explores the use of cloud computing for remote sensing image processing.The main contribution of our work is to develop a remote sensing image processing platform based on cloud computing technology(OpenRS-Cloud).This paper focuses on enabling methodical investigations into the development pattern,computational model,data management and service model exploring this novel distributed computing model.The experimental INSAR processing flow is implemented to verify the efficiency and feasibility of OpenRS-Cloud platform.The results show that cloud computing is well suited for computationally-intensive and data-intensive remote sensing services.

  13. Comcutejs: A Web Browser Based Platform For Large-Scale Computations

    Directory of Open Access Journals (Sweden)

    Roman Debski

    2013-01-01

    Full Text Available The paper presents a new, cost effective, volunteer computing based platform.It utilizes volunteers’ web browsers as computational nodes. The computationaltasks are delegated to the browsers and executed in the background (indepen-dently of any user interface scripts making use of the HTML5 web workerstechnology. The capabilities of the platform have been proved by experimentsperformed in a wide range of numbers of computational nodes (1–400.

  14. Meta-instrument: high speed positioning and tracking platform for near-field optical imaging microscopes

    CERN Document Server

    Bijster, R J F; Spierdijk, J P F; Dekker, A; Klop, W A; Kramer, G F IJ; Cheng, L K; Hagen, R A J; Sadeghian, H

    2016-01-01

    High resolution and high throughput imaging are typically mutually exclusive. The meta-instrument pairs high resolution optical concepts such as nano-antennas, superoscillatory lenses and hyperlenses with a miniaturized opto-mechatronic platform for precise and high speed positioning of the optical elements at lens-to-sample separations that are measured in tens of nanometers. Such platform is a necessary development for bringing near-field optical imaging techniques to their industrial application. Towards this purpose, we present two designs and proof-of-principle instruments that are aimed at realizing sub-nanometer positional precision with a 100 kHz bandwidth.

  15. The development of a computational platform to design and simulate on-board hydrogen storage systems

    DEFF Research Database (Denmark)

    Mazzucco, Andrea; Rokni, Masoud

    2017-01-01

    the vehicular tank within the frame of a complete refueling system. The two technologies that are integrated in the platform are solid-state hydrogen storage in the form of metal hydrides and compressed gas systems. In this work the computational platform is used to compare the storage performance of two tank...

  16. Gravity field error analysis - Applications of Global Positioning System receivers and gradiometers on low orbiting platforms

    Science.gov (United States)

    Schrama, Ernst J. O.

    1991-11-01

    The concept of a Global Positioning System (GPS) receiver as a tracking facility and a gradiometer as a separate instrument on a low-orbiting platform offers a unique tool to map the earth's gravitational field with unprecedented accuracies. The former technique allows determination of the spacecraft's ephemeris at any epoch to within 3-10 cm, the latter permits the measurement of the tensor of second order derivatives of the gravity field to within 0.01 to 0.0001 Eotvos units depending on the type of gradiometer. First, a variety of error sources in gradiometry where emphasis is placed on the rotational problem pursuing as well a static as a dynamic approach is described. Next, an analytical technique is described and applied for an error analysis of gravity field parameters from gradiometer and GPS observation types. Results are discussed for various configurations proposed on Topex/Poseidon, Gravity Probe-B, and Aristoteles, indicating that GPS only solutions may be computed up to degree and order 35, 55, and 85, respectively, whereas a combined GPS/gradiometer experiment on Aristoteles may result in an acceptable solution up to degree and order 240.

  17. Resilient and Robust High Performance Computing Platforms for Scientific Computing Integrity

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Yier [Univ. of Central Florida, Orlando, FL (United States)

    2017-07-14

    As technology advances, computer systems are subject to increasingly sophisticated cyber-attacks that compromise both their security and integrity. High performance computing platforms used in commercial and scientific applications involving sensitive, or even classified data, are frequently targeted by powerful adversaries. This situation is made worse by a lack of fundamental security solutions that both perform efficiently and are effective at preventing threats. Current security solutions fail to address the threat landscape and ensure the integrity of sensitive data. As challenges rise, both private and public sectors will require robust technologies to protect its computing infrastructure. The research outcomes from this project try to address all these challenges. For example, we present LAZARUS, a novel technique to harden kernel Address Space Layout Randomization (KASLR) against paging-based side-channel attacks. In particular, our scheme allows for fine-grained protection of the virtual memory mappings that implement the randomization. We demonstrate the effectiveness of our approach by hardening a recent Linux kernel with LAZARUS, mitigating all of the previously presented side-channel attacks on KASLR. Our extensive evaluation shows that LAZARUS incurs only 0.943% overhead for standard benchmarks, and is therefore highly practical. We also introduced HA2lloc, a hardware-assisted allocator that is capable of leveraging an extended memory management unit to detect memory errors in the heap. We also perform testing using HA2lloc in a simulation environment and find that the approach is capable of preventing common memory vulnerabilities.

  18. A cloud computing based platform for sleep behavior and chronic diseases collaborative research.

    Science.gov (United States)

    Kuo, Mu-Hsing; Borycki, Elizabeth; Kushniruk, Andre; Huang, Yueh-Min; Hung, Shu-Hui

    2014-01-01

    The objective of this study is to propose a Cloud Computing based platform for sleep behavior and chronic disease collaborative research. The platform consists of two main components: (1) a sensing bed sheet with textile sensors to automatically record patient's sleep behaviors and vital signs, and (2) a service-oriented cloud computing architecture (SOCCA) that provides a data repository and allows for sharing and analysis of collected data. Also, we describe our systematic approach to implementing the SOCCA. We believe that the new cloud-based platform can provide nurse and other health professional researchers located in differing geographic locations with a cost effective, flexible, secure and privacy-preserved research environment.

  19. Model Predictive Controller Design for the Dynamic Positioning System of a Semi-submersible Platform

    Institute of Scientific and Technical Information of China (English)

    Hongli Chen; Lei Wan; Fang Wang; Guocheng Zhang

    2012-01-01

    This paper researches how to apply the advanced control technology of model predictive control (MPC) to the design of the dynamic positioning system (DPS) of a semi-submersible platform.First,a linear low-frequency motion model with three degrees of freedom was established in the context of a semi-submersible platform.Second,a model predictive controller was designed based on a model which took the constraints of the system into account.Third,simulation was carried out to demonstrate the feasibility of the controller.The results show that the model predictive controller has good performance and good at dealing with the constraints of the system.

  20. Oak Ridge Leadership Computing Facility Position Paper

    Energy Technology Data Exchange (ETDEWEB)

    Oral, H Sarp [ORNL; Hill, Jason J [ORNL; Thach, Kevin G [ORNL; Podhorszki, Norbert [ORNL; Klasky, Scott A [ORNL; Rogers, James H [ORNL; Shipman, Galen M [ORNL

    2011-01-01

    This paper discusses the business, administration, reliability, and usability aspects of storage systems at the Oak Ridge Leadership Computing Facility (OLCF). The OLCF has developed key competencies in architecting and administration of large-scale Lustre deployments as well as HPSS archival systems. Additionally as these systems are architected, deployed, and expanded over time reliability and availability factors are a primary driver. This paper focuses on the implementation of the Spider parallel Lustre file system as well as the implementation of the HPSS archive at the OLCF.

  1. Future Computing Platforms for Science in a Power Constrained Era

    Science.gov (United States)

    Abdurachmanov, David; Elmer, Peter; Eulisse, Giulio; Knight, Robert

    2015-12-01

    Power consumption will be a key constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics (HEP). This makes performance-per-watt a crucial metric for selecting cost-efficient computing solutions. For this paper, we have done a wide survey of current and emerging architectures becoming available on the market including x86-64 variants, ARMv7 32-bit, ARMv8 64-bit, Many-Core and GPU solutions, as well as newer System-on-Chip (SoC) solutions. We compare performance and energy efficiency using an evolving set of standardized HEP-related benchmarks and power measurement techniques we have been developing. We evaluate the potential for use of such computing solutions in the context of DHTC systems, such as the Worldwide LHC Computing Grid (WLCG).

  2. Future Computing Platforms for Science in a Power Constrained Era

    CERN Document Server

    Abdurachmanov, David; Eulisse, Giulio; Knight, Robert

    2015-01-01

    Power consumption will be a key constraint on the future growth of Distributed High Throughput Computing (DHTC) as used by High Energy Physics (HEP). This makes performance-per-watt a crucial metric for selecting cost-efficient computing solutions. For this paper, we have done a wide survey of current and emerging architectures becoming available on the market including x86-64 variants, ARMv7 32-bit, ARMv8 64-bit, Many-Core and GPU solutions, as well as newer System-on-Chip (SoC) solutions. We compare performance and energy efficiency using an evolving set of standardized HEP-related benchmarks and power measurement techniques we have been developing. We evaluate the potential for use of such computing solutions in the context of DHTC systems, such as the Worldwide LHC Computing Grid (WLCG).

  3. Thermal Balancing Policy for Multiprocessor Stream Computing Platforms

    OpenAIRE

    Mulas, Fabrizio; Atienza, David; Acquaviva, Andrea; Carta, Salvatore; Benini, Luca; Micheli, Giovanni De

    2009-01-01

    Die-temperature control to avoid hotspots is increasingly critical in Multiprocessor System-on-Chip (MPSoCs) for stream computing. In this context, thermal balancing policies based on task migration are a promising approach to re-distribute power dissipation and even out temperature gradients. Since stream computing applications require strict quality of service and timing constraints, the real-time performance impact of thermal balancing policies must be carefully evaluated. In this pa...

  4. Platformation: Cloud Computing Tools at the Service of Social Change

    Directory of Open Access Journals (Sweden)

    Anil Patel

    2012-07-01

    Full Text Available The following article establishes some context and definitions for what is termed the “sharing imperative” – a movement or tendency towards sharing information online and in real time that has rapidly transformed several industries. As internet-enabled devices proliferate to all corners of the globe, ways of working and accessing information have changed. Users now expect to be able to access the products, services, and information that they want from anywhere, at any time, on any device. This article addresses how the nonprofit sector might respond to those demands by embracing the sharing imperative. It suggests that how well an organization shares has become one of the most pressing governance questions a nonprofit organization must tackle. Finally, the article introduces Platformation, a project whereby tools that enable better inter and intra-organizational sharing are tested for scalability, affordability, interoperability, and security, all with a non-profit lens.

  5. Smooth integral sliding mode controller for the position control of Stewart platform.

    Science.gov (United States)

    Kumar P, Ramesh; Chalanga, Asif; Bandyopadhyay, B

    2015-09-01

    This paper proposes the application of a new algorithm for the position control of a Stewart platform. The conventional integral sliding mode controller is a combination of nominal control and discontinuous feedback control hence the overall control is discontinuous in nature. The discontinuity in the feedback control is undesirable for practical applications due to chattering which causes the wear and tear of the mechanical actuators. In this paper the existing integral sliding mode control law for systems with matched disturbances is modified by replacing the discontinuous part by a continuous modified twisting control. This proposed controller is continuous in nature due to the combinations of two continuous controls. The desired position of the platform has been achieved using the proposed controller even in the presence of matched disturbances. The effectiveness of the proposed controller has been proved with the simulation results.

  6. CBRAIN: a web-based, distributed computing platform for collaborative neuroimaging research.

    Science.gov (United States)

    Sherif, Tarek; Rioux, Pierre; Rousseau, Marc-Etienne; Kassis, Nicolas; Beck, Natacha; Adalat, Reza; Das, Samir; Glatard, Tristan; Evans, Alan C

    2014-01-01

    The Canadian Brain Imaging Research Platform (CBRAIN) is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites, and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN's flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC) centers in Canada, one in Korea, one in Germany, and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention, or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson's and Alzheimer's diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction.

  7. Accuracy Analysis of a Low-Cost Platform for Positioning and Navigation

    Science.gov (United States)

    Hofmann, S.; Kuntzsch, C.; Schulze, M. J.; Eggert, D.; Sester, M.

    2012-07-01

    This paper presents an accuracy analysis of a platform based on low-cost components for landmark-based navigation intended for research and teaching purposes. The proposed platform includes a LEGO MINDSTORMS NXT 2.0 kit, an Android-based Smartphone as well as a compact laser scanner Hokuyo URG-04LX. The robot is used in a small indoor environment, where GNSS is not available. Therefore, a landmark map was produced in advance, with the landmark positions provided to the robot. All steps of procedure to set up the platform are shown. The main focus of this paper is the reachable positioning accuracy, which was analyzed in this type of scenario depending on the accuracy of the reference landmarks and the directional and distance measuring accuracy of the laser scanner. Several experiments were carried out, demonstrating the practically achievable positioning accuracy. To evaluate the accuracy, ground truth was acquired using a total station. These results are compared to the theoretically achievable accuracies and the laser scanner's characteristics.

  8. The Sugar Learning Platform: Affordances for Computational Thinking

    Directory of Open Access Journals (Sweden)

    Walter Bender

    2017-01-01

    Full Text Available Diez años después del lanzamiento de Sugar Learning Platform , reflexionamos sobre las herramientas específicas y las affordances educativas promovidas por esta plataforma para implicar a los estudiantes en el pensamiento computacional con el objetivo general de lograr su dominio. Estas herramientas incluyen múltipl es entornos de programación multimedia, así como diversos mecanismos para la depuración, colaboración, expresión y reflexión. Nuestra selección de herramientas viene determinada por la revisión exhaustiva de la obra pionera de Seymour Papert, Marvin Minsky y Cynthia Solomon, que fueron los primeros en integrar la informática multimedia en los centros de Educación Primaria a finales de la década de los 60, con el objetivo de implicar a los estudiantes en el dominio de muchos de los conceptos heurísticos y al goritmos que se asocian con el pensamiento computacional. En el presente artículo, se analizan muchos ejemplos de cómo han utilizado estas herramientas tanto los docentes como los discentes. Además, describimos el papel que desempeña el Software de Código Abierto a la hora de vertebrar el andamiaje para una expresión personal y profunda a través de la programación y para poner de relieve la responsabilidad personal, el sentido de comunidad y las expectativas sin límite de los usuarios de Sugar que se convie rten en desarrolladores.

  9. Determining position inside building via laser rangefinder and handheld computer

    Science.gov (United States)

    Ramsey, Jr. James L.; Finley, Patrick; Melton, Brad

    2010-01-12

    An apparatus, computer software, and a method of determining position inside a building comprising selecting on a PDA at least two walls of a room in a digitized map of a building or a portion of a building, pointing and firing a laser rangefinder at corresponding physical walls, transmitting collected range information to the PDA, and computing on the PDA a position of the laser rangefinder within the room.

  10. Social Computing as Next-Gen Learning Paradigm: A Platform and Applications

    Science.gov (United States)

    Margherita, Alessandro; Taurino, Cesare; Del Vecchio, Pasquale

    As a field at the intersection between computer science and people behavior, social computing can contribute significantly in the endeavor of innovating how individuals and groups interact for learning and working purposes. In particular, the generation of Internet applications tagged as web 2.0 provides an opportunity to create new “environments” where people can exchange knowledge and experience, create new knowledge and learn together. This chapter illustrates the design and application of a prototypal platform which embeds tools such as blog, wiki, folksonomy and RSS in a unique web-based system. This platform has been developed to support a case-based and project-driven learning strategy for the development of business and technology management competencies in undergraduate and graduate education programs. A set of illustrative scenarios are described to show how a learning community can be promoted, created, and sustained through the technological platform.

  11. Supporting Real-Time Computer Vision Workloads using OpenVX on Multicore+GPU Platforms

    Science.gov (United States)

    2015-05-01

    workloads specified using OpenVX to be supported in a predictable way. I. INTRODUCTION In the automotive industry today, vision-based sensing through cameras...Supporting Real-Time Computer Vision Workloads using OpenVX on Multicore+GPU Platforms Glenn A. Elliott, Kecheng Yang, and James H. Anderson...Department of Computer Science, University of North Carolina at Chapel Hill Abstract—In the automotive industry, there is currently great interest in

  12. Multivariate Gradient Analysis for Evaluating and Visualizing a Learning System Platform for Computer Programming

    Science.gov (United States)

    Mather, Richard

    2015-01-01

    This paper explores the application of canonical gradient analysis to evaluate and visualize student performance and acceptance of a learning system platform. The subject of evaluation is a first year BSc module for computer programming. This uses "Ceebot," an animated and immersive game-like development environment. Multivariate…

  13. The Relationship between Chief Information Officer Transformational Leadership and Computing Platform Operating Systems

    Science.gov (United States)

    Anderson, George W.

    2010-01-01

    The purpose of this study was to relate the strength of Chief Information Officer (CIO) transformational leadership behaviors to 1 of 5 computing platform operating systems (OSs) that may be selected for a firm's Enterprise Resource Planning (ERP) business system. Research shows executive leader behaviors may promote innovation through the use of…

  14. UrbanWeb: a Platform for Mobile Context-aware Social Computing

    DEFF Research Database (Denmark)

    Hansen, Frank Allan; Grønbæk, Kaj

    2010-01-01

    UrbanWeb is a novel Web-based context-aware hypermedia plat- form. It provides essential mechanisms for mobile social comput- ing applications: the framework implements context as an exten- sion to Web 2.0 tagging and provides developers with an easy to use platform for mobile context-aware appli...

  15. Comparison of different computer platforms for running the Versatile Advection Code

    NARCIS (Netherlands)

    Toth, G.; Keppens, R.; Sloot, P.; Bubak, M.; Hertzberger, B.

    1998-01-01

    The Versatile Advection Code is a general tool for solving hydrodynamical and magnetohydrodynamical problems arising in astrophysics. We compare the performance of the code on different computer platforms, including work stations and vector and parallel supercomputers. Good parallel scaling can be a

  16. The Relationship between Chief Information Officer Transformational Leadership and Computing Platform Operating Systems

    Science.gov (United States)

    Anderson, George W.

    2010-01-01

    The purpose of this study was to relate the strength of Chief Information Officer (CIO) transformational leadership behaviors to 1 of 5 computing platform operating systems (OSs) that may be selected for a firm's Enterprise Resource Planning (ERP) business system. Research shows executive leader behaviors may promote innovation through the use of…

  17. CBRAIN: A web-based, distributed computing platform for collaborative neuroimaging research

    Directory of Open Access Journals (Sweden)

    Tarek eSherif

    2014-05-01

    Full Text Available The Canadian Brain Imaging Research Platform (CBRAIN is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN’s flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC centers in Canada, one in Korea, one in Germany and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson’s and Alzheimer’s diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction.

  18. Identifying D-positive donors using a second automated testing platform.

    Science.gov (United States)

    Goldman, M; Resz, I; Cote, J; Ochoa, G; Angus, N

    2013-01-01

    Because of the variability of D expression, one method may be inadequate to correctly classify donors with variant RHD alleles. We evaluated the use of a solid -phase automated platform (ImmucorGamma Galileo) to confirm D- test results obtained on first-time donors on the Beckman Coulter PK7300 automated microplate test system. Samples with discordant results were analyzed by serologic tube methods, RHD genotyping using the BLOODchip platform (Progenika) and, if necessary, sequencing. We estimated the number of cases of alloimmunization in women younger than 50 years likelyto be prevented by the addition of Galileo testing. From May 2011 to May 2012, 910,220 donor samples were tested; 15,441 were first-time donors with concordant D- results. Five donors tested D- on the PK7300 and weak D+ on the Galileo; one was found to be a false positive on further testing. On manual testing, the other four donors had positive indirect antiglobulin test results with one to three of the antisera used and were C+. On BLOODchip testing, two donors were classified as D+, and two were assigned a "no call". D variants included weak D type 67, weak D type 9, and two novel variants. Approximately 10 percent of D- units are transfused to women younger that 50 years. Assuming an alloimmunization rate of 30 percent, use of the Galileo would prevent approximately one alloimmunization every 5 to 6 years in this patient group. We conclude that the yield of preventing alloimmunization in this population by adding a second automated seologic testing platform is very low.

  19. 云计算平台浅谈%Discussion on Cloud Computing Platform

    Institute of Scientific and Technical Information of China (English)

    石美峰

    2014-01-01

    分析了当前企业计算机应用系统建设存在的“专机专用”“烟囱式”模式的缺点,介绍了IaaS云计算平台的资源虚拟化、按需计算、动态部署、灵活扩展的特征及价值,云计算平台的模式及组成、构建方法等,提出了应采用云计算平台作为企业数据中心的资源架构模式。%The existing disadvantages of"special machine for special used","stovepipe"pattern in current en-terprise computer application system construction are analyzed.The characteristics and value of IaaS cloud computing platform is introduced,that is resource virtualization,on-demand computing,dynamic deployment,flexible exten-sion.The pattern,the composition,construction method and other of cloud computing platform are elaborated.Puts forward that resource framework pattern of enterprise data center should adopt cloud computing platform.

  20. Measuring and tuning energy efficiency on large scale high performance computing platforms.

    Energy Technology Data Exchange (ETDEWEB)

    Laros, James H., III

    2011-08-01

    Recognition of the importance of power in the field of High Performance Computing, whether it be as an obstacle, expense or design consideration, has never been greater and more pervasive. While research has been conducted on many related aspects, there is a stark absence of work focused on large scale High Performance Computing. Part of the reason is the lack of measurement capability currently available on small or large platforms. Typically, research is conducted using coarse methods of measurement such as inserting a power meter between the power source and the platform, or fine grained measurements using custom instrumented boards (with obvious limitations in scale). To collect the measurements necessary to analyze real scientific computing applications at large scale, an in-situ measurement capability must exist on a large scale capability class platform. In response to this challenge, we exploit the unique power measurement capabilities of the Cray XT architecture to gain an understanding of power use and the effects of tuning. We apply these capabilities at the operating system level by deterministically halting cores when idle. At the application level, we gain an understanding of the power requirements of a range of important DOE/NNSA production scientific computing applications running at large scale (thousands of nodes), while simultaneously collecting current and voltage measurements on the hosting nodes. We examine the effects of both CPU and network bandwidth tuning and demonstrate energy savings opportunities of up to 39% with little or no impact on run-time performance. Capturing scale effects in our experimental results was key. Our results provide strong evidence that next generation large-scale platforms should not only approach CPU frequency scaling differently, but could also benefit from the capability to tune other platform components, such as the network, to achieve energy efficient performance.

  1. Mandibular condyle position in cone beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Hyoung Joo; Kim, Gyu Tae; Choi, Yong Suk; Hwang, Eui Hwan [Kyung Hee Univ. School of Dentistry, Seoul (Korea, Republic of)

    2006-06-15

    To evaluate position of the mandibular condyle within articular fossa in an asymptomatic population radiographically by a cone beam computed tomography. Cone beam computed tomography of 60 temporomandibular joints was performed on 15 males and 15 females with no history of any temporomandibular disorders, or any other orthodontic or photoconductors treatments. Position of mandibular condyle within articular fossa at centric occlusion was evaluated. A statistical evaluation was done using a SPSS. In the sagittal views, mandibular condyle within articular fossa was laterally located at central section. Mandibular condyles in the right and left sides were showed asymmetric positional relationship at medial, central, and lateral sections. Mandibular condyle within articular fossa in an asymptomatic population was observed non-concentric position in the sagittal and coronal views.

  2. Information-computational platform for collaborative multidisciplinary investigations of regional climatic changes and their impacts

    Science.gov (United States)

    Gordov, Evgeny; Lykosov, Vasily; Krupchatnikov, Vladimir; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara

    2013-04-01

    Analysis of growing volume of related to climate change data from sensors and model outputs requires collaborative multidisciplinary efforts of researchers. To do it timely and in reliable way one needs in modern information-computational infrastructure supporting integrated studies in the field of environmental sciences. Recently developed experimental software and hardware platform Climate (http://climate.scert.ru/) provides required environment for regional climate change related investigations. The platform combines modern web 2.0 approach, GIS-functionality and capabilities to run climate and meteorological models, process large geophysical datasets and support relevant analysis. It also supports joint software development by distributed research groups, and organization of thematic education for students and post-graduate students. In particular, platform software developed includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also run of integrated into the platform WRF and «Planet Simulator» models, modeling results data preprocessing and visualization is provided. All functions of the platform are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of selection of geographical region of interest (pan and zoom), data layers manipulation (order, enable/disable, features extraction) and visualization of results. Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches. Using it even unskilled user without specific knowledge can perform reliable computational processing and visualization of large meteorological, climatic and satellite monitoring datasets through

  3. SU-C-304-04: A Compact Modular Computational Platform for Automated On-Board Imager Quality Assurance

    Energy Technology Data Exchange (ETDEWEB)

    Dolly, S [Washington University School of Medicine, Saint Louis, MO (United States); University of Missouri, Columbia, MO (United States); Cai, B; Chen, H; Anastasio, M; Sun, B; Yaddanapudi, S; Noel, C; Goddu, S; Mutic, S; Li, H [Washington University School of Medicine, Saint Louis, MO (United States); Tan, J [UTSouthwestern Medical Center, Dallas, TX (United States)

    2015-06-15

    Purpose: Traditionally, the assessment of X-ray tube output and detector positioning accuracy of on-board imagers (OBI) has been performed manually and subjectively with rulers and dosimeters, and typically takes hours to complete. In this study, we have designed a compact modular computational platform to automatically analyze OBI images acquired with in-house designed phantoms as an efficient and robust surrogate. Methods: The platform was developed as an integrated and automated image analysis-based platform using MATLAB for easy modification and maintenance. Given a set of images acquired with the in-house designed phantoms, the X-ray output accuracy was examined via cross-validation of the uniqueness and integration minimization of important image quality assessment metrics, while machine geometric and positioning accuracy were validated by utilizing pattern-recognition based image analysis techniques. Results: The platform input was a set of images of an in-house designed phantom. The total processing time is about 1–2 minutes. Based on the data acquired from three Varian Truebeam machines over the course of 3 months, the designed test validation strategy achieved higher accuracy than traditional methods. The kVp output accuracy can be verified within +/−2 kVp, the exposure accuracy within 2%, and exposure linearity with a coefficient of variation (CV) of 0.1. Sub-millimeter position accuracy was achieved for the lateral and longitudinal positioning tests, while vertical positioning accuracy within +/−2 mm was achieved. Conclusion: This new platform delivers to the radiotherapy field an automated, efficient, and stable image analysis-based procedure, for the first time, acting as a surrogate for traditional tests for LINAC OBI systems. It has great potential to facilitate OBI quality assurance (QA) with the assistance of advanced image processing techniques. In addition, it provides flexible integration of additional tests for expediting other OBI

  4. Cloud Computing for the Grid: GridControl: A Software Platform to Support the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    None

    2012-02-08

    GENI Project: Cornell University is creating a new software platform for grid operators called GridControl that will utilize cloud computing to more efficiently control the grid. In a cloud computing system, there are minimal hardware and software demands on users. The user can tap into a network of computers that is housed elsewhere (the cloud) and the network runs computer applications for the user. The user only needs interface software to access all of the cloud’s data resources, which can be as simple as a web browser. Cloud computing can reduce costs, facilitate innovation through sharing, empower users, and improve the overall reliability of a dispersed system. Cornell’s GridControl will focus on 4 elements: delivering the state of the grid to users quickly and reliably; building networked, scalable grid-control software; tailoring services to emerging smart grid uses; and simulating smart grid behavior under various conditions.

  5. Applications integration in a hybrid cloud computing environment: modelling and platform

    Science.gov (United States)

    Li, Qing; Wang, Ze-yuan; Li, Wei-hua; Li, Jun; Wang, Cheng; Du, Rui-yang

    2013-08-01

    With the development of application services providers and cloud computing, more and more small- and medium-sized business enterprises use software services and even infrastructure services provided by professional information service companies to replace all or part of their information systems (ISs). These information service companies provide applications, such as data storage, computing processes, document sharing and even management information system services as public resources to support the business process management of their customers. However, no cloud computing service vendor can satisfy the full functional IS requirements of an enterprise. As a result, enterprises often have to simultaneously use systems distributed in different clouds and their intra enterprise ISs. Thus, this article presents a framework to integrate applications deployed in public clouds and intra ISs. A run-time platform is developed and a cross-computing environment process modelling technique is also developed to improve the feasibility of ISs under hybrid cloud computing environments.

  6. A new embedded solution of hyperspectral data processing platform: the embedded GPU computer

    Science.gov (United States)

    Zhang, Lei; Gao, Jiao Bo; Hu, Yu; Sun, Ke Feng; Wang, Ying Hui; Cheng, Juan; Sun, Dan Dan; Li, Yu

    2016-10-01

    During the research of hyper-spectral imaging spectrometer, how to process the huge amount of image data is a difficult problem for all researchers. The amount of image data is about the order of magnitude of several hundred megabytes per second. Traditional solution of the embedded hyper-spectral data processing platform such as DSP and FPGA has its own drawback. With the development of GPU, parallel computing on GPU is increasingly applied in large-scale data processing. In this paper, we propose a new embedded solution of hyper-spectral data processing platform which is based on the embedded GPU computer. We also give a detailed discussion of how to acquire and process hyper-spectral data in embedded GPU computer. We use C++ AMP technology to control GPU and schedule the parallel computing. Experimental results show that the speed of hyper-spectral data processing on embedded GPU computer is apparently faster than ordinary computer. Our research has significant meaning for the engineering application of hyper-spectral imaging spectrometer.

  7. Open-Phylo: a customizable crowd-computing platform for multiple sequence alignment.

    Science.gov (United States)

    Kwak, Daniel; Kam, Alfred; Becerra, David; Zhou, Qikuan; Hops, Adam; Zarour, Eleyine; Kam, Arthur; Sarmenta, Luis; Blanchette, Mathieu; Waldispühl, Jérôme

    2013-01-01

    Citizen science games such as Galaxy Zoo, Foldit, and Phylo aim to harness the intelligence and processing power generated by crowds of online gamers to solve scientific problems. However, the selection of the data to be analyzed through these games is under the exclusive control of the game designers, and so are the results produced by gamers. Here, we introduce Open-Phylo, a freely accessible crowd-computing platform that enables any scientist to enter our system and use crowds of gamers to assist computer programs in solving one of the most fundamental problems in genomics: the multiple sequence alignment problem.

  8. Getting the Most from Distributed Resources With an Analytics Platform for ATLAS Computing Services

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration; Bryant, Lincoln

    2016-01-01

    To meet a sharply increasing demand for computing resources for LHC Run 2, ATLAS distributed computing systems reach far and wide to gather CPU resources and storage capacity to execute an evolving ecosystem of production and analysis workflow tools. Indeed more than a hundred computing sites from the Worldwide LHC Computing Grid, plus many “opportunistic” facilities at HPC centers, universities, national laboratories, and public clouds, combine to meet these requirements. These resources have characteristics (such as local queuing availability, proximity to data sources and target destinations, network latency and bandwidth capacity, etc.) affecting the overall processing efficiency and throughput. To quantitatively understand and in some instances predict behavior, we have developed a platform to aggregate, index (for user queries), and analyze the more important information streams affecting performance. These data streams come from the ATLAS production system (PanDA), the distributed data management s...

  9. Psychometric assessment and behavioral experiments using a free virtual reality platform and computational science.

    Science.gov (United States)

    Cipresso, Pietro; Serino, Silvia; Riva, Giuseppe

    2016-03-19

    Virtual Reality has been extensively used in a wide range of psychological experiments. In this study, we aimed to introduce NeuroVirtual 3D, a platform that clinicians could use free of charge. The platform we developed relies on NeuroVR software, but we extended it to apply to experiments. The software is available free of charge to researchers and clinical practitioners who can also use a large number of virtual environments and objects already developed. The platform has been developed to connect to virtually every device ever produced by the means of Virtual-Reality Peripheral Network (VRPN) protocols; however, a number of these have already been included and tested in the platform. Among the available devices, the Microsoft Kinect low-cost sensor has already been configured for navigation through the virtual environments and to trigger specific action (sounds, videos, images, and the like) when a specific gesture is recognized, e.g., a step forward or an arm up. A task for neglect and a task for spatial abilities assessment were already implemented within the platform. Moreover, NeuroVirtual 3D integrated a TCP-IP-based module (bridge) to collect the data from virtually any existent biosensor (Thought-Technology, Zephyr and StarStim devices have already been included in the platform). It is able to record any psychophysiological signal during any experiment using also the computed indices in real time. NeuroVirtual 3D is able to record external and internal (e.g., coordinates, keys-press, timestamp) data with a millisecond precision, representing de facto the most advanced technology for experimental psychology using virtual environments available without the needs to program code.

  10. Studying an Eulerian Computer Model on Different High-performance Computer Platforms and Some Applications

    Science.gov (United States)

    Georgiev, K.; Zlatev, Z.

    2010-11-01

    The Danish Eulerian Model (DEM) is an Eulerian model for studying the transport of air pollutants on large scale. Originally, the model was developed at the National Environmental Research Institute of Denmark. The model computational domain covers Europe and some neighbour parts belong to the Atlantic Ocean, Asia and Africa. If DEM model is to be applied by using fine grids, then its discretization leads to a huge computational problem. This implies that such a model as DEM must be run only on high-performance computer architectures. The implementation and tuning of such a complex large-scale model on each different computer is a non-trivial task. Here, some comparison results of running of this model on different kind of vector (CRAY C92A, Fujitsu, etc.), parallel computers with distributed memory (IBM SP, CRAY T3E, Beowulf clusters, Macintosh G4 clusters, etc.), parallel computers with shared memory (SGI Origin, SUN, etc.) and parallel computers with two levels of parallelism (IBM SMP, IBM BlueGene/P, clusters of multiprocessor nodes, etc.) will be presented. The main idea in the parallel version of DEM is domain partitioning approach. Discussions according to the effective use of the cache and hierarchical memories of the modern computers as well as the performance, speed-ups and efficiency achieved will be done. The parallel code of DEM, created by using MPI standard library, appears to be highly portable and shows good efficiency and scalability on different kind of vector and parallel computers. Some important applications of the computer model output are presented in short.

  11. Cloud Computing for Geosciences--GeoCloud for standardized geospatial service platforms (Invited)

    Science.gov (United States)

    Nebert, D. D.; Huang, Q.; Yang, C.

    2013-12-01

    The 21st century geoscience faces challenges of Big Data, spike computing requirements (e.g., when natural disaster happens), and sharing resources through cyberinfrastructure across different organizations (Yang et al., 2011). With flexibility and cost-efficiency of computing resources a primary concern, cloud computing emerges as a promising solution to provide core capabilities to address these challenges. Many governmental and federal agencies are adopting cloud technologies to cut costs and to make federal IT operations more efficient (Huang et al., 2010). However, it is still difficult for geoscientists to take advantage of the benefits of cloud computing to facilitate the scientific research and discoveries. This presentation reports using GeoCloud to illustrate the process and strategies used in building a common platform for geoscience communities to enable the sharing, integration of geospatial data, information and knowledge across different domains. GeoCloud is an annual incubator project coordinated by the Federal Geographic Data Committee (FGDC) in collaboration with the U.S. General Services Administration (GSA) and the Department of Health and Human Services. It is designed as a staging environment to test and document the deployment of a common GeoCloud community platform that can be implemented by multiple agencies. With these standardized virtual geospatial servers, a variety of government geospatial applications can be quickly migrated to the cloud. In order to achieve this objective, multiple projects are nominated each year by federal agencies as existing public-facing geospatial data services. From the initial candidate projects, a set of common operating system and software requirements was identified as the baseline for platform as a service (PaaS) packages. Based on these developed common platform packages, each project deploys and monitors its web application, develops best practices, and documents cost and performance information. This

  12. nuMap:A Web Platform for Accurate Prediction of Nucleosome Positioning

    Institute of Scientific and Technical Information of China (English)

    Bader A Alharbi; Thamir H Alshammari; Nathan L Felton; Victor B Zhurkin; Feng Cui

    2014-01-01

    Nucleosome positioning is critical for gene expression and of major biological interest. The high cost of experimentally mapping nucleosomal arrangement signifies the need for computational approaches to predict nucleosome positions at high resolution. Here, we present a web-based application to fulfill this need by implementing two models, YR and W/S schemes, for the translational and rotational positioning of nucleosomes, respectively. Our methods are based on sequence-dependent anisotropic bending that dictates how DNA is wrapped around a histone octamer. This application allows users to specify a number of options such as schemes and param-eters for threading calculation and provides multiple layout formats. The nuMap is implemented in Java/Perl/MySQL and is freely available for public use at http://numap.rit.edu. The user manual, implementation notes, description of the methodology and examples are available at the site.

  13. A high performance, low power computational platform for complex sensing operations in smart cities

    KAUST Repository

    Jiang, Jiming

    2017-02-02

    This paper presents a new wireless platform designed for an integrated traffic/flash flood monitoring system. The sensor platform is built around a 32-bit ARM Cortex M4 microcontroller and a 2.4GHz 802.15.4802.15.4 ISM compliant radio module. It can be interfaced with fixed traffic sensors, or receive data from vehicle transponders. This platform is specifically designed for solar-powered, low bandwidth, high computational performance wireless sensor network applications. A self-recovering unit is designed to increase reliability and allow periodic hard resets, an essential requirement for sensor networks. A radio monitoring circuitry is proposed to monitor incoming and outgoing transmissions, simplifying software debugging. We illustrate the performance of this wireless sensor platform on complex problems arising in smart cities, such as traffic flow monitoring, machine-learning-based flash flood monitoring or Kalman-filter based vehicle trajectory estimation. All design files have been uploaded and shared in an open science framework, and can be accessed from [1]. The hardware design is under CERN Open Hardware License v1.2.

  14. Computing Hulls And Centerpoints In Positive Definite Space

    CERN Document Server

    Fletcher, P Thomas; Phillips, Jeff M; Venkatasubramanian, Suresh

    2009-01-01

    In this paper, we present algorithms for computing approximate hulls and centerpoints for collections of matrices in positive definite space. There are many applications where the data under consideration, rather than being points in a Euclidean space, are positive definite (p.d.) matrices. These applications include diffusion tensor imaging in the brain, elasticity analysis in mechanical engineering, and the theory of kernel maps in machine learning. Our work centers around the notion of a horoball: the limit of a ball fixed at one point whose radius goes to infinity. Horoballs possess many (though not all) of the properties of halfspaces; in particular, they lack a strong separation theorem where two horoballs can completely partition the space. In spite of this, we show that we can compute an approximate "horoball hull" that strictly contains the actual convex hull. This approximate hull also preserves geodesic extents, which is a result of independent value: an immediate corollary is that we can approxima...

  15. Design and performance of the virtualization platform for offline computing on the ATLAS TDAQ Farm

    Science.gov (United States)

    Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Twomey, M. S.; Zaytsev, A.

    2014-06-01

    With the LHC collider at CERN currently going through the period of Long Shutdown 1 there is an opportunity to use the computing resources of the experiments' large trigger farms for other data processing activities. In the case of the ATLAS experiment, the TDAQ farm, consisting of more than 1500 compute nodes, is suitable for running Monte Carlo (MC) production jobs that are mostly CPU and not I/O bound. This contribution gives a thorough review of the design and deployment of a virtualized platform running on this computing resource and of its use to run large groups of CernVM based virtual machines operating as a single CERN-P1 WLCG site. This platform has been designed to guarantee the security and the usability of the ATLAS private network, and to minimize interference with TDAQ's usage of the farm. Openstack has been chosen to provide a cloud management layer. The experience gained in the last 3.5 months shows that the use of the TDAQ farm for the MC simulation contributes to the ATLAS data processing at the level of a large Tier-1 WLCG site, despite the opportunistic nature of the underlying computing resources being used.

  16. A Reconfigurable Cryogenic Platform for the Classical Control of Scalable Quantum Computers

    CERN Document Server

    Homulle, Harald; Patra, Bishnu; Ferrari, Giorgio; Prati, Enrico; Sebastiano, Fabio; Charbon, Edoardo

    2016-01-01

    Recent advances in solid-state qubit technology are paving the way to fault-tolerant quantum computing systems. However, qubit technology is limited by qubit coherence time and by the complexity of coupling the quantum system with a classical electronic infrastructure. We propose an infrastructure, enabling to read and control qubits, that is implemented on a field-programmable gate array (FPGA). The FPGA platform supports functionality required by several qubit technologies and can operate physically close to the qubits over a temperature range from 4K to 300K. Extensive characterization of the platform over this temperature range revealed all major components (such as LUTs, MMCM, PLL, BRAM, IDELAY2) operate correctly and the logic speed is very stable. The stability is finally concretized by operating an integrated ADC with relatively stable performance over temperature.

  17. High Performance Power Spectrum Analysis Using a FPGA Based Reconfigurable Computing Platform

    CERN Document Server

    Abhyankar, Yogindra; Agarwal, Yogesh; Subrahmanya, C R; Prasad, Peeyush; 10.1109/RECONF.2006.307786

    2011-01-01

    Power-spectrum analysis is an important tool providing critical information about a signal. The range of applications includes communication-systems to DNA-sequencing. If there is interference present on a transmitted signal, it could be due to a natural cause or superimposed forcefully. In the latter case, its early detection and analysis becomes important. In such situations having a small observation window, a quick look at power-spectrum can reveal a great deal of information, including frequency and source of interference. In this paper, we present our design of a FPGA based reconfigurable platform for high performance power-spectrum analysis. This allows for the real-time data-acquisition and processing of samples of the incoming signal in a small time frame. The processing consists of computation of power, its average and peak, over a set of input values. This platform sustains simultaneous data streams on each of the four input channels.

  18. Managing the computational chemistry big data problem: the ioChem-BD platform.

    Science.gov (United States)

    Álvarez-Moreno, M; de Graaf, C; López, N; Maseras, F; Poblet, J M; Bo, C

    2015-01-26

    We present the ioChem-BD platform ( www.iochem-bd.org ) as a multiheaded tool aimed to manage large volumes of quantum chemistry results from a diverse group of already common simulation packages. The platform has an extensible structure. The key modules managing the main tasks are to (i) upload of output files from common computational chemistry packages, (ii) extract meaningful data from the results, and (iii) generate output summaries in user-friendly formats. A heavy use of the Chemical Mark-up Language (CML) is made in the intermediate files used by ioChem-BD. From them and using XSL techniques, we manipulate and transform such chemical data sets to fulfill researchers' needs in the form of HTML5 reports, supporting information, and other research media.

  19. A computer simulation platform for the estimation of measurement uncertainties in dimensional X-ray computed tomography

    DEFF Research Database (Denmark)

    Hiller, Jochen; Reindl, Leonard M

    2012-01-01

    into account the main error sources for the measurement. This method has the potential to deal with all kinds of systematic and random errors that influence a dimensional CT measurement. A case study demonstrates the practical application of the VCT simulator using numerically generated CT data and statistical......The knowledge of measurement uncertainty is of great importance in conformance testing in production. The tolerance limit for production must be reduced by the amounts of measurement uncertainty to ensure that the parts are in fact within the tolerance. Over the last 5 years, industrial X......-ray computed tomography (CT) has become an important technology for dimensional quality control. In this paper a computer simulation platform is presented which is able to investigate error sources in dimensional CT measurements. The typical workflow in industrial CT metrology is described and methods...

  20. Computer-operated analytical platform for the determination of nutrients in hydroponic systems.

    Science.gov (United States)

    Rius-Ruiz, F Xavier; Andrade, Francisco J; Riu, Jordi; Rius, F Xavier

    2014-03-15

    Hydroponics is a water, energy, space, and cost efficient system for growing plants in constrained spaces or land exhausted areas. Precise control of hydroponic nutrients is essential for growing healthy plants and producing high yields. In this article we report for the first time on a new computer-operated analytical platform which can be readily used for the determination of essential nutrients in hydroponic growing systems. The liquid-handling system uses inexpensive components (i.e., peristaltic pump and solenoid valves), which are discretely computer-operated to automatically condition, calibrate and clean a multi-probe of solid-contact ion-selective electrodes (ISEs). These ISEs, which are based on carbon nanotubes, offer high portability, robustness and easy maintenance and storage. With this new computer-operated analytical platform we performed automatic measurements of K(+), Ca(2+), NO3(-) and Cl(-) during tomato plants growth in order to assure optimal nutritional uptake and tomato production. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. An Application-Based Performance Evaluation of NASAs Nebula Cloud Computing Platform

    Science.gov (United States)

    Saini, Subhash; Heistand, Steve; Jin, Haoqiang; Chang, Johnny; Hood, Robert T.; Mehrotra, Piyush; Biswas, Rupak

    2012-01-01

    The high performance computing (HPC) community has shown tremendous interest in exploring cloud computing as it promises high potential. In this paper, we examine the feasibility, performance, and scalability of production quality scientific and engineering applications of interest to NASA on NASA's cloud computing platform, called Nebula, hosted at Ames Research Center. This work represents the comprehensive evaluation of Nebula using NUTTCP, HPCC, NPB, I/O, and MPI function benchmarks as well as four applications representative of the NASA HPC workload. Specifically, we compare Nebula performance on some of these benchmarks and applications to that of NASA s Pleiades supercomputer, a traditional HPC system. We also investigate the impact of virtIO and jumbo frames on interconnect performance. Overall results indicate that on Nebula (i) virtIO and jumbo frames improve network bandwidth by a factor of 5x, (ii) there is a significant virtualization layer overhead of about 10% to 25%, (iii) write performance is lower by a factor of 25x, (iv) latency for short MPI messages is very high, and (v) overall performance is 15% to 48% lower than that on Pleiades for NASA HPC applications. We also comment on the usability of the cloud platform.

  2. Design and Performance of the Virtualization Platform for Offline computing on the ATLAS TDAQ Farm

    CERN Document Server

    Ballestrero, S; The ATLAS collaboration; Brasolin, F; Contescu, C; Di Girolamo, A; Lee, C J; Pozo Astigarraga, M E; Scannicchio, D A; Twomey, M S; Zaytsev, A

    2013-01-01

    With the LHC collider at CERN currently going through the period of Long Shutdown 1 (LS1) there is a remarkable opportunity to use the computing resources of the large trigger farms of the experiments for other data processing activities. In the case of ATLAS experiment the TDAQ farm, consisting of more than 1500 compute nodes, is particularly suitable for running Monte Carlo production jobs that are mostly CPU and not I/O bound. This contribution gives a thorough review of all the stages of Sim@P1 project dedicated to the design and deployment of a virtualized platform running on the ATLAS TDAQ computing resources and using it to run the large groups of CernVM based virtual machines operating as a single CERN-P1 WLCG site. This platform has been designed to avoid interference with TDAQ usage of the farm and to guarantee the security and the usability of the ATLAS private network; Openstack has been chosen to provide a cloud management layer. The approaches to organizing support for the sustained operation of...

  3. Design and Performance of the Virtualization Platform for Offline computing on the ATLAS TDAQ Farm

    CERN Document Server

    Ballestrero, S; The ATLAS collaboration; Brasolin, F; Contescu, C; Di Girolamo, A; Lee, C J; Pozo Astigarraga, M E; Scannicchio, D A; Twomey, M S; Zaytsev, A

    2014-01-01

    With the LHC collider at CERN currently going through the period of Long Shutdown 1 (LS1) there is a remarkable opportunity to use the computing resources of the large trigger farms of the experiments for other data processing activities. In the case of ATLAS experiment the TDAQ farm, consisting of more than 1500 compute nodes, is particularly suitable for running Monte Carlo production jobs that are mostly CPU and not I/O bound. This contribution gives a thorough review of all the stages of Sim@P1 project dedicated to the design and deployment of a virtualized platform running on the ATLAS TDAQ computing resources and using it to run the large groups of CernVM based virtual machines operating as a single CERN-P1 WLCG site. This platform has been designed to avoid interference with TDAQ usage of the farm and to guarantee the security and the usability of the ATLAS private network; Openstack has been chosen to provide a cloud management layer. The approaches to organizing support for the sustained operation of...

  4. Development of a computer model to predict platform station keeping requirements in the Gulf of Mexico using remote sensing data

    Science.gov (United States)

    Barber, Bryan; Kahn, Laura; Wong, David

    1990-01-01

    Offshore operations such as oil drilling and radar monitoring require semisubmersible platforms to remain stationary at specific locations in the Gulf of Mexico. Ocean currents, wind, and waves in the Gulf of Mexico tend to move platforms away from their desired locations. A computer model was created to predict the station keeping requirements of a platform. The computer simulation uses remote sensing data from satellites and buoys as input. A background of the project, alternate approaches to the project, and the details of the simulation are presented.

  5. The BioIntelligence Framework: a new computational platform for biomedical knowledge computing.

    Science.gov (United States)

    Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles; Mousses, Spyro

    2013-01-01

    Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information.

  6. The “Chimera”: An Off-The-Shelf CPU/GPGPU/FPGA Hybrid Computing Platform

    Directory of Open Access Journals (Sweden)

    Ra Inta

    2012-01-01

    Full Text Available The nature of modern astronomy means that a number of interesting problems exhibit a substantial computational bound and this situation is gradually worsening. Scientists, increasingly fighting for valuable resources on conventional high-performance computing (HPC facilities—often with a limited customizable user environment—are increasingly looking to hardware acceleration solutions. We describe here a heterogeneous CPU/GPGPU/FPGA desktop computing system (the “Chimera”, built with commercial-off-the-shelf components. We show that this platform may be a viable alternative solution to many common computationally bound problems found in astronomy, however, not without significant challenges. The most significant bottleneck in pipelines involving real data is most likely to be the interconnect (in this case the PCI Express bus residing on the CPU motherboard. Finally, we speculate on the merits of our Chimera system on the entire landscape of parallel computing, through the analysis of representative problems from UC Berkeley’s “Thirteen Dwarves.”

  7. Semiempirical Quantum Chemical Calculations Accelerated on a Hybrid Multicore CPU-GPU Computing Platform.

    Science.gov (United States)

    Wu, Xin; Koslowski, Axel; Thiel, Walter

    2012-07-10

    In this work, we demonstrate that semiempirical quantum chemical calculations can be accelerated significantly by leveraging the graphics processing unit (GPU) as a coprocessor on a hybrid multicore CPU-GPU computing platform. Semiempirical calculations using the MNDO, AM1, PM3, OM1, OM2, and OM3 model Hamiltonians were systematically profiled for three types of test systems (fullerenes, water clusters, and solvated crambin) to identify the most time-consuming sections of the code. The corresponding routines were ported to the GPU and optimized employing both existing library functions and a GPU kernel that carries out a sequence of noniterative Jacobi transformations during pseudodiagonalization. The overall computation times for single-point energy calculations and geometry optimizations of large molecules were reduced by one order of magnitude for all methods, as compared to runs on a single CPU core.

  8. Lilith: A Java framework for the development of scalable tools for high performance distributed computing platforms

    Energy Technology Data Exchange (ETDEWEB)

    Evensky, D.A.; Gentile, A.C.; Armstrong, R.C.

    1998-03-19

    Increasingly, high performance computing constitutes the use of very large heterogeneous clusters of machines. The use and maintenance of such clusters are subject to complexities of communication between the machines in a time efficient and secure manner. Lilith is a general purpose tool that provides a highly scalable, secure, and easy distribution of user code across a heterogeneous computing platform. By handling the details of code distribution and communication, such a framework allows for the rapid development of tools for the use and management of large distributed systems. Lilith is written in Java, taking advantage of Java`s unique features of loading and distributing code dynamically, its platform independence, its thread support, and its provision of graphical components to facilitate easy to use resultant tools. The authors describe the use of Lilith in a tool developed for the maintenance of the large distributed cluster at their institution and present details of the Lilith architecture and user API for the general user development of scalable tools.

  9. Gridifying phylogeny and medical applications on the volunteer computing platform XtremWeb-CH.

    Science.gov (United States)

    Abdennadher, Nabil; Evéquoz, Claude; Billat, Cédric

    2008-01-01

    XtremWeb-CH (XWCH) is a volunteer computing middleware that makes it easy for scientists and industrials to deploy and execute their parallel and distributed applications on a public-resource computing infrastructure. XWCH supports various high performance applications, including those having large storage and communication requirements. Two high performance applications were ported and deployed on an XWCH platform. The first one is the Phylip package of programs that is employed for inferring phylogenies (evolutionary trees). It is the most widely distributed phylogeny package and has been used to build the largest number of published trees. Some modules of Phylip are CPU time consuming; their sequential version cannot be applied to a large number of sequences. The second application ported on XWCH is a medical application used to generate temporal dynamic neuronal maps. The application,named NeuroWeb,is used to better understand the connectivity and activity of neurons. NeuroWeb is a data and CPU intensive application. This paper describes the different components of an XWCH platform and the lessons learned from gridifying Phylip and NeuroWeb. It also details the new features and extensions, which are being added to XWCH in order to support new types of applications.

  10. The Construction of a Web-Based Learning Platform from the Perspective of Computer Support for Collaborative Design

    Directory of Open Access Journals (Sweden)

    Cheng Mei

    2012-04-01

    Full Text Available The purpose of this study is to construct a web-based learning platform of Computer Support for Collaborative Design (CSCD based on theories related to a constructivist learning environment model, mind mapping and computer-supported collaborative learning. The platform conforms to the needs of design students and provides effective tools for interaction and collaborative learning by integrating the tools of mind mapping into a learning environment that utilizes CSCD, a computer-assisted support system that can support and enhance group collaboration. The establishment of the CSCD learning platform represents a significant advance from the fixed functions and existing models of current online learning platforms and is the only learning platform in the world that focuses on learners in design departments. The platform is outstanding for its excellence, user-friendly functions, and innovative technology. In terms of funding, technical ability, human resources, organizational strategies, and risk analysis and evaluations, the learning platform is also worthy of expansion and implementation.

  11. University Students Use of Computers and Mobile Devices for Learning and Their Reading Speed on Different Platforms

    Science.gov (United States)

    Mpofu, Bongeka

    2016-01-01

    This research was aimed at the investigation of mobile device and computer use at a higher learning institution. The goal was to determine the current use of computers and mobile devices for learning and the students' reading speed on different platforms. The research was contextualised in a sample of students at the University of South Africa.…

  12. Computed torque control of an under-actuated service robot platform modeled by natural coordinates

    Science.gov (United States)

    Zelei, Ambrus; Kovács, László L.; Stépán, Gábor

    2011-05-01

    The paper investigates the motion planning of a suspended service robot platform equipped with ducted fan actuators. The platform consists of an RRT robot and a cable suspended swinging actuator that form a subsequent parallel kinematic chain and it is equipped with ducted fan actuators. In spite of the complementary ducted fan actuators, the system is under-actuated. The method of computed torques is applied to control the motion of the robot. The under-actuated systems have less control inputs than degrees of freedom. We assume that the investigated under-actuated system has desired outputs of the same number as inputs. In spite of the fact that the inverse dynamical calculation leads to the solution of a system of differential-algebraic equations (DAE), the desired control inputs can be determined uniquely by the method of computed torques. We use natural (Cartesian) coordinates to describe the configuration of the robot, while a set of algebraic equations represents the geometric constraints. In this modeling approach the mathematical model of the dynamical system itself is also a DAE. The paper discusses the inverse dynamics problem of the complex hybrid robotic system. The results include the desired actuator forces as well as the nominal coordinates corresponding to the desired motion of the carried payload. The method of computed torque control with a PD controller is applied to under-actuated systems described by natural coordinates, while the inverse dynamics is solved via the backward Euler discretization of the DAE system for which a general formalism is proposed. The results are compared with the closed form results obtained by simplified models of the system. Numerical simulation and experiments demonstrate the applicability of the presented concepts.

  13. Computationally inexpensive approach for pitch control of offshore wind turbine on barge floating platform.

    Science.gov (United States)

    Zuo, Shan; Song, Y D; Wang, Lei; Song, Qing-wang

    2013-01-01

    Offshore floating wind turbine (OFWT) has gained increasing attention during the past decade because of the offshore high-quality wind power and complex load environment. The control system is a tradeoff between power tracking and fatigue load reduction in the above-rated wind speed area. In allusion to the external disturbances and uncertain system parameters of OFWT due to the proximity to load centers and strong wave coupling, this paper proposes a computationally inexpensive robust adaptive control approach with memory-based compensation for blade pitch control. The method is tested and compared with a baseline controller and a conventional individual blade pitch controller with the "NREL offshore 5 MW baseline wind turbine" being mounted on a barge platform run on FAST and Matlab/Simulink, operating in the above-rated condition. It is shown that the advanced control approach is not only robust to complex wind and wave disturbances but adaptive to varying and uncertain system parameters as well. The simulation results demonstrate that the proposed method performs better in reducing power fluctuations, fatigue loads and platform vibration as compared to the conventional individual blade pitch control.

  14. Computationally Inexpensive Approach for Pitch Control of Offshore Wind Turbine on Barge Floating Platform

    Directory of Open Access Journals (Sweden)

    Shan Zuo

    2013-01-01

    Full Text Available Offshore floating wind turbine (OFWT has gained increasing attention during the past decade because of the offshore high-quality wind power and complex load environment. The control system is a tradeoff between power tracking and fatigue load reduction in the above-rated wind speed area. In allusion to the external disturbances and uncertain system parameters of OFWT due to the proximity to load centers and strong wave coupling, this paper proposes a computationally inexpensive robust adaptive control approach with memory-based compensation for blade pitch control. The method is tested and compared with a baseline controller and a conventional individual blade pitch controller with the “NREL offshore 5 MW baseline wind turbine” being mounted on a barge platform run on FAST and Matlab/Simulink, operating in the above-rated condition. It is shown that the advanced control approach is not only robust to complex wind and wave disturbances but adaptive to varying and uncertain system parameters as well. The simulation results demonstrate that the proposed method performs better in reducing power fluctuations, fatigue loads and platform vibration as compared to the conventional individual blade pitch control.

  15. A computational platform for modeling and simulation of pipeline georeferencing systems

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, A.G.; Pellanda, P.C.; Gois, J.A. [Instituto Militar de Engenharia (IME), Rio de Janeiro, RJ (Brazil); Roquette, P.; Pinto, M.; Durao, R. [Instituto de Pesquisas da Marinha (IPqM), Rio de Janeiro, RJ (Brazil); Silva, M.S.V.; Martins, W.F.; Camillo, L.M.; Sacsa, R.P.; Madeira, B. [Ministerio de Ciencia e Tecnologia (CT-PETRO2006MCT), Brasilia, DF (Brazil). Financiadora de Estudos e Projetos (FINEP). Plano Nacional de Ciencia e Tecnologia do Setor Petroleo e Gas Natural

    2009-07-01

    This work presents a computational platform for modeling and simulation of pipeline geo referencing systems, which was developed based on typical pipeline characteristics, on the dynamical modeling of Pipeline Inspection Gauge (PIG) and on the analysis and implementation of an inertial navigation algorithm. The software environment of PIG trajectory simulation and navigation allows the user, through a friendly interface, to carry-out evaluation tests of the inertial navigation system under different scenarios. Therefore, it is possible to define the required specifications of the pipeline geo referencing system components, such as: required precision of inertial sensors, characteristics of the navigation auxiliary system (GPS surveyed control points, odometers etc.), pipeline construction information to be considered in order to improve the trajectory estimation precision, and the signal processing techniques more suitable for the treatment of inertial sensors data. The simulation results are analyzed through the evaluation of several performance metrics usually considered in inertial navigation applications, and 2D and 3D plots of trajectory estimation error and of recovered trajectory in the three coordinates are made available to the user. This paper presents the simulation platform and its constituting modules and defines their functional characteristics and interrelationships.(author)

  16. Web-Based Parallel Monte Carlo Simulation Platform for Financial Computation

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Using Java, Java-enabled Web and object-oriented programming technologies, a framework is designed to organize multicomputer system on Intranet quickly to complete Monte Carlo simulation parallelizing. The high-performance computing environment is embedded in Web server so it can be accessed more easily. Adaptive parallelism and eager scheduling algorithm are used to realize load balancing, parallel processing and system fault-tolerance. Independent sequence pseudo-random number generator schemes to keep the parallel simulation availability. Three kinds of stock option pricing models as instances, ideal speedup and pricing results obtained on test bed. Now, as a Web service, a high-performance financial derivative security-pricing platform is set up for training and studying. The framework can also be used to develop other SPMD (single procedure multiple data) application. Robustness is still a major problem for further research.

  17. Phonon-based scalable platform for chip-scale quantum computing

    Science.gov (United States)

    Reinke, Charles M.; El-Kady, Ihab

    2016-12-01

    We present a scalable phonon-based quantum computer on a phononic crystal platform. Practical schemes involve selective placement of a single acceptor atom in the peak of the strain field in a high-Q phononic crystal cavity that enables coupling of the phonon modes to the energy levels of the atom. We show theoretical optimization of the cavity design and coupling waveguide, along with estimated performance figures of the coupled system. A qubit can be created by entangling a phonon at the resonance frequency of the cavity with the atom states. Qubits based on this half-sound, half-matter quasi-particle, called a phoniton, may outcompete other quantum architectures in terms of combined emission rate, coherence lifetime, and fabrication demands.

  18. Design Patterns for Sparse-Matrix Computations on Hybrid CPU/GPU Platforms

    Directory of Open Access Journals (Sweden)

    Valeria Cardellini

    2014-01-01

    Full Text Available We apply object-oriented software design patterns to develop code for scientific software involving sparse matrices. Design patterns arise when multiple independent developments produce similar designs which converge onto a generic solution. We demonstrate how to use design patterns to implement an interface for sparse matrix computations on NVIDIA GPUs starting from PSBLAS, an existing sparse matrix library, and from existing sets of GPU kernels for sparse matrices. We also compare the throughput of the PSBLAS sparse matrix–vector multiplication on two platforms exploiting the GPU with that obtained by a CPU-only PSBLAS implementation. Our experiments exhibit encouraging results regarding the comparison between CPU and GPU executions in double precision, obtaining a speedup of up to 35.35 on NVIDIA GTX 285 with respect to AMD Athlon 7750, and up to 10.15 on NVIDIA Tesla C2050 with respect to Intel Xeon X5650.

  19. Integration of biostatistics and pharmacometrics computing platforms for efficient and reproducible PK/PD analysis: a case study.

    Science.gov (United States)

    Ou, Ying C; Lo, Arthur; Lee, Brian; Liu, Phillip; Kimura, Karen; Eary, Charisse; Hopkins, Alan

    2013-11-01

    Results of pharmacometric analyses influence high-level decisions such as clinical trial design, drug approval, and labeling. Key challenges for timely delivery of pharmacometric analyses are the data assembly process and tracking and documenting the modeling process and results. Since clinical efficacy and safety data typically reside in the biostatistics computing area, an integrated computing platform for pharmacometric and biostatistical analyses would be ideal. A case study is presented integrating a pharmacometric modeling platform into an existing statistical computing environment (SCE). The feasibility and specific configurations of running common PK/PD programs such as NONMEM and R inside of the SCE are provided. The case study provides an example of an integrated repository that facilitates efficient data assembly for pharmacometrics analyses. The proposed platform encourages a good pharmacometrics working practice to maintain transparency, traceability, and reproducibility of PK/PD models and associated data in supporting drug development and regulatory decisions.

  20. Multilayer Cloud Computing Platform Architecture Model%多层云计算平台的架构模式磁

    Institute of Scientific and Technical Information of China (English)

    张治国; 殷克功; 贠永刚; 方佳怡

    2014-01-01

    论文从三个方面介绍了企业多层云计算平台的研究,首先介绍了当前的四种云计算模式,其次提出企业多层云计算平台的架构,第三部分对当前常用企业云计算平台和多层云计算平台运算速度进行了比较,最后分析了多层云计算平台的优点和问题。%The research enterprise cloud computing platform is introduced from three aspects .Firstly four kinds of cloud current calculation model are proposed in this paper ,the enterprise cloud computing platform architecture ,the third part of the platform and multi stratus cloud computing platform of current enterprise operation speed are compared .Many cloud computing platform advantages and problems are analyzed .

  1. Advanced LIGO Two-Stage Twelve-Axis Vibration Isolation and Positioning Platform. Part 1: Design and Production Overview

    CERN Document Server

    Matichard, Fabrice; Mason, Kenneth; Mittleman, Richard; Abbott, Benjamin; Abbott, Samuel; Allwine, Eric; Barnum, Samuel; Birch, Jeremy; Biscans, Sebastien; Clark, Daniel; Coyne, Dennis; DeBra, Dan; DeRosa, Ryan; Foley, Stephany; Fritschel, Peter; Giaime, Joseph A; Gray, Corey; Grabeel, Gregory; Hanson, Joe; Hillard, Michael; Kissel, Jeffrey; Kucharczyk, Christopher; Roux, Adrien Le; Lhuillier, Vincent; Macinnis, Myron; OReilly, Brian; Ottaway, David; Paris, Hugo; Puma, Michael; Radkins, Hugh; Ramet, Celine; Robinson, Mitchell; Ruet, Laurent; Sareen, Pradeep; Shoemaker, Daivid; Stein, Andy; Thomas, Jeremy; Vargas, Michael; Warner, Jimmy

    2014-01-01

    New generations of gravity wave detectors require unprecedented levels of vibration isolation. This paper presents the final design of the vibration isolation and positioning platform used in Advanced LIGO to support the interferometers core optics. This five-ton two-and-half-meter wide system operates in ultra-high vacuum. It features two stages of isolation mounted in series. The stages are imbricated to reduce the overall height. Each stage provides isolation in all directions of translation and rotation. The system is instrumented with a unique combination of low noise relative and inertial sensors. The active control provides isolation from 0.1 Hz to 30 Hz. It brings the platform motion down to 10^(-11) m/Hz^(0.5) at 1 Hz. Active and passive isolation combine to bring the platform motion below 10^(-12) m/Hz^(0.5) at 10 Hz. The passive isolation lowers the motion below 10^(-13) m/Hz^(0.5) at 100 Hz. The paper describes how the platform has been engineered not only to meet the isolation requirements, but a...

  2. PID Controllers Design Applied to Positioning of Ball on the Stewart Platform

    Directory of Open Access Journals (Sweden)

    Koszewnik Andrzej

    2014-12-01

    Full Text Available The paper presents the design and practical implementation of PID controllers for a Stewart platform. The platform uses a resistance touch panel as a sensor and servo motors as actuators. The complete control system stabilizing the ball on the platform is realized with the Arduino microcontroller and the Matlab/Simulink software. Two processes required to acquire measurement signals from the touch panel in two perpendicular directions X and Y, are discussed. The first process includes the calibration of the touch panel, and the second process - the filtering of measurement signals with the low pass Butterworth filter. The obtained signals are used to design the algorithm of the ball stabilization by decoupling the global system into two local subsystems. The algorithm is implemented in a soft real time system. The parameters of both PID controllers (PIDx and PIDy are tuned by the trial-error method and implemented in the microcontroller. Finally, the complete control system is tested at the laboratory stand.

  3. CaPSID: A bioinformatics platform for computational pathogen sequence identification in human genomes and transcriptomes

    Directory of Open Access Journals (Sweden)

    Borozan Ivan

    2012-08-01

    Full Text Available Abstract Background It is now well established that nearly 20% of human cancers are caused by infectious agents, and the list of human oncogenic pathogens will grow in the future for a variety of cancer types. Whole tumor transcriptome and genome sequencing by next-generation sequencing technologies presents an unparalleled opportunity for pathogen detection and discovery in human tissues but requires development of new genome-wide bioinformatics tools. Results Here we present CaPSID (Computational Pathogen Sequence IDentification, a comprehensive bioinformatics platform for identifying, querying and visualizing both exogenous and endogenous pathogen nucleotide sequences in tumor genomes and transcriptomes. CaPSID includes a scalable, high performance database for data storage and a web application that integrates the genome browser JBrowse. CaPSID also provides useful metrics for sequence analysis of pre-aligned BAM files, such as gene and genome coverage, and is optimized to run efficiently on multiprocessor computers with low memory usage. Conclusions To demonstrate the usefulness and efficiency of CaPSID, we carried out a comprehensive analysis of both a simulated dataset and transcriptome samples from ovarian cancer. CaPSID correctly identified all of the human and pathogen sequences in the simulated dataset, while in the ovarian dataset CaPSID’s predictions were successfully validated in vitro.

  4. A reconfigurable computing platform for plume tracking with mobile sensor networks

    Science.gov (United States)

    Kim, Byung Hwa; D'Souza, Colin; Voyles, Richard M.; Hesch, Joel; Roumeliotis, Stergios I.

    2006-05-01

    Much work has been undertaken recently toward the development of low-power, high-performance sensor networks. There are many static remote sensing applications for which this is appropriate. The focus of this development effort is applications that require higher performance computation, but still involve severe constraints on power and other resources. Toward that end, we are developing a reconfigurable computing platform for miniature robotic and human-deployed sensor systems composed of several mobile nodes. The system provides static and dynamic reconfigurability for both software and hardware by the combination of CPU (central processing unit) and FPGA (field-programmable gate array) allowing on-the-fly reprogrammability. Static reconfigurability of the hardware manifests itself in the form of a "morphing bus" architecture that permits the modular connection of various sensors with no bus interface logic. Dynamic hardware reconfigurability provides for the reallocation of hardware resources at run-time as the mobile, resource-constrained nodes encounter unknown environmental conditions that render various sensors ineffective. This computing platform will be described in the context of work on chemical/biological/radiological plume tracking using a distributed team of mobile sensors. The objective for a dispersed team of ground and/or aerial autonomous vehicles (or hand-carried sensors) is to acquire measurements of the concentration of the chemical agent from optimal locations and estimate its source and spread. This requires appropriate distribution, coordination and communication within the team members across a potentially unknown environment. The key problem is to determine the parameters of the distribution of the harmful agent so as to use these values for determining its source and predicting its spread. The accuracy and convergence rate of this estimation process depend not only on the number and accuracy of the sensor measurements but also on their

  5. Multivariate Gradient Analysis for Evaluating and Visualizing a Learning System Platform for Computer Programming

    Directory of Open Access Journals (Sweden)

    Richard Mather

    2015-02-01

    Full Text Available This paper explores the application of canonical gradient analysis to evaluate and visualize student performance and acceptance of a learning system platform. The subject of evaluation is a first year BSc module for computer programming. This uses ‘Ceebot’, an animated and immersive game-like development environment. Multivariate ordination approaches are widely used in ecology to explore species distribution along environmental gradients. Environmental factors are represented here by three ‘assessment’ gradients; one for the overall module mark and two independent tests of programming knowledge and skill. Response data included Likert expressions for behavioral, acceptance and opinion traits. Behavioral characteristics (such as attendance, collaboration and independent study were regarded to be indicative of learning activity. Acceptance and opinion factors (such as perceived enjoyment and effectiveness of Ceebot were treated as expressions of motivation to engage with the learning environment. Ordination diagrams and summary statistics for canonical analyses suggested that logbook grades (the basis for module assessment and code understanding were weakly correlated. Thus strong module performance was not a reliable predictor of programming ability. The three assessment indices were correlated with behaviors of independent study and peer collaboration, but were only weakly associated with attendance. Results were useful for informing teaching practice and suggested: (1 realigning assessments to more fully capture code-level skills (important in the workplace; (2 re-evaluating attendance-based elements of module design; and (3 the overall merit of multivariate canonical gradient approaches for evaluating and visualizing the effectiveness of a learning system platform.

  6. Kilovoltage Rotational External Beam Radiotherapy on a Breast Computed Tomography Platform: A Feasibility Study

    Energy Technology Data Exchange (ETDEWEB)

    Prionas, Nicolas D.; McKenney, Sarah E. [Department of Radiology, University of California, Davis, Medical Center, Sacramento, California (United States); Stern, Robin L. [Department of Radiation Oncology, University of California, Davis, Medical Center, Sacramento, California (United States); Boone, John M., E-mail: jmboone@ucdavis.edu [Department of Radiology, University of California, Davis, Medical Center, Sacramento, California (United States)

    2012-10-01

    Purpose: To demonstrate the feasibility of a dedicated breast computed tomography (bCT) platform to deliver rotational kilovoltage (kV) external beam radiotherapy (RT) for partial breast irradiation, whole breast irradiation, and dose painting. Methods and Materials: Rotational kV-external beam RT using the geometry of a prototype bCT platform was evaluated using a Monte Carlo simulator. A point source emitting 178 keV photons (approximating a 320-kVp spectrum with 4-mm copper filtration) was rotated around a 14-cm voxelized polyethylene disk (0.1 cm tall) or cylinder (9 cm tall) to simulate primary and primary plus scattered photon interactions, respectively. Simulations were also performed using voxelized bCT patient images. Beam collimation was varied in the x-y plane (1-14 cm) and in the z-direction (0.1-10 cm). Dose painting for multiple foci, line, and ring distributions was demonstrated using multiple rotations with varying beam collimation. Simulations using the scanner's native hardware (120 kVp filtered by 0.2-mm copper) were validated experimentally. Results: As the x-y collimator was narrowed, the two-dimensional dose profiles shifted from a cupped profile with a high edge dose to an increasingly peaked central dose distribution with a sharp dose falloff. Using a 1-cm beam, the cylinder edge dose was <7% of the dose deposition at the cylinder center. Simulations using 120-kVp X-rays showed distributions similar to the experimental measurements. A homogeneous dose distribution (<2.5% dose fluctuation) with a 20% decrease in dose deposition at the cylinder edge (i.e., skin sparing) was demonstrated by weighted summation of four dose profiles using different collimation widths. Simulations using patient bCT images demonstrated the potential for treatment planning and image-guided RT. Conclusions: Rotational kV-external beam RT for partial breast irradiation, dose painting, and whole breast irradiation with skin sparing is feasible on a bCT platform

  7. MACBenAbim: A Multi-platform Mobile Application for searching keyterms in Computational Biology and Bioinformatics.

    Science.gov (United States)

    Oluwagbemi, Olugbenga O; Adewumi, Adewole; Esuruoso, Abimbola

    2012-01-01

    Computational biology and bioinformatics are gradually gaining grounds in Africa and other developing nations of the world. However, in these countries, some of the challenges of computational biology and bioinformatics education are inadequate infrastructures, and lack of readily-available complementary and motivational tools to support learning as well as research. This has lowered the morale of many promising undergraduates, postgraduates and researchers from aspiring to undertake future study in these fields. In this paper, we developed and described MACBenAbim (Multi-platform Mobile Application for Computational Biology and Bioinformatics), a flexible user-friendly tool to search for, define and describe the meanings of keyterms in computational biology and bioinformatics, thus expanding the frontiers of knowledge of the users. This tool also has the capability of achieving visualization of results on a mobile multi-platform context. MACBenAbim is available from the authors for non-commercial purposes.

  8. Discovery and analysis of time delay sources in the USGS personal computer data collection platform (PCDCP) system

    Science.gov (United States)

    White, Timothy C.; Sauter, Edward A.; Stewart, Duff C.

    2014-01-01

    Intermagnet is an international oversight group which exists to establish a global network for geomagnetic observatories. This group establishes data standards and standard operating procedures for members and prospective members. Intermagnet has proposed a new One-Second Data Standard, for that emerging geomagnetic product. The standard specifies that all data collected must have a time stamp accuracy of ±10 milliseconds of the top-of-the-second Coordinated Universal Time. Therefore, the U.S. Geological Survey Geomagnetism Program has designed and executed several tests on its current data collection system, the Personal Computer Data Collection Platform. Tests are designed to measure the time shifts introduced by individual components within the data collection system, as well as to measure the time shift introduced by the entire Personal Computer Data Collection Platform. Additional testing designed for Intermagnet will be used to validate further such measurements. Current results of the measurements showed a 5.0–19.9 millisecond lag for the vertical channel (Z) of the Personal Computer Data Collection Platform and a 13.0–25.8 millisecond lag for horizontal channels (H and D) of the collection system. These measurements represent a dynamically changing delay introduced within the U.S. Geological Survey Personal Computer Data Collection Platform.

  9. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME

    Science.gov (United States)

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2017-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments. PMID:28190948

  10. Optimization of a Lattice Boltzmann Computation on State-of-the-Art Multicore Platforms

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Samuel; Carter, Jonathan; Oliker, Leonid; Shalf, John; Yelick, Katherine

    2009-04-10

    We present an auto-tuning approach to optimize application performance on emerging multicore architectures. The methodology extends the idea of search-based performance optimizations, popular in linear algebra and FFT libraries, to application-specific computational kernels. Our work applies this strategy to a lattice Boltzmann application (LBMHD) that historically has made poor use of scalar microprocessors due to its complex data structures and memory access patterns. We explore one of the broadest sets of multicore architectures in the HPC literature, including the Intel Xeon E5345 (Clovertown), AMD Opteron 2214 (Santa Rosa), AMD Opteron 2356 (Barcelona), Sun T5140 T2+ (Victoria Falls), as well as a QS20 IBM Cell Blade. Rather than hand-tuning LBMHD for each system, we develop a code generator that allows us to identify a highly optimized version for each platform, while amortizing the human programming effort. Results show that our auto-tuned LBMHD application achieves up to a 15x improvement compared with the original code at a given concurrency. Additionally, we present detailed analysis of each optimization, which reveal surprising hardware bottlenecks and software challenges for future multicore systems and applications.

  11. GBM Volumetry using the 3D Slicer Medical Image Computing Platform

    Science.gov (United States)

    Egger, Jan; Kapur, Tina; Fedorov, Andriy; Pieper, Steve; Miller, James V.; Veeraraghavan, Harini; Freisleben, Bernd; Golby, Alexandra J.; Nimsky, Christopher; Kikinis, Ron

    2013-01-01

    Volumetric change in glioblastoma multiforme (GBM) over time is a critical factor in treatment decisions. Typically, the tumor volume is computed on a slice-by-slice basis using MRI scans obtained at regular intervals. (3D)Slicer – a free platform for biomedical research – provides an alternative to this manual slice-by-slice segmentation process, which is significantly faster and requires less user interaction. In this study, 4 physicians segmented GBMs in 10 patients, once using the competitive region-growing based GrowCut segmentation module of Slicer, and once purely by drawing boundaries completely manually on a slice-by-slice basis. Furthermore, we provide a variability analysis for three physicians for 12 GBMs. The time required for GrowCut segmentation was on an average 61% of the time required for a pure manual segmentation. A comparison of Slicer-based segmentation with manual slice-by-slice segmentation resulted in a Dice Similarity Coefficient of 88.43 ± 5.23% and a Hausdorff Distance of 2.32 ± 5.23 mm. PMID:23455483

  12. Depositional ''cyclicity'' on carbonate platforms: Real-world limits on computer-model output

    Energy Technology Data Exchange (ETDEWEB)

    Boss, S.K.; Neumann, A.C. (Univ. of North Carolina, Chapel Hill, NC (United States)); Rasmussen, K.A. (Northern Virginia Community Coll., Annandale, VA (United States))

    1994-03-01

    Computer-models which attempt to define interactions among dynamic parameters believed to influence the development of ''cyclic'' carbonate platform sequences have been popularized over the past few years. These models typically utilize vectors for subsidence (constant) and cyclical (sinusoidal) eustatic sea-level to create accommodation space which is filled by sedimentation (depth-dependent rates) following an appropriate lag time (non-depositional episode during initial platform flooding). Since these models are intended to reflect general principles of cyclic carbonate deposition, it is instructive to test their predictive utility by comparing typical model outputs with an actively evolving depositional cycle on a modern carbonate platform where rates of subsidence, eustatic sea-level and sediment accumulation are known. Holocene carbonate deposits across northern Great Bahama Bank provide such an ideal test-platform for model-data comparisons. On Great Bahama Bank, formation of accommodation space depends on eustatic sea-level rise because tectonic subsidence is very slow. Contrary to typical model input parameters, however, the rate of formation of accommodation space varies irregularly across the bank-top because irregular bank-top topography (produced by subaerial erosion and karstification) results in differential flooding of the platform surface. Results of this comparison indicate that typical computer-model input variables (subsidence, sea-level, sedimentation, lag-time) and output depositional geometries are poorly correlated with real depositional patterns across Great Bahama Bank. Since other modern carbonate platforms and ancient carbonate sequences display similarly complex stratigraphies, it is suggested that present computer-modeling results have little predictive value for stratigraphic interpretation.

  13. Positioning Continuing Education Computer Programs for the Corporate Market.

    Science.gov (United States)

    Tilney, Ceil

    1993-01-01

    Summarizes the findings of the market assessment phase of Bellevue Community College's evaluation of its continuing education computer training program. Indicates that marketing efforts must stress program quality and software training to help overcome strong antiacademic client sentiment. (MGB)

  14. 3D virtual human atria: A computational platform for studying clinical atrial fibrillation.

    Science.gov (United States)

    Aslanidi, Oleg V; Colman, Michael A; Stott, Jonathan; Dobrzynski, Halina; Boyett, Mark R; Holden, Arun V; Zhang, Henggui

    2011-10-01

    Despite a vast amount of experimental and clinical data on the underlying ionic, cellular and tissue substrates, the mechanisms of common atrial arrhythmias (such as atrial fibrillation, AF) arising from the functional interactions at the whole atria level remain unclear. Computational modelling provides a quantitative framework for integrating such multi-scale data and understanding the arrhythmogenic behaviour that emerges from the collective spatio-temporal dynamics in all parts of the heart. In this study, we have developed a multi-scale hierarchy of biophysically detailed computational models for the human atria--the 3D virtual human atria. Primarily, diffusion tensor MRI reconstruction of the tissue geometry and fibre orientation in the human sinoatrial node (SAN) and surrounding atrial muscle was integrated into the 3D model of the whole atria dissected from the Visible Human dataset. The anatomical models were combined with the heterogeneous atrial action potential (AP) models, and used to simulate the AP conduction in the human atria under various conditions: SAN pacemaking and atrial activation in the normal rhythm, break-down of regular AP wave-fronts during rapid atrial pacing, and the genesis of multiple re-entrant wavelets characteristic of AF. Contributions of different properties of the tissue to mechanisms of the normal rhythm and arrhythmogenesis were investigated. Primarily, the simulations showed that tissue heterogeneity caused the break-down of the normal AP wave-fronts at rapid pacing rates, which initiated a pair of re-entrant spiral waves; and tissue anisotropy resulted in a further break-down of the spiral waves into multiple meandering wavelets characteristic of AF. The 3D virtual atria model itself was incorporated into the torso model to simulate the body surface ECG patterns in the normal and arrhythmic conditions. Therefore, a state-of-the-art computational platform has been developed, which can be used for studying multi

  15. CoreFlow: a computational platform for integration, analysis and modeling of complex biological data.

    Science.gov (United States)

    Pasculescu, Adrian; Schoof, Erwin M; Creixell, Pau; Zheng, Yong; Olhovsky, Marina; Tian, Ruijun; So, Jonathan; Vanderlaan, Rachel D; Pawson, Tony; Linding, Rune; Colwill, Karen

    2014-04-04

    A major challenge in mass spectrometry and other large-scale applications is how to handle, integrate, and model the data that is produced. Given the speed at which technology advances and the need to keep pace with biological experiments, we designed a computational platform, CoreFlow, which provides programmers with a framework to manage data in real-time. It allows users to upload data into a relational database (MySQL), and to create custom scripts in high-level languages such as R, Python, or Perl for processing, correcting and modeling this data. CoreFlow organizes these scripts into project-specific pipelines, tracks interdependencies between related tasks, and enables the generation of summary reports as well as publication-quality images. As a result, the gap between experimental and computational components of a typical large-scale biology project is reduced, decreasing the time between data generation, analysis and manuscript writing. CoreFlow is being released to the scientific community as an open-sourced software package complete with proteomics-specific examples, which include corrections for incomplete isotopic labeling of peptides (SILAC) or arginine-to-proline conversion, and modeling of multiple/selected reaction monitoring (MRM/SRM) results. CoreFlow was purposely designed as an environment for programmers to rapidly perform data analysis. These analyses are assembled into project-specific workflows that are readily shared with biologists to guide the next stages of experimentation. Its simple yet powerful interface provides a structure where scripts can be written and tested virtually simultaneously to shorten the life cycle of code development for a particular task. The scripts are exposed at every step so that a user can quickly see the relationships between the data, the assumptions that have been made, and the manipulations that have been performed. Since the scripts use commonly available programming languages, they can easily be

  16. Cpu/gpu Computing for AN Implicit Multi-Block Compressible Navier-Stokes Solver on Heterogeneous Platform

    Science.gov (United States)

    Deng, Liang; Bai, Hanli; Wang, Fang; Xu, Qingxin

    2016-06-01

    CPU/GPU computing allows scientists to tremendously accelerate their numerical codes. In this paper, we port and optimize a double precision alternating direction implicit (ADI) solver for three-dimensional compressible Navier-Stokes equations from our in-house Computational Fluid Dynamics (CFD) software on heterogeneous platform. First, we implement a full GPU version of the ADI solver to remove a lot of redundant data transfers between CPU and GPU, and then design two fine-grain schemes, namely “one-thread-one-point” and “one-thread-one-line”, to maximize the performance. Second, we present a dual-level parallelization scheme using the CPU/GPU collaborative model to exploit the computational resources of both multi-core CPUs and many-core GPUs within the heterogeneous platform. Finally, considering the fact that memory on a single node becomes inadequate when the simulation size grows, we present a tri-level hybrid programming pattern MPI-OpenMP-CUDA that merges fine-grain parallelism using OpenMP and CUDA threads with coarse-grain parallelism using MPI for inter-node communication. We also propose a strategy to overlap the computation with communication using the advanced features of CUDA and MPI programming. We obtain speedups of 6.0 for the ADI solver on one Tesla M2050 GPU in contrast to two Xeon X5670 CPUs. Scalability tests show that our implementation can offer significant performance improvement on heterogeneous platform.

  17. Curriculum modules, software laboratories, and an inexpensive hardware platform for teaching computational methods to undergraduate computer science students

    Science.gov (United States)

    Peck, Charles Franklin

    Computational methods are increasingly important to 21st century research and education; bioinformatics and climate change are just two examples of this trend. In this context computer scientists play an important role, facilitating the development and use of the methods and tools used to support computationally-based approaches. The undergraduate curriculum in computer science is one place where computational tools and methods can be introduced to facilitate the development of appropriately prepared computer scientists. To facilitate the evolution of the pedagogy, this dissertation identifies, develops, and organizes curriculum materials, software laboratories, and the reference design for an inexpensive portable cluster computer, all of which are specifically designed to support the teaching of computational methods to undergraduate computer science students. Keywords. computational science, computational thinking, computer science, undergraduate curriculum.

  18. Beyond Computer Literacy: Supporting Youth's Positive Development through Technology

    Science.gov (United States)

    Bers, Marina Umaschi

    2010-01-01

    In a digital era in which technology plays a role in most aspects of a child's life, having the competence and confidence to use computers might be a necessary step, but not a goal in itself. Developing character traits that will serve children to use technology in a safe way to communicate and connect with others, and providing opportunities for…

  19. Beyond Computer Literacy: Supporting Youth's Positive Development through Technology

    Science.gov (United States)

    Bers, Marina Umaschi

    2010-01-01

    In a digital era in which technology plays a role in most aspects of a child's life, having the competence and confidence to use computers might be a necessary step, but not a goal in itself. Developing character traits that will serve children to use technology in a safe way to communicate and connect with others, and providing opportunities for…

  20. Research of video conferencing platform based on Cloud Computing%云视频会议平台研究

    Institute of Scientific and Technical Information of China (English)

    杜磊; 许博; 阚媛; 王晓卓; 马文彬

    2013-01-01

    为解决云计算环境下视频会议的开发与应用,提出了包括基础设施即服务 IaaS、平台即服务 PaaS 和软件即服务 SaaS 的云视频会议平台框架,分析了云视频会议的服务功能和会议流程,并建立了云视频会议平台系统。平台的应用能够促进信息的实时沟通,在降低企业开发、维护成本的同时,提升企业的决策能力。%To slove the problem of the development and application of video conferencing in the Cloud Computing environment , we research the video conferencing platform based on cloud computing . In this paper , the architecture of video conferencing platform is represented as IaaS , PaaS and SaaS . And the services and flow charts of platform have proposed . As the result , the video con-ferencing platform has builded for facilitate the real-time communication . At the same time , it can not only reduce the corporate maintenance costs , but also enhance the decision-making capacity .

  1. Computational Analysis of Perfect-Information Position Auctions

    OpenAIRE

    Thompson, David R. M; Leyton-Brown, Kevin

    2014-01-01

    After experimentation with other designs, the major search engines converged on the weighted, generalized second-price auction (wGSP) for selling keyword advertisements. Notably, this convergence occurred before position auctions were well understood (or, indeed, widely studied) theoretically. While much progress has been made since, theoretical analysis is still not able to settle the question of why search engines found wGSP preferable to other position auctions. We approach this question i...

  2. Validation study of a computer-based open surgical trainer: SimPraxis® simulation platform

    Directory of Open Access Journals (Sweden)

    Tran LN

    2013-03-01

    .Conclusion: We describe an interactive, computer-based simulator designed to assist in mastery of the cognitive steps of an open surgical procedure. This platform is intuitive and flexible, and could be applied to any stepwise medical procedure. Overall, experts outperformed novices in their performance on the trainer. Experts agreed that the content was acceptable, accurate, and representative.Keywords: simulation, surgical education, training, simulator, video

  3. Accelerating hyper-spectral data processing on the multi-CPU and multi-GPU heterogeneous computing platform

    Science.gov (United States)

    Zhang, Lei; Gao, Jiao Bo; Hu, Yu; Wang, Ying Hui; Sun, Ke Feng; Cheng, Juan; Sun, Dan Dan; Li, Yu

    2017-02-01

    During the research of hyper-spectral imaging spectrometer, how to process the huge amount of image data is a difficult problem for all researchers. The amount of image data is about the order of magnitude of several hundred megabytes per second. The only way to solve this problem is parallel computing technology. With the development of multi-core CPU and GPU parallel computing on multi-core CPU or GPU is increasingly applied in large-scale data processing. In this paper, we propose a new parallel computing solution of hyper-spectral data processing which is based on the multi-CPU and multi-GPU heterogeneous computing platform. We use OpenMP technology to control multi-core CPU, we also use CUDA to schedule the parallel computing on multi-GPU. Experimental results show that the speed of hyper-spectral data processing on the multi-CPU and multi-GPU heterogeneous computing platform is apparently faster than the traditional serial algorithm which is run on single core CPU. Our research has significant meaning for the engineering application of the windowing Fourier transform imaging spectrometer.

  4. A Conceptual Architecture for Adaptive Human-Computer Interface of a PT Operation Platform Based on Context-Awareness

    Directory of Open Access Journals (Sweden)

    Qing Xue

    2014-01-01

    Full Text Available We present a conceptual architecture for adaptive human-computer interface of a PT operation platform based on context-awareness. This architecture will form the basis of design for such an interface. This paper describes components, key technologies, and working principles of the architecture. The critical contents covered context information modeling, processing, relationship establishing between contexts and interface design knowledge by use of adaptive knowledge reasoning, and visualization implementing of adaptive interface with the aid of interface tools technology.

  5. Positional quality assessment of orthophotos obtained from sensors onboard multi-rotor UAV platforms.

    Science.gov (United States)

    Mesas-Carrascosa, Francisco Javier; Rumbao, Inmaculada Clavero; Berrocal, Juan Alberto Barrera; Porras, Alfonso García-Ferrer

    2014-11-26

    In this study we explored the positional quality of orthophotos obtained by an unmanned aerial vehicle (UAV). A multi-rotor UAV was used to obtain images using a vertically mounted digital camera. The flight was processed taking into account the photogrammetry workflow: perform the aerial triangulation, generate a digital surface model, orthorectify individual images and finally obtain a mosaic image or final orthophoto. The UAV orthophotos were assessed with various spatial quality tests used by national mapping agencies (NMAs). Results showed that the orthophotos satisfactorily passed the spatial quality tests and are therefore a useful tool for NMAs in their production flowchart.

  6. Positional Quality Assessment of Orthophotos Obtained from Sensors Onboard Multi-Rotor UAV Platforms

    Directory of Open Access Journals (Sweden)

    Francisco Javier Mesas-Carrascosa

    2014-11-01

    Full Text Available In this study we explored the positional quality of orthophotos obtained by an unmanned aerial vehicle (UAV. A multi-rotor UAV was used to obtain images using a vertically mounted digital camera. The flight was processed taking into account the photogrammetry workflow: perform the aerial triangulation, generate a digital surface model, orthorectify individual images and finally obtain a mosaic image or final orthophoto. The UAV orthophotos were assessed with various spatial quality tests used by national mapping agencies (NMAs. Results showed that the orthophotos satisfactorily passed the spatial quality tests and are therefore a useful tool for NMAs in their production flowchart.

  7. Positional Quality Assessment of Orthophotos Obtained from Sensors Onboard Multi-Rotor UAV Platforms

    Science.gov (United States)

    Mesas-Carrascosa, Francisco Javier; Rumbao, Inmaculada Clavero; Berrocal, Juan Alberto Barrera; Porras, Alfonso García-Ferrer

    2014-01-01

    In this study we explored the positional quality of orthophotos obtained by an unmanned aerial vehicle (UAV). A multi-rotor UAV was used to obtain images using a vertically mounted digital camera. The flight was processed taking into account the photogrammetry workflow: perform the aerial triangulation, generate a digital surface model, orthorectify individual images and finally obtain a mosaic image or final orthophoto. The UAV orthophotos were assessed with various spatial quality tests used by national mapping agencies (NMAs). Results showed that the orthophotos satisfactorily passed the spatial quality tests and are therefore a useful tool for NMAs in their production flowchart. PMID:25587877

  8. Integration of a network aware traffic generation device into a computer network emulation platform

    CSIR Research Space (South Africa)

    Von Solms, S

    2014-07-01

    Full Text Available aware traffic into the network emulation platform. Traffic generators are often systems that replay captured traffic packet-by-packet or generate traffic according to a specified model or preconfigured sequence. Many of these traffic generators can...

  9. Positioning Identity in Computer-Mediated Discourse among ESOL Learners

    Science.gov (United States)

    Fong, Carlton J.; Lin, Shengjie; Engle, Randi A.

    2016-01-01

    The present study explores a linguistic mechanism in which the identity of English for speakers of other languages (ESOL) learners can be influenced online. Analyzing the discourse of ESOL chat room participants and how they uptake positioning statements through online conversations, we present two vignettes that illustrate the kind of discourse…

  10. Positioning Identity in Computer-Mediated Discourse among ESOL Learners

    Science.gov (United States)

    Fong, Carlton J.; Lin, Shengjie; Engle, Randi A.

    2016-01-01

    The present study explores a linguistic mechanism in which the identity of English for speakers of other languages (ESOL) learners can be influenced online. Analyzing the discourse of ESOL chat room participants and how they uptake positioning statements through online conversations, we present two vignettes that illustrate the kind of discourse…

  11. Position-based quantum cryptography and catalytic computation

    NARCIS (Netherlands)

    Speelman, F.

    2016-01-01

    In this thesis, we present several results along two different lines of research. The first part concerns the study of position-based quantum cryptography, a topic in quantum cryptography. By combining quantum mechanics with special relativity theory, new cryptographic tasks can be developed that us

  12. Non-GPS full position and angular orientation onboard sensors for moving and stationary platforms

    Science.gov (United States)

    Dhadwal, Harbans S.; Rastegar, Jahangir; Feng, Dake; Kwok, Philip; Pereira, Carlos M.

    2016-05-01

    Angular orientation of both mobile and stationary objects continues to be an ongoing topic of interest for guidance and control as well as for non-GPS based solutions for geolocations of assets in any environment. Currently available sensors, which include inertia devices such as accelerometers and gyros; magnetometers; surface mounted antennas; radars; GPS; and optical line of sight devices, do not provide an acceptable solution for many applications, particularly for gun-fired munitions and for all-weather and all environment scenarios. A robust onboard full angular orientation sensor solution, based on a scanning polarized reference source and a polarized geometrical cavity orientation sensor, is presented. The full position of the object, in the reference source coordinate system, is determined by combining range data obtained using established time-of-flight techniques, with the angular orientation information.

  13. How Novel Algorithms and Access to High Performance Computing Platforms are Enabling Scientific Progress in Atomic and Molecular Physics

    Science.gov (United States)

    Schneider, Barry I.

    2016-10-01

    Over the past 40 years there has been remarkable progress in the quantitative treatment of complex many-body problems in atomic and molecular physics (AMP). This has happened as a consequence of the development of new and powerful numerical methods, translating these algorithms into practical software and the associated evolution of powerful computing platforms ranging from desktops to high performance computational instruments capable of massively parallel computation. We are taking the opportunity afforded by this CCP2015 to review computational progress in scattering theory and the interaction of strong electromagnetic fields with atomic and molecular systems from the early 1960’s until the present time to show how these advances have revealed a remarkable array of interesting and in many cases unexpected features. The article is by no means complete and certainly reflects the views and experiences of the author.

  14. The Mechanism about Key and Credential on Trusted Computing Platform and the Application Study

    Institute of Scientific and Technical Information of China (English)

    SHEN Zhidong; ZHANG Huanguo; ZHANG Miao; YAN Fei; ZHANG Liqiang

    2006-01-01

    Trusted Computing technology is quickly developing in recent years. This technology manages to improve the computer security and archive a trusted computing environment. The core of trusted computing technology is cryptology. In this paper, we analyze the key and credential mechanism which is two basic aspects in the cryptology application of trusted computing. We give an example application to illustrate that the TPM enabled key and credential mechanism can improve the security of computer system.

  15. An Analysis of Impact Factors for Positioning Performance in WLAN Fingerprinting Systems Using Ishikawa Diagrams and a Simulation Platform

    Directory of Open Access Journals (Sweden)

    Keqiang Liu

    2017-01-01

    Full Text Available Many factors influence the positioning performance in WLAN RSSI fingerprinting systems, and summary of these factors is an important but challenging job. Moreover, impact analysis on nonalgorithm factors is significant to system application and quality control but little research has been conducted. This paper analyzes and summarizes the potential impact factors by using an Ishikawa diagram considering radio signal transmitting, propagating, receiving, and processing. A simulation platform was developed to facilitate the analysis experiment, and the paper classifies the potential factors into controllable, uncontrollable, nuisance, and held-constant factors considering simulation feasibility. It takes five nonalgorithm controllable factors including APs density, APs distribution, radio signal propagating attenuation factor, radio signal propagating noise, and RPs density into consideration and adopted the OFAT analysis method in experiment. The positioning result was achieved by using the deterministic and probabilistic algorithms, and the error was presented by RMSE and CDF. The results indicate that the high APs density, signal propagating attenuation factor, and RPs density, with the low signal propagating noise level, are favorable to better performance, while APs distribution has no particular impact pattern on the positioning error. Overall, this paper has made great potential contribution to the quality control of WLAN fingerprinting solutions.

  16. Preliminary results of real-time PPP-RTK positioning algorithm development for moving platforms and its performance validation

    Science.gov (United States)

    Won, Jihye; Park, Kwan-Dong

    2015-04-01

    Real-time PPP-RTK positioning algorithms were developed for the purpose of getting precise coordinates of moving platforms. In this implementation, corrections for the satellite orbit and satellite clock were taken from the IGS-RTS products while the ionospheric delay was removed through ionosphere-free combination and the tropospheric delay was either taken care of using the Global Pressure and Temperature (GPT) model or estimated as a stochastic parameter. To improve the convergence speed, all the available GPS and GLONASS measurements were used and Extended Kalman Filter parameters were optimized. To validate our algorithms, we collected the GPS and GLONASS data from a geodetic-quality receiver installed on a roof of a moving vehicle in an open-sky environment and used IGS final products of satellite orbits and clock offsets. The horizontal positioning error got less than 10 cm within 5 minutes, and the error stayed below 10 cm even after the vehicle start moving. When the IGS-RTS product and the GPT model were used instead of the IGS precise product, the positioning accuracy of the moving vehicle was maintained at better than 20 cm once convergence was achieved at around 6 minutes.

  17. Computer-controlled positive displacement pump for physiological flow simulation.

    Science.gov (United States)

    Holdsworth, D W; Rickey, D W; Drangova, M; Miller, D J; Fenster, A

    1991-11-01

    A computer-controlled pump for use both in the study of vascular haemodynamics and in the calibration of clinical devices which measure blood flow is designed. The novel design of this pump incorporates two rack-mounted pistons, driven into opposing cylinders by a micro-stepping motor. This approach allows the production of nearly uninterrupted steady flow, as well as a variety of pulsatile waveforms, including waveforms with reverse flow. The capabilities of this pump to produce steady flow from 0.1 to 60 ml s-1, as well as sinusoidal flow and physiological flow, such as that found in the common femoral and common carotid arteries are demonstrated. Cycle-to-cycle reproducibility is very good, with an average variation of 0.1 ml s-1 over thousands of cycles.

  18. A novel tablet computer platform for advanced language mapping during awake craniotomy procedures.

    Science.gov (United States)

    Morrison, Melanie A; Tam, Fred; Garavaglia, Marco M; Golestanirad, Laleh; Hare, Gregory M T; Cusimano, Michael D; Schweizer, Tom A; Das, Sunit; Graham, Simon J

    2016-04-01

    A computerized platform has been developed to enhance behavioral testing during intraoperative language mapping in awake craniotomy procedures. The system is uniquely compatible with the environmental demands of both the operating room and preoperative functional MRI (fMRI), thus providing standardized testing toward improving spatial agreement between the 2 brain mapping techniques. Details of the platform architecture, its advantages over traditional testing methods, and its use for language mapping are described. Four illustrative cases demonstrate the efficacy of using the testing platform to administer sophisticated language paradigms, and the spatial agreement between intraoperative mapping and preoperative fMRI results. The testing platform substantially improved the ability of the surgeon to detect and characterize language deficits. Use of a written word generation task to assess language production helped confirm areas of speech apraxia and speech arrest that were inadequately characterized or missed with the use of traditional paradigms, respectively. Preoperative fMRI of the analogous writing task was also assistive, displaying excellent spatial agreement with intraoperative mapping in all 4 cases. Sole use of traditional testing paradigms can be limiting during awake craniotomy procedures. Comprehensive assessment of language function will require additional use of more sophisticated and ecologically valid testing paradigms. The platform presented here provides a means to do so.

  19. Evaluating the Use of Commercial West Nile Virus Antigens as Positive Controls in the Rapid Analyte Measurement Platform West Nile Virus Assay.

    Science.gov (United States)

    Burkhalter, Kristen L; Savage, Harry M

    2015-12-01

    We evaluated the utility of 2 types of commercially available antigens as positive controls in the Rapid Analyte Measurement Platform (RAMP®) West Nile virus (WNV) assay. Purified recombinant WNV envelope antigens and whole killed virus antigens produced positive RAMP results and either type would be useful as a positive control. Killed virus antigens provide operational and economic advantages and we recommend their use over purified recombinant antigens. We also offer practical applications for RAMP positive controls and recommendations for preparing them.

  20. Systematic approach for deriving feasible mappings of parallel algorithms to parallel computing platforms

    NARCIS (Netherlands)

    Arkin, Ethem; Tekinerdogan, Bedir; Imre, Kayhan M.

    2017-01-01

    The need for high-performance computing together with the increasing trend from single processor to parallel computer architectures has leveraged the adoption of parallel computing. To benefit from parallel computing power, usually parallel algorithms are defined that can be mapped and executed

  1. Systematic approach for deriving feasible mappings of parallel algorithms to parallel computing platforms

    NARCIS (Netherlands)

    Arkin, Ethem; Tekinerdogan, Bedir; Imre, Kayhan M.

    2016-01-01

    The need for high-performance computing together with the increasing trend from single processor to parallel computer architectures has leveraged the adoption of parallel computing. To benefit from parallel computing power, usually parallel algorithms are defined that can be mapped and executed o

  2. StatoilHydro chooses Platform LSF and EnginFrame to implement an enterprise-wide computing grid

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-09-15

    This article described a single enterprise-wide computing grid developed by Canadian company Platform Computing Inc. for StatoilHydro. The system used an LSF workload management software with a multi-cluster capability and a web portal for job submissions. The system gave Statoil engineers access to all computing resources in the division and allowed computing resources to be managed centrally. The system was designed to accommodate Statoil's extensive 3-D modelling programs. The company initially used separate platforms for all 4 of its Norway locations. However, the disparity in the relative number of units in each location caused problems in terms of differences in engineering process consistency, and reservoir simulation accuracy. The new network was established in under 20 days, and will allow the oil company to dynamically grow the size of its grid by harvesting unused CPU cycles from any of the 185 user workstations across the division that may not be running at full capacity. It was concluded that when StatoilHydro opens a new location, additional clusters can be added to the grid.

  3. Computational Fluid Dynamic Analysis of a Floating Offshore Wind Turbine Experiencing Platform Pitching Motion

    Directory of Open Access Journals (Sweden)

    Thanhtoan Tran

    2014-08-01

    Full Text Available The objective of this study is to illustrate the unsteady aerodynamic effects of a floating offshore wind turbine experiencing the prescribed pitching motion of a supporting floating platform as a sine function. The three-dimensional, unsteady Reynolds Averaged Navier-Stokes equations with the shear-stress transport (SST k-ω turbulence model were applied. Moreover, an overset grid approach was used to model the rigid body motion of a wind turbine blade. The current simulation results are compared to various approaches from previous studies. The unsteady aerodynamic loads of the blade were demonstrated to change drastically with respect to the frequency and amplitude of platform motion.

  4. HySDeP: a computational platform for on-board hydrogen storage systems – hybrid high-pressure solid-state and gaseous storage

    DEFF Research Database (Denmark)

    Mazzucco, Andrea; Rokni, Masoud

    2016-01-01

    A computational platform is developed in the Modelica® language within the DymolaTM environment to provide a tool for the design and performance comparison of on-board hydrogen storage systems. The platform has been coupled with an open source library for hydrogen fueling stations to investigate...

  5. Hardware platforms for MEMS gyroscope tuning based on evolutionary computation using open-loop and closed -loop frequency response

    Science.gov (United States)

    Keymeulen, Didier; Ferguson, Michael I.; Fink, Wolfgang; Oks, Boris; Peay, Chris; Terrile, Richard; Cheng, Yen; Kim, Dennis; MacDonald, Eric; Foor, David

    2005-01-01

    We propose a tuning method for MEMS gyroscopes based on evolutionary computation to efficiently increase the sensitivity of MEMS gyroscopes through tuning. The tuning method was tested for the second generation JPL/Boeing Post-resonator MEMS gyroscope using the measurement of the frequency response of the MEMS device in open-loop operation. We also report on the development of a hardware platform for integrated tuning and closed loop operation of MEMS gyroscopes. The control of this device is implemented through a digital design on a Field Programmable Gate Array (FPGA). The hardware platform easily transitions to an embedded solution that allows for the miniaturization of the system to a single chip.

  6. VibroCV: a computer vision-based vibroarthrography platform with possible application to Juvenile Idiopathic Arthritis.

    Science.gov (United States)

    Wiens, Andrew D; Prahalad, Sampath; Inan, Omer T

    2016-08-01

    Vibroarthrography, a method for interpreting the sounds emitted by a knee during movement, has been studied for several joint disorders since 1902. However, to our knowledge, the usefulness of this method for management of Juvenile Idiopathic Arthritis (JIA) has not been investigated. To study joint sounds as a possible new biomarker for pediatric cases of JIA we designed and built VibroCV, a platform to capture vibroarthrograms from four accelerometers; electromyograms (EMG) and inertial measurements from four wireless EMG modules; and joint angles from two Sony Eye cameras and six light-emitting diodes with commercially-available off-the-shelf parts and computer vision via OpenCV. This article explains the design of this turn-key platform in detail, and provides a sample recording captured from a pediatric subject.

  7. Quantamatrix Multiplexed Assay Platform system for direct detection of bacteria and antibiotic resistance determinants in positive blood culture bottles.

    Science.gov (United States)

    Wang, H Y; Uh, Y; Kim, S; Lee, H

    2017-05-01

    Rapid and accurate identification of the causative pathogens of bloodstream infections (BSIs) is crucial for initiating appropriate antimicrobial therapy, which decreases the related morbidity and mortality rates. The aim of this study was to evaluate the usefulness of a newly developed multiplexed, bead-based bioassay system, the Quantamatrix Multiplexed Assay Platform (QMAP) system, obtained directly from blood culture bottles, to simultaneously detect the presence of bacteria and identify the genes for antibiotic resistance. The QMAP system was used to evaluate 619 blood culture bottles from patients with BSIs and to compare the results of conventional culture methods. Using conventional bacterial cultures as the reference standard, the sensitivity, specificity, positive predictive value, and negative predictive value of the QMAP system for detection of bacterial pathogens in positive blood culture (PBC) samples were 99.8% (n=592, 95% CI 0.9852-1.000, p system for identification of the genes for antibiotic resistance were 99.4% (n=158, 95% CI 0.9617-0.9999, p system takes about 3 hr, while culture methods can take 48-72 hr. Therefore, analysis using the QMAP system is rapid and reliable for characterizing causative pathogens in BSIs. Copyright © 2016 European Society of Clinical Microbiology and Infectious Diseases. Published by Elsevier Ltd. All rights reserved.

  8. New insights into two distinct nucleosome distributions: comparison of cross-platform positioning datasets in the yeast genome

    Directory of Open Access Journals (Sweden)

    Deng Yangyang

    2010-01-01

    Full Text Available Abstract Background Recently, a number of high-resolution genome-wide maps of nucleosome locations in S. cerevisiae have been derived experimentally. However, nucleosome positions are determined in vivo by the combined effects of numerous factors. Consequently, nucleosomes are not simple static units, which may explain the discrepancies in reported nucleosome positions as measured by different experiments. In order to more accurately depict the genome-wide nucleosome distribution, we integrated multiple nucleosomal positioning datasets using a multi-angle analysis strategy. Results To evaluate the contribution of chromatin structure to transcription, we used the vast amount of available nucleosome analyzed data. Analysis of this data allowed for the comprehensive identification of the connections between promoter nucleosome positioning patterns and various transcription-dependent properties. Further, we characterised the function of nucleosome destabilisation in the context of transcription regulation. Our results indicate that genes with similar nucleosome occupancy patterns share general transcription attributes. We identified the local regulatory correlation (LRC regions for two distinct types of nucleosomes and we assessed their regulatory properties. We also estimated the nucleosome reproducibility and measurement accuracy for high-confidence transcripts. We found that by maintaining a distance of ~13 bp between the upstream border of the +1 nucleosome and the transcription start sites (TSSs, the stable +1 nucleosome may form a barrier against the accessibility of the TSS and shape an optimum chromatin conformation for gene regulation. An in-depth analysis of nucleosome positioning in normally growing and heat shock cells suggested that the extent and patterns of nucleosome sliding are associated with gene activation. Conclusions Our results, which combine different types of data, suggest that cross-platform information, including

  9. Computer-implemented method and apparatus for autonomous position determination using magnetic field data

    Science.gov (United States)

    Ketchum, Eleanor A. (Inventor)

    2000-01-01

    A computer-implemented method and apparatus for determining position of a vehicle within 100 km autonomously from magnetic field measurements and attitude data without a priori knowledge of position. An inverted dipole solution of two possible position solutions for each measurement of magnetic field data are deterministically calculated by a program controlled processor solving the inverted first order spherical harmonic representation of the geomagnetic field for two unit position vectors 180 degrees apart and a vehicle distance from the center of the earth. Correction schemes such as a successive substitutions and a Newton-Raphson method are applied to each dipole. The two position solutions for each measurement are saved separately. Velocity vectors for the position solutions are calculated so that a total energy difference for each of the two resultant position paths is computed. The position path with the smaller absolute total energy difference is chosen as the true position path of the vehicle.

  10. Payment Platform

    DEFF Research Database (Denmark)

    Hjelholt, Morten; Damsgaard, Jan

    2012-01-01

    Payment transactions through the use of physical coins, bank notes or credit cards have for centuries been the standard formats of exchanging money. Recently online and mobile digital payment platforms has entered the stage as contenders to this position and possibly could penetrate societies...... thoroughly and substitute current payment standards in the decades to come. This paper portrays how digital payment platforms evolve in socio-technical niches and how various technological platforms aim for institutional attention in their attempt to challenge earlier platforms and standards. The paper...... applies a co-evolutionary multilevel perspective to model the interplay and processes between technology and society wherein digital payment platforms potentially will substitute other payment platforms just like the credit card negated the check. On this basis this paper formulate a multilevel conceptual...

  11. Significance of buccopalatal implant position, biotype, platform switching, and pre-implant bone augmentation on the level of the midbuccal mucosa.

    Science.gov (United States)

    Zuiderveld, Elise G; den Hartog, Laurens; Vissink, Arjan; Raghoebar, Gerry M; Meijer, Henny J A

    2014-01-01

    This study assessed whether buccopalatal implant position, biotype, platform switching, and pre-implant bone augmentation affects the level of the midbuccal mucosa (MBM). Ninety patients with a single-tooth implant in the esthetic zone were included. The level of the MBM was measured on photographs taken 1 year after crown placement. The factors analyzed only explained 22% of the level of the MBM. The more an implant was placed to the buccal, the more the MBM was positioned apically. A comparable phenomenon was observed in cases with a thick biotype and cases that underwent pre-implant bone augmentation. Platform switching did not affect the level of the MBM.

  12. Coupled sensor/platform control design for low-level chemical detection with position-adaptive micro-UAVs

    Science.gov (United States)

    Goodwin, Thomas; Carr, Ryan; Mitra, Atindra K.; Selmic, Rastko R.

    2009-05-01

    We discuss the development of Position-Adaptive Sensors [1] for purposes for detecting embedded chemical substances in challenging environments. This concept is a generalization of patented Position-Adaptive Radar Concepts developed at AFRL for challenging conditions such as urban environments. For purposes of investigating the detection of chemical substances using multiple MAV (Micro-UAV) platforms, we have designed and implemented an experimental testbed with sample structures such as wooden carts that contain controlled leakage points. Under this general concept, some of the members of a MAV swarm can serve as external position-adaptive "transmitters" by blowing air over the cart and some of the members of a MAV swarm can serve as external position-adaptive "receivers" that are equipped with chemical or biological (chem/bio) sensors that function as "electronic noses". The objective can be defined as improving the particle count of chem/bio concentrations that impinge on a MAV-based position-adaptive sensor that surrounds a chemical repository, such as a cart, via the development of intelligent position-adaptive control algorithms. The overall effect is to improve the detection and false-alarm statistics of the overall system. Within the major sections of this paper, we discuss a number of different aspects of developing our initial MAV-Based Sensor Testbed. This testbed includes blowers to simulate position-adaptive excitations and a MAV from Draganfly Innovations Inc. with stable design modifications to accommodate our chem/bio sensor boom design. We include details with respect to several critical phases of the development effort including development of the wireless sensor network and experimental apparatus, development of the stable sensor boom for the MAV, integration of chem/bio sensors and sensor node onto the MAV and boom, development of position-adaptive control algorithms and initial tests at IDCAST (Institute for the Development and

  13. Finding Needle in a Million Metrics: Anomaly Detection in a Large-scale Computational Advertising Platform

    OpenAIRE

    Zhou, Bowen; Shariat, Shahriar

    2016-01-01

    Online media offers opportunities to marketers to deliver brand messages to a large audience. Advertising technology platforms enables the advertisers to find the proper group of audiences and deliver ad impressions to them in real time. The recent growth of the real time bidding has posed a significant challenge on monitoring such a complicated system. With so many components we need a reliable system that detects the possible changes in the system and alerts the engineering team. In this pa...

  14. A computational platform for considering the effects of aerodynamic and seismic load combination for utility scale horizontal axis wind turbines

    Science.gov (United States)

    Asareh, Mohammad-Amin; Prowell, Ian; Volz, Jeffery; Schonberg, William

    2016-03-01

    The wide deployment of wind turbines in locations with high seismic hazard has led engineers to take into account a more comprehensive seismic design of such structures. Turbine specific guidelines usually use simplified methods and consider many assumptions to combine seismic demand with the other operational loads effecting the design of these structures. As the turbines increase in size and capacity, the interaction between seismic loads and aerodynamic loads becomes even more important. In response to the need for a computational tool that can perform coupled simulations of wind and seismic loads, a seismic module is developed for the FAST code and described in this research. This platform allows engineers working in this industry to directly consider interaction between seismic and other environmental loads for turbines. This paper details the practical application and theory of this platform and provides examples for the use of different capabilities. The platform is then used to show the suitable earthquake and operational load combination with the implicit consideration of aerodynamic damping by estimating appropriate load factors.

  15. OpenVX-based Python Framework for real-time cross platform acceleration of embedded computer vision applications

    Directory of Open Access Journals (Sweden)

    Ori Heimlich

    2016-11-01

    Full Text Available Embedded real-time vision applications are being rapidly deployed in a large realm of consumer electronics, ranging from automotive safety to surveillance systems. However, the relatively limited computational power of embedded platforms is considered as a bottleneck for many vision applications, necessitating optimization. OpenVX is a standardized interface, released in late 2014, in an attempt to provide both system and kernel level optimization to vision applications. With OpenVX, Vision processing are modeled with coarse-grained data flow graphs, which can be optimized and accelerated by the platform implementer. Current full implementations of OpenVX are given in the programming language C, which does not support advanced programming paradigms such as object-oriented, imperative and functional programming, nor does it have runtime or type-checking. Here we present a python-based full Implementation of OpenVX, which eliminates much of the discrepancies between the object-oriented paradigm used by many modern applications and the native C implementations. Our open-source implementation can be used for rapid development of OpenVX applications in embedded platforms. Demonstration includes static and real-time image acquisition and processing using a Raspberry Pi and a GoPro camera. Code is given as supplementary information. Code project and linked deployable virtual machine are located on GitHub: https://github.com/NBEL-lab/PythonOpenVX.

  16. Advanced LIGO Two-Stage Twelve-Axis Vibration Isolation and Positioning Platform. Part 2: Experimental Investigation and Tests Results

    CERN Document Server

    Matichard, Fabrice; Mason, Kenneth; Mittleman, Richard; Abbott, Benjamin; Abbott, Samuel; Allwine, Eric; Barnum, Samuel; Birch, Jeremy; Biscans, Sebastien; Clark, Daniel; Coyne, Dennis; DeBra, Dan; DeRosa, Ryan; Foley, Stephany; Fritschel, Peter; Giaime, Joseph A; Gray, Corey; Grabeel, Gregory; Hanson, Joe; Hillard, Michael; Kissel, Jeffrey; Kucharczyk, Christopher; Roux, Adrien Le; Lhuillier, Vincent; Macinnis, Myron; OReilly, Brian; Ottaway, David; Paris, Hugo; Puma, Michael; Radkins, Hugh; Ramet, Celine; Robinson, Mitchell; Ruet, Laurent; Sareen, Pradeep; Shoemaker, Daivid; Stein, Andy; Thomas, Jeremy; Vargas, Michael; Warner, Jimmy

    2014-01-01

    This paper presents the results of the past seven years of experimental investigation and testing done on the two-stage twelve-axis vibration isolation platform for Advanced LIGO gravity waves observatories. This five-ton two-and-half-meter wide system supports more than a 1000 kg of very sensitive equipment. It provides positioning capability and seismic isolation in all directions of translation and rotation. To meet the very stringent requirements of Advanced LIGO, the system must provide more than three orders of magnitude of isolation over a very large bandwidth. It must bring the motion below 10^(-11) m/(Hz)^0.5 at 1 Hz and 10^(-12) m/(Hz)^0.5 at 10 Hz. A prototype of this system has been built in 2006. It has been extensively tested and analyzed during the following two years. This paper shows how the experimental results obtained with the prototype were used to engineer the final design. It highlights how the engineering solutions implemented not only improved the isolation performance but also greatl...

  17. Computing solutions of the modified Bessel differential equation for imaginary orders and positive arguments

    NARCIS (Netherlands)

    Gil, A.; Segura, J.; Temme, N.M.

    2004-01-01

    We describe a variety of methods to compute the functions Kia (x), Lia (x) and their derivatives for real a and positive x. These functions are numerically satisfactory independent solutions of the differential equation x2w'' + xw' + (a2

  18. Reliability of Industrial Computer Management Platform%工控机可靠性管理平台

    Institute of Scientific and Technical Information of China (English)

    李春霞; 唐怀斌; 贺孝珍; 刘兴莉; 隆萍

    2012-01-01

    论述了工控机可靠性的特征量,从产品设计、研发、生产、管理等方面提出了保证工控机可靠性实现和持续增长的技术、方法和管理体系,并就建立企业可靠性管理平台问题提出了看法.%The characteristic quantities of the industrial computer reliability are discussed. From product design, deveopment, producting, management and other aspects, the technology, methods and management system are put forward, which can ensure reliability to achieve and continuous growth of industrial computer. The establishment of enterprise reliability management platform is discussed.

  19. 构建基于移动云计算的微课教学资源平台%Construction of the micro-lecture teaching resource platform based on mobile cloud computing

    Institute of Scientific and Technical Information of China (English)

    朱静宜

    2015-01-01

    移动云计算是指移动终端通过移动网络以按需、易扩展的方式获得所需的基础设施、平台、软件或应用的一种信息资源服务的交付与使用模式,具有高效的数据存储和计算能力,对微课教学资源平台建设产生了积极的作用.基于目前微课教学资源平台建设的背景,结合移动云计算和微课的特点,分析了教学资源平台总体结构并对其进行了构建.%Mobile cloud computing is a kind of information resource service delivery and usage mode, in which the mobile terminals gain the required infrastructure, platform, software or application through mobile network in an on-demand, scalable way. With efficient data storage and computing power, it has a positive effect on the construction of micro-lecture teaching resources platform. In the background of current micro-lecture teaching resource platform construction, according to the characteristics of mobile cloud computing and micro-lecture, this paper analyzes the architecture of the teaching resource platform and makes it constructed.

  20. Cyborg systems as platforms for computer-vision algorithm-development for astrobiology

    Science.gov (United States)

    McGuire, Patrick Charles; Rodríguez Manfredi, José Antonio; Martínez, Eduardo Sebastián; Gómez Elvira, Javier; Díaz Martínez, Enrique; Ormö, Jens; Neuffer, Kai; Giaquinta, Antonino; Camps Martínez, Fernando; Lepinette Malvitte, Alain; Pérez Mercader, Juan; Ritter, Helge; Oesker, Markus; Ontrup, Jörg; Walter, Jörg

    2004-03-01

    Employing the allegorical imagery from the film "The Matrix", we motivate and discuss our "Cyborg Astrobiologist" research program. In this research program, we are using a wearable computer and video camcorder in order to test and train a computer-vision system to be a field-geologist and field-astrobiologist.

  1. Optimization of beam angles for intensity modulated radiation therapy treatment planning using genetic algorithm on a distributed computing platform.

    Science.gov (United States)

    Nazareth, Daryl P; Brunner, Stephen; Jones, Matthew D; Malhotra, Harish K; Bakhtiari, Mohammad

    2009-07-01

    Planning intensity modulated radiation therapy (IMRT) treatment involves selection of several angle parameters as well as specification of structures and constraints employed in the optimization process. Including these parameters in the combinatorial search space vastly increases the computational burden, and therefore the parameter selection is normally performed manually by a clinician, based on clinical experience. We have investigated the use of a genetic algorithm (GA) and distributed-computing platform to optimize the gantry angle parameters and provide insight into additional structures, which may be necessary, in the dose optimization process to produce optimal IMRT treatment plans. For an IMRT prostate patient, we produced the first generation of 40 samples, each of five gantry angles, by selecting from a uniform random distribution, subject to certain adjacency and opposition constraints. Dose optimization was performed by distributing the 40-plan workload over several machines running a commercial treatment planning system. A score was assigned to each resulting plan, based on how well it satisfied clinically-relevant constraints. The second generation of 40 samples was produced by combining the highest-scoring samples using techniques of crossover and mutation. The process was repeated until the sixth generation, and the results compared with a clinical (equally-spaced) gantry angle configuration. In the sixth generation, 34 of the 40 GA samples achieved better scores than the clinical plan, with the best plan showing an improvement of 84%. Moreover, the resulting configuration of beam angles tended to cluster toward the patient's sides, indicating where the inclusion of additional structures in the dose optimization process may avoid dose hot spots. Additional parameter selection in IMRT leads to a large-scale computational problem. We have demonstrated that the GA combined with a distributed-computing platform can be applied to optimize gantry angle

  2. Significance of buccopalatal implant position, biotype, platform switching, and pre-implant bone augmentation on the level of the midbuccal mucosa

    NARCIS (Netherlands)

    Zuiderveld, Elise G; den Hartog, Laurens; Vissink, Arjan; Raghoebar, Gerry M; Meijer, Henny J A

    2014-01-01

    This study assessed whether buccopalatal implant position, biotype, platform switching, and pre-implant bone augmentation affects the level of the midbuccal mucosa (MBM). Ninety patients with a single-tooth implant in the esthetic zone were included. The level of the MBM was measured on photographs

  3. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    Science.gov (United States)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  4. 基于云计算的模型服务平台研究%Research on Model Service Platform Based on Cloud Computing

    Institute of Scientific and Technical Information of China (English)

    孙向军; 陆勤夫

    2011-01-01

    The model service platform based on cloud computing has been established through analyzing the advantage of cloud computing in service from military model management and reuse,and key technology of realizing model service platform based on cloud computing%从军事模型管理和重用的实际出发,分析云计算在实现服务上的优势,提出了基于云计算的模型服务平台,并对实现基于云计算模型服务平台的关键技术进行了分析。

  5. AAE and AAOMR Joint Position Statement: Use of Cone Beam Computed Tomography in Endodontics 2015 Update.

    Science.gov (United States)

    2015-10-01

    The following statement was prepared by the Special Committee to Revise the Joint American Association of Endodontists/American Academy of Oral and Maxillofacial Radiology Position on Cone Beam Computed Tomography, and approved by the AAE Board of Directors and AAOMR Executive Council in May 2015. AAE members may reprint this position statement for distribution to patients or referring dentists.

  6. Sketch for a model of four epistemological positions toward computer game play

    DEFF Research Database (Denmark)

    Leino, Olli

    2008-01-01

    The paper attempts to sketch out four distinct epistemological positions toward the player, who is understood as derived from play and game. To map out the problem field, two equally challenged positions toward computer game play are observed, emerging from inadequate treatment of the differences...... an external viewpoint, appear as fulfilling a set criteria, while from an inclusive viewpoint, every object which affords being played is counted as a game. These polarities are combined on a two-dimensional plane in order to arrive at a four epistemological positions toward computer game play, which...

  7. Analysis of offshore jacket platform

    Digital Repository Service at National Institute of Oceanography (India)

    Harish, N.; Mandal, S.; Shanthala, B.; Rao, S.

    -Rohman, M., ‘Multi-loop feedback control of offshore steel jacket platforms’, Computers & Structures 70, 1999, 185–202. 11.Yamamoto, I., Terada, Y., and Yokokura, K., ‘An application of a position keeping control system to floating offshore platform...

  8. NCI's High Performance Computing (HPC) and High Performance Data (HPD) Computing Platform for Environmental and Earth System Data Science

    Science.gov (United States)

    Evans, Ben; Allen, Chris; Antony, Joseph; Bastrakova, Irina; Gohar, Kashif; Porter, David; Pugh, Tim; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley

    2015-04-01

    The National Computational Infrastructure (NCI) has established a powerful and flexible in-situ petascale computational environment to enable both high performance computing and Data-intensive Science across a wide spectrum of national environmental and earth science data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress so far to harmonise the underlying data collections for future interdisciplinary research across these large volume data collections. NCI has established 10+ PBytes of major national and international data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the major Australian national-scale scientific collections), leading research communities, and collaborating overseas organisations. New infrastructures created at NCI mean the data collections are now accessible within an integrated High Performance Computing and Data (HPC-HPD) environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large-scale high-bandwidth Lustre filesystems. The hardware was designed at inception to ensure that it would allow the layered software environment to flexibly accommodate the advancement of future data science. New approaches to software technology and data models have also had to be developed to enable access to these large and exponentially

  9. Evaluation of three-dimensional position change of the condylar head after orthognathic surgery using computer-aided design/computer-aided manufacturing-made condyle positioning jig.

    Science.gov (United States)

    Kim, Hyung-Mo; Baek, Seung-Hak; Kim, Tae-Yun; Choi, Jin-Young

    2014-11-01

    This study was performed to evaluate the efficacy of computer-aided design/computer-aided manufacturing (CAM/CAD)-made condyle positioning jig in orthognathic surgery. The sample consisted of 40 mandibular condyles of 20 patients with class III malocclusion who underwent bilateral sagittal split ramus osteotomy with semirigid fixation (6 men and 14 women; mean age, 25 y; mean amount of mandibular setback, 5.8 mm). Exclusion criteria were patients who needed surgical correction of the frontal ramal inclination and had signs and symptoms of the temporomandibular disorder before surgery. Three-dimensional computed tomograms were taken 1 month before the surgery (T1) and 1 day after the surgery (T2). The condylar position was evaluated at the T1 and T2 stages on the axial, frontal, and sagittal aspects in the three-dimensional coordinates. The linear change of the posterior border of the proximal segment of the ramus between T1 and T2 was also evaluated in 30 condyles (15 patients), with the exception of 10 condyles of 5 patients who received mandibular angle reduction surgery. There was no significant difference in the condylar position in the frontal and sagittal aspects (P > 0.05). Although there was a significant difference in the condylar position in the axial aspect (P < 0.01), the amount of difference was less than 1 mm and 1 degree; it can be considered clinically nonsignificant. In the linear change of the posterior border of the proximal segment of the ramus, the mean change was 1.4 mm and 60% of the samples showed a minimal change of less than 1 mm. The results of this study suggest that CAD/CAM-made condyle positioning jig is easy to install and reliable to use in orthognathic surgery.

  10. Scaling up ATLAS Event Service to production levels on opportunistic computing platforms

    Science.gov (United States)

    Benjamin, D.; Caballero, J.; Ernst, M.; Guan, W.; Hover, J.; Lesny, D.; Maeno, T.; Nilsson, P.; Tsulaia, V.; van Gemmeren, P.; Vaniachine, A.; Wang, F.; Wenaus, T.; ATLAS Collaboration

    2016-10-01

    Continued growth in public cloud and HPC resources is on track to exceed the dedicated resources available for ATLAS on the WLCG. Examples of such platforms are Amazon AWS EC2 Spot Instances, Edison Cray XC30 supercomputer, backfill at Tier 2 and Tier 3 sites, opportunistic resources at the Open Science Grid (OSG), and ATLAS High Level Trigger farm between the data taking periods. Because of specific aspects of opportunistic resources such as preemptive job scheduling and data I/O, their efficient usage requires workflow innovations provided by the ATLAS Event Service. Thanks to the finer granularity of the Event Service data processing workflow, the opportunistic resources are used more efficiently. We report on our progress in scaling opportunistic resource usage to double-digit levels in ATLAS production.

  11. Scaling up ATLAS Event Service to production levels on opportunistic computing platforms

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00066086; The ATLAS collaboration; Caballero, Jose; Ernst, Michael; Guan, Wen; Hover, John; Lesny, David; Maeno, Tadashi; Nilsson, Paul; Tsulaia, Vakhtang; van Gemmeren, Peter; Vaniachine, Alexandre; Wang, Fuquan; Wenaus, Torre

    2016-01-01

    Continued growth in public cloud and HPC resources is on track to exceed the dedicated resources available for ATLAS on the WLCG. Examples of such platforms are Amazon AWS EC2 Spot Instances, Edison Cray XC30 supercomputer, backfill at Tier 2 and Tier 3 sites, opportunistic resources at the Open Science Grid (OSG), and ATLAS High Level Trigger farm between the data taking periods. Because of specific aspects of opportunistic resources such as preemptive job scheduling and data I/O, their efficient usage requires workflow innovations provided by the ATLAS Event Service. Thanks to the finer granularity of the Event Service data processing workflow, the opportunistic resources are used more efficiently. We report on our progress in scaling opportunistic resource usage to double-digit levels in ATLAS production.

  12. Scaling up ATLAS Event Service to production levels on opportunistic computing platforms

    CERN Document Server

    Benjamin, Douglas; The ATLAS collaboration; Ernst, Michael; Guan, Wen; Hover, John; Lesny, David; Maeno, Tadashi; Nilsson, Paul; Tsulaia, Vakhtang; van Gemmeren, Peter; Vaniachine, Alexandre; Wang, Fuquan; Wenaus, Torre

    2016-01-01

    Continued growth in public cloud and HPC resources is on track to overcome the dedicated resources available for ATLAS on the WLCG. Example of such platforms are Amazon AWS EC2 Spot Instances, Edison Cray XC30 supercomputer, backfill at the Tier-2 and Tier-3 sites, opportunistic resources at the Open Science Grid, and ATLAS High Level Trigger farm between the data taking periods. Because of opportunistic resources specifics such as preemptive job scheduling and data I/O, their efficient usage requires workflow innovations provided by the ATLAS Event Service. Thanks to the finer granularity of the Event Service data processing workflow, the opportunistic resources are used more efficiently. We report on our progress in scaling opportunistic resource usage to double-digit levels in ATLAS production.

  13. Process Simulation of Complex Biological Pathways in Physical Reactive Space and Reformulated for Massively Parallel Computing Platforms.

    Science.gov (United States)

    Ganesan, Narayan; Li, Jie; Sharma, Vishakha; Jiang, Hanyu; Compagnoni, Adriana

    2016-01-01

    Biological systems encompass complexity that far surpasses many artificial systems. Modeling and simulation of large and complex biochemical pathways is a computationally intensive challenge. Traditional tools, such as ordinary differential equations, partial differential equations, stochastic master equations, and Gillespie type methods, are all limited either by their modeling fidelity or computational efficiency or both. In this work, we present a scalable computational framework based on modeling biochemical reactions in explicit 3D space, that is suitable for studying the behavior of large and complex biological pathways. The framework is designed to exploit parallelism and scalability offered by commodity massively parallel processors such as the graphics processing units (GPUs) and other parallel computing platforms. The reaction modeling in 3D space is aimed at enhancing the realism of the model compared to traditional modeling tools and framework. We introduce the Parallel Select algorithm that is key to breaking the sequential bottleneck limiting the performance of most other tools designed to study biochemical interactions. The algorithm is designed to be computationally tractable, handle hundreds of interacting chemical species and millions of independent agents by considering all-particle interactions within the system. We also present an implementation of the framework on the popular graphics processing units and apply it to the simulation study of JAK-STAT Signal Transduction Pathway. The computational framework will offer a deeper insight into various biological processes within the cell and help us observe key events as they unfold in space and time. This will advance the current state-of-the-art in simulation study of large scale biological systems and also enable the realistic simulation study of macro-biological cultures, where inter-cellular interactions are prevalent.

  14. Computational and experimental platform for understanding and optimizing water flux and salt rejection in nanoporous membranes.

    Energy Technology Data Exchange (ETDEWEB)

    Rempe, Susan B.

    2010-09-01

    Affordable clean water is both a global and a national security issue as lack of it can cause death, disease, and international tension. Furthermore, efficient water filtration reduces the demand for energy, another national issue. The best current solution to clean water lies in reverse osmosis (RO) membranes that remove salts from water with applied pressure, but widely used polymeric membrane technology is energy intensive and produces water depleted in useful electrolytes. Furthermore incremental improvements, based on engineering solutions rather than new materials, have yielded only modest gains in performance over the last 25 years. We have pursued a creative and innovative new approach to membrane design and development for cheap desalination membranes by approaching the problem at the molecular level of pore design. Our inspiration comes from natural biological channels, which permit faster water transport than current reverse osmosis membranes and selectively pass healthy ions. Aiming for an order-of-magnitude improvement over mature polymer technology carries significant inherent risks. The success of our fundamental research effort lies in our exploiting, extending, and integrating recent advances by our team in theory, modeling, nano-fabrication and platform development. A combined theoretical and experimental platform has been developed to understand the interplay between water flux and ion rejection in precisely-defined nano-channels. Our innovative functionalization of solid state nanoporous membranes with organic protein-mimetic polymers achieves 3-fold improvement in water flux over commercial RO membranes and has yielded a pending patent and industrial interest. Our success has generated useful contributions to energy storage, nanoscience, and membrane technology research and development important for national health and prosperity.

  15. An Evaluation of an OSGi-based Residential Pervasive Computing Platform

    DEFF Research Database (Denmark)

    Hansen, Klaus Marius; Larsen, Simon Bo; Pagter, Jakob Illeborg;

    2004-01-01

    Residential applications including home control, alarm systems, and monitoring services is an area in which pervasive computing systems are currently emerging. One problem facing technology and service providers is getting a view on and analysis of technological and commercial problems and opport......Residential applications including home control, alarm systems, and monitoring services is an area in which pervasive computing systems are currently emerging. One problem facing technology and service providers is getting a view on and analysis of technological and commercial problems...... and opportunities. As a step towards that, we present an analysis and evaluation of a widely-used setup for residential pervasive computing applications, viz., a setup based on a residential gateway with an Open Services Gateway Initiative (OSGi) implementation. The analysis is anchored in use, through scenarios...

  16. Interesting Spatio-Temporal Region Discovery Computations Over Gpu and Mapreduce Platforms

    Science.gov (United States)

    McDermott, M.; Prasad, S. K.; Shekhar, S.; Zhou, X.

    2015-07-01

    Discovery of interesting paths and regions in spatio-temporal data sets is important to many fields such as the earth and atmospheric sciences, GIS, public safety and public health both as a goal and as a preliminary step in a larger series of computations. This discovery is usually an exhaustive procedure that quickly becomes extremely time consuming to perform using traditional paradigms and hardware and given the rapidly growing sizes of today's data sets is quickly outpacing the speed at which computational capacity is growing. In our previous work (Prasad et al., 2013a) we achieved a 50 times speedup over sequential using a single GPU. We were able to achieve near linear speedup over this result on interesting path discovery by using Apache Hadoop to distribute the workload across multiple GPU nodes. Leveraging the parallel architecture of GPUs we were able to drastically reduce the computation time of a 3-dimensional spatio-temporal interest region search on a single tile of normalized difference vegetative index for Saudi Arabia. We were further able to see an almost linear speedup in compute performance by distributing this workload across several GPUs with a simple MapReduce model. This increases the speed of processing 10 fold over the comparable sequential while simultaneously increasing the amount of data being processed by 384 fold. This allowed us to process the entirety of the selected data set instead of a constrained window.

  17. Design Tools for Accelerating Development and Usage of Multi-Core Computing Platforms

    Science.gov (United States)

    2014-04-01

    e.g., see [Bhattacharyya 2013]). Through their connections to computation graphs [ Karp 1966] and Kahn process networks [Kahn 1974, Lee 1995...parallel programming. In Proceedings of the IFIP Congress, 1974. [ Karp 1966] R. M. Karp and R. E. Miller. Properties of a model for parallel

  18. A Middleware Platform for Providing Mobile and Embedded Computing Instruction to Software Engineering Students

    Science.gov (United States)

    Mattmann, C. A.; Medvidovic, N.; Malek, S.; Edwards, G.; Banerjee, S.

    2012-01-01

    As embedded software systems have grown in number, complexity, and importance in the modern world, a corresponding need to teach computer science students how to effectively engineer such systems has arisen. Embedded software systems, such as those that control cell phones, aircraft, and medical equipment, are subject to requirements and…

  19. A Framework for Collaborative and Convenient Learning on Cloud Computing Platforms

    Science.gov (United States)

    Sharma, Deepika; Kumar, Vikas

    2017-01-01

    The depth of learning resides in collaborative work with more engagement and fun. Technology can enhance collaboration with a higher level of convenience and cloud computing can facilitate this in a cost effective and scalable manner. However, to deploy a successful online learning environment, elementary components of learning pedagogy must be…

  20. WELCOME – innovative integrated care platform using wearable sensing and smart cloud computing for COPD patients with comorbidities.

    Science.gov (United States)

    Chouvarda, Ioanna; Philip, Nada Y; Natsiavas, Pantelis; Kilintzis, Vasilis; Sobnath, Drishty; Kayyali, Reem; Henriques, Jorge; Paiva, Rui Pedro; Raptopoulos, Andreas; Chételat, Olivier; Maglaveras, Nicos

    2014-01-01

    We propose WELCOME, an innovative integrated care platform using wearable sensors and smart cloud computing for Chronic Obstructive Pulmonary Disease (COPD) patients with co-morbidities. WELCOME aims to bring about a change in the reactive nature of the management of chronic diseases and its comorbidities, in particular through the development of a patient centred and proactive approach to COPD management. The aim of WELCOME is to support healthcare services to give early detection of complications (potentially reducing hospitalisations) and the prevention and mitigation of comorbidities (Heart Failure, Diabetes, Anxiety and Depression). The system incorporates patient hub, where it interacts with the patient via a light vest including a large number of non-invasive chest sensors for monitoring various relevant parameters. In addition, interactive applications to monitor and manage diabetes, anxiety and lifestyle issues will be provided to the patient. Informal carers will also be supported in dealing with their patients. On the other hand, welcome smart cloud platform is the heart of the proposed system where all the medical records and the monitoring data are managed and processed via the decision support system. Healthcare professionals will be able to securely access the WELCOME applications to monitor and manage the patient's conditions and respond to alerts on personalized level.

  1. ArchSim: A System-Level Parallel Simulation Platform for the Architecture Design of High Performance Computer

    Institute of Scientific and Technical Information of China (English)

    Yong-Qin Huang; Hong-Liang Li; Xiang-Hui Xie; Lei Qian; Zi-Yu Hao; Feng Guo; Kun Zhang

    2009-01-01

    High performance computer(HPC)is a complex huge system,of which the architecture design meets increasing difficulties and risks.Traditional methods,such as theoretical analysis,component-level simulation and sequential simulation,are not applicable to system-level simulations of HPC systems.Eyen the parallel simulation using large-scale parallel machines also have many difficulties in scalability,reliability,generality,as well as efficiency.According to the current needs of HPC architecture design,this paper proposes a system-level parallel simulation platform:ArchSim.We first introduce the architecture of ArchSim simulation platform which is composed of a global server(GS),local server agents(LSA)and entities.Secondly,we emphasize some key techniques of ArchSim,including the synchronization protocol,the communication mechanism and the distributed checkpointing/restart mechanism.We then make a synthesized test of some main performance indices of ArchSim with the phold benchmark and analyze the extra overhead generated by ArchSim.Finally,based on ArchSim.we construct a parallel event-driven interconnection network simulator and a system-level simulator for a small scale HPC system with 256 processors.The results of the performance test and HPC system simulations demonstrate that ArchSim can achieve high speedup ratio and high scalability on parallel host machine and support system-level simulations for the architecture design of HPC systems.

  2. 云计算模式下的物流公共信息平台设计研究%Design of Logistics Public Information Platform under Cloud Computation

    Institute of Scientific and Technical Information of China (English)

    杨从亚; 徐海峰

    2013-01-01

    In this paper,in connection with the advantage of the cloud computation technology,we proposed a line of thinking in the design of the logistics public infornation platform.Starting from an analysis of the basic demand of the platform,we determined the basic positioning of the platform and then from the perspective of cloud computation,designed the functions and modules of the platform.%结合云计算技术的优势,提出了一种基于云计算模式的物流公共信息平台设计思路,从物流公共信息平台的基本需求分析人手,提出了平台设计的基本定位,并结合云计算模式,对物流公共信息平台各项系统功能和子系统模块进行了分析和设计,为系统开发与应用奠定了基础.

  3. A positive/negative ion-switching, targeted mass spectrometry-based metabolomics platform for bodily fluids, cells, and fresh and fixed tissue.

    Science.gov (United States)

    Yuan, Min; Breitkopf, Susanne B; Yang, Xuemei; Asara, John M

    2012-04-12

    The revival of interest in cancer cell metabolism in recent years has prompted the need for quantitative analytical platforms for studying metabolites from in vivo sources. We implemented a quantitative polar metabolomics profiling platform using selected reaction monitoring with a 5500 QTRAP hybrid triple quadrupole mass spectrometer that covers all major metabolic pathways. The platform uses hydrophilic interaction liquid chromatography with positive/negative ion switching to analyze 258 metabolites (289 Q1/Q3 transitions) from a single 15-min liquid chromatography-mass spectrometry acquisition with a 3-ms dwell time and a 1.55-s duty cycle time. Previous platforms use more than one experiment to profile this number of metabolites from different ionization modes. The platform is compatible with polar metabolites from any biological source, including fresh tissues, cancer cells, bodily fluids and formalin-fixed paraffin-embedded tumor tissue. Relative quantification can be achieved without using internal standards, and integrated peak areas based on total ion current can be used for statistical analyses and pathway analyses across biological sample conditions. The procedure takes ∼12 h from metabolite extraction to peak integration for a data set containing 15 total samples (∼6 h for a single sample).

  4. An Implementation of real-time phased array radar fundamental functions on DSP-focused, high performance embedded computing platform

    Science.gov (United States)

    Yu, Xining; Zhang, Yan; Patel, Ankit; Zahrai, Allen; Weber, Mark

    2016-05-01

    This paper investigates the feasibility of real-time, multiple channel processing of a digital phased array system backend design, with focus on high-performance embedded computing (HPEC) platforms constructed based on general purpose digital signal processor (DSP). Serial RapidIO (SRIO) is used as inter-chip connection backend protocol to support the inter-core communications and parallelisms. Performance benchmark was obtained based on a SRIO system chassis and emulated configuration similar to a field scale demonstrator of Multi-functional Phased Array Radar (MPAR). An interesting aspect of this work is comparison between "raw and low-level" DSP processing and emerging tools that systematically take advantages of the parallelism and multi-core capability, such as OpenCL and OpenMP. Comparisons with other backend HPEC solutions, such as FPGA and GPU, are also provided through analysis and experiments.

  5. Cell illustrator 4.0: a computational platform for systems biology.

    Science.gov (United States)

    Nagasaki, Masao; Saito, Ayumu; Jeong, Euna; Li, Chen; Kojima, Kaname; Ikeda, Emi; Miyano, Satoru

    2011-01-01

    Cell Illustrator is a software platform for Systems Biology that uses the concept of Petri net for modeling and simulating biopathways. It is intended for biological scientists working at bench. The latest version of Cell Illustrator 4.0 uses Java Web Start technology and is enhanced with new capabilities, including: automatic graph grid layout algorithms using ontology information; tools using Cell System Markup Language (CSML) 3.0 and Cell System Ontology 3.0; parameter search module; high-performance simulation module; CSML database management system; conversion from CSML model to programming languages (FORTRAN, C, C++, Java, Python and Perl); import from SBML, CellML, and BioPAX; and, export to SVG and HTML. Cell Illustrator employs an extension of hybrid Petri net in an object-oriented style so that biopathway models can include objects such as DNA sequence, molecular density, 3D localization information, transcription with frame-shift, translation with codon table, as well as biochemical reactions.

  6. A computer model for the prediction of sensitivity in SPR sensing platforms

    Science.gov (United States)

    Izquierdo, Kristel D.; Salazar, Arnoldo; Losoya-Leal, Adrian; Martinez-Chapa, Sergio O.

    2015-03-01

    Surface Plasmon Resonance (SPR) is a wave phenomenon occurring at a metal-dielectric interface. A SPR-based biosensor operates by monitoring changes in the refractive index close to the interface that are produced in response to the interaction between the analyte and the receptors immobilized on the metal's surface. The performance of these sensors depends on many parameters, including channel geometry, material properties and parameters related to chemical interaction between the analyte and immobilized receptors. This paper presents an integrated model that predicts the sensitivity of an SPR-based sensing platform under the Kretschmann configuration. The model uses the analytical solution of the differential equations that describe the analyte-bioreceptor interaction to correlate changes in analyte concentration to changes in refractive index at the sensing surface. These results are then connected with COMSOL simulations that relate changes in refractive index to changes in the SPR reflectivity curves. The resultant relations are integrated and the model is evaluated under different scenarios. This model will aid in the optimization of assay parameters prior to experimentation for maximum sensitivity; saving both time and expensive chemical reagents during the experimental phase.

  7. The position of a standard optical computer mouse affects cardiorespiratory responses during the operation of a computer under time constraints

    Directory of Open Access Journals (Sweden)

    Shunji Sako

    2014-08-01

    Full Text Available Objectives: This study investigated the association between task-induced stress and fatigue by examining the cardiovascular responses of subjects using different mouse positions while operating a computer under time constraints. Material and Methods: The study was participated by 16 young, healthy men and examined the use of optical mouse devices affixed to laptop computers. Two mouse positions were investigated: (1 the distal position (DP, in which the subjects place their forearms on the desk accompanied by the abduction and flexion of their shoulder joints, and (2 the proximal position (PP, in which the subjects place only their wrists on the desk without using an armrest. The subjects continued each task for 16 min. We assessed differences in several characteristics according to mouse position, including expired gas values, autonomic nerve activities (based on cardiorespiratory responses, operating efficiencies (based on word counts, and fatigue levels (based on the visual analog scale – VAS. Results: Oxygen consumption (VO2, the ratio of inspiration time to respiration time (Ti/Ttotal, respiratory rate (RR, minute ventilation (VE, and the ratio of expiration to inspiration (Te/Ti were significantly lower when the participants were performing the task in the DP than those obtained in the PP. Tidal volume (VT, carbon dioxide output rates (VCO2/VE, and oxygen extraction fractions (VO2/VE were significantly higher for the DP than they were for the PP. No significant difference in VAS was observed between the positions; however, as the task progressed, autonomic nerve activities were lower and operating efficiencies were significantly higher for the DP than they were for the PP. Conclusions: Our results suggest that the DP has fewer effects on cardiorespiratory functions, causes lower levels of sympathetic nerve activity and mental stress, and produces a higher total workload than the PP. This suggests that the DP is preferable to the PP when

  8. A computationally efficient approach for hidden-Markov model-augmented fingerprint-based positioning

    Science.gov (United States)

    Roth, John; Tummala, Murali; McEachen, John

    2016-09-01

    This paper presents a computationally efficient approach for mobile subscriber position estimation in wireless networks. A method of data scaling assisted by timing adjust is introduced in fingerprint-based location estimation under a framework which allows for minimising computational cost. The proposed method maintains a comparable level of accuracy to the traditional case where no data scaling is used and is evaluated in a simulated environment under varying channel conditions. The proposed scheme is studied when it is augmented by a hidden-Markov model to match the internal parameters to the channel conditions that present, thus minimising computational cost while maximising accuracy. Furthermore, the timing adjust quantity, available in modern wireless signalling messages, is shown to be able to further reduce computational cost and increase accuracy when available. The results may be seen as a significant step towards integrating advanced position-based modelling with power-sensitive mobile devices.

  9. The Square Kilometre Array Science Data Processor. Preliminary compute platform design

    Science.gov (United States)

    Broekema, P. C.; van Nieuwpoort, R. V.; Bal, H. E.

    2015-07-01

    The Square Kilometre Array is a next-generation radio-telescope, to be built in South Africa and Western Australia. It is currently in its detailed design phase, with procurement and construction scheduled to start in 2017. The SKA Science Data Processor is the high-performance computing element of the instrument, responsible for producing science-ready data. This is a major IT project, with the Science Data Processor expected to challenge the computing state-of-the art even in 2020. In this paper we introduce the preliminary Science Data Processor design and the principles that guide the design process, as well as the constraints to the design. We introduce a highly scalable and flexible system architecture capable of handling the SDP workload.

  10. Large-scale integrated super-computing platform for next generation virtual drug discovery.

    Science.gov (United States)

    Mitchell, Wayne; Matsumoto, Shunji

    2011-08-01

    Traditional drug discovery starts by experimentally screening chemical libraries to find hit compounds that bind to protein targets, modulating their activity. Subsequent rounds of iterative chemical derivitization and rescreening are conducted to enhance the potency, selectivity, and pharmacological properties of hit compounds. Although computational docking of ligands to targets has been used to augment the empirical discovery process, its historical effectiveness has been limited because of the poor correlation of ligand dock scores and experimentally determined binding constants. Recent progress in super-computing, coupled to theoretical insights, allows the calculation of the Gibbs free energy, and therefore accurate binding constants, for usually large ligand-receptor systems. This advance extends the potential of virtual drug discovery. A specific embodiment of the technology, integrating de novo, abstract fragment based drug design, sophisticated molecular simulation, and the ability to calculate thermodynamic binding constants with unprecedented accuracy, are discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. DCA++: A case for science driven application development for leadership computing platforms

    Energy Technology Data Exchange (ETDEWEB)

    Summers, Michael S; Alvarez, Gonzalo; Meredith, Jeremy; Maier, Thomas A [Computer Science and Mathematics Division, Oak Ridge National Laboratory, P. O. Box 2008, Mail Stop 6164, Oak Ridge, TN 37831 (United States); Schulthess, Thomas C, E-mail: schulthess@cscs.c [Swiss National Supercomputer Center and Institute for Theoretical Physics, ETH Zurich, CSCS MAN E 133, Galeria 2, CH-9628 Manno (Switzerland)

    2009-07-01

    The DCA++ code was one of the early science applications that ran on jaguar at the National Center for Computational Sciences, and the first application code to sustain a petaflop/s under production conditions on a general-purpose supercomputer. The code implements a quantum cluster method with a Quantum Monte Carlo kernel to solve the 2D Hubbard model for high-temperature superconductivity. It is implemented in C++, making heavy use of the generic programming model. In this paper, we discuss how this code was developed, reaching scalability and high efficiency on the world's fastest supercomputer in only a few years. We show how the use of generic concepts combined with systematic refactoring of codes is a better strategy for computational sciences than a comprehensive upfront design.

  12. Of red planets and indigo computers: Mars database visualization as an example of platform downsizing

    Science.gov (United States)

    Kaiser, M. K.; Montegut, M. J.

    1997-01-01

    The last decade has witnessed tremendous advancements in the computer hardware and software used to perform scientific visualization. In this paper, we consider how the visualization of a particular data set, the digital terrain model derived from the Viking orbiter imagery, has been realized in four distinct projects over this period. These examples serve to demonstrate how the vast improvements in computational performance both decrease the cost of such visualization efforts and permit an increasing level of interactivity. We then consider how even today's graphical systems require the visualization designer to make intelligent choices and tradeoffs in database rendering. Finally, we discuss how insights gleaned from an understanding of human visual perception can guide these design decisions, and suggest new options for visualization hardware and software.

  13. A Collaborative Digital Pathology System for Multi-Touch Mobile and Desktop Computing Platforms

    KAUST Repository

    Jeong, W.

    2013-06-13

    Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server system that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch-enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. We introduce a domain-specific image-stack compression method that leverages real-time hardware decoding on mobile devices. It adaptively encodes image stacks in a decorrelated colour space to achieve extremely low bitrates (0.8 bpp) with very low loss of image quality. We evaluate the image quality of our compression method and the performance of our system for diagnosis with an in-depth user study. Collaborative slide image viewing systems are becoming increasingly important in pathology applications such as telepathology and E-learning. Despite rapid advances in computing and imaging technology, current digital pathology systems have limited performance with respect to remote viewing of whole slide images on desktop or mobile computing devices. In this paper we present a novel digital pathology client-server systems that supports collaborative viewing of multi-plane whole slide images over standard networks using multi-touch enabled clients. Our system is built upon a standard HTTP web server and a MySQL database to allow multiple clients to exchange image and metadata concurrently. © 2013 The Eurographics Association and John Wiley & Sons Ltd.

  14. Using the Eclipse Parallel Tools Platform to Assist Earth Science Model Development and Optimization on High Performance Computers

    Science.gov (United States)

    Alameda, J. C.

    2011-12-01

    Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into

  15. Communication Interface of Dots and Boxes Battle Platform in Computer Game%点格棋计算机博弈平台通信接口

    Institute of Scientific and Technical Information of China (English)

    张利群; 曹杨; 李厦

    2016-01-01

    针对传统计算机博弈方式存在的不足,构建点格棋计算机博弈平台.将点格棋计算机博弈平台和博弈程序客户端置于网络环境中,通过博弈平台中的通信接口模块完成各种信息的传送,实现博弈平台与博弈程序之间的信息交互,进而达到对博弈双方的控制,实现自动博弈.实验结果表明这种通信接口的设计安全可靠,通信性能好.%In view of the deficiencies of traditional computer games, a dots and boxes battle platform in computer game was built. The dots and boxes battle platform in computer game and the clients of computer game program are placed in network environ-ment.The transmission of all kinds of information is through the communication interface module of the battle platform in comput-er game to achieve the information interaction between the battle platform and game programs, thereby controlling both sides of the game programs and realizing the automatic game.Test results show that the design of this communication interface is safe, reliable and having a good communication performance.

  16. Impact of Cloud Computing Platform Based on Several Software Engineering Paradigm

    Directory of Open Access Journals (Sweden)

    Ms. Monika Kherajani

    2011-09-01

    Full Text Available A Cloud is a type of parallel and distributed system consisting of a collection of interconnected and virtualised computers that are dynamically provisioned and presented as one or more unified computing resources based on service-level agreements established through negotiation between the service provider and consumers. [UNI_MELB_08] Traditional business applications have always been very complicated and expensive. The amount and variety of hardware and software required to run them are daunting. We need a whole team of experts to install, configure, test, run, secure, and update them. When you multiply this effort across dozens or hundreds of apps, it’s easy to see why the biggest companies with the best IT departments aren’t getting the apps they need. Small and mid-sized businesses don’t stand a chance. In today’s era with cloud computing, you eliminate those headaches because you’re not managing hardware and software that’s the responsibility of an experienced vendor like salesforce.com. The shared infrastructure means it works like a utility: You only pay for what you need, upgrades are automatic, and scaling up or down is easy. In this paper we analyze several aspect and impact of software engineering parameter like design, modularity, testing etc.

  17. Comparative analysis of clinical-scale IFN-gamma positive T-cell enrichment using partially and fully integrated platforms

    Directory of Open Access Journals (Sweden)

    Christoph Priesner

    2016-09-01

    Full Text Available Background and aims. The infusion of enriched CMV-specific donor T-cells appears to be a suitable alternative for the treatment of drug resistant CMV reactivation or de novo infection after both solid organ and hematopoietic stem cell transplantation. Antiviral lymphocytes can be selected from apheresis products using the CliniMACS Cytokine-Capture-System® either with the well-established CliniMACS® Plus (Plus device or with its more versatile successor CliniMACS Prodigy® (Prodigy. Methods. Manufacturing of CMV-specific T-cells was carried out with the Prodigy and Plus in parallel starting with 0.8-1*109 leukocytes collected by lymphapheresis (n=3 and using the MACS GMP PepTivator® HCMVpp65 for antigenic re-stimulation. Target and non-target cells were quantified by a newly developed single-platform assessment and gating strategy using positive (CD3/CD4/CD8/CD45/IFN-gamma, negative (CD14/CD19/CD56, and dead cell (7-AAD discriminators. Results. Both devices produced largely similar results for target cell viabilities: 37.2%-52.2% (Prodigy vs. 51.1%-62.1% (Plus CD45+/7-AAD- cells. Absolute numbers of isolated target cells were 0.1-3.8*106 viable IFN-gamma+ CD3+ cells. The corresponding proportions of IFN-gamma+ CD3+ cells ranged between 19.2% and 95.1% among total CD3+ cells and represented recoveries of 41.9%-87.6%. Within two parallel processes predominantly IFN-gamma+ CD3+CD8+ cytotoxic T-cells were enriched compared to one process that yielded a higher amount of IFN-gamma+ CD3+CD4+ helper T lymphocytes. T-cell purity was higher for the Prodigies products that displayed a lower content of contaminating IFN-gamma- T-cells (3.6%-20.8% compared to the Plus products (19.9%-80.0%. Conclusions. The manufacturing process on the Prodigy saved both process and hands-on time due to its higher process integration and ability for unattended operation. Although the usage of both instruments yielded comparable results, the lower content of residual IFN

  18. 云计算平台的通用评测体系研究%Research on the General Evaluation System of Cloud Computing Platform

    Institute of Scientific and Technical Information of China (English)

    桂媛

    2012-01-01

    Cloud computing platform is made of a number of cloud computing products,involving lots of technology and standard related to cloud computingFirst,the research of the general evaluation system of the cloud computing platform based on the characteristics of cloud computing,Second,the summary indicator system of clouds computing platform is from laaS,PaaS,SaaS level,the benchmark methods for evaluation from the function,performance,safety,reliability is described,finally evaluation model be pro- posed.The article aim to extensively and deeply realize the cloud computing platform management,give enterprise a reference about assessment cloud computing platform.%云计算平台由大量云计算相关产品和承载产品运行的硬件集成,这个集成涉及到大餐的云计算相关技术和标准。该文立足云计算的特点束研究云计算平台的通用评测体系,从IaaS、PaaS、SaaS层而总结出用于度量云计算平台的指标体系,从功能、性能、安全、可靠性角度提出基准评测方法.最后提出评测模型。云计算平台的通用评测体系实现了对云计算平台横向到边、纵向到底的规范管理.为企业测评云计算平台提供参考。

  19. A performance model for the communication in fast multipole methods on high-performance computing platforms

    KAUST Repository

    Ibeid, Huda

    2016-03-04

    Exascale systems are predicted to have approximately 1 billion cores, assuming gigahertz cores. Limitations on affordable network topologies for distributed memory systems of such massive scale bring new challenges to the currently dominant parallel programing model. Currently, there are many efforts to evaluate the hardware and software bottlenecks of exascale designs. It is therefore of interest to model application performance and to understand what changes need to be made to ensure extrapolated scalability. The fast multipole method (FMM) was originally developed for accelerating N-body problems in astrophysics and molecular dynamics but has recently been extended to a wider range of problems. Its high arithmetic intensity combined with its linear complexity and asynchronous communication patterns make it a promising algorithm for exascale systems. In this paper, we discuss the challenges for FMM on current parallel computers and future exascale architectures, with a focus on internode communication. We focus on the communication part only; the efficiency of the computational kernels are beyond the scope of the present study. We develop a performance model that considers the communication patterns of the FMM and observe a good match between our model and the actual communication time on four high-performance computing (HPC) systems, when latency, bandwidth, network topology, and multicore penalties are all taken into account. To our knowledge, this is the first formal characterization of internode communication in FMM that validates the model against actual measurements of communication time. The ultimate communication model is predictive in an absolute sense; however, on complex systems, this objective is often out of reach or of a difficulty out of proportion to its benefit when there exists a simpler model that is inexpensive and sufficient to guide coding decisions leading to improved scaling. The current model provides such guidance.

  20. Development of a high performance parallel computing platform and its use in the study of nanostructures: Clusters, sheets and tubes

    Science.gov (United States)

    Gowtham, S.

    Small clusters of gallium oxide, technologically important high temperature ceramic, together with interaction of nucleic acid bases with graphene and small-diameter carbon nanotube are focus of first principles calculations in this work. A high performance parallel computing platform is also developed to perform these calculations at Michigan Tech. First principles calculations are based on density functional theory employing either local density or gradient-corrected approximation together with plane wave and Gaussian basis sets. The bulk Ga2O3 is known to be a very good candidate for fabricating electronic devices that operate at high temperatures. To explore the properties of Ga2O3 at nanoscale, we have performed a systematic theoretical study on the small polyatomic gallium oxide clusters. The calculated results find that all lowest energy isomers of GamO n clusters are dominated by the Ga-O bonds over the metal-metal or the oxygen-oxygen bonds. Analysis of atomic charges suggest the clusters to be highly ionic similar to the case of bulk Ga2O3. In the study of sequential oxidation of these clusters starting from Ga3O, it is found that the most stable isomers display up to four different backbones of constituent atoms. Furthermore, the predicted configuration of the ground state of Ga2O is recently confirmed by the experimental results of Neumark's group. Guided by the results of calculations the study of gallium oxide clusters, performance related challenge of computational simulations, of producing high performance computers/platforms, has been addressed. Several engineering aspects were thoroughly studied during the design, development and implementation of the high performance parallel computing platform, RAMA, at Michigan Tech. In an attempt to stay true to the principles of Beowulf revolution, the RAMA cluster was extensively customized to make it easy to understand, and use - for administrators as well as end-users. Following the results of benchmark

  1. CT colonography with computer-aided detection: recognizing the causes of false-positive reader results.

    Science.gov (United States)

    Trilisky, Igor; Wroblewski, Kristen; Vannier, Michael W; Horne, John M; Dachman, Abraham H

    2014-01-01

    Computed tomography (CT) colonography is a screening modality used to detect colonic polyps before they progress to colorectal cancer. Computer-aided detection (CAD) is designed to decrease errors of detection by finding and displaying polyp candidates for evaluation by the reader. CT colonography CAD false-positive results are common and have numerous causes. The relative frequency of CAD false-positive results and their effect on reader performance on the basis of a 19-reader, 100-case trial shows that the vast majority of CAD false-positive results were dismissed by readers. Many CAD false-positive results are easily disregarded, including those that result from coarse mucosa, reconstruction, peristalsis, motion, streak artifacts, diverticulum, rectal tubes, and lipomas. CAD false-positive results caused by haustral folds, extracolonic candidates, diminutive lesions (reader false-positive results. Nondismissable CAD soft-tissue polyp candidates larger than 6 mm are another common cause of reader false-positive results that may lead to further evaluation with follow-up CT colonography or optical colonoscopy. Strategies for correctly evaluating CAD polyp candidates are important to avoid pitfalls from common sources of CAD false-positive results. ©RSNA, 2014.

  2. Exploring Effectiveness of Computer-Aided Planning in Implant Positioning for a Single Immediate Implant Placement.

    Science.gov (United States)

    Edelmann, Alexander R; Hosseini, Bashir; Byrd, Warren C; Preisser, John S; Tyndall, Donald A; Nguyen, Tung; Bencharit, Sompop

    2016-06-01

    The value of computer-aided implant planning using cone-beam computerized tomography (CBCT) for single immediate implants was explored. Eighteen patients requiring extraction of a tooth followed by a single immediate implant were enrolled. Small volume preoperative CBCT scans were used to plan the position of the implant. A taper screwed-type implant was immediately placed into a fresh socket using only the final 1 or 2 drills for osteotomy. Postoperative CBCTs were used for the analysis of actual implant placement positioning. Measurements of the planned and the actual implant position were made with respect to their position relative to the adjacent teeth. Mesio-distal displacements and the facial-lingual deviation of the implant from the planned position were determined. Changes in the angulation of the planned and actual implant position in relation to the clinical crown were also measured. To statistically summarize the results, box plots and 95% CIs for means of paired differences were used. The analysis showed no statistical difference between the planned position and final implant placement position in any measurement. The CBCT scans coupled with the computer-aided implant planning program along with a final 1-to-2 drill protocol may improve the accuracy of single immediate implant placement for taper screwed-type implants.

  3. Tempest: Accelerated MS/MS Database Search Software for Heterogeneous Computing Platforms.

    Science.gov (United States)

    Adamo, Mark E; Gerber, Scott A

    2016-09-07

    MS/MS database search algorithms derive a set of candidate peptide sequences from in silico digest of a protein sequence database, and compute theoretical fragmentation patterns to match these candidates against observed MS/MS spectra. The original Tempest publication described these operations mapped to a CPU-GPU model, in which the CPU (central processing unit) generates peptide candidates that are asynchronously sent to a discrete GPU (graphics processing unit) to be scored against experimental spectra in parallel. The current version of Tempest expands this model, incorporating OpenCL to offer seamless parallelization across multicore CPUs, GPUs, integrated graphics chips, and general-purpose coprocessors. Three protocols describe how to configure and run a Tempest search, including discussion of how to leverage Tempest's unique feature set to produce optimal results. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.

  4. HPCC Methodologies for Structural Design and Analysis on Parallel and Distributed Computing Platforms

    Science.gov (United States)

    Farhat, Charbel

    1998-01-01

    In this grant, we have proposed a three-year research effort focused on developing High Performance Computation and Communication (HPCC) methodologies for structural analysis on parallel processors and clusters of workstations, with emphasis on reducing the structural design cycle time. Besides consolidating and further improving the FETI solver technology to address plate and shell structures, we have proposed to tackle the following design related issues: (a) parallel coupling and assembly of independently designed and analyzed three-dimensional substructures with non-matching interfaces, (b) fast and smart parallel re-analysis of a given structure after it has undergone design modifications, (c) parallel evaluation of sensitivity operators (derivatives) for design optimization, and (d) fast parallel analysis of mildly nonlinear structures. While our proposal was accepted, support was provided only for one year.

  5. GridPACK™ : A Framework for Developing Power Grid Simulations on High-Performance Computing Platforms

    Energy Technology Data Exchange (ETDEWEB)

    Palmer, Bruce J.; Perkins, William A.; Chen, Yousu; Jin, Shuangshuang; Callahan, David; Glass, Kevin A.; Diao, Ruisheng; Rice, Mark J.; Elbert, Stephen T.; Vallem, Mallikarjuna R.; Huang, Zhenyu

    2016-05-01

    This paper describes the GridPACK™ framework, which is designed to help power grid engineers develop modeling software capable of running on high performance computers. The framework makes extensive use of software templates to provide high level functionality while at the same time allowing developers the freedom to express whatever models and algorithms they are using. GridPACK™ contains modules for setting up distributed power grid networks, assigning buses and branches with arbitrary behaviors to the network, creating distributed matrices and vectors and using parallel linear and non-linear solvers to solve algebraic equations. It also provides mappers to create matrices and vectors based on properties of the network and functionality to support IO and to mana

  6. INCA: a computational platform for isotopically non-stationary metabolic flux analysis.

    Science.gov (United States)

    Young, Jamey D

    2014-05-01

    13C flux analysis studies have become an essential component of metabolic engineering research. The scope of these studies has gradually expanded to include both isotopically steady-state and transient labeling experiments, the latter of which are uniquely applicable to photosynthetic organisms and slow-to-label mammalian cell cultures. Isotopomer network compartmental analysis (INCA) is the first publicly available software package that can perform both steady-state metabolic flux analysis and isotopically non-stationary metabolic flux analysis. The software provides a framework for comprehensive analysis of metabolic networks using mass balances and elementary metabolite unit balances. The generation of balance equations and their computational solution is completely automated and can be performed on networks of arbitrary complexity.

  7. Analysis of Child Computer Interaction in Edutainment and Simulation Games Application on Android Platform in Indonesia

    Directory of Open Access Journals (Sweden)

    Setia Wirawan

    2013-08-01

    Full Text Available Child Computer Interaction (CCI has become a challenge in utilizing the technology as education media. The increasing number of children, who use advanced gadgets in Indonesia such as smartphones and tablet PCs, provides a new space for developing interactive educational game application for kids. Indonesia is a country with the biggest number of Android-based game application downloaders in the world of service providers. Modeling serious game that has been chosen to deliver the concept is Edutainment and Simulation Games. This paper will analyze and review the application of the two concepts of the game, using data on the ranking from one of the top application service providers in analytic applications. The game application developers are expected to understand CCI and develop applications concepts that suit the needs of children in Indonesia. This will create market opportunities in the Indonesian game industry in the future.

  8. CoreFlow: A computational platform for integration, analysis and modeling of complex biological data

    DEFF Research Database (Denmark)

    Pasculescu, Adrian; Schoof, Erwin; Creixell, Pau

    2014-01-01

    between data generation, analysis and manuscript writing. CoreFlow is being released to the scientific community as an open-sourced software package complete with proteomics-specific examples, which include corrections for incomplete isotopic labeling of peptides (SILAC) or arginine-to-proline conversion...... provides programmers with a framework to manage data in real-time. It allows users to upload data into a relational database (MySQL), and to create custom scripts in high-level languages such as R, Python, or Perl for processing, correcting and modeling this data. CoreFlow organizes these scripts...... into project-specific pipelines, tracks interdependencies between related tasks, and enables the generation of summary reports as well as publication-quality images. As a result, the gap between experimental and computational components of a typical large-scale biology project is reduced, decreasing the time...

  9. Farm Management Support on Cloud Computing Platform: A System for Cropland Monitoring Using Multi-Source Remotely Sensed Data

    Science.gov (United States)

    Coburn, C. A.; Qin, Y.; Zhang, J.; Staenz, K.

    2015-12-01

    Food security is one of the most pressing issues facing humankind. Recent estimates predict that over one billion people don't have enough food to meet their basic nutritional needs. The ability of remote sensing tools to monitor and model crop production and predict crop yield is essential for providing governments and farmers with vital information to ensure food security. Google Earth Engine (GEE) is a cloud computing platform, which integrates storage and processing algorithms for massive remotely sensed imagery and vector data sets. By providing the capabilities of storing and analyzing the data sets, it provides an ideal platform for the development of advanced analytic tools for extracting key variables used in regional and national food security systems. With the high performance computing and storing capabilities of GEE, a cloud-computing based system for near real-time crop land monitoring was developed using multi-source remotely sensed data over large areas. The system is able to process and visualize the MODIS time series NDVI profile in conjunction with Landsat 8 image segmentation for crop monitoring. With multi-temporal Landsat 8 imagery, the crop fields are extracted using the image segmentation algorithm developed by Baatz et al.[1]. The MODIS time series NDVI data are modeled by TIMESAT [2], a software package developed for analyzing time series of satellite data. The seasonality of MODIS time series data, for example, the start date of the growing season, length of growing season, and NDVI peak at a field-level are obtained for evaluating the crop-growth conditions. The system fuses MODIS time series NDVI data and Landsat 8 imagery to provide information of near real-time crop-growth conditions through the visualization of MODIS NDVI time series and comparison of multi-year NDVI profiles. Stakeholders, i.e., farmers and government officers, are able to obtain crop-growth information at crop-field level online. This unique utilization of GEE in

  10. A priori modeling of chemical reactions on computational grid platforms: Workflows and data models

    Energy Technology Data Exchange (ETDEWEB)

    Rampino, S., E-mail: ser_ram@dyn.unipg.it [Dipartimento di Chimica, Universita degli Studi di Perugia, Via Elce di Sotto 8, 06123 Perugia (Italy); Monari, A. [SRSMC-Equipe de Chimie et Biochimie Theoriques, Nancy-Universite et CNRS, Bp70239 Boulevard des Aiguilettes, 54506 Vandoeuvre-les-Nancy Cedex (France); Rossi, E. [CINECA, Via Manganelli 6/3, 40033 Casalecchio di Reno, Bologna (Italy); Evangelisti, S. [Laboratoire de Chimie et de Physique Quantiques, Universite Paul Sabatier Toulouse III et CNRS, 118 Route de Narbonne, 31062 Toulouse Cedex 4 (France); Lagana, A. [Dipartimento di Chimica, Universita degli Studi di Perugia, Via Elce di Sotto 8, 06123 Perugia (Italy)

    2012-04-04

    Graphical abstract: The quantum framework of the Grid Empowered Molecular Simulator GEMS assembled on the European Grid allows the ab initio evaluation of the dynamics of small systems starting from the calculation of the electronic properties. Highlights: Black-Right-Pointing-Pointer The grid based GEMS simulator accurately models small chemical systems. Black-Right-Pointing-Pointer Q5Cost and D5Cost file formats provide interoperability in the workflow. Black-Right-Pointing-Pointer Benchmark runs on H + H{sub 2} highlight the Grid empowering. Black-Right-Pointing-Pointer O + O{sub 2} and N + N{sub 2} calculated k (T)'s fall within the error bars of the experiment. - Abstract: The quantum framework of the Grid Empowered Molecular Simulator GEMS has been assembled on the segment of the European Grid devoted to the Computational Chemistry Virtual Organization. The related grid based workflow allows the ab initio evaluation of the dynamics of small systems starting from the calculation of the electronic properties. Interoperability between computational codes across the different stages of the workflow was made possible by the use of the common data formats Q5Cost and D5Cost. Illustrative benchmark runs have been performed on the prototype H + H{sub 2}, N + N{sub 2} and O + O{sub 2} gas phase exchange reactions and thermal rate coefficients have been calculated for the last two. Results are discussed in terms of the modeling of the interaction and advantages of using the Grid is highlighted.

  11. Vertical Wave Impacts on Offshore Wind Turbine Inspection Platforms

    OpenAIRE

    Bredmose, Henrik; Jacobsen, Niels Gjøl

    2011-01-01

    Breaking wave impacts on a monopile at 20 m depth are computed with a VOF (Volume Of Fluid) method. The impacting waves are generated by the second-order focused wave group technique, to obtain waves that break at the position of the monopile. The subsequent impact from the vertical run-up flow on a horizontal inspection platform is computed for five different platform levels. The computational results show details of monopile impact such as slamming pressures from the overturning wave front ...

  12. A fully parallel, high precision, N-body code running on hybrid computing platforms

    CERN Document Server

    Capuzzo-Dolcetta, R; Punzo, D

    2012-01-01

    We present a new implementation of the numerical integration of the classical, gravitational, N-body problem based on a high order Hermite's integration scheme with block time steps, with a direct evaluation of the particle-particle forces. The main innovation of this code (called HiGPUs) is its full parallelization, exploiting both OpenMP and MPI in the use of the multicore Central Processing Units as well as either Compute Unified Device Architecture (CUDA) or OpenCL for the hosted Graphic Processing Units. We tested both performance and accuracy of the code using up to 256 GPUs in the supercomputer IBM iDataPlex DX360M3 Linux Infiniband Cluster provided by the italian supercomputing consortium CINECA, for values of N up to 8 millions. We were able to follow the evolution of a system of 8 million bodies for few crossing times, task previously unreached by direct summation codes. The code is freely available to the scientific community.

  13. An efficient numerical algorithm for computing densely distributed positive interior transmission eigenvalues

    Science.gov (United States)

    Li, Tiexiang; Huang, Tsung-Ming; Lin, Wen-Wei; Wang, Jenn-Nan

    2017-03-01

    We propose an efficient eigensolver for computing densely distributed spectra of the two-dimensional transmission eigenvalue problem (TEP), which is derived from Maxwell’s equations with Tellegen media and the transverse magnetic mode. The governing equations, when discretized by the standard piecewise linear finite element method, give rise to a large-scale quadratic eigenvalue problem (QEP). Our numerical simulation shows that half of the positive eigenvalues of the QEP are densely distributed in some interval near the origin. The quadratic Jacobi-Davidson method with a so-called non-equivalence deflation technique is proposed to compute the dense spectrum of the QEP. Extensive numerical simulations show that our proposed method processes the convergence efficiently, even when it needs to compute more than 5000 desired eigenpairs. Numerical results also illustrate that the computed eigenvalue curves can be approximated by nonlinear functions, which can be applied to estimate the denseness of the eigenvalues for the TEP.

  14. Methods, Computational Platform, Verification, and Application of Earthquake-Soil-Structure-Interaction Modeling and Simulation

    Science.gov (United States)

    Tafazzoli, Nima

    Seismic response of soil-structure systems has attracted significant attention for a long time. This is quite understandable with the size and the complexity of soil-structure systems. The focus of three important aspects of ESSI modeling could be on consistent following of input seismic energy and a number of energy dissipation mechanisms within the system, numerical techniques used to simulate dynamics of ESSI, and influence of uncertainty of ESSI simulations. This dissertation is a contribution to development of one such tool called ESSI Simulator. The work is being done on extensive verified and validated suite for ESSI Simulator. Verification and validation are important for high fidelity numerical predictions of behavior of complex systems. This simulator uses finite element method as a numerical tool to obtain solutions for large class of engineering problems such as liquefaction, earthquake-soil-structure-interaction, site effect, piles, pile group, probabilistic plasticity, stochastic elastic-plastic FEM, and detailed large scale parallel models. Response of full three-dimensional soil-structure-interaction simulation of complex structures is evaluated under the 3D wave propagation. Domain-Reduction-Method is used for applying the forces as a two-step procedure for dynamic analysis with the goal of reducing the large size computational domain. The issue of damping of the waves at the boundary of the finite element models is studied using different damping patterns. This is used at the layer of elements outside of the Domain-Reduction-Method zone in order to absorb the residual waves coming out of the boundary layer due to structural excitation. Extensive parametric study is done on dynamic soil-structure-interaction of a complex system and results of different cases in terms of soil strength and foundation embedment are compared. High efficiency set of constitutive models in terms of computational time are developed and implemented in ESSI Simulator

  15. 图书馆联盟云计算服务平台的设计与实现——吉林省图书馆联盟云计算服务平台建设实践%Design and Implementation of Library Consortium Cloud Computing Services Platform - the Practice of Jilin Province Library Consortium Cloud Computing Services Platform Construction

    Institute of Scientific and Technical Information of China (English)

    刘万国; 隋会民; 张静鹏

    2012-01-01

    文章基于吉林省图书馆联盟云计算服务平台建设的实践,阐述了图书馆联盟云计算服务平台的建设目标、平台的构成,分析了知识发现与获取、书刊管理、数字资产管理、云存储、学习与研究等云服务子平台的构建技术路线和功能,探讨了联盟云计算服务平台的底层技术架构,论述了图书馆联盟云计算服务平台的社会意义.%Based on the practice of cloud computing service platform construction of Jilin Province Library Consortium, the article elaborates the composition of platform and the goal of Jilin Province Library Consortium cloud computing service platform construction. This article analyzes the knowledge discovery and delivery, books management, digital asset management, Cloud Storage, learning and research, cloud services sub-platform building technical route and function. This article discusses Consortium cloud computing service platform bottom technical architecture, and elaborates the library Consortium cloud computing service platform of social significance.

  16. Incidental lung cancers and positive computed tomography images in people living with HIV

    DEFF Research Database (Denmark)

    Ronit, Andreas; Kristensen, Thomas; Klitbo, Ditte M

    2017-01-01

    OBJECTIVE: Lung cancer screening with low-dose computed tomography (LDCT) of high-risk groups in the general population is recommended by several authorities. This may not be feasible in people living with HIV (PLWHIV) due to higher prevalence of nodules. We therefore assessed the prevalence...... of positive computed tomography (CT) images and lung cancers in PLWHIV. DESIGN: The Copenhagen comorbidity in HIV infection (COCOMO) study is an observational, longitudinal cohort study. Single-round LDCT was performed with subsequent clinical follow-up (NCT02382822). METHOD: Outcomes included histology...

  17. An evolving computational platform for biological mass spectrometry: workflows, statistics and data mining with MASSyPup64.

    Science.gov (United States)

    Winkler, Robert

    2015-01-01

    In biological mass spectrometry, crude instrumental data need to be converted into meaningful theoretical models. Several data processing and data evaluation steps are required to come to the final results. These operations are often difficult to reproduce, because of too specific computing platforms. This effect, known as 'workflow decay', can be diminished by using a standardized informatic infrastructure. Thus, we compiled an integrated platform, which contains ready-to-use tools and workflows for mass spectrometry data analysis. Apart from general unit operations, such as peak picking and identification of proteins and metabolites, we put a strong emphasis on the statistical validation of results and Data Mining. MASSyPup64 includes e.g., the OpenMS/TOPPAS framework, the Trans-Proteomic-Pipeline programs, the ProteoWizard tools, X!Tandem, Comet and SpiderMass. The statistical computing language R is installed with packages for MS data analyses, such as XCMS/metaXCMS and MetabR. The R package Rattle provides a user-friendly access to multiple Data Mining methods. Further, we added the non-conventional spreadsheet program teapot for editing large data sets and a command line tool for transposing large matrices. Individual programs, console commands and modules can be integrated using the Workflow Management System (WMS) taverna. We explain the useful combination of the tools by practical examples: (1) A workflow for protein identification and validation, with subsequent Association Analysis of peptides, (2) Cluster analysis and Data Mining in targeted Metabolomics, and (3) Raw data processing, Data Mining and identification of metabolites in untargeted Metabolomics. Association Analyses reveal relationships between variables across different sample sets. We present its application for finding co-occurring peptides, which can be used for target proteomics, the discovery of alternative biomarkers and protein-protein interactions. Data Mining derived models

  18. An evolving computational platform for biological mass spectrometry: workflows, statistics and data mining with MASSyPup64

    Directory of Open Access Journals (Sweden)

    Robert Winkler

    2015-11-01

    Full Text Available In biological mass spectrometry, crude instrumental data need to be converted into meaningful theoretical models. Several data processing and data evaluation steps are required to come to the final results. These operations are often difficult to reproduce, because of too specific computing platforms. This effect, known as ‘workflow decay’, can be diminished by using a standardized informatic infrastructure. Thus, we compiled an integrated platform, which contains ready-to-use tools and workflows for mass spectrometry data analysis. Apart from general unit operations, such as peak picking and identification of proteins and metabolites, we put a strong emphasis on the statistical validation of results and Data Mining. MASSyPup64 includes e.g., the OpenMS/TOPPAS framework, the Trans-Proteomic-Pipeline programs, the ProteoWizard tools, X!Tandem, Comet and SpiderMass. The statistical computing language R is installed with packages for MS data analyses, such as XCMS/metaXCMS and MetabR. The R package Rattle provides a user-friendly access to multiple Data Mining methods. Further, we added the non-conventional spreadsheet program teapot for editing large data sets and a command line tool for transposing large matrices. Individual programs, console commands and modules can be integrated using the Workflow Management System (WMS taverna. We explain the useful combination of the tools by practical examples: (1 A workflow for protein identification and validation, with subsequent Association Analysis of peptides, (2 Cluster analysis and Data Mining in targeted Metabolomics, and (3 Raw data processing, Data Mining and identification of metabolites in untargeted Metabolomics. Association Analyses reveal relationships between variables across different sample sets. We present its application for finding co-occurring peptides, which can be used for target proteomics, the discovery of alternative biomarkers and protein–protein interactions. Data Mining

  19. An evolving computational platform for biological mass spectrometry: workflows, statistics and data mining with MASSyPup64

    Science.gov (United States)

    2015-01-01

    In biological mass spectrometry, crude instrumental data need to be converted into meaningful theoretical models. Several data processing and data evaluation steps are required to come to the final results. These operations are often difficult to reproduce, because of too specific computing platforms. This effect, known as ‘workflow decay’, can be diminished by using a standardized informatic infrastructure. Thus, we compiled an integrated platform, which contains ready-to-use tools and workflows for mass spectrometry data analysis. Apart from general unit operations, such as peak picking and identification of proteins and metabolites, we put a strong emphasis on the statistical validation of results and Data Mining. MASSyPup64 includes e.g., the OpenMS/TOPPAS framework, the Trans-Proteomic-Pipeline programs, the ProteoWizard tools, X!Tandem, Comet and SpiderMass. The statistical computing language R is installed with packages for MS data analyses, such as XCMS/metaXCMS and MetabR. The R package Rattle provides a user-friendly access to multiple Data Mining methods. Further, we added the non-conventional spreadsheet program teapot for editing large data sets and a command line tool for transposing large matrices. Individual programs, console commands and modules can be integrated using the Workflow Management System (WMS) taverna. We explain the useful combination of the tools by practical examples: (1) A workflow for protein identification and validation, with subsequent Association Analysis of peptides, (2) Cluster analysis and Data Mining in targeted Metabolomics, and (3) Raw data processing, Data Mining and identification of metabolites in untargeted Metabolomics. Association Analyses reveal relationships between variables across different sample sets. We present its application for finding co-occurring peptides, which can be used for target proteomics, the discovery of alternative biomarkers and protein–protein interactions. Data Mining derived models

  20. Versatile, low-cost, computer-controlled, sample positioning system for vacuum applications

    Science.gov (United States)

    Vargas-Aburto, Carlos; Liff, Dale R.

    1991-01-01

    A versatile, low-cost, easy to implement, microprocessor-based motorized positioning system (MPS) suitable for accurate sample manipulation in a Second Ion Mass Spectrometry (SIMS) system, and for other ultra-high vacuum (UHV) applications was designed and built at NASA LeRC. The system can be operated manually or under computer control. In the latter case, local, as well as remote operation is possible via the IEEE-488 bus. The position of the sample can be controlled in three linear orthogonal and one angular coordinates.

  1. Assessment of optimal condylar position with cone-beam computed tomography in south Indian female population

    OpenAIRE

    Manjula, W. S.; Faizal Tajir; R.V. Murali; Kishore Kumar, S; Mohammed Nizam

    2015-01-01

    Aim: The purpose of this study was to investigate, the condyle-fossa relationship, in clinically asymptomatic orthodontically untreated south Indian female volunteers, by cone-beam computed tomography (CBCT). Materials and Methods: The study population consisted of 13 clinically symptom-free and orthodontically untreated angle's Class I female subjects with the mean age of 18 years (ranges from 17 years to 20 years). The normal disc position of the 13 subjects was confirmed by history, clinic...

  2. Pigmented villonodular synovitis mimics metastases on fluorine 18 fluorodeoxyglucose position emission tomography-computed tomography.

    Science.gov (United States)

    Elumogo, Comfort O; Kochenderfer, James N; Civelek, A Cahid; Bluemke, David A

    2016-04-01

    Pigmented villonodular synovitis (PVNS) is a benign joint disease best characterized on magnetic resonance imaging (MRI). The role of fluorine 18 fluorodeoxyglucose ((18)F-FDG) position emission tomography-computed tomography (PET-CT) in the diagnosis or characterization remains unclear. PVNS displays as a focal FDG avid lesion, which can masquerade as a metastatic lesion, on PET-CET. We present a case of PVNS found on surveillance imaging of a lymphoma patient.

  3. Direct coronal computed tomography of the lumbar spine: A new technical approach in supine position

    Energy Technology Data Exchange (ETDEWEB)

    Schnyder, P.; Uske, A.; Mansouri, B.

    1986-11-01

    Computed tomography (CT) was carried out on 46 subjects with L5-S1 disk hernia. All the patients had a L5-S1 angle equal or greater than 40 degrees. Coronal sections of the disk were obtained with a rostral angulation of the gantry, having placed the lumbar spin in a hyperlordotic position. Results are discussed and compared with those obtained from para-axial transverse sections and multidirectional reformated images.

  4. Effect of horizontal position of the computer keyboard on upper extremity posture and muscular load during computer work.

    Science.gov (United States)

    Kotani, K; Barrero, L H; Lee, D L; Dennerlein, J T

    2007-09-01

    The distance of the keyboard from the edge of a work surface has been associated with hand and arm pain; however, the variation in postural and muscular effects with the horizontal position have not been explicitly explored in previous studies. It was hypothesized that the wrist approaches more of a neutral posture as the keyboard distance from the edge of table increases. In a laboratory setting, 20 adults completed computer tasks using four workstation configurations: with the keyboard at the edge of the work surface (NEAR), 8 cm from the edge and 15 cm from the edge, the latter condition also with a pad that raised the work surface proximal to the keyboard (FWP). Electrogoniometers and an electromagnetic motion analysis system measured wrist and upper arm postures and surface electromyography measured muscle activity of two forearm and two shoulder muscles. Wrist ulnar deviation decreased by 50% (4 degrees ) as the keyboard position moved away from the user. Without a pad, wrist extension increased by 20% (4 degrees ) as the keyboard moved away but when the pad was added, wrist extension did not differ from that in the NEAR configuration. Median values of wrist extensor muscle activity decreased by 4% maximum voluntary contraction for the farthest position with a pad (FWP). The upper arm followed suit: flexion increased while abduction and internal rotation decreased as the keyboard was positioned further away from the edge of the table. In order to achieve neutral postures of the upper extremity, the keyboard position in the horizontal plane has an important role and needs to be considered within the context of workstation designs and interventions.

  5. 智能电网中的云计算平台研究%Research on Cloud Computing Platform under Smart Grid

    Institute of Scientific and Technical Information of China (English)

    赵瑞锋; 卢建刚

    2013-01-01

    统一坚强智能电网是未来电网发展的趋势,信息技术是其建设和发展的重要支撑,云计算为坚强智能电网下的海量数据处理、分析、存储、管理与计算平台提供了新的技术手段。为了更好地研究和利用云计算构建智能电网中的信息平台,首先对云计算进行了概述,总结了智能电网对信息平台的要求及云计算在智能电网中的应用,并结合已有研究对智能电网下云计算平台研究的新内容进行了讨论分析。%The unified strong smart grid is the future development trend of power grid ,the information technology is an important infrastructure for its construction and development ,cloud computing provides a new technology for strong smart grid to process ,analyze ,storage ,management of its massive scale information and computing platform .In order to better research and take advantage of cloud computing to build information platform of smart grid ,firstly an overview of cloud computing is stated ,the requirements of information platform for smart grid and the applications of cloud computing in smart grid are summarized ,a discussion and analysis of new research content is given combine with the existing research on the cloud computing platform under smart grid .

  6. Research and Implementation of Digital Experimental Platform Based on Cloud Computing%基于云计算的数字实验平台的研究

    Institute of Scientific and Technical Information of China (English)

    张欣; 刘镇; 王海峰

    2012-01-01

    随着信息技术的发展和传统实验系统的升级,提高实验平台的利用率和增强服务能力的成为业界研究的热点。云计算(Cloud Computing)为实验平台提供了有力的支撑,然而如何有效的利用云计算技术,构建具有交互能力、资源利用率高、规范化的开放性数字实验平台仍然面临技术性的挑战。结合云计算的技术特性,针对数字实验平台的具体需求,设计了基于云计算的数字实验平台(Digital Experimental P1atform Based on Cloud Computing,DEP2C)。%Under the development of information technology and the upgrade of traditional experiment system, the improvement of resource utilization and service capability for experiment platform has become the research hot spot. Cloud computing provides a strong support for open digital experimental platform, but how to use cloud computing technology effectively, building a standardized open digital experiment platform which assures the interactive ability and high resource utilization rate, still faces technical challenges. Using the cloud computing technology, according to the specific requirements of digital experimental platform, the three-layer architecture of digital experimental platform based on cloud computing (DEP2C in short) is designed.

  7. "Happiness Inventors": Informing Positive Computing Technologies Through Participatory Design With Children.

    Science.gov (United States)

    Yarosh, Svetlana; Schueller, Stephen Matthew

    2017-01-17

    Positive psychological interventions for children have typically focused on direct adaptations of interventions developed for adults. As the community moves toward designing positive computing technologies to support child well-being, it is important to use a more participatory process that directly engages children's voices. Our objectives were, through a participatory design study, to understand children's interpretations of positive psychology concepts, as well as their perspectives on technologies that are best suited to enhance their engagement with practice of well-being skills. We addressed these questions through a content analysis of 434 design ideas, 51 sketches, and 8 prototype and videos, which emerged from a 14-session cooperative inquiry study with 12 child "happiness inventors." The study was part of a summer learning camp held at the children's middle school, which focused on teaching the invention process, teaching well-being skills drawn from positive psychology and related areas (gratitude, mindfulness, and problem solving), and iterating design ideas for technologies to support these skills. The children's ideas and prototypes revealed specific facets of how they interpreted gratitude (as thanking, being positive, and doing good things), mindfulness (as externally representing thought and emotions, controlling those thoughts and emotions, getting through unpleasant things, and avoiding forgetting something), and problem solving (as preventing bad decisions, seeking alternative solutions, and not dwelling on unproductive thoughts). This process also revealed that children emphasized particular technologies in their solutions. While desktop or laptop solutions were notably lacking, other ideas were roughly evenly distributed between mobile apps and embodied computing technologies (toys, wearables, etc). We also report on desired functionalities and approaches to engagement in the children's ideas, such as a notable emphasis on representing and

  8. “Happiness Inventors”: Informing Positive Computing Technologies Through Participatory Design With Children

    Science.gov (United States)

    Schueller, Stephen Matthew

    2017-01-01

    Background Positive psychological interventions for children have typically focused on direct adaptations of interventions developed for adults. As the community moves toward designing positive computing technologies to support child well-being, it is important to use a more participatory process that directly engages children’s voices. Objective Our objectives were, through a participatory design study, to understand children’s interpretations of positive psychology concepts, as well as their perspectives on technologies that are best suited to enhance their engagement with practice of well-being skills. Methods We addressed these questions through a content analysis of 434 design ideas, 51 sketches, and 8 prototype and videos, which emerged from a 14-session cooperative inquiry study with 12 child “happiness inventors.” The study was part of a summer learning camp held at the children’s middle school, which focused on teaching the invention process, teaching well-being skills drawn from positive psychology and related areas (gratitude, mindfulness, and problem solving), and iterating design ideas for technologies to support these skills. Results The children’s ideas and prototypes revealed specific facets of how they interpreted gratitude (as thanking, being positive, and doing good things), mindfulness (as externally representing thought and emotions, controlling those thoughts and emotions, getting through unpleasant things, and avoiding forgetting something), and problem solving (as preventing bad decisions, seeking alternative solutions, and not dwelling on unproductive thoughts). This process also revealed that children emphasized particular technologies in their solutions. While desktop or laptop solutions were notably lacking, other ideas were roughly evenly distributed between mobile apps and embodied computing technologies (toys, wearables, etc). We also report on desired functionalities and approaches to engagement in the children’s ideas

  9. 构建云计算平台的开源软件综述%Survey of Open Source Software for Building Cloud Computing Platforms

    Institute of Scientific and Technical Information of China (English)

    林利; 石文昌

    2012-01-01

    The emergence of open source software for cloud computing facilitates the building of cloud computing platforms, but meanwhile it challenges the choosing of appropriate ones from these pool of software. To figure out how to build a cloud computing platform, research on existing open source software for building cloud computing platforms is necessary. This paper investigated the development of these kinds of open source software and analyzed their architectures from the perspective of service models. Through comparison and analysis of representatives of them, effective ways were proposed for developers to choose appropriate pieces of software to build a specific cloud computing platform.%云计算开源软件的涌现为云计算平台的构建提供了便利,同时也为人们从中选择合适的软件带来了挑战.为明确如何构建云计算平台,研究现有用于构建云计算平台的开源软件十分必要.考察构建云计算平台的开源软件的发展状况,从提供服务的角度对各种服务模型的开源软件体系结构进行剖析,通过对比分析当前典型的用于构建云计算平台的开源软件,来为云计算平台建设者利用此类软件构建符合特定需求的云计算环境提供有效的途径.

  10. Computer-assisted teaching of skin flap surgery: validation of a mobile platform software for medical students.

    Directory of Open Access Journals (Sweden)

    David P de Sena

    Full Text Available The purpose of this study was to develop and validate a multimedia software application for mobile platforms to assist in the teaching and learning process of design and construction of a skin flap. Traditional training in surgery is based on learning by doing. Initially, the use of cadavers and animal models appeared to be a valid alternative for training. However, many conflicts with these training models prompted progression to synthetic and virtual reality models. Fifty volunteer fifth- and sixth-year medical students completed a pretest and were randomly allocated into two groups of 25 students each. The control group was exposed for 5 minutes to a standard text-based print article, while the test group used multimedia software describing how to fashion a rhomboid flap. Each group then performed a cutaneous flap on a training bench model while being evaluated by three blinded BSPS (Brazilian Society of Plastic Surgery board-certified surgeons using the OSATS (Objective Structured Assessment of Technical Skill protocol and answered a post-test. The text-based group was then tested again using the software. The computer-assisted learning (CAL group had superior performance as confirmed by checklist scores (p<0.002, overall global assessment (p = 0.017 and post-test results (p<0.001. All participants ranked the multimedia method as the best study tool. CAL learners exhibited better subjective and objective performance when fashioning rhomboid flaps as compared to those taught with standard print material. These findings indicate that students preferred to learn using the multimedia method.

  11. A pilot study evaluating use of a computer-assisted neurorehabilitation platform for upper-extremity stroke assessment

    Directory of Open Access Journals (Sweden)

    Feng Xin

    2009-05-01

    Full Text Available Abstract Background There is a need to develop cost-effective, sensitive stroke assessment instruments. One approach is examining kinematic measures derived from goal-directed tasks, which can potentially be sensitive to the subtle changes in the stroke rehabilitation process. This paper presents the findings from a pilot study that uses a computer-assisted neurorehabilitation platform, interfaced with a conventional force-reflecting joystick, to examine the assessment capability of the system by various types of goal-directed tasks. Methods Both stroke subjects with hemiparesis and able-bodied subjects used the force-reflecting joystick to complete a suite of goal-directed tasks under various task settings. Kinematic metrics, developed for specific types of goal-directed tasks, were used to assess various aspects of upper-extremity motor performance across subjects. Results A number of metrics based on kinematic performance were able to differentiate subjects with different impairment levels, with metrics associated with accuracy, steadiness and speed consistency showing the best capability. Significant differences were also shown on these metrics between various force field settings. Conclusion The results support the potential of using UniTherapy software with a conventional joystick system as an upper-extremity assessment instrument. We demonstrated the ability of using various types of goal-directed tasks to distinguish between subjects with different impairment levels. In addition, we were able to show that different force fields have a significant effect on the performance across subjects with different impairment levels in the trajectory tracking task. These results provide motivation for studies with a larger sample size that can more completely span the impairment space, and can use insights presented here to refine considerations of various task settings so as to generalize and extend our conclusions.

  12. False-Positive Rate Determination of Protein Target Discovery using a Covalent Modification- and Mass Spectrometry-Based Proteomics Platform

    Science.gov (United States)

    Strickland, Erin C.; Geer, M. Ariel; Hong, Jiyong; Fitzgerald, Michael C.

    2014-01-01

    Detection and quantitation of protein-ligand binding interactions is important in many areas of biological research. Stability of proteins from rates of oxidation (SPROX) is an energetics-based technique for identifying the proteins targets of ligands in complex biological mixtures. Knowing the false-positive rate of protein target discovery in proteome-wide SPROX experiments is important for the correct interpretation of results. Reported here are the results of a control SPROX experiment in which chemical denaturation data is obtained on the proteins in two samples that originated from the same yeast lysate, as would be done in a typical SPROX experiment except that one sample would be spiked with the test ligand. False-positive rates of 1.2-2.2 % and manassantin A. The impact of ion purity in the tandem mass spectral analyses and of background oxidation on the false-positive rate of protein target discovery using SPROX is also discussed.

  13. False Positive Rate Determination of Protein Target Discovery using a Covalent Modification- and Mass Spectrometry-Based Proteomics Platform

    Science.gov (United States)

    Strickland, Erin C.; Geer, M. Ariel; Hong, Jiyong; Fitzgerald, Michael C.

    2013-01-01

    Detection and quantitation of protein-ligand binding interactions is important in many areas of biological research. The Stability of Proteins from Rates of Oxidation (SPROX) technique is an energetics-based technique for identifying the proteins targets of ligands in complex biological mixtures. Knowing the false positive rate of protein target discovery in proteome-wide SPROX experiments is important for the correct interpretation of results. Reported here are the results of a control SPROX experiment in which chemical denaturation data is obtained on the proteins in two samples that originated from the same yeast lysate, as would be done in a typical SPROX experiment except that one sample would be spiked with the test ligand. False positive rates of 1.2–2.2% and manassantin A. The impact of ion purity in the tandem mass spectral analyses and of background oxidation on the false positive rate of protein target discovery using SPROX is also discussed. PMID:24114261

  14. Computer methods for automating preoperative dental implant planning: implant positioning and size assignment.

    Science.gov (United States)

    Galanis, Christos C; Sfantsikopoulos, Michael M; Koidis, Petros T; Kafantaris, Nikolaos M; Mpikos, Pavlos G

    2007-04-01

    The paper presents computer-aided methods that allocate a dental implant and suggest its size, during the pre-operative planning stage, in conformance with introduced optimization criteria and established clinical requirements. Based on computed tomography data of the jaw and prosthesis anatomy, single tooth cases are planned for the best-suited implant insertion at a user-defined region. An optimum implantation axis line is produced and cylindrical implants of various candidate sizes are then automatically positioned, while their occlusal end is leveled to bone ridge, and evaluated. Radial safety margins are used for the assessment of the implant safety distance from neighboring anatomical structures and bone quantity and quality are estimated and taken into consideration. A case study demonstrates the concept and allows for its discussion.

  15. Computational time reduction for sequential batch solutions in GNSS precise point positioning technique

    Science.gov (United States)

    Martín Furones, Angel; Anquela Julián, Ana Belén; Dimas-Pages, Alejandro; Cos-Gayón, Fernando

    2017-08-01

    Precise point positioning (PPP) is a well established Global Navigation Satellite System (GNSS) technique that only requires information from the receiver (or rover) to obtain high-precision position coordinates. This is a very interesting and promising technique because eliminates the need for a reference station near the rover receiver or a network of reference stations, thus reducing the cost of a GNSS survey. From a computational perspective, there are two ways to solve the system of observation equations produced by static PPP either in a single step (so-called batch adjustment) or with a sequential adjustment/filter. The results of each should be the same if they are both well implemented. However, if a sequential solution (that is, not only the final coordinates, but also those observed in previous GNSS epochs), is needed, as for convergence studies, finding a batch solution becomes a very time consuming task owing to the need for matrix inversion that accumulates with each consecutive epoch. This is not a problem for the filter solution, which uses information computed in the previous epoch for the solution of the current epoch. Thus filter implementations need extra considerations of user dynamics and parameter state variations between observation epochs with appropriate stochastic update parameter variances from epoch to epoch. These filtering considerations are not needed in batch adjustment, which makes it attractive. The main objective of this research is to significantly reduce the computation time required to obtain sequential results using batch adjustment. The new method we implemented in the adjustment process led to a mean reduction in computational time by 45%.

  16. Research on Cloud Computing Platform Virtual Machine Migration Energy Consumption%云计算平台虚拟机迁移能耗研究

    Institute of Scientific and Technical Information of China (English)

    陈俊; 申田静

    2016-01-01

    This paper research virtual machine migration energy consumption on IPv4 /IPv6 transition prior-period , mid-period ,last-period for cloud computing platform .In this paper ,it monitoring the test node CPU frequency , CPU usage ,and using monitoring data to study the mathematical relation ,finally drawing the cloud computing platform test energy consumption the mathematical model .The paper conducted a IPv4 / IPv6 isomorphic cloud computing platform virtual machine migration energy consumption experiments ,and using two cloud computing platform energy consumption test mathematical model to mean value calculate ,and use it as the final calculation of the migration energy consumption .This study is to lay the foundation for further energy consumption optimization .%对云计算平台进行IPv4/IPv6过渡前期、中期、后期虚拟机迁移能耗研究 .对测试节点CPU频率、CPU使用率进行监测 ,并用监测数据研究了其数学关系 ,最终得出云计算平台能耗测试数学模型 .文中进行了IPv4/IPv6同构云计算平台虚拟机迁移能耗实验 ,用两个云计算平台能耗测试数学模型进行计算取其均值作为最终计算迁移能耗值 ,为进一步进行能耗优化打下基础 .

  17. A New Position-Based Fast Radix-2 Algorithm for Computing the DHT

    Science.gov (United States)

    Shah, Gautam A.; Rathore, Tejmal S.

    The radix-2 decimation-in-time fast Hartley transform algorithm for computing the DHT has been introduced by Bracewell. A set of fast algorithms were further developed by Sorenson et al. A new position-based fast radix-2 decimation-in-time algorithm that requires less number of multiplications than that of Sorenson is proposed. It exploits the characteristics of the DHT matrix and introduces multiplying structures in the signal flow-diagram (SFD). It exhibits an SFD with butterflies similar for each stage. The operation count for the proposed algorithm is determined. It is verified by implementing the program in C.

  18. 基于云计算的军事物流平台设计%Military logistics platform design based on Cloud computing

    Institute of Scientific and Technical Information of China (English)

    赵锐; 马乐; 刘言; 姚金飞; 朱祖礼; 钟榜

    2013-01-01

    通过研究云计算的理论技术,结合军事物流的发展现状,设计了基于云计算的军事物流平台的总体架构,提出了平台的技术架构和功能架构,并对平台提供的具体服务分别进行了阐述。最后对云计算技术在军事物流体系中的实际应用提出了展望。%After studying the theory of Cloud computing , combined with the development of military logistics , this paper de-signs the overall architecture of military logistics platform based on Cloud computing , puts forward the platform ’ s technology archi-tecture and function architecture and then describes the specific services provided by the platform . Finally , it puts forward practical application of Cloud computing technology in the military logistics system .

  19. Effect of Head Position on Facial Soft Tissue Depth Measurements Obtained Using Computed Tomography.

    Science.gov (United States)

    Caple, Jodi M; Stephan, Carl N; Gregory, Laura S; MacGregor, Donna M

    2016-01-01

    Facial soft tissue depth (FSTD) studies employing clinical computed tomography (CT) data frequently rely on depth measurements from raw 2D orthoslices. However, the position of each patient's head was not standardized in this method, potentially decreasing measurement reliability and accuracy. This study measured FSTDs along the original orthoslice plane and compared these measurements to those standardized by the Frankfurt horizontal (FH). Subadult cranial CT scans (n = 115) were used to measure FSTDs at 18 landmarks. Significant differences were observed between the methods at eight of these landmarks (p < 0.05), demonstrating that high-quality data are not generated simply by employing modern imaging modalities such as CT. Proper technique is crucial to useful results, and maintaining control over head position during FSTD data collection is important. This is easily and most readily achieved in CT techniques by rotating the head to the FH plane after constructing a 3D rendering of the data.

  20. An Analysis Of Methods For Sharing An Electronic Platform Of Public Administration Services Using Cloud Computing And Service Oriented Architecture

    Directory of Open Access Journals (Sweden)

    Maciej Hamiga

    2012-01-01

    Full Text Available This paper presents a case study on how to design and implement a publicadministration services platform, using the SOA paradigm and cloud model forsharing among citizens belonging to particular districts and provinces, providingtight integration with an existing ePUAP system. The basic requirements,architecture and implementation of the platform are all discussed. Practicalevaluation of the solution is elaborated using real-case scenario of the BusinessProcess Management related activities.

  1. Vertical Wave Impacts on Offshore Wind Turbine Inspection Platforms

    DEFF Research Database (Denmark)

    Bredmose, Henrik; Jacobsen, Niels Gjøl

    2011-01-01

    on a horizontal inspection platform is computed for five different platform levels. The computational results show details of monopile impact such as slamming pressures from the overturning wave front and the formation of run-up flow. The results show that vertical platform impacts can occur at 20 m water depth....... The dependence of the vertical platform load to the platform level is discussed. Attention is given to the significant downward force that occur after the upward force associated with the vertical impact. The effect of the numerical resolution on the results is assessed. The position of wave overturning is found...... to be influenced by the grid resolution. For the lowest platform levels, the vertical impact is found to contribute to the peak values of in-line force and overturning moment....

  2. Integrating medicinal chemistry, organic/combinatorial chemistry, and computational chemistry for the discovery of selective estrogen receptor modulators with Forecaster, a novel platform for drug discovery.

    Science.gov (United States)

    Therrien, Eric; Englebienne, Pablo; Arrowsmith, Andrew G; Mendoza-Sanchez, Rodrigo; Corbeil, Christopher R; Weill, Nathanael; Campagna-Slater, Valérie; Moitessier, Nicolas

    2012-01-23

    As part of a large medicinal chemistry program, we wish to develop novel selective estrogen receptor modulators (SERMs) as potential breast cancer treatments using a combination of experimental and computational approaches. However, one of the remaining difficulties nowadays is to fully integrate computational (i.e., virtual, theoretical) and medicinal (i.e., experimental, intuitive) chemistry to take advantage of the full potential of both. For this purpose, we have developed a Web-based platform, Forecaster, and a number of programs (e.g., Prepare, React, Select) with the aim of combining computational chemistry and medicinal chemistry expertise to facilitate drug discovery and development and more specifically to integrate synthesis into computer-aided drug design. In our quest for potent SERMs, this platform was used to build virtual combinatorial libraries, filter and extract a highly diverse library from the NCI database, and dock them to the estrogen receptor (ER), with all of these steps being fully automated by computational chemists for use by medicinal chemists. As a result, virtual screening of a diverse library seeded with active compounds followed by a search for analogs yielded an enrichment factor of 129, with 98% of the seeded active compounds recovered, while the screening of a designed virtual combinatorial library including known actives yielded an area under the receiver operating characteristic (AU-ROC) of 0.78. The lead optimization proved less successful, further demonstrating the challenge to simulate structure activity relationship studies.

  3. Role of cranial computed tomography in human immunodeficiency virus-positive patients with generalised seizures

    Directory of Open Access Journals (Sweden)

    Chris van Zyl

    2016-03-01

    Full Text Available Background: Emergency neuroimaging of human immunodeficiency virus (HIV-positive patients with generalised new onset seizures (NOS and a normal post-ictal neurological examination remains controversial, with the general impression being that emergency imaging is necessary because immunosuppression may blur clinical indicators of acute intracranial pathology. The objectives of our study were to establish whether cranial computed tomography (CT affects the emergency management of HIV-positive patients with generalised NOS and a normal post-ictal neurological examination.Method: We conducted a prospective descriptive observational study. Consecutive HIVpositive patients of 18 years and older, who presented to the Kimberley Hospital Complex’s Emergency Department within 24 hours of their first generalised seizures and who had undergone normal post-ictal neurological examinations, were included. Emergency CT results as well as CD4-count levels were evaluated.Results: A total of 25 HIV-positive patients were included in the study. The results of cranial CT brought about a change in emergency care management in 12% of patients, all of them with CD4 counts below 200 cells/mm3 .Conclusion: We suggest that emergency cranial CT be performed on all HIV-positive patients presenting with generalised NOS and a normal post-ictal neurological examination, particularly if the CD4 count is below 200 cells/mm3.Keywords: HIV; Seizures; CT Brain

  4. Effects of Electrode Position on Spatiotemporal Auditory Nerve Fiber Responses: A 3D Computational Model Study

    Directory of Open Access Journals (Sweden)

    Soojin Kang

    2015-01-01

    Full Text Available A cochlear implant (CI is an auditory prosthesis that enables hearing by providing electrical stimuli through an electrode array. It has been previously established that the electrode position can influence CI performance. Thus, electrode position should be considered in order to achieve better CI results. This paper describes how the electrode position influences the auditory nerve fiber (ANF response to either a single pulse or low- (250 pulses/s and high-rate (5,000 pulses/s pulse-trains using a computational model. The field potential in the cochlea was calculated using a three-dimensional finite-element model, and the ANF response was simulated using a biophysical ANF model. The effects were evaluated in terms of the dynamic range, stochasticity, and spike excitation pattern. The relative spread, threshold, jitter, and initiated node were analyzed for single-pulse response; and the dynamic range, threshold, initiated node, and interspike interval were analyzed for pulse-train stimuli responses. Electrode position was found to significantly affect the spatiotemporal pattern of the ANF response, and this effect was significantly dependent on the stimulus rate. We believe that these modeling results can provide guidance regarding perimodiolar and lateral insertion of CIs in clinical settings and help understand CI performance.

  5. Positioning Standardized Acupuncture Points on the Whole Body Based on X-Ray Computed Tomography Images.

    Science.gov (United States)

    Kim, Jungdae; Kang, Dae-In

    2014-02-01

    Objective: The goal of this research was to position all the standardized 361 acupuncture points on the entire human body based on a 3-dimensional (3D) virtual body. Materials and Methods: Digital data from a healthy Korean male with a normal body shape were obtained in the form of cross-sectional images generated by X-ray computed tomography (CT), and the 3D models for the bones and the skin's surface were created through the image-processing steps. Results: The reference points or the landmarks were positioned based on the standard descriptions of the acupoints, and the formulae for the proportionalities between the acupoints and the reference points were presented. About 37% of the 361 standardized acupoints were automatically linked with the reference points, the reference points accounted for 11% of the 361 acupoints, and the remaining acupoints (52%) were positioned point-by-point by using the OpenGL 3D graphics libraries. Based on the projective 2D descriptions of the standard acupuncture points, the volumetric 3D acupoint model was developed; it was extracted from the X-ray CT images. Conclusions: This modality for positioning acupoints may modernize acupuncture research and enable acupuncture treatments to be more personalized.

  6. The NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform to Support the Analysis of Petascale Environmental Data Collections

    Science.gov (United States)

    Evans, B. J. K.; Pugh, T.; Wyborn, L. A.; Porter, D.; Allen, C.; Smillie, J.; Antony, J.; Trenham, C.; Evans, B. J.; Beckett, D.; Erwin, T.; King, E.; Hodge, J.; Woodcock, R.; Fraser, R.; Lescinsky, D. T.

    2014-12-01

    The National Computational Infrastructure (NCI) has co-located a priority set of national data assets within a HPC research platform. This powerful in-situ computational platform has been created to help serve and analyse the massive amounts of data across the spectrum of environmental collections - in particular the climate, observational data and geoscientific domains. This paper examines the infrastructure, innovation and opportunity for this significant research platform. NCI currently manages nationally significant data collections (10+ PB) categorised as 1) earth system sciences, climate and weather model data assets and products, 2) earth and marine observations and products, 3) geosciences, 4) terrestrial ecosystem, 5) water management and hydrology, and 6) astronomy, social science and biosciences. The data is largely sourced from the NCI partners (who include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. By co-locating these large valuable data assets, new opportunities have arisen by harmonising the data collections, making a powerful transdisciplinary research platformThe data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. New scientific software, cloud-scale techniques, server-side visualisation and data services have been harnessed and integrated into the platform, so that analysis is performed seamlessly across the traditional boundaries of the underlying data domains. Characterisation of the techniques along with performance profiling ensures scalability of each software component, all of which can either be enhanced or replaced through future improvements. A Development-to-Operations (DevOps) framework has also been implemented to manage the scale of the software complexity alone. This ensures that

  7. Toward an Optimal Position for IVC Filters: Computational Modeling of the Impact of Renal Vein Inflow

    Energy Technology Data Exchange (ETDEWEB)

    Wang, S L; Singer, M A

    2009-07-13

    The purpose of this report is to evaluate the hemodynamic effects of renal vein inflow and filter position on unoccluded and partially occluded IVC filters using three-dimensional computational fluid dynamics. Three-dimensional models of the TrapEase and Gunther Celect IVC filters, spherical thrombi, and an IVC with renal veins were constructed. Hemodynamics of steady-state flow was examined for unoccluded and partially occluded TrapEase and Gunther Celect IVC filters in varying proximity to the renal veins. Flow past the unoccluded filters demonstrated minimal disruption. Natural regions of stagnant/recirculating flow in the IVC are observed superior to the bilateral renal vein inflows, and high flow velocities and elevated shear stresses are observed in the vicinity of renal inflow. Spherical thrombi induce stagnant and/or recirculating flow downstream of the thrombus. Placement of the TrapEase filter in the suprarenal vein position resulted in a large area of low shear stress/stagnant flow within the filter just downstream of thrombus trapped in the upstream trapping position. Filter position with respect to renal vein inflow influences the hemodynamics of filter trapping. Placement of the TrapEase filter in a suprarenal location may be thrombogenic with redundant areas of stagnant/recirculating flow and low shear stress along the caval wall due to the upstream trapping position and the naturally occurring region of stagnant flow from the renal veins. Infrarenal vein placement of IVC filters in a near juxtarenal position with the downstream cone near the renal vein inflow likely confers increased levels of mechanical lysis of trapped thrombi due to increased shear stress from renal vein inflow.

  8. Computer-assisted preoperative simulation for positioning of plate fixation in Lefort I osteotomy: A case report

    Directory of Open Access Journals (Sweden)

    Hideyuki Suenaga

    2016-06-01

    Full Text Available Computed tomography images are used for three-dimensional planning in orthognathic surgery. This facilitates the actual surgery by simulating the surgical scenario. We performed a computer-assisted virtual orthognathic surgical procedure using optically scanned three-dimensional (3D data and real computed tomography data on a personal computer. It helped maxillary bone movement and positioning and the titanium plate temporary fixation and positioning. This simulated the surgical procedure, which made the procedure easy, and we could perform precise actual surgery and could forecast the postsurgery outcome. This simulation method promises great potential in orthognathic surgery to help surgeons plan and perform operative procedures more precisely.

  9. Computer-assisted preoperative simulation for positioning of plate fixation in Lefort I osteotomy: A case report.

    Science.gov (United States)

    Suenaga, Hideyuki; Taniguchi, Asako; Yonenaga, Kazumichi; Hoshi, Kazuto; Takato, Tsuyoshi

    2016-06-01

    Computed tomography images are used for three-dimensional planning in orthognathic surgery. This facilitates the actual surgery by simulating the surgical scenario. We performed a computer-assisted virtual orthognathic surgical procedure using optically scanned three-dimensional (3D) data and real computed tomography data on a personal computer. It helped maxillary bone movement and positioning and the titanium plate temporary fixation and positioning. This simulated the surgical procedure, which made the procedure easy, and we could perform precise actual surgery and could forecast the postsurgery outcome. This simulation method promises great potential in orthognathic surgery to help surgeons plan and perform operative procedures more precisely.

  10. An exploratory evaluation of Take Control: A novel computer-delivered behavioral platform for placebo-controlled pharmacotherapy trials for alcohol use disorder.

    Science.gov (United States)

    Devine, Eric G; Ryan, Megan L; Falk, Daniel E; Fertig, Joanne B; Litten, Raye Z

    2016-09-01

    Placebo-controlled pharmacotherapy trials for alcohol use disorder (AUD) require an active behavioral platform to avoid putting participants at risk for untreated AUD and to better assess the effectiveness of the medication. Therapist-delivered platforms (TDP) can be costly and present a risk to study design because of the variability in therapist fidelity. Take Control is a novel computer-delivered behavioral platform developed for use in pharmacotherapy trials sponsored by the National Institute on Alcohol Abuse and Alcoholism Clinical Investigations Group (NCIG). This behavioral platform was developed with the goal of reducing trial implementation costs and limiting potential bias introduced by therapists providing TDP. This exploratory study is the first to compare Take Control with TDP on measures related to placebo response rate, medication adherence, and participant retention. Data were drawn from the placebo arms of four multisite, double-blind, randomized controlled trials (RCT) for AUD conducted by NCIG from 2007 to 2015. Data were compared from subjects receiving TDP (n=156) in two RCTs and Take Control (n=155) in another two RCTs. Placebo response rate, as represented by weekly percentage of heavy drinking days, was similar between groups. Subjects who received Take Control had a higher rate of medication adherence than those who received TDP. Subject retention was not significantly different between groups. The findings suggest that Take Control is comparable to TDP on measures of retention, medication adherence, and placebo response. Additional research is needed to evaluate Take Control directly against TDPs in a randomized trial.

  11. Computational Studies of Positive and Negative Streamers in Bubbles Suspended in Distilled Water

    KAUST Repository

    Sharma, Ashish

    2017-01-05

    We perform computational studies of nanosecond streamers generated in helium bubbles immersed in distilled water under high pressure conditions. The model takes into account the presence of water vapor in the gas bubble for an accurate description of the chemical kinetics of the discharge. We apply positive and negative trigger voltages much higher than the breakdown voltage and study the dynamic characteristics of the resulting discharge. We observe that, for high positive trigger voltages, the streamer moves along the surface of the gas bubble during the initial stages of the discharge. We also find a considerable difference in the evolution of the streamer discharge for positive and negative trigger voltages with more uniform volumetric distribution of species in the streamer channel for negative trigger voltages due to formation of multiple streamers. We also observe that the presence of water vapor does not influence the breakdown voltage of the discharge but greatly affects the composition of dominant species in the trail of the streamer channel.

  12. MPI + OpenCL implementation of a phase-field method incorporating CALPHAD description of Gibbs energies on heterogeneous computing platforms

    Science.gov (United States)

    Gerald Tennyson, P.; G. M., Karthik; Phanikumar, G.

    2015-01-01

    Phase-field method uses a non-conserved order parameter to define the phase state of a system and is a versatile method for moving boundary problems. It is a method of choice for simulating microstructure evolution in the domain of materials engineering. Solution of phase-field evolution equations avoids explicit tracking of interfaces and is often implemented on a structured grid to capture microstructure evolution in a simple and elegant manner. Restrictions on the grid size to accurately capture the interface curvature effects lead to large number of grid points in the computational domain and render the simulation computationally intensive for realistic simulations in 3D. However, the availability of powerful heterogeneous computing platforms and super clusters provides the advantage to perform large scale phase-field simulations efficiently. This paper discusses a portable implementation to extend simulations across multiple CPUs using MPI to include use of GPUs using OpenCL. The solution scheme adapts an isotropic stencil that avoids grid-induced anisotropy. Use of separate OpenCL kernels for problem specific portions of the code ensure that the approach can be extended to different problems. Performance analysis of parallel strategies used in the study illustrate the massively parallel computing possibility for phase-field simulations across heterogeneous platforms.

  13. 云计算科技服务系统平台设计研究%Cloud Computing Science and Technology Information Service System Platform Design

    Institute of Scientific and Technical Information of China (English)

    刘捡平; 黄勇; 周西柳

    2012-01-01

    当前云计算是一种新兴的共享基础架构的方法,利用云计算技术整合现有技术服务平台.本文在归纳了云计算服务优点基础上,提出了一种整合科技服务平台的云模型,主要包含了服务模式和整体架构等,同时给出了具体的云计算虚拟化技术的应用.%the current cloud computing is a new method of sharing infrastructure, using cloud computing technology integration of existing technology service platform. This paper summarizes the advantages of cloud computing service basis, put forward a kind of integrated science and technology information service platform based on cloud model, mainly includes the service model and the overall structure, this paper gives the specific cloud computing virtualization technology application.

  14. Research and Implementation of High Performance Networking Platform over Cloud Computing%面向云计算的网络化平台研究与实现

    Institute of Scientific and Technical Information of China (English)

    史佩昌; 王怀民; 蒋杰; 卢凯

    2009-01-01

    云计算提供三种类型的服务:基础设施即服务、平台即服务和软件即服务.很多云实例都采用高性能计算结点构建基础设施,而高性能计算机的传统使用方式制约了云平台型服务的发展.本文设计并实现了基于高性能计算机的面向云计算的网络化平台NPCC,这是尝试解决高性能计算环境支持提供云平台型服务存在问题的一种探索性研究.NPCC采用了高性能虚拟域HPVZ技术和多目标协同的并行工作负载调度策略等,改变了传统高性能计算机的共享使用方式,为用户提供了具有易用性、通用性、安全性、可定制化和图形化的面向云计算的网络化平台环境.%Cloud computing provides three types of services: Infrastructure as a Service、Platform as a Service and Software as a Service. A lot of cloud instances are using high performance computing nodes to construct the infrastructure, while the traditional using manner restricts the development of platform as a Service in Cloud. This paper designs and implements the high performance computer based Networking Platform over Cloud Computing NPCC , which is an exploring research trying to solve problems of using high performance computer to support PaaS. NPCC adopts the technology of high performance virtual zone HPVZ and multi-objective cooperation based parallel load scheduling strategy etc., changing the traditional shared using manner, providing Networking Platform over Cloud Computing environment with easy-using,generalization,security,customization and graphic interface for the users.

  15. Efficient use of automatic exposure control systems in computed tomography requires correct patient positioning.

    Science.gov (United States)

    Gudjonsdottir, J; Svensson, J R; Campling, S; Brennan, P C; Jonsdottir, B

    2009-11-01

    Image quality and radiation dose to the patient are important factors in computed tomography (CT). To provide constant image quality, tube current modulation (TCM) performed by automatic exposure control (AEC) adjusts the tube current to the patient's size and shape. To evaluate the effects of patient centering on tube current-time product (mAs) and image noise. An oval-shaped acrylic phantom was scanned in various off-center positions, at 30-mm intervals within a 500-mm field of view, using three different CT scanners. Acquisition parameters were similar to routine abdomen examinations at each site. The mAs was recorded and noise measured in the images. The correlation of mAs and noise with position was calculated using Pearson correlation. In all three scanners, the mAs delivered by the AEC changed with y-position of the phantom (P<0.001), with correlation values of 0.98 for scanners A and B and -0.98 for scanner C. With x-position, mAs changes were 4.9% or less. As the phantom moved into the y-positions, compared with the iso-center, the mAs varied by up to +70%, -34%, and +56% in scanners A, B, and C, respectively. For scanners A and B, noise in two regions of interest in the lower part of the phantom decreased with elevation, with correlation factors from -0.95 to -0.86 (P<0.02). In the x-direction, significant noise relationships (P<0.005) were only seen in scanner A. This study demonstrates that patient centering markedly affects the efficacy of AEC function and that tube current changes vary between scanners. Tube position when acquiring the scout projection radiograph is decisive for the direction of the mAs change. Off-center patient positions cause errors in tube current modulation that can outweigh the dose reduction gained by AEC use, and image quality is affected.

  16. An Implementation of Real-Time Phased Array Radar Fundamental Functions on a DSP-Focused, High-Performance, Embedded Computing Platform

    Directory of Open Access Journals (Sweden)

    Xining Yu

    2016-09-01

    Full Text Available This paper investigates the feasibility of a backend design for real-time, multiple-channel processing digital phased array system, particularly for high-performance embedded computing platforms constructed of general purpose digital signal processors. First, we obtained the lab-scale backend performance benchmark from simulating beamforming, pulse compression, and Doppler filtering based on a Micro Telecom Computing Architecture (MTCA chassis using the Serial RapidIO protocol in backplane communication. Next, a field-scale demonstrator of a multifunctional phased array radar is emulated by using the similar configuration. Interestingly, the performance of a barebones design is compared to that of emerging tools that systematically take advantage of parallelism and multicore capabilities, including the Open Computing Language.

  17. Image-based computer-assisted diagnosis system for benign paroxysmal positional vertigo

    Science.gov (United States)

    Kohigashi, Satoru; Nakamae, Koji; Fujioka, Hiromu

    2005-04-01

    We develop the image based computer assisted diagnosis system for benign paroxysmal positional vertigo (BPPV) that consists of the balance control system simulator, the 3D eye movement simulator, and the extraction method of nystagmus response directly from an eye movement image sequence. In the system, the causes and conditions of BPPV are estimated by searching the database for record matching with the nystagmus response for the observed eye image sequence of the patient with BPPV. The database includes the nystagmus responses for simulated eye movement sequences. The eye movement velocity is obtained by using the balance control system simulator that allows us to simulate BPPV under various conditions such as canalithiasis, cupulolithiasis, number of otoconia, otoconium size, and so on. Then the eye movement image sequence is displayed on the CRT by the 3D eye movement simulator. The nystagmus responses are extracted from the image sequence by the proposed method and are stored in the database. In order to enhance the diagnosis accuracy, the nystagmus response for a newly simulated sequence is matched with that for the observed sequence. From the matched simulation conditions, the causes and conditions of BPPV are estimated. We apply our image based computer assisted diagnosis system to two real eye movement image sequences for patients with BPPV to show its validity.

  18. Computer-generated holograms at arbitrary positions using multi-view images

    Science.gov (United States)

    Ohsawa, Yusuke; Sakamoto, Yuji

    2012-03-01

    Computer-generated holograms (CGHs), which are generated by simulating the recording process of a hologram in a computer, are noted as an ideal three-dimensional (3D) display technology. However, with CGHs it is necessary to create precise 3D model data based on objects that already exist, and it is difficult to do this. To solve this problem, there has been much research on generating CGHs using multi-view images (MVIs). MVIs make it possible to generate CGHs from real-existing objects in natural light. A method using ordinary digital cameras resulted in high-resolution reconstructed images without the need for any special devices, but with this method it is necessary to capture a huge number of images or to use a huge number of cameras to ensure a sufficient continuous motion parallax. This is simply not realistic for the construction of 3D display applications. In this paper, we describe a method of generating voxel models from captured images and then using the MVIs obtained by the models to generate CGHs. We generate voxel models by SFS, determine voxel value using the captured images, and render voxel models into MVIs. Using this method enables us to arrange holograms at arbitrary positions in the range in which MVIs are generated correctly. We can also obtain a sufficient continuous motion parallax by generating MVIs obtained from voxel models in spite of capturing only a small number of images. Results of optical experiments demonstrated the effectiveness of the proposed method.

  19. Cross-Platform Technologies

    Directory of Open Access Journals (Sweden)

    Maria Cristina ENACHE

    2017-04-01

    Full Text Available Cross-platform - a concept becoming increasingly used in recent years especially in the development of mobile apps, but this consistently over time and in the development of conventional desktop applications. The notion of cross-platform software (multi-platform or platform-independent refers to a software application that can run on more than one operating system or computing architecture. Thus, a cross-platform application can operate independent of software or hardware platform on which it is execute. As a generic definition presents a wide range of meanings for purposes of this paper we individualize this definition as follows: we will reduce the horizon of meaning and we use functionally following definition: a cross-platform application is a software application that can run on more than one operating system (desktop or mobile identical or in a similar way.

  20. Comparing Neuromorphic Solutions in Action: Implementing a Bio-Inspired Solution to a Benchmark Classification Task on Three Parallel-Computing Platforms.

    Science.gov (United States)

    Diamond, Alan; Nowotny, Thomas; Schmuker, Michael

    2015-01-01

    Neuromorphic computing employs models of neuronal circuits to solve computing problems. Neuromorphic hardware systems are now becoming more widely available and "neuromorphic algorithms" are being developed. As they are maturing toward deployment in general research environments, it becomes important to assess and compare them in the context of the applications they are meant to solve. This should encompass not just task performance, but also ease of implementation, speed of processing, scalability, and power efficiency. Here, we report our practical experience of implementing a bio-inspired, spiking network for multivariate classification on three different platforms: the hybrid digital/analog Spikey system, the digital spike-based SpiNNaker system, and GeNN, a meta-compiler for parallel GPU hardware. We assess performance using a standard hand-written digit classification task. We found that whilst a different implementation approach was required for each platform, classification performances remained in line. This suggests that all three implementations were able to exercise the model's ability to solve the task rather than exposing inherent platform limits, although differences emerged when capacity was approached. With respect to execution speed and power consumption, we found that for each platform a large fraction of the computing time was spent outside of the neuromorphic device, on the host machine. Time was spent in a range of combinations of preparing the model, encoding suitable input spiking data, shifting data, and decoding spike-encoded results. This is also where a large proportion of the total power was consumed, most markedly for the SpiNNaker and Spikey systems. We conclude that the simulation efficiency advantage of the assessed specialized hardware systems is easily lost in excessive host-device communication, or non-neuronal parts of the computation. These results emphasize the need to optimize the host-device communication architecture for

  1. Comparing neuromorphic solutions in action: implementing a bio-inspired solution to a benchmark classification task on three parallel-computing platforms

    Directory of Open Access Journals (Sweden)

    Alan eDiamond

    2016-01-01

    Full Text Available Neuromorphic computing employs models of neuronal circuits to solve computing problems. Neuromorphic hardware systems are now becoming more widely available and neuromorphic algorithms are being developed. As they are maturing towards deployment in general research environments, it becomes important to assess and compare them in the context of the applications they are meant to solve. This should encompass not just task performance, but also ease of implementation, speed of processing, scalability and power efficiency.Here, we report our practical experience of implementing a bio-inspired, spiking network for multivariate classification on three different platforms: the hybrid digital/analogue Spikey system, the digital spike-based SpiNNaker system, and GeNN, a meta-compiler for parallel GPU hardware. We assess performance using a standard hand-written digit classification task.We found that whilst a different implementation approach was required for each platform, classification performances remained in line. This suggests that all three implementations were able to exercise the model’s ability to solve the task rather than exposing inherent platform limits, although differences emerged when capacity was approached.With respect to execution speed and power consumption, we found that for each platform a large fraction of the computing time was spent outside of the neuromorphic device, on the host machine. Time was spent in a range of combinations of preparing the model, encoding suitable input spiking data, shifting data and decoding spike-encoded results. This is also where a large proportion of the total power was consumed, most markedly for the SpiNNaker and Spikey systems. We conclude that the simulation efficiency advantage of the assessed specialized hardware systems is easily lost in excessive host-device communication, or non-neuronal parts of the computation. These results emphasize the need to optimize the host-device communication

  2. Airflow behavior changes in upper airway caused by different head and neck positions: Comparison by computational fluid dynamics.

    Science.gov (United States)

    Wei, Wei; Huang, Shi-Wei; Chen, Lian-Hua; Qi, Yang; Qiu, Yi-Min; Li, Shi-Tong

    2017-02-08

    The feasibility of computational fluid dynamics (CFD) to evaluate airflow characteristics in different head and neck positions has not been established. This study compared the changes in volume and airflow behavior of the upper airway by CFD simulation to predict the influence of anatomical and physiological airway changes due to different head-neck positions on mechanical ventilation. One awake volunteer with no risk of difficult airway underwent computed tomography in neutral position, extension position (both head and neck extended), and sniffing position (head extended and neck flexed). Three-dimensional airway models of the upper airway were reconstructed. The total volume (V) and narrowest area (Amin) of the airway models were measured. CFD simulation with an Spalart-Allmaras model was performed to characterize airflow behavior in neutral, extension, and sniffing positions of closed-mouth and open-mouth ventilation. The comparison result for V was neutral position was nearly 3.0 times that in neutral position and 1.7 times that in extension position. The pressure drop and velocity increasing were more obvious in neutral than sniffing or extension position at the same airflow rate. In sniffing position, pressure differences decreased and velocity remained almost constant. Recirculation airflow was generated near the subglottic region in neutral and extension positions. Sniffing position improves airway patency by increasing airway volume and decreasing airway resistance, suggesting that sniffing position may be the optimal choice for mask ventilation.

  3. Estimation Methods of the Point Spread Function Axial Position: A Comparative Computational Study

    Directory of Open Access Journals (Sweden)

    Javier Eduardo Diaz Zamboni

    2017-01-01

    Full Text Available The precise knowledge of the point spread function is central for any imaging system characterization. In fluorescence microscopy, point spread function (PSF determination has become a common and obligatory task for each new experimental device, mainly due to its strong dependence on acquisition conditions. During the last decade, algorithms have been developed for the precise calculation of the PSF, which fit model parameters that describe image formation on the microscope to experimental data. In order to contribute to this subject, a comparative study of three parameter estimation methods is reported, namely: I-divergence minimization (MIDIV, maximum likelihood (ML and non-linear least square (LSQR. They were applied to the estimation of the point source position on the optical axis, using a physical model. Methods’ performance was evaluated under different conditions and noise levels using synthetic images and considering success percentage, iteration number, computation time, accuracy and precision. The main results showed that the axial position estimation requires a high SNR to achieve an acceptable success level and higher still to be close to the estimation error lower bound. ML achieved a higher success percentage at lower SNR compared to MIDIV and LSQR with an intrinsic noise source. Only the ML and MIDIV methods achieved the error lower bound, but only with data belonging to the optical axis and high SNR. Extrinsic noise sources worsened the success percentage, but no difference was found between noise sources for the same method for all methods studied.

  4. 高性能计算平台的IO性能测试与分析%Test and Analysis of IO Performance on High Performance Computing Platform

    Institute of Scientific and Technical Information of China (English)

    李亮; 聂瑞华

    2011-01-01

    在高性能计算平台上测试NFS和Lustre文件系统在大规模并行计算环境下的IO速率,根据测试结果分析该实验平台的IO瓶颈并提出改进方案;之后测试本地Cache、并行应用进行IO时的TransferSize和并行程序存取文件的FileSize等因素对分布式文件系统性能的影响,并根据实验结果提出如何合理地部署并行软件以有效地利用本地Cache,以及提出在编写并行程序时设置合适的 TransferSize和 FileSize 以提高IO性能的建议.%This paper tests the performance of NFS and Lustre file system in the large scale distributed computing environment on high performance computing platform. According to the results, the IO bottleneck is known in this high performance computing platform and an advice is given to improve the IO performance to solve the bottleneck problem. Then this paper tests how the local cache, file's size and transferSize affect the parallel filesystems' IO performance, and proposes how to deploy the parallel applications and write the parallel programs efficiently to improve the IO performance.

  5. A survey of clustering algorithms based on cloud computing platform%基于云计算平台的聚类算法研究进展

    Institute of Scientific and Technical Information of China (English)

    张锦杏; 缪裕青; 邱良佩; 文益民

    2013-01-01

    In order to discover useful information from massive data more efficiently,data mining based on cloud computing comes into being and attracts large amounts of attention.Cloud computing is an emerging technology with powerful capacities of data collecting,storing and computing.It offers great opportunities for data mining.As one of the main fields of data mining,clustering analysis has been widely applied in many domains.Firstly cloud computing technologies and its current state are introduced briefly.Then the existing clustering algorithms based on cloud computing environment are analyzed and categorized.After some major challenges of clustering on cloud computing platform are listed,a prospect of this field is given.%基于云计算平台的数据挖掘主要目的是为了更好地处理海量数据,挖掘有用的信息.云计算为海量数据挖掘提供了强大的数据收集、存储和计算能力,简述了云计算技术及其研究现状,详细介绍了基于云计算平台的聚类算法,总结在云计算平台研究聚类所遇到的新问题,对基于云计算平台的聚类发展趋势进行展望.

  6. 基于Hadoop的云计算平台研究与实现%Research and Implementation of Cloud Computing Platform Based on Hadoop

    Institute of Scientific and Technical Information of China (English)

    范素娟; 田军锋

    2016-01-01

    随着网络技术的发展,网络数据量正以指数级增长且规模日渐庞大。面对正在增长的海量数据,传统的数据处理方法存在效率低下等诸多缺点。人们需要一种新的技术思想来解决这些问题。因此,云计算的思想被提出。云计算是一种新兴的计算模型,是分布式计算技术的一种。而Hadoop作为一个开源的分布式平台是当前最为流行的云计算平台实现之一,被用于高效地处理海量数据。为了提高对海量数据处理的效率,文中首先简要分析了云计算的概念和Hadoop主要组件的工作流程,然后详细介绍了基于Hadoop的云计算平台配置方法和实现过程,并对云平台的搭建过程中遇到的典型问题进行了总结阐述。最后通过实验证明,该平台可以有效地完成分布式数据处理任务。%With the development of network technology,the number of online information is increasing in exponential and becoming lar-ger and larger. With the growing amount of data,the traditional methods for processing massive data have many shortcomings like low ef-ficiency. A novel technology is needed to solve these problems,so the cloud computing has been brought. It is an emerging computational model,as a kind of distributed computing technology. Hadoop is one of the most popular cloud computing platforms as a kind of open sources distributed platform,which is always applied on the area that needs to handle massive data efficiently. In order to improve the effi-ciency of processing massive data,it briefly analyzes the concept of cloud computing and the work flow of the main components of Ha-doop in this paper,then introduction of the implementation method of the cloud computing platform based on Hadoop in detail,discussion of the typical problems encountered in the process of building cloud computing platform. Finally,the experiments show that the platform can effectively complete the processing tasks of

  7. 基于Hadoop的云平台设计与实现%Construct the cloud computing platform based on Hadoop

    Institute of Scientific and Technical Information of China (English)

    秦东霞; 韦家骥; 齐迎春

    2016-01-01

    Hadoop is a free, reliable, efficient and scalable open source cloud platform, which allows the software framework to deal with large data on a distributed cluster. Based on Hadoop, this paper introduces the technology of CentOS, JDK, Hadoop and VMware in virtual machine. Virtual cloud platform is built in the pseudo distributed environment. After testing, the system can run the MapReduce oriented distributed program. This paper also provides a basis for the research of the SSH based cloud platform and application program based on Hadoop.%Hadoop是一个免费、可靠、高效、可扩展的开源云平台,允许在分布式集群上处理大数据的软件框架。本文以Hadoop为基础,详细介绍了虚拟机VMware、JDK、CentOS、Hadoop等技术。在伪分布式环境下搭建虚拟云平台,经过测试,本系统能正常运行MapReduce化的分布式程序,本文还针对用户权限、路径配置和使用SSH服务程序等问题进行了详细的阐述,为基于Hadoop的云平台研究和应用程序开发提供了基础。

  8. 基于TPM的嵌入式可信计算平台设计%Embedded Trusted Computing Platform Based on TPM

    Institute of Scientific and Technical Information of China (English)

    王博; 李波; 高振铁; 陈磊

    2011-01-01

    为了解决目前嵌入式应用中存在的诸多安全问题,设计了一种基于TPM(Trusted Platform Module)安全芯片的嵌入式平台设计方案.在分析可信计算技术发展的基础上,结合嵌入式平台结构特点以及TPM芯片的常见接口,采用了通过I2C总线扩展TPM的方法.最后讨论了TPM的操作方法,设计了TPM在Bootloader和Linux内核两种环境下驱动程序.%In order to solve the security problems existing in the current embedded applications, an embedded platform design scheme is presented based on TPM (Trusted Platform Module). Based on the analysis of the development of trusted computing technology and the normal interface of TPM, TPM is extended by I2C interface. After studying the method of TPM operation, TPM drivers for Bootloader and Linux kernel are designed.

  9. Computer-assisted design of butterfly bileaflet valves for the mitral position.

    Science.gov (United States)

    McQueen, D M; Peskin, C S

    1985-01-01

    This paper describes the application of computer testing to a design study of butterfly bileaflet mitral prostheses having flat or curved leaflets. The curvature is in the plane normal to the pivot axes and is such that the convex sides of the leaflets face each other when the valve is open. The design parameters considered are the curvature of the leaflets and the location of the pivot points. In this study, stagnation is assessed by computing the smallest value (over the three openings of the valve) of the peak velocity, and hemodynamic performance is judged by a benefit/cost ratio: the net stroke volume divided by the mean transvalvular pressure difference. Unlike the case of a pivoting single-disc valve, the inclusion of a constraint on the maximum angle of opening of the leaflets is found to be essential for adequate, competent performance. Results are presented with both 85 degrees and 90 degrees constraints, since best performance is achieved with the opening-angle constraint in this range. Asymmetry of leaflet motion which is observed with flat leaflets in the mitral position is reduced with modest leaflet curvature. Leaflet curvature also ameliorates central orifice stagnation, which is observed with flat leaflets. Curvature of the valve produces the following improvements in comparison with the best flat valve when the opening-angle constraint is 85 degrees: a 38% increase in the minimum peak velocity and a 16% increase in the hemodynamic benefit/cost ratio. With a 90 degrees constraint the corresponding improvements are 34% and 20%, respectively.

  10. Evaluation of Two Computational Techniques of Calculating Multipath Using Global Positioning System Carrier Phase Measurements

    Science.gov (United States)

    Gomez, Susan F.; Hood, Laura; Panneton, Robert J.; Saunders, Penny E.; Adkins, Antha; Hwu, Shian U.; Lu, Ba P.

    1996-01-01

    Two computational techniques are used to calculate differential phase errors on Global Positioning System (GPS) carrier war phase measurements due to certain multipath-producing objects. The two computational techniques are a rigorous computati electromagnetics technique called Geometric Theory of Diffraction (GTD) and the other is a simple ray tracing method. The GTD technique has been used successfully to predict microwave propagation characteristics by taking into account the dominant multipath components due to reflections and diffractions from scattering structures. The ray tracing technique only solves for reflected signals. The results from the two techniques are compared to GPS differential carrier phase ns taken on the ground using a GPS receiver in the presence of typical International Space Station (ISS) interference structures. The calculations produced using the GTD code compared to the measured results better than the ray tracing technique. The agreement was good, demonstrating that the phase errors due to multipath can be modeled and characterized using the GTD technique and characterized to a lesser fidelity using the DECAT technique. However, some discrepancies were observed. Most of the discrepancies occurred at lower devations and were either due to phase center deviations of the antenna, the background multipath environment, or the receiver itself. Selected measured and predicted differential carrier phase error results are presented and compared. Results indicate that reflections and diffractions caused by the multipath producers, located near the GPS antennas, can produce phase shifts of greater than 10 mm, and as high as 95 mm. It should be noted tl the field test configuration was meant to simulate typical ISS structures, but the two environments are not identical. The GZ and DECAT techniques have been used to calculate phase errors due to multipath o the ISS configuration to quantify the expected attitude determination errors.

  11. A study on the position of condylar head on computed tomogram

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Bok; Kim, Jae Duck [Dept. of Oral Radiology, Division of Dentistry, Chosun University, Kwangju (Korea, Republic of)

    1987-11-15

    The author obtained the computed tomograms around the condylar head form 10 normal subjects and 5 patients having clicking condylar head from 10 normal subjects and 5 patients having clicking sound or limitation of mouth opening by using a Hitachi-W 500. And then, the author had the axial analysis of 18 mm interincisal opening. Transcranial view and sub mentovertex view were taken and computed tomographic view. The obtained results were as follows: 1. Median angle of long axis of condylar head was 17 degrees on centric occlusion and the angles of long axis of both condylar heads were reduced symmetrically on 18 mm interincisal opening in normal group, however, in the patient group, the affected side of condyle heads showed greater change in the angle on 18 mm interincisal opening. 2. In the patient group, the condyle head of affected side was located superiorly to that of normal side on centric occlusion and the discrepancy of condylar positional height was increased after 18 mm interincisal opening. 3. The distances from medial pole of condylar head to triangular fossa of temporal bone were same on both right and left side in normal group, however, in the patient group, the distance of affected side was wider than that of opposite side on centric occlusion and became narrower than the opposite side on 18 mm interincisal opening. 4. The distances of posterior joint space were same on both right and left side. The distance at lateral pole 1/3 o f condyle head was similar to that on transcranial view on centric occlusion in normal group. 5. The distances of posterior joint space were narrower in patient group than in normal group. 6. Conclusively, the affected condylar head of patient showed postero-latero-superior displacement on centric occlusion and larger range of rotational movement on 18 mm interincisal opening.

  12. Scapular position after the open Latarjet procedure: results of a computed tomography scan study.

    Science.gov (United States)

    Cerciello, Simone; Edwards, T Bradley; Cerciello, Giuliano; Walch, Gilles

    2015-02-01

    The aim of this study was to investigate, through a computed tomography (CT) scan analysis, the effects of the Latarjet procedure on scapular position in an axial plane. Twenty healthy young male subjects (mean age, 22 years; range, 18-27 years) were enrolled as a control group. Twenty young male patients (mean age, 23 years; range, 17-30 years) with recurrent anterior shoulder dislocation were enrolled as the study group. CT cuts at a proper level allowed the identification of an α angle, which defined the tilt of the scapula relative to the anterior-posterior axis. In the control population, the α angles on the right and left shoulders were 48° (44°-52°) and 48° (44°-54°), respectively. In the study group, the preoperative α angles at the affected and healthy shoulders were 49° (46°-52°) and 49° (44°-52°), respectively. At day 45, the corresponding angles were 45° (40°-50°) and 49° (46°-52°). At 6 months, the average α angle of the shoulder operated on was 52° (46°-58°). The α angle value was restored in 5 cases, increased in 9 cases (mean, 8°), and decreased in 6 cases (mean, 3°). A general symmetry of scapular position was observed during CT scan analysis. This balance was lost initially after the Latarjet procedure, with a decrease of the α angle and scapular protraction. Six months after surgery, a small trend toward scapular retraction was conversely observed; however, the data were not statistically significant. Copyright © 2015 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  13. Computational fluid dynamics vs. inverse dynamics methods to determine passive drag in two breaststroke glide positions.

    Science.gov (United States)

    Costa, L; Mantha, V R; Silva, A J; Fernandes, R J; Marinho, D A; Vilas-Boas, J P; Machado, L; Rouboa, A

    2015-07-16

    Computational fluid dynamics (CFD) plays an important role to quantify, understand and "observe" the water movements around the human body and its effects on drag (D). We aimed to investigate the flow effects around the swimmer and to compare the drag and drag coefficient (CD) values obtained from experiments (using cable velocimetry in a swimming pool) with those of CFD simulations for the two ventral gliding positions assumed during the breaststroke underwater cycle (with shoulders flexed and upper limbs extended above the head-GP1; with shoulders in neutral position and upper limbs extended along the trunk-GP2). Six well-trained breaststroke male swimmers (with reasonable homogeneity of body characteristics) participated in the experimental tests; afterwards a 3D swimmer model was created to fit within the limits of the sample body size profile. The standard k-ε turbulent model was used to simulate the fluid flow around the swimmer model. Velocity ranged from 1.30 to 1.70 m/s for GP1 and 1.10 to 1.50 m/s for GP2. Values found for GP1 and GP2 were lower for CFD than experimental ones. Nevertheless, both CFD and experimental drag/drag coefficient values displayed a tendency to jointly increase/decrease with velocity, except for GP2 CD where CFD and experimental values display opposite tendencies. Results suggest that CFD values obtained by single model approaches should be considered with caution due to small body shape and dimension differences to real swimmers. For better accuracy of CFD studies, realistic individual 3D models of swimmers are required, and specific kinematics respected.

  14. Development of a Computer-Assisted Instrumentation Curriculum for Physics Students: Using LabVIEW and Arduino Platform

    Science.gov (United States)

    Kuan, Wen-Hsuan; Tseng, Chi-Hung; Chen, Sufen; Wong, Ching-Chang

    2016-01-01

    We propose an integrated curriculum to establish essential abilities of computer programming for the freshmen of a physics department. The implementation of the graphical-based interfaces from Scratch to LabVIEW then to LabVIEW for Arduino in the curriculum "Computer-Assisted Instrumentation in the Design of Physics Laboratories" brings…

  15. Towards a Versatile Tele-Education Platform for Computer Science Educators Based on the Greek School Network

    Science.gov (United States)

    Paraskevas, Michael; Zarouchas, Thomas; Angelopoulos, Panagiotis; Perikos, Isidoros

    2013-01-01

    Now days the growing need for highly qualified computer science educators in modern educational environments is commonplace. This study examines the potential use of Greek School Network (GSN) to provide a robust and comprehensive e-training course for computer science educators in order to efficiently exploit advanced IT services and establish a…

  16. Development of a Computer-Assisted Instrumentation Curriculum for Physics Students: Using LabVIEW and Arduino Platform

    Science.gov (United States)

    Kuan, Wen-Hsuan; Tseng, Chi-Hung; Chen, Sufen; Wong, Ching-Chang

    2016-01-01

    We propose an integrated curriculum to establish essential abilities of computer programming for the freshmen of a physics department. The implementation of the graphical-based interfaces from Scratch to LabVIEW then to LabVIEW for Arduino in the curriculum "Computer-Assisted Instrumentation in the Design of Physics Laboratories" brings…

  17. Evaluation of condylar positions in patients with temporomandibular disorders: A cone-beam computed tomography study

    Energy Technology Data Exchange (ETDEWEB)

    Imanimoghaddam, Mahrokh; Mahdavi, Pirooze; Bagherpour, Ali; Darijani, Mansoreh; Ebrahimnejad, Hamed [Dept. of Oral and Maxillofacial Radiology, Oral and Maxillofacial Diseases Research Center, School of Dentistry, Mashhad University of Medical Sciences, Mashhad (Iran, Islamic Republic of); Madani, Azam Sadat [Dept. of Oral and Maxillofacial Radiology, Oral and Maxillofacial Diseases Research Center, School of Dentistry, Mashhad University of Medical Sciences, Mashhad (Iran, Islamic Republic of)

    2016-06-15

    This study was performed to compare the condylar position in patients with temporomandibular joint disorders (TMDs) and a normal group by using cone-beam computed tomography (CBCT). In the TMD group, 25 patients (5 men and 20 women) were randomly selected among the ones suffering from TMD according to the Research Diagnostic Criteria for Temporomandibular Disorders (RDC/TMD). The control group consisted of 25 patients (8 men and 17 women) with normal temporomandibular joints (TMJs) who were referred to the radiology department in order to undergo CBCT scanning for implant treatment in the posterior maxilla. Linear measurements from the superior, anterior, and posterior joint spaces between the condyle and glenoid fossa were made through defined landmarks in the sagittal view. The inclination of articular eminence was also determined. The mean anterior joint space was 2.3 mm in the normal group and 2.8 mm in the TMD group, respectively. The results showed that there was a significant correlation between the superior and posterior joint spaces in both the normal and TMD groups, but it was only in the TMD group that the correlation coefficient among the dimensions of anterior and superior spaces was significant. There was a significant correlation between the inclination of articular eminence and the size of the superior and posterior spaces in the normal group. The average dimension of the anterior joint space was different between the two groups. CBCT could be considered a useful diagnostic imaging modality for TMD patients.

  18. Computational design and characterization of a temperature-sensitive plasmid replicon for gram positive thermophiles

    Directory of Open Access Journals (Sweden)

    Olson Daniel G

    2012-05-01

    Full Text Available Abstract Background Temperature-sensitive (Ts plasmids are useful tools for genetic engineering, but there are currently none compatible with the gram positive, thermophilic, obligate anaerobe, Clostridium thermocellum. Traditional mutagenesis techniques yield Ts mutants at a low frequency, and therefore requires the development of high-throughput screening protocols, which are also not available for this organism. Recently there has been progress in the development of computer algorithms which can predict Ts mutations. Most plasmids currently used for genetic modification of C. thermocellum are based on the replicon of plasmid pNW33N, which replicates using the RepB replication protein. To address this problem, we set out to create a Ts plasmid by mutating the gene coding for the RepB replication protein using an algorithm designed by Varadarajan et al. (1996 for predicting Ts mutants based on the amino-acid sequence of the protein. Results A library of 34 mutant plasmids was designed, synthesized and screened, resulting in 6 mutants which exhibited a Ts phenotype. Of these 6, the one with the most temperature-sensitive phenotype (M166A was compared with the original plasmid. It exhibited lower stability at 48°C and was completely unable to replicate at 55°C. Conclusions The plasmid described in this work could be useful in future efforts to genetically engineer C. thermocellum, and the method used to generate this plasmid may be useful for others trying to make Ts plasmids.

  19. Construction of Vocabulary Knowledge Acquisition and Semantic Computing Platform%词汇知识获取及语义计算平台的构建

    Institute of Scientific and Technical Information of China (English)

    刘兴林

    2013-01-01

    设计并实现一个词汇知识获取及语义计算平台VKASCP,以及自然语言处理所需要的基础功能模块.系统主要功能模块包括合成词识别、合成词词性标注及分词修正、主题词提取、词汇语义计算,以及基于主题词集的自动文摘和文本相似度计算.VKASCP融文本语料库、词汇知识库于一体,为词汇知识获取及语义计算提供了一个良好的研究平台,并为今后构建词汇语义知识库打下了坚实的基础.%This paper designs and implements a vocabulary knowledge acquisition and semantic computing platform; VKASCP, and achieves the basic function of the natural language processing modules. The main function modules include compound-word recognition, compound-word part-of-speech tagging and word segmentation modification, keyword extraction, vocabulary semantic computing, automatic summarization method based on thematic term set, text similarity computing based on thematic term set. VKASCP intergrates the text corpus and vocabulary knowledge base, provides a good platform for the vocabulary knowledge acquisition and semantic computing, and lays a solid foundation to build a lexical semantic knowledge base for the future.

  20. Positive lymphoscintigraphy (ILS) and negative computed tomography for metastatic penile cancer.

    Science.gov (United States)

    Bantis, Athanasios; Sountoulides, Petros; Kalaitzis, Christos; Boussios, Nikolaos; Giannakopoulos, Stelios; Zissimopoulos, Athanasios

    2011-01-01

    Penile carcinoma usually occurs in older than 40 years men with an incidence in western communities of 0.5 to 1.6 per 100,000 men per year while in developing countries the rate is much higher in men. Extensive lymph node dissection of lymphatic inguinal metastases evident by inguinal lymphoscintigraphy (ILS) induces improved overall survival. A 75 years old male with penile squamous cell carcinoma stage pT2N0M0 of less than 2cm diameter, with tumor invasion of the penis corpora underwent partial penectomy with a 2-cm disease-free margin. Three months postoperation, computed tomography (CT) was negative for local recurrence or distant metastases. A dynamic ILS was performed after local anaesthesia and intradermal injection of 80MBq of (99m)Tc-nanocolloid at the lower edge of the left and right inguinal ducts. The lymphatic chain and a hot spot suggestive of a first draining lymph node appeared after 15min on the right inguinal region in the second zone according to Daseler mapping. The left inguinal area was negative for sentinel node (SN). In view of this finding an exploratory laparotomy was performed and pathology showed that this lymph node that was probably a SN was infiltrated by the squamous cell carcinoma. The patient was upstaged to T2N1M0 and scheduled to receive adjuvant chemotherapy with two courses of cisplatin and 5-fluorouracil. While T1 and T2 tumours of diameters 2cm, T3 tumours, and T4 tumours are treated with glans amputation and/or partial or total penile amputation. Imaging with magnetic resonance imaging (MRI) or computed tomography (CT) scan do not always give accurate staging information, because positive findings are usually found only in patients with clinically palpable, enlarged inguinal lymph nodes. Computed tomography and MRI have low sensitivity to identify occult metastases, because they present criteria for malignant involvement mainly based on the size of the lesions. The main pitfall of these diagnostic modalities is due to occult

  1. Assessment of optimal condylar position with cone-beam computed tomography in south Indian female population

    Directory of Open Access Journals (Sweden)

    W S Manjula

    2015-01-01

    Full Text Available Aim: The purpose of this study was to investigate, the condyle-fossa relationship, in clinically asymptomatic orthodontically untreated south Indian female volunteers, by cone-beam computed tomography (CBCT. Materials and Methods: The study population consisted of 13 clinically symptom-free and orthodontically untreated angle′s Class I female subjects with the mean age of 18 years (ranges from 17 years to 20 years. The normal disc position of the 13 subjects was confirmed by history, clinical examination and magnetic resonance imaging scan. Then, the images of the temporomandibular joint (TMJ of the subjects were taken using CBCT to evaluate the optimal condylar position. Posterior joint space (PS, superior joint space (SS and anterior joint space (AS were measured, and the values were subjected to statistical analysis. Mean PS, SS and AS of right and left side TMJ′s were calculated. Paired samples t-test were used for each measurement to evaluate the average differences between the right and left side for each element of the sample. Results: The mean value of PS, SS and AS of right side TMJ′s were 2.1385, 2.2769 and 1.7615, respectively. The mean value of PS, SS and AS of left side TMJ′s were 2.1385, 2.5308 and 1.8538, respectively. Statistical analysis with the t-test indicated no significant differences in the AS, SS, or PS values between the right and left side. TMJ′s mean PS, SS, and AS measurements were 2.1 mm (standard deviation [SD] ±0.65 mm, 2.4 mm (SD ± 0.58 mm, and 1.8 mm (SD ± 0.52 mm, respectively. The ratios of SS and PS to AS, with AS set to 1.0, were 1.3 and 1.2, respectively. Conclusion: These data from optimal joints might serve as norms for the clinical assessment of condylar position obtained by CBCT.

  2. 基于云计算技术的物流信息平台建设%Construction of Logistics Information Platform Based on Cloud Computing Technology

    Institute of Scientific and Technical Information of China (English)

    黄华

    2016-01-01

    Cloud computing technologies play a key role in the construction of logistics information platform, is the combination of the Internet technology and the construction of logistics, has an important role in the development of logistics industry now. The article mainly analyzes the construction of logistics information platform based on cloud computing technology, designed to provide certain reference to promote the construction of logistics information technology.%云计算技术运用在物流信息平台建设中发挥核心作用,是互联网技术与物流建设相结合的产物,在现在物流行业发展中占据重要地位。文章主要分析云计算技术基础上的物流信息平台建设,旨在为促进物流信息技术建设提供一定借鉴。

  3. 巨型柔性Stewart平台极限工作位置的确定%Extreme working position of the huge flexible Stewart platform

    Institute of Scientific and Technical Information of China (English)

    孙欣; 段宝岩

    2001-01-01

    A concept of the Huge Flexible Stewart Platform is presented forthe suspending cable structure of the large spherical radio telescopes feed system. The issue of the loosed flexible cable is emphasized and the criteria condition is also given, on the basis of which a general algorithm is developed to determine the maximum working angle of feed at its working position by utilizing non-linear static equations. And the analysis of the feeds extreme working position and the maximum working angle of the large spherical radio telescope not only makes stable control possible, but also provides the necessary parameters for the design of high-accuracy large spherical radio telescope.%针对大射电天文望远镜中馈源系统的柔索结构及其运动规律,提出了巨型柔性Stewart平台的概念,重点论述了柔性悬索虚牵的问题并给出了判定准则.在此基础上,提出了应用馈源舱的非线性静力平衡方程,确定巨型柔性Stewart平台工作位置及最大工作角度的算法.并应用该算法对大射电天文望远镜馈源系统极限工作位置的最大工作角度进行了分析.不但使该系统实现平稳控制成为可能,而且为高精度的大射电天文望远镜提供了必要的设计参数.

  4. Clouds computing assisted instruction platform based on Zoho Wiki%基于Zoho Wiki的云计算辅助教学平台

    Institute of Scientific and Technical Information of China (English)

    蒋宁; 杨姝; 杨雪华

    2012-01-01

    介绍了云计算技术的定义、优势及流行的云计算平台.云计算促进教育信息化的革新,云计算辅助教学利用“云计算”支持的教育“云服务”,能够高效简洁地提高教学的质量.指出了云计算辅助教学平台目前存在的问题,提出了云计算辅助教学平台协作学习设计的理念.根据教学应用需求,研究中选择支持协作学习的云计算辅助教学平台.以《操作系统》课程实验为例,细致分析了有效协作学习的诸多因素,设计了协作学习的过程,重点讨论了基于Zoho Wiki的云计算辅助教学平台的应用,学生在该平台上可以提升有效协作学习.同时也指出了云计算辅助教学平台的局限性,包括网络资源的充分保障和任课教师的科学设计.%In this paper, the concept and the advantages.of cloud computing technology are introduced together with description for the popular platforms of cloud computing. Cloud computing facilitates the innovation of informatization construction in education, the "cloud computing-assisted instruction(CCAI)" supports "cloud service" in education as well. Thus the effect of instruction is improved efficiently. The problems of the CCAI platform are pointed out, and the suggestions of collaborative learning design through CCAI platform are put forward. In order to meet the demands of instructional practices, the CCAI that supports collaborative learning is utilized in this study. Experiments are carried out based on the course of " Operating System" , with the analysis of the factors influencing effective collaborative learning in detail, also with the design of the collaborative learning process. The focus of the study is the application of a CCAI, Zoho Wiki, with which the learners may improve their effective collaborative learning. The limitation of CCAI is also pointed out, including the adequate online resources and the scientific design by the teachers who are responsible for the

  5. The cloud services innovation platform- enabling service-based environmental modelling using infrastructure-as-a-service cloud computing

    Science.gov (United States)

    Service oriented architectures allow modelling engines to be hosted over the Internet abstracting physical hardware configuration and software deployments from model users. Many existing environmental models are deployed as desktop applications running on user's personal computers (PCs). Migration ...

  6. On construction of experiment platform of Hadvop based cloud computing%基于Hadoop的云计算试验平台搭建研究

    Institute of Scientific and Technical Information of China (English)

    张岩; 郭松; 赵国海

    2013-01-01

    Hadoop is a free open source cloud platform, which is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models. It is a reliable, efficient, scalable cloud platform, is very suitable for simulation test in laboratory environment. In this paper, with the help of some software such as virtual machine VMware, Linux, ubuntu, Hadoop, java-jdk, the building process in the stand-alone environment of virtual cloud platform was described in detail based on the Hadoop. The building process of virtual cloud platform in a specific example was also elaborated. It was described that how to install Hadoop and Java and how to set up in detail. It completed the experiment environment, and point out that some problem must be paid attention during the building process, such as example for user right, path configuration and using of SSH service program. This experimental platform provides the basis for the development of system middleware and application service.%Hadoop是一个免费的开源云平台,是允许在集群计算机上分布式处理大数据的软件框架.它是一种可靠、高效、可伸缩的云平台,很适合在实验室环境下进行模拟测试.以Hadoop为基础,借助虚拟机VMware以及Linux、ubuntu、Hadoop、java-jdk等软件,详细地介绍了单机环境下的虚拟云平台搭建过程,并给出具体的实例搭建过程.在设计实例中详细的论述了虚拟机、java、Hadoop等软件的安装、设置、测试过程.实现了在实验室环境对云平台的虚拟,并提出了在搭建试验平台时应该注意的用户权限、路径配置和使用SSH服务程序等问题.该试验平台为系统中间件和应用服务的开发提供了基础.

  7. Computer aided drug screening platform and its application%计算机辅助药物筛选平台及应用

    Institute of Scientific and Technical Information of China (English)

    宋新蕊; 李达; 陈洁; 赵勇

    2014-01-01

    先导化合物发现是创新药物研发的最重要环节之一。针对目前海量功能不明确的小分子化合物,本文构建了一个用来实现快速发现先导化合物,有效降低药物研发成本的计算机辅助药物筛选平台。该平台采用分布式架构思想,集成了AutoDock Vina和多个小分子库,具有数据安全、计算与存储的负载均衡以及实时监控的特点。应用平台进行先导化合物筛选,在较短时间发现了有针对性的活性小分子化合物,命中率高,大大缩短先导化合物发现周期。该平台具有很好的实用性和良好的扩展性。%Lead compound discovery is the key step of innovative drug research and development. For large numbers of small molecules whose function are not clear, we have established a computer aided drug screening platform (www.vslead.com), which has been considered as a way for quick lead compound discovery and costs reduce. The platform adopted distributed architecture and integrated AutoDock Vina and a number of small molecule libraries. It is featured with data security, load balancing about computation and storage, and real time monitoring.Utilizing the facility provided by the platform, we found an active compound tested by wet lab experiment quickly. Therefore, users can have higher chances to find active compounds and the time of lead compound discovery can be shortened. The platform has very good practicability and extendibility.

  8. Efficient and accurate P-value computation for Position Weight Matrices

    Directory of Open Access Journals (Sweden)

    Varré Jean-Stéphane

    2007-12-01

    Full Text Available Abstract Background Position Weight Matrices (PWMs are probabilistic representations of signals in sequences. They are widely used to model approximate patterns in DNA or in protein sequences. The usage of PWMs needs as a prerequisite to knowing the statistical significance of a word according to its score. This is done by defining the P-value of a score, which is the probability that the background model can achieve a score larger than or equal to the observed value. This gives rise to the following problem: Given a P-value, find the corresponding score threshold. Existing methods rely on dynamic programming or probability generating functions. For many examples of PWMs, they fail to give accurate results in a reasonable amount of time. Results The contribution of this paper is two fold. First, we study the theoretical complexity of the problem, and we prove that it is NP-hard. Then, we describe a novel algorithm that solves the P-value problem efficiently. The main idea is to use a series of discretized score distributions that improves the final result step by step until some convergence criterion is met. Moreover, the algorithm is capable of calculating the exact P-value without any error, even for matrices with non-integer coefficient values. The same approach is also used to devise an accurate algorithm for the reverse problem: finding the P-value for a given score. Both methods are implemented in a software called TFM-PVALUE, that is freely available. Conclusion We have tested TFM-PVALUE on a large set of PWMs representing transcription factor binding sites. Experimental results show that it achieves better performance in terms of computational time and precision than existing tools.

  9. Monitoring system of digital video forensics based on cloud computing platform%基于云计算的视频取证监控系统

    Institute of Scientific and Technical Information of China (English)

    彭召意; 周玉; 文志强

    2011-01-01

    During the digital video forensies, facing the defects of multi-cameras in non-cooperative mode and the problems of massive video data and complex computational evidence, this paper presented a solution which about a monitoring system of digital video forensies based on cloud computing, in this program, each camera worked in a cooperative manner, and the video data in monitoring system was stored in the cloud computing system, in which the cloud computing platform could provide the video monitoring services the terminal users needed, and also provided the complex calculations of target identification and target tracking and others during the evidence collecting. The system could take full advantage of virtual storage and virtual computing capabilities of cloud computing platform to improve the ability of collaborative working of on-site multi-cameras, to increase the efficiency and accuracy of the video forensies and to improve the monitoring flexibility and convenience for various terminal users.%在视频取证过程中,面对多摄像头非协作工作方式的视频取证的缺陷以及海量的视频数据和复杂的取证计算问题,提出了一种基于云计算的视频取证监控系统的解决方案.在该方案中,各摄像头采用协作工作方式,监控系统中的视频数据保存在云计算系统中,终端用户需要的视频监控服务由云计算平台来提供,取证过程中的目标识别和跟踪等复杂计算也由云计算平台提供.该系统可以充分利用云计算平台的虚拟存储和虚拟计算能力,解决取证现场的多摄像头的协作工作能力,提高视频取证的处理效率和取证的准确性以及提高各种终端用户的监控灵活性和方便性.

  10. Design of massive video conversion platform based on cloud computing%基于云计算的海量视频转换平台的设计

    Institute of Scientific and Technical Information of China (English)

    李英壮; 刘曌; 李先毅; 于广辉

    2012-01-01

    随着三网融合的进程不断加快,视频转换业务呈现出数据海量化、多平台共存、编码标准多样的显著特征.云计算技术作为一种商业计算模型,具有低价高效、支持虚拟化、高可扩展性、通用性等特点.文章从视频转换业务的角度出发,基于云计算和视频转换技术,提出了一种基于云计算的海量视频转换平台的设计方案,详细设计了平台的架构,叙述了云控制层、集群控制层、节点控制层的具体功能以及具体的工作流程.%As the process continues to accelerate triple play, video conversion operations show the following characteristics: the data is massive, multiple platforms exist, and coding standards are diverse. Cloud computing is a business computing model. The low-cost and efficient, virtualized, highly scalable, highly scalable, versatile features meet the needs of this new solution. From the perspective of video conversion services, based on cloud computing and video conversion technology, this article presents a massive cloud-based video conversion platform design, and which describes specific functions and operational processes of three layers, I. E. , a cloud control layer, cluster control layer, node control layer, function and the specific workflow.

  11. 基于云平台的机房教学环境建设%Construction of Computer Classroom Teaching Environment Based on Cloud Platform

    Institute of Scientific and Technical Information of China (English)

    吴建善

    2013-01-01

    Computer classroom teaching is an important form of college and university education, but the existing computer classroom environment and mode seriously restrict the effective development of teaching activities. Therefore, it is necessary to build a new type of computer classroom teaching environment to reduce the construction and maintenance cost of computer classroom and the work of computer classroom management staff and at the same time improve the teaching effect. Through the cloud desktop technology, the system resources, teaching resources and office environment can be put on the cloud platform so that teachers and students can get the resources through various terminals, thus the time and space environment, equipment environment, interpersonal environment, information organization environment and emotional environment of computer classroom teaching can be effectively improved and promoting teaching as a result.%机房教学是高校教学的重要形式,但现有的机房环境和模式严重制约着教学活动的有效开展,亟需构建一种全新的机房教学环境,在降低机房建设、维护成本,减轻机房管理人员工作量的同时,提高教学效果。通过云桌面技术把系统资源、教学资源、办公环境放到云平台上,师生通过多样的终端设备读取桌面和资源,可有效改善机房教学的时空环境、设施环境、人际环境、信息环境、组织环境和情感环境等,促进教学。

  12. Stabilisation problem in biaxial platform

    Directory of Open Access Journals (Sweden)

    Lindner Tymoteusz

    2016-12-01

    Full Text Available The article describes investigation of rolling ball stabilization problem on a biaxial platform. The aim of the control system proposed here is to stabilize ball moving on a plane in equilibrium point. The authors proposed a control algorithm based on cascade PID and they compared it with another control method. The article shows the results of the accuracy of ball stabilization and influence of applied filter on the signal waveform. The application used to detect the ball position measured by digital camera has been written using a cross platform .Net wrapper to the OpenCV image processing library - EmguCV. The authors used the bipolar stepper motor with dedicated electronic controller. The data between the computer and the designed controller are sent with use of the RS232 standard. The control stand is based on ATmega series microcontroller.

  13. Stabilisation problem in biaxial platform

    Science.gov (United States)

    Lindner, Tymoteusz; Rybarczyk, Dominik; Wyrwał, Daniel

    2016-12-01

    The article describes investigation of rolling ball stabilization problem on a biaxial platform. The aim of the control system proposed here is to stabilize ball moving on a plane in equilibrium point. The authors proposed a control algorithm based on cascade PID and they compared it with another control method. The article shows the results of the accuracy of ball stabilization and influence of applied filter on the signal waveform. The application used to detect the ball position measured by digital camera has been written using a cross platform .Net wrapper to the OpenCV image processing library - EmguCV. The authors used the bipolar stepper motor with dedicated electronic controller. The data between the computer and the designed controller are sent with use of the RS232 standard. The control stand is based on ATmega series microcontroller.

  14. Adaptive discrete cosine transform-based image compression method on a heterogeneous system platform using Open Computing Language

    Science.gov (United States)

    Alqudami, Nasser; Kim, Shin-Dug

    2014-11-01

    Discrete cosine transform (DCT) is one of the major operations in image compression standards and it requires intensive and complex computations. Recent computer systems and handheld devices are equipped with high computing capability devices such as a general-purpose graphics processing unit (GPGPU) in addition to the traditional multicores CPU. We develop an optimized parallel implementation of the forward DCT algorithm for the JPEG image compression using the recently proposed Open Computing Language (OpenCL). This OpenCL parallel implementation combines a multicore CPU and a GPGPU in a single solution to perform DCT computations in an efficient manner by applying certain optimization techniques to enhance the kernel execution time and data movements. A separate optimal OpenCL kernel code was developed (CPU-based and GPU-based kernels) based on certain appropriate device-based optimization factors, such as thread-mapping, thread granularity, vector-based memory access, and the given workload. The performance of DCT is evaluated on a heterogeneous environment and our OpenCL parallel implementation results in speeding up the execution of the DCT by the factors of 3.68 and 5.58 for different image sizes and formats in terms of workload allocations and data transfer mechanisms. The obtained speedup indicates the scalability of the DCT performance.

  15. Development of a Computer-Assisted Instrumentation Curriculum for Physics Students: Using LabVIEW and Arduino Platform

    Science.gov (United States)

    Kuan, Wen-Hsuan; Tseng, Chi-Hung; Chen, Sufen; Wong, Ching-Chang

    2016-06-01

    We propose an integrated curriculum to establish essential abilities of computer programming for the freshmen of a physics department. The implementation of the graphical-based interfaces from Scratch to LabVIEW then to LabVIEW for Arduino in the curriculum `Computer-Assisted Instrumentation in the Design of Physics Laboratories' brings rigorous algorithm and syntax protocols together with imagination, communication, scientific applications and experimental innovation. The effectiveness of the curriculum was evaluated via statistical analysis of questionnaires, interview responses, the increase in student numbers majoring in physics, and performance in a competition. The results provide quantitative support that the curriculum remove huge barriers to programming which occur in text-based environments, helped students gain knowledge of programming and instrumentation, and increased the students' confidence and motivation to learn physics and computer languages.

  16. Time-domain seismic modeling in viscoelastic media for full waveform inversion on heterogeneous computing platforms with OpenCL

    Science.gov (United States)

    Fabien-Ouellet, Gabriel; Gloaguen, Erwan; Giroux, Bernard

    2017-03-01

    Full Waveform Inversion (FWI) aims at recovering the elastic parameters of the Earth by matching recordings of the ground motion with the direct solution of the wave equation. Modeling the wave propagation for realistic scenarios is computationally intensive, which limits the applicability of FWI. The current hardware evolution brings increasing parallel computing power that can speed up the computations in FWI. However, to take advantage of the diversity of parallel architectures presently available, new programming approaches are required. In this work, we explore the use of OpenCL to develop a portable code that can take advantage of the many parallel processor architectures now available. We present a program called SeisCL for 2D and 3D viscoelastic FWI in the time domain. The code computes the forward and adjoint wavefields using finite-difference and outputs the gradient of the misfit function given by the adjoint state method. To demonstrate the code portability on different architectures, the performance of SeisCL is tested on three different devices: Intel CPUs, NVidia GPUs and Intel Xeon PHI. Results show that the use of GPUs with OpenCL can speed up the computations by nearly two orders of magnitudes over a single threaded application on the CPU. Although OpenCL allows code portability, we show that some device-specific optimization is still required to get the best performance out of a specific architecture. Using OpenCL in conjunction with MPI allows the domain decomposition of large models on several devices located on different nodes of a cluster. For large enough models, the speedup of the domain decomposition varies quasi-linearly with the number of devices. Finally, we investigate two different approaches to compute the gradient by the adjoint state method and show the significant advantages of using OpenCL for FWI.

  17. USA Hire Testing Platform

    Data.gov (United States)

    Office of Personnel Management — The USA Hire Testing Platform delivers tests used in hiring for positions in the Federal Government. To safeguard the integrity of the hiring processes and ensure...

  18. Validation of a Computational Platform for the Analysis of the Physiologic Mechanisms of a Human Experimental Model of Hemorrhage

    Science.gov (United States)

    2009-12-01

    physiologic functioning of a virtual subject is a special adaptation of an established computer model of human physiology ( Guyton /Coleman/Summers model...RL. Computer simulation studies and the scientific method. J Appl Anim Welf Sci 1998;1:119–31. [PubMed: 16363976] 3. Guyton AC, Montani J-P, Hall JE...1197388] 7. Guyton AC, Coleman TG, Granger HJ. Circulation: overall regulation. Annu Rev Physiol 1972;34:13–46. [PubMed: 4334846] 8. Guyton AC, Coleman

  19. Security Mechanism of Workflow on Cloud Computing Platform of Digital Ocean%数字海洋云计算平台工作流安全机制

    Institute of Scientific and Technical Information of China (English)

    阮进勇; 徐凌宇; 丁广太

    2015-01-01

    In recent years,cloud computing technology has attracted much attention of various applications. Along with its development, cloud computing faces enormous challenges in security. Workflow management system is an important part of"cloud computing platform of the Digital Ocean of China". The functions of resource authentication and user identity authentication have played an important role in whole process of workflow management system. In this paper,study the security problem in the processing of user management and cus-tom compound model on ocean cloud platforms. According to the security problem when users customize service flow with public and pri-vate resources on digital ocean cloud platforms and user identity authentication function,a workflow security mechanism,combined the technology of double factors of mobile phone and email,is proposed.%云计算技术业已成为计算机资源交付使用的一种越来越受关注的方式。随着云计算技术的发展,其安全问题也面临巨大挑战。工作流管理系统是“数字海洋云计算平台”的重要组成部分,其中,资源鉴权和用户身份认证贯穿于工作流管理系统所有过程。文中研究数字海洋云平台上用户管理和定制复合模型工作流过程中的安全问题。根据数字海洋云平台上用户定制服务流时使用公有、私有资源安全问题和用户身份认证等功能,提出了一种工作流安全机制,结合手机和电子邮箱的双因素口令技术。

  20. Computing platform to aid in decision making on energy management projects of the ELETROBRAS; Plataforma computacional para auxilio na tomada de decisao em projetos de gestao energetica da ELETROBRAS

    Energy Technology Data Exchange (ETDEWEB)

    Assis, T.B.; Rosa, R.B.V.; Pinto, D.P.; Casagrande, C.G. [Universidade Federal de Juiz de Fora, MG (Brazil). Lab. de Eficiencia Energetica], Emails: tbassis@yahoo.com.br, tatobrasil@yahoo.com.br, casagrandejf@yahoo.com.br, danilo.pinto@ufjf.edu.br; Martins, C.C.; Cantarino, M. [Centrais Eletricas Brasileiras S.A. (ELETROBRAS), Rio de Janeiro, RJ (Brazil). Div. de Eficiencia Energetica em Edificacoes], Emails: cmartin@eletrobras.com, marcelo.cantarino@eletrobras.com

    2009-07-01

    A new tool developed by the Laboratory of Computational Efficiency Energy (LEENER), of the Federal University of Juiz de Fora (UFJF): the SP{sup 3} platform - Planning System of the Public Buildings is presented. This platform, when completed, will help Centrais Eletricas S.A. (ELETROBRAS) in meeting the demand of energetic efficiency projects for public buildings, standardizing data in order to accelerate the approval process and monitoring of a larger number of projects. This article discusses the stages of the platform development, the management methodology used, the goals and outcomes examined with the members of the PROCEL that working on this project.

  1. Associating Drugs, Targets and Clinical Outcomes into an Integrated Network Affords a New Platform for Computer-Aided Drug Repurposing

    DEFF Research Database (Denmark)

    Oprea, Tudor; Nielsen, Sonny Kim; Ursu, Oleg

    2011-01-01

    benefit from an integrated, semantic-web compliant computer-aided drug repurposing (CADR) effort, one that would enable deep data mining of associations between approved drugs (D), targets (T), clinical outcomes (CO) and SE. We report preliminary results from text mining and multivariate statistics, based...

  2. The impact of reorienting cone-beam computed tomographic images in varied head positions on the coordinates of anatomical landmarks

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Hun; Jeong, Ho Gul; Hwang, Jae Joon; Lee, Jung Hee; Han, Sang Sun [Dept. of Oral and Maxillofacial Radiology, Yonsei University, College of Dentistry, Seoul (Korea, Republic of)

    2016-06-15

    The aim of this study was to compare the coordinates of anatomical landmarks on cone-beam computed tomographic (CBCT) images in varied head positions before and after reorientation using image analysis software. CBCT images were taken in a normal position and four varied head positions using a dry skull marked with 3 points where gutta percha was fixed. In each of the five radiographic images, reference points were set, 20 anatomical landmarks were identified, and each set of coordinates was calculated. Coordinates in the images from the normally positioned head were compared with those in the images obtained from varied head positions using statistical methods. Post-reorientation coordinates calculated using a three-dimensional image analysis program were also compared to the reference coordinates. In the original images, statistically significant differences were found between coordinates in the normal-position and varied-position images. However, post-reorientation, no statistically significant differences were found between coordinates in the normal-position and varied-position images. The changes in head position impacted the coordinates of the anatomical landmarks in three-dimensional images. However, reorientation using image analysis software allowed accurate superimposition onto the reference positions.

  3. The impact of reorienting cone-beam computed tomographic images in varied head positions on the coordinates of anatomical landmarks

    Science.gov (United States)

    Kim, Jae Hun; Hwang, Jae Joon; Lee, Jung-Hee

    2016-01-01

    Purpose The aim of this study was to compare the coordinates of anatomical landmarks on cone-beam computed tomographic (CBCT) images in varied head positions before and after reorientation using image analysis software. Materials and Methods CBCT images were taken in a normal position and four varied head positions using a dry skull marked with 3 points where gutta percha was fixed. In each of the five radiographic images, reference points were set, 20 anatomical landmarks were identified, and each set of coordinates was calculated. Coordinates in the images from the normally positioned head were compared with those in the images obtained from varied head positions using statistical methods. Post-reorientation coordinates calculated using a three-dimensional image analysis program were also compared to the reference coordinates. Results In the original images, statistically significant differences were found between coordinates in the normal-position and varied-position images. However, post-reorientation, no statistically significant differences were found between coordinates in the normal-position and varied-position images. Conclusion The changes in head position impacted the coordinates of the anatomical landmarks in three-dimensional images. However, reorientation using image analysis software allowed accurate superimposition onto the reference positions. PMID:27358821

  4. AG-NGS: a powerful and user-friendly computing application for the semi-automated preparation of next-generation sequencing libraries using open liquid handling platforms.

    Science.gov (United States)

    Callejas, Sergio; Álvarez, Rebeca; Benguria, Alberto; Dopazo, Ana

    2014-01-01

    Next-generation sequencing (NGS) is becoming one of the most widely used technologies in the field of genomics. Library preparation is one of the most critical, hands-on, and time-consuming steps in the NGS workflow. Each library must be prepared in an independent well, increasing the number of hours required for a sequencing run and the risk of human-introduced error. Automation of library preparation is the best option to avoid these problems. With this in mind, we have developed automatic genomics NGS (AG-NGS), a computing application that allows an open liquid handling platform to be transformed into a library preparation station without losing the potential of an open platform. Implementation of AG-NGS does not require programming experience, and the application has also been designed to minimize implementation costs. Automated library preparation with AG-NGS generated high-quality libraries from different samples, demonstrating its efficiency, and all quality control parameters fell within the range of optimal values.

  5. Design of IOT Operation Platform in Cloud-computation Environment%云计算环境下物联网运营平台设计研究

    Institute of Scientific and Technical Information of China (English)

    马飞; 李丽; 王炼; 黄新

    2013-01-01

    云计算作为一种HPC技术,可以满足物联网大规模、海量信息处理要求,能够适应物联网计算资源负载变化大的特征,并且能以服务方式向物联网提供计算能力.基于此,构建了基于云计算的物联网运营平台,给出了系统的总体架构(包括云基础设施、物联网云平台和物联网云应用等),为未来物联网平台的运营模式设计提供了有益参考.%In this paper we established the IOT operation platform based on cloud computation and presented the overall framework of the system,including cloud infrastructure,IOT platform and IOT cloud applications,etc.

  6. Platform Constellations

    DEFF Research Database (Denmark)

    Staykova, Kalina Stefanova; Damsgaard, Jan

    2016-01-01

    This research paper presents an initial attempt to introduce and explain the emergence of new phenomenon, which we refer to as platform constellations. Functioning as highly modular systems, the platform constellations are collections of highly connected platforms which co-exist in parallel...... and as such allow us to study platforms not only as separate entities, but also to investigate the relationship between several platforms offered and governed by one and the same platform provider. By investigating two case studies of indigenous platform constellations formed around the hugely popular instant...... messaging apps KakaoTalk and LINE, we are able to gain valuable insights about the nature of these new constructions and to capture and synthesize their main characteristics in a framework. Our results show that platform constellations possess unique innovative capabilities, which can improve users...

  7. Research on time domain simulation of dynamic positioning for a deep water semi-submersible platform%深水半潜平台动力定位时域模拟研究

    Institute of Scientific and Technical Information of China (English)

    王磊; 孙攀; 王亮

    2011-01-01

    Based on time domain low frequency equations of horizontal motion, considering all low frequency environment loads and dynamic positioning model, a time domain simulation system of power consumption for a semi-submersible platform was established. In this paper, the law of power consumption of dynamic positioning system for a platform was researched. On the premise of meeting the positioning accuracy, the limit water depth which the dynamic positioning system can withstand was calculated, and the positioning accuracy changes with the power consumption of the dynamic positioning system for the platform was also discussed, in order to provide a guidance for engineering practice.%以低频时域水平运动方程为基础,考虑各种低频载荷作用以及动力定位模型,建立起一种半潜平台功率消耗的时域模拟系统,研究了一艘半潜平台动力定位系统功率消耗变化的规律,在满足定位精度要求的前提下计算得到了平台动力定位系统能够承受的极限水深,并对平台动力系统的定位精度随功率消耗的变化进行了探讨,以期为工程实际提供指导.

  8. Effect of Nasal Obstruction on Continuous Positive Airway Pressure Treatment: Computational Fluid Dynamics Analyses.

    Directory of Open Access Journals (Sweden)

    Tadashi Wakayama

    Full Text Available Nasal obstruction is a common problem in continuous positive airway pressure (CPAP therapy for obstructive sleep apnea and limits treatment compliance. The purpose of this study is to model the effects of nasal obstruction on airflow parameters under CPAP using computational fluid dynamics (CFD, and to clarify quantitatively the relation between airflow velocity and pressure loss coefficient in subjects with and without nasal obstruction.We conducted an observational cross-sectional study of 16 Japanese adult subjects, of whom 9 had nasal obstruction and 7 did not (control group. Three-dimensional reconstructed models of the nasal cavity and nasopharynx with a CPAP mask fitted to the nostrils were created from each subject's CT scans. The digital models were meshed with tetrahedral cells and stereolithography formats were created. CPAP airflow simulations were conducted using CFD software. Airflow streamlines and velocity contours in the nasal cavities and nasopharynx were compared between groups. Simulation models were confirmed to agree with actual measurements of nasal flow rate and with pressure and flow rate in the CPAP machine.Under 10 cmH2O CPAP, average maximum airflow velocity during inspiration was 17.6 ± 5.6 m/s in the nasal obstruction group but only 11.8 ± 1.4 m/s in the control group. The average pressure drop in the nasopharynx relative to inlet static pressure was 2.44 ± 1.41 cmH2O in the nasal obstruction group but only 1.17 ± 0.29 cmH2O in the control group. The nasal obstruction and control groups were clearly separated by a velocity threshold of 13.5 m/s, and pressure loss coefficient threshold of approximately 10.0. In contrast, there was no significant difference in expiratory pressure in the nasopharynx between the groups.This is the first CFD analysis of the effect of nasal obstruction on CPAP treatment. A strong correlation between the inspiratory pressure loss coefficient and maximum airflow velocity was found.

  9. Proposed Use of the NASA Ames Nebula Cloud Computing Platform for Numerical Weather Prediction and the Distribution of High Resolution Satellite Imagery

    Science.gov (United States)

    Limaye, Ashutosh S.; Molthan, Andrew L.; Srikishen, Jayanthi

    2010-01-01

    The development of the Nebula Cloud Computing Platform at NASA Ames Research Center provides an open-source solution for the deployment of scalable computing and storage capabilities relevant to the execution of real-time weather forecasts and the distribution of high resolution satellite data to the operational weather community. Two projects at Marshall Space Flight Center may benefit from use of the Nebula system. The NASA Short-term Prediction Research and Transition (SPoRT) Center facilitates the use of unique NASA satellite data and research capabilities in the operational weather community by providing datasets relevant to numerical weather prediction, and satellite data sets useful in weather analysis. SERVIR provides satellite data products for decision support, emphasizing environmental threats such as wildfires, floods, landslides, and other hazards, with interests in numerical weather prediction in support of disaster response. The Weather Research and Forecast (WRF) model Environmental Modeling System (WRF-EMS) has been configured for Nebula cloud computing use via the creation of a disk image and deployment of repeated instances. Given the available infrastructure within Nebula and the "infrastructure as a service" concept, the system appears well-suited for the rapid deployment of additional forecast models over different domains, in response to real-time research applications or disaster response. Future investigations into Nebula capabilities will focus on the development of a web mapping server and load balancing configuration to support the distribution of high resolution satellite data sets to users within the National Weather Service and international partners of SERVIR.

  10. Fast polyenergetic forward projection for image formation using OpenCL on a heterogeneous parallel computing platform.

    Science.gov (United States)

    Zhou, Lili; Clifford Chao, K S; Chang, Jenghwa

    2012-11-01

    Simulated projection images of digital phantoms constructed from CT scans have been widely used for clinical and research applications but their quality and computation speed are not optimal for real-time comparison with the radiography acquired with an x-ray source of different energies. In this paper, the authors performed polyenergetic forward projections using open computing language (OpenCL) in a parallel computing ecosystem consisting of CPU and general purpose graphics processing unit (GPGPU) for fast and realistic image formation. The proposed polyenergetic forward projection uses a lookup table containing the NIST published mass attenuation coefficients (μ∕ρ) for different tissue types and photon energies ranging from 1 keV to 20 MeV. The CT images of interested sites are first segmented into different tissue types based on the CT numbers and converted to a three-dimensional attenuation phantom by linking each voxel to the corresponding tissue type in the lookup table. The x-ray source can be a radioisotope or an x-ray generator with a known spectrum described as weight w(n) for energy bin E(n). The Siddon method is used to compute the x-ray transmission line integral for E(n) and the x-ray fluence is the weighted sum of the exponential of line integral for all energy bins with added Poisson noise. To validate this method, a digital head and neck phantom constructed from the CT scan of a Rando head phantom was segmented into three (air, gray∕white matter, and bone) regions for calculating the polyenergetic projection images for the Mohan 4 MV energy spectrum. To accelerate the calculation, the authors partitioned the workloads using the task parallelism and data parallelism and scheduled them in a parallel computing ecosystem consisting of CPU and GPGPU (NVIDIA Tesla C2050) using OpenCL only. The authors explored the task overlapping strategy and the sequential method for generating the first and subsequent DRRs. A dispatcher was designed to drive

  11. Acceso a Recursos de Cómputo de Alto Rendimiento Mediante Correo Electrónico (An email-based platform for accessing High Performance Computing resources

    Directory of Open Access Journals (Sweden)

    Suilan Estévez Velarde

    2014-04-01

    Full Text Available Resumen El cómputo de alto rendimiento es una necesidad para el desarrollo de investigaciones con grandes volúmenes de datos. La creciente demanda de este tipo de resultados ha impulsado a varios centros de investigación a poner en funcionamiento recursos de cómputo de alto rendimiento. En Cuba no existe una solución definitiva que permita a todos los centros de investigación disponer de los recursos de cómputo necesarios para desarrollar sus proyectos. Este trabajo propone el empleo de un clúster de computadoras de la Universidad de Griffith a través de una interfaz basada en el correo electrónico. Esta solución permite disponer de recursos de cómputo de alto rendimiento sin necesidad de una alta conectividad. Como caso de estudio se analizan los resultados obtenidos en un proyecto de optimización global en grandes dimensiones desarrollado en la Universidad de La Habana. Para experimentos con un mes de duración (en una computadora estándar los resultados muestran que al utilizar el recurso de alto rendimiento es posible alcanzar un incremento en el rendimiento relativo superior al 1300%. Abstract: Research with large volumes of data usually require access to High Performance Computing. The increasing demand for this kind of research has led many institutions to develop their own computer clusters. However, in Cuba there is no definitive solution for the High Performance Computing requirements of institutions such as the University of Havana. The expenses of building a computer cluster disallows many institutions to have their own, while the low connectivity limits the use of international high performance computing services. This research presents an alternative solution based on the development of an email-based platform for accessing a computer cluster at Griffith University in Australia. This new communication interface has been successfully used on a Large Scale Optimization research project at the University of Havana

  12. 云计算软件测试平台的构建%Construction of Software Testing Platform on Cloud Computing

    Institute of Scientific and Technical Information of China (English)

    曹丽; 姜毅; 甘春梅; 张一弛; 陈桂强

    2012-01-01

    This paper makes research on a solution to the construction of cloud computing platform for software testing. It first proposes the hierarchical model and the system architecture, and then discusses the approaches to building a cloud eomputing platform for software testing by using the open source software techniques, including IaaS and PaaS. In implementing these approaches, the authors first list out the software and hardware specifications of the IaaS platfornl, then de- scribe the installation and configuration of the OpenStack, and illustrate the configuration of virtual testing servers. The authors also present the deployment of the PaaS, which includes the design and implementation of software testing project management tool on cloud computing, the calling method of IaaS resources. At last ,the running example of system is given.%主要研究基于云计算技术的软件测试平台构建的相关问题,说明云计算软件测试平台的层次模型和系统结构,讨论采用开源软件构建云测试平台IaaS和PaaS的方法。IaaS构建包括软硬件环境说明、OpenStack安装与配置、虚拟测试服务器配置;PaaS构建包括云计算软件测试项目管理工具设计和实现、对IaaS资源的调用方法说明。最后给出系统运行实例。

  13. Molecular simulation workflows as parallel algorithms: the execution engine of Copernicus, a distributed high-performance computing platform.

    Science.gov (United States)

    Pronk, Sander; Pouya, Iman; Lundborg, Magnus; Rotskoff, Grant; Wesén, Björn; Kasson, Peter M; Lindahl, Erik

    2015-06-09

    Computational chemistry and other simulation fields are critically dependent on computing resources, but few problems scale efficiently to the hundreds of thousands of processors available in current supercomputers-particularly for molecular dynamics. This has turned into a bottleneck as new hardware generations primarily provide more processing units rather than making individual units much faster, which simulation applications are addressing by increasingly focusing on sampling with algorithms such as free-energy perturbation, Markov state modeling, metadynamics, or milestoning. All these rely on combining results from multiple simulations into a single observation. They are potentially powerful approaches that aim to predict experimental observables directly, but this comes at the expense of added complexity in selecting sampling strategies and keeping track of dozens to thousands of simulations and their dependencies. Here, we describe how the distributed execution framework Copernicus allows the expression of such algorithms in generic workflows: dataflow programs. Because dataflow algorithms explicitly state dependencies of each constituent part, algorithms only need to be described on conceptual level, after which the execution is maximally parallel. The fully automated execution facilitates the optimization of these algorithms with adaptive sampling, where undersampled regions are automatically detected and targeted without user intervention. We show how several such algorithms can be formulated for computational chemistry problems, and how they are executed efficiently with many loosely coupled simulations using either distributed or parallel resources with Copernicus.

  14. Position Paper: Applying Machine Learning to Software Analysis to Achieve Trusted, Repeatable Scientific Computing

    Energy Technology Data Exchange (ETDEWEB)

    Prowell, Stacy J [ORNL; Symons, Christopher T [ORNL

    2015-01-01

    Producing trusted results from high-performance codes is essential for policy and has significant economic impact. We propose combining rigorous analytical methods with machine learning techniques to achieve the goal of repeatable, trustworthy scientific computing.

  15. ToxEvaluator: an integrated computational platform to aid the interpretation of toxicology study-related findings.

    Science.gov (United States)

    Pelletier, D; Wiegers, T C; Enayetallah, A; Kibbey, C; Gosink, M; Koza-Taylor, P; Mattingly, C J; Lawton, M

    2016-01-01

    Attempts are frequently made to investigate adverse findings from preclinical toxicology studies in order to better understand underlying toxicity mechanisms. These efforts often begin with limited information, including a description of the adverse finding, knowledge of the structure of the chemical associated with its cause and the intended pharmacological target. ToxEvaluator was developed jointly by Pfizer and the Comparative Toxicogenomics Database (http://ctdbase.org) team at North Carolina State University as an in silico platform to facilitate interpretation of toxicity findings in light of prior knowledge. Through the integration of a diverse set of in silico tools that leverage a number of public and proprietary databases, ToxEvaluator streamlines the process of aggregating and interrogating diverse sources of information. The user enters compound and target identifiers, and selects adverse event descriptors from a safety lexicon and mapped MeSH disease terms. ToxEvaluator provides a summary report with multiple distinct areas organized according to what target or structural aspects have been linked to the adverse finding, including primary pharmacology, structurally similar proprietary compounds, structurally similar public domain compounds, predicted secondary (i.e. off-target) pharmacology and known secondary pharmacology. Similar proprietary compounds and their associated in vivo toxicity findings are reported, along with a link to relevant supporting documents. For similar public domain compounds and interacting targets, ToxEvaluator integrates relationships curated in Comparative Toxicogenomics Database, returning all direct and inferred linkages between them. As an example of its utility, we demonstrate how ToxEvaluator rapidly identified direct (primary pharmacology) and indirect (secondary pharmacology) linkages between cerivastatin and myopathy.

  16. BioWires: Conductive DNA Nanowires in a Computationally-Optimized, Synthetic Biological Platform for Nanoelectronic Fabrication

    Science.gov (United States)

    Vecchioni, Simon; Toomey, Emily; Capece, Mark C.; Rothschild, Lynn; Wind, Shalom

    2017-01-01

    DNA is an ideal template for a biological nanowire-it has a linear structure several atoms thick; it possesses addressable nucleobase geometry that can be precisely defined; and it is massively scalable into branched networks. Until now, the drawback of DNA as a conducting nanowire been, simply put, its low conductance. To address this deficiency, we extensively characterize a chemical variant of canonical DNA that exploits the affinity of natural cytosine bases for silver ions. We successfully construct chains of single silver ions inside double-stranded DNA, confirm the basic dC-Ag+-dC bond geometry and kinetics, and show length-tunability dependent on mismatch distribution, ion availability and enzyme activity. An analysis of the absorbance spectra of natural DNA and silver-binding, poly-cytosine DNA demonstrates the heightened thermostability of the ion chain and its resistance to aqueous stresses such as precipitation, dialysis and forced reduction. These chemically critical traits lend themselves to an increase in electrical conductivity of over an order of magnitude for 11-base silver-paired duplexes over natural strands when assayed by STM break junction. We further construct and implement a genetic pathway in the E. coli bacterium for the biosynthesis of highly ionizable DNA sequences. Toward future circuits, we construct a model of transcription network architectures to determine the most efficient and robust connectivity for cell-based fabrication, and we perform sequence optimization with a genetic algorithm to identify oligonucleotides robust to changes in the base-pairing energy landscape. We propose that this system will serve as a synthetic biological fabrication platform for more complex DNA nanotechnology and nanoelectronics with applications to deep space and low resource environments.

  17. Cloud Robotics Platforms

    Directory of Open Access Journals (Sweden)

    Busra Koken

    2015-01-01

    Full Text Available Cloud robotics is a rapidly evolving field that allows robots to offload computation-intensive and storage-intensive jobs into the cloud. Robots are limited in terms of computational capacity, memory and storage. Cloud provides unlimited computation power, memory, storage and especially collaboration opportunity. Cloud-enabled robots are divided into two categories as standalone and networked robots. This article surveys cloud robotic platforms, standalone and networked robotic works such as grasping, simultaneous localization and mapping (SLAM and monitoring.

  18. Fragment-based docking: development of the CHARMMing Web user interface as a platform for computer-aided drug design.

    Science.gov (United States)

    Pevzner, Yuri; Frugier, Emilie; Schalk, Vinushka; Caflisch, Amedeo; Woodcock, H Lee

    2014-09-22

    Web-based user interfaces to scientific applications are important tools that allow researchers to utilize a broad range of software packages with just an Internet connection and a browser. One such interface, CHARMMing (CHARMM interface and graphics), facilitates access to the powerful and widely used molecular software package CHARMM. CHARMMing incorporates tasks such as molecular structure analysis, dynamics, multiscale modeling, and other techniques commonly used by computational life scientists. We have extended CHARMMing's capabilities to include a fragment-based docking protocol that allows users to perform molecular docking and virtual screening calculations either directly via the CHARMMing Web server or on computing resources using the self-contained job scripts generated via the Web interface. The docking protocol was evaluated by performing a series of "re-dockings" with direct comparison to top commercial docking software. Results of this evaluation showed that CHARMMing's docking implementation is comparable to many widely used software packages and validates the use of the new CHARMM generalized force field for docking and virtual screening.

  19. Computing highly correlated positions using mutual information and graph theory for G protein-coupled receptors.

    Directory of Open Access Journals (Sweden)

    Sarosh N Fatakia

    Full Text Available G protein-coupled receptors (GPCRs are a superfamily of seven transmembrane-spanning proteins involved in a wide array of physiological functions and are the most common targets of pharmaceuticals. This study aims to identify a cohort or clique of positions that share high mutual information. Using a multiple sequence alignment of the transmembrane (TM domains, we calculated the mutual information between all inter-TM pairs of aligned positions and ranked the pairs by mutual information. A mutual information graph was constructed with vertices that corresponded to TM positions and edges between vertices were drawn if the mutual information exceeded a threshold of statistical significance. Positions with high degree (i.e. had significant mutual information with a large number of other positions were found to line a well defined inter-TM ligand binding cavity for class A as well as class C GPCRs. Although the natural ligands of class C receptors bind to their extracellular N-terminal domains, the possibility of modulating their activity through ligands that bind to their helical bundle has been reported. Such positions were not found for class B GPCRs, in agreement with the observation that there are not known ligands that bind within their TM helical bundle. All identified key positions formed a clique within the MI graph of interest. For a subset of class A receptors we also considered the alignment of a portion of the second extracellular loop, and found that the two positions adjacent to the conserved Cys that bridges the loop with the TM3 qualified as key positions. Our algorithm may be useful for localizing topologically conserved regions in other protein families.

  20. 基于云计算的校园教学云平台建设%Construction of cloud platform for campus teaching based on cloud computing

    Institute of Scientific and Technical Information of China (English)

    张妍; 王瑞刚

    2013-01-01

    随着教育信息化和互联网的高速发展,校园里面的教育信息化建设已经成为众多行业关注的重点。为了适应当前信息化的发展,提高学生的学习质量和效率,文章重点结合云计算的特点以及我国目前校园信息化建设的现状,研究探讨了基于云计算的校园教育云平台的新模型建设。%With the rapid development of education informatization and IOT, the construction of education informatization in campus has become the focus of many industries. In order to adapt to the development of informatization, and improve the quality and efifciency of student learning, combined with the characteristics of cloud computing and the current status of campus informatization construction, the new model of cloud platform for campus education based on cloud computing is discussed.

  1. Application of Cloud Computing in the Intelligent Medical Integration Platform%云计算在智慧医疗集成平台中的应用

    Institute of Scientific and Technical Information of China (English)

    王琳华

    2014-01-01

    Digital hospital information integration requires fusion of medical information of a variety of intelligent devices, and professional medical services facing to regional healthcare and users. Using cloud computing technology, through the server virtualization, the data collection is realized. This method is high-efficiency, easy to maintain and easy to manage. The information security is improved. Cloud computing technology can solve the problems of intelligent healthcare integration platform, such as high operational cost and big information security risks. Mobile medical healthcare is realized, and it can better promote the development of intelligent healthcare.%数字化医院信息集成需要融合多种智能设备医疗信息,提供面向区域医疗及用户的专业化医疗服务。利用云计算技术,通过服务器虚拟化来实现数据的集合,这种方式高效,易维护,易管理,全面提升信息安全。云计算技术能够解决智慧医疗集成平台中运维成本高,信息安全风险大等问题,实现移动医疗,更好地促进智慧医疗的发展。

  2. Research on the Video Surveillance Platform Based on Cloud Computing%基于云计算的视频监控平台的研究

    Institute of Scientific and Technical Information of China (English)

    李敬

    2014-01-01

    随着网格计算、虚拟化技术等的日渐成熟,以及目前各种社交类网站的大步发展对服务器的高性能、高可靠、高扩展性的需求,催生了云计算。基于云计算的智慧城市综合服务平台,以监控视频为基础,融合多种智慧城市应用业务如智慧社区、智慧家庭、智慧交通、智慧城管等。针对当前视频监控平台凸显的瓶颈,例如流媒体服务器负载过重、容灾能力弱、扩展能力弱等缺点,结合当前流行的开源分布式框架Hadoop,提出一套分布式视频存储,格式转码以及单节点隐患优化的解决方案。%With the maturation of grid computing and virtualization technology, and the current various social networking sites demand server to be high-performance, high-reliability and high-scalability, gives birth to cloud computing. The Smart City service platform which based on the cloud computing, relied on the surveillance video, is an integration of a variety of smart city applications business include smart com-munity, smart home, smart transportation and so on. Mainly introduces the bottlenecks for the current video surveillance platform, such as overloading streaming media server, weak capability of disaster recovery and weak scalability, promotes a comprehensive solution com-bined with the current popular open source distributed framework Hadoop, which can deal with distributed video storage, format transcod-ing and disaster recovery and backup.

  3. 基于LCSAM模型的云计算平台设计与分析研究%Design and analysis of cloud computing platform based on LCSAM

    Institute of Scientific and Technical Information of China (English)

    王金强

    2016-01-01

    目前各国对于云环境中存在的安全隐患缺乏一个统一标准的解决方案与防控措施,在研究已有云安全隐患评估方法及模型的基础上,提出了一个全生命周期云安全隐患分析模型——LCSAM,详细分析了云计算潜在的安全隐患,构建PaaS层安全策略模型,在已有研究基础上提出了基于Kerberos的信任即服务框架,综合利用云计算关键技术,组合多种开源软件搭建了一套对于特定的环境进行快速、经济的分析并可用于云攻、云防、云测的云计算平台系统,为LCSAM的验证提供了实验环境。最后,在此平台上部署一些典型的应用并针对此环境进行云安全隐患模拟仿真,实现了LCSAM模型的验证。%The various countries lack the solution and control measures with unified standard to overcome the security hid⁃den danger existing in cloud environment. On the basis of the research of the available evaluation method and model for the cloud security hidden danger,a whole lifecycle cloud security⁃risk analysis model(LCSAM)is proposed. The potential security hidden danger of cloud computing is analyzed in detail. The security strategy model of PaaS layer was constructed. The frame⁃work of service as trust based on Kerberos is put forward based on the existing research. A set cloud computing platform system was established by comprehensively utilizing the key technologies of cloud computing and combining the various open source soft⁃wares,which can rapidly and economically analyze the specific environment,and apply to the cloud attack,cloud protection and cloud detection. This system provides an experimental environment to verify the LCSAM. Some typical applications are de⁃ployed on the platform to simulate the cloud security hidden danger. The LCSAM was verified.

  4. Cone-Beam Computed Tomographic Assessment of Mandibular Condylar Position in Patients with Temporomandibular Joint Dysfunction and in Healthy Subjects

    Directory of Open Access Journals (Sweden)

    Maryam Paknahad

    2015-01-01

    Full Text Available Statement of the Problem. The clinical significance of condyle-fossa relationships in the temporomandibular joint is a matter of controversy. Different studies have evaluated whether the position of the condyle is a predictor of the presence of temporomandibular disorder. Purpose. The purpose of the present study was to investigate the condylar position according to gender in patients with temporomandibular disorder (TMD and healthy controls using cone-beam computed tomography. Materials and Methods. CBCT of sixty temporomandibular joints in thirty patients with TMD and sixty joints of thirty subjects without TMJ disorder was evaluated in this study. The condylar position was assessed on the CBCT images. The data were analyzed using Pearson chi-square test. Results. No statistically significant differences were found regarding the condylar position between symptomatic and asymptomatic groups. Posterior condylar position was more frequently observed in women and anterior condylar position was more prevalent in men in the symptomatic group. However, no significant differences in condylar position were found in asymptomatic subjects according to gender. Conclusion. This study showed no apparent association between condylar positioning and clinical findings in TMD patients.

  5. Cone-Beam Computed Tomographic Assessment of Mandibular Condylar Position in Patients with Temporomandibular Joint Dysfunction and in Healthy Subjects.

    Science.gov (United States)

    Paknahad, Maryam; Shahidi, Shoaleh; Iranpour, Shiva; Mirhadi, Sabah; Paknahad, Majid

    2015-01-01

    Statement of the Problem. The clinical significance of condyle-fossa relationships in the temporomandibular joint is a matter of controversy. Different studies have evaluated whether the position of the condyle is a predictor of the presence of temporomandibular disorder. Purpose. The purpose of the present study was to investigate the condylar position according to gender in patients with temporomandibular disorder (TMD) and healthy controls using cone-beam computed tomography. Materials and Methods. CBCT of sixty temporomandibular joints in thirty patients with TMD and sixty joints of thirty subjects without TMJ disorder was evaluated in this study. The condylar position was assessed on the CBCT images. The data were analyzed using Pearson chi-square test. Results. No statistically significant differences were found regarding the condylar position between symptomatic and asymptomatic groups. Posterior condylar position was more frequently observed in women and anterior condylar position was more prevalent in men in the symptomatic group. However, no significant differences in condylar position were found in asymptomatic subjects according to gender. Conclusion. This study showed no apparent association between condylar positioning and clinical findings in TMD patients.

  6. Position emission tomography with or without computed tomography in the primary staging of Hodgkin's lymphoma

    DEFF Research Database (Denmark)

    Hutchings, Martin; Jakobsen, Annika Loft; Hansen, Mads;

    2006-01-01

    In order to receive the most appropriate therapy, patients with Hodgkin's lymphoma (HL) must be accurately stratified into different prognostic staging groups. Computed tomography (CT) plays a pivotal role in the conventional staging. The aim of the present study was to investigate the value...

  7. African American Faculty Women Experiences of Underrepresentation in Computer Technology Positions in Higher Education

    Science.gov (United States)

    King, Dolores

    2013-01-01

    African American women are underrepresented in computer technology disciplines in institutions of higher education throughout the United States. Although equitable gender representation is progressing in most fields, much less information is available on why institutions are still lagging in workforce diversity, a problem which can be lessened by…

  8. Analysis of the effect of swimmer's head position on swimming performance using computational fluid dynamics.

    Science.gov (United States)

    Zaïdi, H; Taïar, R; Fohanno, S; Polidori, G

    2008-01-01

    The aim of this numerical work is to analyze the effect of the position of the swimmer's head on the hydrodynamic performances in swimming. In this initial study, the problem was modeled as 2D and in steady hydrodynamic state. The geometry is generated by the CAD software CATIA and the numerical simulation is carried out by the use of the CFD Fluent code. The standard k-epsilon turbulence model is used with a specific wall law. Three positions of the head were studied, for a range of Reynolds numbers about 10(6). The obtained numerical results revealed that the position of the head had a noticeable effect on the hydrodynamic performances, strongly modifying the wake around the swimmer. The analysis of these results made it possible to propose an optimal position of the head of a swimmer in underwater swimming.

  9. Cloud Computing Platform Research Based on Mobile Police Application%面向移动警务应用的云计算平台设计与实现

    Institute of Scientific and Technical Information of China (English)

    肖薇; 计春雷

    2013-01-01

    For public security service for demand to mobile phones and PAD as the main application terminal to the ministry of public security internal rich data as data sources to cloud computing platform as the carrier,build the Android platform based on the function module development and cloud computing platform organic integration system,based on cloud service system of police application comprehensive service platform.This paper mainly introduces the cloud computing technology route and implementation plan,and puts forward the use of spark cloud platform to solve the problem of large data set in image retrieval effectively.%针对公安部内部数据量庞大、移动警务人员在进行图像检索时操作复杂和处理效率不高的问题,提出了一个基于云计算的解决方案.通过使用高效的网络接入,远程调用实现与Spark云计算服务平台的交互,有效解决了海量数据的图像检索问题;通过引入图像特征索引技术,提高了图像检索的效率,从而提升了移动警务的执行能力.

  10. The precise computation of geoid undulation differences with comparison to results obtained from the global positioning system

    Science.gov (United States)

    Engelis, T.; Rapp, R. H.; Tscherning, C. C.

    1984-01-01

    Ellipsoidal height differences have been determined for 13 station pairs in the central Ohio region using measurements made with the Global Positioning System. This information was used to compute geoid undulation differences based on known orthometric heights. These differences were compared to gravimetrically-computed undulations (using a Stokes integration procedure, and least squares collocation having an internal r.m.s. agreement of plus or minus 1 cm in undulation differences). The two sets of undulation differences have an r.m.s. discrepancy of plus or minus 5 cm while the average station separation is of the order of 14 km. This good agreement suggests that gravimetric data can be used to compute accurate geoid undulation differences that can be used to convert ellipsoidal height differences obtained from GPS to orthometric height differences.

  11. Windows Azure Platform

    CERN Document Server

    Redkar, Tejaswi

    2011-01-01

    The Windows Azure Platform has rapidly established itself as one of the most sophisticated cloud computing platforms available. With Microsoft working to continually update their product and keep it at the cutting edge, the future looks bright - if you have the skills to harness it. In particular, new features such as remote desktop access, dynamic content caching and secure content delivery using SSL make the latest version of Azure a more powerful solution than ever before. It's widely agreed that cloud computing has produced a paradigm shift in traditional architectural concepts by providin

  12. Wireless sensor platform

    Energy Technology Data Exchange (ETDEWEB)

    Joshi, Pooran C.; Killough, Stephen M.; Kuruganti, Phani Teja

    2017-08-08

    A wireless sensor platform and methods of manufacture are provided. The platform involves providing a plurality of wireless sensors, where each of the sensors is fabricated on flexible substrates using printing techniques and low temperature curing. Each of the sensors can include planar sensor elements and planar antennas defined using the printing and curing. Further, each of the sensors can include a communications system configured to encode the data from the sensors into a spread spectrum code sequence that is transmitted to a central computer(s) for use in monitoring an area associated with the sensors.

  13. IaaS公有云平台调度模型研究%Research on IaaS public cloud computing platform scheduling model

    Institute of Scientific and Technical Information of China (English)

    岳冬利; 刘海涛; 孙傲冰

    2011-01-01

    A service model for laaS public cloud is created, and based on the waiting-line theory, the optimization analysis ofthe service mode, the queue length and the configuration of scheduling server is made. A scheduling model based on demand-vectors is created to filter available host machines according to the match of demand and available resource. Ifhost machines which meet the demands can' t be found firstly, the scheduling algorithm can combine with the virtual machine motivation to reallocate physical resources to guarantee the maximal available rate and usability of the whole platform. The feasibility of the algorithm is verified on our own IaaS public cloud computing platform.%抽象出IaaS公有云平台的服务模型,基于排队论对平台服务模式、队列长度、调度服务器设置等进行了优化分析.在此基础上提出一种基于IaaS平台需求向量的调度模型,根据需求与可用资源的匹配度从平台管理的物理机集合中筛选出可用的宿主机,若一次性无法找到符合要求的宿主机,平台调度算法结合虚拟机迁移操作,对物理资源进行重新分配,在实现平台资源利用率最大化的同时,保障了平台的可用性.将该算法应用在自主研发的云计算平台上,实验结果验证了该算法的可行性.

  14. Comparative analysis between mandibular positions in centric relation and maximum intercuspation by cone beam computed tomography (CONE-BEAM)

    OpenAIRE

    Ferreira,Amanda de Freitas; Henriques,João César Guimarães; Almeida,Guilherme de Araújo; Machado,Asbel Rodrigues; Machado, Naila Aparecida de Godoi; Fernandes Neto,Alfredo Júlio

    2009-01-01

    This research consisted of a quantitative assessment, and aimed to measure the possible discrepancies between the maxillomandibular positions for centric relation (CR) and maximum intercuspation (MI), using computed tomography volumetric cone beam (cone beam method). The sample of the study consisted of 10 asymptomatic young adult patients divided into two types of standard occlusion: normal occlusion and Angle Class I occlusion. In order to obtain the centric relation, a JIG device and mandi...

  15. Associating Drugs, Targets and Clinical Outcomes into an Integrated Network Affords a New Platform for Computer-Aided Drug Repurposing.

    Science.gov (United States)

    Oprea, Tudor I; Nielsen, Sonny Kim; Ursu, Oleg; Yang, Jeremy J; Taboureau, Olivier; Mathias, Stephen L; Kouskoumvekaki, Lrene; Sklar, Larry A; Bologa, Cristian G

    2011-03-14

    Finding new uses for old drugs is a strategy embraced by the pharmaceutical industry, with increasing participation from the academic sector. Drug repurposing efforts focus on identifying novel modes of action, but not in a systematic manner. With intensive data mining and curation, we aim to apply bio- and cheminformatics tools using the DRUGS database, containing 3,837 unique small molecules annotated on 1,750 proteins. These are likely to serve as drug targets and antitargets (i.e., associated with side effects, SE). The academic community, the pharmaceutical sector and clinicians alike could benefit from an integrated, semantic-web compliant computer-aided drug repurposing (CADR) effort, one that would enable deep data mining of associations between approved drugs (D), targets (T), clinical outcomes (CO) and SE. We report preliminary results from text mining and multivariate statistics, based on 7,684 approved drug labels, ADL (Dailymed) via text mining. From the ADL corresponding to 988 unique drugs, the "adverse reactions" section was mapped onto 174 SE, then clustered via principal component analysis into a 5x5 self-organizing map that was integrated into a Cytoscape network of SE-D-T-CO. This type of data can be used to streamline drug repurposing and may result in novel insights that can lead to the identification of novel drug actions.

  16. Design of Logistics Public Information Platform Based on Cloud Computing Architecture%云计算架构下的物流公共信息平台设计探讨

    Institute of Scientific and Technical Information of China (English)

    李姝宁

    2012-01-01

    为了进一步研究了物流公共信息平台在云计算之下工作的具体特征,本文以云计算技术架构的整体作为起点进行分析,从而构建了物流公共信息平台的云架构并详述了其工作原理,最后通过前文分析展望了物流公共信息平台未来的发展趋势.%In order to further study the specific characteristics of logistics public information platform under the cloud computing, taking the overall cloud computing architecture as a starting point, this paper constructs the clouding architecture of logistics public information platform and describes its working principle, finally looks for the development trend of logistics public information platform in the future.

  17. Research on Partial Discharge Ultrasonic Location in Transformer Based on Improved Multi-Platform Positioning Principle%基于改进多平台定位原理的变压器局放超声定位研究

    Institute of Scientific and Technical Information of China (English)

    李燕青; 魏方园; 王飞龙

    2014-01-01

    针对传统的变压器局放定位方法误差大的问题,提出了基于改进多平台定位原理的定位方法。测向线在定位计算中的权重值与测向平台和局放源之间的距离成反比,建立空间某一点到各条测向线距离与权重值乘积之和的函数,函数最小值对应的点即为局放源的空间位置。通过模拟实验测得阵列中不同传感器接收到的超声信号峰值的平均值随局放源与测向平台之间距离的变化曲线,在求得超声信号峰值平均值的前提下,根据对应的变化曲线即可求得测向平台与局放源之间的距离,在此基础上采用改进多平台定位方法即可求得局放源的空间坐标。%In view of the large error in traditional transformer partial discharge (PD) positioning methods, the positioning method based on improved multi-platform positioning principle is proposed in this paper. Notice that the weight of the measurement line in positioning calculation is inversely proportional to the distance between the source and the platform, we establish the function defined as the sum of products of the weight and the distance between a point and each measurement line, and take the point corresponding to function minimum as the position of the discharge source. In simulation experiment, ultrasonic signals are received by different sensors in the array, and the changes of the average of the signal peaks with the distance between the discharge source and the positioning platform can be measured. Once the average of the peaks received by the sensors is obtained, the distance between the PD source and the platform can be easily achieved based on the change curve. Therefore, the spatial coordinate of the source can be calculated using the above proposed improved multi-platform positioning method.

  18. Positive intraluminal bowel contrast on computed tomography following oral ingestion of Kayexelate

    Energy Technology Data Exchange (ETDEWEB)

    Zissin, R.; Stackievicz, R.; Osadchy, A. [Tel-Aviv Univ., Dept. of Diagnostic Imaging Meir Medical Center, Kfar-Saba, affiliated to the Sackler School of Medicine, Tel-Aviv (Israel)], E-mail: zisinrivka@clalit.org.il; Gayer, G. [Tel-Aviv Univ., Dept. of Diagnostic Imaging Assaf Harofe Medical Center, Zrifin, affiliated to the Sackler School of Medicine, Tel-Aviv (Israel)

    2008-12-15

    Our study presents the computed tomography (CT) manifestations of orally ingested kayexelate (a powdered form of sodium polystyrene sulphonate) used to treat hyperkalemia. Five patients with whom kayexelate appeared as high-attenuating intraluminal enteric content, similar to oral contrast material or leakage of intravascular contrast, are reported. Radiologists should be familiar with its appearance as it may mimic oral or vascular contrast within the gastrointestinal tract, a finding that may lead to a diagnostic error or misinterpretation. (author)

  19. Ladder attachment platform

    Science.gov (United States)

    Swygert,; Richard, W [Springfield, SC

    2012-08-28

    A ladder attachment platform is provided that includes a base for attachment to a ladder that has first and second side rails and a plurality of rungs that extend between in a lateral direction. Also included is a user platform for having a user stand thereon that is carried by the base. The user platform may be positioned with respect to the ladder so that it is not located between a first plane that extends through the first side rail and is perpendicular to the lateral direction and a second plane that extends through the second side rail and is perpendicular to the lateral direction.

  20. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Plutonium Metals, Oxides, and Solutions on the High Performance Computing Platform Moonlight

    Energy Technology Data Exchange (ETDEWEB)

    Chapman, Bryan Scott [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Gough, Sean T. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-12-05

    This report documents a validation of the MCNP6 Version 1.0 computer code on the high performance computing platform Moonlight, for operations at Los Alamos National Laboratory (LANL) that involve plutonium metals, oxides, and solutions. The validation is conducted using the ENDF/B-VII.1 continuous energy group cross section library at room temperature. The results are for use by nuclear criticality safety personnel in performing analysis and evaluation of various facility activities involving plutonium materials.

  1. 基于VC和ANSYS的变压器模型参数计算平台设计%Design of Platform for Parameter Computation of Transformer Model Based on VC and ANSYS

    Institute of Scientific and Technical Information of China (English)

    王雪

    2011-01-01

    The secondary development of ANSYS is carried out. The platform of transformer model parameter computation is designed with APDL language offered by ANSYS and VC languages.%对ANSYS进行了二次开发,利用ANSYS提供的APDL语言和VC语言设计了变压器模型参数计算平台。

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  3. Simple technique to achieve a natural position of the head for cone beam computed tomography

    NARCIS (Netherlands)

    Damstra, Janalt; Fourie, Zacharias; Ren, Yijin

    2010-01-01

    We developed a modified laser level technique to record the natural position of the head in all three planes of space. This is a simple method for use with three-dimensional images and may be valuable in routine craniofacial assessment.

  4. Motivating Students through Positive Learning Experiences: A Comparison of Three Learning Designs for Computer Programming Courses

    Science.gov (United States)

    Lykke, Marianne; Coto, Mayela; Jantzen, Christian; Mora, Sonia; Vandel, Niels

    2015-01-01

    Based on the assumption that wellbeing, positive emotions and engagement influence motivation for learning, the aim of this paper is to provide insight into students' emotional responses to and engagement in different learning designs. By comparing students' reports on the experiential qualities of three different learning designs, their…

  5. Position emission tomography with or without computed tomography in the primary staging of Hodgkin's lymphoma

    DEFF Research Database (Denmark)

    Hutchings, Martin; Loft, Annika; Hansen, Mads

    2006-01-01

    BACKGROUND AND OBJECTIVES: In order to receive the most appropriate therapy, patients with Hodgkin's lymphoma (HL) must be accurately stratified into different prognostic staging groups. Computed tomography (CT) plays a pivotal role in the conventional staging. The aim of the present study...... standard limits the reliability of accuracy calculations. RESULTS: FDG-PET would have upstaged 19% of patients and downstaged 5% of patients, leading to a different treatment in 9% of patients. For FDG-PET/CT, the corresponding figures are 17%, 5%, and 7%. In nodal regions, the sensitivity of FDG...

  6. Enhanced Authentication Mechanisms for Desktop Platform and Smart Phones

    Directory of Open Access Journals (Sweden)

    Dina EL Menshawy

    2012-10-01

    Full Text Available With hundreds of millions using computers and mobile devices all over the globe, these devices have an established position in modern society. Nevertheless, most of these devices use weak authentication techniques with passwords and PINs which can be easily hacked. Thus, stronger identification is needed to ensure data security and privacy. In this paper, we will explain the employment of biometrics to computer and mobile platforms. In addition, the possibility of using keystroke and mouse dynamics for computer authentication is being checked. Finally, we propose an authentication scheme for smart phones that shows positive results.

  7. Energy Equations for Computation of Parabolic-Trough Collector Efficiency Using Solar Position Coordinates

    Directory of Open Access Journals (Sweden)

    I. S. Sintali

    2014-10-01

    Full Text Available This paper presents the development of energy equations for computation of the efficiency of Parabolic-Trough Collector (PTC using solar coordinates. The energy equations included the universal time , day (n, month (M, year (Y, delta T llongitude and latitude in radian. The heliocentric longitude (H, geocentric global coordinates and local topocentric sun coordinates were considered in the modeling equations. The thermal efficiency of the PTC considered both the direct and reflected solar energy incident on the glass-cover as well as the thermal properties of the collector and the total energy losses in the system. The developed energy equations can be used to predict the performance (efficiency of any PTC using the meteorological and radiative data of any particular location.

  8. ARM9+Linux平台上计算机视觉的实现%Implementation of Computer Vision on ARM9+Linux Platform

    Institute of Scientific and Technical Information of China (English)

    马智; 叶林; 葛俊锋

    2011-01-01

    文章以在ARM9+ Linux平台上实现人脸检测为背景,介绍计算机视觉在嵌入式系统中的实现方法.该系统以OpenCV函数库为核心,通过V4L获取图像,移植MiniGUI实现图像的显示以及人机界面的设计,文章对这些问题作详细介绍,并提出提高系统实时性的方法.%This paper introduces the realization method of computer vision in the embedded system, in the background of implementation of face detection for ARM9 + Linux platform. The system is based on OpenCV library, gets images through V4L, and manages the Man-machine interface by MiniGUI. These issues will be detailed and a simple method to improve the system real-time will also be proposed.

  9. A computational platform for robotized fluorescence microscopy (II): DNA damage, replication, checkpoint activation, and cell cycle progression by high-content high-resolution multiparameter image-cytometry.

    Science.gov (United States)

    Furia, Laura; Pelicci, Pier Giuseppe; Faretta, Mario

    2013-04-01

    Dissection of complex molecular-networks in rare cell populations is limited by current technologies that do not allow simultaneous quantification, high-resolution localization, and statistically robust analysis of multiple parameters. We have developed a novel computational platform (Automated Microscopy for Image CytOmetry, A.M.I.CO) for quantitative image-analysis of data from confocal or widefield robotized microscopes. We have applied this image-cytometry technology to the study of checkpoint activation in response to spontaneous DNA damage in nontransformed mammary cells. Cell-cycle profile and active DNA-replication were correlated to (i) Ki67, to monitor proliferation; (ii) phosphorylated histone H2AX (γH2AX) and 53BP1, as markers of DNA-damage response (DDR); and (iii) p53 and p21, as checkpoint-activation markers. Our data suggest the existence of cell-cycle modulated mechanisms involving different functions of γH2AX and 53BP1 in DDR, and of p53 and p21 in checkpoint activation and quiescence regulation during the cell-cycle. Quantitative analysis, event selection, and physical relocalization have been then employed to correlate protein expression at the population level with interactions between molecules, measured with Proximity Ligation Analysis, with unprecedented statistical relevance. Copyright © 2013 International Society for Advancement of Cytometry.

  10. Acceptability and use of a virtual support group for HIV-positive youth in Khayelitsha, Cape Town using the MXit social networking platform.

    Science.gov (United States)

    Henwood, Ruth; Patten, Gabriela; Barnett, Whitney; Hwang, Bella; Metcalf, Carol; Hacking, Damian; Wilkinson, Lynne

    2016-07-01

    Médecins Sans Frontières supports human immunodeficiency virus (HIV)-infected youth, aged 12-25 years, at a clinic in Khayelitsha, South Africa. Patients are enrolled in youth clubs, and provided with a virtual chat room, using the cell-phone-based social networking platform, MXit, to support members between monthly/bimonthly club meetings. The acceptability and uptake of MXit was assessed. MXit was facilitated by lay counsellors, was password protected, and participants could enter and leave at will. Club members were asked to complete self-administered questionnaires and participate in two focus-group discussions. In total, 60 club members completed the questionnaire, and 12 participated in the focus groups. Fifty-eight percentage were aged 23-25 years, 63% were female and 83% had a cell phone. Sixty percentage had used MXit before, with 38% having used it in the past month. Sixty-five percentage were aware of the chat-room and 39% knew how to access it. Thirty-four percentage used the chat-room at least once, 20% had visited the chat-room in the past month, and 29% had used MXit to have private conversations with other club members. Fifty-seven percentage used the chat-room to get advice, and 84% of all respondents felt that offering a service outside the youth club meetings was important and would like to see one to continue. The cost of using social media platforms was an issue with some, as well as the need for anonymity. Preference for other platforms, logistical obstacles, or loss of interest contributed to non-use. Reported usage of the MXit chat-room was low, but participants indicated acceptance of the programme and their desire to interact with their peers through social media. Suggestions to improve the platform included accessible chat histories, using more popular platforms such as Facebook or WhatsApp, and to have topical discussions where pertinent information for youth is provided.

  11. HPC - Platforms Penta Chart

    Energy Technology Data Exchange (ETDEWEB)

    Trujillo, Angelina Michelle [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-10-08

    Strategy, Planning, Acquiring- very large scale computing platforms come and go and planning for immensely scalable machines often precedes actual procurement by 3 years. Procurement can be another year or more. Integration- After Acquisition, machines must be integrated into the computing environments at LANL. Connection to scalable storage via large scale storage networking, assuring correct and secure operations. Management and Utilization – Ongoing operations, maintenance, and trouble shooting of the hardware and systems software at massive scale is required.

  12. Study on Application of Open Source IaaS Platform in Complex Computer Experiment Environment%开源IaaS平台在复杂计算机实验环境中的应用研究

    Institute of Scientific and Technical Information of China (English)

    徐化祥; 陈林; 陈浩

    2015-01-01

    提出一种基于多源信息资源调度的开源IaaS平台兼容性测试方法,并应用到计算机实验环境中.进行多源信息资源调度算法设计和开源IaaS云平台下的云计算数据聚类分析,实现对开源IaaS平台下的差异特征的兼容性提升,提高实验平台的兼容性,优化数据处理和分析性能.仿真结果证明,采用该方法进行开源IaaS云平台下的数据信息处理,开源IaaS平台多源信息资源调度性能和兼容性较好,能有效应用在复杂计算机数据信息处理实验环境中.%This paper presents a testing method of the open source IaaS platform compatibility of multi-source information based on resource scheduling, and applied to the computer experiment environment. The multi-source information resource scheduling algorithm design and open source IaaS cloud platform cloud computing data clustering analysis, enhance the im-plementation of differences of open source under the IaaS platform compatibility, improve the experimental platform com-patibility, optimization of data processing and performance analysis. The simulation results show that, using the method of information processing data of the open source IaaS cloud platform, open source IaaS platform multi-source information re-source scheduling performance and good compatibility, can be effectively used in computer data processing experiment en-vironment.

  13. Position of document holder and work related risk factors for neck pain among computer users: a narrative review.

    Science.gov (United States)

    Ambusam, S; Baharudin, O; Roslizawati, N; Leonard, J

    2015-01-01

    Document holder is used as a remedy to address occupational neck pain among computer users. An understanding on the effects of the document holder along with other work related risk factors while working in computer workstation requires attention. A comprehensive knowledge on the optimal location of the document holder in computer use and associated work related factors that may contribute to neck pain reviewed in this article. A literature search has been conducted over the past 14 years based on the published articles from January 1990 to January 2014 in both Science Direct and PubMed databases. Medical Subject Headings (MeSH) keywords for search were neck muscle OR head posture OR muscle tension' OR muscle activity OR work related disorders OR neck pain AND/OR document location OR document holder OR source document OR copy screen holder.Document holder placed lateral to the screen was most preferred to reduce neck discomfort among occupational typists. Document without a holder was placed flat on the surface is least preferred. The head posture and muscle activity increases when the document is placed flat on the surface compared to when placed on the document holder. Work related factors such as static posture, repetitive movement, prolong sitting and awkward positions were the risk factors for chronic neck pain. This review highlights the optimal location for document holder for computer users to reduce neck pain. Together, the importance of work related risk factors for to neck pain on occupational typist is emphasized for the clinical management.

  14. A computational study of flow past three unequal sized square cylinders at different positions

    Science.gov (United States)

    Islam, Shams-ul; Shigri, Sehrish Hassan; Ying, Zhou Chao; Akbar, Tanvir; Majeed, Danish

    2017-03-01

    The flow past three unequal sized side-by-side square cylinders placed in different vertical configurations is investigated numerically using the lattice Boltzmann method for the Reynolds number Re = 160 and different values of the gap spacing between the cylinders, g, (ranging between 0.5 and 5). The present study is devoted to systematic investigation of the effects of cylinders position on the flow patterns. The reported results reveal that the flow patterns change significantly by the variation of cylinders configuration. Depending on the cylinders positions we observed; chaotic, base bleed, binary vortex street, modulated synchronized, inphase vortex shedding, antiphase vortex shedding, and in-antiphase vortex shedding flow patterns. The characteristics of the flow patterns are discussed with the aid of time history analysis of drag and lift coefficients, power spectra analysis of lift coefficients and vorticity contours visualization. The study also includes a detailed discussion on the aerodynamic forces, such as mean drag coefficient, Strouhal number and root-mean-square values of drag and lift coefficients. Our results show that the flow patterns behind three unequal cylinders are distinctly different compared to the flow past equisized square cylinders placed side-by-side. In chaotic flow pattern the secondary cylinder interaction frequency plays an important role especially at the second, third and fourth configurations for all gap spacings. At larger gap spacings for the first and sixth configurations, the primary vortex shedding frequency plays a dominant role and the jet effect almost diminishes between the cylinders.

  15. An automatic colour-based computer vision algorithm for tracking the position of piglets

    Energy Technology Data Exchange (ETDEWEB)

    Navarro-Jover, J. M.; Alcaniz-Raya, M.; Gomez, V.; Balasch, S.; Moreno, J. R.; Grau-Colomer, V.; Torres, A.

    2009-07-01

    Artificial vision is a powerful observation tool for research in the field of livestock production. So, based on the search and recognition of colour spots in images, a digital image processing system which permits the detection of the position of piglets in a farrowing pen, was developed. To this end, 24,000 images were captured over five takes (days), with a five-second interval between every other image. The nine piglets in a litter were marked on their backs and sides with different coloured spray paints each one, placed at a considerable distance on the RGB space. The programme requires the user to introduce the colour patterns to be found, and the output is an ASCII file with the positions (column X, lineY) for each of these marks within the image analysed. This information may be extremely useful for further applications in the study of animal behaviour and welfare parameters (huddling, activity, suckling, etc.). The software programme initially segments the image in the RGB colour space to separate the colour marks from the rest of the image, and then recognises the colour patterns, using another colour space [B/(R+G+B), (G-R), (B-G)] more suitable for this purpose. This additional colour space was obtained testing different colour combinations derived from R, G and B. The statistical evaluation of the programmes performance revealed an overall 72.5% in piglet detection, 89.1% of this total being correctly detected. (Author) 33 refs.

  16. Correlation between hyoid bone position and airway dimensions in Chinese adolescents by cone beam computed tomography analysis.

    Science.gov (United States)

    Jiang, Y-Y

    2016-07-01

    This study aimed to investigate the correlation between upper airway dimensions and hyoid bone position in Chinese adolescents based on cone beam computed tomography (CBCT) images. CBCT images from a total of 254 study subjects were included. The upper airway and hyoid bone parameters were measured by Materialism's interactive medical image control system (MIMICS) v.16.01 (Materialise, Leuven, Belgium). The airway dimensions were evaluated in terms of volume, cross-sectional area (CSA), mean CSA, length, anteroposterior dimension of the cross-section (AP), lateral dimension of the cross-section (LAT), and LAT/AP ratio. The hyoid bone position was evaluated using eight linear parameters and two angular parameters. Facial characteristics were evaluated using three linear parameters and three angular parameters. Most hyoid bone position parameters (especially the distance between the hyoid bone and hard palate) were significantly associated with most airway dimension parameters. Significant correlations were also observed between the different facial characteristic parameters and hyoid bone position parameters. Most airway dimension parameters showed significant correlations with linear facial parameters, but they displayed significant correlations with only a few angular facial parameters. These findings provide an understanding of the static relationship between the hyoid bone position and airway dimensions, which may serve as a reference for surgeons before orthodontic or orthognathic surgery.

  17. Peripheral quantitative computed tomography in children and adolescents: the 2007 ISCD Pediatric Official Positions.

    Science.gov (United States)

    Zemel, Babette; Bass, Shona; Binkley, Teresa; Ducher, Gaele; Macdonald, Heather; McKay, Heather; Moyer-Mileur, Laurie; Shepherd, John; Specker, Bonny; Ward, Kate; Hans, Didier

    2008-01-01

    Peripheral quantitative computed tomography (pQCT) has mainly been used as a research tool in children. To evaluate the clinical utility of pQCT and formulate recommendations for its use in children, the International Society of Clinical Densitometry (ISCD) convened a task force to review the literature and propose areas of consensus and future research. The types of pQCT technology available, the clinical application of pQCT for bone health assessment in children, the important elements to be included in a pQCT report, and quality control monitoring techniques were evaluated. The review revealed a lack of standardization of pQCT techniques, and a paucity of data regarding differences between pQCT manufacturers, models and software versions and their impact in pediatric assessment. Measurement sites varied across studies. Adequate reference data, a critical element for interpretation of pQCT results, were entirely lacking, although some comparative data on healthy children were available. The elements of the pQCT clinical report and quality control procedures are similar to those recommended for dual-energy X-ray absorptiometry. Future research is needed to establish evidence-based criteria for the selection of the measurement site, scan acquisition and analysis parameters, and outcome measures. Reference data that sufficiently characterize the normal range of variability in the population also need to be established.

  18. Understanding the connection between epigenetic DNA methylation and nucleosome positioning from computer simulations.

    Directory of Open Access Journals (Sweden)

    Guillem Portella

    Full Text Available Cytosine methylation is one of the most important epigenetic marks that regulate the process of gene expression. Here, we have examined the effect of epigenetic DNA methylation on nucleosomal stability using molecular dynamics simulations and elastic deformation models. We found that methylation of CpG steps destabilizes nucleosomes, especially when these are placed in sites where the DNA minor groove faces the histone core. The larger stiffness of methylated CpG steps is a crucial factor behind the decrease in nucleosome stability. Methylation changes the positioning and phasing of the nucleosomal DNA, altering the accessibility of DNA to regulatory proteins, and accordingly gene functionality. Our theoretical calculations highlight a simple physical-based explanation on the foundations of epigenetic signaling.

  19. 基于VPX总线的高级计算平台的研究与设计%Research and Design of Advanced Computing Platform Based on VPX Bus

    Institute of Scientific and Technical Information of China (English)

    陈志列; 陈超; 刘志永; 李琴

    2012-01-01

    Nowdays, with the higher requirement for high performance, high bandwidth and hush environment resistant computing platform, an advanced computing platform based on versatile protocol switch (VPX) bus is proposed. Introduce the development history of the VPX bus, indicate the critical issues need to be solved by the advanced computing platform based on VPX bus, the bus architecture, discuss the design in detail and carry out long-time test under different environment. The test result indicates that the platform has high performance, high bandwidth, excellent stability, strong anti-interference capability and easy maintenance; therefore, it is an industrial computing platform appropriate for severe environment.%针对用户对高性能、高带宽、抗恶劣环境计算平台要求越来越高的现状,提出一种基于多协议交换(versatile protocol switch,VPX)总线设计的高级计算平台.介绍VPX总线的发展历史,指出基于VPX总线的高级计算平台需要解决的关键问题,给出其架构,详细讨论其具体设计方法,并在不同环境下进行了长时间测试.测试结果表明:该设计平台性能高、带宽高、稳定性好、抗干扰能力强、维护方便,适用于恶劣工作环境.

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  1. Computer aided analysis of additional chromosome aberrations in Philadelphia chromosome positive acute lymphoblastic leukaemia using a simplified computer readable cytogenetic notation

    Directory of Open Access Journals (Sweden)

    Mohr Brigitte

    2003-01-01

    Full Text Available Abstract Background The analysis of complex cytogenetic databases of distinct leukaemia entities may help to detect rare recurring chromosome aberrations, minimal common regions of gains and losses, and also hot spots of genomic rearrangements. The patterns of the karyotype alterations may provide insights into the genetic pathways of disease progression. Results We developed a simplified computer readable cytogenetic notation (SCCN by which chromosome findings are normalised at a resolution of 400 bands. Lost or gained chromosomes or chromosome segments are specified in detail, and ranges of chromosome breakpoint assignments are recorded. Software modules were written to summarise the recorded chromosome changes with regard to the respective chromosome involvement. To assess the degree of karyotype alterations the ploidy levels and numbers of numerical and structural changes were recorded separately, and summarised in a complex karyotype aberration score (CKAS. The SCCN and CKAS were used to analyse the extend and the spectrum of additional chromosome aberrations in 94 patients with Philadelphia chromosome positive (Ph-positive acute lymphoblastic leukemia (ALL and secondary chromosome anomalies. Dosage changes of chromosomal material represented 92.1% of all additional events. Recurring regions of chromosome losses were identified. Structural rearrangements affecting (pericentromeric chromosome regions were recorded in 24.6% of the cases. Conclusions SCCN and CKAS provide unifying elements between karyotypes and computer processable data formats. They proved to be useful in the investigation of additional chromosome aberrations in Ph-positive ALL, and may represent a step towards full automation of the analysis of large and complex karyotype databases.

  2. Arthroscopic Latarjet procedure: is optimal positioning of the bone block and screws possible? A prospective computed tomography scan analysis.

    Science.gov (United States)

    Kany, Jean; Flamand, Olivier; Grimberg, Jean; Guinand, Régis; Croutzet, Pierre; Amaravathi, Rajkumar; Sekaran, Padmanaban

    2016-01-01

    We hypothesized that the arthroscopic Latarjet procedure could be performed with accurate bone block positioning and screw fixation with a similar rate of complications to the open Latarjet procedure. In this prospective study, 105 shoulders (104 patients) underwent the arthroscopic Latarjet procedure performed by the same senior surgeon. The day after surgery, an independent surgeon examiner performed a multiplanar bidimensional computed tomography scan analysis. We also evaluated our learning curve by comparing 2 chronologic periods (30 procedures performed in each period), separated by an interval during which 45 procedures were performed. Of the 105 shoulders included in the study, 95 (90.5%) (94 patients) were evaluated. The coracoid graft was accurately positioned relative to the equator of the glenoid surface in 87 of 95 shoulders (91.5%). Accurate bone-block positioning on the axial view with "circle" evaluation was obtained for 77 of 95 shoulders (81%). This procedure was performed in a lateralized position in 7 of 95 shoulders (7.3%) and in a medialized position in 11 shoulders (11.6%). The mean screw angulation with the glenoid surface was 21°. One patient had transient axillary nerve palsy. Of the initial 104 patients, 3 (2.8%) underwent revision. The analysis of our results indicated that the screw-glenoid surface angle significantly predicted the accuracy of the bone-block positioning (P = .001). Our learning curve estimates showed that, compared with our initial period, the average surgical time decreased, and the risk of lateralization showed a statistically significant decrease during the last period (P = .006). This study showed that accurate positioning of the bone block onto the anterior aspect of the glenoid is possible, safe, and reproducible with the arthroscopic Latarjet procedure without additional complications compared with open surgery. Copyright © 2016 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc

  3. A reference station-based GNSS computing mode to support unified precise point positioning and real-time kinematic services

    Science.gov (United States)

    Feng, Yanming; Gu, Shengfeng; Shi, Chuang; Rizos, Chris

    2013-11-01

    Currently, the GNSS computing modes are of two classes: network-based data processing and user receiver-based processing. A GNSS reference receiver station essentially contributes raw measurement data in either the RINEX file format or as real-time data streams in the RTCM format. Very little computation is carried out by the reference station. The existing network-based processing modes, regardless of whether they are executed in real-time or post-processed modes, are centralised or sequential. This paper describes a distributed GNSS computing framework that incorporates three GNSS modes: reference station-based, user receiver-based and network-based data processing. Raw data streams from each GNSS reference receiver station are processed in a distributed manner, i.e., either at the station itself or at a hosting data server/processor, to generate station-based solutions, or reference receiver-specific parameters. These may include precise receiver clock, zenith tropospheric delay, differential code biases, ambiguity parameters, ionospheric delays, as well as line-of-sight information such as azimuth and elevation angles. Covariance information for estimated parameters may also be optionally provided. In such a mode the nearby precise point positioning (PPP) or real-time kinematic (RTK) users can directly use the corrections from all or some of the stations for real-time precise positioning via a data server. At the user receiver, PPP and RTK techniques are unified under the same observation models, and the distinction is how the user receiver software deals with corrections from the reference station solutions and the ambiguity estimation in the observation equations. Numerical tests demonstrate good convergence behaviour for differential code bias and ambiguity estimates derived individually with single reference stations. With station-based solutions from three reference stations within distances of 22-103 km the user receiver positioning results, with various

  4. "Mixed Positioning" Control Strategy Research Based on Android Platform for Distributed Embedded Software%基于Android平台的嵌入式软件“混合”定位控制策略研究

    Institute of Scientific and Technical Information of China (English)

    代敏; 张晶

    2012-01-01

    分布嵌入式软件功能模块的设计与实现是当前研究热点.Android集成了地图模块和丰富的API,能有效支持嵌入式软件在分布移动环境中的定位功能实现.常见定位技术有模拟定位和地图定位.模拟定位不是真实定位;地图定位只能实现自我定位.鉴于以上两种定位的不足,以Android为开发平台,采用模拟定位的思想,地图定位的形式,提出了短信定位这种“混合定位”方法.“混合定位”实现了自我定位到定位他人,是定位控制策略的有效改进.%Researchers are focused on design and implementation of functional modules for distributed embedded software. Android distributed mobile intelligent terminal platform for software development, integrates map module and rich API library and effectively support positioning function implementation for distributed embedded environment. There are two common positoning technology: simulator positioning and map positioning. Simulaor positioning is not a real positioning in a sense. Map positioning can only implement self location. In order to overcome the above two kinds of positioning technology disadvantage, a "mixed positioning" strategy is presented by combined with simulator positioning's idea and map positioning's pattern. " Mixed positioning" , as improvement in positioning control strategy, achieves self-positioning and positioning of others.

  5. Patient Position Verification and Corrective Evaluation Using Cone Beam Computed Tomography (CBCT) in Intensity modulated Radiation Therapy

    Energy Technology Data Exchange (ETDEWEB)

    Do, Gyeong Min; Jeong, Deok Yang; Kim, Young Bum [Dept. of Radiation Oncology, Korea University Guro Hospital, Seoul (Korea, Republic of)

    2009-09-15

    Cone beam computed tomography (CBCT) using an on board imager (OBI) can check the movement and setup error in patient position and target volume by comparing with the image of computer simulation treatment in real.time during patient treatment. Thus, this study purposed to check the change and movement of patient position and target volume using CBCT in IMRT and calculate difference from the treatment plan, and then to correct the position using an automated match system and to test the accuracy of position correction using an electronic portal imaging device (EPID) and examine the usefulness of CBCT in IMRT and the accuracy of the automatic match system. The subjects of this study were 3 head and neck patients and 1 pelvis patient sampled from IMRT patients treated in our hospital. In order to investigate the movement of treatment position and resultant displacement of irradiated volume, we took CBCT using OBI mounted on the linear accelerator. Before each IMRT treatment, we took CBCT and checked difference from the treatment plan by coordinate by comparing it with the image of CT simulation. Then, we made correction through the automatic match system of 3D/3D match to match the treatment plan, and verified and evaluated using electronic portal imaging device. When CBCT was compared with the image of CT simulation before treatment, the average difference by coordinate in the head and neck was 0.99 mm vertically, 1.14 mm longitudinally, 4.91 mm laterally, and 1.07 degrees in the rotational direction, showing somewhat insignificant differences by part. In testing after correction, when the image from the electronic portal imaging device was compared with DRR image, it was found that correction had been made accurately with error less than 0.5 mm. By comparing a CBCT image before treatment with a 3D image reconstructed into a volume instead of a 2D image for the patient's setup error and change in the position of the organs and the target, we could measure and

  6. Toward correcting drift in target position during radiotherapy via computer-controlled couch adjustments on a programmable Linac

    Energy Technology Data Exchange (ETDEWEB)

    McNamara, Joseph E.; Regmi, Rajesh; Michael Lovelock, D.; Yorke, Ellen D.; Mageras, Gig S. [Department of Medical Physics, Memorial Sloan-Kettering Cancer Center, New York, New York 10065 (United States); Goodman, Karyn A.; Rimner, Andreas [Department of Radiation Oncology, Memorial Sloan-Kettering Cancer Center, New York, New York 10065 (United States); Mostafavi, Hassan [Ginzton Technology Center, Varian Medical Systems, Palo Alto, California 94304 (United States)

    2013-05-15

    Purpose: Real-time tracking of respiratory target motion during radiation therapy is technically challenging, owing to rapid and possibly irregular breathing variations. The authors report on a method to predict and correct respiration-averaged drift in target position by means of couch adjustments on an accelerator equipped with such capability. Methods: Dose delivery is broken up into a sequence of 10 s field segments, each followed by a couch adjustment based on analysis of breathing motion from an external monitor as a surrogate of internal target motion. Signal averaging over three respiratory cycles yields a baseline representing target drift. A Kalman filter predicts the baseline position 5 s in advance, for determination of the couch correction. The method's feasibility is tested with a motion phantom programmed according to previously recorded patient signals. Computed couch corrections are preprogrammed into a research mode of an accelerator capable of computer-controlled couch translations synchronized with the motion phantom. The method's performance is evaluated with five cases recorded during hypofractionated treatment and five from respiration-correlated CT simulation, using a root-mean-squared deviation (RMSD) of the baseline from the treatment planned position. Results: RMSD is reduced in all 10 cases, from a mean of 4.9 mm (range 2.7-9.4 mm) before correction to 1.7 mm (range 0.7-2.3 mm) after correction. Treatment time is increased {approx}5% relative to that for no corrections. Conclusions: This work illustrates the potential for reduction in baseline respiratory drift with periodic adjustments in couch position during treatment. Future treatment machine capabilities will enable the use of 'on-the-fly' couch adjustments during treatment.

  7. [Lens platform].

    Science.gov (United States)

    Łukaszewska-Smyk, Agnieszka; Kałuzny, Józef

    2010-01-01

    The lens platform defines lens structure and lens material. Evolution of lens comprises change in their shape, angulation of haptens and transition of three-piece lens into one-piece lens. The lens fall into two categories: rigid (PMMA) and soft (siliconic, acrylic, colameric). The main lens maaterials are polymers (hydrophilic and hydrophobic). The lens platform has an effect on biocompatibility, bioadhesion, stability of lens in capsule, degree of PCO evolution and sensitiveness to laser damages.

  8. Platform contents

    OpenAIRE

    Renault, Régis

    2014-01-01

    A monopoly platform hosts advertisers who compete on a market for horizontally differentiated products. These products may be either mass market products that appeal broadly to the entire consumer population or niche products that are tailored to the tastes of some particular group. Consumers search sequentially through ads incurring a surfing cost of moving to the next ad. They may click on an ad at some cost, which provides all relevant information and the opportunity to buy. The platform c...

  9. The vertical monitor position for presbyopic computer users with progressive lenses: how to reach clear vision and comfortable head posture.

    Science.gov (United States)

    Weidling, Patrick; Jaschinski, Wolfgang

    2015-01-01

    When presbyopic employees are wearing general-purpose progressive lenses, they have clear vision only with a lower gaze inclination to the computer monitor, given the head assumes a comfortable inclination. Therefore, in the present intervention field study the monitor position was lowered, also with the aim to reduce musculoskeletal symptoms. A comparison group comprised users of lenses that do not restrict the field of clear vision. The lower monitor positions led the participants to lower their head inclination, which was linearly associated with a significant reduction in musculoskeletal symptoms. However, for progressive lenses a lower head inclination means a lower zone of clear vision, so that clear vision of the complete monitor was not achieved, rather the monitor should have been placed even lower. The procedures of this study may be useful for optimising the individual monitor position depending on the comfortable head and gaze inclination and the vertical zone of clear vision of progressive lenses. For users of general-purpose progressive lenses, it is suggested that low monitor positions allow for clear vision at the monitor and for a physiologically favourable head inclination. Employees may improve their workplace using a flyer providing ergonomic-optometric information.

  10. Computational fluid dynamics analysis of drag and convective heat transfer of individual body segments for different cyclist positions.

    Science.gov (United States)

    Defraeye, Thijs; Blocken, Bert; Koninckx, Erwin; Hespel, Peter; Carmeliet, Jan

    2011-06-03

    This study aims at investigating drag and convective heat transfer for cyclists at a high spatial resolution. Such an increased spatial resolution, when combined with flow-field data, can increase insight in drag reduction mechanisms and in the thermo-physiological response of cyclists related to heat stress and hygrothermal performance of clothing. Computational fluid dynamics (steady Reynolds-averaged Navier-Stokes) is used to evaluate the drag and convective heat transfer of 19 body segments of a cyclist for three different cyclist positions. The influence of wind speed on the drag is analysed, indicating a pronounced Reynolds number dependency on the drag, where more streamlined positions show a dependency up to higher Reynolds numbers. The drag and convective heat transfer coefficient (CHTC) of the body segments and the entire cyclist are compared for all positions at racing speeds, showing high drag values for the head, legs and arms and high CHTCs for the legs, arms, hands and feet. The drag areas of individual body segments differ markedly for different cyclist positions whereas the convective heat losses of the body segments are found to be less sensitive to the position. CHTC-wind speed correlations are derived, in which the power-law exponent does not differ significantly for the individual body segments for all positions, where an average value of 0.84 is found. Similar CFD studies can be performed to assess drag and CHTCs at a higher spatial resolution for applications in other sport disciplines, bicycle equipment design or to assess convective moisture transfer.

  11. Modeling and computational analysis of the hemodynamic effects of crossing the limbs in an aortic endograft ("ballerina" position).

    Science.gov (United States)

    Georgakarakos, Efstratios; Xenakis, Antonios; Manopoulos, Christos; Georgiadis, George S; Tsangaris, Sokrates; Lazarides, Miltos K

    2012-08-01

    To evaluate the displacement forces acting on an aortic endograft when the iliac limbs are crossed ("ballerina" position). An endograft model was computationally reconstructed based on data from a patient whose infrarenal aortic aneurysm had an endovascular stent-graft implanted with the iliac limbs crossed. Computational fluid dynamics analysis determined the maximum displacement force on the endograft and separately on the bifurcation and iliac limbs. Its analogue model was reconstructed for comparison, assuming the neck, main body, and total length constant but considering the iliac limbs to be deployed in the usual bifurcated mode. Calculations were repeated after developing "idealized" models of both the bifurcated and crossed-limbs endografts with straight main bodies and no neck angulation or curved iliac segments. The vector of the total force was directed anterocaudal for both the typical bifurcated and the crossed-limbs configurations, with the forces in the latter slightly reduced and the vertical component accounting for most of the force in both configurations. Idealized crossed-limbs and bifurcated configurations differed only in the force on the iliac limbs, but this difference disappeared in the realistic models. Crossing of the iliac limbs can slightly affect the direction of the displacement forces. Although this configuration can exert larger forces on the limbs than in the bifurcated mode, this effect can be blunted by concomitant modifications in the geometry of the main body and other parts of the endograft, making its hemodynamic behavior resemble that of a typically positioned endograft.

  12. Windows Azure Platform

    CERN Document Server

    Redkar, Tejaswi

    2010-01-01

    The Azure Services Platform is a brand-new cloud-computing technology from Microsoft. It is composed of four core components-Windows Azure, .NET Services, SQL Services, and Live Services-each with a unique role in the functioning of your cloud service. It is the goal of this book to show you how to use these components, both separately and together, to build flawless cloud services. At its heart Windows Azure Platform is a down-to-earth, code-centric book. This book aims to show you precisely how the components are employed and to demonstrate the techniques and best practices you need to know

  13. Using Ipsilateral Motor Signals in the Unaffected Cerebral Hemisphere as a Signal Platform for Brain Computer Interfaces in Hemiplegic Stroke Survivors

    Science.gov (United States)

    Bundy, David T.; Wronkiewicz, Mark; Sharma, Mohit; Moran, Daniel W.; Corbetta, Maurizio; Leuthardt, Eric C.

    2012-01-01

    Objective Brain computer interface (BCI) systems have emerged as a method to restore function and enhance communication in motor impaired patients. To date, this has been primarily applied to patients who have a compromised motor outflow due to spinal cord dysfunction, but an intact and functioning cerebral cortex. The cortical physiology associated with movement of the contralateral limb has typically been the signal substrate that has been used as a control signal. While this is an ideal control platform in patients with an intact motor cortex, these signals are lost after a hemispheric stroke. Thus, a different control signal is needed that could provide control capability for a patient with a hemiparetic limb. Previous studies have shown that there is a distinct cortical physiology associated with ipsilateral, or same sided, limb movements. Thus far, it was unknown whether stroke survivors could intentionally and effectively modulate this ipsilateral motor activity from their unaffected hemisphere. Therefore, this study seeks to evaluate whether stroke survivors could effectively utilize ipsilateral motor activity from their unaffected hemisphere to achieve this BCI control. Approach To investigate this possibility, electroencephalographic (EEG) signals were recorded from four chronic hemispheric stroke patients as they performed (or attempted to perform) real and imagined hand tasks using either their affected or unaffected hand. Following performance of the screening task, the ability of patients to utilize a BCI system was investigated during on-line control of a 1-dimensional control task. Main Results Significant ipsilateral motor signals (associated with movement intentions of the affected hand) in the unaffected hemisphere, which were found to be distinct from rest and contralateral signals, were identified and subsequently used for a simple online BCI control task. We demonstrate here for the first time that EEG signals from the unaffected hemisphere

  14. Using ipsilateral motor signals in the unaffected cerebral hemisphere as a signal platform for brain-computer interfaces in hemiplegic stroke survivors

    Science.gov (United States)

    Bundy, David T.; Wronkiewicz, Mark; Sharma, Mohit; Moran, Daniel W.; Corbetta, Maurizio; Leuthardt, Eric C.

    2012-06-01

    Brain-computer interface (BCI) systems have emerged as a method to restore function and enhance communication in motor impaired patients. To date, this has been applied primarily to patients who have a compromised motor outflow due to spinal cord dysfunction, but an intact and functioning cerebral cortex. The cortical physiology associated with movement of the contralateral limb has typically been the signal substrate that has been used as a control signal. While this is an ideal control platform in patients with an intact motor cortex, these signals are lost after a hemispheric stroke. Thus, a different control signal is needed that could provide control capability for a patient with a hemiparetic limb. Previous studies have shown that there is a distinct cortical physiology associated with ipsilateral, or same-sided, limb movements. Thus far, it was unknown whether stroke survivors could intentionally and effectively modulate this ipsilateral motor activity from their unaffected hemisphere. Therefore, this study seeks to evaluate whether stroke survivors could effectively utilize ipsilateral motor activity from their unaffected hemisphere to achieve this BCI control. To investigate this possibility, electroencephalographic (EEG) signals were recorded from four chronic hemispheric stroke patients as they performed (or attempted to perform) real and imagined hand tasks using either their affected or unaffected hand. Following performance of the screening task, the ability of patients to utilize a BCI system was investigated during on-line control of a one-dimensional control task. Significant ipsilateral motor signals (associated with movement intentions of the affected hand) in the unaffected hemisphere, which were found to be distinct from rest and contralateral signals, were identified and subsequently used for a simple online BCI control task. We demonstrate here for the first time that EEG signals from the unaffected hemisphere, associated with overt and

  15. Integrated Monitoring Technology of Dynamic Positioning in Semi-submersible Drilling Platform%半潜式钻井平台动力定位集成监控技术

    Institute of Scientific and Technical Information of China (English)

    高文; 陈红卫

    2011-01-01

    针对目前海洋钻井信息管理平台中集成能力、可操作性和可扩展性存在的问题,研究了半潜式钻井平台动力定位集成监控技术;介绍了钻井平台动力定位系统组成和信息集成技术;设计了监控系统网络结构和软件结构;应用OPC技术设计客户端解决了网络异构问题,实现了钻井平台动力定位集成监控;最后利用PLC模拟了吊舱和柴油发电机数据环境,对OPC客户端进行了测试;测试结果表明,客户端能够与不同服务器建立连接,实现对不同系统、设备数据的读写.%Specifically for solving the problems of offshore drilling information management platform such as integrated ability, operabili-ty and extensibility, the integrated monitoring technology of dynamic positioning in semi -submersible drilling platform is studied. Not only integrated technology and dynamic positioning system are introduced, but also network architecture and software architecture are designed. Designing client based on OPC technology solves the problem of heterogeneous networks to achieve integrated monitoring for dynamic positioning system of Semi - submersible drilling platform. Finally using PLC simulates data environment of pod and diesel generator to test client. The results show that the client can connect with different servers and exchange data with different systems and equipments.

  16. Comparative Evaluation of Osseointegrated Dental Implants Based on Platform-Switching Concept: Influence of Diameter, Length, Thread Shape, and In-Bone Positioning Depth on Stress-Based Performance

    Directory of Open Access Journals (Sweden)

    Giuseppe Vairo

    2013-01-01

    Full Text Available This study aimed to investigate the influence of implant design (in terms of diameter, length, and thread shape, in-bone positioning depth, and bone posthealing crestal morphology on load transfer mechanisms of osseointegrated dental implants based on platform-switching concept. In order to perform an effective multiparametric comparative analysis, 11 implants different in dimensions and in thread features were analyzed by a linearly elastic 3-dimensional finite element approach, under a static load. Implant models were integrated with the detailed model of a maxillary premolar bone segment. Different implant in-bone positioning levels were modeled, considering also different posthealing crestal bone morphologies. Bone overloading risk was quantified by introducing proper local stress measures, highlighting that implant diameter is a more effective design parameter than the implant length, as well as that thread shape and thread details can significantly affect stresses at peri-implant bone, especially for short implants. Numerical simulations revealed that the optimal in-bone positioning depth results from the balance of 2 counteracting effects: cratering phenomena and bone apposition induced by platform-switching configuration. Proposed results contribute to identify the mutual influence of a number of factors affecting the bone-implant loading transfer mechanisms, furnishing useful insights and indications for choosing and/or designing threaded osseointegrated implants.

  17. Prediction of positron emission tomography/computed tomography (PET/CT) positivity in patients with high-risk primary melanoma

    Science.gov (United States)

    Danielsen, Maria; Kjaer, Andreas; Wu, Max; Martineau, Lea; Nosrati, Mehdi; Leong, Stanley PL; Sagebiel, Richard W; III, James R Miller; Kashani-Sabet, Mohammed

    2016-01-01

    Positron emission tomography/computed tomography (PET/CT) is an important tool to identify occult melanoma metastasis. To date, it is controversial which patients with primary cutaneous melanoma should have staging PET/CT. In this retrospective analysis of more than 800 consecutive patients with cutaneous melanoma, we sought to identify factors predictive of PET/CT positivity in the setting of newly-diagnosed high-risk primary melanoma to determine those patients most appropriate to undergo a PET/CT scan as part of their diagnostic work up. 167 patients with newly-diagnosed high-risk primary cutaneous melanoma underwent a PET/CT scan performed as part of their initial staging. Clinical and histologic factors were evaluated as possible predictors of melanoma metastasis identified on PET/CT scanning using both univariate and multivariate logistic regression. In all, 32 patients (19.2%) had a positive PET/CT finding of metastatic melanoma. In more than half of these patients (56.3%), PET/CT scanning identified disease that was not detectable on clinical examination. Mitotic rate, tumor thickness, lymphadenopathy, and bleeding were significantly predictive of PET/CT positivity. A combinatorial index constructed from these factors revealed a significant association between number of high-risk factors observed and prevalence of PET/CT positivity, which increased from 5.8% (with the presence of 0-2 factors) to 100.0%, when all four factors were present. These results indicate that combining clinical and histologic prognostic factors enables the identification of patients with a higher likelihood of a positive PET/CT scan.

  18. 基于云计算的Moodle平台学导式教学设计与实现%Based on cloud computing platform Moodle learning guide teaching design and implementation

    Institute of Scientific and Technical Information of China (English)

    赵莉; 李君茹

    2016-01-01

    通过云计算提供的各种在线协作服务,构建基于Moodle在线协作平台,设计学导式教学,创建个性化、开放性、互动性的教学环境,结合《多媒体技术》课程进行基于云计算服务Moodle平台的学导式教学实践活动。%A variety of online collaboration services provided by cloud computing, Based on the use of Moodle online collab-oration platform, Design learning guide teaching, Create a personalized, open and interactive teaching environment, Combined with the《multimedia technology》course of cloud computing services platform Moodle teaching based on practice.

  19. Application of computer-extracted breast tissue texture features in predicting false-positive recalls from screening mammography

    Science.gov (United States)

    Ray, Shonket; Choi, Jae Y.; Keller, Brad M.; Chen, Jinbo; Conant, Emily F.; Kontos, Despina

    2014-03-01

    Mammographic texture features have been shown to have value in breast cancer risk assessment. Previous models have also been developed that use computer-extracted mammographic features of breast tissue complexity to predict the risk of false-positive (FP) recall from breast cancer screening with digital mammography. This work details a novel locallyadaptive parenchymal texture analysis algorithm that identifies and extracts mammographic features of local parenchymal tissue complexity potentially relevant for false-positive biopsy prediction. This algorithm has two important aspects: (1) the adaptive nature of automatically determining an optimal number of region-of-interests (ROIs) in the image and each ROI's corresponding size based on the parenchymal tissue distribution over the whole breast region and (2) characterizing both the local and global mammographic appearances of the parenchymal tissue that could provide more discriminative information for FP biopsy risk prediction. Preliminary results show that this locallyadaptive texture analysis algorithm, in conjunction with logistic regression, can predict the likelihood of false-positive biopsy with an ROC performance value of AUC=0.92 (pclinical implications of using prediction models incorporating these texture features may include the future development of better tools and guidelines regarding personalized breast cancer screening recommendations. Further studies are warranted to prospectively validate our findings in larger screening populations and evaluate their clinical utility.

  20. ITS Platform

    DEFF Research Database (Denmark)

    Tøfting, Svend; Lahrmann, Harry; Agerholm, Niels

    2014-01-01

    Aalborg University and two local companies have over the past four years developed and tested an ITS Platform, which can be used for communication with cars and for providing a number of services to the drivers. The purpose has been to perform a technological test of the possible use of a hidden ...... not have to be very intelligent. This is gradually taken over by applications on smart phones. The ITS Platform with 425 test drivers is now completely developed and can be used for technological testing of e.g. payment systems.......Aalborg University and two local companies have over the past four years developed and tested an ITS Platform, which can be used for communication with cars and for providing a number of services to the drivers. The purpose has been to perform a technological test of the possible use of a hidden...

  1. 基于数字证书的云计算安全认证平台的研究%The Research of Cloud Computing Security Authentication Platform based on Digital Certificate

    Institute of Scientific and Technical Information of China (English)

    徐祺

    2013-01-01

    The cloud computing security authentication platform based on digital certificate was researched according to the security problems of the current cloud computing facing. First of al, the safety status of cloud computing was introduced in this paper. Application of digital certificate in China was analyzed with the electronic signature law's promulgation and implementation. A solution for computing security authentication based on digital certificate was designed cloud in the light of four safety problems existing in cloud computing combined with the digital certificate application. The encryption communication process was analysed that customers and cloud services platform using digital certificate. The scheme provided some models and the reference for cloud computing security authentication platform based on digital.%根据目前云计算面临的安全问题,对基于数字证书的云计算安全认证平台进行了研究。首先介绍了云计算的安全现状,分析了随着《电子签名法》的颁布实施,我国数字证书的应用情况,针对云计算存在的四个方面安全问题,结合数字证书的实际应用场景,设计了一种基于数字证书的云计算安全认证解决方案,并对客户利用数字证书与云服务平台加密通信的过程进行了分析,为基于数字证书的云计算安全认证平台建设提供了借鉴和参考。

  2. 基于MPC8548E的通用嵌入式计算机平台系统软件研究和实现%And the realization of MPC8548E embedded computer platform system software based on

    Institute of Scientific and Technical Information of China (English)

    李文光

    2014-01-01

    随着我国嵌入式计算机平台的不断发展,对嵌入式系统软件研究也逐渐加深,通用型嵌入式计算机平台是嵌入式进行软件开发的重要载体。系统软件作为嵌入式计算机硬件与计算程序的桥梁,需要建立良好性能的嵌入式环境,并为系统提供有利的计算条件。而以MPC8548E为基础的嵌入式计算机平台的出现,可以实现嵌入式系统软件的开发与应用。本文就针对基于MPC8548E的通用嵌入式计算机平台系统软件的实现过程进行深入的分析与研究。%With the continuous development of our embedded computer platform,also gradually deepen the research of embedded system software,embedded computer platform is an important carrier of embedded software development.System software as a bridge of embedded computer hardware and computational procedure,the embedded environment requires the establishment of good performance,and provide favorable conditions for the system calculation.The embedded computer platform based on MPC8548E,can realize the development and application of embedded system software.In this thesis the realization process of embedded computer system software based on MPC8548E thorough analysis and research.

  3. ITS Platform

    DEFF Research Database (Denmark)

    Tøfting, Svend; Lahrmann, Harry; Agerholm, Niels

    2014-01-01

    Aalborg University and two local companies have over the past four years developed and tested an ITS Platform, which can be used for communication with cars and for providing a number of services to the drivers. The purpose has been to perform a technological test of the possible use of a hidden ...... not have to be very intelligent. This is gradually taken over by applications on smart phones. The ITS Platform with 425 test drivers is now completely developed and can be used for technological testing of e.g. payment systems....

  4. Is the two-dimensional computed tomography scan analysis reliable for coracoid graft positioning in Latarjet procedures?

    Science.gov (United States)

    Barth, Johannes; Neyton, Lionel; Métais, Pierre; Panisset, Jean-Claude; Baverel, Laurent; Walch, Gilles; Lafosse, Laurent

    2017-08-01

    The aim of the study was to develop a computed tomography (CT)-based measurement protocol for coracoid graft (CG) placement in both axial and sagittal planes after a Latarjet procedure and to test its intraobserver and interobserver reliability. Fifteen postoperative CT scans were included to assess the intraobserver and interobserver reproducibility of a standardized protocol among 3 senior and 3 junior shoulder surgeons. The evaluation sequence included CG positioning, its contact area with the glenoid, and the angle of its screws in the axial plane. The percentage of CG positioned under the glenoid equator was also analyzed in the sagittal plane. The intraobserver and interobserver agreement was measured by the intraclass correlation coefficient (ICC), and the values were interpreted according to the Landis and Koch classification. The ICC was substantial to almost perfect for intraobserver agreement and fair to almost perfect for interobserver agreement in measuring the angle of screws in the axial plane. The intraobserver agreement was slight to almost perfect and the interobserver agreement slight to substantial regarding CG positioning in the same plane. The intraobserver agreement and interobserver agreement were both fair to almost perfect concerning the contact area. The ICC was moderate to almost perfect for intraobserver agreement and slight to almost perfect for interobserver agreement in analyzing the percentage of CG under the glenoid equator. The variability of ICC values observed implies that caution should be taken in interpreting results regarding the CG position on 2-dimensional CT scans. This discrepancy is mainly explained by the difficulty in orienting the glenoid in the sagittal plane before any other parameter is measured. Copyright © 2017 Journal of Shoulder and Elbow Surgery Board of Trustees. Published by Elsevier Inc. All rights reserved.

  5. Target registration and target positioning errors in computer-assisted neurosurgery: proposal for a standardized reporting of error assessment.

    Science.gov (United States)

    Widmann, Gerlig; Stoffner, Rudolf; Sieb, Michael; Bale, Reto

    2009-12-01

    Assessment of errors is essential in development, testing and clinical application of computer-assisted neurosurgery. Our aim was to provide a comprehensive overview of the different methods to assess target registration error (TRE) and target positioning error (TPE) and to develop a proposal for a standardized reporting of error assessment. A PubMed research on phantom, cadaver or clinical studies on TRE and TPE has been performed. Reporting standards have been defined according to (a) study design and evaluation methods and (b) specifications of the navigation technology. The proposed standardized reporting includes (a) study design (controlled, non-controlled), study type (non-anthropomorphic phantom, anthropomorphic phantom, cadaver, patient), target design, error type and subtypes, space of TPE measurement, statistics, and (b) image modality, scan parameters, tracking technology, registration procedure and targeting technique. Adoption of the proposed standardized reporting may help in the understanding and comparability of different accuracy reports. Copyright (c) 2009 John Wiley & Sons, Ltd.

  6. Benign thyroid and neck lesions mimicking malignancy with false positive findings on positron emission tomography-computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Ye Ri; Kim, Shin Young; Lee, Sang Mi [Soonchunhyang University Cheonan Hospital, Cheonan (Korea, Republic of); Lee, Deuk Young [Dept. of Surgery, Younsei Angelot Women' s Clinic, Cheonan (Korea, Republic of)

    2017-02-15

    The increasing use of positron emission tomography-computed tomography (PET/CT) has led to the frequent detection of incidental thyroid and neck lesions with increased 18F-deoxyglucose (FDG) uptake. Although lesions with increased FDG uptake are commonly assumed to be malignant, benign lesions may also exhibit increased uptake. The purpose of this pictorial essay is to demonstrate that benign thyroid and neck lesions can produce false-positive findings on PET/CT, and to identify various difficulties in interpretation. It is crucial to be aware that differentiating between benign and malignant lesions is difficult in a considerable proportion of cases, when relying only on PET/CT findings. Correlation of PET/CT findings with additional imaging modalities is essential to avoid misdiagnosis.

  7. 五自由度摇摆台位置正解方法研究及 MATLAB实现%Method for Position Forward Solution for 5-DOF Swing Platform and Its Implementation Based on MATLAB

    Institute of Scientific and Technical Information of China (English)

    李辉; 宋诗

    2013-01-01

    A practical algorithm for position forward solution of an asymmetric 5DOF swing platform was studied. The solution problem for a set of nonlinear equations which described the 5DOF swing platform was translated into function optimization. The particle swarm optimization (PSO)and Newton iteration algorithm were combined to solve the problem. And MATLAB PSO toolbox was used to implement the position forward solution of the 5DOF swing platform. The comparison of experiment results shows that using the algorithm combining PSO algorithm with Newton iteration algorithm in is better than just using the PSO algorithm only,both in real time aspects and precision.%  研究一种实用非对称五自由度摇摆台的位置正解算法。把描述五自由度摇摆台的一组非线性方程组的求解问题转化成对函数的优化问题,采用粒子群优化算法和Newton迭代法相结合的方法求解问题,并采用粒子群算法工具箱实现了五自由度摇摆台的位置正解。通过比较试验结果,发现采用PSO算法和Newton迭代法相结合的方法在精度方面比单纯采用PSO算法高,并且兼顾了实时性的要求。

  8. Identifying behaviors that generate positive interactions between museums and people on a social media platform: An analysis of 27 science museums on Twitter

    Science.gov (United States)

    Baker, Stacy Christine

    The aim of this study was to provide a detailed examination of how science museums use Twitter and suggest changes these museums should make to improve their current approach on this social media platform. Previous studies have identified the types of content museums are creating on social media, but none have quantitatively investigated the specific types of content most likely to generate interaction and engagement with a social media audience. A total of 5,278 tweets from 27 science museums were analyzed to determine what type of tweet yields the greatest impact measured in retweets and favorites. 1,453 of those tweets were selected for additional qualitative analysis. The results indicate that tweets with educational content, links, and hashtags lead to the greatest number of retweets and favorites. The results also indicate that the majority of tweets posted by museums do not generate interaction and engagement with a social media audience. A model for existing museums to improve their use of Twitter was created using the results of this study.

  9. Comparative repeatability of guide-pin axis positioning in computer-assisted and manual femoral head resurfacing arthroplasty.

    Science.gov (United States)

    Hodgson, A; Helmy, N; Masri, B A; Greidanus, N V; Inkpen, K B; Duncan, C P; Garbuz, D S; Anglin, C

    2007-10-01

    The orientation of the femoral component in hip resurfacing arthroplasty affects the likelihood of loosening and fracture. Computer-assisted surgery has been shown to improve significantly the surgeon's ability to achieve a desired position and orientation; nevertheless, both bias and variability in positioning remain and can potentially be improved. The authors recently developed a computer-assisted surgical (CAS) technique to guide the placement of the pin used in femoral head resurfacing arthroplasty and showed that it produced significantly less variation than a typical manual technique in varus/valgus placement relative to a preoperatively determined surgical plan while taking a comparable amount of time. In the present study, the repeatability of both the CAS and manual techniques is evaluated in order to estimate the relative contributions to overall variability of surgical technique (CAS versus manual), surgeon experience (novice versus experienced), and other sources of variability (e.g. across specimens and across surgeons). This will enable further improvements in the accuracy of CAS techniques. Three residents/fellows new to femoral head resurfacing and three experienced hip arthroplasty surgeons performed 20-30 repetitions of each of the CAS and manual techniques on at least one of four cadaveric femur specimens. The CAS system had markedly better repeatability (1.2 degrees) in varus/valgus placement relative to the manual technique (2.8 degrees), slightly worse repeatability in version (4.4 degrees versus 3.2 degrees), markedly better repeatability in mid-neck placement (0.7 mm versus 2.5 mm), no significant dependence on surgeon skill level (in contrast to the manual technique), and took significantly less time (50 s versus 123 s). Proposed improvements to the version measurement process showed potential for reducing the standard deviation by almost two thirds. This study supports the use of CAS for femoral head resurfacing as it is quicker than the

  10. Interfractional Position Variation of Pancreatic Tumors Quantified Using Intratumoral Fiducial Markers and Daily Cone Beam Computed Tomography

    Energy Technology Data Exchange (ETDEWEB)

    Horst, Astrid van der, E-mail: a.vanderhorst@amc.uva.nl [Department of Radiation Oncology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands); Wognum, Silvia; Dávila Fajardo, Raquel; Jong, Rianne de [Department of Radiation Oncology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands); Hooft, Jeanin E. van; Fockens, Paul [Department of Gastroenterology and Hepatology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands); Tienhoven, Geertjan van; Bel, Arjan [Department of Radiation Oncology, Academic Medical Center, University of Amsterdam, Amsterdam (Netherlands)

    2013-09-01

    Purpose: The aim of this study was to quantify interfractional pancreatic position variation using fiducial markers visible on daily cone beam computed tomography (CBCT) scans. In addition, we analyzed possible migration of the markers to investigate their suitability for tumor localization. Methods and Materials: For 13 pancreatic cancer patients with implanted Visicoil markers, CBCT scans were obtained before 17 to 25 fractions (300 CBCTs in total). Image registration with the reference CT was used to determine the displacement of the 2 to 3 markers relative to bony anatomy and to each other. We analyzed the distance between marker pairs as a function of time to identify marker registration error (SD of linear fit residuals) and possible marker migration. For each patient, we determined the mean displacement of markers relative to the reference CT (systematic position error) and the spread in displacements (random position error). From this, we calculated the group systematic error, Σ, and group random error, σ. Results: Marker pair distances showed slight trends with time (range, −0.14 to 0.14 mm/day), possibly due to tissue deformation, but no shifts that would indicate marker migration. The mean SD of the fit residuals was 0.8 mm. We found large interfractional position variations, with for 116 of 300 (39%) fractions a 3-dimensional vector displacement of >10 mm. The spread in displacement varied significantly (P<.01) between patients, from a vector range of 9.1 mm to one of 24.6 mm. For the patient group, Σ was 3.8, 6.6, and 3.5 mm; and σ was 3.6, 4.7 and 2.5 mm, in left–right, superior–inferior, and anterior–posterior directions, respectively. Conclusions: We found large systematic displacements of the fiducial markers relative to bony anatomy, in addition to wide distributions of displacement. These results for interfractional position variation confirm the potential benefit of using fiducial markers rather than bony anatomy for daily online

  11. Towards gaze-controlled platform games

    DEFF Research Database (Denmark)

    Muñoz, Jorge; Yannakakis, Georgios N.; Mulvey, Fiona

    2011-01-01

    This paper introduces the concept of using gaze as a sole modality for fully controlling player characters of fast-paced action computer games. A user experiment is devised to collect gaze and gameplay data from subjects playing a version of the popular Super Mario Bros platform game. The initial...... analysis shows that there is a rather limited grid around Mario where the efficient player focuses her attention the most while playing the game. The useful grid as we name it, projects the amount of meaningful visual information a designer should use towards creating successful player character...... controllers with the use of artificial intelligence for a platform game like Super Mario. Information about the eyes' position on the screen and the state of the game are utilized as inputs of an artificial neural network, which is trained to approximate which keyboard action is to be performed at each game...

  12. 计算资源共享平台中工作流任务调度研究%Task scheduler of workflows in computing resource sharing platform

    Institute of Scientific and Technical Information of China (English)

    周智刚

    2011-01-01

    A new scalable scheduler for task workflows with time constraints in computing resource sharing platform is proposed and described.It' s built upon a tree-based P2P overlay that supports efficient and fast aggregation of resource availability information.A twolayered architecture with a local and global scheduler is also presented.Local scheduling defines policies at execution node level, while global scheduling matched workflow tasks with suitable execution nodes.A local scheduler in each node provides its available time intervals to the distributed global scheduler, which summarizes them in the aggregation process.Constraints for deadlines and the correct timing of tasks in workflows are guaranteed with a suitable distributed management of availability time intervals of resources.The simulation result show that fast response times and low overhead in a system with hundreds of nodes are also obtained in the fork-join model and equation solver like applications.%提出了计算资源共享平台中具有时间约束的工作流任务调度方法,该方法利用了非集中式的树型应用层覆盖网络拓扑结构,从而可以高效而快速的收集资源的可用信息.采用全局调度器与本地调度器结合的方式,通过定义资源的收集功能过程,使每个节点中的本地调度器能够把自身的资源可用信息提供给全局的调度器,工作流中任务的最后期限时间约束和任务的恢复时间以一种时间间隙的机制来完成.仿真结果表明,分治模式和解方程类的迭代模式的工作流任务能够在平台上成功调度运行,具有比较快的响应时间和低的通信负载.

  13. On the Idea and Plan of Constructing a Multi-level Teaching Platform for Computer Education%试论多层次计算机教育教学平台的构建思路及方案

    Institute of Scientific and Technical Information of China (English)

    张汗斌

    2013-01-01

    In full-time college computer education system, ac-cording to the requirements of different professional competence, multi-level computer education curriculum is implemented in various professions. The teaching platform is primarily construct-ed based on students' computer competence and carious educa-tional elements, so it is able to adapt to the current market de-mand for talents. This paper analyzes this platform, and explores the principles of computer education, so as to establish a the multi-level teaching platform for computer education.%在全日制高校计算机教育体系中,根据各专业能力的要求,多层次的计算机教育课程在各专业中得以实行。这一教学平台主要根据学生的计算机应用能力以及一系列的教育因素而构建,通过各科计算机技术课程来满足学生实际需求,能够适应当前市场对人才的需求。本文将对这一平台进行分析,探讨计算机教育教学的原则和方式,来解决多层次计算机教育教学平台的构建方案。

  14. Research of public security platform cloud computing architecture based on internet of things%基于物联网的公共安全云计算平台

    Institute of Scientific and Technical Information of China (English)

    白蛟; 全春来; 郭镇

    2011-01-01

    Internet of things is introduced into the field of public safety technology, and the applications of the distributed computing, virtualized storage and cloud computing technology are discussed. To overcome the disadvantages of the existing public security platform, Internet of things five layers the public safety platform architecture is designed, and all levels of features and technology application are described, which provides a new way for the future construction of Internet of things for police. In order to achieve the business data sharing and security, based on this architecture the data supporting platform is proposed based on cloud computing, which supports vir-tualization of data storage and management, and meanwhile offers high performance computing power and storage equipment dynamic expansion ability. Security and computing power of Internet of things are improved.%将物联网技术引入到公共安全领域,重点研究了分布式计算和虚拟化存储及云计算的技术特点和应用,针对目前公共安全平台的不足,设计了5层的物联网公共安全平台架构,为以后警用物联网的建设提供了新的思路,同时结合该架构,提出了一种基于云计算的数据支撑平台,为该公共安全平台提供虚拟化的数据存储和管理,以实现各业务数据的共享和安全,提高了物联网应用的安全和计算能力.

  15. Three-dimensional computer graphics-based ankle morphometry with computerized tomography for total ankle replacement design and positioning.

    Science.gov (United States)

    Kuo, Chien-Chung; Lu, Hsuan-Lun; Leardini, Alberto; Lu, Tung-Wu; Kuo, Mei-Ying; Hsu, Horng-Chaung

    2014-05-01

    Morphometry of the bones of the ankle joint is important for the design of joint replacements and their surgical implantations. However, very little three-dimensional (3D) data are available and not a single study has addressed the Chinese population. Fifty-eight fresh frozen Chinese cadaveric ankle specimens, 26 females, and 32 males, were CT-scanned in the neutral position and their 3D computer graphics-based models were reconstructed. The 3D morphology of the distal tibia/fibula segment and the full talus was analyzed by measuring 31 parameters, defining the relevant dimensions, areas, and volumes from the models. The measurements were compared statistically between sexes and with previously reported data from Caucasian subjects. The results showed that, within a general similarity of ankle morphology between the current Chinese and previous Caucasian subjects groups, there were significant differences in 9 out of the 31 parameters analyzed. From a quantitative comparison with available prostheses designed for the Caucasian population, few of these designs have both tibial and talar components suitable in dimension for the Chinese population. The current data will be helpful for the sizing, design, and surgical positioning of ankle replacements and for surgical instruments, especially for the Chinese population.

  16. Comparative analysis between mandibular positions in centric relation and maximum intercuspation by cone beam computed tomography (CONE-BEAM).

    Science.gov (United States)

    Ferreira, Amanda de Freitas; Henriques, João César Guimarães; Almeida, Guilherme Araújo; Machado, Asbel Rodrigues; Machado, Naila Aparecida de Godoi; Fernandes Neto, Alfredo Júlio

    2009-01-01

    This research consisted of a quantitative assessment, and aimed to measure the possible discrepancies between the maxillomandibular positions for centric relation (CR) and maximum intercuspation (MI), using computed tomography volumetric cone beam (cone beam method). The sample of the study consisted of 10 asymptomatic young adult patients divided into two types of standard occlusion: normal occlusion and Angle Class I occlusion. In order to obtain the centric relation, a JIG device and mandible manipulation were used to deprogram the habitual conditions of the jaw. The evaluations were conducted in both frontal and lateral tomographic images, showing the condyle/articular fossa relation. The images were processed in the software included in the NewTom 3G device (QR NNT software version 2.00), and 8 tomographic images were obtained per patient, four laterally and four frontally exhibiting the TMA's (in CR and MI, on both sides, right and left). By means of tools included in another software, linear and angular measurements were performed and statistically analyzed by student t test. According to the methodology and the analysis performed in asymptomatic patients, it was not possible to detect statistically significant differences between the positions of centric relation and maximum intercuspation. However, the resources of cone beam tomography are of extreme relevance to the completion of further studies that use heterogeneous groups of samples in order to compare the results.

  17. Comparative analysis between mandibular positions in centric relation and maximum intercuspation by cone beam computed tomography (CONE-BEAM

    Directory of Open Access Journals (Sweden)

    Amanda de Freitas Ferreira

    2009-01-01

    Full Text Available This research consisted of a quantitative assessment, and aimed to measure the possible discrepancies between the maxillomandibular positions for centric relation (CR and maximum intercuspation (MI, using computed tomography volumetric cone beam (cone beam method. The sample of the study consisted of 10 asymptomatic young adult patients divided into two types of standard occlusion: normal occlusion and Angle Class I occlusion. In order to obtain the centric relation, a JIG device and mandible manipulation were used to deprogram the habitual conditions of the jaw. The evaluations were conducted in both frontal and lateral tomographic images, showing the condyle/articular fossa relation. The images were processed in the software included in the NewTom 3G device (QR NNT software version 2.00, and 8 tomographic images were obtained per patient, four laterally and four frontally exhibiting the TMA's (in CR and MI, on both sides, right and left. By means of tools included in another software, linear and angular measurements were performed and statistically analyzed by student t test. According to the methodology and the analysis performed in asymptomatic patients, it was not possible to detect statistically significant differences between the positions of centric relation and maximum intercuspation. However, the resources of cone beam tomography are of extreme relevance to the completion of further studies that use heterogeneous groups of samples in order to compare the results.

  18. 岩体质量Q系统计算平台的开发与应用%Development and application of computing platform for rock mass quality classification Q system

    Institute of Scientific and Technical Information of China (English)

    龚剑; 欧阳治华

    2011-01-01

    岩体质量分类Q系统是目前应用最广的岩体质量分类方法之一.为了实时准确地判断岩体稳定性,提高工作效率,并将Q系统岩体分类信息化,以Python为开发语言,利用Web网络技术和数据库技术开发了岩体质量Q系统计算平台.详细介绍了岩体质量Q系统计算平台的开发原理和操作流程,将Q系统与网页界面结合,并通过网页展示Q系统图解与支护措施.矿山实际工程应用表明,岩体质量Q系统计算平台操作简单,可实时判断岩体的稳定性并提出处理方案.%Rock mass quality classification Q system is one of the most popular methods of rock mass quality classification. In order to determine the stability of rock mass accurately in real time and improve the efficiency and informatization level of Q system rock mass quality classification, the computing platform for the classification system is established, using Python development language, Web technology and database technology. This paper introduces the development principles and operational processes of the computing platform. The platform meshes Q system with Web interface, and demonstrates Q system diagram and supporting methods through Web. Practical application shows that the computing platform is easy to operate, and can determine the stability of the rock mass and put forward disposal schemes in real time.

  19. 企业新一代信息架构及云平台的研究%The Research on the new generation of enterprise information architecture and cloud computing platform

    Institute of Scientific and Technical Information of China (English)

    徐茹枝; 周凡雅; 耿啸风

    2012-01-01

    随着企业信息化建设的深入,大型企业面临来自管理支撑、技术架构和运营管理多维度的转型压力,因此需要设计企业新一代信息架构促进业务流程再造以及业务流程集成,通过IT与业务同步,提升企业执行力。本文提出企业新一代信息架构,重点分析和研究了云平台。着重研究了资源池和云管理平台的机制,分析了云平台的功能和业务场景,以数据采集程序实例创建和业务开通场景为例,对云平台改变操作流程和运维模式的意义进行了阐述,最后给出云平台系统生产环境的验证测试样例。%With the depth of the enterprise transition pressure from management information construction, large enterprises face the support, technical architecture and operational management, and therefore need to design a new generation of enterprise information architecture for business process. This paper presents a new generation of enterprise information architecture, focusing on analysis and the cloud computing platform. Focused on the working mechanism of the resource pool and cloud management platform, analyzing the function and business situation of cloud computing platform, demonstrate the meaning with process reengineering and operating model from creating sample of data crawling program and enabling business, at last giving system environment test sample of cloud computing platform.

  20. Cloud Based Applications and Platforms (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Brodt-Giles, D.

    2014-05-15

    Presentation to the Cloud Computing East 2014 Conference, where we are highlighting our cloud computing strategy, describing the platforms on the cloud (including Smartgrid.gov), and defining our process for implementing cloud based applications.

  1. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  2. Year 2 Report: Protein Function Prediction Platform

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, C E

    2012-04-27

    Upon completion of our second year of development in a 3-year development cycle, we have completed a prototype protein structure-function annotation and function prediction system: Protein Function Prediction (PFP) platform (v.0.5). We have met our milestones for Years 1 and 2 and are positioned to continue development in completion of our original statement of work, or a reasonable modification thereof, in service to DTRA Programs involved in diagnostics and medical countermeasures research and development. The PFP platform is a multi-scale computational modeling system for protein structure-function annotation and function prediction. As of this writing, PFP is the only existing fully automated, high-throughput, multi-scale modeling, whole-proteome annotation platform, and represents a significant advance in the field of genome annotation (Fig. 1). PFP modules perform protein functional annotations at the sequence, systems biology, protein structure, and atomistic levels of biological complexity (Fig. 2). Because these approaches provide orthogonal means of characterizing proteins and suggesting protein function, PFP processing maximizes the protein functional information that can currently be gained by computational means. Comprehensive annotation of pathogen genomes is essential for bio-defense applications in pathogen characterization, threat assessment, and medical countermeasure design and development in that it can short-cut the time and effort required to select and characterize protein biomarkers.

  3. Contributing to global computing platform: gliding, tunneling standard services and high energy physics application; Contribution aux infrastructures de calcul global: delegation inter plates-formes, integration de services standards et application a la physique des hautes energies

    Energy Technology Data Exchange (ETDEWEB)

    Lodygensky, O

    2006-09-15

    Centralized computers have been replaced by 'client/server' distributed architectures which are in turn in competition with new distributed systems known as 'peer to peer'. These new technologies are widely spread, and trading, industry and the research world have understood the new goals involved and massively invest around these new technologies, named 'grid'. One of the fields is about calculating. This is the subject of the works presented here. At the Paris Orsay University, a synergy emerged between the Computing Science Laboratory (LRI) and the Linear Accelerator Laboratory (LAL) on grid infrastructure, opening new investigations fields for the first and new high computing perspective for the other. Works presented here are the results of this multi-discipline collaboration. They are based on XtremWeb, the LRI global computing platform. We first introduce a state of the art of the large scale distributed systems, its principles, its architecture based on services. We then introduce XtremWeb and detail modifications and improvements we had to specify and implement to achieve our goals. We present two different studies, first interconnecting grids in order to generalize resource sharing and secondly, be able to use legacy services on such platforms. We finally explain how a research community like the community of high energy cosmic radiation detection can gain access to these services and detail Monte Carlos and data analysis processes over the grids. (author)

  4. Design and Implementation of Urban Geographic Information Public Service Platform Based on Cloud Computing%基于云计算的城市地理信息公共服务平台设计与实现

    Institute of Scientific and Technical Information of China (English)

    张桂芬

    2012-01-01

    引入了"云计算"的概念,对传统的城市地理信息公共服务平台架构进行分析,探讨了基于云计算的平台架构设计、关键技术及实现效果。总结出云计算技术的应用提升了基础地理信息公共服务平台的服务能力和效率,拓宽了地理信息的应用面,将进一步加快地理信息产业的发展。%The paper introduces the concept of cloud computing, analyses advantages and disadvantages of the traditional structure of city platform for geoinformation common services, discusses the design of structure, the core technology and the realization about the platform based on cloud computing. Finally, it summarizes that the application of cloud technology upgrades the service ability and efficiency of platform for geoinformation common services , it will also broaden the application of GIS and fasten the development of geograohical information industry.

  5. Platform Constellations

    DEFF Research Database (Denmark)

    Staykova, Kalina Stefanova; Damsgaard, Jan

    2016-01-01

    messaging apps KakaoTalk and LINE, we are able to gain valuable insights about the nature of these new constructions and to capture and synthesize their main characteristics in a framework. Our results show that platform constellations possess unique innovative capabilities, which can improve users......’ acquisition and users’ engagement rates as well as unlock new sources of value creation and diversify revenue streams....

  6. Research on Electronic Government Common Platform Based on Cloud Computing%基于云计算的电子政务云技术研究

    Institute of Scientific and Technical Information of China (English)

    殷波; 赵昕; 冯伟斌; 王志军; 房秉毅

    2015-01-01

    Because of its high usability, high security and low cost, cloud computing is widely used in many fields of society, and changes service means and serving process with its unique mode. The application of cloud computing in the traditional e-government work, can solve the lack of information exchange, resource sharing problems such as lack of information sharing, improving the working efficiency of the government, enhance the government’s public service capacity, enhance data storage and sharing ability, effectively improve the government’s information level. It researches on the positioning and goal of e-government cloud as wel as the overal structure, to expound the application state of e-government cloud.%云计算以其高可用性、高安全性以及低成本的特点,正广泛应用于社会各行业,并以其特有的模式改变着服务工作方式和服务流程。将云计算应用于传统电子政务工作中,可以解决其信息缺乏互通,资源共享性不足等问题,提高政府工作效率,提升政府的公共服务能力,增强数据的存储和共享能力,有效提高政府的信息化水平。对电子政务云的定位和目标,以及整体架构进行了研究,旨在说明电子政务云的应用现状。

  7. Research on Construction Technique of Computing Platform of Collaborative Simulation for Electromagnetic Characteristics Target%目标电磁特性协同仿真计算平台构建技术研究

    Institute of Scientific and Technical Information of China (English)

    刘民; 朱兴国; 刘姜玲; 黄飞龙

    2015-01-01

    In order to solve the bottleneck of computation of an electrically large target electromagnetic characteristics with complex struc-ture,put forward the connotation of electromagnetic grid computing and electromagnetic computing grid architecture and design. Through the integration of various electromagnetic algorithm,using the next generation advanced Internet and collaborative computing technology for seamless connectivity,resource aggregation and computation task distribution,scheduling and management to a number of different types and use of computing resources,model resources,algorithm resource and cyber resource from universities and research institutes, build collaborative simulation computing platform of electromagnetic, forming the distributed electromagnetic computing networks and cloud computing service system,through a number of collaborative computing nodes to complete the simulation calculation tasks and a-chieve collaborative work of wide area distribution of resources. The platform has the scalability,high availability,high computational ef-ficiency,effective scheduling and powerful computation ability. Finally,the conclusion and the prospects for the next step of work are giv-en.%为解决具有复杂结构超电大尺寸目标电磁特征计算的瓶颈,提出了电磁网格计算的思想以及电磁计算网格的技术内涵、体系结构和设计方案。通过融合各种电磁算法,利用下一代先进互联网和协同计算技术将高校和研究院所的大量不同类型和用途的计算资源、模型资源、算法资源和网络资源进行无缝连接、资源聚合以及计算任务的分发、调度与管理,构建电磁协同仿真计算平台,形成分布式电磁计算网络和云计算服务体系,通过多个计算节点的协同工作来完成仿真计算任务,实现广域分布资源的协同工作。该平台具有可扩展性、高可用性、高计算效率、有效调度以及强大的计算能力。最

  8. Computational modeling of drug distribution in the posterior segment of the eye: effects of device variables and positions.

    Science.gov (United States)

    Jooybar, Elaheh; Abdekhodaie, Mohammad J; Farhadi, Fatolla; Cheng, Yu-Ling

    2014-09-01

    A computational model was developed to simulate drug distribution in the posterior segment of the eye after intravitreal injection and ocular implantation. The effects of important factors in intravitreal injection such as injection time, needle gauge and needle angle on the ocular drug distribution were studied. Also, the influences of the position and the type of implant on the concentration profile in the posterior segment were investigated. Computational Fluid Dynamics (CFD) calculations were conducted to describe the 3D convective-diffusive transport. The geometrical model was constructed based on the human eye dimensions. To simulate intravitreal injection, unlike previous studies which considered the initial shape of the injected drug solution as a sphere or cylinder, the more accurate shape was obtained by level-set method in COMSOL. The results showed that in intravitreal injection the drug concentration profile and its maximum value depended on the injection time, needle gauge and penetration angle of the needle. Considering the actual shape of the injected solution was found necessary to obtain the real concentration profile. In implant insertion, the vitreous cavity received more drugs after intraocular implantation, but this method was more invasive compared to the periocular delivery. Locating the implant in posterior or anterior regions had a significant effect on local drug concentrations. Also, the shape of implant influenced on concentration profile inside the eye. The presented model is useful for optimizing the administration variables to ensure optimum therapeutic benefits. Predicting and quantifying different factors help to reduce the possibility of tissue toxicity and to improve the treatment efficiency.

  9. Computed tomography assessment of temporomandibular joint position and dimensions in patients with class II division 1 and division 2 malocclusions

    Science.gov (United States)

    Ciger, Semra

    2017-01-01

    Background This study aimed to investigate and compare the positions and dimensions of the temporomandibular joint and its components, respectively, in patients with Class II division 1 and division 2 malocclusions. Material and Methods Computed tomography images of 14 patients with Class II division 1 and 14 patients with Class II division 2 malocclusion were included with a mean age of 11.4 ± 1.2 years. The following temporomandibular joint measurements were made with OsiriX medical imaging software program. From the sagittal images, the anterior, superior, and posterior joint spaces and the mandibular fossa depths were measured. From the axial images, the greatest anteroposterior and mediolateral diameters of the mandibular condyles, angles between the long axis of the mandibular condyle and midsagittal plane, and vertical distances from the geometric centers of the condyles to midsagittal plane were measured. The independent samples t-test was used for comparing the measurements between the two sides and between the Class II division 1 and 2 groups. Results No statistically significant differences were observed between the right and left temporomandibular joints; therefore, the data were pooled. There were statistically significant differences between the Class II division 1 and 2 groups with regard to mandibular fossa depth and anterior joint space measurements. Conclusions In Class II patients, the right and left temporomandibular joints were symmetrical. In the Class II division 1 group, the anterior joint space was wider than that in Class II division 2 group, and the mandibular fossa was deeper and wider in the Class II division 1 group. Key words:Temporomandibular joint, Class II malocclusion, Cone beam computed tomography. PMID:28298985

  10. Assessing joint space and condylar position in the people with normal function of temporomandibular joint with cone-beam computed tomography

    OpenAIRE

    Zahra Dalili; Nasim Khaki; Seyed Javad Kia; Fatemeh Salamat

    2012-01-01

    Background: The optimal position of the condyle in glenoid fossa is a fundamental question in dentistry. There is no quantitative standard for the optimal position of mandibular condyle in the glenoid fossa in our population. The purpose of this study is to assess the position of the condyle by cone beam computed tomography (CBCT) images in patient with normal function of temporomandibular joint (TMJ). Materials and Methods: In this cross-sectional study, CBCT images of 40 class I skeleta...

  11. Android平台上WiFi技术在商场员工定位系统中的应用%Application of WiFi Technology in Staff Positioning System on Android Platform

    Institute of Scientific and Technical Information of China (English)

    裴文莲; 詹林

    2013-01-01

    Benefited from the mature GPS technology, outdoor positioning accuracy has been satisfied, while with the increasing number of large buildings, indoor locating application is becoming more important. Android mobile phone provides free Internet access service, such as WiFi and GPRS, so that users can easily connect to network. By comparing different positioning technologies , on the basis of the comparison, WiFi positioning technology is applied to Android platform on a smart phone. This paper realizes large department store positioning service of staff, and a LBS application system which can communicate betueen the client and the server.%得益于成熟的GPS技术,室外定位精度已经让用户比较满意了,然而随着大型建筑物的增多,室内定位应用越发显得重要.Android手机提供了无线WiFi及GPRS等免费上网功能,用户可以方便地连接到网络.本文在对不同定位技术优劣性比较的基础上,将WiFi定位技术运用到Android平台智能手机上,研究大型商场内员工定位服务,实现客户端和服务器端相互通信的LBS应用系统.

  12. Development of a control system for a pair of robotic platforms. Thesis

    Science.gov (United States)

    Cosentino, James L.

    1990-01-01

    This thesis is a discussion of the development of a control system for a pair of three-degree-of-freedom robotic platforms. The platform system and its computer support are described at both the hardware and the software levels, including a section detailing diagnostics which can be run in the event of specific system errors. The design and implementation of a PID controller for the platforms based on experimentally-determined system dynamics is described. Finally, the tracking performance of each joint controller is examined. A path planner is used to map a smooth trajectory between starting and destination positions, and the controller's tracking ability is observed along this path.

  13. Design of Multifunctional Platform Analog Computer Peripherals Based on Android Smartphone%基于Android智能手机的模拟计算机外设的多功能平台设计

    Institute of Scientific and Technical Information of China (English)

    罗圆; 刘世鑫; 瞿绍军

    2014-01-01

    随着计算技术和通信技术的快速发展,人类正逐步进入普适计算(Ubiquitous Computing)时代。而智能空间正是普适计算本质特点的一种具体而集中的体现,智能手机在其中可以获得增强化的个性服务。论文实现了一种基于Android智能手机的模拟电脑外设的多功能平台,手机和电脑端通过TCP/IP协议连接,使得用户在一定的距离内用手机远程的控制电脑。%With the rapid development of computer technology and communication technology, human beings is gradually enter⁃ing the Ubiquitous Computing age. The Smart Space is a specific and concentrated expression of an essential feature of pervasive computing.Due to them,Intelligent phone can get personalized service. This paper realizes a multifunctional platform analog computer peripherals based on Android smartphone, mobile phones and computer connected via TCP/IP protocol, allowing us⁃ers remote control the computer at the appropriate distance using a mobile phone.

  14. Kinematics of an in-parallel actuated manipulator based on the Stewart platform mechanism

    Science.gov (United States)

    Williams, Robert L., II

    1992-01-01

    This paper presents kinematic equations and solutions for an in-parallel actuated robotic mechanism based on Stewart's platform. These equations are required for inverse position and resolved rate (inverse velocity) platform control. NASA LaRC has a Vehicle Emulator System (VES) platform designed by MIT which is based on Stewart's platform. The inverse position solution is straight-forward and computationally inexpensive. Given the desired position and orientation of the moving platform with respect to the base, the lengths of the prismatic leg actuators are calculated. The forward position solution is more complicated and theoretically has 16 solutions. The position and orientation of the moving platform with respect to the base is calculated given the leg actuator lengths. Two methods are pursued in this paper to solve this problem. The resolved rate (inverse velocity) solution is derived. Given the desired Cartesian velocity of the end-effector, the required leg actuator rates are calculated. The Newton-Raphson Jacobian matrix resulting from the second forward position kinematics solution is a modified inverse Jacobian matrix. Examples and simulations are given for the VES.

  15. Hip joint centre position estimation using a dual unscented Kalman filter for computer-assisted orthopaedic surgery.

    Science.gov (United States)

    Beretta, Elisa; De Momi, Elena; Camomilla, Valentina; Cereatti, Andrea; Cappozzo, Aurelio; Ferrigno, Giancarlo

    2014-09-01

    In computer-assisted knee surgery, the accuracy of the localization of the femur centre of rotation relative to the hip-bone (hip joint centre) is affected by the unavoidable and untracked pelvic movements because only the femoral pose is acquired during passive pivoting manoeuvres. We present a dual unscented Kalman filter algorithm that allows the estimation of the hip joint centre also using as input the position of a pelvic reference point that can be acquired with a skin marker placed on the hip, without increasing the invasiveness of the surgical procedure. A comparative assessment of the algorithm was carried out using data provided by in vitro experiments mimicking in vivo surgical conditions. Soft tissue artefacts were simulated and superimposed onto the position of a pelvic landmark. Femoral pivoting made of a sequence of star-like quasi-planar movements followed by a circumduction was performed. The dual unscented Kalman filter method proved to be less sensitive to pelvic displacements, which were shown to be larger during the manoeuvres in which the femur was more adducted. Comparable accuracy between all the analysed methods resulted for hip joint centre displacements smaller than 1 mm (error: 2.2 ± [0.2; 0.3] mm, median ± [inter-quartile range 25%; inter-quartile range 75%]) and between 1 and 6 mm (error: 4.8 ± [0.5; 0.8] mm) during planar movements. When the hip joint centre displacement exceeded 6 mm, the dual unscented Kalman filter proved to be more accurate than the other methods by 30% during multi-planar movements (error: 5.2 ± [1.2; 1] mm).

  16. An Investigation of a Computer Training Company's Migration to a New Distance Learning Platform and the Implementation of an Online Professional Development Program

    Science.gov (United States)

    Rudd, Denis; Bernadowski, Carianne

    2015-01-01

    The purpose of the study was to determine if the Training Partner Program was successful in preparing trainers to use a new distance learning platform. Results indicate the program was a success in improving self-efficacy, engagement, and collaboration among trainers. Additionally, characteristics of online trainers are identified. Online learning…

  17. An Investigation of a Computer Training Company's Migration to a New Distance Learning Platform and the Implementation of an Online Professional Development Program

    Science.gov (United States)

    Rudd, Denis; Bernadowski, Carianne

    2015-01-01

    The purpose of the study was to determine if the Training Partner Program was successful in preparing trainers to use a new distance learning platform. Results indicate the program was a success in improving self-efficacy, engagement, and collaboration among trainers. Additionally, characteristics of online trainers are identified. Online learning…

  18. 基于云计算的物流园区信息服务平台构建研究%Construction Research of Logistics Park Information Service Platform Based on Cloud Computing

    Institute of Scientific and Technical Information of China (English)

    陈翠萍; 黄章树; 李宝玉

    2015-01-01

    The construction of logistics park information service platform for the logistics park is of great significance in these areas:the integration of logistics resources, improve the level of logistics information, improve the operating efficiency of logistics,reduce logistics costs and so on.This paper first introduces the concept of cloud computing,analyzes the demand and the orientation of logistics park information service platform,then constructs the logistics park information service platform based on cloud computing,and describes each module function and the key technology of platform construction.This can provide some reference value for the enterprises and units engaged in the research of the domestic logistics park informatization.%物流园区信息服务平台的建设对物流园区在整合物流资源、提高物流信息化水平、提升物流作业效率以及降低物流成本等方面都有着重要作用。文中首先引入了云计算概念,分析了物流园区信息服务平台的需求以及平台的定位,规划和构建了基于云计算的物流园区信息服务平台,并阐述了各模块功能设计和平台构建的关键技术。此可为从事国内物流园区信息化研究的企业和单位提供一些参考价值。

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  20. Integration of the TNXYZ computer program inside the platform Salome; Integracion del programa de computo TNXYZ dentro de la plataforma Salome

    Energy Technology Data Exchange (ETDEWEB)

    Chaparro V, F. J.

    2014-07-01

    The present work shows the procedure carried out to integrate the code TNXYZ as a calculation tool at the graphical simulation platform Salome. The TNXYZ code propose a numerical solution of the neutron transport equation, in several groups of energy, steady-state and three-dimensional geometry. In order to discretized the variables of the transport equation, the code uses the method of discrete ordinates for the angular variable, and a nodal method for the spatial dependence. The Salome platform is a graphical environment designed for building, editing and simulating mechanical models mainly focused on the industry and unlike other software, in order to form a complete scheme of pre and post processing of information, to integrate and control an external source code. Before the integration the in the Salome platform TNXYZ code was upgraded. TNXYZ was programmed in the 90s using Fortran 77 compiler; for this reason the code was adapted to the characteristics of the current Fortran compilers; in addition, with the intention of extracting partial results over the process sequence, the original structure of the program underwent a modularization process, i.e. the main program was divided into sections where the code performs major operations. This procedure is controlled by the information module (YACS) on Salome platform, and it could be useful for a subsequent coupling with thermal-hydraulics codes. Finally, with the help of the Monte Carlo code Serpent several study cases were defined in order to check the process of integration; the verification process consisted in performing a comparison of the results obtained with the code executed as stand-alone and after modernized, integrated and controlled by the Salome platform. (Author)

  1. Synthesis, photophysical and computational studies of two lophine derivatives with electron-rich substituents in the 2-position

    Science.gov (United States)

    Hamada, Terianne; Le, Tammy; Voegtle, Matthew J.; Doyle, Bryan; Rimby, Jarred; Isovitsch, Ralph

    2017-02-01

    An exploration of the photophysical properties of two lophine derivatives with electron-donating groups in the 2-position began with the preparation of compounds 1 and 2 via one-pot reactions in good yields, 83% and 74%, respectively. The absorption spectra of 1 and 2 had bands at approximately 300 nm (ε ≈ 25,000-34,000 M-1 cm-1), while that of 2 had an additional band at 348 nm (ε = 35,600 M-1 cm-1). These absorptions were assigned to π→π* transitions. Excitation into the absorption band of 1 at approximately 300 nm produced emission at 387 nm, while excitation into either of the absorption bands of 2 produced emission at 406 nm. Of the two compounds, 2 had the higher quantum yield. The emission spectra of compounds 1 and 2 were slightly blue-shifted at 77 K. Excited state lifetimes for 1 and 2 were short (indicating that the observed emission was fluorescence) at room temperature and 77 K, ranging from 1.1 to 1.8 ns. Computational studies of both compounds 1 and 2 were performed to better understand how their structures relate to their photophysical properties.

  2. Computer-assisted orthognathic surgery: waferless maxillary positioning, versatility, and accuracy of an image-guided visualisation display.

    Science.gov (United States)

    Zinser, Max J; Mischkowski, Robert A; Dreiseidler, Timo; Thamm, Oliver C; Rothamel, Daniel; Zöller, Joachim E

    2013-12-01

    There may well be a shift towards 3-dimensional orthognathic surgery when virtual surgical planning can be applied clinically. We present a computer-assisted protocol that uses surgical navigation supplemented by an interactive image-guided visualisation display (IGVD) to transfer virtual maxillary planning precisely. The aim of this study was to analyse its accuracy and versatility in vivo. The protocol consists of maxillofacial imaging, diagnosis, planning of virtual treatment, and intraoperative surgical transfer using an IGV display. The advantage of the interactive IGV display is that the virtually planned maxilla and its real position can be completely superimposed during operation through a video graphics array (VGA) camera, thereby augmenting the surgeon's 3-dimensional perception. Sixteen adult class III patients were treated with by bimaxillary osteotomy. Seven hard tissue variables were chosen to compare (ΔT1-T0) the virtual maxillary planning (T0) with the postoperative result (T1) using 3-dimensional cephalometry. Clinically acceptable precision for the surgical planning transfer of the maxilla (orthognathic planning.

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  4. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  5. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  6. Design for architecture of digital library service platform based on cloud computing%基于云计算的数字图书馆服务平台架构设计

    Institute of Scientific and Technical Information of China (English)

    崔忠伟; 左羽; 韦萍萍; 熊伟程

    2014-01-01

    Based on the analysis of the functional requirement of digital library, a design method of digital library service platform architecture based on cloud computing is proposed, which has six-tier structure, and can provide storage service for resource sharing and compute service for computationally-intensive jobs such as gene sequencing. The implementation technology of the architecture is introduced, with the technologies the cloud service platform can be rapidly established.%在分析数字图书馆功能需求的基础上,提出了一种基于云计算的数字图书馆服务平台架构的设计方法,该平台采用六层架构,既能提供存储服务以实现资源共享,也能为计算量巨大的工作(如基因序列测定)提供计算服务。文中同时介绍了架构的实现技术,利用这些实现技术,能够快速构建云服务平台。

  7. Network Interactive Platform Application in the Computer Installation and Maintenance Programs%网络互动平台在计算机组装与维修课程中应用

    Institute of Scientific and Technical Information of China (English)

    齐秀国

    2011-01-01

    Computer Professional Network is an interactive platform for online education as a means.For vocational students to learn computer assembly and maintenance of a self-learning environment and provide a platform for practice;the vocational expertise in computer-related teaching and learning more in line with the understanding of the law;improve student interest in learning;mention of the latest cutting-edge knowledge.%高职计算机专业网络互动平台是网络教育的一种手段。为高职学生学习计算机组装与维修了提供自主学习环境与实践的平台;使高职计算机相关专业中的教与学更加符合教学中的认知规律;提高学生学习兴趣;提供最新的前沿知识。

  8. 基于云计算的动漫渲染实验平台研究与实现%Research and Implement on Cloud Computing-based Animation Rendering Experimental Platform

    Institute of Scientific and Technical Information of China (English)

    廖宏建; 杨玉宝; 唐连章; 卫建安

    2012-01-01

    Hardware resource constraint of animation rendering is a bottleneck of animation teaching and research in universities. Cloud computing has such features such as high performance computing, mass storage, intelligent deployment etc. And the cloud renders based on cloud computing platform provides the solution to the bottleneck. The rendering nodes made by virtualization technology, cloud storage, service interface, and rendering task management form a core of the cloud rendering experimental platform. The user could complete the rendering process of submitting task, reserving resource, setting parameter, detecting scene, and viewing effect with the remote desktop or web self-service system. It has been proved that the platform is the advanced, economic and easy to management.%动漫渲染的硬件资源限制是目前高校动漫专业教学和科研的一个瓶颈,云计算具有高性能计算、海量存储、智能化部署等特征,架构于云计算平台的云渲染为这一问题提供了解决方案.虚拟化的渲染节点、渲染任务管理、云存储、服务接口是构成云渲染实验平台的核心,用户通过远程Web自助系统或桌面云端完成任务提交、资源预约、参数设置、场景检测、效果查看等渲染过程.实践证明,该平台具有先进性、经济性和易管理等优势.

  9. 基于云计算的电子政务公共平台建设研究%Research on the Building of Electronic Government Public Platform Based on the Cloud Computing

    Institute of Scientific and Technical Information of China (English)

    吕小刚

    2016-01-01

    伴随着我国进入智慧政府建设阶段,政府信息化建设模式也从政府主导转向满足公众需求为核心点的需求导向、问题导向的全新模式。新形势下政府云平台的建设更好地整合硬件和服务资源,并实现资源的高效调度与公众需求最大化契合。文章通过对云计算相关概念的阐述,以“创新、协调、绿色、开放、共享”理念为基础,剖析出基于云计算的电子政务公共平台建设的意义,探索基于云计算电子政务公共平台建设的对策。%With our government developing into intelligent government, the information construction mode of government has transformed from government-led to a new mode that must be need for the public. Under the new situation government cloud platform construction is integrating the hardware resources and software resources together better, and dispatching all kinds of resources for the need of public. By stating the conception of the cloud computing, based on the ethos of“innovation, co-ordination, environment, opening, sharing”, the paper analyzes the meaning of the building of electronic government public platform based on the cloud computing and explores the strategy of building of electronic government public platform based on the cloud computing.

  10. 高可靠云计算平台及其在智慧林业中的应用%High Reliable Cloud Computing Platform and Its Application in Smarter Forestry

    Institute of Scientific and Technical Information of China (English)

    刘亚秋; 景维鹏; 井云凌

    2011-01-01

    "Smarter forestry" is the wise management of forest and ecological resources that uses all available monitoring technology to achieve a full range perception and processing of forest resources,while cloud computing as a new computing model will greatly enhance the decision-making level of smarter forestry.Based on the analysis of connotation and characteristics of clouding computing platform and its relationship with grid computing and distributed computing,the paper put forward and constructed the high reliability cloud computing platform for smarter forestry.This platform based forestry information to combine smarter forestry and natural control theory with cloud computing system closely.The smarter forestry architecture was built with the universal perception and ubiquitous access as the foundation,the highly reliable cloud computing as the means and intelligent decision analysis as the target.The trusted computing suitable for smarter forestry and natural control was proposed,covering data mining technology for very large databases,automatic storage and management,large-scale news communication,reliable resources scheduling,new calculation system.Finally,the basic application of highly reliable cloud computing platform to smarter forestry was discussed.%"智慧林业"就是使用一切可以实现的监测技术对森林资源进行全方位的感知、处理,实现森林及生态资源的智慧管理。云计算作为新型计算模式将极大提高智慧林业决策水平。文中在分析云计算平台的内涵特征及其与网格计算和分布式计算关系的基础上,提出并构建了高可靠智慧林业云计算平台。该平台基于林业信息化思想,将智慧林业及自然控制理论与云计算体系紧密结合,构建了以普适感知与泛在接入为基础、高可靠云计算为手段、智能决策分析为目标的智慧林业体系结构。提出了适于智慧林业与自然控制的海量数据挖掘、自动存储管

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  12. JUSTINA: A platform for autonomous land vehicle research and development

    Science.gov (United States)

    Juhala, Matti

    A platform for autonomous land vehicle research and development was developed. The platform makes it possible to conduct tests in natural surroundings. The platform is based on a small car, which was modified and equipped for computer control. The car is equipped with a controller area network, which makes it possible to use parallel processing. The computer system and the modifications made are described. Programs for basic image processing were written and they are tested in a simple line following application. The programs allow different image processing tasks to be done on a desktop computer using either a recorded video signal or digitized images from natural scenes. Special attention was paid to the camera calibration procedure. A simple vehicle model was created for hardware in the loop testing of image processing and vehicle control algorithms. The model uses artificially produced road images. The vehicle is equipped with a differential Global Positioning System (GPS) navigation system, which was tested on an open field application. This platform is modular and can be easily adapted for different research purposes in autonomous land vehicle development.

  13. MobiNet: a pedagogic platform for Computer Science, Maths and Physics (How to make students love Maths by programming video games)

    OpenAIRE

    Lefebvre, Sylvain; Neyret, Fabrice; Hornus, Samuel; Thollot, Joëlle

    2004-01-01

    International audience; We developed the MobiNet (free) platform and tutorial sessions (tested on 16 batches of high school students) with the aim of offering students a new way of learning and understanding academic scientific subjects. Our approach consists of letting the students manipulate mathematical and physical notions as tools in order to solve concrete tasks, such as a solar system simulation or a video game. This makes students formalize a realworld problem, experiment by trial-and...

  14. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  15. Design and Implementation of Core Curriculum Network Platform for Computer Science Based on C#%基于C#的计算机专业核心课程网络平台的设计与实现

    Institute of Scientific and Technical Information of China (English)

    姬涛

    2014-01-01

    本设计采用B/S架构,将C#作为开发语言,配合使用JavaScript技术和DIV+CSS技术进行前台页面开发,整体的开发在Microsoft Visual Studio平台上进行,使用SQL Server为后台数据库。网络平台在实现基本功能的基础上,显示界面具有清晰、美观、简洁的特点,方便用户使用,具有很强的实用性。%The core curriculum network platform for computer science is designed with B/S architecture, using C # as the devel-opment language, and JavaScript technology and DIV+ CSS technology for front page development. The overall development is completed on Microsoft Visual Studio platform, using SQL Server for the background database. The network platform achieves the basic functions, whose interface is clear, beautiful and simple.

  16. The Application of Digital Resource Platform in Computer Course Teaching in Secondary Vocational Schools%浅谈数字化资源平台在中职计算机类课程教学中的应用

    Institute of Scientific and Technical Information of China (English)

    董自上

    2016-01-01

    With the advent of the Internet Plus era, vocational education digitalized education resources and the resources platform has been further improved. In this paper, the application of current digital resources platform in secondary vocational computer courses teaching is studied to explore the platform in the teaching and learning effect and we give the related suggestions in the end.%伴随着互联网+时代的到来,国家对职业教育发展加大了重视和支持,职业教育数字化教育资源和以此为依托的资源平台也得到提升,文章通过对当前数字化资源平台在中职计算机类课程教学中的应用分析,探讨平台在课程教学和学习的影响并给出使用的相关建议。

  17. The Application of The Health Care Services Platform of Mobile Technology Based on Cloud Computing%基于云计算的移动社区医疗服务平台应用

    Institute of Scientific and Technical Information of China (English)

    陈平平; 谭定英; 刘慧玲

    2012-01-01

    In order to solve residents' medical problem and improve services of medical institutions in communities,it proposes a platform that bases on cloud-computing and mobile technology for community health services.The platform is a simple,intuitive and easy to operate and create personalized community health services for the residents.At the same time through the platform medical institutions in communities can reduce investment to solve the development costs during medical information.%为了解决居民"看病难"问题,提高社区医疗机构的服务水平,提出了基于云计算的移动社区医疗服务平台设计解决方案,能解决现有问题,为居民或家庭创造一个简单直观、操作方便、功能健全的个性化社区医疗服务平台。通过建立移动医疗云服务平台,各个社区医疗机构和部门能够减少对于信息化建设的重复投入,降低开发成本。

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  20. Evaluation of Condylar Position after Orthognathic Surgery for Treatment of Class II Vertical Maxillary Excess and Mandibular Deficiency by Using Cone-Beam Computed Tomography

    OpenAIRE

    Reza Tabrizi; Shoaleh Shahidi; Dept. of Oral and Maxillofacial Radiology, Biomaterials Research Center, Shiraz University of Medical Sciences, Shiraz, Iran.; Hamidreza Arabion

    2016-01-01

    Statement of the Problem: In orthognathic surgeries, proper condylar position is one of the most important factors in postoperative stability. Knowing the condylar movement after orthognathic surgery can help preventing postoperative instabilities. Purpose: The aim of this study was to evaluate the condylar positional changes after Le Fort I maxillary superior repositioning along with mandibular advancement by using cone beam computed tomography (CBCT). Materials and Method: This cross...

  1. Research and Implementation of WiFi Positioning System Based on Android Platform%基于 Android 平台的 WiFi 定位系统研究与实现

    Institute of Scientific and Technical Information of China (English)

    吴雨; 杨力; 王梦茹; 孔港港

    2016-01-01

    With People's increasing demand for indoor location service ,the indoor Wi‐Fi positioning system based on the Android platform has become the study hotspot .This pa‐per ,by using Wi‐Fi wireless sensor and Android smartphone develop an indoor positioning system with K neighboring algorithm and do the experiments in the field .The results show that the system can achieve the function of real‐time positioning and has good localization ac‐curacy that the precision can stable stay within 3 m .In order to improve the accuracy of Wi‐Fi positioning ,people need to research how to optimization algorithm and improve the stabil‐ity of the signal .%随着人们对室内位置服务需求的不断增加,基于Android平台的室内无线保真定位的研究成为热点。本文利用无线保真传感器结合安卓智能手机研制了一种基于附加权值的K邻近位置指纹算法的室内定位系统并在实验室进行了相关实验,结果表明,该系统能够实现实时的定位功能并且具有较好的定位精度,精度能够稳定的保持在3m以内。

  2. 基于云计算的电商平台设计方案的研究%Research on the Design Scheme of an Electronic Business Platform Based on Cloud Computing

    Institute of Scientific and Technical Information of China (English)

    顾海燕

    2015-01-01

    The paper is to introduce the basic concepts of cloud computing. To solve the problems of the Yantai-Apple industry,the paper proposes a scheme of building the electronic business platform of the Yantai-Apple Based on Cloud Computing, and introduces the detailed design ideas.%本文在介绍了云计算的基本概念的基础上,针对烟台苹果行业出现的问题,提出了在云计算基础上搭建烟台苹果电商平台的方案,并介绍了方案的详细设计思路。

  3. 基于虚拟化技术的高校云计算平台架构设计与实现%Structural Design and Implementation of Cloud-computing Platform Based on Virtualization Technology

    Institute of Scientific and Technical Information of China (English)

    周浩

    2015-01-01

    本文针对目前云计算技术的发展,研究与分析了云计算和虚拟化技术,提出了基于服务器虚拟化技术的高校私有云平台架构部署方案,对全国各类高等院校在云计算数据中心的部署上具有借鉴意义.%In view of the current development of cloud computing technology, this paper conducts an analysis of cloud computing and virtualization technology, and proposes a college private cloud platform scheme based on server virtualization,which will provide some reference for other colleges.

  4. Research on information management platform for coal enterprise based on cloud computing technology%基于云计算技术的煤炭企业信息化管理平台研究

    Institute of Scientific and Technical Information of China (English)

    欧莹元

    2014-01-01

    阐述了煤炭企业在企业管理信息化建设方面所面临的问题,并针对这些问题提出了基于云计算技术的解决方案,介绍了基于云计算的煤炭企业管理平台的设计思路,设计了基于云计算煤炭企业管理平台的功能结构,并详细介绍了生产管理、选煤管理、物流配送、客户管理、生产调度、数据分析和云计算中心7个子系统的功能模块。该平台在提高煤炭企业生产效率的同时,也较传统的管理平台更加稳定和高效,扩展能力更强,同时也大大地节省了企业在信息化建设方面的投入,为煤炭企业节约了管理成本。%The existing problems in the coal enterprise management informationization were expounded,and the responding solution was proposed on the basis of cloud computing technolo-gy.The design thought of coal enterprise management platform based on cloud computing was in-troduced.The functional structure of coal enterprise management platform was designed,and the functional modules for seven subsystems including production management,coal preparation management,logistics management,customer management,production scheduling,data analy-sis and cloud computing center were described in detail.The platform can enhance the production efficiency of coal enterprise;meanwhile,it is more stable and efficient than the traditional man-agement platform,with stronger extension capability.Moreover,the investment on the informa-tization building in coal enterprises was reduced,saving the cost of management for coal enterpri-ses.

  5. 一种三自由度并联稳定平台机构的位置与工作空间分析%Position and Workspace Analysis of a 3-DOF Parallel Stable Platform Mechanism

    Institute of Scientific and Technical Information of China (English)

    常兴; 刘安心; 房立丰; 杨廷力

    2012-01-01

    According to theory of the structure type synthesis of parallel robot mechanism, a spatial 3-DOF parallel mechanism with ability of realizing one translation and two rotation was proposed. Based on SOC unit and theory of POC matrix, a topology structure characteristics analysis and position analysis were given. Its forward and inverse position solution equations were established. And the decoupling analysis and singular configuration analysis were also given. There was a workspace analysis at last. This mechanism is easy to control and plan trajectory, with a good prospect in applications of ship stable platform and etc.%根据并联机器人机构结构综合理论,提出一种能够实现空间一平移两转动的三自由度并联机构.以单开链单元和方位特征集理论为基础,对机构进行拓扑结构特征的分析和位置分析,建立了位置正逆解的方程,然后进行了解耦性和奇异位形的分析,最后对其工作空间进行分析.该机构易于控制和轨迹规划,在船舶稳定平台等领域有良好的应用前景.

  6. Multifocal sparganosis mimicking lymphoma involvement: Multimodal imaging findings of ultrasonography, CT, MRI, and position emission tomography-computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Heo, So Young; Park, Ji Yeon; Park, Noh Hyuck; Park, Chan Sub; Kim, Tae Jung [Myongji Hospital, Seonam University College of Medicine, Goyang (Korea, Republic of); Yi, Seong Yoon [Dev. of Hematology-Omcology, Dept. of Internal Medicine, Inje University Ilsan Paik Hospital, Goyang (Korea, Republic of); Jun, Hyun Jung [Dev. of Hematology-Omcology, Dept. of Internal Medicine, Seoul Medical Center, Seoul (Korea, Republic of)

    2016-01-15

    Sparganosis is a rare parasitic disease caused by the migrating plerocercoid larva of Spirometra species tapeworms. The most frequent clinical manifestation is a subcutaneous nodule resembling a neoplasm. In this study, we presented multimodal findings of ultrasonography, computed tomography, magnetic resonance imaging, positron emission tomography-computed tomography and follow-up imagings on multifocal sparganosis, mimicking lymphoma involvement in a patient with lymphoma.

  7. 基于云计算的校园网智能学习平台研究与设计%Research and Design of Campus Network Intelligent Learning Platform Based on Cloud Computing

    Institute of Scientific and Technical Information of China (English)

    洪燕璇

    2015-01-01

    随着云计算技术的快速发展,其已经在校园信息化系统中得到了普及和应用.为了能够提高教学管理的自动化、智能化和信息化水平,文章提出采用云计算技术设计一个校园网智能学习平台,虚拟化教学软硬件资源,教师、学生通过平台获取云端账户,可以多用户并发登录平台管理,使用教学资源,并且能够实现断点续传、记录学习课程进度等功能,以期提高学习的自主性和积极性,改善教学质量.%With the rapid development of cloud computing technology,it has been widely used in the campus information system.In order to improve the teaching management automation,intellectualization and informatization level is proposed in this paper using cloud computing technology design of a campus network intelligent learning platform,virtual teaching software and hardware resources, teachers and students through the platform to obtain the cloud account,can be multi user concurrent board book platform management, use of teaching resources,and can achieve HTTP,recording the course schedule and other functions,improve the learning initiative and enthusiasm,boost the quality of teaching.

  8. Design of supply and demand platform of equipment manufacturing industry based on cloud computing%基于云计算的装备制造业供求平台设计∗

    Institute of Scientific and Technical Information of China (English)

    赵红; 苏剑峰

    2015-01-01

    随着信息技术的快速发展,我国装备制造企业需要加快信息化建设的步伐,并进行技术升级和资源整合。云计算作为一种新兴的数据处理模式,能够加强装备制造企业之间的信息共享。在分析装备制造业对产品、技术、设备和服务等需求的基础上,提出基于云计算的平台架构方案。对平台功能模块进行描述,设计出基于云计算技术的平台业务流程数据模型,为进一步实现系统目标和满足客户需求提供借鉴。%With the rapid development of information technology, the enterprises of equipment manufacturing in China need to accelerate the pace of information construction, and upgrade the technology and integrate the resources. As a new data processing mode, cloud computing can strengthen the information sharing between the equipment manufacturing enterprises. Based on the analysis of the requirements of products, technology, equipment and services in equipment manufacturing industry, the framework of the platform is proposed based on cloud computing. The platform function module is described, the business process data platform model is designed based on cloud calculation technology, in order to provide reference for realizing the system goal and further meeting customer demand.

  9. 基于虚拟实验平台的计算机语言实践教学探索%Exploration of Computer Language Practice Teaching Based on Virtual Experiment Platform

    Institute of Scientific and Technical Information of China (English)

    徐浙君; 俞淑燕

    2012-01-01

    Computer language teaching quality is always a common concern topic among teachers and students.Based on the defects of computer language practical teaching,this paper constructs an integrated virtual experimental platform integrating learning,experiment,evaluation and test etc.with C language practice teaching as an example.The author boldly proposes some teaching means and teaching methods based on this platform,and have improved the quality of practice teaching.%计算机语言教学质量一直以来是广大师生共同关注的话题,本文针对目前计算机语言实践教学过程中存在的诸多弊端,以C语言实践教学为例,构建了一个集学习、实验、评价和测验于一体的综合性虚拟实验平台,大胆提出了一些基于该平台的教学手段和教学方法,提高了实践教学环节的质量。

  10. Research on the Construction of "Computer Aided De-sign" Platform Course Based on Design Majors%设计学类专业计算机辅助设计平台课建设研究

    Institute of Scientific and Technical Information of China (English)

    梁燕

    2014-01-01

    This paper provides a new idea for the course reform of"Computer Aided Design"for design majors. The construction of"Computer Aided Design"platform course for design majors is a complex and systematic project. After a systematic research and practice, a design plan for the construction is built. Platform course construction is of certain referential and enlightenment significance to the exploration on the professional course con-struction for design majors in common universities.%本文为设计学类专业计算机辅助设计课程改革提供了一个新的思路,建设设计学类专业计算机辅助设计平台课是一个复杂的系统工程,经过系统研究与实践,构建了平台课的设计方案。平台课的建设对于探索普通高等院校设计学科专业课程建设具有一定的借鉴与启发意义。

  11. La seguridad informática en el trabajo con la plataforma Moodle (Computer Security in Working with the Moodle Platform

    Directory of Open Access Journals (Sweden)

    Romero-Moreno, María-José Luisa

    2010-12-01

    Full Text Available Resumen: El trabajo presenta los aspectos de seguridad de la plataforma Moodle. Es sabido que los Sistemas Virtuales de Formación impregnan el mundo académico (Campus Virtuales pero cada día más el de la empresa (Formación Continua. Las plataformas eLearning se nos presentan como herramientas adecuadas en estos contextos. Apostamos por el software libre y dentro de él por la plataforma que a día de hoy constituye un auténtico referente en el ámbito de la formación. Pero nos parece fundamental que profesores y tutores puedan tener la seguridad de que sus ficheros están debidamente protegidos. Analizaremos como aprovechar los niveles de seguridad de la herramienta y como configurarla para obtener los resultados esperados. Abstract: This paper presents the security aspects of Moodle platform. It is known that the Virtual Training Systems impregnate the academic world (Virtual Campus but faster and faster the companys (Continuing Training. The platforms eLearning are presented as suitable tools in these contexts. Here we focus on free software and especially on the platform that today is a real landmark in the area of Virtual Training. But it seems essential that teachers and tutors can be safe that their files are protected. In what follow we will analyze how to take advantage of the security levels and how to configure the tool to obtain the expected results.

  12. La seguridad informática en el trabajo con la plataforma Moodle (Computer Security in Working with the Moodle Platform

    Directory of Open Access Journals (Sweden)

    María-José Luisa Romero-Moreno

    2010-12-01

    Full Text Available Resumen: El trabajo presenta los aspectos de seguridad de la plataforma Moodle. Es sabido que los Sistemas Virtuales de Formación impregnan el mundo académico (Campus Virtuales pero cada día más el de la empresa (Formación Continua. Las plataformas eLearning se nos presentan como herramientas adecuadas en estos contextos. Apostamos por el software libre y dentro de él por la plataforma que a día de hoy constituye un auténtico referente en el ámbito de la formación. Pero nos parece fundamental que profesores y tutores puedan tener la seguridad de que sus ficheros están debidamente protegidos. Analizaremos como aprovechar los niveles de seguridad de la herramienta y como configurarla para obtener los resultados esperados.Abstract: This paper presents the security aspects of Moodle platform. It is known that the Virtual Training Systems impregnate the academic world (Virtual Campus but faster and faster the company’s (Continuing Training. The platforms eLearning are presented as suitable tools in these contexts. Here we focus on free software and especially on the platform that today is a real landmark in the area of Virtual Training. But it seems essential that teachers and tutors can be safe that their files are protected. In what follow we will analyze how to take advantage of the security levels and how to configure the tool to obtain the expected results. 

  13. 3D Position and Velocity Vector Computations of Objects Jettisoned from the International Space Station Using Close-Range Photogrammetry Approach

    Science.gov (United States)

    Papanyan, Valeri; Oshle, Edward; Adamo, Daniel

    2008-01-01

    Measurement of the jettisoned object departure trajectory and velocity vector in the International Space Station (ISS) reference frame is vitally important for prompt evaluation of the object s imminent orbit. We report on the first successful application of photogrammetric analysis of the ISS imagery for the prompt computation of the jettisoned object s position and velocity vectors. As post-EVA analyses examples, we present the Floating Potential Probe (FPP) and the Russian "Orlan" Space Suit jettisons, as well as the near-real-time (provided in several hours after the separation) computations of the Video Stanchion Support Assembly Flight Support Assembly (VSSA-FSA) and Early Ammonia Servicer (EAS) jettisons during the US astronauts space-walk. Standard close-range photogrammetry analysis was used during this EVA to analyze two on-board camera image sequences down-linked from the ISS. In this approach the ISS camera orientations were computed from known coordinates of several reference points on the ISS hardware. Then the position of the jettisoned object for each time-frame was computed from its image in each frame of the video-clips. In another, "quick-look" approach used in near-real time, orientation of the cameras was computed from their position (from the ISS CAD model) and operational data (pan and tilt) then location of the jettisoned object was calculated only for several frames of the two synchronized movies. Keywords: Photogrammetry, International Space Station, jettisons, image analysis.

  14. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  18. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  20. Influence of respiratory gating, image filtering, and animal positioning on high-resolution electrocardiography-gated murine cardiac single-photon emission computed tomography

    NARCIS (Netherlands)

    Wu, Chao; Vaissier, Pieter E. B.; Vastenhouw, Brendan; de Jong, Johan R.; Slart, Riemer H. J. A.; Beekman, Freek J.

    2015-01-01

    Cardiac parameters obtained from single-photon emission computed tomographic (SPECT) images can be affected by respiratory motion, image filtering, and animal positioning. We investigated the influence of these factors on ultra-high-resolution murine myocardial perfusion SPECT. Five mice were inject