WorldWideScience

Sample records for biomedical computing center

  1. Computational intelligence in biomedical imaging

    CERN Document Server

    2014-01-01

    This book provides a comprehensive overview of the state-of-the-art computational intelligence research and technologies in biomedical images with emphasis on biomedical decision making. Biomedical imaging offers useful information on patients’ medical conditions and clues to causes of their symptoms and diseases. Biomedical images, however, provide a large number of images which physicians must interpret. Therefore, computer aids are demanded and become indispensable in physicians’ decision making. This book discusses major technical advancements and research findings in the field of computational intelligence in biomedical imaging, for example, computational intelligence in computer-aided diagnosis for breast cancer, prostate cancer, and brain disease, in lung function analysis, and in radiation therapy. The book examines technologies and studies that have reached the practical level, and those technologies that are becoming available in clinical practices in hospitals rapidly such as computational inte...

  2. Biomedical Computing Technology Information Center (BCTIC): Final progress report, March 1, 1986-September 30, 1986

    International Nuclear Information System (INIS)

    1987-01-01

    During this time, BCTIC packaged and disseminated computing technology and honored all requests made before September 1, 1986. The final month of operation was devoted to completing code requests, returning submitted codes, and sending out notices of BCTIC's termination of services on September 30th. Final BCTIC library listings were distributed to members of the active mailing list. Also included in the library listing are names and addresses of program authors and contributors in order that users may have continued support of their programs. The BCTIC library list is attached

  3. University of Vermont Center for Biomedical Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Bernstein, Dr. Ira [University of Vermont and State Agricultural College

    2013-08-02

    This grant was awarded in support of Phase 2 of the University of Vermont Center for Biomedical Imaging. Phase 2 outlined several specific aims including: The development of expertise in MRI and fMRI imaging and their applications The acquisition of peer reviewed extramural funding in support of the Center The development of a Core Imaging Advisory Board, fee structure and protocol review and approval process.

  4. Computer vision for biomedical image applications. Proceedings

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yanxi [Carnegie Mellon Univ., Pittsburgh, PA (United States). School of Computer Science, The Robotics Institute; Jiang, Tianzi [Chinese Academy of Sciences, Beijing (China). National Lab. of Pattern Recognition, Inst. of Automation; Zhang, Changshui (eds.) [Tsinghua Univ., Beijing, BJ (China). Dept. of Automation

    2005-07-01

    This book constitutes the refereed proceedings of the First International Workshop on Computer Vision for Biomedical Image Applications: Current Techniques and Future Trends, CVBIA 2005, held in Beijing, China, in October 2005 within the scope of ICCV 20. (orig.)

  5. The Lister Hill National Center for Biomedical Communications.

    Science.gov (United States)

    Smith, K A

    1994-09-01

    On August 3, 1968, the Joint Resolution of the Congress established the program and construction of the Lister Hill National Center for Biomedical Communications. The facility dedicated in 1980 contains the latest in computer and communications technologies. The history, program requirements, construction management, and general planning are discussed including technical issues regarding cabling, systems functions, heating, ventilation, and air conditioning system (HVAC), fire suppression, research and development laboratories, among others.

  6. Applications of computational intelligence in biomedical technology

    CERN Document Server

    Majernik, Jaroslav; Pancerz, Krzysztof; Zaitseva, Elena

    2016-01-01

    This book presents latest results and selected applications of Computational Intelligence in Biomedical Technologies. Most of contributions deal with problems of Biomedical and Medical Informatics, ranging from theoretical considerations to practical applications. Various aspects of development methods and algorithms in Biomedical and Medical Informatics as well as Algorithms for medical image processing, modeling methods are discussed. Individual contributions also cover medical decision making support, estimation of risks of treatments, reliability of medical systems, problems of practical clinical applications and many other topics  This book is intended for scientists interested in problems of Biomedical Technologies, for researchers and academic staff, for all dealing with Biomedical and Medical Informatics, as well as PhD students. Useful information is offered also to IT companies, developers of equipment and/or software for medicine and medical professionals.  .

  7. Advanced computational approaches to biomedical engineering

    CERN Document Server

    Saha, Punam K; Basu, Subhadip

    2014-01-01

    There has been rapid growth in biomedical engineering in recent decades, given advancements in medical imaging and physiological modelling and sensing systems, coupled with immense growth in computational and network technology, analytic approaches, visualization and virtual-reality, man-machine interaction and automation. Biomedical engineering involves applying engineering principles to the medical and biological sciences and it comprises several topics including biomedicine, medical imaging, physiological modelling and sensing, instrumentation, real-time systems, automation and control, sig

  8. Biomedical Visual Computing: Case Studies and Challenges

    KAUST Repository

    Johnson, Christopher

    2012-01-01

    Advances in computational geometric modeling, imaging, and simulation let researchers build and test models of increasing complexity, generating unprecedented amounts of data. As recent research in biomedical applications illustrates, visualization will be critical in making this vast amount of data usable; it\\'s also fundamental to understanding models of complex phenomena. © 2012 IEEE.

  9. Biomedical Visual Computing: Case Studies and Challenges

    KAUST Repository

    Johnson, Christopher

    2012-01-01

    Advances in computational geometric modeling, imaging, and simulation let researchers build and test models of increasing complexity, generating unprecedented amounts of data. As recent research in biomedical applications illustrates, visualization will be critical in making this vast amount of data usable; it's also fundamental to understanding models of complex phenomena. © 2012 IEEE.

  10. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  11. Biomedical cloud computing with Amazon Web Services.

    Science.gov (United States)

    Fusaro, Vincent A; Patil, Prasad; Gafni, Erik; Wall, Dennis P; Tonellato, Peter J

    2011-08-01

    In this overview to biomedical computing in the cloud, we discussed two primary ways to use the cloud (a single instance or cluster), provided a detailed example using NGS mapping, and highlighted the associated costs. While many users new to the cloud may assume that entry is as straightforward as uploading an application and selecting an instance type and storage options, we illustrated that there is substantial up-front effort required before an application can make full use of the cloud's vast resources. Our intention was to provide a set of best practices and to illustrate how those apply to a typical application pipeline for biomedical informatics, but also general enough for extrapolation to other types of computational problems. Our mapping example was intended to illustrate how to develop a scalable project and not to compare and contrast alignment algorithms for read mapping and genome assembly. Indeed, with a newer aligner such as Bowtie, it is possible to map the entire African genome using one m2.2xlarge instance in 48 hours for a total cost of approximately $48 in computation time. In our example, we were not concerned with data transfer rates, which are heavily influenced by the amount of available bandwidth, connection latency, and network availability. When transferring large amounts of data to the cloud, bandwidth limitations can be a major bottleneck, and in some cases it is more efficient to simply mail a storage device containing the data to AWS (http://aws.amazon.com/importexport/). More information about cloud computing, detailed cost analysis, and security can be found in references.

  12. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  13. COMPUTATIONAL SCIENCE CENTER

    International Nuclear Information System (INIS)

    DAVENPORT, J.

    2006-01-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  14. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  15. Cloud computing applications for biomedical science: A perspective.

    Science.gov (United States)

    Navale, Vivek; Bourne, Philip E

    2018-06-01

    Biomedical research has become a digital data-intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research.

  16. Cloud computing: a new business paradigm for biomedical information sharing.

    Science.gov (United States)

    Rosenthal, Arnon; Mork, Peter; Li, Maya Hao; Stanford, Jean; Koester, David; Reynolds, Patti

    2010-04-01

    We examine how the biomedical informatics (BMI) community, especially consortia that share data and applications, can take advantage of a new resource called "cloud computing". Clouds generally offer resources on demand. In most clouds, charges are pay per use, based on large farms of inexpensive, dedicated servers, sometimes supporting parallel computing. Substantial economies of scale potentially yield costs much lower than dedicated laboratory systems or even institutional data centers. Overall, even with conservative assumptions, for applications that are not I/O intensive and do not demand a fully mature environment, the numbers suggested that clouds can sometimes provide major improvements, and should be seriously considered for BMI. Methodologically, it was very advantageous to formulate analyses in terms of component technologies; focusing on these specifics enabled us to bypass the cacophony of alternative definitions (e.g., exactly what does a cloud include) and to analyze alternatives that employ some of the component technologies (e.g., an institution's data center). Relative analyses were another great simplifier. Rather than listing the absolute strengths and weaknesses of cloud-based systems (e.g., for security or data preservation), we focus on the changes from a particular starting point, e.g., individual lab systems. We often find a rough parity (in principle), but one needs to examine individual acquisitions--is a loosely managed lab moving to a well managed cloud, or a tightly managed hospital data center moving to a poorly safeguarded cloud? 2009 Elsevier Inc. All rights reserved.

  17. Multiscale computer modeling in biomechanics and biomedical engineering

    CERN Document Server

    2013-01-01

    This book reviews the state-of-the-art in multiscale computer modeling, in terms of both accomplishments and challenges. The information in the book is particularly useful for biomedical engineers, medical physicists and researchers in systems biology, mathematical biology, micro-biomechanics and biomaterials who are interested in how to bridge between traditional biomedical engineering work at the organ and tissue scales, and the newer arenas of cellular and molecular bioengineering.

  18. Synergies and Distinctions between Computational Disciplines in Biomedical Research: Perspective from the Clinical and Translational Science Award Programs

    Science.gov (United States)

    Bernstam, Elmer V.; Hersh, William R.; Johnson, Stephen B.; Chute, Christopher G.; Nguyen, Hien; Sim, Ida; Nahm, Meredith; Weiner, Mark; Miller, Perry; DiLaura, Robert P.; Overcash, Marc; Lehmann, Harold P.; Eichmann, David; Athey, Brian D.; Scheuermann, Richard H.; Anderson, Nick; Starren, Justin B.; Harris, Paul A.; Smith, Jack W.; Barbour, Ed; Silverstein, Jonathan C.; Krusch, David A.; Nagarajan, Rakesh; Becich, Michael J.

    2010-01-01

    Clinical and translational research increasingly requires computation. Projects may involve multiple computationally-oriented groups including information technology (IT) professionals, computer scientists and biomedical informaticians. However, many biomedical researchers are not aware of the distinctions among these complementary groups, leading to confusion, delays and sub-optimal results. Although written from the perspective of clinical and translational science award (CTSA) programs within academic medical centers, the paper addresses issues that extend beyond clinical and translational research. The authors describe the complementary but distinct roles of operational IT, research IT, computer science and biomedical informatics using a clinical data warehouse as a running example. In general, IT professionals focus on technology. The authors distinguish between two types of IT groups within academic medical centers: central or administrative IT (supporting the administrative computing needs of large organizations) and research IT (supporting the computing needs of researchers). Computer scientists focus on general issues of computation such as designing faster computers or more efficient algorithms, rather than specific applications. In contrast, informaticians are concerned with data, information and knowledge. Biomedical informaticians draw on a variety of tools, including but not limited to computers, to solve information problems in health care and biomedicine. The paper concludes with recommendations regarding administrative structures that can help to maximize the benefit of computation to biomedical research within academic health centers. PMID:19550198

  19. Java and its future in biomedical computing.

    Science.gov (United States)

    Rodgers, R P

    1996-01-01

    Java, a new object-oriented computing language related to C++, is receiving considerable attention due to its use in creating network-sharable, platform-independent software modules (known as "applets") that can be used with the World Wide Web. The Web has rapidly become the most commonly used information-retrieval tool associated with the global computer network known as the Internet, and Java has the potential to further accelerate the Web's application to medical problems. Java's potentially wide acceptance due to its Web association and its own technical merits also suggests that it may become a popular language for non-Web-based, object-oriented computing. PMID:8880677

  20. Computer Center: Software Review.

    Science.gov (United States)

    Duhrkopf, Richard, Ed.; Belshe, John F., Ed.

    1988-01-01

    Reviews a software package, "Mitosis-Meiosis," available for Apple II or IBM computers with colorgraphics capabilities. Describes the documentation, presentation and flexibility of the program. Rates the program based on graphics and usability in a biology classroom. (CW)

  1. Biomedical Imaging and Computational Modeling in Biomechanics

    CERN Document Server

    Iacoviello, Daniela

    2013-01-01

    This book collects the state-of-art and new trends in image analysis and biomechanics. It covers a wide field of scientific and cultural topics, ranging from remodeling of bone tissue under the mechanical stimulus up to optimizing the performance of sports equipment, through the patient-specific modeling in orthopedics, microtomography and its application in oral and implant research, computational modeling in the field of hip prostheses, image based model development and analysis of the human knee joint, kinematics of the hip joint, micro-scale analysis of compositional and mechanical properties of dentin, automated techniques for cervical cell image analysis, and iomedical imaging and computational modeling in cardiovascular disease.   The book will be of interest to researchers, Ph.D students, and graduate students with multidisciplinary interests related to image analysis and understanding, medical imaging, biomechanics, simulation and modeling, experimental analysis.

  2. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  3. Computational Phase Imaging for Biomedical Applications

    Science.gov (United States)

    Nguyen, Tan Huu

    When a sample is illuminated by an imaging field, its fingerprints are left on the amplitude and the phase of the emerging wave. Capturing the information of the wavefront grants us a deeper understanding of the optical properties of the sample, and of the light-matter interaction. While the amplitude information has been intensively studied, the use of the phase information has been less common. Because all detectors are sensitive to intensity, not phase, wavefront measurements are significantly more challenging. Deploying optical interferometry to measure phase through phase-intensity conversion, quantitative phase imaging (QPI) has recently gained tremendous success in material and life sciences. The first topic of this dissertation describes our effort to develop a new QPI setup, named transmission Spatial Light Interference Microscopy (tSLIM), that uses the twisted nematic liquid-crystal (TNLC) modulators. Compared to the established SLIM technique, tSLIM is much less expensive to build than its predecessor (SLIM) while maintaining significant performance. The tSLIM system uses parallel aligned liquid-crystal (PANLC) modulators, has a slightly smaller signal-to-noise Ratio (SNR), and a more complicated model for the image formation. However, such complexity is well addressed by computing. Most importantly, tSLIM uses TNLC modulators that are popular in display LCDs. Therefore, the total cost of the system is significantly reduced. Alongside developing new imaging modalities, we also improved current QPI imaging systems. In practice, an incident field to the sample is rarely perfectly spatially coherent, i.e., plane wave. It is generally partially coherent; i.e., it comprises of many incoherent plane waves coming from multiple directions. This illumination yields artifacts in the phase measurement results, e.g., halo and phase-underestimation. One solution is using a very bright source, e.g., a laser, which can be spatially filtered very well. However, the

  4. 1st International Conference on Computational and Experimental Biomedical Sciences

    CERN Document Server

    Jorge, RM

    2015-01-01

    This book contains the full papers presented at ICCEBS 2013 – the 1st International Conference on Computational and Experimental Biomedical Sciences, which was organized in Azores, in October 2013. The included papers present and discuss new trends in those fields, using several methods and techniques, including active shape models, constitutive models, isogeometric elements, genetic algorithms, level sets, material models, neural networks, optimization, and the finite element method, in order to address more efficiently different and timely applications involving biofluids, computer simulation, computational biomechanics, image based diagnosis, image processing and analysis, image segmentation, image registration, scaffolds, simulation, and surgical planning. The main audience for this book consists of researchers, Ph.D students, and graduate students with multidisciplinary interests related to the areas of artificial intelligence, bioengineering, biology, biomechanics, computational fluid dynamics, comput...

  5. Computational Biomechanics Theoretical Background and BiologicalBiomedical Problems

    CERN Document Server

    Tanaka, Masao; Nakamura, Masanori

    2012-01-01

    Rapid developments have taken place in biological/biomedical measurement and imaging technologies as well as in computer analysis and information technologies. The increase in data obtained with such technologies invites the reader into a virtual world that represents realistic biological tissue or organ structures in digital form and allows for simulation and what is called “in silico medicine.” This volume is the third in a textbook series and covers both the basics of continuum mechanics of biosolids and biofluids and the theoretical core of computational methods for continuum mechanics analyses. Several biomechanics problems are provided for better understanding of computational modeling and analysis. Topics include the mechanics of solid and fluid bodies, fundamental characteristics of biosolids and biofluids, computational methods in biomechanics analysis/simulation, practical problems in orthopedic biomechanics, dental biomechanics, ophthalmic biomechanics, cardiovascular biomechanics, hemodynamics...

  6. The BioIntelligence Framework: a new computational platform for biomedical knowledge computing.

    Science.gov (United States)

    Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles; Mousses, Spyro

    2013-01-01

    Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information.

  7. Center for Advanced Computational Technology

    Science.gov (United States)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  8. [Computers in biomedical research: I. Analysis of bioelectrical signals].

    Science.gov (United States)

    Vivaldi, E A; Maldonado, P

    2001-08-01

    A personal computer equipped with an analog-to-digital conversion card is able to input, store and display signals of biomedical interest. These signals can additionally be submitted to ad-hoc software for analysis and diagnosis. Data acquisition is based on the sampling of a signal at a given rate and amplitude resolution. The automation of signal processing conveys syntactic aspects (data transduction, conditioning and reduction); and semantic aspects (feature extraction to describe and characterize the signal and diagnostic classification). The analytical approach that is at the basis of computer programming allows for the successful resolution of apparently complex tasks. Two basic principles involved are the definition of simple fundamental functions that are then iterated and the modular subdivision of tasks. These two principles are illustrated, respectively, by presenting the algorithm that detects relevant elements for the analysis of a polysomnogram, and the task flow in systems that automate electrocardiographic reports.

  9. The National Center for Biomedical Ontology: Advancing Biomedicinethrough Structured Organization of Scientific Knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Rubin, Daniel L.; Lewis, Suzanna E.; Mungall, Chris J.; Misra,Sima; Westerfield, Monte; Ashburner, Michael; Sim, Ida; Chute,Christopher G.; Solbrig, Harold; Storey, Margaret-Anne; Smith, Barry; Day-Richter, John; Noy, Natalya F.; Musen, Mark A.

    2006-01-23

    The National Center for Biomedical Ontology (http://bioontology.org) is a consortium that comprises leading informaticians, biologists, clinicians, and ontologists funded by the NIH Roadmap to develop innovative technology and methods that allow scientists to record, manage, and disseminate biomedical information and knowledge in machine-processable form. The goals of the Center are: (1) to help unify the divergent and isolated efforts in ontology development by promoting high quality open-source, standards-based tools to create, manage, and use ontologies, (2) to create new software tools so that scientists can use ontologies to annotate and analyze biomedical data, (3) to provide a national resource for the ongoing evaluation, integration, and evolution of biomedical ontologies and associated tools and theories in the context of driving biomedical projects (DBPs), and (4) to disseminate the tools and resources of the Center and to identify, evaluate, and communicate best practices of ontology development to the biomedical community. The Center is working toward these objectives by providing tools to develop ontologies and to annotate experimental data, and by developing resources to integrate and relate existing ontologies as well as by creating repositories of biomedical data that are annotated using those ontologies. The Center is providing training workshops in ontology design, development, and usage, and is also pursuing research in ontology evaluation, quality, and use of ontologies to promote scientific discovery. Through the research activities within the Center, collaborations with the DBPs, and interactions with the biomedical community, our goal is to help scientists to work more effectively in the e-science paradigm, enhancing experiment design, experiment execution, data analysis, information synthesis, hypothesis generation and testing, and understand human disease.

  10. Acquisition and manipulation of computed tomography images of the maxillofacial region for biomedical prototyping

    International Nuclear Information System (INIS)

    Meurer, Maria Ines; Silva, Jorge Vicente Lopes da; Santa Barbara, Ailton; Nobre, Luiz Felipe; Oliveira, Marilia Gerhardt de; Silva, Daniela Nascimento

    2008-01-01

    Biomedical prototyping has resulted from a merger of rapid prototyping and imaging diagnosis technologies. However, this process is complex, considering the necessity of interaction between biomedical sciences and engineering. Good results are highly dependent on the acquisition of computed tomography images and their subsequent manipulation by means of specific software. The present study describes the experience of a multidisciplinary group of researchers in the acquisition and manipulation of computed tomography images of the maxillofacial region aiming at biomedical prototyping for surgical purposes. (author)

  11. Biomedical data integration in computational drug design and bioinformatics.

    Science.gov (United States)

    Seoane, Jose A; Aguiar-Pulido, Vanessa; Munteanu, Cristian R; Rivero, Daniel; Rabunal, Juan R; Dorado, Julian; Pazos, Alejandro

    2013-03-01

    In recent years, in the post genomic era, more and more data is being generated by biological high throughput technologies, such as proteomics and transcriptomics. This omics data can be very useful, but the real challenge is to analyze all this data, as a whole, after integrating it. Biomedical data integration enables making queries to different, heterogeneous and distributed biomedical data sources. Data integration solutions can be very useful not only in the context of drug design, but also in biomedical information retrieval, clinical diagnosis, system biology, etc. In this review, we analyze the most common approaches to biomedical data integration, such as federated databases, data warehousing, multi-agent systems and semantic technology, as well as the solutions developed using these approaches in the past few years.

  12. Government Cloud Computing Policies: Potential Opportunities for Advancing Military Biomedical Research.

    Science.gov (United States)

    Lebeda, Frank J; Zalatoris, Jeffrey J; Scheerer, Julia B

    2018-02-07

    This position paper summarizes the development and the present status of Department of Defense (DoD) and other government policies and guidances regarding cloud computing services. Due to the heterogeneous and growing biomedical big datasets, cloud computing services offer an opportunity to mitigate the associated storage and analysis requirements. Having on-demand network access to a shared pool of flexible computing resources creates a consolidated system that should reduce potential duplications of effort in military biomedical research. Interactive, online literature searches were performed with Google, at the Defense Technical Information Center, and at two National Institutes of Health research portfolio information sites. References cited within some of the collected documents also served as literature resources. We gathered, selected, and reviewed DoD and other government cloud computing policies and guidances published from 2009 to 2017. These policies were intended to consolidate computer resources within the government and reduce costs by decreasing the number of federal data centers and by migrating electronic data to cloud systems. Initial White House Office of Management and Budget information technology guidelines were developed for cloud usage, followed by policies and other documents from the DoD, the Defense Health Agency, and the Armed Services. Security standards from the National Institute of Standards and Technology, the Government Services Administration, the DoD, and the Army were also developed. Government Services Administration and DoD Inspectors General monitored cloud usage by the DoD. A 2016 Government Accountability Office report characterized cloud computing as being economical, flexible and fast. A congressionally mandated independent study reported that the DoD was active in offering a wide selection of commercial cloud services in addition to its milCloud system. Our findings from the Department of Health and Human Services

  13. Center for computer security: Computer Security Group conference. Summary

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-06-01

    Topics covered include: computer security management; detection and prevention of computer misuse; certification and accreditation; protection of computer security, perspective from a program office; risk analysis; secure accreditation systems; data base security; implementing R and D; key notarization system; DOD computer security center; the Sandia experience; inspector general's report; and backup and contingency planning. (GHT)

  14. Creating a pipeline of talent for informatics: STEM initiative for high school students in computer science, biology, and biomedical informatics

    Directory of Open Access Journals (Sweden)

    Joyeeta Dutta-Moscato

    2014-01-01

    Full Text Available This editorial provides insights into how informatics can attract highly trained students by involving them in science, technology, engineering, and math (STEM training at the high school level and continuing to provide mentorship and research opportunities through the formative years of their education. Our central premise is that the trajectory necessary to be expert in the emergent fields in front of them requires acceleration at an early time point. Both pathology (and biomedical informatics are new disciplines which would benefit from involvement by students at an early stage of their education. In 2009, Michael T Lotze MD, Kirsten Livesey (then a medical student, now a medical resident at University of Pittsburgh Medical Center (UPMC, Richard Hersheberger, PhD (Currently, Dean at Roswell Park, and Megan Seippel, MS (the administrator launched the University of Pittsburgh Cancer Institute (UPCI Summer Academy to bring high school students for an 8 week summer academy focused on Cancer Biology. Initially, pathology and biomedical informatics were involved only in the classroom component of the UPCI Summer Academy. In 2011, due to popular interest, an informatics track called Computer Science, Biology and Biomedical Informatics (CoSBBI was launched. CoSBBI currently acts as a feeder program for the undergraduate degree program in bioinformatics at the University of Pittsburgh, which is a joint degree offered by the Departments of Biology and Computer Science. We believe training in bioinformatics is the best foundation for students interested in future careers in pathology informatics or biomedical informatics. We describe our approach to the recruitment, training and research mentoring of high school students to create a pipeline of exceptionally well-trained applicants for both the disciplines of pathology informatics and biomedical informatics. We emphasize here how mentoring of high school students in pathology informatics and biomedical

  15. Creating a pipeline of talent for informatics: STEM initiative for high school students in computer science, biology, and biomedical informatics.

    Science.gov (United States)

    Dutta-Moscato, Joyeeta; Gopalakrishnan, Vanathi; Lotze, Michael T; Becich, Michael J

    2014-01-01

    This editorial provides insights into how informatics can attract highly trained students by involving them in science, technology, engineering, and math (STEM) training at the high school level and continuing to provide mentorship and research opportunities through the formative years of their education. Our central premise is that the trajectory necessary to be expert in the emergent fields in front of them requires acceleration at an early time point. Both pathology (and biomedical) informatics are new disciplines which would benefit from involvement by students at an early stage of their education. In 2009, Michael T Lotze MD, Kirsten Livesey (then a medical student, now a medical resident at University of Pittsburgh Medical Center (UPMC)), Richard Hersheberger, PhD (Currently, Dean at Roswell Park), and Megan Seippel, MS (the administrator) launched the University of Pittsburgh Cancer Institute (UPCI) Summer Academy to bring high school students for an 8 week summer academy focused on Cancer Biology. Initially, pathology and biomedical informatics were involved only in the classroom component of the UPCI Summer Academy. In 2011, due to popular interest, an informatics track called Computer Science, Biology and Biomedical Informatics (CoSBBI) was launched. CoSBBI currently acts as a feeder program for the undergraduate degree program in bioinformatics at the University of Pittsburgh, which is a joint degree offered by the Departments of Biology and Computer Science. We believe training in bioinformatics is the best foundation for students interested in future careers in pathology informatics or biomedical informatics. We describe our approach to the recruitment, training and research mentoring of high school students to create a pipeline of exceptionally well-trained applicants for both the disciplines of pathology informatics and biomedical informatics. We emphasize here how mentoring of high school students in pathology informatics and biomedical informatics

  16. Eleven quick tips for architecting biomedical informatics workflows with cloud computing.

    Science.gov (United States)

    Cole, Brian S; Moore, Jason H

    2018-03-01

    Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world's largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction.

  17. Private Data Analytics on Biomedical Sensing Data via Distributed Computation.

    Science.gov (United States)

    Gong, Yanmin; Fang, Yuguang; Guo, Yuanxiong

    2016-01-01

    Advances in biomedical sensors and mobile communication technologies have fostered the rapid growth of mobile health (mHealth) applications in the past years. Users generate a high volume of biomedical data during health monitoring, which can be used by the mHealth server for training predictive models for disease diagnosis and treatment. However, the biomedical sensing data raise serious privacy concerns because they reveal sensitive information such as health status and lifestyles of the sensed subjects. This paper proposes and experimentally studies a scheme that keeps the training samples private while enabling accurate construction of predictive models. We specifically consider logistic regression models which are widely used for predicting dichotomous outcomes in healthcare, and decompose the logistic regression problem into small subproblems over two types of distributed sensing data, i.e., horizontally partitioned data and vertically partitioned data. The subproblems are solved using individual private data, and thus mHealth users can keep their private data locally and only upload (encrypted) intermediate results to the mHealth server for model training. Experimental results based on real datasets show that our scheme is highly efficient and scalable to a large number of mHealth users.

  18. Engineering computations at the national magnetic fusion energy computer center

    International Nuclear Information System (INIS)

    Murty, S.

    1983-01-01

    The National Magnetic Fusion Energy Computer Center (NMFECC) was established by the U.S. Department of Energy's Division of Magnetic Fusion Energy (MFE). The NMFECC headquarters is located at Lawrence Livermore National Laboratory. Its purpose is to apply large-scale computational technology and computing techniques to the problems of controlled thermonuclear research. In addition to providing cost effective computing services, the NMFECC also maintains a large collection of computer codes in mathematics, physics, and engineering that is shared by the entire MFE research community. This review provides a broad perspective of the NMFECC, and a list of available codes at the NMFECC for engineering computations is given

  19. Activity report of Computing Research Center

    Energy Technology Data Exchange (ETDEWEB)

    1997-07-01

    On April 1997, National Laboratory for High Energy Physics (KEK), Institute of Nuclear Study, University of Tokyo (INS), and Meson Science Laboratory, Faculty of Science, University of Tokyo began to work newly as High Energy Accelerator Research Organization after reconstructing and converting their systems, under aiming at further development of a wide field of accelerator science using a high energy accelerator. In this Research Organization, Applied Research Laboratory is composed of four Centers to execute assistance of research actions common to one of the Research Organization and their relating research and development (R and D) by integrating the present four centers and their relating sections in Tanashi. What is expected for the assistance of research actions is not only its general assistance but also its preparation and R and D of a system required for promotion and future plan of the research. Computer technology is essential to development of the research and can communize for various researches in the Research Organization. On response to such expectation, new Computing Research Center is required for promoting its duty by coworking and cooperating with every researchers at a range from R and D on data analysis of various experiments to computation physics acting under driving powerful computer capacity such as supercomputer and so forth. Here were described on report of works and present state of Data Processing Center of KEK at the first chapter and of the computer room of INS at the second chapter and on future problems for the Computing Research Center. (G.K.)

  20. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation.

    Science.gov (United States)

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.

  1. BioPortal: enhanced functionality via new Web services from the National Center for Biomedical Ontology to access and use ontologies in software applications.

    Science.gov (United States)

    Whetzel, Patricia L; Noy, Natalya F; Shah, Nigam H; Alexander, Paul R; Nyulas, Csongor; Tudorache, Tania; Musen, Mark A

    2011-07-01

    The National Center for Biomedical Ontology (NCBO) is one of the National Centers for Biomedical Computing funded under the NIH Roadmap Initiative. Contributing to the national computing infrastructure, NCBO has developed BioPortal, a web portal that provides access to a library of biomedical ontologies and terminologies (http://bioportal.bioontology.org) via the NCBO Web services. BioPortal enables community participation in the evaluation and evolution of ontology content by providing features to add mappings between terms, to add comments linked to specific ontology terms and to provide ontology reviews. The NCBO Web services (http://www.bioontology.org/wiki/index.php/NCBO_REST_services) enable this functionality and provide a uniform mechanism to access ontologies from a variety of knowledge representation formats, such as Web Ontology Language (OWL) and Open Biological and Biomedical Ontologies (OBO) format. The Web services provide multi-layered access to the ontology content, from getting all terms in an ontology to retrieving metadata about a term. Users can easily incorporate the NCBO Web services into software applications to generate semantically aware applications and to facilitate structured data collection.

  2. Strom Thurmond Biomedical Research Center at the Medical Univesity for South Carolina Charleston, South Carolina

    Energy Technology Data Exchange (ETDEWEB)

    1994-02-01

    The Department of Energy (DOE) has prepared an Environmental Assessment (EA) evaluating the proposed construction and operation of the Strom Thurmond Biomedical Research Center (Center) at the Medical University of South Carolina (MUSC), Charleston, SC. The DOE is evaluating a grant proposal to authorize the MUSC to construct, equip and operate the lower two floors of the proposed nine-story Center as an expansion of on-going clinical research and out-patient diagnostic activities of the Cardiology Division of the existing Gazes Cardiac Research Institute. Based on the analysis in the EA, the DOE has determined that the proposed action does not constitute a major federal action significantly affecting the quality of the human environment within the meaning of the NEPA. Therefore, the preparation of an Environmental Impact Statement is not required.

  3. Biomedical optics centers: forty years of multidisciplinary clinical translation for improving human health

    Science.gov (United States)

    Tromberg, Bruce J.; Anderson, R. Rox; Birngruber, Reginald; Brinkmann, Ralf; Berns, Michael W.; Parrish, John A.; Apiou-Sbirlea, Gabriela

    2016-12-01

    Despite widespread government and public interest, there are significant barriers to translating basic science discoveries into clinical practice. Biophotonics and biomedical optics technologies can be used to overcome many of these hurdles, due, in part, to offering new portable, bedside, and accessible devices. The current JBO special issue highlights promising activities and examples of translational biophotonics from leading laboratories around the world. We identify common essential features of successful clinical translation by examining the origins and activities of three major international academic affiliated centers with beginnings traceable to the mid-late 1970s: The Wellman Center for Photomedicine (Mass General Hospital, USA), the Beckman Laser Institute and Medical Clinic (University of California, Irvine, USA), and the Medical Laser Center Lübeck at the University of Lübeck, Germany. Major factors driving the success of these programs include visionary founders and leadership, multidisciplinary research and training activities in light-based therapies and diagnostics, diverse funding portfolios, and a thriving entrepreneurial culture that tolerates risk. We provide a brief review of how these three programs emerged and highlight critical phases and lessons learned. Based on these observations, we identify pathways for encouraging the growth and formation of similar programs in order to more rapidly and effectively expand the impact of biophotonics and biomedical optics on human health.

  4. Digital optical computers at the optoelectronic computing systems center

    Science.gov (United States)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  5. AHPCRC - Army High Performance Computing Research Center

    Science.gov (United States)

    2010-01-01

    computing. Of particular interest is the ability of a distrib- uted jamming network (DJN) to jam signals in all or part of a sensor or communications net...and reasoning, assistive technologies. FRIEDRICH (FRITZ) PRINZ Finmeccanica Professor of Engineering, Robert Bosch Chair, Department of Engineering...High Performance Computing Research Center www.ahpcrc.org BARBARA BRYAN AHPCRC Research and Outreach Manager, HPTi (650) 604-3732 bbryan@hpti.com Ms

  6. Eleven quick tips for architecting biomedical informatics workflows with cloud computing

    Science.gov (United States)

    Moore, Jason H.

    2018-01-01

    Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world’s largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction. PMID:29596416

  7. Eleven quick tips for architecting biomedical informatics workflows with cloud computing.

    Directory of Open Access Journals (Sweden)

    Brian S Cole

    2018-03-01

    Full Text Available Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world's largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction.

  8. The Brazilian research and teaching center in biomedicine and aerospace biomedical engineering.

    Science.gov (United States)

    Russomano, T; Falcao, P F; Dalmarco, G; Martinelli, L; Cardoso, R; Santos, M A; Sparenberg, A

    2008-08-01

    The recent engagement of Brazil in the construction and utilization of the International Space Station has motivated several Brazilian research institutions and universities to establish study centers related to Space Sciences. The Pontificia Universidade Catolica do Rio Grande do Sul (PUCRS) is no exception. The University initiated in 1993 the first degree course training students to operate commercial aircraft in South America (the School of Aeronautical Sciences. A further step was the decision to build the first Brazilian laboratory dedicated to the conduct of experiments in ground-based microgravity simulation. Established in 1998, the Microgravity Laboratory, which was located in the Instituto de Pesquisas Cientificas e Tecnologicas (IPCT), was supported by the Schools of Medicine, Aeronautical Sciences and Electrical Engineering/Biomedical Engineering. At the end of 2006, the Microgravity Laboratory became a Center and was transferred to the School of Engineering. The principal activities of the Microgravity Centre are the development of research projects related to human physiology before, during and after ground-based microgravity simulation and parabolic flights, to aviation medicine in the 21st century and to aerospace biomedical engineering. The history of Brazilian, and why not say worldwide, space science should unquestionably go through PUCRS. As time passes, the pioneering spirit of our University in the aerospace area has become undeniable. This is due to the group of professionals, students, technicians and staff in general that have once worked or are still working in the Center of Microgravity, a group of faculty and students that excel in their undeniable technical-scientific qualifications.

  9. Tsinghua-Johns Hopkins Joint Center for Biomedical Engineering Research: scientific and cultural exchange in undergraduate engineering.

    Science.gov (United States)

    Wisneski, Andrew D; Huang, Lixia; Hong, Bo; Wang, Xiaoqin

    2011-01-01

    A model for an international undergraduate biomedical engineering research exchange program is outlined. In 2008, the Johns Hopkins University in collaboration with Tsinghua University in Beijing, China established the Tsinghua-Johns Hopkins Joint Center for Biomedical Engineering Research. Undergraduate biomedical engineering students from both universities are offered the opportunity to participate in research at the overseas institution. Programs such as these will not only provide research experiences for undergraduates but valuable cultural exchange and enrichment as well. Currently, strict course scheduling and rigorous curricula in most biomedical engineering programs may present obstacles for students to partake in study abroad opportunities. Universities are encouraged to harbor abroad opportunities for undergraduate engineering students, for which this particular program can serve as a model.

  10. A Computer Learning Center for Environmental Sciences

    Science.gov (United States)

    Mustard, John F.

    2000-01-01

    In the fall of 1998, MacMillan Hall opened at Brown University to students. In MacMillan Hall was the new Computer Learning Center, since named the EarthLab which was outfitted with high-end workstations and peripherals primarily focused on the use of remotely sensed and other spatial data in the environmental sciences. The NASA grant we received as part of the "Centers of Excellence in Applications of Remote Sensing to Regional and Global Integrated Environmental Assessments" was the primary source of funds to outfit this learning and research center. Since opening, we have expanded the range of learning and research opportunities and integrated a cross-campus network of disciplines who have come together to learn and use spatial data of all kinds. The EarthLab also forms a core of undergraduate, graduate, and faculty research on environmental problems that draw upon the unique perspective of remotely sensed data. Over the last two years, the Earthlab has been a center for research on the environmental impact of water resource use in and regions, impact of the green revolution on forest cover in India, the design of forest preserves in Vietnam, and detailed assessments of the utility of thermal and hyperspectral data for water quality analysis. It has also been used extensively for local environmental activities, in particular studies on the impact of lead on the health of urban children in Rhode Island. Finally, the EarthLab has also served as a key educational and analysis center for activities related to the Brown University Affiliated Research Center that is devoted to transferring university research to the private sector.

  11. A Ten-Year Assessment of a Biomedical Engineering Summer Research Internship within a Comprehensive Cancer Center

    Science.gov (United States)

    Wright, A. S.; Wu, X.; Frye, C. A.; Mathur, A. B.; Patrick, C. W., Jr.

    2007-01-01

    A Biomedical Engineering Internship Program conducted within a Comprehensive Cancer Center over a 10 year period was assessed and evaluated. Although this is a non-traditional location for an internship, it is an ideal site for a multidisciplinary training program for science, technology, engineering, and mathematics (STEM) students. We made a…

  12. The Computational Physics Program of the national MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1989-01-01

    Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs

  13. Patterns of biomedical science production in a sub-Saharan research center

    Directory of Open Access Journals (Sweden)

    Agnandji Selidji T

    2012-03-01

    Full Text Available Abstract Background Research activities in sub-Saharan Africa may be limited to delegated tasks due to the strong control from Western collaborators, which could lead to scientific production of little value in terms of its impact on social and economic innovation in less developed areas. However, the current contexts of international biomedical research including the development of public-private partnerships and research institutions in Africa suggest that scientific activities are growing in sub-Saharan Africa. This study aims to describe the patterns of clinical research activities at a sub-Saharan biomedical research center. Methods In-depth interviews were conducted with a core group of researchers at the Medical Research Unit of the Albert Schweitzer Hospital from June 2009 to February 2010 in Lambaréné, Gabon. Scientific activities running at the MRU as well as the implementation of ethical and regulatory standards were covered by the interview sessions. Results The framework of clinical research includes transnational studies and research initiated locally. In transnational collaborations, a sub-Saharan research institution may be limited to producing confirmatory and late-stage data with little impact on economic and social innovation. However, ethical and regulatory guidelines are being implemented taking into consideration the local contexts. Similarly, the scientific content of studies designed by researchers at the MRU, if local needs are taken into account, may potentially contribute to a scientific production with long-term value on social and economic innovation in sub-Saharan Africa. Conclusion Further research questions and methods in social sciences should comprehensively address the construction of scientific content with the social, economic and cultural contexts surrounding research activities.

  14. Patient identity management for secondary use of biomedical research data in a distributed computing environment.

    Science.gov (United States)

    Nitzlnader, Michael; Schreier, Günter

    2014-01-01

    Dealing with data from different source domains is of increasing importance in today's large scale biomedical research endeavours. Within the European Network for Cancer research in Children and Adolescents (ENCCA) a solution to share such data for secondary use will be established. In this paper the solution arising from the aims of the ENCCA project and regulatory requirements concerning data protection and privacy is presented. Since the details of secondary biomedical dataset utilisation are often not known in advance, data protection regulations are met with an identity management concept that facilitates context-specific pseudonymisation and a way of data aggregation using a hidden reference table later on. Phonetic hashing is proposed to prevent duplicated patient registration and re-identification of patients is possible via a trusted third party only. Finally, the solution architecture allows for implementation in a distributed computing environment, including cloud-based elements.

  15. Effective use of latent semantic indexing and computational linguistics in biological and biomedical applications.

    Science.gov (United States)

    Chen, Hongyu; Martin, Bronwen; Daimon, Caitlin M; Maudsley, Stuart

    2013-01-01

    Text mining is rapidly becoming an essential technique for the annotation and analysis of large biological data sets. Biomedical literature currently increases at a rate of several thousand papers per week, making automated information retrieval methods the only feasible method of managing this expanding corpus. With the increasing prevalence of open-access journals and constant growth of publicly-available repositories of biomedical literature, literature mining has become much more effective with respect to the extraction of biomedically-relevant data. In recent years, text mining of popular databases such as MEDLINE has evolved from basic term-searches to more sophisticated natural language processing techniques, indexing and retrieval methods, structural analysis and integration of literature with associated metadata. In this review, we will focus on Latent Semantic Indexing (LSI), a computational linguistics technique increasingly used for a variety of biological purposes. It is noted for its ability to consistently outperform benchmark Boolean text searches and co-occurrence models at information retrieval and its power to extract indirect relationships within a data set. LSI has been used successfully to formulate new hypotheses, generate novel connections from existing data, and validate empirical data.

  16. A multipurpose computing center with distributed resources

    Science.gov (United States)

    Chudoba, J.; Adam, M.; Adamová, D.; Kouba, T.; Mikula, A.; Říkal, V.; Švec, J.; Uhlířová, J.; Vokáč, P.; Svatoš, M.

    2017-10-01

    The Computing Center of the Institute of Physics (CC IoP) of the Czech Academy of Sciences serves a broad spectrum of users with various computing needs. It runs WLCG Tier-2 center for the ALICE and the ATLAS experiments; the same group of services is used by astroparticle physics projects the Pierre Auger Observatory (PAO) and the Cherenkov Telescope Array (CTA). OSG stack is installed for the NOvA experiment. Other groups of users use directly local batch system. Storage capacity is distributed to several locations. DPM servers used by the ATLAS and the PAO are all in the same server room, but several xrootd servers for the ALICE experiment are operated in the Nuclear Physics Institute in Řež, about 10 km away. The storage capacity for the ATLAS and the PAO is extended by resources of the CESNET - the Czech National Grid Initiative representative. Those resources are in Plzen and Jihlava, more than 100 km away from the CC IoP. Both distant sites use a hierarchical storage solution based on disks and tapes. They installed one common dCache instance, which is published in the CC IoP BDII. ATLAS users can use these resources using the standard ATLAS tools in the same way as the local storage without noticing this geographical distribution. Computing clusters LUNA and EXMAG dedicated to users mostly from the Solid State Physics departments offer resources for parallel computing. They are part of the Czech NGI infrastructure MetaCentrum with distributed batch system based on torque with a custom scheduler. Clusters are installed remotely by the MetaCentrum team and a local contact helps only when needed. Users from IoP have exclusive access only to a part of these two clusters and take advantage of higher priorities on the rest (1500 cores in total), which can also be used by any user of the MetaCentrum. IoP researchers can also use distant resources located in several towns of the Czech Republic with a capacity of more than 12000 cores in total.

  17. Survey of biomedical and environental data bases, models, and integrated computer systems at Argonne National Laboratory

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.; Scott, J.M.; Huebner, R.H.

    1978-08-01

    This document contains an inventory (index) of information resources pertaining to biomedical and environmental projects at Argonne National Laboratory--the information resources include a data base, model, or integrated computer system. Entries are categorized as models, numeric data bases, bibliographic data bases, or integrated hardware/software systems. Descriptions of the Information Coordination Focal Point (ICFP) program, the system for compiling this inventory, and the plans for continuing and expanding it are given, and suggestions for utilizing the services of the ICFP are outlined

  18. Survey of biomedical and environental data bases, models, and integrated computer systems at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Murarka, I.P.; Bodeau, D.J.; Scott, J.M.; Huebner, R.H.

    1978-08-01

    This document contains an inventory (index) of information resources pertaining to biomedical and environmental projects at Argonne National Laboratory--the information resources include a data base, model, or integrated computer system. Entries are categorized as models, numeric data bases, bibliographic data bases, or integrated hardware/software systems. Descriptions of the Information Coordination Focal Point (ICFP) program, the system for compiling this inventory, and the plans for continuing and expanding it are given, and suggestions for utilizing the services of the ICFP are outlined.

  19. The NIH-NIAID Schistosomiasis Resource Center at the Biomedical Research Institute: Molecular Redux.

    Directory of Open Access Journals (Sweden)

    James J Cody

    2016-10-01

    Full Text Available Schistosomiasis remains a health burden in many parts of the world. The complex life cycle of Schistosoma parasites and the economic and societal conditions present in endemic areas make the prospect of eradication unlikely in the foreseeable future. Continued and vigorous research efforts must therefore be directed at this disease, particularly since only a single World Health Organization (WHO-approved drug is available for treatment. The National Institutes of Health (NIH-National Institute of Allergy and Infectious Diseases (NIAID Schistosomiasis Resource Center (SRC at the Biomedical Research Institute provides investigators with the critical raw materials needed to carry out this important research. The SRC makes available, free of charge (including international shipping costs, not only infected host organisms but also a wide array of molecular reagents derived from all life stages of each of the three main human schistosome parasites. As the field of schistosomiasis research rapidly advances, it is likely to become increasingly reliant on omics, transgenics, epigenetics, and microbiome-related research approaches. The SRC has and will continue to monitor and contribute to advances in the field in order to support these research efforts with an expanding array of molecular reagents. In addition to providing investigators with source materials, the SRC has expanded its educational mission by offering a molecular techniques training course and has recently organized an international schistosomiasis-focused meeting. This review provides an overview of the materials and services that are available at the SRC for schistosomiasis researchers, with a focus on updates that have occurred since the original overview in 2008.

  20. Ranking Iranian biomedical research centers according to H-variants (G, M, A, R) in Scopus and Web of Science.

    Science.gov (United States)

    Mahmudi, Zoleikha; Tahamtan, Iman; Sedghi, Shahram; Roudbari, Masoud

    2015-01-01

    We conducted a comprehensive bibliometrics analysis to calculate the H, G, M, A and R indicators for all Iranian biomedical research centers (IBRCs) from the output of ISI Web of Science (WoS) and Scopus between 1991 and 2010. We compared the research performance of the research centers according to these indicators. This was a cross-sectional and descriptive-analytical study, conducted on 104 Iranian biomedical research centers between August and September 2011. We collected our data through Scopus and WoS. Pearson correlation coefficient between the scientometrics indicators was calculated using SPSS, version 16. The mean values of all indicators were higher in Scopus than in WoS. Drug Applied Research Center of Tabriz University of Medical Sciences had the highest number of publications in both WoS and Scopus databases. This research center along with Royan Institute received the highest number of citations in both Scopus and WoS, respectively. The highest correlation was seen between G and R (.998) in WoS and between G and R (.990) in Scopus. Furthermore, the highest overlap of the 10 top IBRCs was between G and H in WoS (100%) and between G-R (90%) and H-R (90%) in Scopus. Research centers affiliated to the top ranked Iranian medical universities obtained a better position with respect to the studied scientometrics indicators. All aforementioned indicators are important for ranking bibliometrics studies as they refer to different attributes of scientific output and citation aspects.

  1. Analysis of uncertainty and variability in finite element computational models for biomedical engineering:characterization and propagation

    Directory of Open Access Journals (Sweden)

    Nerea Mangado

    2016-11-01

    Full Text Available Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.

  2. NASA Center for Computational Sciences: History and Resources

    Science.gov (United States)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  3. Center for Computing Research Summer Research Proceedings 2015.

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, Andrew Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Parks, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-18

    The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).

  4. Molecular image in biomedical research. Molecular imaging unit of the National Cancer Research Center

    International Nuclear Information System (INIS)

    Perez Bruzon, J.; Mulero Anhiorte, F.

    2010-01-01

    This article has two basic objectives. firstly, it will review briefly the most important imaging techniques used in biomedical research indicting the most significant aspects related to their application in the preclinical stage. Secondly, it will present a practical application of these techniques in a pure biomedical research centre (not associated to a clinical facility). Practical aspects such as organisation, equipment, work norms, shielding of the Spanish National Cancer Research Centre (CNIO) Imaging Unit will be shown. This is a pioneering facility in the application of these techniques in research centres without any dependence or any direct relationship with other hospital Nuclear Medicine services. (Author) 7 refs.

  5. Bioinformatics and Computational Core Technology Center

    Data.gov (United States)

    Federal Laboratory Consortium — SERVICES PROVIDED BY THE COMPUTER CORE FACILITYEvaluation, purchase, set up, and maintenance of the computer hardware and network for the 170 users in the research...

  6. Visualization of biomedical image data and irradiation planning using a parallel computing system

    International Nuclear Information System (INIS)

    Lehrig, R.

    1991-01-01

    The contribution explains the development of a novel, low-cost workstation for the processing of biomedical tomographic data sequences. The workstation was to allow both graphical display of the data and implementation of modelling software for irradiation planning, especially for calculation of dose distributions on the basis of the measured tomogram data. The system developed according to these criteria is a parallel computing system which performs secondary, two-dimensional image reconstructions irrespective of the imaging direction of the original tomographic scans. Three-dimensional image reconstructions can be generated from any direction of view, with random selection of sections of the scanned object. (orig./MM) With 69 figs., 2 tabs [de

  7. Biomedical Informatics Research and Education at the EuroMISE Center

    Czech Academy of Sciences Publication Activity Database

    Zvárová, Jana

    2006-01-01

    Roč. 45, Suppl. (2006), s. 166-173 ISSN 0026-1270 Grant - others:Evropské sociální fondy CZ04307/42011/0013 Institutional research plan: CEZ:AV0Z10300504 Keywords : biomedical informatics * research * education * healthcare * information society Subject RIV: BJ - Thermodynamics Impact factor: 1.684, year: 2006

  8. Building the Teraflops/Petabytes Production Computing Center

    International Nuclear Information System (INIS)

    Kramer, William T.C.; Lucas, Don; Simon, Horst D.

    1999-01-01

    In just one decade, the 1990s, supercomputer centers have undergone two fundamental transitions which require rethinking their operation and their role in high performance computing. The first transition in the early to mid-1990s resulted from a technology change in high performance computing architecture. Highly parallel distributed memory machines built from commodity parts increased the operational complexity of the supercomputer center, and required the introduction of intellectual services as equally important components of the center. The second transition is happening in the late 1990s as centers are introducing loosely coupled clusters of SMPs as their premier high performance computing platforms, while dealing with an ever-increasing volume of data. In addition, increasing network bandwidth enables new modes of use of a supercomputer center, in particular, computational grid applications. In this paper we describe what steps NERSC is taking to address these issues and stay at the leading edge of supercomputing centers.; N

  9. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2002-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  10. THE CENTER FOR DATA INTENSIVE COMPUTING

    International Nuclear Information System (INIS)

    GLIMM, J.

    2001-01-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook

  11. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2001-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  12. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2003-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  13. Human-centered Computing: Toward a Human Revolution

    OpenAIRE

    Jaimes, Alejandro; Gatica-Perez, Daniel; Sebe, Nicu; Huang, Thomas S.

    2007-01-01

    Human-centered computing studies the design, development, and deployment of mixed-initiative human-computer systems. HCC is emerging from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts.

  14. Senior Computational Scientist | Center for Cancer Research

    Science.gov (United States)

    The Basic Science Program (BSP) pursues independent, multidisciplinary research in basic and applied molecular biology, immunology, retrovirology, cancer biology, and human genetics. Research efforts and support are an integral part of the Center for Cancer Research (CCR) at the Frederick National Laboratory for Cancer Research (FNLCR). The Cancer & Inflammation Program (CIP),

  15. Computational Physics Program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1984-12-01

    The principal objective of the computational physics group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. A summary of the groups activities is presented, including computational studies in MHD equilibria and stability, plasma transport, Fokker-Planck, and efficient numerical and programming algorithms. References are included

  16. DOORS to the semantic web and grid with a PORTAL for biomedical computing.

    Science.gov (United States)

    Taswell, Carl

    2008-03-01

    The semantic web remains in the early stages of development. It has not yet achieved the goals envisioned by its founders as a pervasive web of distributed knowledge and intelligence. Success will be attained when a dynamic synergism can be created between people and a sufficient number of infrastructure systems and tools for the semantic web in analogy with those for the original web. The domain name system (DNS), web browsers, and the benefits of publishing web pages motivated many people to register domain names and publish web sites on the original web. An analogous resource label system, semantic search applications, and the benefits of collaborative semantic networks will motivate people to register resource labels and publish resource descriptions on the semantic web. The Domain Ontology Oriented Resource System (DOORS) and Problem Oriented Registry of Tags and Labels (PORTAL) are proposed as infrastructure systems for resource metadata within a paradigm that can serve as a bridge between the original web and the semantic web. The Internet Registry Information Service (IRIS) registers [corrected] domain names while DNS publishes domain addresses with mapping of names to addresses for the original web. Analogously, PORTAL registers resource labels and tags while DOORS publishes resource locations and descriptions with mapping of labels to locations for the semantic web. BioPORT is proposed as a prototype PORTAL registry specific for the problem domain of biomedical computing.

  17. Building a High Performance Computing Infrastructure for Novosibirsk Scientific Center

    International Nuclear Information System (INIS)

    Adakin, A; Chubarov, D; Nikultsev, V; Belov, S; Kaplin, V; Sukharev, A; Zaytsev, A; Kalyuzhny, V; Kuchin, N; Lomakin, S

    2011-01-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies (ICT), and Institute of Computational Mathematics and Mathematical Geophysics (ICM and MG). Since each institute has specific requirements on the architecture of the computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for the particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM and MG), and a Grid Computing Facility of BINP. Recently a dedicated optical network with the initial bandwidth of 10 Gbps connecting these three facilities was built in order to make it possible to share the computing resources among the research communities of participating institutes, thus providing a common platform for building the computing infrastructure for various scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technologies based on XEN and KVM platforms. The solution implemented was tested thoroughly within the computing environment of KEDR detector experiment which is being carried out at BINP, and foreseen to be applied to the use cases of other HEP experiments in the upcoming future.

  18. Computational-physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1982-02-01

    The computational physics group is ivolved in several areas of fusion research. One main area is the application of multidimensional Fokker-Planck, transport and combined Fokker-Planck/transport codes to both toroidal and mirror devices. Another major area is the investigation of linear and nonlinear resistive magnetohydrodynamics in two and three dimensions, with applications to all types of fusion devices. The MHD work is often coupled with the task of numerically generating equilibria which model experimental devices. In addition to these computational physics studies, investigations of more efficient numerical algorithms are being carried out

  19. Statistical modeling of biomedical corpora: mining the Caenorhabditis Genetic Center Bibliography for genes related to life span

    Directory of Open Access Journals (Sweden)

    Jordan MI

    2006-05-01

    Full Text Available Abstract Background The statistical modeling of biomedical corpora could yield integrated, coarse-to-fine views of biological phenomena that complement discoveries made from analysis of molecular sequence and profiling data. Here, the potential of such modeling is demonstrated by examining the 5,225 free-text items in the Caenorhabditis Genetic Center (CGC Bibliography using techniques from statistical information retrieval. Items in the CGC biomedical text corpus were modeled using the Latent Dirichlet Allocation (LDA model. LDA is a hierarchical Bayesian model which represents a document as a random mixture over latent topics; each topic is characterized by a distribution over words. Results An LDA model estimated from CGC items had better predictive performance than two standard models (unigram and mixture of unigrams trained using the same data. To illustrate the practical utility of LDA models of biomedical corpora, a trained CGC LDA model was used for a retrospective study of nematode genes known to be associated with life span modification. Corpus-, document-, and word-level LDA parameters were combined with terms from the Gene Ontology to enhance the explanatory value of the CGC LDA model, and to suggest additional candidates for age-related genes. A novel, pairwise document similarity measure based on the posterior distribution on the topic simplex was formulated and used to search the CGC database for "homologs" of a "query" document discussing the life span-modifying clk-2 gene. Inspection of these document homologs enabled and facilitated the production of hypotheses about the function and role of clk-2. Conclusion Like other graphical models for genetic, genomic and other types of biological data, LDA provides a method for extracting unanticipated insights and generating predictions amenable to subsequent experimental validation.

  20. Lecture 4: Cloud Computing in Large Computer Centers

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    This lecture will introduce Cloud Computing concepts identifying and analyzing its characteristics, models, and applications. Also, you will learn how CERN built its Cloud infrastructure and which tools are been used to deploy and manage it. About the speaker: Belmiro Moreira is an enthusiastic software engineer passionate about the challenges and complexities of architecting and deploying Cloud Infrastructures in ve...

  1. The computational physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1988-01-01

    The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generation of supercomputers. The computational physics group is involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to compact toroids. Another major area is the investigation of kinetic instabilities using a 3-D particle code. This work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence are being examined. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers

  2. Review of "Biomedical Informatics; Computer Applications in Health Care and Biomedicine" by Edward H. Shortliffe and James J. Cimino

    OpenAIRE

    Clifford Gari D

    2006-01-01

    Abstract This article is an invited review of the third edition of "Biomedical Informatics; Computer Applications in Health Care and Biomedicine", one of thirty-six volumes in Springer's 'Health Informatics Series', edited by E. Shortliffe and J. Cimino. This book spans most of the current methods and issues in health informatics, ranging through subjects as varied as data acquisition and storage, standards, natural language processing, imaging, electronic health records, decision support, te...

  3. Biomedical Engineering

    CERN Document Server

    Suh, Sang C; Tanik, Murat M

    2011-01-01

    Biomedical Engineering: Health Care Systems, Technology and Techniques is an edited volume with contributions from world experts. It provides readers with unique contributions related to current research and future healthcare systems. Practitioners and researchers focused on computer science, bioinformatics, engineering and medicine will find this book a valuable reference.

  4. Thermoelectric applications as related to biomedical engineering for NASA Johnson Space Center

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, C D

    1997-07-01

    This paper presents current NASA biomedical developments and applications using thermoelectrics. Discussion will include future technology enhancements that would be most beneficial to the application of thermoelectric technology. A great deal of thermoelectric applications have focused on electronic cooling. As with all technological developments within NASA, if the application cannot be related to the average consumer, the technology will not be mass-produced and widely available to the public (a key to research and development expenditures and thermoelectric companies). Included are discussions of thermoelectric applications to cool astronauts during launch and reentry. The earth-based applications, or spin-offs, include such innovations as tank and race car driver cooling, to cooling infants with high temperatures, as well as, the prevention of hair loss during chemotherapy. In order to preserve the scientific value of metabolic samples during long-term space missions, cooling is required to enable scientific studies. Results of one such study should provide a better understanding of osteoporosis and may lead to a possible cure for the disease. In the space environment, noise has to be kept to a minimum. In long-term space applications such as the International Space Station, thermoelectric technology provides the acoustic relief and the reliability for food, as well as, scientific refrigeration/freezers. Applications and future needs are discussed as NASA moves closer to a continued space presence in Mir, International Space Station, and Lunar-Mars Exploration.

  5. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    Energy Technology Data Exchange (ETDEWEB)

    Hules, J. [ed.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  6. Argonne Laboratory Computing Resource Center - FY2004 Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  7. Argonne's Laboratory computing center - 2007 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.; Pieper, G. W.

    2008-05-28

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific

  8. Supporting Human Activities - Exploring Activity-Centered Computing

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob

    2002-01-01

    In this paper we explore an activity-centered computing paradigm that is aimed at supporting work processes that are radically different from the ones known from office work. Our main inspiration is healthcare work that is characterized by an extreme degree of mobility, many interruptions, ad-hoc...

  9. Applied Computational Fluid Dynamics at NASA Ames Research Center

    Science.gov (United States)

    Holst, Terry L.; Kwak, Dochan (Technical Monitor)

    1994-01-01

    The field of Computational Fluid Dynamics (CFD) has advanced to the point where it can now be used for many applications in fluid mechanics research and aerospace vehicle design. A few applications being explored at NASA Ames Research Center will be presented and discussed. The examples presented will range in speed from hypersonic to low speed incompressible flow applications. Most of the results will be from numerical solutions of the Navier-Stokes or Euler equations in three space dimensions for general geometry applications. Computational results will be used to highlight the presentation as appropriate. Advances in computational facilities including those associated with NASA's CAS (Computational Aerosciences) Project of the Federal HPCC (High Performance Computing and Communications) Program will be discussed. Finally, opportunities for future research will be presented and discussed. All material will be taken from non-sensitive, previously-published and widely-disseminated work.

  10. Computational geometry lectures at the morningside center of mathematics

    CERN Document Server

    Wang, Ren-Hong

    2003-01-01

    Computational geometry is a borderline subject related to pure and applied mathematics, computer science, and engineering. The book contains articles on various topics in computational geometry, which are based on invited lectures and some contributed papers presented by researchers working during the program on Computational Geometry at the Morningside Center of Mathematics of the Chinese Academy of Science. The opening article by R.-H. Wang gives a nice survey of various aspects of computational geometry, many of which are discussed in more detail in other papers in the volume. The topics include problems of optimal triangulation, splines, data interpolation, problems of curve and surface design, problems of shape control, quantum teleportation, and others.

  11. Center for Computational Wind Turbine Aerodynamics and Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Sørensen, Jens Nørkær

    2014-01-01

    In order to design and operate a wind farm optimally it is necessary to know in detail how the wind behaves and interacts with the turbines in a farm. This not only requires knowledge about meteorology, turbulence and aerodynamics, but it also requires access to powerful computers and efficient s...... software. Center for Computational Wind Turbine Aerodynamics and Atmospheric Turbulence was established in 2010 in order to create a world-leading cross-disciplinary flow center that covers all relevant disciplines within wind farm meteorology and aerodynamics.......In order to design and operate a wind farm optimally it is necessary to know in detail how the wind behaves and interacts with the turbines in a farm. This not only requires knowledge about meteorology, turbulence and aerodynamics, but it also requires access to powerful computers and efficient...

  12. Automated segmentation of synchrotron radiation micro-computed tomography biomedical images using Graph Cuts and neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Alvarenga de Moura Meneses, Anderson, E-mail: ameneses@ieee.org [Radiological Sciences Laboratory, Rio de Janeiro State University, Rua Sao Francisco Xavier 524, CEP 20550-900, RJ (Brazil); Giusti, Alessandro [IDSIA (Dalle Molle Institute for Artificial Intelligence), University of Lugano (Switzerland); Pereira de Almeida, Andre; Parreira Nogueira, Liebert; Braz, Delson [Nuclear Engineering Program, Federal University of Rio de Janeiro, RJ (Brazil); Cely Barroso, Regina [Laboratory of Applied Physics on Biomedical Sciences, Physics Department, Rio de Janeiro State University, RJ (Brazil); Almeida, Carlos Eduardo de [Radiological Sciences Laboratory, Rio de Janeiro State University, Rua Sao Francisco Xavier 524, CEP 20550-900, RJ (Brazil)

    2011-12-21

    Synchrotron Radiation (SR) X-ray micro-Computed Tomography ({mu}CT) enables magnified images to be used as a non-invasive and non-destructive technique with a high space resolution for the qualitative and quantitative analyses of biomedical samples. The research on applications of segmentation algorithms to SR-{mu}CT is an open problem, due to the interesting and well-known characteristics of SR images for visualization, such as the high resolution and the phase contrast effect. In this article, we describe and assess the application of the Energy Minimization via Graph Cuts (EMvGC) algorithm for the segmentation of SR-{mu}CT biomedical images acquired at the Synchrotron Radiation for MEdical Physics (SYRMEP) beam line at the Elettra Laboratory (Trieste, Italy). We also propose a method using EMvGC with Artificial Neural Networks (EMANNs) for correcting misclassifications due to intensity variation of phase contrast, which are important effects and sometimes indispensable in certain biomedical applications, although they impair the segmentation provided by conventional techniques. Results demonstrate considerable success in the segmentation of SR-{mu}CT biomedical images, with average Dice Similarity Coefficient 99.88% for bony tissue in Wistar Rats rib samples (EMvGC), as well as 98.95% and 98.02% for scans of Rhodnius prolixus insect samples (Chagas's disease vector) with EMANNs, in relation to manual segmentation. The techniques EMvGC and EMANNs cope with the task of performing segmentation in images with the intensity variation due to phase contrast effects, presenting a superior performance in comparison to conventional segmentation techniques based on thresholding and linear/nonlinear image filtering, which is also discussed in the present article.

  13. Automated segmentation of synchrotron radiation micro-computed tomography biomedical images using Graph Cuts and neural networks

    International Nuclear Information System (INIS)

    Alvarenga de Moura Meneses, Anderson; Giusti, Alessandro; Pereira de Almeida, André; Parreira Nogueira, Liebert; Braz, Delson; Cely Barroso, Regina; Almeida, Carlos Eduardo de

    2011-01-01

    Synchrotron Radiation (SR) X-ray micro-Computed Tomography (μCT) enables magnified images to be used as a non-invasive and non-destructive technique with a high space resolution for the qualitative and quantitative analyses of biomedical samples. The research on applications of segmentation algorithms to SR-μCT is an open problem, due to the interesting and well-known characteristics of SR images for visualization, such as the high resolution and the phase contrast effect. In this article, we describe and assess the application of the Energy Minimization via Graph Cuts (EMvGC) algorithm for the segmentation of SR-μCT biomedical images acquired at the Synchrotron Radiation for MEdical Physics (SYRMEP) beam line at the Elettra Laboratory (Trieste, Italy). We also propose a method using EMvGC with Artificial Neural Networks (EMANNs) for correcting misclassifications due to intensity variation of phase contrast, which are important effects and sometimes indispensable in certain biomedical applications, although they impair the segmentation provided by conventional techniques. Results demonstrate considerable success in the segmentation of SR-μCT biomedical images, with average Dice Similarity Coefficient 99.88% for bony tissue in Wistar Rats rib samples (EMvGC), as well as 98.95% and 98.02% for scans of Rhodnius prolixus insect samples (Chagas's disease vector) with EMANNs, in relation to manual segmentation. The techniques EMvGC and EMANNs cope with the task of performing segmentation in images with the intensity variation due to phase contrast effects, presenting a superior performance in comparison to conventional segmentation techniques based on thresholding and linear/nonlinear image filtering, which is also discussed in the present article.

  14. Dendritic silica particles with center-radial pore channels: promising platforms for catalysis and biomedical applications.

    Science.gov (United States)

    Du, Xin; Qiao, Shi Zhang

    2015-01-27

    Dendritic silica micro-/nanoparticles with center-radial pore structures, a kind of newly created porous material, have attracted considerable attention owing to their unique open three-dimensional dendritic superstructures with large pore channels and highly accessible internal surface areas compared with conventional mesoporous silica nanoparticles (MSNs). They are very promising platforms for a variety of applications in catalysis and nanomedicine. In this review, their unique structural characteristics and properties are first analyzed, then novel and interesting synthesis methods associated with the possible formation mechanisms are summarized to provide material scientists some inspiration for the preparation of this kind of dendritic particles. Subsequently, a few examples of interesting applications are presented, mainly in catalysis, biomedicine, and other important fields such as for sacrificial templates and functional coatings. The review is concluded with an outlook on the prospects and challenges in terms of their controlled synthesis and potential applications. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. UC Merced Center for Computational Biology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Colvin, Michael; Watanabe, Masakatsu

    2010-11-30

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformation of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs

  16. Cloud Computing in Science and Engineering and the “SciShop.ru” Computer Simulation Center

    Directory of Open Access Journals (Sweden)

    E. V. Vorozhtsov

    2011-12-01

    Full Text Available Various aspects of cloud computing applications for scientific research, applied design, and remote education are described in this paper. An analysis of the different aspects is performed based on the experience from the “SciShop.ru” Computer Simulation Center. This analysis shows that cloud computing technology has wide prospects in scientific research applications, applied developments and also remote education of specialists, postgraduates, and students.

  17. The role of real-time in biomedical science: a meta-analysis on computational complexity, delay and speedup.

    Science.gov (United States)

    Faust, Oliver; Yu, Wenwei; Rajendra Acharya, U

    2015-03-01

    The concept of real-time is very important, as it deals with the realizability of computer based health care systems. In this paper we review biomedical real-time systems with a meta-analysis on computational complexity (CC), delay (Δ) and speedup (Sp). During the review we found that, in the majority of papers, the term real-time is part of the thesis indicating that a proposed system or algorithm is practical. However, these papers were not considered for detailed scrutiny. Our detailed analysis focused on papers which support their claim of achieving real-time, with a discussion on CC or Sp. These papers were analyzed in terms of processing system used, application area (AA), CC, Δ, Sp, implementation/algorithm (I/A) and competition. The results show that the ideas of parallel processing and algorithm delay were only recently introduced and journal papers focus more on Algorithm (A) development than on implementation (I). Most authors compete on big O notation (O) and processing time (PT). Based on these results, we adopt the position that the concept of real-time will continue to play an important role in biomedical systems design. We predict that parallel processing considerations, such as Sp and algorithm scaling, will become more important. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. The NIRA computer program package (photonuclear data center). Final report

    International Nuclear Information System (INIS)

    Vander Molen, H.J.; Gerstenberg, H.M.

    1976-02-01

    The Photonuclear Data Center's NIRA library of programs, executable from mass storage on the National Bureau of Standard's central computer facility, is described. Detailed instructions are given (with examples) for the use of the library to analyze, evaluate, synthesize, and produce for publication camera-ready tabular and graphical presentations of digital photonuclear reaction cross-section data. NIRA is the acronym for Nuclear Information Research Associate

  19. Fundamental of biomedical engineering

    CERN Document Server

    Sawhney, GS

    2007-01-01

    About the Book: A well set out textbook explains the fundamentals of biomedical engineering in the areas of biomechanics, biofluid flow, biomaterials, bioinstrumentation and use of computing in biomedical engineering. All these subjects form a basic part of an engineer''s education. The text is admirably suited to meet the needs of the students of mechanical engineering, opting for the elective of Biomedical Engineering. Coverage of bioinstrumentation, biomaterials and computing for biomedical engineers can meet the needs of the students of Electronic & Communication, Electronic & Instrumenta

  20. Introduction to biomedical engineering

    CERN Document Server

    Enderle, John D; Blanchard, Susan M

    2005-01-01

    Under the direction of John Enderle, Susan Blanchard and Joe Bronzino, leaders in the field have contributed chapters on the most relevant subjects for biomedical engineering students. These chapters coincide with courses offered in all biomedical engineering programs so that it can be used at different levels for a variety of courses of this evolving field. Introduction to Biomedical Engineering, Second Edition provides a historical perspective of the major developments in the biomedical field. Also contained within are the fundamental principles underlying biomedical engineering design, analysis, and modeling procedures. The numerous examples, drill problems and exercises are used to reinforce concepts and develop problem-solving skills making this book an invaluable tool for all biomedical students and engineers. New to this edition: Computational Biology, Medical Imaging, Genomics and Bioinformatics. * 60% update from first edition to reflect the developing field of biomedical engineering * New chapters o...

  1. The role of dedicated data computing centers in the age of cloud computing

    Science.gov (United States)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  2. Secure data exchange between intelligent devices and computing centers

    Science.gov (United States)

    Naqvi, Syed; Riguidel, Michel

    2005-03-01

    The advent of reliable spontaneous networking technologies (commonly known as wireless ad-hoc networks) has ostensibly raised stakes for the conception of computing intensive environments using intelligent devices as their interface with the external world. These smart devices are used as data gateways for the computing units. These devices are employed in highly volatile environments where the secure exchange of data between these devices and their computing centers is of paramount importance. Moreover, their mission critical applications require dependable measures against the attacks like denial of service (DoS), eavesdropping, masquerading, etc. In this paper, we propose a mechanism to assure reliable data exchange between an intelligent environment composed of smart devices and distributed computing units collectively called 'computational grid'. The notion of infosphere is used to define a digital space made up of a persistent and a volatile asset in an often indefinite geographical space. We study different infospheres and present general evolutions and issues in the security of such technology-rich and intelligent environments. It is beyond any doubt that these environments will likely face a proliferation of users, applications, networked devices, and their interactions on a scale never experienced before. It would be better to build in the ability to uniformly deal with these systems. As a solution, we propose a concept of virtualization of security services. We try to solve the difficult problems of implementation and maintenance of trust on the one hand, and those of security management in heterogeneous infrastructure on the other hand.

  3. New computer system for the Japan Tier-2 center

    CERN Multimedia

    Hiroyuki Matsunaga

    2007-01-01

    The ICEPP (International Center for Elementary Particle Physics) of the University of Tokyo has been operating an LCG Tier-2 center dedicated to the ATLAS experiment, and is going to switch over to the new production system which has been recently installed. The system will be of great help to the exciting physics analyses for coming years. The new computer system includes brand-new blade servers, RAID disks, a tape library system and Ethernet switches. The blade server is DELL PowerEdge 1955 which contains two Intel dual-core Xeon (WoodCrest) CPUs running at 3GHz, and a total of 650 servers will be used as compute nodes. Each of the RAID disks is configured to be RAID-6 with 16 Serial ATA HDDs. The equipment as well as the cooling system is placed in a new large computer room, and both are hooked up to UPS (uninterruptible power supply) units for stable operation. As a whole, the system has been built with redundant configuration in a cost-effective way. The next major upgrade will take place in thre...

  4. Final Report: Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [William Marsh Rice University

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  5. High Performance Computing in Science and Engineering '15 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2015. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  6. High Performance Computing in Science and Engineering '17 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael; HLRS 2017

    2018-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2017. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance.The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  7. Building a Prototype of LHC Analysis Oriented Computing Centers

    Science.gov (United States)

    Bagliesi, G.; Boccali, T.; Della Ricca, G.; Donvito, G.; Paganoni, M.

    2012-12-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  8. Building a Prototype of LHC Analysis Oriented Computing Centers

    International Nuclear Information System (INIS)

    Bagliesi, G; Boccali, T; Della Ricca, G; Donvito, G; Paganoni, M

    2012-01-01

    A Consortium between four LHC Computing Centers (Bari, Milano, Pisa and Trieste) has been formed in 2010 to prototype Analysis-oriented facilities for CMS data analysis, profiting from a grant from the Italian Ministry of Research. The Consortium aims to realize an ad-hoc infrastructure to ease the analysis activities on the huge data set collected at the LHC Collider. While “Tier2” Computing Centres, specialized in organized processing tasks like Monte Carlo simulation, are nowadays a well established concept, with years of running experience, site specialized towards end user chaotic analysis activities do not yet have a defacto standard implementation. In our effort, we focus on all the aspects that can make the analysis tasks easier for a physics user not expert in computing. On the storage side, we are experimenting on storage techniques allowing for remote data access and on storage optimization on the typical analysis access patterns. On the networking side, we are studying the differences between flat and tiered LAN architecture, also using virtual partitioning of the same physical networking for the different use patterns. Finally, on the user side, we are developing tools and instruments to allow for an exhaustive monitoring of their processes at the site, and for an efficient support system in case of problems. We will report about the results of the test executed on different subsystem and give a description of the layout of the infrastructure in place at the site participating to the consortium.

  9. Reaching for the cloud: on the lessons learned from grid computing technology transfer process to the biomedical community.

    Science.gov (United States)

    Mohammed, Yassene; Dickmann, Frank; Sax, Ulrich; von Voigt, Gabriele; Smith, Matthew; Rienhoff, Otto

    2010-01-01

    Natural scientists such as physicists pioneered the sharing of computing resources, which led to the creation of the Grid. The inter domain transfer process of this technology has hitherto been an intuitive process without in depth analysis. Some difficulties facing the life science community in this transfer can be understood using the Bozeman's "Effectiveness Model of Technology Transfer". Bozeman's and classical technology transfer approaches deal with technologies which have achieved certain stability. Grid and Cloud solutions are technologies, which are still in flux. We show how Grid computing creates new difficulties in the transfer process that are not considered in Bozeman's model. We show why the success of healthgrids should be measured by the qualified scientific human capital and the opportunities created, and not primarily by the market impact. We conclude with recommendations that can help improve the adoption of Grid and Cloud solutions into the biomedical community. These results give a more concise explanation of the difficulties many life science IT projects are facing in the late funding periods, and show leveraging steps that can help overcoming the "vale of tears".

  10. High Performance Computing in Science and Engineering '99 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2000-01-01

    The book contains reports about the most significant projects from science and engineering of the Federal High Performance Computing Center Stuttgart (HLRS). They were carefully selected in a peer-review process and are showcases of an innovative combination of state-of-the-art modeling, novel algorithms and the use of leading-edge parallel computer technology. The projects of HLRS are using supercomputer systems operated jointly by university and industry and therefore a special emphasis has been put on the industrial relevance of results and methods.

  11. High Performance Computing in Science and Engineering '98 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    1999-01-01

    The book contains reports about the most significant projects from science and industry that are using the supercomputers of the Federal High Performance Computing Center Stuttgart (HLRS). These projects are from different scientific disciplines, with a focus on engineering, physics and chemistry. They were carefully selected in a peer-review process and are showcases for an innovative combination of state-of-the-art physical modeling, novel algorithms and the use of leading-edge parallel computer technology. As HLRS is in close cooperation with industrial companies, special emphasis has been put on the industrial relevance of results and methods.

  12. Data-Driven Approaches for Computation in Intelligent Biomedical Devices: A Case Study of EEG Monitoring for Chronic Seizure Detection

    Directory of Open Access Journals (Sweden)

    Naveen Verma

    2011-04-01

    Full Text Available Intelligent biomedical devices implies systems that are able to detect specific physiological processes in patients so that particular responses can be generated. This closed-loop capability can have enormous clinical value when we consider the unprecedented modalities that are beginning to emerge for sensing and stimulating patient physiology. Both delivering therapy (e.g., deep-brain stimulation, vagus nerve stimulation, etc. and treating impairments (e.g., neural prosthesis requires computational devices that can make clinically relevant inferences, especially using minimally-intrusive patient signals. The key to such devices is algorithms that are based on data-driven signal modeling as well as hardware structures that are specialized to these. This paper discusses the primary application-domain challenges that must be overcome and analyzes the most promising methods for this that are emerging. We then look at how these methods are being incorporated in ultra-low-energy computational platforms and systems. The case study for this is a seizure-detection SoC that includes instrumentation and computation blocks in support of a system that exploits patient-specific modeling to achieve accurate performance for chronic detection. The SoC samples each EEG channel at a rate of 600 Hz and performs processing to derive signal features on every two second epoch, consuming 9 μJ/epoch/channel. Signal feature extraction reduces the data rate by a factor of over 40×, permitting wireless communication from the patient’s head while reducing the total power on the head by 14×.

  13. High Performance Computing in Science and Engineering '02 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2003-01-01

    This book presents the state-of-the-art in modeling and simulation on supercomputers. Leading German research groups present their results achieved on high-end systems of the High Performance Computing Center Stuttgart (HLRS) for the year 2002. Reports cover all fields of supercomputing simulation ranging from computational fluid dynamics to computer science. Special emphasis is given to industrially relevant applications. Moreover, by presenting results for both vector sytems and micro-processor based systems the book allows to compare performance levels and usability of a variety of supercomputer architectures. It therefore becomes an indispensable guidebook to assess the impact of the Japanese Earth Simulator project on supercomputing in the years to come.

  14. Stream computing for biomedical signal processing: A QRS complex detection case-study.

    Science.gov (United States)

    Murphy, B M; O'Driscoll, C; Boylan, G B; Lightbody, G; Marnane, W P

    2015-01-01

    Recent developments in "Big Data" have brought significant gains in the ability to process large amounts of data on commodity server hardware. Stream computing is a relatively new paradigm in this area, addressing the need to process data in real time with very low latency. While this approach has been developed for dealing with large scale data from the world of business, security and finance, there is a natural overlap with clinical needs for physiological signal processing. In this work we present a case study of streams processing applied to a typical physiological signal processing problem: QRS detection from ECG data.

  15. Computational modeling of Fontan physiology: at the crossroads of pediatric cardiology and biomedical engineering.

    Science.gov (United States)

    Slesnick, Timothy C; Yoganathan, Ajit P

    2014-08-01

    The Fontan operation has evolved over the last four and a half decades and is now widely applied to patients with various forms of "single ventricle" congenital heart disease. Survival has greatly improved since the early years, but long-term morbidity and mortality continue to occur. Modeling of Fontan geometries, both in vitro and using computational fluid dynamics, has been instrumental in designing novel changes to Fontan's operation, including the application of staged surgical procedures leading to a total cavopulmonary anastomosis, lateral tunnel, extracardiac conduit, and most recently bifurcated Y-graft modifications. In this review, the history of modeling of Fontan physiologies, current state-of-the-art methodologies, and future directions are explored. The application of these techniques to cardiac magnetic resonance imaging to construct patient specific anatomies offers the possibility of individualized surgical planning to optimize hemodynamics, including minimizing power loss, balancing hepatic factor distribution, and ultimately improving patient outcomes.

  16. Parallel Processing and Bio-inspired Computing for Biomedical Image Registration

    Directory of Open Access Journals (Sweden)

    Silviu Ioan Bejinariu

    2014-07-01

    Full Text Available Image Registration (IR is an optimization problem computing optimal parameters of a geometric transform used to overlay one or more source images to a given model by maximizing a similarity measure. In this paper the use of bio-inspired optimization algorithms in image registration is analyzed. Results obtained by means of three different algorithms are compared: Bacterial Foraging Optimization Algorithm (BFOA, Genetic Algorithm (GA and Clonal Selection Algorithm (CSA. Depending on the images type, the registration may be: area based, which is slow but more precise, and features based, which is faster. In this paper a feature based approach based on the Scale Invariant Feature Transform (SIFT is proposed. Finally, results obtained using sequential and parallel implementations on multi-core systems for area based and features based image registration are compared.

  17. Hybrid brain-computer interface for biomedical cyber-physical system application using wireless embedded EEG systems.

    Science.gov (United States)

    Chai, Rifai; Naik, Ganesh R; Ling, Sai Ho; Nguyen, Hung T

    2017-01-07

    One of the key challenges of the biomedical cyber-physical system is to combine cognitive neuroscience with the integration of physical systems to assist people with disabilities. Electroencephalography (EEG) has been explored as a non-invasive method of providing assistive technology by using brain electrical signals. This paper presents a unique prototype of a hybrid brain computer interface (BCI) which senses a combination classification of mental task, steady state visual evoked potential (SSVEP) and eyes closed detection using only two EEG channels. In addition, a microcontroller based head-mounted battery-operated wireless EEG sensor combined with a separate embedded system is used to enhance portability, convenience and cost effectiveness. This experiment has been conducted with five healthy participants and five patients with tetraplegia. Generally, the results show comparable classification accuracies between healthy subjects and tetraplegia patients. For the offline artificial neural network classification for the target group of patients with tetraplegia, the hybrid BCI system combines three mental tasks, three SSVEP frequencies and eyes closed, with average classification accuracy at 74% and average information transfer rate (ITR) of the system of 27 bits/min. For the real-time testing of the intentional signal on patients with tetraplegia, the average success rate of detection is 70% and the speed of detection varies from 2 to 4 s.

  18. Development of a computer system at La Hague center

    International Nuclear Information System (INIS)

    Mimaud, Robert; Malet, Georges; Ollivier, Francis; Fabre, J.-C.; Valois, Philippe; Desgranges, Patrick; Anfossi, Gilbert; Gentizon, Michel; Serpollet, Roger.

    1977-01-01

    The U.P.2 plant, built at La Hague Center is intended mainly for the reprocessing of spent fuels coming from (as metal) graphite-gas reactors and (as oxide) light-water, heavy-water and breeder reactors. In each of the five large nuclear units the digital processing of measurements was dealt with until 1974 by CAE 3030 data processors. During the period 1974-1975 a modern industrial computer system was set up. This system, equipped with T 2000/20 material from the Telemecanique company, consists of five measurement acquisition devices (for a total of 1500 lines processed) and two central processing units (CPU). The connection of these two PCU (Hardware and Software) enables an automatic connection of the system either on the first CPU or on the second one. The system covers, at present, data processing, threshold monitoring, alarm systems, display devices, periodical listing, and specific calculations concerning the process (balances etc), and at a later stage, an automatic control of certain units of the Process [fr

  19. Three-dimensional biomedical imaging

    International Nuclear Information System (INIS)

    Robb, R.A.

    1985-01-01

    Scientists in biomedical imaging provide researchers, physicians, and academicians with an understanding of the fundamental theories and practical applications of three-dimensional biomedical imaging methodologies. Succinct descriptions of each imaging modality are supported by numerous diagrams and illustrations which clarify important concepts and demonstrate system performance in a variety of applications. Comparison of the different functional attributes, relative advantages and limitations, complementary capabilities, and future directions of three-dimensional biomedical imaging modalities are given. Volume 1: Introductions to Three-Dimensional Biomedical Imaging Photoelectronic-Digital Imaging for Diagnostic Radiology. X-Ray Computed Tomography - Basic Principles. X-Ray Computed Tomography - Implementation and Applications. X-Ray Computed Tomography: Advanced Systems and Applications in Biomedical Research and Diagnosis. Volume II: Single Photon Emission Computed Tomography. Position Emission Tomography (PET). Computerized Ultrasound Tomography. Fundamentals of NMR Imaging. Display of Multi-Dimensional Biomedical Image Information. Summary and Prognostications

  20. Computer literacy and E-learning perception in Cameroon: the case of Yaounde Faculty of Medicine and Biomedical Sciences

    Science.gov (United States)

    2013-01-01

    Background Health science education faces numerous challenges: assimilation of knowledge, management of increasing numbers of learners or changes in educational models and methodologies. With the emergence of e-learning, the use of information and communication technologies (ICT) and Internet to improve teaching and learning in health science training institutions has become a crucial issue for low and middle income countries, including sub-Saharan Africa. In this perspective, the Faculty of Medicine and Biomedical Sciences (FMBS) of Yaoundé has played a pioneering role in Cameroon in making significant efforts to improve students’ and lecturers’ access to computers and to Internet on its campus. The objective is to investigate how computer literacy and the perception towards e-learning and its potential could contribute to the learning and teaching process within the FMBS academic community. Method A cross-sectional survey was carried out among students, residents and lecturers. The data was gathered through a written questionnaire distributed at FMBS campus and analysed with routine statistical software. Results 307 participants answered the questionnaire: 218 students, 57 residents and 32 lecturers. Results show that most students, residents and lecturers have access to computers and Internet, although students’ access is mainly at home for computers and at cyber cafés for Internet. Most of the participants have a fairly good mastery of ICT. However, some basic rules of good practices concerning the use of ICT in the health domain were still not well known. Google is the most frequently used engine to retrieve health literature for all participants; only 7% of students and 16% of residents have heard about Medical Subject Headings (MeSH). The potential of e-learning in the improvement of teaching and learning still remains insufficiently exploited. About two thirds of the students are not familiar with the concept of e-leaning. 84% of students and 58% of

  1. Computer literacy and E-learning perception in Cameroon: the case of Yaounde Faculty of Medicine and Biomedical Sciences.

    Science.gov (United States)

    Bediang, Georges; Stoll, Beat; Geissbuhler, Antoine; Klohn, Axel M; Stuckelberger, Astrid; Nko'o, Samuel; Chastonay, Philippe

    2013-04-19

    Health science education faces numerous challenges: assimilation of knowledge, management of increasing numbers of learners or changes in educational models and methodologies. With the emergence of e-learning, the use of information and communication technologies (ICT) and Internet to improve teaching and learning in health science training institutions has become a crucial issue for low and middle income countries, including sub-Saharan Africa. In this perspective, the Faculty of Medicine and Biomedical Sciences (FMBS) of Yaoundé has played a pioneering role in Cameroon in making significant efforts to improve students' and lecturers' access to computers and to Internet on its campus.The objective is to investigate how computer literacy and the perception towards e-learning and its potential could contribute to the learning and teaching process within the FMBS academic community. A cross-sectional survey was carried out among students, residents and lecturers. The data was gathered through a written questionnaire distributed at FMBS campus and analysed with routine statistical software. 307 participants answered the questionnaire: 218 students, 57 residents and 32 lecturers. Results show that most students, residents and lecturers have access to computers and Internet, although students' access is mainly at home for computers and at cyber cafés for Internet. Most of the participants have a fairly good mastery of ICT. However, some basic rules of good practices concerning the use of ICT in the health domain were still not well known. Google is the most frequently used engine to retrieve health literature for all participants; only 7% of students and 16% of residents have heard about Medical Subject Headings (MeSH).The potential of e-learning in the improvement of teaching and learning still remains insufficiently exploited. About two thirds of the students are not familiar with the concept of e-leaning. 84% of students and 58% of residents had never had access to

  2. Building the biomedical data science workforce.

    Science.gov (United States)

    Dunn, Michelle C; Bourne, Philip E

    2017-07-01

    This article describes efforts at the National Institutes of Health (NIH) from 2013 to 2016 to train a national workforce in biomedical data science. We provide an analysis of the Big Data to Knowledge (BD2K) training program strengths and weaknesses with an eye toward future directions aimed at any funder and potential funding recipient worldwide. The focus is on extramurally funded programs that have a national or international impact rather than the training of NIH staff, which was addressed by the NIH's internal Data Science Workforce Development Center. From its inception, the major goal of BD2K was to narrow the gap between needed and existing biomedical data science skills. As biomedical research increasingly relies on computational, mathematical, and statistical thinking, supporting the training and education of the workforce of tomorrow requires new emphases on analytical skills. From 2013 to 2016, BD2K jump-started training in this area for all levels, from graduate students to senior researchers.

  3. Translational Bioinformatics and Clinical Research (Biomedical) Informatics.

    Science.gov (United States)

    Sirintrapun, S Joseph; Zehir, Ahmet; Syed, Aijazuddin; Gao, JianJiong; Schultz, Nikolaus; Cheng, Donavan T

    2015-06-01

    Translational bioinformatics and clinical research (biomedical) informatics are the primary domains related to informatics activities that support translational research. Translational bioinformatics focuses on computational techniques in genetics, molecular biology, and systems biology. Clinical research (biomedical) informatics involves the use of informatics in discovery and management of new knowledge relating to health and disease. This article details 3 projects that are hybrid applications of translational bioinformatics and clinical research (biomedical) informatics: The Cancer Genome Atlas, the cBioPortal for Cancer Genomics, and the Memorial Sloan Kettering Cancer Center clinical variants and results database, all designed to facilitate insights into cancer biology and clinical/therapeutic correlations. Copyright © 2015 Elsevier Inc. All rights reserved.

  4. M-center growth in alkali halides: computer simulation

    International Nuclear Information System (INIS)

    Aguilar, M.; Jaque, F.; Agullo-Lopez, F.

    1983-01-01

    The heterogeneous interstitial nucleation model previously proposed to explain F-center growth curves in irradiated alkali halides has been extended to account for M-center kinetics. The interstitials produced during the primary irradiation event are assumed to be trapped at impurities and interstitial clusters or recombine with F and M centers. For M-center formation two cases have been considered: (a) diffusion and aggregation of F centers, and (b) statistical generation and pairing of F centers. Process (b) is the only one consistent with the quadratic relationship between M and F center concentrations. However, to account for the F/M ratios experimentally observed as well as for the role of dose-rate, a modified statistical model involving random creation and association of F + -F pairs has been shown to be adequate. (author)

  5. Biomedical signal analysis

    CERN Document Server

    Rangayyan, Rangaraj M

    2015-01-01

    The book will help assist a reader in the development of techniques for analysis of biomedical signals and computer aided diagnoses with a pedagogical examination of basic and advanced topics accompanied by over 350 figures and illustrations. Wide range of filtering techniques presented to address various applications. 800 mathematical expressions and equations. Practical questions, problems and laboratory exercises. Includes fractals and chaos theory with biomedical applications.

  6. [Biomedical informatics].

    Science.gov (United States)

    Capurro, Daniel; Soto, Mauricio; Vivent, Macarena; Lopetegui, Marcelo; Herskovic, Jorge R

    2011-12-01

    Biomedical Informatics is a new discipline that arose from the need to incorporate information technologies to the generation, storage, distribution and analysis of information in the domain of biomedical sciences. This discipline comprises basic biomedical informatics, and public health informatics. The development of the discipline in Chile has been modest and most projects have originated from the interest of individual people or institutions, without a systematic and coordinated national development. Considering the unique features of health care system of our country, research in the area of biomedical informatics is becoming an imperative.

  7. Fluid dynamics parallel computer development at NASA Langley Research Center

    Science.gov (United States)

    Townsend, James C.; Zang, Thomas A.; Dwoyer, Douglas L.

    1987-01-01

    To accomplish more detailed simulations of highly complex flows, such as the transition to turbulence, fluid dynamics research requires computers much more powerful than any available today. Only parallel processing on multiple-processor computers offers hope for achieving the required effective speeds. Looking ahead to the use of these machines, the fluid dynamicist faces three issues: algorithm development for near-term parallel computers, architecture development for future computer power increases, and assessment of possible advantages of special purpose designs. Two projects at NASA Langley address these issues. Software development and algorithm exploration is being done on the FLEX/32 Parallel Processing Research Computer. New architecture features are being explored in the special purpose hardware design of the Navier-Stokes Computer. These projects are complementary and are producing promising results.

  8. Center for computation and visualization of geometric structures. [Annual], Progress report

    Energy Technology Data Exchange (ETDEWEB)

    1993-02-12

    The mission of the Center is to establish a unified environment promoting research, education, and software and tool development. The work is centered on computing, interpreted in a broad sense to include the relevant theory, development of algorithms, and actual implementation. The research aspects of the Center are focused on geometry; correspondingly the computational aspects are focused on three (and higher) dimensional visualization. The educational aspects are likewise centered on computing and focused on geometry. A broader term than education is `communication` which encompasses the challenge of explaining to the world current research in mathematics, and specifically geometry.

  9. Biomedical Engineering | Classification | College of Engineering & Applied

    Science.gov (United States)

    Engineering Concentration on Ergonomics M.S. Program in Computer Science Interdisciplinary Concentration on Energy Doctoral Programs in Engineering Non-Degree Candidate Departments Biomedical Engineering Biomedical Engineering Industry Advisory Council Civil & Environmental Engineering Civil &

  10. Science gateways for biomedical big data analysis

    NARCIS (Netherlands)

    Shahand, S.

    2015-01-01

    Biomedical researchers are facing data deluge challenges such as dealing with large volume of complex heterogeneous data and complex and computationally demanding data processing methods. Such scale and complexity of biomedical research requires multi-disciplinary collaboration between scientists

  11. search GenBank: interactive orchestration and ad-hoc choreography of Web services in the exploration of the biomedical resources of the National Center For Biotechnology Information.

    Science.gov (United States)

    Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Siążnik, Artur

    2013-03-01

    Due to the growing number of biomedical entries in data repositories of the National Center for Biotechnology Information (NCBI), it is difficult to collect, manage and process all of these entries in one place by third-party software developers without significant investment in hardware and software infrastructure, its maintenance and administration. Web services allow development of software applications that integrate in one place the functionality and processing logic of distributed software components, without integrating the components themselves and without integrating the resources to which they have access. This is achieved by appropriate orchestration or choreography of available Web services and their shared functions. After the successful application of Web services in the business sector, this technology can now be used to build composite software tools that are oriented towards biomedical data processing. We have developed a new tool for efficient and dynamic data exploration in GenBank and other NCBI databases. A dedicated search GenBank system makes use of NCBI Web services and a package of Entrez Programming Utilities (eUtils) in order to provide extended searching capabilities in NCBI data repositories. In search GenBank users can use one of the three exploration paths: simple data searching based on the specified user's query, advanced data searching based on the specified user's query, and advanced data exploration with the use of macros. search GenBank orchestrates calls of particular tools available through the NCBI Web service providing requested functionality, while users interactively browse selected records in search GenBank and traverse between NCBI databases using available links. On the other hand, by building macros in the advanced data exploration mode, users create choreographies of eUtils calls, which can lead to the automatic discovery of related data in the specified databases. search GenBank extends standard capabilities of the

  12. Center for Advanced Energy Studies: Computer Assisted Virtual Environment (CAVE)

    Data.gov (United States)

    Federal Laboratory Consortium — The laboratory contains a four-walled 3D computer assisted virtual environment - or CAVE TM — that allows scientists and engineers to literally walk into their data...

  13. Diamond NV centers for quantum computing and quantum networks

    NARCIS (Netherlands)

    Childress, L.; Hanson, R.

    2013-01-01

    The exotic features of quantum mechanics have the potential to revolutionize information technologies. Using superposition and entanglement, a quantum processor could efficiently tackle problems inaccessible to current-day computers. Nonlocal correlations may be exploited for intrinsically secure

  14. Ethics in psychosocial and biomedical research – A training experience at the Interdisciplinary Center for Bioethics (CIEB) of the University of Chile1

    Science.gov (United States)

    Lolas, Fernando; Rodriguez, Eduardo

    2012-01-01

    This paper reviews the experience in training Latin American professionals and scientists in the ethics of biomedical and psychosocial research at the Interdisciplinary Center for Studies in Bioethics (CIEB) of the University of Chile, aided by a grant from Fogarty International Center (FIC) – National Institutes of Health from 2002 to 2011. In these 10 years of experience, 50 trainees have completed a 12-month training combining on-line and in-person teaching and learning activities, with further support for maintaining contact via webmail and personal meetings. The network formed by faculty and former trainees has published extensively on issues relevant in the continent and has been instrumental in promoting new master level courses at different universities, drafting regulations and norms, and promoting the use of bioethical discourse in health care and research. Evaluation meetings have shown that while most trainees did benefit from the experience and contributed highly to developments at their home institutions and countries, some degree of structuring of demand for qualified personnel is needed in order to better utilize the human resources created by the program. Publications and other deliverables of trainees and faculty are presented. PMID:22754084

  15. [Standardization of the terminology of the academic medical centers and biomedical research centers, in the English language, for journal article sending].

    Science.gov (United States)

    Hochman, Bernardo; Locali, Rafael Fagionato; Oliveira Filho, Renato Santos de; Oliveira, Ricardo Leão de; Goldenberg, Saul; Ferreira, Lydia Masako

    2006-01-01

    To suggest a standardization, in the English language, the formatting of the citation of the research centers. From three more recent publications of the first 20 journals available in Brazilian Portal of Scientific Information - Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES), with bigger factor of impact during the year of 2004, according of information in ISI Web of Knowledge Journal Citation Reports database in biennium 2004-2005, had extracted the formats of citations of the research centers. An analogy to the institutional hierarchie step of the Federal University of Sao Paulo (UNIFESP) was carried out, and the formats most frequent, in the English language, had been adopted as standard to be suggested to cite the research centers for sending articles. In relation to the citation "Departamento", was standardized "Department of ..." (being "..." the name in English of the Department), to the citation "Programa de Pós-Graduação" "... Program", "Disciplina" "Division of ...", "Orgãos, Grupos e Associações" "... Group ", "Setor" "Section of...", "Centro" "Center for ...", "Unidade" "... Unit ", "Instituto" "Institute of ...", "Laboratório" "Laboratory of ..." and "Grupo" "Group of ...".

  16. Performance of Cloud Computing Centers with Multiple Priority Classes

    NARCIS (Netherlands)

    Ellens, W.; Zivkovic, Miroslav; Akkerboom, J.; Litjens, R.; van den Berg, Hans Leo

    In this paper we consider the general problem of resource provisioning within cloud computing. We analyze the problem of how to allocate resources to different clients such that the service level agreements (SLAs) for all of these clients are met. A model with multiple service request classes

  17. National Energy Research Scientific Computing Center 2007 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hules, John A.; Bashor, Jon; Wang, Ucilia; Yarris, Lynn; Preuss, Paul

    2008-10-23

    This report presents highlights of the research conducted on NERSC computers in a variety of scientific disciplines during the year 2007. It also reports on changes and upgrades to NERSC's systems and services aswell as activities of NERSC staff.

  18. Conception of a computer for the nuclear medical department of the Augsburg hospital center

    International Nuclear Information System (INIS)

    Graf, G.; Heidenreich, P.

    1984-01-01

    A computer system based on the Siemens R30 process computer has been employed at the Institute of Nuclear Medicine of the Augsburg Hospital Center since early 1981. This system, including the development and testing of organ-specific evaluation programs, was used as a basis for the conception of the new computer system for the department of nuclear medicine of the Augsburg Hospital Center. The computer system was extended and installed according to this conception when the new 1400-bed hospital was opened in the 3rd phase of construction in autumn 1982. (orig.) [de

  19. Center for Computer Security newsletter. Volume 2, Number 3

    Energy Technology Data Exchange (ETDEWEB)

    None

    1983-05-01

    The Fifth Computer Security Group Conference was held November 16 to 18, 1982, at the Knoxville Hilton in Knoxville, Tennessee. Attending were 183 people, representing the Department of Energy, DOE contractors, other government agencies, and vendor organizations. In these papers are abridgements of most of the papers presented in Knoxville. Less than half-a-dozen speakers failed to furnish either abstracts or full-text papers of their Knoxville presentations.

  20. Computer-aided dispatch--traffic management center field operational test : state of Utah final report

    Science.gov (United States)

    2006-07-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch Traffic Management Center Integration Field Operations Test in the State of Utah. The document discusses evaluation findings in the followin...

  1. Computer-aided dispatch--traffic management center field operational test : Washington State final report

    Science.gov (United States)

    2006-05-01

    This document provides the final report for the evaluation of the USDOT-sponsored Computer-Aided Dispatch - Traffic Management Center Integration Field Operations Test in the State of Washington. The document discusses evaluation findings in the foll...

  2. Information and psychomotor skills knowledge acquisition: A student-customer-centered and computer-supported approach.

    Science.gov (United States)

    Nicholson, Anita; Tobin, Mary

    2006-01-01

    This presentation will discuss coupling commercial and customized computer-supported teaching aids to provide BSN nursing students with a friendly customer-centered self-study approach to psychomotor skill acquisition.

  3. CNC Turning Center Advanced Operations. Computer Numerical Control Operator/Programmer. 444-332.

    Science.gov (United States)

    Skowronski, Steven D.; Tatum, Kenneth

    This student guide provides materials for a course designed to introduce the student to the operations and functions of a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 presents course expectations and syllabus, covers safety precautions, and describes the CNC turning center components, CNC…

  4. Intention and Usage of Computer Based Information Systems in Primary Health Centers

    Science.gov (United States)

    Hosizah; Kuntoro; Basuki N., Hari

    2016-01-01

    The computer-based information system (CBIS) is adopted by almost all of in health care setting, including the primary health center in East Java Province Indonesia. Some of softwares available were SIMPUS, SIMPUSTRONIK, SIKDA Generik, e-puskesmas. Unfortunately they were most of the primary health center did not successfully implemented. This…

  5. Biomedical photonics handbook biomedical diagnostics

    CERN Document Server

    Vo-Dinh, Tuan

    2014-01-01

    Shaped by Quantum Theory, Technology, and the Genomics RevolutionThe integration of photonics, electronics, biomaterials, and nanotechnology holds great promise for the future of medicine. This topic has recently experienced an explosive growth due to the noninvasive or minimally invasive nature and the cost-effectiveness of photonic modalities in medical diagnostics and therapy. The second edition of the Biomedical Photonics Handbook presents fundamental developments as well as important applications of biomedical photonics of interest to scientists, engineers, manufacturers, teachers, studen

  6. Evaluating efforts to diversify the biomedical workforce: the role and function of the Coordination and Evaluation Center of the Diversity Program Consortium.

    Science.gov (United States)

    McCreath, Heather E; Norris, Keith C; Calderόn, Nancy E; Purnell, Dawn L; Maccalla, Nicole M G; Seeman, Teresa E

    2017-01-01

    The National Institutes of Health (NIH)-funded Diversity Program Consortium (DPC) includes a Coordination and Evaluation Center (CEC) to conduct a longitudinal evaluation of the two signature, national NIH initiatives - the Building Infrastructure Leading to Diversity (BUILD) and the National Research Mentoring Network (NRMN) programs - designed to promote diversity in the NIH-funded biomedical, behavioral, clinical, and social sciences research workforce. Evaluation is central to understanding the impact of the consortium activities. This article reviews the role and function of the CEC and the collaborative processes and achievements critical to establishing empirical evidence regarding the efficacy of federally-funded, quasi-experimental interventions across multiple sites. The integrated DPC evaluation is particularly significant because it is a collaboratively developed Consortium Wide Evaluation Plan and the first hypothesis-driven, large-scale systemic national longitudinal evaluation of training programs in the history of NIH/National Institute of General Medical Sciences. To guide the longitudinal evaluation, the CEC-led literature review defined key indicators at critical training and career transition points - or Hallmarks of Success. The multidimensional, comprehensive evaluation of the impact of the DPC framed by these Hallmarks is described. This evaluation uses both established and newly developed common measures across sites, and rigorous quasi-experimental designs within novel multi-methods (qualitative and quantitative). The CEC also promotes shared learning among Consortium partners through working groups and provides technical assistance to support high-quality process and outcome evaluation internally of each program. Finally, the CEC is responsible for developing high-impact dissemination channels for best practices to inform peer institutions, NIH, and other key national and international stakeholders. A strong longitudinal evaluation across

  7. A Descriptive Study towards Green Computing Practice Application for Data Centers in IT Based Industries

    Directory of Open Access Journals (Sweden)

    Anthony Jnr. Bokolo

    2018-01-01

    Full Text Available The progressive upsurge in demand for processing and computing power has led to a subsequent upsurge in data center carbon emissions, cost incurred, unethical waste management, depletion of natural resources and high energy utilization. This raises the issue of the sustainability attainment in data centers of Information Technology (IT based industries. Green computing practice can be applied to facilitate sustainability attainment as IT based industries utilizes data centers to provide services to staffs, practitioners and end users. But it is a known fact that enterprise servers utilize huge quantity of energy and incur other expenditures in cooling operations and it is difficult to address the needs of accuracy and efficiency in data centers while yet encouraging a greener application practice alongside cost reduction. Thus this research study focus on the practice application of Green computing in data centers which houses servers and as such presents the Green computing life cycle strategies and best practices to be practiced for better management in data centers in IT based industries. Data was collected through questionnaire from 133 respondents in industries that currently operate their in-house data centers. The analysed data was used to verify the Green computing life cycle strategies presented in this study. Findings from the data shows that each of the life cycles strategies is significant in assisting IT based industries apply Green computing practices in their data centers. This study would be of interest to knowledge and data management practitioners as well as environmental manager and academicians in deploying Green data centers in their organizations.

  8. Scientific visualization in computational aerodynamics at NASA Ames Research Center

    Science.gov (United States)

    Bancroft, Gordon V.; Plessel, Todd; Merritt, Fergus; Walatka, Pamela P.; Watson, Val

    1989-01-01

    The visualization methods used in computational fluid dynamics research at the NASA-Ames Numerical Aerodynamic Simulation facility are examined, including postprocessing, tracking, and steering methods. The visualization requirements of the facility's three-dimensional graphical workstation are outlined and the types hardware and software used to meet these requirements are discussed. The main features of the facility's current and next-generation workstations are listed. Emphasis is given to postprocessing techniques, such as dynamic interactive viewing on the workstation and recording and playback on videodisk, tape, and 16-mm film. Postprocessing software packages are described, including a three-dimensional plotter, a surface modeler, a graphical animation system, a flow analysis software toolkit, and a real-time interactive particle-tracer.

  9. Leveraging the national cyberinfrastructure for biomedical research.

    Science.gov (United States)

    LeDuc, Richard; Vaughn, Matthew; Fonner, John M; Sullivan, Michael; Williams, James G; Blood, Philip D; Taylor, James; Barnett, William

    2014-01-01

    In the USA, the national cyberinfrastructure refers to a system of research supercomputer and other IT facilities and the high speed networks that connect them. These resources have been heavily leveraged by scientists in disciplines such as high energy physics, astronomy, and climatology, but until recently they have been little used by biomedical researchers. We suggest that many of the 'Big Data' challenges facing the medical informatics community can be efficiently handled using national-scale cyberinfrastructure. Resources such as the Extreme Science and Discovery Environment, the Open Science Grid, and Internet2 provide economical and proven infrastructures for Big Data challenges, but these resources can be difficult to approach. Specialized web portals, support centers, and virtual organizations can be constructed on these resources to meet defined computational challenges, specifically for genomics. We provide examples of how this has been done in basic biology as an illustration for the biomedical informatics community.

  10. Computer science, biology and biomedical informatics academy: outcomes from 5 years of immersing high-school students into informatics research

    Directory of Open Access Journals (Sweden)

    Andrew J King

    2017-01-01

    Full Text Available The University of Pittsburgh's Department of Biomedical Informatics and Division of Pathology Informatics created a Science, Technology, Engineering, and Mathematics (STEM pipeline in 2011 dedicated to providing cutting-edge informatics research and career preparatory experiences to a diverse group of highly motivated high-school students. In this third editorial installment describing the program, we provide a brief overview of the pipeline, report on achievements of the past scholars, and present results from self-reported assessments by the 2015 cohort of scholars. The pipeline continues to expand with the 2015 addition of the innovation internship, and the introduction of a program in 2016 aimed at offering first-time research experiences to undergraduates who are underrepresented in pathology and biomedical informatics. Achievements of program scholars include authorship of journal articles, symposium and summit presentations, and attendance at top 25 universities. All of our alumni matriculated into higher education and 90% remain in STEM majors. The 2015 high-school program had ten participating scholars who self-reported gains in confidence in their research abilities and understanding of what it means to be a scientist.

  11. Computer Science, Biology and Biomedical Informatics academy: Outcomes from 5 years of Immersing High-school Students into Informatics Research.

    Science.gov (United States)

    King, Andrew J; Fisher, Arielle M; Becich, Michael J; Boone, David N

    2017-01-01

    The University of Pittsburgh's Department of Biomedical Informatics and Division of Pathology Informatics created a Science, Technology, Engineering, and Mathematics (STEM) pipeline in 2011 dedicated to providing cutting-edge informatics research and career preparatory experiences to a diverse group of highly motivated high-school students. In this third editorial installment describing the program, we provide a brief overview of the pipeline, report on achievements of the past scholars, and present results from self-reported assessments by the 2015 cohort of scholars. The pipeline continues to expand with the 2015 addition of the innovation internship, and the introduction of a program in 2016 aimed at offering first-time research experiences to undergraduates who are underrepresented in pathology and biomedical informatics. Achievements of program scholars include authorship of journal articles, symposium and summit presentations, and attendance at top 25 universities. All of our alumni matriculated into higher education and 90% remain in STEM majors. The 2015 high-school program had ten participating scholars who self-reported gains in confidence in their research abilities and understanding of what it means to be a scientist.

  12. High Performance Computing in Science and Engineering '16 : Transactions of the High Performance Computing Center, Stuttgart (HLRS) 2016

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2016. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  13. Biomedical nanotechnology.

    Science.gov (United States)

    Hurst, Sarah J

    2011-01-01

    This chapter summarizes the roles of nanomaterials in biomedical applications, focusing on those highlighted in this volume. A brief history of nanoscience and technology and a general introduction to the field are presented. Then, the chemical and physical properties of nanostructures that make them ideal for use in biomedical applications are highlighted. Examples of common applications, including sensing, imaging, and therapeutics, are given. Finally, the challenges associated with translating this field from the research laboratory to the clinic setting, in terms of the larger societal implications, are discussed.

  14. The effective use of virtualization for selection of data centers in a cloud computing environment

    Science.gov (United States)

    Kumar, B. Santhosh; Parthiban, Latha

    2018-04-01

    Data centers are the places which consist of network of remote servers to store, access and process the data. Cloud computing is a technology where users worldwide will submit the tasks and the service providers will direct the requests to the data centers which are responsible for execution of tasks. The servers in the data centers need to employ the virtualization concept so that multiple tasks can be executed simultaneously. In this paper we proposed an algorithm for data center selection based on energy of virtual machines created in server. The virtualization energy in each of the server is calculated and total energy of the data center is obtained by the summation of individual server energy. The tasks submitted are routed to the data center with least energy consumption which will result in minimizing the operational expenses of a service provider.

  15. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Göttingen

    Science.gov (United States)

    Meyer, Jörg; Quadt, Arnulf; Weber, Pavel; ATLAS Collaboration

    2011-12-01

    GoeGrid is a grid resource center located in Göttingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields of grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community, GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center is presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster is detailed. The benefits are an efficient use of computer and personpower resources.

  16. AHPCRC (Army High Performance Computing Research Center) Bulletin. Volume 1, Issue 2

    Science.gov (United States)

    2011-01-01

    area and the researchers working on these projects. Also inside: news from the AHPCRC consortium partners at Morgan State University and the NASA ...Computing Research Center is provided by the supercomputing and research facilities at Stanford University and at the NASA Ames Research Center at...atomic and molecular level, he said. He noted that “every general would like to have” a Star Trek -like holodeck, where holographic avatars could

  17. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    International Nuclear Information System (INIS)

    Bogdanov, A.V.; Yuzhanin, N.V.; Zolotarev, V.I.; Ezhakova, T.R.

    2017-01-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is reviewed and development of the corresponding elements of the system is described in the present paper.

  18. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    Science.gov (United States)

    Bogdanov, A. V.; Iuzhanin, N. V.; Zolotarev, V. I.; Ezhakova, T. R.

    2017-12-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is being reviewed and development of the corresponding elements of the system is being described in the present paper.

  19. Biomedical ontologies: toward scientific debate.

    Science.gov (United States)

    Maojo, V; Crespo, J; García-Remesal, M; de la Iglesia, D; Perez-Rey, D; Kulikowski, C

    2011-01-01

    Biomedical ontologies have been very successful in structuring knowledge for many different applications, receiving widespread praise for their utility and potential. Yet, the role of computational ontologies in scientific research, as opposed to knowledge management applications, has not been extensively discussed. We aim to stimulate further discussion on the advantages and challenges presented by biomedical ontologies from a scientific perspective. We review various aspects of biomedical ontologies going beyond their practical successes, and focus on some key scientific questions in two ways. First, we analyze and discuss current approaches to improve biomedical ontologies that are based largely on classical, Aristotelian ontological models of reality. Second, we raise various open questions about biomedical ontologies that require further research, analyzing in more detail those related to visual reasoning and spatial ontologies. We outline significant scientific issues that biomedical ontologies should consider, beyond current efforts of building practical consensus between them. For spatial ontologies, we suggest an approach for building "morphospatial" taxonomies, as an example that could stimulate research on fundamental open issues for biomedical ontologies. Analysis of a large number of problems with biomedical ontologies suggests that the field is very much open to alternative interpretations of current work, and in need of scientific debate and discussion that can lead to new ideas and research directions.

  20. Technical Data Management Center: a focal point for meteorological and other environmental transport computing technology

    International Nuclear Information System (INIS)

    McGill, B.; Maskewitz, B.F.; Trubey, D.K.

    1981-01-01

    The Technical Data Management Center, collecting, packaging, analyzing, and distributing information, computer technology and data which includes meteorological and other environmental transport work is located at the Oak Ridge National Laboratory, within the Engineering Physics Division. Major activities include maintaining a collection of computing technology and associated literature citations to provide capabilities for meteorological and environmental work. Details of the activities on behalf of TDMC's sponsoring agency, the US Nuclear Regulatory Commission, are described

  1. BIG: a Grid Portal for Biomedical Data and Images

    Directory of Open Access Journals (Sweden)

    Giovanni Aloisio

    2004-06-01

    Full Text Available Modern management of biomedical systems involves the use of many distributed resources, such as high performance computational resources to analyze biomedical data, mass storage systems to store them, medical instruments (microscopes, tomographs, etc., advanced visualization and rendering tools. Grids offer the computational power, security and availability needed by such novel applications. This paper presents BIG (Biomedical Imaging Grid, a Web-based Grid portal for management of biomedical information (data and images in a distributed environment. BIG is an interactive environment that deals with complex user's requests, regarding the acquisition of biomedical data, the "processing" and "delivering" of biomedical images, using the power and security of Computational Grids.

  2. CENTER CONDITIONS AND CYCLICITY FOR A FAMILY OF CUBIC SYSTEMS: COMPUTER ALGEBRA APPROACH.

    Science.gov (United States)

    Ferčec, Brigita; Mahdi, Adam

    2013-01-01

    Using methods of computational algebra we obtain an upper bound for the cyclicity of a family of cubic systems. We overcame the problem of nonradicality of the associated Bautin ideal by moving from the ring of polynomials to a coordinate ring. Finally, we determine the number of limit cycles bifurcating from each component of the center variety.

  3. CNC Turning Center Operations and Prove Out. Computer Numerical Control Operator/Programmer. 444-334.

    Science.gov (United States)

    Skowronski, Steven D.

    This student guide provides materials for a course designed to instruct the student in the recommended procedures used when setting up tooling and verifying part programs for a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 discusses course content and reviews and demonstrates set-up procedures…

  4. Accurate Computation of Periodic Regions' Centers in the General M-Set with Integer Index Number

    Directory of Open Access Journals (Sweden)

    Wang Xingyuan

    2010-01-01

    Full Text Available This paper presents two methods for accurately computing the periodic regions' centers. One method fits for the general M-sets with integer index number, the other fits for the general M-sets with negative integer index number. Both methods improve the precision of computation by transforming the polynomial equations which determine the periodic regions' centers. We primarily discuss the general M-sets with negative integer index, and analyze the relationship between the number of periodic regions' centers on the principal symmetric axis and in the principal symmetric interior. We can get the centers' coordinates with at least 48 significant digits after the decimal point in both real and imaginary parts by applying the Newton's method to the transformed polynomial equation which determine the periodic regions' centers. In this paper, we list some centers' coordinates of general M-sets' k-periodic regions (k=3,4,5,6 for the index numbers α=−25,−24,…,−1 , all of which have highly numerical accuracy.

  5. Advances in biomedical engineering

    CERN Document Server

    Brown, J H U

    1973-01-01

    Advances in Biomedical Engineering, Volume 2, is a collection of papers that discusses the basic sciences, the applied sciences of engineering, the medical sciences, and the delivery of health services. One paper discusses the models of adrenal cortical control, including the secretion and metabolism of cortisol (the controlled process), as well as the initiation and modulation of secretion of ACTH (the controller). Another paper discusses hospital computer systems-application problems, objective evaluation of technology, and multiple pathways for future hospital computer applications. The pos

  6. Building the biomedical data science workforce.

    Directory of Open Access Journals (Sweden)

    Michelle C Dunn

    2017-07-01

    Full Text Available This article describes efforts at the National Institutes of Health (NIH from 2013 to 2016 to train a national workforce in biomedical data science. We provide an analysis of the Big Data to Knowledge (BD2K training program strengths and weaknesses with an eye toward future directions aimed at any funder and potential funding recipient worldwide. The focus is on extramurally funded programs that have a national or international impact rather than the training of NIH staff, which was addressed by the NIH's internal Data Science Workforce Development Center. From its inception, the major goal of BD2K was to narrow the gap between needed and existing biomedical data science skills. As biomedical research increasingly relies on computational, mathematical, and statistical thinking, supporting the training and education of the workforce of tomorrow requires new emphases on analytical skills. From 2013 to 2016, BD2K jump-started training in this area for all levels, from graduate students to senior researchers.

  7. High Performance Computing in Science and Engineering '08 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2009-01-01

    The discussions and plans on all scienti?c, advisory, and political levels to realize an even larger “European Supercomputer” in Germany, where the hardware costs alone will be hundreds of millions Euro – much more than in the past – are getting closer to realization. As part of the strategy, the three national supercomputing centres HLRS (Stuttgart), NIC/JSC (Julic ¨ h) and LRZ (Munich) have formed the Gauss Centre for Supercomputing (GCS) as a new virtual organization enabled by an agreement between the Federal Ministry of Education and Research (BMBF) and the state ministries for research of Baden-Wurttem ¨ berg, Bayern, and Nordrhein-Westfalen. Already today, the GCS provides the most powerful high-performance computing - frastructure in Europe. Through GCS, HLRS participates in the European project PRACE (Partnership for Advances Computing in Europe) and - tends its reach to all European member countries. These activities aligns well with the activities of HLRS in the European HPC infrastructur...

  8. The feasibility study on 3-dimensional fluorescent x-ray computed tomography using the pinhole effect for biomedical applications.

    Science.gov (United States)

    Sunaguchi, Naoki; Yuasa, Tetsuya; Hyodo, Kazuyuki; Zeniya, Tsutomu

    2013-01-01

    We propose a 3-dimensional fluorescent x-ray computed tomography (CT) pinhole collimator, aimed at providing molecular imaging with quantifiable measures and sub-millimeter spatial resolution. In this study, we demonstrate the feasibility of this concept and investigate imaging properties such as spatial resolution, contrast resolution and quantifiable measures, by imaging physical phantoms using a preliminary imaging system developed with monochromatic synchrotron x rays constructed at the BLNE-7A experimental line at KEK, Japan.

  9. Use of computers and Internet among people with severe mental illnesses at peer support centers.

    Science.gov (United States)

    Brunette, Mary F; Aschbrenner, Kelly A; Ferron, Joelle C; Ustinich, Lee; Kelly, Michael; Grinley, Thomas

    2017-12-01

    Peer support centers are an ideal setting where people with severe mental illnesses can access the Internet via computers for online health education, peer support, and behavioral treatments. The purpose of this study was to assess computer use and Internet access in peer support agencies. A peer-assisted survey assessed the frequency with which consumers in all 13 New Hampshire peer support centers (n = 702) used computers to access Internet resources. During the 30-day survey period, 200 of the 702 peer support consumers (28%) responded to the survey. More than 3 quarters (78.5%) of respondents had gone online to seek information in the past year. About half (49%) of respondents were interested in learning about online forums that would provide information and peer support for mental health issues. Peer support centers may be a useful venue for Web-based approaches to education, peer support, and intervention. Future research should assess facilitators and barriers to use of Web-based resources among people with severe mental illness in peer support centers. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  10. On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers

    Science.gov (United States)

    Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.

    2017-10-01

    This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.

  11. The Role of the Radiation Safety Information Computational Center (RSICC) in Knowledge Management

    International Nuclear Information System (INIS)

    Valentine, T.

    2016-01-01

    Full text: The Radiation Safety Information Computational Center (RSICC) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 packages that have been provided by contributors from various agencies. RSICC’s customers obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to help ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programmes both domestically and internationally, as the majority of RSICC’s customers are students attending U.S. universities. RSICC also supports and promotes workshops and seminars in nuclear science and technology to further the use and/or development of computational tools and data. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC’s activities, services, and systems that support knowledge management and education and training in the nuclear field. (author

  12. Current state and future direction of computer systems at NASA Langley Research Center

    Science.gov (United States)

    Rogers, James L. (Editor); Tucker, Jerry H. (Editor)

    1992-01-01

    Computer systems have advanced at a rate unmatched by any other area of technology. As performance has dramatically increased there has been an equally dramatic reduction in cost. This constant cost performance improvement has precipitated the pervasiveness of computer systems into virtually all areas of technology. This improvement is due primarily to advances in microelectronics. Most people are now convinced that the new generation of supercomputers will be built using a large number (possibly thousands) of high performance microprocessors. Although the spectacular improvements in computer systems have come about because of these hardware advances, there has also been a steady improvement in software techniques. In an effort to understand how these hardware and software advances will effect research at NASA LaRC, the Computer Systems Technical Committee drafted this white paper to examine the current state and possible future directions of computer systems at the Center. This paper discusses selected important areas of computer systems including real-time systems, embedded systems, high performance computing, distributed computing networks, data acquisition systems, artificial intelligence, and visualization.

  13. Radiation Shielding Information Center: a source of computer codes and data for fusion neutronics studies

    International Nuclear Information System (INIS)

    McGill, B.L.; Roussin, R.W.; Trubey, D.K.; Maskewitz, B.F.

    1980-01-01

    The Radiation Shielding Information Center (RSIC), established in 1962 to collect, package, analyze, and disseminate information, computer codes, and data in the area of radiation transport related to fission, is now being utilized to support fusion neutronics technology. The major activities include: (1) answering technical inquiries on radiation transport problems, (2) collecting, packaging, testing, and disseminating computing technology and data libraries, and (3) reviewing literature and operating a computer-based information retrieval system containing material pertinent to radiation transport analysis. The computer codes emphasize methods for solving the Boltzmann equation such as the discrete ordinates and Monte Carlo techniques, both of which are widely used in fusion neutronics. The data packages include multigroup coupled neutron-gamma-ray cross sections and kerma coefficients, other nuclear data, and radiation transport benchmark problem results

  14. The Role of Computers in Research and Development at Langley Research Center

    Science.gov (United States)

    Wieseman, Carol D. (Compiler)

    1994-01-01

    This document is a compilation of presentations given at a workshop on the role cf computers in research and development at the Langley Research Center. The objectives of the workshop were to inform the Langley Research Center community of the current software systems and software practices in use at Langley. The workshop was organized in 10 sessions: Software Engineering; Software Engineering Standards, methods, and CASE tools; Solutions of Equations; Automatic Differentiation; Mosaic and the World Wide Web; Graphics and Image Processing; System Design Integration; CAE Tools; Languages; and Advanced Topics.

  15. Argonne's Laboratory Computing Resource Center 2009 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B. (CLS-CI)

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  16. [Projects to accelerate the practical use of innovative medical devices to collaborate with TWIns, Center for Advanced Biomedical Sciences, Waseda University and School of Engineering, The University of Tokyo].

    Science.gov (United States)

    Niimi, Shingo; Umezu, Mitsuo; Iseki, Hiroshi; Harada, Hiroshi Kasanuki Noboru; Mitsuishi, Mamoru; Kitamori, Takehiko; Tei, Yuichi; Nakaoka, Ryusuke; Haishima, Yuji

    2014-01-01

    Division of Medical Devices has been conducting the projects to accelerate the practical use of innovative medical devices to collaborate with TWIns, Center for Advanced Biomedical Sciences, Waseda University and School of Engineering, The University of Tokyo. The TWIns has been studying to aim at establishment of preclinical evaluation methods by "Engineering Based Medicine", and established Regulatory Science Institute for Medical Devices. School of Engineering, The University of Tokyo has been studying to aim at establishment of assessment methodology for innovative minimally invasive therapeutic devices, materials, and nanobio diagnostic devices. This report reviews the exchanges of personnel, the implement systems and the research progress of these projects.

  17. Efficient Computation of Multiscale Entropy over Short Biomedical Time Series Based on Linear State-Space Models

    Directory of Open Access Journals (Sweden)

    Luca Faes

    2017-01-01

    Full Text Available The most common approach to assess the dynamical complexity of a time series across multiple temporal scales makes use of the multiscale entropy (MSE and refined MSE (RMSE measures. In spite of their popularity, MSE and RMSE lack an analytical framework allowing their calculation for known dynamic processes and cannot be reliably computed over short time series. To overcome these limitations, we propose a method to assess RMSE for autoregressive (AR stochastic processes. The method makes use of linear state-space (SS models to provide the multiscale parametric representation of an AR process observed at different time scales and exploits the SS parameters to quantify analytically the complexity of the process. The resulting linear MSE (LMSE measure is first tested in simulations, both theoretically to relate the multiscale complexity of AR processes to their dynamical properties and over short process realizations to assess its computational reliability in comparison with RMSE. Then, it is applied to the time series of heart period, arterial pressure, and respiration measured for healthy subjects monitored in resting conditions and during physiological stress. This application to short-term cardiovascular variability documents that LMSE can describe better than RMSE the activity of physiological mechanisms producing biological oscillations at different temporal scales.

  18. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure

  19. Healthy Eating and Harambee: curriculum development for a culturally-centered bio-medically oriented nutrition education program to reach African American women of childbearing age.

    Science.gov (United States)

    Kannan, Srimathi; Sparks, Arlene V; Webster, J DeWitt; Krishnakumar, Ambika; Lumeng, Julie

    2010-07-01

    The purpose was to develop, implement and evaluate a peer-led nutrition curriculum Healthy Eating and Harambee that addresses established objectives of maternal and infant health and to shift the stage for African American women of childbearing age in Genesee County toward healthier dietary patterns using a socio-cultural and biomedical orientation. The PEN-3 model, which frames culture in the context of health promotion interventions, was integrated with the Transtheoretical Model to guide this 13-week pre-test/post-test curriculum. Materials developed included soul food plate visuals, a micronutrient availability worksheet, a fruit stand, and gardening kits. Learning activities included affirmations, stories, case-scenarios, point-of-purchase product recognition, church health teams, and community health fairs. We investigated health-promoting dietary behaviors (consumption of more fruits and vegetables (F&V), serving more F&V to their families, and moderating dietary sodium and fat intakes), and biomedical behaviors (self-monitoring blood pressure and exercising) across five stages of change. Session attendance and program satisfaction were assessed. N = 102 women participated (mean age = 27.5 years). A majority (77%) reported adopting at least one healthy eating behavior (moderating sodium, serving more F&V to their families), 23% adopted at least two such behaviors (reading food labels for sodium; using culinary herbs/spices; serving more F&V to their families), and 45% adopted both dietary (moderating sodium; eating more fruits) and biomedical behaviors. Participants and facilitators favorably evaluated the curriculum and suggested improvements. A multi-conceptual approach coupled with cultural and biomedical tailoring has potential to promote young African American women's movement to more advanced stages of change and improve self-efficacy for fruit and vegetable intake, dietary sodium moderation, and self-monitoring blood pressure and physical activity.

  20. Teaching Scientific Computing: A Model-Centered Approach to Pipeline and Parallel Programming with C

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2015-01-01

    Full Text Available The aim of this study is to present an approach to the introduction into pipeline and parallel computing, using a model of the multiphase queueing system. Pipeline computing, including software pipelines, is among the key concepts in modern computing and electronics engineering. The modern computer science and engineering education requires a comprehensive curriculum, so the introduction to pipeline and parallel computing is the essential topic to be included in the curriculum. At the same time, the topic is among the most motivating tasks due to the comprehensive multidisciplinary and technical requirements. To enhance the educational process, the paper proposes a novel model-centered framework and develops the relevant learning objects. It allows implementing an educational platform of constructivist learning process, thus enabling learners’ experimentation with the provided programming models, obtaining learners’ competences of the modern scientific research and computational thinking, and capturing the relevant technical knowledge. It also provides an integral platform that allows a simultaneous and comparative introduction to pipelining and parallel computing. The programming language C for developing programming models and message passing interface (MPI and OpenMP parallelization tools have been chosen for implementation.

  1. Argonne's Laboratory computing resource center : 2006 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff

  2. The psychology of computer displays in the modern mission control center

    Science.gov (United States)

    Granaas, Michael M.; Rhea, Donald C.

    1988-01-01

    Work at NASA's Western Aeronautical Test Range (WATR) has demonstrated the need for increased consideration of psychological factors in the design of computer displays for the WATR mission control center. These factors include color perception, memory load, and cognitive processing abilities. A review of relevant work in the human factors psychology area is provided to demonstrate the need for this awareness. The information provided should be relevant in control room settings where computerized displays are being used.

  3. The Radiation Safety Information Computational Center (RSICC): A Resource for Nuclear Science Applications

    Energy Technology Data Exchange (ETDEWEB)

    Kirk, Bernadette Lugue [ORNL

    2009-01-01

    The Radiation Safety Information Computational Center (RSICC) has been in existence since 1963. RSICC collects, organizes, evaluates and disseminates technical information (software and nuclear data) involving the transport of neutral and charged particle radiation, and shielding and protection from the radiation associated with: nuclear weapons and materials, fission and fusion reactors, outer space, accelerators, medical facilities, and nuclear waste management. RSICC serves over 12,000 scientists and engineers from about 100 countries.

  4. Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)

    Science.gov (United States)

    Valentine, Timothy

    2017-09-01

    The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.

  5. Environmental/Biomedical Terminology Index

    Energy Technology Data Exchange (ETDEWEB)

    Huffstetler, J.K.; Dailey, N.S.; Rickert, L.W.; Chilton, B.D.

    1976-12-01

    The Information Center Complex (ICC), a centrally administered group of information centers, provides information support to environmental and biomedical research groups and others within and outside Oak Ridge National Laboratory. In-house data base building and development of specialized document collections are important elements of the ongoing activities of these centers. ICC groups must be concerned with language which will adequately classify and insure retrievability of document records. Language control problems are compounded when the complexity of modern scientific problem solving demands an interdisciplinary approach. Although there are several word lists, indexes, and thesauri specific to various scientific disciplines usually grouped as Environmental Sciences, no single generally recognized authority can be used as a guide to the terminology of all environmental science. If biomedical terminology for the description of research on environmental effects is also needed, the problem becomes even more complex. The building of a word list which can be used as a general guide to the environmental/biomedical sciences has been a continuing activity of the Information Center Complex. This activity resulted in the publication of the Environmental Biomedical Terminology Index (EBTI).

  6. Environmental/Biomedical Terminology Index

    International Nuclear Information System (INIS)

    Huffstetler, J.K.; Dailey, N.S.; Rickert, L.W.; Chilton, B.D.

    1976-12-01

    The Information Center Complex (ICC), a centrally administered group of information centers, provides information support to environmental and biomedical research groups and others within and outside Oak Ridge National Laboratory. In-house data base building and development of specialized document collections are important elements of the ongoing activities of these centers. ICC groups must be concerned with language which will adequately classify and insure retrievability of document records. Language control problems are compounded when the complexity of modern scientific problem solving demands an interdisciplinary approach. Although there are several word lists, indexes, and thesauri specific to various scientific disciplines usually grouped as Environmental Sciences, no single generally recognized authority can be used as a guide to the terminology of all environmental science. If biomedical terminology for the description of research on environmental effects is also needed, the problem becomes even more complex. The building of a word list which can be used as a general guide to the environmental/biomedical sciences has been a continuing activity of the Information Center Complex. This activity resulted in the publication of the Environmental Biomedical Terminology Index

  7. Characterization of the porous structures of the green body and sintered biomedical titanium scaffolds with micro-computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Arifvianto, B., E-mail: b.arifvianto@tudelft.nl; Leeflang, M.A.; Zhou, J.

    2016-11-15

    The present research was aimed at gaining an understanding of the porous structure changes from the green body through water leaching and sintering to titanium scaffolds. Micro-computed tomography (micro-CT) was performed to generate 3D models of titanium scaffold preforms containing carbamide space-holding particles and sintered scaffolds containing macro- and micro-pores. The porosity values and structural parameters were determined by means of image analysis. The result showed that the porosity values, macro-pore sizes, connectivity densities and specific surface areas of the titanium scaffolds sintered at 1200 °C for 3 h did not significantly deviate from those of the green structures with various volume fractions of the space holder. Titanium scaffolds with a maximum specific surface area could be produced with an addition of 60–65 vol% carbamide particles to the matrix powder. The connectivity of pores inside the scaffold increased with rising volume fraction of the space holder. The shrinkage of the scaffolds prepared with > 50 vol% carbamide space holder, occurring during sintering, was caused by the reductions of macro-pore sizes and micro-pore sizes as well as the thickness of struts. In conclusion, the final porous structural characteristics of titanium scaffolds could be estimated from those of the green body. - Highlights: •Porous structures of green body and sintered titanium scaffolds was studied. •Porous structures of both samples were quantitatively characterized with micro-CT. •Porous structures of scaffolds could be controlled from the green body. •Shrinkage mechanisms of titanium scaffolds during sintering was established.

  8. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    Science.gov (United States)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  9. Spectrum of tablet computer use by medical students and residents at an academic medical center

    Directory of Open Access Journals (Sweden)

    Robert Robinson

    2015-07-01

    Full Text Available Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians.Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM in July and August of 2012.Results. There were 76 medical student responses (26% response rate and 66 resident/fellow responses to this survey (21% response rate. Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035. The most common reported uses were for accessing medical reference applications (46%, e-Books (45%, and board study (32%. Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010, review radiology images (27% vs. 12%, p = 0.019, and enter patient care orders (26% vs. 3%, p < 0.001.Discussion. This study shows a high prevalence and frequency of tablet computer use among physicians in training at this academic medical center. Most residents and students use tablet computers to access medical references, e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks.Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on

  10. Initial constructs for patient-centered outcome measures to evaluate brain-computer interfaces.

    Science.gov (United States)

    Andresen, Elena M; Fried-Oken, Melanie; Peters, Betts; Patrick, Donald L

    2016-10-01

    The authors describe preliminary work toward the creation of patient-centered outcome (PCO) measures to evaluate brain-computer interface (BCI) as an assistive technology (AT) for individuals with severe speech and physical impairments (SSPI). In Phase 1, 591 items from 15 existing measures were mapped to the International Classification of Functioning, Disability and Health (ICF). In Phase 2, qualitative interviews were conducted with eight people with SSPI and seven caregivers. Resulting text data were coded in an iterative analysis. Most items (79%) were mapped to the ICF environmental domain; over half (53%) were mapped to more than one domain. The ICF framework was well suited for mapping items related to body functions and structures, but less so for items in other areas, including personal factors. Two constructs emerged from qualitative data: quality of life (QOL) and AT. Component domains and themes were identified for each. Preliminary constructs, domains and themes were generated for future PCO measures relevant to BCI. Existing instruments are sufficient for initial items but do not adequately match the values of people with SSPI and their caregivers. Field methods for interviewing people with SSPI were successful, and support the inclusion of these individuals in PCO research. Implications for Rehabilitation Adapted interview methods allow people with severe speech and physical impairments to participate in patient-centered outcomes research. Patient-centered outcome measures are needed to evaluate the clinical implementation of brain-computer interface as an assistive technology.

  11. Computer Vision Syndrome among Call Center Employees at Telecommunication Company in Bandung

    Directory of Open Access Journals (Sweden)

    Ghea Nursyifa

    2016-06-01

    Full Text Available Background: The occurrence of Computer Vision Syndrome (CVS at the workplace has increased within decades due to theprolonged use of computers. Knowledge of CVS is necessary in order to develop an awareness of how to prevent and alleviate itsprevalence . The objective of this study was to assess the knowledge of CVS among call center employees and to explore the most frequent CVS symptom experienced by the workers. Methods: A descriptive cross sectional study was conducted during the period of September to November 2014 at Telecommunication Company in Bandung using a questionnaire consisting of 30 questions. Out of the 30 questions/statements, 15 statements were about knowledge of CVS and other 15 questions were about the occurrence of CVS and its symptoms. In this study 125 call center employees participated as respondents using consecutive sampling. The level of knowledge was divided into 3 categories: good (76–100%, fair (75–56% and poor (<56%. The collected data was presented in frequency tabulation. Results: There was 74.4% of the respondents had poor knowledge of CVS. The most symptom experienced by the respondents was asthenopia. Conclusions: The CVS occurs in call center employees with various symptoms and signs. This situation is not supported by good knowledge of the syndrome which can hamper prevention programs.

  12. Biomedical technology

    CERN Document Server

    Wriggers, Peter

    2015-01-01

    During the last years computational methods lead to new approaches that can be applied within medical practice. Based on the tremendous advances in medical imaging and high-performance computing, virtual testing is able to help in medical decision processes or implant designs. Current challenges in medicine and engineering are related to the application of computational methods to clinical medicine and the study of biological systems at different scales. Additionally manufacturers will be able to use computational tools and methods to predict the performance of their medical devices in virtual patients. The physical and animal testing procedures could be reduced by virtual prototyping of medical devices. Here simulations can enhance the performance of alternate device designs for a range of virtual patients. This will lead to a refinement of designs and to safer products. This book summarizes different aspects of approaches to enhance function, production, initialization and complications of different types o...

  13. Modeling Remote I/O versus Staging Tradeoff in Multi-Data Center Computing

    International Nuclear Information System (INIS)

    Suslu, Ibrahim H

    2014-01-01

    In multi-data center computing, data to be processed is not always local to the computation. This is a major challenge especially for data-intensive Cloud computing applications, since large amount of data would need to be either moved the local sites (staging) or accessed remotely over the network (remote I/O). Cloud application developers generally chose between staging and remote I/O intuitively without making any scientific comparison specific to their application data access patterns since there is no generic model available that they can use. In this paper, we propose a generic model for the Cloud application developers which would help them to choose the most appropriate data access mechanism for their specific application workloads. We define the parameters that potentially affect the end-to-end performance of the multi-data center Cloud applications which need to access large datasets over the network. To test and validate our models, we implemented a series of synthetic benchmark applications to simulate the most common data access patterns encountered in Cloud applications. We show that our model provides promising results in different settings with different parameters, such as network bandwidth, server and client capabilities, and data access ratio

  14. Computed tomography-guided core-needle biopsy of lung lesions: an oncology center experience

    Energy Technology Data Exchange (ETDEWEB)

    Guimaraes, Marcos Duarte; Fonte, Alexandre Calabria da; Chojniak, Rubens, E-mail: marcosduarte@yahoo.com.b [Hospital A.C. Camargo, Sao Paulo, SP (Brazil). Dept. of Radiology and Imaging Diagnosis; Andrade, Marcony Queiroz de [Hospital Alianca, Salvador, BA (Brazil); Gross, Jefferson Luiz [Hospital A.C. Camargo, Sao Paulo, SP (Brazil). Dept. of Chest Surgery

    2011-03-15

    Objective: The present study is aimed at describing the experience of an oncology center with computed tomography guided core-needle biopsy of pulmonary lesions. Materials and Methods: Retrospective analysis of 97 computed tomography-guided core-needle biopsy of pulmonary lesions performed in the period between 1996 and 2004 in a Brazilian reference oncology center (Hospital do Cancer - A.C. Camargo). Information regarding material appropriateness and the specific diagnoses were collected and analyzed. Results: Among 97 lung biopsies, 94 (96.9%) supplied appropriate specimens for histological analyses, with 71 (73.2%) cases being diagnosed as malignant lesions and 23 (23.7%) diagnosed as benign lesions. Specimens were inappropriate for analysis in three cases. The frequency of specific diagnosis was 83 (85.6%) cases, with high rates for both malignant lesions with 63 (88.7%) cases and benign lesions with 20 (86.7%). As regards complications, a total of 12 cases were observed as follows: 7 (7.2%) cases of hematoma, 3 (3.1%) cases of pneumothorax and 2 (2.1%) cases of hemoptysis. Conclusion: Computed tomography-guided core needle biopsy of lung lesions demonstrated high rates of material appropriateness and diagnostic specificity, and low rates of complications in the present study. (author)

  15. Biomedical engineering and nanotechnology

    International Nuclear Information System (INIS)

    Pawar, S.H.; Khyalappa, R.J.; Yakhmi, J.V.

    2009-01-01

    This book is predominantly a compilation of papers presented in the conference which is focused on the development in biomedical materials, biomedical devises and instrumentation, biomedical effects of electromagnetic radiation, electrotherapy, radiotherapy, biosensors, biotechnology, bioengineering, tissue engineering, clinical engineering and surgical planning, medical imaging, hospital system management, biomedical education, biomedical industry and society, bioinformatics, structured nanomaterial for biomedical application, nano-composites, nano-medicine, synthesis of nanomaterial, nano science and technology development. The papers presented herein contain the scientific substance to suffice the academic directivity of the researchers from the field of biomedicine, biomedical engineering, material science and nanotechnology. Papers relevant to INIS are indexed separately

  16. Threat and vulnerability analysis and conceptual design of countermeasures for a computer center under construction

    International Nuclear Information System (INIS)

    Rozen, A.; Musacchio, J.M.

    1988-01-01

    This project involved the assessment of a new computer center to be used as the main national data processing facility of a large European bank. This building serves as the principal facility in the country with all other branches utilizing the data processing center. As such, the building is a crucial target which may attract terrorist attacks. Threat and vulnerability assessments were performed as a basis to define and overall fully-integrated security system of passive and active countermeasures for the facility. After separately assessing the range of threats and vulnerabilities, a combined matrix of threats and vulnerabilities was used to identify the crucial combinations. A set of architectural-structural passive measures was added to the active components of the security system

  17. Abstracts of digital computer code packages assembled by the Radiation Shielding Information Center

    Energy Technology Data Exchange (ETDEWEB)

    Carter, B.J.; Maskewitz, B.F.

    1985-04-01

    This publication, ORNL/RSIC-13, Volumes I to III Revised, has resulted from an internal audit of the first 168 packages of computing technology in the Computer Codes Collection (CCC) of the Radiation Shielding Information Center (RSIC). It replaces the earlier three documents published as single volumes between 1966 to 1972. A significant number of the early code packages were considered to be obsolete and were removed from the collection in the audit process and the CCC numbers were not reassigned. Others not currently being used by the nuclear R and D community were retained in the collection to preserve technology not replaced by newer methods, or were considered of potential value for reference purposes. Much of the early technology, however, has improved through developer/RSIC/user interaction and continues at the forefront of the advancing state-of-the-art.

  18. Polymer waveguides for electro-optical integration in data centers and high-performance computers.

    Science.gov (United States)

    Dangel, Roger; Hofrichter, Jens; Horst, Folkert; Jubin, Daniel; La Porta, Antonio; Meier, Norbert; Soganci, Ibrahim Murat; Weiss, Jonas; Offrein, Bert Jan

    2015-02-23

    To satisfy the intra- and inter-system bandwidth requirements of future data centers and high-performance computers, low-cost low-power high-throughput optical interconnects will become a key enabling technology. To tightly integrate optics with the computing hardware, particularly in the context of CMOS-compatible silicon photonics, optical printed circuit boards using polymer waveguides are considered as a formidable platform. IBM Research has already demonstrated the essential silicon photonics and interconnection building blocks. A remaining challenge is electro-optical packaging, i.e., the connection of the silicon photonics chips with the system. In this paper, we present a new single-mode polymer waveguide technology and a scalable method for building the optical interface between silicon photonics chips and single-mode polymer waveguides.

  19. Abstracts of digital computer code packages assembled by the Radiation Shielding Information Center

    International Nuclear Information System (INIS)

    Carter, B.J.; Maskewitz, B.F.

    1985-04-01

    This publication, ORNL/RSIC-13, Volumes I to III Revised, has resulted from an internal audit of the first 168 packages of computing technology in the Computer Codes Collection (CCC) of the Radiation Shielding Information Center (RSIC). It replaces the earlier three documents published as single volumes between 1966 to 1972. A significant number of the early code packages were considered to be obsolete and were removed from the collection in the audit process and the CCC numbers were not reassigned. Others not currently being used by the nuclear R and D community were retained in the collection to preserve technology not replaced by newer methods, or were considered of potential value for reference purposes. Much of the early technology, however, has improved through developer/RSIC/user interaction and continues at the forefront of the advancing state-of-the-art

  20. Initial Flight Test of the Production Support Flight Control Computers at NASA Dryden Flight Research Center

    Science.gov (United States)

    Carter, John; Stephenson, Mark

    1999-01-01

    The NASA Dryden Flight Research Center has completed the initial flight test of a modified set of F/A-18 flight control computers that gives the aircraft a research control law capability. The production support flight control computers (PSFCC) provide an increased capability for flight research in the control law, handling qualities, and flight systems areas. The PSFCC feature a research flight control processor that is "piggybacked" onto the baseline F/A-18 flight control system. This research processor allows for pilot selection of research control law operation in flight. To validate flight operation, a replication of a standard F/A-18 control law was programmed into the research processor and flight-tested over a limited envelope. This paper provides a brief description of the system, summarizes the initial flight test of the PSFCC, and describes future experiments for the PSFCC.

  1. The Radiation Safety Information Computational Center (RSICC): A Resource for Nuclear Science Applications

    International Nuclear Information System (INIS)

    Kirk, Bernadette Lugue

    2009-01-01

    The Radiation Safety Information Computational Center (RSICC) has been in existence since 1963. RSICC collects, organizes, evaluates and disseminates technical information (software and nuclear data) involving the transport of neutral and charged particle radiation, and shielding and protection from the radiation associated with: nuclear weapons and materials, fission and fusion reactors, outer space, accelerators, medical facilities, and nuclear waste management. RSICC serves over 12,000 scientists and engineers from about 100 countries. An important activity of RSICC is its participation in international efforts on computational and experimental benchmarks. An example is the Shielding Integral Benchmarks Archival Database (SINBAD), which includes shielding benchmarks for fission, fusion and accelerators. RSICC is funded by the United States Department of Energy, Department of Homeland Security and Nuclear Regulatory Commission.

  2. Computational fluid dynamics research at the United Technologies Research Center requiring supercomputers

    Science.gov (United States)

    Landgrebe, Anton J.

    1987-01-01

    An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.

  3. The Erasmus Computing Grid - Building a Super-Computer Virtually for Free at the Erasmus Medical Center and the Hogeschool Rotterdam

    NARCIS (Netherlands)

    T.A. Knoch (Tobias); L.V. de Zeeuw (Luc)

    2006-01-01

    textabstractThe Set-Up of the 20 Teraflop Erasmus Computing Grid: To meet the enormous computational needs of live- science research as well as clinical diagnostics and treatment the Hogeschool Rotterdam and the Erasmus Medical Center are currently setting up one of the largest desktop

  4. Computer-aided dispatch--traffic management center field operational test final detailed test plan : WSDOT deployment

    Science.gov (United States)

    2003-10-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : WSDOT deployment". This document defines the objective, approach,...

  5. Computer-aided dispatch--traffic management center field operational test final test plans : state of Utah

    Science.gov (United States)

    2004-01-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : state of Utah". This document defines the objective, approach, an...

  6. Changing the batch system in a Tier 1 computing center: why and how

    Science.gov (United States)

    Chierici, Andrea; Dal Pra, Stefano

    2014-06-01

    At the Italian Tierl Center at CNAF we are evaluating the possibility to change the current production batch system. This activity is motivated mainly because we are looking for a more flexible licensing model as well as to avoid vendor lock-in. We performed a technology tracking exercise and among many possible solutions we chose to evaluate Grid Engine as an alternative because its adoption is increasing in the HEPiX community and because it's supported by the EMI middleware that we currently use on our computing farm. Another INFN site evaluated Slurm and we will compare our results in order to understand pros and cons of the two solutions. We will present the results of our evaluation of Grid Engine, in order to understand if it can fit the requirements of a Tier 1 center, compared to the solution we adopted long ago. We performed a survey and a critical re-evaluation of our farming infrastructure: many production softwares (accounting and monitoring on top of all) rely on our current solution and changing it required us to write new wrappers and adapt the infrastructure to the new system. We believe the results of this investigation can be very useful to other Tier-ls and Tier-2s centers in a similar situation, where the effort of switching may appear too hard to stand. We will provide guidelines in order to understand how difficult this operation can be and how long the change may take.

  7. An Audit on the Appropriateness of Coronary Computed Tomography Angiography Referrals in a Tertiary Cardiac Center.

    Science.gov (United States)

    Alderazi, Ahmed Ali; Lynch, Mary

    2017-01-01

    In response to growing concerns regarding the overuse of coronary computed tomography angiography (CCTA) in the clinical setting, multiple societies, including the American College of Cardiology Foundation, have jointly published revised criteria regarding the appropriate use of this imaging modality. However, previous research indicates significant discrepancies in the rate of adherence to these guidelines. To assess the appropriateness of CCTA referrals in a tertiary cardiac center in Bahrain. This retrospective clinical audit examined the records of patients referred to CCTA between the April 1, 2015 and December 31, 2015 in Mohammed bin Khalifa Cardiac Center. Using information from medical records, each case was meticulously audited against guidelines to categorize it as appropriate, inappropriate, or uncertain. Of the 234 records examined, 176 (75.2%) were appropriate, 47 (20.1%) were uncertain, and 11 (4.7%) were inappropriate. About 74.4% of all referrals were to investigate coronary artery disease (CAD). The most common indication that was deemed appropriate was the detection of CAD in the setting of suspected ischemic equivalent in patients with an intermediate pretest probability of CAD (65.9%). Most referrals deemed inappropriate were requested to detect CAD in asymptomatic patients at low or intermediate risk of CAD (63.6%). This audit demonstrates a relatively low rate of inappropriate CCTA referrals, indicating the appropriate and efficient use of this resource in the Mohammed bin Khalifa Cardiac Center. Agreement on and reclassification of "uncertain" cases by guideline authorities would facilitate a deeper understanding of referral appropriateness.

  8. Examining the Fundamental Obstructs of Adopting Cloud Computing for 9-1-1 Dispatch Centers in the USA

    Science.gov (United States)

    Osman, Abdulaziz

    2016-01-01

    The purpose of this research study was to examine the unknown fears of embracing cloud computing which stretches across measurements like fear of change from leaders and the complexity of the technology in 9-1-1 dispatch centers in USA. The problem that was addressed in the study was that many 9-1-1 dispatch centers in USA are still using old…

  9. Computed tomography evaluation of rotary systems on the root canal transportation and centering ability

    Energy Technology Data Exchange (ETDEWEB)

    Pagliosa, Andre; Raucci-Neto, Walter; Silva-Souza, Yara Teresinha Correa; Alfredo, Edson, E-mail: ysousa@unaerp.br [Universidade de Ribeirao Preto (UNAERP), SP (Brazil). Fac. de Odontologia; Sousa-Neto, Manoel Damiao; Versiani, Marco Aurelio [Universidade de Sao Paulo (USP), Ribeirao Preto, SP (Brazil). Fac. de Odoentologia

    2015-03-01

    The endodontic preparation of curved and narrow root canals is challenging, with a tendency for the prepared canal to deviate away from its natural axis. The aim of this study was to evaluate, by cone-beam computed tomography, the transportation and centering ability of curved mesiobuccal canals in maxillary molars after biomechanical preparation with different nickel-titanium (NiTi) rotary systems. Forty teeth with angles of curvature ranging from 20° to 40° and radii between 5.0 mm and 10.0 mm were selected and assigned into four groups (n = 10), according to the biomechanical preparative system used: Hero 642 (HR), Liberator (LB), ProTaper (PT), and Twisted File (TF). The specimens were inserted into an acrylic device and scanned with computed tomography prior to, and following, instrumentation at 3, 6 and 9 mm from the root apex. The canal degree of transportation and centering ability were calculated and analyzed using one-way ANOVA and Tukey’s tests (α = 0.05). The results demonstrated no significant difference (p > 0.05) in shaping ability among the rotary systems. The mean canal transportation was: -0.049 ± 0.083 mm (HR); -0.004 ± 0.044 mm (LB); -0.003 ± 0.064 mm (PT); -0.021 ± 0.064 mm (TF). The mean canal centering ability was: -0.093 ± 0.147 mm (HR); -0.001 ± 0.100 mm (LB); -0.002 ± 0.134 mm (PT); -0.033 ± 0.133 mm (TF). Also, there was no significant difference among the root segments (p > 0.05). It was concluded that the Hero 642, Liberator, ProTaper, and Twisted File rotary systems could be safely used in curved canal instrumentation, resulting in satisfactory preservation of the original canal shape. (author)

  10. Computed tomography evaluation of rotary systems on the root canal transportation and centering ability

    Directory of Open Access Journals (Sweden)

    André PAGLIOSA

    2015-01-01

    Full Text Available Abstract : The endodontic preparation of curved and narrow root canals is challenging, with a tendency for the prepared canal to deviate away from its natural axis. The aim of this study was to evaluate, by cone-beam computed tomography, the transportation and centering ability of curved mesiobuccal canals in maxillary molars after biomechanical preparation with different nickel-titanium (NiTi rotary systems. Forty teeth with angles of curvature ranging from 20° to 40° and radii between 5.0 mm and 10.0 mm were selected and assigned into four groups (n = 10, according to the biomechanical preparative system used: Hero 642 (HR, Liberator (LB, ProTaper (PT, and Twisted File (TF. The specimens were inserted into an acrylic device and scanned with computed tomography prior to, and following, instrumentation at 3, 6 and 9 mm from the root apex. The canal degree of transportation and centering ability were calculated and analyzed using one-way ANOVA and Tukey’s tests (α = 0.05. The results demonstrated no significant difference (p > 0.05 in shaping ability among the rotary systems. The mean canal transportation was: -0.049 ± 0.083 mm (HR; -0.004 ± 0.044 mm (LB; -0.003 ± 0.064 mm (PT; -0.021 ± 0.064 mm (TF. The mean canal centering ability was: -0.093 ± 0.147 mm (HR; -0.001 ± 0.100 mm (LB; -0.002 ± 0.134 mm (PT; -0.033 ± 0.133 mm (TF. Also, there was no significant difference among the root segments (p > 0.05. It was concluded that the Hero 642, Liberator, ProTaper, and Twisted File rotary systems could be safely used in curved canal instrumentation, resulting in satisfactory preservation of the original canal shape.

  11. Computed tomography evaluation of rotary systems on the root canal transportation and centering ability

    International Nuclear Information System (INIS)

    Pagliosa, Andre; Raucci-Neto, Walter; Silva-Souza, Yara Teresinha Correa; Alfredo, Edson; Sousa-Neto, Manoel Damiao; Versiani, Marco Aurelio

    2015-01-01

    The endodontic preparation of curved and narrow root canals is challenging, with a tendency for the prepared canal to deviate away from its natural axis. The aim of this study was to evaluate, by cone-beam computed tomography, the transportation and centering ability of curved mesiobuccal canals in maxillary molars after biomechanical preparation with different nickel-titanium (NiTi) rotary systems. Forty teeth with angles of curvature ranging from 20° to 40° and radii between 5.0 mm and 10.0 mm were selected and assigned into four groups (n = 10), according to the biomechanical preparative system used: Hero 642 (HR), Liberator (LB), ProTaper (PT), and Twisted File (TF). The specimens were inserted into an acrylic device and scanned with computed tomography prior to, and following, instrumentation at 3, 6 and 9 mm from the root apex. The canal degree of transportation and centering ability were calculated and analyzed using one-way ANOVA and Tukey’s tests (α = 0.05). The results demonstrated no significant difference (p > 0.05) in shaping ability among the rotary systems. The mean canal transportation was: -0.049 ± 0.083 mm (HR); -0.004 ± 0.044 mm (LB); -0.003 ± 0.064 mm (PT); -0.021 ± 0.064 mm (TF). The mean canal centering ability was: -0.093 ± 0.147 mm (HR); -0.001 ± 0.100 mm (LB); -0.002 ± 0.134 mm (PT); -0.033 ± 0.133 mm (TF). Also, there was no significant difference among the root segments (p > 0.05). It was concluded that the Hero 642, Liberator, ProTaper, and Twisted File rotary systems could be safely used in curved canal instrumentation, resulting in satisfactory preservation of the original canal shape. (author)

  12. Pain, Work-related Characteristics, and Psychosocial Factors among Computer Workers at a University Center.

    Science.gov (United States)

    Mainenti, Míriam Raquel Meira; Felicio, Lilian Ramiro; Rodrigues, Erika de Carvalho; Ribeiro da Silva, Dalila Terrinha; Vigário Dos Santos, Patrícia

    2014-04-01

    [Purpose] Complaint of pain is common in computer workers, encouraging the investigation of pain-related workplace factors. This study investigated the relationship among work-related characteristics, psychosocial factors, and pain among computer workers from a university center. [Subjects and Methods] Fifteen subjects (median age, 32.0 years; interquartile range, 26.8-34.5 years) were subjected to measurement of bioelectrical impedance; photogrammetry; workplace measurements; and pain complaint, quality of life, and motivation questionnaires. [Results] The low back was the most prevalent region of complaint (76.9%). The number of body regions for which subjects complained of pain was greater in the no rest breaks group, which also presented higher prevalences of neck (62.5%) and low back (100%) pain. There were also observed associations between neck complaint and quality of life; neck complaint and head protrusion; wrist complaint and shoulder angle; and use of a chair back and thoracic pain. [Conclusion] Complaint of pain was associated with no short rest breaks, no use of a chair back, poor quality of life, high head protrusion, and shoulder angle while using the mouse of a computer.

  13. Concurrent validity of an automated algorithm for computing the center of pressure excursion index (CPEI).

    Science.gov (United States)

    Diaz, Michelle A; Gibbons, Mandi W; Song, Jinsup; Hillstrom, Howard J; Choe, Kersti H; Pasquale, Maria R

    2018-01-01

    Center of Pressure Excursion Index (CPEI), a parameter computed from the distribution of plantar pressures during stance phase of barefoot walking, has been used to assess dynamic foot function. The original custom program developed to calculate CPEI required the oversight of a user who could manually correct for certain exceptions to the computational rules. A new fully automatic program has been developed to calculate CPEI with an algorithm that accounts for these exceptions. The purpose of this paper is to compare resulting CPEI values computed by these two programs on plantar pressure data from both asymptomatic and pathologic subjects. If comparable, the new program offers significant benefits-reduced potential for variability due to rater discretion and faster CPEI calculation. CPEI values were calculated from barefoot plantar pressure distributions during comfortable paced walking on 61 healthy asymptomatic adults, 19 diabetic adults with moderate hallux valgus, and 13 adults with mild hallux valgus. Right foot data for each subject was analyzed with linear regression and a Bland-Altman plot. The automated algorithm yielded CPEI values that were linearly related to the original program (R 2 =0.99; Pcomputation methods. Results of this analysis suggest that the new automated algorithm may be used to calculate CPEI on both healthy and pathologic feet. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. New developments in delivering public access to data from the National Center for Computational Toxicology at the EPA

    Science.gov (United States)

    Researchers at EPA’s National Center for Computational Toxicology integrate advances in biology, chemistry, and computer science to examine the toxicity of chemicals and help prioritize chemicals for further research based on potential human health risks. The goal of this researc...

  15. Research and development of grid computing technology in center for computational science and e-systems of Japan Atomic Energy Agency

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of the Japan Atomic Energy Agency (CCSE/JAEA) has carried out R and D of grid computing technology. Since 1995, R and D to realize computational assistance for researchers called Seamless Thinking Aid (STA) and then to share intellectual resources called Information Technology Based Laboratory (ITBL) have been conducted, leading to construct an intelligent infrastructure for the atomic energy research called Atomic Energy Grid InfraStructure (AEGIS) under the Japanese national project 'Development and Applications of Advanced High-Performance Supercomputer'. It aims to enable synchronization of three themes: 1) Computer-Aided Research and Development (CARD) to realize and environment for STA, 2) Computer-Aided Engineering (CAEN) to establish Multi Experimental Tools (MEXT), and 3) Computer Aided Science (CASC) to promote the Atomic Energy Research and Investigation (AERI). This article reviewed achievements in R and D of grid computing technology so far obtained. (T. Tanaka)

  16. Vanderbilt University Institute of Imaging Science Center for Computational Imaging XNAT: A multimodal data archive and processing environment.

    Science.gov (United States)

    Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A

    2016-01-01

    The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Biomedical engineering fundamentals

    CERN Document Server

    Bronzino, Joseph D

    2014-01-01

    Known as the bible of biomedical engineering, The Biomedical Engineering Handbook, Fourth Edition, sets the standard against which all other references of this nature are measured. As such, it has served as a major resource for both skilled professionals and novices to biomedical engineering.Biomedical Engineering Fundamentals, the first volume of the handbook, presents material from respected scientists with diverse backgrounds in physiological systems, biomechanics, biomaterials, bioelectric phenomena, and neuroengineering. More than three dozen specific topics are examined, including cardia

  18. Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center

    Science.gov (United States)

    Molthan, Andrew L.; Limaye, Ashutosh S.; Srikishen, Jayanthi

    2011-01-01

    Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula s "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA s National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA s SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT s experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by

  19. Signal and image analysis for biomedical and life sciences

    CERN Document Server

    Sun, Changming; Pham, Tuan D; Vallotton, Pascal; Wang, Dadong

    2014-01-01

    With an emphasis on applications of computational models for solving modern challenging problems in biomedical and life sciences, this book aims to bring collections of articles from biologists, medical/biomedical and health science researchers together with computational scientists to focus on problems at the frontier of biomedical and life sciences. The goals of this book are to build interactions of scientists across several disciplines and to help industrial users apply advanced computational techniques for solving practical biomedical and life science problems. This book is for users in t

  20. Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center

    Science.gov (United States)

    Molthan, A.; Limaye, A. S.

    2011-12-01

    Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula's "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA's National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA's SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT's experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by

  1. A hypothesis on the formation of the primary ossification centers in the membranous neurocranium: a mathematical and computational model.

    Science.gov (United States)

    Garzón-Alvarado, Diego A

    2013-01-21

    This article develops a model of the appearance and location of the primary centers of ossification in the calvaria. The model uses a system of reaction-diffusion equations of two molecules (BMP and Noggin) whose behavior is of type activator-substrate and its solution produces Turing patterns, which represents the primary ossification centers. Additionally, the model includes the level of cell maturation as a function of the location of mesenchymal cells. Thus the mature cells can become osteoblasts due to the action of BMP2. Therefore, with this model, we can have two frontal primary centers, two parietal, and one, two or more occipital centers. The location of these centers in the simplified computational model is highly consistent with those centers found at an embryonic level. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Figure mining for biomedical research.

    Science.gov (United States)

    Rodriguez-Esteban, Raul; Iossifov, Ivan

    2009-08-15

    Figures from biomedical articles contain valuable information difficult to reach without specialized tools. Currently, there is no search engine that can retrieve specific figure types. This study describes a retrieval method that takes advantage of principles in image understanding, text mining and optical character recognition (OCR) to retrieve figure types defined conceptually. A search engine was developed to retrieve tables and figure types to aid computational and experimental research. http://iossifovlab.cshl.edu/figurome/.

  3. Optimizing biomedical science learning in a veterinary curriculum: a review.

    Science.gov (United States)

    Warren, Amy L; Donnon, Tyrone

    2013-01-01

    As veterinary medical curricula evolve, the time dedicated to biomedical science teaching, as well as the role of biomedical science knowledge in veterinary education, has been scrutinized. Aside from being mandated by accrediting bodies, biomedical science knowledge plays an important role in developing clinical, diagnostic, and therapeutic reasoning skills in the application of clinical skills, in supporting evidence-based veterinary practice and life-long learning, and in advancing biomedical knowledge and comparative medicine. With an increasing volume and fast pace of change in biomedical knowledge, as well as increased demands on curricular time, there has been pressure to make biomedical science education efficient and relevant for veterinary medicine. This has lead to a shift in biomedical education from fact-based, teacher-centered and discipline-based teaching to applicable, student-centered, integrated teaching. This movement is supported by adult learning theories and is thought to enhance students' transference of biomedical science into their clinical practice. The importance of biomedical science in veterinary education and the theories of biomedical science learning will be discussed in this article. In addition, we will explore current advances in biomedical teaching methodologies that are aimed to maximize knowledge retention and application for clinical veterinary training and practice.

  4. Building and evaluating an informatics tool to facilitate analysis of a biomedical literature search service in an academic medical center library.

    Science.gov (United States)

    Hinton, Elizabeth G; Oelschlegel, Sandra; Vaughn, Cynthia J; Lindsay, J Michael; Hurst, Sachiko M; Earl, Martha

    2013-01-01

    This study utilizes an informatics tool to analyze a robust literature search service in an academic medical center library. Structured interviews with librarians were conducted focusing on the benefits of such a tool, expectations for performance, and visual layout preferences. The resulting application utilizes Microsoft SQL Server and .Net Framework 3.5 technologies, allowing for the use of a web interface. Customer tables and MeSH terms are included. The National Library of Medicine MeSH database and entry terms for each heading are incorporated, resulting in functionality similar to searching the MeSH database through PubMed. Data reports will facilitate analysis of the search service.

  5. Whatever works: a systematic user-centered training protocol to optimize brain-computer interfacing individually.

    Directory of Open Access Journals (Sweden)

    Elisabeth V C Friedrich

    Full Text Available This study implemented a systematic user-centered training protocol for a 4-class brain-computer interface (BCI. The goal was to optimize the BCI individually in order to achieve high performance within few sessions for all users. Eight able-bodied volunteers, who were initially naïve to the use of a BCI, participated in 10 sessions over a period of about 5 weeks. In an initial screening session, users were asked to perform the following seven mental tasks while multi-channel EEG was recorded: mental rotation, word association, auditory imagery, mental subtraction, spatial navigation, motor imagery of the left hand and motor imagery of both feet. Out of these seven mental tasks, the best 4-class combination as well as most reactive frequency band (between 8-30 Hz was selected individually for online control. Classification was based on common spatial patterns and Fisher's linear discriminant analysis. The number and time of classifier updates varied individually. Selection speed was increased by reducing trial length. To minimize differences in brain activity between sessions with and without feedback, sham feedback was provided in the screening and calibration runs in which usually no real-time feedback is shown. Selected task combinations and frequency ranges differed between users. The tasks that were included in the 4-class combination most often were (1 motor imagery of the left hand (2, one brain-teaser task (word association or mental subtraction (3, mental rotation task and (4 one more dynamic imagery task (auditory imagery, spatial navigation, imagery of the feet. Participants achieved mean performances over sessions of 44-84% and peak performances in single-sessions of 58-93% in this user-centered 4-class BCI protocol. This protocol is highly adjustable to individual users and thus could increase the percentage of users who can gain and maintain BCI control. A high priority for future work is to examine this protocol with severely

  6. Rethinking Human-Centered Computing: Finding the Customer and Negotiated Interactions at the Airport

    Science.gov (United States)

    Wales, Roxana; O'Neill, John; Mirmalek, Zara

    2003-01-01

    The breakdown in the air transportation system over the past several years raises an interesting question for researchers: How can we help improve the reliability of airline operations? In offering some answers to this question, we make a statement about Huuman-Centered Computing (HCC). First we offer the definition that HCC is a multi-disciplinary research and design methodology focused on supporting humans as they use technology by including cognitive and social systems, computational tools and the physical environment in the analysis of organizational systems. We suggest that a key element in understanding organizational systems is that there are external cognitive and social systems (customers) as well as internal cognitive and social systems (employees) and that they interact dynamically to impact the organization and its work. The design of human-centered intelligent systems must take this outside-inside dynamic into account. In the past, the design of intelligent systems has focused on supporting the work and improvisation requirements of employees but has often assumed that customer requirements are implicitly satisfied by employee requirements. Taking a customer-centric perspective provides a different lens for understanding this outside-inside dynamic, the work of the organization and the requirements of both customers and employees In this article we will: 1) Demonstrate how the use of ethnographic methods revealed the important outside-inside dynamic in an airline, specifically the consequential relationship between external customer requirements and perspectives and internal organizational processes and perspectives as they came together in a changing environment; 2) Describe how taking a customer centric perspective identifies places where the impact of the outside-inside dynamic is most critical and requires technology that can be adaptive; 3) Define and discuss the place of negotiated interactions in airline operations, identifying how these

  7. Energy-Efficient Management of Data Center Resources for Cloud Computing: A Vision, Architectural Elements, and Open Challenges

    OpenAIRE

    Buyya, Rajkumar; Beloglazov, Anton; Abawajy, Jemal

    2010-01-01

    Cloud computing is offering utility-oriented IT services to users worldwide. Based on a pay-as-you-go model, it enables hosting of pervasive applications from consumer, scientific, and business domains. However, data centers hosting Cloud applications consume huge amounts of energy, contributing to high operational costs and carbon footprints to the environment. Therefore, we need Green Cloud computing solutions that can not only save energy for the environment but also reduce operational cos...

  8. Networked Biomedical System for Ubiquitous Health Monitoring

    Directory of Open Access Journals (Sweden)

    Arjan Durresi

    2008-01-01

    Full Text Available We propose a distributed system that enables global and ubiquitous health monitoring of patients. The biomedical data will be collected by wearable health diagnostic devices, which will include various types of sensors and will be transmitted towards the corresponding Health Monitoring Centers. The permanent medical data of patients will be kept in the corresponding Home Data Bases, while the measured biomedical data will be sent to the Visitor Health Monitor Center and Visitor Data Base that serves the area of present location of the patient. By combining the measured biomedical data and the permanent medical data, Health Medical Centers will be able to coordinate the needed actions and help the local medical teams to make quickly the best decisions that could be crucial for the patient health, and that can reduce the cost of health service.

  9. Astigmatic single photon emission computed tomography imaging with a displaced center of rotation

    International Nuclear Information System (INIS)

    Wang, H.; Smith, M.F.; Stone, C.D.; Jaszczak, R.J.

    1998-01-01

    A filtered backprojection algorithm is developed for single photon emission computed tomography (SPECT) imaging with an astigmatic collimator having a displaced center of rotation. The astigmatic collimator has two perpendicular focal lines, one that is parallel to the axis of rotation of the gamma camera and one that is perpendicular to this axis. Using SPECT simulations of projection data from a hot rod phantom and point source arrays, it is found that a lack of incorporation of the mechanical shift in the reconstruction algorithm causes errors and artifacts in reconstructed SPECT images. The collimator and acquisition parameters in the astigmatic reconstruction formula, which include focal lengths, radius of rotation, and mechanical shifts, are often partly unknown and can be determined using the projections of a point source at various projection angles. The accurate determination of these parameters by a least squares fitting technique using projection data from numerically simulated SPECT acquisitions is studied. These studies show that the accuracy of parameter determination is improved as the distance between the point source and the axis of rotation of the gamma camera is increased. The focal length to the focal line perpendicular to the axis of rotation is determined more accurately than the focal length to the focal line parallel to this axis. copyright 1998 American Association of Physicists in Medicine

  10. Establishment of computed tomography reference dose levels in Onassis Cardiac Surgery Center

    International Nuclear Information System (INIS)

    Tsapaki, V.; Kyrozi, E.; Syrigou, T.; Mastorakou, I.; Kottou, S.

    2001-01-01

    The purpose of the study was to apply European Commission (EC) Reference Dose Levels (RDL) in Computed Tomography (CT) examinations at Onassis Cardiac Surgery Center (OCSC). These are weighted CT Dose Index (CTDI w ) for a single slice and Dose-Length Product (DLP) for a complete examination. During the period 1998-1999, the total number of CT examinations, every type of CT examination, patient related data and technical parameters of the examinations were recorded. The most frequent examinations were chosen for investigation which were the head, chest, abdomen and pelvis. CTDI measurements were performed and CTDI w and DLP were calculated. Third Quartile values of CTDI w were chosen to be 43mGy for head, 8mGy for chest, and 22mGy for abdomen and pelvis examinations. Third quartile values of DLP were chosen to be 740mGycm for head, 370mGycm for chest, 490mGycm for abdomen and 420mGycm for pelvis examination. Results confirm that OCSC follows successfully the proposed RDL for the head, chest, abdomen and pelvis examinations in terms of radiation dose. (author)

  11. Biomedical engineering principles

    CERN Document Server

    Ritter, Arthur B; Valdevit, Antonio; Ascione, Alfred N

    2011-01-01

    Introduction: Modeling of Physiological ProcessesCell Physiology and TransportPrinciples and Biomedical Applications of HemodynamicsA Systems Approach to PhysiologyThe Cardiovascular SystemBiomedical Signal ProcessingSignal Acquisition and ProcessingTechniques for Physiological Signal ProcessingExamples of Physiological Signal ProcessingPrinciples of BiomechanicsPractical Applications of BiomechanicsBiomaterialsPrinciples of Biomedical Capstone DesignUnmet Clinical NeedsEntrepreneurship: Reasons why Most Good Designs Never Get to MarketAn Engineering Solution in Search of a Biomedical Problem

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  13. VI Latin American Congress on Biomedical Engineering

    CERN Document Server

    Hadad, Alejandro

    2015-01-01

    This volume presents the proceedings of the CLAIB 2014, held in Paraná, Entre Ríos, Argentina 29, 30 & 31 October 2014. The proceedings, presented by the Regional Council of Biomedical Engineering for Latin America (CORAL) offer research findings, experiences and activities between institutions and universities to develop Bioengineering, Biomedical Engineering and related sciences. The conferences of the American Congress of Biomedical Engineering are sponsored by the International Federation for Medical and Biological Engineering (IFMBE), Society for Engineering in Biology and Medicine (EMBS) and the Pan American Health Organization (PAHO), among other organizations and international agencies and bringing together scientists, academics and biomedical engineers in Latin America and other continents in an environment conducive to exchange and professional growth. The Topics include: - Bioinformatics and Computational Biology - Bioinstrumentation; Sensors, Micro and Nano Technologies - Biomaterials, Tissu...

  14. Annual report of R and D activities in Center for Promotion of Computational Science and Engineering and Center for Computational Science and e-Systems from April 1, 2005 to March 31, 2006

    International Nuclear Information System (INIS)

    2007-03-01

    This report provides an overview of research and development activities in Center for Computational Science and Engineering (CCSE), JAERI in the former half of the fiscal year 2005 (April 1, 2005 - Sep. 30, 2006) and those in Center for Computational Science and e-Systems (CCSE), JAEA, in the latter half of the fiscal year 2005(Oct 1, 2005 - March 31, 2006). In the former half term, the activities have been performed by 5 research groups, Research Group for Computational Science in Atomic Energy, Research Group for Computational Material Science in Atomic Energy, R and D Group for Computer Science, R and D Group for Numerical Experiments, and Quantum Bioinformatics Group in CCSE. At the beginning of the latter half term, these 5 groups were integrated into two offices, Simulation Technology Research and Development Office and Computer Science Research and Development Office at the moment of the unification of JNC (Japan Nuclear Cycle Development Institute) and JAERI (Japan Atomic Energy Research Institute), and the latter-half term activities were operated by the two offices. A big project, ITBL (Information Technology Based Laboratory) project and fundamental computational research for atomic energy plant were performed mainly by two groups, the R and D Group for Computer Science and the Research Group for Computational Science in Atomic Energy in the former half term and their integrated office, Computer Science Research and Development Office in the latter half one, respectively. The main result was verification by using structure analysis for real plant executable on the Grid environment, and received Honorable Mentions of Analytic Challenge in the conference 'Supercomputing (SC05)'. The materials science and bioinformatics in atomic energy research field were carried out by three groups, Research Group for Computational Material Science in Atomic Energy, R and D Group for Computer Science, R and D Group for Numerical Experiments, and Quantum Bioinformatics

  15. Biomedical applications engineering tasks

    Science.gov (United States)

    Laenger, C. J., Sr.

    1976-01-01

    The engineering tasks performed in response to needs articulated by clinicians are described. Initial contacts were made with these clinician-technology requestors by the Southwest Research Institute NASA Biomedical Applications Team. The basic purpose of the program was to effectively transfer aerospace technology into functional hardware to solve real biomedical problems.

  16. Mediator infrastructure for information integration and semantic data integration environment for biomedical research.

    Science.gov (United States)

    Grethe, Jeffrey S; Ross, Edward; Little, David; Sanders, Brian; Gupta, Amarnath; Astakhov, Vadim

    2009-01-01

    This paper presents current progress in the development of semantic data integration environment which is a part of the Biomedical Informatics Research Network (BIRN; http://www.nbirn.net) project. BIRN is sponsored by the National Center for Research Resources (NCRR), a component of the National Institutes of Health (NIH). A goal is the development of a cyberinfrastructure for biomedical research that supports advance data acquisition, data storage, data management, data integration, data mining, data visualization, and other computing and information processing services over the Internet. Each participating institution maintains storage of their experimental or computationally derived data. Mediator-based data integration system performs semantic integration over the databases to enable researchers to perform analyses based on larger and broader datasets than would be available from any single institution's data. This paper describes recent revision of the system architecture, implementation, and capabilities of the semantically based data integration environment for BIRN.

  17. Biomedical signal and image processing.

    Science.gov (United States)

    Cerutti, Sergio; Baselli, Giuseppe; Bianchi, Anna; Caiani, Enrico; Contini, Davide; Cubeddu, Rinaldo; Dercole, Fabio; Rienzo, Luca; Liberati, Diego; Mainardi, Luca; Ravazzani, Paolo; Rinaldi, Sergio; Signorini, Maria; Torricelli, Alessandro

    2011-01-01

    Generally, physiological modeling and biomedical signal processing constitute two important paradigms of biomedical engineering (BME): their fundamental concepts are taught starting from undergraduate studies and are more completely dealt with in the last years of graduate curricula, as well as in Ph.D. courses. Traditionally, these two cultural aspects were separated, with the first one more oriented to physiological issues and how to model them and the second one more dedicated to the development of processing tools or algorithms to enhance useful information from clinical data. A practical consequence was that those who did models did not do signal processing and vice versa. However, in recent years,the need for closer integration between signal processing and modeling of the relevant biological systems emerged very clearly [1], [2]. This is not only true for training purposes(i.e., to properly prepare the new professional members of BME) but also for the development of newly conceived research projects in which the integration between biomedical signal and image processing (BSIP) and modeling plays a crucial role. Just to give simple examples, topics such as brain–computer machine or interfaces,neuroengineering, nonlinear dynamical analysis of the cardiovascular (CV) system,integration of sensory-motor characteristics aimed at the building of advanced prostheses and rehabilitation tools, and wearable devices for vital sign monitoring and others do require an intelligent fusion of modeling and signal processing competences that are certainly peculiar of our discipline of BME.

  18. User-centered design in brain-computer interfaces-a case study.

    Science.gov (United States)

    Schreuder, Martijn; Riccio, Angela; Risetti, Monica; Dähne, Sven; Ramsay, Andrew; Williamson, John; Mattia, Donatella; Tangermann, Michael

    2013-10-01

    The array of available brain-computer interface (BCI) paradigms has continued to grow, and so has the corresponding set of machine learning methods which are at the core of BCI systems. The latter have evolved to provide more robust data analysis solutions, and as a consequence the proportion of healthy BCI users who can use a BCI successfully is growing. With this development the chances have increased that the needs and abilities of specific patients, the end-users, can be covered by an existing BCI approach. However, most end-users who have experienced the use of a BCI system at all have encountered a single paradigm only. This paradigm is typically the one that is being tested in the study that the end-user happens to be enrolled in, along with other end-users. Though this corresponds to the preferred study arrangement for basic research, it does not ensure that the end-user experiences a working BCI. In this study, a different approach was taken; that of a user-centered design. It is the prevailing process in traditional assistive technology. Given an individual user with a particular clinical profile, several available BCI approaches are tested and - if necessary - adapted to him/her until a suitable BCI system is found. Described is the case of a 48-year-old woman who suffered from an ischemic brain stem stroke, leading to a severe motor- and communication deficit. She was enrolled in studies with two different BCI systems before a suitable system was found. The first was an auditory event-related potential (ERP) paradigm and the second a visual ERP paradigm, both of which are established in literature. The auditory paradigm did not work successfully, despite favorable preconditions. The visual paradigm worked flawlessly, as found over several sessions. This discrepancy in performance can possibly be explained by the user's clinical deficit in several key neuropsychological indicators, such as attention and working memory. While the auditory paradigm relies

  19. Bridging the digital divide by increasing computer and cancer literacy: community technology centers for head-start parents and families.

    Science.gov (United States)

    Salovey, Peter; Williams-Piehota, Pamela; Mowad, Linda; Moret, Marta Elisa; Edlund, Denielle; Andersen, Judith

    2009-01-01

    This article describes the establishment of two community technology centers affiliated with Head Start early childhood education programs focused especially on Latino and African American parents of children enrolled in Head Start. A 6-hour course concerned with computer and cancer literacy was presented to 120 parents and other community residents who earned a free, refurbished, Internet-ready computer after completing the program. Focus groups provided the basis for designing the structure and content of the course and modifying it during the project period. An outcomes-based assessment comparing program participants with 70 nonparticipants at baseline, immediately after the course ended, and 3 months later suggested that the program increased knowledge about computers and their use, knowledge about cancer and its prevention, and computer use including health information-seeking via the Internet. The creation of community computer technology centers requires the availability of secure space, capacity of a community partner to oversee project implementation, and resources of this partner to ensure sustainability beyond core funding.

  20. Certification of version 1.2 of the PORFLO-3 code for the WHC scientific and engineering computational center

    International Nuclear Information System (INIS)

    Kline, N.W.

    1994-01-01

    Version 1.2 of the PORFLO-3 Code has migrated from the Hanford Cray computer to workstations in the WHC Scientific and Engineering Computational Center. The workstation-based configuration and acceptance testing are inherited from the CRAY-based configuration. The purpose of this report is to document differences in the new configuration as compared to the parent Cray configuration, and summarize some of the acceptance test results which have shown that the migrated code is functioning correctly in the new environment

  1. High performance computing in science and engineering '09: transactions of the High Performance Computing Center, Stuttgart (HLRS) 2009

    National Research Council Canada - National Science Library

    Nagel, Wolfgang E; Kröner, Dietmar; Resch, Michael

    2010-01-01

    ...), NIC/JSC (J¨ u lich), and LRZ (Munich). As part of that strategic initiative, in May 2009 already NIC/JSC has installed the first phase of the GCS HPC Tier-0 resources, an IBM Blue Gene/P with roughly 300.000 Cores, this time in J¨ u lich, With that, the GCS provides the most powerful high-performance computing infrastructure in Europe alread...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  3. Biomedical applications of polymers

    CERN Document Server

    Gebelein, C G

    1991-01-01

    The biomedical applications of polymers span an extremely wide spectrum of uses, including artificial organs, skin and soft tissue replacements, orthopaedic applications, dental applications, and controlled release of medications. No single, short review can possibly cover all these items in detail, and dozens of books andhundreds of reviews exist on biomedical polymers. Only a few relatively recent examples will be cited here;additional reviews are listed under most of the major topics in this book. We will consider each of the majorclassifications of biomedical polymers to some extent, inclu

  4. Handbook of biomedical optics

    CERN Document Server

    Boas, David A

    2011-01-01

    Biomedical optics holds tremendous promise to deliver effective, safe, non- or minimally invasive diagnostics and targeted, customizable therapeutics. Handbook of Biomedical Optics provides an in-depth treatment of the field, including coverage of applications for biomedical research, diagnosis, and therapy. It introduces the theory and fundamentals of each subject, ensuring accessibility to a wide multidisciplinary readership. It also offers a view of the state of the art and discusses advantages and disadvantages of various techniques.Organized into six sections, this handbook: Contains intr

  5. Biomedical Engineering Desk Reference

    CERN Document Server

    Ratner, Buddy D; Schoen, Frederick J; Lemons, Jack E; Dyro, Joseph; Martinsen, Orjan G; Kyle, Richard; Preim, Bernhard; Bartz, Dirk; Grimnes, Sverre; Vallero, Daniel; Semmlow, John; Murray, W Bosseau; Perez, Reinaldo; Bankman, Isaac; Dunn, Stanley; Ikada, Yoshito; Moghe, Prabhas V; Constantinides, Alkis

    2009-01-01

    A one-stop Desk Reference, for Biomedical Engineers involved in the ever expanding and very fast moving area; this is a book that will not gather dust on the shelf. It brings together the essential professional reference content from leading international contributors in the biomedical engineering field. Material covers a broad range of topics including: Biomechanics and Biomaterials; Tissue Engineering; and Biosignal Processing* A hard-working desk reference providing all the essential material needed by biomedical and clinical engineers on a day-to-day basis * Fundamentals, key techniques,

  6. Powering biomedical devices

    CERN Document Server

    Romero, Edwar

    2013-01-01

    From exoskeletons to neural implants, biomedical devices are no less than life-changing. Compact and constant power sources are necessary to keep these devices running efficiently. Edwar Romero's Powering Biomedical Devices reviews the background, current technologies, and possible future developments of these power sources, examining not only the types of biomedical power sources available (macro, mini, MEMS, and nano), but also what they power (such as prostheses, insulin pumps, and muscular and neural stimulators), and how they work (covering batteries, biofluids, kinetic and ther

  7. The Internet and Computer User Profile: a questionnaire for determining intervention targets in occupational therapy at mental health vocational centers.

    Science.gov (United States)

    Regev, Sivan; Hadas-Lidor, Noami; Rosenberg, Limor

    2016-08-01

    In this study, the assessment tool "Internet and Computer User Profile" questionnaire (ICUP) is presented and validated. It was developed in order to gather information for setting intervention goals to meet current demands. Sixty-eight subjects aged 23-68 participated in the study. The study group (n = 28) was sampled from two vocational centers. The control group consisted of 40 participants from the general population that were sampled by convenience sampling based on the demographics of the study group. Subjects from both groups answered the ICUP questionnaire. Subjects of the study group answered the General Self- Efficacy (GSE) questionnaire and performed the Assessment of Computer Task Performance (ACTP) test in order to examine the convergent validity of the ICUP. Twenty subjects from both groups retook the ICUP questionnaire in order to obtain test-retest results. Differences between groups were tested using multiple analysis of variance (MANOVA) tests. Pearson and Spearman's tests were used for calculating correlations. Cronbach's alpha coefficient and k equivalent were used to assess internal consistency. The results indicate that the questionnaire is valid and reliable. They emphasize that the layout of the ICUP items facilitates in making a comprehensive examination of the client's perception regarding his participation in computer and internet activities. Implications for Rehabiliation The assessment tool "Internet and Computer User Profile" (ICUP) questionnaire is a novel assessment tool that evaluates operative use and individual perception of computer activities. The questionnaire is valid and reliable for use with participants of vocational centers dealing with mental illness. It is essential to facilitate access to computers for people with mental illnesses, seeing that they express similar interest in computers and internet as people from the general population of the same age. Early intervention will be particularly effective for young

  8. Computer modeling with randomized-controlled trial data informs the development of person-centered aged care homes.

    Science.gov (United States)

    Chenoweth, Lynn; Vickland, Victor; Stein-Parbury, Jane; Jeon, Yun-Hee; Kenny, Patricia; Brodaty, Henry

    2015-10-01

    To answer questions on the essential components (services, operations and resources) of a person-centered aged care home (iHome) using computer simulation. iHome was developed with AnyLogic software using extant study data obtained from 60 Australian aged care homes, 900+ clients and 700+ aged care staff. Bayesian analysis of simulated trial data will determine the influence of different iHome characteristics on care service quality and client outcomes. Interim results: A person-centered aged care home (socio-cultural context) and care/lifestyle services (interactional environment) can produce positive outcomes for aged care clients (subjective experiences) in the simulated environment. Further testing will define essential characteristics of a person-centered care home.

  9. Advances in biomedical engineering and biotechnology during 2013-2014.

    Science.gov (United States)

    Liu, Feng; Wang, Ying; Burkhart, Timothy A; González Penedo, Manuel Francisco; Ma, Shaodong

    2014-01-01

    The 3rd International Conference on Biomedical Engineering and Biotechnology (iCBEB 2014), held in Beijing from the 25th to the 28th of September 2014, is an annual conference that intends to provide an opportunity for researchers and practitioners around the world to present the most recent advances and future challenges in the fields of biomedical engineering, biomaterials, bioinformatics and computational biology, biomedical imaging and signal processing, biomechanical engineering and biotechnology, amongst others. The papers published in this issue are selected from this conference, which witnesses the advances in biomedical engineering and biotechnology during 2013-2014.

  10. Spectrum of tablet computer use by medical students and residents at an academic medical center.

    Science.gov (United States)

    Robinson, Robert

    2015-01-01

    Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians. Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM) in July and August of 2012. Results. There were 76 medical student responses (26% response rate) and 66 resident/fellow responses to this survey (21% response rate). Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035). The most common reported uses were for accessing medical reference applications (46%), e-Books (45%), and board study (32%). Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010), review radiology images (27% vs. 12%, p = 0.019), and enter patient care orders (26% vs. 3%, p e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks. Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on resident physicians. Further study is needed better understand how tablet computers and other mobile devices may assist in medical education and patient care.

  11. NDE in biomedical engineering

    International Nuclear Information System (INIS)

    Bhagwat, Aditya; Kumar, Pradeep

    2015-01-01

    Biomedical Engineering (BME) is an interdisciplinary field, marking the conjunction of Medical and Engineering disciplines. It combines the design and problem solving skills of engineering with medical and biological sciences to advance health care treatment, including diagnosis, monitoring, and therapy

  12. Status of Research in Biomedical Engineering 1968.

    Science.gov (United States)

    National Inst. of General Medical Sciences (NIH), Bethesda, MD.

    This status report is divided into eight sections. The first four represent the classical engineering or building aspects of bioengineering and deal with biomedical instrumentation, prosthetics, man-machine systems and computer and information systems. The next three sections are related to the scientific, intellectual and academic influence of…

  13. Computer-Aided Diagnosis of Breast Cancer: A Multi-Center Demonstrator

    National Research Council Canada - National Science Library

    Floyd, Carey

    2000-01-01

    .... The focus has been to gather data from multiple sites in order to verify and whether the artificial neural network computer aid to the diagnosis of breast cancer can be translated between locations...

  14. A Pilot Study of Biomedical Text Comprehension using an Attention-Based Deep Neural Reader: Design and Experimental Analysis.

    Science.gov (United States)

    Kim, Seongsoon; Park, Donghyeon; Choi, Yonghwa; Lee, Kyubum; Kim, Byounggun; Jeon, Minji; Kim, Jihye; Tan, Aik Choon; Kang, Jaewoo

    2018-01-05

    With the development of artificial intelligence (AI) technology centered on deep-learning, the computer has evolved to a point where it can read a given text and answer a question based on the context of the text. Such a specific task is known as the task of machine comprehension. Existing machine comprehension tasks mostly use datasets of general texts, such as news articles or elementary school-level storybooks. However, no attempt has been made to determine whether an up-to-date deep learning-based machine comprehension model can also process scientific literature containing expert-level knowledge, especially in the biomedical domain. This study aims to investigate whether a machine comprehension model can process biomedical articles as well as general texts. Since there is no dataset for the biomedical literature comprehension task, our work includes generating a large-scale question answering dataset using PubMed and manually evaluating the generated dataset. We present an attention-based deep neural model tailored to the biomedical domain. To further enhance the performance of our model, we used a pretrained word vector and biomedical entity type embedding. We also developed an ensemble method of combining the results of several independent models to reduce the variance of the answers from the models. The experimental results showed that our proposed deep neural network model outperformed the baseline model by more than 7% on the new dataset. We also evaluated human performance on the new dataset. The human evaluation result showed that our deep neural model outperformed humans in comprehension by 22% on average. In this work, we introduced a new task of machine comprehension in the biomedical domain using a deep neural model. Since there was no large-scale dataset for training deep neural models in the biomedical domain, we created the new cloze-style datasets Biomedical Knowledge Comprehension Title (BMKC_T) and Biomedical Knowledge Comprehension Last

  15. Computed tomography-guided percutaneous gastrostomy: initial experience at a cancer center

    Energy Technology Data Exchange (ETDEWEB)

    Tyng, Chiang Jeng; Santos, Erich Frank Vater; Guerra, Luiz Felipe Alves; Bitencourt, Almir Galvao Vieira; Barbosa, Paula Nicole Vieira Pinto; Chojniak, Rubens [A. C. Camargo Cancer Center, Sao Paulo, SP (Brazil); Universidade Federal do Espirito Santo (HUCAM/UFES), Vitoria, ES (Brazil). Hospital Universitario Cassiano Antonio de Morais. Radiologia e Diagnostico por Imagem

    2017-03-15

    Gastrostomy is indicated for patients with conditions that do not allow adequate oral nutrition. To reduce the morbidity and costs associated with the procedure, there is a trend toward the use of percutaneous gastrostomy, guided by endoscopy, fluoroscopy, or, most recently, computed tomography. The purpose of this paper was to review the computed tomography-guided gastrostomy procedure, as well as the indications for its use and the potential complications. (author)

  16. Computed tomography-guided percutaneous gastrostomy: initial experience at a cancer center

    International Nuclear Information System (INIS)

    Tyng, Chiang Jeng; Santos, Erich Frank Vater; Guerra, Luiz Felipe Alves; Bitencourt, Almir Galvao Vieira; Barbosa, Paula Nicole Vieira Pinto; Chojniak, Rubens; Universidade Federal do Espirito Santo

    2017-01-01

    Gastrostomy is indicated for patients with conditions that do not allow adequate oral nutrition. To reduce the morbidity and costs associated with the procedure, there is a trend toward the use of percutaneous gastrostomy, guided by endoscopy, fluoroscopy, or, most recently, computed tomography. The purpose of this paper was to review the computed tomography-guided gastrostomy procedure, as well as the indications for its use and the potential complications. (author)

  17. Camera systems in human motion analysis for biomedical applications

    Science.gov (United States)

    Chin, Lim Chee; Basah, Shafriza Nisha; Yaacob, Sazali; Juan, Yeap Ewe; Kadir, Aida Khairunnisaa Ab.

    2015-05-01

    Human Motion Analysis (HMA) system has been one of the major interests among researchers in the field of computer vision, artificial intelligence and biomedical engineering and sciences. This is due to its wide and promising biomedical applications, namely, bio-instrumentation for human computer interfacing and surveillance system for monitoring human behaviour as well as analysis of biomedical signal and image processing for diagnosis and rehabilitation applications. This paper provides an extensive review of the camera system of HMA, its taxonomy, including camera types, camera calibration and camera configuration. The review focused on evaluating the camera system consideration of the HMA system specifically for biomedical applications. This review is important as it provides guidelines and recommendation for researchers and practitioners in selecting a camera system of the HMA system for biomedical applications.

  18. An information technology emphasis in biomedical informatics education.

    Science.gov (United States)

    Kane, Michael D; Brewer, Jeffrey L

    2007-02-01

    Unprecedented growth in the interdisciplinary domain of biomedical informatics reflects the recent advancements in genomic sequence availability, high-content biotechnology screening systems, as well as the expectations of computational biology to command a leading role in drug discovery and disease characterization. These forces have moved much of life sciences research almost completely into the computational domain. Importantly, educational training in biomedical informatics has been limited to students enrolled in the life sciences curricula, yet much of the skills needed to succeed in biomedical informatics involve or augment training in information technology curricula. This manuscript describes the methods and rationale for training students enrolled in information technology curricula in the field of biomedical informatics, which augments the existing information technology curriculum and provides training on specific subjects in Biomedical Informatics not emphasized in bioinformatics courses offered in life science programs, and does not require prerequisite courses in the life sciences.

  19. Measurements and predictions of the air distribution systems in high compute density (Internet) data centers

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jinkyun [HIMEC (Hanil Mechanical Electrical Consultants) Ltd., Seoul 150-103 (Korea); Department of Architectural Engineering, Yonsei University, Seoul 120-749 (Korea); Lim, Taesub; Kim, Byungseon Sean [Department of Architectural Engineering, Yonsei University, Seoul 120-749 (Korea)

    2009-10-15

    When equipment power density increases, a critical goal of a data center cooling system is to separate the equipment exhaust air from the equipment intake air in order to prevent the IT server from overheating. Cooling systems for data centers are primarily differentiated according to the way they distribute air. The six combinations of flooded and locally ducted air distribution make up the vast majority of all installations, except fully ducted air distribution methods. Once the air distribution system (ADS) is selected, there are other elements that must be integrated into the system design. In this research, the design parameters and IT environmental aspects of the cooling system were studied with a high heat density data center. CFD simulation analysis was carried out in order to compare the heat removal efficiencies of various air distribution systems. The IT environment of an actual operating data center is measured to validate a model for predicting the effect of different air distribution systems. A method for planning and design of the appropriate air distribution system is described. IT professionals versed in precision air distribution mechanisms, components, and configurations can work more effectively with mechanical engineers to ensure the specification and design of optimized cooling solutions. (author)

  20. Risk factors for computer visual syndrome (CVS) among operators of two call centers in São Paulo, Brazil.

    Science.gov (United States)

    Sa, Eduardo Costa; Ferreira Junior, Mario; Rocha, Lys Esther

    2012-01-01

    The aims of this study were to investigate work conditions, to estimate the prevalence and to describe risk factors associated with Computer Vision Syndrome among two call centers' operators in São Paulo (n = 476). The methods include a quantitative cross-sectional observational study and an ergonomic work analysis, using work observation, interviews and questionnaires. The case definition was the presence of one or more specific ocular symptoms answered as always, often or sometimes. The multiple logistic regression model, were created using the stepwise forward likelihood method and remained the variables with levels below 5% (p vision (43.5%). The prevalence of Computer Vision Syndrome was 54.6%. Associations verified were: being female (OR 2.6, 95% CI 1.6 to 4.1), lack of recognition at work (OR 1.4, 95% CI 1.1 to 1.8), organization of work in call center (OR 1.4, 95% CI 1.1 to 1.7) and high demand at work (OR 1.1, 95% CI 1.0 to 1.3). The organization and psychosocial factors at work should be included in prevention programs of visual syndrome among call centers' operators.

  1. Ressource-Sharing on the Tera-Flop Scale for the Biomedical Research and Care Sector - The Erasmus Computing Grid and MediGRID

    NARCIS (Netherlands)

    T.A. Knoch (Tobias)

    2008-01-01

    markdownabstractToday advances in scientific research as well as clinical diagnostics and treatment are inevitably connected with information solutions concerning computation power and information storage. The needs for information technology are enormous and are in many cases the limiting factor

  2. Assessing the practice of biomedical ontology evaluation: Gaps and opportunities.

    Science.gov (United States)

    Amith, Muhammad; He, Zhe; Bian, Jiang; Lossio-Ventura, Juan Antonio; Tao, Cui

    2018-04-01

    With the proliferation of heterogeneous health care data in the last three decades, biomedical ontologies and controlled biomedical terminologies play a more and more important role in knowledge representation and management, data integration, natural language processing, as well as decision support for health information systems and biomedical research. Biomedical ontologies and controlled terminologies are intended to assure interoperability. Nevertheless, the quality of biomedical ontologies has hindered their applicability and subsequent adoption in real-world applications. Ontology evaluation is an integral part of ontology development and maintenance. In the biomedicine domain, ontology evaluation is often conducted by third parties as a quality assurance (or auditing) effort that focuses on identifying modeling errors and inconsistencies. In this work, we first organized four categorical schemes of ontology evaluation methods in the existing literature to create an integrated taxonomy. Further, to understand the ontology evaluation practice in the biomedicine domain, we reviewed a sample of 200 ontologies from the National Center for Biomedical Ontology (NCBO) BioPortal-the largest repository for biomedical ontologies-and observed that only 15 of these ontologies have documented evaluation in their corresponding inception papers. We then surveyed the recent quality assurance approaches for biomedical ontologies and their use. We also mapped these quality assurance approaches to the ontology evaluation criteria. It is our anticipation that ontology evaluation and quality assurance approaches will be more widely adopted in the development life cycle of biomedical ontologies. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Ambient radiation levels in positron emission tomography/computed tomography (PET/CT) imaging center

    Energy Technology Data Exchange (ETDEWEB)

    Santana, Priscila do Carmo; Oliveira, Paulo Marcio Campos de; Mamede, Marcelo; Silveira, Mariana de Castro; Aguiar, Polyanna; Real, Raphaela Vila, E-mail: pridili@gmail.com [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil); Silva, Teogenes Augusto da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2015-01-15

    Objective: to evaluate the level of ambient radiation in a PET/CT center. Materials and methods: previously selected and calibrated TLD-100H thermoluminescent dosimeters were utilized to measure room radiation levels. During 32 days, the detectors were placed in several strategically selected points inside the PET/CT center and in adjacent buildings. After the exposure period the dosimeters were collected and processed to determine the radiation level. Results: in none of the points selected for measurements the values exceeded the radiation dose threshold for controlled area (5 mSv/ year) or free area (0.5 mSv/year) as recommended by the Brazilian regulations. Conclusion: in the present study the authors demonstrated that the whole shielding system is appropriate and, consequently, the workers are exposed to doses below the threshold established by Brazilian standards, provided the radiation protection standards are followed. (author)

  4. PRODEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP : HIGH PERFORMANCE COMPUTING WITH QCDOC AND BLUEGENE.

    Energy Technology Data Exchange (ETDEWEB)

    CHRIST,N.; DAVENPORT,J.; DENG,Y.; GARA,A.; GLIMM,J.; MAWHINNEY,R.; MCFADDEN,E.; PESKIN,A.; PULLEYBLANK,W.

    2003-03-11

    Staff of Brookhaven National Laboratory, Columbia University, IBM and the RIKEN BNL Research Center organized a one-day workshop held on February 28, 2003 at Brookhaven to promote the following goals: (1) To explore areas other than QCD applications where the QCDOC and BlueGene/L machines can be applied to good advantage, (2) To identify areas where collaboration among the sponsoring institutions can be fruitful, and (3) To expose scientists to the emerging software architecture. This workshop grew out of an informal visit last fall by BNL staff to the IBM Thomas J. Watson Research Center that resulted in a continuing dialog among participants on issues common to these two related supercomputers. The workshop was divided into three sessions, addressing the hardware and software status of each system, prospective applications, and future directions.

  5. Noise-Resilient Quantum Computing with a Nitrogen-Vacancy Center and Nuclear Spins.

    Science.gov (United States)

    Casanova, J; Wang, Z-Y; Plenio, M B

    2016-09-23

    Selective control of qubits in a quantum register for the purposes of quantum information processing represents a critical challenge for dense spin ensembles in solid-state systems. Here we present a protocol that achieves a complete set of selective electron-nuclear gates and single nuclear rotations in such an ensemble in diamond facilitated by a nearby nitrogen-vacancy (NV) center. The protocol suppresses internuclear interactions as well as unwanted coupling between the NV center and other spins of the ensemble to achieve quantum gate fidelities well exceeding 99%. Notably, our method can be applied to weakly coupled, distant spins representing a scalable procedure that exploits the exceptional properties of nuclear spins in diamond as robust quantum memories.

  6. Ambient radiation levels in positron emission tomography/computed tomography (PET/CT) imaging center

    Science.gov (United States)

    Santana, Priscila do Carmo; de Oliveira, Paulo Marcio Campos; Mamede, Marcelo; Silveira, Mariana de Castro; Aguiar, Polyanna; Real, Raphaela Vila; da Silva, Teógenes Augusto

    2015-01-01

    Objective To evaluate the level of ambient radiation in a PET/CT center. Materials and Methods Previously selected and calibrated TLD-100H thermoluminescent dosimeters were utilized to measure room radiation levels. During 32 days, the detectors were placed in several strategically selected points inside the PET/CT center and in adjacent buildings. After the exposure period the dosimeters were collected and processed to determine the radiation level. Results In none of the points selected for measurements the values exceeded the radiation dose threshold for controlled area (5 mSv/year) or free area (0.5 mSv/year) as recommended by the Brazilian regulations. Conclusion In the present study the authors demonstrated that the whole shielding system is appropriate and, consequently, the workers are exposed to doses below the threshold established by Brazilian standards, provided the radiation protection standards are followed. PMID:25798004

  7. 15th International Conference on Biomedical Engineering

    CERN Document Server

    2014-01-01

    This volume presents the proceedings of the 15th ICMBE held from 4th to 7th December 2013, Singapore. Biomedical engineering is applied in most aspects of our healthcare ecosystem. From electronic health records to diagnostic tools to therapeutic, rehabilitative and regenerative treatments, the work of biomedical engineers is evident. Biomedical engineers work at the intersection of engineering, life sciences and healthcare. The engineers would use principles from applied science including mechanical, electrical, chemical and computer engineering together with physical sciences including physics, chemistry and mathematics to apply them to biology and medicine. Applying such concepts to the human body is very much the same concepts that go into building and programming a machine. The goal is to better understand, replace or fix a target system to ultimately improve the quality of healthcare. With this understanding, the conference proceedings offer a single platform for individuals and organisations working i...

  8. Big Biomedical data as the key resource for discovery science

    Energy Technology Data Exchange (ETDEWEB)

    Toga, Arthur W.; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W.; Price, Nathan D.; Glusman, Gustavo; Heavner, Benjamin D.; Dinov, Ivo D.; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-07-21

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an “-ome to home” approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center’s computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson’s and Alzheimer’s.

  9. Simbody: multibody dynamics for biomedical research.

    Science.gov (United States)

    Sherman, Michael A; Seth, Ajay; Delp, Scott L

    Multibody software designed for mechanical engineering has been successfully employed in biomedical research for many years. For real time operation some biomedical researchers have also adapted game physics engines. However, these tools were built for other purposes and do not fully address the needs of biomedical researchers using them to analyze the dynamics of biological structures and make clinically meaningful recommendations. We are addressing this problem through the development of an open source, extensible, high performance toolkit including a multibody mechanics library aimed at the needs of biomedical researchers. The resulting code, Simbody, supports research in a variety of fields including neuromuscular, prosthetic, and biomolecular simulation, and related research such as biologically-inspired design and control of humanoid robots and avatars. Simbody is the dynamics engine behind OpenSim, a widely used biomechanics simulation application. This article reviews issues that arise uniquely in biomedical research, and reports on the architecture, theory, and computational methods Simbody uses to address them. By addressing these needs explicitly Simbody provides a better match to the needs of researchers than can be obtained by adaptation of mechanical engineering or gaming codes. Simbody is a community resource, free for any purpose. We encourage wide adoption and invite contributions to the code base at https://simtk.org/home/simbody.

  10. Biomedical engineering for health research and development.

    Science.gov (United States)

    Zhang, X-Y

    2015-01-01

    Biomedical engineering is a new area of research in medicine and biology, providing new concepts and designs for the diagnosis, treatment and prevention of various diseases. There are several types of biomedical engineering, such as tissue, genetic, neural and stem cells, as well as chemical and clinical engineering for health care. Many electronic and magnetic methods and equipments are used for the biomedical engineering such as Computed Tomography (CT) scans, Magnetic Resonance Imaging (MRI) scans, Electroencephalography (EEG), Ultrasound and regenerative medicine and stem cell cultures, preparations of artificial cells and organs, such as pancreas, urinary bladders, liver cells, and fibroblasts cells of foreskin and others. The principle of tissue engineering is described with various types of cells used for tissue engineering purposes. The use of several medical devices and bionics are mentioned with scaffold, cells and tissue cultures and various materials are used for biomedical engineering. The use of biomedical engineering methods is very important for the human health, and research and development of diseases. The bioreactors and preparations of artificial cells or tissues and organs are described here.

  11. Human-Centered Design of Human-Computer-Human Dialogs in Aerospace Systems

    Science.gov (United States)

    Mitchell, Christine M.

    1998-01-01

    A series of ongoing research programs at Georgia Tech established a need for a simulation support tool for aircraft computer-based aids. This led to the design and development of the Georgia Tech Electronic Flight Instrument Research Tool (GT-EFIRT). GT-EFIRT is a part-task flight simulator specifically designed to study aircraft display design and single pilot interaction. ne simulator, using commercially available graphics and Unix workstations, replicates to a high level of fidelity the Electronic Flight Instrument Systems (EFIS), Flight Management Computer (FMC) and Auto Flight Director System (AFDS) of the Boeing 757/767 aircraft. The simulator can be configured to present information using conventional looking B757n67 displays or next generation Primary Flight Displays (PFD) such as found on the Beech Starship and MD-11.

  12. Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center

    International Nuclear Information System (INIS)

    Bancroft, G.; Plessel, T.; Merritt, F.; Watson, V.

    1989-01-01

    Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers. 7 refs

  13. Advances in biomedical engineering

    CERN Document Server

    Brown, J H U

    1976-01-01

    Advances in Biomedical Engineering, Volume 6, is a collection of papers that discusses the role of integrated electronics in medical systems and the usage of biological mathematical models in biological systems. Other papers deal with the health care systems, the problems and methods of approach toward rehabilitation, as well as the future of biomedical engineering. One paper discusses the use of system identification as it applies to biological systems to estimate the values of a number of parameters (for example, resistance, diffusion coefficients) by indirect means. More particularly, the i

  14. Biomedical enhancements as justice.

    Science.gov (United States)

    Nam, Jeesoo

    2015-02-01

    Biomedical enhancements, the applications of medical technology to make better those who are neither ill nor deficient, have made great strides in the past few decades. Using Amartya Sen's capability approach as my framework, I argue in this article that far from being simply permissible, we have a prima facie moral obligation to use these new developments for the end goal of promoting social justice. In terms of both range and magnitude, the use of biomedical enhancements will mark a radical advance in how we compensate the most disadvantaged members of society. © 2013 John Wiley & Sons Ltd.

  15. Advances in biomedical engineering

    CERN Document Server

    Brown, J H U

    1976-01-01

    Advances in Biomedical Engineering, Volume 5, is a collection of papers that deals with application of the principles and practices of engineering to basic and applied biomedical research, development, and the delivery of health care. The papers also describe breakthroughs in health improvements, as well as basic research that have been accomplished through clinical applications. One paper examines engineering principles and practices that can be applied in developing therapeutic systems by a controlled delivery system in drug dosage. Another paper examines the physiological and materials vari

  16. A canonical perturbation method for computing the guiding-center motion in magnetized axisymmetric plasma columns

    International Nuclear Information System (INIS)

    Gratreau, P.

    1987-01-01

    The motion of charged particles in a magnetized plasma column, such as that of a magnetic mirror trap or a tokamak, is determined in the framework of the canonical perturbation theory through a method of variation of constants which preserves the energy conservation and the symmetry invariance. The choice of a frame of coordinates close to that of the magnetic coordinates allows a relatively precise determination of the guiding-center motion with a low-ordered approximation in the adiabatic parameter. A Hamiltonian formulation of the motion equations is obtained

  17. Human factors in computing systems: focus on patient-centered health communication at the ACM SIGCHI conference.

    Science.gov (United States)

    Wilcox, Lauren; Patel, Rupa; Chen, Yunan; Shachak, Aviv

    2013-12-01

    Health Information Technologies, such as electronic health records (EHR) and secure messaging, have already transformed interactions among patients and clinicians. In addition, technologies supporting asynchronous communication outside of clinical encounters, such as email, SMS, and patient portals, are being increasingly used for follow-up, education, and data reporting. Meanwhile, patients are increasingly adopting personal tools to track various aspects of health status and therapeutic progress, wishing to review these data with clinicians during consultations. These issues have drawn increasing interest from the human-computer interaction (HCI) community, with special focus on critical challenges in patient-centered interactions and design opportunities that can address these challenges. We saw this community presenting and interacting at the ACM SIGCHI 2013, Conference on Human Factors in Computing Systems, (also known as CHI), held April 27-May 2nd, 2013 at the Palais de Congrès de Paris in France. CHI 2013 featured many formal avenues to pursue patient-centered health communication: a well-attended workshop, tracks of original research, and a lively panel discussion. In this report, we highlight these events and the main themes we identified. We hope that it will help bring the health care communication and the HCI communities closer together. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. CICART Center For Integrated Computation And Analysis Of Reconnection And Turbulence

    International Nuclear Information System (INIS)

    Bhattacharjee, Amitava

    2016-01-01

    CICART is a partnership between the University of New Hampshire (UNH) and Dartmouth College. CICART addresses two important science needs of the DoE: the basic understanding of magnetic reconnection and turbulence that strongly impacts the performance of fusion plasmas, and the development of new mathematical and computational tools that enable the modeling and control of these phenomena. The principal participants of CICART constitute an interdisciplinary group, drawn from the communities of applied mathematics, astrophysics, computational physics, fluid dynamics, and fusion physics. It is a main premise of CICART that fundamental aspects of magnetic reconnection and turbulence in fusion devices, smaller-scale laboratory experiments, and space and astrophysical plasmas can be viewed from a common perspective, and that progress in understanding in any of these interconnected fields is likely to lead to progress in others. The establishment of CICART has strongly impacted the education and research mission of a new Program in Integrated Applied Mathematics in the College of Engineering and Applied Sciences at UNH by enabling the recruitment of a tenure-track faculty member, supported equally by UNH and CICART, and the establishment of an IBM-UNH Computing Alliance. The proposed areas of research in magnetic reconnection and turbulence in astrophysical, space, and laboratory plasmas include the following topics: (A) Reconnection and secondary instabilities in large high-Lundquist-number plasmas, (B) Particle acceleration in the presence of multiple magnetic islands, (C) Gyrokinetic reconnection: comparison with fluid and particle-in-cell models, (D) Imbalanced turbulence, (E) Ion heating, and (F) Turbulence in laboratory (including fusion-relevant) experiments. These theoretical studies make active use of three high-performance computer simulation codes: (1) The Magnetic Reconnection Code, based on extended two-fluid (or Hall MHD) equations, in an Adaptive Mesh

  19. CICART Center For Integrated Computation And Analysis Of Reconnection And Turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Bhattacharjee, Amitava [Univ. of New Hampshire, Durham, NH (United States)

    2016-03-27

    CICART is a partnership between the University of New Hampshire (UNH) and Dartmouth College. CICART addresses two important science needs of the DoE: the basic understanding of magnetic reconnection and turbulence that strongly impacts the performance of fusion plasmas, and the development of new mathematical and computational tools that enable the modeling and control of these phenomena. The principal participants of CICART constitute an interdisciplinary group, drawn from the communities of applied mathematics, astrophysics, computational physics, fluid dynamics, and fusion physics. It is a main premise of CICART that fundamental aspects of magnetic reconnection and turbulence in fusion devices, smaller-scale laboratory experiments, and space and astrophysical plasmas can be viewed from a common perspective, and that progress in understanding in any of these interconnected fields is likely to lead to progress in others. The establishment of CICART has strongly impacted the education and research mission of a new Program in Integrated Applied Mathematics in the College of Engineering and Applied Sciences at UNH by enabling the recruitment of a tenure-track faculty member, supported equally by UNH and CICART, and the establishment of an IBM-UNH Computing Alliance. The proposed areas of research in magnetic reconnection and turbulence in astrophysical, space, and laboratory plasmas include the following topics: (A) Reconnection and secondary instabilities in large high-Lundquist-number plasmas, (B) Particle acceleration in the presence of multiple magnetic islands, (C) Gyrokinetic reconnection: comparison with fluid and particle-in-cell models, (D) Imbalanced turbulence, (E) Ion heating, and (F) Turbulence in laboratory (including fusion-relevant) experiments. These theoretical studies make active use of three high-performance computer simulation codes: (1) The Magnetic Reconnection Code, based on extended two-fluid (or Hall MHD) equations, in an Adaptive Mesh

  20. Biomedical Engineering in Modern Society

    Science.gov (United States)

    Attinger, E. O.

    1971-01-01

    Considers definition of biomedical engineering (BME) and how biomedical engineers should be trained. State of the art descriptions of BME and BME education are followed by a brief look at the future of BME. (TS)

  1. Biomedical Image Registration

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 8th International Workshop on Biomedical Image Registration, WBIR 2018, held in Leiden, The Netherlands, in June 2018. The 11 full and poster papers included in this volume were carefully reviewed and selected from 17 submitted papers. The pap...

  2. Biomedical Data Mining

    NARCIS (Netherlands)

    Peek, N.; Combi, C.; Tucker, A.

    2009-01-01

    Objective: To introduce the special topic of Methods of Information in Medicine on data mining in biomedicine, with selected papers from two workshops on Intelligent Data Analysis in bioMedicine (IDAMAP) held in Verona (2006) and Amsterdam (2007). Methods: Defining the field of biomedical data

  3. Careers in biomedical engineering.

    Science.gov (United States)

    Madrid, R E; Rotger, V I; Herrera, M C

    2010-01-01

    Although biomedical engineering was started in Argentina about 35 years ago, it has had a sustained growth for the last 25 years in human resources, with the emergence of new undergraduate and postgraduate careers, as well as in research, knowledge, technological development, and health care.

  4. Anatomy for Biomedical Engineers

    Science.gov (United States)

    Carmichael, Stephen W.; Robb, Richard A.

    2008-01-01

    There is a perceived need for anatomy instruction for graduate students enrolled in a biomedical engineering program. This appeared especially important for students interested in and using medical images. These students typically did not have a strong background in biology. The authors arranged for students to dissect regions of the body that…

  5. Biomedical research applications

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    The biomedical research Panel believes that the Calutron facility at Oak Ridge is a national and international resource of immense scientific value and of fundamental importance to continued biomedical research. This resource is essential to the development of new isotope uses in biology and medicine. It should therefore be nurtured by adequate support and operated in a way that optimizes its services to the scientific and technological community. The Panel sees a continuing need for a reliable supply of a wide variety of enriched stable isotopes. The past and present utilization of stable isotopes in biomedical research is documented in Appendix 7. Future requirements for stable isotopes are impossible to document, however, because of the unpredictability of research itself. Nonetheless we expect the demand for isotopes to increase in parallel with the continuing expansion of biomedical research as a whole. There are a number of promising research projects at the present time, and these are expected to lead to an increase in production requirements. The Panel also believes that a high degree of priority should be given to replacing the supplies of the 65 isotopes (out of the 224 previously available enriched isotopes) no longer available from ORNL

  6. Computations on the massively parallel processor at the Goddard Space Flight Center

    Science.gov (United States)

    Strong, James P.

    1991-01-01

    Described are four significant algorithms implemented on the massively parallel processor (MPP) at the Goddard Space Flight Center. Two are in the area of image analysis. Of the other two, one is a mathematical simulation experiment and the other deals with the efficient transfer of data between distantly separated processors in the MPP array. The first algorithm presented is the automatic determination of elevations from stereo pairs. The second algorithm solves mathematical logistic equations capable of producing both ordered and chaotic (or random) solutions. This work can potentially lead to the simulation of artificial life processes. The third algorithm is the automatic segmentation of images into reasonable regions based on some similarity criterion, while the fourth is an implementation of a bitonic sort of data which significantly overcomes the nearest neighbor interconnection constraints on the MPP for transferring data between distant processors.

  7. Computer simulations of low energy displacement cascades in a face centered cubic lattice

    International Nuclear Information System (INIS)

    Schiffgens, J.O.; Bourquin, R.D.

    1976-09-01

    Computer simulations of atomic motion in a copper lattice following the production of primary knock-on atoms (PKAs) with energies from 25 to 200 eV are discussed. In this study, a mixed Moliere-Englert pair potential is used to model the copper lattice. The computer code COMENT, which employs the dynamical method, is used to analyze the motion of up to 6000 atoms per time step during cascade evolution. The atoms are specified as initially at rest on the sites of an ideal lattice. A matrix of 12 PKA directions and 6 PKA energies is investigated. Displacement thresholds in the [110] and [100] are calculated to be approximately 17 and 20 eV, respectively. A table showing the stability of isolated Frenkel pairs with different vacancy and interstitial orientations and separations is presented. The numbers of Frenkel pairs and atomic replacements are tabulated as a function of PKA direction for each energy. For PKA energies of 25, 50, 75, 100, 150, and 200 eV, the average number of Frenkel pairs per PKA are 0.4, 0.6, 1.0, 1.2, 1.4, and 2.2 and the average numbers of replacements per PKA are 2.4, 4.0, 3.3, 4.9, 9.3, and 15.8

  8. Computation of Electromagnetic Fields Scattered From Dielectric Objects of Uncertain Shapes Using MLMC Center for Uncertainty

    KAUST Repository

    Litvinenko, Alexander

    2015-01-05

    Simulators capable of computing scattered fields from objects of uncertain shapes are highly useful in electromagnetics and photonics, where device designs are typically subject to fabrication tolerances. Knowledge of statistical variations in scattered fields is useful in ensuring error-free functioning of devices. Oftentimes such simulators use a Monte Carlo (MC) scheme to sample the random domain, where the variables parameterize the uncertainties in the geometry. At each sample, which corresponds to a realization of the geometry, a deterministic electromagnetic solver is executed to compute the scattered fields. However, to obtain accurate statistics of the scattered fields, the number of MC samples has to be large. This significantly increases the total execution time. In this work, to address this challenge, the Multilevel MC (MLMC) scheme is used together with a (deterministic) surface integral equation solver. The MLMC achieves a higher efficiency by “balancing” the statistical errors due to sampling of the random domain and the numerical errors due to discretization of the geometry at each of these samples. Error balancing results in a smaller number of samples requiring coarser discretizations. Consequently, total execution time is significantly shortened.

  9. Biomedical Research Institute, Biomedical Research Foundation of Northwest Louisiana, Shreveport, Louisiana

    International Nuclear Information System (INIS)

    1992-01-01

    Department of Energy (DOE) has prepared an Environmental Assessment (EA), DOE/EA-0789, evaluating the environmental impacts of construction and operation of a Biomedical Research Institute (BRI) at the Louisiana State University (LSU) Medical Center, Shreveport, Louisiana. The purpose of the BRI is to accelerate the development of biomedical research in cardiovascular disease, molecular biology, and neurobiology. Based on the analyses in the EA, DOE has determined that the proposed action does not constitute a major Federal action significantly affecting the quality of the human environment within the meaning of the National Environmental Policy Act of 1969 (NEPA). Therefore, the preparation of an Environmental Impact Statement is not required

  10. Enabling Large-Scale Biomedical Analysis in the Cloud

    Directory of Open Access Journals (Sweden)

    Ying-Chih Lin

    2013-01-01

    Full Text Available Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable.

  11. Brain-computer interface signal processing at the Wadsworth Center: mu and sensorimotor beta rhythms.

    Science.gov (United States)

    McFarland, Dennis J; Krusienski, Dean J; Wolpaw, Jonathan R

    2006-01-01

    The Wadsworth brain-computer interface (BCI), based on mu and beta sensorimotor rhythms, uses one- and two-dimensional cursor movement tasks and relies on user training. This is a real-time closed-loop system. Signal processing consists of channel selection, spatial filtering, and spectral analysis. Feature translation uses a regression approach and normalization. Adaptation occurs at several points in this process on the basis of different criteria and methods. It can use either feedforward (e.g., estimating the signal mean for normalization) or feedback control (e.g., estimating feature weights for the prediction equation). We view this process as the interaction between a dynamic user and a dynamic system that coadapt over time. Understanding the dynamics of this interaction and optimizing its performance represent a major challenge for BCI research.

  12. Abstracts of digital computer code packages. Assembled by the Radiation Shielding Information Center

    International Nuclear Information System (INIS)

    McGill, B.; Maskewitz, B.F.; Anthony, C.M.; Comolander, H.E.; Hendrickson, H.R.

    1976-01-01

    The term ''code package'' is used to describe a miscellaneous grouping of materials which, when interpreted in connection with a digital computer, enables the scientist--user to solve technical problems in the area for which the material was designed. In general, a ''code package'' consists of written material--reports, instructions, flow charts, listings of data, and other useful material and IBM card decks (or, more often, a reel of magnetic tape) on which the source decks, sample problem input (including libraries of data) and the BCD/EBCDIC output listing from the sample problem are written. In addition to the main code, and any available auxiliary routines are also included. The abstract format was chosen to give to a potential code user several criteria for deciding whether or not he wishes to request the code package

  13. Abstracts of digital computer code packages. Assembled by the Radiation Shielding Information Center. [Radiation transport codes

    Energy Technology Data Exchange (ETDEWEB)

    McGill, B.; Maskewitz, B.F.; Anthony, C.M.; Comolander, H.E.; Hendrickson, H.R.

    1976-01-01

    The term ''code package'' is used to describe a miscellaneous grouping of materials which, when interpreted in connection with a digital computer, enables the scientist--user to solve technical problems in the area for which the material was designed. In general, a ''code package'' consists of written material--reports, instructions, flow charts, listings of data, and other useful material and IBM card decks (or, more often, a reel of magnetic tape) on which the source decks, sample problem input (including libraries of data) and the BCD/EBCDIC output listing from the sample problem are written. In addition to the main code, and any available auxiliary routines are also included. The abstract format was chosen to give to a potential code user several criteria for deciding whether or not he wishes to request the code package. (RWR)

  14. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.

    1980-05-01

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone who wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables

  15. Predicting Structures of Ru-Centered Dyes: A Computational Screening Tool.

    Science.gov (United States)

    Fredin, Lisa A; Allison, Thomas C

    2016-04-07

    Dye-sensitized solar cells (DSCs) represent a means for harvesting solar energy to produce electrical power. Though a number of light harvesting dyes are in use, the search continues for more efficient and effective compounds to make commercially viable DSCs a reality. Computational methods have been increasingly applied to understand the dyes currently in use and to aid in the search for improved light harvesting compounds. Semiempirical quantum chemistry methods have a well-deserved reputation for giving good quality results in a very short amount of computer time. The most recent semiempirical models such as PM6 and PM7 are parametrized for a wide variety of molecule types, including organometallic complexes similar to DSC chromophores. In this article, the performance of PM6 is tested against a set of 20 molecules whose geometries were optimized using a density functional theory (DFT) method. It is found that PM6 gives geometries that are in good agreement with the optimized DFT structures. In order to reduce the differences between geometries optimized using PM6 and geometries optimized using DFT, the PM6 basis set parameters have been optimized for a subset of the molecules. It is found that it is sufficient to optimize the basis set for Ru alone to improve the agreement between the PM6 results and the DFT results. When this optimized Ru basis set is used, the mean unsigned error in Ru-ligand bond lengths is reduced from 0.043 to 0.017 Å in the set of 20 test molecules. Though the magnitude of these differences is small, the effect on the calculated UV/vis spectra is significant. These results clearly demonstrate the value of using PM6 to screen DSC chromophores as well as the value of optimizing PM6 basis set parameters for a specific set of molecules.

  16. Biomedical Wireless Ambulatory Crew Monitor

    Science.gov (United States)

    Chmiel, Alan; Humphreys, Brad

    2009-01-01

    A compact, ambulatory biometric data acquisition system has been developed for space and commercial terrestrial use. BioWATCH (Bio medical Wireless and Ambulatory Telemetry for Crew Health) acquires signals from biomedical sensors using acquisition modules attached to a common data and power bus. Several slots allow the user to configure the unit by inserting sensor-specific modules. The data are then sent real-time from the unit over any commercially implemented wireless network including 802.11b/g, WCDMA, 3G. This system has a distributed computing hierarchy and has a common data controller on each sensor module. This allows for the modularity of the device along with the tailored ability to control the cards using a relatively small master processor. The distributed nature of this system affords the modularity, size, and power consumption that betters the current state of the art in medical ambulatory data acquisition. A new company was created to market this technology.

  17. The Benefits of Making Data from the EPA National Center for Computational Toxicology available for reuse (ACS Fall meeting 3 of 12)

    Science.gov (United States)

    Researchers at EPA’s National Center for Computational Toxicology (NCCT) integrate advances in biology, chemistry, exposure and computer science to help prioritize chemicals for further research based on potential human health risks. The goal of this research is to quickly evalua...

  18. Annual report of R and D activities in center for promotion of computational science and engineering from April 1, 2003 to March 31, 2004

    International Nuclear Information System (INIS)

    2005-08-01

    Major Research and development activities of Center for Promotion of Computational Science and Engineering (CCSE), JAERI, have focused on ITBL (IT Based Laboratory) project, computational material science and Quantum Bioinformatics. This report provides an overview of research and development activities in (CCSE) in the fiscal year 2003 (April 1, 2003 - March 31, 2004). (author)

  19. Biomedical sensor design using analog compressed sensing

    Science.gov (United States)

    Balouchestani, Mohammadreza; Krishnan, Sridhar

    2015-05-01

    The main drawback of current healthcare systems is the location-specific nature of the system due to the use of fixed/wired biomedical sensors. Since biomedical sensors are usually driven by a battery, power consumption is the most important factor determining the life of a biomedical sensor. They are also restricted by size, cost, and transmission capacity. Therefore, it is important to reduce the load of sampling by merging the sampling and compression steps to reduce the storage usage, transmission times, and power consumption in order to expand the current healthcare systems to Wireless Healthcare Systems (WHSs). In this work, we present an implementation of a low-power biomedical sensor using analog Compressed Sensing (CS) framework for sparse biomedical signals that addresses both the energy and telemetry bandwidth constraints of wearable and wireless Body-Area Networks (BANs). This architecture enables continuous data acquisition and compression of biomedical signals that are suitable for a variety of diagnostic and treatment purposes. At the transmitter side, an analog-CS framework is applied at the sensing step before Analog to Digital Converter (ADC) in order to generate the compressed version of the input analog bio-signal. At the receiver side, a reconstruction algorithm based on Restricted Isometry Property (RIP) condition is applied in order to reconstruct the original bio-signals form the compressed bio-signals with high probability and enough accuracy. We examine the proposed algorithm with healthy and neuropathy surface Electromyography (sEMG) signals. The proposed algorithm achieves a good level for Average Recognition Rate (ARR) at 93% and reconstruction accuracy at 98.9%. In addition, The proposed architecture reduces total computation time from 32 to 11.5 seconds at sampling-rate=29 % of Nyquist rate, Percentage Residual Difference (PRD)=26 %, Root Mean Squared Error (RMSE)=3 %.

  20. Computed Tomography Features of Benign and Malignant Calcified Thyroid Nodules: A Single-Center Study.

    Science.gov (United States)

    Kim, Donghyun; Kim, Dong Wook; Heo, Young Jin; Baek, Jin Wook; Lee, Yoo Jin; Park, Young Mi; Baek, Hye Jin; Jung, Soo Jin

    No previous studies have investigated thyroid calcification on computed tomography (CT) quantitatively by using Hounsfield unit (HU) values. This study aimed to analyze quantitative HU values of thyroid calcification on preoperative neck CT and to assess the characteristics of benign and malignant calcified thyroid nodules (CTNs). Two hundred twenty patients who underwent neck CT before thyroid surgery from January 2015 to June 2016 were included. On soft-tissue window CT images, CTNs with calcified components of 3 mm or larger in minimum diameter were included in this study. The HU values and types of CTNs were determined and analyzed. Of 61 CTNs in 49 patients, there were 42 malignant nodules and 19 benign nodules. The mean largest diameter of the calcified component was 5.3 (2.5) mm (range, 3.1-17.1 mm). A statistically significant difference was observed in the HU values of calcified portions between benign and malignant CTNs, whereas there was no significant difference in patient age or sex or in the size, location, or type of each CTN. Of the 8 CTNs with pure calcification, 3 exhibited a honeycomb pattern on bone window CT images, and these 3 CTNs were all diagnosed as papillary thyroid carcinoma on histopathological examination. Hounsfield unit values of CTNs may be helpful for differentiating malignancy from benignity.

  1. Biomedical signals, imaging, and informatics

    CERN Document Server

    Bronzino, Joseph D

    2014-01-01

    Known as the bible of biomedical engineering, The Biomedical Engineering Handbook, Fourth Edition, sets the standard against which all other references of this nature are measured. As such, it has served as a major resource for both skilled professionals and novices to biomedical engineering.Biomedical Signals, Imaging, and Informatics, the third volume of the handbook, presents material from respected scientists with diverse backgrounds in biosignal processing, medical imaging, infrared imaging, and medical informatics.More than three dozen specific topics are examined, including biomedical s

  2. Research evaluation support services in biomedical libraries.

    Science.gov (United States)

    Gutzman, Karen Elizabeth; Bales, Michael E; Belter, Christopher W; Chambers, Thane; Chan, Liza; Holmes, Kristi L; Lu, Ya-Ling; Palmer, Lisa A; Reznik-Zellen, Rebecca C; Sarli, Cathy C; Suiter, Amy M; Wheeler, Terrie R

    2018-01-01

    The paper provides a review of current practices related to evaluation support services reported by seven biomedical and research libraries. A group of seven libraries from the United States and Canada described their experiences with establishing evaluation support services at their libraries. A questionnaire was distributed among the libraries to elicit information as to program development, service and staffing models, campus partnerships, training, products such as tools and reports, and resources used for evaluation support services. The libraries also reported interesting projects, lessons learned, and future plans. The seven libraries profiled in this paper report a variety of service models in providing evaluation support services to meet the needs of campus stakeholders. The service models range from research center cores, partnerships with research groups, and library programs with staff dedicated to evaluation support services. A variety of products and services were described such as an automated tool to develop rank-based metrics, consultation on appropriate metrics to use for evaluation, customized publication and citation reports, resource guides, classes and training, and others. Implementing these services has allowed the libraries to expand their roles on campus and to contribute more directly to the research missions of their institutions. Libraries can leverage a variety of evaluation support services as an opportunity to successfully meet an array of challenges confronting the biomedical research community, including robust efforts to report and demonstrate tangible and meaningful outcomes of biomedical research and clinical care. These services represent a transformative direction that can be emulated by other biomedical and research libraries.

  3. EVALUATION OF PROPTOSIS BY USING COMPUTED TOMOGRAPHY IN A TERTIARY CARE CENTER, BURLA, SAMBALPUR, ODISHA

    Directory of Open Access Journals (Sweden)

    Vikas Agrawal

    2017-07-01

    Full Text Available BACKGROUND Proptosis is defined as the abnormal anterior protrusion of the globe beyond the orbital margins.1 It is an important clinical manifestation of various orbital as well as systemic disorders. Aetiology ranging from infection to malignant tumours, among which space occupying lesions within the orbits are the most important. Proptosis is defined as an abnormal protrusion of the eyeball. MATERIALS AND METHODS A total of 32 patients referred from various departments mainly from ophthalmology and medicine with history and clinical features suggestive of proptosis were evaluated in our department and after proper history taking and clinical examination, Computed Tomography (CT scan was done. RESULTS The age of the patients ranged from 1-55 years. Associated chief complaints in case of proptosis were in decreasing order from pain / headache, restricted eye movement, diminished vision and diplopia. Mass lesions (46.87% were the most common cause of proptosis followed by inflammatory lesions (37.5%. Trauma vascular lesions and congenital conditions were infrequent causes of proptosis. In children, common causes of proptosis were retinoblastoma (35.71% and orbital cellulitis (28.57% and in adults the common causes were thyroid ophthalmopathy (22.22%, trauma (16.66% and pseudo-tumour (16.66%. CONCLUSION Mass lesions (46.87% were the most common cause of proptosis followed by inflammatory lesions (37.5%. CT scanning should be the chief investigation in evaluation of lesions causing proptosis. It is the most useful in detecting characterising and determining the extent of disease process. The overall accuracy of CT scan in diagnosis of proptosis is 96.87%.

  4. Ethical Issues of Artificial Biomedical Applications

    OpenAIRE

    Alexiou , Athanasios; Psixa , Maria; Vlamos , Panagiotis

    2011-01-01

    Part 12: Medical Applications of ANN and Ethics of AI; International audience; While the plethora of artificial biomedical applications is enriched and combined with the possibilities of artificial intelligence, bioinformatics and nanotechnology, the variability in the ideological use of such concepts is associated with bioethical issues and several legal aspects. The convergence of bioethics and computer ethics, attempts to illustrate and approach problems, occurring by the fusion of human a...

  5. Optical Polarizationin Biomedical Applications

    CERN Document Server

    Tuchin, Valery V; Zimnyakov, Dmitry A

    2006-01-01

    Optical Polarization in Biomedical Applications introduces key developments in optical polarization methods for quantitative studies of tissues, while presenting the theory of polarization transfer in a random medium as a basis for the quantitative description of polarized light interaction with tissues. This theory uses the modified transfer equation for Stokes parameters and predicts the polarization structure of multiple scattered optical fields. The backscattering polarization matrices (Jones matrix and Mueller matrix) important for noninvasive medical diagnostic are introduced. The text also describes a number of diagnostic techniques such as CW polarization imaging and spectroscopy, polarization microscopy and cytometry. As a new tool for medical diagnosis, optical coherent polarization tomography is analyzed. The monograph also covers a range of biomedical applications, among them cataract and glaucoma diagnostics, glucose sensing, and the detection of bacteria.

  6. A Semantic Web management model for integrative biomedical informatics.

    Directory of Open Access Journals (Sweden)

    Helena F Deus

    2008-08-01

    Full Text Available Data, data everywhere. The diversity and magnitude of the data generated in the Life Sciences defies automated articulation among complementary efforts. The additional need in this field for managing property and access permissions compounds the difficulty very significantly. This is particularly the case when the integration involves multiple domains and disciplines, even more so when it includes clinical and high throughput molecular data.The emergence of Semantic Web technologies brings the promise of meaningful interoperation between data and analysis resources. In this report we identify a core model for biomedical Knowledge Engineering applications and demonstrate how this new technology can be used to weave a management model where multiple intertwined data structures can be hosted and managed by multiple authorities in a distributed management infrastructure. Specifically, the demonstration is performed by linking data sources associated with the Lung Cancer SPORE awarded to The University of Texas MD Anderson Cancer Center at Houston and the Southwestern Medical Center at Dallas. A software prototype, available with open source at www.s3db.org, was developed and its proposed design has been made publicly available as an open source instrument for shared, distributed data management.The Semantic Web technologies have the potential to addresses the need for distributed and evolvable representations that are critical for systems Biology and translational biomedical research. As this technology is incorporated into application development we can expect that both general purpose productivity software and domain specific software installed on our personal computers will become increasingly integrated with the relevant remote resources. In this scenario, the acquisition of a new dataset should automatically trigger the delegation of its analysis.

  7. Cloud Based Metalearning System for Predictive Modeling of Biomedical Data

    Directory of Open Access Journals (Sweden)

    Milan Vukićević

    2014-01-01

    Full Text Available Rapid growth and storage of biomedical data enabled many opportunities for predictive modeling and improvement of healthcare processes. On the other side analysis of such large amounts of data is a difficult and computationally intensive task for most existing data mining algorithms. This problem is addressed by proposing a cloud based system that integrates metalearning framework for ranking and selection of best predictive algorithms for data at hand and open source big data technologies for analysis of biomedical data.

  8. Computer-Based Training in Eating and Nutrition Facilitates Person-Centered Hospital Care: A Group Concept Mapping Study.

    Science.gov (United States)

    Westergren, Albert; Edfors, Ellinor; Norberg, Erika; Stubbendorff, Anna; Hedin, Gita; Wetterstrand, Martin; Rosas, Scott R; Hagell, Peter

    2018-04-01

    Studies have shown that computer-based training in eating and nutrition for hospital nursing staff increased the likelihood that patients at risk of undernutrition would receive nutritional interventions. This article seeks to provide understanding from the perspective of nursing staff of conceptually important areas for computer-based nutritional training, and their relative importance to nutritional care, following completion of the training. Group concept mapping, an integrated qualitative and quantitative methodology, was used to conceptualize important factors relating to the training experiences through four focus groups (n = 43), statement sorting (n = 38), and importance rating (n = 32), followed by multidimensional scaling and cluster analysis. Sorting of 38 statements yielded four clusters. These clusters (number of statements) were as follows: personal competence and development (10), practice close care development (10), patient safety (9), and awareness about the nutrition care process (9). First and second clusters represented "the learning organization," and third and fourth represented "quality improvement." These findings provide a conceptual basis for understanding the importance of training in eating and nutrition, which contributes to a learning organization and quality improvement, and can be linked to and facilitates person-centered nutritional care and patient safety.

  9. Production Support Flight Control Computers: Research Capability for F/A-18 Aircraft at Dryden Flight Research Center

    Science.gov (United States)

    Carter, John F.

    1997-01-01

    NASA Dryden Flight Research Center (DFRC) is working with the United States Navy to complete ground testing and initiate flight testing of a modified set of F/A-18 flight control computers. The Production Support Flight Control Computers (PSFCC) can give any fleet F/A-18 airplane an in-flight, pilot-selectable research control law capability. NASA DFRC can efficiently flight test the PSFCC for the following four reasons: (1) Six F/A-18 chase aircraft are available which could be used with the PSFCC; (2) An F/A-18 processor-in-the-loop simulation exists for validation testing; (3) The expertise has been developed in programming the research processor in the PSFCC; and (4) A well-defined process has been established for clearing flight control research projects for flight. This report presents a functional description of the PSFCC. Descriptions of the NASA DFRC facilities, PSFCC verification and validation process, and planned PSFCC projects are also provided.

  10. Building a biomedical ontology recommender web service

    Directory of Open Access Journals (Sweden)

    Jonquet Clement

    2010-06-01

    Full Text Available Abstract Background Researchers in biomedical informatics use ontologies and terminologies to annotate their data in order to facilitate data integration and translational discoveries. As the use of ontologies for annotation of biomedical datasets has risen, a common challenge is to identify ontologies that are best suited to annotating specific datasets. The number and variety of biomedical ontologies is large, and it is cumbersome for a researcher to figure out which ontology to use. Methods We present the Biomedical Ontology Recommender web service. The system uses textual metadata or a set of keywords describing a domain of interest and suggests appropriate ontologies for annotating or representing the data. The service makes a decision based on three criteria. The first one is coverage, or the ontologies that provide most terms covering the input text. The second is connectivity, or the ontologies that are most often mapped to by other ontologies. The final criterion is size, or the number of concepts in the ontologies. The service scores the ontologies as a function of scores of the annotations created using the National Center for Biomedical Ontology (NCBO Annotator web service. We used all the ontologies from the UMLS Metathesaurus and the NCBO BioPortal. Results We compare and contrast our Recommender by an exhaustive functional comparison to previously published efforts. We evaluate and discuss the results of several recommendation heuristics in the context of three real world use cases. The best recommendations heuristics, rated ‘very relevant’ by expert evaluators, are the ones based on coverage and connectivity criteria. The Recommender service (alpha version is available to the community and is embedded into BioPortal.

  11. Opal web services for biomedical applications.

    Science.gov (United States)

    Ren, Jingyuan; Williams, Nadya; Clementi, Luca; Krishnan, Sriram; Li, Wilfred W

    2010-07-01

    Biomedical applications have become increasingly complex, and they often require large-scale high-performance computing resources with a large number of processors and memory. The complexity of application deployment and the advances in cluster, grid and cloud computing require new modes of support for biomedical research. Scientific Software as a Service (sSaaS) enables scalable and transparent access to biomedical applications through simple standards-based Web interfaces. Towards this end, we built a production web server (http://ws.nbcr.net) in August 2007 to support the bioinformatics application called MEME. The server has grown since to include docking analysis with AutoDock and AutoDock Vina, electrostatic calculations using PDB2PQR and APBS, and off-target analysis using SMAP. All the applications on the servers are powered by Opal, a toolkit that allows users to wrap scientific applications easily as web services without any modification to the scientific codes, by writing simple XML configuration files. Opal allows both web forms-based access and programmatic access of all our applications. The Opal toolkit currently supports SOAP-based Web service access to a number of popular applications from the National Biomedical Computation Resource (NBCR) and affiliated collaborative and service projects. In addition, Opal's programmatic access capability allows our applications to be accessed through many workflow tools, including Vision, Kepler, Nimrod/K and VisTrails. From mid-August 2007 to the end of 2009, we have successfully executed 239,814 jobs. The number of successfully executed jobs more than doubled from 205 to 411 per day between 2008 and 2009. The Opal-enabled service model is useful for a wide range of applications. It provides for interoperation with other applications with Web Service interfaces, and allows application developers to focus on the scientific tool and workflow development. Web server availability: http://ws.nbcr.net.

  12. MOLIERE: Automatic Biomedical Hypothesis Generation System.

    Science.gov (United States)

    Sybrandt, Justin; Shtutman, Michael; Safro, Ilya

    2017-08-01

    Hypothesis generation is becoming a crucial time-saving technique which allows biomedical researchers to quickly discover implicit connections between important concepts. Typically, these systems operate on domain-specific fractions of public medical data. MOLIERE, in contrast, utilizes information from over 24.5 million documents. At the heart of our approach lies a multi-modal and multi-relational network of biomedical objects extracted from several heterogeneous datasets from the National Center for Biotechnology Information (NCBI). These objects include but are not limited to scientific papers, keywords, genes, proteins, diseases, and diagnoses. We model hypotheses using Latent Dirichlet Allocation applied on abstracts found near shortest paths discovered within this network, and demonstrate the effectiveness of MOLIERE by performing hypothesis generation on historical data. Our network, implementation, and resulting data are all publicly available for the broad scientific community.

  13. Biomedical Requirements for High Productivity Computing Systems

    Science.gov (United States)

    2005-04-01

    differences in heart muscle structure between normal and brittle-boned mice suffering from osteogenesis imperfecta (OI) because of a deficiency in the protein...reached. In a typical comparative modeling exercise one would use a heuristic algorithm to determine possible sequences of interest, then the Smith...example exercise , require a description of the cellular events that create demands for oxygen. Having cellular level equations together with

  14. Biomedical applications of batteries

    Energy Technology Data Exchange (ETDEWEB)

    Latham, Roger [Faculty of Health and Life Sciences, De Montfort University, The Gateway, Leicester, LE1 9BH (United Kingdom); Linford, Roger [The Research Office, De Montfort University, The Gateway, Leicester, LE1 9BH (United Kingdom); Schlindwein, Walkiria [School of Pharmacy, De Montfort University, The Gateway, Leicester, LE1 9BH (United Kingdom)

    2004-08-31

    An overview is presented of the many ways in which batteries and battery materials are used in medicine and in biomedical studies. These include the use of batteries as power sources for motorised wheelchairs, surgical tools, cardiac pacemakers and defibrillators, dynamic prostheses, sensors and monitors for physiological parameters, neurostimulators, devices for pain relief, and iontophoretic, electroporative and related devices for drug administration. The various types of battery and fuel cell used for this wide range of applications will be considered, together with the potential harmful side effects, including accidental ingestion of batteries and the explosive nature of some of the early cardiac pacemaker battery systems.

  15. Statistics in biomedical research

    Directory of Open Access Journals (Sweden)

    González-Manteiga, Wenceslao

    2007-06-01

    Full Text Available The discipline of biostatistics is nowadays a fundamental scientific component of biomedical, public health and health services research. Traditional and emerging areas of application include clinical trials research, observational studies, physiology, imaging, and genomics. The present article reviews the current situation of biostatistics, considering the statistical methods traditionally used in biomedical research, as well as the ongoing development of new methods in response to the new problems arising in medicine. Clearly, the successful application of statistics in biomedical research requires appropriate training of biostatisticians. This training should aim to give due consideration to emerging new areas of statistics, while at the same time retaining full coverage of the fundamentals of statistical theory and methodology. In addition, it is important that students of biostatistics receive formal training in relevant biomedical disciplines, such as epidemiology, clinical trials, molecular biology, genetics, and neuroscience.La Bioestadística es hoy en día una componente científica fundamental de la investigación en Biomedicina, salud pública y servicios de salud. Las áreas tradicionales y emergentes de aplicación incluyen ensayos clínicos, estudios observacionales, fisología, imágenes, y genómica. Este artículo repasa la situación actual de la Bioestadística, considerando los métodos estadísticos usados tradicionalmente en investigación biomédica, así como los recientes desarrollos de nuevos métodos, para dar respuesta a los nuevos problemas que surgen en Medicina. Obviamente, la aplicación fructífera de la estadística en investigación biomédica exige una formación adecuada de los bioestadísticos, formación que debería tener en cuenta las áreas emergentes en estadística, cubriendo al mismo tiempo los fundamentos de la teoría estadística y su metodología. Es importante, además, que los estudiantes de

  16. Biomedical signals and systems

    CERN Document Server

    Tranquillo, Joseph V

    2013-01-01

    Biomedical Signals and Systems is meant to accompany a one-semester undergraduate signals and systems course. It may also serve as a quick-start for graduate students or faculty interested in how signals and systems techniques can be applied to living systems. The biological nature of the examples allows for systems thinking to be applied to electrical, mechanical, fluid, chemical, thermal and even optical systems. Each chapter focuses on a topic from classic signals and systems theory: System block diagrams, mathematical models, transforms, stability, feedback, system response, control, time

  17. Biomedical photonics handbook

    CERN Document Server

    Vo-Dinh, Tuan

    2003-01-01

    1.Biomedical Photonics: A Revolution at the Interface of Science and Technology, T. Vo-DinhPHOTONICS AND TISSUE OPTICS2.Optical Properties of Tissues, J. Mobley and T. Vo-Dinh3.Light-Tissue Interactions, V.V. Tuchin 4.Theoretical Models and Algorithms in Optical Diffusion Tomography, S.J. Norton and T. Vo-DinhPHOTONIC DEVICES5.Laser Light in Biomedicine and the Life Sciences: From the Present to the Future, V.S. Letokhov6.Basic Instrumentation in Photonics, T. Vo-Dinh7.Optical Fibers and Waveguides for Medical Applications, I. Gannot and

  18. Radiochemicals in biomedical research

    International Nuclear Information System (INIS)

    Evans, E.A.; Oldham, K.G.

    1988-01-01

    This volume describes the role of radiochemicals in biomedical research, as tracers in the development of new drugs, their interaction and function with receptor proteins, with the kinetics of binding of hormone - receptor interactions, and their use in cancer research and clinical oncology. The book also aims to identify future trends in this research, the main objective of which is to provide information leading to improvements in the quality of life, and to give readers a basic understanding of the development of new drugs, how they function in relation to receptor proteins and lead to a better understanding of the diagnosis and treatment of cancers. (author)

  19. IEEE International Symposium on Biomedical Imaging.

    Science.gov (United States)

    2017-01-01

    The IEEE International Symposium on Biomedical Imaging (ISBI) is a scientific conference dedicated to mathematical, algorithmic, and computational aspects of biological and biomedical imaging, across all scales of observation. It fosters knowledge transfer among different imaging communities and contributes to an integrative approach to biomedical imaging. ISBI is a joint initiative from the IEEE Signal Processing Society (SPS) and the IEEE Engineering in Medicine and Biology Society (EMBS). The 2018 meeting will include tutorials, and a scientific program composed of plenary talks, invited special sessions, challenges, as well as oral and poster presentations of peer-reviewed papers. High-quality papers are requested containing original contributions to the topics of interest including image formation and reconstruction, computational and statistical image processing and analysis, dynamic imaging, visualization, image quality assessment, and physical, biological, and statistical modeling. Accepted 4-page regular papers will be published in the symposium proceedings published by IEEE and included in IEEE Xplore. To encourage attendance by a broader audience of imaging scientists and offer additional presentation opportunities, ISBI 2018 will continue to have a second track featuring posters selected from 1-page abstract submissions without subsequent archival publication.

  20. SCELib2: the new revision of SCELib, the parallel computational library of molecular properties in the single center approach

    Science.gov (United States)

    Sanna, N.; Morelli, G.

    2004-09-01

    In this paper we present the new version of the SCELib program (CPC Catalogue identifier ADMG) a full numerical implementation of the Single Center Expansion (SCE) method. The physics involved is that of producing the SCE description of molecular electronic densities, of molecular electrostatic potentials and of molecular perturbed potentials due to a point negative or positive charge. This new revision of the program has been optimized to run in serial as well as in parallel execution mode, to support a larger set of molecular symmetries and to permit the restart of long-lasting calculations. To measure the performance of this new release, a comparative study has been carried out on the most powerful computing architectures in serial and parallel runs. The results of the calculations reported in this paper refer to real cases medium to large molecular systems and they are reported in full details to benchmark at best the parallel architectures the new SCELib code will run on. Program summaryTitle of program: SCELib2 Catalogue identifier: ADGU Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADGU Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Reference to previous versions: Comput. Phys. Commun. 128 (2) (2000) 139 (CPC catalogue identifier: ADMG) Does the new version supersede the original program?: Yes Computer for which the program is designed and others on which it has been tested: HP ES45 and rx2600, SUN ES4500, IBM SP and any single CPU workstation based on Alpha, SPARC, POWER, Itanium2 and X86 processors Installations: CASPUR, local Operating systems under which the program has been tested: HP Tru64 V5.X, SUNOS V5.8, IBM AIX V5.X, Linux RedHat V8.0 Programming language used: C Memory required to execute with typical data: 10 Mwords. Up to 2000 Mwords depending on the molecular system and runtime parameters No. of bits in a word: 64 No. of processors used: 1 to 32 Has the code been vectorized or parallelized?: Yes

  1. Delivering an Informational Hub for Data at the National Center for Computational Toxicology (ACS Spring Meeting) 7 of 7

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  2. Investigating Impact Metrics for Performance for the US EPA National Center for Computational Toxicology (ACS Fall meeting)

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) Computational Toxicology Program integrates advances in biology, chemistry, and computer science to help prioritize chemicals for further research based on potential human health risks. This work involves computational and data drive...

  3. Big biomedical data as the key resource for discovery science.

    Science.gov (United States)

    Toga, Arthur W; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W; Price, Nathan D; Glusman, Gustavo; Heavner, Benjamin D; Dinov, Ivo D; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-11-01

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an "-ome to home" approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center's computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson's and Alzheimer's. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Evolving technologies drive the new roles of Biomedical Engineering.

    Science.gov (United States)

    Frisch, P H; St Germain, J; Lui, W

    2008-01-01

    Rapidly changing technology coupled with the financial impact of organized health care, has required hospital Biomedical Engineering organizations to augment their traditional operational and business models to increase their role in developing enhanced clinical applications utilizing new and evolving technologies. The deployment of these technology based applications has required Biomedical Engineering organizations to re-organize to optimize the manner in which they provide and manage services. Memorial Sloan-Kettering Cancer Center has implemented a strategy to explore evolving technologies integrating them into enhanced clinical applications while optimally utilizing the expertise of the traditional Biomedical Engineering component (Clinical Engineering) to provide expanded support in technology / equipment management, device repair, preventive maintenance and integration with legacy clinical systems. Specifically, Biomedical Engineering is an integral component of the Medical Physics Department which provides comprehensive and integrated support to the Center in advanced physical, technical and engineering technology. This organizational structure emphasizes the integration and collaboration between a spectrum of technical expertise for clinical support and equipment management roles. The high cost of clinical equipment purchases coupled with the increasing cost of service has driven equipment management responsibilities to include significant business and financial aspects to provide a cost effective service model. This case study details the dynamics of these expanded roles, future initiatives and benefits for Biomedical Engineering and Memorial Sloan Kettering Cancer Center.

  5. Spectroscopic and computational study of a nonheme iron nitrosyl center in a biosynthetic model of nitric oxide reductase.

    Science.gov (United States)

    Chakraborty, Saumen; Reed, Julian; Ross, Matthew; Nilges, Mark J; Petrik, Igor D; Ghosh, Soumya; Hammes-Schiffer, Sharon; Sage, J Timothy; Zhang, Yong; Schulz, Charles E; Lu, Yi

    2014-02-24

    A major barrier to understanding the mechanism of nitric oxide reductases (NORs) is the lack of a selective probe of NO binding to the nonheme FeB center. By replacing the heme in a biosynthetic model of NORs, which structurally and functionally mimics NORs, with isostructural ZnPP, the electronic structure and functional properties of the FeB nitrosyl complex was probed. This approach allowed observation of the first S=3/2 nonheme {FeNO}(7) complex in a protein-based model system of NOR. Detailed spectroscopic and computational studies show that the electronic state of the {FeNO}(7) complex is best described as a high spin ferrous iron (S=2) antiferromagnetically coupled to an NO radical (S=1/2) [Fe(2+)-NO(.)]. The radical nature of the FeB -bound NO would facilitate N-N bond formation by radical coupling with the heme-bound NO. This finding, therefore, supports the proposed trans mechanism of NO reduction by NORs. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Center for Programming Models for Scalable Parallel Computing - Towards Enhancing OpenMP for Manycore and Heterogeneous Nodes

    Energy Technology Data Exchange (ETDEWEB)

    Barbara Chapman

    2012-02-01

    OpenMP was not well recognized at the beginning of the project, around year 2003, because of its limited use in DoE production applications and the inmature hardware support for an efficient implementation. Yet in the recent years, it has been graduately adopted both in HPC applications, mostly in the form of MPI+OpenMP hybrid code, and in mid-scale desktop applications for scientific and experimental studies. We have observed this trend and worked deligiently to improve our OpenMP compiler and runtimes, as well as to work with the OpenMP standard organization to make sure OpenMP are evolved in the direction close to DoE missions. In the Center for Programming Models for Scalable Parallel Computing project, the HPCTools team at the University of Houston (UH), directed by Dr. Barbara Chapman, has been working with project partners, external collaborators and hardware vendors to increase the scalability and applicability of OpenMP for multi-core (and future manycore) platforms and for distributed memory systems by exploring different programming models, language extensions, compiler optimizations, as well as runtime library support.

  7. Handbook on advanced design and manufacturing technologies for biomedical devices

    CERN Document Server

    2013-01-01

    The last decades have seen remarkable advances in computer-aided design, engineering and manufacturing technologies, multi-variable simulation tools, medical imaging, biomimetic design, rapid prototyping, micro and nanomanufacturing methods and information management resources, all of which provide new horizons for the Biomedical Engineering fields and the Medical Device Industry. Handbook on Advanced Design and Manufacturing Technologies for Biomedical Devices covers such topics in depth, with an applied perspective and providing several case studies that help to analyze and understand the key factors of the different stages linked to the development of a novel biomedical device, from the conceptual and design steps, to the prototyping and industrialization phases. Main research challenges and future potentials are also discussed, taking into account relevant social demands and a growing market already exceeding billions of dollars. In time, advanced biomedical devices will decisively change methods and resu...

  8. Annual report of R and D activities in Center for Computational Science and e-Systems from April 1, 2006 to March 31, 2007

    International Nuclear Information System (INIS)

    2008-03-01

    This report provides an overview of the research and development activities of the Center for Computational Science and e-Systems (CCSE), JAEA in fiscal year 2006 (April 1, 2006 - March 31, 2007). These research and development activities have been performed by the Simulation Technology Research and Development Office and the Computer Science Research and Development Office. The primary results of the research and development activities are the development of simulation techniques for a virtual earthquake testbed, an intelligent infrastructure for atomic energy research, computational biological disciplines to predict DNA repair function of protein, and material models for a neutron detection device, crack propagation, and gas bubble formation in nuclear fuel. (author)

  9. Professional Identification for Biomedical Engineers

    Science.gov (United States)

    Long, Francis M.

    1973-01-01

    Discusses four methods of professional identification in biomedical engineering including registration, certification, accreditation, and possible membership qualification of the societies. Indicates that the destiny of the biomedical engineer may be under the control of a new profession, neither the medical nor the engineering. (CC)

  10. Egyptian Journal of Biomedical Sciences

    African Journals Online (AJOL)

    The Egyptian Journal of Biomedical Sciences publishes in all aspects of biomedical research sciences. Both basic and clinical research papers are welcomed. Vol 23 (2007). DOWNLOAD FULL TEXT Open Access DOWNLOAD FULL TEXT Subscription or Fee Access. Table of Contents. Articles. Phytochemical And ...

  11. African Journal of Biomedical Research

    African Journals Online (AJOL)

    The African Journal of biomedical Research was founded in 1998 as a joint project ... of the journal led to the formation of a group (Biomedical Communications Group, ... analysis of multidrug resistant aerobic gram-negative clinical isolates from a ... Dental formula and dental abnormalities observed in the Eidolon helvum ...

  12. Mathematics and physics of emerging biomedical imaging

    International Nuclear Information System (INIS)

    1996-01-01

    Although the mathematical sciences were used in a general way for image processing, they were of little importance in biomedical work until the development in the 1970s of computed tomography (CT) for the imaging of x-rays and isotope emission tomography. In the 1980s, MRI eclipsed the other modalities in many ways as the most informative medical imaging methodology. Besides these well-established techniques, computer-based mathematical methods are being explored in applications to other well-known methods, such as ultrasound and electroencephalography, as well as new techniques of optical imaging, impedance tomography, and magnetic source imaging. It is worth pointing out that, while the final images of many of these techniques bear many similarities to each other, the technologies involved in each are completely different and the parameters represented in the images are very different in character as well as in medical usefulness. In each case, rather different mathematical or statistical models are used, with different equations. One common thread is the paradigm of reconstruction from indirect measurements--this is the unifying theme of this report. The imaging methods used in biomedical applications that this report discusses include: (1) x-ray projection imaging; (2) x-ray computed tomography (CT); (3) magnetic resonance imaging (MRI) and magnetic resonance spectroscopy; (4) single photon emission computed tomography (SPECT); (5) positron emission tomography (PET); (6) ultrasonics; (7) electrical source imaging (ESI); (8) electrical impedance tomography (EIT); (9) magnetic source imaging (MSI); and (10) medical optical imaging

  13. The combinatorics computation for Casimir operators of the symplectic Lie algebra and the application for determining the center of the enveloping algebra of a semidirect product

    International Nuclear Information System (INIS)

    Le Van Hop.

    1989-12-01

    The combinatorics computation is used to describe the Casimir operators of the symplectic Lie Algebra. This result is applied for determining the Center of the enveloping Algebra of the semidirect Product of the Heisenberg Lie Algebra and the symplectic Lie Algebra. (author). 10 refs

  14. Biomedical Science Technologists in Lagos Universities: Meeting ...

    African Journals Online (AJOL)

    Biomedical Science Technologists in Lagos Universities: Meeting Modern Standards ... like to see in biomedical science in Nigeria; 5) their knowledge of ten state-of-the-arts ... KEY WORDS: biomedical science, state-of-the-arts, technical staff ...

  15. Journal of Biomedical Investigation: Editorial Policies

    African Journals Online (AJOL)

    Journal of Biomedical Investigation: Editorial Policies. Journal Home ... The focus of the Journal of Biomedical Research is to promote interdisciplinary research across all Biomedical Sciences. It publishes ... Business editor – Sam Meludu.

  16. Biomedical informatics and translational medicine

    Directory of Open Access Journals (Sweden)

    Sarkar Indra

    2010-02-01

    Full Text Available Abstract Biomedical informatics involves a core set of methodologies that can provide a foundation for crossing the "translational barriers" associated with translational medicine. To this end, the fundamental aspects of biomedical informatics (e.g., bioinformatics, imaging informatics, clinical informatics, and public health informatics may be essential in helping improve the ability to bring basic research findings to the bedside, evaluate the efficacy of interventions across communities, and enable the assessment of the eventual impact of translational medicine innovations on health policies. Here, a brief description is provided for a selection of key biomedical informatics topics (Decision Support, Natural Language Processing, Standards, Information Retrieval, and Electronic Health Records and their relevance to translational medicine. Based on contributions and advancements in each of these topic areas, the article proposes that biomedical informatics practitioners ("biomedical informaticians" can be essential members of translational medicine teams.

  17. Customization of biomedical terminologies.

    Science.gov (United States)

    Homo, Julien; Dupuch, Laëtitia; Benbrahim, Allel; Grabar, Natalia; Dupuch, Marie

    2012-01-01

    Within the biomedical area over one hundred terminologies exist and are merged in the Unified Medical Language System Metathesaurus, which gives over 1 million concepts. When such huge terminological resources are available, the users must deal with them and specifically they must deal with irrelevant parts of these terminologies. We propose to exploit seed terms and semantic distance algorithms in order to customize the terminologies and to limit within them a semantically homogeneous space. An evaluation performed by a medical expert indicates that the proposed approach is relevant for the customization of terminologies and that the extracted terms are mostly relevant to the seeds. It also indicates that different algorithms provide with similar or identical results within a given terminology. The difference is due to the terminologies exploited. A special attention must be paid to the definition of optimal association between the semantic similarity algorithms and the thresholds specific to a given terminology.

  18. Biomedical applications of nanotechnology.

    Science.gov (United States)

    Ramos, Ana P; Cruz, Marcos A E; Tovani, Camila B; Ciancaglini, Pietro

    2017-04-01

    The ability to investigate substances at the molecular level has boosted the search for materials with outstanding properties for use in medicine. The application of these novel materials has generated the new research field of nanobiotechnology, which plays a central role in disease diagnosis, drug design and delivery, and implants. In this review, we provide an overview of the use of metallic and metal oxide nanoparticles, carbon-nanotubes, liposomes, and nanopatterned flat surfaces for specific biomedical applications. The chemical and physical properties of the surface of these materials allow their use in diagnosis, biosensing and bioimaging devices, drug delivery systems, and bone substitute implants. The toxicology of these particles is also discussed in the light of a new field referred to as nanotoxicology that studies the surface effects emerging from nanostructured materials.

  19. 78 FR 4419 - Center for Scientific Review; Notice of Closed Meetings

    Science.gov (United States)

    2013-01-22

    ...: Center for Scientific Review Special Emphasis Panel, Biomedical Imaging and Engineering Area Review. Date... . Name of Committee: Center for Scientific Review Special Emphasis Panel, Member Conflict: Nanotechnology...

  20. Figure text extraction in biomedical literature.

    Directory of Open Access Journals (Sweden)

    Daehyun Kim

    2011-01-01

    Full Text Available Figures are ubiquitous in biomedical full-text articles, and they represent important biomedical knowledge. However, the sheer volume of biomedical publications has made it necessary to develop computational approaches for accessing figures. Therefore, we are developing the Biomedical Figure Search engine (http://figuresearch.askHERMES.org to allow bioscientists to access figures efficiently. Since text frequently appears in figures, automatically extracting such text may assist the task of mining information from figures. Little research, however, has been conducted exploring text extraction from biomedical figures.We first evaluated an off-the-shelf Optical Character Recognition (OCR tool on its ability to extract text from figures appearing in biomedical full-text articles. We then developed a Figure Text Extraction Tool (FigTExT to improve the performance of the OCR tool for figure text extraction through the use of three innovative components: image preprocessing, character recognition, and text correction. We first developed image preprocessing to enhance image quality and to improve text localization. Then we adapted the off-the-shelf OCR tool on the improved text localization for character recognition. Finally, we developed and evaluated a novel text correction framework by taking advantage of figure-specific lexicons.The evaluation on 382 figures (9,643 figure texts in total randomly selected from PubMed Central full-text articles shows that FigTExT performed with 84% precision, 98% recall, and 90% F1-score for text localization and with 62.5% precision, 51.0% recall and 56.2% F1-score for figure text extraction. When limiting figure texts to those judged by domain experts to be important content, FigTExT performed with 87.3% precision, 68.8% recall, and 77% F1-score. FigTExT significantly improved the performance of the off-the-shelf OCR tool we used, which on its own performed with 36.6% precision, 19.3% recall, and 25.3% F1-score for

  1. Pleural effusion biomarkers and computed tomography findings in diagnosing malignant pleural mesothelioma: A retrospective study in a single center

    Science.gov (United States)

    Kataoka, Yuki; Ikegaki, Shunkichi; Saito, Emiko; Matsumoto, Hirotaka; Kaku, Sawako; Shimada, Masatoshi; Hirabayashi, Masataka

    2017-01-01

    In this study, we aimed to examine the clinical value of the pleural effusion (PE) biomarkers, soluble mesothelin-related peptide (SMRP), cytokeratin 19 fragment (CYFRA 21–1) and carcinoembryonic antigen (CEA), and the utility of combining chest computed tomography (CT) findings with these biomarkers, in diagnosing malignant pleural mesothelioma (MPM). We conducted a retrospective cohort study in a single center. Consecutive patients with undiagnosed pleural effusions who underwent PE analysis between September 2014 and August 2016 were reviewed. This study included 240 patients (32 with MPM and 208 non-MPM). SMRP and the CYFRA 21-1/CEA ratio had a sensitivity and specificity for diagnosing MPM of 56.3% and 86.5%, and 87.5% and 74.0%, respectively. Using receiver operating characteristics (ROC) curve analysis of the ability of these markers to distinguish MPM from all other PE causes, the area under the ROC curve (AUC) for SMRP and the CYFRA 21-1/CEA ratio was 0.804 and 0.874, respectively. The sensitivity and specificity of SMRP combined with the CYFRA 21-1/CEA ratio were 93.8% and 64.9%, respectively. The sensitivity of the combination of SMRP, the CYFRA 21-1/CEA ratio, and the presence of Leung’s criteria (a chest CT finding that is suggestive of malignant pleural disease) was 93.8%. In conclusion, the combined PE biomarkers had a high sensitivity for diagnosing MPM, although the addition of chest CT findings did not improve the sensitivity of SMRP combined with the CYFRA 21-1/CEA ratio. Combination of these biomarkers helped to rule out MPM effectively among patients at high risk of suffering MPM and would be valuable especially for old frail patients who have difficulty in undergoing invasive procedures such as thoracoscopy. PMID:28968445

  2. Citizen Science for Mining the Biomedical Literature

    Directory of Open Access Journals (Sweden)

    Ginger Tsueng

    2016-12-01

    Full Text Available Biomedical literature represents one of the largest and fastest growing collections of unstructured biomedical knowledge. Finding critical information buried in the literature can be challenging. To extract information from free-flowing text, researchers need to: 1. identify the entities in the text (named entity recognition, 2. apply a standardized vocabulary to these entities (normalization, and 3. identify how entities in the text are related to one another (relationship extraction. Researchers have primarily approached these information extraction tasks through manual expert curation and computational methods. We have previously demonstrated that named entity recognition (NER tasks can be crowdsourced to a group of non-experts via the paid microtask platform, Amazon Mechanical Turk (AMT, and can dramatically reduce the cost and increase the throughput of biocuration efforts. However, given the size of the biomedical literature, even information extraction via paid microtask platforms is not scalable. With our web-based application Mark2Cure (http://mark2cure.org, we demonstrate that NER tasks also can be performed by volunteer citizen scientists with high accuracy. We apply metrics from the Zooniverse Matrices of Citizen Science Success and provide the results here to serve as a basis of comparison for other citizen science projects. Further, we discuss design considerations, issues, and the application of analytics for successfully moving a crowdsourcing workflow from a paid microtask platform to a citizen science platform. To our knowledge, this study is the first application of citizen science to a natural language processing task.

  3. The use of AMS to the biomedical sciences

    International Nuclear Information System (INIS)

    Vogel, J.S.

    1991-04-01

    The Center for Accelerator Mass Spectroscopy (AMS) began making AMS measurements in 1989. Biomedical experiments were originally limited by sample preparation techniques, but we expect the number of biomedical samples to increase five-fold. While many of the detailed techniques for making biomedical measurements resemble those used in other fields, biological tracer experiments differ substantially from the observational approaches of earth science investigators. The role of xenobiotius in initiating mutations in cells is of particular interest. One measure of the damage caused to the genetic material is obtained by counting the number of adducts formed by a chemical agent at a given dose. AMS allows direct measurement of the number of adducts through stoichiometric quantification of the 14 C label attached to the DNA after exposure to a labelled carcinogen. Other isotopes of interest include tritium, 36 Cl, 79 SE, 41 Ca, 26 Al and 129 I. Our experiments with low dose environmental carcinogens reflect the protocols which will become a common part of biomedical AMS. In biomedical experiments, the researcher defines the carbon to be analyzed through dissection and/or chemical purification; thus the sample is ''merely'' combusted and graphitized at the AMS facility. However, since biomedical samples can have a 14 C range of five orders of magnitude, preparation of graphite required construction of a special manifold to prevent cross-contamination. Additionally, a strain of 14 C-depleted C57BL/6 mice is being developed to further reduce background in biomedical experiments. AMS has a bright and diverse future in radioisotope tracing. Such work requires a dedicated amalgamation of AMS scientists and biomedical researchers who will redesign experimental protocols to maximize the AMS technique and minimize the danger of catastrophic contamination. 18 refs., 4 figs., 1 tab

  4. Cardiovascular system simulation in biomedical engineering education.

    Science.gov (United States)

    Rideout, V. C.

    1972-01-01

    Use of complex cardiovascular system models, in conjunction with a large hybrid computer, in biomedical engineering courses. A cardiovascular blood pressure-flow model, driving a compartment model for the study of dye transport, was set up on the computer for use as a laboratory exercise by students who did not have the computer experience or skill to be able to easily set up such a simulation involving some 27 differential equations running at 'real time' rate. The students were given detailed instructions regarding the model, and were then able to study effects such as those due to septal and valve defects upon the pressure, flow, and dye dilution curves. The success of this experiment in the use of involved models in engineering courses was such that it seems that this type of laboratory exercise might be considered for use in physiology courses as an adjunct to animal experiments.

  5. All India Seminar on Biomedical Engineering 2012

    CERN Document Server

    Bhatele, Mukta

    2013-01-01

    This book is a collection of articles presented by researchers and practitioners, including engineers, biologists, health professionals and informatics/computer scientists, interested in both theoretical advances and applications of information systems, artificial intelligence, signal processing, electronics and other engineering tools in areas related to biology and medicine in the All India Seminar on Biomedical Engineering 2012 (AISOBE 2012), organized by The Institution of Engineers (India), Jabalpur Local Centre, Jabalpur, India during November 3-4, 2012. The content of the book is useful to doctors, engineers, researchers and academicians as well as industry professionals.

  6. 76 FR 21091 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare & Medicaid...

    Science.gov (United States)

    2011-04-14

    ...: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching...: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub. L.) 100-503...), as amended, (Pub. L. 100-503, the Computer Matching and Privacy Protection Act (CMPPA) of 1988), the...

  7. New roles & responsibilities of hospital biomedical engineering.

    Science.gov (United States)

    Frisch, P H; Stone, B; Booth, P; Lui, W

    2014-01-01

    Over the last decade the changing healthcare environment has required hospitals and specifically Biomedical Engineering to critically evaluate, optimize and adapt their operations. The focus is now on new technologies, changes to the environment of care, support requirements and financial constraints. Memorial Sloan Kettering Cancer Center (MSKCC), an NIH-designated comprehensive cancer center, has been transitioning to an increasing outpatient care environment. This transition is driving an increase in-patient acuity coupled with the need for added urgency of support and response time. New technologies, regulatory requirements and financial constraints have impacted operating budgets and in some cases, resulted in a reduction in staffing. Specific initiatives, such as the Joint Commission's National Patient Safety Goals, requirements for an electronic medical record, meaningful use and ICD10 have caused institutions to reevaluate their operations and processes including requiring Biomedical Engineering to manage new technologies, integrations and changes in the electromagnetic environment, while optimizing operational workflow and resource utilization. This paper addresses the new and expanding responsibilities and approach of Biomedical Engineering organizations, specifically at MSKCC. It is suggested that our experience may be a template for other organizations facing similar problems. Increasing support is necessary for Medical Software - Medical Device Data Systems in the evolving wireless environment, including RTLS and RFID. It will be necessary to evaluate the potential impact on the growing electromagnetic environment, on connectivity resulting in the need for dynamic and interactive testing and the growing demand to establish new and needed operational synergies with Information Technology operations and other operational groups within the institution, such as nursing, facilities management, central supply, and the user departments.

  8. An optimal big data workflow for biomedical image analysis

    Directory of Open Access Journals (Sweden)

    Aurelle Tchagna Kouanou

    Full Text Available Background and objective: In the medical field, data volume is increasingly growing, and traditional methods cannot manage it efficiently. In biomedical computation, the continuous challenges are: management, analysis, and storage of the biomedical data. Nowadays, big data technology plays a significant role in the management, organization, and analysis of data, using machine learning and artificial intelligence techniques. It also allows a quick access to data using the NoSQL database. Thus, big data technologies include new frameworks to process medical data in a manner similar to biomedical images. It becomes very important to develop methods and/or architectures based on big data technologies, for a complete processing of biomedical image data. Method: This paper describes big data analytics for biomedical images, shows examples reported in the literature, briefly discusses new methods used in processing, and offers conclusions. We argue for adapting and extending related work methods in the field of big data software, using Hadoop and Spark frameworks. These provide an optimal and efficient architecture for biomedical image analysis. This paper thus gives a broad overview of big data analytics to automate biomedical image diagnosis. A workflow with optimal methods and algorithm for each step is proposed. Results: Two architectures for image classification are suggested. We use the Hadoop framework to design the first, and the Spark framework for the second. The proposed Spark architecture allows us to develop appropriate and efficient methods to leverage a large number of images for classification, which can be customized with respect to each other. Conclusions: The proposed architectures are more complete, easier, and are adaptable in all of the steps from conception. The obtained Spark architecture is the most complete, because it facilitates the implementation of algorithms with its embedded libraries. Keywords: Biomedical images, Big

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  10. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  12. Smart nanomaterials for biomedics.

    Science.gov (United States)

    Choi, Soonmo; Tripathi, Anuj; Singh, Deepti

    2014-10-01

    Nanotechnology has become important in various disciplines of technology and science. It has proven to be a potential candidate for various applications ranging from biosensors to the delivery of genes and therapeutic agents to tissue engineering. Scaffolds for every application can be tailor made to have the appropriate physicochemical properties that will influence the in vivo system in the desired way. For highly sensitive and precise detection of specific signals or pathogenic markers, or for sensing the levels of particular analytes, fabricating target-specific nanomaterials can be very useful. Multi-functional nano-devices can be fabricated using different approaches to achieve multi-directional patterning in a scaffold with the ability to alter topographical cues at scale of less than or equal to 100 nm. Smart nanomaterials are made to understand the surrounding environment and act accordingly by either protecting the drug in hostile conditions or releasing the "payload" at the intended intracellular target site. All of this is achieved by exploiting polymers for their functional groups or incorporating conducting materials into a natural biopolymer to obtain a "smart material" that can be used for detection of circulating tumor cells, detection of differences in the body analytes, or repair of damaged tissue by acting as a cell culture scaffold. Nanotechnology has changed the nature of diagnosis and treatment in the biomedical field, and this review aims to bring together the most recent advances in smart nanomaterials.

  13. Zirconia in biomedical applications.

    Science.gov (United States)

    Chen, Yen-Wei; Moussi, Joelle; Drury, Jeanie L; Wataha, John C

    2016-10-01

    The use of zirconia in medicine and dentistry has rapidly expanded over the past decade, driven by its advantageous physical, biological, esthetic, and corrosion properties. Zirconia orthopedic hip replacements have shown superior wear-resistance over other systems; however, risk of catastrophic fracture remains a concern. In dentistry, zirconia has been widely adopted for endosseous implants, implant abutments, and all-ceramic crowns. Because of an increasing demand for esthetically pleasing dental restorations, zirconia-based ceramic restorations have become one of the dominant restorative choices. Areas covered: This review provides an updated overview of the applications of zirconia in medicine and dentistry with a focus on dental applications. The MEDLINE electronic database (via PubMed) was searched, and relevant original and review articles from 2010 to 2016 were included. Expert commentary: Recent data suggest that zirconia performs favorably in both orthopedic and dental applications, but quality long-term clinical data remain scarce. Concerns about the effects of wear, crystalline degradation, crack propagation, and catastrophic fracture are still debated. The future of zirconia in biomedical applications will depend on the generation of these data to resolve concerns.

  14. Bio-medical CMOS ICs

    CERN Document Server

    Yoo, Hoi-Jun

    2011-01-01

    This book is based on a graduate course entitled, Ubiquitous Healthcare Circuits and Systems, that was given by one of the editors. It includes an introduction and overview to biomedical ICs and provides information on the current trends in research.

  15. Functionalized carbon nanotubes: biomedical applications

    Science.gov (United States)

    Vardharajula, Sandhya; Ali, Sk Z; Tiwari, Pooja M; Eroğlu, Erdal; Vig, Komal; Dennis, Vida A; Singh, Shree R

    2012-01-01

    Carbon nanotubes (CNTs) are emerging as novel nanomaterials for various biomedical applications. CNTs can be used to deliver a variety of therapeutic agents, including biomolecules, to the target disease sites. In addition, their unparalleled optical and electrical properties make them excellent candidates for bioimaging and other biomedical applications. However, the high cytotoxicity of CNTs limits their use in humans and many biological systems. The biocompatibility and low cytotoxicity of CNTs are attributed to size, dose, duration, testing systems, and surface functionalization. The functionalization of CNTs improves their solubility and biocompatibility and alters their cellular interaction pathways, resulting in much-reduced cytotoxic effects. Functionalized CNTs are promising novel materials for a variety of biomedical applications. These potential applications are particularly enhanced by their ability to penetrate biological membranes with relatively low cytotoxicity. This review is directed towards the overview of CNTs and their functionalization for biomedical applications with minimal cytotoxicity. PMID:23091380

  16. Molecular Biomedical Imaging Laboratory (MBIL)

    Data.gov (United States)

    Federal Laboratory Consortium — The Molecular Biomedical Imaging Laboratory (MBIL) is adjacent-a nd has access-to the Department of Radiology and Imaging Sciences clinical imaging facilities. MBIL...

  17. New Directions for Biomedical Engineering

    Science.gov (United States)

    Plonsey, Robert

    1973-01-01

    Discusses the definition of "biomedical engineering" and the development of educational programs in the field. Includes detailed descriptions of the roles of bioengineers, medical engineers, and chemical engineers. (CC)

  18. Summer Biomedical Engineering Institute 1972

    Science.gov (United States)

    Deloatch, E. M.

    1973-01-01

    The five problems studied for biomedical applications of NASA technology are reported. The studies reported are: design modification of electrophoretic equipment, operating room environment control, hematological viscometry, handling system for iridium, and indirect blood pressure measuring device.

  19. New frontiers in biomedical science and engineering during 2014-2015.

    Science.gov (United States)

    Liu, Feng; Lee, Dong-Hoon; Lagoa, Ricardo; Kumar, Sandeep

    2015-01-01

    The International Conference on Biomedical Engineering and Biotechnology (ICBEB) is an international meeting held once a year. This, the fourth International Conference on Biomedical Engineering and Biotechnology (ICBEB2015), will be held in Shanghai, China, during August 18th-21st, 2015. This annual conference intends to provide an opportunity for researchers and practitioners at home and abroad to present the most recent frontiers and future challenges in the fields of biomedical science, biomedical engineering, biomaterials, bioinformatics and computational biology, biomedical imaging and signal processing, biomechanical engineering and biotechnology, etc. The papers published in this issue are selected from this Conference, which witness the advances in biomedical engineering and biotechnology during 2014-2015.

  20. Hydroxyapatite coatings for biomedical applications

    CERN Document Server

    Zhang, Sam

    2013-01-01

    Hydroxyapatite coatings are of great importance in the biological and biomedical coatings fields, especially in the current era of nanotechnology and bioapplications. With a bonelike structure that promotes osseointegration, hydroxyapatite coating can be applied to otherwise bioinactive implants to make their surface bioactive, thus achieving faster healing and recovery. In addition to applications in orthopedic and dental implants, this coating can also be used in drug delivery. Hydroxyapatite Coatings for Biomedical Applications explores developments in the processing and property characteri

  1. John Glenn Biomedical Engineering Consortium

    Science.gov (United States)

    Nall, Marsha

    2004-01-01

    The John Glenn Biomedical Engineering Consortium is an inter-institutional research and technology development, beginning with ten projects in FY02 that are aimed at applying GRC expertise in fluid physics and sensor development with local biomedical expertise to mitigate the risks of space flight on the health, safety, and performance of astronauts. It is anticipated that several new technologies will be developed that are applicable to both medical needs in space and on earth.

  2. Essential numerical computer methods

    CERN Document Server

    Johnson, Michael L

    2010-01-01

    The use of computers and computational methods has become ubiquitous in biological and biomedical research. During the last 2 decades most basic algorithms have not changed, but what has is the huge increase in computer speed and ease of use, along with the corresponding orders of magnitude decrease in cost. A general perception exists that the only applications of computers and computer methods in biological and biomedical research are either basic statistical analysis or the searching of DNA sequence data bases. While these are important applications they only scratch the surface of the current and potential applications of computers and computer methods in biomedical research. The various chapters within this volume include a wide variety of applications that extend far beyond this limited perception. As part of the Reliable Lab Solutions series, Essential Numerical Computer Methods brings together chapters from volumes 210, 240, 321, 383, 384, 454, and 467 of Methods in Enzymology. These chapters provide ...

  3. Bayes' theorem: A paradigm research tool in biomedical sciences ...

    African Journals Online (AJOL)

    One of the most interesting applications of the results of probability theory involves estimating unknown probability and making decisions on the basis of new (sample) information. Biomedical scientists often use the Bayesian decision theory for the purposes of computing diagnostic values such as sensitivity and specificity ...

  4. Pathophysiologic mechanisms of biomedical nanomaterials

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Liming, E-mail: wangliming@ihep.ac.cn; Chen, Chunying, E-mail: chenchy@nanoctr.cn

    2016-05-15

    Nanomaterials (NMs) have been widespread used in biomedical fields, daily consuming, and even food industry. It is crucial to understand the safety and biomedical efficacy of NMs. In this review, we summarized the recent progress about the physiological and pathological effects of NMs from several levels: protein-nano interface, NM-subcellular structures, and cell–cell interaction. We focused on the detailed information of nano-bio interaction, especially about protein adsorption, intracellular trafficking, biological barriers, and signaling pathways as well as the associated mechanism mediated by nanomaterials. We also introduced related analytical methods that are meaningful and helpful for biomedical effect studies in the future. We believe that knowledge about pathophysiologic effects of NMs is not only significant for rational design of medical NMs but also helps predict their safety and further improve their applications in the future. - Highlights: • Rapid protein adsorption onto nanomaterials that affects biomedical effects • Nanomaterials and their interaction with biological membrane, intracellular trafficking and specific cellular effects • Nanomaterials and their interaction with biological barriers • The signaling pathways mediated by nanomaterials and related biomedical effects • Novel techniques for studying translocation and biomedical effects of NMs.

  5. Pathophysiologic mechanisms of biomedical nanomaterials

    International Nuclear Information System (INIS)

    Wang, Liming; Chen, Chunying

    2016-01-01

    Nanomaterials (NMs) have been widespread used in biomedical fields, daily consuming, and even food industry. It is crucial to understand the safety and biomedical efficacy of NMs. In this review, we summarized the recent progress about the physiological and pathological effects of NMs from several levels: protein-nano interface, NM-subcellular structures, and cell–cell interaction. We focused on the detailed information of nano-bio interaction, especially about protein adsorption, intracellular trafficking, biological barriers, and signaling pathways as well as the associated mechanism mediated by nanomaterials. We also introduced related analytical methods that are meaningful and helpful for biomedical effect studies in the future. We believe that knowledge about pathophysiologic effects of NMs is not only significant for rational design of medical NMs but also helps predict their safety and further improve their applications in the future. - Highlights: • Rapid protein adsorption onto nanomaterials that affects biomedical effects • Nanomaterials and their interaction with biological membrane, intracellular trafficking and specific cellular effects • Nanomaterials and their interaction with biological barriers • The signaling pathways mediated by nanomaterials and related biomedical effects • Novel techniques for studying translocation and biomedical effects of NMs

  6. Development of an Instrument to Measure Health Center (HC) Personnel's Computer Use, Knowledge and Functionality Demand for HC Computerized Information System in Thailand

    OpenAIRE

    Kijsanayotin, Boonchai; Pannarunothai, Supasit; Speedie, Stuart

    2005-01-01

    Knowledge about socio-technical aspects of information technology (IT) is vital for the success of health IT projects. The Thailand health administration anticipates using health IT to support the recently implemented national universal health care system. However, the national knowledge associate with the socio-technical aspects of health IT has not been studied in Thailand. A survey instrument measuring Thai health center (HC) personnel’s computer use, basic IT knowledge a...

  7. RPCs in biomedical applications

    Science.gov (United States)

    Belli, G.; De Vecchi, C.; Giroletti, E.; Guida, R.; Musitelli, G.; Nardò, R.; Necchi, M. M.; Pagano, D.; Ratti, S. P.; Sani, G.; Vicini, A.; Vitulo, P.; Viviani, C.

    2006-08-01

    We are studying possible applications of Resistive Plate Chambers (RPCs) in the biomedical domain such as Positron Emission Tomography (PET). The use of RPCs in PET can provide several improvements on the usual scintillation-based detectors. The most striking features are the extremely good spatial and time resolutions. They can be as low as 50 μm and 25 ps respectively, to be compared to the much higher intrinsic limits in bulk detectors. Much efforts have been made to investigate suitable materials to make RPCs sensitive to 511 keV photons. For this reason, we are studying different types of coating employing high Z materials with proper electrical resistivity. Later investigations explored the possibility of coating glass electrodes by mean of serigraphy techniques, employing oxide based mixtures with a high density of high Z materials; the efficiency is strongly dependent on its thickness and it reaches a maximum for a characteristic value that is a function of the compound (usually a few hundred microns). The most promising mixtures seem to be PbO, Bi 2O 3 and Tl 2O. Preliminary gamma efficiency measurements for a Multigap RPC prototype (MRPC) are presented as well as simulations using GEANT4-based framework. The MRPC has 5 gas gaps; their spacings are kept by 0.3 mm diameter nylon fishing line, electrodes are made of thin glasses (1 mm for the outer electrodes, 0.15-0.4 mm for the inner ones). The detector is enclosed in a metallic gas-tight box, filled with a C 2H 2F 4 92.5%, SF 6 2.5%, C 4H 10 5% mixture. Different gas mixtures are being studied increasing the SF6 percentage and results of efficiency as a function of the new mixtures will be presented.

  8. Experimental and Computational Instrumentation for Rotorcraft Noise and Vibration Control Research at the Penn State Rotorcraft Center

    National Research Council Canada - National Science Library

    Smith, Edward

    2001-01-01

    A team of faculty at the Penn State Rotorcraft Center of Excellence has integrated five new facilities into a broad range of research and educational programs focused on rotorcraft noise and vibration control...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  11. 77 FR 33547 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare and Medicaid...

    Science.gov (United States)

    2012-06-06

    ...: Social Security Administration (SSA). ACTION: Notice of a new computer matching program that will expire... protections for such persons. The Privacy Act, as amended, regulates the use of computer matching by Federal... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2012-0015] Privacy Act of 1974, as Amended...

  12. 78 FR 69926 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA/Centers for Medicare & Medicaid...

    Science.gov (United States)

    2013-11-21

    ...: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Pub. L 100-503), amended the... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2013-0059] Privacy Act of 1974, as Amended...

  13. Annual report of R and D activities in center for promotion of computational science and engineering from April 1, 2004 to March 31, 2005

    International Nuclear Information System (INIS)

    2005-09-01

    This report provides an overview of research and development activities in Center for Promotion of Computational Science and Engineering (CCSE), JAERI, in the fiscal year 2004 (April 1, 2004 - March 31, 2005). The activities have been performed by Research Group for Computational Science in Atomic Energy, Research Group for Computational Material Science in Atomic Energy, R and D Group for Computer Science, R and D Group for Numerical Experiments, and Quantum Bioinformatics Group in CCSE. The ITBL (Information Technology Based Laboratory) project is performed mainly by the R and D Group for Computer Science and the Research Group for Computational Science in Atomic Energy. According to the mid-term evaluation for the ITBL project conducted by the MEXT, the achievement of the ITBL infrastructure software developed by JAERI has been remarked as outstanding at the 13th Information Science and Technology Committee in the Subdivision on R and D Planning and Evaluation of the Council for Science and Technology on April 26th, 2004. (author)

  14. Annual report of R and D activities in Center for Computational Science and e-Systems from April 1, 2009 to March 31, 2010

    International Nuclear Information System (INIS)

    2011-10-01

    This report overviews the activity of research and development (R and D) in Center for Computational Science and e-Systems (CCSE) of the Japan Atomic Energy Agency (JAEA), during the fiscal year 2009 (April 1, 2009 - March 31, 2010). The work has been accomplished by the Simulation Technology R and D Office and Computer Science R and D Office in CCSE. The activity includes researches of secure computational infrastructure for the use in atomic energy research, which is based on the grid technology, a seismic response analysis for the structure of nuclear power plants, materials science, and quantum bioinformatics. The materials science research includes large scale atomic and subatomic simulations of nuclear fuels and materials for safety assessment, large scale quantum simulations of superconductor for the design of new devices and fundamental understanding of superconductivity. The quantum bioinformatics research focuses on the development of technology for large scale atomic simulations of proteins. (author)

  15. Annual report of R and D activities in Center for Computational Science and e-Systems from April 1, 2007 to March 31, 2009

    International Nuclear Information System (INIS)

    2010-01-01

    This report provides an overview of research and development activities in Center for Computational Science and e-Systems (CCSE), JAEA, during the fiscal years 2007 and 2008 (Apr 1, 2007 - March 31, 2009). These research and development activities have been performed by the Simulation Technology R and D Office and Computer Science R and D Office. These activities include development of secure computational infrastructure for atomic energy research based on the grid technology, large scale seismic analysis of an entire nuclear reactor structure, large scale fluid dynamics simulation of J-PARC mercury target, large scale plasma simulation for nuclear fusion reactor, large scale atomic and subatomic simulations of nuclear fuels and materials for safety assessment, large scale quantum simulations of superconductor for the design of new devices and fundamental understanding of superconductivity, development of protein database for the identification of radiation-resistance gene, and large scale atomic simulation of proteins. (author)

  16. Biomedical cyclotron facility

    International Nuclear Information System (INIS)

    MacDonald, N.S.; Birdsall, R.; Takahaski, J.; McConnel, L.; Wood, R.; Wakakuwa, S.

    1976-01-01

    During the fifth year of operation the mechanical performance of the cyclotron and accessory equipment was excellent. Major items put into operation were a small computer system interfaced with Ge-Li gamma spectrometer and a pneumatic-tube system for fast delivery of short-lived radionuclides. A table is presented listing the radionuclides produced

  17. Session Introduction: Challenges of Pattern Recognition in Biomedical Data.

    Science.gov (United States)

    Verma, Shefali Setia; Verma, Anurag; Basile, Anna Okula; Bishop, Marta-Byrska; Darabos, Christian

    2018-01-01

    The analysis of large biomedical data often presents with various challenges related to not just the size of the data, but also to data quality issues such as heterogeneity, multidimensionality, noisiness, and incompleteness of the data. The data-intensive nature of computational genomics problems in biomedical informatics warrants the development and use of massive computer infrastructure and advanced software tools and platforms, including but not limited to the use of cloud computing. Our session aims to address these challenges in handling big data for designing a study, performing analysis, and interpreting outcomes of these analyses. These challenges have been prevalent in many studies including those which focus on the identification of novel genetic variant-phenotype associations using data from sources like Electronic Health Records (EHRs) or multi-omic data. One of the biggest challenges to focus on is the imperfect nature of the biomedical data where a lot of noise and sparseness is observed. In our session, we will present research articles that can help in identifying innovative ways to recognize and overcome newly arising challenges associated with pattern recognition in biomedical data.

  18. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  19. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  5. The biomedical discourse relation bank

    Directory of Open Access Journals (Sweden)

    Joshi Aravind

    2011-05-01

    Full Text Available Abstract Background Identification of discourse relations, such as causal and contrastive relations, between situations mentioned in text is an important task for biomedical text-mining. A biomedical text corpus annotated with discourse relations would be very useful for developing and evaluating methods for biomedical discourse processing. However, little effort has been made to develop such an annotated resource. Results We have developed the Biomedical Discourse Relation Bank (BioDRB, in which we have annotated explicit and implicit discourse relations in 24 open-access full-text biomedical articles from the GENIA corpus. Guidelines for the annotation were adapted from the Penn Discourse TreeBank (PDTB, which has discourse relations annotated over open-domain news articles. We introduced new conventions and modifications to the sense classification. We report reliable inter-annotator agreement of over 80% for all sub-tasks. Experiments for identifying the sense of explicit discourse connectives show the connective itself as a highly reliable indicator for coarse sense classification (accuracy 90.9% and F1 score 0.89. These results are comparable to results obtained with the same classifier on the PDTB data. With more refined sense classification, there is degradation in performance (accuracy 69.2% and F1 score 0.28, mainly due to sparsity in the data. The size of the corpus was found to be sufficient for identifying the sense of explicit connectives, with classifier performance stabilizing at about 1900 training instances. Finally, the classifier performs poorly when trained on PDTB and tested on BioDRB (accuracy 54.5% and F1 score 0.57. Conclusion Our work shows that discourse relations can be reliably annotated in biomedical text. Coarse sense disambiguation of explicit connectives can be done with high reliability by using just the connective as a feature, but more refined sense classification requires either richer features or more

  6. [Master course in biomedical engineering].

    Science.gov (United States)

    Jobbágy, Akos; Benyó, Zoltán; Monos, Emil

    2009-11-22

    The Bologna Declaration aims at harmonizing the European higher education structure. In accordance with the Declaration, biomedical engineering will be offered as a master (MSc) course also in Hungary, from year 2009. Since 1995 biomedical engineering course has been held in cooperation of three universities: Semmelweis University, Budapest Veterinary University, and Budapest University of Technology and Economics. One of the latter's faculties, Faculty of Electrical Engineering and Informatics, has been responsible for the course. Students could start their biomedical engineering studies - usually in parallel with their first degree course - after they collected at least 180 ECTS credits. Consequently, the biomedical engineering course could have been considered as a master course even before the Bologna Declaration. Students had to collect 130 ECTS credits during the six-semester course. This is equivalent to four-semester full-time studies, because during the first three semesters the curriculum required to gain only one third of the usual ECTS credits. The paper gives a survey on the new biomedical engineering master course, briefly summing up also the subjects in the curriculum.

  7. Autonomous Micro-Modular Mobile Data Center Cloud Computing Study for Modeling, Simulation, Information Processing and Cyber-Security Viability

    Data.gov (United States)

    National Aeronautics and Space Administration — Cloud computing environments offer opportunities for malicious users to penetrate security layers and damage, destroy or steal data. This ability can be exploited to...

  8. A robust approach to extract biomedical events from literature.

    Science.gov (United States)

    Bui, Quoc-Chinh; Sloot, Peter M A

    2012-10-15

    The abundance of biomedical literature has attracted significant interest in novel methods to automatically extract biomedical relations from the literature. Until recently, most research was focused on extracting binary relations such as protein-protein interactions and drug-disease relations. However, these binary relations cannot fully represent the original biomedical data. Therefore, there is a need for methods that can extract fine-grained and complex relations known as biomedical events. In this article we propose a novel method to extract biomedical events from text. Our method consists of two phases. In the first phase, training data are mapped into structured representations. Based on that, templates are used to extract rules automatically. In the second phase, extraction methods are developed to process the obtained rules. When evaluated against the Genia event extraction abstract and full-text test datasets (Task 1), we obtain results with F-scores of 52.34 and 53.34, respectively, which are comparable to the state-of-the-art systems. Furthermore, our system achieves superior performance in terms of computational efficiency. Our source code is available for academic use at http://dl.dropbox.com/u/10256952/BioEvent.zip.

  9. Innovations in Biomedical Engineering 2016

    CERN Document Server

    Tkacz, Ewaryst; Paszenda, Zbigniew; Piętka, Ewa

    2017-01-01

    This book presents the proceedings of the “Innovations in Biomedical Engineering IBE’2016” Conference held on October 16–18, 2016 in Poland, discussing recent research on innovations in biomedical engineering. The past decade has seen the dynamic development of more and more sophisticated technologies, including biotechnologies, and more general technologies applied in the area of life sciences. As such the book covers the broadest possible spectrum of subjects related to biomedical engineering innovations. Divided into four parts, it presents state-of-the-art achievements in: • engineering of biomaterials, • modelling and simulations in biomechanics, • informatics in medicine • signal analysis The book helps bridge the gap between technological and methodological engineering achievements on the one hand and clinical requirements in the three major areas diagnosis, therapy and rehabilitation on the other.

  10. Modeling and control in the biomedical sciences

    CERN Document Server

    Banks, H T

    1975-01-01

    These notes are based on (i) a series of lectures that I gave at the 14th Biennial Seminar of the Canadian Mathematical Congress held at the University of Western Ontario August 12-24, 1973 and (li) some of my lectures in a modeling course that I have cotaught in the Division of Bio-Medical Sciences at Brown during the past several years. An earlier version of these notes appeared in the Center for Dynamical Systems Lectures Notes series (CDS LN 73-1, November 1973). I have in this revised and extended version of those earlier notes incorporated a number of changes based both on classroom experience and on my research efforts with several colleagues during the intervening period. The narrow viewpoint of the present notes (use of optimization and control theory in biomedical problems) reflects more the scope of the CMC lectures given in August, 1973 than the scope of my own interests. Indeed, my real interests have included the modeling process itself as well as the contributions made by investiga­ tors who e...

  11. Contamination control training for biomedical facilities

    International Nuclear Information System (INIS)

    Trinoskey, P.A.

    1994-10-01

    In 1991, a contamination control course was developed for the Biology and Biotechnology Research Program (BBRP) at the Lawrence Livermore National Laboratory (LLNL). This course was based on the developer's experience in Radiation Safety at the University of Utah and University of Kansas Medical Center. This course has been well received at LLNL because it addresses issues that are important to individuals handling small quantities of radioactive materials. This group of users is often overlooked. They are typically very well educated and are expected to ''know'' what they should do. Many of these individuals are not initially comfortable working with radioactive materials. They appreciate the opportunity to be introduced to contamination control techniques and to discuss issues they may have. In addition, the authors benefit by experience that researchers bring from other facilities. The training course will address the specific radiological training requirements for chemists, biologists, and medical researchers who are using small amounts of dispersible radionuclides in tabletop experiments, and will not be exposed to other radiation sources. The training will include: the potential hazards of typical radionuclides, contamination control procedures, and guidance for developing and including site-specific information. The training course will eliminate the need for Radiological Worker II training for bio-medical researchers. The target audience for this training course is bio-medical researchers

  12. Biomedical Use of Aerospace Personal Cooling Garments

    Science.gov (United States)

    Webbon, Bruce W.; Montgomery, Leslie D.; Callaway, Robert K.

    1994-01-01

    Personal thermoregulatory systems are required during extravehicular activity (EVA) to remove the metabolic heat generated by the suited astronaut. The Extravehicular and Protective Systems (STE) Branch of NASA Ames Research Center has developed advanced concepts or liquid cooling garments for both industrial and biomedical applications for the past 25 years. Examples of this work include: (1) liquid cooled helmets for helicopter pilots and race car drivers; (2) vests for fire and mine rescue personnel; (3) bras to increase the definition of tumors during thermography; (4) lower body garments for young women with erythomelaigia; and (5) whole body garments used by patients with multiple sclerosis (MS). The benefits of the biomedical application of artificial thermoregulation received national attention through two recent events: (1) the liquid-cooled garment technology was inducted into the United States Space Foundation's Space Technology Hall of Fame (1993); and (2) NASA has signed a joint Memorandum of Understanding with the Multiple Sclerosis Association (1994) to share this technology for use with MS patient treatment. The STE Branch is currently pursuing a program to refine thermoregulatory design in light of recent technology developments that might be applicable for use by several medical patient populations. Projects have been initiated to apply thermoregulatory technology for the treatment and/or rehabilitation of patients with spinal cord injuries, multiple sclerosis, migraine headaches, and to help prevent the loss of hair during chemotherapy.

  13. Advances in biomedical dosimetry

    International Nuclear Information System (INIS)

    1981-01-01

    Full text: Radiation dosimetry, the accurate determination of the absorbed dose within an irradiated body or a piece of material, is a prerequisite for all applications of ionizing radiation. This has been known since the very first radiation applications in medicine and biology, and increasing efforts are being made by radiation researchers to develop more reliable, effective and safe instruments, and to further improve dosimetric accuracy for all types of radiation used. Development of new techniques and instrumentation was particularly fast in the field of both medical diagnostic and therapeutic radiology. Thus, in Paris in October the IAEA held the latest symposium in its continuing series on dosimetry in medicine and biology. The last one was held in Vienna in 1975. High-quality dosimetry is obviously of great importance for human health, whether the objectives lie in the prevention and control of risks associated with the nuclear industry, in medical uses of radioactive substances or X-ray beams for diagnostic purposes, or in the application of photon, electron or neutron beams in radiotherapy. The symposium dealt with the following subjects: General aspects of dosimetry; Special physical and biomedical aspects; Determination of absorbed dose; Standardization and calibration of dosimetric systems; and Development of dosimetric systems. The forty or so papers presented and the discussions that followed them brought out a certain number of dominant themes, among which three deserve particular mention. - The recent generalization of the International System of Units having prompted a fundamental reassessment of the dosimetric quantities to be considered in calibrating measuring instruments, various proposals were advanced by the representatives of national metrology laboratories to replace the quantity 'exposure' (SI unit = coulomb/kg) by 'Kerma' or 'absorbed dose' (unit joule/kg, the special name of which is 'gray'), this latter being closer to the practical

  14. Finding the Cell Center by a Balance of Dynein and Myosin Pulling and Microtubule Pushing: A Computational Study

    Science.gov (United States)

    Zhu, Jie; Burakov, Anton; Rodionov, Vladimir

    2010-01-01

    The centrosome position in many types of interphase cells is actively maintained in the cell center. Our previous work indicated that the centrosome is kept at the center by pulling force generated by dynein and actin flow produced by myosin contraction and that an unidentified factor that depends on microtubule dynamics destabilizes position of the centrosome. Here, we use modeling to simulate the centrosome positioning based on the idea that the balance of three forces—dyneins pulling along microtubule length, myosin-powered centripetal drag, and microtubules pushing on organelles—is responsible for the centrosome displacement. By comparing numerical predictions with centrosome behavior in wild-type and perturbed interphase cells, we rule out several plausible hypotheses about the nature of the microtubule-based force. We conclude that strong dynein- and weaker myosin-generated forces pull the microtubules inward competing with microtubule plus-ends pushing the microtubule aster outward and that the balance of these forces positions the centrosome at the cell center. The model also predicts that kinesin action could be another outward-pushing force. Simulations demonstrate that the force-balance centering mechanism is robust yet versatile. We use the experimental observations to reverse engineer the characteristic forces and centrosome mobility. PMID:20980619

  15. Cone-beam Computed Tomographic Assessment of Canal Centering Ability and Transportation after Preparation with Twisted File and Bio RaCe Instrumentation.

    Directory of Open Access Journals (Sweden)

    Kiamars Honardar

    2014-08-01

    Full Text Available Use of rotary Nickel-Titanium (NiTi instruments for endodontic preparation has introduced a new era in endodontic practice, but this issue has undergone dramatic modifications in order to achieve improved shaping abilities. Cone-beam computed tomography (CBCT has made it possible to accurately evaluate geometrical changes following canal preparation. This study was carried out to compare canal centering ability and transportation of Twisted File and BioRaCe rotary systems by means of cone-beam computed tomography.Thirty root canals from freshly extracted mandibular and maxillary teeth were selected. Teeth were mounted and scanned before and after preparation by CBCT at different apical levels. Specimens were divided into 2 groups of 15. In the first group Twisted File and in the second, BioRaCe was used for canal preparation. Canal transportation and centering ability after preparation were assessed by NNT Viewer and Photoshop CS4 software. Statistical analysis was performed using t-test and two-way ANOVA.All samples showed deviations from the original axes of the canals. No significant differences were detected between the two rotary NiTi instruments for canal centering ability in all sections. Regarding canal transportation however, a significant difference was seen in the BioRaCe group at 7.5mm from the apex.Under the conditions of this in vitro study, Twisted File and BioRaCe rotary NiTi files retained original canal geometry.

  16. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  17. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  19. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  20. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  2. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  3. 4th International Conference on Biomedical Engineering in Vietnam

    CERN Document Server

    Toan, Nguyen; Khoa, Truong; Phuong, Tran; Development of Biomedical Engineering

    2013-01-01

    This volume presents the proceedings of the Fourth International Conference on the Development of Biomedical Engineering in Vietnam which was held in Ho Chi Minh City as a Mega-conference. It is kicked off by the Regenerative Medicine Conference with the theme “BUILDING A FACE” USING A REGENERATIVE MEDICINE APPROACH”, endorsed mainly by the Tissue Engineering and Regenerative Medicine International Society (TERMIS). It is followed by the Computational Medicine Conference, endorsed mainly by the Computational Surgery International Network (COSINE) and the Computational Molecular Medicine of German National Funding Agency; and the General Biomedical Engineering Conference, endorsed mainly by the International Federation for Medical and Biological Engineering (IFMBE). It featured the contributions of 435 scientists from 30 countries, including: Australia, Austria, Belgium, Canada, China, Finland, France, Germany, Hungary, India, Iran, Italy, Japan, Jordan, Korea, Malaysia, Netherlands, Pakistan, Poland, Ru...

  4. Biomedical Imaging Principles and Applications

    CERN Document Server

    Salzer, Reiner

    2012-01-01

    This book presents and describes imaging technologies that can be used to study chemical processes and structural interactions in dynamic systems, principally in biomedical systems. The imaging technologies, largely biomedical imaging technologies such as MRT, Fluorescence mapping, raman mapping, nanoESCA, and CARS microscopy, have been selected according to their application range and to the chemical information content of their data. These technologies allow for the analysis and evaluation of delicate biological samples, which must not be disturbed during the profess. Ultimately, this may me

  5. Biomedical applications of magnetic particles

    CERN Document Server

    Mefford, Thompson

    2018-01-01

    Magnetic particles are increasingly being used in a wide variety of biomedical applications. Written by a team of internationally respected experts, this book provides an up-to-date authoritative reference for scientists and engineers. The first section presents the fundamentals of the field by explaining the theory of magnetism, describing techniques to synthesize magnetic particles, and detailing methods to characterize magnetic particles. The second section describes biomedical applications, including chemical sensors and cellular actuators, and diagnostic applications such as drug delivery, hyperthermia cancer treatment, and magnetic resonance imaging contrast.

  6. About TTC | NCI Technology Transfer Center | TTC

    Science.gov (United States)

    The TTC facilitates licensing and co-development partnerships between biomedical industry, academia, and government agencies and the research laboratories of the NCI and nine other institutes and centers of NIH.

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  8. Localization and Tracking of Implantable Biomedical Sensors

    Directory of Open Access Journals (Sweden)

    Ilknur Umay

    2017-03-01

    Full Text Available Implantable sensor systems are effective tools for biomedical diagnosis, visualization and treatment of various health conditions, attracting the interest of researchers, as well as healthcare practitioners. These systems efficiently and conveniently provide essential data of the body part being diagnosed, such as gastrointestinal (temperature, pH, pressure parameter values, blood glucose and pressure levels and electrocardiogram data. Such data are first transmitted from the implantable sensor units to an external receiver node or network and then to a central monitoring and control (computer unit for analysis, diagnosis and/or treatment. Implantable sensor units are typically in the form of mobile microrobotic capsules or implanted stationary (body-fixed units. In particular, capsule-based systems have attracted significant research interest recently, with a variety of applications, including endoscopy, microsurgery, drug delivery and biopsy. In such implantable sensor systems, one of the most challenging problems is the accurate localization and tracking of the microrobotic sensor unit (e.g., robotic capsule inside the human body. This article presents a literature review of the existing localization and tracking techniques for robotic implantable sensor systems with their merits and limitations and possible solutions of the proposed localization methods. The article also provides a brief discussion on the connection and cooperation of such techniques with wearable biomedical sensor systems.

  9. Canal transportation and centering ability of protaper and self-adjusting file system in long oval canals: An ex-vivo cone-beam computed tomography analysis.

    Science.gov (United States)

    Shah, Dipali Yogesh; Wadekar, Swati Ishwara; Dadpe, Ashwini Manish; Jadhav, Ganesh Ranganath; Choudhary, Lalit Jayant; Kalra, Dheeraj Deepak

    2017-01-01

    The purpose of this study was to compare and evaluate the shaping ability of ProTaper (PT) and Self-Adjusting File (SAF) system using cone-beam computed tomography (CBCT) to assess their performance in oval-shaped root canals. Sixty-two mandibular premolars with single oval canals were divided into two experimental groups ( n = 31) according to the systems used: Group I - PT and Group II - SAF. Canals were evaluated before and after instrumentation using CBCT to assess centering ratio and canal transportation at three levels. Data were statistically analyzed using one-way analysis of variance, post hoc Tukey's test, and t -test. The SAF showed better centering ability and lesser canal transportation than the PT only in the buccolingual plane at 6 and 9 mm levels. The shaping ability of the PT was best in the apical third in both the planes. The SAF had statistically significant better centering and lesser canal transportation in the buccolingual as compared to the mesiodistal plane at the middle and coronal levels. The SAF produced significantly less transportation and remained centered than the PT at the middle and coronal levels in the buccolingual plane of oval canals. In the mesiodistal plane, the performance of both the systems was parallel.

  10. National Biomedical Tracer Facility. Project definition study

    Energy Technology Data Exchange (ETDEWEB)

    Schafer, R.

    1995-02-14

    We request a $25 million government-guaranteed, interest-free loan to be repaid over a 30-year period for construction and initial operations of a cyclotron-based National Biomedical Tracer Facility (NBTF) in North Central Texas. The NBTF will be co-located with a linear accelerator-based commercial radioisotope production facility, funded by the private sector at approximately $28 million. In addition, research radioisotope production by the NBTF will be coordinated through an association with an existing U.S. nuclear reactor center that will produce research and commercial radioisotopes through neutron reactions. The combined facilities will provide the full range of technology for radioisotope production and research: fast neutrons, thermal neutrons, and particle beams (H{sup -}, H{sup +}, and D{sup +}). The proposed NBTF facility includes an 80 MeV, 1 mA H{sup -} cyclotron that will produce proton-induced (neutron deficient) research isotopes.

  11. Poland: biomedical ethics in a socialist state.

    Science.gov (United States)

    Szawarski, Zbigniew

    1987-06-01

    In one of a Hastings Center Report series of four country reports, a professor of ethics discusses the Polish approach to ethical issues in health care. Szawarski begins by outlining five factors that influence the practice of medicine in Poland: a socialist form of government, the influence of the Roman Catholic Church, an ongoing economic crisis, the legacy of the Nazi death camps, and a lack of formal instruction in biomedical ethics. He then discusses three current ethical concerns of physicians, patients, and the public: regulation of physician conduct, abortion, and in vitro fertilization. There is little formal public debate of the issues, however, and physicians seem committed to upholding traditional medical codes of ethics without analyzing underlying moral principles and justifications.

  12. National Biomedical Tracer Facility. Project definition study

    International Nuclear Information System (INIS)

    Schafer, R.

    1995-01-01

    We request a $25 million government-guaranteed, interest-free loan to be repaid over a 30-year period for construction and initial operations of a cyclotron-based National Biomedical Tracer Facility (NBTF) in North Central Texas. The NBTF will be co-located with a linear accelerator-based commercial radioisotope production facility, funded by the private sector at approximately $28 million. In addition, research radioisotope production by the NBTF will be coordinated through an association with an existing U.S. nuclear reactor center that will produce research and commercial radioisotopes through neutron reactions. The combined facilities will provide the full range of technology for radioisotope production and research: fast neutrons, thermal neutrons, and particle beams (H - , H + , and D + ). The proposed NBTF facility includes an 80 MeV, 1 mA H - cyclotron that will produce proton-induced (neutron deficient) research isotopes

  13. Pricing the Services of the Computer Center at the Catholic University of Louvain. Program on Institutional Management in Higher Education.

    Science.gov (United States)

    Hecquet, Ignace; And Others

    Principles are outlined that are used as a basis for the system of pricing the services of the Computer Centre. The system illustrates the use of a management method to secure better utilization of university resources. Departments decide how to use the appropriations granted to them and establish a system of internal prices that reflect the cost…

  14. Full text clustering and relationship network analysis of biomedical publications.

    Directory of Open Access Journals (Sweden)

    Renchu Guan

    Full Text Available Rapid developments in the biomedical sciences have increased the demand for automatic clustering of biomedical publications. In contrast to current approaches to text clustering, which focus exclusively on the contents of abstracts, a novel method is proposed for clustering and analysis of complete biomedical article texts. To reduce dimensionality, Cosine Coefficient is used on a sub-space of only two vectors, instead of computing the Euclidean distance within the space of all vectors. Then a strategy and algorithm is introduced for Semi-supervised Affinity Propagation (SSAP to improve analysis efficiency, using biomedical journal names as an evaluation background. Experimental results show that by avoiding high-dimensional sparse matrix computations, SSAP outperforms conventional k-means methods and improves upon the standard Affinity Propagation algorithm. In constructing a directed relationship network and distribution matrix for the clustering results, it can be noted that overlaps in scope and interests among BioMed publications can be easily identified, providing a valuable analytical tool for editors, authors and readers.

  15. Full text clustering and relationship network analysis of biomedical publications.

    Science.gov (United States)

    Guan, Renchu; Yang, Chen; Marchese, Maurizio; Liang, Yanchun; Shi, Xiaohu

    2014-01-01

    Rapid developments in the biomedical sciences have increased the demand for automatic clustering of biomedical publications. In contrast to current approaches to text clustering, which focus exclusively on the contents of abstracts, a novel method is proposed for clustering and analysis of complete biomedical article texts. To reduce dimensionality, Cosine Coefficient is used on a sub-space of only two vectors, instead of computing the Euclidean distance within the space of all vectors. Then a strategy and algorithm is introduced for Semi-supervised Affinity Propagation (SSAP) to improve analysis efficiency, using biomedical journal names as an evaluation background. Experimental results show that by avoiding high-dimensional sparse matrix computations, SSAP outperforms conventional k-means methods and improves upon the standard Affinity Propagation algorithm. In constructing a directed relationship network and distribution matrix for the clustering results, it can be noted that overlaps in scope and interests among BioMed publications can be easily identified, providing a valuable analytical tool for editors, authors and readers.

  16. Mathematical modeling and computational intelligence in engineering applications

    CERN Document Server

    Silva Neto, Antônio José da; Silva, Geraldo Nunes

    2016-01-01

    This book brings together a rich selection of studies in mathematical modeling and computational intelligence, with application in several fields of engineering, like automation, biomedical, chemical, civil, electrical, electronic, geophysical and mechanical engineering, on a multidisciplinary approach. Authors from five countries and 16 different research centers contribute with their expertise in both the fundamentals and real problems applications based upon their strong background on modeling and computational intelligence. The reader will find a wide variety of applications, mathematical and computational tools and original results, all presented with rigorous mathematical procedures. This work is intended for use in graduate courses of engineering, applied mathematics and applied computation where tools as mathematical and computational modeling, numerical methods and computational intelligence are applied to the solution of real problems.

  17. Biomedical nanomaterials from design to implementation

    CERN Document Server

    Webster, Thomas

    2016-01-01

    Biomedical Nanomaterials brings together the engineering applications and challenges of using nanostructured surfaces and nanomaterials in healthcare in a single source. Each chapter covers important and new information in the biomedical applications of nanomaterials.

  18. Archives: Journal of Medical and Biomedical Sciences

    African Journals Online (AJOL)

    Items 1 - 20 of 20 ... Archives: Journal of Medical and Biomedical Sciences. Journal Home > Archives: Journal of Medical and Biomedical Sciences. Log in or Register to get access to full text downloads.

  19. Archives: Journal of Medicine and Biomedical Research

    African Journals Online (AJOL)

    Items 1 - 19 of 19 ... Archives: Journal of Medicine and Biomedical Research. Journal Home > Archives: Journal of Medicine and Biomedical Research. Log in or Register to get access to full text downloads.

  20. Development of an instrument to measure health center (HC) personnel's computer use, knowledge and functionality demand for HC computerized information system in Thailand.

    Science.gov (United States)

    Kijsanayotin, Boonchai; Pannarunothai, Supasit; Speedie, Stuart

    2005-01-01

    Knowledge about socio-technical aspects of information technology (IT) is vital for the success of health IT projects. The Thailand health administration anticipates using health IT to support the recently implemented national universal health care system. However, the national knowledge associate with the socio-technical aspects of health IT has not been studied in Thailand. A survey instrument measuring Thai health center (HC) personnel's computer use, basic IT knowledge and HC computerized information system functionality needs was developed. The instrument reveals acceptable test-retest reliability and reasonable internal consistency of the measures. The future nation-wide demonstration study will benefit from this study.

  1. Integrated Biomaterials for Biomedical Technology

    CERN Document Server

    Ramalingam, Murugan; Ramakrishna, Seeram; Kobayashi, Hisatoshi

    2012-01-01

    This cutting edge book provides all the important aspects dealing with the basic science involved in materials in biomedical technology, especially structure and properties, techniques and technological innovations in material processing and characterizations, as well as the applications. The volume consists of 12 chapters written by acknowledged experts of the biomaterials field and covers a wide range of topics and applications.

  2. Biomedical Engineering Education in Perspective

    Science.gov (United States)

    Gowen, Richard J.

    1973-01-01

    Discusses recent developments in the health care industry and their impact on the future of biomedical engineering education. Indicates that a more thorough understanding of the complex functions of the living organism can be acquired through the application of engineering techniques to problems of life sciences. (CC)

  3. Statistics in three biomedical journals

    Czech Academy of Sciences Publication Activity Database

    Pilčík, Tomáš

    2003-01-01

    Roč. 52, č. 1 (2003), s. 39-43 ISSN 0862-8408 R&D Projects: GA ČR GA310/03/1381 Grant - others:Howard Hughes Medical Institute(US) HHMI55000323 Institutional research plan: CEZ:AV0Z5052915 Keywords : statistics * usage * biomedical journals Subject RIV: EC - Immunology Impact factor: 0.939, year: 2003

  4. Resource for the Development of Biomedical Accelerator Mass Spectrometry (AMS)

    Energy Technology Data Exchange (ETDEWEB)

    Turteltaub, K. W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bench, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Buchholz, B. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Enright, H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kulp, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCartt, A. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Malfatti, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ognibene, T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Loots, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Stewart, B. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-04-08

    The NIH Research Resource for Biomedical AMS was originally funded at Lawrence Livermore National Laboratory in 1999 to develop and apply the technology of accelerator mass spectrometry (AMS) in broad- based biomedical research. The Resource’s niche is to fill needs for ultra high sensitivity quantitation when isotope-labeled agents are used. The Research Resource’s Technology Research and Development (TR&D) efforts will focus on the needs of the biomedical research community in the context of seven Driving Biomedical Projects (DBPs) that will drive the Center’s technical capabilities through three core TR&Ds. We will expand our present capabilities by developing a fully integrated HPLC AMS to increase our capabilities for metabolic measurements, we will develop methods to understand cellular processes and we will develop and validate methods for the application of AMS in human studies, which is a growing area of demand by collaborators and service users. In addition, we will continue to support new and ongoing collaborative and service projects that require the capabilities of the Resource. The Center will continue to train researchers in the use of the AMS capabilities being developed, and the results of all efforts will be widely disseminated to advance progress in biomedical research. Towards these goals, our specific aims are to:1.) Increase the value and information content of AMS measurements by combining molecular speciation with quantitation of defined macromolecular isolates. Specifically, develop and validate methods for macromolecule labeling, characterization and quantitation.2.) Develop and validate methods and strategies to enable AMS to become more broadly used in human studies. Specifically, demonstrate robust methods for conducting pharmacokinetic/pharmacodynamics studies in humans and model systems.3.) Increase the accessibility of AMS to the Biomedical research community and the throughput of AMS through direct coupling to separatory

  5. Resource for the Development of Biomedical Accelerator Mass Spectrometry (AMS)

    Energy Technology Data Exchange (ETDEWEB)

    Tuerteltaub, K. W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bench, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Buchholz, B. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Enright, H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kulp, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Loots, G. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCartt, A. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Malfatti, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ognibene, T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Stewart, B. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-03-21

    The NIH Research Resource for Biomedical AMS was originally funded at Lawrence Livermore National Laboratory in 1999 to develop and apply the technology of accelerator mass spectrometry (AMS) in broad- based biomedical research. The Resource’s niche is to fill needs for ultra high sensitivity quantitation when isotope-labeled agents are used. The Research Resource’s Technology Research and Development (TR&D) efforts will focus on the needs of the biomedical research community in the context of seven Driving Biomedical Projects (DBPs) that will drive the Center’s technical capabilities through three core TR&Ds. We will expand our present capabilities by developing a fully integrated HPLC AMS to increase our capabilities for metabolic measurements, we will develop methods to understand cellular processes and we will develop and validate methods for the application of AMS in human studies, which is a growing area of demand by collaborators and service users. In addition, we will continue to support new and ongoing collaborative and service projects that require the capabilities of the Resource. The Center will continue to train researchers in the use of the AMS capabilities being developed, and the results of all efforts will be widely disseminated to advance progress in biomedical research. Towards these goals, our specific aims are to:1.) Increase the value and information content of AMS measurements by combining molecular speciation with quantitation of defined macromolecular isolates. Specifically, develop and validate methods for macromolecule labeling, characterization and quantitation.2.) Develop and validate methods and strategies to enable AMS to become more broadly used in human studies. Specifically, demonstrate robust methods for conducting pharmacokinetic/pharmacodynamics studies in humans and model systems.3.) Increase the accessibility of AMS to the Biomedical research community and the throughput of AMS through direct coupling to separatory

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  7. An enhanced approach for biomedical image restoration using image fusion techniques

    Science.gov (United States)

    Karam, Ghada Sabah; Abbas, Fatma Ismail; Abood, Ziad M.; Kadhim, Kadhim K.; Karam, Nada S.

    2018-05-01

    Biomedical image is generally noisy and little blur due to the physical mechanisms of the acquisition process, so one of the common degradations in biomedical image is their noise and poor contrast. The idea of biomedical image enhancement is to improve the quality of the image for early diagnosis. In this paper we are using Wavelet Transformation to remove the Gaussian noise from biomedical images: Positron Emission Tomography (PET) image and Radiography (Radio) image, in different color spaces (RGB, HSV, YCbCr), and we perform the fusion of the denoised images resulting from the above denoising techniques using add image method. Then some quantive performance metrics such as signal -to -noise ratio (SNR), peak signal-to-noise ratio (PSNR), and Mean Square Error (MSE), etc. are computed. Since this statistical measurement helps in the assessment of fidelity and image quality. The results showed that our approach can be applied of Image types of color spaces for biomedical images.

  8. Design reality gap issues within an ICT4D project:an assessment of Jigawa State Community Computer Center

    OpenAIRE

    Kanya, Rislana Abdulazeez; Good, Alice

    2013-01-01

    This paper evaluates the Jigawa State Government Community Computer centre project using the design reality gap framework. The purpose of this was to analyse the shortfall between design expectations and implementation realities, in order to find out the current situation of the project. Furthermore to analyse whether it would meet the key stakeholder’s expectation. The Majority of Government ICT Projects is classified as either failure or partial failure. Our research will underpin a case st...

  9. Peer interaction in mixed age groups: a study in the computer area of an early childhood education center in Portugal

    OpenAIRE

    Figueiredo, Maria Pacheco; Figueiredo, Ana Cláudia Nogueira de; Rego, Belmiro

    2015-01-01

    The study was developed as a teacher-research project during initial teacher education – Masters Degree of Early Childhood and Primary Education, in Portugal. It analysed the interactions between children of 3 to 6 years old, during the use of the computer as a free choice activity, confronting situations between peers of the same age and situations between peers of different ages. The focus of the analysis was the collaborative interactions. This was a qualitative study. Child...

  10. Research evaluation support services in biomedical libraries

    Directory of Open Access Journals (Sweden)

    Karen Elizabeth Gutzman

    2018-01-01

    Conclusions: Libraries can leverage a variety of evaluation support services as an opportunity to successfully meet an array of challenges confronting the biomedical research community, including robust efforts to report and demonstrate tangible and meaningful outcomes of biomedical research and clinical care. These services represent a transformative direction that can be emulated by other biomedical and research libraries.

  11. Molecular modeling and computational simulation of the photosystem-II reaction center to address isoproturon resistance in Phalaris minor.

    Science.gov (United States)

    Singh, Durg Vijay; Agarwal, Shikha; Kesharwani, Rajesh Kumar; Misra, Krishna

    2012-08-01

    Isoproturon is the only herbicide that can control Phalaris minor, a competitive weed of wheat that developed resistance in 1992. Resistance against isoproturon was reported to be due to a mutation in the psbA gene that encodes the isoproturon-binding D1 protein. Previously in our laboratory, a triazole derivative of isoproturon (TDI) was synthesized and found to be active against both susceptible and resistant biotypes at 0.5 kg/ha but has shown poor specificity. In the present study, both susceptible D1((S)), resistant D1((R)) and D2 proteins of the PS-II reaction center of P. minor have been modeled and simulated, selecting the crystal structure of PS-II from Thermosynechococcus elongatus (2AXT.pdb) as template. Loop regions were refined, and the complete reaction center D1/D2 was simulated with GROMACS in lipid (1-palmitoyl-2-oleoylglycero-3-phosphoglycerol, POPG) environment along with ligands and cofactor. Both S and R models were energy minimized using steepest decent equilibrated with isotropic pressure coupling and temperature coupling using a Berendsen protocol, and subjected to 1,000 ps of MD simulation. As a result of MD simulation, the best model obtained in lipid environment had five chlorophylls, two plastoquinones, two phenophytins and a bicarbonate ion along with cofactor Fe and oxygen evolving center (OEC). The triazole derivative of isoproturon was used as lead molecule for docking. The best worked out conformation of TDI was chosen for receptor-based de novo ligand design. In silico designed molecules were screened and, as a result, only those molecules that show higher docking and binding energies in comparison to isoproturon and its triazole derivative were proposed for synthesis in order to get more potent, non-resistant and more selective TDI analogs.

  12. Modernization of the graphics post-processors of the Hamburg German Climate Computer Center Carbon Cycle Codes

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, E.J.; McNeilly, G.S.

    1994-03-01

    The existing National Center for Atmospheric Research (NCAR) code in the Hamburg Oceanic Carbon Cycle Circulation Model and the Hamburg Large-Scale Geostrophic Ocean General Circulation Model was modernized and reduced in size while still producing an equivalent end result. A reduction in the size of the existing code from more than 50,000 lines to approximately 7,500 lines in the new code has made the new code much easier to maintain. The existing code in Hamburg model uses legacy NCAR (including even emulated CALCOMP subrountines) graphics to display graphical output. The new code uses only current (version 3.1) NCAR subrountines.

  13. Biomedical Simulation Models of Human Auditory Processes

    Science.gov (United States)

    Bicak, Mehmet M. A.

    2012-01-01

    Detailed acoustic engineering models that explore noise propagation mechanisms associated with noise attenuation and transmission paths created when using hearing protectors such as earplugs and headsets in high noise environments. Biomedical finite element (FE) models are developed based on volume Computed Tomography scan data which provides explicit external ear, ear canal, middle ear ossicular bones and cochlea geometry. Results from these studies have enabled a greater understanding of hearing protector to flesh dynamics as well as prioritizing noise propagation mechanisms. Prioritization of noise mechanisms can form an essential framework for exploration of new design principles and methods in both earplug and earcup applications. These models are currently being used in development of a novel hearing protection evaluation system that can provide experimentally correlated psychoacoustic noise attenuation. Moreover, these FE models can be used to simulate the effects of blast related impulse noise on human auditory mechanisms and brain tissue.

  14. Biomedical informatics discovering knowledge in big data

    CERN Document Server

    Holzinger, Andreas

    2014-01-01

    This book provides a broad overview of the topic Bioinformatics (medical informatics + biological information) with a focus on data, information and knowledge. From data acquisition and storage to visualization, privacy, regulatory, and other practical and theoretical topics, the author touches on several fundamental aspects of the innovative interface between the medical and computational domains that form biomedical informatics. Each chapter starts by providing a useful inventory of definitions and commonly used acronyms for each topic, and throughout the text, the reader finds several real-world examples, methodologies, and ideas that complement the technical and theoretical background. Also at the beginning of each chapter a new section called "key problems", has been added, where the author discusses possible traps and unsolvable or major problems. This new edition includes new sections at the end of each chapter, called "future outlook and research avenues," providing pointers to future challenges.

  15. Comparing the performance of biomedical clustering methods

    DEFF Research Database (Denmark)

    Wiwie, Christian; Baumbach, Jan; Röttger, Richard

    2015-01-01

    expression to protein domains. Performance was judged on the basis of 13 common cluster validity indices. We developed a clustering analysis platform, ClustEval (http://clusteval.mpi-inf.mpg.de), to promote streamlined evaluation, comparison and reproducibility of clustering results in the future......Identifying groups of similar objects is a popular first step in biomedical data analysis, but it is error-prone and impossible to perform manually. Many computational methods have been developed to tackle this problem. Here we assessed 13 well-known methods using 24 data sets ranging from gene....... This allowed us to objectively evaluate the performance of all tools on all data sets with up to 1,000 different parameter sets each, resulting in a total of more than 4 million calculated cluster validity indices. We observed that there was no universal best performer, but on the basis of this wide...

  16. Contribution for labelling study of cellular and molecular structures of biomedical interest with technetium 99

    International Nuclear Information System (INIS)

    Rebello, L.H.; Piotkwosky, M.C.; Pereira, J.A.A.; Boasquevisque, E.M.; Silva, J.R.M.; Reis, R.J.N.; Pires, E.T.; Bernardo-Filho, M.

    1992-01-01

    The methodologies for labelling bacteria, planaria and cercaria from schistosomiasis evolution cycle and in oxamniquine with technetium 99 m, developed in the Biomedical Center of Rio de Janeiro University and in the Research Center of National Institute of Cancer are shown. (C.G.C.)

  17. Comparison of canal transportation and centering ability of hand Protaper files and rotary Protaper files by using micro computed tomography

    OpenAIRE

    Amit Gandhi; Taru Gandhi

    2011-01-01

    Introduction and objective: The aim of the present study was to compare root canal preparation with rotary ProTaper files and hand ProTaper files to find a better instrumentation technique for maintaining root canal geometry with the aid of computed tomography. Material and methods: Twenty curved root canals with at least 10 degree of curvature were divided into 2 groups of 10 teeth each. In group I the canals were prepared with hand ProTaper files and in group II the canals were prepared wit...

  18. Root Canal Transportation and Centering Ability of Nickel-Titanium Rotary Instruments in Mandibular Premolars Assessed Using Cone-Beam Computed Tomography.

    Science.gov (United States)

    Mamede-Neto, Iussif; Borges, Alvaro Henrique; Guedes, Orlando Aguirre; de Oliveira, Durvalino; Pedro, Fábio Luis Miranda; Estrela, Carlos

    2017-01-01

    The aim of this study was to evaluate, using cone-beam computed tomography (CBCT), transportation and centralization of different nickel-titanium (NiTi) rotary instruments. One hundred and twenty eight mandibular premolars were selected and instrumented using the following brands of NiTi files: WaveOne, WaveOne Gold, Reciproc, ProTaper Next, ProTaper Gold, Mtwo, BioRaCe and RaCe. CBCT imaging was performed before and after root canal preparation to obtain measurements of mesial and distal dentin walls and calculations of root canal transportation and centralization. A normal distribution of data was confirmed by the Kolmogorov-Smirnov and Levene tests, and results were assessed using the Kruskal-Wallis test. Statistical significance was set at 5%. ProTaper Gold produced the lowest canal transportation values, and RaCe, the highest. ProTaper Gold files also showed the highest values for centering ability, whereas BioRaCe showed the lowest. No significant differences were found across the different instruments in terms of canal transportation and centering ability (P > 0.05). Based on the methodology employed, all instruments used for root canal preparation of mandibular premolars performed similarly with regard to canal transportation and centering ability.

  19. Comparison of canal transportation and centering ability of twisted files, Pathfile-ProTaper system, and stainless steel hand K-files by using computed tomography.

    Science.gov (United States)

    Gergi, Richard; Rjeily, Joe Abou; Sader, Joseph; Naaman, Alfred

    2010-05-01

    The purpose of this study was to compare canal transportation and centering ability of 2 rotary nickel-titanium (NiTi) systems (Twisted Files [TF] and Pathfile-ProTaper [PP]) with conventional stainless steel K-files. Ninety root canals with severe curvature and short radius were selected. Canals were divided randomly into 3 groups of 30 each. After preparation with TF, PP, and stainless steel files, the amount of transportation that occurred was assessed by using computed tomography. Three sections from apical, mid-root, and coronal levels of the canal were recorded. Amount of transportation and centering ability were assessed. The 3 groups were statistically compared with analysis of variance and Tukey honestly significant difference test. Less transportation and better centering ability occurred with TF rotary instruments (P < .0001). K-files showed the highest transportation followed by PP system. PP system showed significant transportation when compared with TF (P < .0001). The TF system was found to be the best for all variables measured in this study. Copyright (c) 2010 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  20. Branding the bio/biomedical engineering degree.

    Science.gov (United States)

    Voigt, Herbert F

    2011-01-01

    The future challenges to medical and biological engineering, sometimes referred to as biomedical engineering or simply bioengineering, are many. Some of these are identifiable now and others will emerge from time to time as new technologies are introduced and harnessed. There is a fundamental issue regarding "Branding the bio/biomedical engineering degree" that requires a common understanding of what is meant by a B.S. degree in Biomedical Engineering, Bioengineering, or Biological Engineering. In this paper we address some of the issues involved in branding the Bio/Biomedical Engineering degree, with the aim of clarifying the Bio/Biomedical Engineering brand.

  1. Advanced Methods of Biomedical Signal Processing

    CERN Document Server

    Cerutti, Sergio

    2011-01-01

    This book grew out of the IEEE-EMBS Summer Schools on Biomedical Signal Processing, which have been held annually since 2002 to provide the participants state-of-the-art knowledge on emerging areas in biomedical engineering. Prominent experts in the areas of biomedical signal processing, biomedical data treatment, medicine, signal processing, system biology, and applied physiology introduce novel techniques and algorithms as well as their clinical or physiological applications. The book provides an overview of a compelling group of advanced biomedical signal processing techniques, such as mult

  2. Software for biomedical engineering signal processing laboratory experiments.

    Science.gov (United States)

    Tompkins, Willis J; Wilson, J

    2009-01-01

    In the early 1990's we developed a special computer program called UW DigiScope to provide a mechanism for anyone interested in biomedical digital signal processing to study the field without requiring any other instrument except a personal computer. There are many digital filtering and pattern recognition algorithms used in processing biomedical signals. In general, students have very limited opportunity to have hands-on access to the mechanisms of digital signal processing. In a typical course, the filters are designed non-interactively, which does not provide the student with significant understanding of the design constraints of such filters nor their actual performance characteristics. UW DigiScope 3.0 is the first major update since version 2.0 was released in 1994. This paper provides details on how the new version based on MATLAB! works with signals, including the filter design tool that is the programming interface between UW DigiScope and processing algorithms.

  3. Scientific visualization uncertainty, multifield, biomedical, and scalable visualization

    CERN Document Server

    Chen, Min; Johnson, Christopher; Kaufman, Arie; Hagen, Hans

    2014-01-01

    Based on the seminar that took place in Dagstuhl, Germany in June 2011, this contributed volume studies the four important topics within the scientific visualization field: uncertainty visualization, multifield visualization, biomedical visualization and scalable visualization. • Uncertainty visualization deals with uncertain data from simulations or sampled data, uncertainty due to the mathematical processes operating on the data, and uncertainty in the visual representation, • Multifield visualization addresses the need to depict multiple data at individual locations and the combination of multiple datasets, • Biomedical is a vast field with select subtopics addressed from scanning methodologies to structural applications to biological applications, • Scalability in scientific visualization is critical as data grows and computational devices range from hand-held mobile devices to exascale computational platforms. Scientific Visualization will be useful to practitioners of scientific visualization, ...

  4. Computer classes and games in virtual reality environment to reduce loneliness among students of an elderly reference center: Study protocol for a randomised cross-over design.

    Science.gov (United States)

    Antunes, Thaiany Pedrozo Campos; Oliveira, Acary Souza Bulle de; Crocetta, Tania Brusque; Antão, Jennifer Yohanna Ferreira de Lima; Barbosa, Renata Thais de Almeida; Guarnieri, Regiani; Massetti, Thais; Monteiro, Carlos Bandeira de Mello; Abreu, Luiz Carlos de

    2017-03-01

    Physical and mental changes associated with aging commonly lead to a decrease in communication capacity, reducing social interactions and increasing loneliness. Computer classes for older adults make significant contributions to social and cognitive aspects of aging. Games in a virtual reality (VR) environment stimulate the practice of communicative and cognitive skills and might also bring benefits to older adults. Furthermore, it might help to initiate their contact to the modern technology. The purpose of this study protocol is to evaluate the effects of practicing VR games during computer classes on the level of loneliness of students of an elderly reference center. This study will be a prospective longitudinal study with a randomised cross-over design, with subjects aged 50 years and older, of both genders, spontaneously enrolled in computer classes for beginners. Data collection will be done in 3 moments: moment 0 (T0) - at baseline; moment 1 (T1) - after 8 typical computer classes; and moment 2 (T2) - after 8 computer classes which include 15 minutes for practicing games in VR environment. A characterization questionnaire, the short version of the Short Social and Emotional Loneliness Scale for Adults (SELSA-S) and 3 games with VR (Random, MoviLetrando, and Reaction Time) will be used. For the intervention phase 4 other games will be used: Coincident Timing, Motor Skill Analyser, Labyrinth, and Fitts. The statistical analysis will compare the evolution in loneliness perception, performance, and reaction time during the practice of the games between the 3 moments of data collection. Performance and reaction time during the practice of the games will also be correlated to the loneliness perception. The protocol is approved by the host institution's ethics committee under the number 52305215.3.0000.0082. Results will be disseminated via peer-reviewed journal articles and conferences. This clinical trial is registered at ClinicalTrials.gov identifier: NCT

  5. Comparison of concept recognizers for building the Open Biomedical Annotator

    Directory of Open Access Journals (Sweden)

    Rubin Daniel

    2009-09-01

    Full Text Available Abstract The National Center for Biomedical Ontology (NCBO is developing a system for automated, ontology-based access to online biomedical resources (Shah NH, et al.: Ontology-driven indexing of public datasets for translational bioinformatics. BMC Bioinformatics 2009, 10(Suppl 2:S1. The system's indexing workflow processes the text metadata of diverse resources such as datasets from GEO and ArrayExpress to annotate and index them with concepts from appropriate ontologies. This indexing requires the use of a concept-recognition tool to identify ontology concepts in the resource's textual metadata. In this paper, we present a comparison of two concept recognizers – NLM's MetaMap and the University of Michigan's Mgrep. We utilize a number of data sources and dictionaries to evaluate the concept recognizers in terms of precision, recall, speed of execution, scalability and customizability. Our evaluations demonstrate that Mgrep has a clear edge over MetaMap for large-scale service oriented applications. Based on our analysis we also suggest areas of potential improvements for Mgrep. We have subsequently used Mgrep to build the Open Biomedical Annotator service. The Annotator service has access to a large dictionary of biomedical terms derived from the United Medical Language System (UMLS and NCBO ontologies. The Annotator also leverages the hierarchical structure of the ontologies and their mappings to expand annotations. The Annotator service is available to the community as a REST Web service for creating ontology-based annotations of their data.

  6. Effort-reward imbalance and one-year change in neck-shoulder and upper extremity pain among call center computer operators.

    Science.gov (United States)

    Krause, Niklas; Burgel, Barbara; Rempel, David

    2010-01-01

    The literature on psychosocial job factors and musculoskeletal pain is inconclusive in part due to insufficient control for confounding by biomechanical factors. The aim of this study was to investigate prospectively the independent effects of effort-reward imbalance (ERI) at work on regional musculoskeletal pain of the neck and upper extremities of call center operators after controlling for (i) duration of computer use both at work and at home, (ii) ergonomic workstation design, (iii) physical activities during leisure time, and (iv) other individual worker characteristics. This was a one-year prospective study among 165 call center operators who participated in a randomized ergonomic intervention trial that has been described previously. Over an approximate four-week period, we measured ERI and 28 potential confounders via a questionnaire at baseline. Regional upper-body pain and computer use was measured by weekly surveys for up to 12 months following the implementation of ergonomic interventions. Regional pain change scores were calculated as the difference between average weekly pain scores pre- and post intervention. A significant relationship was found between high average ERI ratios and one-year increases in right upper-extremity pain after adjustment for pre-intervention regional mean pain score, current and past physical workload, ergonomic workstation design, and anthropometric, sociodemographic, and behavioral risk factors. No significant associations were found with change in neck-shoulder or left upper-extremity pain. This study suggests that ERI predicts regional upper-extremity pain in -computer operators working >or=20 hours per week. Control for physical workload and ergonomic workstation design was essential for identifying ERI as a risk factor.

  7. The next generation of similarity measures that fully explore the semantics in biomedical ontologies.

    Science.gov (United States)

    Couto, Francisco M; Pinto, H Sofia

    2013-10-01

    There is a prominent trend to augment and improve the formality of biomedical ontologies. For example, this is shown by the current effort on adding description logic axioms, such as disjointness. One of the key ontology applications that can take advantage of this effort is the conceptual (functional) similarity measurement. The presence of description logic axioms in biomedical ontologies make the current structural or extensional approaches weaker and further away from providing sound semantics-based similarity measures. Although beneficial in small ontologies, the exploration of description logic axioms by semantics-based similarity measures is computational expensive. This limitation is critical for biomedical ontologies that normally contain thousands of concepts. Thus in the process of gaining their rightful place, biomedical functional similarity measures have to take the journey of finding how this rich and powerful knowledge can be fully explored while keeping feasible computational costs. This manuscript aims at promoting and guiding the development of compelling tools that deliver what the biomedical community will require in a near future: a next-generation of biomedical similarity measures that efficiently and fully explore the semantics present in biomedical ontologies.

  8. New biomedical applications of radiocarbon

    International Nuclear Information System (INIS)

    Davis, J.C.

    1990-12-01

    The potential of accelerator mass spectrometry (AMS) and radiocarbon in biomedical applications is being investigated by Lawrence Livermore National Laboratory (LLNL). A measurement of the dose-response curve for DNA damage caused by a carcinogen in mouse liver cells was an initial experiment. This demonstrated the sensitivity and utility of AMS for detecting radiocarbon tags and led to numerous follow-on experiments. The initial experiment and follow-on experiments are discussed in this report. 12 refs., 4 figs. (SM)

  9. Gold Nanocages for Biomedical Applications**

    OpenAIRE

    Skrabalak, Sara E.; Chen, Jingyi; Au, Leslie; Lu, Xianmao; Li, Xingde; Xia, Younan

    2007-01-01

    Nanostructured materials provide a promising platform for early cancer detection and treatment. Here we highlight recent advances in the synthesis and use of Au nanocages for such biomedical applications. Gold nanocages represent a novel class of nanostructures, which can be prepared via a remarkably simple route based on the galvanic replacement reaction between Ag nanocubes and HAuCl4. The Au nanocages have a tunable surface plasmon resonance peak that extends into the near-infrared, where ...

  10. Biomedical devices and their applications

    CERN Document Server

    2004-01-01

    This volume introduces readers to the basic concepts and recent advances in the field of biomedical devices. The text gives a detailed account of novel developments in drug delivery, protein electrophoresis, estrogen mimicking methods and medical devices. It also provides the necessary theoretical background as well as describing a wide range of practical applications. The level and style make this book accessible not only to scientific and medical researchers but also to graduate students.

  11. The Ontology for Biomedical Investigations.

    Science.gov (United States)

    Bandrowski, Anita; Brinkman, Ryan; Brochhausen, Mathias; Brush, Matthew H; Bug, Bill; Chibucos, Marcus C; Clancy, Kevin; Courtot, Mélanie; Derom, Dirk; Dumontier, Michel; Fan, Liju; Fostel, Jennifer; Fragoso, Gilberto; Gibson, Frank; Gonzalez-Beltran, Alejandra; Haendel, Melissa A; He, Yongqun; Heiskanen, Mervi; Hernandez-Boussard, Tina; Jensen, Mark; Lin, Yu; Lister, Allyson L; Lord, Phillip; Malone, James; Manduchi, Elisabetta; McGee, Monnie; Morrison, Norman; Overton, James A; Parkinson, Helen; Peters, Bjoern; Rocca-Serra, Philippe; Ruttenberg, Alan; Sansone, Susanna-Assunta; Scheuermann, Richard H; Schober, Daniel; Smith, Barry; Soldatova, Larisa N; Stoeckert, Christian J; Taylor, Chris F; Torniai, Carlo; Turner, Jessica A; Vita, Randi; Whetzel, Patricia L; Zheng, Jie

    2016-01-01

    The Ontology for Biomedical Investigations (OBI) is an ontology that provides terms with precisely defined meanings to describe all aspects of how investigations in the biological and medical domains are conducted. OBI re-uses ontologies that provide a representation of biomedical knowledge from the Open Biological and Biomedical Ontologies (OBO) project and adds the ability to describe how this knowledge was derived. We here describe the state of OBI and several applications that are using it, such as adding semantic expressivity to existing databases, building data entry forms, and enabling interoperability between knowledge resources. OBI covers all phases of the investigation process, such as planning, execution and reporting. It represents information and material entities that participate in these processes, as well as roles and functions. Prior to OBI, it was not possible to use a single internally consistent resource that could be applied to multiple types of experiments for these applications. OBI has made this possible by creating terms for entities involved in biological and medical investigations and by importing parts of other biomedical ontologies such as GO, Chemical Entities of Biological Interest (ChEBI) and Phenotype Attribute and Trait Ontology (PATO) without altering their meaning. OBI is being used in a wide range of projects covering genomics, multi-omics, immunology, and catalogs of services. OBI has also spawned other ontologies (Information Artifact Ontology) and methods for importing parts of ontologies (Minimum information to reference an external ontology term (MIREOT)). The OBI project is an open cross-disciplinary collaborative effort, encompassing multiple research communities from around the globe. To date, OBI has created 2366 classes and 40 relations along with textual and formal definitions. The OBI Consortium maintains a web resource (http://obi-ontology.org) providing details on the people, policies, and issues being addressed

  12. [Cluster analysis in biomedical researches].

    Science.gov (United States)

    Akopov, A S; Moskovtsev, A A; Dolenko, S A; Savina, G D

    2013-01-01

    Cluster analysis is one of the most popular methods for the analysis of multi-parameter data. The cluster analysis reveals the internal structure of the data, group the separate observations on the degree of their similarity. The review provides a definition of the basic concepts of cluster analysis, and discusses the most popular clustering algorithms: k-means, hierarchical algorithms, Kohonen networks algorithms. Examples are the use of these algorithms in biomedical research.

  13. Design and analysis of a tendon-based computed tomography-compatible robot with remote center of motion for lung biopsy.

    Science.gov (United States)

    Yang, Yunpeng; Jiang, Shan; Yang, Zhiyong; Yuan, Wei; Dou, Huaisu; Wang, Wei; Zhang, Daguang; Bian, Yuan

    2017-04-01

    Nowadays, biopsy is a decisive method of lung cancer diagnosis, whereas lung biopsy is time-consuming, complex and inaccurate. So a computed tomography-compatible robot for rapid and precise lung biopsy is developed in this article. According to the actual operation process, the robot is divided into two modules: 4-degree-of-freedom position module for location of puncture point is appropriate for patient's almost all positions and 3-degree-of-freedom tendon-based orientation module with remote center of motion is compact and computed tomography-compatible to orientate and insert needle automatically inside computed tomography bore. The workspace of the robot surrounds patient's thorax, and the needle tip forms a cone under patient's skin. A new error model of the robot based on screw theory is proposed in view of structure error and actuation error, which are regarded as screw motions. Simulation is carried out to verify the precision of the error model contrasted with compensation via inverse kinematics. The results of insertion experiment on specific phantom prove the feasibility of the robot with mean error of 1.373 mm in laboratory environment, which is accurate enough to replace manual operation.

  14. Comparison of canal transportation and centering ability of rotary protaper, one shape system and wave one system using cone beam computed tomography: An in vitro study

    Science.gov (United States)

    Tambe, Varsha Harshal; Nagmode, Pradnya Sunil; Abraham, Sathish; Patait, Mahendra; Lahoti, Pratik Vinod; Jaju, Neha

    2014-01-01

    Aim: The aim of the present study was to compare the canal transportation and centering ability of Rotary ProTaper, One Shape and Wave One systems using cone beam computed tomography (CBCT) in curved root canals to find better instrumentation technique for maintaining root canal geometry. Materials and Methods: Total 30 freshly extracted premolars having curved root canals with at least 10 degrees of curvature were divided into three groups of 10 teeth each. All teeth were scanned by CBCT to determine the root canal shape before instrumentation. In Group 1, the canals were prepared with Rotary ProTaper files, in Group 2 the canals were prepared with One Shape files and in Group 3 canals were prepared with Wave One files. After preparation, post-instrumentation scan was performed. Pre-instrumentation and post-instrumentation images were obtained at three levels, 3 mm apical, 3 mm coronal and 8 mm apical above the apical foramen were compared using CBCT software. Amount of transportation and centering ability were assessed. The three groups were statistically compared with analysis of variance and Tukey honestly significant. Results: All instruments maintained the original canal curvature with significant differences between the different files. Data suggested that Wave One files presented the best outcomes for both the variables evaluated. Wave One files caused lesser transportation and remained better centered in the canal than One Shape and Rotary ProTaper files. Conclusion: The canal preparation with Wave One files showed lesser transportation and better centering ability than One Shape and ProTaper. PMID:25506145

  15. Biomedical applications of nanodiamond (Review)

    Science.gov (United States)

    Turcheniuk, K.; Mochalin, Vadym N.

    2017-06-01

    The interest in nanodiamond applications in biology and medicine is on the rise over recent years. This is due to the unique combination of properties that nanodiamond provides. Small size (∼5 nm), low cost, scalable production, negligible toxicity, chemical inertness of diamond core and rich chemistry of nanodiamond surface, as well as bright and robust fluorescence resistant to photobleaching are the distinct parameters that render nanodiamond superior to any other nanomaterial when it comes to biomedical applications. The most exciting recent results have been related to the use of nanodiamonds for drug delivery and diagnostics—two components of a quickly growing area of biomedical research dubbed theranostics. However, nanodiamond offers much more in addition: it can be used to produce biodegradable bone surgery devices, tissue engineering scaffolds, kill drug resistant microbes, help us to fight viruses, and deliver genetic material into cell nucleus. All these exciting opportunities require an in-depth understanding of nanodiamond. This review covers the recent progress as well as general trends in biomedical applications of nanodiamond, and underlines the importance of purification, characterization, and rational modification of this nanomaterial when designing nanodiamond based theranostic platforms.

  16. Magnetic nanoparticles for biomedical applications

    International Nuclear Information System (INIS)

    Krustev, P.; Ruskov, T.

    2007-01-01

    In this paper we describe different biomedical application using magnetic nanoparticles. Over the past decade, a number of biomedical applications have begun to emerge for magnetic nanoparticles of differing sizes, shapes, and compositions. Areas under investigation include targeted drug delivery, ultra-sensitive disease detection, gene therapy, high throughput genetic screening, biochemical sensing, and rapid toxicity cleansing. Magnetic nanoparticles exhibit ferromagnetic or superparamagnetic behavior, magnetizing strongly under an applied field. In the second case (superparamagnetic nanoparticles) there is no permanent magnetism once the field is removed. The superparamagnetic nanoparticles are highly attractive as in vivo probes or in vitro tools to extract information on biochemical systems. The optical properties of magnetic metal nanoparticles are spectacular and, therefore, have promoted a great deal of excitement during the last few decades. Many applications as MRI imaging and hyperthermia rely on the use of iron oxide particles. Moreover magnetic nanoparticles conjugated with antibodies are also applied to hyperthermia and have enabled tumor specific contrast enhancement in MRI. Other promising biomedical applications are connected with tumor cells treated with magnetic nanoparticles with X-ray ionizing radiation, which employs magnetic nanoparticles as a complementary radiate source inside the tumor. (authors)

  17. Superhydrophobic Materials for Biomedical Applications

    Science.gov (United States)

    Colson, Yolonda L.; Grinstaff, Mark W.

    2016-01-01

    Superhydrophobic surfaces are actively studied across a wide range of applications and industries, and are now finding increased use in the biomedical arena as substrates to control protein adsorption, cellular interaction, and bacterial growth, as well as platforms for drug delivery devices and for diagnostic tools. The commonality in the design of these materials is to create a stable or metastable air state at the material surface, which lends itself to a number of unique properties. These activities are catalyzing the development of new materials, applications, and fabrication techniques, as well as collaborations across material science, chemistry, engineering, and medicine given the interdisciplinary nature of this work. The review begins with a discussion of superhydrophobicity, and then explores biomedical applications that are utilizing superhydrophobicity in depth including material selection characteristics, in vitro performance, and in vivo performance. General trends are offered for each application in addition to discussion of conflicting data in the literature, and the review concludes with the authors’ future perspectives on the utility of superhydrophobic surfaces for biomedical applications. PMID:27449946

  18. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study

    Science.gov (United States)

    Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-01-01

    Background The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. Aim The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Materials and Methods Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20o to 35o were divided into three groups of 20 samples each: ProTaper PT (group I) – full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO – single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey’s honestly significant difference test. Results It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant Conclusion It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal

  19. Comparative Analysis of Canal Centering Ability of Different Single File Systems Using Cone Beam Computed Tomography- An In-Vitro Study.

    Science.gov (United States)

    Agarwal, Rolly S; Agarwal, Jatin; Jain, Pradeep; Chandra, Anil

    2015-05-01

    The ability of an endodontic instrument to remain centered in the root canal system is one of the most important characteristic influencing the clinical performance of a particular file system. Thus, it is important to assess the canal centering ability of newly introduced single file systems before they can be considered a viable replacement of full-sequence rotary file systems. The aim of the study was to compare the canal transportation, centering ability, and time taken for preparation of curved root canals after instrumentation with single file systems One Shape and Wave One, using cone-beam computed tomography (CBCT). Sixty mesiobuccal canals of mandibular molars with an angle of curvature ranging from 20(o) to 35(o) were divided into three groups of 20 samples each: ProTaper PT (group I) - full-sequence rotary control group, OneShape OS (group II)- single file continuous rotation, WaveOne WO - single file reciprocal motion (group III). Pre instrumentation and post instrumentation three-dimensional CBCT images were obtained from root cross-sections at 3mm, 6mm and 9mm from the apex. Scanned images were then accessed to determine canal transportation and centering ability. The data collected were evaluated using one-way analysis of variance (ANOVA) with Tukey's honestly significant difference test. It was observed that there were no differences in the magnitude of transportation between the rotary instruments (p >0.05) at both 3mm as well as 6mm from the apex. At 9 mm from the apex, Group I PT showed significantly higher mean canal transportation and lower centering ability (0.19±0.08 and 0.39±0.16), as compared to Group II OS (0.12±0.07 and 0.54±0.24) and Group III WO (0.13±0.06 and 0.55±0.18) while the differences between OS and WO were not statistically significant. It was concluded that there was minor difference between the tested groups. Single file systems demonstrated average canal transportation and centering ability comparable to full sequence

  20. Environmental Modeling Center

    Data.gov (United States)

    Federal Laboratory Consortium — The Environmental Modeling Center provides the computational tools to perform geostatistical analysis, to model ground water and atmospheric releases for comparison...

  1. Biotechnology development for biomedical applications.

    Energy Technology Data Exchange (ETDEWEB)

    Kuehl, Michael; Brozik, Susan Marie; Rogers, David Michael; Rempe, Susan L.; Abhyankar, Vinay V.; Hatch, Anson V.; Dirk, Shawn M.; Hedberg-Dirk, Elizabeth (University of New Mexico, Albuquerque, NM); Sukharev, Sergei (University of Maryland, College Park, MD); Anishken, Andriy (University of Maryland, College Park, MD); Cicotte, Kirsten; De Sapio, Vincent; Buerger, Stephen P.; Mai, Junyu

    2010-11-01

    Sandia's scientific and engineering expertise in the fields of computational biology, high-performance prosthetic limbs, biodetection, and bioinformatics has been applied to specific problems at the forefront of cancer research. Molecular modeling was employed to design stable mutations of the enzyme L-asparaginase with improved selectivity for asparagine over other amino acids with the potential for improved cancer chemotherapy. New electrospun polymer composites with improved electrical conductivity and mechanical compliance have been demonstrated with the promise of direct interfacing between the peripheral nervous system and the control electronics of advanced prosthetics. The capture of rare circulating tumor cells has been demonstrated on a microfluidic chip produced with a versatile fabrication processes capable of integration with existing lab-on-a-chip and biosensor technology. And software tools have been developed to increase the calculation speed of clustered heat maps for the display of relationships in large arrays of protein data. All these projects were carried out in collaboration with researchers at the University of Texas M. D. Anderson Cancer Center in Houston, TX.

  2. A User-Centered Mobile Cloud Computing Platform for Improving Knowledge Management in Small-to-Medium Enterprises in the Chilean Construction Industry

    Directory of Open Access Journals (Sweden)

    Daniela Núñez

    2018-03-01

    Full Text Available Knowledge management (KM is a key element for the development of small-to-medium enterprises (SMEs in the construction industry. This is particularly relevant in Chile, where this industry is composed almost entirely of SMEs. Although various KM system proposals can be found in the literature, they are not suitable for SMEs, due to usability problems, budget constraints, and time and connectivity issues. Mobile Cloud Computing (MCC systems offer several advantages to construction SMEs, but they have not yet been exploited to address KM needs. Therefore, this research is aimed at the development of a MCC-based KM platform to manage lessons learned in different construction projects of SMEs, through an iterative and user-centered methodology. Usability and quality evaluations of the proposed platform show that MCC is a feasible and attractive option to address the KM issues in SMEs of the Chilean construction industry, since it is possible to consider both technical and usability requirements.

  3. A novel biomedical image indexing and retrieval system via deep preference learning.

    Science.gov (United States)

    Pang, Shuchao; Orgun, Mehmet A; Yu, Zhezhou

    2018-05-01

    -of-the-art techniques in indexing biomedical images. We propose a novel and automated indexing system based on deep preference learning to characterize biomedical images for developing computer aided diagnosis (CAD) systems in healthcare. Our proposed system shows an outstanding indexing ability and high efficiency for biomedical image retrieval applications and it can be used to collect and annotate the high-resolution images in a biomedical database for further biomedical image research and applications. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Spinopelvic dissociation: multidetector computed tomographic evaluation of fracture patterns and associated injuries at a single level 1 trauma center.

    Science.gov (United States)

    Gupta, Pushpender; Barnwell, Jonathan C; Lenchik, Leon; Wuertzer, Scott D; Miller, Anna N

    2016-06-01

    The objective of the present study is to evaluate multidetector computed tomographic (MDCT) fracture patterns and associated injuries in patients with spinopelvic dissociation (SPD). Our institutional trauma registry database was reviewed from Jan. 1, 2006, to Sept. 30, 2012, specifically evaluating patients with sacral fractures. MDCT scans of patients with sacral fractures were reviewed to determine the presence of SPD. SPD cases were characterized into the following fracture patterns: U-shaped, Y-shaped, T-shaped, H-shaped, and burst. The following MDCT features were recorded: level of the horizontal fracture, location of vertical fracture, kyphosis between major fracture fragments, displacement of fracture fragment, narrowing of central spinal canal, narrowing of neural foramina, and extension into sacroiliac joints. Quantitative evaluation of the sacral fractures was performed in accordance with the consensus statement by the Spine Trauma Study Group. Medical records were reviewed to determine associated pelvic and non-pelvic fractures, bladder and bowel injuries, nerve injuries, and type of surgical intervention. Twenty-one patients had SPD, of whom 13 were men and eight were women. Mean age was 41.8 years (range 18.8 to 87.7). Five fractures (24 %) were U-shaped, six (29 %) H-shaped, four (19 %) Y-shaped, and six (29 %) burst. Nine patients (43 %) had central canal narrowing, and 19 (90 %) had neural foramina narrowing. Eleven patients (52 %) had kyphotic angulation between major fracture fragments, and seven patients (33 %) had either anterior (24 %) or posterior (10 %) displacement of the proximal fracture fragment. Fourteen patients (67 %) had associated pelvic fractures, and 20 (95 %) had associated non-pelvic fractures. Two patients (10 %) had associated urethral injuries, and one (5 %) had an associated colon injury. Seven patients (33 %) had associated nerve injuries. Six patients (29 %) had surgical fixation while 15 (71 %) were

  5. Computed tomography-guided needle aspiration and biopsy of pulmonary lesions - A single-center experience in 1000 patients

    Energy Technology Data Exchange (ETDEWEB)

    Poulou, Loukia S.; Tsagouli, Paraskevi; Thanos, Loukas [Dept. of Medical Imaging and Interventional Radiology, General Hospital of Chest Diseases ' Sotiria' , Athens (Greece)], e-mail: ploukia@hotmail.com; Ziakas, Panayiotis D. [Program of Outcomes Research, Div. of Infectious Diseases, Warren Alpert Medical School, Brown Univ., RI, and Div. of Infectious Diseases, Rhode Island Hospital, Rhode Island (United States); Politi, Dimitra [Dept. of Cythopathology, General Hospital of Chest Diseases ' Sotiria' Athens (Greece); Trigidou, Rodoula [Dept. of Pathology, General Hospital of Chest Diseases ' Sotiria' Athens (Greece)

    2013-07-15

    Background: Computed tomography (CT)-guided fine needle aspiration (FNA) and biopsies are well-established, minimally invasive diagnostic tools for pulmonary lesions. Purpose: To analyze retrospectively the results of 1000 consecutive lung CT-guided FNA and/or core needle biopsies (CNB), the main outcome measures being diagnostic yield, and complication rates. Material and Methods: Patients considered eligible were those referred to our department for lung lesions. The choice of FNA, CNB, or both was based upon the radiologist's judgment. Diagnostic yield was defined as the probability of having a definite result by cytology/histology. Results: The study included 733 male patients and 267 female patients, with a mean (SD) age of 66.4 (11.4) years. The mean (SD) lesion size was 3.7 (2.4) cm in maximal diameter. Six hundred and forty-one (64%) patients underwent an FNA procedure, 245 (25%) a CNB, and 114 (11%) had been subjected to both. The diagnostic yield was 960/994 (96.6%); this decreased significantly with the use of CNB only (odds ratio [OR] 0.32; 95% CI 0.12 - 0.88; P = 0.03), while it increased with lesion size (OR 1.35; 95% CI 1.03 - 1.79; P = 0.03 per cm increase). In 506 patients (52.7%), a malignant process was diagnosed by cytopathology/histology. The complication rate reached 97/1000 (9.7%); complications included: hemorrhage, 62 (6.2%); pneumothorax, 28 (2.8%); hemorrhage and pneumothorax, 5 (0.5%); and hemoptysis, 2 (0.2%). It was not significantly affected by the type of procedure or localization of the lesion. The overall risk for complications was three times higher for lesions <4 cm (OR 3.26; 95% CI 1.96 - 5.42; P < 0.001). Conclusion: CT-guided lung biopsy has a high diagnostic yield using FNA, CNB, or both. The CNB procedure alone will not suffice. Complication rates were acceptable and correlated inversely with lesion size, not localization or type of procedure.

  6. Computer methods in biomechanics and biomedical engineering - Supplement 1: papers from the 32th congress of the Société de Biomécanique, Lyon, 28-29th August

    OpenAIRE

    CHEZE, L; DUMAS, R; NICOLLE, S; MIDDLETON, J; JACOBS, CR

    2007-01-01

    Subjects: Bioinformatics; Biomaterials; Biomaterials & Medical Devices; Biomaterials & Medical devices; Biomechanics; Biomechanics & Human Movement Science; Breast Cancer; Cardiovascular Imaging; Computational Mechanics; Dentistry; Diagnostic Imaging; Ergonomics; Ergonomics & Human Factors; Mechanics: Fluid Dynamics; Mechanical Engineering: Fluid Dynamics; Mathematical Biology; Mechanical Engineering: Mechanical Engineering Design; Design: Mechanical Engineering Design; Mechanical Engineering...

  7. Projected Applications of a ``Climate in a Box'' Computing System at the NASA Short-term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Jedlovec, G.; Molthan, A.; Zavodsky, B.; Case, J.; Lafontaine, F.

    2010-12-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to “Climate in a Box” systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the “Climate in a Box” system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA’s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the “Climate in a Box” system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed

  8. Projected Applications of a "Climate in a Box" Computing System at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Jedlovec, Gary J.; Molthan, Andrew L.; Zavodsky, Bradley; Case, Jonathan L.; LaFontaine, Frank J.

    2010-01-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to "Climate in a Box" systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the "Climate in a Box" system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the "Climate in a Box" system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed within the NASA SPo

  9. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  10. Text mining patents for biomedical knowledge.

    Science.gov (United States)

    Rodriguez-Esteban, Raul; Bundschus, Markus

    2016-06-01

    Biomedical text mining of scientific knowledge bases, such as Medline, has received much attention in recent years. Given that text mining is able to automatically extract biomedical facts that revolve around entities such as genes, proteins, and drugs, from unstructured text sources, it is seen as a major enabler to foster biomedical research and drug discovery. In contrast to the biomedical literature, research into the mining of biomedical patents has not reached the same level of maturity. Here, we review existing work and highlight the associated technical challenges that emerge from automatically extracting facts from patents. We conclude by outlining potential future directions in this domain that could help drive biomedical research and drug discovery. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. A qualitative study adopting a user-centered approach to design and validate a brain computer interface for cognitive rehabilitation for people with brain injury.

    Science.gov (United States)

    Martin, Suzanne; Armstrong, Elaine; Thomson, Eileen; Vargiu, Eloisa; Solà, Marc; Dauwalder, Stefan; Miralles, Felip; Daly Lynn, Jean

    2017-07-14

    Cognitive rehabilitation is established as a core intervention within rehabilitation programs following a traumatic brain injury (TBI). Digitally enabled assistive technologies offer opportunities for clinicians to increase remote access to rehabilitation supporting transition into home. Brain Computer Interface (BCI) systems can harness the residual abilities of individuals with limited function to gain control over computers through their brain waves. This paper presents an online cognitive rehabilitation application developed with therapists, to work remotely with people who have TBI, who will use BCI at home to engage in the therapy. A qualitative research study was completed with people who are community dwellers post brain injury (end users), and a cohort of therapists involved in cognitive rehabilitation. A user-centered approach over three phases in the development, design and feasibility testing of this cognitive rehabilitation application included two tasks (Find-a-Category and a Memory Card task). The therapist could remotely prescribe activity with different levels of difficulty. The service user had a home interface which would present the therapy activities. This novel work was achieved by an international consortium of academics, business partners and service users.

  12. Tritium AMS for biomedical applications

    International Nuclear Information System (INIS)

    Roberts, M.L.; Velsko, C.; Turteltaub, K.W.

    1993-08-01

    We are developing 3 H-AMS to measure 3 H activity of mg-sized biological samples. LLNL has already successfully applied 14 C AMS to a variety of problems in the area of biomedical research. Development of 3 H AMS would greatly complement these studies. The ability to perform 3 H AMS measurements at sensitivities equivalent to those obtained for 14 C will allow us to perform experiments using compounds that are not readily available in 14 C-tagged form. A 3 H capability would also allow us to perform unique double-labeling experiments in which we learn the fate, distribution, and metabolism of separate fractions of biological compounds

  13. Luminescent nanodiamonds for biomedical applications.

    Science.gov (United States)

    Say, Jana M; van Vreden, Caryn; Reilly, David J; Brown, Louise J; Rabeau, James R; King, Nicholas J C

    2011-12-01

    In recent years, nanodiamonds have emerged from primarily an industrial and mechanical applications base, to potentially underpinning sophisticated new technologies in biomedical and quantum science. Nanodiamonds are relatively inexpensive, biocompatible, easy to surface functionalise and optically stable. This combination of physical properties are ideally suited to biological applications, including intracellular labelling and tracking, extracellular drug delivery and adsorptive detection of bioactive molecules. Here we describe some of the methods and challenges for processing nanodiamond materials, detection schemes and some of the leading applications currently under investigation.

  14. Thermoresponsive Polymers for Biomedical Applications

    Directory of Open Access Journals (Sweden)

    Theoni K. Georgiou

    2011-08-01

    Full Text Available Thermoresponsive polymers are a class of “smart” materials that have the ability to respond to a change in temperature; a property that makes them useful materials in a wide range of applications and consequently attracts much scientific interest. This review focuses mainly on the studies published over the last 10 years on the synthesis and use of thermoresponsive polymers for biomedical applications including drug delivery, tissue engineering and gene delivery. A summary of the main applications is given following the different studies on thermoresponsive polymers which are categorized based on their 3-dimensional structure; hydrogels, interpenetrating networks, micelles, crosslinked micelles, polymersomes, films and particles.

  15. Biomedical signal and image processing

    CERN Document Server

    Najarian, Kayvan

    2012-01-01

    INTRODUCTION TO DIGITAL SIGNAL AND IMAGE PROCESSINGSignals and Biomedical Signal ProcessingIntroduction and OverviewWhat is a ""Signal""?Analog, Discrete, and Digital SignalsProcessing and Transformation of SignalsSignal Processing for Feature ExtractionSome Characteristics of Digital ImagesSummaryProblemsFourier TransformIntroduction and OverviewOne-Dimensional Continuous Fourier TransformSampling and NYQUIST RateOne-Dimensional Discrete Fourier TransformTwo-Dimensional Discrete Fourier TransformFilter DesignSummaryProblemsImage Filtering, Enhancement, and RestorationIntroduction and Overview

  16. An introduction to biomedical instrumentation

    CERN Document Server

    Dewhurst, D J

    1976-01-01

    An Introduction to Biomedical Instrumentation presents a course of study and applications covering the basic principles of medical and biological instrumentation, as well as the typical features of its design and construction. The book aims to aid not only the cognitive domain of the readers, but also their psychomotor domain as well. Aside from the seminar topics provided, which are divided into 27 chapters, the book complements these topics with practical applications of the discussions. Figures and mathematical formulas are also given. Major topics discussed include the construction, handli

  17. Review of Biomedical Image Processing

    Directory of Open Access Journals (Sweden)

    Ciaccio Edward J

    2011-11-01

    Full Text Available Abstract This article is a review of the book: 'Biomedical Image Processing', by Thomas M. Deserno, which is published by Springer-Verlag. Salient information that will be useful to decide whether the book is relevant to topics of interest to the reader, and whether it might be suitable as a course textbook, are presented in the review. This includes information about the book details, a summary, the suitability of the text in course and research work, the framework of the book, its specific content, and conclusions.

  18. Introduction to biomedical engineering technology

    CERN Document Server

    Street, Laurence J

    2011-01-01

    IntroductionHistory of Medical DevicesThe Role of Biomedical Engineering Technologists in Health CareCharacteristics of Human Anatomy and Physiology That Relate to Medical DevicesSummaryQuestionsDiagnostic Devices: Part OnePhysiological Monitoring SystemsThe HeartSummaryQuestionsDiagnostic Devices: Part TwoCirculatory System and BloodRespiratory SystemNervous SystemSummaryQuestionsDiagnostic Devices: Part ThreeDigestive SystemSensory OrgansReproductionSkin, Bone, Muscle, MiscellaneousChapter SummaryQuestionsDiagnostic ImagingIntroductionX-RaysMagnetic Resonance Imaging ScannersPositron Emissio

  19. Understanding the Structure-Function Relationships of Dendrimers in Environmental and Biomedical Applications

    Science.gov (United States)

    Wang, Bo

    We are living an era wherein nanoparticles (NPs) have been widely applied in our lives. Dendrimers are special polymeric NPs with unique physiochemical properties, which have been intensely explored for a variety of applications. Current studies on dendrimers are bottlenecked by insufficient understandings of their structure and dynamic behaviors from a molecular level. With primarily computational approaches supplemented by many other experimental technics, this dissertation aims to establish structure-function relationships of dendrimers in environmental and biomedical applications. More specifically, it thoroughly investigates the interactions between dendrimers and different biomolecules including carbon-based NPs, metal-based NPs, and proteins/peptides. Those results not only provide profound knowledge for evaluating the impacts of dendrimers on environmental and biological systems but also facilitate designing next-generation functional polymeric nanomaterials. The dissertation is organized as following. Chapter 1 provides an overview of current progresses on dendrimer studies, where methodology of Discrete Molecular Dynamics (DMD), my major research tool, is also introduced. Two directions of utilizing dendrimers will be discussed in following chapters. Chapter 2 will focus on environmental applications of dendrimers, where two back-to-back studies are presented. I will start from describing some interesting observations from experiments i.e. dendrimers dispersed model oil molecules. Then, I will reveal why surface chemistries of dendrimers lead to different remediation efficiencies by computational modelings. Finally, I will demonstrate different scenarios of dendrimer-small molecules association. Chapter 3 is centered on dendrimers in the biomedical applications including two subtopics. In the first topic, we will discuss dendrimers as surfactants that modulating the interactions between proteins and NPs. Some fundamental concepts regarding to NPs

  20. The Biomedical Resource Ontology (BRO) to enable resource discovery in clinical and translational research.

    Science.gov (United States)

    Tenenbaum, Jessica D; Whetzel, Patricia L; Anderson, Kent; Borromeo, Charles D; Dinov, Ivo D; Gabriel, Davera; Kirschner, Beth; Mirel, Barbara; Morris, Tim; Noy, Natasha; Nyulas, Csongor; Rubenson, David; Saxman, Paul R; Singh, Harpreet; Whelan, Nancy; Wright, Zach; Athey, Brian D; Becich, Michael J; Ginsburg, Geoffrey S; Musen, Mark A; Smith, Kevin A; Tarantal, Alice F; Rubin, Daniel L; Lyster, Peter

    2011-02-01

    The biomedical research community relies on a diverse set of resources, both within their own institutions and at other research centers. In addition, an increasing number of shared electronic resources have been developed. Without effective means to locate and query these resources, it is challenging, if not impossible, for investigators to be aware of the myriad resources available, or to effectively perform resource discovery when the need arises. In this paper, we describe the development and use of the Biomedical Resource Ontology (BRO) to enable semantic annotation and discovery of biomedical resources. We also describe the Resource Discovery System (RDS) which is a federated, inter-institutional pilot project that uses the BRO to facilitate resource discovery on the Internet. Through the RDS framework and its associated Biositemaps infrastructure, the BRO facilitates semantic search and discovery of biomedical resources, breaking down barriers and streamlining scientific research that will improve human health. Copyright © 2010 Elsevier Inc. All rights reserved.

  1. Basics of biomedical ultrasound for engineers

    CERN Document Server

    Azhari, Haim

    2010-01-01

    "Basics of Biomedical Ultrasound for Engineers is a structured textbook for university engineering courses in biomedical ultrasound and for researchers in the field. This book offers a tool for building a solid understanding of biomedical ultrasound, and leads the novice through the field in a step-by-step manner. The book begins with the most basic definitions of waves, proceeds to ultrasounds in fluids, and then delves into solid ultrasounds, the most complicated kind of ultrasound. It encompasses a wide range of topics within biomedical ultrasound, from conceptual definitions of waves to the intricacies of focusing devices, transducers, and acoustic fields"--Provided by publisher.

  2. Mathematics and physics of emerging biomedical imaging

    National Research Council Canada - National Science Library

    Committee on the Mathematics and Physics of Emerging Dynamic Biomedical Imaging, National Research Council

    .... Incorporating input from dozens of biomedical researchers who described what they perceived as key open problems of imaging that are amenable to attack by mathematical scientists and physicists...

  3. Frontiers in biomedical engineering and biotechnology.

    Science.gov (United States)

    Liu, Feng; Goodarzi, Ali; Wang, Haifeng; Stasiak, Joanna; Sun, Jianbo; Zhou, Yu

    2014-01-01

    The 2nd International Conference on Biomedical Engineering and Biotechnology (iCBEB 2013), held in Wuhan on 11–13 October 2013, is an annual conference that aims at providing an opportunity for international and national researchers and practitioners to present the most recent advances and future challenges in the fields of Biomedical Information, Biomedical Engineering and Biotechnology. The papers published by this issue are selected from this conference, which witnesses the frontier in the field of Biomedical Engineering and Biotechnology, which particularly has helped improving the level of clinical diagnosis in medical work.

  4. Telemedicine optoelectronic biomedical data processing system

    Science.gov (United States)

    Prosolovska, Vita V.

    2010-08-01

    The telemedicine optoelectronic biomedical data processing system is created to share medical information for the control of health rights and timely and rapid response to crisis. The system includes the main blocks: bioprocessor, analog-digital converter biomedical images, optoelectronic module for image processing, optoelectronic module for parallel recording and storage of biomedical imaging and matrix screen display of biomedical images. Rated temporal characteristics of the blocks defined by a particular triggering optoelectronic couple in analog-digital converters and time imaging for matrix screen. The element base for hardware implementation of the developed matrix screen is integrated optoelectronic couples produced by selective epitaxy.

  5. Publications in biomedical and environmental sciences programs, 1980

    International Nuclear Information System (INIS)

    Pfuderer, H.A.; Moody, J.B.

    1981-07-01

    This bibliography contains 690 references to articles in journals, books, and reports published in the subject area of biomedical and environmental sciences during 1980. There are 529 references to articles published in journals and books and 161 references to reports. Staff members in the Biomedical and Environmental Sciences divisions have other publications not included in this bibliography; for example, theses, book reviews, abstracts published in journals or symposia proceedings, pending journal publications and reports such as monthly and bimonthly progress reports, contractor reports, and reports for internal distribution. This document is sorted by the division, and then alphabetically by author. The sorting by divisions separates the references by subject area in a simple way. The divisions represented in the order that they appear in the bibliography are Analytical Chemistry, Biology, Chemical Technology, Information R and D, Health and Safety Research, Energy, Environmental Sciences, and Computer Sciences

  6. Publications in biomedical and environmental sciences programs, 1980

    Energy Technology Data Exchange (ETDEWEB)

    Pfuderer, H.A.; Moody, J.B.

    1981-07-01

    This bibliography contains 690 references to articles in journals, books, and reports published in the subject area of biomedical and environmental sciences during 1980. There are 529 references to articles published in journals and books and 161 references to reports. Staff members in the Biomedical and Environmental Sciences divisions have other publications not included in this bibliography; for example, theses, book reviews, abstracts published in journals or symposia proceedings, pending journal publications and reports such as monthly and bimonthly progress reports, contractor reports, and reports for internal distribution. This document is sorted by the division, and then alphabetically by author. The sorting by divisions separates the references by subject area in a simple way. The divisions represented in the order that they appear in the bibliography are Analytical Chemistry, Biology, Chemical Technology, Information R and D, Health and Safety Research, Energy, Environmental Sciences, and Computer Sciences.

  7. Bioethical Principles of Biomedical Research Involving Animals

    Directory of Open Access Journals (Sweden)

    Bakir Mehić

    2011-08-01

    animals for research, testing, or training in different countries. In the few that have done so, the measures adopted vary widely: on the one hand, legally enforceable detailed regulations with licensing of experimenters and their premises together with an official inspectorate; on the other, entirely voluntary self-regulation by the biomedical community, with lay participation. Many variations are possible between these extremes, one intermediate situation being a legal requirement that experiments or other procedures involving the use of animals should be subject to the approval of ethical committees of specified composition.The International Guiding Principles are the product of the collaboration of a representative sample of the international biomedical community, including experts of the World Health Organization, and of consultations with responsible animal welfare groups. The International Guiding Principles have already gained a considerable measure of acceptance internationally. European Medical Research Councils (EMRC, an international association that includes all the West European medical research councils, fully endorsed the Guiding Principles in 1984. Here we bring the basic bioethical principles for using animals in biomedical research[3]: Methods such as mathematical models, computer simulation and in vitro biological systems should be used wherever appropriate,Animal experiments should be undertaken only after due consideration of their relevance for human or animal health and the advancement of biological knowledge,The animals selected for an experiment should be of an appropriate species and quality, and the minimum number required to obtain scientifically valid results,Investigators and other personnel should never fail to treat animals as sentient, and should regard their proper care and use and the avoidance or minimization of discomfort, distress, or pain as ethical imperatives,Procedures with animals that may cause more than momentary or minimal

  8. Biomedical information retrieval across languages.

    Science.gov (United States)

    Daumke, Philipp; Markü, Kornél; Poprat, Michael; Schulz, Stefan; Klar, Rüdiger

    2007-06-01

    This work presents a new dictionary-based approach to biomedical cross-language information retrieval (CLIR) that addresses many of the general and domain-specific challenges in current CLIR research. Our method is based on a multilingual lexicon that was generated partly manually and partly automatically, and currently covers six European languages. It contains morphologically meaningful word fragments, termed subwords. Using subwords instead of entire words significantly reduces the number of lexical entries necessary to sufficiently cover a specific language and domain. Mediation between queries and documents is based on these subwords as well as on lists of word-n-grams that are generated from large monolingual corpora and constitute possible translation units. The translations are then sent to a standard Internet search engine. This process makes our approach an effective tool for searching the biomedical content of the World Wide Web in different languages. We evaluate this approach using the OHSUMED corpus, a large medical document collection, within a cross-language retrieval setting.

  9. Biomedical applications of control engineering

    CERN Document Server

    Hacısalihzade, Selim S

    2013-01-01

    Biomedical Applications of Control Engineering is a lucidly written textbook for graduate control engin­eering and biomedical engineering students as well as for medical prac­ti­tioners who want to get acquainted with quantitative methods. It is based on decades of experience both in control engineering and clinical practice.   The book begins by reviewing basic concepts of system theory and the modeling process. It then goes on to discuss control engineering application areas like ·         Different models for the human operator, ·         Dosage and timing optimization in oral drug administration, ·         Measuring symptoms of and optimal dopaminergic therapy in Parkinson’s disease, ·         Measure­ment and control of blood glucose le­vels both naturally and by means of external controllers in diabetes, and ·         Control of depth of anaesthesia using inhalational anaesthetic agents like sevoflurane using both fuzzy and state feedback controllers....

  10. Reviewing Manuscripts for Biomedical Journals

    Science.gov (United States)

    Garmel, Gus M

    2010-01-01

    Writing for publication is a complex task. For many professionals, producing a well-executed manuscript conveying one's research, ideas, or educational wisdom is challenging. Authors have varying emotions related to the process of writing for scientific publication. Although not studied, a relationship between an author's enjoyment of the writing process and the product's outcome is highly likely. As with any skill, practice generally results in improvements. Literature focused on preparing manuscripts for publication and the art of reviewing submissions exists. Most journals guard their reviewers' anonymity with respect to the manuscript review process. This is meant to protect them from direct or indirect author demands, which may occur during the review process or in the future. It is generally accepted that author identities are masked in the peer-review process. However, the concept of anonymity for reviewers has been debated recently; many editors consider it problematic that reviewers are not held accountable to the public for their decisions. The review process is often arduous and underappreciated, one reason why biomedical journals acknowledge editors and frequently recognize reviewers who donate their time and expertise in the name of science. This article describes essential elements of a submitted manuscript, with the hopes of improving scientific writing. It also discusses the review process within the biomedical literature, the importance of reviewers to the scientific process, responsibilities of reviewers, and qualities of a good review and reviewer. In addition, it includes useful insights to individuals who read and interpret the medical literature. PMID:20740129

  11. e-Science platform for translational biomedical imaging research: running, statistics, and analysis

    Science.gov (United States)

    Wang, Tusheng; Yang, Yuanyuan; Zhang, Kai; Wang, Mingqing; Zhao, Jun; Xu, Lisa; Zhang, Jianguo

    2015-03-01

    In order to enable multiple disciplines of medical researchers, clinical physicians and biomedical engineers working together in a secured, efficient, and transparent cooperative environment, we had designed an e-Science platform for biomedical imaging research and application cross multiple academic institutions and hospitals in Shanghai and presented this work in SPIE Medical Imaging conference held in San Diego in 2012. In past the two-years, we implemented a biomedical image chain including communication, storage, cooperation and computing based on this e-Science platform. In this presentation, we presented the operating status of this system in supporting biomedical imaging research, analyzed and discussed results of this system in supporting multi-disciplines collaboration cross-multiple institutions.

  12. Interactive Processing and Visualization of Image Data forBiomedical and Life Science Applications

    Energy Technology Data Exchange (ETDEWEB)

    Staadt, Oliver G.; Natarjan, Vijay; Weber, Gunther H.; Wiley,David F.; Hamann, Bernd

    2007-02-01

    Background: Applications in biomedical science and life science produce large data sets using increasingly powerful imaging devices and computer simulations. It is becoming increasingly difficult for scientists to explore and analyze these data using traditional tools. Interactive data processing and visualization tools can support scientists to overcome these limitations. Results: We show that new data processing tools and visualization systems can be used successfully in biomedical and life science applications. We present an adaptive high-resolution display system suitable for biomedical image data, algorithms for analyzing and visualization protein surfaces and retinal optical coherence tomography data, and visualization tools for 3D gene expression data. Conclusion: We demonstrated that interactive processing and visualization methods and systems can support scientists in a variety of biomedical and life science application areas concerned with massive data analysis.

  13. Light Ion Biomedical Research Accelerator LIBRA

    International Nuclear Information System (INIS)

    Gough, R.A.

    1987-01-01

    LIBRA is a concept to place a light-ion, charged-particle facility in a hospital environment, and to dedicate it to applications in biology and medicine. There are two aspects of the program envisaged for LIBRA: a basic research effort coupled with a program in clinical applications of accelerated charged particles. The operational environment to be provided for LIBRA is one in which both of these components can coexist and flourish, and one that will promote the transfer of technology and knowledge from one to the other. In order to further investigate the prospects for a Light Ion Biomedical Research Accelerator (LIBRA), discussions are underway with the Merritt Peralta Medical Center MPMC) in Oakland CA, and the University of California at San Francisco (UCSF). In this paper, a brief discussion of the technical requirements for such a facility is given, together with an outline of the accelerator technology required. While still in a preliminary stage, it is possible nevertheless to develop an adequate working description of the type, size, performance and cost of the accelerator facilities required to meet the preliminary goals for LIBRA

  14. The Light Ion Biomedical Research Accelerator (LIBRA)

    International Nuclear Information System (INIS)

    Gough, R.A.

    1987-03-01

    LIBRA is a concept to place a light-ion, charged-particle facility in a hospital environment, and to dedicate it to applications in biology and medicine. There are two aspects of the program envisaged for LIBRA: a basic research effort coupled with a program in clinical applications of accelerated charged particles. The operational environment to be provided for LIBRA is one in which both of these components can coexist and flourish, and one that will promote the transfer of technology and knowledge from one to the other. In order to further investigate the prospects for a Light Ion Biomedical Research Accelerator (LIBRA), discussions are underway with the Merritt Peralta Medical Center (MPMC) in Oakland, California, and the University of California at San Francisco (UCSF). In this paper, a brief discussion of the technical requirements for such a facility is given, together with an outline of the accelerator technology required. While still in a preliminary stage, it is possible nevertheless to develop an adequate working description of the type, size, performance and cost of the accelerator facilities required to meet the preliminary goals for LIBRA

  15. SCELib3.0: The new revision of SCELib, the parallel computational library of molecular properties in the Single Center Approach

    Science.gov (United States)

    Sanna, N.; Baccarelli, I.; Morelli, G.

    2009-12-01

    SCELib is a computer program which implements the Single Center Expansion (SCE) method to describe molecular electronic densities and the interaction potentials between a charged projectile (electron or positron) and a target molecular system. The first version (CPC Catalog identifier ADMG_v1_0) was submitted to the CPC Program Library in 2000, and version 2.0 (ADMG_v2_0) was submitted in 2004. We here announce the new release 3.0 which presents additional features with respect to the previous versions aiming at a significative enhance of its capabilities to deal with larger molecular systems. SCELib 3.0 allows for ab initio effective core potential (ECP) calculations of the molecular wavefunctions to be used in the SCE method in addition to the standard all-electron description of the molecule. The list of supported architectures has been updated and the code has been ported to platforms based on accelerating coprocessors, such as the NVIDIA GPGPU and the new parallel model adopted is able to efficiently run on a mixed many-core computing system. Program summaryProgram title: SCELib3.0 Catalogue identifier: ADMG_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADMG_v3_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2 018 862 No. of bytes in distributed program, including test data, etc.: 4 955 014 Distribution format: tar.gz Programming language: C Compilers used: xlc V8.x, Intel C V10.x, Portland Group V7.x, nvcc V2.x Computer: All SMP platforms based on AIX, Linux and SUNOS operating systems over SPARC, POWER, Intel Itanium2, X86, em64t and Opteron processors Operating system: SUNOS, IBM AIX, Linux RedHat (Enterprise), Linux SuSE (SLES) Has the code been vectorized or parallelized?: Yes. 1 to 32 (CPU or GPU) used RAM: Up to 32 GB depending on the molecular

  16. Declining trend in the use of repeat computed tomography for trauma patients admitted to a level I trauma center for traffic-related injuries

    Energy Technology Data Exchange (ETDEWEB)

    Psoter, Kevin J., E-mail: kevinp2@u.washington.edu [Department of Epidemiology, University of Washington, Box 357236, Seattle, WA 98195 (United States); Roudsari, Bahman S., E-mail: roudsari@u.washington.edu [Department of Radiology, Comparative Effectiveness, Cost and Outcomes Research Center, University of Washington, 325 Ninth Avenue, Box 359960, Seattle, WA 98104 (United States); Graves, Janessa M., E-mail: janessa@u.washington.edu [Department of Pediatrics, Harborview Injury Prevention and Research Center, University of Washington, 325 Ninth Avenue, Box 359960, Seattle, WA 98104 (United States); Mack, Christopher, E-mail: cdmack@uw.edu [Harborview Injury Prevention and Research Center, University of Washington, 325 Ninth Avenue, Box 359960, Seattle, WA 98104 (United States); Jarvik, Jeffrey G., E-mail: jarvikj@u.washington.edu [Department of Radiology and Department of Neurological Surgery, Comparative Effectiveness, Cost and Outcomes Research Center, University of Washington, 325 Ninth Avenue, Box 359960, Seattle, WA 98104 (United States)

    2013-06-15

    Objective: To evaluate the trend in utilization of repeat (i.e. ≥2) computed tomography (CT) and to compare utilization patterns across body regions for trauma patients admitted to a level I trauma center for traffic-related injuries (TRI). Materials and Methods: We linked the Harborview Medical Center trauma registry (1996–2010) to the billing department data. We extracted the following variables: type and frequency of CTs performed, age, gender, race/ethnicity, insurance status, injury mechanism and severity, length of hospitalization, intensive care unit (ICU) admission and final disposition. TRIs were defined as motor vehicle collisions, motorcycle, bicycle and pedestrian-related injuries. Logistic regression was used to evaluate the association between utilization of different body region repeat (i.e. ≥2) CTs and year of admission, adjusting for patient and injury-related characteristics that could influence utilization patterns. Results: A total of 28,431 patients were admitted for TRIs over the study period and 9499 (33%) received repeat CTs. From 1996 to 2010, the proportion of patients receiving repeat CTs decreased by 33%. Relative to 2000 and adjusting for other covariates, patients with TRIs admitted in 2010 had significantly lower odds of undergoing repeat head (OR = 0.61; 95% CI: 0.49–0.76), pelvis (OR = 0.37; 95% CI: 0.27–0.52), cervical spine (OR = 0.23; 95% CI: 0.12–0.43), and maxillofacial CTs (OR = 0.24; 95% CI: 0.10–0.57). However, they had higher odds of receiving repeat thoracic CTs (OR = 1.86; 95% CI: 1.02–3.38). Conclusion: A significant decrease in the utilization of repeat CTs was observed in trauma patients presenting with traffic-related injuries over a 15-year period.

  17. Temporal trends in compliance with appropriateness criteria for stress single-photon emission computed tomography sestamibi studies in an academic medical center.

    Science.gov (United States)

    Gibbons, Raymond J; Askew, J Wells; Hodge, David; Miller, Todd D

    2010-03-01

    The purpose of this study was to apply published appropriateness criteria for single-photon emission computed tomography (SPECT) myocardial perfusion imaging (MPI) in a single academic medical center to determine if the percentage of inappropriate studies was changing over time. In a previous study, we applied the American College of Cardiology Foundation/American Society of Nuclear Cardiology (ASNC) appropriateness criteria for stress SPECT MPI and reported that 14% of stress SPECT studies were performed for inappropriate reasons. Using similar methodology, we retrospectively examined 284 patients who underwent stress SPECT MPI in October 2006 and compared the findings to the previous cohort of 284 patients who underwent stress SPECT MPI in May 2005. The indications for testing in the 2 cohorts were very similar. The overall level of agreement in characterizing categories of appropriateness between 2 experienced cardiovascular nurse abstractors was good (kappa = 0.68), which represented an improvement from our previous study (kappa = 0.56). There was a significant change between May 2005 and October 2006 in the overall classification of categories for appropriateness (P = .024 by chi(2) statistic). There were modest, but insignificant, increases in the number of patients who were unclassified (15% in the current study vs 11% previously), appropriate (66% vs 64%), and uncertain (12% vs 11%). Only 7% of the studies in the current study were inappropriate, which represented a significant (P = .004) decrease from the 14% reported in the 2005 cohort. In the absence of any specific intervention, there was a significant change in the overall classification of SPECT appropriateness in an academic medical center over 17 months. The only significant difference in individual categories was a decrease in inappropriate studies. Additional measurements over time will be required to determine if this trend is sustainable or generalizable.

  18. Archives of Medical and Biomedical Research

    African Journals Online (AJOL)

    Archives of Medical and Biomedical Research is the official journal of the International Association of Medical and Biomedical Researchers (IAMBR) and the Society for Free Radical Research Africa (SFRR-Africa). It is an internationally peer reviewed, open access and multidisciplinary journal aimed at publishing original ...

  19. A new educational program on biomedical engineering

    NARCIS (Netherlands)

    van Alste, Jan A.

    2000-01-01

    At the University of Twente together with the Free University of Amsterdam a new educational program on Biomedical Engineering will be developed. The academic program with a five-year duration will start in September 2001. After a general, broad education in Biomedical Engineering in the first three

  20. Sierra Leone Journal of Biomedical Research

    African Journals Online (AJOL)

    The Sierra Leone Journal of Biomedical Research publishes papers in all fields of Medicine and Allied Health Sciences including Basic Medical Sciences, Clinical Sciences, Dental Sciences, Behavioural Sciences, Biomedical Engineering, Molecular Biology, Pharmaceutical Sciences, Biotechnology in relation to Medicine, ...