WorldWideScience

Sample records for biomedical computing center

  1. Computational intelligence in biomedical imaging

    CERN Document Server

    2014-01-01

    This book provides a comprehensive overview of the state-of-the-art computational intelligence research and technologies in biomedical images with emphasis on biomedical decision making. Biomedical imaging offers useful information on patients’ medical conditions and clues to causes of their symptoms and diseases. Biomedical images, however, provide a large number of images which physicians must interpret. Therefore, computer aids are demanded and become indispensable in physicians’ decision making. This book discusses major technical advancements and research findings in the field of computational intelligence in biomedical imaging, for example, computational intelligence in computer-aided diagnosis for breast cancer, prostate cancer, and brain disease, in lung function analysis, and in radiation therapy. The book examines technologies and studies that have reached the practical level, and those technologies that are becoming available in clinical practices in hospitals rapidly such as computational inte...

  2. Biomedical Computing Technology Information Center (BCTIC): Final progress report, March 1, 1986-September 30, 1986

    International Nuclear Information System (INIS)

    1987-01-01

    During this time, BCTIC packaged and disseminated computing technology and honored all requests made before September 1, 1986. The final month of operation was devoted to completing code requests, returning submitted codes, and sending out notices of BCTIC's termination of services on September 30th. Final BCTIC library listings were distributed to members of the active mailing list. Also included in the library listing are names and addresses of program authors and contributors in order that users may have continued support of their programs. The BCTIC library list is attached

  3. University of Vermont Center for Biomedical Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Bernstein, Dr. Ira [University of Vermont and State Agricultural College

    2013-08-02

    This grant was awarded in support of Phase 2 of the University of Vermont Center for Biomedical Imaging. Phase 2 outlined several specific aims including: The development of expertise in MRI and fMRI imaging and their applications The acquisition of peer reviewed extramural funding in support of the Center The development of a Core Imaging Advisory Board, fee structure and protocol review and approval process.

  4. Establishment of the Center for Biomedical Technology Innovation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-12-15

    The report discussed the following topics: (1) Orthopedic Devices; (2) Hybrid Vector and Method Resulting in Protein Overproduction by Eukaryotic Cells; (3) Surgical Simulator; (4) CBTI (Center for Biomedical Technology Innovation) as an Incubator for Start-up Companies; (5) Voice-activated, computer-assisted surgical robotics; (6) Through transmission ultrasonic 3-D holography for diagnostic imaging; (7) CBTI's Scibermed{trademark} Virtual Institute (SVI); and (8) Laser Oxygenation Tomography.

  5. Applications of computational intelligence in biomedical technology

    CERN Document Server

    Majernik, Jaroslav; Pancerz, Krzysztof; Zaitseva, Elena

    2016-01-01

    This book presents latest results and selected applications of Computational Intelligence in Biomedical Technologies. Most of contributions deal with problems of Biomedical and Medical Informatics, ranging from theoretical considerations to practical applications. Various aspects of development methods and algorithms in Biomedical and Medical Informatics as well as Algorithms for medical image processing, modeling methods are discussed. Individual contributions also cover medical decision making support, estimation of risks of treatments, reliability of medical systems, problems of practical clinical applications and many other topics  This book is intended for scientists interested in problems of Biomedical Technologies, for researchers and academic staff, for all dealing with Biomedical and Medical Informatics, as well as PhD students. Useful information is offered also to IT companies, developers of equipment and/or software for medicine and medical professionals.  .

  6. Advanced computational approaches to biomedical engineering

    CERN Document Server

    Saha, Punam K; Basu, Subhadip

    2014-01-01

    There has been rapid growth in biomedical engineering in recent decades, given advancements in medical imaging and physiological modelling and sensing systems, coupled with immense growth in computational and network technology, analytic approaches, visualization and virtual-reality, man-machine interaction and automation. Biomedical engineering involves applying engineering principles to the medical and biological sciences and it comprises several topics including biomedicine, medical imaging, physiological modelling and sensing, instrumentation, real-time systems, automation and control, sig

  7. Biomedical computing facility interface design plan

    Science.gov (United States)

    Puckett, R. D.

    1971-01-01

    The results are presented of a design study performed to establish overall system interface requirements for the Biomedical Laboratories Division's Sigma-3 computer system. Emphasis has been placed upon the definition of an overall implementation plan and associated schedule to meet both near-term and long-range requirements within the constraints at available resources.

  8. Biomedical Visual Computing: Case Studies and Challenges

    KAUST Repository

    Johnson, Christopher

    2012-01-01

    Advances in computational geometric modeling, imaging, and simulation let researchers build and test models of increasing complexity, generating unprecedented amounts of data. As recent research in biomedical applications illustrates, visualization will be critical in making this vast amount of data usable; it\\'s also fundamental to understanding models of complex phenomena. © 2012 IEEE.

  9. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT,J.

    2004-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security.

  10. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2005-11-01

    The Brookhaven Computational Science Center brings together researchers in biology, chemistry, physics, and medicine with applied mathematicians and computer scientists to exploit the remarkable opportunities for scientific discovery which have been enabled by modern computers. These opportunities are especially great in computational biology and nanoscience, but extend throughout science and technology and include, for example, nuclear and high energy physics, astrophysics, materials and chemical science, sustainable energy, environment, and homeland security. To achieve our goals we have established a close alliance with applied mathematicians and computer scientists at Stony Brook and Columbia Universities.

  11. Biomedical cloud computing with Amazon Web Services.

    Directory of Open Access Journals (Sweden)

    Vincent A Fusaro

    2011-08-01

    Full Text Available In this overview to biomedical computing in the cloud, we discussed two primary ways to use the cloud (a single instance or cluster, provided a detailed example using NGS mapping, and highlighted the associated costs. While many users new to the cloud may assume that entry is as straightforward as uploading an application and selecting an instance type and storage options, we illustrated that there is substantial up-front effort required before an application can make full use of the cloud's vast resources. Our intention was to provide a set of best practices and to illustrate how those apply to a typical application pipeline for biomedical informatics, but also general enough for extrapolation to other types of computational problems. Our mapping example was intended to illustrate how to develop a scalable project and not to compare and contrast alignment algorithms for read mapping and genome assembly. Indeed, with a newer aligner such as Bowtie, it is possible to map the entire African genome using one m2.2xlarge instance in 48 hours for a total cost of approximately $48 in computation time. In our example, we were not concerned with data transfer rates, which are heavily influenced by the amount of available bandwidth, connection latency, and network availability. When transferring large amounts of data to the cloud, bandwidth limitations can be a major bottleneck, and in some cases it is more efficient to simply mail a storage device containing the data to AWS (http://aws.amazon.com/importexport/. More information about cloud computing, detailed cost analysis, and security can be found in references.

  12. Biomedical cloud computing with Amazon Web Services.

    Science.gov (United States)

    Fusaro, Vincent A; Patil, Prasad; Gafni, Erik; Wall, Dennis P; Tonellato, Peter J

    2011-08-01

    In this overview to biomedical computing in the cloud, we discussed two primary ways to use the cloud (a single instance or cluster), provided a detailed example using NGS mapping, and highlighted the associated costs. While many users new to the cloud may assume that entry is as straightforward as uploading an application and selecting an instance type and storage options, we illustrated that there is substantial up-front effort required before an application can make full use of the cloud's vast resources. Our intention was to provide a set of best practices and to illustrate how those apply to a typical application pipeline for biomedical informatics, but also general enough for extrapolation to other types of computational problems. Our mapping example was intended to illustrate how to develop a scalable project and not to compare and contrast alignment algorithms for read mapping and genome assembly. Indeed, with a newer aligner such as Bowtie, it is possible to map the entire African genome using one m2.2xlarge instance in 48 hours for a total cost of approximately $48 in computation time. In our example, we were not concerned with data transfer rates, which are heavily influenced by the amount of available bandwidth, connection latency, and network availability. When transferring large amounts of data to the cloud, bandwidth limitations can be a major bottleneck, and in some cases it is more efficient to simply mail a storage device containing the data to AWS (http://aws.amazon.com/importexport/). More information about cloud computing, detailed cost analysis, and security can be found in references.

  13. COMPUTATIONAL SCIENCE CENTER

    Energy Technology Data Exchange (ETDEWEB)

    DAVENPORT, J.

    2006-11-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to

  14. COMPUTATIONAL SCIENCE CENTER

    International Nuclear Information System (INIS)

    DAVENPORT, J.

    2006-01-01

    Computational Science is an integral component of Brookhaven's multi science mission, and is a reflection of the increased role of computation across all of science. Brookhaven currently has major efforts in data storage and analysis for the Relativistic Heavy Ion Collider (RHIC) and the ATLAS detector at CERN, and in quantum chromodynamics. The Laboratory is host for the QCDOC machines (quantum chromodynamics on a chip), 10 teraflop/s computers which boast 12,288 processors each. There are two here, one for the Riken/BNL Research Center and the other supported by DOE for the US Lattice Gauge Community and other scientific users. A 100 teraflop/s supercomputer will be installed at Brookhaven in the coming year, managed jointly by Brookhaven and Stony Brook, and funded by a grant from New York State. This machine will be used for computational science across Brookhaven's entire research program, and also by researchers at Stony Brook and across New York State. With Stony Brook, Brookhaven has formed the New York Center for Computational Science (NYCCS) as a focal point for interdisciplinary computational science, which is closely linked to Brookhaven's Computational Science Center (CSC). The CSC has established a strong program in computational science, with an emphasis on nanoscale electronic structure and molecular dynamics, accelerator design, computational fluid dynamics, medical imaging, parallel computing and numerical algorithms. We have been an active participant in DOES SciDAC program (Scientific Discovery through Advanced Computing). We are also planning a major expansion in computational biology in keeping with Laboratory initiatives. Additional laboratory initiatives with a dependence on a high level of computation include the development of hydrodynamics models for the interpretation of RHIC data, computational models for the atmospheric transport of aerosols, and models for combustion and for energy utilization. The CSC was formed to bring together

  15. Cloud computing: a new business paradigm for biomedical information sharing.

    Science.gov (United States)

    Rosenthal, Arnon; Mork, Peter; Li, Maya Hao; Stanford, Jean; Koester, David; Reynolds, Patti

    2010-04-01

    We examine how the biomedical informatics (BMI) community, especially consortia that share data and applications, can take advantage of a new resource called "cloud computing". Clouds generally offer resources on demand. In most clouds, charges are pay per use, based on large farms of inexpensive, dedicated servers, sometimes supporting parallel computing. Substantial economies of scale potentially yield costs much lower than dedicated laboratory systems or even institutional data centers. Overall, even with conservative assumptions, for applications that are not I/O intensive and do not demand a fully mature environment, the numbers suggested that clouds can sometimes provide major improvements, and should be seriously considered for BMI. Methodologically, it was very advantageous to formulate analyses in terms of component technologies; focusing on these specifics enabled us to bypass the cacophony of alternative definitions (e.g., exactly what does a cloud include) and to analyze alternatives that employ some of the component technologies (e.g., an institution's data center). Relative analyses were another great simplifier. Rather than listing the absolute strengths and weaknesses of cloud-based systems (e.g., for security or data preservation), we focus on the changes from a particular starting point, e.g., individual lab systems. We often find a rough parity (in principle), but one needs to examine individual acquisitions--is a loosely managed lab moving to a well managed cloud, or a tightly managed hospital data center moving to a poorly safeguarded cloud? 2009 Elsevier Inc. All rights reserved.

  16. Structural biology computing: Lessons for the biomedical research sciences.

    Science.gov (United States)

    Morin, Andrew; Sliz, Piotr

    2013-11-01

    The field of structural biology, whose aim is to elucidate the molecular and atomic structures of biological macromolecules, has long been at the forefront of biomedical sciences in adopting and developing computational research methods. Operating at the intersection between biophysics, biochemistry, and molecular biology, structural biology's growth into a foundational framework on which many concepts and findings of molecular biology are interpreted1 has depended largely on parallel advancements in computational tools and techniques. Without these computing advances, modern structural biology would likely have remained an exclusive pursuit practiced by few, and not become the widely practiced, foundational field it is today. As other areas of biomedical research increasingly embrace research computing techniques, the successes, failures and lessons of structural biology computing can serve as a useful guide to progress in other biomedically related research fields. Copyright © 2013 Wiley Periodicals, Inc.

  17. Multiscale computer modeling in biomechanics and biomedical engineering

    CERN Document Server

    2013-01-01

    This book reviews the state-of-the-art in multiscale computer modeling, in terms of both accomplishments and challenges. The information in the book is particularly useful for biomedical engineers, medical physicists and researchers in systems biology, mathematical biology, micro-biomechanics and biomaterials who are interested in how to bridge between traditional biomedical engineering work at the organ and tissue scales, and the newer arenas of cellular and molecular bioengineering.

  18. Synergies and Distinctions between Computational Disciplines in Biomedical Research: Perspective from the Clinical and Translational Science Award Programs

    Science.gov (United States)

    Bernstam, Elmer V.; Hersh, William R.; Johnson, Stephen B.; Chute, Christopher G.; Nguyen, Hien; Sim, Ida; Nahm, Meredith; Weiner, Mark; Miller, Perry; DiLaura, Robert P.; Overcash, Marc; Lehmann, Harold P.; Eichmann, David; Athey, Brian D.; Scheuermann, Richard H.; Anderson, Nick; Starren, Justin B.; Harris, Paul A.; Smith, Jack W.; Barbour, Ed; Silverstein, Jonathan C.; Krusch, David A.; Nagarajan, Rakesh; Becich, Michael J.

    2010-01-01

    Clinical and translational research increasingly requires computation. Projects may involve multiple computationally-oriented groups including information technology (IT) professionals, computer scientists and biomedical informaticians. However, many biomedical researchers are not aware of the distinctions among these complementary groups, leading to confusion, delays and sub-optimal results. Although written from the perspective of clinical and translational science award (CTSA) programs within academic medical centers, the paper addresses issues that extend beyond clinical and translational research. The authors describe the complementary but distinct roles of operational IT, research IT, computer science and biomedical informatics using a clinical data warehouse as a running example. In general, IT professionals focus on technology. The authors distinguish between two types of IT groups within academic medical centers: central or administrative IT (supporting the administrative computing needs of large organizations) and research IT (supporting the computing needs of researchers). Computer scientists focus on general issues of computation such as designing faster computers or more efficient algorithms, rather than specific applications. In contrast, informaticians are concerned with data, information and knowledge. Biomedical informaticians draw on a variety of tools, including but not limited to computers, to solve information problems in health care and biomedicine. The paper concludes with recommendations regarding administrative structures that can help to maximize the benefit of computation to biomedical research within academic health centers. PMID:19550198

  19. Transportation Research & Analysis Computing Center

    Data.gov (United States)

    Federal Laboratory Consortium — The technical objectives of the TRACC project included the establishment of a high performance computing center for use by USDOT research teams, including those from...

  20. Biomedical Imaging and Computational Modeling in Biomechanics

    CERN Document Server

    Iacoviello, Daniela

    2013-01-01

    This book collects the state-of-art and new trends in image analysis and biomechanics. It covers a wide field of scientific and cultural topics, ranging from remodeling of bone tissue under the mechanical stimulus up to optimizing the performance of sports equipment, through the patient-specific modeling in orthopedics, microtomography and its application in oral and implant research, computational modeling in the field of hip prostheses, image based model development and analysis of the human knee joint, kinematics of the hip joint, micro-scale analysis of compositional and mechanical properties of dentin, automated techniques for cervical cell image analysis, and iomedical imaging and computational modeling in cardiovascular disease.   The book will be of interest to researchers, Ph.D students, and graduate students with multidisciplinary interests related to image analysis and understanding, medical imaging, biomechanics, simulation and modeling, experimental analysis.

  1. 1st International Conference on Computational and Experimental Biomedical Sciences

    CERN Document Server

    Jorge, RM

    2015-01-01

    This book contains the full papers presented at ICCEBS 2013 – the 1st International Conference on Computational and Experimental Biomedical Sciences, which was organized in Azores, in October 2013. The included papers present and discuss new trends in those fields, using several methods and techniques, including active shape models, constitutive models, isogeometric elements, genetic algorithms, level sets, material models, neural networks, optimization, and the finite element method, in order to address more efficiently different and timely applications involving biofluids, computer simulation, computational biomechanics, image based diagnosis, image processing and analysis, image segmentation, image registration, scaffolds, simulation, and surgical planning. The main audience for this book consists of researchers, Ph.D students, and graduate students with multidisciplinary interests related to the areas of artificial intelligence, bioengineering, biology, biomechanics, computational fluid dynamics, comput...

  2. Computational Biomechanics Theoretical Background and BiologicalBiomedical Problems

    CERN Document Server

    Tanaka, Masao; Nakamura, Masanori

    2012-01-01

    Rapid developments have taken place in biological/biomedical measurement and imaging technologies as well as in computer analysis and information technologies. The increase in data obtained with such technologies invites the reader into a virtual world that represents realistic biological tissue or organ structures in digital form and allows for simulation and what is called “in silico medicine.” This volume is the third in a textbook series and covers both the basics of continuum mechanics of biosolids and biofluids and the theoretical core of computational methods for continuum mechanics analyses. Several biomechanics problems are provided for better understanding of computational modeling and analysis. Topics include the mechanics of solid and fluid bodies, fundamental characteristics of biosolids and biofluids, computational methods in biomechanics analysis/simulation, practical problems in orthopedic biomechanics, dental biomechanics, ophthalmic biomechanics, cardiovascular biomechanics, hemodynamics...

  3. [Biomedical waste management in the Regional Hospital Center of Ziguinchor].

    Science.gov (United States)

    Ndiaye, Papa; Fall, Cheikh; Diedhiou, Abdoulaye; Tal-Dial, Anta; Diedhiou, Oumar

    2003-01-01

    To make the hospital environment healthier for those it serves, the management of biomedical waste (BMW) was studied in the Ziguinchor Regional Hospital Center (RHC) in Senegal from 1 March through 15 March, 2000. The RHC incinerator had stopped operating in 1993. Problems in BMW management were observed at all levels. Neither identification nor sorting took place during collection. Waste bins were exposed everywhere. Workers, rather than carrying waste bins on their back or head, used rolling tables. BMW ended up in a shallow open pit where they were periodically burned. Workers collected, stored, and transported BMW without any type of protection (gloves, boots, masks, aprons, etc.). The principal determinants of this poor management appear to be inadequate funding and training for the cleaning staff, the staff's failure to realize the dangers, and their use of non-standardized practices, due to the absence of BMW policies. BMW management at Ziguinchor RHC must be corrected. Protective equipment must be used systematically. Similarly, standardized practices must be applied to the decontamination of used objects, the identification and sorting at the source, the recovery and recycling of all objects with any remaining value, and the correct storage of BMW. This waste must be transported under high security from its place of storage to its final disposal site. Deep burial has been selected as the most feasible method of disposal under current conditions. A year-long program has been proposed towards this end. Strategies include training, information, motivation, equipment, supervision and evaluation. The budget to implement this program is CFA 5,423,454 francs, distributed between training (22%), equipment (40%), construction of the pit, and follow-up (38%). The tasks are distributed between a public health doctor, department supervisors, and cleaning staff. The follow-up will include three quarterly inspections, at 3, 6, and 9 months and an evaluation at the end

  4. Lensfree Computational Microscopy Tools and their Biomedical Applications

    Science.gov (United States)

    Sencan, Ikbal

    Conventional microscopy has been a revolutionary tool for biomedical applications since its invention several centuries ago. Ability to non-destructively observe very fine details of biological objects in real time enabled to answer many important questions about their structures and functions. Unfortunately, most of these advance microscopes are complex, bulky, expensive, and/or hard to operate, so they could not reach beyond the walls of well-equipped laboratories. Recent improvements in optoelectronic components and computational methods allow creating imaging systems that better fulfill the specific needs of clinics or research related biomedical applications. In this respect, lensfree computational microscopy aims to replace bulky and expensive optical components with compact and cost-effective alternatives through the use of computation, which can be particularly useful for lab-on-a-chip platforms as well as imaging applications in low-resource settings. Several high-throughput on-chip platforms are built with this approach for applications including, but not limited to, cytometry, micro-array imaging, rare cell analysis, telemedicine, and water quality screening. The lack of optical complexity in these lensfree on-chip imaging platforms is compensated by using computational techniques. These computational methods are utilized for various purposes in coherent, incoherent and fluorescent on-chip imaging platforms e.g. improving the spatial resolution, to undo the light diffraction without using lenses, localization of objects in a large volume and retrieval of the phase or the color/spectral content of the objects. For instance, pixel super resolution approaches based on source shifting are used in lensfree imaging platforms to prevent under sampling, Bayer pattern, and aliasing artifacts. Another method, iterative phase retrieval, is utilized to compensate the lack of lenses by undoing the diffraction and removing the twin image noise of in-line holograms

  5. The National Center for Biomedical Ontology: Advancing Biomedicinethrough Structured Organization of Scientific Knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Rubin, Daniel L.; Lewis, Suzanna E.; Mungall, Chris J.; Misra,Sima; Westerfield, Monte; Ashburner, Michael; Sim, Ida; Chute,Christopher G.; Solbrig, Harold; Storey, Margaret-Anne; Smith, Barry; Day-Richter, John; Noy, Natalya F.; Musen, Mark A.

    2006-01-23

    The National Center for Biomedical Ontology (http://bioontology.org) is a consortium that comprises leading informaticians, biologists, clinicians, and ontologists funded by the NIH Roadmap to develop innovative technology and methods that allow scientists to record, manage, and disseminate biomedical information and knowledge in machine-processable form. The goals of the Center are: (1) to help unify the divergent and isolated efforts in ontology development by promoting high quality open-source, standards-based tools to create, manage, and use ontologies, (2) to create new software tools so that scientists can use ontologies to annotate and analyze biomedical data, (3) to provide a national resource for the ongoing evaluation, integration, and evolution of biomedical ontologies and associated tools and theories in the context of driving biomedical projects (DBPs), and (4) to disseminate the tools and resources of the Center and to identify, evaluate, and communicate best practices of ontology development to the biomedical community. The Center is working toward these objectives by providing tools to develop ontologies and to annotate experimental data, and by developing resources to integrate and relate existing ontologies as well as by creating repositories of biomedical data that are annotated using those ontologies. The Center is providing training workshops in ontology design, development, and usage, and is also pursuing research in ontology evaluation, quality, and use of ontologies to promote scientific discovery. Through the research activities within the Center, collaborations with the DBPs, and interactions with the biomedical community, our goal is to help scientists to work more effectively in the e-science paradigm, enhancing experiment design, experiment execution, data analysis, information synthesis, hypothesis generation and testing, and understand human disease.

  6. Acquisition and manipulation of computed tomography images of the maxillofacial region for biomedical prototyping

    International Nuclear Information System (INIS)

    Meurer, Maria Ines; Silva, Jorge Vicente Lopes da; Santa Barbara, Ailton; Nobre, Luiz Felipe; Oliveira, Marilia Gerhardt de; Silva, Daniela Nascimento

    2008-01-01

    Biomedical prototyping has resulted from a merger of rapid prototyping and imaging diagnosis technologies. However, this process is complex, considering the necessity of interaction between biomedical sciences and engineering. Good results are highly dependent on the acquisition of computed tomography images and their subsequent manipulation by means of specific software. The present study describes the experience of a multidisciplinary group of researchers in the acquisition and manipulation of computed tomography images of the maxillofacial region aiming at biomedical prototyping for surgical purposes. (author)

  7. The center for causal discovery of biomedical knowledge from big data.

    Science.gov (United States)

    Cooper, Gregory F; Bahar, Ivet; Becich, Michael J; Benos, Panayiotis V; Berg, Jeremy; Espino, Jeremy U; Glymour, Clark; Jacobson, Rebecca Crowley; Kienholz, Michelle; Lee, Adrian V; Lu, Xinghua; Scheines, Richard

    2015-11-01

    The Big Data to Knowledge (BD2K) Center for Causal Discovery is developing and disseminating an integrated set of open source tools that support causal modeling and discovery of biomedical knowledge from large and complex biomedical datasets. The Center integrates teams of biomedical and data scientists focused on the refinement of existing and the development of new constraint-based and Bayesian algorithms based on causal Bayesian networks, the optimization of software for efficient operation in a supercomputing environment, and the testing of algorithms and software developed using real data from 3 representative driving biomedical projects: cancer driver mutations, lung disease, and the functional connectome of the human brain. Associated training activities provide both biomedical and data scientists with the knowledge and skills needed to apply and extend these tools. Collaborative activities with the BD2K Consortium further advance causal discovery tools and integrate tools and resources developed by other centers. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. Biomedical data integration in computational drug design and bioinformatics.

    Science.gov (United States)

    Seoane, Jose A; Aguiar-Pulido, Vanessa; Munteanu, Cristian R; Rivero, Daniel; Rabunal, Juan R; Dorado, Julian; Pazos, Alejandro

    2013-03-01

    In recent years, in the post genomic era, more and more data is being generated by biological high throughput technologies, such as proteomics and transcriptomics. This omics data can be very useful, but the real challenge is to analyze all this data, as a whole, after integrating it. Biomedical data integration enables making queries to different, heterogeneous and distributed biomedical data sources. Data integration solutions can be very useful not only in the context of drug design, but also in biomedical information retrieval, clinical diagnosis, system biology, etc. In this review, we analyze the most common approaches to biomedical data integration, such as federated databases, data warehousing, multi-agent systems and semantic technology, as well as the solutions developed using these approaches in the past few years.

  9. Center for computer security: Computer Security Group conference. Summary

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-06-01

    Topics covered include: computer security management; detection and prevention of computer misuse; certification and accreditation; protection of computer security, perspective from a program office; risk analysis; secure accreditation systems; data base security; implementing R and D; key notarization system; DOD computer security center; the Sandia experience; inspector general's report; and backup and contingency planning. (GHT)

  10. Government Cloud Computing Policies: Potential Opportunities for Advancing Military Biomedical Research.

    Science.gov (United States)

    Lebeda, Frank J; Zalatoris, Jeffrey J; Scheerer, Julia B

    2018-02-07

    This position paper summarizes the development and the present status of Department of Defense (DoD) and other government policies and guidances regarding cloud computing services. Due to the heterogeneous and growing biomedical big datasets, cloud computing services offer an opportunity to mitigate the associated storage and analysis requirements. Having on-demand network access to a shared pool of flexible computing resources creates a consolidated system that should reduce potential duplications of effort in military biomedical research. Interactive, online literature searches were performed with Google, at the Defense Technical Information Center, and at two National Institutes of Health research portfolio information sites. References cited within some of the collected documents also served as literature resources. We gathered, selected, and reviewed DoD and other government cloud computing policies and guidances published from 2009 to 2017. These policies were intended to consolidate computer resources within the government and reduce costs by decreasing the number of federal data centers and by migrating electronic data to cloud systems. Initial White House Office of Management and Budget information technology guidelines were developed for cloud usage, followed by policies and other documents from the DoD, the Defense Health Agency, and the Armed Services. Security standards from the National Institute of Standards and Technology, the Government Services Administration, the DoD, and the Army were also developed. Government Services Administration and DoD Inspectors General monitored cloud usage by the DoD. A 2016 Government Accountability Office report characterized cloud computing as being economical, flexible and fast. A congressionally mandated independent study reported that the DoD was active in offering a wide selection of commercial cloud services in addition to its milCloud system. Our findings from the Department of Health and Human Services

  11. Eleven quick tips for architecting biomedical informatics workflows with cloud computing.

    Science.gov (United States)

    Cole, Brian S; Moore, Jason H

    2018-03-01

    Cloud computing has revolutionized the development and operations of hardware and software across diverse technological arenas, yet academic biomedical research has lagged behind despite the numerous and weighty advantages that cloud computing offers. Biomedical researchers who embrace cloud computing can reap rewards in cost reduction, decreased development and maintenance workload, increased reproducibility, ease of sharing data and software, enhanced security, horizontal and vertical scalability, high availability, a thriving technology partner ecosystem, and much more. Despite these advantages that cloud-based workflows offer, the majority of scientific software developed in academia does not utilize cloud computing and must be migrated to the cloud by the user. In this article, we present 11 quick tips for architecting biomedical informatics workflows on compute clouds, distilling knowledge gained from experience developing, operating, maintaining, and distributing software and virtualized appliances on the world's largest cloud. Researchers who follow these tips stand to benefit immediately by migrating their workflows to cloud computing and embracing the paradigm of abstraction.

  12. Creating a pipeline of talent for informatics: STEM initiative for high school students in computer science, biology, and biomedical informatics.

    Science.gov (United States)

    Dutta-Moscato, Joyeeta; Gopalakrishnan, Vanathi; Lotze, Michael T; Becich, Michael J

    2014-01-01

    This editorial provides insights into how informatics can attract highly trained students by involving them in science, technology, engineering, and math (STEM) training at the high school level and continuing to provide mentorship and research opportunities through the formative years of their education. Our central premise is that the trajectory necessary to be expert in the emergent fields in front of them requires acceleration at an early time point. Both pathology (and biomedical) informatics are new disciplines which would benefit from involvement by students at an early stage of their education. In 2009, Michael T Lotze MD, Kirsten Livesey (then a medical student, now a medical resident at University of Pittsburgh Medical Center (UPMC)), Richard Hersheberger, PhD (Currently, Dean at Roswell Park), and Megan Seippel, MS (the administrator) launched the University of Pittsburgh Cancer Institute (UPCI) Summer Academy to bring high school students for an 8 week summer academy focused on Cancer Biology. Initially, pathology and biomedical informatics were involved only in the classroom component of the UPCI Summer Academy. In 2011, due to popular interest, an informatics track called Computer Science, Biology and Biomedical Informatics (CoSBBI) was launched. CoSBBI currently acts as a feeder program for the undergraduate degree program in bioinformatics at the University of Pittsburgh, which is a joint degree offered by the Departments of Biology and Computer Science. We believe training in bioinformatics is the best foundation for students interested in future careers in pathology informatics or biomedical informatics. We describe our approach to the recruitment, training and research mentoring of high school students to create a pipeline of exceptionally well-trained applicants for both the disciplines of pathology informatics and biomedical informatics. We emphasize here how mentoring of high school students in pathology informatics and biomedical informatics

  13. Creating a pipeline of talent for informatics: STEM initiative for high school students in computer science, biology, and biomedical informatics

    Science.gov (United States)

    Dutta-Moscato, Joyeeta; Gopalakrishnan, Vanathi; Lotze, Michael T.; Becich, Michael J.

    2014-01-01

    This editorial provides insights into how informatics can attract highly trained students by involving them in science, technology, engineering, and math (STEM) training at the high school level and continuing to provide mentorship and research opportunities through the formative years of their education. Our central premise is that the trajectory necessary to be expert in the emergent fields in front of them requires acceleration at an early time point. Both pathology (and biomedical) informatics are new disciplines which would benefit from involvement by students at an early stage of their education. In 2009, Michael T Lotze MD, Kirsten Livesey (then a medical student, now a medical resident at University of Pittsburgh Medical Center (UPMC)), Richard Hersheberger, PhD (Currently, Dean at Roswell Park), and Megan Seippel, MS (the administrator) launched the University of Pittsburgh Cancer Institute (UPCI) Summer Academy to bring high school students for an 8 week summer academy focused on Cancer Biology. Initially, pathology and biomedical informatics were involved only in the classroom component of the UPCI Summer Academy. In 2011, due to popular interest, an informatics track called Computer Science, Biology and Biomedical Informatics (CoSBBI) was launched. CoSBBI currently acts as a feeder program for the undergraduate degree program in bioinformatics at the University of Pittsburgh, which is a joint degree offered by the Departments of Biology and Computer Science. We believe training in bioinformatics is the best foundation for students interested in future careers in pathology informatics or biomedical informatics. We describe our approach to the recruitment, training and research mentoring of high school students to create a pipeline of exceptionally well-trained applicants for both the disciplines of pathology informatics and biomedical informatics. We emphasize here how mentoring of high school students in pathology informatics and biomedical informatics

  14. Private Data Analytics on Biomedical Sensing Data via Distributed Computation.

    Science.gov (United States)

    Gong, Yanmin; Fang, Yuguang; Guo, Yuanxiong

    2016-01-01

    Advances in biomedical sensors and mobile communication technologies have fostered the rapid growth of mobile health (mHealth) applications in the past years. Users generate a high volume of biomedical data during health monitoring, which can be used by the mHealth server for training predictive models for disease diagnosis and treatment. However, the biomedical sensing data raise serious privacy concerns because they reveal sensitive information such as health status and lifestyles of the sensed subjects. This paper proposes and experimentally studies a scheme that keeps the training samples private while enabling accurate construction of predictive models. We specifically consider logistic regression models which are widely used for predicting dichotomous outcomes in healthcare, and decompose the logistic regression problem into small subproblems over two types of distributed sensing data, i.e., horizontally partitioned data and vertically partitioned data. The subproblems are solved using individual private data, and thus mHealth users can keep their private data locally and only upload (encrypted) intermediate results to the mHealth server for model training. Experimental results based on real datasets show that our scheme is highly efficient and scalable to a large number of mHealth users.

  15. Engineering computations at the national magnetic fusion energy computer center

    International Nuclear Information System (INIS)

    Murty, S.

    1983-01-01

    The National Magnetic Fusion Energy Computer Center (NMFECC) was established by the U.S. Department of Energy's Division of Magnetic Fusion Energy (MFE). The NMFECC headquarters is located at Lawrence Livermore National Laboratory. Its purpose is to apply large-scale computational technology and computing techniques to the problems of controlled thermonuclear research. In addition to providing cost effective computing services, the NMFECC also maintains a large collection of computer codes in mathematics, physics, and engineering that is shared by the entire MFE research community. This review provides a broad perspective of the NMFECC, and a list of available codes at the NMFECC for engineering computations is given

  16. Computational Center for Studies of Microturbulence

    International Nuclear Information System (INIS)

    William Dorland

    2006-01-01

    The Maryland Computational Center for Studies of Microturbulence (CCSM) was one component of a larger, multi-institutional Plasma Microturbulence Project, funded through what eventually became DOE's Scientific Discovery Through Advanced Computing Program. The primary focus of research in CCSM was to develop, deploy, maintain, and utilize kinetic simulation techniques, especially the gyrokinetic code called GS2

  17. Activity report of Computing Research Center

    Energy Technology Data Exchange (ETDEWEB)

    1997-07-01

    On April 1997, National Laboratory for High Energy Physics (KEK), Institute of Nuclear Study, University of Tokyo (INS), and Meson Science Laboratory, Faculty of Science, University of Tokyo began to work newly as High Energy Accelerator Research Organization after reconstructing and converting their systems, under aiming at further development of a wide field of accelerator science using a high energy accelerator. In this Research Organization, Applied Research Laboratory is composed of four Centers to execute assistance of research actions common to one of the Research Organization and their relating research and development (R and D) by integrating the present four centers and their relating sections in Tanashi. What is expected for the assistance of research actions is not only its general assistance but also its preparation and R and D of a system required for promotion and future plan of the research. Computer technology is essential to development of the research and can communize for various researches in the Research Organization. On response to such expectation, new Computing Research Center is required for promoting its duty by coworking and cooperating with every researchers at a range from R and D on data analysis of various experiments to computation physics acting under driving powerful computer capacity such as supercomputer and so forth. Here were described on report of works and present state of Data Processing Center of KEK at the first chapter and of the computer room of INS at the second chapter and on future problems for the Computing Research Center. (G.K.)

  18. Analysis of Uncertainty and Variability in Finite Element Computational Models for Biomedical Engineering: Characterization and Propagation.

    Science.gov (United States)

    Mangado, Nerea; Piella, Gemma; Noailly, Jérôme; Pons-Prats, Jordi; Ballester, Miguel Ángel González

    2016-01-01

    Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known, and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.

  19. BioPortal: enhanced functionality via new Web services from the National Center for Biomedical Ontology to access and use ontologies in software applications.

    Science.gov (United States)

    Whetzel, Patricia L; Noy, Natalya F; Shah, Nigam H; Alexander, Paul R; Nyulas, Csongor; Tudorache, Tania; Musen, Mark A

    2011-07-01

    The National Center for Biomedical Ontology (NCBO) is one of the National Centers for Biomedical Computing funded under the NIH Roadmap Initiative. Contributing to the national computing infrastructure, NCBO has developed BioPortal, a web portal that provides access to a library of biomedical ontologies and terminologies (http://bioportal.bioontology.org) via the NCBO Web services. BioPortal enables community participation in the evaluation and evolution of ontology content by providing features to add mappings between terms, to add comments linked to specific ontology terms and to provide ontology reviews. The NCBO Web services (http://www.bioontology.org/wiki/index.php/NCBO_REST_services) enable this functionality and provide a uniform mechanism to access ontologies from a variety of knowledge representation formats, such as Web Ontology Language (OWL) and Open Biological and Biomedical Ontologies (OBO) format. The Web services provide multi-layered access to the ontology content, from getting all terms in an ontology to retrieving metadata about a term. Users can easily incorporate the NCBO Web services into software applications to generate semantically aware applications and to facilitate structured data collection.

  20. Strom Thurmond Biomedical Research Center at the Medical Univesity for South Carolina Charleston, South Carolina

    Energy Technology Data Exchange (ETDEWEB)

    1994-02-01

    The Department of Energy (DOE) has prepared an Environmental Assessment (EA) evaluating the proposed construction and operation of the Strom Thurmond Biomedical Research Center (Center) at the Medical University of South Carolina (MUSC), Charleston, SC. The DOE is evaluating a grant proposal to authorize the MUSC to construct, equip and operate the lower two floors of the proposed nine-story Center as an expansion of on-going clinical research and out-patient diagnostic activities of the Cardiology Division of the existing Gazes Cardiac Research Institute. Based on the analysis in the EA, the DOE has determined that the proposed action does not constitute a major federal action significantly affecting the quality of the human environment within the meaning of the NEPA. Therefore, the preparation of an Environmental Impact Statement is not required.

  1. Biomedical optics centers: forty years of multidisciplinary clinical translation for improving human health

    Science.gov (United States)

    Tromberg, Bruce J.; Anderson, R. Rox; Birngruber, Reginald; Brinkmann, Ralf; Berns, Michael W.; Parrish, John A.; Apiou-Sbirlea, Gabriela

    2016-12-01

    Despite widespread government and public interest, there are significant barriers to translating basic science discoveries into clinical practice. Biophotonics and biomedical optics technologies can be used to overcome many of these hurdles, due, in part, to offering new portable, bedside, and accessible devices. The current JBO special issue highlights promising activities and examples of translational biophotonics from leading laboratories around the world. We identify common essential features of successful clinical translation by examining the origins and activities of three major international academic affiliated centers with beginnings traceable to the mid-late 1970s: The Wellman Center for Photomedicine (Mass General Hospital, USA), the Beckman Laser Institute and Medical Clinic (University of California, Irvine, USA), and the Medical Laser Center Lübeck at the University of Lübeck, Germany. Major factors driving the success of these programs include visionary founders and leadership, multidisciplinary research and training activities in light-based therapies and diagnostics, diverse funding portfolios, and a thriving entrepreneurial culture that tolerates risk. We provide a brief review of how these three programs emerged and highlight critical phases and lessons learned. Based on these observations, we identify pathways for encouraging the growth and formation of similar programs in order to more rapidly and effectively expand the impact of biophotonics and biomedical optics on human health.

  2. Biomedical optics centers: forty years of multidisciplinary clinical translation for improving human health.

    Science.gov (United States)

    Tromberg, Bruce J; Anderson, R Rox; Birngruber, Reginald; Brinkmann, Ralf; Berns, Michael W; Parrish, John A; Apiou-Sbirlea, Gabriela

    2016-12-01

    Despite widespread government and public interest, there are significant barriers to translating basic science discoveries into clinical practice. Biophotonics and biomedical optics technologies can be used to overcome many of these hurdles, due, in part, to offering new portable, bedside, and accessible devices. The current JBO special issue highlights promising activities and examples of translational biophotonics from leading laboratories around the world. We identify common essential features of successful clinical translation by examining the origins and activities of three major international academic affiliated centers with beginnings traceable to the mid-late 1970s: The Wellman Center for Photomedicine (Mass General Hospital, USA), the Beckman Laser Institute and Medical Clinic (University of California, Irvine, USA), and the Medical Laser Center Lübeck at the University of Lübeck, Germany. Major factors driving the success of these programs include visionary founders and leadership, multidisciplinary research and training activities in light-based therapies and diagnostics, diverse funding portfolios, and a thriving entrepreneurial culture that tolerates risk. We provide a brief review of how these three programs emerged and highlight critical phases and lessons learned. Based on these observations, we identify pathways for encouraging the growth and formation of similar programs in order to more rapidly and effectively expand the impact of biophotonics and biomedical optics on human health.

  3. The Brazilian research and teaching center in biomedicine and aerospace biomedical engineering.

    Science.gov (United States)

    Russomano, T; Falcao, P F; Dalmarco, G; Martinelli, L; Cardoso, R; Santos, M A; Sparenberg, A

    2008-08-01

    The recent engagement of Brazil in the construction and utilization of the International Space Station has motivated several Brazilian research institutions and universities to establish study centers related to Space Sciences. The Pontificia Universidade Catolica do Rio Grande do Sul (PUCRS) is no exception. The University initiated in 1993 the first degree course training students to operate commercial aircraft in South America (the School of Aeronautical Sciences. A further step was the decision to build the first Brazilian laboratory dedicated to the conduct of experiments in ground-based microgravity simulation. Established in 1998, the Microgravity Laboratory, which was located in the Instituto de Pesquisas Cientificas e Tecnologicas (IPCT), was supported by the Schools of Medicine, Aeronautical Sciences and Electrical Engineering/Biomedical Engineering. At the end of 2006, the Microgravity Laboratory became a Center and was transferred to the School of Engineering. The principal activities of the Microgravity Centre are the development of research projects related to human physiology before, during and after ground-based microgravity simulation and parabolic flights, to aviation medicine in the 21st century and to aerospace biomedical engineering. The history of Brazilian, and why not say worldwide, space science should unquestionably go through PUCRS. As time passes, the pioneering spirit of our University in the aerospace area has become undeniable. This is due to the group of professionals, students, technicians and staff in general that have once worked or are still working in the Center of Microgravity, a group of faculty and students that excel in their undeniable technical-scientific qualifications.

  4. Tsinghua-Johns Hopkins Joint Center for Biomedical Engineering Research: scientific and cultural exchange in undergraduate engineering.

    Science.gov (United States)

    Wisneski, Andrew D; Huang, Lixia; Hong, Bo; Wang, Xiaoqin

    2011-01-01

    A model for an international undergraduate biomedical engineering research exchange program is outlined. In 2008, the Johns Hopkins University in collaboration with Tsinghua University in Beijing, China established the Tsinghua-Johns Hopkins Joint Center for Biomedical Engineering Research. Undergraduate biomedical engineering students from both universities are offered the opportunity to participate in research at the overseas institution. Programs such as these will not only provide research experiences for undergraduates but valuable cultural exchange and enrichment as well. Currently, strict course scheduling and rigorous curricula in most biomedical engineering programs may present obstacles for students to partake in study abroad opportunities. Universities are encouraged to harbor abroad opportunities for undergraduate engineering students, for which this particular program can serve as a model.

  5. The Computational Physics Program of the national MFE Computer Center

    Energy Technology Data Exchange (ETDEWEB)

    Mirin, A.A.

    1989-01-01

    Since June 1974, the MFE Computer Center has been engaged in a significant computational physics effort. The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generations of supercomputers. The Computational Physics Group has been involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to tokamaks and compact toroids. A third area is the investigation of kinetic instabilities using a 3-D particle code; this work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence have been under examination, with the hope of being able to explain anomalous transport. Also, we are collaborating in an international effort to evaluate fully three-dimensional linear stability of toroidal devices. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers. A summary of these programs are included in this paper. 6 tabs.

  6. A Computer Learning Center for Environmental Sciences

    Science.gov (United States)

    Mustard, John F.

    2000-01-01

    In the fall of 1998, MacMillan Hall opened at Brown University to students. In MacMillan Hall was the new Computer Learning Center, since named the EarthLab which was outfitted with high-end workstations and peripherals primarily focused on the use of remotely sensed and other spatial data in the environmental sciences. The NASA grant we received as part of the "Centers of Excellence in Applications of Remote Sensing to Regional and Global Integrated Environmental Assessments" was the primary source of funds to outfit this learning and research center. Since opening, we have expanded the range of learning and research opportunities and integrated a cross-campus network of disciplines who have come together to learn and use spatial data of all kinds. The EarthLab also forms a core of undergraduate, graduate, and faculty research on environmental problems that draw upon the unique perspective of remotely sensed data. Over the last two years, the Earthlab has been a center for research on the environmental impact of water resource use in and regions, impact of the green revolution on forest cover in India, the design of forest preserves in Vietnam, and detailed assessments of the utility of thermal and hyperspectral data for water quality analysis. It has also been used extensively for local environmental activities, in particular studies on the impact of lead on the health of urban children in Rhode Island. Finally, the EarthLab has also served as a key educational and analysis center for activities related to the Brown University Affiliated Research Center that is devoted to transferring university research to the private sector.

  7. [Nondestructive imaging of elements distribution in biomedical samples by X-ray fluorescence computed tomography].

    Science.gov (United States)

    Yang, Qun; Deng, Biao; Lü, Wei-Wei; Du, Guo-Hao; Yan, Fu-Hua; Xiao, Ti-Qiao; Xu, Hong-Jie

    2011-10-01

    X-ray fluorescence computed tomography is a stimulated emission tomography that allows nondestructive reconstruction of the elements distribution in the sample, which is important for biomedical investigations. Owing to the high flux density and easy energy tunability of highly collimated synchrotron X-rays, it is possible to apply X-ray fluorescence CT to biomedical samples. Reported in the present paper, an X-ray fluorescence CT system was established at Shanghai Synchrotron Radiation Facility for the investigations of trace elements distribution inside biomedical samples. By optimizing the experiment setup, the spatial resolution was improved and the data acquisition process was obviously speeded up. The maximum-likelihood expectation-maximization algorithm was introduced for the image reconstruction, which remarkably improved the imaging accuracy of element distributions. The developed system was verified by the test sample and medical sample respectively. The results showed that the distribution of interested elements could be imaged correctly, and the spatial resolution of 150 m was achieved. In conclusion, the developed system could be applied to the research on large-size biomedical samples, concerning imaging accuracy, spatial resolution and data collection time.

  8. A Ten-Year Assessment of a Biomedical Engineering Summer Research Internship within a Comprehensive Cancer Center

    Science.gov (United States)

    Wright, A. S.; Wu, X.; Frye, C. A.; Mathur, A. B.; Patrick, C. W., Jr.

    2007-01-01

    A Biomedical Engineering Internship Program conducted within a Comprehensive Cancer Center over a 10 year period was assessed and evaluated. Although this is a non-traditional location for an internship, it is an ideal site for a multidisciplinary training program for science, technology, engineering, and mathematics (STEM) students. We made a…

  9. Patterns of biomedical science production in a sub-Saharan research center

    Directory of Open Access Journals (Sweden)

    Agnandji Selidji T

    2012-03-01

    Full Text Available Abstract Background Research activities in sub-Saharan Africa may be limited to delegated tasks due to the strong control from Western collaborators, which could lead to scientific production of little value in terms of its impact on social and economic innovation in less developed areas. However, the current contexts of international biomedical research including the development of public-private partnerships and research institutions in Africa suggest that scientific activities are growing in sub-Saharan Africa. This study aims to describe the patterns of clinical research activities at a sub-Saharan biomedical research center. Methods In-depth interviews were conducted with a core group of researchers at the Medical Research Unit of the Albert Schweitzer Hospital from June 2009 to February 2010 in Lambaréné, Gabon. Scientific activities running at the MRU as well as the implementation of ethical and regulatory standards were covered by the interview sessions. Results The framework of clinical research includes transnational studies and research initiated locally. In transnational collaborations, a sub-Saharan research institution may be limited to producing confirmatory and late-stage data with little impact on economic and social innovation. However, ethical and regulatory guidelines are being implemented taking into consideration the local contexts. Similarly, the scientific content of studies designed by researchers at the MRU, if local needs are taken into account, may potentially contribute to a scientific production with long-term value on social and economic innovation in sub-Saharan Africa. Conclusion Further research questions and methods in social sciences should comprehensively address the construction of scientific content with the social, economic and cultural contexts surrounding research activities.

  10. A multipurpose computing center with distributed resources

    Science.gov (United States)

    Chudoba, J.; Adam, M.; Adamová, D.; Kouba, T.; Mikula, A.; Říkal, V.; Švec, J.; Uhlířová, J.; Vokáč, P.; Svatoš, M.

    2017-10-01

    The Computing Center of the Institute of Physics (CC IoP) of the Czech Academy of Sciences serves a broad spectrum of users with various computing needs. It runs WLCG Tier-2 center for the ALICE and the ATLAS experiments; the same group of services is used by astroparticle physics projects the Pierre Auger Observatory (PAO) and the Cherenkov Telescope Array (CTA). OSG stack is installed for the NOvA experiment. Other groups of users use directly local batch system. Storage capacity is distributed to several locations. DPM servers used by the ATLAS and the PAO are all in the same server room, but several xrootd servers for the ALICE experiment are operated in the Nuclear Physics Institute in Řež, about 10 km away. The storage capacity for the ATLAS and the PAO is extended by resources of the CESNET - the Czech National Grid Initiative representative. Those resources are in Plzen and Jihlava, more than 100 km away from the CC IoP. Both distant sites use a hierarchical storage solution based on disks and tapes. They installed one common dCache instance, which is published in the CC IoP BDII. ATLAS users can use these resources using the standard ATLAS tools in the same way as the local storage without noticing this geographical distribution. Computing clusters LUNA and EXMAG dedicated to users mostly from the Solid State Physics departments offer resources for parallel computing. They are part of the Czech NGI infrastructure MetaCentrum with distributed batch system based on torque with a custom scheduler. Clusters are installed remotely by the MetaCentrum team and a local contact helps only when needed. Users from IoP have exclusive access only to a part of these two clusters and take advantage of higher priorities on the rest (1500 cores in total), which can also be used by any user of the MetaCentrum. IoP researchers can also use distant resources located in several towns of the Czech Republic with a capacity of more than 12000 cores in total.

  11. Effective use of latent semantic indexing and computational linguistics in biological and biomedical applications.

    Science.gov (United States)

    Chen, Hongyu; Martin, Bronwen; Daimon, Caitlin M; Maudsley, Stuart

    2013-01-01

    Text mining is rapidly becoming an essential technique for the annotation and analysis of large biological data sets. Biomedical literature currently increases at a rate of several thousand papers per week, making automated information retrieval methods the only feasible method of managing this expanding corpus. With the increasing prevalence of open-access journals and constant growth of publicly-available repositories of biomedical literature, literature mining has become much more effective with respect to the extraction of biomedically-relevant data. In recent years, text mining of popular databases such as MEDLINE has evolved from basic term-searches to more sophisticated natural language processing techniques, indexing and retrieval methods, structural analysis and integration of literature with associated metadata. In this review, we will focus on Latent Semantic Indexing (LSI), a computational linguistics technique increasingly used for a variety of biological purposes. It is noted for its ability to consistently outperform benchmark Boolean text searches and co-occurrence models at information retrieval and its power to extract indirect relationships within a data set. LSI has been used successfully to formulate new hypotheses, generate novel connections from existing data, and validate empirical data.

  12. Effective use of Latent Semantic Indexing and Computational Linguistics in Biological and Biomedical Applications

    Directory of Open Access Journals (Sweden)

    Hongyu eChen

    2013-01-01

    Full Text Available Text mining is rapidly becoming an essential technique for the annotation and analysis of large biological data sets. Biomedical literature currently increases at a rate of several thousand papers per week, making automated information retrieval methods the only feasible method of managing this expanding corpus. With the increasing prevalence of open-access journals and constant growth of publicly-available repositories of biomedical literature, literature mining has become much more effective with respect to the extraction of biomedically-relevant data. In recent years, text mining of popular databases such as MEDLINE has evolved from basic term-searches to more sophisticated natural language processing techniques, indexing and retrieval methods, structural analysis and integration of literature with associated metadata. In this review, we will focus on Latent Semantic Indexing (LSI, a computational linguistics technique increasingly used for a variety of biological purposes. It is noted for its ability to consistently outperform benchmark Boolean text searches and co-occurrence models at information retrieval and its power to extract indirect relationships within a data set. LSI has been used successfully to formulate new hypotheses, generate novel connections from existing data, and validate empirical data.

  13. Credibility Assessment of Deterministic Computational Models and Simulations for Space Biomedical Research and Operations

    Science.gov (United States)

    Mulugeta, Lealem; Walton, Marlei; Nelson, Emily; Myers, Jerry

    2015-01-01

    Human missions beyond low earth orbit to destinations, such as to Mars and asteroids will expose astronauts to novel operational conditions that may pose health risks that are currently not well understood and perhaps unanticipated. In addition, there are limited clinical and research data to inform development and implementation of health risk countermeasures for these missions. Consequently, NASA's Digital Astronaut Project (DAP) is working to develop and implement computational models and simulations (M&S) to help predict and assess spaceflight health and performance risks, and enhance countermeasure development. In order to effectively accomplish these goals, the DAP evaluates its models and simulations via a rigorous verification, validation and credibility assessment process to ensure that the computational tools are sufficiently reliable to both inform research intended to mitigate potential risk as well as guide countermeasure development. In doing so, DAP works closely with end-users, such as space life science researchers, to establish appropriate M&S credibility thresholds. We will present and demonstrate the process the DAP uses to vet computational M&S for space biomedical analysis using real M&S examples. We will also provide recommendations on how the larger space biomedical community can employ these concepts to enhance the credibility of their M&S codes.

  14. Survey of biomedical and environental data bases, models, and integrated computer systems at Argonne National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Murarka, I.P.; Bodeau, D.J.; Scott, J.M.; Huebner, R.H.

    1978-08-01

    This document contains an inventory (index) of information resources pertaining to biomedical and environmental projects at Argonne National Laboratory--the information resources include a data base, model, or integrated computer system. Entries are categorized as models, numeric data bases, bibliographic data bases, or integrated hardware/software systems. Descriptions of the Information Coordination Focal Point (ICFP) program, the system for compiling this inventory, and the plans for continuing and expanding it are given, and suggestions for utilizing the services of the ICFP are outlined.

  15. Survey of biomedical and environental data bases, models, and integrated computer systems at Argonne National Laboratory

    International Nuclear Information System (INIS)

    Murarka, I.P.; Bodeau, D.J.; Scott, J.M.; Huebner, R.H.

    1978-08-01

    This document contains an inventory (index) of information resources pertaining to biomedical and environmental projects at Argonne National Laboratory--the information resources include a data base, model, or integrated computer system. Entries are categorized as models, numeric data bases, bibliographic data bases, or integrated hardware/software systems. Descriptions of the Information Coordination Focal Point (ICFP) program, the system for compiling this inventory, and the plans for continuing and expanding it are given, and suggestions for utilizing the services of the ICFP are outlined

  16. The problem of organization of a coastal coordinating computer center

    Science.gov (United States)

    Dyubkin, I. A.; Lodkin, I. I.

    1974-01-01

    The fundamental principles of the operation of a coastal coordinating and computing center under conditions of automation are presented. Special attention is devoted to the work of Coastal Computer Center of the Arctic and Antarctic Scientific Research Institute. This center generalizes from data collected in expeditions and also from observations made at polar stations.

  17. Online object oriented Monte Carlo computational tool for the needs of biomedical optics.

    Science.gov (United States)

    Doronin, Alexander; Meglinski, Igor

    2011-09-01

    Conceptual engineering design and optimization of laser-based imaging techniques and optical diagnostic systems used in the field of biomedical optics requires a clear understanding of the light-tissue interaction and peculiarities of localization of the detected optical radiation within the medium. The description of photon migration within the turbid tissue-like media is based on the concept of radiative transfer that forms a basis of Monte Carlo (MC) modeling. An opportunity of direct simulation of influence of structural variations of biological tissues on the probing light makes MC a primary tool for biomedical optics and optical engineering. Due to the diversity of optical modalities utilizing different properties of light and mechanisms of light-tissue interactions a new MC code is typically required to be developed for the particular diagnostic application. In current paper introducing an object oriented concept of MC modeling and utilizing modern web applications we present the generalized online computational tool suitable for the major applications in biophotonics. The computation is supported by NVIDEA CUDA Graphics Processing Unit providing acceleration of modeling up to 340 times.

  18. The NIH-NIAID Schistosomiasis Resource Center at the Biomedical Research Institute: Molecular Redux.

    Directory of Open Access Journals (Sweden)

    James J Cody

    2016-10-01

    Full Text Available Schistosomiasis remains a health burden in many parts of the world. The complex life cycle of Schistosoma parasites and the economic and societal conditions present in endemic areas make the prospect of eradication unlikely in the foreseeable future. Continued and vigorous research efforts must therefore be directed at this disease, particularly since only a single World Health Organization (WHO-approved drug is available for treatment. The National Institutes of Health (NIH-National Institute of Allergy and Infectious Diseases (NIAID Schistosomiasis Resource Center (SRC at the Biomedical Research Institute provides investigators with the critical raw materials needed to carry out this important research. The SRC makes available, free of charge (including international shipping costs, not only infected host organisms but also a wide array of molecular reagents derived from all life stages of each of the three main human schistosome parasites. As the field of schistosomiasis research rapidly advances, it is likely to become increasingly reliant on omics, transgenics, epigenetics, and microbiome-related research approaches. The SRC has and will continue to monitor and contribute to advances in the field in order to support these research efforts with an expanding array of molecular reagents. In addition to providing investigators with source materials, the SRC has expanded its educational mission by offering a molecular techniques training course and has recently organized an international schistosomiasis-focused meeting. This review provides an overview of the materials and services that are available at the SRC for schistosomiasis researchers, with a focus on updates that have occurred since the original overview in 2008.

  19. NASA Center for Computational Sciences: History and Resources

    Science.gov (United States)

    2000-01-01

    The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.

  20. Center for Computing Research Summer Research Proceedings 2015.

    Energy Technology Data Exchange (ETDEWEB)

    Bradley, Andrew Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Parks, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-18

    The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).

  1. Ranking Iranian biomedical research centers according to H-variants (G, M, A, R) in Scopus and Web of Science.

    Science.gov (United States)

    Mahmudi, Zoleikha; Tahamtan, Iman; Sedghi, Shahram; Roudbari, Masoud

    2015-01-01

    We conducted a comprehensive bibliometrics analysis to calculate the H, G, M, A and R indicators for all Iranian biomedical research centers (IBRCs) from the output of ISI Web of Science (WoS) and Scopus between 1991 and 2010. We compared the research performance of the research centers according to these indicators. This was a cross-sectional and descriptive-analytical study, conducted on 104 Iranian biomedical research centers between August and September 2011. We collected our data through Scopus and WoS. Pearson correlation coefficient between the scientometrics indicators was calculated using SPSS, version 16. The mean values of all indicators were higher in Scopus than in WoS. Drug Applied Research Center of Tabriz University of Medical Sciences had the highest number of publications in both WoS and Scopus databases. This research center along with Royan Institute received the highest number of citations in both Scopus and WoS, respectively. The highest correlation was seen between G and R (.998) in WoS and between G and R (.990) in Scopus. Furthermore, the highest overlap of the 10 top IBRCs was between G and H in WoS (100%) and between G-R (90%) and H-R (90%) in Scopus. Research centers affiliated to the top ranked Iranian medical universities obtained a better position with respect to the studied scientometrics indicators. All aforementioned indicators are important for ranking bibliometrics studies as they refer to different attributes of scientific output and citation aspects.

  2. Bioinformatics and Computational Core Technology Center

    Data.gov (United States)

    Federal Laboratory Consortium — SERVICES PROVIDED BY THE COMPUTER CORE FACILITYEvaluation, purchase, set up, and maintenance of the computer hardware and network for the 170 users in the research...

  3. Analysis of uncertainty and variability in finite element computational models for biomedical engineering:characterization and propagation

    Directory of Open Access Journals (Sweden)

    Nerea Mangado

    2016-11-01

    Full Text Available Computational modeling has become a powerful tool in biomedical engineering thanks to its potential to simulate coupled systems. However, real parameters are usually not accurately known and variability is inherent in living organisms. To cope with this, probabilistic tools, statistical analysis and stochastic approaches have been used. This article aims to review the analysis of uncertainty and variability in the context of finite element modeling in biomedical engineering. Characterization techniques and propagation methods are presented, as well as examples of their applications in biomedical finite element simulations. Uncertainty propagation methods, both non-intrusive and intrusive, are described. Finally, pros and cons of the different approaches and their use in the scientific community are presented. This leads us to identify future directions for research and methodological development of uncertainty modeling in biomedical engineering.

  4. Energy Consumption in Cloud Computing Data Centers

    OpenAIRE

    Uchechukwu Awada; Keqiu Li; Yanming Shen

    2014-01-01

    The implementation of cloud computing has attracted computing as a utility and enables penetrative applications from scientific, consumer and business domains. However, this implementation faces tremendous energy consumption, carbon dioxide emission and associated costs concerns. With energy consumption becoming key issue for the operation and maintenance of cloud datacenters, cloud computing providers are becoming profoundly concerned.  In this paper, we present formulations and solutions fo...

  5. Building the Teraflops/Petabytes Production Computing Center

    International Nuclear Information System (INIS)

    Kramer, William T.C.; Lucas, Don; Simon, Horst D.

    1999-01-01

    In just one decade, the 1990s, supercomputer centers have undergone two fundamental transitions which require rethinking their operation and their role in high performance computing. The first transition in the early to mid-1990s resulted from a technology change in high performance computing architecture. Highly parallel distributed memory machines built from commodity parts increased the operational complexity of the supercomputer center, and required the introduction of intellectual services as equally important components of the center. The second transition is happening in the late 1990s as centers are introducing loosely coupled clusters of SMPs as their premier high performance computing platforms, while dealing with an ever-increasing volume of data. In addition, increasing network bandwidth enables new modes of use of a supercomputer center, in particular, computational grid applications. In this paper we describe what steps NERSC is taking to address these issues and stay at the leading edge of supercomputing centers.; N

  6. Biomedical Informatics Research and Education at the EuroMISE Center

    Czech Academy of Sciences Publication Activity Database

    Zvárová, Jana

    2006-01-01

    Roč. 45, Suppl. (2006), s. 166-173 ISSN 0026-1270 Grant - others:Evropské sociální fondy CZ04307/42011/0013 Institutional research plan: CEZ:AV0Z10300504 Keywords : biomedical informatics * research * education * healthcare * information society Subject RIV: BJ - Thermodynamics Impact factor: 1.684, year: 2006

  7. Human-centered Computing: Toward a Human Revolution

    OpenAIRE

    Jaimes, Alejandro; Gatica-Perez, Daniel; Sebe, Nicu; Huang, Thomas S.

    2007-01-01

    Human-centered computing studies the design, development, and deployment of mixed-initiative human-computer systems. HCC is emerging from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts.

  8. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2001-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  9. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2002-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  10. THE CENTER FOR DATA INTENSIVE COMPUTING

    Energy Technology Data Exchange (ETDEWEB)

    GLIMM,J.

    2003-11-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook.

  11. THE CENTER FOR DATA INTENSIVE COMPUTING

    International Nuclear Information System (INIS)

    GLIMM, J.

    2001-01-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook

  12. THE CENTER FOR DATA INTENSIVE COMPUTING

    International Nuclear Information System (INIS)

    GLIMM, J.

    2003-01-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook

  13. THE CENTER FOR DATA INTENSIVE COMPUTING

    International Nuclear Information System (INIS)

    GLIMM, J.

    2002-01-01

    CDIC will provide state-of-the-art computational and computer science for the Laboratory and for the broader DOE and scientific community. We achieve this goal by performing advanced scientific computing research in the Laboratory's mission areas of High Energy and Nuclear Physics, Biological and Environmental Research, and Basic Energy Sciences. We also assist other groups at the Laboratory to reach new levels of achievement in computing. We are ''data intensive'' because the production and manipulation of large quantities of data are hallmarks of scientific research in the 21st century and are intrinsic features of major programs at Brookhaven. An integral part of our activity to accomplish this mission will be a close collaboration with the University at Stony Brook

  14. Computer Software Management and Information Center

    Science.gov (United States)

    1983-01-01

    Computer programs for passive anti-roll tank, earth resources laboratory applications, the NIMBUS-7 coastal zone color scanner derived products, transportable applications executive, plastic and failure analysis of composites, velocity gradient method for calculating velocities in an axisymmetric annular duct, an integrated procurement management system, data I/O PRON for the Motorola exorcisor, aerodynamic shock-layer shape, kinematic modeling, hardware library for a graphics computer, and a file archival system are documented.

  15. Computational Physics Program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1984-12-01

    The principal objective of the computational physics group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. A summary of the groups activities is presented, including computational studies in MHD equilibria and stability, plasma transport, Fokker-Planck, and efficient numerical and programming algorithms. References are included

  16. Ranking Iranian biomedical research centers according to H-variants (G, M, A, R) in Scopus and Web of Science

    Science.gov (United States)

    Mahmudi, Zoleikha; Tahamtan, Iman; Sedghi, Shahram; Roudbari, Masoud

    2015-01-01

    Background: We conducted a comprehensive bibliometrics analysis to calculate the H, G, M, A and R indicators for all Iranian biomedical research centers (IBRCs) from the output of ISI Web of Science (WoS) and Scopus between 1991 and 2010. We compared the research performance of the research centers according to these indicators. Methods: This was a cross-sectional and descriptive-analytical study, conducted on 104 Iranian biomedical research centers between August and September 2011. We collected our data through Scopus and WoS. Pearson correlation coefficient between the scientometrics indicators was calculated using SPSS, version 16. Results: The mean values of all indicators were higher in Scopus than in WoS. Drug Applied Research Center of Tabriz University of Medical Sciences had the highest number of publications in both WoS and Scopus databases. This research center along with Royan Institute received the highest number of citations in both Scopus and WoS, respectively. The highest correlation was seen between G and R (.998) in WoS and between G and R (.990) in Scopus. Furthermore, the highest overlap of the 10 top IBRCs was between G and H in WoS (100%) and between G-R (90%) and H-R (90%) in Scopus. Conclusion: Research centers affiliated to the top ranked Iranian medical universities obtained a better position with respect to the studied scientometrics indicators. All aforementioned indicators are important for ranking bibliometrics studies as they refer to different attributes of scientific output and citation aspects. PMID:26478875

  17. Building a High Performance Computing Infrastructure for Novosibirsk Scientific Center

    International Nuclear Information System (INIS)

    Adakin, A; Chubarov, D; Nikultsev, V; Belov, S; Kaplin, V; Sukharev, A; Zaytsev, A; Kalyuzhny, V; Kuchin, N; Lomakin, S

    2011-01-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies (ICT), and Institute of Computational Mathematics and Mathematical Geophysics (ICM and MG). Since each institute has specific requirements on the architecture of the computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for the particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM and MG), and a Grid Computing Facility of BINP. Recently a dedicated optical network with the initial bandwidth of 10 Gbps connecting these three facilities was built in order to make it possible to share the computing resources among the research communities of participating institutes, thus providing a common platform for building the computing infrastructure for various scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technologies based on XEN and KVM platforms. The solution implemented was tested thoroughly within the computing environment of KEDR detector experiment which is being carried out at BINP, and foreseen to be applied to the use cases of other HEP experiments in the upcoming future.

  18. The computational physics program of the National MFE Computer Center

    International Nuclear Information System (INIS)

    Mirin, A.A.

    1988-01-01

    The principal objective of the Computational Physics Group is to develop advanced numerical models for the investigation of plasma phenomena and the simulation of present and future magnetic confinement devices. Another major objective of the group is to develop efficient algorithms and programming techniques for current and future generation of supercomputers. The computational physics group is involved in several areas of fusion research. One main area is the application of Fokker-Planck/quasilinear codes to tokamaks. Another major area is the investigation of resistive magnetohydrodynamics in three dimensions, with applications to compact toroids. Another major area is the investigation of kinetic instabilities using a 3-D particle code. This work is often coupled with the task of numerically generating equilibria which model experimental devices. Ways to apply statistical closure approximations to study tokamak-edge plasma turbulence are being examined. In addition to these computational physics studies, the group has developed a number of linear systems solvers for general classes of physics problems and has been making a major effort at ascertaining how to efficiently utilize multiprocessor computers

  19. Lecture 4: Cloud Computing in Large Computer Centers

    CERN Multimedia

    CERN. Geneva

    2013-01-01

    This lecture will introduce Cloud Computing concepts identifying and analyzing its characteristics, models, and applications. Also, you will learn how CERN built its Cloud infrastructure and which tools are been used to deploy and manage it. About the speaker: Belmiro Moreira is an enthusiastic software engineer passionate about the challenges and complexities of architecting and deploying Cloud Infrastructures in ve...

  20. Biomedical, Artificial Intelligence, and DNA Computing Photonics Applications and Web Engineering, Wilga, May 2012

    Science.gov (United States)

    Romaniuk, Ryszard S.

    2012-05-01

    This paper is the fifth part (out of five) of the research survey of WILGA Symposium work, May 2012 Edition, concerned with Biomedical, Artificial Intelligence and DNA Computing technologies. It presents a digest of chosen technical work results shown by young researchers from different technical universities from this country during the Jubilee XXXth SPIE-IEEE Wilga 2012, May Edition, symposium on Photonics and Web Engineering. Topical tracks of the symposium embraced, among others, nanomaterials and nanotechnologies for photonics, sensory and nonlinear optical fibers, object oriented design of hardware, photonic metrology, optoelectronics and photonics applications, photonics-electronics co-design, optoelectronic and electronic systems for astronomy and high energy physics experiments, JET tokamak and pi-of-the sky experiments development. The symposium is an annual summary in the development of numerable Ph.D. theses carried out in this country in the area of advanced electronic and photonic systems. It is also a great occasion for SPIE, IEEE, OSA and PSP students to meet together in a large group spanning the whole country with guests from this part of Europe. A digest of Wilga references is presented [1-270].

  1. Statistical modeling of biomedical corpora: mining the Caenorhabditis Genetic Center Bibliography for genes related to life span

    Directory of Open Access Journals (Sweden)

    Jordan MI

    2006-05-01

    Full Text Available Abstract Background The statistical modeling of biomedical corpora could yield integrated, coarse-to-fine views of biological phenomena that complement discoveries made from analysis of molecular sequence and profiling data. Here, the potential of such modeling is demonstrated by examining the 5,225 free-text items in the Caenorhabditis Genetic Center (CGC Bibliography using techniques from statistical information retrieval. Items in the CGC biomedical text corpus were modeled using the Latent Dirichlet Allocation (LDA model. LDA is a hierarchical Bayesian model which represents a document as a random mixture over latent topics; each topic is characterized by a distribution over words. Results An LDA model estimated from CGC items had better predictive performance than two standard models (unigram and mixture of unigrams trained using the same data. To illustrate the practical utility of LDA models of biomedical corpora, a trained CGC LDA model was used for a retrospective study of nematode genes known to be associated with life span modification. Corpus-, document-, and word-level LDA parameters were combined with terms from the Gene Ontology to enhance the explanatory value of the CGC LDA model, and to suggest additional candidates for age-related genes. A novel, pairwise document similarity measure based on the posterior distribution on the topic simplex was formulated and used to search the CGC database for "homologs" of a "query" document discussing the life span-modifying clk-2 gene. Inspection of these document homologs enabled and facilitated the production of hypotheses about the function and role of clk-2. Conclusion Like other graphical models for genetic, genomic and other types of biological data, LDA provides a method for extracting unanticipated insights and generating predictions amenable to subsequent experimental validation.

  2. Review of "Biomedical Informatics; Computer Applications in Health Care and Biomedicine" by Edward H. Shortliffe and James J. Cimino

    OpenAIRE

    Clifford Gari D

    2006-01-01

    Abstract This article is an invited review of the third edition of "Biomedical Informatics; Computer Applications in Health Care and Biomedicine", one of thirty-six volumes in Springer's 'Health Informatics Series', edited by E. Shortliffe and J. Cimino. This book spans most of the current methods and issues in health informatics, ranging through subjects as varied as data acquisition and storage, standards, natural language processing, imaging, electronic health records, decision support, te...

  3. Reliability centered operational performance measure for computer systems in NPPs

    International Nuclear Information System (INIS)

    Khobare, S.K.; Chandra, Umesh; Govindarajan, G.

    1992-01-01

    There could be arguments in favour and against the application of digital computers in highly safety critical plants like nuclear power plants (NPPs). The unfavourable arguments arise mainly due to grey areas in accurate assessment, demonstration of the required reliability and availability of computer systems. This requirement is important particularly for computer applications in control and protection logic systems of the NPPs. Systematic performance comparison study is required to be made for these applications against the conventional hard wired logic systems. The present paper discusses reliability centered system and plant performance measures to analyse the gains of computer applications in the NPPs. (author). 5 refs. 2 tabs

  4. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    Energy Technology Data Exchange (ETDEWEB)

    Hules, J. [ed.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  5. Argonne Laboratory Computing Resource Center - FY2004 Report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.

    2005-04-14

    In the spring of 2002, Argonne National Laboratory founded the Laboratory Computing Resource Center, and in April 2003 LCRC began full operations with Argonne's first teraflops computing cluster. The LCRC's driving mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting application use and development. This report describes the scientific activities, computing facilities, and usage in the first eighteen months of LCRC operation. In this short time LCRC has had broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. Steering for LCRC comes from the Computational Science Advisory Committee, composed of computing experts from many Laboratory divisions. The CSAC Allocations Committee makes decisions on individual project allocations for Jazz.

  6. Center for Computational Wind Turbine Aerodynamics and Atmospheric Turbulence

    DEFF Research Database (Denmark)

    Sørensen, Jens Nørkær

    2014-01-01

    In order to design and operate a wind farm optimally it is necessary to know in detail how the wind behaves and interacts with the turbines in a farm. This not only requires knowledge about meteorology, turbulence and aerodynamics, but it also requires access to powerful computers and efficient...... software. Center for Computational Wind Turbine Aerodynamics and Atmospheric Turbulence was established in 2010 in order to create a world-leading cross-disciplinary flow center that covers all relevant disciplines within wind farm meteorology and aerodynamics....

  7. Biomedical Engineering

    CERN Document Server

    Suh, Sang C; Tanik, Murat M

    2011-01-01

    Biomedical Engineering: Health Care Systems, Technology and Techniques is an edited volume with contributions from world experts. It provides readers with unique contributions related to current research and future healthcare systems. Practitioners and researchers focused on computer science, bioinformatics, engineering and medicine will find this book a valuable reference.

  8. Argonne's Laboratory computing center - 2007 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R.; Pieper, G. W.

    2008-05-28

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (1012 floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2007, there were over 60 active projects representing a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff use of national computing facilities, and improving the scientific

  9. The Brazilian Research and Teaching Center in Biomedicine and Aerospace Biomedical Engineering

    OpenAIRE

    Russomano, T; Falcao, P F; Dalmarco, G; Martinelli, L; Cardoso, R; Santos, M A; Sparenberg, A

    2008-01-01

    The recent engagement of Brazil in the construction and utilization of the International Space Station has motivated several Brazilian research institutions and universities to establish study centers related to Space Sciences. The Pontificia Universidade Catolica do Rio Grande do Sul (PUCRS) is no exception.

  10. Computational geometry lectures at the morningside center of mathematics

    CERN Document Server

    Wang, Ren-Hong

    2003-01-01

    Computational geometry is a borderline subject related to pure and applied mathematics, computer science, and engineering. The book contains articles on various topics in computational geometry, which are based on invited lectures and some contributed papers presented by researchers working during the program on Computational Geometry at the Morningside Center of Mathematics of the Chinese Academy of Science. The opening article by R.-H. Wang gives a nice survey of various aspects of computational geometry, many of which are discussed in more detail in other papers in the volume. The topics include problems of optimal triangulation, splines, data interpolation, problems of curve and surface design, problems of shape control, quantum teleportation, and others.

  11. UC Merced Center for Computational Biology Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Colvin, Michael; Watanabe, Masakatsu

    2010-11-30

    Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformation of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs

  12. Computer-aided biomedical imaging and graphics physiological measurement and control. Proceedings of the Biological Engineering Society, 6th Nordic meeting, Aberdeen, July 1984

    International Nuclear Information System (INIS)

    Jordan, M.; Perkins, W.J.; Upton, J.; Markham, J.

    1984-01-01

    The proceedings of the Sixth Nordic Meeting of the Biological Engineering Society held in Aberdeen in July 1984 on computer-aided biomedical imaging and graphics and physiological measurement and control are presented. The summaries of the papers presented cover the use of computer imaging and graphics in ultrasonic imaging, nuclear medicine, radiology, biomedical radiography, tomography and NMR imaging. The papers on the use of computers in physiological measurement and control cover subject headings including computer-based instrumentation, transducers, monitoring and control, assessment and therapy, clinical measurement, blood flow and signal processing and analysis. (U.K.)

  13. Cloud Computing in Science and Engineering and the “SciShop.ru” Computer Simulation Center

    Directory of Open Access Journals (Sweden)

    E. V. Vorozhtsov

    2011-12-01

    Full Text Available Various aspects of cloud computing applications for scientific research, applied design, and remote education are described in this paper. An analysis of the different aspects is performed based on the experience from the “SciShop.ru” Computer Simulation Center. This analysis shows that cloud computing technology has wide prospects in scientific research applications, applied developments and also remote education of specialists, postgraduates, and students.

  14. Automated segmentation of synchrotron radiation micro-computed tomography biomedical images using Graph Cuts and neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Alvarenga de Moura Meneses, Anderson, E-mail: ameneses@ieee.org [Radiological Sciences Laboratory, Rio de Janeiro State University, Rua Sao Francisco Xavier 524, CEP 20550-900, RJ (Brazil); Giusti, Alessandro [IDSIA (Dalle Molle Institute for Artificial Intelligence), University of Lugano (Switzerland); Pereira de Almeida, Andre; Parreira Nogueira, Liebert; Braz, Delson [Nuclear Engineering Program, Federal University of Rio de Janeiro, RJ (Brazil); Cely Barroso, Regina [Laboratory of Applied Physics on Biomedical Sciences, Physics Department, Rio de Janeiro State University, RJ (Brazil); Almeida, Carlos Eduardo de [Radiological Sciences Laboratory, Rio de Janeiro State University, Rua Sao Francisco Xavier 524, CEP 20550-900, RJ (Brazil)

    2011-12-21

    Synchrotron Radiation (SR) X-ray micro-Computed Tomography ({mu}CT) enables magnified images to be used as a non-invasive and non-destructive technique with a high space resolution for the qualitative and quantitative analyses of biomedical samples. The research on applications of segmentation algorithms to SR-{mu}CT is an open problem, due to the interesting and well-known characteristics of SR images for visualization, such as the high resolution and the phase contrast effect. In this article, we describe and assess the application of the Energy Minimization via Graph Cuts (EMvGC) algorithm for the segmentation of SR-{mu}CT biomedical images acquired at the Synchrotron Radiation for MEdical Physics (SYRMEP) beam line at the Elettra Laboratory (Trieste, Italy). We also propose a method using EMvGC with Artificial Neural Networks (EMANNs) for correcting misclassifications due to intensity variation of phase contrast, which are important effects and sometimes indispensable in certain biomedical applications, although they impair the segmentation provided by conventional techniques. Results demonstrate considerable success in the segmentation of SR-{mu}CT biomedical images, with average Dice Similarity Coefficient 99.88% for bony tissue in Wistar Rats rib samples (EMvGC), as well as 98.95% and 98.02% for scans of Rhodnius prolixus insect samples (Chagas's disease vector) with EMANNs, in relation to manual segmentation. The techniques EMvGC and EMANNs cope with the task of performing segmentation in images with the intensity variation due to phase contrast effects, presenting a superior performance in comparison to conventional segmentation techniques based on thresholding and linear/nonlinear image filtering, which is also discussed in the present article.

  15. Dendritic silica particles with center-radial pore channels: promising platforms for catalysis and biomedical applications.

    Science.gov (United States)

    Du, Xin; Qiao, Shi Zhang

    2015-01-27

    Dendritic silica micro-/nanoparticles with center-radial pore structures, a kind of newly created porous material, have attracted considerable attention owing to their unique open three-dimensional dendritic superstructures with large pore channels and highly accessible internal surface areas compared with conventional mesoporous silica nanoparticles (MSNs). They are very promising platforms for a variety of applications in catalysis and nanomedicine. In this review, their unique structural characteristics and properties are first analyzed, then novel and interesting synthesis methods associated with the possible formation mechanisms are summarized to provide material scientists some inspiration for the preparation of this kind of dendritic particles. Subsequently, a few examples of interesting applications are presented, mainly in catalysis, biomedicine, and other important fields such as for sacrificial templates and functional coatings. The review is concluded with an outlook on the prospects and challenges in terms of their controlled synthesis and potential applications. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. The role of dedicated data computing centers in the age of cloud computing

    Science.gov (United States)

    Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr

    2017-10-01

    Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.

  17. New computer system for the Japan Tier-2 center

    CERN Multimedia

    Hiroyuki Matsunaga

    2007-01-01

    The ICEPP (International Center for Elementary Particle Physics) of the University of Tokyo has been operating an LCG Tier-2 center dedicated to the ATLAS experiment, and is going to switch over to the new production system which has been recently installed. The system will be of great help to the exciting physics analyses for coming years. The new computer system includes brand-new blade servers, RAID disks, a tape library system and Ethernet switches. The blade server is DELL PowerEdge 1955 which contains two Intel dual-core Xeon (WoodCrest) CPUs running at 3GHz, and a total of 650 servers will be used as compute nodes. Each of the RAID disks is configured to be RAID-6 with 16 Serial ATA HDDs. The equipment as well as the cooling system is placed in a new large computer room, and both are hooked up to UPS (uninterruptible power supply) units for stable operation. As a whole, the system has been built with redundant configuration in a cost-effective way. The next major upgrade will take place in thre...

  18. Fundamental of biomedical engineering

    CERN Document Server

    Sawhney, GS

    2007-01-01

    About the Book: A well set out textbook explains the fundamentals of biomedical engineering in the areas of biomechanics, biofluid flow, biomaterials, bioinstrumentation and use of computing in biomedical engineering. All these subjects form a basic part of an engineer''s education. The text is admirably suited to meet the needs of the students of mechanical engineering, opting for the elective of Biomedical Engineering. Coverage of bioinstrumentation, biomaterials and computing for biomedical engineers can meet the needs of the students of Electronic & Communication, Electronic & Instrumenta

  19. Supporting Human Activities - Exploring Activity-Centered Computing

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob

    2002-01-01

    In this paper we explore an activity-centered computing paradigm that is aimed at supporting work processes that are radically different from the ones known from office work. Our main inspiration is healthcare work that is characterized by an extreme degree of mobility, many interruptions, ad......-hoc collaboration based on shared material, and organized in terms of well-defined, recurring, work activities. We propose that this kind of work can be supported by a pervasive computing infrastructure together with domain-specific services, both designed from a perspective where work activities are first class...... objects. We also present an exploratory prototype design and first implementation and present some initial results from evaluations in a healthcare environment....

  20. Final Report: Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [William Marsh Rice University

    2011-09-13

    As part of the Center for Programming Models for Scalable Parallel Computing, Rice University collaborated with project partners in the design, development and deployment of language, compiler, and runtime support for parallel programming models to support application development for the “leadership-class” computer systems at DOE national laboratories. Work over the course of this project has focused on the design, implementation, and evaluation of a second-generation version of Coarray Fortran. Research and development efforts of the project have focused on the CAF 2.0 language, compiler, runtime system, and supporting infrastructure. This has involved working with the teams that provide infrastructure for CAF that we rely on, implementing new language and runtime features, producing an open source compiler that enabled us to evaluate our ideas, and evaluating our design and implementation through the use of benchmarks. The report details the research, development, findings, and conclusions from this work.

  1. High Performance Computing in Science and Engineering '17 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael; HLRS 2017

    2018-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2017. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance.The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  2. High Performance Computing in Science and Engineering '15 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2015. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  3. High Performance Computing in Science and Engineering '99 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2000-01-01

    The book contains reports about the most significant projects from science and engineering of the Federal High Performance Computing Center Stuttgart (HLRS). They were carefully selected in a peer-review process and are showcases of an innovative combination of state-of-the-art modeling, novel algorithms and the use of leading-edge parallel computer technology. The projects of HLRS are using supercomputer systems operated jointly by university and industry and therefore a special emphasis has been put on the industrial relevance of results and methods.

  4. High Performance Computing in Science and Engineering '98 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    1999-01-01

    The book contains reports about the most significant projects from science and industry that are using the supercomputers of the Federal High Performance Computing Center Stuttgart (HLRS). These projects are from different scientific disciplines, with a focus on engineering, physics and chemistry. They were carefully selected in a peer-review process and are showcases for an innovative combination of state-of-the-art physical modeling, novel algorithms and the use of leading-edge parallel computer technology. As HLRS is in close cooperation with industrial companies, special emphasis has been put on the industrial relevance of results and methods.

  5. Center for Programming Models for Scalable Parallel Computing

    Energy Technology Data Exchange (ETDEWEB)

    John Mellor-Crummey

    2008-02-29

    Rice University's achievements as part of the Center for Programming Models for Scalable Parallel Computing include: (1) design and implemention of cafc, the first multi-platform CAF compiler for distributed and shared-memory machines, (2) performance studies of the efficiency of programs written using the CAF and UPC programming models, (3) a novel technique to analyze explicitly-parallel SPMD programs that facilitates optimization, (4) design, implementation, and evaluation of new language features for CAF, including communication topologies, multi-version variables, and distributed multithreading to simplify development of high-performance codes in CAF, and (5) a synchronization strength reduction transformation for automatically replacing barrier-based synchronization with more efficient point-to-point synchronization. The prototype Co-array Fortran compiler cafc developed in this project is available as open source software from http://www.hipersoft.rice.edu/caf.

  6. High Performance Computing in Science and Engineering '02 : Transactions of the High Performance Computing Center

    CERN Document Server

    Jäger, Willi

    2003-01-01

    This book presents the state-of-the-art in modeling and simulation on supercomputers. Leading German research groups present their results achieved on high-end systems of the High Performance Computing Center Stuttgart (HLRS) for the year 2002. Reports cover all fields of supercomputing simulation ranging from computational fluid dynamics to computer science. Special emphasis is given to industrially relevant applications. Moreover, by presenting results for both vector sytems and micro-processor based systems the book allows to compare performance levels and usability of a variety of supercomputer architectures. It therefore becomes an indispensable guidebook to assess the impact of the Japanese Earth Simulator project on supercomputing in the years to come.

  7. Data-Driven Approaches for Computation in Intelligent Biomedical Devices: A Case Study of EEG Monitoring for Chronic Seizure Detection

    Directory of Open Access Journals (Sweden)

    Naveen Verma

    2011-04-01

    Full Text Available Intelligent biomedical devices implies systems that are able to detect specific physiological processes in patients so that particular responses can be generated. This closed-loop capability can have enormous clinical value when we consider the unprecedented modalities that are beginning to emerge for sensing and stimulating patient physiology. Both delivering therapy (e.g., deep-brain stimulation, vagus nerve stimulation, etc. and treating impairments (e.g., neural prosthesis requires computational devices that can make clinically relevant inferences, especially using minimally-intrusive patient signals. The key to such devices is algorithms that are based on data-driven signal modeling as well as hardware structures that are specialized to these. This paper discusses the primary application-domain challenges that must be overcome and analyzes the most promising methods for this that are emerging. We then look at how these methods are being incorporated in ultra-low-energy computational platforms and systems. The case study for this is a seizure-detection SoC that includes instrumentation and computation blocks in support of a system that exploits patient-specific modeling to achieve accurate performance for chronic detection. The SoC samples each EEG channel at a rate of 600 Hz and performs processing to derive signal features on every two second epoch, consuming 9 μJ/epoch/channel. Signal feature extraction reduces the data rate by a factor of over 40×, permitting wireless communication from the patient’s head while reducing the total power on the head by 14×.

  8. Biomedical signal processing

    CERN Document Server

    Akay, Metin

    1994-01-01

    Sophisticated techniques for signal processing are now available to the biomedical specialist! Written in an easy-to-read, straightforward style, Biomedical Signal Processing presents techniques to eliminate background noise, enhance signal detection, and analyze computer data, making results easy to comprehend and apply. In addition to examining techniques for electrical signal analysis, filtering, and transforms, the author supplies an extensive appendix with several computer programs that demonstrate techniques presented in the text.

  9. Stream computing for biomedical signal processing: A QRS complex detection case-study.

    Science.gov (United States)

    Murphy, B M; O'Driscoll, C; Boylan, G B; Lightbody, G; Marnane, W P

    2015-01-01

    Recent developments in "Big Data" have brought significant gains in the ability to process large amounts of data on commodity server hardware. Stream computing is a relatively new paradigm in this area, addressing the need to process data in real time with very low latency. While this approach has been developed for dealing with large scale data from the world of business, security and finance, there is a natural overlap with clinical needs for physiological signal processing. In this work we present a case study of streams processing applied to a typical physiological signal processing problem: QRS detection from ECG data.

  10. Development of a computer system at La Hague center

    International Nuclear Information System (INIS)

    Mimaud, Robert; Malet, Georges; Ollivier, Francis; Fabre, J.-C.; Valois, Philippe; Desgranges, Patrick; Anfossi, Gilbert; Gentizon, Michel; Serpollet, Roger.

    1977-01-01

    The U.P.2 plant, built at La Hague Center is intended mainly for the reprocessing of spent fuels coming from (as metal) graphite-gas reactors and (as oxide) light-water, heavy-water and breeder reactors. In each of the five large nuclear units the digital processing of measurements was dealt with until 1974 by CAE 3030 data processors. During the period 1974-1975 a modern industrial computer system was set up. This system, equipped with T 2000/20 material from the Telemecanique company, consists of five measurement acquisition devices (for a total of 1500 lines processed) and two central processing units (CPU). The connection of these two PCU (Hardware and Software) enables an automatic connection of the system either on the first CPU or on the second one. The system covers, at present, data processing, threshold monitoring, alarm systems, display devices, periodical listing, and specific calculations concerning the process (balances etc), and at a later stage, an automatic control of certain units of the Process [fr

  11. Interactive computer-based programs for a cancer learning center.

    Science.gov (United States)

    Besa, E C; Nieman, L Z; Joseph, R R

    1995-01-01

    This paper describes the design and evaluation of a computer-based instruction (CBI) program that was integrated into a multidisciplinary cancer curriculum at the Medical College of Pennsylvania. Instruction took place in a cancer learning center. Modules contained literature, posters, slide sets, videocassette films, and "see, touch, and feel" models to teach and practice breast, testicular, rectal, laryngeal, and colonoscopic examinations. The CBI (programmed on HyperCard) contained tutorials divided into three levels of learning objectives: level one, epidemiology and prevention; level two, diagnosis and staging; and level three, management and prognosis. Simulated cases and test items were developed for each level. To evaluate students' perceptions of the program and provide them with feedback about their performances, the authors designed a questionnaire, held a focus group, and developed a built-in tracking system for the CBI. Results showed that the program was well received, the students answered the test items correctly, and the students wanted more time to study cancer. A description of some of the problems encountered with technology and equipment is provided for faculty who may be interested in designing and implementing similar CBI programs.

  12. Parallel Processing and Bio-inspired Computing for Biomedical Image Registration

    Directory of Open Access Journals (Sweden)

    Silviu Ioan Bejinariu

    2014-07-01

    Full Text Available Image Registration (IR is an optimization problem computing optimal parameters of a geometric transform used to overlay one or more source images to a given model by maximizing a similarity measure. In this paper the use of bio-inspired optimization algorithms in image registration is analyzed. Results obtained by means of three different algorithms are compared: Bacterial Foraging Optimization Algorithm (BFOA, Genetic Algorithm (GA and Clonal Selection Algorithm (CSA. Depending on the images type, the registration may be: area based, which is slow but more precise, and features based, which is faster. In this paper a feature based approach based on the Scale Invariant Feature Transform (SIFT is proposed. Finally, results obtained using sequential and parallel implementations on multi-core systems for area based and features based image registration are compared.

  13. Computational modeling of Fontan physiology: at the crossroads of pediatric cardiology and biomedical engineering.

    Science.gov (United States)

    Slesnick, Timothy C; Yoganathan, Ajit P

    2014-08-01

    The Fontan operation has evolved over the last four and a half decades and is now widely applied to patients with various forms of "single ventricle" congenital heart disease. Survival has greatly improved since the early years, but long-term morbidity and mortality continue to occur. Modeling of Fontan geometries, both in vitro and using computational fluid dynamics, has been instrumental in designing novel changes to Fontan's operation, including the application of staged surgical procedures leading to a total cavopulmonary anastomosis, lateral tunnel, extracardiac conduit, and most recently bifurcated Y-graft modifications. In this review, the history of modeling of Fontan physiologies, current state-of-the-art methodologies, and future directions are explored. The application of these techniques to cardiac magnetic resonance imaging to construct patient specific anatomies offers the possibility of individualized surgical planning to optimize hemodynamics, including minimizing power loss, balancing hepatic factor distribution, and ultimately improving patient outcomes.

  14. Bayesian Computation Methods for Inferring Regulatory Network Models Using Biomedical Data.

    Science.gov (United States)

    Tian, Tianhai

    2016-01-01

    The rapid advancement of high-throughput technologies provides huge amounts of information for gene expression and protein activity in the genome-wide scale. The availability of genomics, transcriptomics, proteomics, and metabolomics dataset gives an unprecedented opportunity to study detailed molecular regulations that is very important to precision medicine. However, it is still a significant challenge to design effective and efficient method to infer the network structure and dynamic property of regulatory networks. In recent years a number of computing methods have been designed to explore the regulatory mechanisms as well as estimate unknown model parameters. Among them, the Bayesian inference method can combine both prior knowledge and experimental data to generate updated information regarding the regulatory mechanisms. This chapter gives a brief review for Bayesian statistical methods that are used to infer the network structure and estimate model parameters based on experimental data.

  15. Hybrid brain-computer interface for biomedical cyber-physical system application using wireless embedded EEG systems.

    Science.gov (United States)

    Chai, Rifai; Naik, Ganesh R; Ling, Sai Ho; Nguyen, Hung T

    2017-01-07

    One of the key challenges of the biomedical cyber-physical system is to combine cognitive neuroscience with the integration of physical systems to assist people with disabilities. Electroencephalography (EEG) has been explored as a non-invasive method of providing assistive technology by using brain electrical signals. This paper presents a unique prototype of a hybrid brain computer interface (BCI) which senses a combination classification of mental task, steady state visual evoked potential (SSVEP) and eyes closed detection using only two EEG channels. In addition, a microcontroller based head-mounted battery-operated wireless EEG sensor combined with a separate embedded system is used to enhance portability, convenience and cost effectiveness. This experiment has been conducted with five healthy participants and five patients with tetraplegia. Generally, the results show comparable classification accuracies between healthy subjects and tetraplegia patients. For the offline artificial neural network classification for the target group of patients with tetraplegia, the hybrid BCI system combines three mental tasks, three SSVEP frequencies and eyes closed, with average classification accuracy at 74% and average information transfer rate (ITR) of the system of 27 bits/min. For the real-time testing of the intentional signal on patients with tetraplegia, the average success rate of detection is 70% and the speed of detection varies from 2 to 4 s.

  16. Uncertainty quantification and validation of 3D lattice scaffolds for computer-aided biomedical applications.

    Science.gov (United States)

    Gorguluarslan, Recep M; Choi, Seung-Kyum; Saldana, Christopher J

    2017-07-01

    A methodology is proposed for uncertainty quantification and validation to accurately predict the mechanical response of lattice structures used in the design of scaffolds. Effective structural properties of the scaffolds are characterized using a developed multi-level stochastic upscaling process that propagates the quantified uncertainties at strut level to the lattice structure level. To obtain realistic simulation models for the stochastic upscaling process and minimize the experimental cost, high-resolution finite element models of individual struts were reconstructed from the micro-CT scan images of lattice structures which are fabricated by selective laser melting. The upscaling method facilitates the process of determining homogenized strut properties to reduce the computational cost of the detailed simulation model for the scaffold. Bayesian Information Criterion is utilized to quantify the uncertainties with parametric distributions based on the statistical data obtained from the reconstructed strut models. A systematic validation approach that can minimize the experimental cost is also developed to assess the predictive capability of the stochastic upscaling method used at the strut level and lattice structure level. In comparison with physical compression test results, the proposed methodology of linking the uncertainty quantification with the multi-level stochastic upscaling method enabled an accurate prediction of the elastic behavior of the lattice structure with minimal experimental cost by accounting for the uncertainties induced by the additive manufacturing process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Computer literacy and E-learning perception in Cameroon: the case of Yaounde Faculty of Medicine and Biomedical Sciences.

    Science.gov (United States)

    Bediang, Georges; Stoll, Beat; Geissbuhler, Antoine; Klohn, Axel M; Stuckelberger, Astrid; Nko'o, Samuel; Chastonay, Philippe

    2013-04-19

    Health science education faces numerous challenges: assimilation of knowledge, management of increasing numbers of learners or changes in educational models and methodologies. With the emergence of e-learning, the use of information and communication technologies (ICT) and Internet to improve teaching and learning in health science training institutions has become a crucial issue for low and middle income countries, including sub-Saharan Africa. In this perspective, the Faculty of Medicine and Biomedical Sciences (FMBS) of Yaoundé has played a pioneering role in Cameroon in making significant efforts to improve students' and lecturers' access to computers and to Internet on its campus.The objective is to investigate how computer literacy and the perception towards e-learning and its potential could contribute to the learning and teaching process within the FMBS academic community. A cross-sectional survey was carried out among students, residents and lecturers. The data was gathered through a written questionnaire distributed at FMBS campus and analysed with routine statistical software. 307 participants answered the questionnaire: 218 students, 57 residents and 32 lecturers. Results show that most students, residents and lecturers have access to computers and Internet, although students' access is mainly at home for computers and at cyber cafés for Internet. Most of the participants have a fairly good mastery of ICT. However, some basic rules of good practices concerning the use of ICT in the health domain were still not well known. Google is the most frequently used engine to retrieve health literature for all participants; only 7% of students and 16% of residents have heard about Medical Subject Headings (MeSH).The potential of e-learning in the improvement of teaching and learning still remains insufficiently exploited. About two thirds of the students are not familiar with the concept of e-leaning. 84% of students and 58% of residents had never had access to

  18. Computer literacy and E-learning perception in Cameroon: the case of Yaounde Faculty of Medicine and Biomedical Sciences

    Science.gov (United States)

    2013-01-01

    Background Health science education faces numerous challenges: assimilation of knowledge, management of increasing numbers of learners or changes in educational models and methodologies. With the emergence of e-learning, the use of information and communication technologies (ICT) and Internet to improve teaching and learning in health science training institutions has become a crucial issue for low and middle income countries, including sub-Saharan Africa. In this perspective, the Faculty of Medicine and Biomedical Sciences (FMBS) of Yaoundé has played a pioneering role in Cameroon in making significant efforts to improve students’ and lecturers’ access to computers and to Internet on its campus. The objective is to investigate how computer literacy and the perception towards e-learning and its potential could contribute to the learning and teaching process within the FMBS academic community. Method A cross-sectional survey was carried out among students, residents and lecturers. The data was gathered through a written questionnaire distributed at FMBS campus and analysed with routine statistical software. Results 307 participants answered the questionnaire: 218 students, 57 residents and 32 lecturers. Results show that most students, residents and lecturers have access to computers and Internet, although students’ access is mainly at home for computers and at cyber cafés for Internet. Most of the participants have a fairly good mastery of ICT. However, some basic rules of good practices concerning the use of ICT in the health domain were still not well known. Google is the most frequently used engine to retrieve health literature for all participants; only 7% of students and 16% of residents have heard about Medical Subject Headings (MeSH). The potential of e-learning in the improvement of teaching and learning still remains insufficiently exploited. About two thirds of the students are not familiar with the concept of e-leaning. 84% of students and 58% of

  19. Robust detection and segmentation of cell nuclei in biomedical images based on a computational topology framework.

    Science.gov (United States)

    Rojas-Moraleda, Rodrigo; Xiong, Wei; Halama, Niels; Breitkopf-Heinlein, Katja; Dooley, Steven; Salinas, Luis; Heermann, Dieter W; Valous, Nektarios A

    2017-05-01

    The segmentation of cell nuclei is an important step towards the automated analysis of histological images. The presence of a large number of nuclei in whole-slide images necessitates methods that are computationally tractable in addition to being effective. In this work, a method is developed for the robust segmentation of cell nuclei in histological images based on the principles of persistent homology. More specifically, an abstract simplicial homology approach for image segmentation is established. Essentially, the approach deals with the persistence of disconnected sets in the image, thus identifying salient regions that express patterns of persistence. By introducing an image representation based on topological features, the task of segmentation is less dependent on variations of color or texture. This results in a novel approach that generalizes well and provides stable performance. The method conceptualizes regions of interest (cell nuclei) pertinent to their topological features in a successful manner. The time cost of the proposed approach is lower-bounded by an almost linear behavior and upper-bounded by O(n 2 ) in a worst-case scenario. Time complexity matches a quasilinear behavior which is O(n 1+ɛ ) for ε experts) and the supervised training of a random forest classifier. The results are obtained on a per-object basis. The proposed workflow successfully detected both hepatocyte and non-parenchymal cell nuclei with an accuracy of 84.6%, and hepatocyte cell nuclei only with an accuracy of 86.2%. A public histological dataset with supplied ground-truth data is also used for evaluating the performance of the proposed approach (accuracy: 94.5%). Further validations are carried out with a publicly available dataset and ground-truth data from the Gland Segmentation in Colon Histology Images Challenge (GlaS) contest. The proposed method is useful for obtaining unsupervised robust initial segmentations that can be further integrated in image/data processing

  20. Translational Bioinformatics and Clinical Research (Biomedical) Informatics.

    Science.gov (United States)

    Sirintrapun, S Joseph; Zehir, Ahmet; Syed, Aijazuddin; Gao, JianJiong; Schultz, Nikolaus; Cheng, Donavan T

    2015-06-01

    Translational bioinformatics and clinical research (biomedical) informatics are the primary domains related to informatics activities that support translational research. Translational bioinformatics focuses on computational techniques in genetics, molecular biology, and systems biology. Clinical research (biomedical) informatics involves the use of informatics in discovery and management of new knowledge relating to health and disease. This article details 3 projects that are hybrid applications of translational bioinformatics and clinical research (biomedical) informatics: The Cancer Genome Atlas, the cBioPortal for Cancer Genomics, and the Memorial Sloan Kettering Cancer Center clinical variants and results database, all designed to facilitate insights into cancer biology and clinical/therapeutic correlations. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Building the biomedical data science workforce.

    Science.gov (United States)

    Dunn, Michelle C; Bourne, Philip E

    2017-07-01

    This article describes efforts at the National Institutes of Health (NIH) from 2013 to 2016 to train a national workforce in biomedical data science. We provide an analysis of the Big Data to Knowledge (BD2K) training program strengths and weaknesses with an eye toward future directions aimed at any funder and potential funding recipient worldwide. The focus is on extramurally funded programs that have a national or international impact rather than the training of NIH staff, which was addressed by the NIH's internal Data Science Workforce Development Center. From its inception, the major goal of BD2K was to narrow the gap between needed and existing biomedical data science skills. As biomedical research increasingly relies on computational, mathematical, and statistical thinking, supporting the training and education of the workforce of tomorrow requires new emphases on analytical skills. From 2013 to 2016, BD2K jump-started training in this area for all levels, from graduate students to senior researchers.

  2. Fluid dynamics parallel computer development at NASA Langley Research Center

    Science.gov (United States)

    Townsend, James C.; Zang, Thomas A.; Dwoyer, Douglas L.

    1987-01-01

    To accomplish more detailed simulations of highly complex flows, such as the transition to turbulence, fluid dynamics research requires computers much more powerful than any available today. Only parallel processing on multiple-processor computers offers hope for achieving the required effective speeds. Looking ahead to the use of these machines, the fluid dynamicist faces three issues: algorithm development for near-term parallel computers, architecture development for future computer power increases, and assessment of possible advantages of special purpose designs. Two projects at NASA Langley address these issues. Software development and algorithm exploration is being done on the FLEX/32 Parallel Processing Research Computer. New architecture features are being explored in the special purpose hardware design of the Navier-Stokes Computer. These projects are complementary and are producing promising results.

  3. ASSURED CLOUD COMPUTING UNIVERSITY CENTER OFEXCELLENCE (ACC UCOE)

    Science.gov (United States)

    2018-01-18

    Security Monitors,” MS Thesis, Department of Computer Science , University of Illinois at Urbana- Champaign, Urbana, IL, August 2015. Approved for...Flow Constraints”, MS Thesis, Department of Computer Science , University of Illinois at Urbana- Champaign, May 2015. 13. Fangzhou Yao, “ Secure ...infrastructure security -Design of algorithms and techniques for real-time assuredness in cloud computing -Map-reduce task assignment with data locality

  4. Center for computation and visualization of geometric structures. [Annual], Progress report

    Energy Technology Data Exchange (ETDEWEB)

    1993-02-12

    The mission of the Center is to establish a unified environment promoting research, education, and software and tool development. The work is centered on computing, interpreted in a broad sense to include the relevant theory, development of algorithms, and actual implementation. The research aspects of the Center are focused on geometry; correspondingly the computational aspects are focused on three (and higher) dimensional visualization. The educational aspects are likewise centered on computing and focused on geometry. A broader term than education is `communication` which encompasses the challenge of explaining to the world current research in mathematics, and specifically geometry.

  5. Biomedical signal analysis

    CERN Document Server

    Rangayyan, Rangaraj M

    2015-01-01

    The book will help assist a reader in the development of techniques for analysis of biomedical signals and computer aided diagnoses with a pedagogical examination of basic and advanced topics accompanied by over 350 figures and illustrations. Wide range of filtering techniques presented to address various applications. 800 mathematical expressions and equations. Practical questions, problems and laboratory exercises. Includes fractals and chaos theory with biomedical applications.

  6. B3: Fuzzy-Based Data Center Load Optimization in Cloud Computing

    Directory of Open Access Journals (Sweden)

    M. Jaiganesh

    2013-01-01

    Full Text Available Cloud computing started a new era in getting variety of information puddles through various internet connections by any connective devices. It provides pay and use method for grasping the services by the clients. Data center is a sophisticated high definition server, which runs applications virtually in cloud computing. It moves the application, services, and data to a large data center. Data center provides more service level, which covers maximum of users. In order to find the overall load efficiency, the utilization service in data center is a definite task. Hence, we propose a novel method to find the efficiency of the data center in cloud computing. The goal is to optimize date center utilization in terms of three big factors—Bandwidth, Memory, and Central Processing Unit (CPU cycle. We constructed a fuzzy expert system model to obtain maximum Data Center Load Efficiency (DCLE in cloud computing environments. The advantage of the proposed system lies in DCLE computing. While computing, it allows regular evaluation of services to any number of clients. This approach indicates that the current cloud needs an order of magnitude in data center management to be used in next generation computing.

  7. Center for Advanced Energy Studies: Computer Assisted Virtual Environment (CAVE)

    Data.gov (United States)

    Federal Laboratory Consortium — The laboratory contains a four-walled 3D computer assisted virtual environment - or CAVE TM — that allows scientists and engineers to literally walk into their data...

  8. [Biomedical informatics].

    Science.gov (United States)

    Capurro, Daniel; Soto, Mauricio; Vivent, Macarena; Lopetegui, Marcelo; Herskovic, Jorge R

    2011-12-01

    Biomedical Informatics is a new discipline that arose from the need to incorporate information technologies to the generation, storage, distribution and analysis of information in the domain of biomedical sciences. This discipline comprises basic biomedical informatics, and public health informatics. The development of the discipline in Chile has been modest and most projects have originated from the interest of individual people or institutions, without a systematic and coordinated national development. Considering the unique features of health care system of our country, research in the area of biomedical informatics is becoming an imperative.

  9. Science gateways for biomedical big data analysis

    NARCIS (Netherlands)

    Shahand, S.

    2015-01-01

    Biomedical researchers are facing data deluge challenges such as dealing with large volume of complex heterogeneous data and complex and computationally demanding data processing methods. Such scale and complexity of biomedical research requires multi-disciplinary collaboration between scientists

  10. National Energy Research Scientific Computing Center 2007 Annual Report

    Energy Technology Data Exchange (ETDEWEB)

    Hules, John A.; Bashor, Jon; Wang, Ucilia; Yarris, Lynn; Preuss, Paul

    2008-10-23

    This report presents highlights of the research conducted on NERSC computers in a variety of scientific disciplines during the year 2007. It also reports on changes and upgrades to NERSC's systems and services aswell as activities of NERSC staff.

  11. Performance of Cloud Computing Centers with Multiple Priority Classes

    NARCIS (Netherlands)

    Ellens, W.; Zivkovic, M.; Akkerboom, J.D.; Litjens, R.; Berg, J.L. van den

    2012-01-01

    In this paper we consider the general problem of resource provisioning within cloud computing. We analyze the problem of how to allocate resources to different clients such that the service level agreements (SLAs) for all of these clients are met. A model with multiple service request classes

  12. Performance of Cloud Computing Centers with Multiple Priority Classes

    NARCIS (Netherlands)

    Ellens, W.; Zivkovic, Miroslav; Akkerboom, J.; Litjens, R.; van den Berg, Hans Leo

    In this paper we consider the general problem of resource provisioning within cloud computing. We analyze the problem of how to allocate resources to different clients such that the service level agreements (SLAs) for all of these clients are met. A model with multiple service request classes

  13. search GenBank: interactive orchestration and ad-hoc choreography of Web services in the exploration of the biomedical resources of the National Center For Biotechnology Information.

    Science.gov (United States)

    Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Siążnik, Artur

    2013-03-01

    Due to the growing number of biomedical entries in data repositories of the National Center for Biotechnology Information (NCBI), it is difficult to collect, manage and process all of these entries in one place by third-party software developers without significant investment in hardware and software infrastructure, its maintenance and administration. Web services allow development of software applications that integrate in one place the functionality and processing logic of distributed software components, without integrating the components themselves and without integrating the resources to which they have access. This is achieved by appropriate orchestration or choreography of available Web services and their shared functions. After the successful application of Web services in the business sector, this technology can now be used to build composite software tools that are oriented towards biomedical data processing. We have developed a new tool for efficient and dynamic data exploration in GenBank and other NCBI databases. A dedicated search GenBank system makes use of NCBI Web services and a package of Entrez Programming Utilities (eUtils) in order to provide extended searching capabilities in NCBI data repositories. In search GenBank users can use one of the three exploration paths: simple data searching based on the specified user's query, advanced data searching based on the specified user's query, and advanced data exploration with the use of macros. search GenBank orchestrates calls of particular tools available through the NCBI Web service providing requested functionality, while users interactively browse selected records in search GenBank and traverse between NCBI databases using available links. On the other hand, by building macros in the advanced data exploration mode, users create choreographies of eUtils calls, which can lead to the automatic discovery of related data in the specified databases. search GenBank extends standard capabilities of the

  14. BIMS: Biomedical Information Management System

    OpenAIRE

    Mora, Oscar; Bisbal, Jesús

    2013-01-01

    In this paper, we present BIMS (Biomedical Information Management System). BIMS is a software architecture designed to provide a flexible computational framework to manage the information needs of a wide range of biomedical research projects. The main goal is to facilitate the clinicians' job in data entry, and researcher's tasks in data management, in high data quality biomedical research projects. The BIMS architecture has been designed following the two-level modeling paradigm, a promising...

  15. Ethics in psychosocial and biomedical research – A training experience at the Interdisciplinary Center for Bioethics (CIEB) of the University of Chile1

    Science.gov (United States)

    Lolas, Fernando; Rodriguez, Eduardo

    2012-01-01

    This paper reviews the experience in training Latin American professionals and scientists in the ethics of biomedical and psychosocial research at the Interdisciplinary Center for Studies in Bioethics (CIEB) of the University of Chile, aided by a grant from Fogarty International Center (FIC) – National Institutes of Health from 2002 to 2011. In these 10 years of experience, 50 trainees have completed a 12-month training combining on-line and in-person teaching and learning activities, with further support for maintaining contact via webmail and personal meetings. The network formed by faculty and former trainees has published extensively on issues relevant in the continent and has been instrumental in promoting new master level courses at different universities, drafting regulations and norms, and promoting the use of bioethical discourse in health care and research. Evaluation meetings have shown that while most trainees did benefit from the experience and contributed highly to developments at their home institutions and countries, some degree of structuring of demand for qualified personnel is needed in order to better utilize the human resources created by the program. Publications and other deliverables of trainees and faculty are presented. PMID:22754084

  16. Center for Computer Security newsletter. Volume 2, Number 3

    Energy Technology Data Exchange (ETDEWEB)

    None

    1983-05-01

    The Fifth Computer Security Group Conference was held November 16 to 18, 1982, at the Knoxville Hilton in Knoxville, Tennessee. Attending were 183 people, representing the Department of Energy, DOE contractors, other government agencies, and vendor organizations. In these papers are abridgements of most of the papers presented in Knoxville. Less than half-a-dozen speakers failed to furnish either abstracts or full-text papers of their Knoxville presentations.

  17. The Federated Tier-2 Computing Center for LHC

    International Nuclear Information System (INIS)

    Liko, D.; Hoerman, N.; Kuhn, D.; Mair, G.; Jais, W.

    2010-01-01

    Full text: The Worldwide LHC Computing Grid project (WLCG) has established a global computing infrastructure for the analysis of LHC data. It is based on several large grid projects, the EGEE project in Europe and the Open Science Grid (OSG) in the US. With the start of LHC a new international structure has been established in Europe, the European Grid Initiative (EGI), to provide a sustainable support for the infrastructure in the coming years. In Austria, supported by the BMWF, the AustrianGrid project has also established a federated Tier-2 centre for LHC with large computing clusters in Vienna and Innsbruck. It builds the basis for the LHC data analysis of the Austrian particle physics community. The Tier-2 is again supported by an Austrian National Grid Initiative (NGT-AT) that integrates into the European structure. With the start of the collider operation also the Austrian part of the grid has demonstrated its capabilities and data from the ATLAS and the CMS experiment are being analysed in Austria from the first day. As of today a number of other research areas, as biology, radio-biology and theoretical physics are also profiting from the available resources. (author)

  18. Computer vision research at Marshall Space Flight Center

    Science.gov (United States)

    Vinz, Frank L.

    1990-01-01

    Orbital docking, inspection, and sevicing are operations which have the potential for capability enhancement as well as cost reduction for space operations by the application of computer vision technology. Research at MSFC has been a natural outgrowth of orbital docking simulations for remote manually controlled vehicles such as the Teleoperator Retrieval System and the Orbital Maneuvering Vehicle (OMV). Baseline design of the OMV dictates teleoperator control from a ground station. This necessitates a high data-rate communication network and results in several seconds of time delay. Operational costs and vehicle control difficulties could be alleviated by an autonomous or semi-autonomous control system onboard the OMV which would be based on a computer vision system having capability to recognize video images in real time. A concept under development at MSFC with these attributes is based on syntactic pattern recognition. It uses tree graphs for rapid recognition of binary images of known orbiting target vehicles. This technique and others being investigated at MSFC will be evaluated in realistic conditions by the use of MSFC orbital docking simulators. Computer vision is also being applied at MSFC as part of the supporting development for Work Package One of Space Station Freedom.

  19. Information and psychomotor skills knowledge acquisition: A student-customer-centered and computer-supported approach.

    Science.gov (United States)

    Nicholson, Anita; Tobin, Mary

    2006-01-01

    This presentation will discuss coupling commercial and customized computer-supported teaching aids to provide BSN nursing students with a friendly customer-centered self-study approach to psychomotor skill acquisition.

  20. Local area network strategies and guidelines for a Peruvian Computer Center

    OpenAIRE

    Palomino Fonseca, Miguel A.

    1991-01-01

    Approved for public release; distribution is unlimited This thesis examines the application of local area network (LAN) technology to the Peruvian Air Force Computer Center. The current Peruvian Air Force Computer Center communication system and its problems are discussed, along with the basic concepts of data communications, protocols and topologies. The IEEE 802.3 and IEEE 802.5 specifications are discussed in detail. This study is primarily concerned with how to design the best local...

  1. Intention and Usage of Computer Based Information Systems in Primary Health Centers

    Science.gov (United States)

    Hosizah; Kuntoro; Basuki N., Hari

    2016-01-01

    The computer-based information system (CBIS) is adopted by almost all of in health care setting, including the primary health center in East Java Province Indonesia. Some of softwares available were SIMPUS, SIMPUSTRONIK, SIKDA Generik, e-puskesmas. Unfortunately they were most of the primary health center did not successfully implemented. This…

  2. CNC Turning Center Advanced Operations. Computer Numerical Control Operator/Programmer. 444-332.

    Science.gov (United States)

    Skowronski, Steven D.; Tatum, Kenneth

    This student guide provides materials for a course designed to introduce the student to the operations and functions of a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 presents course expectations and syllabus, covers safety precautions, and describes the CNC turning center components, CNC…

  3. A Descriptive Study towards Green Computing Practice Application for Data Centers in IT Based Industries

    Directory of Open Access Journals (Sweden)

    Anthony Jnr. Bokolo

    2018-01-01

    Full Text Available The progressive upsurge in demand for processing and computing power has led to a subsequent upsurge in data center carbon emissions, cost incurred, unethical waste management, depletion of natural resources and high energy utilization. This raises the issue of the sustainability attainment in data centers of Information Technology (IT based industries. Green computing practice can be applied to facilitate sustainability attainment as IT based industries utilizes data centers to provide services to staffs, practitioners and end users. But it is a known fact that enterprise servers utilize huge quantity of energy and incur other expenditures in cooling operations and it is difficult to address the needs of accuracy and efficiency in data centers while yet encouraging a greener application practice alongside cost reduction. Thus this research study focus on the practice application of Green computing in data centers which houses servers and as such presents the Green computing life cycle strategies and best practices to be practiced for better management in data centers in IT based industries. Data was collected through questionnaire from 133 respondents in industries that currently operate their in-house data centers. The analysed data was used to verify the Green computing life cycle strategies presented in this study. Findings from the data shows that each of the life cycles strategies is significant in assisting IT based industries apply Green computing practices in their data centers. This study would be of interest to knowledge and data management practitioners as well as environmental manager and academicians in deploying Green data centers in their organizations.

  4. Scientific visualization in computational aerodynamics at NASA Ames Research Center

    Science.gov (United States)

    Bancroft, Gordon V.; Plessel, Todd; Merritt, Fergus; Walatka, Pamela P.; Watson, Val

    1989-01-01

    The visualization methods used in computational fluid dynamics research at the NASA-Ames Numerical Aerodynamic Simulation facility are examined, including postprocessing, tracking, and steering methods. The visualization requirements of the facility's three-dimensional graphical workstation are outlined and the types hardware and software used to meet these requirements are discussed. The main features of the facility's current and next-generation workstations are listed. Emphasis is given to postprocessing techniques, such as dynamic interactive viewing on the workstation and recording and playback on videodisk, tape, and 16-mm film. Postprocessing software packages are described, including a three-dimensional plotter, a surface modeler, a graphical animation system, a flow analysis software toolkit, and a real-time interactive particle-tracer.

  5. Applied human factors research at the NASA Johnson Space Center Human-Computer Interaction Laboratory

    Science.gov (United States)

    Rudisill, Marianne; Mckay, Timothy D.

    1990-01-01

    The applied human factors research program performed at the NASA Johnson Space Center's Human-Computer Interaction Laboratory is discussed. Research is conducted to advance knowledge in human interaction with computer systems during space crew tasks. In addition, the Laboratory is directly involved in the specification of the human-computer interface (HCI) for space systems in development (e.g., Space Station Freedom) and is providing guidelines and support for HCI design to current and future space missions.

  6. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Göttingen

    CERN Document Server

    Meyer, J; The ATLAS collaboration; Weber, P

    2010-01-01

    GoeGrid is a grid resource center located in Goettingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center will be presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster will be detailed. The benefits are an efficient use of computer and manpower resources. Further interdisciplinary projects are commonly organized courses for students of all fields to support education on grid-computing.

  7. High Performance Computing in Science and Engineering '16 : Transactions of the High Performance Computing Center, Stuttgart (HLRS) 2016

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2016-01-01

    This book presents the state-of-the-art in supercomputer simulation. It includes the latest findings from leading researchers using systems from the High Performance Computing Center Stuttgart (HLRS) in 2016. The reports cover all fields of computational science and engineering ranging from CFD to computational physics and from chemistry to computer science with a special emphasis on industrially relevant applications. Presenting findings of one of Europe’s leading systems, this volume covers a wide variety of applications that deliver a high level of sustained performance. The book covers the main methods in high-performance computing. Its outstanding results in achieving the best performance for production codes are of particular interest for both scientists and engineers. The book comes with a wealth of color illustrations and tables of results.

  8. Evaluating efforts to diversify the biomedical workforce: the role and function of the Coordination and Evaluation Center of the Diversity Program Consortium.

    Science.gov (United States)

    McCreath, Heather E; Norris, Keith C; Calderόn, Nancy E; Purnell, Dawn L; Maccalla, Nicole M G; Seeman, Teresa E

    2017-01-01

    The National Institutes of Health (NIH)-funded Diversity Program Consortium (DPC) includes a Coordination and Evaluation Center (CEC) to conduct a longitudinal evaluation of the two signature, national NIH initiatives - the Building Infrastructure Leading to Diversity (BUILD) and the National Research Mentoring Network (NRMN) programs - designed to promote diversity in the NIH-funded biomedical, behavioral, clinical, and social sciences research workforce. Evaluation is central to understanding the impact of the consortium activities. This article reviews the role and function of the CEC and the collaborative processes and achievements critical to establishing empirical evidence regarding the efficacy of federally-funded, quasi-experimental interventions across multiple sites. The integrated DPC evaluation is particularly significant because it is a collaboratively developed Consortium Wide Evaluation Plan and the first hypothesis-driven, large-scale systemic national longitudinal evaluation of training programs in the history of NIH/National Institute of General Medical Sciences. To guide the longitudinal evaluation, the CEC-led literature review defined key indicators at critical training and career transition points - or Hallmarks of Success. The multidimensional, comprehensive evaluation of the impact of the DPC framed by these Hallmarks is described. This evaluation uses both established and newly developed common measures across sites, and rigorous quasi-experimental designs within novel multi-methods (qualitative and quantitative). The CEC also promotes shared learning among Consortium partners through working groups and provides technical assistance to support high-quality process and outcome evaluation internally of each program. Finally, the CEC is responsible for developing high-impact dissemination channels for best practices to inform peer institutions, NIH, and other key national and international stakeholders. A strong longitudinal evaluation across

  9. Computer Science, Biology and Biomedical Informatics academy: Outcomes from 5 years of Immersing High-school Students into Informatics Research.

    Science.gov (United States)

    King, Andrew J; Fisher, Arielle M; Becich, Michael J; Boone, David N

    2017-01-01

    The University of Pittsburgh's Department of Biomedical Informatics and Division of Pathology Informatics created a Science, Technology, Engineering, and Mathematics (STEM) pipeline in 2011 dedicated to providing cutting-edge informatics research and career preparatory experiences to a diverse group of highly motivated high-school students. In this third editorial installment describing the program, we provide a brief overview of the pipeline, report on achievements of the past scholars, and present results from self-reported assessments by the 2015 cohort of scholars. The pipeline continues to expand with the 2015 addition of the innovation internship, and the introduction of a program in 2016 aimed at offering first-time research experiences to undergraduates who are underrepresented in pathology and biomedical informatics. Achievements of program scholars include authorship of journal articles, symposium and summit presentations, and attendance at top 25 universities. All of our alumni matriculated into higher education and 90% remain in STEM majors. The 2015 high-school program had ten participating scholars who self-reported gains in confidence in their research abilities and understanding of what it means to be a scientist.

  10. Computer Science, Biology and Biomedical Informatics academy: Outcomes from 5 years of Immersing High-school Students into Informatics Research

    Science.gov (United States)

    King, Andrew J.; Fisher, Arielle M.; Becich, Michael J.; Boone, David N.

    2017-01-01

    The University of Pittsburgh's Department of Biomedical Informatics and Division of Pathology Informatics created a Science, Technology, Engineering, and Mathematics (STEM) pipeline in 2011 dedicated to providing cutting-edge informatics research and career preparatory experiences to a diverse group of highly motivated high-school students. In this third editorial installment describing the program, we provide a brief overview of the pipeline, report on achievements of the past scholars, and present results from self-reported assessments by the 2015 cohort of scholars. The pipeline continues to expand with the 2015 addition of the innovation internship, and the introduction of a program in 2016 aimed at offering first-time research experiences to undergraduates who are underrepresented in pathology and biomedical informatics. Achievements of program scholars include authorship of journal articles, symposium and summit presentations, and attendance at top 25 universities. All of our alumni matriculated into higher education and 90% remain in STEM majors. The 2015 high-school program had ten participating scholars who self-reported gains in confidence in their research abilities and understanding of what it means to be a scientist. PMID:28400991

  11. Biomedical photonics handbook biomedical diagnostics

    CERN Document Server

    Vo-Dinh, Tuan

    2014-01-01

    Shaped by Quantum Theory, Technology, and the Genomics RevolutionThe integration of photonics, electronics, biomaterials, and nanotechnology holds great promise for the future of medicine. This topic has recently experienced an explosive growth due to the noninvasive or minimally invasive nature and the cost-effectiveness of photonic modalities in medical diagnostics and therapy. The second edition of the Biomedical Photonics Handbook presents fundamental developments as well as important applications of biomedical photonics of interest to scientists, engineers, manufacturers, teachers, studen

  12. CodPop: Computer Code for Defining Population Centers around NPP sites

    International Nuclear Information System (INIS)

    Lee, H. W.; Im, C. B.; Hyun, S. G.; Kim, S. Y.; Seo, Y. S.; Na, J. H.

    2009-01-01

    An applicant for a reactor license is required by 10 CFR Part 100, corresponding rule to the MEST Notice No. 2008-7, to designate a population center distance, defined as the distance from the nuclear reactor center to the nearest boundary of a densely populated center containing more than about 25,000 residents. The population center distance must be at least one-third times the distance to the outer boundary of the low population zone (LPZ). The outer boundary of a population center shall be determined upon consideration of population distribution, not controlled by political boundaries (NARA, 1997). This paper presents a computer code, developed from a short term basic research program of Korea Institute of Nuclear Safety in 2008, for analyzing population distributions and defining outer boundary of a population center around a nuclear power plant (Lee et al, 2008)

  13. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Goettingen

    CERN Document Server

    Meyer, J; The ATLAS collaboration; Weber, P

    2011-01-01

    GoeGrid is a grid resource center located in G¨ottingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center is presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster is detailed. The benefits are an efficient use of computer and manpower resources.

  14. ATLAS Tier-2 at the Compute Resource Center GoeGrid in Gettingen

    International Nuclear Information System (INIS)

    Meyer, Jörg; Quadt, Arnulf; Weber, Pavel

    2011-01-01

    GoeGrid is a grid resource center located in Gettingen, Germany. The resources are commonly used, funded, and maintained by communities doing research in the fields of grid development, computer science, biomedicine, high energy physics, theoretical physics, astrophysics, and the humanities. For the high energy physics community, GoeGrid serves as a Tier-2 center for the ATLAS experiment as part of the world-wide LHC computing grid (WLCG). The status and performance of the Tier-2 center is presented with a focus on the interdisciplinary setup and administration of the cluster. Given the various requirements of the different communities on the hardware and software setup the challenge of the common operation of the cluster is detailed. The benefits are an efficient use of computer and personpower resources.

  15. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    Science.gov (United States)

    Bogdanov, A. V.; Iuzhanin, N. V.; Zolotarev, V. I.; Ezhakova, T. R.

    2017-12-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is being reviewed and development of the corresponding elements of the system is being described in the present paper.

  16. Impact of configuration management system of computer center on support of scientific projects throughout their lifecycle

    International Nuclear Information System (INIS)

    Bogdanov, A.V.; Yuzhanin, N.V.; Zolotarev, V.I.; Ezhakova, T.R.

    2017-01-01

    In this article the problem of scientific projects support throughout their lifecycle in the computer center is considered in every aspect of support. Configuration Management system plays a connecting role in processes related to the provision and support of services of a computer center. In view of strong integration of IT infrastructure components with the use of virtualization, control of infrastructure becomes even more critical to the support of research projects, which means higher requirements for the Configuration Management system. For every aspect of research projects support, the influence of the Configuration Management system is reviewed and development of the corresponding elements of the system is described in the present paper.

  17. Deep Space Network (DSN), Network Operations Control Center (NOCC) computer-human interfaces

    Science.gov (United States)

    Ellman, Alvin; Carlton, Magdi

    1993-01-01

    The Network Operations Control Center (NOCC) of the DSN is responsible for scheduling the resources of DSN, and monitoring all multi-mission spacecraft tracking activities in real-time. Operations performs this job with computer systems at JPL connected to over 100 computers at Goldstone, Australia and Spain. The old computer system became obsolete, and the first version of the new system was installed in 1991. Significant improvements for the computer-human interfaces became the dominant theme for the replacement project. Major issues required innovating problem solving. Among these issues were: How to present several thousand data elements on displays without overloading the operator? What is the best graphical representation of DSN end-to-end data flow? How to operate the system without memorizing mnemonics of hundreds of operator directives? Which computing environment will meet the competing performance requirements? This paper presents the technical challenges, engineering solutions, and results of the NOCC computer-human interface design.

  18. Biomedical nanotechnology.

    Science.gov (United States)

    Hurst, Sarah J

    2011-01-01

    This chapter summarizes the roles of nanomaterials in biomedical applications, focusing on those highlighted in this volume. A brief history of nanoscience and technology and a general introduction to the field are presented. Then, the chemical and physical properties of nanostructures that make them ideal for use in biomedical applications are highlighted. Examples of common applications, including sensing, imaging, and therapeutics, are given. Finally, the challenges associated with translating this field from the research laboratory to the clinic setting, in terms of the larger societal implications, are discussed.

  19. Computer Center.

    Science.gov (United States)

    Kramer, David W.

    1991-01-01

    Discusses the use of videodisk technology to teach biology. Presents videodisk selection criteria for biology teachers, lists 13 biology videodisks and their suppliers, and lists videodisks offered with biology textbooks. (MDH)

  20. Technical Data Management Center: a focal point for meteorological and other environmental transport computing technology

    International Nuclear Information System (INIS)

    McGill, B.; Maskewitz, B.F.; Trubey, D.K.

    1981-01-01

    The Technical Data Management Center, collecting, packaging, analyzing, and distributing information, computer technology and data which includes meteorological and other environmental transport work is located at the Oak Ridge National Laboratory, within the Engineering Physics Division. Major activities include maintaining a collection of computing technology and associated literature citations to provide capabilities for meteorological and environmental work. Details of the activities on behalf of TDMC's sponsoring agency, the US Nuclear Regulatory Commission, are described

  1. CNC Turning Center Operations and Prove Out. Computer Numerical Control Operator/Programmer. 444-334.

    Science.gov (United States)

    Skowronski, Steven D.

    This student guide provides materials for a course designed to instruct the student in the recommended procedures used when setting up tooling and verifying part programs for a two-axis computer numerical control (CNC) turning center. The course consists of seven units. Unit 1 discusses course content and reviews and demonstrates set-up procedures…

  2. Computational Fluid Dynamics Modeling and Validating Experiments of Airflow in a Data Center

    Directory of Open Access Journals (Sweden)

    Emelie Wibron

    2018-03-01

    Full Text Available The worldwide demand on data storage continues to increase and both the number and the size of data centers are expanding rapidly. Energy efficiency is an important factor to consider in data centers since the total energy consumption is huge. The servers must be cooled and the performance of the cooling system depends on the flow field of the air. Computational Fluid Dynamics (CFD can provide detailed information about the airflow in both existing data centers and proposed data center configurations before they are built. However, the simulations must be carried out with quality and trust. The k– ε model is the most common choice to model the turbulent airflow in data centers. The aim of this study is to examine the performance of more advanced turbulence models, not previously investigated for CFD modeling of data centers. The considered turbulence models are the k– ε model, the Reynolds Stress Model (RSM and Detached Eddy Simulations (DES. The commercial code ANSYS CFX 16.0 is used to perform the simulations and experimental values are used for validation. It is clarified that the flow field for the different turbulence models deviate at locations that are not in the close proximity of the main components in the data center. The k– ε model fails to predict low velocity regions. RSM and DES produce very similar results and, based on the solution times, it is recommended to use RSM to model the turbulent airflow data centers.

  3. Accurate Computation of Periodic Regions' Centers in the General M-Set with Integer Index Number

    Directory of Open Access Journals (Sweden)

    Wang Xingyuan

    2010-01-01

    Full Text Available This paper presents two methods for accurately computing the periodic regions' centers. One method fits for the general M-sets with integer index number, the other fits for the general M-sets with negative integer index number. Both methods improve the precision of computation by transforming the polynomial equations which determine the periodic regions' centers. We primarily discuss the general M-sets with negative integer index, and analyze the relationship between the number of periodic regions' centers on the principal symmetric axis and in the principal symmetric interior. We can get the centers' coordinates with at least 48 significant digits after the decimal point in both real and imaginary parts by applying the Newton's method to the transformed polynomial equation which determine the periodic regions' centers. In this paper, we list some centers' coordinates of general M-sets' k-periodic regions (k=3,4,5,6 for the index numbers α=−25,−24,…,−1 , all of which have highly numerical accuracy.

  4. Do jumbo cups cause hip center elevation in revision THA? A computer simulation.

    Science.gov (United States)

    Nwankwo, Chima; Dong, Nick N; Heffernan, Christopher D; Ries, Michael D

    2014-02-01

    Acetabular revision THA with use of a large (jumbo) cup is an effective treatment for many cavitary and segmental peripheral bone defects. However, the jumbo cup may result in elevation of the hip center and protrusion through the anterior acetabular wall as a result of the oversized geometry of the jumbo cup compared with the physiologic acetabulum. The purpose of this computer simulation was to determine how much elevation of the hip center and anterior wall protrusion occurs in revision THA with use of a jumbo cup technique in which the inferior edge of the jumbo cup is placed at the inferior acetabular rim and the superior edge of the jumbo cup is placed against host bone at the superior margin of a posterosuperior bone defect. Two hundred sixty-five pelvic CT scans were analyzed by custom CT analytical software. The computer simulated oversized reaming. The vertical and anterior reamer center shifts were measured, and anterior column bone removal was determined. The computer simulation demonstrated that the hip center shifted 0.27 mm superiorly and 0.02 mm anteriorly, and anterior column bone removal increased 0.86 mm for every 1-mm increase in reamer diameter. Our results indicate that the jumbo cup technique results in hip center elevation despite placement of the cup adjacent to the inferior acetabulum. For a hypothetical increase from a 54-mm socket to a 72-mm socket, as one might see in the context of the revision of a failed THA, our model would predict an elevation of the hip center of approximately 5 mm and loss of approximately 15 mm of anterior column bone. This suggests that an increase in femoral head length may be needed to compensate for the hip center elevation caused by the use of a large jumbo cup in revision THA. A jumbo cup may also result in protrusion through the anterior wall.

  5. High Performance Computing in Science and Engineering '08 : Transactions of the High Performance Computing Center

    CERN Document Server

    Kröner, Dietmar; Resch, Michael

    2009-01-01

    The discussions and plans on all scienti?c, advisory, and political levels to realize an even larger “European Supercomputer” in Germany, where the hardware costs alone will be hundreds of millions Euro – much more than in the past – are getting closer to realization. As part of the strategy, the three national supercomputing centres HLRS (Stuttgart), NIC/JSC (Julic ¨ h) and LRZ (Munich) have formed the Gauss Centre for Supercomputing (GCS) as a new virtual organization enabled by an agreement between the Federal Ministry of Education and Research (BMBF) and the state ministries for research of Baden-Wurttem ¨ berg, Bayern, and Nordrhein-Westfalen. Already today, the GCS provides the most powerful high-performance computing - frastructure in Europe. Through GCS, HLRS participates in the European project PRACE (Partnership for Advances Computing in Europe) and - tends its reach to all European member countries. These activities aligns well with the activities of HLRS in the European HPC infrastructur...

  6. BIG: a Grid Portal for Biomedical Data and Images

    Directory of Open Access Journals (Sweden)

    Giovanni Aloisio

    2004-06-01

    Full Text Available Modern management of biomedical systems involves the use of many distributed resources, such as high performance computational resources to analyze biomedical data, mass storage systems to store them, medical instruments (microscopes, tomographs, etc., advanced visualization and rendering tools. Grids offer the computational power, security and availability needed by such novel applications. This paper presents BIG (Biomedical Imaging Grid, a Web-based Grid portal for management of biomedical information (data and images in a distributed environment. BIG is an interactive environment that deals with complex user's requests, regarding the acquisition of biomedical data, the "processing" and "delivering" of biomedical images, using the power and security of Computational Grids.

  7. On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers

    Science.gov (United States)

    Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.

    2017-10-01

    This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.

  8. Radiation Shielding Information Center: a source of computer codes and data for fusion neutronics studies

    International Nuclear Information System (INIS)

    McGill, B.L.; Roussin, R.W.; Trubey, D.K.; Maskewitz, B.F.

    1980-01-01

    The Radiation Shielding Information Center (RSIC), established in 1962 to collect, package, analyze, and disseminate information, computer codes, and data in the area of radiation transport related to fission, is now being utilized to support fusion neutronics technology. The major activities include: (1) answering technical inquiries on radiation transport problems, (2) collecting, packaging, testing, and disseminating computing technology and data libraries, and (3) reviewing literature and operating a computer-based information retrieval system containing material pertinent to radiation transport analysis. The computer codes emphasize methods for solving the Boltzmann equation such as the discrete ordinates and Monte Carlo techniques, both of which are widely used in fusion neutronics. The data packages include multigroup coupled neutron-gamma-ray cross sections and kerma coefficients, other nuclear data, and radiation transport benchmark problem results

  9. Argonne's Laboratory Computing Resource Center 2009 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B. (CLS-CI)

    2011-05-13

    Now in its seventh year of operation, the Laboratory Computing Resource Center (LCRC) continues to be an integral component of science and engineering research at Argonne, supporting a diverse portfolio of projects for the U.S. Department of Energy and other sponsors. The LCRC's ongoing mission is to enable and promote computational science and engineering across the Laboratory, primarily by operating computing facilities and supporting high-performance computing application use and development. This report describes scientific activities carried out with LCRC resources in 2009 and the broad impact on programs across the Laboratory. The LCRC computing facility, Jazz, is available to the entire Laboratory community. In addition, the LCRC staff provides training in high-performance computing and guidance on application usage, code porting, and algorithm development. All Argonne personnel and collaborators are encouraged to take advantage of this computing resource and to provide input into the vision and plans for computing and computational analysis at Argonne. The LCRC Allocations Committee makes decisions on individual project allocations for Jazz. Committee members are appointed by the Associate Laboratory Directors and span a range of computational disciplines. The 350-node LCRC cluster, Jazz, began production service in April 2003 and has been a research work horse ever since. Hosting a wealth of software tools and applications and achieving high availability year after year, researchers can count on Jazz to achieve project milestones and enable breakthroughs. Over the years, many projects have achieved results that would have been unobtainable without such a computing resource. In fiscal year 2009, there were 49 active projects representing a wide cross-section of Laboratory research and almost all research divisions.

  10. The Role of Computers in Research and Development at Langley Research Center

    Science.gov (United States)

    Wieseman, Carol D. (Compiler)

    1994-01-01

    This document is a compilation of presentations given at a workshop on the role cf computers in research and development at the Langley Research Center. The objectives of the workshop were to inform the Langley Research Center community of the current software systems and software practices in use at Langley. The workshop was organized in 10 sessions: Software Engineering; Software Engineering Standards, methods, and CASE tools; Solutions of Equations; Automatic Differentiation; Mosaic and the World Wide Web; Graphics and Image Processing; System Design Integration; CAE Tools; Languages; and Advanced Topics.

  11. Argonne's Laboratory Computing Resource Center : 2005 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Coghlan, S. C; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Pieper, G. P.

    2007-06-30

    Argonne National Laboratory founded the Laboratory Computing Resource Center in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. The first goal of the LCRC was to deploy a mid-range supercomputing facility to support the unmet computational needs of the Laboratory. To this end, in September 2002, the Laboratory purchased a 350-node computing cluster from Linux NetworX. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the fifty fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2005, there were 62 active projects on Jazz involving over 320 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to improve the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure

  12. Advances in biomedical engineering

    CERN Document Server

    Brown, J H U

    1973-01-01

    Advances in Biomedical Engineering, Volume 2, is a collection of papers that discusses the basic sciences, the applied sciences of engineering, the medical sciences, and the delivery of health services. One paper discusses the models of adrenal cortical control, including the secretion and metabolism of cortisol (the controlled process), as well as the initiation and modulation of secretion of ACTH (the controller). Another paper discusses hospital computer systems-application problems, objective evaluation of technology, and multiple pathways for future hospital computer applications. The pos

  13. Teaching Scientific Computing: A Model-Centered Approach to Pipeline and Parallel Programming with C

    Directory of Open Access Journals (Sweden)

    Vladimiras Dolgopolovas

    2015-01-01

    Full Text Available The aim of this study is to present an approach to the introduction into pipeline and parallel computing, using a model of the multiphase queueing system. Pipeline computing, including software pipelines, is among the key concepts in modern computing and electronics engineering. The modern computer science and engineering education requires a comprehensive curriculum, so the introduction to pipeline and parallel computing is the essential topic to be included in the curriculum. At the same time, the topic is among the most motivating tasks due to the comprehensive multidisciplinary and technical requirements. To enhance the educational process, the paper proposes a novel model-centered framework and develops the relevant learning objects. It allows implementing an educational platform of constructivist learning process, thus enabling learners’ experimentation with the provided programming models, obtaining learners’ competences of the modern scientific research and computational thinking, and capturing the relevant technical knowledge. It also provides an integral platform that allows a simultaneous and comparative introduction to pipelining and parallel computing. The programming language C for developing programming models and message passing interface (MPI and OpenMP parallelization tools have been chosen for implementation.

  14. Building the biomedical data science workforce.

    Directory of Open Access Journals (Sweden)

    Michelle C Dunn

    2017-07-01

    Full Text Available This article describes efforts at the National Institutes of Health (NIH from 2013 to 2016 to train a national workforce in biomedical data science. We provide an analysis of the Big Data to Knowledge (BD2K training program strengths and weaknesses with an eye toward future directions aimed at any funder and potential funding recipient worldwide. The focus is on extramurally funded programs that have a national or international impact rather than the training of NIH staff, which was addressed by the NIH's internal Data Science Workforce Development Center. From its inception, the major goal of BD2K was to narrow the gap between needed and existing biomedical data science skills. As biomedical research increasingly relies on computational, mathematical, and statistical thinking, supporting the training and education of the workforce of tomorrow requires new emphases on analytical skills. From 2013 to 2016, BD2K jump-started training in this area for all levels, from graduate students to senior researchers.

  15. Argonne's Laboratory computing resource center : 2006 annual report.

    Energy Technology Data Exchange (ETDEWEB)

    Bair, R. B.; Kaushik, D. K.; Riley, K. R.; Valdes, J. V.; Drugan, C. D.; Pieper, G. P.

    2007-05-31

    Argonne National Laboratory founded the Laboratory Computing Resource Center (LCRC) in the spring of 2002 to help meet pressing program needs for computational modeling, simulation, and analysis. The guiding mission is to provide critical computing resources that accelerate the development of high-performance computing expertise, applications, and computations to meet the Laboratory's challenging science and engineering missions. In September 2002 the LCRC deployed a 350-node computing cluster from Linux NetworX to address Laboratory needs for mid-range supercomputing. This cluster, named 'Jazz', achieved over a teraflop of computing power (10{sup 12} floating-point calculations per second) on standard tests, making it the Laboratory's first terascale computing system and one of the 50 fastest computers in the world at the time. Jazz was made available to early users in November 2002 while the system was undergoing development and configuration. In April 2003, Jazz was officially made available for production operation. Since then, the Jazz user community has grown steadily. By the end of fiscal year 2006, there were 76 active projects on Jazz involving over 380 scientists and engineers. These projects represent a wide cross-section of Laboratory expertise, including work in biosciences, chemistry, climate, computer science, engineering applications, environmental science, geoscience, information science, materials science, mathematics, nanoscience, nuclear engineering, and physics. Most important, many projects have achieved results that would have been unobtainable without such a computing resource. The LCRC continues to foster growth in the computational science and engineering capability and quality at the Laboratory. Specific goals include expansion of the use of Jazz to new disciplines and Laboratory initiatives, teaming with Laboratory infrastructure providers to offer more scientific data management capabilities, expanding Argonne staff

  16. The psychology of computer displays in the modern mission control center

    Science.gov (United States)

    Granaas, Michael M.; Rhea, Donald C.

    1988-01-01

    Work at NASA's Western Aeronautical Test Range (WATR) has demonstrated the need for increased consideration of psychological factors in the design of computer displays for the WATR mission control center. These factors include color perception, memory load, and cognitive processing abilities. A review of relevant work in the human factors psychology area is provided to demonstrate the need for this awareness. The information provided should be relevant in control room settings where computerized displays are being used.

  17. Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)

    Science.gov (United States)

    Valentine, Timothy

    2017-09-01

    The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.

  18. Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC

    Directory of Open Access Journals (Sweden)

    Valentine Timothy

    2017-01-01

    Full Text Available The Radiation Safety Information Computational Center (RSICC at Oak Ridge National Laboratory (ORNL is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC’s customers (scientists, engineers, and students from around the world obtain access to such computing codes (source and/or executable versions and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC’s customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S tools that support both domestic and international activities. This presentation will provide a general review of RSICC’s activities, services, and systems that support knowledge management and education and training in the nuclear field.

  19. Efficient Computation of Multiscale Entropy over Short Biomedical Time Series Based on Linear State-Space Models

    Directory of Open Access Journals (Sweden)

    Luca Faes

    2017-01-01

    Full Text Available The most common approach to assess the dynamical complexity of a time series across multiple temporal scales makes use of the multiscale entropy (MSE and refined MSE (RMSE measures. In spite of their popularity, MSE and RMSE lack an analytical framework allowing their calculation for known dynamic processes and cannot be reliably computed over short time series. To overcome these limitations, we propose a method to assess RMSE for autoregressive (AR stochastic processes. The method makes use of linear state-space (SS models to provide the multiscale parametric representation of an AR process observed at different time scales and exploits the SS parameters to quantify analytically the complexity of the process. The resulting linear MSE (LMSE measure is first tested in simulations, both theoretically to relate the multiscale complexity of AR processes to their dynamical properties and over short process realizations to assess its computational reliability in comparison with RMSE. Then, it is applied to the time series of heart period, arterial pressure, and respiration measured for healthy subjects monitored in resting conditions and during physiological stress. This application to short-term cardiovascular variability documents that LMSE can describe better than RMSE the activity of physiological mechanisms producing biological oscillations at different temporal scales.

  20. [Projects to accelerate the practical use of innovative medical devices to collaborate with TWIns, Center for Advanced Biomedical Sciences, Waseda University and School of Engineering, The University of Tokyo].

    Science.gov (United States)

    Niimi, Shingo; Umezu, Mitsuo; Iseki, Hiroshi; Harada, Hiroshi Kasanuki Noboru; Mitsuishi, Mamoru; Kitamori, Takehiko; Tei, Yuichi; Nakaoka, Ryusuke; Haishima, Yuji

    2014-01-01

    Division of Medical Devices has been conducting the projects to accelerate the practical use of innovative medical devices to collaborate with TWIns, Center for Advanced Biomedical Sciences, Waseda University and School of Engineering, The University of Tokyo. The TWIns has been studying to aim at establishment of preclinical evaluation methods by "Engineering Based Medicine", and established Regulatory Science Institute for Medical Devices. School of Engineering, The University of Tokyo has been studying to aim at establishment of assessment methodology for innovative minimally invasive therapeutic devices, materials, and nanobio diagnostic devices. This report reviews the exchanges of personnel, the implement systems and the research progress of these projects.

  1. Exploring Effective Decision Making through Human-Centered and Computational Intelligence Methods

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kyungsik; Cook, Kristin A.; Shih, Patrick C.

    2016-06-13

    Decision-making has long been studied to understand a psychological, cognitive, and social process of selecting an effective choice from alternative options. Its studies have been extended from a personal level to a group and collaborative level, and many computer-aided decision-making systems have been developed to help people make right decisions. There has been significant research growth in computational aspects of decision-making systems, yet comparatively little effort has existed in identifying and articulating user needs and requirements in assessing system outputs and the extent to which human judgments could be utilized for making accurate and reliable decisions. Our research focus is decision-making through human-centered and computational intelligence methods in a collaborative environment, and the objectives of this position paper are to bring our research ideas to the workshop, and share and discuss ideas.

  2. Eye center localization and gaze gesture recognition for human-computer interaction.

    Science.gov (United States)

    Zhang, Wenhao; Smith, Melvyn L; Smith, Lyndon N; Farooq, Abdul

    2016-03-01

    This paper introduces an unsupervised modular approach for accurate and real-time eye center localization in images and videos, thus allowing a coarse-to-fine, global-to-regional scheme. The trajectories of eye centers in consecutive frames, i.e., gaze gestures, are further analyzed, recognized, and employed to boost the human-computer interaction (HCI) experience. This modular approach makes use of isophote and gradient features to estimate the eye center locations. A selective oriented gradient filter has been specifically designed to remove strong gradients from eyebrows, eye corners, and shadows, which sabotage most eye center localization methods. A real-world implementation utilizing these algorithms has been designed in the form of an interactive advertising billboard to demonstrate the effectiveness of our method for HCI. The eye center localization algorithm has been compared with 10 other algorithms on the BioID database and six other algorithms on the GI4E database. It outperforms all the other algorithms in comparison in terms of localization accuracy. Further tests on the extended Yale Face Database b and self-collected data have proved this algorithm to be robust against moderate head poses and poor illumination conditions. The interactive advertising billboard has manifested outstanding usability and effectiveness in our tests and shows great potential for benefiting a wide range of real-world HCI applications.

  3. Healthy Eating and Harambee: curriculum development for a culturally-centered bio-medically oriented nutrition education program to reach African American women of childbearing age.

    Science.gov (United States)

    Kannan, Srimathi; Sparks, Arlene V; Webster, J DeWitt; Krishnakumar, Ambika; Lumeng, Julie

    2010-07-01

    The purpose was to develop, implement and evaluate a peer-led nutrition curriculum Healthy Eating and Harambee that addresses established objectives of maternal and infant health and to shift the stage for African American women of childbearing age in Genesee County toward healthier dietary patterns using a socio-cultural and biomedical orientation. The PEN-3 model, which frames culture in the context of health promotion interventions, was integrated with the Transtheoretical Model to guide this 13-week pre-test/post-test curriculum. Materials developed included soul food plate visuals, a micronutrient availability worksheet, a fruit stand, and gardening kits. Learning activities included affirmations, stories, case-scenarios, point-of-purchase product recognition, church health teams, and community health fairs. We investigated health-promoting dietary behaviors (consumption of more fruits and vegetables (F&V), serving more F&V to their families, and moderating dietary sodium and fat intakes), and biomedical behaviors (self-monitoring blood pressure and exercising) across five stages of change. Session attendance and program satisfaction were assessed. N = 102 women participated (mean age = 27.5 years). A majority (77%) reported adopting at least one healthy eating behavior (moderating sodium, serving more F&V to their families), 23% adopted at least two such behaviors (reading food labels for sodium; using culinary herbs/spices; serving more F&V to their families), and 45% adopted both dietary (moderating sodium; eating more fruits) and biomedical behaviors. Participants and facilitators favorably evaluated the curriculum and suggested improvements. A multi-conceptual approach coupled with cultural and biomedical tailoring has potential to promote young African American women's movement to more advanced stages of change and improve self-efficacy for fruit and vegetable intake, dietary sodium moderation, and self-monitoring blood pressure and physical activity.

  4. An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center

    Science.gov (United States)

    Gleason, J. L.; Little, M. M.

    2013-12-01

    NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.

  5. Spectrum of tablet computer use by medical students and residents at an academic medical center

    Directory of Open Access Journals (Sweden)

    Robert Robinson

    2015-07-01

    Full Text Available Introduction. The value of tablet computer use in medical education is an area of considerable interest, with preliminary investigations showing that the majority of medical trainees feel that tablet computers added value to the curriculum. This study investigated potential differences in tablet computer use between medical students and resident physicians.Materials & Methods. Data collection for this survey was accomplished with an anonymous online questionnaire shared with the medical students and residents at Southern Illinois University School of Medicine (SIU-SOM in July and August of 2012.Results. There were 76 medical student responses (26% response rate and 66 resident/fellow responses to this survey (21% response rate. Residents/fellows were more likely to use tablet computers several times daily than medical students (32% vs. 20%, p = 0.035. The most common reported uses were for accessing medical reference applications (46%, e-Books (45%, and board study (32%. Residents were more likely than students to use a tablet computer to access an electronic medical record (41% vs. 21%, p = 0.010, review radiology images (27% vs. 12%, p = 0.019, and enter patient care orders (26% vs. 3%, p < 0.001.Discussion. This study shows a high prevalence and frequency of tablet computer use among physicians in training at this academic medical center. Most residents and students use tablet computers to access medical references, e-Books, and to study for board exams. Residents were more likely to use tablet computers to complete clinical tasks.Conclusions. Tablet computer use among medical students and resident physicians was common in this survey. All learners used tablet computers for point of care references and board study. Resident physicians were more likely to use tablet computers to access the EMR, enter patient care orders, and review radiology studies. This difference is likely due to the differing educational and professional demands placed on

  6. Modeling Remote I/O versus Staging Tradeoff in Multi-Data Center Computing

    International Nuclear Information System (INIS)

    Suslu, Ibrahim H

    2014-01-01

    In multi-data center computing, data to be processed is not always local to the computation. This is a major challenge especially for data-intensive Cloud computing applications, since large amount of data would need to be either moved the local sites (staging) or accessed remotely over the network (remote I/O). Cloud application developers generally chose between staging and remote I/O intuitively without making any scientific comparison specific to their application data access patterns since there is no generic model available that they can use. In this paper, we propose a generic model for the Cloud application developers which would help them to choose the most appropriate data access mechanism for their specific application workloads. We define the parameters that potentially affect the end-to-end performance of the multi-data center Cloud applications which need to access large datasets over the network. To test and validate our models, we implemented a series of synthetic benchmark applications to simulate the most common data access patterns encountered in Cloud applications. We show that our model provides promising results in different settings with different parameters, such as network bandwidth, server and client capabilities, and data access ratio

  7. Environmental/Biomedical Terminology Index

    International Nuclear Information System (INIS)

    Huffstetler, J.K.; Dailey, N.S.; Rickert, L.W.; Chilton, B.D.

    1976-12-01

    The Information Center Complex (ICC), a centrally administered group of information centers, provides information support to environmental and biomedical research groups and others within and outside Oak Ridge National Laboratory. In-house data base building and development of specialized document collections are important elements of the ongoing activities of these centers. ICC groups must be concerned with language which will adequately classify and insure retrievability of document records. Language control problems are compounded when the complexity of modern scientific problem solving demands an interdisciplinary approach. Although there are several word lists, indexes, and thesauri specific to various scientific disciplines usually grouped as Environmental Sciences, no single generally recognized authority can be used as a guide to the terminology of all environmental science. If biomedical terminology for the description of research on environmental effects is also needed, the problem becomes even more complex. The building of a word list which can be used as a general guide to the environmental/biomedical sciences has been a continuing activity of the Information Center Complex. This activity resulted in the publication of the Environmental Biomedical Terminology Index

  8. Environmental/Biomedical Terminology Index

    Energy Technology Data Exchange (ETDEWEB)

    Huffstetler, J.K.; Dailey, N.S.; Rickert, L.W.; Chilton, B.D.

    1976-12-01

    The Information Center Complex (ICC), a centrally administered group of information centers, provides information support to environmental and biomedical research groups and others within and outside Oak Ridge National Laboratory. In-house data base building and development of specialized document collections are important elements of the ongoing activities of these centers. ICC groups must be concerned with language which will adequately classify and insure retrievability of document records. Language control problems are compounded when the complexity of modern scientific problem solving demands an interdisciplinary approach. Although there are several word lists, indexes, and thesauri specific to various scientific disciplines usually grouped as Environmental Sciences, no single generally recognized authority can be used as a guide to the terminology of all environmental science. If biomedical terminology for the description of research on environmental effects is also needed, the problem becomes even more complex. The building of a word list which can be used as a general guide to the environmental/biomedical sciences has been a continuing activity of the Information Center Complex. This activity resulted in the publication of the Environmental Biomedical Terminology Index (EBTI).

  9. Comparison of nonmesonic hypernuclear decay rates computed in laboratory and center-of-mass coordinates

    International Nuclear Information System (INIS)

    De Conti, C.; Barbero, C.; Galeão, A. P.; Krmpotić, F.

    2014-01-01

    In this work we compute the one-nucleon-induced nonmesonic hypernuclear decay rates of Λ 5 He, Λ 12 C and Λ 13 C using a formalism based on the independent particle shell model in terms of laboratory coordinates. To ascertain the correctness and precision of the method, these results are compared with those obtained using a formalism in terms of center-of-mass coordinates, which has been previously reported in the literature. The formalism in terms of laboratory coordinates will be useful in the shell-model approach to two-nucleon-induced transitions

  10. System analysis for the Huntsville Operation Support Center distributed computer system

    Science.gov (United States)

    Ingels, F. M.

    1986-01-01

    A simulation model of the NASA Huntsville Operational Support Center (HOSC) was developed. This simulation model emulates the HYPERchannel Local Area Network (LAN) that ties together the various computers of HOSC. The HOSC system is a large installation of mainframe computers such as the Perkin Elmer 3200 series and the Dec VAX series. A series of six simulation exercises of the HOSC model is described using data sets provided by NASA. The analytical analysis of the ETHERNET LAN and the video terminals (VTs) distribution system are presented. An interface analysis of the smart terminal network model which allows the data flow requirements due to VTs on the ETHERNET LAN to be estimated, is presented.

  11. Initial Flight Test of the Production Support Flight Control Computers at NASA Dryden Flight Research Center

    Science.gov (United States)

    Carter, John; Stephenson, Mark

    1999-01-01

    The NASA Dryden Flight Research Center has completed the initial flight test of a modified set of F/A-18 flight control computers that gives the aircraft a research control law capability. The production support flight control computers (PSFCC) provide an increased capability for flight research in the control law, handling qualities, and flight systems areas. The PSFCC feature a research flight control processor that is "piggybacked" onto the baseline F/A-18 flight control system. This research processor allows for pilot selection of research control law operation in flight. To validate flight operation, a replication of a standard F/A-18 control law was programmed into the research processor and flight-tested over a limited envelope. This paper provides a brief description of the system, summarizes the initial flight test of the PSFCC, and describes future experiments for the PSFCC.

  12. Abstracts of digital computer code packages assembled by the Radiation Shielding Information Center

    Energy Technology Data Exchange (ETDEWEB)

    Carter, B.J.; Maskewitz, B.F.

    1985-04-01

    This publication, ORNL/RSIC-13, Volumes I to III Revised, has resulted from an internal audit of the first 168 packages of computing technology in the Computer Codes Collection (CCC) of the Radiation Shielding Information Center (RSIC). It replaces the earlier three documents published as single volumes between 1966 to 1972. A significant number of the early code packages were considered to be obsolete and were removed from the collection in the audit process and the CCC numbers were not reassigned. Others not currently being used by the nuclear R and D community were retained in the collection to preserve technology not replaced by newer methods, or were considered of potential value for reference purposes. Much of the early technology, however, has improved through developer/RSIC/user interaction and continues at the forefront of the advancing state-of-the-art.

  13. Abstracts of digital computer code packages assembled by the Radiation Shielding Information Center

    International Nuclear Information System (INIS)

    Carter, B.J.; Maskewitz, B.F.

    1985-04-01

    This publication, ORNL/RSIC-13, Volumes I to III Revised, has resulted from an internal audit of the first 168 packages of computing technology in the Computer Codes Collection (CCC) of the Radiation Shielding Information Center (RSIC). It replaces the earlier three documents published as single volumes between 1966 to 1972. A significant number of the early code packages were considered to be obsolete and were removed from the collection in the audit process and the CCC numbers were not reassigned. Others not currently being used by the nuclear R and D community were retained in the collection to preserve technology not replaced by newer methods, or were considered of potential value for reference purposes. Much of the early technology, however, has improved through developer/RSIC/user interaction and continues at the forefront of the advancing state-of-the-art

  14. Computational fluid dynamics research at the United Technologies Research Center requiring supercomputers

    Science.gov (United States)

    Landgrebe, Anton J.

    1987-01-01

    An overview of research activities at the United Technologies Research Center (UTRC) in the area of Computational Fluid Dynamics (CFD) is presented. The requirement and use of various levels of computers, including supercomputers, for the CFD activities is described. Examples of CFD directed toward applications to helicopters, turbomachinery, heat exchangers, and the National Aerospace Plane are included. Helicopter rotor codes for the prediction of rotor and fuselage flow fields and airloads were developed with emphasis on rotor wake modeling. Airflow and airload predictions and comparisons with experimental data are presented. Examples are presented of recent parabolized Navier-Stokes and full Navier-Stokes solutions for hypersonic shock-wave/boundary layer interaction, and hydrogen/air supersonic combustion. In addition, other examples of CFD efforts in turbomachinery Navier-Stokes methodology and separated flow modeling are presented. A brief discussion of the 3-tier scientific computing environment is also presented, in which the researcher has access to workstations, mid-size computers, and supercomputers.

  15. Computer-aided dispatch--traffic management center field operational test final detailed test plan : WSDOT deployment

    Science.gov (United States)

    2003-10-01

    The purpose of this document is to expand upon the evaluation components presented in "Computer-aided dispatch--traffic management center field operational test final evaluation plan : WSDOT deployment". This document defines the objective, approach,...

  16. Biomedical technology

    CERN Document Server

    Wriggers, Peter

    2015-01-01

    During the last years computational methods lead to new approaches that can be applied within medical practice. Based on the tremendous advances in medical imaging and high-performance computing, virtual testing is able to help in medical decision processes or implant designs. Current challenges in medicine and engineering are related to the application of computational methods to clinical medicine and the study of biological systems at different scales. Additionally manufacturers will be able to use computational tools and methods to predict the performance of their medical devices in virtual patients. The physical and animal testing procedures could be reduced by virtual prototyping of medical devices. Here simulations can enhance the performance of alternate device designs for a range of virtual patients. This will lead to a refinement of designs and to safer products. This book summarizes different aspects of approaches to enhance function, production, initialization and complications of different types o...

  17. A Decentralized Virtual Machine Migration Approach of Data Centers for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Xiaoying Wang

    2013-01-01

    Full Text Available As cloud computing offers services to lots of users worldwide, pervasive applications from customers are hosted by large-scale data centers. Upon such platforms, virtualization technology is employed to multiplex the underlying physical resources. Since the incoming loads of different application vary significantly, it is important and critical to manage the placement and resource allocation schemes of the virtual machines (VMs in order to guarantee the quality of services. In this paper, we propose a decentralized virtual machine migration approach inside the data centers for cloud computing environments. The system models and power models are defined and described first. Then, we present the key steps of the decentralized mechanism, including the establishment of load vectors, load information collection, VM selection, and destination determination. A two-threshold decentralized migration algorithm is implemented to further save the energy consumption as well as keeping the quality of services. By examining the effect of our approach by performance evaluation experiments, the thresholds and other factors are analyzed and discussed. The results illustrate that the proposed approach can efficiently balance the loads across different physical nodes and also can lead to less power consumption of the entire system holistically.

  18. Examining the Fundamental Obstructs of Adopting Cloud Computing for 9-1-1 Dispatch Centers in the USA

    Science.gov (United States)

    Osman, Abdulaziz

    2016-01-01

    The purpose of this research study was to examine the unknown fears of embracing cloud computing which stretches across measurements like fear of change from leaders and the complexity of the technology in 9-1-1 dispatch centers in USA. The problem that was addressed in the study was that many 9-1-1 dispatch centers in USA are still using old…

  19. Changing the batch system in a Tier 1 computing center: why and how

    Science.gov (United States)

    Chierici, Andrea; Dal Pra, Stefano

    2014-06-01

    At the Italian Tierl Center at CNAF we are evaluating the possibility to change the current production batch system. This activity is motivated mainly because we are looking for a more flexible licensing model as well as to avoid vendor lock-in. We performed a technology tracking exercise and among many possible solutions we chose to evaluate Grid Engine as an alternative because its adoption is increasing in the HEPiX community and because it's supported by the EMI middleware that we currently use on our computing farm. Another INFN site evaluated Slurm and we will compare our results in order to understand pros and cons of the two solutions. We will present the results of our evaluation of Grid Engine, in order to understand if it can fit the requirements of a Tier 1 center, compared to the solution we adopted long ago. We performed a survey and a critical re-evaluation of our farming infrastructure: many production softwares (accounting and monitoring on top of all) rely on our current solution and changing it required us to write new wrappers and adapt the infrastructure to the new system. We believe the results of this investigation can be very useful to other Tier-ls and Tier-2s centers in a similar situation, where the effort of switching may appear too hard to stand. We will provide guidelines in order to understand how difficult this operation can be and how long the change may take.

  20. User participation in the development of the human/computer interface for control centers

    Science.gov (United States)

    Broome, Richard; Quick-Campbell, Marlene; Creegan, James; Dutilly, Robert

    1996-01-01

    Technological advances coupled with the requirements to reduce operations staffing costs led to the demand for efficient, technologically-sophisticated mission operations control centers. The control center under development for the earth observing system (EOS) is considered. The users are involved in the development of a control center in order to ensure that it is cost-efficient and flexible. A number of measures were implemented in the EOS program in order to encourage user involvement in the area of human-computer interface development. The following user participation exercises carried out in relation to the system analysis and design are described: the shadow participation of the programmers during a day of operations; the flight operations personnel interviews; and the analysis of the flight operations team tasks. The user participation in the interface prototype development, the prototype evaluation, and the system implementation are reported on. The involvement of the users early in the development process enables the requirements to be better understood and the cost to be reduced.

  1. Computed tomography evaluation of rotary systems on the root canal transportation and centering ability

    Energy Technology Data Exchange (ETDEWEB)

    Pagliosa, Andre; Raucci-Neto, Walter; Silva-Souza, Yara Teresinha Correa; Alfredo, Edson, E-mail: ysousa@unaerp.br [Universidade de Ribeirao Preto (UNAERP), SP (Brazil). Fac. de Odontologia; Sousa-Neto, Manoel Damiao; Versiani, Marco Aurelio [Universidade de Sao Paulo (USP), Ribeirao Preto, SP (Brazil). Fac. de Odoentologia

    2015-03-01

    The endodontic preparation of curved and narrow root canals is challenging, with a tendency for the prepared canal to deviate away from its natural axis. The aim of this study was to evaluate, by cone-beam computed tomography, the transportation and centering ability of curved mesiobuccal canals in maxillary molars after biomechanical preparation with different nickel-titanium (NiTi) rotary systems. Forty teeth with angles of curvature ranging from 20° to 40° and radii between 5.0 mm and 10.0 mm were selected and assigned into four groups (n = 10), according to the biomechanical preparative system used: Hero 642 (HR), Liberator (LB), ProTaper (PT), and Twisted File (TF). The specimens were inserted into an acrylic device and scanned with computed tomography prior to, and following, instrumentation at 3, 6 and 9 mm from the root apex. The canal degree of transportation and centering ability were calculated and analyzed using one-way ANOVA and Tukey’s tests (α = 0.05). The results demonstrated no significant difference (p > 0.05) in shaping ability among the rotary systems. The mean canal transportation was: -0.049 ± 0.083 mm (HR); -0.004 ± 0.044 mm (LB); -0.003 ± 0.064 mm (PT); -0.021 ± 0.064 mm (TF). The mean canal centering ability was: -0.093 ± 0.147 mm (HR); -0.001 ± 0.100 mm (LB); -0.002 ± 0.134 mm (PT); -0.033 ± 0.133 mm (TF). Also, there was no significant difference among the root segments (p > 0.05). It was concluded that the Hero 642, Liberator, ProTaper, and Twisted File rotary systems could be safely used in curved canal instrumentation, resulting in satisfactory preservation of the original canal shape. (author)

  2. Biomedical engineering and nanotechnology

    International Nuclear Information System (INIS)

    Pawar, S.H.; Khyalappa, R.J.; Yakhmi, J.V.

    2009-01-01

    This book is predominantly a compilation of papers presented in the conference which is focused on the development in biomedical materials, biomedical devises and instrumentation, biomedical effects of electromagnetic radiation, electrotherapy, radiotherapy, biosensors, biotechnology, bioengineering, tissue engineering, clinical engineering and surgical planning, medical imaging, hospital system management, biomedical education, biomedical industry and society, bioinformatics, structured nanomaterial for biomedical application, nano-composites, nano-medicine, synthesis of nanomaterial, nano science and technology development. The papers presented herein contain the scientific substance to suffice the academic directivity of the researchers from the field of biomedicine, biomedical engineering, material science and nanotechnology. Papers relevant to INIS are indexed separately

  3. Concurrent validity of an automated algorithm for computing the center of pressure excursion index (CPEI).

    Science.gov (United States)

    Diaz, Michelle A; Gibbons, Mandi W; Song, Jinsup; Hillstrom, Howard J; Choe, Kersti H; Pasquale, Maria R

    2018-01-01

    Center of Pressure Excursion Index (CPEI), a parameter computed from the distribution of plantar pressures during stance phase of barefoot walking, has been used to assess dynamic foot function. The original custom program developed to calculate CPEI required the oversight of a user who could manually correct for certain exceptions to the computational rules. A new fully automatic program has been developed to calculate CPEI with an algorithm that accounts for these exceptions. The purpose of this paper is to compare resulting CPEI values computed by these two programs on plantar pressure data from both asymptomatic and pathologic subjects. If comparable, the new program offers significant benefits-reduced potential for variability due to rater discretion and faster CPEI calculation. CPEI values were calculated from barefoot plantar pressure distributions during comfortable paced walking on 61 healthy asymptomatic adults, 19 diabetic adults with moderate hallux valgus, and 13 adults with mild hallux valgus. Right foot data for each subject was analyzed with linear regression and a Bland-Altman plot. The automated algorithm yielded CPEI values that were linearly related to the original program (R 2 =0.99; Pcomputation methods. Results of this analysis suggest that the new automated algorithm may be used to calculate CPEI on both healthy and pathologic feet. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Research and development of grid computing technology in center for computational science and e-systems of Japan Atomic Energy Agency

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of the Japan Atomic Energy Agency (CCSE/JAEA) has carried out R and D of grid computing technology. Since 1995, R and D to realize computational assistance for researchers called Seamless Thinking Aid (STA) and then to share intellectual resources called Information Technology Based Laboratory (ITBL) have been conducted, leading to construct an intelligent infrastructure for the atomic energy research called Atomic Energy Grid InfraStructure (AEGIS) under the Japanese national project 'Development and Applications of Advanced High-Performance Supercomputer'. It aims to enable synchronization of three themes: 1) Computer-Aided Research and Development (CARD) to realize and environment for STA, 2) Computer-Aided Engineering (CAEN) to establish Multi Experimental Tools (MEXT), and 3) Computer Aided Science (CASC) to promote the Atomic Energy Research and Investigation (AERI). This article reviewed achievements in R and D of grid computing technology so far obtained. (T. Tanaka)

  5. Effects of a computer-assisted language intervention in a rural Nevada center.

    Science.gov (United States)

    Krumpe, Jo Anne; Harlow, Steven

    2008-06-01

    A computer-assisted language intervention, Fast ForWord-Language (FFW-L), was tested at a rural Nevada center in a group of children (Grades 2-12) referred by parents and teachers to assess enhancement of language skills. Given conflicting results from previous studies, language scores were measured using Clinical Evaluation of Language Fundamentals, Third Edition (CELF-3) before and after the FFW-L intervention. 58 children's CELF-3 postintervention scores were adjusted for age-specific expected changes and compared with pretest scores. Adjusted scores increased in both receptive and expressive domains of the CELF-3. Children with prior diagnoses of language and/or learning impairment did not differ from other referrals on adjusted CELF-3 adjusted gain scores after treatment. Thus the Fast ForWord-Language intervention may benefit a much broader group of children referred by parents and teachers for language or reading problems.

  6. System analysis for the Huntsville Operational Support Center distributed computer system

    Science.gov (United States)

    Ingels, F. M.; Mauldin, J.

    1984-01-01

    The Huntsville Operations Support Center (HOSC) is a distributed computer system used to provide real time data acquisition, analysis and display during NASA space missions and to perform simulation and study activities during non-mission times. The primary purpose is to provide a HOSC system simulation model that is used to investigate the effects of various HOSC system configurations. Such a model would be valuable in planning the future growth of HOSC and in ascertaining the effects of data rate variations, update table broadcasting and smart display terminal data requirements on the HOSC HYPERchannel network system. A simulation model was developed in PASCAL and results of the simulation model for various system configuraions were obtained. A tutorial of the model is presented and the results of simulation runs are presented. Some very high data rate situations were simulated to observe the effects of the HYPERchannel switch over from contention to priority mode under high channel loading.

  7. Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center

    Science.gov (United States)

    Molthan, Andrew L.; Limaye, Ashutosh S.; Srikishen, Jayanthi

    2011-01-01

    Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula s "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA s National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA s SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT s experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by

  8. Cloud Computing Applications in Support of Earth Science Activities at Marshall Space Flight Center

    Science.gov (United States)

    Molthan, A.; Limaye, A. S.

    2011-12-01

    Currently, the NASA Nebula Cloud Computing Platform is available to Agency personnel in a pre-release status as the system undergoes a formal operational readiness review. Over the past year, two projects within the Earth Science Office at NASA Marshall Space Flight Center have been investigating the performance and value of Nebula's "Infrastructure as a Service", or "IaaS" concept and applying cloud computing concepts to advance their respective mission goals. The Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique NASA satellite observations and weather forecasting capabilities for use within the operational forecasting community through partnerships with NOAA's National Weather Service (NWS). SPoRT has evaluated the performance of the Weather Research and Forecasting (WRF) model on virtual machines deployed within Nebula and used Nebula instances to simulate local forecasts in support of regional forecast studies of interest to select NWS forecast offices. In addition to weather forecasting applications, rapidly deployable Nebula virtual machines have supported the processing of high resolution NASA satellite imagery to support disaster assessment following the historic severe weather and tornado outbreak of April 27, 2011. Other modeling and satellite analysis activities are underway in support of NASA's SERVIR program, which integrates satellite observations, ground-based data and forecast models to monitor environmental change and improve disaster response in Central America, the Caribbean, Africa, and the Himalayas. Leveraging SPoRT's experience, SERVIR is working to establish a real-time weather forecasting model for Central America. Other modeling efforts include hydrologic forecasts for Kenya, driven by NASA satellite observations and reanalysis data sets provided by the broader meteorological community. Forecast modeling efforts are supplemented by short-term forecasts of convective initiation, determined by

  9. Biomedical engineering fundamentals

    CERN Document Server

    Bronzino, Joseph D

    2014-01-01

    Known as the bible of biomedical engineering, The Biomedical Engineering Handbook, Fourth Edition, sets the standard against which all other references of this nature are measured. As such, it has served as a major resource for both skilled professionals and novices to biomedical engineering.Biomedical Engineering Fundamentals, the first volume of the handbook, presents material from respected scientists with diverse backgrounds in physiological systems, biomechanics, biomaterials, bioelectric phenomena, and neuroengineering. More than three dozen specific topics are examined, including cardia

  10. Signal and image analysis for biomedical and life sciences

    CERN Document Server

    Sun, Changming; Pham, Tuan D; Vallotton, Pascal; Wang, Dadong

    2014-01-01

    With an emphasis on applications of computational models for solving modern challenging problems in biomedical and life sciences, this book aims to bring collections of articles from biologists, medical/biomedical and health science researchers together with computational scientists to focus on problems at the frontier of biomedical and life sciences. The goals of this book are to build interactions of scientists across several disciplines and to help industrial users apply advanced computational techniques for solving practical biomedical and life science problems. This book is for users in t

  11. Whatever works: a systematic user-centered training protocol to optimize brain-computer interfacing individually.

    Directory of Open Access Journals (Sweden)

    Elisabeth V C Friedrich

    Full Text Available This study implemented a systematic user-centered training protocol for a 4-class brain-computer interface (BCI. The goal was to optimize the BCI individually in order to achieve high performance within few sessions for all users. Eight able-bodied volunteers, who were initially naïve to the use of a BCI, participated in 10 sessions over a period of about 5 weeks. In an initial screening session, users were asked to perform the following seven mental tasks while multi-channel EEG was recorded: mental rotation, word association, auditory imagery, mental subtraction, spatial navigation, motor imagery of the left hand and motor imagery of both feet. Out of these seven mental tasks, the best 4-class combination as well as most reactive frequency band (between 8-30 Hz was selected individually for online control. Classification was based on common spatial patterns and Fisher's linear discriminant analysis. The number and time of classifier updates varied individually. Selection speed was increased by reducing trial length. To minimize differences in brain activity between sessions with and without feedback, sham feedback was provided in the screening and calibration runs in which usually no real-time feedback is shown. Selected task combinations and frequency ranges differed between users. The tasks that were included in the 4-class combination most often were (1 motor imagery of the left hand (2, one brain-teaser task (word association or mental subtraction (3, mental rotation task and (4 one more dynamic imagery task (auditory imagery, spatial navigation, imagery of the feet. Participants achieved mean performances over sessions of 44-84% and peak performances in single-sessions of 58-93% in this user-centered 4-class BCI protocol. This protocol is highly adjustable to individual users and thus could increase the percentage of users who can gain and maintain BCI control. A high priority for future work is to examine this protocol with severely

  12. Capturing the Value of Biomedical Research.

    Science.gov (United States)

    Bertuzzi, Stefano; Jamaleddine, Zeina

    2016-03-24

    Assessing the real-world impact of biomedical research is notoriously difficult. Here, we present the framework for building a prospective science-centered information system from scratch that has been afforded by the Sidra Medical and Research Center in Qatar. This experiment is part of the global conversation on maximizing returns on research investment. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Astigmatic single photon emission computed tomography imaging with a displaced center of rotation

    International Nuclear Information System (INIS)

    Wang, H.; Smith, M.F.; Stone, C.D.; Jaszczak, R.J.

    1998-01-01

    A filtered backprojection algorithm is developed for single photon emission computed tomography (SPECT) imaging with an astigmatic collimator having a displaced center of rotation. The astigmatic collimator has two perpendicular focal lines, one that is parallel to the axis of rotation of the gamma camera and one that is perpendicular to this axis. Using SPECT simulations of projection data from a hot rod phantom and point source arrays, it is found that a lack of incorporation of the mechanical shift in the reconstruction algorithm causes errors and artifacts in reconstructed SPECT images. The collimator and acquisition parameters in the astigmatic reconstruction formula, which include focal lengths, radius of rotation, and mechanical shifts, are often partly unknown and can be determined using the projections of a point source at various projection angles. The accurate determination of these parameters by a least squares fitting technique using projection data from numerically simulated SPECT acquisitions is studied. These studies show that the accuracy of parameter determination is improved as the distance between the point source and the axis of rotation of the gamma camera is increased. The focal length to the focal line perpendicular to the axis of rotation is determined more accurately than the focal length to the focal line parallel to this axis. copyright 1998 American Association of Physicists in Medicine

  14. Building and evaluating an informatics tool to facilitate analysis of a biomedical literature search service in an academic medical center library.

    Science.gov (United States)

    Hinton, Elizabeth G; Oelschlegel, Sandra; Vaughn, Cynthia J; Lindsay, J Michael; Hurst, Sachiko M; Earl, Martha

    2013-01-01

    This study utilizes an informatics tool to analyze a robust literature search service in an academic medical center library. Structured interviews with librarians were conducted focusing on the benefits of such a tool, expectations for performance, and visual layout preferences. The resulting application utilizes Microsoft SQL Server and .Net Framework 3.5 technologies, allowing for the use of a web interface. Customer tables and MeSH terms are included. The National Library of Medicine MeSH database and entry terms for each heading are incorporated, resulting in functionality similar to searching the MeSH database through PubMed. Data reports will facilitate analysis of the search service.

  15. COMPUTING

    CERN Document Server

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  16. Optimizing biomedical science learning in a veterinary curriculum: a review.

    Science.gov (United States)

    Warren, Amy L; Donnon, Tyrone

    2013-01-01

    As veterinary medical curricula evolve, the time dedicated to biomedical science teaching, as well as the role of biomedical science knowledge in veterinary education, has been scrutinized. Aside from being mandated by accrediting bodies, biomedical science knowledge plays an important role in developing clinical, diagnostic, and therapeutic reasoning skills in the application of clinical skills, in supporting evidence-based veterinary practice and life-long learning, and in advancing biomedical knowledge and comparative medicine. With an increasing volume and fast pace of change in biomedical knowledge, as well as increased demands on curricular time, there has been pressure to make biomedical science education efficient and relevant for veterinary medicine. This has lead to a shift in biomedical education from fact-based, teacher-centered and discipline-based teaching to applicable, student-centered, integrated teaching. This movement is supported by adult learning theories and is thought to enhance students' transference of biomedical science into their clinical practice. The importance of biomedical science in veterinary education and the theories of biomedical science learning will be discussed in this article. In addition, we will explore current advances in biomedical teaching methodologies that are aimed to maximize knowledge retention and application for clinical veterinary training and practice.

  17. Annual report of R and D activities in Center for Promotion of Computational Science and Engineering and Center for Computational Science and e-Systems from April 1, 2005 to March 31, 2006

    International Nuclear Information System (INIS)

    2007-03-01

    This report provides an overview of research and development activities in Center for Computational Science and Engineering (CCSE), JAERI in the former half of the fiscal year 2005 (April 1, 2005 - Sep. 30, 2006) and those in Center for Computational Science and e-Systems (CCSE), JAEA, in the latter half of the fiscal year 2005(Oct 1, 2005 - March 31, 2006). In the former half term, the activities have been performed by 5 research groups, Research Group for Computational Science in Atomic Energy, Research Group for Computational Material Science in Atomic Energy, R and D Group for Computer Science, R and D Group for Numerical Experiments, and Quantum Bioinformatics Group in CCSE. At the beginning of the latter half term, these 5 groups were integrated into two offices, Simulation Technology Research and Development Office and Computer Science Research and Development Office at the moment of the unification of JNC (Japan Nuclear Cycle Development Institute) and JAERI (Japan Atomic Energy Research Institute), and the latter-half term activities were operated by the two offices. A big project, ITBL (Information Technology Based Laboratory) project and fundamental computational research for atomic energy plant were performed mainly by two groups, the R and D Group for Computer Science and the Research Group for Computational Science in Atomic Energy in the former half term and their integrated office, Computer Science Research and Development Office in the latter half one, respectively. The main result was verification by using structure analysis for real plant executable on the Grid environment, and received Honorable Mentions of Analytic Challenge in the conference 'Supercomputing (SC05)'. The materials science and bioinformatics in atomic energy research field were carried out by three groups, Research Group for Computational Material Science in Atomic Energy, R and D Group for Computer Science, R and D Group for Numerical Experiments, and Quantum Bioinformatics

  18. A National Coordinating Center for Trauma Research

    Science.gov (United States)

    2017-10-01

    Biomedical Research Informatics Computation System (BRICS) to meet the functional needs of the NTRR. BRICS is a NIH-developed, disease agnostic, web... thyroid gland, major arteries and veins Create serial iterations of the models and molds to complete engineering 10-12 60% Research materials for...National Institutes of Health (NIH) roundtables, Centers for Disease Control meetings, and others.1–3 In 2015, the NIH and American College of Surgeons (ACS

  19. Exposure Science and the US EPA National Center for Computational Toxicology

    Science.gov (United States)

    The emerging field of computational toxicology applies mathematical and computer models and molecular biological and chemical approaches to explore both qualitative and quantitative relationships between sources of environmental pollutant exposure and adverse health outcomes. The...

  20. Biomedical engineering principles

    CERN Document Server

    Ritter, Arthur B; Valdevit, Antonio; Ascione, Alfred N

    2011-01-01

    Introduction: Modeling of Physiological ProcessesCell Physiology and TransportPrinciples and Biomedical Applications of HemodynamicsA Systems Approach to PhysiologyThe Cardiovascular SystemBiomedical Signal ProcessingSignal Acquisition and ProcessingTechniques for Physiological Signal ProcessingExamples of Physiological Signal ProcessingPrinciples of BiomechanicsPractical Applications of BiomechanicsBiomaterialsPrinciples of Biomedical Capstone DesignUnmet Clinical NeedsEntrepreneurship: Reasons why Most Good Designs Never Get to MarketAn Engineering Solution in Search of a Biomedical Problem

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  2. Bridging the digital divide by increasing computer and cancer literacy: community technology centers for head-start parents and families.

    Science.gov (United States)

    Salovey, Peter; Williams-Piehota, Pamela; Mowad, Linda; Moret, Marta Elisa; Edlund, Denielle; Andersen, Judith

    2009-01-01

    This article describes the establishment of two community technology centers affiliated with Head Start early childhood education programs focused especially on Latino and African American parents of children enrolled in Head Start. A 6-hour course concerned with computer and cancer literacy was presented to 120 parents and other community residents who earned a free, refurbished, Internet-ready computer after completing the program. Focus groups provided the basis for designing the structure and content of the course and modifying it during the project period. An outcomes-based assessment comparing program participants with 70 nonparticipants at baseline, immediately after the course ended, and 3 months later suggested that the program increased knowledge about computers and their use, knowledge about cancer and its prevention, and computer use including health information-seeking via the Internet. The creation of community computer technology centers requires the availability of secure space, capacity of a community partner to oversee project implementation, and resources of this partner to ensure sustainability beyond core funding.

  3. VI Latin American Congress on Biomedical Engineering

    CERN Document Server

    Hadad, Alejandro

    2015-01-01

    This volume presents the proceedings of the CLAIB 2014, held in Paraná, Entre Ríos, Argentina 29, 30 & 31 October 2014. The proceedings, presented by the Regional Council of Biomedical Engineering for Latin America (CORAL) offer research findings, experiences and activities between institutions and universities to develop Bioengineering, Biomedical Engineering and related sciences. The conferences of the American Congress of Biomedical Engineering are sponsored by the International Federation for Medical and Biological Engineering (IFMBE), Society for Engineering in Biology and Medicine (EMBS) and the Pan American Health Organization (PAHO), among other organizations and international agencies and bringing together scientists, academics and biomedical engineers in Latin America and other continents in an environment conducive to exchange and professional growth. The Topics include: - Bioinformatics and Computational Biology - Bioinstrumentation; Sensors, Micro and Nano Technologies - Biomaterials, Tissu...

  4. Certification of version 1.2 of the PORFLO-3 code for the WHC scientific and engineering computational center

    International Nuclear Information System (INIS)

    Kline, N.W.

    1994-01-01

    Version 1.2 of the PORFLO-3 Code has migrated from the Hanford Cray computer to workstations in the WHC Scientific and Engineering Computational Center. The workstation-based configuration and acceptance testing are inherited from the CRAY-based configuration. The purpose of this report is to document differences in the new configuration as compared to the parent Cray configuration, and summarize some of the acceptance test results which have shown that the migrated code is functioning correctly in the new environment

  5. High performance computing in science and engineering '09: transactions of the High Performance Computing Center, Stuttgart (HLRS) 2009

    National Research Council Canada - National Science Library

    Nagel, Wolfgang E; Kröner, Dietmar; Resch, Michael

    2010-01-01

    ...), NIC/JSC (J¨ u lich), and LRZ (Munich). As part of that strategic initiative, in May 2009 already NIC/JSC has installed the first phase of the GCS HPC Tier-0 resources, an IBM Blue Gene/P with roughly 300.000 Cores, this time in J¨ u lich, With that, the GCS provides the most powerful high-performance computing infrastructure in Europe alread...

  6. Computational Narrative Intelligence: A Human-Centered Goal for Artificial Intelligence

    OpenAIRE

    Riedl, Mark O.

    2016-01-01

    Narrative intelligence is the ability to craft, tell, understand, and respond affectively to stories. We argue that instilling artificial intelligences with computational narrative intelligence affords a number of applications beneficial to humans. We lay out some of the machine learning challenges necessary to solve to achieve computational narrative intelligence. Finally, we argue that computational narrative is a practical step towards machine enculturation, the teaching of sociocultural v...

  7. Biomedical applications engineering tasks

    Science.gov (United States)

    Laenger, C. J., Sr.

    1976-01-01

    The engineering tasks performed in response to needs articulated by clinicians are described. Initial contacts were made with these clinician-technology requestors by the Southwest Research Institute NASA Biomedical Applications Team. The basic purpose of the program was to effectively transfer aerospace technology into functional hardware to solve real biomedical problems.

  8. AHPCRC (Army High Performance Computing Rsearch Center) Bulletin. Volume 1, Issue 4

    Science.gov (United States)

    2011-01-01

    common treatment for acute blood loss arising from invasive surgery or traumatic wounding. Donated whole blood can be stored for approximately one...computational works on computational fluid dynamics on moving grids and fluid–structure interaction phenomena have contrib- uted to a renaissance of

  9. Adaptive TrimTree: Green Data Center Networks through Resource Consolidation, Selective Connectedness and Energy Proportional Computing

    Directory of Open Access Journals (Sweden)

    Saima Zafar

    2016-10-01

    Full Text Available A data center is a facility with a group of networked servers used by an organization for storage, management and dissemination of its data. The increase in data center energy consumption over the past several years is staggering, therefore efforts are being initiated to achieve energy efficiency of various components of data centers. One of the main reasons data centers have high energy inefficiency is largely due to the fact that most organizations run their data centers at full capacity 24/7. This results into a number of servers and switches being underutilized or even unutilized, yet working and consuming electricity around the clock. In this paper, we present Adaptive TrimTree; a mechanism that employs a combination of resource consolidation, selective connectedness and energy proportional computing for optimizing energy consumption in a Data Center Network (DCN. Adaptive TrimTree adopts a simple traffic-and-topology-based heuristic to find a minimum power network subset called ‘active network subset’ that satisfies the existing network traffic conditions while switching off the residual unused network components. A ‘passive network subset’ is also identified for redundancy which consists of links and switches that can be required in future and this subset is toggled to sleep state. An energy proportional computing technique is applied to the active network subset for adapting link data rates to workload thus maximizing energy optimization. We have compared our proposed mechanism with fat-tree topology and ElasticTree; a scheme based on resource consolidation. Our simulation results show that our mechanism saves 50%–70% more energy as compared to fat-tree and 19.6% as compared to ElasticTree, with minimal impact on packet loss percentage and delay. Additionally, our mechanism copes better with traffic anomalies and surges due to passive network provision.

  10. Biomedical signal and image processing.

    Science.gov (United States)

    Cerutti, Sergio; Baselli, Giuseppe; Bianchi, Anna; Caiani, Enrico; Contini, Davide; Cubeddu, Rinaldo; Dercole, Fabio; Rienzo, Luca; Liberati, Diego; Mainardi, Luca; Ravazzani, Paolo; Rinaldi, Sergio; Signorini, Maria; Torricelli, Alessandro

    2011-01-01

    Generally, physiological modeling and biomedical signal processing constitute two important paradigms of biomedical engineering (BME): their fundamental concepts are taught starting from undergraduate studies and are more completely dealt with in the last years of graduate curricula, as well as in Ph.D. courses. Traditionally, these two cultural aspects were separated, with the first one more oriented to physiological issues and how to model them and the second one more dedicated to the development of processing tools or algorithms to enhance useful information from clinical data. A practical consequence was that those who did models did not do signal processing and vice versa. However, in recent years,the need for closer integration between signal processing and modeling of the relevant biological systems emerged very clearly [1], [2]. This is not only true for training purposes(i.e., to properly prepare the new professional members of BME) but also for the development of newly conceived research projects in which the integration between biomedical signal and image processing (BSIP) and modeling plays a crucial role. Just to give simple examples, topics such as brain–computer machine or interfaces,neuroengineering, nonlinear dynamical analysis of the cardiovascular (CV) system,integration of sensory-motor characteristics aimed at the building of advanced prostheses and rehabilitation tools, and wearable devices for vital sign monitoring and others do require an intelligent fusion of modeling and signal processing competences that are certainly peculiar of our discipline of BME.

  11. The EGI-Engage EPOS Competence Center - Interoperating heterogeneous AAI mechanisms and Orchestrating distributed computational resources

    Science.gov (United States)

    Bailo, Daniele; Scardaci, Diego; Spinuso, Alessandro; Sterzel, Mariusz; Schwichtenberg, Horst; Gemuend, Andre

    2016-04-01

    manage the use of the subsurface of the Earth. EPOS started its Implementation Phase in October 2015 and is now actively working in order to integrate multidisciplinary data into a single e-infrastructure. Multidisciplinary data are organized and governed by the Thematic Core Services (TCS) - European wide organizations and e-Infrastructure providing community specific data and data products - and are driven by various scientific communities encompassing a wide spectrum of Earth science disciplines. TCS data, data products and services will be integrated into the Integrated Core Services (ICS) system, that will ensure their interoperability and access to these services by the scientific community as well as other users within the society. The EPOS competence center (EPOS CC) goal is to tackle two of the main challenges that the ICS are going to face in the near future, by taking advantage of the technical solutions provided by EGI. In order to do this, we will present the two pilot use cases the EGI-EPOS CC is developing: 1) The AAI pilot, dealing with the provision of transparent and homogeneous access to the ICS infrastructure to users owning different kind of credentials (e.g. eduGain, OpenID Connect, X509 certificates etc.). Here the focus is on the mechanisms which allow the credential delegation. 2) The computational pilot, Improve the back-end services of an existing application in the field of Computational Seismology, developed in the context of the EC funded project VERCE. The application allows the processing and the comparison of data resulting from the simulation of seismic wave propagation following a real earthquake and real measurements recorded by seismographs. While the simulation data is produced directly by the users and stored in a Data Management System, the observations need to be pre-staged from institutional data-services, which are maintained by the community itself. This use case aims at exploiting the EGI FedCloud e-infrastructure for Data

  12. Computer-Aided Diagnosis of Breast Cancer: A Multi-Center Demonstrator

    National Research Council Canada - National Science Library

    Floyd, Carey

    2000-01-01

    .... The focus has been to gather data from multiple sites in order to verify and whether the artificial neural network computer aid to the diagnosis of breast cancer can be translated between locations...

  13. Computed tomography-guided percutaneous gastrostomy: initial experience at a cancer center

    Energy Technology Data Exchange (ETDEWEB)

    Tyng, Chiang Jeng; Santos, Erich Frank Vater; Guerra, Luiz Felipe Alves; Bitencourt, Almir Galvao Vieira; Barbosa, Paula Nicole Vieira Pinto; Chojniak, Rubens [A. C. Camargo Cancer Center, Sao Paulo, SP (Brazil); Universidade Federal do Espirito Santo (HUCAM/UFES), Vitoria, ES (Brazil). Hospital Universitario Cassiano Antonio de Morais. Radiologia e Diagnostico por Imagem

    2017-03-15

    Gastrostomy is indicated for patients with conditions that do not allow adequate oral nutrition. To reduce the morbidity and costs associated with the procedure, there is a trend toward the use of percutaneous gastrostomy, guided by endoscopy, fluoroscopy, or, most recently, computed tomography. The purpose of this paper was to review the computed tomography-guided gastrostomy procedure, as well as the indications for its use and the potential complications. (author)

  14. Handbook of biomedical optics

    CERN Document Server

    Boas, David A

    2011-01-01

    Biomedical optics holds tremendous promise to deliver effective, safe, non- or minimally invasive diagnostics and targeted, customizable therapeutics. Handbook of Biomedical Optics provides an in-depth treatment of the field, including coverage of applications for biomedical research, diagnosis, and therapy. It introduces the theory and fundamentals of each subject, ensuring accessibility to a wide multidisciplinary readership. It also offers a view of the state of the art and discusses advantages and disadvantages of various techniques.Organized into six sections, this handbook: Contains intr

  15. Biomedical applications of polymers

    CERN Document Server

    Gebelein, C G

    1991-01-01

    The biomedical applications of polymers span an extremely wide spectrum of uses, including artificial organs, skin and soft tissue replacements, orthopaedic applications, dental applications, and controlled release of medications. No single, short review can possibly cover all these items in detail, and dozens of books andhundreds of reviews exist on biomedical polymers. Only a few relatively recent examples will be cited here;additional reviews are listed under most of the major topics in this book. We will consider each of the majorclassifications of biomedical polymers to some extent, inclu

  16. Powering biomedical devices

    CERN Document Server

    Romero, Edwar

    2013-01-01

    From exoskeletons to neural implants, biomedical devices are no less than life-changing. Compact and constant power sources are necessary to keep these devices running efficiently. Edwar Romero's Powering Biomedical Devices reviews the background, current technologies, and possible future developments of these power sources, examining not only the types of biomedical power sources available (macro, mini, MEMS, and nano), but also what they power (such as prostheses, insulin pumps, and muscular and neural stimulators), and how they work (covering batteries, biofluids, kinetic and ther

  17. Advances in biomedical engineering and biotechnology during 2013-2014.

    Science.gov (United States)

    Liu, Feng; Wang, Ying; Burkhart, Timothy A; González Penedo, Manuel Francisco; Ma, Shaodong

    2014-01-01

    The 3rd International Conference on Biomedical Engineering and Biotechnology (iCBEB 2014), held in Beijing from the 25th to the 28th of September 2014, is an annual conference that intends to provide an opportunity for researchers and practitioners around the world to present the most recent advances and future challenges in the fields of biomedical engineering, biomaterials, bioinformatics and computational biology, biomedical imaging and signal processing, biomechanical engineering and biotechnology, amongst others. The papers published in this issue are selected from this conference, which witnesses the advances in biomedical engineering and biotechnology during 2013-2014.

  18. Status of Research in Biomedical Engineering 1968.

    Science.gov (United States)

    National Inst. of General Medical Sciences (NIH), Bethesda, MD.

    This status report is divided into eight sections. The first four represent the classical engineering or building aspects of bioengineering and deal with biomedical instrumentation, prosthetics, man-machine systems and computer and information systems. The next three sections are related to the scientific, intellectual and academic influence of…

  19. A computer system to analyze showers in nuclear emulsions: Center Director's discretionary fund report

    Science.gov (United States)

    Meegan, C. A.; Fountain, W. F.; Berry, F. A., Jr.

    1987-01-01

    A system to rapidly digitize data from showers in nuclear emulsions is described. A TV camera views the emulsions though a microscope. The TV output is superimposed on the monitor of a minicomputer. The operator uses the computer's graphics capability to mark the positions of particle tracks. The coordinates of each track are stored on a disk. The computer then predicts the coordinates of each track through successive layers of emulsion. The operator, guided by the predictions, thus tracks and stores the development of the shower. The system provides a significant improvement over purely manual methods of recording shower development in nuclear emulsion stacks.

  20. Biomedical Engineering Laboratory

    National Research Council Canada - National Science Library

    Bodruzzama, Mohammad

    2003-01-01

    ... and on-line analysis of the biomedical signals. Each Biopac system-based laboratory station consists of real-time data acquisition system, amplifiers for EMG, EKG, EEG, and equipment for the study of Plethysmography, evoked response, cardio...

  1. Analysis of Heavy Metals in Soil from Burnt Computer Business Center

    African Journals Online (AJOL)

    This paper looked at the determination of heavy metal concentration in soil samples at burnt computer business centre (A case study of PTI, Effurun). Soil samples were collected from the site and another sample from unaffected area as control. The samples were analyzed for the presence of copper, zinc, cadmium, lead ...

  2. Person-Centered Emotional Support and Gender Attributions in Computer-Mediated Communication

    Science.gov (United States)

    Spottswood, Erin L.; Walther, Joseph B.; Holmstrom, Amanda J.; Ellison, Nicole B.

    2013-01-01

    Without physical appearance, identification in computer-mediated communication is relatively ambiguous and may depend on verbal cues such as usernames, content, and/or style. This is important when gender-linked differences exist in the effects of messages, as in emotional support. This study examined gender attribution for online support…

  3. RESOURCE-EFFICIENT ALLOCATION HEURISTICS FOR MANAGEMENT OF DATA CENTERS FOR CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    Vitaliy Litvinov

    2014-07-01

    Full Text Available Survey of research in resource-efficient computing and architectural principles forresource-efficient management of Clouds are offered in this article. Resource-efficient resource allocation policies and scheduling algorithms considering QoS expectations and power usage characteristics of the devices are defined.

  4. PRODEEDINGS OF RIKEN BNL RESEARCH CENTER WORKSHOP : HIGH PERFORMANCE COMPUTING WITH QCDOC AND BLUEGENE.

    Energy Technology Data Exchange (ETDEWEB)

    CHRIST,N.; DAVENPORT,J.; DENG,Y.; GARA,A.; GLIMM,J.; MAWHINNEY,R.; MCFADDEN,E.; PESKIN,A.; PULLEYBLANK,W.

    2003-03-11

    Staff of Brookhaven National Laboratory, Columbia University, IBM and the RIKEN BNL Research Center organized a one-day workshop held on February 28, 2003 at Brookhaven to promote the following goals: (1) To explore areas other than QCD applications where the QCDOC and BlueGene/L machines can be applied to good advantage, (2) To identify areas where collaboration among the sponsoring institutions can be fruitful, and (3) To expose scientists to the emerging software architecture. This workshop grew out of an informal visit last fall by BNL staff to the IBM Thomas J. Watson Research Center that resulted in a continuing dialog among participants on issues common to these two related supercomputers. The workshop was divided into three sessions, addressing the hardware and software status of each system, prospective applications, and future directions.

  5. Ambient radiation levels in positron emission tomography/computed tomography (PET/CT) imaging center

    Energy Technology Data Exchange (ETDEWEB)

    Santana, Priscila do Carmo; Oliveira, Paulo Marcio Campos de; Mamede, Marcelo; Silveira, Mariana de Castro; Aguiar, Polyanna; Real, Raphaela Vila, E-mail: pridili@gmail.com [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil); Silva, Teogenes Augusto da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2015-01-15

    Objective: to evaluate the level of ambient radiation in a PET/CT center. Materials and methods: previously selected and calibrated TLD-100H thermoluminescent dosimeters were utilized to measure room radiation levels. During 32 days, the detectors were placed in several strategically selected points inside the PET/CT center and in adjacent buildings. After the exposure period the dosimeters were collected and processed to determine the radiation level. Results: in none of the points selected for measurements the values exceeded the radiation dose threshold for controlled area (5 mSv/ year) or free area (0.5 mSv/year) as recommended by the Brazilian regulations. Conclusion: in the present study the authors demonstrated that the whole shielding system is appropriate and, consequently, the workers are exposed to doses below the threshold established by Brazilian standards, provided the radiation protection standards are followed. (author)

  6. Noise-Resilient Quantum Computing with a Nitrogen-Vacancy Center and Nuclear Spins.

    Science.gov (United States)

    Casanova, J; Wang, Z-Y; Plenio, M B

    2016-09-23

    Selective control of qubits in a quantum register for the purposes of quantum information processing represents a critical challenge for dense spin ensembles in solid-state systems. Here we present a protocol that achieves a complete set of selective electron-nuclear gates and single nuclear rotations in such an ensemble in diamond facilitated by a nearby nitrogen-vacancy (NV) center. The protocol suppresses internuclear interactions as well as unwanted coupling between the NV center and other spins of the ensemble to achieve quantum gate fidelities well exceeding 99%. Notably, our method can be applied to weakly coupled, distant spins representing a scalable procedure that exploits the exceptional properties of nuclear spins in diamond as robust quantum memories.

  7. Ambient radiation levels in positron emission tomography/computed tomography (PET/CT) imaging center

    Science.gov (United States)

    Santana, Priscila do Carmo; de Oliveira, Paulo Marcio Campos; Mamede, Marcelo; Silveira, Mariana de Castro; Aguiar, Polyanna; Real, Raphaela Vila; da Silva, Teógenes Augusto

    2015-01-01

    Objective To evaluate the level of ambient radiation in a PET/CT center. Materials and Methods Previously selected and calibrated TLD-100H thermoluminescent dosimeters were utilized to measure room radiation levels. During 32 days, the detectors were placed in several strategically selected points inside the PET/CT center and in adjacent buildings. After the exposure period the dosimeters were collected and processed to determine the radiation level. Results In none of the points selected for measurements the values exceeded the radiation dose threshold for controlled area (5 mSv/year) or free area (0.5 mSv/year) as recommended by the Brazilian regulations. Conclusion In the present study the authors demonstrated that the whole shielding system is appropriate and, consequently, the workers are exposed to doses below the threshold established by Brazilian standards, provided the radiation protection standards are followed. PMID:25798004

  8. A Pilot Study of Biomedical Text Comprehension using an Attention-Based Deep Neural Reader: Design and Experimental Analysis.

    Science.gov (United States)

    Kim, Seongsoon; Park, Donghyeon; Choi, Yonghwa; Lee, Kyubum; Kim, Byounggun; Jeon, Minji; Kim, Jihye; Tan, Aik Choon; Kang, Jaewoo

    2018-01-05

    With the development of artificial intelligence (AI) technology centered on deep-learning, the computer has evolved to a point where it can read a given text and answer a question based on the context of the text. Such a specific task is known as the task of machine comprehension. Existing machine comprehension tasks mostly use datasets of general texts, such as news articles or elementary school-level storybooks. However, no attempt has been made to determine whether an up-to-date deep learning-based machine comprehension model can also process scientific literature containing expert-level knowledge, especially in the biomedical domain. This study aims to investigate whether a machine comprehension model can process biomedical articles as well as general texts. Since there is no dataset for the biomedical literature comprehension task, our work includes generating a large-scale question answering dataset using PubMed and manually evaluating the generated dataset. We present an attention-based deep neural model tailored to the biomedical domain. To further enhance the performance of our model, we used a pretrained word vector and biomedical entity type embedding. We also developed an ensemble method of combining the results of several independent models to reduce the variance of the answers from the models. The experimental results showed that our proposed deep neural network model outperformed the baseline model by more than 7% on the new dataset. We also evaluated human performance on the new dataset. The human evaluation result showed that our deep neural model outperformed humans in comprehension by 22% on average. In this work, we introduced a new task of machine comprehension in the biomedical domain using a deep neural model. Since there was no large-scale dataset for training deep neural models in the biomedical domain, we created the new cloze-style datasets Biomedical Knowledge Comprehension Title (BMKC_T) and Biomedical Knowledge Comprehension Last

  9. SOCaaS: Security Operations Center as a Service for Cloud Computing Environments

    OpenAIRE

    Fahad F. Alruwaili; T. Aaron Gulliver

    2014-01-01

    The management of information security operations is a complex task, especially in a cloud environment.  The cloud service layers and multi-tenancy architecture creates a complex environment in which to develop and manage an information security incident management and compliance program. This paper presents a novel security operations center (SOC) framework as a service for cloud service providers and customers. The goal is to protect cloud services against new and existing attacks as well a...

  10. AHPCRC (Army High Performance Computing Research Center) Bulletin. Volume 3, Issue 1

    Science.gov (United States)

    2011-01-01

    alternative is to re-frame the problem in terms of a mathematical function that is easier to work with. In mathematical parlance, a homotopy method is...Mechanical Engineering Department spoke to all the PREP students about the Unmanned Aerial Systems Technical Analysis and Applications Center designed...and Eduardo Vega, New Mexico State University (men- tors: Charbel Bou-Mosleh and Charbel Farhat). “Aero- dynamic Analysis of the NASA Common Research

  11. Fast computed tomography and volume rendering using the body-centered cubic lattice

    OpenAIRE

    Finkbeiner, Bernhard

    2009-01-01

    Two main tasks in the field of volumetric image processing are acquisition and visualization of 3D data. The main challenge is to reduce processing costs, while maintaining high accuracy. To achieve these goals for volume rendering (visualization), we demonstrate that non-separable box splines for body-centered cubic (BCC) lattices can be adapted to fast evaluation on graphics hardware. Thus, the BCC lattice can be used for interactive volume rendering leading to better image quality than com...

  12. Ressource-Sharing on the Tera-Flop Scale for the Biomedical Research and Care Sector - The Erasmus Computing Grid and MediGRID

    NARCIS (Netherlands)

    T.A. Knoch (Tobias)

    2008-01-01

    markdownabstractToday advances in scientific research as well as clinical diagnostics and treatment are inevitably connected with information solutions concerning computation power and information storage. The needs for information technology are enormous and are in many cases the limiting factor

  13. Camera systems in human motion analysis for biomedical applications

    Science.gov (United States)

    Chin, Lim Chee; Basah, Shafriza Nisha; Yaacob, Sazali; Juan, Yeap Ewe; Kadir, Aida Khairunnisaa Ab.

    2015-05-01

    Human Motion Analysis (HMA) system has been one of the major interests among researchers in the field of computer vision, artificial intelligence and biomedical engineering and sciences. This is due to its wide and promising biomedical applications, namely, bio-instrumentation for human computer interfacing and surveillance system for monitoring human behaviour as well as analysis of biomedical signal and image processing for diagnosis and rehabilitation applications. This paper provides an extensive review of the camera system of HMA, its taxonomy, including camera types, camera calibration and camera configuration. The review focused on evaluating the camera system consideration of the HMA system specifically for biomedical applications. This review is important as it provides guidelines and recommendation for researchers and practitioners in selecting a camera system of the HMA system for biomedical applications.

  14. Human-Centered Design of Human-Computer-Human Dialogs in Aerospace Systems

    Science.gov (United States)

    Mitchell, Christine M.

    1998-01-01

    A series of ongoing research programs at Georgia Tech established a need for a simulation support tool for aircraft computer-based aids. This led to the design and development of the Georgia Tech Electronic Flight Instrument Research Tool (GT-EFIRT). GT-EFIRT is a part-task flight simulator specifically designed to study aircraft display design and single pilot interaction. ne simulator, using commercially available graphics and Unix workstations, replicates to a high level of fidelity the Electronic Flight Instrument Systems (EFIS), Flight Management Computer (FMC) and Auto Flight Director System (AFDS) of the Boeing 757/767 aircraft. The simulator can be configured to present information using conventional looking B757n67 displays or next generation Primary Flight Displays (PFD) such as found on the Beech Starship and MD-11.

  15. AHPCRC (Army High Performance Computing Research Center) Bulletin. Volume 1, Issue 2

    Science.gov (United States)

    2011-01-01

    microwave access Wi-Fi (also known as 802.11) works well in offices, cafes , and outdoor “hotspots,” but if you move across town you’ll quickly lose...New Mexico State University. They are developing new protocols and methods specifically for operation in a complex battlefield-like environ- ment...Dissemination and Aggregation Amiya Bhattacharya Department of Computer Science New Mexico State University amiya@cs.nmsu.edu (575) 646-6230 and Charbel

  16. AHPCRC (Army High Performance Computing Research Center) Bulletin. Volume 2, Issue 2, 2011

    Science.gov (United States)

    2011-01-01

    properties and the information required to model the hinges. Extensive research of different kinds of plastics and their common usage determined the...below). The direct heat flux method for computing the thermal conductivity is analogous to experimental measurements. It was successfully applied to...drive down a road and easily locate buried IED’s! My project goals this summer were to learn how to use and implement Verilog HDL and ModelSim

  17. Biomedical wellness challenges and opportunities

    Science.gov (United States)

    Tangney, John F.

    2012-06-01

    The mission of ONR's Human and Bioengineered Systems Division is to direct, plan, foster, and encourage Science and Technology in cognitive science, computational neuroscience, bioscience and bio-mimetic technology, social/organizational science, training, human factors, and decision making as related to future Naval needs. This paper highlights current programs that contribute to future biomedical wellness needs in context of humanitarian assistance and disaster relief. ONR supports fundamental research and related technology demonstrations in several related areas, including biometrics and human activity recognition; cognitive sciences; computational neurosciences and bio-robotics; human factors, organizational design and decision research; social, cultural and behavioral modeling; and training, education and human performance. In context of a possible future with automated casualty evacuation, elements of current science and technology programs are illustrated.

  18. Hastings Center

    Science.gov (United States)

    ... Scientists Join Forces Read more HASTINGS CENTER NEWS Artificial intelligence and brain-computer interfaces could revolutionize the treatment ... more HASTINGS CENTER NEWS With the power of artificial intelligence, machines can perform increasingly complex tasks, such as ...

  19. Grid Computing

    CERN Document Server

    Yen, Eric

    2008-01-01

    Based on the Grid Computing: International Symposium on Grid Computing (ISGC) 2007, held in Taipei, Taiwan in March of 2007, this title presents the grid solutions and research results in grid operations, grid middleware, biomedical operations, and e-science applications. It is suitable for graduate-level students in computer science.

  20. A canonical perturbation method for computing the guiding-center motion in magnetized axisymmetric plasma columns

    International Nuclear Information System (INIS)

    Gratreau, P.

    1987-01-01

    The motion of charged particles in a magnetized plasma column, such as that of a magnetic mirror trap or a tokamak, is determined in the framework of the canonical perturbation theory through a method of variation of constants which preserves the energy conservation and the symmetry invariance. The choice of a frame of coordinates close to that of the magnetic coordinates allows a relatively precise determination of the guiding-center motion with a low-ordered approximation in the adiabatic parameter. A Hamiltonian formulation of the motion equations is obtained

  1. Teaching scientific principles through a computer-based, design-centered learning environment

    Science.gov (United States)

    Wolfe, Michael Brian

    Research on science instruction indicates that the traditional science classroom is not always effective in improving students' scientific understanding. Physics courses, in particular, do not promote the ability to apply scientific principles for many reasons, based on their focus on procedural problem-solving and lab exercises. In this dissertation, I propose the Designing-to-Learn Architecture (DTLA), a design-centered goal-based scenario (GBS) architecture, theoretically grounded in the literature on design-centered learning environments, goal-based scenarios, intelligent tutoring systems and simulations. The DTLA offers an alternative approach to addressing the issues encountered in the traditional science classroom. The architecture consists of an artifact with associated design goals; components with component options; a simulation; a reference database; and guided tutorials. I describe the design of Goin' Up?, the prototype DTL application, serving as the basis for evaluating the effectiveness of the DTLA. I present results of interview and testing protocols from the formative evaluation of Goin' Up?, suggesting that learning outcomes, though not statistically significant, could be improved through DTLA enhancements informed by usage patterns in software sessions. I conclude with an analysis of the results and suggestions for improvements to the DTLA, including additional components to address reflection, provide support for novice designers, and offer tutorial guidance on the analysis of the artifact.

  2. Assessing the practice of biomedical ontology evaluation: Gaps and opportunities.

    Science.gov (United States)

    Amith, Muhammad; He, Zhe; Bian, Jiang; Lossio-Ventura, Juan Antonio; Tao, Cui

    2018-04-01

    With the proliferation of heterogeneous health care data in the last three decades, biomedical ontologies and controlled biomedical terminologies play a more and more important role in knowledge representation and management, data integration, natural language processing, as well as decision support for health information systems and biomedical research. Biomedical ontologies and controlled terminologies are intended to assure interoperability. Nevertheless, the quality of biomedical ontologies has hindered their applicability and subsequent adoption in real-world applications. Ontology evaluation is an integral part of ontology development and maintenance. In the biomedicine domain, ontology evaluation is often conducted by third parties as a quality assurance (or auditing) effort that focuses on identifying modeling errors and inconsistencies. In this work, we first organized four categorical schemes of ontology evaluation methods in the existing literature to create an integrated taxonomy. Further, to understand the ontology evaluation practice in the biomedicine domain, we reviewed a sample of 200 ontologies from the National Center for Biomedical Ontology (NCBO) BioPortal-the largest repository for biomedical ontologies-and observed that only 15 of these ontologies have documented evaluation in their corresponding inception papers. We then surveyed the recent quality assurance approaches for biomedical ontologies and their use. We also mapped these quality assurance approaches to the ontology evaluation criteria. It is our anticipation that ontology evaluation and quality assurance approaches will be more widely adopted in the development life cycle of biomedical ontologies. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Big Biomedical data as the key resource for discovery science

    Energy Technology Data Exchange (ETDEWEB)

    Toga, Arthur W.; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W.; Price, Nathan D.; Glusman, Gustavo; Heavner, Benjamin D.; Dinov, Ivo D.; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-07-21

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an “-ome to home” approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center’s computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson’s and Alzheimer’s.

  4. CICART Center For Integrated Computation And Analysis Of Reconnection And Turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Bhattacharjee, Amitava [Univ. of New Hampshire, Durham, NH (United States)

    2016-03-27

    CICART is a partnership between the University of New Hampshire (UNH) and Dartmouth College. CICART addresses two important science needs of the DoE: the basic understanding of magnetic reconnection and turbulence that strongly impacts the performance of fusion plasmas, and the development of new mathematical and computational tools that enable the modeling and control of these phenomena. The principal participants of CICART constitute an interdisciplinary group, drawn from the communities of applied mathematics, astrophysics, computational physics, fluid dynamics, and fusion physics. It is a main premise of CICART that fundamental aspects of magnetic reconnection and turbulence in fusion devices, smaller-scale laboratory experiments, and space and astrophysical plasmas can be viewed from a common perspective, and that progress in understanding in any of these interconnected fields is likely to lead to progress in others. The establishment of CICART has strongly impacted the education and research mission of a new Program in Integrated Applied Mathematics in the College of Engineering and Applied Sciences at UNH by enabling the recruitment of a tenure-track faculty member, supported equally by UNH and CICART, and the establishment of an IBM-UNH Computing Alliance. The proposed areas of research in magnetic reconnection and turbulence in astrophysical, space, and laboratory plasmas include the following topics: (A) Reconnection and secondary instabilities in large high-Lundquist-number plasmas, (B) Particle acceleration in the presence of multiple magnetic islands, (C) Gyrokinetic reconnection: comparison with fluid and particle-in-cell models, (D) Imbalanced turbulence, (E) Ion heating, and (F) Turbulence in laboratory (including fusion-relevant) experiments. These theoretical studies make active use of three high-performance computer simulation codes: (1) The Magnetic Reconnection Code, based on extended two-fluid (or Hall MHD) equations, in an Adaptive Mesh

  5. CICART Center For Integrated Computation And Analysis Of Reconnection And Turbulence

    International Nuclear Information System (INIS)

    Bhattacharjee, Amitava

    2016-01-01

    CICART is a partnership between the University of New Hampshire (UNH) and Dartmouth College. CICART addresses two important science needs of the DoE: the basic understanding of magnetic reconnection and turbulence that strongly impacts the performance of fusion plasmas, and the development of new mathematical and computational tools that enable the modeling and control of these phenomena. The principal participants of CICART constitute an interdisciplinary group, drawn from the communities of applied mathematics, astrophysics, computational physics, fluid dynamics, and fusion physics. It is a main premise of CICART that fundamental aspects of magnetic reconnection and turbulence in fusion devices, smaller-scale laboratory experiments, and space and astrophysical plasmas can be viewed from a common perspective, and that progress in understanding in any of these interconnected fields is likely to lead to progress in others. The establishment of CICART has strongly impacted the education and research mission of a new Program in Integrated Applied Mathematics in the College of Engineering and Applied Sciences at UNH by enabling the recruitment of a tenure-track faculty member, supported equally by UNH and CICART, and the establishment of an IBM-UNH Computing Alliance. The proposed areas of research in magnetic reconnection and turbulence in astrophysical, space, and laboratory plasmas include the following topics: (A) Reconnection and secondary instabilities in large high-Lundquist-number plasmas, (B) Particle acceleration in the presence of multiple magnetic islands, (C) Gyrokinetic reconnection: comparison with fluid and particle-in-cell models, (D) Imbalanced turbulence, (E) Ion heating, and (F) Turbulence in laboratory (including fusion-relevant) experiments. These theoretical studies make active use of three high-performance computer simulation codes: (1) The Magnetic Reconnection Code, based on extended two-fluid (or Hall MHD) equations, in an Adaptive Mesh

  6. 15th International Conference on Biomedical Engineering

    CERN Document Server

    2014-01-01

    This volume presents the proceedings of the 15th ICMBE held from 4th to 7th December 2013, Singapore. Biomedical engineering is applied in most aspects of our healthcare ecosystem. From electronic health records to diagnostic tools to therapeutic, rehabilitative and regenerative treatments, the work of biomedical engineers is evident. Biomedical engineers work at the intersection of engineering, life sciences and healthcare. The engineers would use principles from applied science including mechanical, electrical, chemical and computer engineering together with physical sciences including physics, chemistry and mathematics to apply them to biology and medicine. Applying such concepts to the human body is very much the same concepts that go into building and programming a machine. The goal is to better understand, replace or fix a target system to ultimately improve the quality of healthcare. With this understanding, the conference proceedings offer a single platform for individuals and organisations working i...

  7. Computer simulations of low energy displacement cascades in a face centered cubic lattice

    International Nuclear Information System (INIS)

    Schiffgens, J.O.; Bourquin, R.D.

    1976-09-01

    Computer simulations of atomic motion in a copper lattice following the production of primary knock-on atoms (PKAs) with energies from 25 to 200 eV are discussed. In this study, a mixed Moliere-Englert pair potential is used to model the copper lattice. The computer code COMENT, which employs the dynamical method, is used to analyze the motion of up to 6000 atoms per time step during cascade evolution. The atoms are specified as initially at rest on the sites of an ideal lattice. A matrix of 12 PKA directions and 6 PKA energies is investigated. Displacement thresholds in the [110] and [100] are calculated to be approximately 17 and 20 eV, respectively. A table showing the stability of isolated Frenkel pairs with different vacancy and interstitial orientations and separations is presented. The numbers of Frenkel pairs and atomic replacements are tabulated as a function of PKA direction for each energy. For PKA energies of 25, 50, 75, 100, 150, and 200 eV, the average number of Frenkel pairs per PKA are 0.4, 0.6, 1.0, 1.2, 1.4, and 2.2 and the average numbers of replacements per PKA are 2.4, 4.0, 3.3, 4.9, 9.3, and 15.8

  8. Computer Data Analysis for Meteorology - Project-Centered Skill Development for the Early Undergraduate Career

    Science.gov (United States)

    Ellis, T. D.

    2014-12-01

    Too often in geoscience education are the computer skills necessary for success in the workforce put off until the last years of undergraduate education. This is especially true in meteorology, a form of geophysical fluid dynamics many people encounter on a daily basis. Meteorologists often need to know specialized computer skills, including the use of scripting languages to automate handling large bundles of data, manipulating four-dimensional arrays (with three spatial dimensions and one time dimension), visualizing said datasets simply and effectively for publication, and performing statistical analysis of those datasets. Such topics are often addressed only at the senior undergraduate level or graduate school. At SUNY Oneonta, we are piloting a course that teaches these skills to third-semester students with the intent of building confidence in these skills throughout students' careers and with the of building a tool-box of skills that can be used in upper-division courses and undergraduate research. This poster will present the methods used in building this course, the kinds of activities designed, the desired student learning outcomes, and our assessment of those outcomes, and new initiatives engaged since the completion of the NSF-funded portion of the project in 2012.

  9. Computation of Electromagnetic Fields Scattered From Dielectric Objects of Uncertain Shapes Using MLMC Center for Uncertainty

    KAUST Repository

    Litvinenko, Alexander

    2015-01-05

    Simulators capable of computing scattered fields from objects of uncertain shapes are highly useful in electromagnetics and photonics, where device designs are typically subject to fabrication tolerances. Knowledge of statistical variations in scattered fields is useful in ensuring error-free functioning of devices. Oftentimes such simulators use a Monte Carlo (MC) scheme to sample the random domain, where the variables parameterize the uncertainties in the geometry. At each sample, which corresponds to a realization of the geometry, a deterministic electromagnetic solver is executed to compute the scattered fields. However, to obtain accurate statistics of the scattered fields, the number of MC samples has to be large. This significantly increases the total execution time. In this work, to address this challenge, the Multilevel MC (MLMC) scheme is used together with a (deterministic) surface integral equation solver. The MLMC achieves a higher efficiency by “balancing” the statistical errors due to sampling of the random domain and the numerical errors due to discretization of the geometry at each of these samples. Error balancing results in a smaller number of samples requiring coarser discretizations. Consequently, total execution time is significantly shortened.

  10. Computations on the massively parallel processor at the Goddard Space Flight Center

    Science.gov (United States)

    Strong, James P.

    1991-01-01

    Described are four significant algorithms implemented on the massively parallel processor (MPP) at the Goddard Space Flight Center. Two are in the area of image analysis. Of the other two, one is a mathematical simulation experiment and the other deals with the efficient transfer of data between distantly separated processors in the MPP array. The first algorithm presented is the automatic determination of elevations from stereo pairs. The second algorithm solves mathematical logistic equations capable of producing both ordered and chaotic (or random) solutions. This work can potentially lead to the simulation of artificial life processes. The third algorithm is the automatic segmentation of images into reasonable regions based on some similarity criterion, while the fourth is an implementation of a bitonic sort of data which significantly overcomes the nearest neighbor interconnection constraints on the MPP for transferring data between distant processors.

  11. Simbody: multibody dynamics for biomedical research.

    Science.gov (United States)

    Sherman, Michael A; Seth, Ajay; Delp, Scott L

    Multibody software designed for mechanical engineering has been successfully employed in biomedical research for many years. For real time operation some biomedical researchers have also adapted game physics engines. However, these tools were built for other purposes and do not fully address the needs of biomedical researchers using them to analyze the dynamics of biological structures and make clinically meaningful recommendations. We are addressing this problem through the development of an open source, extensible, high performance toolkit including a multibody mechanics library aimed at the needs of biomedical researchers. The resulting code, Simbody, supports research in a variety of fields including neuromuscular, prosthetic, and biomolecular simulation, and related research such as biologically-inspired design and control of humanoid robots and avatars. Simbody is the dynamics engine behind OpenSim, a widely used biomechanics simulation application. This article reviews issues that arise uniquely in biomedical research, and reports on the architecture, theory, and computational methods Simbody uses to address them. By addressing these needs explicitly Simbody provides a better match to the needs of researchers than can be obtained by adaptation of mechanical engineering or gaming codes. Simbody is a community resource, free for any purpose. We encourage wide adoption and invite contributions to the code base at https://simtk.org/home/simbody.

  12. Biomedical engineering for health research and development.

    Science.gov (United States)

    Zhang, X-Y

    2015-01-01

    Biomedical engineering is a new area of research in medicine and biology, providing new concepts and designs for the diagnosis, treatment and prevention of various diseases. There are several types of biomedical engineering, such as tissue, genetic, neural and stem cells, as well as chemical and clinical engineering for health care. Many electronic and magnetic methods and equipments are used for the biomedical engineering such as Computed Tomography (CT) scans, Magnetic Resonance Imaging (MRI) scans, Electroencephalography (EEG), Ultrasound and regenerative medicine and stem cell cultures, preparations of artificial cells and organs, such as pancreas, urinary bladders, liver cells, and fibroblasts cells of foreskin and others. The principle of tissue engineering is described with various types of cells used for tissue engineering purposes. The use of several medical devices and bionics are mentioned with scaffold, cells and tissue cultures and various materials are used for biomedical engineering. The use of biomedical engineering methods is very important for the human health, and research and development of diseases. The bioreactors and preparations of artificial cells or tissues and organs are described here.

  13. Chapter 1: Biomedical knowledge integration.

    Directory of Open Access Journals (Sweden)

    Philip R O Payne

    Full Text Available The modern biomedical research and healthcare delivery domains have seen an unparalleled increase in the rate of innovation and novel technologies over the past several decades. Catalyzed by paradigm-shifting public and private programs focusing upon the formation and delivery of genomic and personalized medicine, the need for high-throughput and integrative approaches to the collection, management, and analysis of heterogeneous data sets has become imperative. This need is particularly pressing in the translational bioinformatics domain, where many fundamental research questions require the integration of large scale, multi-dimensional clinical phenotype and bio-molecular data sets. Modern biomedical informatics theory and practice has demonstrated the distinct benefits associated with the use of knowledge-based systems in such contexts. A knowledge-based system can be defined as an intelligent agent that employs a computationally tractable knowledge base or repository in order to reason upon data in a targeted domain and reproduce expert performance relative to such reasoning operations. The ultimate goal of the design and use of such agents is to increase the reproducibility, scalability, and accessibility of complex reasoning tasks. Examples of the application of knowledge-based systems in biomedicine span a broad spectrum, from the execution of clinical decision support, to epidemiologic surveillance of public data sets for the purposes of detecting emerging infectious diseases, to the discovery of novel hypotheses in large-scale research data sets. In this chapter, we will review the basic theoretical frameworks that define core knowledge types and reasoning operations with particular emphasis on the applicability of such conceptual models within the biomedical domain, and then go on to introduce a number of prototypical data integration requirements and patterns relevant to the conduct of translational bioinformatics that can be addressed

  14. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.

    1980-05-01

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone who wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables

  15. Annual report of R and D activities in center for promotion of computational science and engineering from April 1, 2003 to March 31, 2004

    International Nuclear Information System (INIS)

    2005-08-01

    Major Research and development activities of Center for Promotion of Computational Science and Engineering (CCSE), JAERI, have focused on ITBL (IT Based Laboratory) project, computational material science and Quantum Bioinformatics. This report provides an overview of research and development activities in (CCSE) in the fiscal year 2003 (April 1, 2003 - March 31, 2004). (author)

  16. Abstracts of digital computer code packages. Assembled by the Radiation Shielding Information Center

    International Nuclear Information System (INIS)

    McGill, B.; Maskewitz, B.F.; Anthony, C.M.; Comolander, H.E.; Hendrickson, H.R.

    1976-01-01

    The term ''code package'' is used to describe a miscellaneous grouping of materials which, when interpreted in connection with a digital computer, enables the scientist--user to solve technical problems in the area for which the material was designed. In general, a ''code package'' consists of written material--reports, instructions, flow charts, listings of data, and other useful material and IBM card decks (or, more often, a reel of magnetic tape) on which the source decks, sample problem input (including libraries of data) and the BCD/EBCDIC output listing from the sample problem are written. In addition to the main code, and any available auxiliary routines are also included. The abstract format was chosen to give to a potential code user several criteria for deciding whether or not he wishes to request the code package

  17. Computer vision for real-time orbital operations. Center directors discretionary fund

    Science.gov (United States)

    Vinz, F. L.; Brewster, L. L.; Thomas, L. D.

    1984-01-01

    Machine vision research is examined as it relates to the NASA Space Station program and its associated Orbital Maneuvering Vehicle (OMV). Initial operation of OMV for orbital assembly, docking, and servicing are manually controlled from the ground by means of an on board TV camera. These orbital operations may be accomplished autonomously by machine vision techniques which use the TV camera as a sensing device. Classical machine vision techniques are described. An alternate method is developed and described which employs a syntactic pattern recognition scheme. It has the potential for substantial reduction of computing and data storage requirements in comparison to the Two-Dimensional Fast Fourier Transform (2D FFT) image analysis. The method embodies powerful heuristic pattern recognition capability by identifying image shapes such as elongation, symmetry, number of appendages, and the relative length of appendages.

  18. Introduction to biomedical optics

    CERN Document Server

    Splinter, Robert

    2006-01-01

    GENERAL BIOMEDICAL OPTICS THEORYIntroduction to the Use of Light for Diagnostic and Therapeutic ModalitiesWhat Is Biomedical Optics?Biomedical Optics TimelineElementary Optical DiscoveriesHistorical Events in Therapeutic and Diagnostic Use of LightLight SourcesCurrent State of the ArtSummaryAdditional ReadingProblemsReview of Optical Principles: Fundamental Electromagnetic Theory and Description of Light SourcesDefinitions in OpticsKirchhoff's Laws of RadiationElectromagnetic Wave TheoryLight SourcesApplications of Various LasersSummaryAdditional ReadingProblemsReview of Optical Principles: Classical OpticsGeometrical OpticsOther Optical PrinciplesQuantum PhysicsGaussian OpticsSummaryAdditional ReadingProblemsReview of Optical Interaction PropertiesAbsorption and ScatteringSummaryAdditional ReadingProblemsLight-Tissue Interaction VariablesLaser VariablesTissue VariablesLight Transportation TheoryLight Propagation under Dominant AbsorptionSummaryNomenclatureAdditional ReadingProblemsLight-Tissue Interaction Th...

  19. Advances in biomedical engineering

    CERN Document Server

    Brown, J H U

    1976-01-01

    Advances in Biomedical Engineering, Volume 6, is a collection of papers that discusses the role of integrated electronics in medical systems and the usage of biological mathematical models in biological systems. Other papers deal with the health care systems, the problems and methods of approach toward rehabilitation, as well as the future of biomedical engineering. One paper discusses the use of system identification as it applies to biological systems to estimate the values of a number of parameters (for example, resistance, diffusion coefficients) by indirect means. More particularly, the i

  20. Advances in biomedical engineering

    CERN Document Server

    Brown, J H U

    1976-01-01

    Advances in Biomedical Engineering, Volume 5, is a collection of papers that deals with application of the principles and practices of engineering to basic and applied biomedical research, development, and the delivery of health care. The papers also describe breakthroughs in health improvements, as well as basic research that have been accomplished through clinical applications. One paper examines engineering principles and practices that can be applied in developing therapeutic systems by a controlled delivery system in drug dosage. Another paper examines the physiological and materials vari

  1. Biomedical enhancements as justice.

    Science.gov (United States)

    Nam, Jeesoo

    2015-02-01

    Biomedical enhancements, the applications of medical technology to make better those who are neither ill nor deficient, have made great strides in the past few decades. Using Amartya Sen's capability approach as my framework, I argue in this article that far from being simply permissible, we have a prima facie moral obligation to use these new developments for the end goal of promoting social justice. In terms of both range and magnitude, the use of biomedical enhancements will mark a radical advance in how we compensate the most disadvantaged members of society. © 2013 John Wiley & Sons Ltd.

  2. GSDC: A Unique Data Center in Korea for HEP research

    Directory of Open Access Journals (Sweden)

    Ahn Sang-Un

    2017-01-01

    Full Text Available Global Science experimental Data hub Center (GSDC at Korea Institute of Science and Technology Information (KISTI is a unique data center in South Korea established for promoting the fundamental research fields by supporting them with the expertise on Information and Communication Technology (ICT and the infrastructure for High Performance Computing (HPC, High Throughput Computing (HTC and Networking. GSDC has supported various research fields in South Korea dealing with the large scale of data, e.g. RENO experiment for neutrino research, LIGO experiment for gravitational wave detection, Genome sequencing project for bio-medical, and HEP experiments such as CDF at FNAL, Belle at KEK, and STAR at BNL. In particular, GSDC has run a Tier-1 center for ALICE experiment using the LHC at CERN since 2013. In this talk, we present the overview on computing infrastructure that GSDC runs for the research fields and we discuss on the data center infrastructure management system deployed at GSDC.

  3. GSDC: A Unique Data Center in Korea for HEP research

    Science.gov (United States)

    Ahn, Sang-Un

    2017-04-01

    Global Science experimental Data hub Center (GSDC) at Korea Institute of Science and Technology Information (KISTI) is a unique data center in South Korea established for promoting the fundamental research fields by supporting them with the expertise on Information and Communication Technology (ICT) and the infrastructure for High Performance Computing (HPC), High Throughput Computing (HTC) and Networking. GSDC has supported various research fields in South Korea dealing with the large scale of data, e.g. RENO experiment for neutrino research, LIGO experiment for gravitational wave detection, Genome sequencing project for bio-medical, and HEP experiments such as CDF at FNAL, Belle at KEK, and STAR at BNL. In particular, GSDC has run a Tier-1 center for ALICE experiment using the LHC at CERN since 2013. In this talk, we present the overview on computing infrastructure that GSDC runs for the research fields and we discuss on the data center infrastructure management system deployed at GSDC.

  4. Repeat computed tomography for trauma patients undergoing transfer to a Level I trauma center.

    Science.gov (United States)

    Young, Andrew Joseph; Meyers, Kenneth Sadler; Wolfe, Luke; Duane, Therese Marie

    2012-06-01

    Our goal was to determine the characteristics of trauma transfer patients with repeat imaging. A retrospective trauma registry review was performed to evaluate trauma patients who were transferred from referring institutions between January 2005 and December 2009. Patients were divided into those who had a duplicate computed tomography (CT) scan versus those who did not. There were 2678 patients included of whom 559 (21%) had at least one repeat CT scan, whereas 2119 (79%) did not have any repeat CT scans. Those with repeat CT scans were older (42.3 ± 27.3 years vs 37.3 ± 25.6 years), had a higher Injury Severity Score (ISS) (13.7 ± 8.7 vs 11.9 ± 8.8), and more likely to have blunt trauma (odds ratio, 4.7; confidence interval, 2.3 to 9.6) (P for all < 0.0007). Those with CT scans done only at the referring facility were younger, had a lower ISS, and shorter lengths of stay (P for all < 0.0003). ISS and age were independent predictors for repeat CT scans. Transfer patients had imaging repeated one-fifth of the time. The younger, less injured patient went without repeat imaging suggesting that they may have been adequately cared for at the outside institution.

  5. EVALUATION OF PROPTOSIS BY USING COMPUTED TOMOGRAPHY IN A TERTIARY CARE CENTER, BURLA, SAMBALPUR, ODISHA

    Directory of Open Access Journals (Sweden)

    Vikas Agrawal

    2017-07-01

    Full Text Available BACKGROUND Proptosis is defined as the abnormal anterior protrusion of the globe beyond the orbital margins.1 It is an important clinical manifestation of various orbital as well as systemic disorders. Aetiology ranging from infection to malignant tumours, among which space occupying lesions within the orbits are the most important. Proptosis is defined as an abnormal protrusion of the eyeball. MATERIALS AND METHODS A total of 32 patients referred from various departments mainly from ophthalmology and medicine with history and clinical features suggestive of proptosis were evaluated in our department and after proper history taking and clinical examination, Computed Tomography (CT scan was done. RESULTS The age of the patients ranged from 1-55 years. Associated chief complaints in case of proptosis were in decreasing order from pain / headache, restricted eye movement, diminished vision and diplopia. Mass lesions (46.87% were the most common cause of proptosis followed by inflammatory lesions (37.5%. Trauma vascular lesions and congenital conditions were infrequent causes of proptosis. In children, common causes of proptosis were retinoblastoma (35.71% and orbital cellulitis (28.57% and in adults the common causes were thyroid ophthalmopathy (22.22%, trauma (16.66% and pseudo-tumour (16.66%. CONCLUSION Mass lesions (46.87% were the most common cause of proptosis followed by inflammatory lesions (37.5%. CT scanning should be the chief investigation in evaluation of lesions causing proptosis. It is the most useful in detecting characterising and determining the extent of disease process. The overall accuracy of CT scan in diagnosis of proptosis is 96.87%.

  6. Enabling Large-Scale Biomedical Analysis in the Cloud

    Directory of Open Access Journals (Sweden)

    Ying-Chih Lin

    2013-01-01

    Full Text Available Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable.

  7. Biomedical Engineering in Modern Society

    Science.gov (United States)

    Attinger, E. O.

    1971-01-01

    Considers definition of biomedical engineering (BME) and how biomedical engineers should be trained. State of the art descriptions of BME and BME education are followed by a brief look at the future of BME. (TS)

  8. Biomedical Data Mining

    NARCIS (Netherlands)

    Peek, N.; Combi, C.; Tucker, A.

    2009-01-01

    Objective: To introduce the special topic of Methods of Information in Medicine on data mining in biomedicine, with selected papers from two workshops on Intelligent Data Analysis in bioMedicine (IDAMAP) held in Verona (2006) and Amsterdam (2007). Methods: Defining the field of biomedical data

  9. Anatomy for Biomedical Engineers

    Science.gov (United States)

    Carmichael, Stephen W.; Robb, Richard A.

    2008-01-01

    There is a perceived need for anatomy instruction for graduate students enrolled in a biomedical engineering program. This appeared especially important for students interested in and using medical images. These students typically did not have a strong background in biology. The authors arranged for students to dissect regions of the body that…

  10. Biomedical research applications

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    The biomedical research Panel believes that the Calutron facility at Oak Ridge is a national and international resource of immense scientific value and of fundamental importance to continued biomedical research. This resource is essential to the development of new isotope uses in biology and medicine. It should therefore be nurtured by adequate support and operated in a way that optimizes its services to the scientific and technological community. The Panel sees a continuing need for a reliable supply of a wide variety of enriched stable isotopes. The past and present utilization of stable isotopes in biomedical research is documented in Appendix 7. Future requirements for stable isotopes are impossible to document, however, because of the unpredictability of research itself. Nonetheless we expect the demand for isotopes to increase in parallel with the continuing expansion of biomedical research as a whole. There are a number of promising research projects at the present time, and these are expected to lead to an increase in production requirements. The Panel also believes that a high degree of priority should be given to replacing the supplies of the 65 isotopes (out of the 224 previously available enriched isotopes) no longer available from ORNL

  11. National Space Biomedical Research Institute Annual Report

    Science.gov (United States)

    2000-01-01

    This report summarizes the activities of the National Space Biomedical Research Institute (NSBRI) during FY 2000. The NSBRI is responsible for the development of countermeasures against the deleterious effects of long-duration space flight and performs fundamental and applied space biomedical research directed towards this specific goal. Its mission is to lead a world-class, national effort in integrated, critical path space biomedical research that supports NASA's Human Exploration and Development of Space (HEDS) Strategic Plan by focusing on the enabling of long-term human presence in, development of, and exploration of space. This is accomplished by: designing, testing and validating effective countermeasures to address the biological and environmental impediments to long-term human space flight; defining the molecular, cellular, organ-level, integrated responses and mechanistic relationships that ultimately determine these impediments, where such activity fosters the development of novel countermeasures; establishing biomedical support technologies to maximize human performance in space, reduce biomedical hazards to an acceptable level, and deliver quality medical care; transferring and disseminating the biomedical advances in knowledge and technology acquired through living and working in space to the general benefit of mankind, including the treatment of patients suffering from gravity- and radiation-related conditions on Earth; and ensuring open involvement of the scientific community, industry and the public at large in the Institute's activities and fostering a robust collaboration with NASA, particularly through NASA's Lyndon B. Johnson Space Center. Attachment:Appendices (A,B,C,D,E,F,G,H,I,J,K,L,M,N,O, and P.).

  12. Biomedical Research Institute, Biomedical Research Foundation of Northwest Louisiana, Shreveport, Louisiana

    International Nuclear Information System (INIS)

    1992-01-01

    Department of Energy (DOE) has prepared an Environmental Assessment (EA), DOE/EA-0789, evaluating the environmental impacts of construction and operation of a Biomedical Research Institute (BRI) at the Louisiana State University (LSU) Medical Center, Shreveport, Louisiana. The purpose of the BRI is to accelerate the development of biomedical research in cardiovascular disease, molecular biology, and neurobiology. Based on the analyses in the EA, DOE has determined that the proposed action does not constitute a major Federal action significantly affecting the quality of the human environment within the meaning of the National Environmental Policy Act of 1969 (NEPA). Therefore, the preparation of an Environmental Impact Statement is not required

  13. Biomedical Wireless Ambulatory Crew Monitor

    Science.gov (United States)

    Chmiel, Alan; Humphreys, Brad

    2009-01-01

    A compact, ambulatory biometric data acquisition system has been developed for space and commercial terrestrial use. BioWATCH (Bio medical Wireless and Ambulatory Telemetry for Crew Health) acquires signals from biomedical sensors using acquisition modules attached to a common data and power bus. Several slots allow the user to configure the unit by inserting sensor-specific modules. The data are then sent real-time from the unit over any commercially implemented wireless network including 802.11b/g, WCDMA, 3G. This system has a distributed computing hierarchy and has a common data controller on each sensor module. This allows for the modularity of the device along with the tailored ability to control the cards using a relatively small master processor. The distributed nature of this system affords the modularity, size, and power consumption that betters the current state of the art in medical ambulatory data acquisition. A new company was created to market this technology.

  14. Biomedical Science Technologists in Lagos Universities: Meeting ...

    African Journals Online (AJOL)

    Biomedical Science Technologists in Lagos Universities: Meeting Modern Standards in Biomedical Research. ... biomedical techniques. SOTA biomedical science needs adequate financial investment for the scientific resources as well as stable civic infrastructure, thus these public institutions need more of such provisions.

  15. Production Support Flight Control Computers: Research Capability for F/A-18 Aircraft at Dryden Flight Research Center

    Science.gov (United States)

    Carter, John F.

    1997-01-01

    NASA Dryden Flight Research Center (DFRC) is working with the United States Navy to complete ground testing and initiate flight testing of a modified set of F/A-18 flight control computers. The Production Support Flight Control Computers (PSFCC) can give any fleet F/A-18 airplane an in-flight, pilot-selectable research control law capability. NASA DFRC can efficiently flight test the PSFCC for the following four reasons: (1) Six F/A-18 chase aircraft are available which could be used with the PSFCC; (2) An F/A-18 processor-in-the-loop simulation exists for validation testing; (3) The expertise has been developed in programming the research processor in the PSFCC; and (4) A well-defined process has been established for clearing flight control research projects for flight. This report presents a functional description of the PSFCC. Descriptions of the NASA DFRC facilities, PSFCC verification and validation process, and planned PSFCC projects are also provided.

  16. Computer-Based Training in Eating and Nutrition Facilitates Person-Centered Hospital Care: A Group Concept Mapping Study.

    Science.gov (United States)

    Westergren, Albert; Edfors, Ellinor; Norberg, Erika; Stubbendorff, Anna; Hedin, Gita; Wetterstrand, Martin; Rosas, Scott R; Hagell, Peter

    2018-04-01

    Studies have shown that computer-based training in eating and nutrition for hospital nursing staff increased the likelihood that patients at risk of undernutrition would receive nutritional interventions. This article seeks to provide understanding from the perspective of nursing staff of conceptually important areas for computer-based nutritional training, and their relative importance to nutritional care, following completion of the training. Group concept mapping, an integrated qualitative and quantitative methodology, was used to conceptualize important factors relating to the training experiences through four focus groups (n = 43), statement sorting (n = 38), and importance rating (n = 32), followed by multidimensional scaling and cluster analysis. Sorting of 38 statements yielded four clusters. These clusters (number of statements) were as follows: personal competence and development (10), practice close care development (10), patient safety (9), and awareness about the nutrition care process (9). First and second clusters represented "the learning organization," and third and fourth represented "quality improvement." These findings provide a conceptual basis for understanding the importance of training in eating and nutrition, which contributes to a learning organization and quality improvement, and can be linked to and facilitates person-centered nutritional care and patient safety.

  17. Biomedical signals, imaging, and informatics

    CERN Document Server

    Bronzino, Joseph D

    2014-01-01

    Known as the bible of biomedical engineering, The Biomedical Engineering Handbook, Fourth Edition, sets the standard against which all other references of this nature are measured. As such, it has served as a major resource for both skilled professionals and novices to biomedical engineering.Biomedical Signals, Imaging, and Informatics, the third volume of the handbook, presents material from respected scientists with diverse backgrounds in biosignal processing, medical imaging, infrared imaging, and medical informatics.More than three dozen specific topics are examined, including biomedical s

  18. Crossing the Chasm: Information Technology to Biomedical Informatics

    Science.gov (United States)

    Fahy, Brenda G.; Balke, C. William; Umberger, Gloria H.; Talbert, Jeffery; Canales, Denise Niles; Steltenkamp, Carol L.; Conigliaro, Joseph

    2011-01-01

    Accelerating the translation of new scientific discoveries to improve human health and disease management is the overall goal of a series of initiatives integrated in the National Institutes of Health (NIH) “Roadmap for Medical Research.” The Clinical and Translational Research Award (CTSA) program is, arguably, the most visible component of the NIH Roadmap providing resources to institutions to transform their clinical and translational research enterprises along the goals of the Roadmap. The CTSA program emphasizes biomedical informatics as a critical component for the accomplishment of the NIH’s translational objectives. To be optimally effective, emerging biomedical informatics programs must link with the information technology (IT) platforms of the enterprise clinical operations within academic health centers. This report details one academic health center’s transdisciplinary initiative to create an integrated academic discipline of biomedical informatics through the development of its infrastructure for clinical and translational science infrastructure and response to the CTSA mechanism. This approach required a detailed informatics strategy to accomplish these goals. This transdisciplinary initiative was the impetus for creation of a specialized biomedical informatics core, the Center for Biomedical Informatics (CBI). Development of the CBI codified the need to incorporate medical informatics including quality and safety informatics and enterprise clinical information systems within the CBI. This paper describes the steps taken to develop the biomedical informatics infrastructure, its integration with clinical systems at one academic health center, successes achieved, and barriers encountered during these efforts. PMID:21383632

  19. Ethical Issues of Artificial Biomedical Applications

    OpenAIRE

    Alexiou, Athanasios; Psixa, Maria; Vlamos, Panagiotis

    2011-01-01

    Part 12: Medical Applications of ANN and Ethics of AI; International audience; While the plethora of artificial biomedical applications is enriched and combined with the possibilities of artificial intelligence, bioinformatics and nanotechnology, the variability in the ideological use of such concepts is associated with bioethical issues and several legal aspects. The convergence of bioethics and computer ethics, attempts to illustrate and approach problems, occurring by the fusion of human a...

  20. Research evaluation support services in biomedical libraries.

    Science.gov (United States)

    Gutzman, Karen Elizabeth; Bales, Michael E; Belter, Christopher W; Chambers, Thane; Chan, Liza; Holmes, Kristi L; Lu, Ya-Ling; Palmer, Lisa A; Reznik-Zellen, Rebecca C; Sarli, Cathy C; Suiter, Amy M; Wheeler, Terrie R

    2018-01-01

    The paper provides a review of current practices related to evaluation support services reported by seven biomedical and research libraries. A group of seven libraries from the United States and Canada described their experiences with establishing evaluation support services at their libraries. A questionnaire was distributed among the libraries to elicit information as to program development, service and staffing models, campus partnerships, training, products such as tools and reports, and resources used for evaluation support services. The libraries also reported interesting projects, lessons learned, and future plans. The seven libraries profiled in this paper report a variety of service models in providing evaluation support services to meet the needs of campus stakeholders. The service models range from research center cores, partnerships with research groups, and library programs with staff dedicated to evaluation support services. A variety of products and services were described such as an automated tool to develop rank-based metrics, consultation on appropriate metrics to use for evaluation, customized publication and citation reports, resource guides, classes and training, and others. Implementing these services has allowed the libraries to expand their roles on campus and to contribute more directly to the research missions of their institutions. Libraries can leverage a variety of evaluation support services as an opportunity to successfully meet an array of challenges confronting the biomedical research community, including robust efforts to report and demonstrate tangible and meaningful outcomes of biomedical research and clinical care. These services represent a transformative direction that can be emulated by other biomedical and research libraries.

  1. 76 FR 24889 - Submission for OMB Review; Comment Request; Cancer Biomedical Informatics Grid® (caBIG®) Support...

    Science.gov (United States)

    2011-05-03

    ... Biomedical Informatics Grid (caBIG ) Support Service Provider (SSP) Program (NCI) Summary: Under the... control number. Proposed Collection: Title: cancer Biomedical Informatics Grid (caBIG ) Support Service... OMB Number. Need and Use of Information Collection: The NCI Center for Biomedical Informatics and...

  2. A Semantic Web management model for integrative biomedical informatics.

    Directory of Open Access Journals (Sweden)

    Helena F Deus

    2008-08-01

    Full Text Available Data, data everywhere. The diversity and magnitude of the data generated in the Life Sciences defies automated articulation among complementary efforts. The additional need in this field for managing property and access permissions compounds the difficulty very significantly. This is particularly the case when the integration involves multiple domains and disciplines, even more so when it includes clinical and high throughput molecular data.The emergence of Semantic Web technologies brings the promise of meaningful interoperation between data and analysis resources. In this report we identify a core model for biomedical Knowledge Engineering applications and demonstrate how this new technology can be used to weave a management model where multiple intertwined data structures can be hosted and managed by multiple authorities in a distributed management infrastructure. Specifically, the demonstration is performed by linking data sources associated with the Lung Cancer SPORE awarded to The University of Texas MD Anderson Cancer Center at Houston and the Southwestern Medical Center at Dallas. A software prototype, available with open source at www.s3db.org, was developed and its proposed design has been made publicly available as an open source instrument for shared, distributed data management.The Semantic Web technologies have the potential to addresses the need for distributed and evolvable representations that are critical for systems Biology and translational biomedical research. As this technology is incorporated into application development we can expect that both general purpose productivity software and domain specific software installed on our personal computers will become increasingly integrated with the relevant remote resources. In this scenario, the acquisition of a new dataset should automatically trigger the delegation of its analysis.

  3. Optical Polarizationin Biomedical Applications

    CERN Document Server

    Tuchin, Valery V; Zimnyakov, Dmitry A

    2006-01-01

    Optical Polarization in Biomedical Applications introduces key developments in optical polarization methods for quantitative studies of tissues, while presenting the theory of polarization transfer in a random medium as a basis for the quantitative description of polarized light interaction with tissues. This theory uses the modified transfer equation for Stokes parameters and predicts the polarization structure of multiple scattered optical fields. The backscattering polarization matrices (Jones matrix and Mueller matrix) important for noninvasive medical diagnostic are introduced. The text also describes a number of diagnostic techniques such as CW polarization imaging and spectroscopy, polarization microscopy and cytometry. As a new tool for medical diagnosis, optical coherent polarization tomography is analyzed. The monograph also covers a range of biomedical applications, among them cataract and glaucoma diagnostics, glucose sensing, and the detection of bacteria.

  4. Biomedical Shape Memory Polymers

    Directory of Open Access Journals (Sweden)

    SHEN Xue-lin

    2017-07-01

    Full Text Available Shape memory polymers(SMPs are a class of functional "smart" materials that have shown bright prospects in the area of biomedical applications. The novel smart materials with multifunction of biodegradability and biocompatibility can be designed based on their general principle, composition and structure. In this review, the latest process of three typical biodegradable SMPs(poly(lactide acide, poly(ε-caprolactone, polyurethane was summarized. These three SMPs were classified in different structures and discussed, and shape-memory mechanism, recovery rate and fixed rate, response speed was analysed in detail, also, some biomedical applications were presented. Finally, the future development and applications of SMPs are prospected: two-way SMPs and body temperature induced SMPs will be the focus attension by researchers.

  5. Biomedical Applications of Graphene

    Science.gov (United States)

    Shen, He; Zhang, Liming; Liu, Min; Zhang, Zhijun

    2012-01-01

    Graphene exhibits unique 2-D structure and exceptional phyiscal and chemical properties that lead to many potential applications. Among various applications, biomedical applications of graphene have attracted ever-increasing interests over the last three years. In this review, we present an overview of current advances in applications of graphene in biomedicine with focus on drug delivery, cancer therapy and biological imaging, together with a brief discussion on the challenges and perspectives for future research in this field. PMID:22448195

  6. Multilingual biomedical dictionary.

    Science.gov (United States)

    Daumke, Philipp; Markó, Kornél; Poprat, Michael; Schulz, Stefan

    2005-01-01

    We present a unique technique to create a multilingual biomedical dictionary, based on a methodology called Morpho-Semantic indexing. Our approach closes a gap caused by the absence of free available multilingual medical dictionaries and the lack of accuracy of non-medical electronic translation tools. We first explain the underlying technology followed by a description of the dictionary interface, which makes use of a multilingual subword thesaurus and of statistical information from a domain-specific, multilingual corpus.

  7. Cloud Based Metalearning System for Predictive Modeling of Biomedical Data

    Directory of Open Access Journals (Sweden)

    Milan Vukićević

    2014-01-01

    Full Text Available Rapid growth and storage of biomedical data enabled many opportunities for predictive modeling and improvement of healthcare processes. On the other side analysis of such large amounts of data is a difficult and computationally intensive task for most existing data mining algorithms. This problem is addressed by proposing a cloud based system that integrates metalearning framework for ranking and selection of best predictive algorithms for data at hand and open source big data technologies for analysis of biomedical data.

  8. Models for Predicting and Explaining Citation Count of Biomedical Articles

    OpenAIRE

    Fu, Lawrence D.; Aliferis, Constantin

    2008-01-01

    The single most important bibliometric criterion for judging the impact of biomedical papers and their authors’ work is the number of citations received which is commonly referred to as “citation count”. This metric however is unavailable until several years after publication time. In the present work, we build computer models that accurately predict citation counts of biomedical publications within a deep horizon of ten years using only predictive information available at publication time. O...

  9. Computer Center CDC Libraries.

    Science.gov (United States)

    1984-06-01

    SPPCO IS/LI VMULFF IS/Il/ FABSV IS/MI/ SPPFA IS/LI/ VMULFM IS/Il/ FCOMB IS/MI/ SQRDC IS/LI/ VMULFP IS/Il/ FIGI IS/El/ SQRSL IS/LI VMULFQ IS/Il/ FIGI2...SYMMETRIC TRIDIAGONAL MATRIX DETERMINED BY FIGI BALANC BALANCE A REAL GENERAL MATRIX 6 BALBAK BACK TRANSFORM THE EIGENVECTORS OF THAT REAL MATRIX...UPPER HESSENBERG FORM USING ELEMENTARY TRANSFORMATIONS ELTRAN ACCUMULATE THE TRANSFORMATIONS IN THE REDUCTION OF A REAL GENERAL MATRIX BY ELMHES FIGI

  10. Computational optical biomedical spectroscopy and imaging

    CERN Document Server

    Musa, Sarhan M

    2015-01-01

    Applications of Vibrational Spectroscopic Imaging in Personal Care Studies; Guojin Zhang, Roger L. McMullen, Richard Mendelsohn, and Osama M. MusaFluorescence Bioimaging with Applications to Chemistry; Ufana Riaz and S.M. AshrafNew Trends in Immunohistochemical, Genome, and Metabolomics Imaging; G. Livanos, Aditi Deshpande, C. Narayan, Ying Na, T. Quang, T. Farrahi, R. Koglin, Suman Shrestha, M. Zervakis, and George C. GiakosDeveloping a Comprehensive Taxonomy for Human Cell Types; Richard Conroy and Vinay PaiFunctional Near-Infrared S

  11. Opal web services for biomedical applications.

    Science.gov (United States)

    Ren, Jingyuan; Williams, Nadya; Clementi, Luca; Krishnan, Sriram; Li, Wilfred W

    2010-07-01

    Biomedical applications have become increasingly complex, and they often require large-scale high-performance computing resources with a large number of processors and memory. The complexity of application deployment and the advances in cluster, grid and cloud computing require new modes of support for biomedical research. Scientific Software as a Service (sSaaS) enables scalable and transparent access to biomedical applications through simple standards-based Web interfaces. Towards this end, we built a production web server (http://ws.nbcr.net) in August 2007 to support the bioinformatics application called MEME. The server has grown since to include docking analysis with AutoDock and AutoDock Vina, electrostatic calculations using PDB2PQR and APBS, and off-target analysis using SMAP. All the applications on the servers are powered by Opal, a toolkit that allows users to wrap scientific applications easily as web services without any modification to the scientific codes, by writing simple XML configuration files. Opal allows both web forms-based access and programmatic access of all our applications. The Opal toolkit currently supports SOAP-based Web service access to a number of popular applications from the National Biomedical Computation Resource (NBCR) and affiliated collaborative and service projects. In addition, Opal's programmatic access capability allows our applications to be accessed through many workflow tools, including Vision, Kepler, Nimrod/K and VisTrails. From mid-August 2007 to the end of 2009, we have successfully executed 239,814 jobs. The number of successfully executed jobs more than doubled from 205 to 411 per day between 2008 and 2009. The Opal-enabled service model is useful for a wide range of applications. It provides for interoperation with other applications with Web Service interfaces, and allows application developers to focus on the scientific tool and workflow development. Web server availability: http://ws.nbcr.net.

  12. Building a biomedical ontology recommender web service

    Directory of Open Access Journals (Sweden)

    Jonquet Clement

    2010-06-01

    Full Text Available Abstract Background Researchers in biomedical informatics use ontologies and terminologies to annotate their data in order to facilitate data integration and translational discoveries. As the use of ontologies for annotation of biomedical datasets has risen, a common challenge is to identify ontologies that are best suited to annotating specific datasets. The number and variety of biomedical ontologies is large, and it is cumbersome for a researcher to figure out which ontology to use. Methods We present the Biomedical Ontology Recommender web service. The system uses textual metadata or a set of keywords describing a domain of interest and suggests appropriate ontologies for annotating or representing the data. The service makes a decision based on three criteria. The first one is coverage, or the ontologies that provide most terms covering the input text. The second is connectivity, or the ontologies that are most often mapped to by other ontologies. The final criterion is size, or the number of concepts in the ontologies. The service scores the ontologies as a function of scores of the annotations created using the National Center for Biomedical Ontology (NCBO Annotator web service. We used all the ontologies from the UMLS Metathesaurus and the NCBO BioPortal. Results We compare and contrast our Recommender by an exhaustive functional comparison to previously published efforts. We evaluate and discuss the results of several recommendation heuristics in the context of three real world use cases. The best recommendations heuristics, rated ‘very relevant’ by expert evaluators, are the ones based on coverage and connectivity criteria. The Recommender service (alpha version is available to the community and is embedded into BioPortal.

  13. Aquisição e manipulação de imagens por tomografia computadorizada da região maxilofacial visando à obtenção de protótipos biomédicos Acquisition and manipulation of computed tomography images of the maxillofacial region for biomedical prototyping

    Directory of Open Access Journals (Sweden)

    Maria Inês Meurer

    2008-02-01

    Full Text Available O processo de construção de protótipos biomédicos surgiu da união das tecnologias de prototipagem rápida e do diagnóstico por imagens. No entanto, este processo é complexo, em função da necessária interação entre as ciências biomédicas e a engenharia. Para que bons resultados sejam obtidos, especial atenção deve ser dispensada à aquisição das imagens por tomografia computadorizada e à manipulação dessas imagens em softwares específicos. Este artigo apresenta a experiência multidisciplinar de um grupo de pesquisadores com a aquisição e a manipulação de imagens por tomografia computadorizada do complexo maxilofacial, visando à construção de protótipos biomédicos com finalidade cirúrgica.Biomedical prototyping has resulted from a merger of rapid prototyping and imaging diagnosis technologies. However, this process is complex, considering the necessity of interaction between biomedical sciences and engineering. Good results are highly dependent on the acquisition of computed tomography images and their subsequent manipulation by means of specific softwares. The present study describes the experience of a multidisciplinary group of researchers in the acquisition and manipulation of computed tomography images of the maxillofacial region aiming at biomedical prototyping for surgical purposes.

  14. Center for Programming Models for Scalable Parallel Computing - Towards Enhancing OpenMP for Manycore and Heterogeneous Nodes

    Energy Technology Data Exchange (ETDEWEB)

    Barbara Chapman

    2012-02-01

    OpenMP was not well recognized at the beginning of the project, around year 2003, because of its limited use in DoE production applications and the inmature hardware support for an efficient implementation. Yet in the recent years, it has been graduately adopted both in HPC applications, mostly in the form of MPI+OpenMP hybrid code, and in mid-scale desktop applications for scientific and experimental studies. We have observed this trend and worked deligiently to improve our OpenMP compiler and runtimes, as well as to work with the OpenMP standard organization to make sure OpenMP are evolved in the direction close to DoE missions. In the Center for Programming Models for Scalable Parallel Computing project, the HPCTools team at the University of Houston (UH), directed by Dr. Barbara Chapman, has been working with project partners, external collaborators and hardware vendors to increase the scalability and applicability of OpenMP for multi-core (and future manycore) platforms and for distributed memory systems by exploring different programming models, language extensions, compiler optimizations, as well as runtime library support.

  15. Exploring the Effects of Student-Centered Project-Based Learning with Initiation on Students' Computing Skills: A Quasi-Experimental Study of Digital Storytelling

    Science.gov (United States)

    Tsai, Chia-Wen; Shen, Pei-Di; Lin, Rong-An

    2015-01-01

    This study investigated, via quasi-experiments, the effects of student-centered project-based learning with initiation (SPBL with Initiation) on the development of students' computing skills. In this study, 96 elementary school students were selected from four class sections taking a course titled "Digital Storytelling" and were assigned…

  16. The combinatorics computation for Casimir operators of the symplectic Lie algebra and the application for determining the center of the enveloping algebra of a semidirect product

    International Nuclear Information System (INIS)

    Le Van Hop.

    1989-12-01

    The combinatorics computation is used to describe the Casimir operators of the symplectic Lie Algebra. This result is applied for determining the Center of the enveloping Algebra of the semidirect Product of the Heisenberg Lie Algebra and the symplectic Lie Algebra. (author). 10 refs

  17. IEEE International Symposium on Biomedical Imaging.

    Science.gov (United States)

    2017-01-01

    The IEEE International Symposium on Biomedical Imaging (ISBI) is a scientific conference dedicated to mathematical, algorithmic, and computational aspects of biological and biomedical imaging, across all scales of observation. It fosters knowledge transfer among different imaging communities and contributes to an integrative approach to biomedical imaging. ISBI is a joint initiative from the IEEE Signal Processing Society (SPS) and the IEEE Engineering in Medicine and Biology Society (EMBS). The 2018 meeting will include tutorials, and a scientific program composed of plenary talks, invited special sessions, challenges, as well as oral and poster presentations of peer-reviewed papers. High-quality papers are requested containing original contributions to the topics of interest including image formation and reconstruction, computational and statistical image processing and analysis, dynamic imaging, visualization, image quality assessment, and physical, biological, and statistical modeling. Accepted 4-page regular papers will be published in the symposium proceedings published by IEEE and included in IEEE Xplore. To encourage attendance by a broader audience of imaging scientists and offer additional presentation opportunities, ISBI 2018 will continue to have a second track featuring posters selected from 1-page abstract submissions without subsequent archival publication.

  18. Biomedical signals and systems

    CERN Document Server

    Tranquillo, Joseph V

    2013-01-01

    Biomedical Signals and Systems is meant to accompany a one-semester undergraduate signals and systems course. It may also serve as a quick-start for graduate students or faculty interested in how signals and systems techniques can be applied to living systems. The biological nature of the examples allows for systems thinking to be applied to electrical, mechanical, fluid, chemical, thermal and even optical systems. Each chapter focuses on a topic from classic signals and systems theory: System block diagrams, mathematical models, transforms, stability, feedback, system response, control, time

  19. Statistics in biomedical research

    Directory of Open Access Journals (Sweden)

    González-Manteiga, Wenceslao

    2007-06-01

    Full Text Available The discipline of biostatistics is nowadays a fundamental scientific component of biomedical, public health and health services research. Traditional and emerging areas of application include clinical trials research, observational studies, physiology, imaging, and genomics. The present article reviews the current situation of biostatistics, considering the statistical methods traditionally used in biomedical research, as well as the ongoing development of new methods in response to the new problems arising in medicine. Clearly, the successful application of statistics in biomedical research requires appropriate training of biostatisticians. This training should aim to give due consideration to emerging new areas of statistics, while at the same time retaining full coverage of the fundamentals of statistical theory and methodology. In addition, it is important that students of biostatistics receive formal training in relevant biomedical disciplines, such as epidemiology, clinical trials, molecular biology, genetics, and neuroscience.La Bioestadística es hoy en día una componente científica fundamental de la investigación en Biomedicina, salud pública y servicios de salud. Las áreas tradicionales y emergentes de aplicación incluyen ensayos clínicos, estudios observacionales, fisología, imágenes, y genómica. Este artículo repasa la situación actual de la Bioestadística, considerando los métodos estadísticos usados tradicionalmente en investigación biomédica, así como los recientes desarrollos de nuevos métodos, para dar respuesta a los nuevos problemas que surgen en Medicina. Obviamente, la aplicación fructífera de la estadística en investigación biomédica exige una formación adecuada de los bioestadísticos, formación que debería tener en cuenta las áreas emergentes en estadística, cubriendo al mismo tiempo los fundamentos de la teoría estadística y su metodología. Es importante, además, que los estudiantes de

  20. Biomedical photonics handbook

    CERN Document Server

    Vo-Dinh, Tuan

    2003-01-01

    1.Biomedical Photonics: A Revolution at the Interface of Science and Technology, T. Vo-DinhPHOTONICS AND TISSUE OPTICS2.Optical Properties of Tissues, J. Mobley and T. Vo-Dinh3.Light-Tissue Interactions, V.V. Tuchin 4.Theoretical Models and Algorithms in Optical Diffusion Tomography, S.J. Norton and T. Vo-DinhPHOTONIC DEVICES5.Laser Light in Biomedicine and the Life Sciences: From the Present to the Future, V.S. Letokhov6.Basic Instrumentation in Photonics, T. Vo-Dinh7.Optical Fibers and Waveguides for Medical Applications, I. Gannot and

  1. Radiochemicals in biomedical research

    International Nuclear Information System (INIS)

    Evans, E.A.; Oldham, K.G.

    1988-01-01

    This volume describes the role of radiochemicals in biomedical research, as tracers in the development of new drugs, their interaction and function with receptor proteins, with the kinetics of binding of hormone - receptor interactions, and their use in cancer research and clinical oncology. The book also aims to identify future trends in this research, the main objective of which is to provide information leading to improvements in the quality of life, and to give readers a basic understanding of the development of new drugs, how they function in relation to receptor proteins and lead to a better understanding of the diagnosis and treatment of cancers. (author)

  2. Biomedical research competencies for osteopathic medical students.

    Science.gov (United States)

    Cruser, des Anges; Dubin, Bruce; Brown, Sarah K; Bakken, Lori L; Licciardone, John C; Podawiltz, Alan L; Bulik, Robert J

    2009-10-13

    Without systematic exposure to biomedical research concepts or applications, osteopathic medical students may be generally under-prepared to efficiently consume and effectively apply research and evidence-based medicine information in patient care. The academic literature suggests that although medical residents are increasingly expected to conduct research in their post graduate training specialties, they generally have limited understanding of research concepts.With grant support from the National Center for Complementary and Alternative Medicine, and a grant from the Osteopathic Heritage Foundation, the University of North Texas Health Science Center (UNTHSC) is incorporating research education in the osteopathic medical school curriculum. The first phase of this research education project involved a baseline assessment of students' understanding of targeted research concepts. This paper reports the results of that assessment and discusses implications for research education during medical school. Using a novel set of research competencies supported by the literature as needed for understanding research information, we created a questionnaire to measure students' confidence and understanding of selected research concepts. Three matriculating medical school classes completed the on-line questionnaire. Data were analyzed for differences between groups using analysis of variance and t-tests. Correlation coefficients were computed for the confidence and applied understanding measures. We performed a principle component factor analysis of the confidence items, and used multiple regression analyses to explore how confidence might be related to the applied understanding. Of 496 total incoming, first, and second year medical students, 354 (71.4%) completed the questionnaire. Incoming students expressed significantly more confidence than first or second year students (F = 7.198, df = 2, 351, P = 0.001) in their ability to understand the research concepts. Factor analyses

  3. Big biomedical data as the key resource for discovery science.

    Science.gov (United States)

    Toga, Arthur W; Foster, Ian; Kesselman, Carl; Madduri, Ravi; Chard, Kyle; Deutsch, Eric W; Price, Nathan D; Glusman, Gustavo; Heavner, Benjamin D; Dinov, Ivo D; Ames, Joseph; Van Horn, John; Kramer, Roger; Hood, Leroy

    2015-11-01

    Modern biomedical data collection is generating exponentially more data in a multitude of formats. This flood of complex data poses significant opportunities to discover and understand the critical interplay among such diverse domains as genomics, proteomics, metabolomics, and phenomics, including imaging, biometrics, and clinical data. The Big Data for Discovery Science Center is taking an "-ome to home" approach to discover linkages between these disparate data sources by mining existing databases of proteomic and genomic data, brain images, and clinical assessments. In support of this work, the authors developed new technological capabilities that make it easy for researchers to manage, aggregate, manipulate, integrate, and model large amounts of distributed data. Guided by biological domain expertise, the Center's computational resources and software will reveal relationships and patterns, aiding researchers in identifying biomarkers for the most confounding conditions and diseases, such as Parkinson's and Alzheimer's. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Predicting biomedical document access as a function of past use.

    Science.gov (United States)

    Goodwin, J Caleb; Johnson, Todd R; Cohen, Trevor; Herskovic, Jorge R; Bernstam, Elmer V

    2012-01-01

    To determine whether past access to biomedical documents can predict future document access. The authors used 394 days of query log (August 1, 2009 to August 29, 2010) from PubMed users in the Texas Medical Center, which is the largest medical center in the world. The authors evaluated two document access models based on the work of Anderson and Schooler. The first is based on how frequently a document was accessed. The second is based on both frequency and recency. The model based only on frequency of past access was highly correlated with the empirical data (R²=0.932), whereas the model based on frequency and recency had a much lower correlation (R²=0.668). The frequency-only model accurately predicted whether a document will be accessed based on past use. Modeling accesses as a function of frequency requires storing only the number of accesses and the creation date for the document. This model requires low storage overheads and is computationally efficient, making it scalable to large corpora such as MEDLINE. It is feasible to accurately model the probability of a document being accessed in the future based on past accesses.

  5. Handbook on advanced design and manufacturing technologies for biomedical devices

    CERN Document Server

    2013-01-01

    The last decades have seen remarkable advances in computer-aided design, engineering and manufacturing technologies, multi-variable simulation tools, medical imaging, biomimetic design, rapid prototyping, micro and nanomanufacturing methods and information management resources, all of which provide new horizons for the Biomedical Engineering fields and the Medical Device Industry. Handbook on Advanced Design and Manufacturing Technologies for Biomedical Devices covers such topics in depth, with an applied perspective and providing several case studies that help to analyze and understand the key factors of the different stages linked to the development of a novel biomedical device, from the conceptual and design steps, to the prototyping and industrialization phases. Main research challenges and future potentials are also discussed, taking into account relevant social demands and a growing market already exceeding billions of dollars. In time, advanced biomedical devices will decisively change methods and resu...

  6. [3D visualization and information interaction in biomedical applications].

    Science.gov (United States)

    Pu, F; Fan, Y; Jiang, W; Zhang, M; Mak, A F; Chen, J

    2001-06-01

    3D visualization and virtual reality are important trend in the development of modern science and technology, and as well in the studies on biomedical engineering. This paper presents a computer procedure developed for 3D visualization in biomedical applications. The biomedical models are constructed in slice sequences based on polygon cells and information interaction is realized on the basis of OpenGL selection mode in particular consideration of the specialties in this field such as irregularity in geometry and complexity in material etc. The software developed has functions of 3D model construction and visualization, real-time modeling transformation, information interaction and so on. It could serve as useful platform for 3D visualization in biomedical engineering research.

  7. Evolving technologies drive the new roles of Biomedical Engineering.

    Science.gov (United States)

    Frisch, P H; St Germain, J; Lui, W

    2008-01-01

    Rapidly changing technology coupled with the financial impact of organized health care, has required hospital Biomedical Engineering organizations to augment their traditional operational and business models to increase their role in developing enhanced clinical applications utilizing new and evolving technologies. The deployment of these technology based applications has required Biomedical Engineering organizations to re-organize to optimize the manner in which they provide and manage services. Memorial Sloan-Kettering Cancer Center has implemented a strategy to explore evolving technologies integrating them into enhanced clinical applications while optimally utilizing the expertise of the traditional Biomedical Engineering component (Clinical Engineering) to provide expanded support in technology / equipment management, device repair, preventive maintenance and integration with legacy clinical systems. Specifically, Biomedical Engineering is an integral component of the Medical Physics Department which provides comprehensive and integrated support to the Center in advanced physical, technical and engineering technology. This organizational structure emphasizes the integration and collaboration between a spectrum of technical expertise for clinical support and equipment management roles. The high cost of clinical equipment purchases coupled with the increasing cost of service has driven equipment management responsibilities to include significant business and financial aspects to provide a cost effective service model. This case study details the dynamics of these expanded roles, future initiatives and benefits for Biomedical Engineering and Memorial Sloan Kettering Cancer Center.

  8. A concept-driven biomedical knowledge extraction and visualization framework for conceptualization of text corpora.

    Science.gov (United States)

    Jahiruddin; Abulaish, Muhammad; Dey, Lipika

    2010-12-01

    A number of techniques such as information extraction, document classification, document clustering and information visualization have been developed to ease extraction and understanding of information embedded within text documents. However, knowledge that is embedded in natural language texts is difficult to extract using simple pattern matching techniques and most of these methods do not help users directly understand key concepts and their semantic relationships in document corpora, which are critical for capturing their conceptual structures. The problem arises due to the fact that most of the information is embedded within unstructured or semi-structured texts that computers can not interpret very easily. In this paper, we have presented a novel Biomedical Knowledge Extraction and Visualization framework, BioKEVis to identify key information components from biomedical text documents. The information components are centered on key concepts. BioKEVis applies linguistic analysis and Latent Semantic Analysis (LSA) to identify key concepts. The information component extraction principle is based on natural language processing techniques and semantic-based analysis. The system is also integrated with a biomedical named entity recognizer, ABNER, to tag genes, proteins and other entity names in the text. We have also presented a method for collating information extracted from multiple sources to generate semantic network. The network provides distinct user perspectives and allows navigation over documents with similar information components and is also used to provide a comprehensive view of the collection. The system stores the extracted information components in a structured repository which is integrated with a query-processing module to handle biomedical queries over text documents. We have also proposed a document ranking mechanism to present retrieved documents in order of their relevance to the user query. Copyright © 2010 Elsevier Inc. All rights reserved.

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  11. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  12. Professional Identification for Biomedical Engineers

    Science.gov (United States)

    Long, Francis M.

    1973-01-01

    Discusses four methods of professional identification in biomedical engineering including registration, certification, accreditation, and possible membership qualification of the societies. Indicates that the destiny of the biomedical engineer may be under the control of a new profession, neither the medical nor the engineering. (CC)

  13. Egyptian Journal of Biomedical Sciences

    African Journals Online (AJOL)

    The Egyptian Journal of Biomedical Sciences publishes in all aspects of biomedical research sciences. Both basic and clinical research papers are welcomed. Vol 23 (2007). DOWNLOAD FULL TEXT Open Access DOWNLOAD FULL TEXT Subscription or Fee Access. Table of Contents. Articles. Phytochemical And ...

  14. 78 FR 4419 - Center for Scientific Review; Notice of Closed Meetings

    Science.gov (United States)

    2013-01-22

    ...: Center for Scientific Review Special Emphasis Panel, Biomedical Imaging and Engineering Area Review. Date... . Name of Committee: Center for Scientific Review Special Emphasis Panel, Member Conflict: Nanotechnology...

  15. Biomedical informatics and translational medicine

    Directory of Open Access Journals (Sweden)

    Sarkar Indra

    2010-02-01

    Full Text Available Abstract Biomedical informatics involves a core set of methodologies that can provide a foundation for crossing the "translational barriers" associated with translational medicine. To this end, the fundamental aspects of biomedical informatics (e.g., bioinformatics, imaging informatics, clinical informatics, and public health informatics may be essential in helping improve the ability to bring basic research findings to the bedside, evaluate the efficacy of interventions across communities, and enable the assessment of the eventual impact of translational medicine innovations on health policies. Here, a brief description is provided for a selection of key biomedical informatics topics (Decision Support, Natural Language Processing, Standards, Information Retrieval, and Electronic Health Records and their relevance to translational medicine. Based on contributions and advancements in each of these topic areas, the article proposes that biomedical informatics practitioners ("biomedical informaticians" can be essential members of translational medicine teams.

  16. Biomedical informatics and translational medicine.

    Science.gov (United States)

    Sarkar, Indra Neil

    2010-02-26

    Biomedical informatics involves a core set of methodologies that can provide a foundation for crossing the "translational barriers" associated with translational medicine. To this end, the fundamental aspects of biomedical informatics (e.g., bioinformatics, imaging informatics, clinical informatics, and public health informatics) may be essential in helping improve the ability to bring basic research findings to the bedside, evaluate the efficacy of interventions across communities, and enable the assessment of the eventual impact of translational medicine innovations on health policies. Here, a brief description is provided for a selection of key biomedical informatics topics (Decision Support, Natural Language Processing, Standards, Information Retrieval, and Electronic Health Records) and their relevance to translational medicine. Based on contributions and advancements in each of these topic areas, the article proposes that biomedical informatics practitioners ("biomedical informaticians") can be essential members of translational medicine teams.

  17. Biomedical applications of nanotechnology.

    Science.gov (United States)

    Ramos, Ana P; Cruz, Marcos A E; Tovani, Camila B; Ciancaglini, Pietro

    2017-04-01

    The ability to investigate substances at the molecular level has boosted the search for materials with outstanding properties for use in medicine. The application of these novel materials has generated the new research field of nanobiotechnology, which plays a central role in disease diagnosis, drug design and delivery, and implants. In this review, we provide an overview of the use of metallic and metal oxide nanoparticles, carbon-nanotubes, liposomes, and nanopatterned flat surfaces for specific biomedical applications. The chemical and physical properties of the surface of these materials allow their use in diagnosis, biosensing and bioimaging devices, drug delivery systems, and bone substitute implants. The toxicology of these particles is also discussed in the light of a new field referred to as nanotoxicology that studies the surface effects emerging from nanostructured materials.

  18. Unsupervised Structure Detection in Biomedical Data.

    Science.gov (United States)

    Vogt, Julia E

    2015-01-01

    A major challenge in computational biology is to find simple representations of high-dimensional data that best reveal the underlying structure. In this work, we present an intuitive and easy-to-implement method based on ranked neighborhood comparisons that detects structure in unsupervised data. The method is based on ordering objects in terms of similarity and on the mutual overlap of nearest neighbors. This basic framework was originally introduced in the field of social network analysis to detect actor communities. We demonstrate that the same ideas can successfully be applied to biomedical data sets in order to reveal complex underlying structure. The algorithm is very efficient and works on distance data directly without requiring a vectorial embedding of data. Comprehensive experiments demonstrate the validity of this approach. Comparisons with state-of-the-art clustering methods show that the presented method outperforms hierarchical methods as well as density based clustering methods and model-based clustering. A further advantage of the method is that it simultaneously provides a visualization of the data. Especially in biomedical applications, the visualization of data can be used as a first pre-processing step when analyzing real world data sets to get an intuition of the underlying data structure. We apply this model to synthetic data as well as to various biomedical data sets which demonstrate the high quality and usefulness of the inferred structure.

  19. Cardiovascular system simulation in biomedical engineering education.

    Science.gov (United States)

    Rideout, V. C.

    1972-01-01

    Use of complex cardiovascular system models, in conjunction with a large hybrid computer, in biomedical engineering courses. A cardiovascular blood pressure-flow model, driving a compartment model for the study of dye transport, was set up on the computer for use as a laboratory exercise by students who did not have the computer experience or skill to be able to easily set up such a simulation involving some 27 differential equations running at 'real time' rate. The students were given detailed instructions regarding the model, and were then able to study effects such as those due to septal and valve defects upon the pressure, flow, and dye dilution curves. The success of this experiment in the use of involved models in engineering courses was such that it seems that this type of laboratory exercise might be considered for use in physiology courses as an adjunct to animal experiments.

  20. The use of AMS to the biomedical sciences

    International Nuclear Information System (INIS)

    Vogel, J.S.

    1991-04-01

    The Center for Accelerator Mass Spectroscopy (AMS) began making AMS measurements in 1989. Biomedical experiments were originally limited by sample preparation techniques, but we expect the number of biomedical samples to increase five-fold. While many of the detailed techniques for making biomedical measurements resemble those used in other fields, biological tracer experiments differ substantially from the observational approaches of earth science investigators. The role of xenobiotius in initiating mutations in cells is of particular interest. One measure of the damage caused to the genetic material is obtained by counting the number of adducts formed by a chemical agent at a given dose. AMS allows direct measurement of the number of adducts through stoichiometric quantification of the 14 C label attached to the DNA after exposure to a labelled carcinogen. Other isotopes of interest include tritium, 36 Cl, 79 SE, 41 Ca, 26 Al and 129 I. Our experiments with low dose environmental carcinogens reflect the protocols which will become a common part of biomedical AMS. In biomedical experiments, the researcher defines the carbon to be analyzed through dissection and/or chemical purification; thus the sample is ''merely'' combusted and graphitized at the AMS facility. However, since biomedical samples can have a 14 C range of five orders of magnitude, preparation of graphite required construction of a special manifold to prevent cross-contamination. Additionally, a strain of 14 C-depleted C57BL/6 mice is being developed to further reduce background in biomedical experiments. AMS has a bright and diverse future in radioisotope tracing. Such work requires a dedicated amalgamation of AMS scientists and biomedical researchers who will redesign experimental protocols to maximize the AMS technique and minimize the danger of catastrophic contamination. 18 refs., 4 figs., 1 tab

  1. The use of AMS to the biomedical sciences

    Energy Technology Data Exchange (ETDEWEB)

    Vogel, J.S.

    1991-04-01

    The Center for Accelerator Mass Spectroscopy (AMS) began making AMS measurements in 1989. Biomedical experiments were originally limited by sample preparation techniques, but we expect the number of biomedical samples to increase five-fold. While many of the detailed techniques for making biomedical measurements resemble those used in other fields, biological tracer experiments differ substantially from the observational approaches of earth science investigators. The role of xenobiotius in initiating mutations in cells is of particular interest. One measure of the damage caused to the genetic material is obtained by counting the number of adducts formed by a chemical agent at a given dose. AMS allows direct measurement of the number of adducts through stoichiometric quantification of the {sup 14}C label attached to the DNA after exposure to a labelled carcinogen. Other isotopes of interest include tritium, {sup 36}Cl, {sup 79}SE, {sup 41}Ca, {sup 26}Al and {sup 129}I. Our experiments with low dose environmental carcinogens reflect the protocols which will become a common part of biomedical AMS. In biomedical experiments, the researcher defines the carbon to be analyzed through dissection and/or chemical purification; thus the sample is merely'' combusted and graphitized at the AMS facility. However, since biomedical samples can have a {sup 14}C range of five orders of magnitude, preparation of graphite required construction of a special manifold to prevent cross-contamination. Additionally, a strain of {sup 14}C-depleted C57BL/6 mice is being developed to further reduce background in biomedical experiments. AMS has a bright and diverse future in radioisotope tracing. Such work requires a dedicated amalgamation of AMS scientists and biomedical researchers who will redesign experimental protocols to maximize the AMS technique and minimize the danger of catastrophic contamination. 18 refs., 4 figs., 1 tab.

  2. All India Seminar on Biomedical Engineering 2012

    CERN Document Server

    Bhatele, Mukta

    2013-01-01

    This book is a collection of articles presented by researchers and practitioners, including engineers, biologists, health professionals and informatics/computer scientists, interested in both theoretical advances and applications of information systems, artificial intelligence, signal processing, electronics and other engineering tools in areas related to biology and medicine in the All India Seminar on Biomedical Engineering 2012 (AISOBE 2012), organized by The Institution of Engineers (India), Jabalpur Local Centre, Jabalpur, India during November 3-4, 2012. The content of the book is useful to doctors, engineers, researchers and academicians as well as industry professionals.

  3. Proceedings of the symposium on bio-medical engineering and nuclear medicine

    International Nuclear Information System (INIS)

    Kataria, S.K.; Jindal, G.D.; Joshi, V.M.; Singh, B.

    2000-01-01

    In the symposium on biomedical engineering and nuclear medicine, the major topics covered were biomedical instrumentation and measurements, imaging and image processing, computer applications in medicine, modelling of bio-electric signals and nuclear medicine. Papers relevant to INIS are indexed separately

  4. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  6. Development of an Instrument to Measure Health Center (HC) Personnel's Computer Use, Knowledge and Functionality Demand for HC Computerized Information System in Thailand

    OpenAIRE

    Kijsanayotin, Boonchai; Pannarunothai, Supasit; Speedie, Stuart

    2005-01-01

    Knowledge about socio-technical aspects of information technology (IT) is vital for the success of health IT projects. The Thailand health administration anticipates using health IT to support the recently implemented national universal health care system. However, the national knowledge associate with the socio-technical aspects of health IT has not been studied in Thailand. A survey instrument measuring Thai health center (HC) personnel’s computer use, basic IT knowledge a...

  7. Semantic similarity in biomedical ontologies.

    Directory of Open Access Journals (Sweden)

    Catia Pesquita

    2009-07-01

    Full Text Available In recent years, ontologies have become a mainstream topic in biomedical research. When biological entities are described using a common schema, such as an ontology, they can be compared by means of their annotations. This type of comparison is called semantic similarity, since it assesses the degree of relatedness between two entities by the similarity in meaning of their annotations. The application of semantic similarity to biomedical ontologies is recent; nevertheless, several studies have been published in the last few years describing and evaluating diverse approaches. Semantic similarity has become a valuable tool for validating the results drawn from biomedical studies such as gene clustering, gene expression data analysis, prediction and validation of molecular interactions, and disease gene prioritization. We review semantic similarity measures applied to biomedical ontologies and propose their classification according to the strategies they employ: node-based versus edge-based and pairwise versus groupwise. We also present comparative assessment studies and discuss the implications of their results. We survey the existing implementations of semantic similarity measures, and we describe examples of applications to biomedical research. This will clarify how biomedical researchers can benefit from semantic similarity measures and help them choose the approach most suitable for their studies.Biomedical ontologies are evolving toward increased coverage, formality, and integration, and their use for annotation is increasingly becoming a focus of both effort by biomedical experts and application of automated annotation procedures to create corpora of higher quality and completeness than are currently available. Given that semantic similarity measures are directly dependent on these evolutions, we can expect to see them gaining more relevance and even becoming as essential as sequence similarity is today in biomedical research.

  8. Semantic similarity in biomedical ontologies.

    Science.gov (United States)

    Pesquita, Catia; Faria, Daniel; Falcão, André O; Lord, Phillip; Couto, Francisco M

    2009-07-01

    In recent years, ontologies have become a mainstream topic in biomedical research. When biological entities are described using a common schema, such as an ontology, they can be compared by means of their annotations. This type of comparison is called semantic similarity, since it assesses the degree of relatedness between two entities by the similarity in meaning of their annotations. The application of semantic similarity to biomedical ontologies is recent; nevertheless, several studies have been published in the last few years describing and evaluating diverse approaches. Semantic similarity has become a valuable tool for validating the results drawn from biomedical studies such as gene clustering, gene expression data analysis, prediction and validation of molecular interactions, and disease gene prioritization. We review semantic similarity measures applied to biomedical ontologies and propose their classification according to the strategies they employ: node-based versus edge-based and pairwise versus groupwise. We also present comparative assessment studies and discuss the implications of their results. We survey the existing implementations of semantic similarity measures, and we describe examples of applications to biomedical research. This will clarify how biomedical researchers can benefit from semantic similarity measures and help them choose the approach most suitable for their studies.Biomedical ontologies are evolving toward increased coverage, formality, and integration, and their use for annotation is increasingly becoming a focus of both effort by biomedical experts and application of automated annotation procedures to create corpora of higher quality and completeness than are currently available. Given that semantic similarity measures are directly dependent on these evolutions, we can expect to see them gaining more relevance and even becoming as essential as sequence similarity is today in biomedical research.

  9. National Space Biomedical Research Institute

    Science.gov (United States)

    2003-01-01

    In June 1996, NASA released a Cooperative Agreement Notice (CAN) inviting proposals to establish a National Space Biomedical Research Institute (9-CAN-96-01). This CAN stated that: The Mission of the Institute will be to lead a National effort for accomplishing the integrated, critical path, biomedical research necessary to support the long term human presence, development, and exploration of space and to enhance life on Earth by applying the resultant advances in human knowledge and technology acquired through living and working in space. The Institute will be the focal point of NASA sponsored space biomedical research. This statement has not been amended by NASA and remains the mission of the NSBRI.

  10. New roles & responsibilities of hospital biomedical engineering.

    Science.gov (United States)

    Frisch, P H; Stone, B; Booth, P; Lui, W

    2014-01-01

    Over the last decade the changing healthcare environment has required hospitals and specifically Biomedical Engineering to critically evaluate, optimize and adapt their operations. The focus is now on new technologies, changes to the environment of care, support requirements and financial constraints. Memorial Sloan Kettering Cancer Center (MSKCC), an NIH-designated comprehensive cancer center, has been transitioning to an increasing outpatient care environment. This transition is driving an increase in-patient acuity coupled with the need for added urgency of support and response time. New technologies, regulatory requirements and financial constraints have impacted operating budgets and in some cases, resulted in a reduction in staffing. Specific initiatives, such as the Joint Commission's National Patient Safety Goals, requirements for an electronic medical record, meaningful use and ICD10 have caused institutions to reevaluate their operations and processes including requiring Biomedical Engineering to manage new technologies, integrations and changes in the electromagnetic environment, while optimizing operational workflow and resource utilization. This paper addresses the new and expanding responsibilities and approach of Biomedical Engineering organizations, specifically at MSKCC. It is suggested that our experience may be a template for other organizations facing similar problems. Increasing support is necessary for Medical Software - Medical Device Data Systems in the evolving wireless environment, including RTLS and RFID. It will be necessary to evaluate the potential impact on the growing electromagnetic environment, on connectivity resulting in the need for dynamic and interactive testing and the growing demand to establish new and needed operational synergies with Information Technology operations and other operational groups within the institution, such as nursing, facilities management, central supply, and the user departments.

  11. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  14. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  15. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  17. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  18. Annual report of R and D activities in center for promotion of computational science and engineering from April 1, 2004 to March 31, 2005

    International Nuclear Information System (INIS)

    2005-09-01

    This report provides an overview of research and development activities in Center for Promotion of Computational Science and Engineering (CCSE), JAERI, in the fiscal year 2004 (April 1, 2004 - March 31, 2005). The activities have been performed by Research Group for Computational Science in Atomic Energy, Research Group for Computational Material Science in Atomic Energy, R and D Group for Computer Science, R and D Group for Numerical Experiments, and Quantum Bioinformatics Group in CCSE. The ITBL (Information Technology Based Laboratory) project is performed mainly by the R and D Group for Computer Science and the Research Group for Computational Science in Atomic Energy. According to the mid-term evaluation for the ITBL project conducted by the MEXT, the achievement of the ITBL infrastructure software developed by JAERI has been remarked as outstanding at the 13th Information Science and Technology Committee in the Subdivision on R and D Planning and Evaluation of the Council for Science and Technology on April 26th, 2004. (author)

  19. Zirconia in biomedical applications.

    Science.gov (United States)

    Chen, Yen-Wei; Moussi, Joelle; Drury, Jeanie L; Wataha, John C

    2016-10-01

    The use of zirconia in medicine and dentistry has rapidly expanded over the past decade, driven by its advantageous physical, biological, esthetic, and corrosion properties. Zirconia orthopedic hip replacements have shown superior wear-resistance over other systems; however, risk of catastrophic fracture remains a concern. In dentistry, zirconia has been widely adopted for endosseous implants, implant abutments, and all-ceramic crowns. Because of an increasing demand for esthetically pleasing dental restorations, zirconia-based ceramic restorations have become one of the dominant restorative choices. Areas covered: This review provides an updated overview of the applications of zirconia in medicine and dentistry with a focus on dental applications. The MEDLINE electronic database (via PubMed) was searched, and relevant original and review articles from 2010 to 2016 were included. Expert commentary: Recent data suggest that zirconia performs favorably in both orthopedic and dental applications, but quality long-term clinical data remain scarce. Concerns about the effects of wear, crystalline degradation, crack propagation, and catastrophic fracture are still debated. The future of zirconia in biomedical applications will depend on the generation of these data to resolve concerns.

  20. Annual report of R and D activities in Center for Computational Science and e-Systems from April 1, 2007 to March 31, 2009

    International Nuclear Information System (INIS)

    2010-01-01

    This report provides an overview of research and development activities in Center for Computational Science and e-Systems (CCSE), JAEA, during the fiscal years 2007 and 2008 (Apr 1, 2007 - March 31, 2009). These research and development activities have been performed by the Simulation Technology R and D Office and Computer Science R and D Office. These activities include development of secure computational infrastructure for atomic energy research based on the grid technology, large scale seismic analysis of an entire nuclear reactor structure, large scale fluid dynamics simulation of J-PARC mercury target, large scale plasma simulation for nuclear fusion reactor, large scale atomic and subatomic simulations of nuclear fuels and materials for safety assessment, large scale quantum simulations of superconductor for the design of new devices and fundamental understanding of superconductivity, development of protein database for the identification of radiation-resistance gene, and large scale atomic simulation of proteins. (author)

  1. Annual report of R and D activities in Center for Computational Science and e-Systems from April 1, 2009 to March 31, 2010

    International Nuclear Information System (INIS)

    2011-10-01

    This report overviews the activity of research and development (R and D) in Center for Computational Science and e-Systems (CCSE) of the Japan Atomic Energy Agency (JAEA), during the fiscal year 2009 (April 1, 2009 - March 31, 2010). The work has been accomplished by the Simulation Technology R and D Office and Computer Science R and D Office in CCSE. The activity includes researches of secure computational infrastructure for the use in atomic energy research, which is based on the grid technology, a seismic response analysis for the structure of nuclear power plants, materials science, and quantum bioinformatics. The materials science research includes large scale atomic and subatomic simulations of nuclear fuels and materials for safety assessment, large scale quantum simulations of superconductor for the design of new devices and fundamental understanding of superconductivity. The quantum bioinformatics research focuses on the development of technology for large scale atomic simulations of proteins. (author)

  2. New frontiers in biomedical science and engineering during 2014-2015.

    Science.gov (United States)

    Liu, Feng; Lee, Dong-Hoon; Lagoa, Ricardo; Kumar, Sandeep

    2015-01-01

    The International Conference on Biomedical Engineering and Biotechnology (ICBEB) is an international meeting held once a year. This, the fourth International Conference on Biomedical Engineering and Biotechnology (ICBEB2015), will be held in Shanghai, China, during August 18th-21st, 2015. This annual conference intends to provide an opportunity for researchers and practitioners at home and abroad to present the most recent frontiers and future challenges in the fields of biomedical science, biomedical engineering, biomaterials, bioinformatics and computational biology, biomedical imaging and signal processing, biomechanical engineering and biotechnology, etc. The papers published in this issue are selected from this Conference, which witness the advances in biomedical engineering and biotechnology during 2014-2015.

  3. Bayes' theorem: A paradigm research tool in biomedical sciences ...

    African Journals Online (AJOL)

    One of the most interesting applications of the results of probability theory involves estimating unknown probability and making decisions on the basis of new (sample) information. Biomedical scientists often use the Bayesian decision theory for the purposes of computing diagnostic values such as sensitivity and specificity ...

  4. Molecular Biomedical Imaging Laboratory (MBIL)

    Data.gov (United States)

    Federal Laboratory Consortium — The Molecular Biomedical Imaging Laboratory (MBIL) is adjacent-a nd has access-to the Department of Radiology and Imaging Sciences clinical imaging facilities. MBIL...

  5. Functionalized carbon nanotubes: biomedical applications

    Science.gov (United States)

    Vardharajula, Sandhya; Ali, Sk Z; Tiwari, Pooja M; Eroğlu, Erdal; Vig, Komal; Dennis, Vida A; Singh, Shree R

    2012-01-01

    Carbon nanotubes (CNTs) are emerging as novel nanomaterials for various biomedical applications. CNTs can be used to deliver a variety of therapeutic agents, including biomolecules, to the target disease sites. In addition, their unparalleled optical and electrical properties make them excellent candidates for bioimaging and other biomedical applications. However, the high cytotoxicity of CNTs limits their use in humans and many biological systems. The biocompatibility and low cytotoxicity of CNTs are attributed to size, dose, duration, testing systems, and surface functionalization. The functionalization of CNTs improves their solubility and biocompatibility and alters their cellular interaction pathways, resulting in much-reduced cytotoxic effects. Functionalized CNTs are promising novel materials for a variety of biomedical applications. These potential applications are particularly enhanced by their ability to penetrate biological membranes with relatively low cytotoxicity. This review is directed towards the overview of CNTs and their functionalization for biomedical applications with minimal cytotoxicity. PMID:23091380

  6. Bio-medical CMOS ICs

    CERN Document Server

    Yoo, Hoi-Jun

    2011-01-01

    This book is based on a graduate course entitled, Ubiquitous Healthcare Circuits and Systems, that was given by one of the editors. It includes an introduction and overview to biomedical ICs and provides information on the current trends in research.

  7. Summer Biomedical Engineering Institute 1972

    Science.gov (United States)

    Deloatch, E. M.

    1973-01-01

    The five problems studied for biomedical applications of NASA technology are reported. The studies reported are: design modification of electrophoretic equipment, operating room environment control, hematological viscometry, handling system for iridium, and indirect blood pressure measuring device.

  8. New Directions for Biomedical Engineering

    Science.gov (United States)

    Plonsey, Robert

    1973-01-01

    Discusses the definition of "biomedical engineering" and the development of educational programs in the field. Includes detailed descriptions of the roles of bioengineers, medical engineers, and chemical engineers. (CC)

  9. The Cancer Biomedical Informatics Grid (caBIG) Security Infrastructure.

    Science.gov (United States)

    Langella, Stephen; Oster, Scott; Hastings, Shannon; Siebenlist, Frank; Phillips, Joshua; Ervin, David; Permar, Justin; Kurc, Tahsin; Saltz, Joel

    2007-10-11

    Security is a high priority issue in medical domain, because many institutions performing biomedical research work with sensitive medical data regularly. This issue becomes more complicated, when it is desirable or needed to access and analyze data in a multi-institutional setting. In the NCI cancer Biomedical Informatics Grid (caBIG) program, several security issues were raised that existing security technologies could not address. Considering caBIG is envisioned to span a large number of cancer centers and investigator laboratories, these issues pose considerable challenge. In this paper we present these issues and the infrastructure, referred to as GAARDS, which has been developed to address them.

  10. The Cancer Biomedical Informatics Grid (caBIG™) Security Infrastructure

    Science.gov (United States)

    Langella, Stephen; Oster, Scott; Hastings, Shannon; Siebenlist, Frank; Phillips, Joshua; Ervin, David; Permar, Justin; Kurc, Tahsin; Saltz, Joel

    2007-01-01

    Security is a high priority issue in medical domain, because many institutions performing biomedical research work with sensitive medical data regularly. This issue becomes more complicated, when it is desirable or needed to access and analyze data in a multi-institutional setting. In the NCI cancer Biomedical Informatics Grid (caBIG™) program, several security issues were raised that existing security technologies could not address. Considering caBIG is envisioned to span a large number of cancer centers and investigator laboratories, these issues pose considerable challenge. In this paper we present these issues and the infrastructure, referred to as GAARDS, which has been developed to address them. PMID:18693873

  11. Hydroxyapatite coatings for biomedical applications

    CERN Document Server

    Zhang, Sam

    2013-01-01

    Hydroxyapatite coatings are of great importance in the biological and biomedical coatings fields, especially in the current era of nanotechnology and bioapplications. With a bonelike structure that promotes osseointegration, hydroxyapatite coating can be applied to otherwise bioinactive implants to make their surface bioactive, thus achieving faster healing and recovery. In addition to applications in orthopedic and dental implants, this coating can also be used in drug delivery. Hydroxyapatite Coatings for Biomedical Applications explores developments in the processing and property characteri

  12. Considering biomedical/CAM treatments

    OpenAIRE

    Cheng, JX; Widjaja, F; Choi, JE; Hendren, RL

    2013-01-01

    Complementary and alternative medicine (CAM) is widely used to treat children with psychiatric disorders. In this review, MedLine was searched for various biomedical/CAM treatments in combination with the key words "children," "adolescents," "psychiatric disorders," and "complementary alternative medicine." The biomedical/CAM treatments most thoroughly researched were omega-3 fatty acids, melatonin, and memantine. Those with the fewest published studies were N-acetylcysteine, vitamin B 12 , a...

  13. Semantic Similarity in Biomedical Ontologies

    OpenAIRE

    Pesquita, Catia; Faria, Daniel; Falc?o, Andr? O.; Lord, Phillip; Couto, Francisco M.

    2009-01-01

    In recent years, ontologies have become a mainstream topic in biomedical research. When biological entities are described using a common schema, such as an ontology, they can be compared by means of their annotations. This type of comparison is called semantic similarity, since it assesses the degree of relatedness between two entities by the similarity in meaning of their annotations. The application of semantic similarity to biomedical ontologies is recent; nevertheless, several studies hav...

  14. BIMS: Biomedical Information Management System

    OpenAIRE

    Mora Pérez, Oscar

    2009-01-01

    This final year project presents the design principles and prototype implementation of BIMS (Biomedical Information Management System), a flexible software system which provides an infrastructure to manage all information required by biomedical research projects.The BIMS project was initiated with the motivation to solve several limitations in medical data acquisition of some research projects, in which Universitat Pompeu Fabra takes part. These limitations,based on the lack of control mechan...

  15. John Glenn Biomedical Engineering Consortium

    Science.gov (United States)

    Nall, Marsha

    2004-01-01

    The John Glenn Biomedical Engineering Consortium is an inter-institutional research and technology development, beginning with ten projects in FY02 that are aimed at applying GRC expertise in fluid physics and sensor development with local biomedical expertise to mitigate the risks of space flight on the health, safety, and performance of astronauts. It is anticipated that several new technologies will be developed that are applicable to both medical needs in space and on earth.

  16. Modified chitosans for biomedical applications

    OpenAIRE

    Yalınca, Zülal

    2013-01-01

    ABSTRACT: The subject of this thesis is the exploration of the suitability of chitosan and some of its derivatives for some chosen biomedical applications. Chitosan-graft-poly (N-vinyl imidazole), Chitosan-tripolyphosphate and ascorbyl chitosan were synthesized and characterized for specific biomedical applications in line with their chemical functionalities. Chitosan-graft-poly (N-vinyl imidazole), Chi-graft-PNVI, was synthesized by two methods; via an N-protection route and without N-pr...

  17. Evaluation of root canal transportation, centering ratio, and remaining dentin thickness of TRUShape and ProTaper Next systems in curved root canals using micro-computed tomography.

    Science.gov (United States)

    Elnaghy, Amr M; Al-Dharrab, Ayman A; Abbas, Hisham M; Elsaka, Shaymaa E

    2017-01-01

    To evaluate and compare the volume of removed dentin, transportation, and centering ability of TRUShape (TRS; Dentsply Tulsa Dental Specialties) system with ProTaper Next (PTN; Dentsply Maillefer) by using micro-computed tomography (µCT). Twenty extracted human mandibular first molars with two separate mesial canals with curvatures of 25 to 35 degrees were divided into two experimental groups (n = 20) according to the rotary nickel-titanium file system used in canal instrumentation as follows: group TRS and group PTN. Canals were scanned before and after instrumentation using µCT to evaluate root canal transportation, centering ratio, and volumetric changes. Data of canal transportation and centering ratio values were analyzed using independent t test. Volume changes data were statistically analyzed using Mann-Whitney test. Statistical significance level was set at P lower mean volume of removed dentin (2.09 ± 0.41 mm3) than the TRS group (2.77 ± 0.72 mm3) (P canal transportation (P = .170) and centering ratio (P = .111) between TRS and PTN groups. However, at the apical and middle levels, the PTN group had a significantly lower mean transportation value and higher centering ratio than the TRS group (P canal preparation with the PTN system revealed better performance with fewer canal aberrations than the TRS system in curved root canals.

  18. Autonomous Micro-Modular Mobile Data Center Cloud Computing Study for Modeling, Simulation, Information Processing and Cyber-Security Viability

    Data.gov (United States)

    National Aeronautics and Space Administration — Cloud computing environments offer opportunities for malicious users to penetrate security layers and damage, destroy or steal data. This ability can be exploited to...

  19. Pathophysiologic mechanisms of biomedical nanomaterials

    International Nuclear Information System (INIS)

    Wang, Liming; Chen, Chunying

    2016-01-01

    Nanomaterials (NMs) have been widespread used in biomedical fields, daily consuming, and even food industry. It is crucial to understand the safety and biomedical efficacy of NMs. In this review, we summarized the recent progress about the physiological and pathological effects of NMs from several levels: protein-nano interface, NM-subcellular structures, and cell–cell interaction. We focused on the detailed information of nano-bio interaction, especially about protein adsorption, intracellular trafficking, biological barriers, and signaling pathways as well as the associated mechanism mediated by nanomaterials. We also introduced related analytical methods that are meaningful and helpful for biomedical effect studies in the future. We believe that knowledge about pathophysiologic effects of NMs is not only significant for rational design of medical NMs but also helps predict their safety and further improve their applications in the future. - Highlights: • Rapid protein adsorption onto nanomaterials that affects biomedical effects • Nanomaterials and their interaction with biological membrane, intracellular trafficking and specific cellular effects • Nanomaterials and their interaction with biological barriers • The signaling pathways mediated by nanomaterials and related biomedical effects • Novel techniques for studying translocation and biomedical effects of NMs

  20. Automatic detection of the left ventricular myocardium long axis and center in thallium-201 single photon emission computed tomography

    International Nuclear Information System (INIS)

    Cauvin, J.C.; Boire, J.Y.; Maubliant, J.C.; Bonny, J.M.; Zanca, M.; Veyre, A.

    1992-01-01

    A new method for centering and reorienting automatically the left ventricle in thallium-201 myocardial single photon emission tomography (SPECT) is proposed. The processing involves the following steps: (a) The transverse sections of the left ventricle are segmented, (b) the three-dimensional skeleton of the left ventricle is extracted using tools of mathematical morphology, (c) the skeleton is fitted to a quadratic surface by the least-squares method, (d) the left ventricle is reoriented and centered using the long axis and the coordinates of the centre of the quadratic surface. A series of 30 consecutive exercise and redistribution 201 Tl SPECT studies were centered and reoriented by two operators twice with this method, and twice manually. There was no significant difference in the mean realignment performed by the automatic and the manual methods while centering differed moderately in some instances. In all cases and for all parameters, the reproducibility of the automatic method was 1.00, while it ranged between 0.74 and 0.98 with the manual centering and reorientation. This automatic approach provides a fast and highly reproducible method for the reconstruction of short- and long-axis sections of the left ventricle in 201 Tl SPECT. (orig.)

  1. The feasibility of computer-based prism adaptation to ameliorate neglect in sub-acute stroke patients admitted to a rehabilitation center.

    Directory of Open Access Journals (Sweden)

    Miranda eSmit

    2013-07-01

    Full Text Available AbstractIntroduction: There is wide interest in transferring paper-and-pencil tests to a computer-based setting, resulting in more precise recording of performance. Here, we investigated the feasibility of computer-based testing and computer-based prism adaptation (PA to ameliorate neglect in sub-acute stroke patients admitted to a rehabilitation center. Methods: 33 neglect patients were included. PA was performed with a pair of goggles with wide-field point-to-point prismatic lenses inducing an ipsilesional optical shift of 10 degrees. A variety of digitalized neuropsychological tests were performed using an interactive tablet immediately before and after PA.Results: All 33 patients (mean age 60.36 (SD 13.30, (mean days post-stroke 63.73 (SD 37.74 were able to work with the tablet and to understand, perform and complete the digitalized tests within the proposed time-frame, indicating that there is feasibility of computer-based assessment in this stage post-stroke. Analyses of the efficacy of PA indicated no significant change on any of the outcome measures, except time.Discussion: In conclusion, there is feasibility of computer-based testing in such an early stage, which makes the computer-based setting a promising technique for evaluating more ecologically valid tasks. Secondly, the computer-based PA can be considered as a reliable procedure. We can conclude from our analysis, addressing the efficacy of PA, that the effectiveness of single session PA may not be sufficient to produce short term effects on our static tasks. Further studies, however, need to be done to evaluate the computer-based efficacy with more ecologically valid assessments in an intensive double-blind, sham-controlled multiple PA treatment design.

  2. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  3. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  4. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  8. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  9. The Development of Hand-Centered Visual Representations in the Primate Brain: A Computer Modeling Study Using Natural Visual Scenes.

    Science.gov (United States)

    Galeazzi, Juan M; Minini, Loredana; Stringer, Simon M

    2015-01-01

    Neurons that respond to visual targets in a hand-centered frame of reference have been found within various areas of the primate brain. We investigate how hand-centered visual representations may develop in a neural network model of the primate visual system called VisNet, when the model is trained on images of the hand seen against natural visual scenes. The simulations show how such neurons may develop through a biologically plausible process of unsupervised competitive learning and self-organization. In an advance on our previous work, the visual scenes consisted of multiple targets presented simultaneously with respect to the hand. Three experiments are presented. First, VisNet was trained with computerized images consisting of a realistic image of a hand and a variety of natural objects, presented in different textured backgrounds during training. The network was then tested with just one textured object near the hand in order to verify if the output cells were capable of building hand-centered representations with a single localized receptive field. We explain the underlying principles of the statistical decoupling that allows the output cells of the network to develop single localized receptive fields even when the network is trained with multiple objects. In a second simulation we examined how some of the cells with hand-centered receptive fields decreased their shape selectivity and started responding to a localized region of hand-centered space as the number of objects presented in overlapping locations during training increases. Lastly, we explored the same learning principles training the network with natural visual scenes collected by volunteers. These results provide an important step in showing how single, localized, hand-centered receptive fields could emerge under more ecologically realistic visual training conditions.

  10. Cone-beam Computed Tomographic Assessment of Canal Centering Ability and Transportation after Preparation with Twisted File and Bio RaCe Instrumentation.

    Directory of Open Access Journals (Sweden)

    Kiamars Honardar

    2014-08-01

    Full Text Available Use of rotary Nickel-Titanium (NiTi instruments for endodontic preparation has introduced a new era in endodontic practice, but this issue has undergone dramatic modifications in order to achieve improved shaping abilities. Cone-beam computed tomography (CBCT has made it possible to accurately evaluate geometrical changes following canal preparation. This study was carried out to compare canal centering ability and transportation of Twisted File and BioRaCe rotary systems by means of cone-beam computed tomography.Thirty root canals from freshly extracted mandibular and maxillary teeth were selected. Teeth were mounted and scanned before and after preparation by CBCT at different apical levels. Specimens were divided into 2 groups of 15. In the first group Twisted File and in the second, BioRaCe was used for canal preparation. Canal transportation and centering ability after preparation were assessed by NNT Viewer and Photoshop CS4 software. Statistical analysis was performed using t-test and two-way ANOVA.All samples showed deviations from the original axes of the canals. No significant differences were detected between the two rotary NiTi instruments for canal centering ability in all sections. Regarding canal transportation however, a significant difference was seen in the BioRaCe group at 7.5mm from the apex.Under the conditions of this in vitro study, Twisted File and BioRaCe rotary NiTi files retained original canal geometry.

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  12. Real time computer data system for the 40 x 80 ft wind tunnel facility at Ames Research Center

    Science.gov (United States)

    Cambra, J. M.; Tolari, G. P.

    1974-01-01

    The wind tunnel realtime computer system is a distributed data gathering system that features a master computer subsystem, a high speed data gathering subsystem, a quick look dynamic analysis and vibration control subsystem, an analog recording back-up subsystem, a pulse code modulation (PCM) on-board subsystem, a communications subsystem, and a transducer excitation and calibration subsystem. The subsystems are married to the master computer through an executive software system and standard hardware and FORTRAN software interfaces. The executive software system has four basic software routines. These are the playback, setup, record, and monitor routines. The standard hardware interfaces along with the software interfaces provide the system with the capability of adapting to new environments.

  13. [Scientometrics and bibliometrics of biomedical engineering periodicals and papers].

    Science.gov (United States)

    Zhao, Ping; Xu, Ping; Li, Bingyan; Wang, Zhengrong

    2003-09-01

    This investigation was made to reveal the current status, research trend and research level of biomedical engineering in Chinese mainland by means of scientometrics and to assess the quality of the four domestic publications by bibliometrics. We identified all articles of four related publications by searching Chinese and foreign databases from 1997 to 2001. All articles collected or cited by these databases were searched and statistically analyzed for finding out the relevant distributions, including databases, years, authors, institutions, subject headings and subheadings. The source of sustentation funds and the related articles were analyzed too. The results showed that two journals were cited by two foreign databases and five Chinese databases simultaneously. The output of Journal of Biomedical Engineering was the highest. Its quantity of original papers cited by EI, CA and the totality of papers sponsored by funds were higher than those of the others, but the quantity and percentage per year of biomedical articles cited by EI were decreased in all. Inland core authors and institutions had come into being in the field of biomedical engineering. Their research topics were mainly concentrated on ten subject headings which included biocompatible materials, computer-assisted signal processing, electrocardiography, computer-assisted image processing, biomechanics, algorithms, electroencephalography, automatic data processing, mechanical stress, hemodynamics, mathematical computing, microcomputers, theoretical models, etc. The main subheadings were concentrated on instrumentation, physiopathology, diagnosis, therapy, ultrasonography, physiology, analysis, surgery, pathology, method, etc.

  14. Session Introduction: Challenges of Pattern Recognition in Biomedical Data.

    Science.gov (United States)

    Verma, Shefali Setia; Verma, Anurag; Basile, Anna Okula; Bishop, Marta-Byrska; Darabos, Christian

    2018-01-01

    The analysis of large biomedical data often presents with various challenges related to not just the size of the data, but also to data quality issues such as heterogeneity, multidimensionality, noisiness, and incompleteness of the data. The data-intensive nature of computational genomics problems in biomedical informatics warrants the development and use of massive computer infrastructure and advanced software tools and platforms, including but not limited to the use of cloud computing. Our session aims to address these challenges in handling big data for designing a study, performing analysis, and interpreting outcomes of these analyses. These challenges have been prevalent in many studies including those which focus on the identification of novel genetic variant-phenotype associations using data from sources like Electronic Health Records (EHRs) or multi-omic data. One of the biggest challenges to focus on is the imperfect nature of the biomedical data where a lot of noise and sparseness is observed. In our session, we will present research articles that can help in identifying innovative ways to recognize and overcome newly arising challenges associated with pattern recognition in biomedical data.

  15. Computed tomography (CT)-compatible remote center of motion needle steering robot : Fusing CT images and electromagnetic sensor data

    NARCIS (Netherlands)

    Shahriari, Navid; Heerink, Wout; van Katwijk, Tim; Hekman, Edsko E. G.; Oudkerk, Matthijs; Misra, Sarthak

    Lung cancer is the most common cause of cancer-related death, and early detection can reduce the mortality rate. Patients with lung nodules greater than 10 mm usually undergo a computed tomography (CT) guided biopsy. However, aligning the needle with the target is difficult and the needle tends to

  16. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  17. [Master course in biomedical engineering].

    Science.gov (United States)

    Jobbágy, Akos; Benyó, Zoltán; Monos, Emil

    2009-11-22

    The Bologna Declaration aims at harmonizing the European higher education structure. In accordance with the Declaration, biomedical engineering will be offered as a master (MSc) course also in Hungary, from year 2009. Since 1995 biomedical engineering course has been held in cooperation of three universities: Semmelweis University, Budapest Veterinary University, and Budapest University of Technology and Economics. One of the latter's faculties, Faculty of Electrical Engineering and Informatics, has been responsible for the course. Students could start their biomedical engineering studies - usually in parallel with their first degree course - after they collected at least 180 ECTS credits. Consequently, the biomedical engineering course could have been considered as a master course even before the Bologna Declaration. Students had to collect 130 ECTS credits during the six-semester course. This is equivalent to four-semester full-time studies, because during the first three semesters the curriculum required to gain only one third of the usual ECTS credits. The paper gives a survey on the new biomedical engineering master course, briefly summing up also the subjects in the curriculum.

  18. Mathematical modeling and computational intelligence in engineering applications

    CERN Document Server

    Silva Neto, Antônio José da; Silva, Geraldo Nunes

    2016-01-01

    This book brings together a rich selection of studies in mathematical modeling and computational intelligence, with application in several fields of engineering, like automation, biomedical, chemical, civil, electrical, electronic, geophysical and mechanical engineering, on a multidisciplinary approach. Authors from five countries and 16 different research centers contribute with their expertise in both the fundamentals and real problems applications based upon their strong background on modeling and computational intelligence. The reader will find a wide variety of applications, mathematical and computational tools and original results, all presented with rigorous mathematical procedures. This work is intended for use in graduate courses of engineering, applied mathematics and applied computation where tools as mathematical and computational modeling, numerical methods and computational intelligence are applied to the solution of real problems.

  19. GT-MSOCC - A domain for research on human-computer interaction and decision aiding in supervisory control systems. [Georgia Tech - Multisatellite Operations Control Center

    Science.gov (United States)

    Mitchell, Christine M.

    1987-01-01

    The Georgia Tech-Multisatellite Operations Control Center (GT-MSOCC), a real-time interactive simulation of the operator interface to a NASA ground control system for unmanned earth-orbiting satellites, is described. The GT-MSOCC program for investigating a range of modeling, decision aiding, and workstation design issues related to the human-computer interaction is discussed. A GT-MSOCC operator function model is described in which operator actions, both cognitive and manual, are represented as the lowest level discrete control network nodes, and operator action nodes are linked to information needs or system reconfiguration commands.

  20. Development of an instrument to measure health center (HC) personnel's computer use, knowledge and functionality demand for HC computerized information system in Thailand.

    Science.gov (United States)

    Kijsanayotin, Boonchai; Pannarunothai, Supasit; Speedie, Stuart

    2005-01-01

    Knowledge about socio-technical aspects of information technology (IT) is vital for the success of health IT projects. The Thailand health administration anticipates using health IT to support the recently implemented national universal health care system. However, the national knowledge associate with the socio-technical aspects of health IT has not been studied in Thailand. A survey instrument measuring Thai health center (HC) personnel's computer use, basic IT knowledge and HC computerized information system functionality needs was developed. The instrument reveals acceptable test-retest reliability and reasonable internal consistency of the measures. The future nation-wide demonstration study will benefit from this study.

  1. About TTC | NCI Technology Transfer Center | TTC

    Science.gov (United States)

    The TTC facilitates licensing and co-development partnerships between biomedical industry, academia, and government agencies and the research laboratories of the NCI and nine other institutes and centers of NIH.

  2. A robust approach to extract biomedical events from literature.

    Science.gov (United States)

    Bui, Quoc-Chinh; Sloot, Peter M A

    2012-10-15

    The abundance of biomedical literature has attracted significant interest in novel methods to automatically extract biomedical relations from the literature. Until recently, most research was focused on extracting binary relations such as protein-protein interactions and drug-disease relations. However, these binary relations cannot fully represent the original biomedical data. Therefore, there is a need for methods that can extract fine-grained and complex relations known as biomedical events. In this article we propose a novel method to extract biomedical events from text. Our method consists of two phases. In the first phase, training data are mapped into structured representations. Based on that, templates are used to extract rules automatically. In the second phase, extraction methods are developed to process the obtained rules. When evaluated against the Genia event extraction abstract and full-text test datasets (Task 1), we obtain results with F-scores of 52.34 and 53.34, respectively, which are comparable to the state-of-the-art systems. Furthermore, our system achieves superior performance in terms of computational efficiency. Our source code is available for academic use at http://dl.dropbox.com/u/10256952/BioEvent.zip.

  3. 4th International Conference on Biomedical Engineering in Vietnam

    CERN Document Server

    Toan, Nguyen; Khoa, Truong; Phuong, Tran; Development of Biomedical Engineering

    2013-01-01

    This volume presents the proceedings of the Fourth International Conference on the Development of Biomedical Engineering in Vietnam which was held in Ho Chi Minh City as a Mega-conference. It is kicked off by the Regenerative Medicine Conference with the theme “BUILDING A FACE” USING A REGENERATIVE MEDICINE APPROACH”, endorsed mainly by the Tissue Engineering and Regenerative Medicine International Society (TERMIS). It is followed by the Computational Medicine Conference, endorsed mainly by the Computational Surgery International Network (COSINE) and the Computational Molecular Medicine of German National Funding Agency; and the General Biomedical Engineering Conference, endorsed mainly by the International Federation for Medical and Biological Engineering (IFMBE). It featured the contributions of 435 scientists from 30 countries, including: Australia, Austria, Belgium, Canada, China, Finland, France, Germany, Hungary, India, Iran, Italy, Japan, Jordan, Korea, Malaysia, Netherlands, Pakistan, Poland, Ru...

  4. Reclassification and Documentation in a Medium-sized Medical Center Library: The MTST System in the Simultaneous Production of Catalog Cards and a Computer Stored Record

    Science.gov (United States)

    Love, Erika; Butzin, Diane; Robinson, Robert E.; Lee, Soo

    1971-01-01

    A project to recatalog and reclassify the book collection of the Bowman Gray School of Medicine Library utilizing the Magnetic Tape/Selectric Typwriter system for simultaneous catalog card production and computer stored data acquisition marks the beginning of eventual computerization of all library operations. A keyboard optical display system will be added by late 1970. Major input operations requiring the creation of “hard copy” will continue via the MTST system. Updating, editing and retrieval operations as well as input without hard copy production will be done through the “on-line” keyboard optical display system. Once the library's first data bank, the book catalog, has been established the computer may be consulted directly for library holdings from any optical display terminal throughout the medical center. Three basic information retrieval operations may be carried out through “on-line” optical display terminals. Output options include the reproduction of part or all of a given document, or the generation of statistical data, which are derived from two Acquisition Code lines. The creation of a central bibliographic record of Bowman Gray Faculty publications patterned after the cataloging program is presently under way. The cataloging and computer storage of serial holdings records will begin after completion of the reclassification project. All acquisitions added to the collection since October 1967 are computer-stored and fully retrievable. Reclassification of older titles will be completed in early 1971. PMID:5542915

  5. Modeling in biomedical informatics: an exploratory analysis part 2.

    Science.gov (United States)

    Hasman, A; Haux, R

    2007-01-01

    Modeling is a significant part of research, education and practice in biomedical and health informatics. Our objective was to explore which types of models of processes are used in current biomedical/health informatics research, as reflected in publications of scientific journals in this field. Also, the implications for medical informatics curricula were investigated. Retrospective, prolective observational study on recent publications of the two official journals of the International Medical Informatics Association (IMIA), the International Journal of Medical Informatics (IJMI) and Methods of Information in Medicine (MIM). All publications of the years 2004 and 2005 from these journals were indexed according to a given list of model types. Random samples out of these publications were analysed in more depth. Three hundred and eighty-four publications have been analysed, 190 of IJMI and 194 of MIM. For publications in special issues (121 in IJMI) and special topics (132 in MIM) we found differences between theme-centered and conference-centered special issues/special topics (SIT) publications. In particular, we could observe a high variation between modeling in publications of theme-centered SITs. It became obvious that often sound formal knowledge as well as a strong engineering background is needed for carrying out this type of research. Usually, this knowledge and the related skills can be best provided in consecutive B.Sc. and M.Sc. programs in medical informatics (respectively, health informatics, biomedical informatics). If the focus should be primarily on health information systems and evaluation this can be offered in a M.Sc. program in medical informatics. In analysing the 384 publications it became obvious that modeling continues to be a major task in research, education and practice in biomedical and health informatics. Knowledge and skills on a broad range of model types are needed in biomedical/health informatics.

  6. Biomedical Use of Aerospace Personal Cooling Garments

    Science.gov (United States)

    Webbon, Bruce W.; Montgomery, Leslie D.; Callaway, Robert K.

    1994-01-01

    Personal thermoregulatory systems are required during extravehicular activity (EVA) to remove the metabolic heat generated by the suited astronaut. The Extravehicular and Protective Systems (STE) Branch of NASA Ames Research Center has developed advanced concepts or liquid cooling garments for both industrial and biomedical applications for the past 25 years. Examples of this work include: (1) liquid cooled helmets for helicopter pilots and race car drivers; (2) vests for fire and mine rescue personnel; (3) bras to increase the definition of tumors during thermography; (4) lower body garments for young women with erythomelaigia; and (5) whole body garments used by patients with multiple sclerosis (MS). The benefits of the biomedical application of artificial thermoregulation received national attention through two recent events: (1) the liquid-cooled garment technology was inducted into the United States Space Foundation's Space Technology Hall of Fame (1993); and (2) NASA has signed a joint Memorandum of Understanding with the Multiple Sclerosis Association (1994) to share this technology for use with MS patient treatment. The STE Branch is currently pursuing a program to refine thermoregulatory design in light of recent technology developments that might be applicable for use by several medical patient populations. Projects have been initiated to apply thermoregulatory technology for the treatment and/or rehabilitation of patients with spinal cord injuries, multiple sclerosis, migraine headaches, and to help prevent the loss of hair during chemotherapy.

  7. Modeling and control in the biomedical sciences

    CERN Document Server

    Banks, H T

    1975-01-01

    These notes are based on (i) a series of lectures that I gave at the 14th Biennial Seminar of the Canadian Mathematical Congress held at the University of Western Ontario August 12-24, 1973 and (li) some of my lectures in a modeling course that I have cotaught in the Division of Bio-Medical Sciences at Brown during the past several years. An earlier version of these notes appeared in the Center for Dynamical Systems Lectures Notes series (CDS LN 73-1, November 1973). I have in this revised and extended version of those earlier notes incorporated a number of changes based both on classroom experience and on my research efforts with several colleagues during the intervening period. The narrow viewpoint of the present notes (use of optimization and control theory in biomedical problems) reflects more the scope of the CMC lectures given in August, 1973 than the scope of my own interests. Indeed, my real interests have included the modeling process itself as well as the contributions made by investiga­ tors who e...

  8. Contamination control training for biomedical facilities

    International Nuclear Information System (INIS)

    Trinoskey, P.A.

    1994-10-01

    In 1991, a contamination control course was developed for the Biology and Biotechnology Research Program (BBRP) at the Lawrence Livermore National Laboratory (LLNL). This course was based on the developer's experience in Radiation Safety at the University of Utah and University of Kansas Medical Center. This course has been well received at LLNL because it addresses issues that are important to individuals handling small quantities of radioactive materials. This group of users is often overlooked. They are typically very well educated and are expected to ''know'' what they should do. Many of these individuals are not initially comfortable working with radioactive materials. They appreciate the opportunity to be introduced to contamination control techniques and to discuss issues they may have. In addition, the authors benefit by experience that researchers bring from other facilities. The training course will address the specific radiological training requirements for chemists, biologists, and medical researchers who are using small amounts of dispersible radionuclides in tabletop experiments, and will not be exposed to other radiation sources. The training will include: the potential hazards of typical radionuclides, contamination control procedures, and guidance for developing and including site-specific information. The training course will eliminate the need for Radiological Worker II training for bio-medical researchers. The target audience for this training course is bio-medical researchers

  9. Career Development among American Biomedical Postdocs

    Science.gov (United States)

    Gibbs, Kenneth D.; McGready, John; Griffin, Kimberly

    2015-01-01

    Recent biomedical workforce policy efforts have centered on enhancing career preparation for trainees, and increasing diversity in the research workforce. Postdoctoral scientists, or postdocs, are among those most directly impacted by such initiatives, yet their career development remains understudied. This study reports results from a 2012 national survey of 1002 American biomedical postdocs. On average, postdocs reported increased knowledge about career options but lower clarity about their career goals relative to PhD entry. The majority of postdocs were offered structured career development at their postdoctoral institutions, but less than one-third received this from their graduate departments. Postdocs from all social backgrounds reported significant declines in interest in faculty careers at research-intensive universities and increased interest in nonresearch careers; however, there were differences in the magnitude and period of training during which these changes occurred across gender and race/ethnicity. Group differences in interest in faculty careers were explained by career interest differences formed during graduate school but not by differences in research productivity, research self-efficacy, or advisor relationships. These findings point to the need for enhanced career development earlier in the training process, and interventions sensitive to distinctive patterns of interest development across social identity groups. PMID:26582238

  10. Innovations in Biomedical Engineering 2016

    CERN Document Server

    Tkacz, Ewaryst; Paszenda, Zbigniew; Piętka, Ewa

    2017-01-01

    This book presents the proceedings of the “Innovations in Biomedical Engineering IBE’2016” Conference held on October 16–18, 2016 in Poland, discussing recent research on innovations in biomedical engineering. The past decade has seen the dynamic development of more and more sophisticated technologies, including biotechnologies, and more general technologies applied in the area of life sciences. As such the book covers the broadest possible spectrum of subjects related to biomedical engineering innovations. Divided into four parts, it presents state-of-the-art achievements in: • engineering of biomaterials, • modelling and simulations in biomechanics, • informatics in medicine • signal analysis The book helps bridge the gap between technological and methodological engineering achievements on the one hand and clinical requirements in the three major areas diagnosis, therapy and rehabilitation on the other.

  11. Design reality gap issues within an ICT4D project:an assessment of Jigawa State Community Computer Center

    OpenAIRE

    Kanya, Rislana Abdulazeez; Good, Alice

    2013-01-01

    This paper evaluates the Jigawa State Government Community Computer centre project using the design reality gap framework. The purpose of this was to analyse the shortfall between design expectations and implementation realities, in order to find out the current situation of the project. Furthermore to analyse whether it would meet the key stakeholder’s expectation. The Majority of Government ICT Projects is classified as either failure or partial failure. Our research will underpin a case st...

  12. Hierarchical Storage Management at the NASA Center for Computational Sciences: From UniTree to SAM-QFS

    Science.gov (United States)

    Salmon, Ellen; Tarshish, Adina; Palm, Nancy; Patel, Sanjay; Saletta, Marty; Vanderlan, Ed; Rouch, Mike; Burns, Lisa; Duffy, Daniel; Caine, Robert

    2004-01-01

    This paper presents the data management issues associated with a large center like the NCCS and how these issues are addressed. More specifically, the focus of this paper is on the recent transition from a legacy UniTree (Legato) system to a SAM-QFS (Sun) system. Therefore, this paper will describe the motivations, from both a hardware and software perspective, for migrating from one system to another. Coupled with the migration from UniTree into SAM-QFS, the complete mass storage environment was upgraded to provide high availability, redundancy, and enhanced performance. This paper will describe the resulting solution and lessons learned throughout the migration process.

  13. Modernization of the graphics post-processors of the Hamburg German Climate Computer Center Carbon Cycle Codes

    Energy Technology Data Exchange (ETDEWEB)

    Stevens, E.J.; McNeilly, G.S.

    1994-03-01

    The existing National Center for Atmospheric Research (NCAR) code in the Hamburg Oceanic Carbon Cycle Circulation Model and the Hamburg Large-Scale Geostrophic Ocean General Circulation Model was modernized and reduced in size while still producing an equivalent end result. A reduction in the size of the existing code from more than 50,000 lines to approximately 7,500 lines in the new code has made the new code much easier to maintain. The existing code in Hamburg model uses legacy NCAR (including even emulated CALCOMP subrountines) graphics to display graphical output. The new code uses only current (version 3.1) NCAR subrountines.

  14. Biomedical applications of magnetic particles

    CERN Document Server

    Mefford, Thompson

    2017-01-01

    Magnetic particles are increasingly being used in a wide variety of biomedical applications. Written by a team of internationally respected experts, this book provides an up-to-date authoritative reference for scientists and engineers. The first section presents the fundamentals of the field by explaining the theory of magnetism, describing techniques to synthesize magnetic particles, and detailing methods to characterize magnetic particles. The second section describes biomedical applications, including chemical sensors and cellular actuators, and diagnostic applications such as drug delivery, hyperthermia cancer treatment, and magnetic resonance imaging contrast.

  15. Biomedical applications of magnetic particles

    CERN Document Server

    Mefford, Thompson

    2018-01-01

    Magnetic particles are increasingly being used in a wide variety of biomedical applications. Written by a team of internationally respected experts, this book provides an up-to-date authoritative reference for scientists and engineers. The first section presents the fundamentals of the field by explaining the theory of magnetism, describing techniques to synthesize magnetic particles, and detailing methods to characterize magnetic particles. The second section describes biomedical applications, including chemical sensors and cellular actuators, and diagnostic applications such as drug delivery, hyperthermia cancer treatment, and magnetic resonance imaging contrast.

  16. Biomedical Imaging Principles and Applications

    CERN Document Server

    Salzer, Reiner

    2012-01-01

    This book presents and describes imaging technologies that can be used to study chemical processes and structural interactions in dynamic systems, principally in biomedical systems. The imaging technologies, largely biomedical imaging technologies such as MRT, Fluorescence mapping, raman mapping, nanoESCA, and CARS microscopy, have been selected according to their application range and to the chemical information content of their data. These technologies allow for the analysis and evaluation of delicate biological samples, which must not be disturbed during the profess. Ultimately, this may me

  17. Minimize the Percentage of Noise in Biomedical Images Using Neural Networks

    Directory of Open Access Journals (Sweden)

    Abdul Khader Jilani Saudagar

    2014-01-01

    Full Text Available The overall goal of the research is to improve the quality of biomedical image for telemedicine with minimum percentages of noise in the retrieved image and to take less computation time. The novelty of this technique lies in the implementation of spectral coding for biomedical images using neural networks in order to accomplish the above objectives. This work is in continuity of an ongoing research project aimed at developing a system for efficient image compression approach for telemedicine in Saudi Arabia. We compare the efficiency of this technique against existing image compression techniques, namely, JPEG2000, in terms of compression ratio, peak signal to noise ratio (PSNR, and computation time. To our knowledge, the research is the primary in providing a comparative study with other techniques used in the compression of biomedical images. This work explores and tests biomedical images such as X-rays, computed tomography (CT, magnetic resonance imaging (MRI, and positron emission tomography (PET.

  18. Formal ontologies in biomedical knowledge representation.

    Science.gov (United States)

    Schulz, S; Jansen, L

    2013-01-01

    Medical decision support and other intelligent applications in the life sciences depend on increasing amounts of digital information. Knowledge bases as well as formal ontologies are being used to organize biomedical knowledge and data. However, these two kinds of artefacts are not always clearly distinguished. Whereas the popular RDF(S) standard provides an intuitive triple-based representation, it is semantically weak. Description logics based ontology languages like OWL-DL carry a clear-cut semantics, but they are computationally expensive, and they are often misinterpreted to encode all kinds of statements, including those which are not ontological. We distinguish four kinds of statements needed to comprehensively represent domain knowledge: universal statements, terminological statements, statements about particulars and contingent statements. We argue that the task of formal ontologies is solely to represent universal statements, while the non-ontological kinds of statements can nevertheless be connected with ontological representations. To illustrate these four types of representations, we use a running example from parasitology. We finally formulate recommendations for semantically adequate ontologies that can efficiently be used as a stable framework for more context-dependent biomedical knowledge representation and reasoning applications like clinical decision support systems.

  19. Localization and Tracking of Implantable Biomedical Sensors

    Directory of Open Access Journals (Sweden)

    Ilknur Umay

    2017-03-01

    Full Text Available Implantable sensor systems are effective tools for biomedical diagnosis, visualization and treatment of various health conditions, attracting the interest of researchers, as well as healthcare practitioners. These systems efficiently and conveniently provide essential data of the body part being diagnosed, such as gastrointestinal (temperature, pH, pressure parameter values, blood glucose and pressure levels and electrocardiogram data. Such data are first transmitted from the implantable sensor units to an external receiver node or network and then to a central monitoring and control (computer unit for analysis, diagnosis and/or treatment. Implantable sensor units are typically in the form of mobile microrobotic capsules or implanted stationary (body-fixed units. In particular, capsule-based systems have attracted significant research interest recently, with a variety of applications, including endoscopy, microsurgery, drug delivery and biopsy. In such implantable sensor systems, one of the most challenging problems is the accurate localization and tracking of the microrobotic sensor unit (e.g., robotic capsule inside the human body. This article presents a literature review of the existing localization and tracking techniques for robotic implantable sensor systems with their merits and limitations and possible solutions of the proposed localization methods. The article also provides a brief discussion on the connection and cooperation of such techniques with wearable biomedical sensor systems.

  20. General guidelines for biomedical software development.

    Science.gov (United States)

    Silva, Luis Bastiao; Jimenez, Rafael C; Blomberg, Niklas; Luis Oliveira, José

    2017-01-01

    Most bioinformatics tools available today were not written by professional software developers, but by people that wanted to solve their own problems, using computational solutions and spending the minimum time and effort possible, since these were just the means to an end. Consequently, a vast number of software applications are currently available, hindering the task of identifying the utility and quality of each. At the same time, this situation has hindered regular adoption of these tools in clinical practice. Typically, they are not sufficiently developed to be used by most clinical researchers and practitioners. To address these issues, it is necessary to re-think how biomedical applications are built and adopt new strategies that ensure quality, efficiency, robustness, correctness and reusability of software components. We also need to engage end-users during the development process to ensure that applications fit their needs. In this review, we present a set of guidelines to support biomedical software development, with an explanation of how they can be implemented and what kind of open-source tools can be used for each specific topic.

  1. National Biomedical Tracer Facility. Project definition study

    Energy Technology Data Exchange (ETDEWEB)

    Schafer, R.

    1995-02-14

    We request a $25 million government-guaranteed, interest-free loan to be repaid over a 30-year period for construction and initial operations of a cyclotron-based National Biomedical Tracer Facility (NBTF) in North Central Texas. The NBTF will be co-located with a linear accelerator-based commercial radioisotope production facility, funded by the private sector at approximately $28 million. In addition, research radioisotope production by the NBTF will be coordinated through an association with an existing U.S. nuclear reactor center that will produce research and commercial radioisotopes through neutron reactions. The combined facilities will provide the full range of technology for radioisotope production and research: fast neutrons, thermal neutrons, and particle beams (H{sup -}, H{sup +}, and D{sup +}). The proposed NBTF facility includes an 80 MeV, 1 mA H{sup -} cyclotron that will produce proton-induced (neutron deficient) research isotopes.

  2. National Biomedical Tracer Facility. Project definition study

    International Nuclear Information System (INIS)

    Schafer, R.

    1995-01-01

    We request a $25 million government-guaranteed, interest-free loan to be repaid over a 30-year period for construction and initial operations of a cyclotron-based National Biomedical Tracer Facility (NBTF) in North Central Texas. The NBTF will be co-located with a linear accelerator-based commercial radioisotope production facility, funded by the private sector at approximately $28 million. In addition, research radioisotope production by the NBTF will be coordinated through an association with an existing U.S. nuclear reactor center that will produce research and commercial radioisotopes through neutron reactions. The combined facilities will provide the full range of technology for radioisotope production and research: fast neutrons, thermal neutrons, and particle beams (H - , H + , and D + ). The proposed NBTF facility includes an 80 MeV, 1 mA H - cyclotron that will produce proton-induced (neutron deficient) research isotopes

  3. Full text clustering and relationship network analysis of biomedical publications.

    Directory of Open Access Journals (Sweden)

    Renchu Guan

    Full Text Available Rapid developments in the biomedical sciences have increased the demand for automatic clustering of biomedical publications. In contrast to current approaches to text clustering, which focus exclusively on the contents of abstracts, a novel method is proposed for clustering and analysis of complete biomedical article texts. To reduce dimensionality, Cosine Coefficient is used on a sub-space of only two vectors, instead of computing the Euclidean distance within the space of all vectors. Then a strategy and algorithm is introduced for Semi-supervised Affinity Propagation (SSAP to improve analysis efficiency, using biomedical journal names as an evaluation background. Experimental results show that by avoiding high-dimensional sparse matrix computations, SSAP outperforms conventional k-means methods and improves upon the standard Affinity Propagation algorithm. In constructing a directed relationship network and distribution matrix for the clustering results, it can be noted that overlaps in scope and interests among BioMed publications can be easily identified, providing a valuable analytical tool for editors, authors and readers.

  4. Computer classes and games in virtual reality environment to reduce loneliness among students of an elderly reference center: Study protocol for a randomised cross-over design.

    Science.gov (United States)

    Antunes, Thaiany Pedrozo Campos; Oliveira, Acary Souza Bulle de; Crocetta, Tania Brusque; Antão, Jennifer Yohanna Ferreira de Lima; Barbosa, Renata Thais de Almeida; Guarnieri, Regiani; Massetti, Thais; Monteiro, Carlos Bandeira de Mello; Abreu, Luiz Carlos de

    2017-03-01

    Physical and mental changes associated with aging commonly lead to a decrease in communication capacity, reducing social interactions and increasing loneliness. Computer classes for older adults make significant contributions to social and cognitive aspects of aging. Games in a virtual reality (VR) environment stimulate the practice of communicative and cognitive skills and might also bring benefits to older adults. Furthermore, it might help to initiate their contact to the modern technology. The purpose of this study protocol is to evaluate the effects of practicing VR games during computer classes on the level of loneliness of students of an elderly reference center. This study will be a prospective longitudinal study with a randomised cross-over design, with subjects aged 50 years and older, of both genders, spontaneously enrolled in computer classes for beginners. Data collection will be done in 3 moments: moment 0 (T0) - at baseline; moment 1 (T1) - after 8 typical computer classes; and moment 2 (T2) - after 8 computer classes which include 15 minutes for practicing games in VR environment. A characterization questionnaire, the short version of the Short Social and Emotional Loneliness Scale for Adults (SELSA-S) and 3 games with VR (Random, MoviLetrando, and Reaction Time) will be used. For the intervention phase 4 other games will be used: Coincident Timing, Motor Skill Analyser, Labyrinth, and Fitts. The statistical analysis will compare the evolution in loneliness perception, performance, and reaction time during the practice of the games between the 3 moments of data collection. Performance and reaction time during the practice of the games will also be correlated to the loneliness perception. The protocol is approved by the host institution's ethics committee under the number 52305215.3.0000.0082. Results will be disseminated via peer-reviewed journal articles and conferences. This clinical trial is registered at ClinicalTrials.gov identifier: NCT

  5. Advancing biomedical imaging.

    Science.gov (United States)

    Weissleder, Ralph; Nahrendorf, Matthias

    2015-11-24

    Imaging reveals complex structures and dynamic interactive processes, located deep inside the body, that are otherwise difficult to decipher. Numerous imaging modalities harness every last inch of the energy spectrum. Clinical modalities include magnetic resonance imaging (MRI), X-ray computed tomography (CT), ultrasound, and light-based methods [endoscopy and optical coherence tomography (OCT)]. Research modalities include various light microscopy techniques (confocal, multiphoton, total internal reflection, superresolution fluorescence microscopy), electron microscopy, mass spectrometry imaging, fluorescence tomography, bioluminescence, variations of OCT, and optoacoustic imaging, among a few others. Although clinical imaging and research microscopy are often isolated from one another, we argue that their combination and integration is not only informative but also essential to discovering new biology and interpreting clinical datasets in which signals invariably originate from hundreds to thousands of cells per voxel.

  6. Archives: Journal of Medical and Biomedical Sciences

    African Journals Online (AJOL)

    Items 1 - 20 of 20 ... Archives: Journal of Medical and Biomedical Sciences. Journal Home > Archives: Journal of Medical and Biomedical Sciences. Log in or Register to get access to full text downloads.

  7. Biomedical nanomaterials from design to implementation

    CERN Document Server

    Webster, Thomas

    2016-01-01

    Biomedical Nanomaterials brings together the engineering applications and challenges of using nanostructured surfaces and nanomaterials in healthcare in a single source. Each chapter covers important and new information in the biomedical applications of nanomaterials.

  8. African Journal of Biomedical Research: Journal Sponsorship

    African Journals Online (AJOL)

    African Journal of Biomedical Research: Journal Sponsorship. Journal Home > About the Journal > African Journal of Biomedical Research: Journal Sponsorship. Log in or Register to get access to full text downloads.

  9. Archives: Journal of Medicine and Biomedical Research

    African Journals Online (AJOL)

    Items 1 - 19 of 19 ... Archives: Journal of Medicine and Biomedical Research. Journal Home > Archives: Journal of Medicine and Biomedical Research. Log in or Register to get access to full text downloads.

  10. Effort-reward imbalance and one-year change in neck-shoulder and upper extremity pain among call center computer operators.

    Science.gov (United States)

    Krause, Niklas; Burgel, Barbara; Rempel, David

    2010-01-01

    The literature on psychosocial job factors and musculoskeletal pain is inconclusive in part due to insufficient control for confounding by biomechanical factors. The aim of this study was to investigate prospectively the independent effects of effort-reward imbalance (ERI) at work on regional musculoskeletal pain of the neck and upper extremities of call center operators after controlling for (i) duration of computer use both at work and at home, (ii) ergonomic workstation design, (iii) physical activities during leisure time, and (iv) other individual worker characteristics. This was a one-year prospective study among 165 call center operators who participated in a randomized ergonomic intervention trial that has been described previously. Over an approximate four-week period, we measured ERI and 28 potential confounders via a questionnaire at baseline. Regional upper-body pain and computer use was measured by weekly surveys for up to 12 months following the implementation of ergonomic interventions. Regional pain change scores were calculated as the difference between average weekly pain scores pre- and post intervention. A significant relationship was found between high average ERI ratios and one-year increases in right upper-extremity pain after adjustment for pre-intervention regional mean pain score, current and past physical workload, ergonomic workstation design, and anthropometric, sociodemographic, and behavioral risk factors. No significant associations were found with change in neck-shoulder or left upper-extremity pain. This study suggests that ERI predicts regional upper-extremity pain in -computer operators working >or=20 hours per week. Control for physical workload and ergonomic workstation design was essential for identifying ERI as a risk factor.

  11. Shining Future of Biomedical Optics

    Science.gov (United States)

    Wang, Lihong

    2017-10-04

    Lihong V. Wang summarizes his tenure as Editor-in-Chief of the Journal of Biomedical Optics and introduces his successor, Brian Pogue, who will assume the role in January 2018. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  12. Mathematical modeling in biomedical imaging

    CERN Document Server

    2009-01-01

    This volume gives an introduction to a fascinating research area to applied mathematicians. It is devoted to providing the exposition of promising analytical and numerical techniques for solving challenging biomedical imaging problems, which trigger the investigation of interesting issues in various branches of mathematics.

  13. Journal of Biomedical Investigation: Submissions

    African Journals Online (AJOL)

    The following instructions relating to submissions must be adhered to. Failure to conform can lead to delay in publication. Preferred method of submission. Manuscripts may be submitted by post (Editor-in-chief Journal of Biomedical Investigation, Department of Pharmacology and Therapeutics, Faculty of Medicine College ...

  14. Biomedical Engineering Education in Perspective

    Science.gov (United States)

    Gowen, Richard J.

    1973-01-01

    Discusses recent developments in the health care industry and their impact on the future of biomedical engineering education. Indicates that a more thorough understanding of the complex functions of the living organism can be acquired through the application of engineering techniques to problems of life sciences. (CC)

  15. Statistics in three biomedical journals

    Czech Academy of Sciences Publication Activity Database

    Pilčík, Tomáš

    2003-01-01

    Roč. 52, č. 1 (2003), s. 39-43 ISSN 0862-8408 R&D Projects: GA ČR GA310/03/1381 Grant - others:Howard Hughes Medical Institute(US) HHMI55000323 Institutional research plan: CEZ:AV0Z5052915 Keywords : statistics * usage * biomedical journals Subject RIV: EC - Immunology Impact factor: 0.939, year: 2003

  16. Integrated Biomaterials for Biomedical Technology

    CERN Document Server

    Ramalingam, Murugan; Ramakrishna, Seeram; Kobayashi, Hisatoshi

    2012-01-01

    This cutting edge book provides all the important aspects dealing with the basic science involved in materials in biomedical technology, especially structure and properties, techniques and technological innovations in material processing and characterizations, as well as the applications. The volume consists of 12 chapters written by acknowledged experts of the biomaterials field and covers a wide range of topics and applications.

  17. African Journal of Biomedical Research

    African Journals Online (AJOL)

    The African Journal of biomedical Research was founded in 1998 as a joint project between a private communications outfit (Laytal Communications) and ... is aimed at being registered in future as a non-governmental organization involved in the promotion of scientific proceedings and publications in developing countries.

  18. Resource for the Development of Biomedical Accelerator Mass Spectrometry (AMS)

    Energy Technology Data Exchange (ETDEWEB)

    Turteltaub, K. W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bench, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Buchholz, B. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Enright, H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kulp, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCartt, A. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Malfatti, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ognibene, T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Loots, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Stewart, B. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-04-08

    The NIH Research Resource for Biomedical AMS was originally funded at Lawrence Livermore National Laboratory in 1999 to develop and apply the technology of accelerator mass spectrometry (AMS) in broad- based biomedical research. The Resource’s niche is to fill needs for ultra high sensitivity quantitation when isotope-labeled agents are used. The Research Resource’s Technology Research and Development (TR&D) efforts will focus on the needs of the biomedical research community in the context of seven Driving Biomedical Projects (DBPs) that will drive the Center’s technical capabilities through three core TR&Ds. We will expand our present capabilities by developing a fully integrated HPLC AMS to increase our capabilities for metabolic measurements, we will develop methods to understand cellular processes and we will develop and validate methods for the application of AMS in human studies, which is a growing area of demand by collaborators and service users. In addition, we will continue to support new and ongoing collaborative and service projects that require the capabilities of the Resource. The Center will continue to train researchers in the use of the AMS capabilities being developed, and the results of all efforts will be widely disseminated to advance progress in biomedical research. Towards these goals, our specific aims are to:1.) Increase the value and information content of AMS measurements by combining molecular speciation with quantitation of defined macromolecular isolates. Specifically, develop and validate methods for macromolecule labeling, characterization and quantitation.2.) Develop and validate methods and strategies to enable AMS to become more broadly used in human studies. Specifically, demonstrate robust methods for conducting pharmacokinetic/pharmacodynamics studies in humans and model systems.3.) Increase the accessibility of AMS to the Biomedical research community and the throughput of AMS through direct coupling to separatory

  19. Resource for the Development of Biomedical Accelerator Mass Spectrometry (AMS)

    Energy Technology Data Exchange (ETDEWEB)

    Tuerteltaub, K. W. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Bench, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Buchholz, B. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Enright, H. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kulp, K. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Loots, G. G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); McCartt, A. D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Malfatti, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Ognibene, T. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Stewart, B. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-03-21

    The NIH Research Resource for Biomedical AMS was originally funded at Lawrence Livermore National Laboratory in 1999 to develop and apply the technology of accelerator mass spectrometry (AMS) in broad- based biomedical research. The Resource’s niche is to fill needs for ultra high sensitivity quantitation when isotope-labeled agents are used. The Research Resource’s Technology Research and Development (TR&D) efforts will focus on the needs of the biomedical research community in the context of seven Driving Biomedical Projects (DBPs) that will drive the Center’s technical capabilities through three core TR&Ds. We will expand our present capabilities by developing a fully integrated HPLC AMS to increase our capabilities for metabolic measurements, we will develop methods to understand cellular processes and we will develop and validate methods for the application of AMS in human studies, which is a growing area of demand by collaborators and service users. In addition, we will continue to support new and ongoing collaborative and service projects that require the capabilities of the Resource. The Center will continue to train researchers in the use of the AMS capabilities being developed, and the results of all efforts will be widely disseminated to advance progress in biomedical research. Towards these goals, our specific aims are to:1.) Increase the value and information content of AMS measurements by combining molecular speciation with quantitation of defined macromolecular isolates. Specifically, develop and validate methods for macromolecule labeling, characterization and quantitation.2.) Develop and validate methods and strategies to enable AMS to become more broadly used in human studies. Specifically, demonstrate robust methods for conducting pharmacokinetic/pharmacodynamics studies in humans and model systems.3.) Increase the accessibility of AMS to the Biomedical research community and the throughput of AMS through direct coupling to separatory

  20. Design and analysis of a tendon-based computed tomography-compatible robot with remote center of motion for lung biopsy.

    Science.gov (United States)

    Yang, Yunpeng; Jiang, Shan; Yang, Zhiyong; Yuan, Wei; Dou, Huaisu; Wang, Wei; Zhang, Daguang; Bian, Yuan

    2017-04-01

    Nowadays, biopsy is a decisive method of lung cancer diagnosis, whereas lung biopsy is time-consuming, complex and inaccurate. So a computed tomography-compatible robot for rapid and precise lung biopsy is developed in this article. According to the actual operation process, the robot is divided into two modules: 4-degree-of-freedom position module for location of puncture point is appropriate for patient's almost all positions and 3-degree-of-freedom tendon-based orientation module with remote center of motion is compact and computed tomography-compatible to orientate and insert needle automatically inside computed tomography bore. The workspace of the robot surrounds patient's thorax, and the needle tip forms a cone under patient's skin. A new error model of the robot based on screw theory is proposed in view of structure error and actuation error, which are regarded as screw motions. Simulation is carried out to verify the precision of the error model contrasted with compensation via inverse kinematics. The results of insertion experiment on specific phantom prove the feasibility of the robot with mean error of 1.373 mm in laboratory environment, which is accurate enough to replace manual operation.

  1. Biomedical informatics discovering knowledge in big data

    CERN Document Server

    Holzinger, Andreas

    2014-01-01

    This book provides a broad overview of the topic Bioinformatics (medical informatics + biological information) with a focus on data, information and knowledge. From data acquisition and storage to visualization, privacy, regulatory, and other practical and theoretical topics, the author touches on several fundamental aspects of the innovative interface between the medical and computational domains that form biomedical informatics. Each chapter starts by providing a useful inventory of definitions and commonly used acronyms for each topic, and throughout the text, the reader finds several real-world examples, methodologies, and ideas that complement the technical and theoretical background. Also at the beginning of each chapter a new section called "key problems", has been added, where the author discusses possible traps and unsolvable or major problems. This new edition includes new sections at the end of each chapter, called "future outlook and research avenues," providing pointers to future challenges.

  2. Pharmacovigilance and Biomedical Informatics: A Model for Future Development.

    Science.gov (United States)

    Beninger, Paul; Ibara, Michael A

    2016-12-01

    The discipline of pharmacovigilance is rooted in the aftermath of the thalidomide tragedy of 1961. It has evolved as a result of collaborative efforts by many individuals and organizations, including physicians, patients, Health Authorities, universities, industry, the World Health Organization, the Council for International Organizations of Medical Sciences, and the International Conference on Harmonisation. Biomedical informatics is rooted in technologically based methodologies and has evolved at the speed of computer technology. The purpose of this review is to bring a novel lens to pharmacovigilance, looking at the evolution and development of the field of pharmacovigilance from the perspective of biomedical informatics, with the explicit goal of providing a foundation for discussion of the future direction of pharmacovigilance as a discipline. For this review, we searched [publication trend for the log 10 value of the numbers of publications identified in PubMed] using the key words [informatics (INF), pharmacovigilance (PV), phar-macovigilance þ informatics (PV þ INF)], for [study types] articles published between [1994-2015]. We manually searched the reference lists of identified articles for additional information. Biomedical informatics has made significant contributions to the infrastructural development of pharmacovigilance. However, there has not otherwise been a systematic assessment of the role of biomedical informatics in enhancing the field of pharmacovigilance, and there has been little cross-discipline scholarship. Rapidly developing innovations in biomedical informatics pose a challenge to pharmacovigilance in finding ways to include new sources of safety information, including social media, massively linked databases, and mobile and wearable wellness applications and sensors. With biomedical informatics as a lens, it is evident that certain aspects of pharmacovigilance are evolving more slowly. However, the high levels of mutual interest in

  3. Environmental Modeling Center

    Data.gov (United States)

    Federal Laboratory Consortium — The Environmental Modeling Center provides the computational tools to perform geostatistical analysis, to model ground water and atmospheric releases for comparison...

  4. Modeling, stability analysis, and computational aspects of some simplest nonlinear fuzzy two-term controllers derived via center of area/gravity defuzzification.

    Science.gov (United States)

    Arun, N K; Mohan, B M

    2017-09-01

    The mathematical models reported in the literature so far have been found using Center of Sums (CoS) defuzzification method only. It appears that no one has found models using Center of Area (CoA) or Center of Gravity (CoG) defuzzification method. Although there have been some works reported to deal with modeling of fuzzy controllers via Centroid method, all of them have in fact used CoS method only. In this paper, for the first time mathematical models of the simplest Mamdani type fuzzy Proportional Integral (PI)/Proportional Derivative (PD) controllers via CoG defuzzification are presented. L-type and Γ-type membership functions over different Universes of Discourse (UoDs) are considered for the input variables. L-type, Π-type and Γ-type membership functions are considered for the output variable. Three linear fuzzy control rules relating all four input fuzzy sets to three output fuzzy sets are chosen. Two triangular norms namely Algebraic Product (AP) and Minimum (Min), Maximum (Max) triangular co-norm, and two inference methods, Larsen Product (LP) and Mamdani Minimum (MM), are used. Properties of the models are studied. Stability analysis of closed-loop systems containing one of these controller models in the loop is done using the Small Gain theorem. Since digital controllers are implemented using digital processors, computational and memory requirements of these fuzzy controllers and conventional (nonfuzzy) controllers are compared. A rough estimate of the computational time taken by the digital computer while implementing any of these discrete-time fuzzy controllers is given. Two nonlinear plants are considered to show the superiority of the simplest fuzzy controller obtained using CoA or CoG defuzzification method over the simplest fuzzy controller obtained using CoS method and reported recently. Real-time implementation of one of the developed controller models is done on coupled tank experimental setup to show the feasibility of the developed model

  5. Analytical techniques in biomedical stable isotope applications : (isotope ratio) mass spectrometry or infrared spectrometry?

    NARCIS (Netherlands)

    Stellaard, F; Elzinga, H

    2005-01-01

    An overview is presented of biomedical applications of stable isotopes in general, but mainly focused on the activities of the Center for Liver, Digestive and Metabolic Diseases of the University Medical Center Groningen. The aims of metabolic studies in the areas of glucose, fat, cholesterol and

  6. Contribution for labelling study of cellular and molecular structures of biomedical interest with technetium 99

    International Nuclear Information System (INIS)

    Rebello, L.H.; Piotkwosky, M.C.; Pereira, J.A.A.; Boasquevisque, E.M.; Silva, J.R.M.; Reis, R.J.N.; Pires, E.T.; Bernardo-Filho, M.

    1992-01-01

    The methodologies for labelling bacteria, planaria and cercaria from schistosomiasis evolution cycle and in oxamniquine with technetium 99 m, developed in the Biomedical Center of Rio de Janeiro University and in the Research Center of National Institute of Cancer are shown. (C.G.C.)

  7. Software for biomedical engineering signal processing laboratory experiments.

    Science.gov (United States)

    Tompkins, Willis J; Wilson, J

    2009-01-01

    In the early 1990's we developed a special computer program called UW DigiScope to provide a mechanism for anyone interested in biomedical digital signal processing to study the field without requiring any other instrument except a personal computer. There are many digital filtering and pattern recognition algorithms used in processing biomedical signals. In general, students have very limited opportunity to have hands-on access to the mechanisms of digital signal processing. In a typical course, the filters are designed non-interactively, which does not provide the student with significant understanding of the design constraints of such filters nor their actual performance characteristics. UW DigiScope 3.0 is the first major update since version 2.0 was released in 1994. This paper provides details on how the new version based on MATLAB! works with signals, including the filter design tool that is the programming interface between UW DigiScope and processing algorithms.

  8. Scientific visualization uncertainty, multifield, biomedical, and scalable visualization

    CERN Document Server

    Chen, Min; Johnson, Christopher; Kaufman, Arie; Hagen, Hans

    2014-01-01

    Based on the seminar that took place in Dagstuhl, Germany in June 2011, this contributed volume studies the four important topics within the scientific visualization field: uncertainty visualization, multifield visualization, biomedical visualization and scalable visualization. • Uncertainty visualization deals with uncertain data from simulations or sampled data, uncertainty due to the mathematical processes operating on the data, and uncertainty in the visual representation, • Multifield visualization addresses the need to depict multiple data at individual locations and the combination of multiple datasets, • Biomedical is a vast field with select subtopics addressed from scanning methodologies to structural applications to biological applications, • Scalability in scientific visualization is critical as data grows and computational devices range from hand-held mobile devices to exascale computational platforms. Scientific Visualization will be useful to practitioners of scientific visualization, ...

  9. Advanced Methods of Biomedical Signal Processing

    CERN Document Server

    Cerutti, Sergio

    2011-01-01

    This book grew out of the IEEE-EMBS Summer Schools on Biomedical Signal Processing, which have been held annually since 2002 to provide the participants state-of-the-art knowledge on emerging areas in biomedical engineering. Prominent experts in the areas of biomedical signal processing, biomedical data treatment, medicine, signal processing, system biology, and applied physiology introduce novel techniques and algorithms as well as their clinical or physiological applications. The book provides an overview of a compelling group of advanced biomedical signal processing techniques, such as mult

  10. Branding the bio/biomedical engineering degree.

    Science.gov (United States)

    Voigt, Herbert F

    2011-01-01

    The future challenges to medical and biological engineering, sometimes referred to as biomedical engineering or simply bioengineering, are many. Some of these are identifiable now and others will emerge from time to time as new technologies are introduced and harnessed. There is a fundamental issue regarding "Branding the bio/biomedical engineering degree" that requires a common understanding of what is meant by a B.S. degree in Biomedical Engineering, Bioengineering, or Biological Engineering. In this paper we address some of the issues involved in branding the Bio/Biomedical Engineering degree, with the aim of clarifying the Bio/Biomedical Engineering brand.

  11. Predicting disease-related genes using integrated biomedical networks

    OpenAIRE

    Peng, Jiajie; Bai, Kun; Shang, Xuequn; Wang, Guohua; Xue, Hansheng; Jin, Shuilin; Cheng, Liang; Wang, Yadong; Chen, Jin

    2017-01-01

    Background Identifying the genes associated to human diseases is crucial for disease diagnosis and drug design. Computational approaches, esp. the network-based approaches, have been recently developed to identify disease-related genes effectively from the existing biomedical networks. Meanwhile, the advance in biotechnology enables researchers to produce multi-omics data, enriching our understanding on human diseases, and revealing the complex relationships between genes and diseases. Howeve...

  12. Secure management of biomedical data with cryptographic hardware.

    Science.gov (United States)

    Canim, Mustafa; Kantarcioglu, Murat; Malin, Bradley

    2012-01-01

    The biomedical community is increasingly migrating toward research endeavors that are dependent on large quantities of genomic and clinical data. At the same time, various regulations require that such data be shared beyond the initial collecting organization (e.g., an academic medical center). It is of critical importance to ensure that when such data are shared, as well as managed, it is done so in a manner that upholds the privacy of the corresponding individuals and the overall security of the system. In general, organizations have attempted to achieve these goals through deidentification methods that remove explicitly, and potentially, identifying features (e.g., names, dates, and geocodes). However, a growing number of studies demonstrate that deidentified data can be reidentified to named individuals using simple automated methods. As an alternative, it was shown that biomedical data could be shared, managed, and analyzed through practical cryptographic protocols without revealing the contents of any particular record. Yet, such protocols required the inclusion of multiple third parties, which may not always be feasible in the context of trust or bandwidth constraints. Thus, in this paper, we introduce a framework that removes the need for multiple third parties by collocating services to store and to process sensitive biomedical data through the integration of cryptographic hardware. Within this framework, we define a secure protocol to process genomic data and perform a series of experiments to demonstrate that such an approach can be run in an efficient manner for typical biomedical investigations.

  13. Comparison of concept recognizers for building the Open Biomedical Annotator

    Directory of Open Access Journals (Sweden)

    Rubin Daniel

    2009-09-01

    Full Text Available Abstract The National Center for Biomedical Ontology (NCBO is developing a system for automated, ontology-based access to online biomedical resources (Shah NH, et al.: Ontology-driven indexing of public datasets for translational bioinformatics. BMC Bioinformatics 2009, 10(Suppl 2:S1. The system's indexing workflow processes the text metadata of diverse resources such as datasets from GEO and ArrayExpress to annotate and index them with concepts from appropriate ontologies. This indexing requires the use of a concept-recognition tool to identify ontology concepts in the resource's textual metadata. In this paper, we present a comparison of two concept recognizers – NLM's MetaMap and the University of Michigan's Mgrep. We utilize a number of data sources and dictionaries to evaluate the concept recognizers in terms of precision, recall, speed of execution, scalability and customizability. Our evaluations demonstrate that Mgrep has a clear edge over MetaMap for large-scale service oriented applications. Based on our analysis we also suggest areas of potential improvements for Mgrep. We have subsequently used Mgrep to build the Open Biomedical Annotator service. The Annotator service has access to a large dictionary of biomedical terms derived from the United Medical Language System (UMLS and NCBO ontologies. The Annotator also leverages the hierarchical structure of the ontologies and their mappings to expand annotations. The Annotator service is available to the community as a REST Web service for creating ontology-based annotations of their data.

  14. A User-Centered Mobile Cloud Computing Platform for Improving Knowledge Management in Small-to-Medium Enterprises in the Chilean Construction Industry

    Directory of Open Access Journals (Sweden)

    Daniela Núñez

    2018-03-01

    Full Text Available Knowledge management (KM is a key element for the development of small-to-medium enterprises (SMEs in the construction industry. This is particularly relevant in Chile, where this industry is composed almost entirely of SMEs. Although various KM system proposals can be found in the literature, they are not suitable for SMEs, due to usability problems, budget constraints, and time and connectivity issues. Mobile Cloud Computing (MCC systems offer several advantages to construction SMEs, but they have not yet been exploited to address KM needs. Therefore, this research is aimed at the development of a MCC-based KM platform to manage lessons learned in different construction projects of SMEs, through an iterative and user-centered methodology. Usability and quality evaluations of the proposed platform show that MCC is a feasible and attractive option to address the KM issues in SMEs of the Chilean construction industry, since it is possible to consider both technical and usability requirements.

  15. Computed tomography-guided needle aspiration and biopsy of pulmonary lesions - A single-center experience in 1000 patients

    Energy Technology Data Exchange (ETDEWEB)

    Poulou, Loukia S.; Tsagouli, Paraskevi; Thanos, Loukas [Dept. of Medical Imaging and Interventional Radiology, General Hospital of Chest Diseases ' Sotiria' , Athens (Greece)], e-mail: ploukia@hotmail.com; Ziakas, Panayiotis D. [Program of Outcomes Research, Div. of Infectious Diseases, Warren Alpert Medical School, Brown Univ., RI, and Div. of Infectious Diseases, Rhode Island Hospital, Rhode Island (United States); Politi, Dimitra [Dept. of Cythopathology, General Hospital of Chest Diseases ' Sotiria' Athens (Greece); Trigidou, Rodoula [Dept. of Pathology, General Hospital of Chest Diseases ' Sotiria' Athens (Greece)

    2013-07-15

    Background: Computed tomography (CT)-guided fine needle aspiration (FNA) and biopsies are well-established, minimally invasive diagnostic tools for pulmonary lesions. Purpose: To analyze retrospectively the results of 1000 consecutive lung CT-guided FNA and/or core needle biopsies (CNB), the main outcome measures being diagnostic yield, and complication rates. Material and Methods: Patients considered eligible were those referred to our department for lung lesions. The choice of FNA, CNB, or both was based upon the radiologist's judgment. Diagnostic yield was defined as the probability of having a definite result by cytology/histology. Results: The study included 733 male patients and 267 female patients, with a mean (SD) age of 66.4 (11.4) years. The mean (SD) lesion size was 3.7 (2.4) cm in maximal diameter. Six hundred and forty-one (64%) patients underwent an FNA procedure, 245 (25%) a CNB, and 114 (11%) had been subjected to both. The diagnostic yield was 960/994 (96.6%); this decreased significantly with the use of CNB only (odds ratio [OR] 0.32; 95% CI 0.12 - 0.88; P = 0.03), while it increased with lesion size (OR 1.35; 95% CI 1.03 - 1.79; P = 0.03 per cm increase). In 506 patients (52.7%), a malignant process was diagnosed by cytopathology/histology. The complication rate reached 97/1000 (9.7%); complications included: hemorrhage, 62 (6.2%); pneumothorax, 28 (2.8%); hemorrhage and pneumothorax, 5 (0.5%); and hemoptysis, 2 (0.2%). It was not significantly affected by the type of procedure or localization of the lesion. The overall risk for complications was three times higher for lesions <4 cm (OR 3.26; 95% CI 1.96 - 5.42; P < 0.001). Conclusion: CT-guided lung biopsy has a high diagnostic yield using FNA, CNB, or both. The CNB procedure alone will not suffice. Complication rates were acceptable and correlated inversely with lesion size, not localization or type of procedure.

  16. Atom-Centered Potentials with Dispersion-Corrected Minimal-Basis-Set Hartree-Fock: An Efficient and Accurate Computational Approach for Large Molecular Systems.

    Science.gov (United States)

    Prasad, Viki Kumar; Otero-de-la-Roza, Alberto; DiLabio, Gino A

    2018-02-13

    We present a computational methodology based on atom-centered potentials (ACPs) for the efficient and accurate structural modeling of large molecular systems. ACPs are atom-centered one-electron potentials that have the same functional form as effective-core potentials. In recent works, we showed that ACPs can be used to produce a correction to the ground-state wave function and electronic energy to alleviate shortcomings in the underlying model chemistry. In this work, we present ACPs for H, C, N, and O atoms that are specifically designed to predict accurate non-covalent binding energies and inter- and intramolecular geometries when combined with dispersion-corrected Hartree-Fock (HF-D3) and a minimal basis-set (scaled MINI or MINIs). For example, the combined HF-D3/MINIs-ACP method demonstrates excellent performance, with mean absolute errors of 0.36 and 0.28 kcal/mol for the S22x5 and S66x8 benchmark sets, respectively, relative to highly correlated complete-basis-set data. The application of ACPs results in a significant decrease in error compared to uncorrected HF-D3/MINIs for all benchmark sets examined. In addition, HF-D3/MINIs-ACP, has a cost only slightly higher than a minimal-basis-set HF calculation and can be used with any electronic structure program for molecular quantum chemistry that uses Gaussian basis sets and effective-core potentials.

  17. Gold Nanocages for Biomedical Applications**

    OpenAIRE

    Skrabalak, Sara E.; Chen, Jingyi; Au, Leslie; Lu, Xianmao; Li, Xingde; Xia, Younan

    2007-01-01

    Nanostructured materials provide a promising platform for early cancer detection and treatment. Here we highlight recent advances in the synthesis and use of Au nanocages for such biomedical applications. Gold nanocages represent a novel class of nanostructures, which can be prepared via a remarkably simple route based on the galvanic replacement reaction between Ag nanocubes and HAuCl4. The Au nanocages have a tunable surface plasmon resonance peak that extends into the near-infrared, where ...

  18. Biomedical devices and their applications

    CERN Document Server

    2004-01-01

    This volume introduces readers to the basic concepts and recent advances in the field of biomedical devices. The text gives a detailed account of novel developments in drug delivery, protein electrophoresis, estrogen mimicking methods and medical devices. It also provides the necessary theoretical background as well as describing a wide range of practical applications. The level and style make this book accessible not only to scientific and medical researchers but also to graduate students.

  19. The Ontology for Biomedical Investigations.

    Science.gov (United States)

    Bandrowski, Anita; Brinkman, Ryan; Brochhausen, Mathias; Brush, Matthew H; Bug, Bill; Chibucos, Marcus C; Clancy, Kevin; Courtot, Mélanie; Derom, Dirk; Dumontier, Michel; Fan, Liju; Fostel, Jennifer; Fragoso, Gilberto; Gibson, Frank; Gonzalez-Beltran, Alejandra; Haendel, Melissa A; He, Yongqun; Heiskanen, Mervi; Hernandez-Boussard, Tina; Jensen, Mark; Lin, Yu; Lister, Allyson L; Lord, Phillip; Malone, James; Manduchi, Elisabetta; McGee, Monnie; Morrison, Norman; Overton, James A; Parkinson, Helen; Peters, Bjoern; Rocca-Serra, Philippe; Ruttenberg, Alan; Sansone, Susanna-Assunta; Scheuermann, Richard H; Schober, Daniel; Smith, Barry; Soldatova, Larisa N; Stoeckert, Christian J; Taylor, Chris F; Torniai, Carlo; Turner, Jessica A; Vita, Randi; Whetzel, Patricia L; Zheng, Jie

    2016-01-01

    The Ontology for Biomedical Investigations (OBI) is an ontology that provides terms with precisely defined meanings to describe all aspects of how investigations in the biological and medical domains are conducted. OBI re-uses ontologies that provide a representation of biomedical knowledge from the Open Biological and Biomedical Ontologies (OBO) project and adds the ability to describe how this knowledge was derived. We here describe the state of OBI and several applications that are using it, such as adding semantic expressivity to existing databases, building data entry forms, and enabling interoperability between knowledge resources. OBI covers all phases of the investigation process, such as planning, execution and reporting. It represents information and material entities that participate in these processes, as well as roles and functions. Prior to OBI, it was not possible to use a single internally consistent resource that could be applied to multiple types of experiments for these applications. OBI has made this possible by creating terms for entities involved in biological and medical investigations and by importing parts of other biomedical ontologies such as GO, Chemical Entities of Biological Interest (ChEBI) and Phenotype Attribute and Trait Ontology (PATO) without altering their meaning. OBI is being used in a wide range of projects covering genomics, multi-omics, immunology, and catalogs of services. OBI has also spawned other ontologies (Information Artifact Ontology) and methods for importing parts of ontologies (Minimum information to reference an external ontology term (MIREOT)). The OBI project is an open cross-disciplinary collaborative effort, encompassing multiple research communities from around the globe. To date, OBI has created 2366 classes and 40 relations along with textual and formal definitions. The OBI Consortium maintains a web resource (http://obi-ontology.org) providing details on the people, policies, and issues being addressed

  20. The Ontology for Biomedical Investigations.

    Directory of Open Access Journals (Sweden)

    Anita Bandrowski

    Full Text Available The Ontology for Biomedical Investigations (OBI is an ontology that provides terms with precisely defined meanings to describe all aspects of how investigations in the biological and medical domains are conducted. OBI re-uses ontologies that provide a representation of biomedical knowledge from the Open Biological and Biomedical Ontologies (OBO project and adds the ability to describe how this knowledge was derived. We here describe the state of OBI and several applications that are using it, such as adding semantic expressivity to existing databases, building data entry forms, and enabling interoperability between knowledge resources. OBI covers all phases of the investigation process, such as planning, execution and reporting. It represents information and material entities that participate in these processes, as well as roles and functions. Prior to OBI, it was not possible to use a single internally consistent resource that could be applied to multiple types of experiments for these applications. OBI has made this possible by creating terms for entities involved in biological and medical investigations and by importing parts of other biomedical ontologies such as GO, Chemical Entities of Biological Interest (ChEBI and Phenotype Attribute and Trait Ontology (PATO without altering their meaning. OBI is being used in a wide range of projects covering genomics, multi-omics, immunology, and catalogs of services. OBI has also spawned other ontologies (Information Artifact Ontology and methods for importing parts of ontologies (Minimum information to reference an external ontology term (MIREOT. The OBI project is an open cross-disciplinary collaborative effort, encompassing multiple research communities from around the globe. To date, OBI has created 2366 classes and 40 relations along with textual and formal definitions. The OBI Consortium maintains a web resource (http://obi-ontology.org providing details on the people, policies, and issues being

  1. Biomedical waste management: An overview

    OpenAIRE

    Mahendra R.R Raj

    2009-01-01

    The importance of waste disposal management is a very essential and integral part of any health care system. Health care providers have been ignorant or they did not essentially know the basic aspect of the importance and effective management of hospital waste.This overview of biomedical waste disposal/management gives a thorough insight into the aspects of the guidelines to be followed and adopted according to the international WHO approved methodology for a cleaner, disease-free, and health...

  2. Projected Applications of a "Climate in a Box" Computing System at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    Science.gov (United States)

    Jedlovec, Gary J.; Molthan, Andrew L.; Zavodsky, Bradley; Case, Jonathan L.; LaFontaine, Frank J.

    2010-01-01

    The NASA Short-term Prediction Research and Transition (SPoRT) Center focuses on the transition of unique observations and research capabilities to the operational weather community, with a goal of improving short-term forecasts on a regional scale. Advances in research computing have lead to "Climate in a Box" systems, with hardware configurations capable of producing high resolution, near real-time weather forecasts, but with footprints, power, and cooling requirements that are comparable to desktop systems. The SPoRT Center has developed several capabilities for incorporating unique NASA research capabilities and observations with real-time weather forecasts. Planned utilization includes the development of a fully-cycled data assimilation system used to drive 36-48 hour forecasts produced by the NASA Unified version of the Weather Research and Forecasting (WRF) model (NU-WRF). The horsepower provided by the "Climate in a Box" system is expected to facilitate the assimilation of vertical profiles of temperature and moisture provided by the Atmospheric Infrared Sounder (AIRS) aboard the NASA Aqua satellite. In addition, the Moderate Resolution Imaging Spectroradiometer (MODIS) instruments aboard NASA s Aqua and Terra satellites provide high-resolution sea surface temperatures and vegetation characteristics. The development of MODIS normalized difference vegetation index (NVDI) composites for use within the NASA Land Information System (LIS) will assist in the characterization of vegetation, and subsequently the surface albedo and processes related to soil moisture. Through application of satellite simulators, NASA satellite instruments can be used to examine forecast model errors in cloud cover and other characteristics. Through the aforementioned application of the "Climate in a Box" system and NU-WRF capabilities, an end goal is the establishment of a real-time forecast system that fully integrates modeling and analysis capabilities developed within the NASA SPo

  3. Magnetic nanoparticles for biomedical applications

    International Nuclear Information System (INIS)

    Krustev, P.; Ruskov, T.

    2007-01-01

    In this paper we describe different biomedical application using magnetic nanoparticles. Over the past decade, a number of biomedical applications have begun to emerge for magnetic nanoparticles of differing sizes, shapes, and compositions. Areas under investigation include targeted drug delivery, ultra-sensitive disease detection, gene therapy, high throughput genetic screening, biochemical sensing, and rapid toxicity cleansing. Magnetic nanoparticles exhibit ferromagnetic or superparamagnetic behavior, magnetizing strongly under an applied field. In the second case (superparamagnetic nanoparticles) there is no permanent magnetism once the field is removed. The superparamagnetic nanoparticles are highly attractive as in vivo probes or in vitro tools to extract information on biochemical systems. The optical properties of magnetic metal nanoparticles are spectacular and, therefore, have promoted a great deal of excitement during the last few decades. Many applications as MRI imaging and hyperthermia rely on the use of iron oxide particles. Moreover magnetic nanoparticles conjugated with antibodies are also applied to hyperthermia and have enabled tumor specific contrast enhancement in MRI. Other promising biomedical applications are connected with tumor cells treated with magnetic nanoparticles with X-ray ionizing radiation, which employs magnetic nanoparticles as a complementary radiate source inside the tumor. (authors)

  4. Superhydrophobic Materials for Biomedical Applications

    Science.gov (United States)

    Colson, Yolonda L.; Grinstaff, Mark W.

    2016-01-01

    Superhydrophobic surfaces are actively studied across a wide range of applications and industries, and are now finding increased use in the biomedical arena as substrates to control protein adsorption, cellular interaction, and bacterial growth, as well as platforms for drug delivery devices and for diagnostic tools. The commonality in the design of these materials is to create a stable or metastable air state at the material surface, which lends itself to a number of unique properties. These activities are catalyzing the development of new materials, applications, and fabrication techniques, as well as collaborations across material science, chemistry, engineering, and medicine given the interdisciplinary nature of this work. The review begins with a discussion of superhydrophobicity, and then explores biomedical applications that are utilizing superhydrophobicity in depth including material selection characteristics, in vitro performance, and in vivo performance. General trends are offered for each application in addition to discussion of conflicting data in the literature, and the review concludes with the authors’ future perspectives on the utility of superhydrophobic surfaces for biomedical applications. PMID:27449946

  5. Biomedical applications of nanodiamond (Review)

    Science.gov (United States)

    Turcheniuk, K.; Mochalin, Vadym N.

    2017-06-01

    The interest in nanodiamond applications in biology and medicine is on the rise over recent years. This is due to the unique combination of properties that nanodiamond provides. Small size (∼5 nm), low cost, scalable production, negligible toxicity, chemical inertness of diamond core and rich chemistry of nanodiamond surface, as well as bright and robust fluorescence resistant to photobleaching are the distinct parameters that render nanodiamond superior to any other nanomaterial when it comes to biomedical applications. The most exciting recent results have been related to the use of nanodiamonds for drug delivery and diagnostics—two components of a quickly growing area of biomedical research dubbed theranostics. However, nanodiamond offers much more in addition: it can be used to produce biodegradable bone surgery devices, tissue engineering scaffolds, kill drug resistant microbes, help us to fight viruses, and deliver genetic material into cell nucleus. All these exciting opportunities require an in-depth understanding of nanodiamond. This review covers the recent progress as well as general trends in biomedical applications of nanodiamond, and underlines the importance of purification, characterization, and rational modification of this nanomaterial when designing nanodiamond based theranostic platforms.

  6. Career Development among American Biomedical Postdocs.

    Science.gov (United States)

    Gibbs, Kenneth D; McGready, John; Griffin, Kimberly

    2015-01-01

    Recent biomedical workforce policy efforts have centered on enhancing career preparation for trainees, and increasing diversity in the research workforce. Postdoctoral scientists, or postdocs, are among those most directly impacted by such initiatives, yet their career development remains understudied. This study reports results from a 2012 national survey of 1002 American biomedical postdocs. On average, postdocs reported increased knowledge about career options but lower clarity about their career goals relative to PhD entry. The majority of postdocs were offered structured career development at their postdoctoral institutions, but less than one-third received this from their graduate departments. Postdocs from all social backgrounds reported significant declines in interest in faculty careers at research-intensive universities and increased interest in nonresearch careers; however, there were differences in the magnitude and period of training during which these changes occurred across gender and race/ethnicity. Group differences in interest in faculty careers were explained by career interest differences formed during graduate school but not by differences in research productivity, research self-efficacy, or advisor relationships. These findings point to the need for enhanced career development earlier in the training process, and interventions sensitive to distinctive patterns of interest development across social identity groups. © 2015 K. D. Gibbs et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  7. Biotechnology development for biomedical applications.

    Energy Technology Data Exchange (ETDEWEB)

    Kuehl, Michael; Brozik, Susan Marie; Rogers, David Michael; Rempe, Susan L.; Abhyankar, Vinay V.; Hatch, Anson V.; Dirk, Shawn M.; Hedberg-Dirk, Elizabeth (University of New Mexico, Albuquerque, NM); Sukharev, Sergei (University of Maryland, College Park, MD); Anishken, Andriy (University of Maryland, College Park, MD); Cicotte, Kirsten; De Sapio, Vincent; Buerger, Stephen P.; Mai, Junyu

    2010-11-01

    Sandia's scientific and engineering expertise in the fields of computational biology, high-performance prosthetic limbs, biodetection, and bioinformatics has been applied to specific problems at the forefront of cancer research. Molecular modeling was employed to design stable mutations of the enzyme L-asparaginase with improved selectivity for asparagine over other amino acids with the potential for improved cancer chemotherapy. New electrospun polymer composites with improved electrical conductivity and mechanical compliance have been demonstrated with the promise of direct interfacing between the peripheral nervous system and the control electronics of advanced prosthetics. The capture of rare circulating tumor cells has been demonstrated on a microfluidic chip produced with a versatile fabrication processes capable of integration with existing lab-on-a-chip and biosensor technology. And software tools have been developed to increase the calculation speed of clustered heat maps for the display of relationships in large arrays of protein data. All these projects were carried out in collaboration with researchers at the University of Texas M. D. Anderson Cancer Center in Houston, TX.

  8. Computer methods in biomechanics and biomedical engineering - Supplement 1: papers from the 32th congress of the Société de Biomécanique, Lyon, 28-29th August

    OpenAIRE

    CHEZE, L; DUMAS, R; NICOLLE, S; MIDDLETON, J; JACOBS, CR

    2007-01-01

    Subjects: Bioinformatics; Biomaterials; Biomaterials & Medical Devices; Biomaterials & Medical devices; Biomechanics; Biomechanics & Human Movement Science; Breast Cancer; Cardiovascular Imaging; Computational Mechanics; Dentistry; Diagnostic Imaging; Ergonomics; Ergonomics & Human Factors; Mechanics: Fluid Dynamics; Mechanical Engineering: Fluid Dynamics; Mathematical Biology; Mechanical Engineering: Mechanical Engineering Design; Design: Mechanical Engineering Design; Mechanical Engineering...

  9. Results of the 2016 International Skin Imaging Collaboration International Symposium on Biomedical Imaging challenge: Comparison of the accuracy of computer algorithms to dermatologists for the diagnosis of melanoma from dermoscopic images.

    Science.gov (United States)

    Marchetti, Michael A; Codella, Noel C F; Dusza, Stephen W; Gutman, David A; Helba, Brian; Kalloo, Aadi; Mishra, Nabin; Carrera, Cristina; Celebi, M Emre; DeFazio, Jennifer L; Jaimes, Natalia; Marghoob, Ashfaq A; Quigley, Elizabeth; Scope, Alon; Yélamos, Oriol; Halpern, Allan C

    2018-02-01

    Computer vision may aid in melanoma detection. We sought to compare melanoma diagnostic accuracy of computer algorithms to dermatologists using dermoscopic images. We conducted a cross-sectional study using 100 randomly selected dermoscopic images (50 melanomas, 44 nevi, and 6 lentigines) from an international computer vision melanoma challenge dataset (n = 379), along with individual algorithm results from 25 teams. We used 5 methods (nonlearned and machine learning) to combine individual automated predictions into "fusion" algorithms. In a companion study, 8 dermatologists classified the lesions in the 100 images as either benign or malignant. The average sensitivity and specificity of dermatologists in classification was 82% and 59%. At 82% sensitivity, dermatologist specificity was similar to the top challenge algorithm (59% vs. 62%, P = .68) but lower than the best-performing fusion algorithm (59% vs. 76%, P = .02). Receiver operating characteristic area of the top fusion algorithm was greater than the mean receiver operating characteristic area of dermatologists (0.86 vs. 0.71, P = .001). The dataset lacked the full spectrum of skin lesions encountered in clinical practice, particularly banal lesions. Readers and algorithms were not provided clinical data (eg, age or lesion history/symptoms). Results obtained using our study design cannot be extrapolated to clinical practice. Deep learning computer vision systems classified melanoma dermoscopy images with accuracy that exceeded some but not all dermatologists. Copyright © 2017 American Academy of Dermatology, Inc. Published by Elsevier Inc. All rights reserved.

  10. A novel biomedical image indexing and retrieval system via deep preference learning.

    Science.gov (United States)

    Pang, Shuchao; Orgun, Mehmet A; Yu, Zhezhou

    2018-05-01

    -of-the-art techniques in indexing biomedical images. We propose a novel and automated indexing system based on deep preference learning to characterize biomedical images for developing computer aided diagnosis (CAD) systems in healthcare. Our proposed system shows an outstanding indexing ability and high efficiency for biomedical image retrieval applications and it can be used to collect and annotate the high-resolution images in a biomedical database for further biomedical image research and applications. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Do routinely repeated computed tomography scans in traumatic brain injury influence management? A prospective observational study in a level 1 trauma center.

    Science.gov (United States)

    Connon, Francis F; Namdarian, Benjamin; Ee, Joanne L C; Drummond, Katharine J; Miller, Julie A

    2011-12-01

    To prospectively examine the clinical role of routine repeat computed tomographic scans of the brain (CTB) in patients with traumatic head injury. The use of routine serial CTB after traumatic head injury is recommended by some authors, but remains controversial. From March 2007 to October 2008, all patients with traumatic head injury admitted to the Royal Melbourne Hospital, a metropolitan, Level I trauma center, were prospectively studied. After the initial computed tomography brain scans, any subsequent CTBs were assessed and were recorded as being either "clinically indicated" or "routine" and ensuing medical and surgical management. Inpatient information was recorded and comparisons made according to indication for CTB, Glasgow Coma Scale, and management changes. A total of 651 patients were admitted with traumatic head injury over the 20-month study period. Of those, 39 underwent immediate craniotomy/craniectomy and were excluded from analysis. Another 25 were excluded due to incomplete data, leaving 591 patients for analysis. Of the 591 assessed, 401 were discharged with no further computed tomography investigation. One hundred and ninety patients underwent a total of 305 repeat brain scans, of which 149 were clinically indicated, whereas 156 were obtained as a "routine" investigation with no deterioration in patients' neurological status. Of the repeated scans, 71 were improved, 169 were unchanged, and 64 were worse. None of the 156 patients who received a "routine" CTB required a change in management. The 149 CTB performed for clinical deterioration resulted in a change in management in 28 patients (19%). The patients who underwent "indicated" computed tomographic scans and subsequently required a change in management were on average younger (P < 0.001) and more severely head injured (P = 0.001) than the patients not requiring a change in management. No patients from our cohort with a "routine" repeat CTB required a change in management. Given the costs

  12. Predicting Treatment Relations with Semantic Patterns over Biomedical Knowledge Graphs.

    Science.gov (United States)

    Bakal, Gokhan; Kavuluru, Ramakanth

    2015-12-01

    Identifying new potential treatment options (say, medications and procedures) for known medical conditions that cause human disease burden is a central task of biomedical research. Since all candidate drugs cannot be tested with animal and clinical trials, in vitro approaches are first attempted to identify promising candidates. Even before this step, due to recent advances, in silico or computational approaches are also being employed to identify viable treatment options. Generally, natural language processing (NLP) and machine learning are used to predict specific relations between any given pair of entities using the distant supervision approach. In this paper, we report preliminary results on predicting treatment relations between biomedical entities purely based on semantic patterns over biomedical knowledge graphs. As such, we refrain from explicitly using NLP, although the knowledge graphs themselves may be built from NLP extractions. Our intuition is fairly straightforward - entities that participate in a treatment relation may be connected using similar path patterns in biomedical knowledge graphs extracted from scientific literature. Using a dataset of treatment relation instances derived from the well known Unified Medical Language System (UMLS), we verify our intuition by employing graph path patterns from a well known knowledge graph as features in machine learned models. We achieve a high recall (92 %) but precision, however, decreases from 95% to an acceptable 71% as we go from uniform class distribution to a ten fold increase in negative instances. We also demonstrate models trained with patterns of length ≤ 3 result in statistically significant gains in F-score over those trained with patterns of length ≤ 2. Our results show the potential of exploiting knowledge graphs for relation extraction and we believe this is the first effort to employ graph patterns as features for identifying biomedical relations.

  13. NeuroTerrain--a client-server system for browsing 3D biomedical image data sets.

    Science.gov (United States)

    Gustafson, Carl; Bug, William J; Nissanov, Jonathan

    2007-02-05

    Three dimensional biomedical image sets are becoming ubiquitous, along with the canonical atlases providing the necessary spatial context for analysis. To make full use of these 3D image sets, one must be able to present views for 2D display, either surface renderings or 2D cross-sections through the data. Typical display software is limited to presentations along one of the three orthogonal anatomical axes (coronal, horizontal, or sagittal). However, data sets precisely oriented along the major axes are rare. To make fullest use of these datasets, one must reasonably match the atlas' orientation; this involves resampling the atlas in planes matched to the data set. Traditionally, this requires the atlas and browser reside on the user's desktop; unfortunately, in addition to being monolithic programs, these tools often require substantial local resources. In this article, we describe a network-capable, client-server framework to slice and visualize 3D atlases at off-axis angles, along with an open client architecture and development kit to support integration into complex data analysis environments. Here we describe the basic architecture of a client-server 3D visualization system, consisting of a thin Java client built on a development kit, and a computationally robust, high-performance server written in ANSI C++. The Java client components (NetOStat) support arbitrary-angle viewing and run on readily available desktop computers running Mac OS X, Windows XP, or Linux as a downloadable Java Application. Using the NeuroTerrain Software Development Kit (NT-SDK), sophisticated atlas browsing can be added to any Java-compatible application requiring as little as 50 lines of Java glue code, thus making it eminently re-useable and much more accessible to programmers building more complex, biomedical data analysis tools. The NT-SDK separates the interactive GUI components from the server control and monitoring, so as to support development of non-interactive applications

  14. Understanding the Structure-Function Relationships of Dendrimers in Environmental and Biomedical Applications

    Science.gov (United States)

    Wang, Bo

    We are living an era wherein nanoparticles (NPs) have been widely applied in our lives. Dendrimers are special polymeric NPs with unique physiochemical properties, which have been intensely explored for a variety of applications. Current studies on dendrimers are bottlenecked by insufficient understandings of their structure and dynamic behaviors from a molecular level. With primarily computational approaches supplemented by many other experimental technics, this dissertation aims to establish structure-function relationships of dendrimers in environmental and biomedical applications. More specifically, it thoroughly investigates the interactions between dendrimers and different biomolecules including carbon-based NPs, metal-based NPs, and proteins/peptides. Those results not only provide profound knowledge for evaluating the impacts of dendrimers on environmental and biological systems but also facilitate designing next-generation functional polymeric nanomaterials. The dissertation is organized as following. Chapter 1 provides an overview of current progresses on dendrimer studies, where methodology of Discrete Molecular Dynamics (DMD), my major research tool, is also introduced. Two directions of utilizing dendrimers will be discussed in following chapters. Chapter 2 will focus on environmental applications of dendrimers, where two back-to-back studies are presented. I will start from describing some interesting observations from experiments i.e. dendrimers dispersed model oil molecules. Then, I will reveal why surface chemistries of dendrimers lead to different remediation efficiencies by computational modelings. Finally, I will demonstrate different scenarios of dendrimer-small molecules association. Chapter 3 is centered on dendrimers in the biomedical applications including two subtopics. In the first topic, we will discuss dendrimers as surfactants that modulating the interactions between proteins and NPs. Some fundamental concepts regarding to NPs

  15. The ethics of biomedical big data

    CERN Document Server

    Mittelstadt, Brent Daniel

    2016-01-01

    This book presents cutting edge research on the new ethical challenges posed by biomedical Big Data technologies and practices. ‘Biomedical Big Data’ refers to the analysis of aggregated, very large datasets to improve medical knowledge and clinical care. The book describes the ethical problems posed by aggregation of biomedical datasets and re-use/re-purposing of data, in areas such as privacy, consent, professionalism, power relationships, and ethical governance of Big Data platforms. Approaches and methods are discussed that can be used to address these problems to achieve the appropriate balance between the social goods of biomedical Big Data research and the safety and privacy of individuals. Seventeen original contributions analyse the ethical, social and related policy implications of the analysis and curation of biomedical Big Data, written by leading experts in the areas of biomedical research, medical and technology ethics, privacy, governance and data protection. The book advances our understan...

  16. Text mining patents for biomedical knowledge.

    Science.gov (United States)

    Rodriguez-Esteban, Raul; Bundschus, Markus

    2016-06-01

    Biomedical text mining of scientific knowledge bases, such as Medline, has received much attention in recent years. Given that text mining is able to automatically extract biomedical facts that revolve around entities such as genes, proteins, and drugs, from unstructured text sources, it is seen as a major enabler to foster biomedical research and drug discovery. In contrast to the biomedical literature, research into the mining of biomedical patents has not reached the same level of maturity. Here, we review existing work and highlight the associated technical challenges that emerge from automatically extracting facts from patents. We conclude by outlining potential future directions in this domain that could help drive biomedical research and drug discovery. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. SCELib3.0: The new revision of SCELib, the parallel computational library of molecular properties in the Single Center Approach

    Science.gov (United States)

    Sanna, N.; Baccarelli, I.; Morelli, G.

    2009-12-01

    SCELib is a computer program which implements the Single Center Expansion (SCE) method to describe molecular electronic densities and the interaction potentials between a charged projectile (electron or positron) and a target molecular system. The first version (CPC Catalog identifier ADMG_v1_0) was submitted to the CPC Program Library in 2000, and version 2.0 (ADMG_v2_0) was submitted in 2004. We here announce the new release 3.0 which presents additional features with respect to the previous versions aiming at a significative enhance of its capabilities to deal with larger molecular systems. SCELib 3.0 allows for ab initio effective core potential (ECP) calculations of the molecular wavefunctions to be used in the SCE method in addition to the standard all-electron description of the molecule. The list of supported architectures has been updated and the code has been ported to platforms based on accelerating coprocessors, such as the NVIDIA GPGPU and the new parallel model adopted is able to efficiently run on a mixed many-core computing system. Program summaryProgram title: SCELib3.0 Catalogue identifier: ADMG_v3_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADMG_v3_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2 018 862 No. of bytes in distributed program, including test data, etc.: 4 955 014 Distribution format: tar.gz Programming language: C Compilers used: xlc V8.x, Intel C V10.x, Portland Group V7.x, nvcc V2.x Computer: All SMP platforms based on AIX, Linux and SUNOS operating systems over SPARC, POWER, Intel Itanium2, X86, em64t and Opteron processors Operating system: SUNOS, IBM AIX, Linux RedHat (Enterprise), Linux SuSE (SLES) Has the code been vectorized or parallelized?: Yes. 1 to 32 (CPU or GPU) used RAM: Up to 32 GB depending on the molecular

  18. The Biomedical Resource Ontology (BRO) to enable resource discovery in clinical and translational research.

    Science.gov (United States)

    Tenenbaum, Jessica D; Whetzel, Patricia L; Anderson, Kent; Borromeo, Charles D; Dinov, Ivo D; Gabriel, Davera; Kirschner, Beth; Mirel, Barbara; Morris, Tim; Noy, Natasha; Nyulas, Csongor; Rubenson, David; Saxman, Paul R; Singh, Harpreet; Whelan, Nancy; Wright, Zach; Athey, Brian D; Becich, Michael J; Ginsburg, Geoffrey S; Musen, Mark A; Smith, Kevin A; Tarantal, Alice F; Rubin, Daniel L; Lyster, Peter

    2011-02-01

    The biomedical research community relies on a diverse set of resources, both within their own institutions and at other research centers. In addition, an increasing number of shared electronic resources have been developed. Without effective means to locate and query these resources, it is challenging, if not impossible, for investigators to be aware of the myriad resources available, or to effectively perform resource discovery when the need arises. In this paper, we describe the development and use of the Biomedical Resource Ontology (BRO) to enable semantic annotation and discovery of biomedical resources. We also describe the Resource Discovery System (RDS) which is a federated, inter-institutional pilot project that uses the BRO to facilitate resource discovery on the Internet. Through the RDS framework and its associated Biositemaps infrastructure, the BRO facilitates semantic search and discovery of biomedical resources, breaking down barriers and streamlining scientific research that will improve human health. Copyright © 2010 Elsevier Inc. All rights reserved.

  19. Program of “Okayama Biomedical Engineering Professional” for Local Renovation

    Science.gov (United States)

    Hayashi, Kozaburo

    Okayama University of Science, Department of Biomedical Engineering, is promoting a program of “Okayama Biomedical Engineering Professional” for the development and renovation of biomedical industries in Okayama area. This is one of the programs of the national project on “Formation of the Center for the Production of Capable Persons for Local Renovation” , sponsored by the Ministry of Education, Culture, Sports, Science and Technology, Japan, and performed by the Japan Science and Technology Agency. The purpose of the program is to develop and educate specialists for the research, development, production, and marketing of biomedical devices and equipment in local industries in Okayama area. A half a year training for approximately 5 students from industries consists of 12 days of lectures and experiments, which is repeatedly provided for 5 years (approximately 45 students in total) .

  20. An introduction to biomedical instrumentation

    CERN Document Server

    Dewhurst, D J

    1976-01-01

    An Introduction to Biomedical Instrumentation presents a course of study and applications covering the basic principles of medical and biological instrumentation, as well as the typical features of its design and construction. The book aims to aid not only the cognitive domain of the readers, but also their psychomotor domain as well. Aside from the seminar topics provided, which are divided into 27 chapters, the book complements these topics with practical applications of the discussions. Figures and mathematical formulas are also given. Major topics discussed include the construction, handli

  1. Tritium AMS for biomedical applications

    International Nuclear Information System (INIS)

    Roberts, M.L.; Velsko, C.; Turteltaub, K.W.

    1993-08-01

    We are developing 3 H-AMS to measure 3 H activity of mg-sized biological samples. LLNL has already successfully applied 14 C AMS to a variety of problems in the area of biomedical research. Development of 3 H AMS would greatly complement these studies. The ability to perform 3 H AMS measurements at sensitivities equivalent to those obtained for 14 C will allow us to perform experiments using compounds that are not readily available in 14 C-tagged form. A 3 H capability would also allow us to perform unique double-labeling experiments in which we learn the fate, distribution, and metabolism of separate fractions of biological compounds

  2. Thermoresponsive Polymers for Biomedical Applications

    Directory of Open Access Journals (Sweden)

    Theoni K. Georgiou

    2011-08-01

    Full Text Available Thermoresponsive polymers are a class of “smart” materials that have the ability to respond to a change in temperature; a property that makes them useful materials in a wide range of applications and consequently attracts much scientific interest. This review focuses mainly on the studies published over the last 10 years on the synthesis and use of thermoresponsive polymers for biomedical applications including drug delivery, tissue engineering and gene delivery. A summary of the main applications is given following the different studies on thermoresponsive polymers which are categorized based on their 3-dimensional structure; hydrogels, interpenetrating networks, micelles, crosslinked micelles, polymersomes, films and particles.

  3. Introduction to biomedical engineering technology

    CERN Document Server

    Street, Laurence J

    2011-01-01

    IntroductionHistory of Medical DevicesThe Role of Biomedical Engineering Technologists in Health CareCharacteristics of Human Anatomy and Physiology That Relate to Medical DevicesSummaryQuestionsDiagnostic Devices: Part OnePhysiological Monitoring SystemsThe HeartSummaryQuestionsDiagnostic Devices: Part TwoCirculatory System and BloodRespiratory SystemNervous SystemSummaryQuestionsDiagnostic Devices: Part ThreeDigestive SystemSensory OrgansReproductionSkin, Bone, Muscle, MiscellaneousChapter SummaryQuestionsDiagnostic ImagingIntroductionX-RaysMagnetic Resonance Imaging ScannersPositron Emissio

  4. Biomedical signal and image processing

    CERN Document Server

    Najarian, Kayvan

    2012-01-01

    INTRODUCTION TO DIGITAL SIGNAL AND IMAGE PROCESSINGSignals and Biomedical Signal ProcessingIntroduction and OverviewWhat is a ""Signal""?Analog, Discrete, and Digital SignalsProcessing and Transformation of SignalsSignal Processing for Feature ExtractionSome Characteristics of Digital ImagesSummaryProblemsFourier TransformIntroduction and OverviewOne-Dimensional Continuous Fourier TransformSampling and NYQUIST RateOne-Dimensional Discrete Fourier TransformTwo-Dimensional Discrete Fourier TransformFilter DesignSummaryProblemsImage Filtering, Enhancement, and RestorationIntroduction and Overview

  5. Biomedical waste management: An overview

    Directory of Open Access Journals (Sweden)

    Mahendra R.R Raj

    2009-01-01

    Full Text Available The importance of waste disposal management is a very essential and integral part of any health care system. Health care providers have been ignorant or they did not essentially know the basic aspect of the importance and effective management of hospital waste.This overview of biomedical waste disposal/management gives a thorough insight into the aspects of the guidelines to be followed and adopted according to the international WHO approved methodology for a cleaner, disease-free, and healthier medical services to the populace, i.e., to the hospital employees, patients, and society.

  6. Review of Biomedical Image Processing

    Directory of Open Access Journals (Sweden)

    Ciaccio Edward J

    2011-11-01

    Full Text Available Abstract This article is a review of the book: 'Biomedical Image Processing', by Thomas M. Deserno, which is published by Springer-Verlag. Salient information that will be useful to decide whether the book is relevant to topics of interest to the reader, and whether it might be suitable as a course textbook, are presented in the review. This includes information about the book details, a summary, the suitability of the text in course and research work, the framework of the book, its specific content, and conclusions.

  7. Luminescent nanodiamonds for biomedical applications.

    Science.gov (United States)

    Say, Jana M; van Vreden, Caryn; Reilly, David J; Brown, Louise J; Rabeau, James R; King, Nicholas J C

    2011-12-01

    In recent years, nanodiamonds have emerged from primarily an industrial and mechanical applications base, to potentially underpinning sophisticated new technologies in biomedical and quantum science. Nanodiamonds are relatively inexpensive, biocompatible, easy to surface functionalise and optically stable. This combination of physical properties are ideally suited to biological applications, including intracellular labelling and tracking, extracellular drug delivery and adsorptive detection of bioactive molecules. Here we describe some of the methods and challenges for processing nanodiamond materials, detection schemes and some of the leading applications currently under investigation.

  8. Declining trend in the use of repeat computed tomography for trauma patients admitted to a level I trauma center for traffic-related injuries.

    Science.gov (United States)

    Psoter, Kevin J; Roudsari, Bahman S; Graves, Janessa M; Mack, Christopher; Jarvik, Jeffrey G

    2013-06-01

    To evaluate the trend in utilization of repeat (i.e. ≥2) computed tomography (CT) and to compare utilization patterns across body regions for trauma patients admitted to a level I trauma center for traffic-related injuries (TRI). We linked the Harborview Medical Center trauma registry (1996-2010) to the billing department data. We extracted the following variables: type and frequency of CTs performed, age, gender, race/ethnicity, insurance status, injury mechanism and severity, length of hospitalization, intensive care unit (ICU) admission and final disposition. TRIs were defined as motor vehicle collisions, motorcycle, bicycle and pedestrian-related injuries. Logistic regression was used to evaluate the association between utilization of different body region repeat (i.e. ≥2) CTs and year of admission, adjusting for patient and injury-related characteristics that could influence utilization patterns. A total of 28,431 patients were admitted for TRIs over the study period and 9499 (33%) received repeat CTs. From 1996 to 2010, the proportion of patients receiving repeat CTs decreased by 33%. Relative to 2000 and adjusting for other covariates, patients with TRIs admitted in 2010 had significantly lower odds of undergoing repeat head (OR=0.61; 95% CI: 0.49-0.76), pelvis (OR=0.37; 95% CI: 0.27-0.52), cervical spine (OR=0.23; 95% CI: 0.12-0.43), and maxillofacial CTs (OR=0.24; 95% CI: 0.10-0.57). However, they had higher odds of receiving repeat thoracic CTs (OR=1.86; 95% CI: 1.02-3.38). A significant decrease in the utilization of repeat CTs was observed in trauma patients presenting with traffic-related injuries over a 15-year period. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  9. Declining trend in the use of repeat computed tomography for trauma patients admitted to a level I trauma center for traffic-related injuries

    International Nuclear Information System (INIS)

    Psoter, Kevin J.; Roudsari, Bahman S.; Graves, Janessa M.; Mack, Christopher; Jarvik, Jeffrey G.

    2013-01-01

    Objective: To evaluate the trend in utilization of repeat (i.e. ≥2) computed tomography (CT) and to compare utilization patterns across body regions for trauma patients admitted to a level I trauma center for traffic-related injuries (TRI). Materials and Methods: We linked the Harborview Medical Center trauma registry (1996–2010) to the billing department data. We extracted the following variables: type and frequency of CTs performed, age, gender, race/ethnicity, insurance status, injury mechanism and severity, length of hospitalization, intensive care unit (ICU) admission and final disposition. TRIs were defined as motor vehicle collisions, motorcycle, bicycle and pedestrian-related injuries. Logistic regression was used to evaluate the association between utilization of different body region repeat (i.e. ≥2) CTs and year of admission, adjusting for patient and injury-related characteristics that could influence utilization patterns. Results: A total of 28,431 patients were admitted for TRIs over the study period and 9499 (33%) received repeat CTs. From 1996 to 2010, the proportion of patients receiving repeat CTs decreased by 33%. Relative to 2000 and adjusting for other covariates, patients with TRIs admitted in 2010 had significantly lower odds of undergoing repeat head (OR = 0.61; 95% CI: 0.49–0.76), pelvis (OR = 0.37; 95% CI: 0.27–0.52), cervical spine (OR = 0.23; 95% CI: 0.12–0.43), and maxillofacial CTs (OR = 0.24; 95% CI: 0.10–0.57). However, they had higher odds of receiving repeat thoracic CTs (OR = 1.86; 95% CI: 1.02–3.38). Conclusion: A significant decrease in the utilization of repeat CTs was observed in trauma patients presenting with traffic-related injuries over a 15-year period

  10. Declining trend in the use of repeat computed tomography for trauma patients admitted to a level I trauma center for traffic-related injuries

    Energy Technology Data Exchange (ETDEWEB)

    Psoter, Kevin J., E-mail: kevinp2@u.washington.edu [Department of Epidemiology, University of Washington, Box 357236, Seattle, WA 98195 (United States); Roudsari, Bahman S., E-mail: roudsari@u.washington.edu [Department of Radiology, Comparative Effectiveness, Cost and Outcomes Research Center, University of Washington, 325 Ninth Avenue, Box 359960, Seattle, WA 98104 (United States); Graves, Janessa M., E-mail: janessa@u.washington.edu [Department of Pediatrics, Harborview Injury Prevention and Research Center, University of Washington, 325 Ninth Avenue, Box 359960, Seattle, WA 98104 (United States); Mack, Christopher, E-mail: cdmack@uw.edu [Harborview Injury Prevention and Research Center, University of Washington, 325 Ninth Avenue, Box 359960, Seattle, WA 98104 (United States); Jarvik, Jeffrey G., E-mail: jarvikj@u.washington.edu [Department of Radiology and Department of Neurological Surgery, Comparative Effectiveness, Cost and Outcomes Research Center, University of Washington, 325 Ninth Avenue, Box 359960, Seattle, WA 98104 (United States)

    2013-06-15

    Objective: To evaluate the trend in utilization of repeat (i.e. ≥2) computed tomography (CT) and to compare utilization patterns across body regions for trauma patients admitted to a level I trauma center for traffic-related injuries (TRI). Materials and Methods: We linked the Harborview Medical Center trauma registry (1996–2010) to the billing department data. We extracted the following variables: type and frequency of CTs performed, age, gender, race/ethnicity, insurance status, injury mechanism and severity, length of hospitalization, intensive care unit (ICU) admission and final disposition. TRIs were defined as motor vehicle collisions, motorcycle, bicycle and pedestrian-related injuries. Logistic regression was used to evaluate the association between utilization of different body region repeat (i.e. ≥2) CTs and year of admission, adjusting for patient and injury-related characteristics that could influence utilization patterns. Results: A total of 28,431 patients were admitted for TRIs over the study period and 9499 (33%) received repeat CTs. From 1996 to 2010, the proportion of patients receiving repeat CTs decreased by 33%. Relative to 2000 and adjusting for other covariates, patients with TRIs admitted in 2010 had significantly lower odds of undergoing repeat head (OR = 0.61; 95% CI: 0.49–0.76), pelvis (OR = 0.37; 95% CI: 0.27–0.52), cervical spine (OR = 0.23; 95% CI: 0.12–0.43), and maxillofacial CTs (OR = 0.24; 95% CI: 0.10–0.57). However, they had higher odds of receiving repeat thoracic CTs (OR = 1.86; 95% CI: 1.02–3.38). Conclusion: A significant decrease in the utilization of repeat CTs was observed in trauma patients presenting with traffic-related injuries over a 15-year period.

  11. Mathematics and physics of emerging biomedical imaging

    National Research Council Canada - National Science Library

    National Research Council Staff; Commission on Physical Sciences, Mathematics, and Applications; Division on Engineering and Physical Sciences; National Research Council; National Academy of Sciences

    .... Incorporating input from dozens of biomedical researchers who described what they perceived as key open problems of imaging that are amenable to attack by mathematical scientists and physicists...

  12. Frontiers in biomedical engineering and biotechnology.

    Science.gov (United States)

    Liu, Feng; Goodarzi, Ali; Wang, Haifeng; Stasiak, Joanna; Sun, Jianbo; Zhou, Yu

    2014-01-01

    The 2nd International Conference on Biomedical Engineering and Biotechnology (iCBEB 2013), held in Wuhan on 11–13 October 2013, is an annual conference that aims at providing an opportunity for international and national researchers and practitioners to present the most recent advances and future challenges in the fields of Biomedical Information, Biomedical Engineering and Biotechnology. The papers published by this issue are selected from this conference, which witnesses the frontier in the field of Biomedical Engineering and Biotechnology, which particularly has helped improving the level of clinical diagnosis in medical work.

  13. Computer Center CDC Reference Manual.

    Science.gov (United States)

    1984-09-01

    34 Introduction *"- 1. For card decks, the following job card requirements apply at the DTNSRDC site: MFF - CDC CYBER 176 - orange (Carderock) yellow ...stripe (Annapolis) * MFE - CDC CYBER 750 - blue or purple (Carderock) yellow stripe (Annapolis) * No other cards in the deck should be any of the above...semi-private files day -OFF -Turn off dayfile messages for individual MSAUDITs for LO=FP (could generate wallpaper ). Has no effect on other options

  14. Publications in biomedical and environmental sciences programs, 1980

    International Nuclear Information System (INIS)

    Pfuderer, H.A.; Moody, J.B.

    1981-07-01

    This bibliography contains 690 references to articles in journals, books, and reports published in the subject area of biomedical and environmental sciences during 1980. There are 529 references to articles published in journals and books and 161 references to reports. Staff members in the Biomedical and Environmental Sciences divisions have other publications not included in this bibliography; for example, theses, book reviews, abstracts published in journals or symposia proceedings, pending journal publications and reports such as monthly and bimonthly progress reports, contractor reports, and reports for internal distribution. This document is sorted by the division, and then alphabetically by author. The sorting by divisions separates the references by subject area in a simple way. The divisions represented in the order that they appear in the bibliography are Analytical Chemistry, Biology, Chemical Technology, Information R and D, Health and Safety Research, Energy, Environmental Sciences, and Computer Sciences

  15. Publications in biomedical and environmental sciences programs, 1980

    Energy Technology Data Exchange (ETDEWEB)

    Pfuderer, H.A.; Moody, J.B.

    1981-07-01

    This bibliography contains 690 references to articles in journals, books, and reports published in the subject area of biomedical and environmental sciences during 1980. There are 529 references to articles published in journals and books and 161 references to reports. Staff members in the Biomedical and Environmental Sciences divisions have other publications not included in this bibliography; for example, theses, book reviews, abstracts published in journals or symposia proceedings, pending journal publications and reports such as monthly and bimonthly progress reports, contractor reports, and reports for internal distribution. This document is sorted by the division, and then alphabetically by author. The sorting by divisions separates the references by subject area in a simple way. The divisions represented in the order that they appear in the bibliography are Analytical Chemistry, Biology, Chemical Technology, Information R and D, Health and Safety Research, Energy, Environmental Sciences, and Computer Sciences.

  16. Bioethical Principles of Biomedical Research Involving Animals

    Directory of Open Access Journals (Sweden)

    Bakir Mehić

    2011-08-01

    animals for research, testing, or training in different countries. In the few that have done so, the measures adopted vary widely: on the one hand, legally enforceable detailed regulations with licensing of experimenters and their premises together with an official inspectorate; on the other, entirely voluntary self-regulation by the biomedical community, with lay participation. Many variations are possible between these extremes, one intermediate situation being a legal requirement that experiments or other procedures involving the use of animals should be subject to the approval of ethical committees of specified composition.The International Guiding Principles are the product of the collaboration of a representative sample of the international biomedical community, including experts of the World Health Organization, and of consultations with responsible animal welfare groups. The International Guiding Principles have already gained a considerable measure of acceptance internationally. European Medical Research Councils (EMRC, an international association that includes all the West European medical research councils, fully endorsed the Guiding Principles in 1984. Here we bring the basic bioethical principles for using animals in biomedical research[3]: Methods such as mathematical models, computer simulation and in vitro biological systems should be used wherever appropriate,Animal experiments should be undertaken only after due consideration of their relevance for human or animal health and the advancement of biological knowledge,The animals selected for an experiment should be of an appropriate species and quality, and the minimum number required to obtain scientifically valid results,Investigators and other personnel should never fail to treat animals as sentient, and should regard their proper care and use and the avoidance or minimization of discomfort, distress, or pain as ethical imperatives,Procedures with animals that may cause more than momentary or minimal

  17. Biomedical applications of control engineering

    CERN Document Server

    Hacısalihzade, Selim S

    2013-01-01

    Biomedical Applications of Control Engineering is a lucidly written textbook for graduate control engin­eering and biomedical engineering students as well as for medical prac­ti­tioners who want to get acquainted with quantitative methods. It is based on decades of experience both in control engineering and clinical practice.   The book begins by reviewing basic concepts of system theory and the modeling process. It then goes on to discuss control engineering application areas like ·         Different models for the human operator, ·         Dosage and timing optimization in oral drug administration, ·         Measuring symptoms of and optimal dopaminergic therapy in Parkinson’s disease, ·         Measure­ment and control of blood glucose le­vels both naturally and by means of external controllers in diabetes, and ·         Control of depth of anaesthesia using inhalational anaesthetic agents like sevoflurane using both fuzzy and state feedback controllers....

  18. Interactive Processing and Visualization of Image Data forBiomedical and Life Science Applications

    Energy Technology Data Exchange (ETDEWEB)

    Staadt, Oliver G.; Natarjan, Vijay; Weber, Gunther H.; Wiley,David F.; Hamann, Bernd

    2007-02-01

    Background: Applications in biomedical science and life science produce large data sets using increasingly powerful imaging devices and computer simulations. It is becoming increasingly difficult for scientists to explore and analyze these data using traditional tools. Interactive data processing and visualization tools can support scientists to overcome these limitations. Results: We show that new data processing tools and visualization systems can be used successfully in biomedical and life science applications. We present an adaptive high-resolution display system suitable for biomedical image data, algorithms for analyzing and visualization protein surfaces and retinal optical coherence tomography data, and visualization tools for 3D gene expression data. Conclusion: We demonstrated that interactive processing and visualization methods and systems can support scientists in a variety of biomedical and life science application areas concerned with massive data analysis.

  19. Archives of Medical and Biomedical Research

    African Journals Online (AJOL)

    Archives of Medical and Biomedical Research is the official journal of the International Association of Medical and Biomedical Researchers (IAMBR) and the Society for Free Radical Research Africa (SFRR-Africa). It is an internationally peer reviewed, open access and multidisciplinary journal aimed at publishing original ...

  20. Biomedical Journals and the World Wide Web.

    Science.gov (United States)

    Schoonbaert, Dirk

    1998-01-01

    Discusses the publication of biomedical journals on the Internet. Highlights include pros and cons of electronic publishing; the Global Health Network at the University of Pittsburgh; the availability of biomedical journals on the World Wide Web; current applications, including access to journal contents tables and electronic delivery of full-text…