WorldWideScience

Sample records for computing strategy acquisition

  1. NNSA?s Computing Strategy, Acquisition Plan, and Basis for Computing Time Allocation

    Energy Technology Data Exchange (ETDEWEB)

    Nikkel, D J

    2009-07-21

    This report is in response to the Omnibus Appropriations Act, 2009 (H.R. 1105; Public Law 111-8) in its funding of the National Nuclear Security Administration's (NNSA) Advanced Simulation and Computing (ASC) Program. This bill called for a report on ASC's plans for computing and platform acquisition strategy in support of stockpile stewardship. Computer simulation is essential to the stewardship of the nation's nuclear stockpile. Annual certification of the country's stockpile systems, Significant Finding Investigations (SFIs), and execution of Life Extension Programs (LEPs) are dependent on simulations employing the advanced ASC tools developed over the past decade plus; indeed, without these tools, certification would not be possible without a return to nuclear testing. ASC is an integrated program involving investments in computer hardware (platforms and computing centers), software environments, integrated design codes and physical models for these codes, and validation methodologies. The significant progress ASC has made in the past derives from its focus on mission and from its strategy of balancing support across the key investment areas necessary for success. All these investment areas must be sustained for ASC to adequately support current stockpile stewardship mission needs and to meet ever more difficult challenges as the weapons continue to age or undergo refurbishment. The appropriations bill called for this report to address three specific issues, which are responded to briefly here but are expanded upon in the subsequent document: (1) Identify how computing capability at each of the labs will specifically contribute to stockpile stewardship goals, and on what basis computing time will be allocated to achieve the goal of a balanced program among the labs. (2) Explain the NNSA's acquisition strategy for capacity and capability of machines at each of the labs and how it will fit within the existing budget constraints. (3

  2. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    Directory of Open Access Journals (Sweden)

    Yeqing Zhang

    2018-02-01

    Full Text Available For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully.

  3. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    Science.gov (United States)

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-01-01

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully. PMID:29495301

  4. Implementing acquisition strategies

    International Nuclear Information System (INIS)

    Montgomery, G. K.

    1997-01-01

    The objective of this paper is to address some of the strategies necessary to effect a successful asset or corporate acquisition. Understanding the corporate objective, the full potential of the asset, the specific strategies to be employed, the value of time, and most importantly the interaction of all these are crucial, for missed steps are likely to result in missed opportunities. The amount of factual information that can be obtained and utilized in a timely fashion is the largest single hurdle to the capture of value in the asset or corporate acquisition. Fact, familiarity and experience are key in this context. The importance of the due diligence process prior to title or data transfer cannot be overemphasized. Some of the most important assets acquired in a merger may be the people. To maximize effectiveness, it is essential to merge both existing staff and those that came with the new acquisition as soon as possible. By thinking together as a unit, knowledge and experience can be applied to realize the potential of the asset. Hence team building is one of the challenges, doing it quickly is usually the most effective. Developing new directions for the new enlarged company by combining the strengths of the old and the new creates more value, as well as a more efficient operation. Equally important to maximizing the potential of the new acquisition is the maintenance of the momentum generated by the need to grow that gave the impetus to acquiring new assets in the first place. In brief, the right mix of vision, facts and perceptions, quick enactment of the post-close strategies and keeping the momentum alive, are the principal ingredients of a focused strategy

  5. 48 CFR 34.004 - Acquisition strategy.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Acquisition strategy. 34... CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION General 34.004 Acquisition strategy. The program manager, as specified in agency procedures, shall develop an acquisition strategy tailored to the particular...

  6. 48 CFR 3034.004 - Acquisition strategy.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Acquisition strategy. 3034.004 Section 3034.004 Federal Acquisition Regulations System DEPARTMENT OF HOMELAND SECURITY, HOMELAND... Acquisition strategy. See (HSAR) 48 CFR 3009.570 for policy applicable to acquisition strategies that consider...

  7. 48 CFR 234.004 - Acquisition strategy.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Acquisition strategy. 234..., DEPARTMENT OF DEFENSE SPECIAL CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION 234.004 Acquisition strategy. (1) See 209.570 for policy applicable to acquisition strategies that consider the use of lead system...

  8. 48 CFR 434.004 - Acquisition strategy.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Acquisition strategy. 434.004 Section 434.004 Federal Acquisition Regulations System DEPARTMENT OF AGRICULTURE SPECIAL CATEGORIES OF CONTRACTING MAJOR SYSTEM ACQUISITION General 434.004 Acquisition strategy. (a) The program...

  9. 48 CFR 307.104-70 - Acquisition strategy.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Acquisition strategy. 307... AND ACQUISITION PLANNING ACQUISITION PLANNING Acquisition Planning 307.104-70 Acquisition strategy... designated by the HHS CIO, DASFMP, the CAO, or the cognizant HCA) shall prepare an acquisition strategy using...

  10. Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.

    Science.gov (United States)

    Pearl, Lisa S; Sprouse, Jon

    2015-06-01

    Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.

  11. Modeling Human Information Acquisition Strategies

    NARCIS (Netherlands)

    Heuvelink, Annerieke; Klein, Michel C. A.; van Lambalgen, Rianne; Taatgen, Niels A.; Rijn, Hedderik van

    2009-01-01

    The focus of this paper is the development of a computational model for intelligent agents that decides on whether to acquire required information by retrieving it from memory or by interacting with the world. First, we present a task for which such decisions have to be made. Next, we discuss an

  12. Cognitive Strategies and Skill Acquisition.

    Science.gov (United States)

    1981-02-09

    Behavior (Acadmic Press, N.Y., 1974). ( 9). Craik , F.I.M., 8 Lockhart , R.S., Levels of processing : A frame- work for memory research, Journal of...C.D., a Stein, B.S., Some general constraints on learning and memory research, in: F.I.M. Craik 6 L.S. Cermak.(eds.), Levels of Processing and...instructions, or instructions in the use of particular strategies. (Belmont & Butterfield, 1971; Craik & Lockhart , 1972; Weinstein, 1978) have had

  13. Exploitative and Deceptive Resource Acquisition Strategies

    Directory of Open Access Journals (Sweden)

    Joshua J. Reynolds

    2015-07-01

    Full Text Available Life history strategy (LHS and life history contingencies (LHCs should theoretically influence the use of exploitative and deceptive resource acquisition strategies. However, little research has been done in this area. The purpose of the present work was to create measures of exploitative strategies and test the predictions of life history theory. Pilot studies developed and validated a behavioral measure of cheating called the Dot Game. The role of individual LHS and LHCs (manipulated via validated story primes on cheating was investigated in Study 1. Studies 2a through 2c were conducted to develop and validate a self-report measure called the Exploitative and Deceptive Resource Acquisition Strategy Scale (EDRASS. Finally, Study 3 investigated life history and EDRASS. Results indicated that while LHS influences exploitative strategies, life history contingences had little effect. Implications of these findings are discussed.

  14. Cloud computing strategies

    CERN Document Server

    Chorafas, Dimitris N

    2011-01-01

    A guide to managing cloud projects, Cloud Computing Strategies provides the understanding required to evaluate the technology and determine how it can be best applied to improve business and enhance your overall corporate strategy. Based on extensive research, it examines the opportunities and challenges that loom in the cloud. It explains exactly what cloud computing is, what it has to offer, and calls attention to the important issues management needs to consider before passing the point of no return regarding financial commitments.

  15. COMPUTER-AIDED ACQUISITION OF WRITING SKILLS

    NARCIS (Netherlands)

    Verhoef, R.; Tomic, W.

    2008-01-01

    This article presents the results of a review of the literature questioning whether and to what extent computers can be used as a means of instruction for the guided acquisition of communicative writing skills in higher education. To answer this question, the present paper first explores the

  16. 10 CFR 626.4 - General acquisition strategy.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 4 2010-01-01 2010-01-01 false General acquisition strategy. 626.4 Section 626.4 Energy DEPARTMENT OF ENERGY (CONTINUED) SALES REGULATION PROCEDURES FOR ACQUISITION OF PETROLEUM FOR THE STRATEGIC PETROLEUM RESERVE § 626.4 General acquisition strategy. (a) Criteria for commencing acquisition. To reduce...

  17. Nutrient acquisition strategies of mammalian cells.

    Science.gov (United States)

    Palm, Wilhelm; Thompson, Craig B

    2017-06-07

    Mammalian cells are surrounded by diverse nutrients, such as glucose, amino acids, various macromolecules and micronutrients, which they can import through transmembrane transporters and endolysosomal pathways. By using different nutrient sources, cells gain metabolic flexibility to survive periods of starvation. Quiescent cells take up sufficient nutrients to sustain homeostasis. However, proliferating cells depend on growth-factor-induced increases in nutrient uptake to support biomass formation. Here, we review cellular nutrient acquisition strategies and their regulation by growth factors and cell-intrinsic nutrient sensors. We also discuss how oncogenes and tumour suppressors promote nutrient uptake and thereby support the survival and growth of cancer cells.

  18. Memory strategies and ESL vocabulary acquisition

    Directory of Open Access Journals (Sweden)

    Carisma Dreyer

    2013-02-01

    Full Text Available This article compares the effectiveness of three learning strategies (memory strategies for ESL vocabulary acquisition. Four intact ESL classes were divided into one control group and three treatment groups (keyword, semantic, and keyword-semantic. These Afrikaans-speaking standard 6 pupils then received 4 days of instruction. Both multiplechoice and cued-recall instruments were used to measure effects both 1 day and 9 days after instruction. The results indicated that for both the multiple-choice and cued-recall tests the combined keyword-semantic strategy differed statistically Significantly as well as practically significantly from the keyword method. The results, therefore, suggest that the combined keyword-semantic strategy increased retention above the other strategies. Hierdie artikel vergelyk die effektiwiteit van drie taalleerstrategiee (geheue strategiee vir die aanleer van woordeskat met mekaar. Vier intak Engels tweedetaal klasse is verdeel in een kontrole groep en drie eksperimentele groepe (sleutelwoord, semantiese en 'n kombinasie van die sleutelwoord-semantiese strategiee. 'n Groep Afrikaanssprekende standerd ses leerlinge het vir 'n tydperk van vier dae onderrig in elk van bogenoemde strategiee ontvang. Multikeuse en "cued-recall" instrumente is gebruik om die effek van onderrig beide een dag en nege dae na eksperimentering te bepaal. Die resultate het aangetoon dat die gekombineerde sleutelwoord-semantiese strategie statisties betekenisvol sowel as prakties betekenisvol van die sleutelwoord strategie en die kontrole groep verskil het. Dit wil dus voorkom asof die gekombineerde sleutelwoord-semantiese strategie die mees belowende strategie is ten opsigte van die retensie van woordeskat.

  19. Sustaining an Acquisition-based Growth Strategy

    DEFF Research Database (Denmark)

    Henningsson, Stefan; Toppenberg, Gustav; Shanks, Graeme

    Value creating acquisitions are a major challenge for many firms. Our case study of Cisco Systems shows that an advanced Enterprise Architecture (EA) capability can contribute to the acquisition process through a) preparing the acquirer to become ‘acquisition ready’, b) identifying resource...... complementarity, c) directing and governing the integration process, and d) post-acquisition evaluation of the achieved integration and proposing ways forward. Using the EA capability in the acquisition process improves Cisco’s ability to rapidly capture value from its acquisitions and to sustain its acquisition...

  20. 2XIIB computer data acquisition system

    International Nuclear Information System (INIS)

    Tyler, G.C.

    1975-01-01

    All major plasma diagnostic measurements from the 2XIIB experiment are recorded, digitized, and stored by the computer data acquisition system. The raw data is then examined, correlated, reduced, and useful portions are quickly retrieved which direct the future conduct of the plasma experiment. This is done in real time and on line while the data is current. The immediate availability of this pertinent data has accelerated the rate at which the 2XII personnel have been able to gain knowledge in the study of plasma containment and fusion interaction. The up time of the experiment is being used much more effectively than ever before. This paper describes the hardware configuration of our data system in relation to various plasma parameters measured, the advantages of powerful software routines to reduce and correlate the data, the present plans for expansion of the system, and the problems we have had to overcome in certain areas to meet our original goals

  1. Strategy and Tactics of International Mergers and Acquisitions

    Directory of Open Access Journals (Sweden)

    Denys Kiriakov

    2011-12-01

    Full Text Available The article reviews contemporary strategy and tactics issues in terms of international mergers and acquisitions, along with displaying cyclical waves of mergers and acquisitions over the last century as well as motivation thereof. Five strategies adhered to by international companies initiating conclusion of such agreements as well as challenges accompanying execution thereof have been analyzed. Modern strategic and tactical tools of international mergers and acquisitions process management have been researched on exemplary buyer (a corporation case-study.

  2. A data acquisition system based on a personal computer

    International Nuclear Information System (INIS)

    Omata, K.; Fujita, Y.; Yoshikawa, N.; Sekiguchi, M.; Shida, Y.

    1991-07-01

    A versatile and flexible data acquisition system KODAQ (Kakuken Online Data AcQuisition system) has been developed. The system runs with CAMAC and a most popular Japanese personal computer, PC9801 (NEC), similar to the IBM PC/AT. The system is designed to set up easily a data acquisition system for various kinds of nuclear-physics experiments. (author)

  3. Reactor Pressure Vessel (RPV) Acquisition Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Mizia, Ronald Eugene [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2008-04-01

    The Department of Energy has selected the High Temperature Gas-cooled Reactor design for the Next Generation Nuclear Plant (NGNP) Project. The NGNP will demonstrate the use of nuclear power for electricity and hydrogen production. It will have an outlet gas temperature in the range of 900°C and a plant design service life of 60 years. The reactor design will be a graphite moderated, helium-cooled, prismatic or pebble-bed reactor and use low-enriched uranium, TRISO-coated fuel. The plant size, reactor thermal power, and core configuration will ensure passive decay heat removal without fuel damage or radioactive material releases during accidents. The NGNP Materials Research and Development (R&D) Program is responsible for performing R&D on likely NGNP materials in support of the NGNP design, licensing, and construction activities. Selection of the technology and design configuration for the NGNP must consider both the cost and risk profiles to ensure that the demonstration plant establishes a sound foundation for future commercial deployments. The NGNP challenge is to achieve a significant advancement in nuclear technology while at the same time setting the stage for an economically viable deployment of the new technology in the commercial sector soon after 2020. The purpose of this report is to address the acquisition strategy for the NGNP Reactor Pressure Vessel (RPV). This component will be larger than any nuclear reactor pressure vessel presently in service in the United States. The RPV will be taller, larger in diameter, thicker walled, heavier and most likely fabricated at the Idaho National Laboratory (INL) site of multiple subcomponent pieces. The pressure vessel steel can either be a conventional materials already used in the nuclear industry such as listed within ASME A508/A533 specifications or it will be fabricated from newer pressure vessel materials never before used for a nuclear reactor in the US. Each of these characteristics will present a

  4. Breaking Bad: Reforming Cyber Acquisition via Innovative Strategies

    Science.gov (United States)

    2015-04-01

    relate to infrastructure programs. Acquisition strategies for defensive tools, such as firewalls and antivirus software , will differ from acquisition...working on nuclear concepts for the benefit of the national will. These laboratories, as pointed out by Capt Patrick Roberts, also perform software ... software development, these scientists and engineers now possess experience making cyber systems that could be focused into the development of offensive

  5. Ownership Strategy and Subsidiary Survival in Foreign Acquisitions

    DEFF Research Database (Denmark)

    Wang, Yi; Larimo, Jorma

    2017-01-01

    In this study, we analyze the general effect of acquirers’ ownership strategy on the survival in foreign acquisitions. Furthermore, we attempt to address five potential moderating effects: international, regional, target country experience, cultural distance, as well as host country development...

  6. Healthcare mergers and acquisitions: strategies for consolidation.

    Science.gov (United States)

    Zuckerman, Alan M

    2011-01-01

    The passage of federal healthcare reform legislation, in combination with other factors, makes it likely that the next few years will be a major period of consolidation for healthcare organizations. This article examines the seven key forces reshaping healthcare delivery--from insurance industry consolidation to cost inflation to the increasing gap between financially strong and struggling providers--and provides advice for organizations on both sides of an acquisition.

  7. Fusions and acquisitions, a strategy of development

    International Nuclear Information System (INIS)

    Anon.

    2002-01-01

    The fusions and acquisitions (F and A) activity stayed at a high level in 2001 in the overall oil, energy and environment activities, while it has strongly diminished in other sectors, like the telecommunications. The study carried out by L. Le Dortz and B. Debosscher from the French centre of external trade (CFCE) lists and comments more than 170 F and A operations, among which about 40 exceed 1 billion of US$ in terms of valorization. In a general way, the logics of concentration is confirmed in the petroleum industry, while the evolutions are more contrasted in the utilities (gas, electricity, water): decay of the operations performed by north American groups and keeping of the dynamism of F and As in Europe. This article briefly summarizes the content of this study. (J.S.)

  8. Assessment of Language Learners' Strategies: Do They Prefer Learning or Acquisition Strategies?

    Science.gov (United States)

    Altmisdort, Gonca

    2016-01-01

    The aim of this study is to evaluate learning and acquisition strategies used by second/foreign language learners. This study is a comparative investigation of learning and acquisition strategies of successful and less successful language learners. The main question of the study is to investigate if there is a relationship between the learners'…

  9. Acquisition of Computers That Process Corporate Information

    National Research Council Canada - National Science Library

    Gimble, Thomas

    1999-01-01

    The Secretary of Defense announced the Corporate Information Management initiative on November 16, 1990, to establish a DoD-wide concept for managing computer, communications, and information management functions...

  10. ANL statement of site strategy for computing workstations

    Energy Technology Data Exchange (ETDEWEB)

    Fenske, K.R. (ed.); Boxberger, L.M.; Amiot, L.W.; Bretscher, M.E.; Engert, D.E.; Moszur, F.M.; Mueller, C.J.; O' Brien, D.E.; Schlesselman, C.G.; Troyer, L.J.

    1991-11-01

    This Statement of Site Strategy describes the procedure at Argonne National Laboratory for defining, acquiring, using, and evaluating scientific and office workstations and related equipment and software in accord with DOE Order 1360.1A (5-30-85), and Laboratory policy. It is Laboratory policy to promote the installation and use of computing workstations to improve productivity and communications for both programmatic and support personnel, to ensure that computing workstations acquisitions meet the expressed need in a cost-effective manner, and to ensure that acquisitions of computing workstations are in accord with Laboratory and DOE policies. The overall computing site strategy at ANL is to develop a hierarchy of integrated computing system resources to address the current and future computing needs of the laboratory. The major system components of this hierarchical strategy are: Supercomputers, Parallel computers, Centralized general purpose computers, Distributed multipurpose minicomputers, and Computing workstations and office automation support systems. Computing workstations include personal computers, scientific and engineering workstations, computer terminals, microcomputers, word processing and office automation electronic workstations, and associated software and peripheral devices costing less than $25,000 per item.

  11. Oral Vocabulary and Language Acquisition Strategies to Increase Literacy

    Science.gov (United States)

    Green, Grace

    2017-01-01

    This study addresses low literacy achievement in students in kindergarten and first grades. The study was designed to help identify how general education teachers can use specific daily research-based oral vocabulary acquisition strategies to close the literacy gap. This quantitative research helped to determine if the implementation of an oral…

  12. Computational Analysis of SAXS Data Acquisition.

    Science.gov (United States)

    Dong, Hui; Kim, Jin Seob; Chirikjian, Gregory S

    2015-09-01

    Small-angle x-ray scattering (SAXS) is an experimental biophysical method used for gaining insight into the structure of large biomolecular complexes. Under appropriate chemical conditions, the information obtained from a SAXS experiment can be equated to the pair distribution function, which is the distribution of distances between every pair of points in the complex. Here we develop a mathematical model to calculate the pair distribution function for a structure of known density, and analyze the computational complexity of these calculations. Efficient recursive computation of this forward model is an important step in solving the inverse problem of recovering the three-dimensional density of biomolecular structures from their pair distribution functions. In particular, we show that integrals of products of three spherical-Bessel functions arise naturally in this context. We then develop an algorithm for the efficient recursive computation of these integrals.

  13. SAR: A fast computer for Camac data acquisition

    International Nuclear Information System (INIS)

    Bricaud, B.; Faivre, J.C.; Pain, J.

    1979-01-01

    This paper describes a special data acquisition and processing facility developed for Nuclear Physics experiments at intermediate energy installed at SATURNE (France) and at CERN (Geneva, Switzerland). Previously, we used a PDP 11/45 computer which was connected to the experiments through a Camac Branch highway. In a typical experiment (340 words per event), the computer limited the data acquisition rate at 4 μsec for each 16-bit transfer and the on-line data reduction at 20 events per second only. The initial goal of this project was to increase these two performances. Previous known acquisition processors were limited by the memory capacity these systems could support. Most of the time the data reduction was done on the host mini computer. Higher memory size can be designed with new fast RAM (Intel 2147) and the data processing can now take place on the front end processor

  14. Learning theories in computer-assisted foreign language acquisition

    OpenAIRE

    Baeva, D.

    2013-01-01

    This paper reviews the learning theories, focusing to the strong interest in technology use for language learning. It is important to look at how technology has been used in the field thus far. The goals of this review are to understand how computers have been used in the past years to support foreign language learning, and to explore any research evidence with regards to how computer technology can enhance language skills acquisition

  15. An Empirical Generative Framework for Computational Modeling of Language Acquisition

    Science.gov (United States)

    Waterfall, Heidi R.; Sandbank, Ben; Onnis, Luca; Edelman, Shimon

    2010-01-01

    This paper reports progress in developing a computer model of language acquisition in the form of (1) a generative grammar that is (2) algorithmically learnable from realistic corpus data, (3) viable in its large-scale quantitative performance and (4) psychologically real. First, we describe new algorithmic methods for unsupervised learning of…

  16. COMPUTER-AIDED DATA ACQUISITION FOR COMBUSTION EXPERIMENTS

    Science.gov (United States)

    The article describes the use of computer-aided data acquisition techniques to aid the research program of the Combustion Research Branch (CRB) of the U.S. EPA's Air and Energy Engineering Research Laboratory (AEERL) in Research Triangle Park, NC, in particular on CRB's bench-sca...

  17. Evaluation strategies for monadic computations

    Directory of Open Access Journals (Sweden)

    Tomas Petricek

    2012-02-01

    Full Text Available Monads have become a powerful tool for structuring effectful computations in functional programming, because they make the order of effects explicit. When translating pure code to a monadic version, we need to specify evaluation order explicitly. Two standard translations give call-by-value and call-by-name semantics. The resulting programs have different structure and types, which makes revisiting the choice difficult. In this paper, we translate pure code to monadic using an additional operation malias that abstracts out the evaluation strategy. The malias operation is based on computational comonads; we use a categorical framework to specify the laws that are required to hold about the operation. For any monad, we show implementations of malias that give call-by-value and call-by-name semantics. Although we do not give call-by-need semantics for all monads, we show how to turn certain monads into an extended monad with call-by-need semantics, which partly answers an open question. Moreover, using our unified translation, it is possible to change the evaluation strategy of functional code translated to the monadic form without changing its structure or types.

  18. Technology Proliferation: Acquisition Strategies and Opportunities for an Uncertain Future

    Science.gov (United States)

    2018-04-20

    COVERED (From - To) 07/31/17 to 04/09/18 Technology Proliferation: Acquisition Strategies and Opportunities for an Uncertain Future Colonel Heather A...efficient and expeditious fielding of technologically superior capabilities. In today’s environment, it is commonplace for private industry to be the...first to develop and deploy technologies that can be adopted for defense systems. The result is that the Department of Defense (DoD) is largely a

  19. Acquisition of gamma camera and physiological data by computer

    International Nuclear Information System (INIS)

    Hack, S.N.; Chang, M.; Line, B.R.; Cooper, J.A.; Robeson, G.H.

    1986-01-01

    We have designed, implemented, and tested a new Research Data Acquisition System (RDAS) that permits a general purpose digital computer to acquire signals from both gamma camera sources and physiological signal sources concurrently. This system overcomes the limited multi-source, high speed data acquisition capabilities found in most clinically oriented nuclear medicine computers. The RDAS can simultaneously input signals from up to four gamma camera sources with a throughput of 200 kHz per source and from up to eight physiological signal sources with an aggregate throughput of 50 kHz. Rigorous testing has found the RDAS to exhibit acceptable linearity and timing characteristics. In addition, flood images obtained by this system were compared with flood images acquired by a commercial nuclear medicine computer system. National Electrical Manufacturers Association performance standards of the flood images were found to be comparable

  20. Electron backscatter diffraction: Strategies for reliable data acquisition and processing

    International Nuclear Information System (INIS)

    Randle, Valerie

    2009-01-01

    In electron backscatter diffraction (EBSD) software packages there are many user choices both in data acquisition and in data processing and display. In order to extract maximum scientific value from an inquiry, it is helpful to have some guidelines for best practice in conducting an EBSD investigation. The purpose of this article therefore is to address selected topics of EBSD practice, in a tutorial manner. The topics covered are a brief summary on the principles of EBSD, specimen preparation, calibration of an EBSD system, experiment design, speed of data acquisition, data clean-up, microstructure characterisation (including grain size) and grain boundary characterisation. This list is not meant to cover exhaustively all areas where EBSD is used, but rather to provide a resource consisting of some useful strategies for novice EBSD users.

  1. The Acquisition of Vocabulary Through Three Memory Strategies

    Directory of Open Access Journals (Sweden)

    Libia Maritza Pérez

    2017-02-01

    Full Text Available The present study reports on an action research study that explores the implications of applying three vocabulary strategies: word cards, association with pictures, and association with a topic through fables in the acquisition of new vocabulary in a group of EFL low-level proficiency teenagers in a public school in Espinal, Tolima, Colombia. The participants had never used vocabulary strategies before and struggled to memorize and recall words.  Two types of questionnaires, a researcher’s journal, and vocabulary tests were the instruments used to gather data.  The results showed that these strategies were effective to expand the range of words progressively and improve the ability to recall them. The study also found that these strategies involve cognitive and affective factors that can affect students’ perception about the strategies and their use. The implementation of the strategies highlighted the need to train teachers and learners in strategies intended to teach and learn vocabulary and to include them in the English language program in any school.

  2. Tracker Readout ASIC for Proton Computed Tomography Data Acquisition.

    Science.gov (United States)

    Johnson, Robert P; Dewitt, Joel; Holcomb, Cole; Macafee, Scott; Sadrozinski, Hartmut F-W; Steinberg, David

    2013-10-01

    A unique CMOS chip has been designed to serve as the front-end of the tracking detector data acquisition system of a pre-clinical prototype scanner for proton computed tomography (pCT). The scanner is to be capable of measuring one to two million proton tracks per second, so the chip must be able to digitize the data and send it out rapidly while keeping the front-end amplifiers active at all times. One chip handles 64 consecutive channels, including logic for control, calibration, triggering, buffering, and zero suppression. It outputs a formatted cluster list for each trigger, and a set of field programmable gate arrays merges those lists from many chips to build the events to be sent to the data acquisition computer. The chip design has been fabricated, and subsequent tests have demonstrated that it meets all of its performance requirements, including excellent low-noise performance.

  3. D0 experiment: its trigger, data acquisition, and computers

    International Nuclear Information System (INIS)

    Cutts, D.; Zeller, R.; Schamberger, D.; Van Berg, R.

    1984-05-01

    The new collider facility to be built at Fermilab's Tevatron-I D0 region is described. The data acquisition requirements are discussed, as well as the hardware and software triggers designed to meet these needs. An array of MicroVAX computers running VAXELN will filter in parallel (a complete event in each microcomputer) and transmit accepted events via Ethernet to a host. This system, together with its subsequent offline needs, is briefly presented

  4. Replacement strategy for obsolete plant computers

    International Nuclear Information System (INIS)

    Schaefer, J.P.

    1985-01-01

    The plant computers of the first generation of larger nuclear power plants are reaching the end of their useful life time with respect to the hardware. The software would be no reason for a system exchange but new tasks for the supervisory computer system, availability questions of maintenance personnel and spare parts and the demand for improved operating procedures for the computer users have stimulated the considerations on how to exchange a computer system in a nuclear power plant without extending plant outage times due to exchange works. In the Federal Republic of Germany the planning phase of such backfitting projects is well under way, some projects are about to be implemented. The base for these backfitting projects is a modular supervisory computer concept which has been designated for the new line of KWU PWR's. The main characteristic of this computer system is the splitting of the system into a data acquisition level and a data processing level. This principle allows an extension of the processing level or even repeated replacements of the processing computers. With the existing computer system still in operation the new system can be installed in a step-by-step procedure. As soon as the first of the redundant process computers of the data processing level is in operation and the data link to the data acquisition computers is established the old computer system can be taken out of service. Then the back-up processing computer can be commissioned to complete the new system. (author)

  5. Computed tomography: acquisition process, technology and current state

    Directory of Open Access Journals (Sweden)

    Óscar Javier Espitia Mendoza

    2016-02-01

    Full Text Available Computed tomography is a noninvasive scan technique widely applied in areas such as medicine, industry, and geology. This technique allows the three-dimensional reconstruction of the internal structure of an object which is lighted with an X-rays source. The reconstruction is formed with two-dimensional cross-sectional images of the object. Each cross-sectional is obtained from measurements of physical phenomena, such as attenuation, dispersion, and diffraction of X-rays, as result of their interaction with the object. In general, measurements acquisition is performed with methods based on any of these phenomena and according to various architectures classified in generations. Furthermore, in response to the need to simulate acquisition systems for CT, software dedicated to this task has been developed. The objective of this research is to determine the current state of CT techniques, for this, a review of methods, different architectures used for the acquisition and some of its applications is presented. Additionally, results of simulations are presented. The main contributions of this work are the detailed description of acquisition methods and the presentation of the possible trends of the technique.

  6. Resource acquisition policy: Multiple account evaluation of electricity resource alternatives [and] resource acquisition strategy

    International Nuclear Information System (INIS)

    1994-06-01

    British Columbia Hydro has been directed by the provincial government to develop evaluation procedures to rank electricity resource alternatives in terms of their social benefits and costs, and to acquire resources on the basis of need. The current state of development of social costing at BC Hydro is detailed along with its application to the multiple account evaluation of resources. In this evaluation, BC Hydro's corporate costs, customer cost, transfer payments to the province, direct costs incurred by provincial or regional governments or other Crown agences, direct environmental impact costs from air emissions and land/water use, community and social impact costs, and economic development impacts are taken into account. The BC Hydro resource acquisition strategy is also described as it was developed in response to provincial policy on electricity supply from independent power producers. This strategy includes a determination of need, a decision to acquire need-determined resources either by itself or from a private sector developer, and decisions to acquire resources in advance of need for reasons such as economic opportunity, long-term strategies, or load displacement. Background information is included on calculation of air emissions costs. An illustrative example is provided of the multiple account evaluation of several types of resource projects. 1 fig., 5 tabs

  7. Replacement strategy for ASDEX upgrade's new control and data acquisition

    International Nuclear Information System (INIS)

    Raupp, G.; Behler, K.; Cole, R.; Engelhardt, K.; Lohs, A.; Lueddecke, K.; Neu, G.; Treutterer, W.; Vijverberg, Th.; Zasche, D.; Zehetbauer, Th.

    2004-01-01

    ASDEX Upgrade is being equipped with a new real-time plasma control and data acquisition system and a novel time system. Major components were implemented and installed. While much work for performance optimisation and application programming remains to be done, commissioning of the new system parallel to experiment operation is being prepared. Commissioning of the new system will be done step-by-step. To facilitate testing the old and new control systems share all input signals. Switching between old and new system can be performed within 60 min: 23 fibre optics for output of actuator commands and input triggers must be connected to the active system and minor modifications done to interface the machine protection. Commissioning phases include background listening, technical discharges and full plasma operation. With the strategy chosen we minimize risk to the machine and reduce interference with ongoing experiment campaigns

  8. Management and organisational barriers in the acquisition of computer usage skills by mature age workers.

    Science.gov (United States)

    Keogh, Mark

    2009-09-01

    To investigate workplace cultures in the acquisition of computer usage skills by mature age workers. Data were gathered through focus groups conducted at job network centres in the Greater Brisbane metropolitan region. Participants who took part were a mixture of workers and job-seekers. The results suggest that mature age workers can be exposed to inappropriate computer training practices and age-insensitive attitudes towards those with low base computer skills. There is a need for managers to be observant of ageist attitudes in the work place and to develop age-sensitive strategies to help mature age workers learn computer usage skills. Mature age workers also need to develop skills in ways which are practical and meaningful to their work.

  9. 75 FR 54524 - Defense Federal Acquisition Regulation Supplement; Acquisition Strategies To Ensure Competition...

    Science.gov (United States)

    2010-09-08

    ...., because the changes are to internal Government organization and operating procedures only. The rule... 48 CFR Part 207 Government procurement. Ynette R. Shelkin, Editor, Defense Acquisition Regulations... Life Cycle of Major Defense Acquisition Programs (DFARS Case 2009-D014) AGENCY: Defense Acquisition...

  10. Data acquisition system using a C 90-10 computer

    International Nuclear Information System (INIS)

    Smiljanic, Gabro

    1969-05-01

    The aim of this study is to make possible the acquisition of experimental data by the memory of a numerical calculator. These data come from analog-to-digital converters that analyze the amplitude of the pulses provided by detectors. Normally the computer executes its main program (data processing, transfer on magnetic tape, visualization, etc.). When information is available at the output of a converter, an interruption of the main program is requested, and after agreement a subroutine supports access to the information in the computer. The author has also considered the bi- and tri-parametric acquisition. The calculator and the converters are commercial devices (calculator C 90-10 from CII and converter CA 12 or CA 25 from Intertechnique), while the systems of adaptation of the input-output levels and the visualization were studied and realized at the CEA Saclay. An interface device was built to connect the converters; it's the cable part of the system. On the other hand, the programs necessary for the operation of the calculator have been studied and developed; it is the program aspect of the system. As far as possible the interface is designed to be universal, i.e. it must be able to work with other brands of equipment. The acquisition of the data is carried out in two phases: a) the converter expresses the amplitude of the input signal in the form of a binary number which is transferred into the interface at the same time as an interruption of the main program is asked. b) After acceptance of this interruption, the subprogram supports the transfer of the information of the interface in the computer, then adds a unit to the word located at the address determined from the information received. In other words, the system behaves like an amplitude analyzer whose operation is well known. But it is of a much more flexible use because of the possibilities of quick adaptation of the programs to the needs of the considered experiment, of the possibility to treat

  11. Integrating Computer-Mediated Communication Strategy Instruction

    Science.gov (United States)

    McNeil, Levi

    2016-01-01

    Communication strategies (CSs) play important roles in resolving problematic second language interaction and facilitating language learning. While studies in face-to-face contexts demonstrate the benefits of communication strategy instruction (CSI), there have been few attempts to integrate computer-mediated communication and CSI. The study…

  12. 105-N Basin sediment removal subcontract acquisition strategy plan

    International Nuclear Information System (INIS)

    Wilsey, D.J.

    1997-01-01

    The 105-N Basin Sediment Removal Subcontract Acquisition Strategy Plan provides a detailed set of actions to specify and procure services and equipment for removal of approximately 400 ft 3 (wet) of radioactive sediment. The plan outlines a cost-effective approach to remove this sediment safely and within an aggressive schedule. This 105-N Basin Sediment Removal Strategy Plan includes the following key elements: a current vendor survey of capabilities and interest in this type of work, confirming that qualified, competitive sources are available, and in the time frame required; a systematic review of the various options for sediment disposal in enough detail to exhaustively uncover pros and cons of each approach; use of a workshop approach to assess different ways to accomplish the work and ensure the disposal options considered are cost and schedule effective; integration of the complicated sampling and characterization process, which is essential to successful execution of the procurement scheme; review of the various subcontracting options to maximize the use of existing technology, existing equipment, and specialized expertise; detailed early planning and strategizing for early identification of problems that can be solved early before they become restraints or potential added costs; tailored design schedule to cover three alternate approaches so that sample characterization will not delay engineering preparation for disposal; support required from various organizations onsite as well as subcontractors, well in advance of the need for that support, improving availability of the proper personnel to support schedule and cost objectives; and provides the possible opportunity to process the sediment in the valve pit and pump pit through this same subcontractor's process

  13. Nutrition acquisition strategies during fungal infection of plants.

    Science.gov (United States)

    Divon, Hege H; Fluhr, Robert

    2007-01-01

    In host-pathogen interactions, efficient pathogen nutrition is a prerequisite for successful colonization and fungal fitness. Filamentous fungi have a remarkable capability to adapt and exploit the external nutrient environment. For phytopathogenic fungi, this asset has developed within the context of host physiology and metabolism. The understanding of nutrient acquisition and pathogen primary metabolism is of great importance in the development of novel disease control strategies. In this review, we discuss the current knowledge on how plant nutrient supplies are utilized by phytopathogenic fungi, and how these activities are controlled. The generation and use of auxotrophic mutants have been elemental to the determination of essential and nonessential nutrient compounds from the plant. Considerable evidence indicates that pathogen entrainment of host metabolism is a widespread phenomenon and can be accomplished by rerouting of the plant's responses. Crucial fungal signalling components for nutrient-sensing pathways as well as their developmental dependency have now been identified, and were shown to operate in a coordinate cross-talk fashion that ensures proper nutrition-related behaviour during the infection process.

  14. USMC Acquisition Strategies For Cots Mobile Devices in the Tactical Environment

    Science.gov (United States)

    2017-09-01

    D., & Samtani, S. (2011). On the adaptation of commercial smartphones to tactical environments. 2011 Military Communications Conference. Retrieved...acquisition strategies to support rapid adoption and integration of emerging commercial off-the-shelf (COTS) mobile devices into the tactical domain...identified and assessed acquisition strategies to support rapid adoption and integration of emerging commercial off-the-shelf (COTS) mobile devices into

  15. Precise fixpoint computation through strategy iteration

    DEFF Research Database (Denmark)

    Gawlitza, Thomas; Seidl, Helmut

    2007-01-01

    We present a practical algorithm for computing least solutions of systems of equations over the integers with addition, multiplication with positive constants, maximum and minimum. The algorithm is based on strategy iteration. Its run-time (w.r.t. the uniform cost measure) is independent of the s......We present a practical algorithm for computing least solutions of systems of equations over the integers with addition, multiplication with positive constants, maximum and minimum. The algorithm is based on strategy iteration. Its run-time (w.r.t. the uniform cost measure) is independent...

  16. Computer-based theory of strategies

    Energy Technology Data Exchange (ETDEWEB)

    Findler, N V

    1983-01-01

    Some of the objectives and working tools of a new area of study, tentatively called theory of strategies, are described. It is based on the methodology of artificial intelligence, decision theory, operations research and digital gaming. The latter refers to computing activity that incorporates model building, simulation and learning programs in conflict situations. Three long-term projects which aim at automatically analyzing and synthesizing strategies are discussed. 27 references.

  17. Value Creation through Acquisition Strategy: A Study of Volvo’s Acquisition by Geely

    Directory of Open Access Journals (Sweden)

    Yane Chandera

    2012-04-01

    Full Text Available This paper examines the value creation on the acquisition of Volvo Car Corp by Zhejiang Geely Holding Group. The acquisition of Volvo by Geely became an interesting topic to discuss since it was the first time in automotive industry that a Chinese company acquired an international company with a considerably high transaction amount. The paper examines the short term value creation using event study to calculate abnormal returns of each company’s stock during the announcement period and measuring the significance of the cumulative abnormal return. The findings are consistent with previous studies over the years which have shown that most acquisitions fail to add value for shareholders in the acquiring company. The paper discusses the broad managerial implications of the findings this paper discussion on marketing aspect after the acquisition by integrating two different brand perceptions.

  18. Value Creation through Acquisition Strategy: A Study of Volvo’s Acquisition by Geely

    Directory of Open Access Journals (Sweden)

    Yane Chandera

    2012-08-01

    Full Text Available This paper examines the value creation on the acquisition of Volvo Car Corp by Zhejiang Geely Holding Group. The acquisition of Volvo by Geely became an interesting topic to discuss since it was the first time in automotive industry that a Chinese company acquired an international company with a considerably high transaction amount. The paper examines the short term value creation using event study to calculate abnormal returns of each company’s stock during the announcement period and measuring the significance of the cumulative abnormal return. The findings are consistent with previous studies over the years which have shown that most acquisitions fail to add value for shareholders in the acquiring company. The paper discusses the broad managerial implications of the findings this paper discussion on marketing aspect after the acquisition by integrating two different brand perceptions.

  19. Next Generation Nuclear Plant Intermediate Heat Exchanger Acquisition Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Mizia, Ronald Eugene [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2008-04-01

    DOE has selected the High Temperature Gas-cooled Reactor (HTGR) design for the Next Generation Nuclear Plant (NGNP) Project. The NGNP will demonstrate the use of nuclear power for electricity and hydrogen production. It will have an outlet gas temperature in the range of 900°C to 950°C and a plant design service life of 60 years. The reactor design will be a graphite moderated, helium cooled, prismatic or pebble-bed reactor, and use low-enriched uranium, TRISO-coated fuel. The plant size, reactor thermal power, and core configuration will ensure passive decay heat removal without fuel damage or radioactive material releases during accidents. The NGNP Materials Research and Development (R&D) Program is responsible for performing R&D on likely NGNP materials in support of the NGNP design, licensing, and construction activities. Selection of the technology and design configuration for the NGNP must consider both the cost and risk profiles to ensure that the demonstration plant establishes a sound foundation for future commercial deployments. The NGNP challenge is to achieve a significant advancement in nuclear technology while at the same time setting the stage for an economically viable deployment of the new technology in the commercial sector soon after 2020. The purpose of this report is to address the acquisition strategy for the NGNP Intermediate Heat Exchanger (IHX).This component will be operated in flowing, impure helium on the primary and secondary side at temperatures up to 950°C. There are major high temperature design, materials availability, and fabrication issues that need to be addressed. The prospective materials are Alloys 617, 230, 800H and X, with Alloy 617 being the leading candidate for the use at 950°C. The material delivery schedule for these materials does not pose a problem for a 2018 start up as the vendors can quote reasonable delivery times at the moment. The product forms and amount needed must be finalized as soon as possible. An

  20. Quantification of Artifact Reduction With Real-Time Cine Four-Dimensional Computed Tomography Acquisition Methods

    International Nuclear Information System (INIS)

    Langner, Ulrich W.; Keall, Paul J.

    2010-01-01

    Purpose: To quantify the magnitude and frequency of artifacts in simulated four-dimensional computed tomography (4D CT) images using three real-time acquisition methods- direction-dependent displacement acquisition, simultaneous displacement and phase acquisition, and simultaneous displacement and velocity acquisition- and to compare these methods with commonly used retrospective phase sorting. Methods and Materials: Image acquisition for the four 4D CT methods was simulated with different displacement and velocity tolerances for spheres with radii of 0.5 cm, 1.5 cm, and 2.5 cm, using 58 patient-measured tumors and respiratory motion traces. The magnitude and frequency of artifacts, CT doses, and acquisition times were computed for each method. Results: The mean artifact magnitude was 50% smaller for the three real-time methods than for retrospective phase sorting. The dose was ∼50% lower, but the acquisition time was 20% to 100% longer for the real-time methods than for retrospective phase sorting. Conclusions: Real-time acquisition methods can reduce the frequency and magnitude of artifacts in 4D CT images, as well as the imaging dose, but they increase the image acquisition time. The results suggest that direction-dependent displacement acquisition is the preferred real-time 4D CT acquisition method, because on average, the lowest dose is delivered to the patient and the acquisition time is the shortest for the resulting number and magnitude of artifacts.

  1. Deliberating A Contract Type Based Risk Mitigation Strategy For South African Defense Acquisitions

    Science.gov (United States)

    2016-06-01

    cash flow may lead to cost overruns and schedule slippage. Table 1 shows the description, schedule and cost performance status for two SA DOD...possibility of applying a contract-type based strategy to manage acquisition program costs and schedule risks for the South African (SA) Department of...deviations between technical, cost and schedule performance. 14. SUBJECT TERMS acquisition process, defense acquisition, contract-type, risk

  2. A data acquisition computer for high energy physics applications DAFNE:- hardware manual

    International Nuclear Information System (INIS)

    Barlow, J.; Seller, P.; De-An, W.

    1983-07-01

    A high performance stand alone computer system based on the Motorola 68000 micro processor has been built at the Rutherford Appleton Laboratory. Although the design was strongly influenced by the requirement to provide a compact data acquisition computer for the high energy physics environment, the system is sufficiently general to find applications in a wider area. It provides colour graphics and tape and disc storage together with access to CAMAC systems. This report is the hardware manual of the data acquisition computer, DAFNE (Data Acquisition For Nuclear Experiments), and as such contains a full description of the hardware structure of the computer system. (author)

  3. Offset-electrode profile acquisition strategy for electrical resistivity tomography

    Science.gov (United States)

    Robbins, Austin R.; Plattner, Alain

    2018-04-01

    We present an electrode layout strategy that allows electrical resistivity profiles to image the third dimension close to the profile plane. This "offset-electrode profile" approach involves laterally displacing electrodes away from the profile line in an alternating fashion and then inverting the resulting data using three-dimensional electrical resistivity tomography software. In our synthetic and field surveys, the offset-electrode method succeeds in revealing three-dimensional structures in the vicinity of the profile plane, which we could not achieve using three-dimensional inversions of linear profiles. We confirm and explain the limits of linear electrode profiles through a discussion of the three-dimensional sensitivity patterns: For a homogeneous starting model together with a linear electrode layout, all sensitivities remain symmetric with respect to the profile plane through each inversion step. This limitation can be overcome with offset-electrode layouts by breaking the symmetry pattern among the sensitivities. Thanks to freely available powerful three-dimensional resistivity tomography software and cheap modern computing power, the requirement for full three-dimensional calculations does not create a significant burden and renders the offset-electrode approach a cost-effective method. By offsetting the electrodes in an alternating pattern, as opposed to laying the profile out in a U-shape, we minimize shortening the profile length.

  4. EFL Vocabulary Acquisition through Word Cards: Student Perceptions and Strategies

    Science.gov (United States)

    Wilkinson, Darrell

    2017-01-01

    Vocabulary knowledge plays an important role in second language proficiency, and learners need to acquire thousands of words in order to become proficient in the target language. As numerous studies have shown that incidental vocabulary acquisition is not sufficient on its own, it is clear that learners must devote considerable time and effort to…

  5. Language Learning Strategies in Second & Foreign Language Acquisition

    OpenAIRE

    TAKEUCHI, Osamu

    1991-01-01

    This article is an attempt to the work on language learning strategies(LLS) in second & foreign language acquisiton (SFLA) research, and to give suggestions for future language learning strategies research. In the first section, I will discuss briefly the background of language learning strategies reserch, and in the ensuing sections, I will review articles on: (i) the identification & classification of language learning strategies; (ii) the variables affecting the use of language learning st...

  6. Naval Ships Acquisition Strategy for the Venezuelan Navy.

    Science.gov (United States)

    1982-06-01

    8. Jefatura de Logistica Comandancia General de la Marina Avenida Vollmer, San Bernardino Caracas, Venezuela 9. Professor M. B. Kline, Code 54Kx 2...34...the GSN must determine the acquisition of defense systems, on the basis of the priority demand requested for the tasks de - rived from the...the contract is signed by both parties, the MOD and the Contractor(s). Transportation and installation of the Venezuelan Naval Mission in the

  7. Acquisition Management for System of Systems: Requirement Evolution and Acquisition Strategy Planning

    Science.gov (United States)

    2013-01-29

    of modern portfolio and control theory . The reformulation allows for possible changes in estimated quantities (e.g., due to market shifts in... Portfolio Theory (MPT). Final Report: NPS award N00244-11-1-0003 5 Extending CEM and Markov: Agent-Based Modeling Approach Research conducted in the...integration and acquisition from a robust portfolio theory standpoint. Robust portfolio management methodologies have been widely used by financial

  8. Data Management Standards in Computer-aided Acquisition and Logistic Support (CALS)

    Science.gov (United States)

    Jefferson, David K.

    1990-01-01

    Viewgraphs and discussion on data management standards in computer-aided acquisition and logistic support (CALS) are presented. CALS is intended to reduce cost, increase quality, and improve timeliness of weapon system acquisition and support by greatly improving the flow of technical information. The phase 2 standards, industrial environment, are discussed. The information resource dictionary system (IRDS) is described.

  9. Automation and schema acquisition in learning elementary computer programming : implications for the design of practice

    NARCIS (Netherlands)

    van Merrienboer, Jeroen J.G.; van Merrienboer, J.J.G.; Paas, Fred G.W.C.

    1990-01-01

    Two complementary processes may be distinguished in learning a complex cognitive skill such as computer programming. First, automation offers task-specific procedures that may directly control programming behavior, second, schema acquisition offers cognitive structures that provide analogies in new

  10. Computer system design description for the spare pump mini-dacs data acquisition and control system

    International Nuclear Information System (INIS)

    Vargo, G.F. Jr.

    1994-01-01

    The attached document outlines the computer software design for the mini data acquisition and control system (DACS), that supports the testing of the spare pump for Tank 241-SY-101, at the maintenance and storage facility (MASF)

  11. Information and psychomotor skills knowledge acquisition: A student-customer-centered and computer-supported approach.

    Science.gov (United States)

    Nicholson, Anita; Tobin, Mary

    2006-01-01

    This presentation will discuss coupling commercial and customized computer-supported teaching aids to provide BSN nursing students with a friendly customer-centered self-study approach to psychomotor skill acquisition.

  12. Acquisition and manipulation of computed tomography images of the maxillofacial region for biomedical prototyping

    International Nuclear Information System (INIS)

    Meurer, Maria Ines; Silva, Jorge Vicente Lopes da; Santa Barbara, Ailton; Nobre, Luiz Felipe; Oliveira, Marilia Gerhardt de; Silva, Daniela Nascimento

    2008-01-01

    Biomedical prototyping has resulted from a merger of rapid prototyping and imaging diagnosis technologies. However, this process is complex, considering the necessity of interaction between biomedical sciences and engineering. Good results are highly dependent on the acquisition of computed tomography images and their subsequent manipulation by means of specific software. The present study describes the experience of a multidisciplinary group of researchers in the acquisition and manipulation of computed tomography images of the maxillofacial region aiming at biomedical prototyping for surgical purposes. (author)

  13. Towards a computational spatial knowledge acquisition model in architectural space

    NARCIS (Netherlands)

    Lyu, J.; Vries, de B.; Sun, C.; Sun, C.; Zhang, J.

    2013-01-01

    Abstract. Existing research which is related to spatial knowledge acquisition often shows a limited scope because of the complexity in the cognition process. Research in spatial representation such as space syntax presumes that vision drives movement. This assumption is only true under certain

  14. The Role of Knowledge Base and Declarative Metamemory in the Acquisition of a Reading Strategy.

    Science.gov (United States)

    Gaultney, Jane F.; Hack-Weiner, Nancy

    A study examined whether previous knowledge facilitates the acquisition of a reading comprehension strategy by children who are poor readers. Subjects, 54 fourth- and fifth-grade boys in Palm Beach County, Florida, who were poor readers and baseball experts, were trained in the use of a reading strategy (asking "why" questions), with…

  15. Parallel Computing Strategies for Irregular Algorithms

    Science.gov (United States)

    Biswas, Rupak; Oliker, Leonid; Shan, Hongzhang; Biegel, Bryan (Technical Monitor)

    2002-01-01

    Parallel computing promises several orders of magnitude increase in our ability to solve realistic computationally-intensive problems, but relies on their efficient mapping and execution on large-scale multiprocessor architectures. Unfortunately, many important applications are irregular and dynamic in nature, making their effective parallel implementation a daunting task. Moreover, with the proliferation of parallel architectures and programming paradigms, the typical scientist is faced with a plethora of questions that must be answered in order to obtain an acceptable parallel implementation of the solution algorithm. In this paper, we consider three representative irregular applications: unstructured remeshing, sparse matrix computations, and N-body problems, and parallelize them using various popular programming paradigms on a wide spectrum of computer platforms ranging from state-of-the-art supercomputers to PC clusters. We present the underlying problems, the solution algorithms, and the parallel implementation strategies. Smart load-balancing, partitioning, and ordering techniques are used to enhance parallel performance. Overall results demonstrate the complexity of efficiently parallelizing irregular algorithms.

  16. STRATEGIES IN IMPROVING READING COMPREHENSION THROUGH VOCABULARY ACQUISITION

    Directory of Open Access Journals (Sweden)

    Khairil Razali

    2013-11-01

    Full Text Available Vocabulary acquisition concerns on how people expand the numbers of words they understand when learning a new language. Knowing words in a second or foreign language is vitally important because the reader will be able to understand the written text well and the speaker will be able to communicate basic ideas through vocabulary even if the person does not understand how to create a grammatically correct sentence. As Madsen argued, “mastering vocabulary is the primary thing that every student should acquire in learning English” (Harold, 1983. Therefore, acquiring a sufficiently large vocabulary is one of the important tasks faced by L2 learners in order to comprehend the written texts in reading as one of the four basic features of language learning.

  17. ALICE - A computer program for nuclear data acquisition

    International Nuclear Information System (INIS)

    Skaali, T.B.

    1981-02-01

    This manual contains the users guide and the program documentation for the ALICE data acquisition system. The ALICE Users Guide, which is contained in part 1 of the manual, can be read independently of the program documentation in part 2. The ALICE program is written in the interpretive language NODAL. Due to the inherent slow execution speed of interpreted code time-consuming tasks such as non-linear least squares peak fitting cannot be implemented. On the other hand the special features of the NODAL language have made possible facilities in ALICE which hardly could have been realized by, e.g. a FORTRAN program. The complete system can be divided in two parts, i) the ALICE program written in NODAL, and ii) a data acquisition package which logically represents an extension of the SINTRAN III operating system. The system is thus portable to other NORD- 10/100 installations provided that the floating hardware is 48 bits. (Auth.)

  18. Literacy Acquisition and Cultural Awareness: Folksongs as Strategy ...

    African Journals Online (AJOL)

    This paper examines Yoruba folksongs in the context of performance for the purpose of entertainment, and more important, as an educational strategy essentially geared towards moral and cultural development. It discusses how folksongs have remained a vigorous aspect of the dissemi nation of knowledge about popular ...

  19. Primary teachers' knowledge and acquisition of stress relieving strategies.

    Science.gov (United States)

    Cockburn, A D

    1996-09-01

    Over the last 20 years there have been numerous studies of teacher stress but little is known of how teachers acquire coping strategies; their knowledge of those available to them and their opinion of these techniques. A total of 335 Norfolk primary teachers responded to a postal questionnaire providing biographical details; levels of job satisfaction and work related stress; responses to a range of commonly advocated techniques to reduce teacher stress and their opinion on who-if anyone-should take more responsibility for reducing teacher stress. On average the respondents were aware of 35 stress reduction strategies. The most effective strategies were ensuring that one understood what one was about to teach and thorough lesson preparation. A significant proportion of practitioners said that they would not consider seeking expert sources of advice. A total of 89 per cent of practitioners reported that they acquired at least some strategies through their own experience. It was concluded that the issue of teacher stress needs to be considered at governmental, school and individual levels. In the light of some resistance to traditional methods of stress reduction, the implications for initial and in-service training were explored.

  20. Vocabulary Acquisition through Direct and Indirect Learning Strategies

    Science.gov (United States)

    Naeimi, Maki; Foo, Thomas Chow Voon

    2015-01-01

    Vocabulary learning has long been considered as one of the essential components for developing language learning. However, language learners are required to not just concern about memorizing definitions but also integrating vocabulary meaning into their present knowledge. Many strategies such as direct or indirect ones may be integrated to enhance…

  1. Attenuation Correction Strategies for Positron Emission Tomography/Computed Tomography and 4-Dimensional Positron Emission Tomography/Computed Tomography

    OpenAIRE

    Pan, Tinsu; Zaidi, Habib

    2013-01-01

    This article discusses attenuation correction strategies in positron emission tomography/computed tomography (PET/CT) and 4 dimensional PET/CT imaging. Average CT scan derived from averaging the high temporal resolution CT images is effective in improving the registration of the CT and the PET images and quantification of the PET data. It underscores list mode data acquisition in 4 dimensional PET and introduces 4 dimensional CT popular in thoracic treatment planning to 4 dimensional PET/CT. ...

  2. Data acquisition and manipulation program for the R2000 computer

    International Nuclear Information System (INIS)

    McWilliam, D.

    1980-03-01

    A data acquisition and manipulation program has been produced which enables the PFR simulator hardware to be used as a data logger or incident recorder or to acquire data from paper tape in a number of different formats. The data are stored on disc and can be displayed in graphical or alphanumeric form on video display screens. The data can be scanned, scaled and otherwise manipulated using buttons on the simulator control desk, and can be plotted or punched out on paper tape. (author)

  3. Automatic data acquisition system of environmental radiation monitor with a personal computer

    International Nuclear Information System (INIS)

    Ohkubo, Tohru; Nakamura, Takashi.

    1984-05-01

    The automatic data acquisition system of environmental radiation monitor was developed in a low price by using a PET personal computer. The count pulses from eight monitors settled at four site boundaries were transmitted to a radiation control room by a signal transmission device and analyzed by the computer via 12 channel scaler and PET-CAMAC Interface for graphic display and printing. (author)

  4. The Advantages and Disadvantages of Computer Technology in Second Language Acquisition

    Science.gov (United States)

    Lai, Cheng-Chieh; Kritsonis, William Allan

    2006-01-01

    The purpose of this article is to discuss the advantages and disadvantages of computer technology and Computer Assisted Language Learning (CALL) programs for current second language learning. According to the National Clearinghouse for English Language Acquisition & Language Instruction Educational Programs' report (2002), more than nine million…

  5. The Effect of Computer Simulations on Acquisition of Knowledge and Cognitive Load: A Gender Perspective

    Science.gov (United States)

    Kaheru, Sam J.; Kriek, Jeanne

    2016-01-01

    A study on the effect of the use of computer simulations (CS) on the acquisition of knowledge and cognitive load was undertaken with 104 Grade 11 learners in four schools in rural South Africa on the physics topic geometrical optics. Owing to the lack of resources a teacher-centred approach was followed in the use of computer simulations. The…

  6. Transmission computed tomography data acquisition with a SPECT system

    International Nuclear Information System (INIS)

    Greer, K.L.; Harris, C.C.; Jaszczak, R.J.; Coleman, R.E.; Hedlund, L.W.; Floyd, C.E.; Manglos, S.H.

    1987-01-01

    Phantom and animal transmission computed tomography (TCT) scans were performed with a camera-based single photon emission computed tomography (SPECT) system to determine system linearity as a function of object density, which is important in the accurate determination of attenuation coefficients for SPECT attenuation compensation. Results from phantoms showed promise in providing a linear relationship in measuring density while maintaining good image resolution. Animal images were essentially free of artifacts. Transmission computed tomography scans derived from a SPECT system appear to have the potential to provide data suitable for incorporation in an attenuation compensation algorithm at relatively low (calculated) radiation doses to the subjects

  7. SAR: a fast computer for CAMAC data acquisition

    International Nuclear Information System (INIS)

    Bricaud, B.; Faivre, J.C.; Pain, J.

    1979-01-01

    An original 32-bit computer architecture has been designed, based on bit-slice microprocessors, around the AMD 2901. A 32 bit instruction set was defined with a 200 ns execution time per instruction. Basic memory capacity is equally divided into two 32K 32-bit zones named Program memory and Data memory. The computer has a Camac Branch interface; during a Camac transfer activation, which lasts seven cycles, five cycles are free for processing

  8. Comparison of E-Book Acquisitions Strategies Across Disciplines Finds Differences in Cost and Usage

    Directory of Open Access Journals (Sweden)

    Laura Costello

    2017-03-01

    Full Text Available Carrico, S.B., Cataldo, T.T., Botero, C., & Shelton, T. Objective – To compare e-book cost-usage data across different acquisitions styles and disciplines. Design – Case study. Setting – A public research university serving an annual enrollment of over 49,000 students and employing more than 3,000 faculty members in the Southern United States. Subjects – Cost and usage data from 15,006 e-books acquired by the Library through packages, firm orders, and demand-driven acquisitions. Methods – Data was collected from publishers and vendors across the three acquisitions strategies. Usage, cost, and call number information was collected for the materials purchased via firm order or demand driven acquisitions and these were sorted into disciplines based on the call number assigned. Discipline, cost, and use were determined for each package collection as a whole because information on individual titles was not provided by the publishers. The authors then compared usage and cost across disciplines and acquisitions strategies. Main Results – Overall, e-books purchased in packages had a 50% use rate and an average cost per use of $3.39, e-books purchased through firm orders had a 52% use rate and an average cost per use of $22.21, and e-books purchased through demand driven acquisitions had an average cost per use of $8.88 and 13.9 average uses per title. Package purchasing was cost effective for science, technology, engineering, and mathematics (STEM materials and medicine (MED materials. Demand driven acquisition was a particularly good strategy for humanities and social sciences (HSS titles. Conclusion – There are differences between the acquisitions strategies and disciplines in cost and use. Firm orders had a higher cost per use than the other acquisitions strategies. Commentary This study examined cost per use across three acquisitions styles and three disciplinary groups. The results agree with studies from other institutions that have

  9. Computer games: Apprehension of learning strategies

    Directory of Open Access Journals (Sweden)

    Carlos Antonio Bruno da Silva

    2003-12-01

    Full Text Available Computer games and mainly videogames have proved to be an important tendency in Brazilian children’s play. They are part of the playful culture, which associates modern technology to traditional play preserving the importance of the latter. Based on Vygotsky and Chadwick’s ideas, this work studies the alternatives in the use of videogame by the occupational therapist, educator or parents, aiming prevention of learning difficulty by means of apprehension of learning strategies. Sixty children were investigated under dialectic, descriptive qualitative/quantitative focus. There was a semi-structured interview, direct observation and focused group applied to this intentional sample. Out of the 60 children playing in 3 videogame rental shops in Fortaleza-CE and Quixadá-CE, 30 aged 4 to 6 years old and the other 30 aged 7 and 8. Results indicate that the determination that the videogame is played in-group favors the apprehension of learning and affective strategies, processing, and meta-cognition. Therefore, videogame can be considered an excellent resource in terms of preventing learning difficulties, enabling children to their reality.

  10. Computer-Aided Prototyping Systems (CAPS) within the software acquisition process: a case study

    OpenAIRE

    Ellis, Mary Kay

    1993-01-01

    Approved for public release; distribution is unlimited This thesis provides a case study which examines the benefits derived from the practice of computer-aided prototyping within the software acquisition process. An experimental prototyping systems currently in research is the Computer Aided Prototyping System (CAPS) managed under the Computer Science department of the Naval Postgraduate School, Monterey, California. This thesis determines the qualitative value which may be realized by ...

  11. Strategy of nitrogen acquisition and utilization by carnivorous Dionaea muscipula.

    Science.gov (United States)

    Kruse, Jörg; Gao, Peng; Honsel, Anne; Kreuzwieser, Jürgen; Burzlaff, Tim; Alfarraj, Saleh; Hedrich, Rainer; Rennenberg, Heinz

    2014-03-01

    Plant carnivory represents an exceptional means to acquire N. Snap traps of Dionaea muscipula serve two functions, and provide both N and photosynthate. Using (13)C/(15)N-labelled insect powder, we performed feeding experiments with Dionaea plants that differed in physiological state and N status (spring vs. autumn plants). We measured the effects of (15)N uptake on light-saturated photosynthesis (A(max)), dark respiration (R(D)) and growth. Depending on N status, insect capture briefly altered the dynamics of R(D)/A(max), reflecting high energy demand during insect digestion and nutrient uptake, followed by enhanced photosynthesis and growth. Organic N acquired from insect prey was immediately redistributed, in order to support swift renewal of traps and thereby enhance probability of prey capture. Respiratory costs associated with permanent maintenance of the photosynthetic machinery were thereby minimized. Dionaea's strategy of N utilization is commensurate with the random capture of large prey, occasionally transferring a high load of organic nutrients to the plant. Our results suggest that physiological adaptations to unpredictable resource availability are essential for Dionaea's success with regards to a carnivorous life style.

  12. Decision Making about Computer Acquisition and Use in American Schools.

    Science.gov (United States)

    Becker, Henry Jay

    1993-01-01

    Discusses the centralization and decentralization of decision making about computer use in elementary and secondary schools based on results of a 1989 national survey. Results unexpectedly indicate that more successful programs are the result of districtwide planning than individual teacher or school-level decision making. (LRW)

  13. Acquisition and analysis strategies in functional MRI at high fields

    International Nuclear Information System (INIS)

    Windischberger, C.

    2001-08-01

    Functional magnetic resonance imaging represents a non-invasive technique to examine neuronal activity in the brain. It applies radio waves to excite nuclear spins, using the emitted signal during relaxation for image generation. Signal modulations from local blood flow and oxygenation level changes caused by neuronal activity are the basis for calculating functional brain maps with high spatial resolution. The present work discusses concepts for improving the spatial and temporal resolution, as well as sophisticated analysis approaches. Besides an exhaustive description of image reconstruction algorithms, computational simulations on echo-shifting in echo-planar imaging are presented and effects on spatial resolution are quantified. The results demonstrate that echo-shifting causes only minimal resolution losses for high signal-to-noise data, but leads to severe resolution degradation (up to 30 %) in images with low signal-to-noise ratios. After an overview of the mechanisms that cause fMRI signal changes subsequent to neuronal activity, explorative analysis algorithms like Fuzzy Cluster Analysis, as well as parametric approaches are described and discussed. In the context of fMRI artifacts, effects of respiratory motion are examined. For the first time, well-defined breathing patterns are used to quantify the influences on fMRI signal intensity. Also, the variability of fMRI activation in a mental rotation paradigm are investigated, using single-trial analysis. Such, intra-subject activation consistency was determined successfully. Finally, in a second study on mental rotation explorative data analysis was applied to retrieve neuro-functional hypotheses. (author)

  14. Dual-Energy Computed Tomography: Image Acquisition, Processing, and Workflow.

    Science.gov (United States)

    Megibow, Alec J; Kambadakone, Avinash; Ananthakrishnan, Lakshmi

    2018-07-01

    Dual energy computed tomography has been available for more than 10 years; however, it is currently on the cusp of widespread clinical use. The way dual energy data are acquired and assembled must be appreciated at the clinical level so that the various reconstruction types can extend its diagnostic power. The type of scanner that is present in a given practice dictates the way in which the dual energy data can be presented and used. This article compares and contrasts how dual source, rapid kV switching, and spectral technologies acquire and present dual energy reconstructions to practicing radiologists. Copyright © 2018 Elsevier Inc. All rights reserved.

  15. Automatic data-acquisition and communications computer network for fusion experiments

    International Nuclear Information System (INIS)

    Kemper, C.O.

    1981-01-01

    A network of more than twenty computers serves the data acquisition, archiving, and analysis requirements of the ISX, EBT, and beam-line test facilities at the Fusion Division of Oak Ridge National Laboratory. The network includes PDP-8, PDP-12, PDP-11, PDP-10, and Interdata 8-32 processors, and is unified by a variety of high-speed serial and parallel communications channels. While some processors are dedicated to experimental data acquisition, and others are dedicated to later analysis and theoretical work, many processors perform a combination of acquisition, real-time analysis and display, and archiving and communications functions. A network software system has been developed which runs in each processor and automatically transports data files from point of acquisition to point or points of analysis, display, and storage, providing conversion and formatting functions are required

  16. The behaviour of Pacific metallurgical coal markets: the impact of Japan's acquisition strategy on market price

    Energy Technology Data Exchange (ETDEWEB)

    Koerner, R J [Queensland University, St. Lucia, Qld. (Australia). Graduate School of Management, Faculty of Commerce and Economics

    1993-03-01

    This paper examines whether some elements of Japan's resource acquisition strategies might have caused price and other distortions of market behaviour in the Pacific metallurgical coal trade. The industry chosen for investigation is that of steel manufacture, and the traded resources commodity examined is coking coal, which is the primary energy input for blast furnace iron making. Regression modelling studies to determine historic acquisition value and quality relationships for US, Australian and Canadian coals sold into the Japanese coking coal market are described. Departures from normal demand response behaviour to price competitiveness are also investigated. 3 figs., 3 tabs.

  17. Computer software design description for the integrated control and data acquisition system LDUA system

    International Nuclear Information System (INIS)

    Aftanas, B.L.

    1998-01-01

    This Computer Software Design Description (CSDD) document provides the overview of the software design for all the software that is part of the integrated control and data acquisition system of the Light Duty Utility Arm System (LDUA). It describes the major software components and how they interface. It also references the documents that contain the detailed design description of the components

  18. The Use of Computer-Based Videogames in Knowledge Acquisition and Retention.

    Science.gov (United States)

    Ricci, Katrina E.

    1994-01-01

    Research conducted at the Naval Training Systems Center in Orlando, Florida, investigated the acquisition and retention of basic knowledge with subject matter presented in the forms of text, test, and game. Results are discussed in terms of the effectiveness of computer-based games for military training. (Author/AEF)

  19. A local computer network for the experimental data acquisition at BESSY

    International Nuclear Information System (INIS)

    Buchholz, W.

    1984-01-01

    For the users of the Berlin dedicated electron storage ring for synchrotron radiation (BESSY) a local computer network has been installed: The system is designed primarily for data acquisition and offers the users a generous hardware provision combined with maximum sortware flexibility

  20. Computer technology and computer programming research and strategies

    CERN Document Server

    Antonakos, James L

    2011-01-01

    Covering a broad range of new topics in computer technology and programming, this volume discusses encryption techniques, SQL generation, Web 2.0 technologies, and visual sensor networks. It also examines reconfigurable computing, video streaming, animation techniques, and more. Readers will learn about an educational tool and game to help students learn computer programming. The book also explores a new medical technology paradigm centered on wireless technology and cloud computing designed to overcome the problems of increasing health technology costs.

  1. Cloud computing and ROI a new framework for it strategy

    CERN Document Server

    Mohapatra, Sanjay

    2014-01-01

    This book develops an IT strategy for cloud computing that helps businesses evaluate their readiness for cloud services and calculate the ROI. The framework provided helps reduce risks involved in transitioning from traditional "on site" IT strategy to virtual "cloud computing." Since the advent of cloud computing, many organizations have made substantial gains implementing this innovation. Cloud computing allows companies to focus more on their core competencies, as IT enablement is taken care of through cloud services. Cloud Computing and ROI includes case studies covering retail, automobile and food processing industries. Each of these case studies have successfully implemented the cloud computing framework and their strategies are explained. As cloud computing may not be ideal for all businesses, criteria?are also offered to help determine if this strategy should be adopt.

  2. The Computer Book of the Internal Medicine Resident: competence acquisition and achievement of learning objectives.

    Science.gov (United States)

    Oristrell, J; Oliva, J C; Casanovas, A; Comet, R; Jordana, R; Navarro, M

    2014-01-01

    The Computer Book of the Internal Medicine resident (CBIMR) is a computer program that was validated to analyze the acquisition of competences in teams of Internal Medicine residents. To analyze the characteristics of the rotations during the Internal Medicine residency and to identify the variables associated with the acquisition of clinical and communication skills, the achievement of learning objectives and resident satisfaction. All residents of our service (n=20) participated in the study during a period of 40 months. The CBIMR consisted of 22 self-assessment questionnaires specific for each rotation, with items on services (clinical workload, disease protocolization, resident responsibilities, learning environment, service organization and teamwork) and items on educational outcomes (acquisition of clinical and communication skills, achievement of learning objectives, overall satisfaction). Associations between services features and learning outcomes were analyzed using bivariate and multivariate analysis. An intense clinical workload, high resident responsibilities and disease protocolization were associated with the acquisition of clinical skills. High clinical competence and teamwork were both associated with better communication skills. Finally, an adequate learning environment was associated with increased clinical competence, the achievement of educational goals and resident satisfaction. Potentially modifiable variables related with the operation of clinical services had a significant impact on the acquisition of clinical and communication skills, the achievement of educational goals, and resident satisfaction during the specialized training in Internal Medicine. Copyright © 2013 Elsevier España, S.L. All rights reserved.

  3. Littoral Combat Ship: Need to Address Fundamental Weaknesses in LCS and Frigate Acquisition Strategies

    Science.gov (United States)

    2016-06-01

    LITTORAL COMBAT SHIP Need to Address Fundamental Weaknesses in LCS and Frigate Acquisition Strategies Report to...Office Highlights of GAO-16-356, a report to congressional committees June 2016 LITTORAL COMBAT SHIP Need to Address Fundamental Weaknesses in...capabilities of the LCS—a small surface combatant (SSC) consisting of a ship and reconfigurable mission packages built by two shipyards as different

  4. A multilevel modeling approach to examining individual differences in skill acquisition for a computer-based task.

    Science.gov (United States)

    Nair, Sankaran N; Czaja, Sara J; Sharit, Joseph

    2007-06-01

    This article explores the role of age, cognitive abilities, prior experience, and knowledge in skill acquisition for a computer-based simulated customer service task. Fifty-two participants aged 50-80 performed the task over 4 consecutive days following training. They also completed a battery that assessed prior computer experience and cognitive abilities. The data indicated that overall quality and efficiency of performance improved with practice. The predictors of initial level of performance and rate of change in performance varied according to the performance parameter assessed. Age and fluid intelligence predicted initial level and rate of improvement in overall quality, whereas crystallized intelligence and age predicted initial e-mail processing time, and crystallized intelligence predicted rate of change in e-mail processing time over days. We discuss the implications of these findings for the design of intervention strategies.

  5. Using a progress computer for the direct acquisition and processing of radiation protection data

    International Nuclear Information System (INIS)

    Barz, H.G.; Borchardt, K.D.; Hacke, J.; Kirschfeld, K.E.; Kluppak, B.

    1976-01-01

    A process computer will be used in the Hahn-Meitner-Institute to rationalize radiation protection measures. Appr. 150 transmitters are to be connected with this computer. Especially the radiation measuring devices of a nuclear reactor, of hot cells, and of a heavy ion accelerator, as well as the emission- and environment monitoring systems will be connected. The advantages of this method are described: central data acquisition, central alarm and stoppage information, data processing of certain measurement values, possibility of quick disturbance analysis. Furthermore the authors report about the preparations already finished, particularly about data transmission of digital and analog values to the computer. (orig./HP) [de

  6. Computer Ethics Topics and Teaching Strategies.

    Science.gov (United States)

    DeLay, Jeanine A.

    An overview of six major issues in computer ethics is provided in this paper: (1) unauthorized and illegal database entry, surveillance and monitoring, and privacy issues; (2) piracy and intellectual property theft; (3) equity and equal access; (4) philosophical implications of artificial intelligence and computer rights; (5) social consequences…

  7. Computer-based learning: games as an instructional strategy.

    Science.gov (United States)

    Blake, J; Goodman, J

    1999-01-01

    Games are a creative teaching strategy that enhances learning and problem solving. Gaming strategies are being used by the authors to make learning interesting, stimulating and fun. This article focuses on the development and implementation of computer games as an instructional strategy. Positive outcomes have resulted from the use of games in the classroom.

  8. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... at one site or multiple site licenses, and the format and media in which the software or... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software...

  9. Evaluation of Acquisition Strategies for Image-Based Construction Site Monitoring

    Science.gov (United States)

    Tuttas, S.; Braun, A.; Borrmann, A.; Stilla, U.

    2016-06-01

    Construction site monitoring is an essential task for keeping track of the ongoing construction work and providing up-to-date information for a Building Information Model (BIM). The BIM contains the as-planned states (geometry, schedule, costs, ...) of a construction project. For updating, the as-built state has to be acquired repeatedly and compared to the as-planned state. In the approach presented here, a 3D representation of the as-built state is calculated from photogrammetric images using multi-view stereo reconstruction. On construction sites one has to cope with several difficulties like security aspects, limited accessibility, occlusions or construction activity. Different acquisition strategies and techniques, namely (i) terrestrial acquisition with a hand-held camera, (ii) aerial acquisition using a Unmanned Aerial Vehicle (UAV) and (iii) acquisition using a fixed stereo camera pair at the boom of the crane, are tested on three test sites. They are assessed considering the special needs for the monitoring tasks and limitations on construction sites. The three scenarios are evaluated based on the ability of automation, the required effort for acquisition, the necessary equipment and its maintaining, disturbance of the construction works, and on the accuracy and completeness of the resulting point clouds. Based on the experiences during the test cases the following conclusions can be drawn: Terrestrial acquisition has the lowest requirements on the device setup but lacks on automation and coverage. The crane camera shows the lowest flexibility but the highest grade of automation. The UAV approach can provide the best coverage by combining nadir and oblique views, but can be limited by obstacles and security aspects. The accuracy of the point clouds is evaluated based on plane fitting of selected building parts. The RMS errors of the fitted parts range from 1 to a few cm for the UAV and the hand-held scenario. First results show that the crane camera

  10. EVALUATION OF ACQUISITION STRATEGIES FOR IMAGE-BASED CONSTRUCTION SITE MONITORING

    Directory of Open Access Journals (Sweden)

    S. Tuttas

    2016-06-01

    Full Text Available Construction site monitoring is an essential task for keeping track of the ongoing construction work and providing up-to-date information for a Building Information Model (BIM. The BIM contains the as-planned states (geometry, schedule, costs, ... of a construction project. For updating, the as-built state has to be acquired repeatedly and compared to the as-planned state. In the approach presented here, a 3D representation of the as-built state is calculated from photogrammetric images using multi-view stereo reconstruction. On construction sites one has to cope with several difficulties like security aspects, limited accessibility, occlusions or construction activity. Different acquisition strategies and techniques, namely (i terrestrial acquisition with a hand-held camera, (ii aerial acquisition using a Unmanned Aerial Vehicle (UAV and (iii acquisition using a fixed stereo camera pair at the boom of the crane, are tested on three test sites. They are assessed considering the special needs for the monitoring tasks and limitations on construction sites. The three scenarios are evaluated based on the ability of automation, the required effort for acquisition, the necessary equipment and its maintaining, disturbance of the construction works, and on the accuracy and completeness of the resulting point clouds. Based on the experiences during the test cases the following conclusions can be drawn: Terrestrial acquisition has the lowest requirements on the device setup but lacks on automation and coverage. The crane camera shows the lowest flexibility but the highest grade of automation. The UAV approach can provide the best coverage by combining nadir and oblique views, but can be limited by obstacles and security aspects. The accuracy of the point clouds is evaluated based on plane fitting of selected building parts. The RMS errors of the fitted parts range from 1 to a few cm for the UAV and the hand-held scenario. First results show that the crane

  11. Design and Construction of Detector and Data Acquisition Elements for Proton Computed Tomography

    International Nuclear Information System (INIS)

    Fermi Research Alliance; Northern Illinois University

    2015-01-01

    Proton computed tomography (pCT) offers an alternative to x-ray imaging with potential for three-dimensional imaging, reduced radiation exposure, and in-situ imaging. Northern Illinois University (NIU) is developing a second-generation proton computed tomography system with a goal of demonstrating the feasibility of three-dimensional imaging within clinically realistic imaging times. The second-generation pCT system is comprised of a tracking system, a calorimeter, data acquisition, a computing farm, and software algorithms. The proton beam encounters the upstream tracking detectors, the patient or phantom, the downstream tracking detectors, and a calorimeter. The schematic layout of the PCT system is shown. The data acquisition sends the proton scattering information to an offline computing farm. Major innovations of the second generation pCT project involve an increased data acquisition rate ( MHz range) and development of three-dimensional imaging algorithms. The Fermilab Particle Physics Division and Northern Illinois Center for Accelerator and Detector Development at Northern Illinois University worked together to design and construct the tracking detectors, calorimeter, readout electronics and detector mounting system.

  12. Cloud Computing:Strategies for Cloud Computing Adoption

    OpenAIRE

    Shimba, Faith

    2010-01-01

    The advent of cloud computing in recent years has sparked an interest from different organisations, institutions and users to take advantage of web applications. This is a result of the new economic model for the Information Technology (IT) department that cloud computing promises. The model promises a shift from an organisation required to invest heavily for limited IT resources that are internally managed, to a model where the organisation can buy or rent resources that are managed by a clo...

  13. Food Security Strategy Based on Computer Innovation

    OpenAIRE

    Ruihui Mu

    2015-01-01

    Case analysis to identify innovative strategies for food security occurred in the Oriental Hotel, voluntarily implement food safety control. Food security strategy investigation and the reasons for their use of multiple data sources, including accommodation and catering industry to implement and document interviews with key decision makers in the hotel performed to observe the business environment were examined. This finding suggests that addressing food security, not only is the food control...

  14. ACQUISITION AS A GENERATOR STRATEGY FROM COMPETITIVE ADVANTAGES IN THE BRAZILIAN MARKET OF FUELS DISTRIBUITION

    Directory of Open Access Journals (Sweden)

    Maurício Fernandes Pereira

    2013-06-01

    Full Text Available The subject from this work is about acquisitions as organizational strategies and it is guided by the general objective on identifying if the acquisition of Texaco by Ultra Group, in Brazil, could generate competitive advantages. Thus, the main aim is to characterize, specifically, the fuel distribution sector in Brazil, presenting characteristics, strategies, classification of resources and the competitive advantage’s identification in the buying process of Texaco by Ultra Group. The methodology used for this research is a case study of qualitative nature. Data collection has been performed through literature review, documentary analysis and semi-structured interviews. In the analysis of collected data, specific objectives have been met. It was clear, therefore, the presence of features such as scale earnings, brand exposure, better management practices, synergies, tangible and intangible assets and market growth. So, those resources are classified according to the competitive implications. Then, it might be concluded that Texaco´s acquisition could bring competitive advantages for Ultra / Ipiranga Group. Respondents believe the sector is growing and businesses tend to grow despite the world crisis. They also confirmed that, in a highly competitive market, strategic alliances and market growing are factors that may ensure success to each company.

  15. Developing and implementing a data acquisition strategy for global agricultural monitoring: an inter-agency initiative

    Science.gov (United States)

    Justice, C. O.; Whitcraft, A. K.; Becker-Reshef, I.; Killough, B.

    2013-12-01

    In 2011, in response to global food crises, the G20 Agricultural Ministers launched a satellite-based global agricultural monitoring initiative to develop the Group on Earth Observations Global Agriculture Monitoring (GEOGLAM) system. The GEO is aimed at enhancing the availability and use of both satellite and in situ data for societal benefit. This initiative builds on the observation requirements developed by the GEO Agricultural Community of Practice, the understanding that no one satellite system can currently provide all the data needed for agricultural monitoring and the resulting recommendation for improved acquisition and availability of data by the World's space agencies. Implicit in this recommendation is the fact that certain regions of the Earth are imagery rich while others are imagery poor, leaving knowledge gaps about agricultural processes and food supply for certain areas of the World. In order to respond to these knowledge gaps and to strengthen national, regional, and global agricultural monitoring networks, GEOGLAM is working with the Committee on Earth Observations (CEOS), the space arm of GEO, to develop a coordinated global acquisition strategy. A key component of GEOGLAM is an effort to articulate the temporal and spatial Earth Observation (EO) requirements for monitoring; second, the identification of current and planned missions which are capable of fulfilling these EO requirements; and third, the development of a multi-agency, multi-mission image acquisition strategy for agricultural monitoring. CEOS engineers and GEOGLAM scientists have been collaborating on the EO requirements since 2012, and are now beginning the first implementation phase of the acquisition strategy. The goal is to put in place an operational system of systems using a virtual constellation of satellite-based sensors acquiring data to meet the needs for monitoring and early warning of shortfalls in agricultural production, a goal that was articulated in the 1970's

  16. Computer-based data acquisition system in the Large Coil Test Facility

    International Nuclear Information System (INIS)

    Gould, S.S.; Layman, L.R.; Million, D.L.

    1983-01-01

    The utilization of computers for data acquisition and control is of paramount importance on large-scale fusion experiments because they feature the ability to acquire data from a large number of sensors at various sample rates and provide for flexible data interpretation, presentation, reduction, and analysis. In the Large Coil Test Facility (LCTF) a Digital Equipment Corporation (DEC) PDP-11/60 host computer with the DEC RSX-11M operating system coordinates the activities of five DEC LSI-11/23 front-end processors (FEPs) via direct memory access (DMA) communication links. This provides host control of scheduled data acquisition and FEP event-triggered data collection tasks. Four of the five FEPs have no operating system

  17. Monte Carlo strategies in scientific computing

    CERN Document Server

    Liu, Jun S

    2008-01-01

    This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...

  18. The Quality of Quantity: Mini-UAVS As An Alternative UAV Acquisition Strategy at the Army Brigade Level

    National Research Council Canada - National Science Library

    Weed, Shawn

    2002-01-01

    This monograph asks should the U.S. Army alter its current UAV acquisition strategy for maneuver brigades from one in which limited numbers of high capability systems are acquired, in favor of another that fields a large quantity...

  19. Investigation of measuring strategies in computed tomography

    DEFF Research Database (Denmark)

    Müller, Pavel; Hiller, Jochen; Cantatore, Angela

    2011-01-01

    Computed tomography has entered the industrial world in 1980’s as a technique for non-destructive testing and has nowadays become a revolutionary tool for dimensional metrology, suitable for actual/nominal comparison and verification of geometrical and dimensional tolerances. This paper evaluates...

  20. The Strategy Blueprint: A Strategy Process Computer-Aided Design Tool

    OpenAIRE

    Aldea, Adina Ioana; Febriani, Tania Rizki; Daneva, Maya; Iacob, Maria Eugenia

    2017-01-01

    Strategy has always been a main concern of organizations because it dictates their direction, and therefore determines their success. Thus, organizations need to have adequate support to guide them through their strategy formulation process. The goal of this research is to develop a computer-based tool, known as ‘the Strategy Blueprint’, consisting of a combination of nine strategy techniques, which can help organizations define the most suitable strategy, based on the internal and external f...

  1. Real-time data acquisition and feedback control using Linux Intel computers

    International Nuclear Information System (INIS)

    Penaflor, B.G.; Ferron, J.R.; Piglowski, D.A.; Johnson, R.D.; Walker, M.L.

    2006-01-01

    This paper describes the experiences of the DIII-D programming staff in adapting Linux based Intel computing hardware for use in real-time data acquisition and feedback control systems. Due to the highly dynamic and unstable nature of magnetically confined plasmas in tokamak fusion experiments, real-time data acquisition and feedback control systems are in routine use with all major tokamaks. At DIII-D, plasmas are created and sustained using a real-time application known as the digital plasma control system (PCS). During each experiment, the PCS periodically samples data from hundreds of diagnostic signals and provides these data to control algorithms implemented in software. These algorithms compute the necessary commands to send to various actuators that affect plasma performance. The PCS consists of a group of rack mounted Intel Xeon computer systems running an in-house customized version of the Linux operating system tailored specifically to meet the real-time performance needs of the plasma experiments. This paper provides a more detailed description of the real-time computing hardware and custom developed software, including recent work to utilize dual Intel Xeon equipped computers within the PCS

  2. Computer Solution to the Game of Pure Strategy

    Directory of Open Access Journals (Sweden)

    Laurent Bartholdi

    2012-11-01

    Full Text Available We numerically solve the classical "Game of Pure Strategy" using linear programming. We notice an intricate even-odd behaviour in the results of our computations that seems to encourage odd or maximal bids.

  3. Effects of fundamentals acquisition and strategy switch on stock price dynamics

    Science.gov (United States)

    Wu, Songtao; He, Jianmin; Li, Shouwei

    2018-02-01

    An agent-based artificial stock market is developed to simulate trading behavior of investors. In the market, acquisition and employment of information about fundamentals and strategy switch are investigated to explain stock price dynamics. Investors could obtain the information from both market and neighbors resided on their social networks. Depending on information status and performances of different strategies, an informed investor may switch to the strategy of fundamentalist. This in turn affects the information acquisition process, since fundamentalists are more inclined to search and spread the information than chartists. Further investigation into price dynamics generated from three typical networks, i.e. regular lattice, small-world network and random graph, are conducted after general relation between network structures and price dynamics is revealed. In each network, integrated effects of different combinations of information efficiency and switch intensity are investigated. Results have shown that, along with increasing switch intensity, market and social information efficiency play different roles in the formation of price distortion, standard deviation and kurtosis of returns.

  4. Computer-based programs on acquisition of reading skills in schoolchildren (review of contemporary foreign investigations

    Directory of Open Access Journals (Sweden)

    Prikhoda N.A.

    2015-03-01

    Full Text Available The article presents a description of 17 computer-based programs, which were used over the last 5 years (2008—2013 in 15 studies of computer-assisted reading instruction and intervention of schoolchildren. The article includes a description of specificity of various terms used in the above-mentioned studies and the contents of training sessions. The article also carries out a brief analysis of main characteristics of computer-based techniques — language of instruction, age and basic characteristics of students, duration and frequency of training sessions, dependent variables of education. Special attention is paid to efficiency of acquisition of different reading skills through computer-based programs in comparison to traditional school instruction.

  5. Scalable Strategies for Computing with Massive Data

    Directory of Open Access Journals (Sweden)

    Michael Kane

    2013-11-01

    Full Text Available This paper presents two complementary statistical computing frameworks that address challenges in parallel processing and the analysis of massive data. First, the foreach package allows users of the R programming environment to define parallel loops that may be run sequentially on a single machine, in parallel on a symmetric multiprocessing (SMP machine, or in cluster environments without platform-specific code. Second, the bigmemory package implements memory- and file-mapped data structures that provide (a access to arbitrarily large data while retaining a look and feel that is familiar to R users and (b data structures that are shared across processor cores in order to support efficient parallel computing techniques. Although these packages may be used independently, this paper shows how they can be used in combination to address challenges that have effectively been beyond the reach of researchers who lack specialized software development skills or expensive hardware.

  6. Multiscale methods in turbulent combustion: strategies and computational challenges

    International Nuclear Information System (INIS)

    Echekki, Tarek

    2009-01-01

    A principal challenge in modeling turbulent combustion flows is associated with their complex, multiscale nature. Traditional paradigms in the modeling of these flows have attempted to address this nature through different strategies, including exploiting the separation of turbulence and combustion scales and a reduced description of the composition space. The resulting moment-based methods often yield reasonable predictions of flow and reactive scalars' statistics under certain conditions. However, these methods must constantly evolve to address combustion at different regimes, modes or with dominant chemistries. In recent years, alternative multiscale strategies have emerged, which although in part inspired by the traditional approaches, also draw upon basic tools from computational science, applied mathematics and the increasing availability of powerful computational resources. This review presents a general overview of different strategies adopted for multiscale solutions of turbulent combustion flows. Within these strategies, some specific models are discussed or outlined to illustrate their capabilities and underlying assumptions. These strategies may be classified under four different classes, including (i) closure models for atomistic processes, (ii) multigrid and multiresolution strategies, (iii) flame-embedding strategies and (iv) hybrid large-eddy simulation-low-dimensional strategies. A combination of these strategies and models can potentially represent a robust alternative strategy to moment-based models; but a significant challenge remains in the development of computational frameworks for these approaches as well as their underlying theories. (topical review)

  7. Teaching Concept Mapping and University Level Study Strategies Using Computers.

    Science.gov (United States)

    Mikulecky, Larry; And Others

    1989-01-01

    Assesses the utility and effectiveness of three interactive computer programs and associated print materials in instructing and modeling for undergraduates how to comprehend and reconceptualize scientific textbook material. Finds that "how to" reading strategies can be taught via computer and transferred to new material. (RS)

  8. Hardware and software maintenance strategies for upgrading vintage computers

    International Nuclear Information System (INIS)

    Wang, B.C.; Buijs, W.J.; Banting, R.D.

    1992-01-01

    The paper focuses on the maintenance of the computer hardware and software for digital control computers (DCC). Specific design and problems related to various maintenance strategies are reviewed. A foundation was required for a reliable computer maintenance and upgrading program to provide operation of the DCC with high availability and reliability for 40 years. This involved a carefully planned and executed maintenance and upgrading program, involving complementary hardware and software strategies. The computer system was designed on a modular basis, with large sections easily replaceable, to facilitate maintenance and improve availability of the system. Advances in computer hardware have made it possible to replace DCC peripheral devices with reliable, inexpensive, and widely available components from PC-based systems (PC = personal computer). By providing a high speed link from the DCC to a PC, it is now possible to use many commercial software packages to process data from the plant. 1 fig

  9. Strategies for MCMC computation inquantitative genetics

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus; Ibánēz-Escriche, Noelia; Sorensen, Daniel

    another extension of the linear mixed model introducing genetic random effects influencing the log residual variances of the observations thereby producing a genetically structured variance heterogeneity. Considerable computational problems arise when abandoning the standard linear mixed model. Maximum...... the various algorithms in the context of the heterogeneous variance model. Apart from being a model of great interest in its own right, this model has proven to be a hard test for MCMC methods. We compare the performances of the different algorithms when applied to three real datasets which differ markedly...... results of applying two MCMC schemes to data sets with pig litter sizes, rabbit litter sizes, and snail weights. Some concluding remarks are given in Section 5....

  10. ''Big Dee'' upgrade of the Doublet III diagnostic data acquisition computer system

    International Nuclear Information System (INIS)

    Mcharg, B.B.

    1983-01-01

    The ''Big Dee'' upgrade of the Doublet III tokamak facility will begin operation in 1986 with an initial quantity of data expected to be 10 megabytes per shot and eventually attaining 20-25 megabytes per shot. This is in comparison to the 4-5 megabytes of data currently acquired. To handle this greater quantity of data and to serve physics needs for significantly improved between-shot processing of data will require a substantial upgrade of the existing data acquisition system. The key points of the philosophy that have been adopted for the upgraded system to handle the greater quantity of data are (1) preserve existing hardware, (2) preserve existing software; (3) configure the system in a modular fashion; and (4) distribute the data acquisition over multiple computers. The existing system using ModComp CLASSIC 16 bit minicomputers is capable of handling 5 megabytes of data per shot

  11. Big Dee upgrade of the Doublet III diagnostic data acquisition computer system

    International Nuclear Information System (INIS)

    McHarg, B.B. Jr.

    1983-12-01

    The Big Dee upgrade of the Doublet III tokamak facility will begin operation in 1986 with an initial quantity of data expected to be 10 megabytes per shot and eventually attaining 20 to 25 megabytes per shot. This is in comparison to the 4 to 5 megabytes of data currently acquired. To handle this greater quantity of data and to serve physics needs for significantly improved between-shot processing of data will require a substantial upgrade of the existing data acquisition system. The key points of the philosophy that have been adopted for the upgraded system to handle the greater quantity of data are (1) preserve existing hardware; (2) preserve existing software; (3) configure the system in a modular fashion; and (4) distribute the data acquisition over multiple computers. The existing system using ModComp CLASSIC 16 bit minicomputers is capable of handling 5 megabytes of data per shot

  12. Adaptive Optics Facility: control strategy and first on-sky results of the acquisition sequence

    Science.gov (United States)

    Madec, P.-Y.; Kolb, J.; Oberti, S.; Paufique, J.; La Penna, P.; Hackenberg, W.; Kuntschner, H.; Argomedo, J.; Kiekebusch, M.; Donaldson, R.; Suarez, M.; Arsenault, R.

    2016-07-01

    The Adaptive Optics Facility is an ESO project aiming at converting Yepun, one of the four 8m telescopes in Paranal, into an adaptive telescope. This is done by replacing the current conventional secondary mirror of Yepun by a Deformable Secondary Mirror (DSM) and attaching four Laser Guide Star (LGS) Units to its centerpiece. In the meantime, two Adaptive Optics (AO) modules have been developed incorporating each four LGS WaveFront Sensors (WFS) and one tip-tilt sensor used to control the DSM at 1 kHz frame rate. The four LGS Units and one AO module (GRAAL) have already been assembled on Yepun. Besides the technological challenge itself, one critical area of AOF is the AO control strategy and its link with the telescope control, including Active Optics used to shape M1. Another challenge is the request to minimize the overhead due to AOF during the acquisition phase of the observation. This paper presents the control strategy of the AOF. The current control of the telescope is first recalled, and then the way the AO control makes the link with the Active Optics is detailed. Lab results are used to illustrate the expected performance. Finally, the overall AOF acquisition sequence is presented as well as first results obtained on sky with GRAAL.

  13. Computer control and data acquisition system for the R.F. Test Facility

    International Nuclear Information System (INIS)

    Stewart, K.A.; Burris, R.D.; Mankin, J.B.; Thompson, D.H.

    1986-01-01

    The Radio Frequency Test Facility (RFTF) at Oak Ridge National Laboratory, used to test and evaluate high-power ion cyclotron resonance heating (ICRH) systems and components, is monitored and controlled by a multicomponent computer system. This data acquisition and control system consists of three major hardware elements: (1) an Allen-Bradley PLC-3 programmable controller; (2) a VAX 11/780 computer; and (3) a CAMAC serial highway interface. Operating in LOCAL as well as REMOTE mode, the programmable logic controller (PLC) performs all the control functions of the test facility. The VAX computer acts as the operator's interface to the test facility by providing color mimic panel displays and allowing input via a trackball device. The VAX also provides archiving of trend data acquired by the PLC. Communications between the PLC and the VAX are via the CAMAC serial highway. Details of the hardware, software, and the operation of the system are presented in this paper

  14. Computer system design description for SY-101 hydrogen mitigation test project data acquisition and control system (DACS-1). Revision 1

    International Nuclear Information System (INIS)

    Truitt, R.W.

    1994-01-01

    This document provides descriptions of components and tasks that are involved in the computer system for the data acquisition and control of the mitigation tests conducted on waste tank SY-101 at the Hanford Nuclear Reservation. The system was designed and implemented by Los alamos National Laboratory and supplied to Westinghouse Hanford Company. The computers (both personal computers and specialized data-taking computers) and the software programs of the system will hereafter collectively be referred to as the DACS (Data Acquisition and Control System)

  15. An on-line data acquisition system based on Norsk-Data ND-560 computer

    International Nuclear Information System (INIS)

    Bandyopadhyay, A.; Roy, A.; Dey, S.K.; Bhattacharya, S.; Bhowmik, R.K.

    1987-01-01

    This paper describes a high-speed data acquisition system based on CAMAC for Norsk Data ND-560 computer operating in a multiuser environment. As opposed to the present trend, the system has been implemented with minimum hardware at CAMAC level taking advantage of the dual processors of ND-560. The package consists of several coordinated tasks running in the two CPUs which acquire data, record on tape, permit on-line analysis and display the data and perform related control operations. It has been used in several experiments at VECC and its performance in on-line experiments is reported. (orig.)

  16. A versatile computer based system for data acquisition manipulation and presentation

    International Nuclear Information System (INIS)

    Bardsley, D.J.

    1985-12-01

    A data acquisition system based on the Microdata M 1600L data logger and a Digital Equipment Corporation (DEC) VT103 computer system has been set up for use in a wide range of research and development projects in the field of fission detectors and associated technology. The philosophy underlying the system and its important features are described. Operating instructions for the logger are given, and its application to experimental measurements is considered. Observations on whole system performance, and recommendations for improvements are made. (U.K.)

  17. A method for improved 4D-computed tomography data acquisition

    Energy Technology Data Exchange (ETDEWEB)

    Kupper, Martin; Sprengel, Wolfgang [Technische Univ. Graz (Austria). Inst. fuer Materialphysik; Winkler, Peter; Zurl, Brigitte [Medizinische Univ. Graz (Austria). Comprehensive Cancer Center

    2017-05-01

    In four-dimensional time-dependent computed tomography (4D-CT) of the lungs, irregularities in breathing movements can cause errors in data acquisition, or even data loss. We present a method based on sending a synthetic, regular breathing signal to the CT instead of the real signal, which ensures 4D-CT data sets without data loss. Subsequent correction of the signal based on the real breathing curve enables an accurate reconstruction of the size and movement of the target volume. This makes it possible to plan radiation treatment based on the obtained data. The method was tested with dynamic thorax phantom measurements using synthetic and real breathing patterns.

  18. Software development on the DIII-D control and data acquisition computers

    International Nuclear Information System (INIS)

    Penaflor, B.G.; McHarg, B.B. Jr.; Piglowski, D.

    1997-11-01

    The various software systems developed for the DIII-D tokamak have played a highly visible and important role in tokamak operations and fusion research. Because of the heavy reliance on in-house developed software encompassing all aspects of operating the tokamak, much attention has been given to the careful design, development and maintenance of these software systems. Software systems responsible for tokamak control and monitoring, neutral beam injection, and data acquisition demand the highest level of reliability during plasma operations. These systems made up of hundreds of programs totaling thousands of lines of code have presented a wide variety of software design and development issues ranging from low level hardware communications, database management, and distributed process control, to man machine interfaces. The focus of this paper will be to describe how software is developed and managed for the DIII-D control and data acquisition computers. It will include an overview and status of software systems implemented for tokamak control, neutral beam control, and data acquisition. The issues and challenges faced developing and managing the large amounts of software in support of the dynamic and everchanging needs of the DIII-D experimental program will be addressed

  19. Three-dimensional protein structure prediction: Methods and computational strategies.

    Science.gov (United States)

    Dorn, Márcio; E Silva, Mariel Barbachan; Buriol, Luciana S; Lamb, Luis C

    2014-10-12

    A long standing problem in structural bioinformatics is to determine the three-dimensional (3-D) structure of a protein when only a sequence of amino acid residues is given. Many computational methodologies and algorithms have been proposed as a solution to the 3-D Protein Structure Prediction (3-D-PSP) problem. These methods can be divided in four main classes: (a) first principle methods without database information; (b) first principle methods with database information; (c) fold recognition and threading methods; and (d) comparative modeling methods and sequence alignment strategies. Deterministic computational techniques, optimization techniques, data mining and machine learning approaches are typically used in the construction of computational solutions for the PSP problem. Our main goal with this work is to review the methods and computational strategies that are currently used in 3-D protein prediction. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. An empirical analysis of the dynamic programming model of stockpile acquisition strategies for China's strategic petroleum reserve

    International Nuclear Information System (INIS)

    Wu, Gang; Fan, Ying; Wei, Yi-Ming; Liu, Lan-Cui

    2008-01-01

    The world's future oil price is affected by many factors. The challenge, therefore, is how to select optimal stockpile acquisition strategies to minimize the cost of maintaining a reserve. This paper provides a new method for analyzing this problem using an uncertain dynamic programming model to analyze stockpile acquisition strategies for strategic petroleum reserve. Using this model, we quantify the impact of uncertain world oil price on optimal stockpile acquisition strategies of China's strategic petroleum reserve for the period 2007-2010 and 2011-2020. Our results show that the future stockpile acquisition is related to oil prices and their probability and, if not considering the occurrence of oil supply shortage, China should at least purchase 25 million barrels when world oil price is at an optimal level. The optimal price of stockpile acquisition of every year has a stronger relationship with the probability of high price; and the optimal expected price and size of stockpile acquisition is different in each year. (author)

  1. Computer-based supervisory control and data acquisition system for the radioactive waste evaporator

    International Nuclear Information System (INIS)

    Pope, N.G.; Schreiber, S.B.; Yarbro, S.L.; Gomez, B.G.; Nekimken, H.L.; Sanchez, D.E.; Bibeau, R.A.; Macdonald, J.M.

    1994-12-01

    The evaporator process at TA-55 reduces the amount of transuranic liquid radioactive waste by separating radioactive salts from relatively low-level radioactive nitric acid solution. A computer-based supervisory control and data acquisition (SCADA) system has been installed on the process that allows the operators to easily interface with process equipment. Individual single-loop controllers in the SCADA system allow more precise process operation with less human intervention. With this system, process data can be archieved in computer files for later analysis. Data are distributed throughout the TA-55 site through a local area network so that real-time process conditions can be monitored at multiple locations. The entire system has been built using commercially available hardware and software components

  2. Data acquisition with the personal computer to the microwaves generator of the microtron MT-25

    International Nuclear Information System (INIS)

    Rivero Ramirez, D.; Benavides Benitez, J. I.; Quiles Latorre, F. J.; Pahor, J.; Ponikvar, D.; Lago, G.

    2000-01-01

    The following paper includes the description of the design, construction and completion of a data acquisition system. The system is destined to the sampling of the work parameters of the generator of microwaves of the Microtron-25 that will settle in the High Institute of Nuclear Sciences and Technology, Havana, Cuba. In order to guarantee the suitable operation of the system a monitor program in assembler language has been developed. This program allows the communication of the system with one personal computer through the interface RS-232, as well as executes the commands received through it. Also the development of a program of attention to the system from one personal computer using the methods of the virtual instrumentation is included in this paper

  3. A New Adaptive Checkpointing Strategy for Mobile Computing

    Institute of Scientific and Technical Information of China (English)

    MENChaoguang; ZUODecheng; YANGXiaozong

    2005-01-01

    Adaptive checkpointing strategy is an efficient recovery scheme, which is suitable for mobile computing system. However, all existing adaptive checkpointing schemes are not correct to recover system when failure occurs in some special period. In this paper, the issues that will lead to system inconsistency are first discussed and then a new adaptive strategy that can recover system to correct consistent state is proposed. Our algorithm improves system recovery performance because only failure process needs rollback through logging.

  4. Xenon Acquisition Strategies for High-Power Electric Propulsion NASA Missions

    Science.gov (United States)

    Herman, Daniel A.; Unfried, Kenneth G.

    2015-01-01

    The benefits of high-power solar electric propulsion (SEP) for both NASA's human and science exploration missions combined with the technology investment from the Space Technology Mission Directorate have enabled the development of a 50kW-class SEP mission. NASA mission concepts developed, including the Asteroid Redirect Robotic Mission, and those proposed by contracted efforts for the 30kW-class demonstration have a range of xenon propellant loads from 100's of kg up to 10,000 kg. A xenon propellant load of 10 metric tons represents greater than 10% of the global annual production rate of xenon. A single procurement of this size with short-term delivery can disrupt the xenon market, driving up pricing, making the propellant costs for the mission prohibitive. This paper examines the status of the xenon industry worldwide, including historical xenon supply and pricing. The paper discusses approaches for acquiring on the order of 10 MT of xenon propellant considering realistic programmatic constraints to support potential near-term NASA missions. Finally, the paper will discuss acquisitions strategies for mission campaigns utilizing multiple high-power solar electric propulsion vehicles requiring 100's of metric tons of xenon over an extended period of time where a longer term acquisition approach could be implemented.

  5. The Strategy Blueprint : A Strategy Process Computer-Aided Design Tool

    NARCIS (Netherlands)

    Aldea, Adina Ioana; Febriani, Tania Rizki; Daneva, Maya; Iacob, Maria Eugenia

    2017-01-01

    Strategy has always been a main concern of organizations because it dictates their direction, and therefore determines their success. Thus, organizations need to have adequate support to guide them through their strategy formulation process. The goal of this research is to develop a computer-based

  6. Use of VME computers for the data acquisition system of the PHOENICS experiment

    International Nuclear Information System (INIS)

    Zucht, B.

    1989-10-01

    The data acquisition program PHON (PHOENICS ONLINE) for the PHOENICS-experiment at the stretcher ring ELSA in Bonn is described. PHON is based on a fast parallel CAMAC readout with special VME-front-end-processors (VIP) and a VAX computer, allowing comfortable control and programming. Special tools have been developed to facilitate the implementation of user programs. The PHON-compiler allows to specify the arrangement of the CAMAC-modules to be read out for each event (camaclist) using a simple language. The camaclist is translated in 68000 Assembly and runs on the front-end-processors, making high data rates possible. User programs for monitoring and control of the experiment normally require low data rates and therefore run on the VAX computer. CAMAC operations are supported by the PHON CAMAC-Library. For graphic representation of the data the CERN standard program libraries HBOOK and PAW are used. The data acquisition system is very flexible and can be easily adapted to different experiments. (orig.)

  7. Efficient Strategy Computation in Zero-Sum Asymmetric Repeated Games

    KAUST Repository

    Li, Lichun

    2017-03-06

    Zero-sum asymmetric games model decision making scenarios involving two competing players who have different information about the game being played. A particular case is that of nested information, where one (informed) player has superior information over the other (uninformed) player. This paper considers the case of nested information in repeated zero-sum games and studies the computation of strategies for both the informed and uninformed players for finite-horizon and discounted infinite-horizon nested information games. For finite-horizon settings, we exploit that for both players, the security strategy, and also the opponent\\'s corresponding best response depend only on the informed player\\'s history of actions. Using this property, we refine the sequence form, and formulate an LP computation of player strategies that is linear in the size of the uninformed player\\'s action set. For the infinite-horizon discounted game, we construct LP formulations to compute the approximated security strategies for both players, and provide a bound on the performance difference between the approximated security strategies and the security strategies. Finally, we illustrate the results on a network interdiction game between an informed system administrator and uniformed intruder.

  8. Computational strategies for three-dimensional flow simulations on distributed computer systems

    Science.gov (United States)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-08-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  9. Computational strategies for three-dimensional flow simulations on distributed computer systems

    Science.gov (United States)

    Sankar, Lakshmi N.; Weed, Richard A.

    1995-01-01

    This research effort is directed towards an examination of issues involved in porting large computational fluid dynamics codes in use within the industry to a distributed computing environment. This effort addresses strategies for implementing the distributed computing in a device independent fashion and load balancing. A flow solver called TEAM presently in use at Lockheed Aeronautical Systems Company was acquired to start this effort. The following tasks were completed: (1) The TEAM code was ported to a number of distributed computing platforms including a cluster of HP workstations located in the School of Aerospace Engineering at Georgia Tech; a cluster of DEC Alpha Workstations in the Graphics visualization lab located at Georgia Tech; a cluster of SGI workstations located at NASA Ames Research Center; and an IBM SP-2 system located at NASA ARC. (2) A number of communication strategies were implemented. Specifically, the manager-worker strategy and the worker-worker strategy were tested. (3) A variety of load balancing strategies were investigated. Specifically, the static load balancing, task queue balancing and the Crutchfield algorithm were coded and evaluated. (4) The classical explicit Runge-Kutta scheme in the TEAM solver was replaced with an LU implicit scheme. And (5) the implicit TEAM-PVM solver was extensively validated through studies of unsteady transonic flow over an F-5 wing, undergoing combined bending and torsional motion. These investigations are documented in extensive detail in the dissertation, 'Computational Strategies for Three-Dimensional Flow Simulations on Distributed Computing Systems', enclosed as an appendix.

  10. Computer data-acquisition and control system for Thomson-scattering measurements

    International Nuclear Information System (INIS)

    Stewart, K.A.; Foskett, R.D.; Kindsfather, R.R.; Lazarus, E.A.; Thomas, C.E.

    1983-03-01

    The Thomson-Scattering Diagnostic System (SCATPAK II) used to measure the electron temperature and density in the Impurity Study Experiment is interfaced to a Perkin-Elmer 8/32 computer that operates under the OS/32 operating system. The calibration, alignment, and operation of this diagnostic are all under computer control. Data acquired from 106 photomultiplier tubes installed on 15 spectrometers are transmitted to the computer by eighteen 12-channel, analog-to-digital integrators along a CAMAC serial highway. With each laser pulse, 212 channels of data are acquired: 106 channels of signal plus background and 106 channels of background only. Extensive use of CAMAC instrumentation enables large amounts of data to be acquired and control processes to be performed in a time-dependent environment. The Thomson-scattering computer system currently operates in three modes: user interaction and control, data acquisition and transmission, and data analysis. This paper discusses the development and implementation of this system as well as data storage and retrieval

  11. Advanced Simulation and Computing Co-Design Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Ang, James A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hoang, Thuc T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Kelly, Suzanne M. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); McPherson, Allen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Rob [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    This ASC Co-design Strategy lays out the full continuum and components of the co-design process, based on what we have experienced thus far and what we wish to do more in the future to meet the program’s mission of providing high performance computing (HPC) and simulation capabilities for NNSA to carry out its stockpile stewardship responsibility.

  12. Aviation Acquisition: A Comprehensive Strategy Is Needed for Cultural Change at FAA

    Science.gov (United States)

    1996-08-22

    The Federal Aviation Administration's (FAA) timely acquisition of new air : traffic control equipment has become increasingly critical for aviation safety : and efficiency. However, persistent acquisition problems raise questions about : the agency's...

  13. Categorizing words using 'frequent frames': what cross-linguistic analyses reveal about distributional acquisition strategies.

    Science.gov (United States)

    Chemla, Emmanuel; Mintz, Toben H; Bernal, Savita; Christophe, Anne

    2009-04-01

    Mintz (2003) described a distributional environment called a frame, defined as the co-occurrence of two context words with one intervening target word. Analyses of English child-directed speech showed that words that fell within any frequently occurring frame consistently belonged to the same grammatical category (e.g. noun, verb, adjective, etc.). In this paper, we first generalize this result to French, a language in which the function word system allows patterns that are potentially detrimental to a frame-based analysis procedure. Second, we show that the discontinuity of the chosen environments (i.e. the fact that target words are framed by the context words) is crucial for the mechanism to be efficient. This property might be relevant for any computational approach to grammatical categorization. Finally, we investigate a recursive application of the procedure and observe that the categorization is paradoxically worse when context elements are categories rather than actual lexical items. Item-specificity is thus also a core computational principle for this type of algorithm. Our analysis, along with results from behavioural studies (Gómez, 2002; Gómez and Maye, 2005; Mintz, 2006), provides strong support for frames as a basis for the acquisition of grammatical categories by infants. Discontinuity and item-specificity appear to be crucial features.

  14. Endoleak detection using single-acquisition split-bolus dual-energy computer tomography (DECT)

    Energy Technology Data Exchange (ETDEWEB)

    Javor, D.; Wressnegger, A.; Unterhumer, S.; Kollndorfer, K.; Nolz, R.; Beitzke, D.; Loewe, C. [Medical University of Vienna, Department of Biomedical Imaging and Image-guided Therapy, Vienna (Austria)

    2017-04-15

    To assess a single-phase, dual-energy computed tomography (DECT) with a split-bolus technique and reconstruction of virtual non-enhanced images for the detection of endoleaks after endovascular aneurysm repair (EVAR). Fifty patients referred for routine follow-up post-EVAR CT and a history of at least one post-EVAR follow-up CT examination using our standard biphasic (arterial and venous phase) routine protocol (which was used as the reference standard) were included in this prospective trial. An in-patient comparison and an analysis of the split-bolus protocol and the previously used double-phase protocol were performed with regard to differences in diagnostic accuracy, radiation dose, and image quality. The analysis showed a significant reduction of radiation dose of up to 42 %, using the single-acquisition split-bolus protocol, while maintaining a comparable diagnostic accuracy (primary endoleak detection rate of 96 %). Image quality between the two protocols was comparable and only slightly inferior for the split-bolus scan (2.5 vs. 2.4). Using the single-acquisition, split-bolus approach allows for a significant dose reduction while maintaining high image quality, resulting in effective endoleak identification. (orig.)

  15. Computer programs for the acquisition and analysis of eddy-current array probe data

    International Nuclear Information System (INIS)

    Pate, J.R.; Dodd, C.V.

    1996-07-01

    Objective of the Improved Eddy-Curent ISI (in-service inspection) for Steam Generators Tubing program is to upgrade and validate eddy-current inspections, including probes, instrumentation, and data processing techniques for ISI of new, used, and repaired steam generator tubes; to improve defect detection, classification and characterization as affected by diameter and thickness variations, denting, probe wobble, tube sheet, tube supports, copper and sludge deposits, even when defect types and other variables occur in combination; to transfer this advanced technology to NRC's mobile NDE laboratory and staff. This report documents computer programs that were developed for acquisition of eddy-current data from specially designed 16-coil array probes. Complete code as well as instructions for use are provided

  16. CLOUD COMPUTING ADOPTION STRATEGIES AT PT TASPEN INDONESIA, Tbk

    Directory of Open Access Journals (Sweden)

    Julirzal Sarmedy

    2014-10-01

    Full Text Available PT. Taspen as Indonesian institution, is responsible for managing social insuranceprograms of civil servants. With branch offices and business partners who are geographicallydispersed throughout Indonesia, information technology is very important to support thebusiness processes. Cloud computing is a model of information technology services that couldpotentially increase the effectiveness and efficiency of PT. Taspen information system. Thisstudy examines the phenomenon exists at PT. Taspen in order to adopt cloud computing inthe information system, by using the framework of Technology-Organization-Environment,Diffusion of Innovation theory, and Partial Least Square method. Organizational factor isthe most dominant for PT. Taspen to adopt cloud computing. Referring to these findings,then a SWOT analysis and TOWS matrix are performed, which in this study recommendsthe implementation of a strategy model of cloud computing services that are private andgradually in process.

  17. Computational assessment of visual search strategies in volumetric medical images.

    Science.gov (United States)

    Wen, Gezheng; Aizenman, Avigael; Drew, Trafton; Wolfe, Jeremy M; Haygood, Tamara Miner; Markey, Mia K

    2016-01-01

    When searching through volumetric images [e.g., computed tomography (CT)], radiologists appear to use two different search strategies: "drilling" (restrict eye movements to a small region of the image while quickly scrolling through slices), or "scanning" (search over large areas at a given depth before moving on to the next slice). To computationally identify the type of image information that is used in these two strategies, 23 naïve observers were instructed with either "drilling" or "scanning" when searching for target T's in 20 volumes of faux lung CTs. We computed saliency maps using both classical two-dimensional (2-D) saliency, and a three-dimensional (3-D) dynamic saliency that captures the characteristics of scrolling through slices. Comparing observers' gaze distributions with the saliency maps showed that search strategy alters the type of saliency that attracts fixations. Drillers' fixations aligned better with dynamic saliency and scanners with 2-D saliency. The computed saliency was greater for detected targets than for missed targets. Similar results were observed in data from 19 radiologists who searched five stacks of clinical chest CTs for lung nodules. Dynamic saliency may be superior to the 2-D saliency for detecting targets embedded in volumetric images, and thus "drilling" may be more efficient than "scanning."

  18. Impact of external knowledge acquisition strategies on innovation: a comparative study based on Dutch and Swiss panel data

    OpenAIRE

    Arvanitis, S.; Lokshin, B.; Mohnen, P.; Wörter, M.

    2013-01-01

    There is growing evidence that firms increasingly adopt open innovation practices. In this paper we investigate the impact of two such external knowledge acquisition strategies, ‘buy’ and ‘cooperate’, on firm’s product innovation performance. Taking a direct (productivity)approach, we test for complementarity effects in the simultaneous use of the two strategies, and in the intensity of their use. Our results based on large panels of Dutch and Swiss innovating firms, suggest that while both ‘...

  19. Electronics, trigger, data acquisition, and computing working group on future B physics experiments

    International Nuclear Information System (INIS)

    Geer, S.

    1993-01-01

    Electronics, trigger, data acquisition, and computing: this is a very broad list of topics. Nevertheless in a modern particle physics experiment one thinks in terms of a data pipeline in which the front end electronics, the trigger and data acquisition, and the offline reconstruction are linked together. In designing any piece of this pipeline it is necessary to understand the bigger picture of the data flow, data rates and volume, and the input rate, output rate, and latencies for each part of the pipeline. All of this needs to be developed with a clear understanding of the requirements imposed by the physics goals of the experiment; the signal efficiencies, background rates, and the amount of recorded information that needs to be propagated through the pipeline to select and analyse the events of interest. The technology needed to meet the demanding high data volume needs of the next round of B physics experiments appears to be available, now or within a couple of years. This seems to be the case for both fixed target and collider B physics experiments. Although there are many differences between the various data pipelines that are being proposed, there are also striking similarities. All experiments have a multi-level trigger scheme (most have levels 1, 2, and 3) where the final level consists of a computing farm that can run offline-type code and reduce the data volume by a factor of a few. Finally, the ability to reconstruct large data volumes offline in a reasonably short time, and making large data volumes available to many physicists for analysis, imposes severe constraints on the foreseen data pipelines, and a significant uncertainty in evaluating the various approaches proposed

  20. How Well Can Existing Software Support Processes Accomplish Sustainment of a Non-Developmental Item-Based Acquisition Strategy

    Science.gov (United States)

    2017-04-06

    guidance to the PM regarding development and sustainment of software . The need for a strong application of software engineering principles is...on the battlefield by a government- developed network manager application . The configuration of this confluence of software will be jointly managed...How Well Can Existing Software -Support Processes Accomplish Sustainment of a Non- Developmental Item-Based Acquisition Strategy? Graciano

  1. Acquisition of Requests and Apologies in Spanish and French: Impact of Study Abroad and Strategy-Building Intervention

    Science.gov (United States)

    Cohen, Andrew D.; Shively, Rachel L.

    2007-01-01

    The primary aim of this study was to assess the impact of a curricular intervention on study-abroad students' use of language- and culture-learning strategies and on their acquisition of requests and apologies. The intervention consisted of a brief face-to-face orientation to learning speech acts, a self-study guidebook on language and culture…

  2. Effectiveness of Analogy Instructional Strategy on Undergraduate Student's Acquisition of Organic Chemistry Concepts in Mutah University, Jordan

    Science.gov (United States)

    Samara, Nawaf Ahmad Hasan

    2016-01-01

    This study aimed at investigating the effectiveness of analogy instructional strategy on undergraduate students' acquisition of organic chemistry concepts in Mutah University, Jordan. A quasi-experimental design was used in the study; Participants were 97 students who enrolled in organic chemistry course at the department of chemistry during the…

  3. A Matchmaking Strategy Of Mixed Resource On Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Wisam Elshareef

    2015-08-01

    Full Text Available Abstract Today cloud computing has become a key technology for online allotment of computing resources and online storage of user data in a lower cost where computing resources are available all the time over the Internet with pay per use concept. Recently there is a growing need for resource management strategies in a cloud computing environment that encompass both end-users satisfaction and a high job submission throughput with appropriate scheduling. One of the major and essential issues in resource management is related to allocate incoming tasks to suitable virtual machine matchmaking. The main objective of this paper is to propose a matchmaking strategy between the incoming requests and various resources in the cloud environment to satisfy the requirements of users and to load balance the workload on resources. Load Balancing is an important aspect of resource management in a cloud computing environment. So this paper proposes a dynamic weight active monitor DWAM load balance algorithm which allocates on the fly the incoming requests to the all available virtual machines in an efficient manner in order to achieve better performance parameters such as response time processing time and resource utilization. The feasibility of the proposed algorithm is analyzed using Cloudsim simulator which proves the superiority of the proposed DWAM algorithm over its counterparts in literature. Simulation results demonstrate that proposed algorithm dramatically improves response time data processing time and more utilized of resource compared Active monitor and VM-assign algorithms.

  4. The impact of constructivist teaching strategies on the acquisition of higher order cognition and learning

    Science.gov (United States)

    Merrill, Alison Saricks

    The purpose of this quasi-experimental quantitative mixed design study was to compare the effectiveness of brain-based teaching strategies versus a traditional lecture format in the acquisition of higher order cognition as determined by test scores. A second purpose was to elicit student feedback about the two teaching approaches. The design was a 2 x 2 x 2 factorial design study with repeated measures on the last factor. The independent variables were type of student, teaching method, and a within group change over time. Dependent variables were a between group comparison of pre-test, post-test gain scores and a within and between group comparison of course examination scores. A convenience sample of students enrolled in medical-surgical nursing was used. One group (n=36) was made up of traditional students and the other group (n=36) consisted of second-degree students. Four learning units were included in this study. Pre- and post-tests were given on the first two units. Course examinations scores from all four units were compared. In one cohort two of the units were taught via lecture format and two using constructivist activities. These methods were reversed for the other cohort. The conceptual basis for this study derives from neuroscience and cognitive psychology. Learning is defined as the growth of new dendrites. Cognitive psychologists view learning as a constructive activity in which new knowledge is built on an internal foundation of existing knowledge. Constructivist teaching strategies are designed to stimulate the brain's natural learning ability. There was a statistically significant difference based on type of teaching strategy (t = -2.078, df = 270, p = .039, d = .25)) with higher mean scores on the examinations covering brain-based learning units. There was no statistical significance based on type of student. Qualitative data collection was conducted in an on-line forum at the end of the semester. Students had overall positive responses about the

  5. Computing security strategies in finite horizon repeated Bayesian games

    KAUST Repository

    Lichun Li

    2017-07-10

    This paper studies security strategies in two-player zero-sum repeated Bayesian games with finite horizon. In such games, each player has a private type which is independently chosen according to a publicly known a priori probability. Players\\' types are fixed all through the game. The game is played for finite stages. At every stage, players simultaneously choose their actions which are observed by the public. The one-stage payoff of player 1 (or penalty to player 2) depends on both players types and actions, and is not directly observed by any player. While player 1 aims to maximize the total payoff over the game, player 2 wants to minimize it. This paper provides each player two ways to compute the security strategy, i.e. the optimal strategy in the worst case. First, a security strategy that directly depends on both players\\' history actions is derived by refining the sequence form. Noticing that history action space grows exponentially with respect to the time horizon, this paper further presents a security strategy that depends on player\\'s fixed sized sufficient statistics. The sufficient statistics is shown to consist of the belief on one\\'s own type, the regret on the other player\\'s type, and the stage, and is independent of the other player\\'s strategy.

  6. Retrieval and organizational strategies in conceptual memory a computer model

    CERN Document Server

    Kolodner, Janet L

    2014-01-01

    'Someday we expect that computers will be able to keep us informed about the news. People have imagined being able to ask their home computers questions such as "What's going on in the world?"…'. Originally published in 1984, this book is a fascinating look at the world of memory and computers before the internet became the mainstream phenomenon it is today. It looks at the early development of a computer system that could keep us informed in a way that we now take for granted. Presenting a theory of remembering, based on human information processing, it begins to address many of the hard problems implicated in the quest to make computers remember. The book had two purposes in presenting this theory of remembering. First, to be used in implementing intelligent computer systems, including fact retrieval systems and intelligent systems in general. Any intelligent program needs to use and store and use a great deal of knowledge. The strategies and structures in the book were designed to be used for that purpos...

  7. Data acquisition

    International Nuclear Information System (INIS)

    Clout, P.N.

    1982-01-01

    Data acquisition systems are discussed for molecular biology experiments using synchrotron radiation sources. The data acquisition system requirements are considered. The components of the solution are described including hardwired solutions and computer-based solutions. Finally, the considerations for the choice of the computer-based solution are outlined. (U.K.)

  8. Evolution Of The Operational Energy Strategy And Its Consideration In The Defense Acquisition Process

    Science.gov (United States)

    2016-09-01

    Figure 12. PPBE Process Flowchart . Source: AcqNotes (2016). We comment above that once a program manager has completed his major Defense...acquisition system: 1) acquisition, 2) requirements and 3) planning, programming , budgeting, and execution (PPBE). We looked at the evolution of...to gain traction or improve promulgation of key guidance and documentation for new-starts and/or upgrades to weapon system acquisition programs

  9. Market driven strategy for acquisition of waste acceptance and transportation services for commercial spent fuel in the United States

    International Nuclear Information System (INIS)

    Lemeshewky, W.; Macaluso, C.; Smith, P.; Teer, B.

    1998-05-01

    The Department of Energy has the responsibility for the shipment of spent nuclear fuel (SNF) from commercial reactors to a Federal facility for storage and/or disposal. DOE has developed a strategy for a market driven approach for the acquisition of transportation services and equipment which will maximize the participation of private industry. To implement this strategy, DOE is planning to issue a Request for Proposal (RFP) for the provision of the required services and equipment to accept SNF from the utilities and transport the SNF to a Federal facility. The paper discusses this strategy and describes the RFP

  10. A Multilevel Modeling Approach to Examining Individual Differences in Skill Acquisition for a Computer-Based Task

    OpenAIRE

    Nair, Sankaran N.; Czaja, Sara J.; Sharit, Joseph

    2007-01-01

    This article explores the role of age, cognitive abilities, prior experience, and knowledge in skill acquisition for a computer-based simulated customer service task. Fifty-two participants aged 50–80 performed the task over 4 consecutive days following training. They also completed a battery that assessed prior computer experience and cognitive abilities. The data indicated that overall quality and efficiency of performance improved with practice. The predictors of initial level of performan...

  11. The Effect of Motor Difficulty on the Acquisition of a Computer Task: A Comparison between Young and Older Adults

    Science.gov (United States)

    Fezzani, K.; Albinet, C.; Thon, B.; Marquie, J. -C.

    2010-01-01

    The present study investigated the extent to which the impact of motor difficulty on the acquisition of a computer task varies as a function of age. Fourteen young and 14 older participants performed 352 sequences of 10 serial pointing movements with a wireless pen on a digitiser tablet. A conditional probabilistic structure governed the…

  12. Optimal core acquisition and pricing strategies for hybrid manufacturing and remanufacturing systems

    NARCIS (Netherlands)

    Caner Bulmus, Serra; Zhu, Stuart X.; Teunter, Ruud H.

    2014-01-01

    In this study, we combine two aspects of remanufacturing, namely product acquisition management and marketing (pricing) of the remanufactured products. We consider an original equipment manufacturer (OEM) who decides on the acquisition prices offered for returns from different quality types and on

  13. A strategy to compute plastic post-buckling of structures

    International Nuclear Information System (INIS)

    Combescure, A.

    1983-08-01

    The paper gives a general framework to the different strategies used to compute the post-buckling of structures. Two particular strategies are studied in more details and it is shown how they can be applied in the plastic regime. All the methods suppose that the loads F are proportional to a simple parameter lambda; more precisely: eq (1) F = lambda F 0 . The paper shows how these methods can be implemented in a very simple way. In the elastic case we show the application of the method to the calculation of post buckling response of a clamped arch. The method is also applied to a very simple case of two bars which can be calculated analytically. In the plastic range, the method is applied to the post-buckling of an imperfect ring which can be calculated analytically. Another example is the comparison of the comparison of the computed post-buckling of a thin cylinder under axial compression, and of the experimental behavior on the same cylinder. The limitation of these types of strategies are also mentionned and the physical signifiance of calculations in the post-buckling regime are discussed

  14. Microevolution analysis of Bacillus coahuilensis unveils differences in phosphorus acquisition strategies and their regulation

    Directory of Open Access Journals (Sweden)

    Zulema eGómez-Lunar

    2016-02-01

    Full Text Available Bacterial genomes undergo numerous events of gene losses and gains that generate genome variability among strains of the same species (microevolution. Our aim was to compare the genomes and relevant phenotypes of three Bacillus coahuilensis strains from two oligotrophic hydrological systems in the Cuatro Ciénegas Basin (México, to unveil the environmental challenges that this species cope with, and the microevolutionary differences in these genotypes. Since the strains were isolated from a low P environment, we placed emphasis on the search of different phosphorus acquisition strategies. The three B. coahuilensis strains exhibited similar numbers of coding DNA sequences, of which 82% (2, 893 constituted the core genome, and 18% corresponded to accessory genes. Most of the genes in this last group were associated with mobile genetic elements or were annotated as hypothetical proteins. Ten percent of the pangenome consisted of strain-specific genes. Alignment of the three B. coahuilensis genomes indicated a high level of synteny and revealed the presence of several genomic islands. Unexpectedly, one of these islands contained genes that encode the 2-keto-3-deoxymannooctulosonic acid (Kdo biosynthesis enzymes, a feature associated to cell walls of Gram-negative bacteria. Some microevolutionary changes were clearly associated with mobile genetic elements. Our analysis revealed inconsistencies between phenotype and genotype, which we suggest result from the impossibility to map regulatory features to genome analysis. Experimental results revealed variability in the types and numbers of auxotrophies between the strains that could not consistently be explained by in silico metabolic models. Several intraspecific differences in preferences for carbohydrate and phosphorus utilization were observed. Regarding phosphorus recycling, scavenging, and storage, variations were found between the three genomes. The three strains exhibited differences regarding

  15. Xenon Acquisition Strategies for High-Power Electric Propulsion NASA Missions

    Science.gov (United States)

    Herman, Daniel A.; Unfried, Kenneth G.

    2015-01-01

    Solar electric propulsion (SEP) has been used for station-keeping of geostationary communications satellites since the 1980s. Solar electric propulsion has also benefitted from success on NASA Science Missions such as Deep Space One and Dawn. The xenon propellant loads for these applications have been in the 100s of kilograms range. Recent studies performed for NASA's Human Exploration and Operations Mission Directorate (HEOMD) have demonstrated that SEP is critically enabling for both near-term and future exploration architectures. The high payoff for both human and science exploration missions and technology investment from NASA's Space Technology Mission Directorate (STMD) are providing the necessary convergence and impetus for a 30-kilowatt-class SEP mission. Multiple 30-50- kilowatt Solar Electric Propulsion Technology Demonstration Mission (SEP TDM) concepts have been developed based on the maturing electric propulsion and solar array technologies by STMD with recent efforts focusing on an Asteroid Redirect Robotic Mission (ARRM). Xenon is the optimal propellant for the existing state-of-the-art electric propulsion systems considering efficiency, storability, and contamination potential. NASA mission concepts developed and those proposed by contracted efforts for the 30-kilowatt-class demonstration have a range of xenon propellant loads from 100s of kilograms up to 10,000 kilograms. This paper examines the status of the xenon industry worldwide, including historical xenon supply and pricing. The paper will provide updated information on the xenon market relative to previous papers that discussed xenon production relative to NASA mission needs. The paper will discuss the various approaches for acquiring on the order of 10 metric tons of xenon propellant to support potential near-term NASA missions. Finally, the paper will discuss acquisitions strategies for larger NASA missions requiring 100s of metric tons of xenon will be discussed.

  16. Evaluation of different EEG acquisition systems concerning their suitability for building a brain-computer interface

    Directory of Open Access Journals (Sweden)

    Andreas Pinegger

    2016-09-01

    Full Text Available One important aspect in non-invasive brain-computer interface (BCI research is to acquire the electroencephalogram (EEG in a proper way. From an end-user perspective this means with maximum comfort and without any extra inconveniences (e.g., washing the hair. Whereas from a technical perspective, the signal quality has to be optimal to make the BCI work effectively and efficiently.In this work we evaluated three different commercially available EEG acquisition systems that differ in the type of electrode (gel-, water-, and dry-based, the amplifier technique, and the data transmission method. Every system was tested regarding three different aspects, namely, technical, BCI effectiveness and efficiency (P300 communication and control, and user satisfaction (comfort.We found that the water-based system had the lowest short circuit noise level, the hydrogel-based system had the highest P300 spelling accuracies, and the dry electrode system caused the least inconveniences.Therefore, building a reliable BCI is possible with all evaluated systems and it is on the user to decide which system meets the given requirements best.

  17. Patterns for election of active computing nodes in high availability distributed data acquisition systems

    International Nuclear Information System (INIS)

    Nair, Preetha; Padmini, S.; Diwakar, M.P.; Gohel, Nilesh

    2013-01-01

    Computer based systems for power plant and research reactors are expected to have high availability. Redundancy is a common approach to improve the availability of a system. In redundant configuration the challenge is to select one node as active, and in case of failure of current active node provide automatic fast switchover by electing another node to function as active and restore normal operation. Additional constraints include: exactly one node should be elected as active in an n-way redundant architecture. This paper discusses various high availability configurations developed by Electronics Division and deployed in power and research reactors and patterns followed to elect active nodes of distributed data acquisition systems. The systems are categorized into two: Active/Passive where changeover takes effect only on the failure of Active node, and Active/Active, where changeover is effective in alternate cycles. A novel concept of priority driven state based Active (Master) node election pattern is described for Active/Passive systems which allows multiple redundancy and dynamic election of single master. The paper also discusses the Active/Active pattern, which uncovers failure early by activating all the nodes alternatively in a redundant system. This pattern can be extended to multiple redundant nodes. (author)

  18. Shifts in nitrogen acquisition strategies enable enhanced terrestrial carbon storage under elevated CO2 in a global model

    Science.gov (United States)

    Sulman, B. N.; Brzostek, E. R.; Menge, D.; Malyshev, S.; Shevliakova, E.

    2017-12-01

    Earth System Model (ESM) projections of terrestrial carbon (C) uptake are critical to understanding the future of the global C cycle. Current ESMs include intricate representations of photosynthetic C fixation in plants, allowing them to simulate the stimulatory effect of increasing atmospheric CO2 levels on photosynthesis. However, they lack sophisticated representations of plant nutrient acquisition, calling into question their ability to project the future land C sink. We conducted simulations using a new model of terrestrial C and nitrogen (N) cycling within the Geophysical Fluid Dynamics Laboratory (GFDL) global land model LM4 that uses a return on investment framework to simulate global patterns of N acquisition via fixation of N2 from the atmosphere, scavenging of inorganic N from soil solution, and mining of organic N from soil organic matter (SOM). We show that these strategies drive divergent C cycle responses to elevated CO2 at the ecosystem scale, with the scavenging strategy leading to N limitation of plant growth and the mining strategy facilitating stimulation of plant biomass accumulation over decadal time scales. In global simulations, shifts in N acquisition from inorganic N scavenging to organic N mining along with increases in N fixation supported long-term acceleration of C uptake under elevated CO2. Our results indicate that the ability of the land C sink to mitigate atmospheric CO2 levels is tightly coupled to the functional diversity of ecosystems and their capacity to change their N acquisition strategies over time. Incorporation of these mechanisms into ESMs is necessary to improve confidence in model projections of the global C cycle.

  19. Shoot to root communication is necessary to control the expression of iron-acquisition genes in Strategy I plants.

    Science.gov (United States)

    García, María J; Romera, Francisco J; Stacey, Minviluz G; Stacey, Gary; Villar, Eduardo; Alcántara, Esteban; Pérez-Vicente, Rafael

    2013-01-01

    Previous research showed that auxin, ethylene, and nitric oxide (NO) can activate the expression of iron (Fe)-acquisition genes in the roots of Strategy I plants grown with low levels of Fe, but not in plants grown with high levels of Fe. However, it is still an open question as to how Fe acts as an inhibitor and which pool of Fe (e.g., root, phloem, etc.) in the plant acts as the key regulator for gene expression control. To further clarify this, we studied the effect of the foliar application of Fe on the expression of Fe-acquisition genes in several Strategy I plants, including wild-type cultivars of Arabidopsis [Arabidopsis thaliana (L.) Heynh], pea [Pisum sativum L.], tomato [Solanum lycopersicon Mill.], and cucumber [Cucumis sativus L.], as well as mutants showing constitutive expression of Fe-acquisition genes when grown under Fe-sufficient conditions [Arabidopsis opt3-2 and frd3-3, pea dgl and brz, and tomato chln (chloronerva)]. The results showed that the foliar application of Fe blocked the expression of Fe-acquisition genes in the wild-type cultivars and in the frd3-3, brz, and chln mutants, but not in the opt3-2 and dgl mutants, probably affected in the transport of a Fe-related repressive signal in the phloem. Moreover, the addition of either ACC (ethylene precursor) or GSNO (NO donor) to Fe-deficient plants up-regulated the expression of Fe-acquisition genes, but this effect did not occur in Fe-deficient plants sprayed with foliar Fe, again suggesting the existence of a Fe-related repressive signal moving from leaves to roots.

  20. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    Science.gov (United States)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  1. Comparison of Print Monograph Acquisitions Strategies Finds Circulation Advantage to Firm Orders

    Directory of Open Access Journals (Sweden)

    Laura Costello

    2017-12-01

    Full Text Available A Review of: Ke, I., Gao, W., & Bronicki, J. (2017. Does title-by-title selection make a difference? A usage title analysis on print monograph purchasing. Collection Management, 42(1, 34-47. http://dx.doi.org/10.1080/01462679.2016.1249040 Abstract Objective – To compare usage of print monographs acquired through firm order to those acquired through approval plans. Design – Quantitative study. Setting – A public research university serving an annual enrollment of over 43,500 students and employing more than 2,600 faculty members in the South Central United States. Subjects – Circulation and call number data from 21,356 print books acquired through approval plans, and 23,920 print books acquired through firm orders. Methods – Item records for print materials purchased between January 1, 2011 and December 31, 2014 were extracted from the catalog and separated by acquisitions strategy into firm order and approval plan lists. Items without call numbers and materials that had been placed on course reserves were removed from the lists. The authors examined accumulated circulation counts and conducted trend analyses to examine year-to-year usage. The authors also measured circulation performance in each Library of Congress call number class; they grouped these classes into science, social science, and humanities titles. Main Results – The authors found that 31% of approval plan books and 39% of firm order books had circulated at least once. The firm order books that had circulated were used an average of 1.87 times, compared to approval plan books which were used an average of 1.47 times. The year-to-year analysis showed that the initial circulation rate for approval plan books decreased from 42% in 2011 to 14% in 2014, and from 46% to 24% for firm order books. Subject area analysis showed that medicine and military science had the highest circulation rates at over 45%, and that agriculture and bibliography titles had the lowest circulation

  2. Optimizing 4D cone beam computed tomography acquisition by varying the gantry velocity and projection time interval

    International Nuclear Information System (INIS)

    O’Brien, Ricky T; Cooper, Benjamin J; Keall, Paul J

    2013-01-01

    Four dimensional cone beam computed tomography (4DCBCT) is an emerging clinical image guidance strategy for tumour sites affected by respiratory motion. In current generation 4DCBCT techniques, both the gantry rotation speed and imaging frequency are constant and independent of the patient’s breathing which can lead to projection clustering. We present a mixed integer quadratic programming (MIQP) model for respiratory motion guided-4DCBCT (RMG-4DCBCT) which regulates the gantry velocity and projection time interval, in response to the patient’s respiratory signal, so that a full set of evenly spaced projections can be taken in a number of phase, or displacement, bins during the respiratory cycle. In each respiratory bin, an image can be reconstructed from the projections to give a 4D view of the patient’s anatomy so that the motion of the lungs, and tumour, can be observed during the breathing cycle. A solution to the full MIQP model in a practical amount of time, 10 s, is not possible with the leading commercial MIQP solvers, so a heuristic method is presented. Using parameter settings typically used on current generation 4DCBCT systems (4 min image acquisition, 1200 projections, 10 respiratory bins) and a sinusoidal breathing trace with a 4 s period, we show that the root mean square (RMS) of the angular separation between projections with displacement binning is 2.7° using existing constant gantry speed systems and 0.6° using RMG-4DCBCT. For phase based binning the RMS is 2.7° using constant gantry speed systems and 2.5° using RMG-4DCBCT. The optimization algorithm presented is a critical step on the path to developing a system for RMG-4DCBCT. (paper)

  3. Strategies for Sharing Seismic Data Among Multiple Computer Platforms

    Science.gov (United States)

    Baker, L. M.; Fletcher, J. B.

    2001-12-01

    the user. Commercial software packages, such as MatLab, also have the ability to share data in their own formats across multiple computer platforms. Our Fortran applications can create plot files in Adobe PostScript, Illustrator, and Portable Document Format (PDF) formats. Vendor support for reading these files is readily available on multiple computer platforms. We will illustrate by example our strategies for sharing seismic data among our multiple computer platforms, and we will discuss our positive and negative experiences. We will include our solutions for handling the different byte ordering, floating-point formats, and text file ``end-of-line'' conventions on the various computer platforms we use (6 different operating systems on 5 processor architectures).

  4. An appraisal of problems and solutions to the acquisition of ...

    African Journals Online (AJOL)

    This study which sought to find out the problems and strategies for enhancing the acquisition of computer literacy skills by academic staff members of the Nasa rawa state University Keffi (NSUK) and those of the University of Jos (UNIJOS) dwelt on the methods available for the acquisition of computer literacy skills by ...

  5. Data of NODDI diffusion metrics in the brain and computer simulation of hybrid diffusion imaging (HYDI acquisition scheme

    Directory of Open Access Journals (Sweden)

    Chandana Kodiweera

    2016-06-01

    Full Text Available This article provides NODDI diffusion metrics in the brains of 52 healthy participants and computer simulation data to support compatibility of hybrid diffusion imaging (HYDI, “Hybrid diffusion imaging” [1] acquisition scheme in fitting neurite orientation dispersion and density imaging (NODDI model, “NODDI: practical in vivo neurite orientation dispersion and density imaging of the human brain” [2]. HYDI is an extremely versatile diffusion magnetic resonance imaging (dMRI technique that enables various analyzes methods using a single diffusion dataset. One of the diffusion data analysis methods is the NODDI computation, which models the brain tissue with three compartments: fast isotropic diffusion (e.g., cerebrospinal fluid, anisotropic hindered diffusion (e.g., extracellular space, and anisotropic restricted diffusion (e.g., intracellular space. The NODDI model produces microstructural metrics in the developing brain, aging brain or human brain with neurologic disorders. The first dataset provided here are the means and standard deviations of NODDI metrics in 48 white matter region-of-interest (ROI averaging across 52 healthy participants. The second dataset provided here is the computer simulation with initial conditions guided by the first dataset as inputs and gold standard for model fitting. The computer simulation data provide a direct comparison of NODDI indices computed from the HYDI acquisition [1] to the NODDI indices computed from the originally proposed acquisition [2]. These data are related to the accompanying research article “Age Effects and Sex Differences in Human Brain White Matter of Young to Middle-Aged Adults: A DTI, NODDI, and q-Space Study” [3].

  6. A new DoD initiative: the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program

    International Nuclear Information System (INIS)

    Arevalo, S; Atwood, C; Bell, P; Blacker, T D; Dey, S; Fisher, D; Fisher, D A; Genalis, P; Gorski, J; Harris, A; Hill, K; Hurwitz, M; Kendall, R P; Meakin, R L; Morton, S; Moyer, E T; Post, D E; Strawn, R; Veldhuizen, D v; Votta, L G

    2008-01-01

    In FY2008, the U.S. Department of Defense (DoD) initiated the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program, a $360M program with a two-year planning phase and a ten-year execution phase. CREATE will develop and deploy three computational engineering tool sets for DoD acquisition programs to use to design aircraft, ships and radio-frequency antennas. The planning and execution of CREATE are based on the 'lessons learned' from case studies of large-scale computational science and engineering projects. The case studies stress the importance of a stable, close-knit development team; a focus on customer needs and requirements; verification and validation; flexible and agile planning, management, and development processes; risk management; realistic schedules and resource levels; balanced short- and long-term goals and deliverables; and stable, long-term support by the program sponsor. Since it began in FY2008, the CREATE program has built a team and project structure, developed requirements and begun validating them, identified candidate products, established initial connections with the acquisition programs, begun detailed project planning and development, and generated the initial collaboration infrastructure necessary for success by its multi-institutional, multidisciplinary teams

  7. Cleared risks? Proactive strategies as service in Cloud Computing contexts

    Directory of Open Access Journals (Sweden)

    Manuela Moro-Cabero

    2018-02-01

    Full Text Available The increasing use of Cloud Computing services at organizations is an undeniable reality nowadays. Record managers must then adopt a proactive and compromised position, giving advice to users. However, as a consequence of lack of knowledge in the function and regulation of the field, the implementation of those services faces no little confrontation. This descriptive essay, supported by a relevant number of heterogeneous resources, systematizes the menaces of managing and storing information with these services. At the same time, the study establishes a set of action-strategies for both reaching and hiring agreements. The objective of the paper is, therefore, to make the professional aware of the potential of these services as assessing and controlling tools, as well as ensuring the digital continuity and the record resources preservation in the Cloud.

  8. A high-performance data acquisition system for computer-based multichannel analyzer

    International Nuclear Information System (INIS)

    Zhou Xinzhi; Bai Rongsheng; Wen Liangbi; Huang Yanwen

    1996-01-01

    A high-performance data acquisition system applied in the multichannel analyzer is designed with single-chip microcomputer system. The paper proposes the principle and the method of realizing the simultaneous data acquisition, the data pre-processing, and the fast bidirectional data transfer by means of direct memory access based on dual-port RAM as well. The measurement for dead or live time of ADC system can also be implemented efficiently by using it

  9. Multi parametric system for the acquisition and processing of nuclear data on a personal computer

    International Nuclear Information System (INIS)

    Toledo Acosta, R. B.; Osorio Deliz, J. F.; Arista Romeu, E.; Perez Sanchez, R.; Lopez Torres, E.

    1997-01-01

    A four-parameter Multi parametric System for the acquisition and processing of nuclear data is described. It is characterized for its flexibility and relatively low cost, also guaranteeing a high acquisition capacity. The system allows to be utilized in a multi parametric manner, in pulse height analysis or in any combination of both for parameter. The hardware and the software of the system are described. A general explanation of the operation and the characteristics of the system is offered. (author) [es

  10. Hybrid data acquisition and processing strategies with increased throughput and selectivity: pSMART analysis for global qualitative and quantitative analysis.

    Science.gov (United States)

    Prakash, Amol; Peterman, Scott; Ahmad, Shadab; Sarracino, David; Frewen, Barbara; Vogelsang, Maryann; Byram, Gregory; Krastins, Bryan; Vadali, Gouri; Lopez, Mary

    2014-12-05

    Data-dependent acquisition (DDA) and data-independent acquisition strategies (DIA) have both resulted in improved understanding of proteomics samples. Both strategies have advantages and disadvantages that are well-published, where DDA is typically applied for deep discovery and DIA may be used to create sample records. In this paper, we present a hybrid data acquisition and processing strategy (pSMART) that combines the strengths of both techniques and provides significant benefits for qualitative and quantitative peptide analysis. The performance of pSMART is compared to published DIA strategies in an experiment that allows the objective assessment of DIA performance with respect to interrogation of previously acquired MS data. The results of this experiment demonstrate that pSMART creates fewer decoy hits than a standard DIA strategy. Moreover, we show that pSMART is more selective, sensitive, and reproducible than either standard DIA or DDA strategies alone.

  11. Clinical evaluation of reducing acquisition time on single-photon emission computed tomography image quality using proprietary resolution recovery software.

    Science.gov (United States)

    Aldridge, Matthew D; Waddington, Wendy W; Dickson, John C; Prakash, Vineet; Ell, Peter J; Bomanji, Jamshed B

    2013-11-01

    A three-dimensional model-based resolution recovery (RR) reconstruction algorithm that compensates for collimator-detector response, resulting in an improvement in reconstructed spatial resolution and signal-to-noise ratio of single-photon emission computed tomography (SPECT) images, was tested. The software is said to retain image quality even with reduced acquisition time. Clinically, any improvement in patient throughput without loss of quality is to be welcomed. Furthermore, future restrictions in radiotracer supplies may add value to this type of data analysis. The aims of this study were to assess improvement in image quality using the software and to evaluate the potential of performing reduced time acquisitions for bone and parathyroid SPECT applications. Data acquisition was performed using the local standard SPECT/CT protocols for 99mTc-hydroxymethylene diphosphonate bone and 99mTc-methoxyisobutylisonitrile parathyroid SPECT imaging. The principal modification applied was the acquisition of an eight-frame gated data set acquired using an ECG simulator with a fixed signal as the trigger. This had the effect of partitioning the data such that the effect of reduced time acquisitions could be assessed without conferring additional scanning time on the patient. The set of summed data sets was then independently reconstructed using the RR software to permit a blinded assessment of the effect of acquired counts upon reconstructed image quality as adjudged by three experienced observers. Data sets reconstructed with the RR software were compared with the local standard processing protocols; filtered back-projection and ordered-subset expectation-maximization. Thirty SPECT studies were assessed (20 bone and 10 parathyroid). The images reconstructed with the RR algorithm showed improved image quality for both full-time and half-time acquisitions over local current processing protocols (Pimproved image quality compared with local processing protocols and has been

  12. Locus of legitimacy and startup resource acquisition strategies: Evidence from social enterprises in South Korea and Taiwan

    Directory of Open Access Journals (Sweden)

    Yi Ling Yang

    2018-05-01

    Full Text Available Purpose - Theoretically, the paper aims to provide locus of legitimacy as a framework to not only introduce a multidimensional perspective on legitimacy but also expand the understanding about resource acquisition strategies of social enterprises. Empirically, the authors test the theoretical predictions by using cases from South Korea and Taiwan. Practically, the authors intend to assist chief executive officers (CEOs of social enterprises in their effort to secure valuable resources and provide policy implications so that both South Korea and Taiwan learn from each other. Design/methodology/approach - The authors use case methods to find evidence of the proposed theoretical framework. The initial search for target companies showed that social enterprises in South Korea and Taiwan were ideal samples. In-person, email and phone interviews were conducted on CEOs, and archival data on institutional environments and various aspects of social enterprises were collected. Collected data were analyzed using the locus of legitimacy framework to find out how different emphasis on locus of legitimacy impacted critical decisions of social enterprise, such as human, financial and network resources. Findings - As predicted in the locus of the legitimacy framework, the analyses confirmed that locus of legitimacy did explain critical decisions of social enterprises in South Korea and Taiwan. First, significant institutional forces existed, shaping social enterprises behavior. For example, Taiwanese Jinu showed that greater emphasis was given to internal legitimacy, while South Korean Sohwa was higher in external locus of legitimacy. Such differences systematically impacted choices made on resource acquisition strategies. Jinu showed a greater similarity to those of for-profit companies, aligning key decisions of resource acquisition strategies to achieve financial viability as a top priority. However, Sohwa, though financial performance was still important

  13. Evaluation of Parallel and Fan-Beam Data Acquisition Geometries and Strategies for Myocardial SPECT Imaging

    Science.gov (United States)

    Qi, Yujin; Tsui, B. M. W.; Gilland, K. L.; Frey, E. C.; Gullberg, G. T.

    2004-06-01

    This study evaluates myocardial SPECT images obtained from parallel-hole (PH) and fan-beam (FB) collimator geometries using both circular-orbit (CO) and noncircular-orbit (NCO) acquisitions. A newly developed 4-D NURBS-based cardiac-torso (NCAT) phantom was used to simulate the /sup 99m/Tc-sestamibi uptakes in human torso with myocardial defects in the left ventricular (LV) wall. Two phantoms were generated to simulate patients with thick and thin body builds. Projection data including the effects of attenuation, collimator-detector response and scatter were generated using SIMSET Monte Carlo simulations. A large number of photon histories were generated such that the projection data were close to noise free. Poisson noise fluctuations were then added to simulate the count densities found in clinical data. Noise-free and noisy projection data were reconstructed using the iterative OS-EM reconstruction algorithm with attenuation compensation. The reconstructed images from noisy projection data show that the noise levels are lower for the FB as compared to the PH collimator due to increase in detected counts. The NCO acquisition method provides slightly better resolution and small improvement in defect contrast as compared to the CO acquisition method in noise-free reconstructed images. Despite lower projection counts the NCO shows the same noise level as the CO in the attenuation corrected reconstruction images. The results from the channelized Hotelling observer (CHO) study show that FB collimator is superior to PH collimator in myocardial defect detection, but the NCO shows no statistical significant difference from the CO for either PH or FB collimator. In conclusion, our results indicate that data acquisition using NCO makes a very small improvement in the resolution over CO for myocardial SPECT imaging. This small improvement does not make a significant difference on myocardial defect detection. However, an FB collimator provides better defect detection than a

  14. The Lean Acquisition Strategy Behind the DOD’s 2015 Electronic Health Record System

    Science.gov (United States)

    2016-09-01

    the acquisition of CHCS, numerous errors caused delays in system development contracting. First, the DOD miscommunicated with some vendors and...Unsolidified program milestones Manager Lack of commitment Program focus on home-grown solutions 3. Question 2: What Were the Key Obstacles and Risks...differences in a way … solutions are presented” (personal communication, June 10, 2016). Program manager 1 echoes the findings from the survey that the

  15. Engineering and Humanities Students' Strategies for Vocabulary Acquisition: An Iranian Experience

    Directory of Open Access Journals (Sweden)

    Hassan Soodmand Afshar

    2014-05-01

    Full Text Available The present study set out to investigate the differences between EAP (English for Academic Purposes students of Humanities and Engineering in terms of vocabulary strategy choice and use. One hundred and five undergraduate Iranian students (39 students from Engineering Faculty and 66 from Humanities Faculty studying at Bu-Ali Sina University Hamedan, during the academic year of 2011–2012 participated in this study. For data collection purposes, a pilot-tested factor-analyzed five-point Likert-scale vocabulary learning strategies questionnaire (VLSQ containing 45 statements was adopted. The results of independent samples t-test indicated that, overall, the two groups were not significantly different in the choice and use of vocabulary learning strategies. However, running Chi square analyses, significant differences were found in individual strategy use in 6 out of 45 strategies. That is, while Humanities students used more superficial and straightforward strategies like repetition strategy and seeking help from others, the Engineering students preferred much deeper, thought-provoking and sophisticated strategies like using a monolingual dictionary and learning vocabulary through collocations and coordinates. Further, the most and the least frequently used vocabulary learning strategies by the two groups were specified, out of which only two strategies in each category were commonly shared by both groups. The possible reasons why the results have turned out to be so as well as the implications of the study are discussed in details in the paper.

  16. Evaluation of a modified Fitts law brain-computer interface target acquisition task in able and motor disabled individuals

    Science.gov (United States)

    Felton, E. A.; Radwin, R. G.; Wilson, J. A.; Williams, J. C.

    2009-10-01

    A brain-computer interface (BCI) is a communication system that takes recorded brain signals and translates them into real-time actions, in this case movement of a cursor on a computer screen. This work applied Fitts' law to the evaluation of performance on a target acquisition task during sensorimotor rhythm-based BCI training. Fitts' law, which has been used as a predictor of movement time in studies of human movement, was used here to determine the information transfer rate, which was based on target acquisition time and target difficulty. The information transfer rate was used to make comparisons between control modalities and subject groups on the same task. Data were analyzed from eight able-bodied and five motor disabled participants who wore an electrode cap that recorded and translated their electroencephalogram (EEG) signals into computer cursor movements. Direct comparisons were made between able-bodied and disabled subjects, and between EEG and joystick cursor control in able-bodied subjects. Fitts' law aptly described the relationship between movement time and index of difficulty for each task movement direction when evaluated separately and averaged together. This study showed that Fitts' law can be successfully applied to computer cursor movement controlled by neural signals.

  17. Strategy Ranges: Describing Change in Prospective Elementary Teachers' Approaches to Mental Computation of Sums and Differences

    Science.gov (United States)

    Whitacre, Ian

    2015-01-01

    This study investigated the sets of mental computation strategies used by prospective elementary teachers to compute sums and differences of whole numbers. In the context of an intervention designed to improve the number sense of prospective elementary teachers, participants were interviewed pre/post, and their mental computation strategies were…

  18. The integration of automated knowledge acquisition with computer-aided software engineering for space shuttle expert systems

    Science.gov (United States)

    Modesitt, Kenneth L.

    1990-01-01

    A prediction was made that the terms expert systems and knowledge acquisition would begin to disappear over the next several years. This is not because they are falling into disuse; it is rather that practitioners are realizing that they are valuable adjuncts to software engineering, in terms of problem domains addressed, user acceptance, and in development methodologies. A specific problem was discussed, that of constructing an automated test analysis system for the Space Shuttle Main Engine. In this domain, knowledge acquisition was part of requirements systems analysis, and was performed with the aid of a powerful inductive ESBT in conjunction with a computer aided software engineering (CASE) tool. The original prediction is not a very risky one -- it has already been accomplished.

  19. Isolating second language learning factors in a computational study of bilingual construction acquisition

    NARCIS (Netherlands)

    Matusevych, Y.; Alishahi, A.; Backus, A.M.; Bello, P.; Guarini, M.; McShane, M.; Scassellati, B.

    2014-01-01

    The study of second language acquisition (SLA) is often hindered by substantial variability in the background of learners, their learning process and the input they receive. This diversity often makes it difficult to isolate specific learning factors and study their impact on L2 development. We

  20. Combining Language Corpora with Experimental and Computational Approaches for Language Acquisition Research

    Science.gov (United States)

    Monaghan, Padraic; Rowland, Caroline F.

    2017-01-01

    Historically, first language acquisition research was a painstaking process of observation, requiring the laborious hand coding of children's linguistic productions, followed by the generation of abstract theoretical proposals for how the developmental process unfolds. Recently, the ability to collect large-scale corpora of children's language…

  1. Market driven strategy for acquisition of waste acceptance and transportation services for commercial spent fuel in the united states

    International Nuclear Information System (INIS)

    Lemeshewsky, W.; Macaluso, C.; Smith, P.; Teer, B.

    1998-01-01

    The Office of Civilian Radioactive Waste Management (OCRWM) in the United States Department of Energy (DOE) has the responsibility under the Nuclear Waste Policy Act of 1982 (the Act) for the shipment of spent nuclear fuel (SNF) from commercial reactors to a Federal facility for storage and/or disposal. The Act requires the use of private industry to the 'fullest extent possible' in the transportation of spent fuels. An OCRWM goal is to develop a safe, efficient and effective transportation system while meeting the mandate of the Act. OCRWM has then develop a strategy for a market driven approach for the acquisition of transportation services and equipment. To implement this strategy, OCRWM is planning to issue a Request for Proposal (RPF) for the provision of the required services and equipment to accept SNF from the utilities and transport the SNF to a Federal facility. Two draft RPFs have been issued with the second draft incorporating comments on the first draft from potential contractors and other interested parties. The overall strategy as outlined in the draft RPF relies on private industry to use the innovative powers of the marketplace to help DOE accomplish its mission objectives. DOE intends to pursue this procurement strategy whether or not the OCRWM program includes interim storage. The concept described in the draft RPF provides for DOE to purchase services and equipment from a contractor-operated waste acceptance and transportation organization. The contractor is expected to provide initial financing for the project including that necessary for initial acquisition of operational equipment, establish the necessary management organization, and mobilize the necessary resources and capabilities to provide the SNF delivery services at a fixed rate. DOE will retain final approval on all routes and maintain primary responsibility to the States, tribes, and local units of government for assuring appropriate interaction and consideration of their input on

  2. A strategy to compute plastic post-buckling of structures

    International Nuclear Information System (INIS)

    Combescure, A.

    1983-01-01

    All the methods presented here give in some cases, some interesting computed solutions. It has been remarked that the different strategies do not always give the same post buckling path. More foundamentally, it has been observed that the post buckling path, when buckling is unstable, is characterized by a dynamic movement. All inertial effects are neglected in all the approaches presented here. So that the post buckling load deflections curve is valid only if there is a very little kinetic energy associated with the post buckling. The method is also, as it is presented, limited to a load depending of a simple parameter lambda. The case of more than one parameter is not very clear yet. In conclusion, the method presented here gives a way to solve class of the post buckling behavior of a structure. If the post buckling occurs with a small kinetic energy (displacement controlled buckling) and if the loads depend of only one parameter. These methods should give good results even into the plastic range. If the buckling is unstable and that a large kinetic energy is involved with the post buckling these methods are not realistic. (orig./RW)

  3. Computer control and data acquisition system for the Mirror Fusion Test Facility Ion Cyclotron Resonant Heating System (ICRH)

    International Nuclear Information System (INIS)

    Cheshire, D.L.; Thomas, R.A.

    1985-01-01

    The Lawrence Livermore National Laboratory (LLNL) large Mirror Fusion Test Facility (MFTF-B) will employ an Ion Cyclotron Resonant Heating (ICRH) system for plasma startup. As the MFTF-B Industrial Participant, TRW has responsibility for the ICRH system, including development of the data acquisition and control system. During the MFTF-B Supervisory Control and Diagnostic System (SCDS). For subsystem development and checkout at TRW, and for verification and acceptance testing at LLNL, the system will be run from a stand-alone computer system designed to simulate the functions of SCDS. The ''SCDS Simulator'' was developed originally for the MFTF-B ECRH System; descriptions of the hardware and software are updated in this paper. The computer control and data acquisition functions implemented for ICRH are described, including development status, and test schedule at TRW and at LLNL. The application software is written for the SCDS Simulator, but it is programmed in PASCAL and designed to facilitate conversion for use on the SCDS computers

  4. Prioritization of the hemodialysis patients' preferences in acquisition of health information: A strategy for patient education

    Directory of Open Access Journals (Sweden)

    Hassan Babamohamadi

    2016-07-01

    Full Text Available Full training according to the information needs of patients reduces health care costs and increases the quality of care. The present study was conducted aims to prioritize the preferences of hemodialysis patients in acquisition of health information to be able to provide training according to these preferences and their prioritization after achieving them. This study was a descriptive cross-sectional one which was conducted on all hemodialysis patients who visited Kowsar Hospital in Semnan within the year 2014-2015. Data collecting tool was researcher-made questionnaire which assessed physical information needs of patients in four areas of nutrition, energy, pain and discomfort, sleep and rest. Data were analyzed by SPSS software version 16 using the descriptive statistics.71 hemodialysis patients participated in this study. 68.6%, 50.7%, 42.6% and 46.7% of patients expressed acquisition information regarding hematopoietic foods, how to increase mobility, how to relieve itching during dialysis and mental activities before sleep as their first priorities, respectively. The results of this study showed that hemodialysis patients need to know what kinds of information in the field of physical problems. To facilitate adaptation and selfcare of patients, providing information and training based on the real needs of patients will be helpful.

  5. Confidence range estimate of extended source imagery acquisition algorithms via computer simulations. [in optical communication systems

    Science.gov (United States)

    Chen, CHIEN-C.; Hui, Elliot; Okamoto, Garret

    1992-01-01

    Spatial acquisition using the sun-lit Earth as a beacon source provides several advantages over active beacon-based systems for deep-space optical communication systems. However, since the angular extend of the Earth image is large compared to the laser beam divergence, the acquisition subsystem must be capable of resolving the image to derive the proper pointing orientation. The algorithms used must be capable of deducing the receiver location given the blurring introduced by the imaging optics and the large Earth albedo fluctuation. Furthermore, because of the complexity of modelling the Earth and the tracking algorithms, an accurate estimate of the algorithm accuracy can only be made via simulation using realistic Earth images. An image simulator was constructed for this purpose, and the results of the simulation runs are reported.

  6. Strategies for improving approximate Bayesian computation tests for synchronous diversification.

    Science.gov (United States)

    Overcast, Isaac; Bagley, Justin C; Hickerson, Michael J

    2017-08-24

    Estimating the variability in isolation times across co-distributed taxon pairs that may have experienced the same allopatric isolating mechanism is a core goal of comparative phylogeography. The use of hierarchical Approximate Bayesian Computation (ABC) and coalescent models to infer temporal dynamics of lineage co-diversification has been a contentious topic in recent years. Key issues that remain unresolved include the choice of an appropriate prior on the number of co-divergence events (Ψ), as well as the optimal strategies for data summarization. Through simulation-based cross validation we explore the impact of the strategy for sorting summary statistics and the choice of prior on Ψ on the estimation of co-divergence variability. We also introduce a new setting (β) that can potentially improve estimation of Ψ by enforcing a minimal temporal difference between pulses of co-divergence. We apply this new method to three empirical datasets: one dataset each of co-distributed taxon pairs of Panamanian frogs and freshwater fishes, and a large set of Neotropical butterfly sister-taxon pairs. We demonstrate that the choice of prior on Ψ has little impact on inference, but that sorting summary statistics yields substantially more reliable estimates of co-divergence variability despite violations of assumptions about exchangeability. We find the implementation of β improves estimation of Ψ, with improvement being most dramatic given larger numbers of taxon pairs. We find equivocal support for synchronous co-divergence for both of the Panamanian groups, but we find considerable support for asynchronous divergence among the Neotropical butterflies. Our simulation experiments demonstrate that using sorted summary statistics results in improved estimates of the variability in divergence times, whereas the choice of hyperprior on Ψ has negligible effect. Additionally, we demonstrate that estimating the number of pulses of co-divergence across co-distributed taxon

  7. Computer system requirements specification for 101-SY hydrogen mitigation test project data acquisition and control system (DACS-1)

    International Nuclear Information System (INIS)

    McNeece, S.G.; Truitt, R.W.

    1994-01-01

    The system requirements specification for SY-101 hydrogen mitigation test project (HMTP) data acquisition and control system (DACS-1) documents the system requirements for the DACS-1 project. The purpose of the DACS is to provide data acquisition and control capabilities for the hydrogen mitigation testing of Tank SY-101. Mitigation testing uses a pump immersed in the waste, directed at varying angles and operated at different speeds and time durations. Tank and supporting instrumentation is brought into the DACS to monitor the status of the tank and to provide information on the effectiveness of the mitigation test. Instrumentation is also provided for closed loop control of the pump operation. DACS is also capable for being expanded to control and monitor other mitigation testing. The intended audience for the computer system requirements specification includes the SY-101 hydrogen mitigation test data acquisition and control system designers: analysts, programmers, instrument engineers, operators, maintainers. It is intended for the data users: tank farm operations, mitigation test engineers, the Test Review Group (TRG), data management support staff, data analysis, Hanford data stewards, and external reviewers

  8. A response strategy predicts acquisition of schedule-induced polydipsia in rats.

    Science.gov (United States)

    Gregory, James Gardner; Hawken, Emily R; Banasikowski, Tomek J; Dumont, Eric C; Beninger, Richard J

    2015-08-03

    Schedule-induced polydipsia (SIP) is excessive, non-regulatory drinking. We aimed to identify phenotypic learning traits representative of neural circuitry that underlies SIP and hypothesized that rats that are response-learners will be more susceptible in developing compulsive water drinking. Using the Y-maze, the rats were characterized as either place- or response-learners. They were exposed to the SIP protocol for a period of 21days. Subsequent histological staining for FosB/ΔFosB examined neuronal activation associated with SIP in several brain regions. The rats with a preference for a response-learning strategy were more likely to develop SIP than the rats using a place-learning strategy. Furthermore amphetamine sensitization, observed to increase SIP, also shifted learning strategy to a response-learning strategy. No differences were observed in FosB/ΔFosB expression between SIP and non-SIP rats in the dorsolateral striatum (DLS) and CA1 region of the hippocampus. However, SIP rats had greater FosB/ΔFosB expression in prefrontal cortex regions. The rats that develop SIP have a preference for response-learning strategies and increased neuronal activation in frontal cortical regions associated with habit formation and compulsion. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Assessment of temporal resolution of multi-detector row computed tomography in helical acquisition mode using the impulse method.

    Science.gov (United States)

    Ichikawa, Katsuhiro; Hara, Takanori; Urikura, Atsushi; Takata, Tadanori; Ohashi, Kazuya

    2015-06-01

    The purpose of this study was to propose a method for assessing the temporal resolution (TR) of multi-detector row computed tomography (CT) (MDCT) in the helical acquisition mode using temporal impulse signals generated by a metal ball passing through the acquisition plane. An 11-mm diameter metal ball was shot along the central axis at approximately 5 m/s during a helical acquisition, and the temporal sensitivity profile (TSP) was measured from the streak image intensities in the reconstructed helical CT images. To assess the validity, we compared the measured and theoretical TSPs for the 4-channel modes of two MDCT systems. A 64-channel MDCT system was used to compare TSPs and image quality of a motion phantom for the pitch factors P of 0.6, 0.8, 1.0 and 1.2 with a rotation time R of 0.5 s, and for two R/P combinations of 0.5/1.2 and 0.33/0.8. Moreover, the temporal transfer functions (TFs) were calculated from the obtained TSPs. The measured and theoretical TSPs showed perfect agreement. The TSP narrowed with an increase in the pitch factor. The image sharpness of the 0.33/0.8 combination was inferior to that of the 0.5/1.2 combination, despite their almost identical full width at tenth maximum values. The temporal TFs quantitatively confirmed these differences. The TSP results demonstrated that the TR in the helical acquisition mode significantly depended on the pitch factor as well as the rotation time, and the pitch factor and reconstruction algorithm affected the TSP shape. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  10. Good year, bad year: changing strategies, changing networks? A two-year study on seed acquisition in northern Cameroon

    Directory of Open Access Journals (Sweden)

    Chloé Violon

    2016-06-01

    Full Text Available Analysis of seed exchange networks at a single point in time may reify sporadic relations into apparently fixed and long-lasting ones. In northern Cameroon, where environment is not only strongly seasonal but also shows unpredictable interannual variation, farmers' social networks are flexible from year to year. When adjusting their strategies, Tupuri farmers do not systematically solicit the same partners to acquire the desired propagules. Seed acquisitions documented during a single cropping season may thus not accurately reflect the underlying larger social network that can be mobilized at the local level. To test this hypothesis, we documented, at the outset of two cropping seasons (2010 and 2011, the relationships through which seeds were acquired by the members of 16 households in a Tupuri community. In 2011, farmers faced sudden failure of the rains and had to solicit distant relatives, highlighting their ability to quickly trigger specific social relations to acquire necessary seeding material. Observing the same set of individuals during two successive years and the seed sources they solicited in each year enabled us to discriminate repeated relations from sporadic ones. Although farmers did not acquire seeds from the same individuals from one year to the next, they relied on quite similar relational categories of people. However, the worse weather conditions during the second year led to (1 a shift from red sorghum seeds to pearl millet seeds, (2 a geographical extension of the network, and (3 an increased participation of women in seed acquisitions. In critical situations, women mobilized their own kin almost exclusively. We suggest that studying the seed acquisition network over a single year provides a misrepresentation of the underlying social network. Depending on the difficulties farmers face, they may occasionally call on relationships that transcend the local relationships used each year.

  11. Data acquisition instruments: Psychopharmacology

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1998-01-01

    This report contains the results of a Direct Assistance Project performed by Lockheed Martin Energy Systems, Inc., for Dr. K. O. Jobson. The purpose of the project was to perform preliminary analysis of the data acquisition instruments used in the field of psychiatry, with the goal of identifying commonalities of data and strategies for handling and using the data in the most advantageous fashion. Data acquisition instruments from 12 sources were provided by Dr. Jobson. Several commonalities were identified and a potentially useful data strategy is reported here. Analysis of the information collected for utility in performing diagnoses is recommended. In addition, further work is recommended to refine the commonalities into a directly useful computer systems structure.

  12. Hard real-time quick EXAFS data acquisition with all open source software on a commodity personal computer

    International Nuclear Information System (INIS)

    So, I.; Siddons, D.P.; Caliebe, W.A.; Khalid, S.

    2007-01-01

    We describe here the data acquisition subsystem of the Quick EXAFS (QEXAFS) experiment at the National Synchrotron Light Source of Brookhaven National Laboratory. For ease of future growth and flexibility, almost all software components are open source with very active maintainers. Among them, Linux running on x86 desktop computer, RTAI for real-time response, COMEDI driver for the data acquisition hardware, Qt and PyQt for graphical user interface, PyQwt for plotting, and Python for scripting. The signal (A/D) and energy-reading (IK220 encoder) devices in the PCI computer are also EPICS enabled. The control system scans the monochromator energy through a networked EPICS motor. With the real-time kernel, the system is capable of deterministic data-sampling period of tens of micro-seconds with typical timing-jitter of several micro-seconds. At the same time, Linux is running in other non-real-time processes handling the user-interface. A modern Qt-based controls-frontend enhances productivity. The fast plotting and zooming of data in time or energy coordinates let the experimenters verify the quality of the data before detailed analysis. Python scripting is built-in for automation. The typical data-rate for continuous runs are around 10 M bytes/min

  13. Development of a computer systems for operational data acquisition of uranium isotopic enrichment pilot plant

    International Nuclear Information System (INIS)

    Maia, W.M.C.

    1985-01-01

    A pilot plant for uranium enrichment using the jet nozzle process was transfered from Federal Republic of Germany to Brazil, to train Brazilian technicist in its operation and to improve the process. This pilot plant is monitored by a data acquisition system and the possibility of faulty events would cause serious dificulties, as far as maintenance is concerned (for instance, unvailable special components). It is described the development of a new system, which is proposed in order to minimize difficulties with maintenance that utilizes in the assembling integrated circuits of large scale of integration. It is controlled by a microcomputer. (Author) [pt

  14. Computer-controlled data acquisition system for the ISX-B neutral injection system

    International Nuclear Information System (INIS)

    Edmonds, P.H.; Sherrill, B.; Pearce, J.W.

    1980-05-01

    A data acquisition system for the Impurity Study Experiment (ISX-B) neutral injection system at the Oak Ridge National Laboratory is presented. The system is based on CAMAC standards and is controlled by a MIK-11/2 microcomputer. The system operates at the ion source high voltage on the source table, transmitting the analyzed data to a terminal at ground potential. This reduces the complexity of the communications link and also allows much flexibility in the diagnostics and eventual control of the beam line

  15. A synchronization method for wireless acquisition systems, application to brain computer interfaces.

    Science.gov (United States)

    Foerster, M; Bonnet, S; van Langhenhove, A; Porcherot, J; Charvet, G

    2013-01-01

    A synchronization method for wireless acquisition systems has been developed and implemented on a wireless ECoG recording implant and on a wireless EEG recording helmet. The presented algorithm and hardware implementation allow the precise synchronization of several data streams from several sensor nodes for applications where timing is critical like in event-related potential (ERP) studies. The proposed method has been successfully applied to obtain visual evoked potentials and compared with a reference biosignal amplifier. The control over the exact sampling frequency allows reducing synchronization errors that will otherwise accumulate during a recording. The method is scalable to several sensor nodes communicating with a shared base station.

  16. Effects of two retraining strategies on nursing students' acquisition and retention of BLS/AED skills: A cluster randomised trial.

    Science.gov (United States)

    Hernández-Padilla, José Manuel; Suthers, Fiona; Granero-Molina, José; Fernández-Sola, Cayetano

    2015-08-01

    To determine and compare the effects of two different retraining strategies on nursing students' acquisition and retention of BLS/AED skills. Nursing students (N = 177) from two European universities were randomly assigned to either an instructor-directed (IDG) or a student-directed (SDG) 4-h retraining session in BLS/AED. A multiple-choice questionnaire, the Cardiff Test, Laerdal SkillReporter(®) software and a self-efficacy scale were used to assess students' overall competency (knowledge, psychomotor skills and self-efficacy) in BLS/AED at pre-test, post-test and 3-month retention-test. GEE, chi-squared and McNemar tests were performed to examine statistical differences amongst groups across time. There was a significant increase in the proportion of students who achieved competency for all variables measuring knowledge, psychomotor skills and self-efficacy between pre-test and post-test in both groups (all p-valuesstudy demonstrated that using a student-directed strategy to retrain BLS/AED skills has resulted in a higher proportion of nursing students achieving and retaining competency in BLS/AED at three months when compared to an instructor-directed strategy. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  17. The effect of acquisition moves on income, pre-tax profits and future strategy of logistics firms

    Directory of Open Access Journals (Sweden)

    Judit Oláh

    2017-12-01

    Full Text Available Our research deals with a comprehensive study of the management success factors of logistics service providers using a new approach, and examines the life of logistics service companies. The data were collected from 51 logistics service providers in Hungary. We searched for the proper enterprise scale – acquisitionsstrategies (including the method of looking for the economies of scale in the LSP segment to be examined, and the role of strategy choice. Our research has found that among logistics companies those firms which followed the growth pattern has significantly higher sales revenue than the companies growing organically. Additionally, logistics companies – considering their pre-tax profits - work more efficiently when they have a growth strategy (regardless of its time lag. However, this claim is true only for those companies that did not have any (revenue growth over the previous period. The results of our research can effectively help logistics service providers find their business success factors, which will enable them fulfil the expectations of their customers in the supply chain better.

  18. The Effects of Computer-Assisted Feedback Strategies in Technology Education: A Comparison of Learning Outcomes

    Science.gov (United States)

    Adams, Ruifang Hope; Strickland, Jane

    2012-01-01

    This study investigated the effects of computer-assisted feedback strategies that have been utilized by university students in a technology education curriculum. Specifically, the study examined the effectiveness of the computer-assisted feedback strategy "Knowledge of Response feedback" (KOR), and the "Knowledge of Correct Responses feedback"…

  19. Mental Computation or Standard Algorithm? Children's Strategy Choices on Multi-Digit Subtractions

    Science.gov (United States)

    Torbeyns, Joke; Verschaffel, Lieven

    2016-01-01

    This study analyzed children's use of mental computation strategies and the standard algorithm on multi-digit subtractions. Fifty-eight Flemish 4th graders of varying mathematical achievement level were individually offered subtractions that either stimulated the use of mental computation strategies or the standard algorithm in one choice and two…

  20. English Language Learners' Strategies for Reading Computer-Based Texts at Home and in School

    Science.gov (United States)

    Park, Ho-Ryong; Kim, Deoksoon

    2016-01-01

    This study investigated four elementary-level English language learners' (ELLs') use of strategies for reading computer-based texts at home and in school. The ELLs in this study were in the fourth and fifth grades in a public elementary school. We identify the ELLs' strategies for reading computer-based texts in home and school environments. We…

  1. Effects of Computer-Based Practice on the Acquisition and Maintenance of Basic Academic Skills for Children with Moderate to Intensive Educational Needs

    Science.gov (United States)

    Everhart, Julie M.; Alber-Morgan, Sheila R.; Park, Ju Hee

    2011-01-01

    This study investigated the effects of computer-based practice on the acquisition and maintenance of basic academic skills for two children with moderate to intensive disabilities. The special education teacher created individualized computer games that enabled the participants to independently practice academic skills that corresponded with their…

  2. The movement kinematics and learning strategies associated with adopting different foci of attention during both acquisition and anxious performance.

    Directory of Open Access Journals (Sweden)

    Gavin Peter Lawrence

    2012-11-01

    Full Text Available Research suggests that implicit strategies adopted during learning help prevent breakdown of automatic processes and subsequent performance decrements associated with the presence of pressure. According to the Constrained Action Hypothesis, automaticity of movement is promoted when adopting an external focus of attention. The purpose of the current experiment was to investigate if learning with an external focus of attention can enhance performance under subsequent pressure situations through promoting implicit learning and automaticity. Since previous research has generally used outcome measures of performance, the current study adopted measures of movement production. Specifically, we calculated within subject variability in trajectory velocity and distance travelled every 10% of movement time. This detailed kinematic analysis allowed investigation into some of the previously unexplored mechanisms responsible for the benefits of adopting an external focus of attention. Novice participants performed a 2.5m golf putt. Following a pre-test, participants were randomly assigned to one of three focus groups (internal, external, control. Participants then completed 400 acquisition trials over two consecutive days before being subjected to both a low-anxiety and high-anxiety transfer test. Dependent variables included variability, number of successful putts and mean radial error. Results revealed that variability was greater in the internal compared to the external and control groups. Putting performance revealed that all groups increased performance following acquisition. However, only the control group demonstrated a decrement in performance in the high-anxiety transfer test. These findings suggest that adopting an appropriate focus of attention during learning can prevent choking; with an external focus inhibiting the breakdown of automatic processes and an internal focus acting as a self-focus learning strategy and thus desensitizing individuals

  3. Pedagogical Strategies to Increase Pre-service Teachers’ Confidence in Computer Learning

    Directory of Open Access Journals (Sweden)

    Li-Ling Chen

    2004-07-01

    Full Text Available Pre-service teachers’ attitudes towards computers significantly influence their future adoption of integrating computer technology into their teaching. What are the pedagogical strategies that a teacher education instructor or an instructional designer can incorporate to enhance a pre-service teacher’s comfort level in using computers? In this exploratory report, the researcher synthesizes related literature, provides a comprehensive list of theory-based instructional strategies, and describes a study of the perceptions of 189 pre-service teachers regarding strategies related to increasing their comfort in using computers.

  4. In-Depth Analysis of Computer Memory Acquisition Software for Forensic Purposes.

    Science.gov (United States)

    McDown, Robert J; Varol, Cihan; Carvajal, Leonardo; Chen, Lei

    2016-01-01

    The comparison studies on random access memory (RAM) acquisition tools are either limited in metrics or the selected tools were designed to be executed in older operating systems. Therefore, this study evaluates widely used seven shareware or freeware/open source RAM acquisition forensic tools that are compatible to work with the latest 64-bit Windows operating systems. These tools' user interface capabilities, platform limitations, reporting capabilities, total execution time, shared and proprietary DLLs, modified registry keys, and invoked files during processing were compared. We observed that Windows Memory Reader and Belkasoft's Live Ram Capturer leaves the least fingerprints in memory when loaded. On the other hand, ProDiscover and FTK Imager perform poor in memory usage, processing time, DLL usage, and not-wanted artifacts introduced to the system. While Belkasoft's Live Ram Capturer is the fastest to obtain an image of the memory, Pro Discover takes the longest time to do the same job. © 2015 American Academy of Forensic Sciences.

  5. Facility location, capacity acquisition and technology selection models for manufacturing strategy planning

    OpenAIRE

    Verter, Vedat

    1993-01-01

    Ankara : The Institute of Engineering and Science, Bilkent Univ., 1993. Thesis (Ph.D.) -- Bilkent University, 1993. Includes bibliographical references leaves 129-141. The primary aim of this dissertation research is to contribute to the manufacturing strategy planning process. The firm is perceived as a value chain which can be represented by a production-distribution network. Structural decisions regarding the value chain of a firm are the means to implement the firm’s manufacturin...

  6. The Effect of Various Strategies on the Acquisition, Retention, and Transfer of a Serial Positioning Task

    Science.gov (United States)

    1979-07-01

    child development and behavior. N. Y.: Academic Press, 1974. Craik , F. I.M., & Lockhart , R. S. Levels of processing : A framework for memory research...F. I. M. Craik & L. S. Cermak (Eds.), Levels of processing and theories of memory. Hillsdale, N. J.: Erlbaum, 1978. Bruner, J. S. The act of discovery... Lockhart , 1972; Craik & Tulving, 1975). Although the dependent measures differ, the conclusions drawn remain similar. Strategy usage has a facilitatory

  7. The Influence of Learning Strategies in the Acquisition, Retention, and Transfer of a Visual Tracking Task

    Science.gov (United States)

    1979-08-01

    Psychology, Psychoanalysis and Neurology X. N. Y.: Van Nostrand Reinhold, 1977. Craik , F. I. M., & Lockhart , R. S. Levels of processing : A framework for...Morris, C. D., & Stein, B. S. Some general constraints oil learning and memory research. In F. I. M. Craik & L. S. Cermak (Eds.), Levels of processing ... Craik & Lockhart , 1972; Craik & Tulving, 1975). Although the dependent measures differ, the conclusions drawn remain similar. Strategy usage has a

  8. Using Mixed-Modality Learning Strategies via e-Learning for Second Language Vocabulary Acquisition

    Science.gov (United States)

    Yang, Fang-Chuan Ou; Wu, Wen-Chi Vivian

    2015-01-01

    This study demonstrated an e-learning system, MyEVA, based on a mixed-modality vocabulary strategy in assisting learners of English as a second language (L2 learners) to improve their vocabulary. To explore the learning effectiveness of MyEVA, the study compared four vocabulary-learning techniques, MyEVA in preference mode, MyEVA in basic mode, an…

  9. Advanced Technology Acquisition Strategies of the People’s Republic of China

    Science.gov (United States)

    2010-09-01

    Institute of Applied Physics and Computational Mathematics ICBM inter-continental ballistic missile IOE Institute of Optics and Electronics IRC...corporations, and other entities to devise collection schemes according to their particular needs. Another is China’s emphasis on “ actuarial ... mathematics , astronomy, manufacturing, and many other facets of economic and cultural development. The “four great inventions” (sì dà fā míng) of

  10. Knowledge acquisition in ecological poduct design: the effects of computer-mediated communication and elicitation method

    OpenAIRE

    Sauer, J.; Schramme, S.; Rüttinger, B.

    2000-01-01

    This article presents a study that examines multiple effects of using different means of computer-mediated communication and knowledge elicitation methods during a product design process. The experimental task involved a typical scenario in product design, in which a knowledge engineer consults two experts to generate knowledge about a design issue. Employing a 3x2 between-subjects design, three conference types (face-to-face, computer, multivedia) and two knowledge elicitation methods (struc...

  11. Neurolinguistics and psycholinguistics as a basis for computer acquisition of natural language

    Energy Technology Data Exchange (ETDEWEB)

    Powers, D.M.W.

    1983-04-01

    Research into natural language understanding systems for computers has concentrated on implementing particular grammars and grammatical models of the language concerned. This paper presents a rationale for research into natural language understanding systems based on neurological and psychological principles. Important features of the approach are that it seeks to place the onus of learning the language on the computer, and that it seeks to make use of the vast wealth of relevant psycholinguistic and neurolinguistic theory. 22 references.

  12. Listening Strategy Use and Influential Factors in Web-Based Computer Assisted Language Learning

    Science.gov (United States)

    Chen, L.; Zhang, R.; Liu, C.

    2014-01-01

    This study investigates second and foreign language (L2) learners' listening strategy use and factors that influence their strategy use in a Web-based computer assisted language learning (CALL) system. A strategy inventory, a factor questionnaire and a standardized listening test were used to collect data from a group of 82 Chinese students…

  13. The composing technique of fast and large scale nuclear data acquisition and control system with single chip microcomputers and PC computers

    International Nuclear Information System (INIS)

    Xu Zurun; Wu Shiying; Liu Haitao; Yao Yangsen; Wang Yingguan; Yang Chaowen

    1998-01-01

    The technique of employing single-chip microcomputers and PC computers to compose a fast and large scale nuclear data acquisition and control system was discussed in detail. The optimum composition mode of this kind of system, the acquisition and control circuit unit based on single-chip microcomputers, the real-time communication methods and the software composition under the Windows 3.2 were also described. One, two and three dimensional spectra measured by this system were demonstrated

  14. The composing technique of fast and large scale nuclear data acquisition and control system with single chip microcomputers and PC computers

    International Nuclear Information System (INIS)

    Xu Zurun; Wu Shiying; Liu Haitao; Yao Yangsen; Wang Yingguan; Yang Chaowen

    1997-01-01

    The technique of employing single-chip microcomputers and PC computers to compose a fast and large scale nuclear data acquisition and control system was discussed in detail. The optimum composition mode of this kind of system, the acquisition and control circuit unit based on single-chip microcomputers, the real-time communication methods and the software composition under the Windows 3.2 were also described. One, two and three dimensional spectra measured by this system were demonstrated

  15. The selection of acquisition strategy and solving trade surpluses of food products by using the simulation

    Directory of Open Access Journals (Sweden)

    Mikić Neven

    2015-01-01

    Full Text Available Real business environment opens up many possibilities of business conduct, so that appropriate strategies, compatible with multicriteria requirements of the environment, potentially lead to the realization of the set goal. Adequate schedule and the optimal combination of available resources are possible to establish by a mathematical formalization in terms of the theoretical model that connects business outcomes with a cause or a probability of their occurrence. Exactly research of the possibility of using and applying the results of theoretical models in solving the specific tasks in regard to expressing relations of initial assumptions related to selection of the optimal operating strategy, is the initial motive of this paper. The theoretical models, which describe the real problem, can be analysed analytically or by simulation, depending on its complexity and the variables type, which describe it. The model should provide achieving the managing balance through the model correction of the available operational resources, increasing in that way also the capacity of decision-making system in terms of futuristic knowledge insufficiency. The research results should show that the simulation model apply, in this particular example, enables to a company significant increase of business efficiency level, more complex utilization of the capacities, increase of the competitiveness, etc.

  16. Playing at Serial Acquisitions

    NARCIS (Netherlands)

    J.T.J. Smit (Han); T. Moraitis (Thras)

    2010-01-01

    textabstractBehavioral biases can result in suboptimal acquisition decisions-with the potential for errors exacerbated in consolidating industries, where consolidators design serial acquisition strategies and fight escalating takeover battles for platform companies that may determine their future

  17. The research of computer network security and protection strategy

    Science.gov (United States)

    He, Jian

    2017-05-01

    With the widespread popularity of computer network applications, its security is also received a high degree of attention. Factors affecting the safety of network is complex, for to do a good job of network security is a systematic work, has the high challenge. For safety and reliability problems of computer network system, this paper combined with practical work experience, from the threat of network security, security technology, network some Suggestions and measures for the system design principle, in order to make the masses of users in computer networks to enhance safety awareness and master certain network security technology.

  18. Efficient Strategy Computation in Zero-Sum Asymmetric Repeated Games

    KAUST Repository

    Li, Lichun; Shamma, Jeff S.

    2017-01-01

    -horizon nested information games. For finite-horizon settings, we exploit that for both players, the security strategy, and also the opponent's corresponding best response depend only on the informed player's history of actions. Using this property, we refine

  19. Regional differences in brain volume predict the acquisition of skill in a complex real-time strategy videogame.

    Science.gov (United States)

    Basak, Chandramallika; Voss, Michelle W; Erickson, Kirk I; Boot, Walter R; Kramer, Arthur F

    2011-08-01

    Previous studies have found that differences in brain volume among older adults predict performance in laboratory tasks of executive control, memory, and motor learning. In the present study we asked whether regional differences in brain volume as assessed by the application of a voxel-based morphometry technique on high resolution MRI would also be useful in predicting the acquisition of skill in complex tasks, such as strategy-based video games. Twenty older adults were trained for over 20 h to play Rise of Nations, a complex real-time strategy game. These adults showed substantial improvements over the training period in game performance. MRI scans obtained prior to training revealed that the volume of a number of brain regions, which have been previously associated with subsets of the trained skills, predicted a substantial amount of variance in learning on the complex game. Thus, regional differences in brain volume can predict learning in complex tasks that entail the use of a variety of perceptual, cognitive and motor processes. Copyright © 2011 Elsevier Inc. All rights reserved.

  20. Mass spectrometry in plant metabolomics strategies: from analytical platforms to data acquisition and processing.

    Science.gov (United States)

    Ernst, Madeleine; Silva, Denise Brentan; Silva, Ricardo Roberto; Vêncio, Ricardo Z N; Lopes, Norberto Peporine

    2014-06-01

    Covering: up to 2013. Plant metabolomics is a relatively recent research field that has gained increasing interest in the past few years. Up to the present day numerous review articles and guide books on the subject have been published. This review article focuses on the current applications and limitations of the modern mass spectrometry techniques, especially in combination with electrospray ionisation (ESI), an ionisation method which is most commonly applied in metabolomics studies. As a possible alternative to ESI, perspectives on matrix-assisted laser desorption/ionisation mass spectrometry (MALDI-MS) in metabolomics studies are introduced, a method which still is not widespread in the field. In metabolomics studies the results must always be interpreted in the context of the applied sampling procedures as well as data analysis. Different sampling strategies are introduced and the importance of data analysis is illustrated in the example of metabolic network modelling.

  1. Darwinian Spacecraft: Soft Computing Strategies Breeding Better, Faster Cheaper

    Science.gov (United States)

    Noever, David A.; Baskaran, Subbiah

    1999-01-01

    Computers can create infinite lists of combinations to try to solve a particular problem, a process called "soft-computing." This process uses statistical comparables, neural networks, genetic algorithms, fuzzy variables in uncertain environments, and flexible machine learning to create a system which will allow spacecraft to increase robustness, and metric evaluation. These concepts will allow for the development of a spacecraft which will allow missions to be performed at lower costs.

  2. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    Energy Technology Data Exchange (ETDEWEB)

    Joubert, Wayne [ORNL; Kothe, Douglas B [ORNL; Nam, Hai Ah [ORNL

    2009-12-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for the longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be

  3. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    International Nuclear Information System (INIS)

    Joubert, Wayne; Kothe, Douglas B.; Nam, Hai Ah

    2009-01-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for the longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be

  4. Computer game-based mathematics education : Embedded faded worked examples facilitate knowledge acquisition

    NARCIS (Netherlands)

    ter Vrugte, Judith; de Jong, Anthonius J.M.; Vandercruysse, Sylke; Wouters, Pieter; van Oostendorp, Herre; Elen, Jan

    This study addresses the added value of faded worked examples in a computer game-based learning environment. The faded worked examples were introduced to encourage active selection and processing of domain content in the game. The content of the game was proportional reasoning and participants were

  5. CAMAC throughput of a new RISC-based data acquisition computer at the DIII-D tokamak

    International Nuclear Information System (INIS)

    VanderLaan, J.F.; Cummings, J.W.

    1993-10-01

    The amount of experimental data acquired per plasma discharge at DIII-D has continued to grow. The largest shot size in May 1991 was 49 Mbyte; in May 1992, 66 Mbyte; and in April 1993, 80 Mbyte. The increasing load has prompted the installation of a new Motorola 88100-based MODCOMP computer to supplement the existing core of three older MODCOMP data acquisition CPUs. New Kinetic Systems CAMAC serial highway driver hardware runs on the 88100 VME bus. The new operating system is MODCOMP REAL/IX version of AT ampersand T System V UNIX with real-time extensions and networking capabilities; future plans call for installation of additional computers of this type for tokamak and neutral beam control functions. Experiences with the CAMAC hardware and software will be chronicled, including observation of data throughput. The Enhanced Serial Highway crate controller is advertised as twice as fast as the previous crate controller, and computer I/O speeds are expected to also increase data rates

  6. CAMAC throughput of a new RISC-based data acquisition computer at the DIII-D tokamak

    Science.gov (United States)

    Vanderlaan, J. F.; Cummings, J. W.

    1993-10-01

    The amount of experimental data acquired per plasma discharge at DIII-D has continued to grow. The largest shot size in May 1991 was 49 Mbyte; in May 1992, 66 Mbyte; and in April 1993, 80 Mbyte. The increasing load has prompted the installation of a new Motorola 88100-based MODCOMP computer to supplement the existing core of three older MODCOMP data acquisition CPU's. New Kinetic Systems CAMAC serial highway driver hardware runs on the 88100 VME bus. The new operating system is MODCOMP REAL/IX version of AT&T System V UNIX with real-time extensions and networking capabilities; future plans call for installation of additional computers of this type for tokamak and neutral beam control functions. Experiences with the CAMAC hardware and software will be chronicled, including observation of data throughput. The Enhanced Serial Highway crate controller is advertised as twice as fast as the previous crate controller, and computer I/O speeds are expected to also increase data rates.

  7. Three-dimensional image acquisition and reconstruction system on a mobile device based on computer-generated integral imaging.

    Science.gov (United States)

    Erdenebat, Munkh-Uchral; Kim, Byeong-Jun; Piao, Yan-Ling; Park, Seo-Yeon; Kwon, Ki-Chul; Piao, Mei-Lan; Yoo, Kwan-Hee; Kim, Nam

    2017-10-01

    A mobile three-dimensional image acquisition and reconstruction system using a computer-generated integral imaging technique is proposed. A depth camera connected to the mobile device acquires the color and depth data of a real object simultaneously, and an elemental image array is generated based on the original three-dimensional information for the object, with lens array specifications input into the mobile device. The three-dimensional visualization of the real object is reconstructed on the mobile display through optical or digital reconstruction methods. The proposed system is implemented successfully and the experimental results certify that the system is an effective and interesting method of displaying real three-dimensional content on a mobile device.

  8. Problems and accommodation strategies reported by computer users with rheumatoid arthritis or fibromyalgia.

    Science.gov (United States)

    Baker, Nancy A; Rubinstein, Elaine N; Rogers, Joan C

    2012-09-01

    Little is known about the problems experienced by and the accommodation strategies used by computer users with rheumatoid arthritis (RA) or fibromyalgia (FM). This study (1) describes specific problems and accommodation strategies used by people with RA and FM during computer use; and (2) examines if there were significant differences in the problems and accommodation strategies between the different equipment items for each diagnosis. Subjects were recruited from the Arthritis Network Disease Registry. Respondents completed a self-report survey, the Computer Problems Survey. Data were analyzed descriptively (percentages; 95% confidence intervals). Differences in the number of problems and accommodation strategies were calculated using nonparametric tests (Friedman's test and Wilcoxon Signed Rank Test). Eighty-four percent of respondents reported at least one problem with at least one equipment item (RA = 81.5%; FM = 88.9%), with most respondents reporting problems with their chair. Respondents most commonly used timing accommodation strategies to cope with mouse and keyboard problems, personal accommodation strategies to cope with chair problems and environmental accommodation strategies to cope with monitor problems. The number of problems during computer use was substantial in our sample, and our respondents with RA and FM may not implement the most effective strategies to deal with their chair, keyboard, or mouse problems. This study suggests that workers with RA and FM might potentially benefit from education and interventions to assist with the development of accommodation strategies to reduce problems related to computer use.

  9. Study on Cloud Computing Resource Scheduling Strategy Based on the Ant Colony Optimization Algorithm

    OpenAIRE

    Lingna He; Qingshui Li; Linan Zhu

    2012-01-01

    In order to replace the traditional Internet software usage patterns and enterprise management mode, this paper proposes a new business calculation mode- cloud computing, resources scheduling strategy is the key technology in cloud computing, Based on the study of cloud computing system structure and the mode of operation, The key research for cloud computing the process of the work scheduling and resource allocation problems based on ant colony algorithm , Detailed analysis and design of the...

  10. Assisting at-risk community college students' acquisition of critical thinking learning strategies in human anatomy and physiology

    Science.gov (United States)

    Arburn, Theresa Morkovsky

    1998-11-01

    The purpose of this study was to investigate whether learning thinking strategies within the context of a community college course in Human Anatomy and Physiology would result in increased academic performance and the incidence of critical thinking skills. Included in the study sample were 68 community college students, many of whom would be categorized as "at-risk," who were enrolled in four sections of a Human Anatomy and Physiology class. Two of the class sections served as the experimental group and two sections served as the control group. During the course of one semester, members of the experimental group participated in the use of a student-generated questioning technique in conjunction with lecture presentations, while members of the control group did not. All students were pretested using the Learning and Study Strategies Inventory (LASSI) and the California Critical Thinking Skills Test (CCTST). Posttesting was completed using these same instruments and an end-of-course comprehensive examination. Analysis of data revealed no significant differences between the experimental and control groups with regard to their overall achievement, their ability to process information, or their demonstration of critical thinking. It was interesting to note, however, that members of the experimental group did exhibit a change in their ability to select main ideas, apply deductive reasoning, and use inference. While the use of thinking strategies within the context of the course did not effect a significant change in academic achievement or critical thinking among at-risk community college students, it should be noted that application of a non-lecture method of class participation had no negative impact on student performance. Whether more abstruse changes have occurred with regard to the acquisition of cognitive skills remains to be elucidated.

  11. The Effect of Using Jigsaw Strategy in Teaching Science on the Acquisition of Scientific Concepts among the Fourth Graders of Bani Kinana Directorate of Education

    Science.gov (United States)

    Hamadneh, Qaseem Mohammad Salim

    2017-01-01

    The study aimed to identify the effect of using Jigsaw strategy in teaching science on the acquisition of scientific concepts among the fourth graders of Bani Kinana Directorate of Education compared to the traditional way. The study sample consisted of 70 male and female students, divided into two groups: experimental and control where the…

  12. Projection matrix acquisition for cone-beam computed tomography iterative reconstruction

    Science.gov (United States)

    Yang, Fuqiang; Zhang, Dinghua; Huang, Kuidong; Shi, Wenlong; Zhang, Caixin; Gao, Zongzhao

    2017-02-01

    Projection matrix is an essential and time-consuming part in computed tomography (CT) iterative reconstruction. In this article a novel calculation algorithm of three-dimensional (3D) projection matrix is proposed to quickly acquire the matrix for cone-beam CT (CBCT). The CT data needed to be reconstructed is considered as consisting of the three orthogonal sets of equally spaced and parallel planes, rather than the individual voxels. After getting the intersections the rays with the surfaces of the voxels, the coordinate points and vertex is compared to obtain the index value that the ray traversed. Without considering ray-slope to voxel, it just need comparing the position of two points. Finally, the computer simulation is used to verify the effectiveness of the algorithm.

  13. Simulation of skill acquisition in sequential learning of a computer game

    DEFF Research Database (Denmark)

    Hansen, John Paulin; Nielsen, Finn Ravnsbjerg; Rasmussen, Jens

    1995-01-01

    The paper presents some theoretical assumptions about the cognitive control mechanisms of subjects learning to play a computer game. A simulation model has been developed to investigate these assumptions. The model is an automaton, reacting to instruction-like cue action rules. The prototypical...... performances of 23 experimental subjects at succeeding levels of training are compared to the performance of the model. The findings are interpreted in terms of a general taxonomy for cognitive task analysis....

  14. MR equipment acquisition strategies: low-field or high-field scanners

    International Nuclear Information System (INIS)

    Marti-Bonmati, L.; Kormano, M.

    1997-01-01

    Magnetic resonance (MR) field strength is one of the key aspects to consider when purchasing MR equipment. Other aspects include the gradient system, coil design, computer and pulse sequence availability, purchase cost, local reimbursement policies, and current opinion within the medical community. Our objective here is to evaluate the decision-influencing aspects of the MR market, with a focus on some specific areas such as high resolution studies, examination times, special techniques, instrumentation, open design magnets, costs and reimbursement policies, academic and industrial interests, contrast media, clinical efficacy, and finally, clinicians' preferences. Certainly the advantage of high-field is a higher signal-to-noise ratio and improved resolution. With a high-field unit, higher spatial resolution images and higher temporal resolution images can be obtained. Typical imaging times needed to produce clinically diagnostic images are about 3 times longer at 0.1 T than at 1.0 or 1.5 T. High-field-related advanced techniques, such as functional imaging, spectroscopy and microscopy, may become clinically useful in the near future. As long as there is an unlimited demand for MR examinations, it appears financially profitable to run a high-field system, despite the associated higher costs. However, if demand for MR becomes saturated, low-field systems will cause less financial strain on the reimbursement organisation and service provider. Recent emphasis on cost containment, the development of interventional techniques, the increased use of MR for patients in intensive care and operating suites, the deployment of magnets in office suites, and the development of new magnet configurations, all favour the supplementary use of low-field systems. Hence, MR units of all field strengths have a role in radiology. (orig.)

  15. Computing security strategies in finite horizon repeated Bayesian games

    KAUST Repository

    Lichun Li; Langbort, Cedric; Shamma, Jeff S.

    2017-01-01

    in the worst case. First, a security strategy that directly depends on both players' history actions is derived by refining the sequence form. Noticing that history action space grows exponentially with respect to the time horizon, this paper further presents a

  16. Strategies for Involving Communities in the Funding of Computer ...

    African Journals Online (AJOL)

    Toshiba

    An International Multidisciplinary Journal, Ethiopia. Vol. 8 (1), Serial ... This is a survey design which investigated the strategies for involving communities in ... logical operations on the data and finally gives the result in the form of information. .... Adesina. (1981) observed that the major constraints of educational financing in.

  17. Plant nutrient acquisition strategies in tundra species: at which soil depth do species take up their nitrogen?

    Science.gov (United States)

    Limpens, Juul; Heijmans, Monique; Nauta, Ake; van Huissteden, Corine; van Rijssel, Sophie

    2016-04-01

    The Arctic is warming at unprecedented rates. Increased thawing of permafrost releases nutrients locked up in the previously frozen soils layers, which may initiate shifts in vegetation composition. The direction in which the vegetation shifts will co-determine whether Arctic warming is mitigated or accelerated, making understanding successional trajectories urgent. One of the key factors influencing the competitive relationships between plant species is their access to nutrients, in particularly nitrogen (N). We assessed the depth at which plant species took up N by performing a 15N tracer study, injecting 15(NH4)2SO4 at three depths (5, 15, 20 cm) into the soil in arctic tundra in north-eastern Siberia in July. In addition we explored plant nutrient acquisition strategy by analyzing natural abundances of 15N in leaves. We found that vascular plants took up 15N at all injection depths, irrespective of species, but also that species showed a clear preference for specific soil layers that coincided with their functional group (graminoids, dwarf shrubs, cryptogams). Graminoids took up most 15N at 20 cm depth nearest to the thaw front, with grasses showing a more pronounced preference than sedges. Dwarf shrubs took up most 15N at 5 cm depth, with deciduous shrubs displaying more preference than evergreens. Cryptogams did not take up any of the supplied 15N . The natural 15N abundances confirmed the pattern of nutrient acquisition from deeper soil layers in graminoids and from shallow soil layers in both deciduous and evergreen dwarf shrubs. Our results prove that graminoids and shrubs differ in their N uptake strategies, with graminoids profiting from nutrients released at the thaw front, whereas shrubs forage in the upper soil layers. The above implies that graminoids, grasses in particular, will have a competitive advantage over shrubs as the thaw front proceeds and/or superficial soil layers dry out. Our results suggest that the vertical distribution of nutrients

  18. Computer Networking Strategies for Building Collaboration among Science Educators.

    Science.gov (United States)

    Aust, Ronald

    The development and dissemination of science materials can be associated with technical delivery systems such as the Unified Network for Informatics in Teacher Education (UNITE). The UNITE project was designed to investigate ways for using computer networking to improve communications and collaboration among university schools of education and…

  19. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    Science.gov (United States)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  20. A strategy for reducing turnaround time in design optimization using a distributed computer system

    Science.gov (United States)

    Young, Katherine C.; Padula, Sharon L.; Rogers, James L.

    1988-01-01

    There is a need to explore methods for reducing lengthly computer turnaround or clock time associated with engineering design problems. Different strategies can be employed to reduce this turnaround time. One strategy is to run validated analysis software on a network of existing smaller computers so that portions of the computation can be done in parallel. This paper focuses on the implementation of this method using two types of problems. The first type is a traditional structural design optimization problem, which is characterized by a simple data flow and a complicated analysis. The second type of problem uses an existing computer program designed to study multilevel optimization techniques. This problem is characterized by complicated data flow and a simple analysis. The paper shows that distributed computing can be a viable means for reducing computational turnaround time for engineering design problems that lend themselves to decomposition. Parallel computing can be accomplished with a minimal cost in terms of hardware and software.

  1. Computer-aided acquisition and logistics support (CALS): Concept of Operations for Depot Maintenance

    Energy Technology Data Exchange (ETDEWEB)

    Bourgeois, N.C.; Greer, D.K.

    1993-04-01

    This CALS Concept of Operations for Depot Maintenance provides the foundation strategy and the near term tactical plan for CALS implementation in the depot maintenance environment. The user requirements enumerated and the overarching architecture outlined serve as the primary framework for implementation planning. The seamless integration of depot maintenance business processes and supporting information systems with the emerging global CALS environment will be critical to the efficient realization of depot user's information requirements, and as, such will be a fundamental theme in depot implementations.

  2. Real-time data acquisition and computation for the SSC using optical and electronic technologies

    International Nuclear Information System (INIS)

    Cantrell, C.D.; Fenyves, E.J.; Wallace, B.

    1990-01-01

    The authors discuss combinations of optical and electronic technologies that may be able to address major data-filtering and data-analysis problems at the SSC. Novel scintillation detectors and optical readout may permit the use of optical processing techniques for trigger decisions and particle tracking. Very-high-speed fiberoptic local-area networks will be necessary to pipeline data from the detectors to the triggers and from the triggers to computers. High-speed, few-processor MIMD superconductors with advanced fiberoptic I/O technology offer a usable, cost-effective alternative to the microprocessor farms currently proposed for event selection and analysis for the SSC. The use of a real-time operating system that provides standard programming tools will facilitate all tasks, from reprogramming the detectors' event-selection criteria to detector simulation and event analysis. 34 refs., 1 fig., 1 tab

  3. Image analysis in modern ophthalmology: from acquisition to computer assisted diagnosis and telemedicine

    Science.gov (United States)

    Marrugo, Andrés G.; Millán, María S.; Cristóbal, Gabriel; Gabarda, Salvador; Sorel, Michal; Sroubek, Filip

    2012-06-01

    Medical digital imaging has become a key element of modern health care procedures. It provides visual documentation and a permanent record for the patients, and most important the ability to extract information about many diseases. Modern ophthalmology thrives and develops on the advances in digital imaging and computing power. In this work we present an overview of recent image processing techniques proposed by the authors in the area of digital eye fundus photography. Our applications range from retinal image quality assessment to image restoration via blind deconvolution and visualization of structural changes in time between patient visits. All proposed within a framework for improving and assisting the medical practice and the forthcoming scenario of the information chain in telemedicine.

  4. A real time data acquisition system using the MIL-STD-1553B bus. [for transmission of data to host computer for control law processing

    Science.gov (United States)

    Peri, Frank, Jr.

    1992-01-01

    A flight digital data acquisition system that uses the MIL-STD-1553B bus for transmission of data to a host computer for control law processing is described. The instrument, the Remote Interface Unit (RIU), can accommodate up to 16 input channels and eight output channels. The RIU employs a digital signal processor to perform local digital filtering before sending data to the host. The system allows flexible sensor and actuator data organization to facilitate quick control law computations on the host computer. The instrument can also run simple control laws autonomously without host intervention. The RIU and host computer together have replaced a similar larger, ground minicomputer system with favorable results.

  5. Computing strategy of Alpha-Magnetic Spectrometer experiment

    International Nuclear Information System (INIS)

    Choutko, V.; Klimentov, A.

    2003-01-01

    Alpha-Magnetic Spectrometer (AMS) is an experiment to search in the space for dark matter, missing matter, and antimatter scheduled for being flown on the International Space Station in the fall of year 2005 for at least 3 consecutive years. This paper gives an overview of the AMS software with emphasis on the distributed production system based on client/server approach. We also describe our choice of hardware components to build a processing farm with TByte RAID arrays of IDE disks and highlight the strategies that make our system different from many other experimental systems

  6. Interactive uncertainty reduction strategies and verbal affection in computer-mediated communication

    NARCIS (Netherlands)

    Antheunis, M.L.; Schouten, A.P.; Valkenburg, P.M.; Peter, J.

    2012-01-01

    The goal of this study was to investigate the language-based strategies that computer-mediated communication (CMC) users employ to reduce uncertainty in the absence of nonverbal cues. Specifically, this study investigated the prevalence of three interactive uncertainty reduction strategies (i.e.,

  7. Effects of Computer-Assisted Jigsaw II Cooperative Learning Strategy on Physics Achievement and Retention

    Science.gov (United States)

    Gambari, Isiaka Amosa; Yusuf, Mudasiru Olalere

    2016-01-01

    This study investigated the effects of computer-assisted Jigsaw II cooperative strategy on physics achievement and retention. The study also determined how moderating variables of achievement levels as it affects students' performance in physics when Jigsaw II cooperative learning is used as an instructional strategy. Purposive sampling technique…

  8. Note-Taking with Computers: Exploring Alternative Strategies for Improved Recall

    Science.gov (United States)

    Bui, Dung C.; Myerson, Joel; Hale, Sandra

    2013-01-01

    Three experiments examined note-taking strategies and their relation to recall. In Experiment 1, participants were instructed either to take organized lecture notes or to try and transcribe the lecture, and they either took their notes by hand or typed them into a computer. Those instructed to transcribe the lecture using a computer showed the…

  9. New Perspectives on Computational and Cognitive Strategies for Word Sense Disambiguation

    CERN Document Server

    Kwong, Oi Yee

    2013-01-01

    Cognitive and Computational Strategies for Word Sense Disambiguation examines cognitive strategies by humans and computational strategies by machines, for WSD in parallel.  Focusing on a psychologically valid property of words and senses, author Oi Yee Kwong discusses their concreteness or abstractness and draws on psycholinguistic data to examine the extent to which existing lexical resources resemble the mental lexicon as far as the concreteness distinction is concerned. The text also investigates the contribution of different knowledge sources to WSD in relation to this very intrinsic nature of words and senses. 

  10. Computer-based system for acquisition of nuclear well log data

    International Nuclear Information System (INIS)

    Meisner, J.E.

    1983-01-01

    There is described a computer-based well logging system, for acquiring nuclear well log data, including gamma ray energy spectrum and neutron population decay rate data, and providing a real-time presentation of the data on an operator's display based on a traversal by a downhole instrument of a prescribed borehole depth interval. The system has a multichannel analyzer including a pulse height analyzer and a memory. After a spectral gamma ray pulse signal coming from a downhole instrument over a logging cable is amplified and conditioned, the pulse height analyzer converts the pulse height into a digital code by peak detection, sample-and-hold action, and analog-to-digital conversion. The digital code defines the address of a memory location or channel, corresponding to a particular gamma ray energy and having a count value to be incremented. The spectrum data is then accessed by the system central processing unit (CPU) for analysis, and routed to the operator's display for presentation as a plot of relative gamma ray emissions activity versus energy level. For acquiring neutron decay rate data, the system has a multichannel scaling unit including a memory and a memory address generator. After a burst of neutrons downhole, thermal and epithermal neutron detector pulses build up and die away. Using the neutron source trigger as an initializing reference, the address generator produces a sequence of memory address codes, each code addressing the memory for a prescribed period of time, so as to define a series of time slots. A detector pulse signal produced during a time slot results in the incrementing of the count value in an address memory location. (author)

  11. Current strategies for dosage reduction in computed tomography

    International Nuclear Information System (INIS)

    May, M.S.; Wuest, W.; Lell, M.M.; Uder, M.; Kalender, W.A.; Schmidt, B.

    2012-01-01

    The potential risks of radiation exposure associated with computed tomography (CT) imaging are reason for ongoing concern for both medical staff and patients. Radiation dose reduction is, according to the as low as reasonably achievable principle, an important issue in clinical routine, research and development. The complex interaction of preparation, examination and post-processing provides a high potential for optimization on the one hand but on the other a high risk for errors. The radiologist is responsible for the quality of the CT examination which requires specialized and up-to-date knowledge. Most of the techniques for radiation dose reduction are independent of the system and manufacturer. The basic principle should be radiation dose optimization without loss of diagnostic image quality rather than just reduction. (orig.) [de

  12. Hypoxia and bicarbonate could limit the expression of iron acquisition genes in Strategy I plants by affecting ethylene synthesis and signaling in different ways.

    Science.gov (United States)

    García, María J; García-Mateo, María J; Lucena, Carlos; Romera, Francisco J; Rojas, Carmen L; Alcántara, Esteban; Pérez-Vicente, Rafael

    2014-01-01

    In a previous work, it was shown that bicarbonate (one of the most important factors causing Fe chlorosis in Strategy I plants) can limit the expression of several genes involved in Fe acquisition. Hypoxia is considered another important factor causing Fe chlorosis, mainly on calcareous soils. However, to date it is not known whether hypoxia aggravates Fe chlorosis by affecting bicarbonate concentration or by specific negative effects on Fe acquisition. Results found in this work show that hypoxia, generated by eliminating the aeration of the nutrient solution, can limit the expression of several Fe acquisition genes in Fe-deficient Arabidopsis, cucumber and pea plants, like the genes for ferric reductases AtFRO2, PsFRO1 and CsFRO1; iron transporters AtIRT1, PsRIT1 and CsIRT1; H(+) -ATPase CsHA1; and transcription factors AtFIT, AtbHLH38, and AtbHLH39. Interestingly, the limitation of the expression of Fe-acquisition genes by hypoxia did not occur in the Arabidopsis ethylene constitutive mutant ctr1, which suggests that the negative effect of hypoxia is related to ethylene, an hormone involved in the upregulation of Fe acquisition genes. As for hypoxia, results obtained by applying bicarbonate to the nutrient solution suggests that ethylene is also involved in its negative effect, since ACC (1-aminocyclopropane-1-carboxylic acid; ethylene precursor) partially reversed the negative effect of bicarbonate on the expression of Fe acquisition genes. Taken together, the results obtained show that hypoxia and bicarbonate could induce Fe chlorosis by limiting the expression of Fe acquisition genes, probably because each factor negatively affects different steps of ethylene synthesis and/or signaling. © 2013 Scandinavian Plant Physiology Society.

  13. Defense strategies for cloud computing multi-site server infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Nageswara S. [ORNL; Ma, Chris Y. T. [Hang Seng Management College, Hon Kong; He, Fei [Texas A& M University, Kingsville, TX, USA

    2018-01-01

    We consider cloud computing server infrastructures for big data applications, which consist of multiple server sites connected over a wide-area network. The sites house a number of servers, network elements and local-area connections, and the wide-area network plays a critical, asymmetric role of providing vital connectivity between them. We model this infrastructure as a system of systems, wherein the sites and wide-area network are represented by their cyber and physical components. These components can be disabled by cyber and physical attacks, and also can be protected against them using component reinforcements. The effects of attacks propagate within the systems, and also beyond them via the wide-area network.We characterize these effects using correlations at two levels using: (a) aggregate failure correlation function that specifies the infrastructure failure probability given the failure of an individual site or network, and (b) first-order differential conditions on system survival probabilities that characterize the component-level correlations within individual systems. We formulate a game between an attacker and a provider using utility functions composed of survival probability and cost terms. At Nash Equilibrium, we derive expressions for the expected capacity of the infrastructure given by the number of operational servers connected to the network for sum-form, product-form and composite utility functions.

  14. A Novel Strategy for Mechanism Based Computational Drug Discovery

    Science.gov (United States)

    Subha, Kalyaanamoorthy; Kumar, Gopal Ramesh; Rajalakshmi, Rajasekaran; Aravindhan, Ganesan

    2010-01-01

    Glioma, the common brain tumor, which arises from the glial cells, offers worse prognosis and therapy than any other tumors. Despite the genetic and pathological diversities of malignant gliomas, common signaling pathways that drive cellular proliferation, survival, invasion and angiogenesis have been identified. Very often, various tyrosine kinase receptors are inappropriately activated in human brain tumors and contribute to tumor malignancy. During such tumourous states where multiple pathways are involved, a few of them are responsbile for cell differentiation, proliferation and anti-apoptosis. Computational simulation studies of normal EGFR signaling in glioma together with the mutant EGFR mediated signaling and the MAPK signaling in glioma were carried out. There were no significant cross talks observed between the mutant EGFR and the MAPK pathways and thus from the simulation results, we propose a novel concept of ‘multiple-targeting’ that combines EGFR and Ras targeted therapy thereby providing a better therapeutic value against glioma. Diallyl Disulfide (DADS) that has been commonly used for Ras inhibition in glioma was taken for analyses and the effect of inhibiting the EGFR downstream signaling protein with this DADS was analyzed using the simulation and docking studies. PMID:24179383

  15. A Novel Strategy for Mechanism Based Computational Drug Discovery

    Directory of Open Access Journals (Sweden)

    Kalyaanamoorthy Subha

    2010-01-01

    Full Text Available Glioma, the common brain tumor, which arises from the glial cells, offers worse prognosis and therapy than any other tumors. Despite the genetic and pathological diversities of malignant gliomas, common signaling pathways that drive cellular proliferation, survival, invasion and angiogenesis have been identified. Very often, various tyrosine kinase receptors are inappropriately activated in human brain tumors and contribute to tumor malignancy. During such tumourous states where multiple pathways are involved, a few of them are responsbile for cell differentiation, proliferation and anti-apoptosis. Computational simulation studies of normal EGFR signaling in glioma together with the mutant EGFR mediated signaling and the MAPK signaling in glioma were carried out. There were no significant cross talks observed between the mutant EGFR and the MAPK pathways and thus from the simulation results, we propose a novel concept of ‘multiple-targeting’ that combines EGFR and Ras targeted therapy thereby providing a better therapeutic value against glioma. Diallyl Disulfide (DADS that has been commonly used for Ras inhibition in glioma was taken for analyses and the effect of inhibiting the EGFR downstream signaling protein with this DADS was analyzed using the simulation and docking studies.

  16. Mergers and Acquisitions

    OpenAIRE

    Frasch, Manfred; Leptin, Maria

    2000-01-01

    Mergers and acquisitions (M&As) are booming a strategy of choice for organizations attempting to maintain a competitive advantage. Previous research on mergers and acquisitions declares that acquirers do not normally benefit from acquisitions. Targets, on the other hand, have a tendency of gaining positive returns in the few days surrounding merger announcements due to several characteristic on the acquisitions deal. The announcement period wealth effect on acquiring firms, however, is as cle...

  17. Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.

    Science.gov (United States)

    Bian, Yuemin; Xie, Xiang-Qun Sean

    2018-04-09

    Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.

  18. Cloud computing task scheduling strategy based on improved differential evolution algorithm

    Science.gov (United States)

    Ge, Junwei; He, Qian; Fang, Yiqiu

    2017-04-01

    In order to optimize the cloud computing task scheduling scheme, an improved differential evolution algorithm for cloud computing task scheduling is proposed. Firstly, the cloud computing task scheduling model, according to the model of the fitness function, and then used improved optimization calculation of the fitness function of the evolutionary algorithm, according to the evolution of generation of dynamic selection strategy through dynamic mutation strategy to ensure the global and local search ability. The performance test experiment was carried out in the CloudSim simulation platform, the experimental results show that the improved differential evolution algorithm can reduce the cloud computing task execution time and user cost saving, good implementation of the optimal scheduling of cloud computing tasks.

  19. Unsupervised Language Acquisition

    Science.gov (United States)

    de Marcken, Carl

    1996-11-01

    This thesis presents a computational theory of unsupervised language acquisition, precisely defining procedures for learning language from ordinary spoken or written utterances, with no explicit help from a teacher. The theory is based heavily on concepts borrowed from machine learning and statistical estimation. In particular, learning takes place by fitting a stochastic, generative model of language to the evidence. Much of the thesis is devoted to explaining conditions that must hold for this general learning strategy to arrive at linguistically desirable grammars. The thesis introduces a variety of technical innovations, among them a common representation for evidence and grammars, and a learning strategy that separates the ``content'' of linguistic parameters from their representation. Algorithms based on it suffer from few of the search problems that have plagued other computational approaches to language acquisition. The theory has been tested on problems of learning vocabularies and grammars from unsegmented text and continuous speech, and mappings between sound and representations of meaning. It performs extremely well on various objective criteria, acquiring knowledge that causes it to assign almost exactly the same structure to utterances as humans do. This work has application to data compression, language modeling, speech recognition, machine translation, information retrieval, and other tasks that rely on either structural or stochastic descriptions of language.

  20. Language Acquisition without an Acquisition Device

    Science.gov (United States)

    O'Grady, William

    2012-01-01

    Most explanatory work on first and second language learning assumes the primacy of the acquisition phenomenon itself, and a good deal of work has been devoted to the search for an "acquisition device" that is specific to humans, and perhaps even to language. I will consider the possibility that this strategy is misguided and that language…

  1. Computational Evidence that Frequency Trajectory Theory Does Not Oppose but Emerges from Age-of-Acquisition Theory

    Science.gov (United States)

    Mermillod, Martial; Bonin, Patrick; Meot, Alain; Ferrand, Ludovic; Paindavoine, Michel

    2012-01-01

    According to the age-of-acquisition hypothesis, words acquired early in life are processed faster and more accurately than words acquired later. Connectionist models have begun to explore the influence of the age/order of acquisition of items (and also their frequency of encounter). This study attempts to reconcile two different methodological and…

  2. Axial power deviation control strategy and computer simulation for Daya Bay Nuclear Power Station

    International Nuclear Information System (INIS)

    Liao Yehong; Zhou Xiaoling, Xiao Min

    2004-01-01

    Daya Bay Nuclear Power Station has very tight operation diagram especially at its right side. Therefore the successful control of axial power deviation for PWR is crucial to nuclear safety. After analyzing various core characters' effect on axial power distribution, several axial power deviation control strategies has been proposed to comply with different power varying operation scenario. Application and computer simulation of the strategies has shown that our prediction of axial power deviation evolution are comparable to the measurement values, and that our control strategies are effective. Engineering experience shows that the application of our methodology can predict accurately the transient of axial power deviation, and therefore has become a useful tool for reactor operation and safety control. This paper presents the axial power control characteristics, reactor operation strategy research, computer simulation, and comparison to measurement results in Daya Bay Nuclear Power Station. (author)

  3. Temporal resolution measurement of 128-slice dual source and 320-row area detector computed tomography scanners in helical acquisition mode using the impulse method.

    Science.gov (United States)

    Hara, Takanori; Urikura, Atsushi; Ichikawa, Katsuhiro; Hoshino, Takashi; Nishimaru, Eiji; Niwa, Shinji

    2016-04-01

    To analyse the temporal resolution (TR) of modern computed tomography (CT) scanners using the impulse method, and assess the actual maximum TR at respective helical acquisition modes. To assess the actual TR of helical acquisition modes of a 128-slice dual source CT (DSCT) scanner and a 320-row area detector CT (ADCT) scanner, we assessed the TRs of various acquisition combinations of a pitch factor (P) and gantry rotation time (R). The TR of the helical acquisition modes for the 128-slice DSCT scanner continuously improved with a shorter gantry rotation time and greater pitch factor. However, for the 320-row ADCT scanner, the TR with a pitch factor of pitch factor of >1.0, it was approximately one half of the gantry rotation time. The maximum TR values of single- and dual-source helical acquisition modes for the 128-slice DSCT scanner were 0.138 (R/P=0.285/1.5) and 0.074s (R/P=0.285/3.2), and the maximum TR values of the 64×0.5- and 160×0.5-mm detector configurations of the helical acquisition modes for the 320-row ADCT scanner were 0.120 (R/P=0.275/1.375) and 0.195s (R/P=0.3/0.6), respectively. Because the TR of a CT scanner is not accurately depicted in the specifications of the individual scanner, appropriate acquisition conditions should be determined based on the actual TR measurement. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  4. Computing the Pareto-Nash equilibrium set in finite multi-objective mixed-strategy games

    Directory of Open Access Journals (Sweden)

    Victoria Lozan

    2013-10-01

    Full Text Available The Pareto-Nash equilibrium set (PNES is described as intersection of graphs of efficient response mappings. The problem of PNES computing in finite multi-objective mixed-strategy games (Pareto-Nash games is considered. A method for PNES computing is studied. Mathematics Subject Classification 2010: 91A05, 91A06, 91A10, 91A43, 91A44.

  5. Reference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate BoilingReference Computational Meshing Strategy for Computational Fluid Dynamics Simulation of Departure from Nucleate Boiling

    Energy Technology Data Exchange (ETDEWEB)

    Pointer, William David [ORNL

    2017-08-01

    The objective of this effort is to establish a strategy and process for generation of suitable computational mesh for computational fluid dynamics simulations of departure from nucleate boiling in a 5 by 5 fuel rod assembly held in place by PWR mixing vane spacer grids. This mesh generation process will support ongoing efforts to develop, demonstrate and validate advanced multi-phase computational fluid dynamics methods that enable more robust identification of dryout conditions and DNB occurrence.Building upon prior efforts and experience, multiple computational meshes were developed using the native mesh generation capabilities of the commercial CFD code STAR-CCM+. These meshes were used to simulate two test cases from the Westinghouse 5 by 5 rod bundle facility. The sensitivity of predicted quantities of interest to the mesh resolution was then established using two evaluation methods, the Grid Convergence Index method and the Least Squares method. This evaluation suggests that the Least Squares method can reliably establish the uncertainty associated with local parameters such as vector velocity components at a point in the domain or surface averaged quantities such as outlet velocity magnitude. However, neither method is suitable for characterization of uncertainty in global extrema such as peak fuel surface temperature, primarily because such parameters are not necessarily associated with a fixed point in space. This shortcoming is significant because the current generation algorithm for identification of DNB event conditions relies on identification of such global extrema. Ongoing efforts to identify DNB based on local surface conditions will address this challenge

  6. A Computer Assisted Method to Track Listening Strategies in Second Language Learning

    Science.gov (United States)

    Roussel, Stephanie

    2011-01-01

    Many studies about listening strategies are based on what learners report while listening to an oral message in the second language (Vandergrift, 2003; Graham, 2006). By recording a video of the computer screen while L2 learners (L1 French) were listening to an MP3-track in German, this study uses a novel approach and recent developments in…

  7. Portraits of PBL: Course Objectives and Students' Study Strategies in Computer Engineering, Psychology and Physiotherapy.

    Science.gov (United States)

    Dahlgren, Madeleine Abrandt

    2000-01-01

    Compares the role of course objectives in relation to students' study strategies in problem-based learning (PBL). Results comprise data from three PBL programs at Linkopings University (Sweden), in physiotherapy, psychology, and computer engineering. Faculty provided course objectives to function as supportive structures and guides for students'…

  8. The Human-Computer Interaction of Cross-Cultural Gaming Strategy

    Science.gov (United States)

    Chakraborty, Joyram; Norcio, Anthony F.; Van Der Veer, Jacob J.; Andre, Charles F.; Miller, Zachary; Regelsberger, Alexander

    2015-01-01

    This article explores the cultural dimensions of the human-computer interaction that underlies gaming strategies. The article is a desktop study of existing literature and is organized into five sections. The first examines the cultural aspects of knowledge processing. The social constructs technology interaction is discussed. Following this, the…

  9. Computer-generated versus nurse-determined strategy for incubator humidity and time to regain birthweight

    NARCIS (Netherlands)

    Helder, Onno K.; Mulder, Paul G. H.; van Goudoever, Johannes B.

    2008-01-01

    To compare effects on premature infants' weight gain of a computer-generated and a nurse-determined incubator humidity strategy. An optimal humidity protocol is thought to reduce time to regain birthweight. Prospective randomized controlled design. Level IIIC neonatal intensive care unit in the

  10. Rational behavior in decision making. A comparison between humans, computers and fast and frugal strategies

    NARCIS (Netherlands)

    Snijders, C.C.P.

    2007-01-01

    Rational behavior in decision making. A comparison between humans, computers, and fast and frugal strategies Chris Snijders and Frits Tazelaar (Eindhoven University of Technology, The Netherlands) Real life decisions often have to be made in "noisy" circumstances: not all crucial information is

  11. Computing in the Curriculum: Challenges and Strategies from a Teacher's Perspective

    Science.gov (United States)

    Sentance, Sue; Csizmadia, Andrew

    2017-01-01

    Computing is being introduced into the curriculum in many countries. Teachers' perspectives enable us to discover what challenges this presents, and also the strategies teachers claim to be using successfully in teaching the subject across primary and secondary education. The study described in this paper was carried out in the UK in 2014 where…

  12. Multimedia Instructional Tools' Impact on Student Motivation and Learning Strategies in Computer Applications Courses

    Science.gov (United States)

    Chapman, Debra; Wang, Shuyan

    2015-01-01

    Multimedia instructional tools (MMIT) have been identified as a way effectively and economically present instructional material. MMITs are commonly used in introductory computer applications courses as MMITs should be effective in increasing student knowledge and positively impact motivation and learning strategies, without increasing costs. This…

  13. A practical strategy for the characterization of ponicidin metabolites in vivo and in vitro by UHPLC-Q-TOF-MS based on nontargeted SWATH data acquisition.

    Science.gov (United States)

    Xie, Weiwei; Jin, Yiran; Hou, Ludan; Ma, Yinghua; Xu, Huijun; Zhang, Kerong; Zhang, Lantong; Du, Yingfeng

    2017-10-25

    Ponicidin is an active natural ent-kaurane diterpenoid ingredient originating from many Isondon herbs and is expected to become a new anticancer agent. In this study, a practical strategy was developed for the identification of ponicidin metabolites in vivo and in vitro utilizing ultra-high-performance liquid chromatography coupled with hybrid triple quadrupole time-of-flight mass spectrometry (UHPLC-Q-TOF-MS). The analytical strategy was as follows: potential ponicidin metabolites were detected by a novel on-line data acquisition approach, i.e., sequential window acquisition of all theoretical fragment-ion spectra (SWATH™). Compared to the traditional information-dependent acquisition (IDA) method, SWATH™ significantly improved the hit rate of low-level or trace metabolites because it could obtain all MS/MS spectra. Moreover, many data post-processing methods were used to deduce the metabolites structures. As a result, a total of 20 metabolites were characterized in vivo and in vitro. The results showed that ponicidin could undergo general metabolic reactions, such as oxidation, reduction, hydrolysis, methylation and glucuronidation. Furthermore, there was an obvious difference in the ponicidin metabolites among four species in vitro. This is the first time that the SWATH™ data acquisition mode has been used to characterize ponicidin metabolites in trace amounts or in a biological matrix. These results not only provided a better understanding of the safety and efficacy of ponicidin but also showed a valuable methodology for the identification of other ent-kaurane diterpenoid metabolites. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. A Longitudinal Study of Handwriting Skills in Pre-Schoolers: The Acquisition of Syllable Oriented Programming Strategies

    Science.gov (United States)

    Soler Vilageliu, Olga; Kandel, Sonia

    2012-01-01

    Previous studies have shown the relevance of the syllable as a programming unit in handwriting production, both in adults and elementary school children. This longitudinal study focuses on the acquisition of writing skills in a group of preschoolers. It examines how and when the syllable structure of the word starts regulating motor programming in…

  15. TH-E-17A-07: Improved Cine Four-Dimensional Computed Tomography (4D CT) Acquisition and Processing Method

    International Nuclear Information System (INIS)

    Castillo, S; Castillo, R; Castillo, E; Pan, T; Ibbott, G; Balter, P; Hobbs, B; Dai, J; Guerrero, T

    2014-01-01

    Purpose: Artifacts arising from the 4D CT acquisition and post-processing methods add systematic uncertainty to the treatment planning process. We propose an alternate cine 4D CT acquisition and post-processing method to consistently reduce artifacts, and explore patient parameters indicative of image quality. Methods: In an IRB-approved protocol, 18 patients with primary thoracic malignancies received a standard cine 4D CT acquisition followed by an oversampling 4D CT that doubled the number of images acquired. A second cohort of 10 patients received the clinical 4D CT plus 3 oversampling scans for intra-fraction reproducibility. The clinical acquisitions were processed by the standard phase sorting method. The oversampling acquisitions were processed using Dijkstras algorithm to optimize an artifact metric over available image data. Image quality was evaluated with a one-way mixed ANOVA model using a correlation-based artifact metric calculated from the final 4D CT image sets. Spearman correlations and a linear mixed model tested the association between breathing parameters, patient characteristics, and image quality. Results: The oversampling 4D CT scans reduced artifact presence significantly by 27% and 28%, for the first cohort and second cohort respectively. From cohort 2, the inter-replicate deviation for the oversampling method was within approximately 13% of the cross scan average at the 0.05 significance level. Artifact presence for both clinical and oversampling methods was significantly correlated with breathing period (ρ=0.407, p-value<0.032 clinical, ρ=0.296, p-value<0.041 oversampling). Artifact presence in the oversampling method was significantly correlated with amount of data acquired, (ρ=-0.335, p-value<0.02) indicating decreased artifact presence with increased breathing cycles per scan location. Conclusion: The 4D CT oversampling acquisition with optimized sorting reduced artifact presence significantly and reproducibly compared to the phase

  16. Retention strategies and factors associated with missed visits among low income women at increased risk of HIV acquisition in the US (HPTN 064).

    Science.gov (United States)

    Haley, Danielle F; Lucas, Jonathan; Golin, Carol E; Wang, Jing; Hughes, James P; Emel, Lynda; El-Sadr, Wafaa; Frew, Paula M; Justman, Jessica; Adimora, Adaora A; Watson, Christopher Chauncey; Mannheimer, Sharon; Rompalo, Anne; Soto-Torres, Lydia; Tims-Cook, Zandraetta; Carter, Yvonne; Hodder, Sally L

    2014-04-01

    Women at high-risk for HIV acquisition often face challenges that hinder their retention in HIV prevention trials. These same challenges may contribute to missed clinical care visits among HIV-infected women. This article, informed by the Gelberg-Andersen Behavioral Model for Vulnerable Populations, identifies factors associated with missed study visits and describes the multifaceted retention strategies used by study sites. HPTN 064 was a multisite, longitudinal HIV seroincidence study in 10 US communities. Eligible women were aged 18-44 years, resided in a census tract/zipcode with high poverty and HIV prevalence, and self-reported ≥1 personal or sex partner behavior related to HIV acquisition. Multivariate analyses of predisposing (e.g., substance use) and enabling (e.g., unmet health care needs) characteristics, and study attributes (i.e., recruitment venue, time of enrollment) identified factors associated with missed study visits. Retention strategies included: community engagement; interpersonal relationship building; reduction of external barriers; staff capacity building; and external tracing. Visit completion was 93% and 94% at 6 and 12 months. Unstable housing and later date of enrollment were associated with increased likelihood of missed study visits. Black race, recruitment from an outdoor venue, and financial responsibility for children were associated with greater likelihood of attendance. Multifaceted retention strategies may reduce missed study visits. Knowledge of factors associated with missed visits may help to focus efforts.

  17. A Cloud-Computing-Based Data Placement Strategy in High-Speed Railway

    Directory of Open Access Journals (Sweden)

    Hanning Wang

    2012-01-01

    Full Text Available As an important component of China’s transportation data sharing system, high-speed railway data sharing is a typical application of data-intensive computing. Currently, most high-speed railway data is shared in cloud computing environment. Thus, there is an urgent need for an effective cloud-computing-based data placement strategy in high-speed railway. In this paper, a new data placement strategy named hierarchical structure data placement strategy is proposed. The proposed method combines the semidefinite programming algorithm with the dynamic interval mapping algorithm. The semi-definite programming algorithm is suitable for the placement of files with various replications, ensuring that different replications of a file are placed on different storage devices, while the dynamic interval mapping algorithm ensures better self-adaptability of the data storage system. A hierarchical data placement strategy is proposed for large-scale networks. In this paper, a new theoretical analysis is provided, which is put in comparison with several other previous data placement approaches, showing the efficacy of the new analysis in several experiments.

  18. Influence of Learning Strategy of Cognitive Conflict on Student Misconception in Computational Physics Course

    Science.gov (United States)

    Akmam, A.; Anshari, R.; Amir, H.; Jalinus, N.; Amran, A.

    2018-04-01

    Misconception is one of the factors causing students are not suitable in to choose a method for problem solving. Computational Physics course is a major subject in the Department of Physics FMIPA UNP Padang. The problem in Computational Physics learning lately is that students have difficulties in constructing knowledge. The indication of this problem was the student learning outcomes do not achieve mastery learning. The root of the problem is the ability of students to think critically weak. Student critical thinking can be improved using cognitive by conflict learning strategies. The research aims to determine the effect of cognitive conflict learning strategy to student misconception on the subject of Computational Physics Course at the Department of Physics, Faculty of Mathematics and Science, Universitas Negeri Padang. The experimental research design conducted after-before design cycles with a sample of 60 students by cluster random sampling. Data were analyzed using repeated Anova measurements. The cognitive conflict learning strategy has a significant effect on student misconception in the subject of Computational Physics Course.

  19. A strategy for improved computational efficiency of the method of anchored distributions

    Science.gov (United States)

    Over, Matthew William; Yang, Yarong; Chen, Xingyuan; Rubin, Yoram

    2013-06-01

    This paper proposes a strategy for improving the computational efficiency of model inversion using the method of anchored distributions (MAD) by "bundling" similar model parametrizations in the likelihood function. Inferring the likelihood function typically requires a large number of forward model (FM) simulations for each possible model parametrization; as a result, the process is quite expensive. To ease this prohibitive cost, we present an approximation for the likelihood function called bundling that relaxes the requirement for high quantities of FM simulations. This approximation redefines the conditional statement of the likelihood function as the probability of a set of similar model parametrizations "bundle" replicating field measurements, which we show is neither a model reduction nor a sampling approach to improving the computational efficiency of model inversion. To evaluate the effectiveness of these modifications, we compare the quality of predictions and computational cost of bundling relative to a baseline MAD inversion of 3-D flow and transport model parameters. Additionally, to aid understanding of the implementation we provide a tutorial for bundling in the form of a sample data set and script for the R statistical computing language. For our synthetic experiment, bundling achieved a 35% reduction in overall computational cost and had a limited negative impact on predicted probability distributions of the model parameters. Strategies for minimizing error in the bundling approximation, for enforcing similarity among the sets of model parametrizations, and for identifying convergence of the likelihood function are also presented.

  20. A Fast GPU-accelerated Mixed-precision Strategy for Fully NonlinearWater Wave Computations

    DEFF Research Database (Denmark)

    Glimberg, Stefan Lemvig; Engsig-Karup, Allan Peter; Madsen, Morten G.

    2011-01-01

    We present performance results of a mixed-precision strategy developed to improve a recently developed massively parallel GPU-accelerated tool for fast and scalable simulation of unsteady fully nonlinear free surface water waves over uneven depths (Engsig-Karup et.al. 2011). The underlying wave......-preconditioned defect correction method. The improved strategy improves the performance by exploiting architectural features of modern GPUs for mixed precision computations and is tested in a recently developed generic library for fast prototyping of PDE solvers. The new wave tool is applicable to solve and analyze...

  1. A strategy for the phased replacement of CANDU digital control computers

    International Nuclear Information System (INIS)

    Hepburn, G.A.

    2001-01-01

    Significant developments have occurred with respect to the replacement of the Plant Digital Control Computers (DCCs) on CANDU plants in the past six months. This paper summarises the conclusions of the condition assessment carried out on these machines at Point Lepreau Generating Station, and describes a strategy for a phased transition to a replacement system based on today's technology. Most elements of the strategy are already in place, and sufficient technical work has been done to allow those components which have been assessed as requiring prompt attention to be replaced in a matter of months. (author)

  2. Teaching strategies applied to teaching computer networks in Engineering in Telecommunications and Electronics

    Directory of Open Access Journals (Sweden)

    Elio Manuel Castañeda-González

    2016-07-01

    Full Text Available Because of the large impact that today computer networks, their study in related fields such as Telecommunications Engineering and Electronics is presented to the student with great appeal. However, by digging in content, lacking a strong practical component, you can make this interest decreases considerably. This paper proposes the use of teaching strategies and analogies, media and interactive applications that enhance the teaching of discipline networks and encourage their study. It is part of an analysis of how the teaching of the discipline process is performed and then a description of each of these strategies is done with their respective contribution to student learning.

  3. Functioning strategy study on control systems of large physical installations used with a digital computer

    International Nuclear Information System (INIS)

    Bel'man, L.B.; Lavrikov, S.A.; Lenskij, O.D.

    1975-01-01

    A criterion to evaluate the efficiency of a control system functioning of large physical installations by means of a control computer. The criteria are the object utilization factor and computer load factor. Different strategies of control system functioning are described, and their comparative analysis is made. A choice of such important parameters as sampling time and parameter correction time is made. A single factor to evaluate the system functioning efficiency is introduced and its dependence on the sampling interval value is given. Using diagrams attached, it is easy to find the optimum value of the sampling interval and the corresponding maximum value of the single efficiency factor proposed

  4. Incorporating electronic-based and computer-based strategies: graduate nursing courses in administration.

    Science.gov (United States)

    Graveley, E; Fullerton, J T

    1998-04-01

    The use of electronic technology allows faculty to improve their course offerings. Four graduate courses in nursing administration were contemporized to incorporate fundamental computer-based skills that would be expected of graduates in the work setting. Principles of adult learning offered a philosophical foundation that guided course development and revision. Course delivery strategies included computer-assisted instructional modules, e-mail interactive discussion groups, and use of the electronic classroom. Classroom seminar discussions and two-way interactive video conferencing focused on group resolution of problems derived from employment settings and assigned readings. Using these electronic technologies, a variety of courses can be revised to accommodate the learners' needs.

  5. Effects of a Computer-Assisted Concept Mapping Learning Strategy on EFL College Students' English Reading Comprehension

    Science.gov (United States)

    Liu, Pei-Lin; Chen, Chiu-Jung; Chang, Yu-Ju

    2010-01-01

    The purpose of this research was to investigate the effects of a computer-assisted concept mapping learning strategy on EFL college learners' English reading comprehension. The research questions were: (1) what was the influence of the computer-assisted concept mapping learning strategy on different learners' English reading comprehension? (2) did…

  6. Optimization of helical acquisition parameters to preserve uniformity of mouse whole body using multipinhole collimator in single-photon emission computed tomography

    Directory of Open Access Journals (Sweden)

    Naoyuki Ukon

    Full Text Available Focusing on whole-body uniformity in small-animal single-photon emission computed tomography (SPECT, we examined the optimal helical acquisition parameters using five-pinhole collimators for mouse imaging. SPECT images of an 80-mm-long cylindrical phantom with 99mTc solution were acquired using an Inveon multimodality imaging platform. The bed travels used in this study were 0, 30, 60, 90 and 120 mm, and the numbers of revolutions traversed during the SPECT scan were 1.0, 2.0, 3.0, 4.0, 5.0 and 7.0, respectively. Artifacts that degrade uniformity in reconstructed images were conspicuous when the bed travel was smaller than the object length. Regarding the distal-to-center ratio (DCR of SPECT values in the object’s axial direction, the DCR nearest to the ideal ratio of 1.00 was 1.02 in the optimal uniformity with 4.0 revolutions and a bed travel of 120 mm. Moreover, the helical acquisition using these parameters suppressed the formation of artifacts. We proposed the optimal parameters in whole-body helical SPECT; the bed travel was sufficiently larger than the object length; the 4.0 or more revolutions were required for a pitch of approximately 30 mm/revolution. The optimal acquisition parameters in SPECT to preserve uniformity would contribute to the accurate quantification of whole-body biodistribution. Keywords: Helical acquisition, Multipinhole collimator, Computed tomography, SPECT

  7. Learning Support Assessment Study of a Computer Simulation for the Development of Microbial Identification Strategies

    Directory of Open Access Journals (Sweden)

    Tristan E. Johnson

    2009-12-01

    Full Text Available This paper describes a study that examined how microbiology students construct knowledge of bacterial identification while using a computer simulation. The purpose of this study was to understand how the simulation affects the cognitive processing of students during thinking, problem solving, and learning about bacterial identification and to determine how the simulation facilitates the learning of a domain-specific problem-solving strategy. As part of an upper-division microbiology course, five students participated in several simulation assignments. The data were collected using think-aloud protocol and video action logs as the students used the simulation. The analysis revealed two major themes that determined the performance of the students: Simulation Usage—how the students used the software features and Problem-Solving Strategy Development—the strategy level students started with and the skill level they achieved when they completed their use of the simulation. Several conclusions emerged from the analysis of the data: (i The simulation affects various aspects of cognitive processing by creating an environment that makes it possible to practice the application of a problem-solving strategy. The simulation was used as an environment that allowed students to practice the cognitive skills required to solve an unknown. (ii Identibacter (the computer simulation may be considered to be a cognitive tool to facilitate the learning of a bacterial identification problem-solving strategy. (iii The simulation characteristics did support student learning of a problem-solving strategy. (iv Students demonstrated problem-solving strategy development specific to bacterial identification. (v Participants demonstrated an improved performance from their repeated use of the simulation.

  8. Acquisition Research Program Homepage

    OpenAIRE

    2015-01-01

    Includes an image of the main page on this date and compressed file containing additional web pages. Established in 2003, Naval Postgraduate School’s (NPS) Acquisition Research Program provides leadership in innovation, creative problem solving and an ongoing dialogue, contributing to the evolution of Department of Defense acquisition strategies.

  9. Assisting People with Multiple Disabilities by Improving Their Computer Pointing Efficiency with an Automatic Target Acquisition Program

    Science.gov (United States)

    Shih, Ching-Hsiang; Shih, Ching-Tien; Peng, Chin-Ling

    2011-01-01

    This study evaluated whether two people with multiple disabilities would be able to improve their pointing performance through an Automatic Target Acquisition Program (ATAP) and a newly developed mouse driver (i.e. a new mouse driver replaces standard mouse driver, and is able to monitor mouse movement and intercept click action). Initially, both…

  10. Detection of parathyroid adenomas using a monophasic dual-energy computed tomography acquisition: diagnostic performance and potential radiation dose reduction

    International Nuclear Information System (INIS)

    Leiva-Salinas, Carlos; Flors, Lucia; Durst, Christopher R.; Hou, Qinghua; Mukherjee, Sugoto; Patrie, James T.; Wintermark, Max

    2016-01-01

    The aims of the study were to compare the diagnostic performance of a combination of virtual non-contrast (VNC) images and arterial images obtained from a single-phase dual-energy CT (DECT) acquisition and standard non-contrast and arterial images from a biphasic protocol and to study the potential radiation dose reduction of the former approach. All DECT examinations performed for evaluation of parathyroid adenomas during a 13-month period were retrospectively reviewed. An initial single-energy unenhanced acquisition was followed by a dual-energy arterial phase acquisition. ''Virtual non-contrast images'' were generated from the dual-energy acquisition. Two independent and blinded radiologists evaluated three different sets of images during three reading sessions: single arterial phase, single-phase DECT (virtual non-contrast and arterial phase), and standard biphasic protocol (true non-contrast and arterial phase). The accuracy of interpretation in lateralizing an adenoma to the side of the neck and localizing it to a quadrant in the neck was evaluated. Sixty patients (mean age, 65.5 years; age range, 38-87 years) were included in the study. The lateralization and localization accuracy, sensitivity, and positive predicted value (PPV) and negative predicted value (NPV) of the different image datasets were comparable. The combination of VNC and arterial images was more specific than arterial images alone to lateralize a parathyroid lesion (OR = 1.93, p = 0.043). The use of the single-phase protocol resulted in a calculated radiation exposure reduction of 52.8 %. Virtual non-contrast and arterial images from a single DECT acquisition showed similar diagnostic accuracy than a biphasic protocol, providing a significant dose reduction. (orig.)

  11. Detection of parathyroid adenomas using a monophasic dual-energy computed tomography acquisition: diagnostic performance and potential radiation dose reduction

    Energy Technology Data Exchange (ETDEWEB)

    Leiva-Salinas, Carlos; Flors, Lucia; Durst, Christopher R.; Hou, Qinghua; Mukherjee, Sugoto [University of Virginia, Department of Radiology, Division of Neuroradiology, Charlottesville, VA (United States); Patrie, James T. [University of Virginia, Department of Public Health Sciences, Charlottesville, VA (United States); Wintermark, Max [Stanford University, Department of Radiology, Palo Alto, CA (United States)

    2016-11-15

    The aims of the study were to compare the diagnostic performance of a combination of virtual non-contrast (VNC) images and arterial images obtained from a single-phase dual-energy CT (DECT) acquisition and standard non-contrast and arterial images from a biphasic protocol and to study the potential radiation dose reduction of the former approach. All DECT examinations performed for evaluation of parathyroid adenomas during a 13-month period were retrospectively reviewed. An initial single-energy unenhanced acquisition was followed by a dual-energy arterial phase acquisition. ''Virtual non-contrast images'' were generated from the dual-energy acquisition. Two independent and blinded radiologists evaluated three different sets of images during three reading sessions: single arterial phase, single-phase DECT (virtual non-contrast and arterial phase), and standard biphasic protocol (true non-contrast and arterial phase). The accuracy of interpretation in lateralizing an adenoma to the side of the neck and localizing it to a quadrant in the neck was evaluated. Sixty patients (mean age, 65.5 years; age range, 38-87 years) were included in the study. The lateralization and localization accuracy, sensitivity, and positive predicted value (PPV) and negative predicted value (NPV) of the different image datasets were comparable. The combination of VNC and arterial images was more specific than arterial images alone to lateralize a parathyroid lesion (OR = 1.93, p = 0.043). The use of the single-phase protocol resulted in a calculated radiation exposure reduction of 52.8 %. Virtual non-contrast and arterial images from a single DECT acquisition showed similar diagnostic accuracy than a biphasic protocol, providing a significant dose reduction. (orig.)

  12. A Strategy for Automatic Performance Tuning of Stencil Computations on GPUs

    Directory of Open Access Journals (Sweden)

    Joseph D. Garvey

    2018-01-01

    Full Text Available We propose and evaluate a novel strategy for tuning the performance of a class of stencil computations on Graphics Processing Units. The strategy uses a machine learning model to predict the optimal way to load data from memory followed by a heuristic that divides other optimizations into groups and exhaustively explores one group at a time. We use a set of 104 synthetic OpenCL stencil benchmarks that are representative of many real stencil computations. We first demonstrate the need for auto-tuning by showing that the optimization space is sufficiently complex that simple approaches to determining a high-performing configuration fail. We then demonstrate the effectiveness of our approach on NVIDIA and AMD GPUs. Relative to a random sampling of the space, we find configurations that are 12%/32% faster on the NVIDIA/AMD platform in 71% and 4% less time, respectively. Relative to an expert search, we achieve 5% and 9% better performance on the two platforms in 89% and 76% less time. We also evaluate our strategy for different stencil computational intensities, varying array sizes and shapes, and in combination with expert search.

  13. A cluster-based strategy for assessing the overlap between large chemical libraries and its application to a recent acquisition.

    Science.gov (United States)

    Engels, Michael F M; Gibbs, Alan C; Jaeger, Edward P; Verbinnen, Danny; Lobanov, Victor S; Agrafiotis, Dimitris K

    2006-01-01

    We report on the structural comparison of the corporate collections of Johnson & Johnson Pharmaceutical Research & Development (JNJPRD) and 3-Dimensional Pharmaceuticals (3DP), performed in the context of the recent acquisition of 3DP by JNJPRD. The main objective of the study was to assess the druglikeness of the 3DP library and the extent to which it enriched the chemical diversity of the JNJPRD corporate collection. The two databases, at the time of acquisition, collectively contained more than 1.1 million compounds with a clearly defined structural description. The analysis was based on a clustering approach and aimed at providing an intuitive quantitative estimate and visual representation of this enrichment. A novel hierarchical clustering algorithm called divisive k-means was employed in combination with Kelley's cluster-level selection method to partition the combined data set into clusters, and the diversity contribution of each library was evaluated as a function of the relative occupancy of these clusters. Typical 3DP chemotypes enriching the diversity of the JNJPRD collection were catalogued and visualized using a modified maximum common substructure algorithm. The joint collection of JNJPRD and 3DP compounds was also compared to other databases of known medicinally active or druglike compounds. The potential of the methodology for the analysis of very large chemical databases is discussed.

  14. Preliminary study on X-ray fluorescence computed tomography imaging of gold nanoparticles: Acceleration of data acquisition by multiple pinholes scheme

    Science.gov (United States)

    Sasaya, Tenta; Sunaguchi, Naoki; Seo, Seung-Jum; Hyodo, Kazuyuki; Zeniya, Tsutomu; Kim, Jong-Ki; Yuasa, Tetsuya

    2018-04-01

    Gold nanoparticles (GNPs) have recently attracted attention in nanomedicine as novel contrast agents for cancer imaging. A decisive tomographic imaging technique has not yet been established to depict the 3-D distribution of GNPs in an object. An imaging technique known as pinhole-based X-ray fluorescence computed tomography (XFCT) is a promising method that can be used to reconstruct the distribution of GNPs from the X-ray fluorescence emitted by GNPs. We address the acceleration of data acquisition in pinhole-based XFCT for preclinical use using a multiple pinhole scheme. In this scheme, multiple projections are simultaneously acquired through a multi-pinhole collimator with a 2-D detector and full-field volumetric beam to enhance the signal-to-noise ratio of the projections; this enables fast data acquisition. To demonstrate the efficacy of this method, we performed an imaging experiment using a physical phantom with an actual multi-pinhole XFCT system that was constructed using the beamline AR-NE7A at KEK. The preliminary study showed that the multi-pinhole XFCT achieved a data acquisition time of 20 min at a theoretical detection limit of approximately 0.1 Au mg/ml and at a spatial resolution of 0.4 mm.

  15. Post-Acquisition IT Integration

    DEFF Research Database (Denmark)

    Henningsson, Stefan; Yetton, Philip

    2013-01-01

    The extant research on post-acquisition IT integration analyzes how acquirers realize IT-based value in individual acquisitions. However, serial acquirers make 60% of acquisitions. These acquisitions are not isolated events, but are components in growth-by-acquisition programs. To explain how...... serial acquirers realize IT-based value, we develop three propositions on the sequential effects on post-acquisition IT integration in acquisition programs. Their combined explanation is that serial acquirers must have a growth-by-acquisition strategy that includes the capability to improve...... IT integration capabilities, to sustain high alignment across acquisitions and to maintain a scalable IT infrastructure with a flat or decreasing cost structure. We begin the process of validating the three propositions by investigating a longitudinal case study of a growth-by-acquisition program....

  16. Evaluation of a Mixing versus a Cycling Strategy of Antibiotic Use in Critically-Ill Medical Patients: Impact on Acquisition of Resistant Microorganisms and Clinical Outcomes.

    Directory of Open Access Journals (Sweden)

    Nazaret Cobos-Trigueros

    Full Text Available To compare the effect of two strategies of antibiotic use (mixing vs. cycling on the acquisition of resistant microorganisms, infections and other clinical outcomes.Prospective cohort study in an 8-bed intensive care unit during 35- months in which a mixing-cycling policy of antipseudomonal beta-lactams (meropenem, ceftazidime/piperacillin-tazobactam and fluoroquinolones was operative. Nasopharyngeal and rectal swabs and respiratory secretions were obtained within 48h of admission and thrice weekly thereafter. Target microorganisms included methicillin-resistant S. aureus, vancomycin-resistant enterococci, third-generation cephalosporin-resistant Enterobacteriaceae and non-fermenters.A total of 409 (42% patients were included in mixing and 560 (58% in cycling. Exposure to ceftazidime/piperacillin-tazobactam and fluoroquinolones was significantly higher in mixing while exposure to meropenem was higher in cycling, although overall use of antipseudomonals was not significantly different (37.5/100 patient-days vs. 38.1/100 patient-days. There was a barely higher acquisition rate of microorganisms during mixing, but this difference lost its significance when the cases due to an exogenous Burkholderia cepacia outbreak were excluded (19.3% vs. 15.4%, OR 0.8, CI 0.5-1.1. Acquisition of Pseudomonas aeruginosa resistant to the intervention antibiotics or with multiple-drug resistance was similar. There were no significant differences between mixing and cycling in the proportion of patients acquiring any infection (16.6% vs. 14.5%, OR 0.9, CI 0.6-1.2, any infection due to target microorganisms (5.9% vs. 5.2%, OR 0.9, CI 0.5-1.5, length of stay (median 5 d for both groups or mortality (13.9 vs. 14.3%, OR 1.03, CI 0.7-1.3.A cycling strategy of antibiotic use with a 6-week cycle duration is similar to mixing in terms of acquisition of resistant microorganisms, infections, length of stay and mortality.

  17. Feasibility of Systematic Respiratory-Gated Acquisition in Unselected Patients Referred for 18F-Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography

    Directory of Open Access Journals (Sweden)

    Philippe Robin

    2018-02-01

    Full Text Available ObjectiveRespiratory motion in 18F-fluorodeoxyglucose positron emission tomography/computed tomography (FDG PET/CT induces blurred images, leading to errors in location and quantification for lung and abdominal lesions. Various methods have been developed to correct for these artifacts, and most of current PET/CT scanners are equipped with a respiratory gating system. However, they are not routinely performed because their use is time-consuming. The aim of this study is to assess the feasibility and quantitative impact of a systematic respiratory-gated acquisition in unselected patients referred for FDG PET/CT, without increasing acquisition time.MethodsPatients referred for a FDG PET/CT examination to the nuclear medicine department of Brest University Hospital were consecutively enrolled, during a 3-month period. Cases presenting lung or liver uptakes were analyzed. Two sets of images were reconstructed from data recorded during a unique acquisition with a continuous table speed of 1 mm/s of the used Biograph mCT Flow PET/CT scanner: standard free-breathing images, and respiratory-gated images. Lesion location and quantitative parameters were recorded and compared.ResultsFrom October 1 2015 to December 31 2015, 847 patients were referred for FDG PET/CT, 741 underwent a respiratory-gated acquisition. Out of them, 213 (29% had one or more lung or liver uptake but 82 (38% had no usable respiratory-gated signal. Accordingly, 131 (62% patients with 183 lung or liver uptakes were analyzed. Considering the 183 lesions, 140 and 43 were located in the lungs and the liver, respectively. The median (IQR difference between respiratory-gated images and non-gated images was 18% (4−32 for SUVmax, increasing to 30% (14−57 in lower lobes for lung lesions, and −18% (−40 to −4 for MTV (p < 0.05. Technologists’ active personal dosimetry and mean total examinations duration were not statistically different between periods with and without

  18. Usefulness of high helical pitch acquisition for reduction of patient radiation dose in cardiac multidetector computed tomography

    International Nuclear Information System (INIS)

    Sano, Tomonari; Matsutani, Hideyuki; Kondo, Takeshi; Sekine, Takako; Arai, Takehiro; Morita, Hitomi; Takase, Shinichi

    2009-01-01

    Helical pitch (HP) usually has been decided automatically by the software (Heart Navi) included in the MDCT machine (Aquilion 64) depending on gantry rotation speed (r) and heart rate (HR). To reduce radiation dose, 255 consecutive patients with low HR (≤60 bpm) and without arrhythmia underwent cardiac MDCT using high HP. We had already reported that the relationship among r, HP, and the maximum data acquisition time interval (Tmax) does not create the data deficit in arrhythmia. It was represented as Tmax=(69.88/HP-0.64) r; (equation 1). From equation 1, HP=69.88 r/(Tmax+0.64 r); (equation 2) was derived. We measured the maximum R-R interval (R-Rmax) on electrocardiogram (ECG) before multi detector row CT (MDCT) acquisition, and R-Rmax x 1.1 was calculated as Tmax in consideration of R-Rmax prolongation during MDCT acquisition. The HP of high HP acquisition was calculated from equation 2. In HR≤50 bpm, Heart Navi determined r: 0.35 sec/rot and HP: 9.8, and in 51 bpm≤HR≤66 bpm, r:0.35 sec/rot and HP: 11.2. HP of the high HP (16.4±1.2) was significantly (p<0.0001) higher than that of Heart Navi HP (10.9±0.6). The scanning time (6.5±0.6 sec) of high HP was significantly (p<0.0001) shorter than that of Heart Navi (9.0±0.8 sec), and the dose length product of high HP (675±185 mGy·cm) was significantly (p<0.0001) lower than that of Heart Navi (923±252 mGy·cm). The high HP could produce fine images in 251/255 patients. In conclusion, the high HP acquisition is useful for reduction of radiation dose and scanning time. (author)

  19. Heme acquisition mechanisms of Porphyromonas gingivalis - strategies used in a polymicrobial community in a heme-limited host environment.

    Science.gov (United States)

    Smalley, J W; Olczak, T

    2017-02-01

    Porphyromonas gingivalis, a main etiologic agent and key pathogen responsible for initiation and progression of chronic periodontitis requires heme as a source of iron and protoporphyrin IX for its survival and the ability to establish an infection. Porphyromonas gingivalis is able to accumulate a defensive cell-surface heme-containing pigment in the form of μ-oxo bisheme. The main sources of heme for P. gingivalis in vivo are hemoproteins present in saliva, gingival crevicular fluid, and erythrocytes. To acquire heme, P. gingivalis uses several mechanisms. Among them, the best characterized are those employing hemagglutinins, hemolysins, and gingipains (Kgp, RgpA, RgpB), TonB-dependent outer-membrane receptors (HmuR, HusB, IhtA), and hemophore-like proteins (HmuY, HusA). Proteins involved in intracellular heme transport, storage, and processing are less well characterized (e.g. PgDps). Importantly, P. gingivalis may also use the heme acquisition systems of other bacteria to fulfill its own heme requirements. Porphyromonas gingivalis displays a novel paradigm for heme acquisition from hemoglobin, whereby the Fe(II)-containing oxyhemoglobin molecule must first be oxidized to methemoglobin to facilitate heme release. This process not only involves P. gingivalis arginine- and lysine-specific gingipains, but other proteases (e.g. interpain A from Prevotella intermedia) or pyocyanin produced by Pseudomonas aeruginosa. Porphyromonas gingivalis is then able to fully proteolyze the more susceptible methemoglobin substrate to release free heme or to wrest heme from it directly through the use of the HmuY hemophore. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. Facilitating Preschoolers' Scientific Knowledge Construction via Computer Games Regarding Light and Shadow: The Effect of the Prediction-Observation-Explanation (POE) Strategy

    Science.gov (United States)

    Hsu, Chung-Yuan; Tsai, Chin-Chung; Liang, Jyh-Chong

    2011-10-01

    Educational researchers have suggested that computer games have a profound influence on students' motivation, knowledge construction, and learning performance, but little empirical research has targeted preschoolers. Thus, the purpose of the present study was to investigate the effects of implementing a computer game that integrates the prediction-observation-explanation (POE) strategy (White and Gunstone in Probing understanding. Routledge, New York, 1992) on facilitating preschoolers' acquisition of scientific concepts regarding light and shadow. The children's alternative conceptions were explored as well. Fifty participants were randomly assigned into either an experimental group that played a computer game integrating the POE model or a control group that played a non-POE computer game. By assessing the students' conceptual understanding through interviews, this study revealed that the students in the experimental group significantly outperformed their counterparts in the concepts regarding "shadow formation in daylight" and "shadow orientation." However, children in both groups, after playing the games, still expressed some alternative conceptions such as "Shadows always appear behind a person" and "Shadows should be on the same side as the sun."

  1. The Effect of Using a Proposed Teaching Strategy Based on the Selective Thinking on Students' Acquisition Concepts in Mathematics

    Science.gov (United States)

    Qudah, Ahmad Hassan

    2016-01-01

    This study aimed at identify the effect of using a proposed teaching strategy based on the selective thinking in acquire mathematical concepts by Classroom Teacher Students at Al- al- Bayt University, The sample of the study consisted of (74) students, equally distributed into a control group and an experimental group. The selective thinking…

  2. Copy-Right for Software and Computer Games: Strategies and Challenges

    Directory of Open Access Journals (Sweden)

    Hojatollah Ayoubi

    2009-11-01

    Full Text Available Copy-right has been initially used in cultural and art industries. From that time there have been two different approaches to the matter: the commercial-economic approach which is concerned with the rights of suppliers and investors; and the other approach, the cultural one, which is especially concerned with the rights of author. First approach is rooted in Anglo-American countries, while the other is originally French. Expansion of the computer market, and separating software and hardware markets caused to the so-called velvet-rubbery, which refers to the illegal reproduction in the market. Therefore, there were some struggles all over the world to protect rights of their producers. In present study, beside the domestic and international difficulties these strategies would encounter, this article has reviewed different strategies to face this challenge.

  3. Time Synchronization Strategy Between On-Board Computer and FIMS on STSAT-1

    Directory of Open Access Journals (Sweden)

    Seong Woo Kwak

    2004-06-01

    Full Text Available STSAT-1 was launched on sep. 2003 with the main payload of Far Ultra-violet Imaging Spectrograph(FIMS. The mission of FIMS is to observe universe and aurora. In this paper, we suggest a simple and reliable strategy adopted in STSAT-1 to synchronize time between On-board Computer(OBC and FIMS. For the characteristics of STSAT-1, this strategy is devised to maintain reliability of satellite system and to reduce implementation cost by using minimized electronic circuits. We suggested two methods with different synchronization resolutions to cope with unexpected faults in space. The backup method with low resolution can be activated when the main has some problems.

  4. Estimation of kinetic and thermodynamic ligand-binding parameters using computational strategies.

    Science.gov (United States)

    Deganutti, Giuseppe; Moro, Stefano

    2017-04-01

    Kinetic and thermodynamic ligand-protein binding parameters are gaining growing importance as key information to consider in drug discovery. The determination of the molecular structures, using particularly x-ray and NMR techniques, is crucial for understanding how a ligand recognizes its target in the final binding complex. However, for a better understanding of the recognition processes, experimental studies of ligand-protein interactions are needed. Even though several techniques can be used to investigate both thermodynamic and kinetic profiles for a ligand-protein complex, these procedures are very often laborious, time consuming and expensive. In the last 10 years, computational approaches have enormous potential in providing insights into each of the above effects and in parsing their contributions to the changes in both kinetic and thermodynamic binding parameters. The main purpose of this review is to summarize the state of the art of computational strategies for estimating the kinetic and thermodynamic parameters of a ligand-protein binding.

  5. Evolution of the heteroharmonic strategy for target-range computation in the echolocation of Mormoopidae.

    Directory of Open Access Journals (Sweden)

    Emanuel C Mora

    2013-06-01

    Full Text Available Echolocating bats use the time elapsed from biosonar pulse emission to the arrival of echo (defined as echo-delay to assess target-distance. Target-distance is represented in the brain by delay-tuned neurons that are classified as either heteroharmonic or homoharmormic. Heteroharmonic neurons respond more strongly to pulse-echo pairs in which the timing of the pulse is given by the fundamental biosonar harmonic while the timing of echoes is provided by one (or several of the higher order harmonics. On the other hand, homoharmonic neurons are tuned to the echo delay between similar harmonics in the emitted pulse and echo. It is generally accepted that heteroharmonic computations are advantageous over homoharmonic computations; i.e. heteroharmonic neurons receive information from call and echo in different frequency-bands which helps to avoid jamming between pulse and echo signals. Heteroharmonic neurons have been found in two species of the family Mormoopidae (Pteronotus parnellii and Pteronotus quadridens and in Rhinolophus rouxi. Recently, it was proposed that heteroharmonic target-range computations are a primitive feature of the genus Pteronotus that was preserved in the evolution of the genus. Here we review recent findings on the evolution of echolocation in Mormoopidae, and try to link those findings to the evolution of the heteroharmonic computation strategy. We stress the hypothesis that the ability to perform heteroharmonic computations evolved separately from the ability of using long constant-frequency echolocation calls, high duty cycle echolocation and Doppler Shift Compensation. Also, we present the idea that heteroharmonic computations might have been of advantage for categorizing prey size, hunting eared insects and living in large conspecific colonies. We make five testable predictions that might help future investigations to clarify the evolution of the heteroharmonic echolocation in Mormoopidae and other families.

  6. EIAGRID: In-field optimization of seismic data acquisition by real-time subsurface imaging using a remote GRID computing environment.

    Science.gov (United States)

    Heilmann, B. Z.; Vallenilla Ferrara, A. M.

    2009-04-01

    The constant growth of contaminated sites, the unsustainable use of natural resources, and, last but not least, the hydrological risk related to extreme meteorological events and increased climate variability are major environmental issues of today. Finding solutions for these complex problems requires an integrated cross-disciplinary approach, providing a unified basis for environmental science and engineering. In computer science, grid computing is emerging worldwide as a formidable tool allowing distributed computation and data management with administratively-distant resources. Utilizing these modern High Performance Computing (HPC) technologies, the GRIDA3 project bundles several applications from different fields of geoscience aiming to support decision making for reasonable and responsible land use and resource management. In this abstract we present a geophysical application called EIAGRID that uses grid computing facilities to perform real-time subsurface imaging by on-the-fly processing of seismic field data and fast optimization of the processing workflow. Even though, seismic reflection profiling has a broad application range spanning from shallow targets in a few meters depth to targets in a depth of several kilometers, it is primarily used by the hydrocarbon industry and hardly for environmental purposes. The complexity of data acquisition and processing poses severe problems for environmental and geotechnical engineering: Professional seismic processing software is expensive to buy and demands large experience from the user. In-field processing equipment needed for real-time data Quality Control (QC) and immediate optimization of the acquisition parameters is often not available for this kind of studies. As a result, the data quality will be suboptimal. In the worst case, a crucial parameter such as receiver spacing, maximum offset, or recording time turns out later to be inappropriate and the complete acquisition campaign has to be repeated. The

  7. A hybrid computational strategy to address WGS variant analysis in >5000 samples.

    Science.gov (United States)

    Huang, Zhuoyi; Rustagi, Navin; Veeraraghavan, Narayanan; Carroll, Andrew; Gibbs, Richard; Boerwinkle, Eric; Venkata, Manjunath Gorentla; Yu, Fuli

    2016-09-10

    The decreasing costs of sequencing are driving the need for cost effective and real time variant calling of whole genome sequencing data. The scale of these projects are far beyond the capacity of typical computing resources available with most research labs. Other infrastructures like the cloud AWS environment and supercomputers also have limitations due to which large scale joint variant calling becomes infeasible, and infrastructure specific variant calling strategies either fail to scale up to large datasets or abandon joint calling strategies. We present a high throughput framework including multiple variant callers for single nucleotide variant (SNV) calling, which leverages hybrid computing infrastructure consisting of cloud AWS, supercomputers and local high performance computing infrastructures. We present a novel binning approach for large scale joint variant calling and imputation which can scale up to over 10,000 samples while producing SNV callsets with high sensitivity and specificity. As a proof of principle, we present results of analysis on Cohorts for Heart And Aging Research in Genomic Epidemiology (CHARGE) WGS freeze 3 dataset in which joint calling, imputation and phasing of over 5300 whole genome samples was produced in under 6 weeks using four state-of-the-art callers. The callers used were SNPTools, GATK-HaplotypeCaller, GATK-UnifiedGenotyper and GotCloud. We used Amazon AWS, a 4000-core in-house cluster at Baylor College of Medicine, IBM power PC Blue BioU at Rice and Rhea at Oak Ridge National Laboratory (ORNL) for the computation. AWS was used for joint calling of 180 TB of BAM files, and ORNL and Rice supercomputers were used for the imputation and phasing step. All other steps were carried out on the local compute cluster. The entire operation used 5.2 million core hours and only transferred a total of 6 TB of data across the platforms. Even with increasing sizes of whole genome datasets, ensemble joint calling of SNVs for low

  8. Can pictures promote the acquisition of sight-word reading? An evaluation of two potential instructional strategies.

    Science.gov (United States)

    Richardson, Amy R; Lerman, Dorothea C; Nissen, Melissa A; Luck, Kally M; Neal, Ashley E; Bao, Shimin; Tsami, Loukia

    2017-01-01

    Sight-word instruction can be a useful supplement to phonics-based methods under some circumstances. Nonetheless, few studies have evaluated the conditions under which pictures may be used successfully to teach sight-word reading. In this study, we extended prior research by examining two potential strategies for reducing the effects of overshadowing when using picture prompts. Five children with developmental disabilities and two typically developing children participated. In the first experiment, the therapist embedded sight words within pictures but gradually faded in the pictures as needed using a least-to-most prompting hierarchy. In the second experiment, the therapist embedded text-to-picture matching within the sight-word reading sessions. Results suggested that these strategies reduced the interference typically observed with picture prompts and enhanced performance during teaching sessions for the majority of participants. Text-to-picture matching also accelerated mastery of the sight words relative to a condition under which the therapist presented text without pictures. © 2016 Society for the Experimental Analysis of Behavior.

  9. Computer system design description for SY-101 hydrogen mitigation test project data acquisition and control system (DACS-1)

    International Nuclear Information System (INIS)

    Ermi, A.M.

    1997-01-01

    Description of the Proposed Activity/REPORTABLE OCCURRENCE or PIAB: This ECN changes the computer systems design description support document describing the computers system used to control, monitor and archive the processes and outputs associated with the Hydrogen Mitigation Test Pump installed in SY-101. There is no new activity or procedure associated with the updating of this reference document. The updating of this computer system design description maintains an agreed upon documentation program initiated within the test program and carried into operations at time of turnover to maintain configuration control as outlined by design authority practicing guidelines. There are no new credible failure modes associated with the updating of information in a support description document. The failure analysis of each change was reviewed at the time of implementation of the Systems Change Request for all the processes changed. This document simply provides a history of implementation and current system status

  10. Axial power difference control strategy and computer simulation for GNPS during stretch-out and power decrease

    International Nuclear Information System (INIS)

    Liao Yehong; Xiao Min; Li Xianfeng; Zhu Minhong

    2004-01-01

    Successful control of the axial power difference for PWR is crucial to nuclear safety. After analyzing various elements' effect on the axial power distribution, different axial power deviation control strategies have been proposed to comply with different power decrease scenarios. Application of the strategy to computer simulation shows that our prediction of axial power deviation evolution is comparable to the measurement value, and that our control strategy is effective

  11. Comparison of different strategies in prenatal screening for Down's syndrome: cost effectiveness analysis of computer simulation.

    Science.gov (United States)

    Gekas, Jean; Gagné, Geneviève; Bujold, Emmanuel; Douillard, Daniel; Forest, Jean-Claude; Reinharz, Daniel; Rousseau, François

    2009-02-13

    To assess and compare the cost effectiveness of three different strategies for prenatal screening for Down's syndrome (integrated test, sequential screening, and contingent screenings) and to determine the most useful cut-off values for risk. Computer simulations to study integrated, sequential, and contingent screening strategies with various cut-offs leading to 19 potential screening algorithms. The computer simulation was populated with data from the Serum Urine and Ultrasound Screening Study (SURUSS), real unit costs for healthcare interventions, and a population of 110 948 pregnancies from the province of Québec for the year 2001. Cost effectiveness ratios, incremental cost effectiveness ratios, and screening options' outcomes. The contingent screening strategy dominated all other screening options: it had the best cost effectiveness ratio ($C26,833 per case of Down's syndrome) with fewer procedure related euploid miscarriages and unnecessary terminations (respectively, 6 and 16 per 100,000 pregnancies). It also outperformed serum screening at the second trimester. In terms of the incremental cost effectiveness ratio, contingent screening was still dominant: compared with screening based on maternal age alone, the savings were $C30,963 per additional birth with Down's syndrome averted. Contingent screening was the only screening strategy that offered early reassurance to the majority of women (77.81%) in first trimester and minimised costs by limiting retesting during the second trimester (21.05%). For the contingent and sequential screening strategies, the choice of cut-off value for risk in the first trimester test significantly affected the cost effectiveness ratios (respectively, from $C26,833 to $C37,260 and from $C35,215 to $C45,314 per case of Down's syndrome), the number of procedure related euploid miscarriages (from 6 to 46 and from 6 to 45 per 100,000 pregnancies), and the number of unnecessary terminations (from 16 to 26 and from 16 to 25 per 100

  12. IMPLEMENTING THE COMPUTER-BASED NATIONAL EXAMINATION IN INDONESIAN SCHOOLS: THE CHALLENGES AND STRATEGIES

    Directory of Open Access Journals (Sweden)

    Heri Retnawati

    2017-12-01

    Full Text Available In line with technological development, the computer-based national examination (CBNE has become an urgent matter as its implementation faces various challenges, especially in developing countries. Strategies in implementing CBNE are thus needed to face the challenges. The aim of this research was to analyse the challenges and strategies of Indonesian schools in implementing CBNE. This research was qualitative phenomenological in nature. The data were collected through a questionnaire and a focus group discussion. The research participants were teachers who were test supervisors and technicians at junior high schools and senior high schools (i.e. Level 1 and 2 and vocational high schools implementing CBNE in Yogyakarta, Indonesia. The data were analysed using the Bogdan and Biklen model. The results indicate that (1 in implementing CBNE, the schools should initially make efforts to provide the electronic equipment supporting it; (2 the implementation of CBNE is challenged by problems concerning the Internet and the electricity supply; (3 the test supervisors have to learn their duties by themselves and (4 the students are not yet familiar with the beneficial use of information technology. To deal with such challenges, the schools employed strategies by making efforts to provide the standard electronic equipment through collaboration with the students’ parents and improving the curriculum content by adding information technology as a school subject.

  13. Computationally efficient design of optimal output feedback strategies for controllable passive damping devices

    International Nuclear Information System (INIS)

    Kamalzare, Mahmoud; Johnson, Erik A; Wojtkiewicz, Steven F

    2014-01-01

    Designing control strategies for smart structures, such as those with semiactive devices, is complicated by the nonlinear nature of the feedback control, secondary clipping control and other additional requirements such as device saturation. The usual design approach resorts to large-scale simulation parameter studies that are computationally expensive. The authors have previously developed an approach for state-feedback semiactive clipped-optimal control design, based on a nonlinear Volterra integral equation that provides for the computationally efficient simulation of such systems. This paper expands the applicability of the approach by demonstrating that it can also be adapted to accommodate more realistic cases when, instead of full state feedback, only a limited set of noisy response measurements is available to the controller. This extension requires incorporating a Kalman filter (KF) estimator, which is linear, into the nominal model of the uncontrolled system. The efficacy of the approach is demonstrated by a numerical study of a 100-degree-of-freedom frame model, excited by a filtered Gaussian random excitation, with noisy acceleration sensor measurements to determine the semiactive control commands. The results show that the proposed method can improve computational efficiency by more than two orders of magnitude relative to a conventional solver, while retaining a comparable level of accuracy. Further, the proposed approach is shown to be similarly efficient for an extensive Monte Carlo simulation to evaluate the effects of sensor noise levels and KF tuning on the accuracy of the response. (paper)

  14. A comparison of educational strategies for the acquisition of nursing student's performance and critical thinking: simulation-based training vs. integrated training (simulation and critical thinking strategies).

    Science.gov (United States)

    Zarifsanaiey, Nahid; Amini, Mitra; Saadat, Farideh

    2016-11-16

    There is a need to change the focus of nursing education from traditional teacher-centered training programs to student-centered active methods. The integration of the two active learning techniques will improve the effectiveness of training programs. The objective of this study is to compare the effects of the integrated training (simulation and critical thinking strategies) and simulation-based training on the performance level and critical thinking ability of nursing students. The present quasi-experimental study was performed in 2014 on 40 students who were studying practical nursing principles and skills course in the first half of the academic year in Shiraz University of Medical Sciences. Students were randomly divided into control (n = 20) and experimental (n = 20) groups. After training students through simulation and integrated education (simulation and critical thinking strategies), the students' critical thinking ability and performance were evaluated via the use of California Critical Thinking Ability Questionnaire B (CCTST) and Objective Structured Clinical Examination (OSCE) comprising 10 stations, respectively. The external reliability of the California Critical Thinking questionnaire was reported by Case B.to be between 0.78 and 0.80 and the validity of OSCE was approved by 5 members of the faculty. Furthermore, by using Split Half method (the correlation between odd and even stations), the reliability of the test was approved with correlation coefficient of 0.66. Data were analyzed using t-test and Mann-Whitney test. A significance level of 0.05 was considered to be statistically significant. The mean scores of the experimental group performance level were higher than the mean score of the control group performance level. This difference was statistically significant and students in the experimental group in OSCE stations had significantly higher performance than the control group (P critical thinking did not increase before and after the

  15. Early Intervention with Children of Dyslexic Parents: Effects of Computer-Based Reading Instruction at Home on Literacy Acquisition

    Science.gov (United States)

    Regtvoort, Anne G. F. M.; van der Leij, Aryan

    2007-01-01

    The hereditary basis of dyslexia makes it possible to identify children at risk early on. Pre-reading children genetically at risk received during 14 weeks a home- and computer-based training in phonemic awareness and letter-sound relationships in the context of reading instruction. At posttest training effects were found for both phonemic…

  16. Early intervention with children of dyslexic parents: Effects of computer-based reading instruction at home on literacy acquisition

    NARCIS (Netherlands)

    Regtvoort, A.G.F.M.; van der Leij, A.

    2007-01-01

    The hereditary basis of dyslexia makes it possible to identify children at risk early on. Pre-reading children genetically at risk received during 14 weeks a home- and computer-based training in phonemic awareness and letter-sound relationships in the context of reading instruction. At posttest

  17. Selective lesion of septal cholinergic neurons in rats impairs acquisition of a delayed matching to position T-maze task by delaying the shift from a response to a place strategy.

    Science.gov (United States)

    Fitz, Nicholas F; Gibbs, Robert B; Johnson, David A

    2008-12-16

    This study tested the hypothesis that septal cholinergic lesions impair acquisition of a delayed matching to position (DMP) T-maze task in male rats by affecting learning strategy. Rats received either the selective cholinergic immunotoxin, 192 IgG-saporin (SAP) or artificial cerebrospinal fluid directly into the medial septum. Two weeks later, animals were trained to acquire the DMP task. SAP-treated rats took significantly longer to acquire the task than corresponding controls. Both SAP-treated and control rats adopted a persistent turn and utilized a response strategy during early periods of training. By the time rats reached criterion the persistent turn was no longer evident, and all rats had shifted to an allocentric strategy, i.e., were relying on extramaze cues to a significant degree. During the acquisition period, SAP-treated rats spent significantly more days showing a persistent turn and using a response strategy than corresponding controls. The added time spent using a response strategy accounted entirely for the added days required to reach criterion among the SAP-treated rats. This suggests that the principal mechanism by which septal cholinergic lesions impair DMP acquisition in male rats is by increasing the predisposition to use a response vs. a place strategy, thereby affecting the ability to switch from one strategy to another.

  18. Value of computed tomography arthrography with delayed acquisitions in the work-up of ganglion cysts of the tarsal tunnel: report of three cases

    International Nuclear Information System (INIS)

    Omoumi, Patrick; Gheldere, Antoine de; Leemrijse, Thibaut; Galant, Christine; Van den Bergh, Peter; Malghem, Jacques; Simoni, Paolo; Berg, Bruno C.V.; Lecouvet, Frederic E.

    2010-01-01

    Ganglion cysts are a common cause of tarsal tunnel syndrome. As in other locations, these cysts are believed to communicate with neighboring joints. The positive diagnosis and preoperative work-up of these cysts require identification and location of the cyst pedicles so that they may be excised and the risk of recurrence decreased. This can be challenging with ultrasonography and magnetic resonance (MR) imaging. We present three cases of symptomatic ganglion cysts of the tarsal tunnel, diagnosed by MR imaging, where computed tomography (CT) arthrography with delayed acquisitions helped to confirm the diagnosis and identify precisely the topography of the communication with the subtalar joint. These cases provide new evidence of the articular origin of ganglion cysts developing in the tarsal tunnel. (orig.)

  19. Evaluation of linear measurements of implant sites based o head orientation during acquisition: An ex vivo study using cone-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Sabban, Hanadi; Mahdian, Mina; Dhingra, Ajay; Lurie, Alan G.; Tadinada, Aditya [University of Connecticut School of Dental Medicine, Farmington (United States)

    2015-06-15

    This study evaluated the effect of various head orientations during cone-beam computed tomography (CBCT) image acquisition on linear measurements of potential implant sites. Six dry human skulls with a total of 28 implant sites were evaluated for seven different head orientations. The scans were acquired using a Hitachi CB-MercuRay CBCT machine. The scanned volumes were reconstructed. Horizontal and vertical measurements were made and were compared to measurements made after simulating the head position to corrected head angulations. Data was analyzed using a two-way ANOVA test. Statistical analysis revealed a significant interaction between the mean errors in vertical measurements with a marked difference observed at the extension head position (P<0.05). Statistical analysis failed to yield any significant interaction between the mean errors in horizontal measurements at various head positions. Head orientation could significantly affect the vertical measurements in CBCT scans. The main head position influencing the measurements is extension.

  20. Combined positron emission tomography/computed tomography (PET/CT) for clinical oncology: technical aspects and acquisition protocols

    International Nuclear Information System (INIS)

    Beyer, T.

    2004-01-01

    Combined PET/CT imaging is a non-invasive means of reviewing both, the anatomy and the molecular pathways of a patient during a quasi-simultaneous examination. Since the introduction of the prototype PET/CT in 1998 a rapid development of this imaging technology is being witnessed. The incorporation of fast PET detector technology into PET/CT designs and the routine use of the CT transmission images for attenuation correction of the PET allow for anato-metabolic whole-body examinations to be completed in less than 30 min. Thus, PET/CT imaging offers a logistical advantage to both, the patient and the clinicians since the two complementary exams - whenever clinically indicated - can be performed almost at the same time and a single integrated report can be created. Nevertheless, a number of pit-falls, primarily from the use of CT-based attenuation correction, have been identified and are being addressed through optimized acquisition protocols. It is fair to say, that PET/CT has been integrated in the diagnostic imaging arena, and in many cases has led to a close collaboration between different, yet complementary diagnostic and therapeutic medical disciplines. (orig.)

  1. Computational Strategy for Quantifying Human Pesticide Exposure based upon a Saliva Measurement

    Directory of Open Access Journals (Sweden)

    Charles eTimchalk

    2015-05-01

    Full Text Available Quantitative exposure data is important for evaluating toxicity risk and biomonitoring is a critical tool for evaluating human exposure. Direct personal monitoring provides the most accurate estimation of a subject’s true dose, and non-invasive methods are advocated for quantifying exposure to xenobiotics. In this regard, there is a need to identify chemicals that are cleared in saliva at concentrations that can be quantified to support the implementation of this approach. This manuscript reviews the computational modeling approaches that are coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics and provides additional insight on species-dependent differences in partitioning that are of key importance for extrapolation. The primary mechanism by which xenobiotics leave the blood and enter saliva involves paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of xenobiotics transferred by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computationally modeled using compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of the Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis identified that both protein-binding and pKa (for weak acids and bases have significant impact on determining partitioning and species dependent differences based upon physiological variance. Future strategies are focused on an in vitro salivary acinar cell based system to experimentally determine and computationally predict salivary gland uptake and clearance for xenobiotics. It is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of chemical exposures in human

  2. Practical considerations for optimizing cardiac computed tomography protocols for comprehensive acquisition prior to transcatheter aortic valve replacement.

    Science.gov (United States)

    Khalique, Omar K; Pulerwitz, Todd C; Halliburton, Sandra S; Kodali, Susheel K; Hahn, Rebecca T; Nazif, Tamim M; Vahl, Torsten P; George, Isaac; Leon, Martin B; D'Souza, Belinda; Einstein, Andrew J

    2016-01-01

    Transcatheter aortic valve replacement (TAVR) is performed frequently in patients with severe, symptomatic aortic stenosis who are at high risk or inoperable for open surgical aortic valve replacement. Computed tomography angiography (CTA) has become the gold standard imaging modality for pre-TAVR cardiac anatomic and vascular access assessment. Traditionally, cardiac CTA has been most frequently used for assessment of coronary artery stenosis, and scanning protocols have generally been tailored for this purpose. Pre-TAVR CTA has different goals than coronary CTA and the high prevalence of chronic kidney disease in the TAVR patient population creates a particular need to optimize protocols for a reduction in iodinated contrast volume. This document reviews details which allow the physician to tailor CTA examinations to maximize image quality and minimize harm, while factoring in multiple patient and scanner variables which must be considered in customizing a pre-TAVR protocol. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  3. Performance Assessment Strategies: A computational framework for conceptual design of large roofs

    Directory of Open Access Journals (Sweden)

    Michela Turrin

    2014-01-01

    Full Text Available Using engineering performance evaluations to explore design alternatives during the conceptual phase of architectural design helps to understand the relationships between form and performance; and is crucial for developing well-performing final designs. Computer aided conceptual design has the potential to aid the design team in discovering and highlighting these relationships; especially by means of procedural and parametric geometry to support the generation of geometric design, and building performance simulation tools to support performance assessments. However, current tools and methods for computer aided conceptual design in architecture do not explicitly reveal nor allow for backtracking the relationships between performance and geometry of the design. They currently support post-engineering, rather than the early design decisions and the design exploration process. Focusing on large roofs, this research aims at developing a computational design approach to support designers in performance driven explorations. The approach is meant to facilitate the multidisciplinary integration and the learning process of the designer; and not to constrain the process in precompiled procedures or in hard engineering formulations, nor to automatize it by delegating the design creativity to computational procedures. PAS (Performance Assessment Strategies as a method is the main output of the research. It consists of a framework including guidelines and an extensible library of procedures for parametric modelling. It is structured on three parts. Pre-PAS provides guidelines for a design strategy-definition, toward the parameterization process. Model-PAS provides guidelines, procedures and scripts for building the parametric models. Explore-PAS supports the solutions-assessment based on numeric evaluations and performance simulations, until the identification of a suitable design solution. PAS has been developed based on action research. Several case studies

  4. Computational strategy for quantifying human pesticide exposure based upon a saliva measurement

    Energy Technology Data Exchange (ETDEWEB)

    Timchalk, Charles; Weber, Thomas J.; Smith, Jordan N.

    2015-05-27

    The National Research Council of the National Academies report, Toxicity Testing in the 21st Century: A Vision and Strategy, highlighted the importance of quantitative exposure data for evaluating human toxicity risk and noted that biomonitoring is a critical tool for quantitatively evaluating exposure from both environmental and occupational settings. Direct measurement of chemical exposures using personal monitoring provides the most accurate estimation of a subject’s true exposure, and non-invasive methods have also been advocated for quantifying the pharmacokinetics and bioavailability of drugs and xenobiotics. In this regard, there is a need to identify chemicals that are readily cleared in saliva at concentrations that can be quantified to support the implementation of this approach.. The current manuscript describes the use of computational modeling approaches that are closely coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics. The primary mechanism by which xenobiotics leave the blood and enter saliva is thought to involve paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of drugs and xenobiotics cleared from plasma into saliva by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computational modeled using a combination of compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of a modified Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis of key model parameters specifically identified that both protein-binding and pKa (for weak acids and bases) had the most significant impact on the determination of partitioning and that there were clear species dependent differences based upon physiological variance between

  5. Inductive acquisition of expert knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Muggleton, S.H.

    1986-01-01

    Expert systems divide neatly into two categories: those in which (1) the expert decisions result in changes to some external environment (control systems), and (2) the expert decisions merely seek to describe the environment (classification systems). Both the explanation of computer-based reasoning and the bottleneck (Feigenbaum, 1979) of knowledge acquisition are major issues in expert-systems research. The author contributed to these areas of research in two ways: 1. He implemented an expert-system shell, the Mugol environment, which facilitates knowledge acquisition by inductive inference and provides automatic explanation of run-time reasoning on demand. RuleMaster, a commercial version of this environment, was used to advantage industrially in the construction and testing of two large classification systems. 2. He investigated a new techniques called 'sequence induction' that can be used in construction of control systems. Sequence induction is based on theoretical work in grammatical learning. He improved existing grammatical learning algorithms as well as suggesting and theoretically characterizing new ones. These algorithms were successfully applied to acquisition of knowledge for a diverse set of control systems, including inductive construction of robot plans and chess end-gam strategies.

  6. Computer use problems and accommodation strategies at work and home for people with systemic sclerosis: a needs assessment.

    Science.gov (United States)

    Baker, Nancy A; Aufman, Elyse L; Poole, Janet L

    2012-01-01

    We identified the extent of the need for interventions and assistive technology to prevent computer use problems in people with systemic sclerosis (SSc) and the accommodation strategies they use to alleviate such problems. Respondents were recruited through the Scleroderma Foundation. Twenty-seven people with SSc who used a computer and reported difficulty in working completed the Computer Problems Survey. All but 1 of the respondents reported one problem with at least one equipment type. The highest number of respondents reported problems with keyboards (88%) and chairs (85%). More than half reported discomfort in the past month associated with the chair, keyboard, and mouse. Respondents used a variety of accommodation strategies. Many respondents experienced problems and discomfort related to computer use. The characteristic symptoms of SSc may contribute to these problems. Occupational therapy interventions for computer use problems in clients with SSc need to be tested. Copyright © 2012 by the American Occupational Therapy Association, Inc.

  7. Using multiple metaphors and multimodalities as a semiotic resource when teaching year 2 students computational strategies

    Science.gov (United States)

    Mildenhall, Paula; Sherriff, Barbara

    2017-06-01

    Recent research indicates that using multimodal learning experiences can be effective in teaching mathematics. Using a social semiotic lens within a participationist framework, this paper reports on a professional learning collaboration with a primary school teacher designed to explore the use of metaphors and modalities in mathematics instruction. This video case study was conducted in a year 2 classroom over two terms, with the focus on building children's understanding of computational strategies. The findings revealed that the teacher was able to successfully plan both multimodal and multiple metaphor learning experiences that acted as semiotic resources to support the children's understanding of abstract mathematics. The study also led to implications for teaching when using multiple metaphors and multimodalities.

  8. Cloud computing task scheduling strategy based on differential evolution and ant colony optimization

    Science.gov (United States)

    Ge, Junwei; Cai, Yu; Fang, Yiqiu

    2018-05-01

    This paper proposes a task scheduling strategy DEACO based on the combination of Differential Evolution (DE) and Ant Colony Optimization (ACO), aiming at the single problem of optimization objective in cloud computing task scheduling, this paper combines the shortest task completion time, cost and load balancing. DEACO uses the solution of the DE to initialize the initial pheromone of ACO, reduces the time of collecting the pheromone in ACO in the early, and improves the pheromone updating rule through the load factor. The proposed algorithm is simulated on cloudsim, and compared with the min-min and ACO. The experimental results show that DEACO is more superior in terms of time, cost, and load.

  9. Proxy-equation paradigm: A strategy for massively parallel asynchronous computations

    Science.gov (United States)

    Mittal, Ankita; Girimaji, Sharath

    2017-09-01

    Massively parallel simulations of transport equation systems call for a paradigm change in algorithm development to achieve efficient scalability. Traditional approaches require time synchronization of processing elements (PEs), which severely restricts scalability. Relaxing synchronization requirement introduces error and slows down convergence. In this paper, we propose and develop a novel "proxy equation" concept for a general transport equation that (i) tolerates asynchrony with minimal added error, (ii) preserves convergence order and thus, (iii) expected to scale efficiently on massively parallel machines. The central idea is to modify a priori the transport equation at the PE boundaries to offset asynchrony errors. Proof-of-concept computations are performed using a one-dimensional advection (convection) diffusion equation. The results demonstrate the promise and advantages of the present strategy.

  10. Digital mammography: Signal-extraction strategies for computer-aided detection of microcalcifications

    International Nuclear Information System (INIS)

    Chan, H.P.; Doi, K.; Metz, C.E.; Vyborny, C.J.; Lam, K.L.; Schmidt, R.A.

    1987-01-01

    The authors found that the structured background of a mammogram can be removed effectively by either a difference-image technique (using a matched filter in combination with a median filter, a contrast-reversal filter, or a box-rim filter) or a visual response filter alone. Locally adaptive gray-level thresholding and region-growing techniques can then be employed to extract microcalcifications from the processed image. Signals are further distinguished from noise or artifacts by their size, contrast, and clustering properties. The authors studied the dependence of the detectability of microcalcifications on the various signal-extraction strategies. Potential application of the computer-aided system to mammography is assessed by its performance on clinical mammograms

  11. Using computer-aided drug design and medicinal chemistry strategies in the fight against diabetes.

    Science.gov (United States)

    Semighini, Evandro P; Resende, Jonathan A; de Andrade, Peterson; Morais, Pedro A B; Carvalho, Ivone; Taft, Carlton A; Silva, Carlos H T P

    2011-04-01

    The aim of this work is to present a simple, practical and efficient protocol for drug design, in particular Diabetes, which includes selection of the illness, good choice of a target as well as a bioactive ligand and then usage of various computer aided drug design and medicinal chemistry tools to design novel potential drug candidates in different diseases. We have selected the validated target dipeptidyl peptidase IV (DPP-IV), whose inhibition contributes to reduce glucose levels in type 2 diabetes patients. The most active inhibitor with complex X-ray structure reported was initially extracted from the BindingDB database. By using molecular modification strategies widely used in medicinal chemistry, besides current state-of-the-art tools in drug design (including flexible docking, virtual screening, molecular interaction fields, molecular dynamics, ADME and toxicity predictions), we have proposed 4 novel potential DPP-IV inhibitors with drug properties for Diabetes control, which have been supported and validated by all the computational tools used herewith.

  12. 48 CFR 7.107 - Additional requirements for acquisitions involving bundling.

    Science.gov (United States)

    2010-10-01

    ...; and (2) The acquisition strategy provides for maximum practicable participation by small business... the Government. However, because of the potential impact on small business participation, the head of... acquisition strategy involves substantial bundling, the acquisition strategy must additionally— (1) Identify...

  13. Flat panel detector-based cone beam computed tomography with a circle-plus-two-arcs data acquisition orbit: Preliminary phantom study

    International Nuclear Information System (INIS)

    Ning Ruola; Tang Xiangyang; Conover, David; Yu Rongfeng

    2003-01-01

    Cone beam computed tomography (CBCT) has been investigated in the past two decades due to its potential advantages over a fan beam CT. These advantages include (a) great improvement in data acquisition efficiency, spatial resolution, and spatial resolution uniformity, (b) substantially better utilization of x-ray photons generated by the x-ray tube compared to a fan beam CT, and (c) significant advancement in clinical three-dimensional (3D) CT applications. However, most studies of CBCT in the past are focused on cone beam data acquisition theories and reconstruction algorithms. The recent development of x-ray flat panel detectors (FPD) has made CBCT imaging feasible and practical. This paper reports a newly built flat panel detector-based CBCT prototype scanner and presents the results of the preliminary evaluation of the prototype through a phantom study. The prototype consisted of an x-ray tube, a flat panel detector, a GE 8800 CT gantry, a patient table and a computer system. The prototype was constructed by modifying a GE 8800 CT gantry such that both a single-circle cone beam acquisition orbit and a circle-plus-two-arcs orbit can be achieved. With a circle-plus-two-arcs orbit, a complete set of cone beam projection data can be obtained, consisting of a set of circle projections and a set of arc projections. Using the prototype scanner, the set of circle projections were acquired by rotating the x-ray tube and the FPD together on the gantry, and the set of arc projections were obtained by tilting the gantry while the x-ray tube and detector were at the 12 and 6 o'clock positions, respectively. A filtered backprojection exact cone beam reconstruction algorithm based on a circle-plus-two-arcs orbit was used for cone beam reconstruction from both the circle and arc projections. The system was first characterized in terms of the linearity and dynamic range of the detector. Then the uniformity, spatial resolution and low contrast resolution were assessed using

  14. An image acquisition and registration strategy for the fusion of hyperpolarized helium-3 MRI and x-ray CT images of the lung

    Science.gov (United States)

    Ireland, Rob H.; Woodhouse, Neil; Hoggard, Nigel; Swinscoe, James A.; Foran, Bernadette H.; Hatton, Matthew Q.; Wild, Jim M.

    2008-11-01

    The purpose of this ethics committee approved prospective study was to evaluate an image acquisition and registration protocol for hyperpolarized helium-3 magnetic resonance imaging (3He-MRI) and x-ray computed tomography. Nine patients with non-small cell lung cancer (NSCLC) gave written informed consent to undergo a free-breathing CT, an inspiration breath-hold CT and a 3D ventilation 3He-MRI in CT position using an elliptical birdcage radiofrequency (RF) body coil. 3He-MRI to CT image fusion was performed using a rigid registration algorithm which was assessed by two observers using anatomical landmarks and a percentage volume overlap coefficient. Registration of 3He-MRI to breath-hold CT was more accurate than to free-breathing CT; overlap 82.9 ± 4.2% versus 59.8 ± 9.0% (p < 0.001) and mean landmark error 0.75 ± 0.24 cm versus 1.25 ± 0.60 cm (p = 0.002). Image registration is significantly improved by using an imaging protocol that enables both 3He-MRI and CT to be acquired with similar breath holds and body position through the use of a birdcage 3He-MRI body RF coil and an inspiration breath-hold CT. Fusion of 3He-MRI to CT may be useful for the assessment of patients with lung diseases.

  15. An image acquisition and registration strategy for the fusion of hyperpolarized helium-3 MRI and x-ray CT images of the lung

    International Nuclear Information System (INIS)

    Ireland, Rob H; Woodhouse, Neil; Hoggard, Nigel; Swinscoe, James A; Foran, Bernadette H; Hatton, Matthew Q; Wild, Jim M

    2008-01-01

    The purpose of this ethics committee approved prospective study was to evaluate an image acquisition and registration protocol for hyperpolarized helium-3 magnetic resonance imaging ( 3 He-MRI) and x-ray computed tomography. Nine patients with non-small cell lung cancer (NSCLC) gave written informed consent to undergo a free-breathing CT, an inspiration breath-hold CT and a 3D ventilation 3 He-MRI in CT position using an elliptical birdcage radiofrequency (RF) body coil. 3 He-MRI to CT image fusion was performed using a rigid registration algorithm which was assessed by two observers using anatomical landmarks and a percentage volume overlap coefficient. Registration of 3 He-MRI to breath-hold CT was more accurate than to free-breathing CT; overlap 82.9 ± 4.2% versus 59.8 ± 9.0% (p 3 He-MRI and CT to be acquired with similar breath holds and body position through the use of a birdcage 3 He-MRI body RF coil and an inspiration breath-hold CT. Fusion of 3 He-MRI to CT may be useful for the assessment of patients with lung diseases.

  16. Comparison of Tc-99m-sestamibi-F-18-fluorodeoxyglucose dual isotope simultaneous acquisition and rest-stress Tc-99m-sestamibi single photon emission computed tomography for the assessment of myocardial viability

    NARCIS (Netherlands)

    De Boer, J; Slart, RHJA; Blanksma, Paulus; Willemsen, ATM; Jager, PL; Paans, AMJ; Vaalburg, W; Piers, DA

    Dual isotope simultaneous acquisition single photon emission computed tomography (DISA SPECT) offers the advantage of obtaining information on myocardial perfusion using Tc-99m-sestamibi (Tc-99m-MIBI) and metabolism using F-18-fluorodeoxyglucose (F-18-FDG) in a single study. The prerequisite is that

  17. Comparison of 99mTc-sestamibi-18F-fluorodeoxyglucose dual isotope simultaneous acquisition and rest-stress 99mTc-sestamibi single photon emission computed tomography for the assessment of myocardial viability

    NARCIS (Netherlands)

    den Boer, Johan; Slart, R H J A; Blanksma, P K; Willemsen, Antonius; Jager, P L; Paans, A M J; Vaalburg, W; Piers, D A

    Dual isotope simultaneous acquisition single photon emission computed tomography (DISA SPECT) offers the advantage of obtaining information on myocardial perfusion using Tc-sestamibi ( Tc-MIBI) and metabolism using F-fluorodeoxyglucose ( F-FDG) in a single study. The prerequisite is that the Tc-MIBI

  18. Examining the Effects of Two Computer Programming Learning Strategies: Self-Explanation versus Reading Questions and Answers

    Directory of Open Access Journals (Sweden)

    Nancy Lee

    2017-08-01

    Full Text Available The study described here explored the differential effects of two learning strategies, self-explanation and reading questions and answers, on learning the computer programming language JavaScript. Students’ test performance and perceptions of effectiveness toward the two strategies were examined. An online interactive tutorial instruction implementing worked-examples and multimedia learning principles was developed for this study. Participants were 147 high school students (ages 14 to 18 of a computer introductory course in six periods which were randomly divided into two groups (n = 78; n = 69 of three periods each. The two groups alternated learning strategies to learn five lessons. Students’ prerequisite knowledge of XHTML and motivation to learn computer programming languages were measured before starting the tutorial. Students largely expressed their preference toward self-explanation over reading questions and answers. They thought self-explanation incurred much more work yet was more effective. However, the two learning strategies did not have differential effects on students’ test performance. The seeming discrepancy arising from students’ preferred strategy and their test performance was discussed in the areas of familiar versus new strategy, difficulty of learning materials and testing method, and experimental duration.

  19. Usefulness and safety of propranolol injection into vein for acquisition of coronary multidetector-row computed tomography

    International Nuclear Information System (INIS)

    Sekine, Takako; Kodama, Takahide; Kondo, Takeshi

    2010-01-01

    A low heart rate (HR), associated with a prolonged slow filling phase (SF), is necessary to obtain a high quality coronary CT at a low radiation dose with conventional 64 multidetector-row computed tomography (MDCT). The purpose of our study was to confirm the safety of injecting propranolol (2-10 mg) into the vein for lowering heart rate in patients requiring MDCT and to document the effect of the drug on HR, PQ and SF. Of 1290 consecutive patients who were initially considered for enrollment in the coronary MDCT study, 40 patients with atrial fibrillations, 3 with atrial flutters, and 13 with artificial pacemakers were excluded. Of the remaining 1234 patients (M/F=714/520), 331 had already taken an oral beta-blocker before the CT examination, and were included in the study. In patients with no contraindications, propranolol was aggressively injected (2-10 mg) into the vein to reduce the HR. In patients not taking an oral beta blocker, 2 mg propranolol reduced the HR by -10±5 bpm and 10 mg, by -20±7 bpm. However, in patients taking an oral beta-blocker, the decrease in HR by propranolol was minimal (2 mg, -6±4 bpm; 10 mg, -10±6 bpm). Propranolol significantly prolonged the PQ interval (from 169±27 to 179±29 ms, P<0.0001), and SF (from 125±69 to 264±79 ms, P<0.0001). Adverse effects of propranolol injection were observed in only 3 [2 mild hypotension and 1 paroxysmal atrial fibrillation (recovered to sinus rhythm by DC counter shock)] of 3212 patients. All 3 patients became stable after 1 or 2 hours of rest and could return home. Propranolol injection was a relatively safe and useful method to reduce HR and prolong SF, necessary for obtaining high quality coronary MDCT with a low radiation dose. (author)

  20. Computationally derived points of fragility of a human cascade are consistent with current therapeutic strategies.

    Directory of Open Access Journals (Sweden)

    Deyan Luan

    2007-07-01

    Full Text Available The role that mechanistic mathematical modeling and systems biology will play in molecular medicine and clinical development remains uncertain. In this study, mathematical modeling and sensitivity analysis were used to explore the working hypothesis that mechanistic models of human cascades, despite model uncertainty, can be computationally screened for points of fragility, and that these sensitive mechanisms could serve as therapeutic targets. We tested our working hypothesis by screening a model of the well-studied coagulation cascade, developed and validated from literature. The predicted sensitive mechanisms were then compared with the treatment literature. The model, composed of 92 proteins and 148 protein-protein interactions, was validated using 21 published datasets generated from two different quiescent in vitro coagulation models. Simulated platelet activation and thrombin generation profiles in the presence and absence of natural anticoagulants were consistent with measured values, with a mean correlation of 0.87 across all trials. Overall state sensitivity coefficients, which measure the robustness or fragility of a given mechanism, were calculated using a Monte Carlo strategy. In the absence of anticoagulants, fluid and surface phase factor X/activated factor X (fX/FXa activity and thrombin-mediated platelet activation were found to be fragile, while fIX/FIXa and fVIII/FVIIIa activation and activity were robust. Both anti-fX/FXa and direct thrombin inhibitors are important classes of anticoagulants; for example, anti-fX/FXa inhibitors have FDA approval for the prevention of venous thromboembolism following surgical intervention and as an initial treatment for deep venous thrombosis and pulmonary embolism. Both in vitro and in vivo experimental evidence is reviewed supporting the prediction that fIX/FIXa activity is robust. When taken together, these results support our working hypothesis that computationally derived points of

  1. CLOUD COMPUTING AND BIG DATA AS CONVERGENT TECHNOLOGIES FOR RETAIL PRICING STRATEGIES OF SMEs

    Directory of Open Access Journals (Sweden)

    George SUCIU

    2014-05-01

    Full Text Available Most retailers know that technology has played an increasingly important role in helping retailers set prices. Online business decision systems are at the core point of an SMEs management and reporting activities. But, until recently, these efforts have been rooted in advances in computing technology, such as cloud computing and big data mining, rather than in newfound applications of scientific principles. In addition, in previous approaches big data mining solutions were implemented locally on private clouds and no SME could aggregate and analyze the information that consumers are exchanging with each other. Real science is a powerful, pervasive force in retail today, particularly so for addressing the complex challenge of retail pricing. Cloud Computing comes in to provide access to entirely new business capabilities through sharing resources and services and managing and assigning resources effectively. Done right, the application of scientific principles to the creation of a true price optimization strategy can lead to significant sales, margin, and profit lift for retailers. In this paper we describe a method to provide the mobile retail consumers with reviews, brand ratings and detailed product information at the point of sale. Furthermore, we present how we use Exalead CloudView platform to search for weak signals in big data by analyzing multimedia data (text, voice, picture, video and mining online social networks. The analysis makes not only customer profiling possible, but also brand promotion in the form of coupons, discounts or upselling to generate more sales, thus providing the opportunity for retailer SMEs to connect directly to its customers in real time. The paper explains why retailers can no longer thrive without a science-based pricing system, defines and illustrates the right science-based approach, and calls out the key features and functionalities of leading science-based price optimization systems. In particular, given

  2. Variable Selection for Nonparametric Gaussian Process Priors: Models and Computational Strategies.

    Science.gov (United States)

    Savitsky, Terrance; Vannucci, Marina; Sha, Naijun

    2011-02-01

    This paper presents a unified treatment of Gaussian process models that extends to data from the exponential dispersion family and to survival data. Our specific interest is in the analysis of data sets with predictors that have an a priori unknown form of possibly nonlinear associations to the response. The modeling approach we describe incorporates Gaussian processes in a generalized linear model framework to obtain a class of nonparametric regression models where the covariance matrix depends on the predictors. We consider, in particular, continuous, categorical and count responses. We also look into models that account for survival outcomes. We explore alternative covariance formulations for the Gaussian process prior and demonstrate the flexibility of the construction. Next, we focus on the important problem of selecting variables from the set of possible predictors and describe a general framework that employs mixture priors. We compare alternative MCMC strategies for posterior inference and achieve a computationally efficient and practical approach. We demonstrate performances on simulated and benchmark data sets.

  3. The Goal Specificity Effect on Strategy Use and Instructional Efficiency during Computer-Based Scientific Discovery Learning

    Science.gov (United States)

    Kunsting, Josef; Wirth, Joachim; Paas, Fred

    2011-01-01

    Using a computer-based scientific discovery learning environment on buoyancy in fluids we investigated the "effects of goal specificity" (nonspecific goals vs. specific goals) for two goal types (problem solving goals vs. learning goals) on "strategy use" and "instructional efficiency". Our empirical findings close an important research gap,…

  4. Game-Based Practice versus Traditional Practice in Computer-Based Writing Strategy Training: Effects on Motivation and Achievement

    Science.gov (United States)

    Proske, Antje; Roscoe, Rod D.; McNamara, Danielle S.

    2014-01-01

    Achieving sustained student engagement with practice in computer-based writing strategy training can be a challenge. One potential solution is to foster engagement by embedding practice in educational games; yet there is currently little research comparing the effectiveness of game-based practice versus more traditional forms of practice. In this…

  5. The Effectiveness of Interactive Computer Assisted Modeling in Teaching Study Strategies and Concept Mapping of College Textbook Material.

    Science.gov (United States)

    Mikulecky, Larry

    A study evaluated the effectiveness of a series of print materials and interactive computer-guided study programs designed to lead undergraduate students to apply basic textbook reading and concept mapping strategies to the study of science and social science textbooks. Following field testing with 25 learning skills students, 50 freshman biology…

  6. Technologies and Business Value of Cloud Computing: Strategy for the Department of Information Processing

    OpenAIRE

    Manyando, Obyster

    2013-01-01

    This research focuses on the area of cloud computing, an important research area in computer and service science. The general aim of this research is to emphasize the business value of cloud computing to businesses. The objective of this research is to explore the economic benefits of cloud computing in order to promote its use in business and the education sector. Consequently, the research recommends the adoption of cloud computing by the Kemi-Tornio University of Applied Sciences. The ...

  7. Indagini informatiche e acquisizione della prova nel processo penale / Enquêtes informatiques et acquisition d’éléments de preuve dans le procès pénal / Computer investigation and acquisition of evidence in criminal proceedings

    Directory of Open Access Journals (Sweden)

    Bravo Fabio

    2010-03-01

    Full Text Available This article aims to examine the new regulations introduced by the Budapest Convention on Cybercrime and the Italian Ratification Law No. 48/2008. Attention is focused primarily on the impact on computer investigation and the acquisition of evidence in criminal proceedings. The article also analyzes some relevant Italian Court decisions, in which we can find an uncertain and fluctuating trend as regards elements and principles of computer forensics. Although the Italian law has now implemented several articles of the Criminal Procedure Code, the principles of computer forensics, it will be necessary to verify how these principles will be implemented in practice.

  8. Four-channel multidetector-row computed tomography in the evaluation of facial fractures - optimized parameters for acquisition and multiplanar reformation

    International Nuclear Information System (INIS)

    Omid, P. M.

    2002-08-01

    The first part of this thesis is designed to give the reader a comprehensive survey on the complex basic principles of computed tomography (CT), from the early beginning to the recent development of multidetector-row CT (MD-CT). Attention is focused on imaging of trauma in general and on imaging of facial fractures in particular. The second part of this thesis describes a clinical study performed to optimize acquisition protocols and multiplanar reformation (MPR) algorithms for the evaluation of facial fractures using MD-CT, which has not been yet described in literature. For this study, a cadaver head with artificial blunt facial trauma was examined using a 4-channel MD-CT scanner. The influence of acquisition parameters (collimation: 2x0.5 mm/4x1 mm/4x2.5 mm; tube current: 120 mAs/90 mAs/60 mAs), image reconstruction algorithms (standard vs. ultra-high resolution (UHR) modes; reconstructed slice thicknesses: 0.5 mm/1 mm/3 mm; increment: 0.3 mm/0.6 mm/1.5 mm), and reformation algorithms (slice thicknesses: 0.5 mm/1 mm/3 mm; overlap: 0.5 mm/1 mm/3 mm) on detectability of facial fractures in MPRs with MD-CT was analyzed. Effects of algorithms and parameters on image noise, artifacts and delineation of soft tissues were evaluated. The results of this study reliably demonstrate that fracture detection was significantly higher with thin MPRs (0.5/0.5 mm, 1/0.5 mm, 1/1 mm) (p = 0 .014) acquired with 2x0.5 mm collimation (p = 0 .046), in UHR mode (p .0005) with 120 mAs (p = 0 .025). Inter-observer variability showed very good agreement (κ > = 0 .942). Non-UHR mode, lower mAs and thick MPRs (3/0.5 mm, 3/1 mm, 3/0.5 mm) showed significantly decreased detectability. (author)

  9. New KENS data acquisition system

    International Nuclear Information System (INIS)

    Arai, M.; Furusaka, M.; Satoh, S.

    1989-01-01

    In this report, the authors discuss a data acquisition system, KENSnet, which is newly introduced to the KENS facility. The criteria for the data acquisition system was about 1 MIPS for CPU speed and 150 Mbytes for storage capacity for a computer per spectrometer. VAX computers were chosen with their propreitary operating system, VMS. The Vax computers are connected by a DECnet network mediated by Ethernet. Front-end computers, Apple Macintosh Plus and Macintosh II, were chosen for their user-friendly manipulation and intelligence. New CAMAC-based data acquisition electronics were developed. The data acquisition control program (ICP) and the general data analysis program (Genie) were both developed at ISIS and have been installed. 2 refs., 3 figs., 1 tab

  10. A systematic data acquisition and mining strategy for chemical profiling of Aster tataricus rhizoma (Ziwan) by UHPLC-Q-TOF-MS and the corresponding anti-depressive activity screening.

    Science.gov (United States)

    Sun, Yupeng; Li, Li; Liao, Man; Su, Min; Wan, Changchen; Zhang, Lantong; Zhang, Hailin

    2018-05-30

    In this study, a systematic data acquisition and mining strategy aimed at the traditional Chinese medicine (TCM) complex system based on ultra high-performance liquid chromatography coupled with quadrupole time of flight mass spectrometry (UHPLC-Q-TOF-MS) was reported. The workflow of this strategy is as follows: First, the high resolution mass data are acquired by both data-dependent acquisition mode (DDA) and data-independent acquisition mode (DIA). Then a global data mining that combined targeted and non-targeted compound finding is applied to analyze mass spectral data. Furthermore, some assistant tools, such as key product ions (KPIs), are employed for compound hunting and identification. The TCM Ziwan (ZW, Aster tataricus rhizoma) was used to illustrate this strategy for the first time. In this research, total 131 compounds including organic acids, peptides, terpenes, steroids, flavonoids, coumarins, anthraquinones and aldehydes were identified or tentatively characterized in ZW based on accurate mass measurements within ±5 ppm error, and 50 of them were unambiguously confirmed by comparing standard compounds. Afterwards, based on the traditional Chinese medical theory and the key determinants of firing patterns of ventral tegmental area (VTA) dopamine (DA) neurons in the development of depression, the confirmed compounds were subsequently evaluated the pharmacological effect of activity of VTA DA neurons and anti-depressive efficacy. This research provided not only a chemical profiling for further in vivo study of ZW, but also an efficient data acquisition and mining strategy to profile the chemical constituents and find new bioactive substances for other TCM complex system. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Individual Differences in Strategy Use on Division Problems: Mental versus Written Computation

    Science.gov (United States)

    Hickendorff, Marian; van Putten, Cornelis M.; Verhelst, Norman D.; Heiser, Willem J.

    2010-01-01

    Individual differences in strategy use (choice and accuracy) were analyzed. A sample of 362 Grade 6 students solved complex division problems under 2 different conditions. In the choice condition students were allowed to use either a mental or a written strategy. In the subsequent no-choice condition, they were required to use a written strategy.…

  12. Acquisition IT Integration

    DEFF Research Database (Denmark)

    Henningsson, Stefan; Øhrgaard, Christian

    2015-01-01

    of temporary agency workers. Following an analytic induction approach, theoretically grounded in the re-source-based view of the firm, we identify the complimentary and supplementary roles consultants can assume in acquisition IT integration. Through case studies of three acquirers, we investigate how...... the acquirers appropriate the use of agency workers as part of its acquisition strategy. For the investigated acquirers, assigning roles to agency workers is contingent on balancing the needs of knowledge induction and knowledge retention, as well as experience richness and in-depth under-standing. Composition...

  13. LEGS data acquisition facility

    International Nuclear Information System (INIS)

    LeVine, M.J.

    1985-01-01

    The data acquisition facility for the LEGS medium energy photonuclear beam line is composed of an auxiliary crate controller (ACC) acting as a front-end processor, loosely coupled to a time-sharing host computer based on a UNIX-like environment. The ACC services all real-time demands in the CAMAC crate: it responds to LAMs generated by data acquisition modules, to keyboard commands, and it refreshes the graphics display at frequent intervals. The host processor is needed only for printing histograms and recording event buffers on magnetic tape. The host also provides the environment for software development. The CAMAC crate is interfaced by a VERSAbus CAMAC branch driver

  14. A moving blocker-based strategy for simultaneous megavoltage and kilovoltage scatter correction in cone-beam computed tomography image acquired during volumetric modulated arc therapy

    International Nuclear Information System (INIS)

    Ouyang, Luo; Lee, Huichen Pam; Wang, Jing

    2015-01-01

    Purpose: To evaluate a moving blocker-based approach in estimating and correcting megavoltage (MV) and kilovoltage (kV) scatter contamination in kV cone-beam computed tomography (CBCT) acquired during volumetric modulated arc therapy (VMAT). Methods and materials: During the concurrent CBCT/VMAT acquisition, a physical attenuator (i.e., “blocker”) consisting of equally spaced lead strips was mounted and moved constantly between the CBCT source and patient. Both kV and MV scatter signals were estimated from the blocked region of the imaging panel, and interpolated into the unblocked region. A scatter corrected CBCT was then reconstructed from the unblocked projections after scatter subtraction using an iterative image reconstruction algorithm based on constraint optimization. Experimental studies were performed on a Catphan® phantom and an anthropomorphic pelvis phantom to demonstrate the feasibility of using a moving blocker for kV–MV scatter correction. Results: Scatter induced cupping artifacts were substantially reduced in the moving blocker corrected CBCT images. Quantitatively, the root mean square error of Hounsfield units (HU) in seven density inserts of the Catphan phantom was reduced from 395 to 40. Conclusions: The proposed moving blocker strategy greatly improves the image quality of CBCT acquired with concurrent VMAT by reducing the kV–MV scatter induced HU inaccuracy and cupping artifacts

  15. Evaluation of cognitive loads imposed by traditional paper-based and innovative computer-based instructional strategies.

    Science.gov (United States)

    Khalil, Mohammed K; Mansour, Mahmoud M; Wilhite, Dewey R

    2010-01-01

    Strategies of presenting instructional information affect the type of cognitive load imposed on the learner's working memory. Effective instruction reduces extraneous (ineffective) cognitive load and promotes germane (effective) cognitive load. Eighty first-year students from two veterinary schools completed a two-section questionnaire that evaluated their perspectives on the educational value of a computer-based instructional program. They compared the difference between cognitive loads imposed by paper-based and computer-based instructional strategies used to teach the anatomy of the canine skeleton. Section I included 17 closed-ended items, rated on a five-point Likert scale, that assessed the use of graphics, content, and the learning process. Section II included a nine-point mental effort rating scale to measure the level of difficulty of instruction; students were asked to indicate the amount of mental effort invested in the learning task using both paper-based and computer-based presentation formats. The closed-ended data were expressed as means and standard deviations. A paired t test with an alpha level of 0.05 was used to determine the overall mean difference between the two presentation formats. Students positively evaluated their experience with the computer-based instructional program with a mean score of 4.69 (SD=0.53) for use of graphics, 4.70 (SD=0.56) for instructional content, and 4.45 (SD=0.67) for the learning process. The mean difference of mental effort (1.50) between the two presentation formats was significant, t=8.26, p≤.0001, df=76, for two-tailed distribution. Consistent with cognitive load theory, innovative computer-based instructional strategies decrease extraneous cognitive load compared with traditional paper-based instructional strategies.

  16. A computational approach to compare regression modelling strategies in prediction research.

    Science.gov (United States)

    Pajouheshnia, Romin; Pestman, Wiebe R; Teerenstra, Steven; Groenwold, Rolf H H

    2016-08-25

    It is often unclear which approach to fit, assess and adjust a model will yield the most accurate prediction model. We present an extension of an approach for comparing modelling strategies in linear regression to the setting of logistic regression and demonstrate its application in clinical prediction research. A framework for comparing logistic regression modelling strategies by their likelihoods was formulated using a wrapper approach. Five different strategies for modelling, including simple shrinkage methods, were compared in four empirical data sets to illustrate the concept of a priori strategy comparison. Simulations were performed in both randomly generated data and empirical data to investigate the influence of data characteristics on strategy performance. We applied the comparison framework in a case study setting. Optimal strategies were selected based on the results of a priori comparisons in a clinical data set and the performance of models built according to each strategy was assessed using the Brier score and calibration plots. The performance of modelling strategies was highly dependent on the characteristics of the development data in both linear and logistic regression settings. A priori comparisons in four empirical data sets found that no strategy consistently outperformed the others. The percentage of times that a model adjustment strategy outperformed a logistic model ranged from 3.9 to 94.9 %, depending on the strategy and data set. However, in our case study setting the a priori selection of optimal methods did not result in detectable improvement in model performance when assessed in an external data set. The performance of prediction modelling strategies is a data-dependent process and can be highly variable between data sets within the same clinical domain. A priori strategy comparison can be used to determine an optimal logistic regression modelling strategy for a given data set before selecting a final modelling approach.

  17. A complex-plane strategy for computing rotating polytropic models - Numerical results for strong and rapid differential rotation

    International Nuclear Information System (INIS)

    Geroyannis, V.S.

    1990-01-01

    In this paper, a numerical method, called complex-plane strategy, is implemented in the computation of polytropic models distorted by strong and rapid differential rotation. The differential rotation model results from a direct generalization of the classical model, in the framework of the complex-plane strategy; this generalization yields very strong differential rotation. Accordingly, the polytropic models assume extremely distorted interiors, while their boundaries are slightly distorted. For an accurate simulation of differential rotation, a versatile method, called multiple partition technique is developed and implemented. It is shown that the method remains reliable up to rotation states where other elaborate techniques fail to give accurate results. 11 refs

  18. Computer-aided detection of breast masses: Four-view strategy for screening mammography

    International Nuclear Information System (INIS)

    Wei Jun; Chan Heangping; Zhou Chuan; Wu Yita; Sahiner, Berkman; Hadjiiski, Lubomir M.; Roubidoux, Marilyn A.; Helvie, Mark A.

    2011-01-01

    Purpose: To improve the performance of a computer-aided detection (CAD) system for mass detection by using four-view information in screening mammography. Methods: The authors developed a four-view CAD system that emulates radiologists' reading by using the craniocaudal and mediolateral oblique views of the ipsilateral breast to reduce false positives (FPs) and the corresponding views of the contralateral breast to detect asymmetry. The CAD system consists of four major components: (1) Initial detection of breast masses on individual views, (2) information fusion of the ipsilateral views of the breast (referred to as two-view analysis), (3) information fusion of the corresponding views of the contralateral breast (referred to as bilateral analysis), and (4) fusion of the four-view information with a decision tree. The authors collected two data sets for training and testing of the CAD system: A mass set containing 389 patients with 389 biopsy-proven masses and a normal set containing 200 normal subjects. All cases had four-view mammograms. The true locations of the masses on the mammograms were identified by an experienced MQSA radiologist. The authors randomly divided the mass set into two independent sets for cross validation training and testing. The overall test performance was assessed by averaging the free response receiver operating characteristic (FROC) curves of the two test subsets. The FP rates during the FROC analysis were estimated by using the normal set only. The jackknife free-response ROC (JAFROC) method was used to estimate the statistical significance of the difference between the test FROC curves obtained with the single-view and the four-view CAD systems. Results: Using the single-view CAD system, the breast-based test sensitivities were 58% and 77% at the FP rates of 0.5 and 1.0 per image, respectively. With the four-view CAD system, the breast-based test sensitivities were improved to 76% and 87% at the corresponding FP rates, respectively

  19. Biliary drainage strategy of unresectable malignant hilar strictures by computed tomography volumetry.

    Science.gov (United States)

    Takahashi, Ei; Fukasawa, Mitsuharu; Sato, Tadashi; Takano, Shinichi; Kadokura, Makoto; Shindo, Hiroko; Yokota, Yudai; Enomoto, Nobuyuki

    2015-04-28

    To identify criteria for predicting successful drainage of unresectable malignant hilar biliary strictures (UMHBS) because no ideal strategy currently exists. We examined 78 patients with UMHBS who underwent biliary drainage. Drainage was considered effective when the serum bilirubin level decreased by ≥ 50% from the value before stent placement within 2 wk after drainage, without additional intervention. Complications that occurred within 7 d after stent placement were considered as early complications. Before drainage, the liver volume of each section (lateral and medial sections of the left liver and anterior and posterior sections of the right liver) was measured using computed tomography (CT) volumetry. Drained liver volume was calculated based on the volume of each liver section and the type of bile duct stricture (according to the Bismuth classification). Tumor volume, which was calculated by using CT volumetry, was excluded from the volume of each section. Receiver operating characteristic (ROC) analysis was performed to identify the optimal cutoff values for drained liver volume. In addition, factors associated with the effectiveness of drainage and early complications were evaluated. Multivariate analysis showed that drained liver volume [odds ratio (OR) = 2.92, 95%CI: 1.648-5.197; P < 0.001] and impaired liver function (with decompensated liver cirrhosis) (OR = 0.06, 95%CI: 0.009-0.426; P = 0.005) were independent factors contributing to the effectiveness of drainage. ROC analysis for effective drainage showed cutoff values of 33% of liver volume for patients with preserved liver function (with normal liver or compensated liver cirrhosis) and 50% for patients with impaired liver function (with decompensated liver cirrhosis). The sensitivity and specificity of these cutoff values were 82% and 80% for preserved liver function, and 100% and 67% for impaired liver function, respectively. Among patients who met these criteria, the rate of effective drainage

  20. Computer simulation of wolf-removal strategies for animal damage control

    Science.gov (United States)

    Robert G. Haight; Laurel E. Travis; Kevin Nimerfro; L. David Mech

    2002-01-01

    Because of the sustained growth of the gray wolf (Canis lupus) population in the western Great Lakes region of the United States, management agencies are anticipating gray wolf removal from the federal endangered species list and are proposing strategies for wolf management. Strategies are needed that would balance conflicting public demands for wolf...

  1. Towards a strategy for the introduction of information and computer literacy (ICL) courses

    NARCIS (Netherlands)

    Plomp, T.; Carleer, G.J.

    1987-01-01

    An important goal of the national policy on computers in education in the Netherlands is the familiarization of all citizens with information technology. This policy was a plea for some basic education in information and computer literacy. In the beginning of the implementation of this basic

  2. Time-resolved 3D pulmonary perfusion MRI: comparison of different k-space acquisition strategies at 1.5 and 3 T.

    Science.gov (United States)

    Attenberger, Ulrike I; Ingrisch, Michael; Dietrich, Olaf; Herrmann, Karin; Nikolaou, Konstantin; Reiser, Maximilian F; Schönberg, Stefan O; Fink, Christian

    2009-09-01

    Time-resolved pulmonary perfusion MRI requires both high temporal and spatial resolution, which can be achieved by using several nonconventional k-space acquisition techniques. The aim of this study is to compare the image quality of time-resolved 3D pulmonary perfusion MRI with different k-space acquisition techniques in healthy volunteers at 1.5 and 3 T. Ten healthy volunteers underwent contrast-enhanced time-resolved 3D pulmonary MRI on 1.5 and 3 T using the following k-space acquisition techniques: (a) generalized autocalibrating partial parallel acquisition (GRAPPA) with an internal acquisition of reference lines (IRS), (b) GRAPPA with a single "external" acquisition of reference lines (ERS) before the measurement, and (c) a combination of GRAPPA with an internal acquisition of reference lines and view sharing (VS). The spatial resolution was kept constant at both field strengths to exclusively evaluate the influences of the temporal resolution achieved with the different k-space sampling techniques on image quality. The temporal resolutions were 2.11 seconds IRS, 1.31 seconds ERS, and 1.07 VS at 1.5 T and 2.04 seconds IRS, 1.30 seconds ERS, and 1.19 seconds VS at 3 T.Image quality was rated by 2 independent radiologists with regard to signal intensity, perfusion homogeneity, artifacts (eg, wrap around, noise), and visualization of pulmonary vessels using a 3 point scale (1 = nondiagnostic, 2 = moderate, 3 = good). Furthermore, the signal-to-noise ratio in the lungs was assessed. At 1.5 T the lowest image quality (sum score: 154) was observed for the ERS technique and the highest quality for the VS technique (sum score: 201). In contrast, at 3 T images acquired with VS were hampered by strong artifacts and image quality was rated significantly inferior (sum score: 137) compared with IRS (sum score: 180) and ERS (sum score: 174). Comparing 1.5 and 3 T, in particular the overall rating of the IRS technique (sum score: 180) was very similar at both field

  3. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  4. Different strategies in solving series completion inductive reasoning problems: an fMRI and computational study.

    Science.gov (United States)

    Liang, Peipeng; Jia, Xiuqin; Taatgen, Niels A; Zhong, Ning; Li, Kuncheng

    2014-08-01

    Neural correlate of human inductive reasoning process is still unclear. Number series and letter series completion are two typical inductive reasoning tasks, and with a common core component of rule induction. Previous studies have demonstrated that different strategies are adopted in number series and letter series completion tasks; even the underlying rules are identical. In the present study, we examined cortical activation as a function of two different reasoning strategies for solving series completion tasks. The retrieval strategy, used in number series completion tasks, involves direct retrieving of arithmetic knowledge to get the relations between items. The procedural strategy, used in letter series completion tasks, requires counting a certain number of times to detect the relations linking two items. The two strategies require essentially the equivalent cognitive processes, but have different working memory demands (the procedural strategy incurs greater demands). The procedural strategy produced significant greater activity in areas involved in memory retrieval (dorsolateral prefrontal cortex, DLPFC) and mental representation/maintenance (posterior parietal cortex, PPC). An ACT-R model of the tasks successfully predicted behavioral performance and BOLD responses. The present findings support a general-purpose dual-process theory of inductive reasoning regarding the cognitive architecture. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. High performance computing of density matrix renormalization group method for 2-dimensional model. Parallelization strategy toward peta computing

    International Nuclear Information System (INIS)

    Yamada, Susumu; Igarashi, Ryo; Machida, Masahiko; Imamura, Toshiyuki; Okumura, Masahiko; Onishi, Hiroaki

    2010-01-01

    We parallelize the density matrix renormalization group (DMRG) method, which is a ground-state solver for one-dimensional quantum lattice systems. The parallelization allows us to extend the applicable range of the DMRG to n-leg ladders i.e., quasi two-dimension cases. Such an extension is regarded to bring about several breakthroughs in e.g., quantum-physics, chemistry, and nano-engineering. However, the straightforward parallelization requires all-to-all communications between all processes which are unsuitable for multi-core systems, which is a mainstream of current parallel computers. Therefore, we optimize the all-to-all communications by the following two steps. The first one is the elimination of the communications between all processes by only rearranging data distribution with the communication data amount kept. The second one is the avoidance of the communication conflict by rescheduling the calculation and the communication. We evaluate the performance of the DMRG method on multi-core supercomputers and confirm that our two-steps tuning is quite effective. (author)

  6. Mergers + acquisitions.

    Science.gov (United States)

    Hoppszallern, Suzanna

    2002-05-01

    The hospital sector in 2001 led the health care field in mergers and acquisitions. Most deals involved a network augmenting its presence within a specific region or in a market adjacent to its primary service area. Analysts expect M&A activity to increase in 2002.

  7. Computing Optimal Mixed Strategies for Terrorist Plot Detection Games with the Consideration of Information Leakage

    Directory of Open Access Journals (Sweden)

    Li MingChu

    2017-01-01

    Full Text Available The terrorist’s coordinated attack is becoming an increasing threat to western countries. By monitoring potential terrorists, security agencies are able to detect and destroy terrorist plots at their planning stage. Therefore, an optimal monitoring strategy for the domestic security agency becomes necessary. However, previous study about monitoring strategy generation fails to consider the information leakage, due to hackers and insider threat. Such leakage events may lead to failure of watching potential terrorists and destroying the plot, and cause a huge risk to public security. This paper makes two major contributions. Firstly, we develop a new Stackelberg game model for the security agency to generate optimal monitoring strategy with the consideration of information leakage. Secondly, we provide a double-oracle framework DO-TPDIL for calculation effectively. The experimental result shows that our approach can obtain robust strategies against information leakage with high feasibility and efficiency.

  8. Organizational Strategy and Business Environment Effects Based on a Computation Method

    Science.gov (United States)

    Reklitis, Panagiotis; Konstantopoulos, Nikolaos; Trivellas, Panagiotis

    2007-12-01

    According to many researchers of organizational theory, a great number of problems encountered by the manufacturing firms are due to their ineffectiveness to respond to significant changes of their external environment and align their competitive strategy accordingly. From this point of view, the pursuit of the appropriate generic strategy is vital for firms facing a dynamic and highly competitive environment. In the present paper, we adopt Porter's typology to operationalise organizational strategy (cost leadership, innovative and marketing differentiation, and focus) considering changes in the external business environment (dynamism, complexity and munificence). Although simulation of social events is a quite difficult task, since there are so many considerations (not all well understood) involved, in the present study we developed a dynamic system based on the conceptual framework of strategy-environment associations.

  9. The Pitzer-Lee-Kesler-Teja (PLKT) Strategy and Its Implementation by Meta-Computing Software

    Czech Academy of Sciences Publication Activity Database

    Smith, W. R.; Lísal, Martin; Missen, R. W.

    2001-01-01

    Roč. 35, č. 1 (2001), s. 68-73 ISSN 0009-2479 Institutional research plan: CEZ:AV0Z4072921 Keywords : The Pitzer -Lee-Kesler-Teja (PLKT) strategy * implementation Subject RIV: CF - Physical ; Theoretical Chemistry

  10. Computing Optimal Mixed Strategies for Terrorist Plot Detection Games with the Consideration of Information Leakage

    OpenAIRE

    Li MingChu; Yang Zekun; Lu Kun; Guo Cheng

    2017-01-01

    The terrorist’s coordinated attack is becoming an increasing threat to western countries. By monitoring potential terrorists, security agencies are able to detect and destroy terrorist plots at their planning stage. Therefore, an optimal monitoring strategy for the domestic security agency becomes necessary. However, previous study about monitoring strategy generation fails to consider the information leakage, due to hackers and insider threat. Such leakage events may lead to failure of watch...

  11. Fuzzy Cognitive and Social Negotiation Agent Strategy for Computational Collective Intelligence

    Science.gov (United States)

    Chohra, Amine; Madani, Kurosh; Kanzari, Dalel

    Finding the adequate (win-win solutions for both parties) negotiation strategy with incomplete information for autonomous agents, even in one-to-one negotiation, is a complex problem. Elsewhere, negotiation behaviors, in which the characters such as conciliatory or aggressive define a 'psychological' aspect of the negotiator personality, play an important role. The aim of this paper is to develop a fuzzy cognitive and social negotiation strategy for autonomous agents with incomplete information, where the characters conciliatory, neutral, or aggressive, are suggested to be integrated in negotiation behaviors (inspired from research works aiming to analyze human behavior and those on social negotiation psychology). For this purpose, first, one-to-one bargaining process, in which a buyer agent and a seller agent negotiate over single issue (price), is developed for a time-dependent strategy (based on time-dependent behaviors of Faratin et al.) and for a fuzzy cognitive and social strategy. Second, experimental environments and measures, allowing a set of experiments, carried out for different negotiation deadlines of buyer and seller agents, are detailed. Third, experimental results for both time-dependent and fuzzy cognitive and social strategies are presented, analyzed, and compared for different deadlines of agents. The suggested fuzzy cognitive and social strategy allows agents to improve the negotiation process, with regard to the time-dependent one, in terms of agent utilities, round number to reach an agreement, and percentage of agreements.

  12. What's New in Software? Computers and the Writing Process: Strategies That Work.

    Science.gov (United States)

    Ellsworth, Nancy J.

    1990-01-01

    The computer can be a powerful tool to help students who are having difficulty learning the skills of prewriting, composition, revision, and editing. Specific software is suggested for each phase, as well as for classroom publishing. (Author/JDD)

  13. Tree Based Decision Strategies and Auctions in Computational Multi-Agent Systems

    Czech Academy of Sciences Publication Activity Database

    Šlapák, M.; Neruda, Roman

    2017-01-01

    Roč. 38, č. 4 (2017), s. 335-342 ISSN 0257-4306 Institutional support: RVO:67985807 Keywords : auction systems * decision making * genetic programming * multi-agent system * task distribution Subject RIV: IN - Informatics, Computer Science OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) http://rev-inv-ope.univ-paris1.fr/fileadmin/rev-inv-ope/files/38417/38417-04.pdf

  14. Development and Evaluation of a Computer-Based Learning Environment for Teachers: Assessment of Learning Strategies in Learning Journals

    Directory of Open Access Journals (Sweden)

    Inga Glogger

    2013-01-01

    Full Text Available Training teachers to assess important components of self-regulated learning such as learning strategies is an important, yet somewhat neglected, aspect of the integration of self-regulated learning at school. Learning journals can be used to assess learning strategies in line with cyclical process models of self-regulated learning, allowing for rich formative feedback. Against this background, we developed a computer-based learning environment (CBLE that trains teachers to assess learning strategies with learning journals. The contents of the CBLE and its instructional design were derived from theory. The CBLE was further shaped by research in a design-based manner. Finally, in two evaluation studies, student teachers (N1=44; N2=89 worked with the CBLE. We analyzed satisfaction, interest, usability, and assessment skills. Additionally, in evaluation study 2, effects of an experimental variation on motivation and assessment skills were tested. We found high satisfaction, interest, and good usability, as well as satisfying assessment skills, after working with the CBLE. Results show that teachers can be trained to assess learning strategies in learning journals. The developed CBLE offers new perspectives on how to support teachers in fostering learning strategies as central component of effective self-regulated learning at school.

  15. The Effectiveness of Using Contextual Clues, Dictionary Strategy and Computer Assisted Language Learning (Call In Learning Vocabulary

    Directory of Open Access Journals (Sweden)

    Zuraina Ali

    2013-07-01

    Full Text Available This study investigates the effectiveness of three vocabulary learning methods that are Contextual Clues, Dictionary Strategy, and Computer Assisted Language Learning (CALL in learning vocabulary among ESL learners. First, it aims at finding which of the vocabulary learning methods namely Dictionary Strategy, Contextual Clues, and CALL that may result in the highest number of words learnt in the immediate and delayed recall tests. Second, it compares the results of the Pre-test and the Delayed Recall Post-test to determine the differences of learning vocabulary using the methods. A quasi-experiment that tested the effectiveness of learning vocabulary using Dictionary Strategy, Contextual clues, and CALL involved 123 first year university students. Qualitative procedures included the collection of data from interviews which were conducted to triangulate the data obtain from the quantitative inquiries. Findings from the study using ANOVA revealed that there were significant differences when students were exposed to Dictionary Strategy, Contextual Clues and CALL in the immediate recall tests but not in the Delayed Recall Post-test. Also, there were significant differences when t test was used to compare the scores between the Pre-test and the Delayed Recall Post-test in using the three methods of vocabulary learning. Although many researchers have advocated the relative effectiveness of Dictionary Strategy, Contextual Clues, and CALL in learning vocabulary, the study however, is still paramount since there is no study has ever empirically investigated the relative efficacy of these three methods in a single study.

  16. Flexible data acquisition system

    Energy Technology Data Exchange (ETDEWEB)

    Clout, P N; Ridley, P A [Science Research Council, Daresbury (UK). Daresbury Lab.

    1978-06-01

    A data acquisition system has been developed which enables several independent experiments to be controlled by a 24 K word PDP-11 computer. Significant features of the system are the use of CAMAC, a high level language (RTL/2) and a general-purpose operating system executive which assist the rapid implementation of new experiments. This system has been used successfully for EXAFS and photo-electron spectroscopy experiments. It is intended to provide powerful concurrent data analysis and feedback facilities to the experimenter by on-line connection to the central IBM 370/165 computer.

  17. VersaCold: Analysis of Change Management in Mergers & Acquisitions

    OpenAIRE

    Eslami, Sara

    2011-01-01

    Many firms use mergers and acquisitions as a corporate strategy to increase shareholder value. Therefore, understanding such a widely exercised strategy and its implications on corporate change would be critical for organizations that wish to pursue this strategy. This study provides an in depth review of mergers and acquisitions and introduces best practices for managing changes that result from mergers and acquisitions. Next, the concepts are applied to two cases of acquisitions in VersaCol...

  18. Improve Outcomes Study subjects Chemistry Teaching and Learning Strategies through independent study with the help of computer-based media

    Science.gov (United States)

    Sugiharti, Gulmah

    2018-03-01

    This study aims to see the improvement of student learning outcomes by independent learning using computer-based learning media in the course of STBM (Teaching and Learning Strategy) Chemistry. Population in this research all student of class of 2014 which take subject STBM Chemistry as many as 4 class. While the sample is taken by purposive as many as 2 classes, each 32 students, as control class and expriment class. The instrument used is the test of learning outcomes in the form of multiple choice with the number of questions as many as 20 questions that have been declared valid, and reliable. Data analysis techniques used one-sided t test and improved learning outcomes using a normalized gain test. Based on the learning result data, the average of normalized gain values for the experimental class is 0,530 and for the control class is 0,224. The result of the experimental student learning result is 53% and the control class is 22,4%. Hypothesis testing results obtained t count> ttable is 9.02> 1.6723 at the level of significance α = 0.05 and db = 58. This means that the acceptance of Ha is the use of computer-based learning media (CAI Computer) can improve student learning outcomes in the course Learning Teaching Strategy (STBM) Chemistry academic year 2017/2018.

  19. Mergers & Acquisitions

    DEFF Research Database (Denmark)

    Fomcenco, Alex

    This dissertation is a legal dogmatic thesis, the goal of which is to describe and analyze the current state of law in Europe in regard to some relevant selected elements related to mergers and acquisitions, and the adviser’s counsel in this regard. Having regard to the topic of the dissertation...... and fiscal neutrality, group-related issues, holding-structure issues, employees, stock exchange listing issues, and corporate nationality....

  20. Survey of Storage and Fault Tolerance Strategies Used in Cloud Computing

    Science.gov (United States)

    Ericson, Kathleen; Pallickara, Shrideep

    Cloud computing has gained significant traction in recent years. Companies such as Google, Amazon and Microsoft have been building massive data centers over the past few years. Spanning geographic and administrative domains, these data centers tend to be built out of commodity desktops with the total number of computers managed by these companies being in the order of millions. Additionally, the use of virtualization allows a physical node to be presented as a set of virtual nodes resulting in a seemingly inexhaustible set of computational resources. By leveraging economies of scale, these data centers can provision cpu, networking, and storage at substantially reduced prices which in turn underpins the move by many institutions to host their services in the cloud.

  1. The Acquisition Experiences of Kazoil

    DEFF Research Database (Denmark)

    Minbaeva, Dana; Muratbekova-Touron, Maral

    2016-01-01

    This case describes two diverging post-acquisition experiences of KazOil, an oil drilling company in Kazakhstan, in the years after the dissolution of the Soviet Union. When the company was bought by the Canadian corporation Hydrocarbons Ltd in 1996, exposed to new human resource strategies...... among students that cultural distance is not the main determinant for the success of social integration mechanisms in post-acquisition situations. On the contrary, the relationship between integration instrument and integration success is also governed by contextual factors such as the attractiveness...... of the acquisition target or state of development of HRM in the target country....

  2. A Case Study on Neural Inspired Dynamic Memory Management Strategies for High Performance Computing.

    Energy Technology Data Exchange (ETDEWEB)

    Vineyard, Craig Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Verzi, Stephen Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-09-01

    As high performance computing architectures pursue more computational power there is a need for increased memory capacity and bandwidth as well. A multi-level memory (MLM) architecture addresses this need by combining multiple memory types with different characteristics as varying levels of the same architecture. How to efficiently utilize this memory infrastructure is an unknown challenge, and in this research we sought to investigate whether neural inspired approaches can meaningfully help with memory management. In particular we explored neurogenesis inspired re- source allocation, and were able to show a neural inspired mixed controller policy can beneficially impact how MLM architectures utilize memory.

  3. Monitoring and Depth of Strategy Use in Computer-Based Learning Environments for Science and History

    Science.gov (United States)

    Deekens, Victor M.; Greene, Jeffrey A.; Lobczowski, Nikki G.

    2018-01-01

    Background: Self-regulated learning (SRL) models position metacognitive monitoring as central to SRL processing and predictive of student learning outcomes (Winne & Hadwin, 2008; Zimmerman, 2000). A body of research evidence also indicates that depth of strategy use, ranging from surface to deep processing, is predictive of learning…

  4. Different strategies in solving series completion inductive reasoning problems : An fMRI and computational study

    NARCIS (Netherlands)

    Liang, Peipeng; Jia, Xiuqin; Taatgen, Niels A.; Zhong, Ning; Li, Kuncheng

    Neural correlate of human inductive reasoning process is still unclear. Number series and letter series completion are two typical inductive reasoning tasks, and with a common core component of rule induction. Previous studies have demonstrated that different strategies are adopted in number series

  5. Data acquisition techniques using PC

    CERN Document Server

    Austerlitz, Howard

    1991-01-01

    Data Acquisition Techniques Using Personal Computers contains all the information required by a technical professional (engineer, scientist, technician) to implement a PC-based acquisition system. Including both basic tutorial information as well as some advanced topics, this work is suitable as a reference book for engineers or as a supplemental text for engineering students. It gives the reader enough understanding of the topics to implement a data acquisition system based on commercial products. A reader can alternatively learn how to custom build hardware or write his or her own software.

  6. InfoMall: An Innovative Strategy for High-Performance Computing and Communications Applications Development.

    Science.gov (United States)

    Mills, Kim; Fox, Geoffrey

    1994-01-01

    Describes the InfoMall, a program led by the Northeast Parallel Architectures Center (NPAC) at Syracuse University (New York). The InfoMall features a partnership of approximately 24 organizations offering linked programs in High Performance Computing and Communications (HPCC) technology integration, software development, marketing, education and…

  7. A dimension reduction strategy for improving the efficiency of computer-aided detection for CT colonography

    Science.gov (United States)

    Song, Bowen; Zhang, Guopeng; Wang, Huafeng; Zhu, Wei; Liang, Zhengrong

    2013-02-01

    Various types of features, e.g., geometric features, texture features, projection features etc., have been introduced for polyp detection and differentiation tasks via computer aided detection and diagnosis (CAD) for computed tomography colonography (CTC). Although these features together cover more information of the data, some of them are statistically highly-related to others, which made the feature set redundant and burdened the computation task of CAD. In this paper, we proposed a new dimension reduction method which combines hierarchical clustering and principal component analysis (PCA) for false positives (FPs) reduction task. First, we group all the features based on their similarity using hierarchical clustering, and then PCA is employed within each group. Different numbers of principal components are selected from each group to form the final feature set. Support vector machine is used to perform the classification. The results show that when three principal components were chosen from each group we can achieve an area under the curve of receiver operating characteristics of 0.905, which is as high as the original dataset. Meanwhile, the computation time is reduced by 70% and the feature set size is reduce by 77%. It can be concluded that the proposed method captures the most important information of the feature set and the classification accuracy is not affected after the dimension reduction. The result is promising and further investigation, such as automatically threshold setting, are worthwhile and are under progress.

  8. Do company strategies and structures converge in global markets? Evidence from the computer industry

    NARCIS (Netherlands)

    Duysters, G.M.; Hagedoorn, J.

    2001-01-01

    This note examines isomorphism and diversity in a global industry. We study how the ongoing internationalisation process has affected companies from various regions of the world. Empirical research is focussed on the international computer industry. We find that companies in this sector have become

  9. Text-mining strategies to support computational research in chemical toxicity (ACS 2017 Spring meeting)

    Science.gov (United States)

    With 26 million citations, PubMed is one of the largest sources of information about the activity of chemicals in biological systems. Because this information is expressed in natural language and not stored as data, using the biomedical literature directly in computational resear...

  10. Exploring the Strategies for a Community College Transition into a Cloud-Computing Environment

    Science.gov (United States)

    DeBary, Narges

    2017-01-01

    The use of the Internet has resulted in the birth of an innovative virtualization technology called cloud computing. Virtualization can tremendously improve the instructional and operational systems of a community college. Although the incidental adoption of the cloud solutions in the community colleges of higher education has been increased,…

  11. A study on evaluation strategies in dimensional X-ray computed tomography by estimation of measurement uncertainties

    DEFF Research Database (Denmark)

    Müller, Pavel; Hiller, Jochen; Cantatore, Angela

    2012-01-01

    Computed tomography has entered the industrial world in 1980’s as a technique for nondestructive testing and has nowadays become a revolutionary tool for dimensional metrology, suitable for actual/nominal comparison and verification of geometrical and dimensional tolerances. This paper evaluates...... measurement results using different measuring strategies applied in different inspection software packages for volume and surface data analysis. The strategy influence is determined by calculating the measurement uncertainty. This investigation includes measurements of two industrial items, an aluminium pipe...... connector and a plastic toggle, a hearing aid component. These are measured using a commercial CT scanner. Traceability is transferred using tactile and optical coordinate measuring machines, which are used to produce reference measurements. Results show that measurements of diameter for both parts resulted...

  12. The use of a proactive dissemination strategy to optimize reach of an internet-delivered computer tailored lifestyle intervention

    Science.gov (United States)

    2013-01-01

    Background The use of reactive strategies to disseminate effective Internet-delivered lifestyle interventions restricts their level of reach within the target population. This stresses the need to invest in proactive strategies to offer these interventions to the target population. The present study used a proactive strategy to increase reach of an Internet-delivered multi component computer tailored intervention, by embedding the intervention in an existing online health monitoring system of the Regional Public Health Services in the Netherlands. Methods The research population consisted of Dutch adults who were invited to participate in the Adult Health Monitor (N = 96,388) offered by the Regional Public Health Services. This Monitor consisted of an online or a written questionnaire. A prospective design was used to determine levels of reach, by focusing on actual participation in the lifestyle intervention. Furthermore, adequacy of reach among the target group was assessed by composing detailed profiles of intervention users. Participants’ characteristics, like demographics, behavioral and mental health status and quality of life, were included in the model as predictors. Results A total of 41,155 (43%) people participated in the Adult Health Monitor, of which 41% (n = 16,940) filled out the online version. More than half of the online participants indicated their interest (n = 9169; 54%) in the computer tailored intervention and 5168 participants (31%) actually participated in the Internet-delivered computer tailored intervention. Males, older respondents and individuals with a higher educational degree were significantly more likely to participate in the intervention. Furthermore, results indicated that especially participants with a relatively healthier lifestyle and a healthy BMI were likely to participate. Conclusions With one out of three online Adult Health Monitor participants actually participating in the computer tailored lifestyle

  13. The use of a proactive dissemination strategy to optimize reach of an internet-delivered computer tailored lifestyle intervention.

    Science.gov (United States)

    Schneider, Francine; Schulz, Daniela N; Pouwels, Loes H L; de Vries, Hein; van Osch, Liesbeth A D M

    2013-08-05

    The use of reactive strategies to disseminate effective Internet-delivered lifestyle interventions restricts their level of reach within the target population. This stresses the need to invest in proactive strategies to offer these interventions to the target population. The present study used a proactive strategy to increase reach of an Internet-delivered multi component computer tailored intervention, by embedding the intervention in an existing online health monitoring system of the Regional Public Health Services in the Netherlands. The research population consisted of Dutch adults who were invited to participate in the Adult Health Monitor (N = 96,388) offered by the Regional Public Health Services. This Monitor consisted of an online or a written questionnaire. A prospective design was used to determine levels of reach, by focusing on actual participation in the lifestyle intervention. Furthermore, adequacy of reach among the target group was assessed by composing detailed profiles of intervention users. Participants' characteristics, like demographics, behavioral and mental health status and quality of life, were included in the model as predictors. A total of 41,155 (43%) people participated in the Adult Health Monitor, of which 41% (n = 16,940) filled out the online version. More than half of the online participants indicated their interest (n = 9169; 54%) in the computer tailored intervention and 5168 participants (31%) actually participated in the Internet-delivered computer tailored intervention. Males, older respondents and individuals with a higher educational degree were significantly more likely to participate in the intervention. Furthermore, results indicated that especially participants with a relatively healthier lifestyle and a healthy BMI were likely to participate. With one out of three online Adult Health Monitor participants actually participating in the computer tailored lifestyle intervention, the employed proactive

  14. IT-based Value Creation in Serial Acquisitions

    DEFF Research Database (Denmark)

    Henningsson, Stefan; Yetton, Philip

    2013-01-01

    serial acquirers realize IT-based value, we integrate and model the findings on individual acquisitions from the extant literature, and extend that model to explain the effects of sequential acquisitions in a growth-by-acquisition strategy. This extended model, drawing on the Resource-Based Theory......The extant research on post-acquisition IT integration analyzes how acquirers realize IT-based value in individual acquisitions. However, serial acquirers make 60% of acquisitions. These acquisitions are not isolated events, but are components in growth-by-acquisition programs. To explain how...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  16. Computational drug design strategies applied to the modelling of human immunodeficiency virus-1 reverse transcriptase inhibitors

    Directory of Open Access Journals (Sweden)

    Lucianna Helene Santos

    2015-11-01

    Full Text Available Reverse transcriptase (RT is a multifunctional enzyme in the human immunodeficiency virus (HIV-1 life cycle and represents a primary target for drug discovery efforts against HIV-1 infection. Two classes of RT inhibitors, the nucleoside RT inhibitors (NRTIs and the nonnucleoside transcriptase inhibitors are prominently used in the highly active antiretroviral therapy in combination with other anti-HIV drugs. However, the rapid emergence of drug-resistant viral strains has limited the successful rate of the anti-HIV agents. Computational methods are a significant part of the drug design process and indispensable to study drug resistance. In this review, recent advances in computer-aided drug design for the rational design of new compounds against HIV-1 RT using methods such as molecular docking, molecular dynamics, free energy calculations, quantitative structure-activity relationships, pharmacophore modelling and absorption, distribution, metabolism, excretion and toxicity prediction are discussed. Successful applications of these methodologies are also highlighted.

  17. COMPUTATION IN CLOUD; A COMPETITIVE STRATEGY FOR THE SMALL AND MEDIUM COMPANIES IN MEXICO

    Directory of Open Access Journals (Sweden)

    Alma Lilia Sapién Aguilar

    2014-05-01

    Full Text Available Cloud  computing  is  a  technology  that  provides  services  via  the  Internet.  With  this technology, companies can gain a competitive advantage, provide better customer service and reduce  costs. The objective of this research  was to analyze  the Cloud computing  service of companies  in the  city of Chihuahua,  Mexico.  This  was  a non-experimental,  descriptive  and empirical study with a quantitative approach, which was based on a survey conducted in the months  of  January  2012  through  April  2013.  The  study’s  purpose  was  small  and  medium enterprises (SMEs in the industrial, commercial and service sectors which represent the study population. Finite population formula was used to obtain the sample size, which were selected in a random manner. The results showed that 93 % of companies obtain reduced costs using cloud computing. Storage and sharing information was detected as some of the most used services. Companies want to have savings in technology infrastructure in order to increase the life cycle of the equipment, in addition to provide a higher quality service to customers.

  18. Computational Strategies for Dissecting the High-Dimensional Complexity of Adaptive Immune Repertoires

    Directory of Open Access Journals (Sweden)

    Enkelejda Miho

    2018-02-01

    Full Text Available The adaptive immune system recognizes antigens via an immense array of antigen-binding antibodies and T-cell receptors, the immune repertoire. The interrogation of immune repertoires is of high relevance for understanding the adaptive immune response in disease and infection (e.g., autoimmunity, cancer, HIV. Adaptive immune receptor repertoire sequencing (AIRR-seq has driven the quantitative and molecular-level profiling of immune repertoires, thereby revealing the high-dimensional complexity of the immune receptor sequence landscape. Several methods for the computational and statistical analysis of large-scale AIRR-seq data have been developed to resolve immune repertoire complexity and to understand the dynamics of adaptive immunity. Here, we review the current research on (i diversity, (ii clustering and network, (iii phylogenetic, and (iv machine learning methods applied to dissect, quantify, and compare the architecture, evolution, and specificity of immune repertoires. We summarize outstanding questions in computational immunology and propose future directions for systems immunology toward coupling AIRR-seq with the computational discovery of immunotherapeutics, vaccines, and immunodiagnostics.

  19. Amplitudes, acquisition and imaging

    Energy Technology Data Exchange (ETDEWEB)

    Bloor, Robert

    1998-12-31

    Accurate seismic amplitude information is important for the successful evaluation of many prospects and the importance of such amplitude information is increasing with the advent of time lapse seismic techniques. It is now widely accepted that the proper treatment of amplitudes requires seismic imaging in the form of either time or depth migration. A key factor in seismic imaging is the spatial sampling of the data and its relationship to the imaging algorithms. This presentation demonstrates that acquisition caused spatial sampling irregularity can affect the seismic imaging and perturb amplitudes. Equalization helps to balance the amplitudes, and the dealing strategy improves the imaging further when there are azimuth variations. Equalization and dealiasing can also help with the acquisition irregularities caused by shot and receiver dislocation or missing traces. 2 refs., 2 figs.

  20. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  1. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  3. Data acquisition for PLT

    International Nuclear Information System (INIS)

    Thompson, P.A.

    1975-01-01

    DA/PLT, the data acquisition system for the Princeton Large Torus (PLT) fusion research device, consists of a PDP-10 host computer, five satellite PDP-11s connected to the host by a special high-speed interface, miscellaneous other minicomputers and commercially supplied instruments, and much PPPL produced hardware. The software consists of the standard PDP-10 monitor with local modifications and the special systems and applications programs to customize the DA/PLT for the specific job of supporting data acquisition, analysis, display, and archiving, with concurrent off-line analysis, program development, and, in the background, general batch and timesharing. Some details of the over-all architecture are presented, along with a status report of the different PLT experiments being supported

  4. An automated tuberculosis screening strategy combining X-ray-based computer-aided detection and clinical information

    Science.gov (United States)

    Melendez, Jaime; Sánchez, Clara I.; Philipsen, Rick H. H. M.; Maduskar, Pragnya; Dawson, Rodney; Theron, Grant; Dheda, Keertan; van Ginneken, Bram

    2016-04-01

    Lack of human resources and radiological interpretation expertise impair tuberculosis (TB) screening programmes in TB-endemic countries. Computer-aided detection (CAD) constitutes a viable alternative for chest radiograph (CXR) reading. However, no automated techniques that exploit the additional clinical information typically available during screening exist. To address this issue and optimally exploit this information, a machine learning-based combination framework is introduced. We have evaluated this framework on a database containing 392 patient records from suspected TB subjects prospectively recruited in Cape Town, South Africa. Each record comprised a CAD score, automatically computed from a CXR, and 12 clinical features. Comparisons with strategies relying on either CAD scores or clinical information alone were performed. Our results indicate that the combination framework outperforms the individual strategies in terms of the area under the receiving operating characteristic curve (0.84 versus 0.78 and 0.72), specificity at 95% sensitivity (49% versus 24% and 31%) and negative predictive value (98% versus 95% and 96%). Thus, it is believed that combining CAD and clinical information to estimate the risk of active disease is a promising tool for TB screening.

  5. Nuclear cycle length economics strategy using stochastic and deterministic Monte Carlo computation models

    International Nuclear Information System (INIS)

    Wook Ahn, T.

    2014-01-01

    Nuclear power plants (NPP) have historically been a low cost base-load electricity source because of their high fuel density and operational reliability. In the United States, NPPs typically run 18- to 24-month cycles to limit outage times and maximize capacity factor. recently, however, increased volatility in energy and fuel prices, lower natural gas prices, higher material costs, and new sources are challenging the nuclear industry. This warrants a study in developing a more robust cycle length and fuel burnup strategy to make NPPs more competitive. (Author)

  6. Nuclear cycle length economics strategy using stochastic and deterministic Monte Carlo computation models

    Energy Technology Data Exchange (ETDEWEB)

    Wook Ahn, T.

    2014-07-01

    Nuclear power plants (NPP) have historically been a low cost base-load electricity source because of their high fuel density and operational reliability. In the United States, NPPs typically run 18- to 24-month cycles to limit outage times and maximize capacity factor. recently, however, increased volatility in energy and fuel prices, lower natural gas prices, higher material costs, and new sources are challenging the nuclear industry. This warrants a study in developing a more robust cycle length and fuel burnup strategy to make NPPs more competitive. (Author)

  7. Calculation of demands for nuclear fuels and fuel cycle services. Description of computer model and strategies developed by Working Group 1

    International Nuclear Information System (INIS)

    Working Group 1 examined a range of reactor deployment strategies and fuel cycle options, in oder to estimate the range of nuclear fuel requirements and fuel cycle service needs which would result. The computer model, its verification in comparison with other models, the strategies to be examined through use of the model, and the range of results obtained are described

  8. Computational metabolic engineering strategies for growth-coupled biofuel production by Synechocystis

    Directory of Open Access Journals (Sweden)

    Kiyan Shabestary

    2016-12-01

    Full Text Available Chemical and fuel production by photosynthetic cyanobacteria is a promising technology but to date has not reached competitive rates and titers. Genome-scale metabolic modeling can reveal limitations in cyanobacteria metabolism and guide genetic engineering strategies to increase chemical production. Here, we used constraint-based modeling and optimization algorithms on a genome-scale model of Synechocystis PCC6803 to find ways to improve productivity of fermentative, fatty-acid, and terpene-derived fuels. OptGene and MOMA were used to find heuristics for knockout strategies that could increase biofuel productivity. OptKnock was used to find a set of knockouts that led to coupling between biofuel and growth. Our results show that high productivity of fermentation or reversed beta-oxidation derived alcohols such as 1-butanol requires elimination of NADH sinks, while terpenes and fatty-acid based fuels require creating imbalances in intracellular ATP and NADPH production and consumption. The FBA-predicted productivities of these fuels are at least 10-fold higher than those reported so far in the literature. We also discuss the physiological and practical feasibility of implementing these knockouts. This work gives insight into how cyanobacteria could be engineered to reach competitive biofuel productivities. Keywords: Cyanobacteria, Modeling, Flux balance analysis, Biofuel, MOMA, OptFlux, OptKnock

  9. Low-Computation Strategies for Extracting CO2 Emission Trends from Surface-Level Mixing Ratio Observations

    Science.gov (United States)

    Shusterman, A.; Kim, J.; Lieschke, K.; Newman, C.; Cohen, R. C.

    2017-12-01

    Global momentum is building for drastic, regulated reductions in greenhouse gas emissions over the coming decade. With this increasing regulation comes a clear need for increasingly sophisticated monitoring, reporting, and verification (MRV) strategies capable of enforcing and optimizing emissions-related policy, particularly as it applies to urban areas. Remote sensing and/or activity-based emission inventories can offer MRV insights for entire sectors or regions, but are not yet sophisticated enough to resolve unexpected trends in specific emitters. Urban surface monitors can offer the desired proximity to individual greenhouse gas sources, but due to the densely-packed nature of typical urban landscapes, surface observations are rarely representative of a single source. Most previous efforts to decompose these complex signals into their contributing emission processes have involved inverse atmospheric modeling techniques, which are computationally intensive and believed to depend heavily on poorly understood a priori estimates of error covariance. Here we present a number of transparent, low-computation approaches for extracting source-specific emissions estimates from signals with a variety of nearfield influences. Using observations from the first several years of the BErkeley Atmospheric CO2 Observation Network (BEACO2N), we demonstrate how to exploit strategic pairings of monitoring "nodes," anomalous wind conditions, and well-understood temporal variations to hone in on specific CO2 sources of interest. When evaluated against conventional, activity-based bottom-up emission inventories, these strategies are seen to generate quantitatively rigorous emission estimates. With continued application as the BEACO2N data set grows in time and space, these approaches offer a promising avenue for optimizing greenhouse gas mitigation strategies into the future.

  10. On a numerical strategy to compute gravity currents of non-Newtonian fluids

    International Nuclear Information System (INIS)

    Vola, D.; Babik, F.; Latche, J.-C.

    2004-01-01

    This paper is devoted to the presentation of a numerical scheme for the simulation of gravity currents of non-Newtonian fluids. The two dimensional computational grid is fixed and the free-surface is described as a polygonal interface independent from the grid and advanced in time by a Lagrangian technique. Navier-Stokes equations are semi-discretized in time by the Characteristic-Galerkin method, which finally leads to solve a generalized Stokes problem posed on a physical domain limited by the free surface to only a part of the computational grid. To this purpose, we implement a Galerkin technique with a particular approximation space, defined as the restriction to the fluid domain of functions of a finite element space. The decomposition-coordination method allows to deal without any regularization with a variety of non-linear and possibly non-differentiable constitutive laws. Beside more analytical tests, we revisit with this numerical method some simulations of gravity currents of the literature, up to now investigated within the simplified thin-flow approximation framework

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  12. Shortening the Defense Acquisition Cycle: A Transformational Imperative?

    National Research Council Canada - National Science Library

    Vollmecke, Kirk

    2004-01-01

    .... The acquisition system is both political and complex. This Strategy Research Project paper explores the effectiveness of past policy changes to reduce cycle time, and reviews current acquisition issues or problems related to cycle time reduction...

  13. Trigger and data acquisition: The bytes start and stop here!

    International Nuclear Information System (INIS)

    Tschirhart, R.

    2010-01-01

    The modern trigger and data acquisition systems that instrument discovery experiments at the Large Hadron Collider (LHC) at CERN are very complex digital systems that select, reduce, and process enormous volumes of data in real-time to match the resources of state-of-the-art distributed computing available to researchers. Never before in particle physics have such powerful digital reconstruction and filtering systems been matched to a world-wide distributed system of computing of unprecedented scale. The goal of these massive aggregate computing systems is to extract as much physical information as possible from collision events at the LHC with well understood selection criteria and biases. Current strategies and future challenges in providing these aggregate real-time and offline computing systems are described.

  14. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  15. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  16. KENS data acquisition system KENSnet

    International Nuclear Information System (INIS)

    Arai, Masatoshi; Furusaka, Michihiro; Satoh, Setsuo; Johnson, M.W.

    1988-01-01

    The installation of a new data acquisition system KENSnet has been completed at the KENS neutron facility. For data collection, 160 Mbytes are necessary for temporary disk storage, and 1 MIPS of CPU is required. For the computing system, models were chosen from the VAX family of computers running their proprietary operating system VMS. The VMS operating system has a very user friendly interface, and is well suited to instrument control applications. New data acquisition electronics were developed. A gate module receives a signal of proton extraction time from the accelerator, and checks the veto signals from the sample environment equipment (vacuum, temperature, chopper phasing, etc.). Then the signal is issued to a delay-time module. A time-control module starts timing from the delayed start signal from the delay-time module, and distributes an encoded time-boundary address to memory modules at the preset times anabling the memory modules to accumulate data histograms. The data acquisition control program (ICP) and the general data analysis program (Genie) were both developed at ISIS, and have been installed in the new data acquisition system. They give the experimenter 'user-friendly' data acquisition and a good environment for data manipulation. The ICP controls the DAE and transfers the histogram data into the computers. (N.K.)

  17. Use of an expert system for the data acquisition input of a computer software for materials flux of an irradiated fuel reprocessing plant

    International Nuclear Information System (INIS)

    Ouvrier, N.; Castelli, P.

    1993-01-01

    An irradiated fuel reprocessing plant products purified plutonium and all sorts of wastes. PROBILUS program has been developed to calculate the transfer flux of materials. The data acquisition input being long, an expert system MARCMOD has been realized. PROBILUS was conceived like a standards collection. We wanted to automate the tedious tasks, to decrease the output result delays, and to improve the result quality. That is the reason why an expert system MARCMOD with a graphic interface has been carried out. With this tool the user may describe graphically his process, check the coherence of acquired informations, generate automatically the whole input data of PROBILUS, and change by interaction some predefined data

  18. Effects of Ordering Strategies and Programming Paradigms on Sparse Matrix Computations

    Science.gov (United States)

    Oliker, Leonid; Li, Xiaoye; Husbands, Parry; Biswas, Rupak; Biegel, Bryan (Technical Monitor)

    2002-01-01

    The Conjugate Gradient (CG) algorithm is perhaps the best-known iterative technique to solve sparse linear systems that are symmetric and positive definite. For systems that are ill-conditioned, it is often necessary to use a preconditioning technique. In this paper, we investigate the effects of various ordering and partitioning strategies on the performance of parallel CG and ILU(O) preconditioned CG (PCG) using different programming paradigms and architectures. Results show that for this class of applications: ordering significantly improves overall performance on both distributed and distributed shared-memory systems, that cache reuse may be more important than reducing communication, that it is possible to achieve message-passing performance using shared-memory constructs through careful data ordering and distribution, and that a hybrid MPI+OpenMP paradigm increases programming complexity with little performance gains. A implementation of CG on the Cray MTA does not require special ordering or partitioning to obtain high efficiency and scalability, giving it a distinct advantage for adaptive applications; however, it shows limited scalability for PCG due to a lack of thread level parallelism.

  19. Microcomputer data acquisition and control.

    Science.gov (United States)

    East, T D

    1986-01-01

    In medicine and biology there are many tasks that involve routine well defined procedures. These tasks are ideal candidates for computerized data acquisition and control. As the performance of microcomputers rapidly increases and cost continues to go down the temptation to automate the laboratory becomes great. To the novice computer user the choices of hardware and software are overwhelming and sadly most of the computer sales persons are not at all familiar with real-time applications. If you want to bill your patients you have hundreds of packaged systems to choose from; however, if you want to do real-time data acquisition the choices are very limited and confusing. The purpose of this chapter is to provide the novice computer user with the basics needed to set up a real-time data acquisition system with the common microcomputers. This chapter will cover the following issues necessary to establish a real time data acquisition and control system: Analysis of the research problem: Definition of the problem; Description of data and sampling requirements; Cost/benefit analysis. Choice of Microcomputer hardware and software: Choice of microprocessor and bus structure; Choice of operating system; Choice of layered software. Digital Data Acquisition: Parallel Data Transmission; Serial Data Transmission; Hardware and software available. Analog Data Acquisition: Description of amplitude and frequency characteristics of the input signals; Sampling theorem; Specification of the analog to digital converter; Hardware and software available; Interface to the microcomputer. Microcomputer Control: Analog output; Digital output; Closed-Loop Control. Microcomputer data acquisition and control in the 21st Century--What is in the future? High speed digital medical equipment networks; Medical decision making and artificial intelligence.

  20. Data Acquisition System

    International Nuclear Information System (INIS)

    Watwood, D.; Beatty, J.

    1991-01-01

    The Data Acquisition System (DAS) is comprised of a Hewlett-Packard (HP) model 9816, Series 200 Computer System with the appropriate software to acquire, control, and archive data from a Data Acquisition/Control Unit, models HP3497A and HP3498A. The primary storage medium is an HP9153 16-megabyte hard disc. The data is backed-up on three floppy discs. One floppy disc drive is contained in the HP9153 chassis; the other two comprise an HP9122 dual disc drive. An HP82906A line printer supplies hard copy backup. A block diagram of the hardware setup is shown. The HP3497A/3498A Data Acquisition/Control Units read each input channel and transmit the raw voltage reading to the HP9816 CPU via the HPIB bus. The HP9816 converts this voltage to the appropriate engineering units using the calibration curves for the sensor being read. The HP9816 archives both the raw and processed data along with the time and the readings were taken to hard and floppy discs. The processed values and reading time are printed on the line printer. This system is designed to accommodate several types of sensors; each type is discussed in the following sections

  1. Complexity in language acquisition.

    Science.gov (United States)

    Clark, Alexander; Lappin, Shalom

    2013-01-01

    Learning theory has frequently been applied to language acquisition, but discussion has largely focused on information theoretic problems-in particular on the absence of direct negative evidence. Such arguments typically neglect the probabilistic nature of cognition and learning in general. We argue first that these arguments, and analyses based on them, suffer from a major flaw: they systematically conflate the hypothesis class and the learnable concept class. As a result, they do not allow one to draw significant conclusions about the learner. Second, we claim that the real problem for language learning is the computational complexity of constructing a hypothesis from input data. Studying this problem allows for a more direct approach to the object of study--the language acquisition device-rather than the learnable class of languages, which is epiphenomenal and possibly hard to characterize. The learnability results informed by complexity studies are much more insightful. They strongly suggest that target grammars need to be objective, in the sense that the primitive elements of these grammars are based on objectively definable properties of the language itself. These considerations support the view that language acquisition proceeds primarily through data-driven learning of some form. Copyright © 2013 Cognitive Science Society, Inc.

  2. A Soft Computing Based Approach Using Modified Selection Strategy for Feature Reduction of Medical Systems

    Directory of Open Access Journals (Sweden)

    Kursat Zuhtuogullari

    2013-01-01

    Full Text Available The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.

  3. A soft computing based approach using modified selection strategy for feature reduction of medical systems.

    Science.gov (United States)

    Zuhtuogullari, Kursat; Allahverdi, Novruz; Arikan, Nihat

    2013-01-01

    The systems consisting high input spaces require high processing times and memory usage. Most of the attribute selection algorithms have the problems of input dimensions limits and information storage problems. These problems are eliminated by means of developed feature reduction software using new modified selection mechanism with middle region solution candidates adding. The hybrid system software is constructed for reducing the input attributes of the systems with large number of input variables. The designed software also supports the roulette wheel selection mechanism. Linear order crossover is used as the recombination operator. In the genetic algorithm based soft computing methods, locking to the local solutions is also a problem which is eliminated by using developed software. Faster and effective results are obtained in the test procedures. Twelve input variables of the urological system have been reduced to the reducts (reduced input attributes) with seven, six, and five elements. It can be seen from the obtained results that the developed software with modified selection has the advantages in the fields of memory allocation, execution time, classification accuracy, sensitivity, and specificity values when compared with the other reduction algorithms by using the urological test data.

  4. Beat the Cheater: Computing Game-Theoretic Strategies for When to Kick a Gambler out of a Casino

    DEFF Research Database (Denmark)

    Sørensen, Troels Bjerre; Dalis, Melissa; Korzhyk, Dmytro

    2014-01-01

    Gambles in casinos are usually set up so that the casino makes a profit in expectation -- as long as gamblers play honestly. However, some gamblers are able to cheat, reducing the casino’s profit. How should the casino address this? A common strategy is to selectively kick gamblers out, possibly ...... techniques could be useful for addressing related problems -- for example, illegal trades in financial markets....... model the problem as a Bayesian game in which the casino is a Stackelberg leader that can commit to a (possibly randomized) policy for when to kick gamblers out, and we provide efficient algorithms for computing the optimal policy. Besides being potentially useful to casinos, we imagine that similar...

  5. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  12. The Effectiveness of the Instructional Programs Based on Self-Management Strategies in Acquisition of Social Skills by the Children with Intellectual Disabilities

    Science.gov (United States)

    Avcioglu, Hasan

    2012-01-01

    The purpose of this study is to evaluate the effectiveness of self-management skills training program, based on self-control strategies, on students with intellectual disabilities. A multiple-probe design across subjects single-subject research methodology was used in this study. Nine students with intellectual disabilities, whose ages are between…

  13. 基于互联网社区的消费者需求信息采集策略%Strategies of Consumer Demand Information Acquisition Based on Internet Community

    Institute of Scientific and Technical Information of China (English)

    徐颖; 李倩

    2011-01-01

    从互联网社区参与者角色——社区建设者、主题发布者、话题讨论者出发,提出企业在互联网社区上采集消费者需求信息的基本方针,分析在互联网社区上消费者需求信息表现形式的转化过程,构建在互联网社区上采集消费者需求信息的具体路线,以此为企业制定在互联网社区上采集消费者需求信息策略提供借鉴。%From the roles of Interact community participants—— the community constructor, the subject promulgator and the topic discusser, the paper proposes the basic acquisition policies for consumer demand information on the Internet community. It analyzes the conversion process of the forms of the consumer demand information, and constructs the concrete routes of gathering the information on the Internet community. It provides the reference to develop strategies of consumer demand information acquisition for enterprises.

  14. Knowledge-sharing Behavior and Post-acquisition Integration Failure

    DEFF Research Database (Denmark)

    Gammelgaard, Jens; Husted, Kenneth; Michailova, Snejina

    2004-01-01

    AbstractNot achieving the anticipated synergy effects in the post-acquisition integration context is a serious causefor the high acquisition failure rate. While existing studies on failures of acquisitions exist fromeconomics, finance, strategy, organization theory, and human resources management......, this paper appliesinsights from the knowledge-sharing literature. The paper establishes a conceptual link between obstaclesin the post-acquisition integration processes and individual knowledge-sharing behavior as related toknowledge transmitters and knowledge receivers. We argue that such an angle offers...... important insights toexplaining the high failure rate in acquisitions.Descriptors: post-acquisition integration, acquisition failure, individual knowledge-sharing behavior...

  15. Defragging Computer/Videogame Implementation and Assessment in the Social Studies

    Science.gov (United States)

    McBride, Holly

    2014-01-01

    Students in this post-industrial technological age require opportunities for the acquisition of new skills, especially in the marketplace of innovation. A pedagogical strategy that is becoming more and more popular within social studies classrooms is the use of computer and video games as enhancements to everyday lesson plans. Computer/video games…

  16. Behavior acquisition in artificial agents

    OpenAIRE

    Thurau, Christian

    2006-01-01

    Computational skill acquisition in robots and simulated agents has been a topic of increasing popularity throughout the last years. Despite impressive progress, autonomous behavior at a level of animals and humans are not yet replicated by machines. Especially when a complex environment demands versatile, goal-oriented behavior, current artificial systems show shortcomings. Consider for instance modern 3D computer games. Despite their key role for more emersive game experience, surprisingly l...

  17. Advanced IPNE data acquisition system

    International Nuclear Information System (INIS)

    Duma, M.; Moisa, D.; Petrovici, M.; Berceanu, I.; Ivascu, M.; Pascovici, G.; Simion, V.; Osvath, E.; Bock, R.; Gobbi, A.; Hildebrand, K.D.; Lynen, U.; Mueller, W.F.J.; Beeskow, M.

    1987-05-01

    A complex and flexible data acquisition system has been developed in order to run relative complex experiments in our acceleration system - ALIGATOR. AIDA programme has been carried out on a small PDP - 11/34 computer and is based on a CAMAC hardware. The main hardware and software features are presented. (authors)

  18. The SINQ data acquisition environment

    Energy Technology Data Exchange (ETDEWEB)

    Maden, D [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1996-11-01

    The data acquisition environment for the neutron scattering instruments supported by LNS at SINQ is described. The intention is to provide future users with the necessary background to the computing facilities on site rather than to present a user manual for the on-line system. (author) 5 figs., 6 refs.

  19. E802 data acquisition complex

    International Nuclear Information System (INIS)

    LeVine, M.J.

    1985-01-01

    The data acquisition architecture planned for experiment E802 is described. A VAX 11/785 will be front-ended by a VME-based array of 68000-family microprocessors, which will be used as intelligent CAMAC crate controllers, event builders, and to augment the computing power of the VAX for on-line analysis of the data. 4 refs

  20. The SINQ data acquisition environment

    International Nuclear Information System (INIS)

    Maden, D.

    1996-01-01

    The data acquisition environment for the neutron scattering instruments supported by LNS at SINQ is described. The intention is to provide future users with the necessary background to the computing facilities on site rather than to present a user manual for the on-line system. (author) 5 figs., 6 refs

  1. Data-Acquisition Systems for Fusion Devices

    NARCIS (Netherlands)

    van Haren, P. C.; Oomens, N. A.

    1993-01-01

    During the last two decades, computerized data acquisition systems (DASs) have been applied at magnetic confinement fusion devices. Present-day data acquisition is done by means of distributed computer systems and transient recorders in CAMAC systems. The development of DASs has been technology

  2. EBT data acquisition and analysis system

    International Nuclear Information System (INIS)

    Burris, R.D.; Greenwood, D.E.; Stanton, J.S.; Geoffroy, K.A.

    1980-10-01

    This document describes the design and implementation of a data acquisition and analysis system for the EBT fusion experiment. The system includes data acquisition on five computers, automatic transmission of that data to a large, central data base, and a powerful data retrieval system. The system is flexible and easy to use, and it provides a fully documented record of the experiments

  3. Data Acquisition and Real-Time Systems.

    Science.gov (United States)

    Lawrence, D. E., Ed.; Fenwick, P. M., Ed.

    The first group of papers starts with a tutorial paper which surveys the methods used in data acquisition systems. Other papers in this group describe: (1) some problems involved in the computer acquisition of high-speed randomly-occurring data and the protection of this data from accidental corruption, (2) an input/output bus to allow an IBM…

  4. A Strategy Combining Higher Energy C-Trap Dissociation with Neutral Loss- and Product Ion-Based MSn Acquisition for Global Profiling and Structure Annotation of Fatty Acids Conjugates.

    Science.gov (United States)

    Bi, Qi-Rui; Hou, Jin-Jun; Yang, Min; Shen, Yao; Qi, Peng; Feng, Rui-Hong; Dai, Zhuo; Yan, Bing-Peng; Wang, Jian-Wei; Shi, Xiao-Jian; Wu, Wan-Ying; Guo, De-An

    2017-03-01

    Fatty acids conjugates (FACs) are ubiquitous but found in trace amounts in the natural world. They are composed of multiple unknown substructures and side chains. Thus, FACs are difficult to be analyzed by traditional mass spectrometric methods. In this study, an integrated strategy was developed to global profiling and targeted structure annotation of FACs in complex matrix by LTQ Orbitrap. Dicarboxylic acid conjugated bufotoxins (DACBs) in Venenum bufonis (VB) were used as model compounds. The new strategy (abbreviated as HPNA) combined higher-energy C-trap dissociation (HCD) with product ion- (PI), neutral loss- (NL) based MS n (n ≥ 3) acquisition in both positive-ion mode and negative-ion mode. Several advantages are presented. First, various side chains were found under HCD in negative-ion mode, which included both known and unknown side chains. Second, DACBs with multiple side chains were simultaneously detected in one run. Compared with traditional quadrupole-based mass method, it greatly increased analysis throughput. Third, the fragment ions of side chain and steroids substructure could be obtained by PI- and NL-based MS n acquisition, respectively, which greatly increased the accuracy of the structure annotation of DACBs. In all, 78 DACBs have been discovered, of which 68 were new compounds; 25 types of substructure formulas and seven dicarboxylic acid side chains were found, especially five new side chains, including two saturated dicarboxylic acids [(azelaic acid (C 9 ) and sebacic acid (C 10 )] and three unsaturated dicarboxylic acids (u-C 8 , u-C 9 , and u-C 10 ). All these results greatly enriched the structures of DACBs in VB. Graphical Abstract ᅟ.

  5. 2017 NAIP Acquisition Map

    Data.gov (United States)

    Farm Service Agency, Department of Agriculture — Planned States for 2017 NAIP acquisition and acquisition status layer (updated daily). Updates to the acquisition seasons may be made during the season to...

  6. How Cisco Systems Used Enterprise Architecture Capability to Sustain Acquisition-Based Growth

    DEFF Research Database (Denmark)

    Toppenberg, Gustav; Shanks, Graeme; Henningsson, Stefan

    2015-01-01

    Value-creating acquisitions are a major challenge for many firms. The case of Cisco Systems shows that an advanced enterprise architecture (EA) capability can contribute to the four phases of the acquisition process: pre-acquisition preparation, acquisition selection, acquisition integration...... and post-integration management. Cisco’s EA capability improves its ability to rapidly capture value from acquisitions and to sustain its acquisition-based growth strategy over time....

  7. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  8. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  10. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  11. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  13. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  14. Triggering and data acquisition general considerations

    International Nuclear Information System (INIS)

    Butler, Joel N.

    2003-01-01

    We provide a general introduction to trigger and data acquisition systems in High Energy Physics. We emphasize the new possibilities and new approaches that have been made possible by developments in computer technology and networking

  15. Emerging Market Firms’ Acquisitions in Advanced Markets

    DEFF Research Database (Denmark)

    Stucchi, Tamara

    2012-01-01

    markets. These antecedents can influence emerging market firms’ capacities to absorb or exploit technological and/or marketing advantages in advanced markets. In order to be successful, emerging market firms have to undertake those upmarket acquisitions that best “fit” their antecedents. Four mutually......This study draws upon the resource-based view and the institution-based view of the firm to provide a comprehensive overview of how different resource-, institution- and industry-based antecedents affect the motivations guiding the acquisitions that emerging market firms undertake in advanced...... exclusive acquisition strategies are derived, which are then illustrated using examples of Indian firms’ acquisitions in advanced markets....

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  17. Syntax acquisition.

    Science.gov (United States)

    Crain, Stephen; Thornton, Rosalind

    2012-03-01

    Every normal child acquires a language in just a few years. By 3- or 4-years-old, children have effectively become adults in their abilities to produce and understand endlessly many sentences in a variety of conversational contexts. There are two alternative accounts of the course of children's language development. These different perspectives can be traced back to the nature versus nurture debate about how knowledge is acquired in any cognitive domain. One perspective dates back to Plato's dialog 'The Meno'. In this dialog, the protagonist, Socrates, demonstrates to Meno, an aristocrat in Ancient Greece, that a young slave knows more about geometry than he could have learned from experience. By extension, Plato's Problem refers to any gap between experience and knowledge. How children fill in the gap in the case of language continues to be the subject of much controversy in cognitive science. Any model of language acquisition must address three factors, inter alia: 1. The knowledge children accrue; 2. The input children receive (often called the primary linguistic data); 3. The nonlinguistic capacities of children to form and test generalizations based on the input. According to the famous linguist Noam Chomsky, the main task of linguistics is to explain how children bridge the gap-Chomsky calls it a 'chasm'-between what they come to know about language, and what they could have learned from experience, even given optimistic assumptions about their cognitive abilities. Proponents of the alternative 'nurture' approach accuse nativists like Chomsky of overestimating the complexity of what children learn, underestimating the data children have to work with, and manifesting undue pessimism about children's abilities to extract information based on the input. The modern 'nurture' approach is often referred to as the usage-based account. We discuss the usage-based account first, and then the nativist account. After that, we report and discuss the findings of several

  18. Perceived risks in online hotel services acquisition: Determinant factors of reduction strategies and their relation with consumer´s demographical characteristics

    Directory of Open Access Journals (Sweden)

    Anderson Gomes de Souza

    2012-09-01

    Full Text Available The purpose of this article was to identify which factors determine the adoption of strategies to reduce perceived risk in the purchase of hotel services online, checking whether there is any relation between those relievers and demographical characteristics. A factor analysis was conducted after application of a structured questionnaire with consumers who have a habit of traveling and booking hotels through the internet. The results showed that the factors that characterize the strategies used by consumers as a means of reducing risk in the virtual environment are: users and partners own experience, regulation guarantees and certainty and higher price. That is, it can be considered that these factors are relevant and tend to be used by consumers as a way to reduce risks when purchasing a hotel service online. On the other hand, it was found that demographic characteristics show no relationship with consumers´ perception of risk. With these results, the question that remains is whether the companies in this field of activity develop actions aligned with these factors in order to enable consumers to reduce their risk perceptions while buying these services on websites.

  19. Are Educational Computer Micro-Games Engaging and Effective for Knowledge Acquisition at High-Schools? A Quasi-Experimental Study

    Science.gov (United States)

    Brom, Cyril; Preuss, Michal; Klement, Daniel

    2011-01-01

    Curricular schooling can benefit from the usage of educational computer games, but it is difficult to integrate them in the formal schooling system. Here, we investigate one possible approach to this integration, which capitalizes on using a micro-game that can be played with a teacher's guidance as a supplement after a traditional expository…

  20. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  1. Acquisition of a potent and selective TC-PTP inhibitor via a stepwise fluorophore-tagged combinatorial synthesis and screening strategy.

    Science.gov (United States)

    Zhang, Sheng; Chen, Lan; Luo, Yong; Gunawan, Andrea; Lawrence, David S; Zhang, Zhong-Yin

    2009-09-16

    Protein tyrosine phosphatases (PTPs) regulate a broad range of cellular processes including proliferation, differentiation, migration, apoptosis, and immune responses. Dysfunction of PTP activity is associated with cancers, metabolic syndromes, and autoimmune disorders. Consequently, small molecule PTP inhibitors should serve not only as powerful tools to delineate the physiological roles of these enzymes in vivo but also as lead compounds for therapeutic development. We describe a novel stepwise fluorophore-tagged combinatorial library synthesis and competitive fluorescence polarization screening approach that transforms a weak and general PTP inhibitor into an extremely potent and selective TC-PTP inhibitor with highly efficacious cellular activity. The result serves as a proof-of-concept in PTP inhibitor development, as it demonstrates the feasibility of acquiring potent, yet highly selective, cell permeable PTP inhibitory agents. Given the general nature of the approach, this strategy should be applicable to other PTP targets.

  2. Multi spectral scaling data acquisition system

    International Nuclear Information System (INIS)

    Behere, Anita; Patil, R.D.; Ghodgaonkar, M.D.; Gopalakrishnan, K.R.

    1997-01-01

    In nuclear spectroscopy applications, it is often desired to acquire data at high rate with high resolution. With the availability of low cost computers, it is possible to make a powerful data acquisition system with minimum hardware and software development, by designing a PC plug-in acquisition board. But in using the PC processor for data acquisition, the PC can not be used as a multitasking node. Keeping this in view, PC plug-in acquisition boards with on-board processor find tremendous applications. Transputer based data acquisition board has been designed which can be configured as a high count rate pulse height MCA or as a Multi Spectral Scaler. Multi Spectral Scaling (MSS) is a new technique, in which multiple spectra are acquired in small time frames and are then analyzed. This paper describes the details of this multi spectral scaling data acquisition system. 2 figs

  3. Cloud Computing Strategy

    Science.gov (United States)

    2012-07-01

    regardless of  access point or the device being used across the Global Information Grid ( GIG ).  These data  centers will host existing applications...state.  It  illustrates that the DoD Enterprise Cloud is an integrated environment on the  GIG , consisting of  DoD Components, commercial entities...Operations and Maintenance (O&M) costs by  leveraging  economies  of scale, and automate monitoring and provisioning to reduce the  human cost of service

  4. Development of a Data Acquisition Program for the Purpose of Monitoring Processing Statistics Throughout the BaBar Online Computing Infrastructure's Farm Machines

    Energy Technology Data Exchange (ETDEWEB)

    Stonaha, P.

    2004-09-03

    A current shortcoming of the BaBar monitoring system is the lack of systematic gathering, archiving, and access to the running statistics of the BaBar Online Computing Infrastructure's farm machines. Using C, a program has been written to gather the raw data of each machine's running statistics and compute various rates and percentages that can be used for system monitoring. These rates and percentages then can be stored in an EPICS database for graphing, archiving, and future access. Graphical outputs show the reception of the data into the EPICS database. The C program can read if the data are 32- or 64-bit and correct for overflows. This program is not exclusive to BaBar and can be easily modified for any system.

  5. An Analysis of the RCA Price-S Cost Estimation Model as it Relates to Current Air Force Computer Software Acquisition and Management.

    Science.gov (United States)

    1979-12-01

    because of the use of complex computational algorithms (Ref 25). Another important factor effecting the cost of soft- ware is the size of the development...involved the alignment and navigational algorithm portions of the software. The second avionics system application was the development of an inertial...001 1 COAT CONL CREA CINT CMAT CSTR COPR CAPP New Code .001 .001 .001 .001 1001 ,OO .00 Device TDAT T03NL TREA TINT Types o * Quantity OGAT OONL OREA

  6. An efficient rhythmic component expression and weighting synthesis strategy for classifying motor imagery EEG in a brain computer interface

    Science.gov (United States)

    Wang, Tao; He, Bin

    2004-03-01

    The recognition of mental states during motor imagery tasks is crucial for EEG-based brain computer interface research. We have developed a new algorithm by means of frequency decomposition and weighting synthesis strategy for recognizing imagined right- and left-hand movements. A frequency range from 5 to 25 Hz was divided into 20 band bins for each trial, and the corresponding envelopes of filtered EEG signals for each trial were extracted as a measure of instantaneous power at each frequency band. The dimensionality of the feature space was reduced from 200 (corresponding to 2 s) to 3 by down-sampling of envelopes of the feature signals, and subsequently applying principal component analysis. The linear discriminate analysis algorithm was then used to classify the features, due to its generalization capability. Each frequency band bin was weighted by a function determined according to the classification accuracy during the training process. The present classification algorithm was applied to a dataset of nine human subjects, and achieved a success rate of classification of 90% in training and 77% in testing. The present promising results suggest that the present classification algorithm can be used in initiating a general-purpose mental state recognition based on motor imagery tasks.

  7. Cation-π interactions: computational analyses of the aromatic box motif and the fluorination strategy for experimental evaluation.

    Science.gov (United States)

    Davis, Matthew R; Dougherty, Dennis A

    2015-11-21

    Cation-π interactions are common in biological systems, and many structural studies have revealed the aromatic box as a common motif. With the aim of understanding the nature of the aromatic box, several computational methods were evaluated for their ability to reproduce experimental cation-π binding energies. We find the DFT method M06 with the 6-31G(d,p) basis set performs best of several methods tested. The binding of benzene to a number of different cations (sodium, potassium, ammonium, tetramethylammonium, and guanidinium) was studied. In addition, the binding of the organic cations NH4(+) and NMe4(+) to ab initio generated aromatic boxes as well as examples of aromatic boxes from protein crystal structures were investigated. These data, along with a study of the distance dependence of the cation-π interaction, indicate that multiple aromatic residues can meaningfully contribute to cation binding, even with displacements of more than an angstrom from the optimal cation-π interaction. Progressive fluorination of benzene and indole was studied as well, and binding energies obtained were used to reaffirm the validity of the "fluorination strategy" to study cation-π interactions in vivo.

  8. Data acquisition for experiments with multi-detector arrays

    Indian Academy of Sciences (India)

    Experiments with multi-detector arrays have special requirements and place higher demands on computer data acquisition systems. In this contribution we discuss data acquisition systems with special emphasis on multi-detector arrays and in particular we describe a new data acquisition system, AMPS which we have ...

  9. Data-acquisition systems

    International Nuclear Information System (INIS)

    Cyborski, D.R.; Teh, K.M.

    1995-01-01

    Up to now, DAPHNE, the data-acquisition system developed for ATLAS, was used routinely for experiments at ATLAS and the Dynamitron. More recently, the Division implemented 2 MSU/DAPHNE systems. The MSU/DAPHNE system is a hybrid data-acquisition system which combines the front-end of the Michigan State University (MSU) DA system with the traditional DAPHNE back-end. The MSU front-end is based on commercially available modules. This alleviates the problems encountered with the DAPHNE front-end which is based on custom designed electronics. The first MSU system was obtained for the APEX experiment and was used there successfully. A second MSU front-end, purchased as a backup for the APEX experiment, was installed as a fully-independent second MSU/DAPHNE system with the procurement of a DEC 3000 Alpha host computer, and was used successfully for data-taking in an experiment at ATLAS. Additional hardware for a third system was bought and will be installed. With the availability of 2 MSU/DAPHNE systems in addition to the existing APEX setup, it is planned that the existing DAPHNE front-end will be decommissioned

  10. Relative Effectiveness of Computer-Supported Jigsaw II, STAD and TAI Cooperative Learning Strategies on Performance, Attitude, and Retention of Secondary School Students in Physics

    Science.gov (United States)

    Gambari, Amosa Isiaka; Yusuf, Mudasiru Olalere

    2017-01-01

    This study investigated the relative effectiveness of computer-supported cooperative learning strategies on the performance, attitudes, and retention of secondary school students in physics. A purposive sampling technique was used to select four senior secondary schools from Minna, Nigeria. The students were allocated to one of four groups:…

  11. Acquisition of data from on-line laser turbidimeter and calculation of some kinetic variables in computer-coupled automated fed-batch culture

    International Nuclear Information System (INIS)

    Kadotani, Y.; Miyamoto, K.; Mishima, N.; Kominami, M.; Yamane, T.

    1995-01-01

    Output signals of a commercially available on-line laser turbidimeter exhibit fluctuations due to air and/or CO 2 bubbles. A simple data processing algorithm and a personal computer software have been developed to smooth the noisy turbidity data acquired, and to utilize them for the on-line calculations of some kinetic variables involved in batch and fed-batch cultures of uniformly dispersed microorganisms. With this software, about 10 3 instantaneous turbidity data acquired over 55 s are averaged and convert it to dry cell concentration, X, every minute. Also, volume of the culture broth, V, is estimated from the averaged output data of weight loss of feed solution reservoir, W, using an electronic balance on which the reservoir is placed. Then, the computer software is used to perform linear regression analyses over the past 30 min of the total biomass, VX, the natural logarithm of the total biomass, ln(VX), and the weight loss, W, in order to calculate volumetric growth rate, d(VX)/dt, specific growth rate, μ [ = dln(VX)/dt] and the rate of W, dW/dt, every minute in a fed-batch culture. The software used to perform the first-order regression analyses of VX, ln(VX) and W was applied to batch or fed-batch cultures of Escherichia coli on minimum synthetic or natural complex media. Sample determination coefficients of the three different variables (VX, ln(VX) and W) were close to unity, indicating that the calculations are accurate. Furthermore, growth yield, Y x/s , and specific substrate consumption rate, q sc , were approximately estimated from the data, dW/dt and in a ‘balanced’ fed-batch culture of E. coli on the minimum synthetic medium where the computer-aided substrate-feeding system automatically matches well with the cell growth. (author)

  12. Prospectively electrocardiogram-triggered high-pitch spiral acquisition coronary computed tomography angiography for assessment of biodegradable vascular scaffold expansion: Comparison with optical coherence tomography

    Energy Technology Data Exchange (ETDEWEB)

    D’Alfonso, Maria Grazia [Interventional Cardiology Unit University Of Florence, Heart and Vessels department, AOU Careggi, Florence (Italy); Mattesini, Alessio, E-mail: amattesini@gmail.com [Interventional Cardiology Unit University Of Florence, Heart and Vessels department, AOU Careggi, Florence (Italy); Meucci, Francesco [Interventional Cardiology Unit University Of Florence, Heart and Vessels department, AOU Careggi, Florence (Italy); Acquafresca, Manlio [Radiology Unit 4, Radiology Department, AOU Careggi, Florence (Italy); Gensini, Gian Franco; Valente, Serafina [Interventional Cardiology Unit University Of Florence, Heart and Vessels department, AOU Careggi, Florence (Italy)

    2014-11-15

    BVS polymeric struts are transparent to the light so that the vessel wall contour can be easily visualized using optical coherence tomography (OCT). Therefore OCT represents a unique tool for both the evaluation of the resorption process and for the assessment of acute BVS mechanical failure. Similarly, the metal-free struts allow unrestricted coronary computed tomography angiography (CCTA), thus this non invasive method might become the gold standard for a non invasive assessment of BVS. In this case we show the ability of CCTA, performed with a low X-Ray dose, to provide a good evaluation of scaffold expansion. The quantitative measurements were in agreement with those obtained with OCT.

  13. Speed in Acquisitions

    DEFF Research Database (Denmark)

    Meglio, Olimpia; King, David R.; Risberg, Annette

    2017-01-01

    The advantage of speed is often invoked by academics and practitioners as an essential condition during post-acquisition integration, frequently without consideration of the impact earlier decisions have on acquisition speed. In this article, we examine the role speed plays in acquisitions across...... the acquisition process using research organized around characteristics that display complexity with respect to acquisition speed. We incorporate existing research with a process perspective of acquisitions in order to present trade-offs, and consider the influence of both stakeholders and the pre......-deal-completion context on acquisition speed, as well as the organization’s capabilities to facilitating that speed. Observed trade-offs suggest both that acquisition speed often requires longer planning time before an acquisition and that associated decisions require managerial judgement. A framework for improving...

  14. The ALICE data acquisition system

    CERN Document Server

    Carena, F; Chapeland, S; Chibante Barroso, V; Costa, F; Dénes, E; Divià, R; Fuchs, U; Grigore, A; Kiss, T; Simonetti, G; Soós, C; Telesca, A; Vande Vyvre, P; Von Haller, B

    2014-01-01

    In this paper we describe the design, the construction, the commissioning and the operation of the Data Acquisition (DAQ) and Experiment Control Systems (ECS) of the ALICE experiment at the CERN Large Hadron Collider (LHC). The DAQ and the ECS are the systems used respectively for the acquisition of all physics data and for the overall control of the experiment. They are two computing systems made of hundreds of PCs and data storage units interconnected via two networks. The collection of experimental data from the detectors is performed by several hundreds of high-speed optical links. We describe in detail the design considerations for these systems handling the extreme data throughput resulting from central lead ions collisions at LHC energy. The implementation of the resulting requirements into hardware (custom optical links and commercial computing equipment), infrastructure (racks, cooling, power distribution, control room), and software led to many innovative solutions which are described together with ...

  15. Design of the Digital Sky Survey DA and online system: A case history in the use of computer aided tools for data acquisition system design

    Science.gov (United States)

    Petravick, D.; Berman, E.; Nicinski, T.; Rechenmacher, R.; Oleynik, G.; Pordes, R.; Stoughton, C.

    1991-06-01

    As part of its expanding Astrophysics program, Fermilab is participating in the Digital Sky Survey (DSS). Fermilab is part of a collaboration involving University of Chicago, Princeton University, and the Institute of Advanced Studies (at Princeton). The DSS main results will be a photometric imaging survey and a redshift survey of galaxies and color-selected quasars over pi steradians of the Northern Galactic Cap. This paper focuses on our use of Computer Aided Software Engineering (CASE) in specifying the data system for DSS. Extensions to standard methodologies were necessary to compensate for tool shortcomings and to improve communication amongst the collaboration members. One such important extension was the incorporation of CASE information into the specification document.

  16. SSCL computer planning

    International Nuclear Information System (INIS)

    Price, L.E.

    1990-01-01

    The SSC Laboratory is in the process of planning the acquisition of a substantial computing system to support the design of detectors. Advice has been sought from users and computer experts in several stages. This paper discuss this process

  17. Effects of Computer-Assisted Instruction in Using Formal Decision-Making Strategies to Choose a College Major.

    Science.gov (United States)

    Mau, Wei-Cheng; Jepsen, David A.

    1992-01-01

    Compared decision-making strategies and college major choice among 113 first-year students assigned to Elimination by Aspects Strategy (EBA), Subjective Expected Utility Strategy (SEU), and control groups. "Rational" EBA students scored significantly higher on choice certainty; lower on choice anxiety and career indecision than "rational"…

  18. Influence of Cone-beam Computed Tomography on Endodontic Retreatment Strategies among General Dental Practitioners and Endodontists.

    Science.gov (United States)

    Rodríguez, Gustavo; Patel, Shanon; Durán-Sindreu, Fernando; Roig, Miguel; Abella, Francesc

    2017-09-01

    Treatment options for endodontic failure include nonsurgical or surgical endodontic retreatment, intentional replantation, and extraction with or without replacement of the tooth. The aim of the present study was to determine the impact of cone-beam computed tomographic (CBCT) imaging on clinical decision making among general dental practitioners and endodontists after failed root canal treatment. A second objective was to assess the self-reported level of difficulty in making a treatment choice before and after viewing a preoperative CBCT scan. Eight patients with endodontically treated teeth diagnosed as symptomatic apical periodontitis, acute apical abscess, or chronic apical abscess were selected. In the first session, the examiners were given the details of each case, including any relevant radiographs, and were asked to choose 1 of the proposed treatment alternatives and assess the difficulty of making a decision. One month later, the examiners reviewed randomly the same 8 cases with the additional information from the CBCT data. The examiners altered their treatment plan after viewing the CBCT scan in 49.8% of the cases. A significant difference in the treatment plan between the 2 imaging modalities was recorded for endodontists and general practitioners (P < .05). After CBCT evaluation, neither group altered their self-reported level of difficulty when choosing a treatment plan (P = .0524). The extraction option rose significantly to 20% after viewing the CBCT scan (P < .05). CBCT imaging directly influences endodontic retreatment strategies among general dental practitioners and endodontists. Copyright © 2017 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  19. Rayleigh’s quotient–based damage detection algorithm: Theoretical concepts, computational techniques, and field implementation strategies

    DEFF Research Database (Denmark)

    NJOMO WANDJI, Wilfried

    2017-01-01

    levels are targeted: existence, location, and severity. The proposed algorithm is analytically developed from the dynamics theory and the virtual energy principle. Some computational techniques are proposed for carrying out computations, including discretization, integration, derivation, and suitable...

  20. Modelling a data acquisition system

    International Nuclear Information System (INIS)

    Green, P.W.

    1986-01-01

    A data acquisition system to be run on a Data General ECLIPSE computer has been completely designed and developed using a VAX 11/780. This required that many of the features of the RDOS operating system be simulated on the VAX. Advantages and disadvantages of this approach are discussed, with particular regard to transportability of the system among different machines/operating systems, and the effect of the approach on various design decisions