WorldWideScience

Sample records for computations targeting multiple

  1. Multiple operating system rotation environment moving target defense

    Science.gov (United States)

    Evans, Nathaniel; Thompson, Michael

    2016-03-22

    Systems and methods for providing a multiple operating system rotation environment ("MORE") moving target defense ("MTD") computing system are described. The MORE-MTD system provides enhanced computer system security through a rotation of multiple operating systems. The MORE-MTD system increases attacker uncertainty, increases the cost of attacking the system, reduces the likelihood of an attacker locating a vulnerability, and reduces the exposure time of any located vulnerability. The MORE-MTD environment is effectuated by rotation of the operating systems at a given interval. The rotating operating systems create a consistently changing attack surface for remote attackers.

  2. Protein search for multiple targets on DNA

    Energy Technology Data Exchange (ETDEWEB)

    Lange, Martin [Johannes Gutenberg University, Mainz 55122 (Germany); Department of Chemistry, Rice University, Houston, Texas 77005 (United States); Kochugaeva, Maria [Department of Chemistry, Rice University, Houston, Texas 77005 (United States); Kolomeisky, Anatoly B., E-mail: tolya@rice.edu [Department of Chemistry, Rice University, Houston, Texas 77005 (United States); Center for Theoretical Biological Physics, Rice University, Houston, Texas 77005 (United States)

    2015-09-14

    Protein-DNA interactions are crucial for all biological processes. One of the most important fundamental aspects of these interactions is the process of protein searching and recognizing specific binding sites on DNA. A large number of experimental and theoretical investigations have been devoted to uncovering the molecular description of these phenomena, but many aspects of the mechanisms of protein search for the targets on DNA remain not well understood. One of the most intriguing problems is the role of multiple targets in protein search dynamics. Using a recently developed theoretical framework we analyze this question in detail. Our method is based on a discrete-state stochastic approach that takes into account most relevant physical-chemical processes and leads to fully analytical description of all dynamic properties. Specifically, systems with two and three targets have been explicitly investigated. It is found that multiple targets in most cases accelerate the search in comparison with a single target situation. However, the acceleration is not always proportional to the number of targets. Surprisingly, there are even situations when it takes longer to find one of the multiple targets in comparison with the single target. It depends on the spatial position of the targets, distances between them, average scanning lengths of protein molecules on DNA, and the total DNA lengths. Physical-chemical explanations of observed results are presented. Our predictions are compared with experimental observations as well as with results from a continuum theory for the protein search. Extensive Monte Carlo computer simulations fully support our theoretical calculations.

  3. WISDOM-II: Screening against multiple targets implicated in malaria using computational grid infrastructures

    Directory of Open Access Journals (Sweden)

    Kenyon Colin

    2009-05-01

    Full Text Available Abstract Background Despite continuous efforts of the international community to reduce the impact of malaria on developing countries, no significant progress has been made in the recent years and the discovery of new drugs is more than ever needed. Out of the many proteins involved in the metabolic activities of the Plasmodium parasite, some are promising targets to carry out rational drug discovery. Motivation Recent years have witnessed the emergence of grids, which are highly distributed computing infrastructures particularly well fitted for embarrassingly parallel computations like docking. In 2005, a first attempt at using grids for large-scale virtual screening focused on plasmepsins and ended up in the identification of previously unknown scaffolds, which were confirmed in vitro to be active plasmepsin inhibitors. Following this success, a second deployment took place in the fall of 2006 focussing on one well known target, dihydrofolate reductase (DHFR, and on a new promising one, glutathione-S-transferase. Methods In silico drug design, especially vHTS is a widely and well-accepted technology in lead identification and lead optimization. This approach, therefore builds, upon the progress made in computational chemistry to achieve more accurate in silico docking and in information technology to design and operate large scale grid infrastructures. Results On the computational side, a sustained infrastructure has been developed: docking at large scale, using different strategies in result analysis, storing of the results on the fly into MySQL databases and application of molecular dynamics refinement are MM-PBSA and MM-GBSA rescoring. The modeling results obtained are very promising. Based on the modeling results, In vitro results are underway for all the targets against which screening is performed. Conclusion The current paper describes the rational drug discovery activity at large scale, especially molecular docking using FlexX software

  4. Matrix multiplication operations with data pre-conditioning in a high performance computing architecture

    Science.gov (United States)

    Eichenberger, Alexandre E; Gschwind, Michael K; Gunnels, John A

    2013-11-05

    Mechanisms for performing matrix multiplication operations with data pre-conditioning in a high performance computing architecture are provided. A vector load operation is performed to load a first vector operand of the matrix multiplication operation to a first target vector register. A load and splat operation is performed to load an element of a second vector operand and replicating the element to each of a plurality of elements of a second target vector register. A multiply add operation is performed on elements of the first target vector register and elements of the second target vector register to generate a partial product of the matrix multiplication operation. The partial product of the matrix multiplication operation is accumulated with other partial products of the matrix multiplication operation.

  5. A mathematical analysis of multiple-target SELEX.

    Science.gov (United States)

    Seo, Yeon-Jung; Chen, Shiliang; Nilsen-Hamilton, Marit; Levine, Howard A

    2010-10-01

    SELEX (Systematic Evolution of Ligands by Exponential Enrichment) is a procedure by which a mixture of nucleic acids can be fractionated with the goal of identifying those with specific biochemical activities. One combines the mixture with a specific target molecule and then separates the target-NA complex from the resulting reactions. The target-NA complex is separated from the unbound NA by mechanical means (such as by filtration), the NA is eluted from the complex, amplified by PCR (polymerase chain reaction), and the process repeated. After several rounds, one should be left with the nucleic acids that best bind to the target. The problem was first formulated mathematically in Irvine et al. (J. Mol. Biol. 222:739-761, 1991). In Levine and Nilsen-Hamilton (Comput. Biol. Chem. 31:11-25, 2007), a mathematical analysis of the process was given. In Vant-Hull et al. (J. Mol. Biol. 278:579-597, 1998), multiple target SELEX was considered. It was assumed that each target has a single nucleic acid binding site that permits occupation by no more than one nucleic acid. Here, we revisit Vant-Hull et al. (J. Mol. Biol. 278:579-597, 1998) using the same assumptions. The iteration scheme is shown to be convergent and a simplified algorithm is given. Our interest here is in the behavior of the multiple target SELEX process as a discrete "time" dynamical system. Our goal is to characterize the limiting states and their dependence on the initial distribution of nucleic acid and target fraction components. (In multiple target SELEX, we vary the target component fractions, but not their concentrations, as fixed and the initial pool of nucleic acids as a variable starting condition). Given N nucleic acids and a target consisting of M subtarget component species, there is an M × N matrix of affinities, the (i,j) entry corresponding to the affinity of the jth nucleic acid for the ith subtarget. We give a structure condition on this matrix that is equivalent to the following

  6. Genome-wide identification of the regulatory targets of a transcription factor using biochemical characterization and computational genomic analysis

    Directory of Open Access Journals (Sweden)

    Jolly Emmitt R

    2005-11-01

    Full Text Available Abstract Background A major challenge in computational genomics is the development of methodologies that allow accurate genome-wide prediction of the regulatory targets of a transcription factor. We present a method for target identification that combines experimental characterization of binding requirements with computational genomic analysis. Results Our method identified potential target genes of the transcription factor Ndt80, a key transcriptional regulator involved in yeast sporulation, using the combined information of binding affinity, positional distribution, and conservation of the binding sites across multiple species. We have also developed a mathematical approach to compute the false positive rate and the total number of targets in the genome based on the multiple selection criteria. Conclusion We have shown that combining biochemical characterization and computational genomic analysis leads to accurate identification of the genome-wide targets of a transcription factor. The method can be extended to other transcription factors and can complement other genomic approaches to transcriptional regulation.

  7. Precision Modeling Of Targets Using The VALUE Computer Program

    Science.gov (United States)

    Hoffman, George A.; Patton, Ronald; Akerman, Alexander

    1989-08-01

    The 1976-vintage LASERX computer code has been augmented to produce realistic electro-optical images of targets. Capabilities lacking in LASERX but recently incorporated into its VALUE successor include: •Shadows cast onto the ground •Shadows cast onto parts of the target •See-through transparencies (e.g.,canopies) •Apparent images due both to atmospheric scattering and turbulence •Surfaces characterized by multiple bi-directional reflectance functions VALUE provides not only realistic target modeling by its precise and comprehensive representation of all target attributes, but additionally VALUE is very user friendly. Specifically, setup of runs is accomplished by screen prompting menus in a sequence of queries that is logical to the user. VALUE also incorporates the Optical Encounter (OPEC) software developed by Tricor Systems,Inc., Elgin, IL.

  8. First passage times for multiple particles with reversible target-binding kinetics

    Science.gov (United States)

    Grebenkov, Denis S.

    2017-10-01

    We investigate the first passage problem for multiple particles that diffuse towards a target, partially adsorb there, and then desorb after a finite exponentially distributed residence time. We search for the first time when m particles undergoing such reversible target-binding kinetics are found simultaneously on the target that may trigger an irreversible chemical reaction or a biophysical event. Even if the particles are independent, the finite residence time on the target yields an intricate temporal coupling between particles. We compute analytically the mean first passage time (MFPT) for two independent particles by mapping the original problem to higher-dimensional surface-mediated diffusion and solving the coupled partial differential equations. The respective effects of the adsorption and desorption rates on the MFPT are revealed and discussed.

  9. Controlling data transfers from an origin compute node to a target compute node

    Science.gov (United States)

    Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Ratterman, Joseph D [Rochester, MN; Smith, Brian E [Rochester, MN

    2011-06-21

    Methods, apparatus, and products are disclosed for controlling data transfers from an origin compute node to a target compute node that include: receiving, by an application messaging module on the target compute node, an indication of a data transfer from an origin compute node to the target compute node; and administering, by the application messaging module on the target compute node, the data transfer using one or more messaging primitives of a system messaging module in dependence upon the indication.

  10. Secure Multiparty Quantum Computation for Summation and Multiplication.

    Science.gov (United States)

    Shi, Run-hua; Mu, Yi; Zhong, Hong; Cui, Jie; Zhang, Shun

    2016-01-21

    As a fundamental primitive, Secure Multiparty Summation and Multiplication can be used to build complex secure protocols for other multiparty computations, specially, numerical computations. However, there is still lack of systematical and efficient quantum methods to compute Secure Multiparty Summation and Multiplication. In this paper, we present a novel and efficient quantum approach to securely compute the summation and multiplication of multiparty private inputs, respectively. Compared to classical solutions, our proposed approach can ensure the unconditional security and the perfect privacy protection based on the physical principle of quantum mechanics.

  11. NEWTONIAN IMPERIALIST COMPETITVE APPROACH TO OPTIMIZING OBSERVATION OF MULTIPLE TARGET POINTS IN MULTISENSOR SURVEILLANCE SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. Afghan-Toloee

    2013-09-01

    Full Text Available The problem of specifying the minimum number of sensors to deploy in a certain area to face multiple targets has been generally studied in the literatures. In this paper, we are arguing the multi-sensors deployment problem (MDP. The Multi-sensor placement problem can be clarified as minimizing the cost required to cover the multi target points in the area. We propose a more feasible method for the multi-sensor placement problem. Our method makes provision the high coverage of grid based placements while minimizing the cost as discovered in perimeter placement techniques. The NICA algorithm as improved ICA (Imperialist Competitive Algorithm is used to decrease the performance time to explore an enough solution compared to other meta-heuristic schemes such as GA, PSO and ICA. A three dimensional area is used for clarify the multiple target and placement points, making provision x, y, and z computations in the observation algorithm. A structure of model for the multi-sensor placement problem is proposed: The problem is constructed as an optimization problem with the objective to minimize the cost while covering all multiple target points upon a given probability of observation tolerance.

  12. Memory for found targets interferes with subsequent performance in multiple-target visual search.

    Science.gov (United States)

    Cain, Matthew S; Mitroff, Stephen R

    2013-10-01

    Multiple-target visual searches--when more than 1 target can appear in a given search display--are commonplace in radiology, airport security screening, and the military. Whereas 1 target is often found accurately, additional targets are more likely to be missed in multiple-target searches. To better understand this decrement in 2nd-target detection, here we examined 2 potential forms of interference that can arise from finding a 1st target: interference from the perceptual salience of the 1st target (a now highly relevant distractor in a known location) and interference from a newly created memory representation for the 1st target. Here, we found that removing found targets from the display or making them salient and easily segregated color singletons improved subsequent search accuracy. However, replacing found targets with random distractor items did not improve subsequent search accuracy. Removing and highlighting found targets likely reduced both a target's visual salience and its memory load, whereas replacing a target removed its visual salience but not its representation in memory. Collectively, the current experiments suggest that the working memory load of a found target has a larger effect on subsequent search accuracy than does its perceptual salience. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  13. A global calibration method for multiple vision sensors based on multiple targets

    International Nuclear Information System (INIS)

    Liu, Zhen; Zhang, Guangjun; Wei, Zhenzhong; Sun, Junhua

    2011-01-01

    The global calibration of multiple vision sensors (MVS) has been widely studied in the last two decades. In this paper, we present a global calibration method for MVS with non-overlapping fields of view (FOVs) using multiple targets (MT). MT is constructed by fixing several targets, called sub-targets, together. The mutual coordinate transformations between sub-targets need not be known. The main procedures of the proposed method are as follows: one vision sensor is selected from MVS to establish the global coordinate frame (GCF). MT is placed in front of the vision sensors for several (at least four) times. Using the constraint that the relative positions of all sub-targets are invariant, the transformation matrix from the coordinate frame of each vision sensor to GCF can be solved. Both synthetic and real experiments are carried out and good result is obtained. The proposed method has been applied to several real measurement systems and shown to be both flexible and accurate. It can serve as an attractive alternative to existing global calibration methods

  14. Assisting People with Multiple Disabilities by Improving Their Computer Pointing Efficiency with an Automatic Target Acquisition Program

    Science.gov (United States)

    Shih, Ching-Hsiang; Shih, Ching-Tien; Peng, Chin-Ling

    2011-01-01

    This study evaluated whether two people with multiple disabilities would be able to improve their pointing performance through an Automatic Target Acquisition Program (ATAP) and a newly developed mouse driver (i.e. a new mouse driver replaces standard mouse driver, and is able to monitor mouse movement and intercept click action). Initially, both…

  15. Adaptive Waveform Design for Cognitive Radar in Multiple Targets Situations

    Directory of Open Access Journals (Sweden)

    Xiaowen Zhang

    2018-02-01

    Full Text Available In this paper, the problem of cognitive radar (CR waveform optimization design for target detection and estimation in multiple extended targets situations is investigated. This problem is analyzed in signal-dependent interference, as well as additive channel noise for extended targets with unknown target impulse response (TIR. To address this problem, an improved algorithm is employed for target detection by maximizing the detection probability of the received echo on the promise of ensuring the TIR estimation precision. In this algorithm, an additional weight vector is introduced to achieve a trade-off among different targets. Both the estimate of TIR and transmit waveform can be updated at each step based on the previous step. Under the same constraint on waveform energy and bandwidth, the information theoretical approach is also considered. In addition, the relationship between the waveforms that are designed based on the two criteria is discussed. Unlike most existing works that only consider single target with temporally correlated characteristics, waveform design for multiple extended targets is considered in this method. Simulation results demonstrate that compared with linear frequency modulated (LFM signal, waveforms designed based on maximum detection probability and maximum mutual information (MI criteria can make radar echoes contain more multiple-target information and improve radar performance as a result.

  16. Parent-administered computer-assisted tutoring targeting letter-sound knowledge: Evaluation via multiple-baseline across three preschool students.

    Science.gov (United States)

    DuBois, Matthew R; Volpe, Robert J; Burns, Matthew K; Hoffman, Jessica A

    2016-12-01

    Knowledge of letters sounds has been identified as a primary objective of preschool instruction and intervention. Despite this designation, large disparities exist in the number of letter sounds children know at school entry. Enhancing caregivers' ability to teach their preschool-aged children letter sounds may represent an effective practice for reducing this variability and ensuring that more children are prepared to experience early school success. This study used a non-concurrent multiple-baseline-across-participants design to evaluate the effectiveness of caregivers (N=3) delivering a computer-assisted tutoring program (Tutoring Buddy) targeting letter sound knowledge to their preschool-aged children. Visual analyses and effect size estimates derived from Percentage of All Non-Overlapping Data (PAND) statistics indicated consistent results for letter sound acquisition, as 6weeks of intervention yielded large effects for letter sound knowledge (LSK) across all three children. Large effect sizes were also found for letter sound fluency (LSF) and nonsense word fluency (NWF) for two children. All three caregivers rated the intervention as highly usable and were able to administer it with high levels of fidelity. Taken together, the results of the present study found Tutoring Buddy to be an effective, simple, and usable way for the caregivers to support their children's literacy development. Copyright © 2016 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  17. Dynamic studies of multiple configurations of CERN's Antiproton Decelerator Target core under proton beam impact

    CERN Document Server

    AUTHOR|(CDS)2248381

    Antiprotons, like many other exotic particles, are produced by impacting high energy proton beams onto fixed targets. At the European Organization for Nuclear Research (CERN), this is done in the Antiproton Decelerator (AD) Facility. The engineering challenges related to the design of an optimal configuration of the AD-Target system derive from the extremely high energy depositions reached in the very thin target core as a consequence of each proton beam impact. A new target design is foreseen for operation after 2021, triggering multiple R&D activities since 2013 for this purpose. The goal of the present Master Thesis is to complement these activities with analytical and numerical calculations, delving into the phenomena associated to the dynamic response of the target core. In this context, two main studies have been carried out. First, the experimental data observed in targets subjected to low intensity proton pulses was cross-checked with analytical and computational methods for modal analysis, applie...

  18. Through-Wall Multiple Targets Vital Signs Tracking Based on VMD Algorithm

    Directory of Open Access Journals (Sweden)

    Jiaming Yan

    2016-08-01

    Full Text Available Targets located at the same distance are easily neglected in most through-wall multiple targets detecting applications which use the single-input single-output (SISO ultra-wideband (UWB radar system. In this paper, a novel multiple targets vital signs tracking algorithm for through-wall detection using SISO UWB radar has been proposed. Taking advantage of the high-resolution decomposition of the Variational Mode Decomposition (VMD based algorithm, the respiration signals of different targets can be decomposed into different sub-signals, and then, we can track the time-varying respiration signals accurately when human targets located in the same distance. Intensive evaluation has been conducted to show the effectiveness of our scheme with a 0.15 m thick concrete brick wall. Constant, piecewise-constant and time-varying vital signs could be separated and tracked successfully with the proposed VMD based algorithm for two targets, even up to three targets. For the multiple targets’ vital signs tracking issues like urban search and rescue missions, our algorithm has superior capability in most detection applications.

  19. Multiple-Targeted Graphene-based Nanocarrier for Intracellular Imaging of mRNAs

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Ying; Li, Zhaohui; Liu, Misha; Hu, Dehong; Lin, Yuehe; Li, Jinghong

    2017-08-29

    Simultaneous detection and imaging of multiple intracellular messenger RNA (mRNAs) hold great significant for early cancer diagnostics and preventive medicine development. Herein, we propose a multiple-targeted graphene oxide (GO) nanocarrier that can simultaneously detect and image different type mRNAs in living cells. First of all, in vitro detection of multiple targets have been realized successfully based on the multiple-targeted GO nanocarrier with linear relationship ranging from 3 nM to 200 nM, as well as sensitive detection limit of 1.84 nM for manganese superoxide dismutase (Mn-SOD) mRNA and 2.45 nM for β-actin mRNA. Additionally, this nanosensing platform composed of fluorescent labeled single strand DNA probes and GO nanocarrier can identify Mn-SOD mRNA and endogenous mRNA of β-actin in living cancer cells, showing rapid response, high specificity, nuclease stability, and good biocompatibility during the cell imaging. Thirdly, changes of the expression levels of mRNA in living cells before or after the drug treatment can be monitored successfully. By using multiple ssDNA as probes and GO nanocarrier as the cellular delivery cargo, the proposed simultaneous multiple-targeted sensing platform will be of great potential as a powerful tool for intracellular trafficking process from basic research to clinical diagnosis.

  20. Targeted therapy of multiple myeloma.

    Science.gov (United States)

    Dolloff, Nathan G; Talamo, Giampaolo

    2013-01-01

    Multiple myeloma (MM) is a plasma cell malignancy and the second most common hematologic cancer. MM is characterized by the accumulation of malignant plasma cells within the bone marrow, and presents clinically with a broad range of symptoms, including hypercalcemia, renal insufficiency, anemia, and lytic bone lesions. MM is a heterogeneous disease associated with genomic instability, where patients may express multiple genetic abnormalities that affect several oncogenic pathways. Commonly detected genetic aberrations are translocations involving immunoglobulin heavy chain (IgH) switch regions (chromosome 14q32) and oncogenes such as c-maf [t(14:16)], cyclin D1 [t(11:14)], and FGFR3/MMSET [t(4:14)]. Advances in the basic understanding of MM and the development of novel agents, such as the immunomodulatory drugs (IMiDs) thalidomide and lenalidomide and the proteasome inhibitor bortezomib, have increased therapeutic response rates and prolonged patient survival. Despite these advances MM remains incurable in the majority of patients, and it is therefore critical to identify additional therapeutic strategies and targets for its treatment. In this chapter, we review the underlying genetic components of MM and discuss the results of recent clinical trials that demonstrate the effectiveness of targeted agents in the management of MM. In addition, we discuss experimental therapies that are currently in clinical development along with their molecular rationale in the treatment of MM.

  1. MAGERI: Computational pipeline for molecular-barcoded targeted resequencing.

    Directory of Open Access Journals (Sweden)

    Mikhail Shugay

    2017-05-01

    Full Text Available Unique molecular identifiers (UMIs show outstanding performance in targeted high-throughput resequencing, being the most promising approach for the accurate identification of rare variants in complex DNA samples. This approach has application in multiple areas, including cancer diagnostics, thus demanding dedicated software and algorithms. Here we introduce MAGERI, a computational pipeline that efficiently handles all caveats of UMI-based analysis to obtain high-fidelity mutation profiles and call ultra-rare variants. Using an extensive set of benchmark datasets including gold-standard biological samples with known variant frequencies, cell-free DNA from tumor patient blood samples and publicly available UMI-encoded datasets we demonstrate that our method is both robust and efficient in calling rare variants. The versatility of our software is supported by accurate results obtained for both tumor DNA and viral RNA samples in datasets prepared using three different UMI-based protocols.

  2. Targeting accuracy of single-isocenter intensity-modulated radiosurgery for multiple lesions

    Energy Technology Data Exchange (ETDEWEB)

    Calvo-Ortega, J.F., E-mail: jfcdrr@yahoo.es; Pozo, M.; Moragues, S.; Casals, J.

    2017-07-01

    To investigate the targeting accuracy of intensity-modulated SRS (IMRS) plans designed to simultaneously treat multiple brain metastases with a single isocenter. A home-made acrylic phantom able to support a film (EBT3) in its coronal plane was used. The phantom was CT scanned and three coplanar small targets (a central and two peripheral) were outlined in the Eclipse system. Peripheral targets were 6 cm apart from the central one. A reference IMRS plan was designed to simultaneously treat the three targets, but only a single isocenter located at the center of the central target was used. After positioning the phantom on the linac using the room lasers, a CBCT scan was acquired and the reference plan were mapped on it, by placing the planned isocenter at the intersection of the landmarks used in the film showing the linac isocenter. The mapped plan was then recalculated and delivered. The film dose distribution was derived using a cloud computing application ( (www.radiochromic.com)) that uses a triple-channel dosimetry algorithm. Comparison of dose distributions using the gamma index (5%/1 mm) were performed over a 5 × 5 cm{sup 2} region centered over each target. 2D shifts required to get the best gamma passing rates on the peripheral target regions were compared with the reported ones for the central target. The experiment was repeated ten times in different sessions. Average 2D shifts required to achieve optimal gamma passing rates (99%, 97%, 99%) were 0.7 mm (SD: 0.3 mm), 0.8 mm (SD: 0.4 mm) and 0.8 mm (SD: 0.3 mm), for the central and the two peripheral targets, respectively. No statistical differences (p > 0.05) were found for targeting accuracy between the central and the two peripheral targets. The study revealed a targeting accuracy within 1 mm for off-isocenter targets within 6 cm of the linac isocenter, when a single-isocenter IMRS plan is designed.

  3. Multiple Embedded Processors for Fault-Tolerant Computing

    Science.gov (United States)

    Bolotin, Gary; Watson, Robert; Katanyoutanant, Sunant; Burke, Gary; Wang, Mandy

    2005-01-01

    A fault-tolerant computer architecture has been conceived in an effort to reduce vulnerability to single-event upsets (spurious bit flips caused by impingement of energetic ionizing particles or photons). As in some prior fault-tolerant architectures, the redundancy needed for fault tolerance is obtained by use of multiple processors in one computer. Unlike prior architectures, the multiple processors are embedded in a single field-programmable gate array (FPGA). What makes this new approach practical is the recent commercial availability of FPGAs that are capable of having multiple embedded processors. A working prototype (see figure) consists of two embedded IBM PowerPC 405 processor cores and a comparator built on a Xilinx Virtex-II Pro FPGA. This relatively simple instantiation of the architecture implements an error-detection scheme. A planned future version, incorporating four processors and two comparators, would correct some errors in addition to detecting them.

  4. Possibilities of computer tomography in multiple sclerosis

    International Nuclear Information System (INIS)

    Vymazal, J.; Bauer, J.

    1983-01-01

    Computer tomography was performed in 41 patients with multiple sclerosis, the average age of patients being 40.8 years. Native examinations were made of 17 patients, examinations with contrast medium of 19, both methods were used in the examination of 5 patients. In 26 patients, i.e. in almost two-thirds, cerebral atrophy was found, in 11 of a severe type. In 9 patients atrophy affected only the hemispheres, in 16 also the stem and cerebellum. The stem and cerebellum only were affected in 1 patient. Hypodense foci were found in 21 patients, i.e. more than half of those examined. In 9 there were multiple foci. In most of the 19 examined patients the hypodense changes were in the hemispheres and only in 2 in the cerebellum and brain stem. No hyperdense changes were detected. The value and possibilities are discussed of examinations by computer tomography multiple sclerosis. (author)

  5. Effects of multiple scattering and target structure on photon emission

    International Nuclear Information System (INIS)

    Blankenbecler, R.

    1996-05-01

    The Landau-Pomeranchuk-Migdal effect is the suppression of Bethe-Heitler radiation caused by multiple scattering in the target medium. The quantum treatment given by S.D. Drell and the author for homogeneous targets of finite thickness will be reviewed. It will then be extended to structured targets. In brief, it is shown that radiators composed of separated plates or of a medium with a spatially varying radiation length can exhibit unexpected structure, even coherence maxima and minima, in their photon spectra. Finally, a functional integral method for performing the averaging implicit in multiple scattering will be briefly discussed and the leading corrections to previous results evaluated

  6. Multiple-targeted graphene-based nanocarrier for intracellular imaging of mRNAs

    International Nuclear Information System (INIS)

    Wang, Ying; Li, Zhaohui; Liu, Misha; Xu, Jinjin; Hu, Dehong; Lin, Yuehe; Li, Jinghong

    2017-01-01

    Simultaneous detection and imaging of multiple intracellular messenger RNA (mRNAs) hold great significant for early cancer diagnostics and preventive medicine development. Herein, we propose a multiple-targeted graphene oxide (GO) nanocarrier that can simultaneously detect and image different type mRNAs in living cells. First of all, in vitro detection of multiple targets have been realized successfully based on the multiple-targeted GO nanocarrier with linear relationship ranging from 3 nM to 200 nM, as well as sensitive detection limit of 1.84 nM for manganese superoxide dismutase (Mn-SOD) mRNA and 2.45 nM for β-actin mRNA. Additionally, this nanosensing platform composed of fluorescent labelled single strand DNA probes and GO nanocarrier can identify Mn-SOD mRNA and endogenous mRNA of β-actin in living cancer cells, showing rapid response, high specificity, nuclease stability, and good biocompatibility during the cell imaging. Thirdly, changes of the expression levels of mRNA in living cells before or after the drug treatment can be monitored successfully. By using multiple ssDNA as probes and GO nanocarrier as the cellular delivery cargo, the proposed simultaneous multiple-targeted sensing platform will be of great potential as a powerful tool for intracellular trafficking process from basic research to clinical diagnosis. - Graphical abstract: Schematic illustration of simultaneously multiple mRNAs monitoring inside single living breast cancer cell based on GO nanocarrier. In particular, the fluorescent signals could be monitored when Mn-SOD probe (red) and β-actin probe (green) hybridizes with their mRNA targets inside the living cells. Random probe (orange) was regarded as control probe for the sensing strategy. - Highlights: • A multiple-targeted GO nanocarrier was used for mRNAs imaging and expression changes after drug treatment can be monitored successfully. • Sensitive detection limit of 1.84 nM for manganese superoxide dismutase (Mn-SOD) m

  7. Computational Modeling of Ablation on an Irradiated Target

    Science.gov (United States)

    Mehmedagic, Igbal; Thangam, Siva

    2017-11-01

    Computational modeling of pulsed nanosecond laser interaction with an irradiated metallic target is presented. The model formulation involves ablation of the metallic target irradiated by pulsed high intensity laser at normal atmospheric conditions. Computational findings based on effective representation and prediction of the heat transfer, melting and vaporization of the targeting material as well as plume formation and expansion are presented along with its relevance for the development of protective shields. In this context, the available results for a representative irradiation from 1064 nm laser pulse is used to analyze various ablation mechanisms, variable thermo-physical and optical properties, plume expansion and surface geometry. Funded in part by U. S. Army ARDEC, Picatinny Arsenal, NJ.

  8. Cooperative target convergence using multiple agents

    International Nuclear Information System (INIS)

    Kwok, K.S.; Driessen, B.J.

    1997-01-01

    This work considers the problem of causing multiple (100''s) autonomous mobile robots to converge to a target and provides a follow-the-leader approach to the problem. Each robot has only a limited-range sensor for sending the target and also larger but also limited-range robot-to-robot communication capability. Because of the small amount of information available to the robots, a practical approach to improve convergence to the target is to have a robot follow the robot with the best quality of information. Specifically, each robot emits a signal that informs in-range robots what its status is. A robot has a status value of 0 if it is itself in range of the target. A robot has a status of 1 if it is not in range of the target but is in communication range of a robot that is in range of the target. A robot has a status of 2 if it is not in range of the target but is within range of another robot that has status 1, and so on. Of all the mobile robots that any given robot is in range of, it follows the one with the best status. The emergent behavior is the ant-like trails of robots following each other toward the target. If the robot is not in range of another robot that is either in range of the target or following another robot, the robot will assign-1 to its quality-of-information, and will execute an exhaustive search. The exhaustive search will continue until it encounters either the target or another robot with a nonnegative quality-of-information. The quality of information approach was extended to the case where each robot only has two-bit signals informing it of distance to in-range robots

  9. Cooperative target convergence using multiple agents

    Energy Technology Data Exchange (ETDEWEB)

    Kwok, K.S.; Driessen, B.J.

    1997-10-01

    This work considers the problem of causing multiple (100`s) autonomous mobile robots to converge to a target and provides a follow-the-leader approach to the problem. Each robot has only a limited-range sensor for sending the target and also larger but also limited-range robot-to-robot communication capability. Because of the small amount of information available to the robots, a practical approach to improve convergence to the target is to have a robot follow the robot with the best quality of information. Specifically, each robot emits a signal that informs in-range robots what its status is. A robot has a status value of 0 if it is itself in range of the target. A robot has a status of 1 if it is not in range of the target but is in communication range of a robot that is in range of the target. A robot has a status of 2 if it is not in range of the target but is within range of another robot that has status 1, and so on. Of all the mobile robots that any given robot is in range of, it follows the one with the best status. The emergent behavior is the ant-like trails of robots following each other toward the target. If the robot is not in range of another robot that is either in range of the target or following another robot, the robot will assign-1 to its quality-of-information, and will execute an exhaustive search. The exhaustive search will continue until it encounters either the target or another robot with a nonnegative quality-of-information. The quality of information approach was extended to the case where each robot only has two-bit signals informing it of distance to in-range robots.

  10. DOA Estimation of Low Altitude Target Based on Adaptive Step Glowworm Swarm Optimization-multiple Signal Classification Algorithm

    Directory of Open Access Journals (Sweden)

    Zhou Hao

    2015-06-01

    Full Text Available The traditional MUltiple SIgnal Classification (MUSIC algorithm requires significant computational effort and can not be employed for the Direction Of Arrival (DOA estimation of targets in a low-altitude multipath environment. As such, a novel MUSIC approach is proposed on the basis of the algorithm of Adaptive Step Glowworm Swarm Optimization (ASGSO. The virtual spatial smoothing of the matrix formed by each snapshot is used to realize the decorrelation of the multipath signal and the establishment of a fullorder correlation matrix. ASGSO optimizes the function and estimates the elevation of the target. The simulation results suggest that the proposed method can overcome the low altitude multipath effect and estimate the DOA of target readily and precisely without radar effective aperture loss.

  11. A compound chimeric antigen receptor strategy for targeting multiple myeloma.

    Science.gov (United States)

    Chen, K H; Wada, M; Pinz, K G; Liu, H; Shuai, X; Chen, X; Yan, L E; Petrov, J C; Salman, H; Senzel, L; Leung, E L H; Jiang, X; Ma, Y

    2018-02-01

    Current clinical outcomes using chimeric-antigen receptors (CARs) against multiple myeloma show promise in the eradication of bulk disease. However, these anti-BCMA (CD269) CARs observe relapse as a common phenomenon after treatment due to the reemergence of either antigen-positive or -negative cells. Hence, the development of improvements in CAR design to target antigen loss and increase effector cell persistency represents a critical need. Here, we report on the anti-tumor activity of a CAR T-cell possessing two complete and independent CAR receptors against the multiple myeloma antigens BCMA and CS1. We determined that the resulting compound CAR (cCAR) T-cell possesses consistent, potent and directed cytotoxicity against each target antigen population. Using multiple mouse models of myeloma and mixed cell populations, we are further able to show superior in vivo survival by directed cytotoxicity against multiple populations compared to a single-expressing CAR T-cell. These findings indicate that compound targeting of BCMA and CS1 on myeloma cells can potentially be an effective strategy for augmenting the response against myeloma bulk disease and for initiation of broader coverage CAR therapy.

  12. Multiple-User, Multitasking, Virtual-Memory Computer System

    Science.gov (United States)

    Generazio, Edward R.; Roth, Don J.; Stang, David B.

    1993-01-01

    Computer system designed and programmed to serve multiple users in research laboratory. Provides for computer control and monitoring of laboratory instruments, acquisition and anlaysis of data from those instruments, and interaction with users via remote terminals. System provides fast access to shared central processing units and associated large (from megabytes to gigabytes) memories. Underlying concept of system also applicable to monitoring and control of industrial processes.

  13. Design and Delivery of Multiple Server-Side Computer Languages Course

    Science.gov (United States)

    Wang, Shouhong; Wang, Hai

    2011-01-01

    Given the emergence of service-oriented architecture, IS students need to be knowledgeable of multiple server-side computer programming languages to be able to meet the needs of the job market. This paper outlines the pedagogy of an innovative course of multiple server-side computer languages for the undergraduate IS majors. The paper discusses…

  14. SWIMS: a small-angle multiple scattering computer code

    International Nuclear Information System (INIS)

    Sayer, R.O.

    1976-07-01

    SWIMS (Sigmund and WInterbon Multiple Scattering) is a computer code for calculation of the angular dispersion of ion beams that undergo small-angle, incoherent multiple scattering by gaseous or solid media. The code uses the tabulated angular distributions of Sigmund and Winterbon for a Thomas-Fermi screened Coulomb potential. The fraction of the incident beam scattered into a cone defined by the polar angle α is computed as a function of α for reduced thicknesses over the range 0.01 less than or equal to tau less than or equal to 10.0. 1 figure, 2 tables

  15. Immunogenic Targets for Specific Immunotherapy in Multiple Myeloma

    Directory of Open Access Journals (Sweden)

    Lu Zhang

    2012-01-01

    Full Text Available Multiple myeloma remains an incurable disease although the prognosis has been improved by novel therapeutics and agents recently. Relapse occurs in the majority of patients and becomes fatal finally. Immunotherapy might be a powerful intervention to maintain a long-lasting control of minimal residual disease or to even eradicate disseminated tumor cells. Several tumor-associated antigens have been identified in patients with multiple myeloma. These antigens are expressed in a tumor-specific or tumor-restricted pattern, are able to elicit immune response, and thus could serve as targets for immunotherapy. This review discusses immunogenic antigens with therapeutic potential for multiple myeloma.

  16. Computed Tomography diagnosis of skeletal involvement in multiple myeloma

    International Nuclear Information System (INIS)

    Scutellari, Pier Nuccio; Galeotti, Roberto; Leprotti, Stefano; Piva, Nadia; Spanedda, Romedio

    1997-01-01

    The authors assess the role of Computed Topography in the diagnosis and management of multiple myeloma (MM) and investigate if Computed Tomography findings can influence the clinical approach, prognosis and treatment. 273 multiple myeloma patients submitted to Computed Tomography June 1994, to December, 1996. The patients were 143 men and 130 women (mean age: 65 years): 143 were stage I, 38 stage II and 92 stage III according to Durie and Salomon's clinical classification. All patients were submitted to blood tests, spinal radiography and Computed Tomography, the latter with serial 5-mm scans on several vertebral bodies. Computed Tomography despicted vertebral arch and process involvement in 3 cases with the vertebral pedicle sign. Moreover, Computed Tomography proved superior to radiography in showing the spread of myelomatous masses into the soft tissues in a case with solitary permeative lesion in the left public bone, which facilitated subsequent biopsy. As for extraosseous localizations, Computed Tomography demonstrated thoracic soft tissue (1 woman) and pelvic (1 man) involvement by myelomtous masses penetrating into surrounding tissues. In our series, only a case of osteosclerotic bone myeloma was observed in the pelvis, associated with lytic abnormalities. Computed Tomography findings do not seem to improve the clinical approach and therapeutic management of the disease. Nevertheless, the authors reccommend Computed Tomography for some myelomatous conditions, namely: a) in the patients with focal bone pain but normal skeletal radiographs; b) in the patients with M protein, bone marrow plasmocytosis and back pain, but with an incoclusive multiple myeloma diagnosis; c) to asses bone spread in the regions which are anatomically complex or difficult to study with radiography and to depict soft tissue involvement; d) for bone biopsy

  17. Rational polypharmacology: systematically identifying and engaging multiple drug targets to promote axon growth

    Science.gov (United States)

    Al-Ali, Hassan; Lee, Do-Hun; Danzi, Matt C.; Nassif, Houssam; Gautam, Prson; Wennerberg, Krister; Zuercher, Bill; Drewry, David H.; Lee, Jae K.; Lemmon, Vance P.; Bixby, John L.

    2016-01-01

    Mammalian Central Nervous System (CNS) neurons regrow their axons poorly following injury, resulting in irreversible functional losses. Identifying therapeutics that encourage CNS axon repair has been difficult, in part because multiple etiologies underlie this regenerative failure. This suggests a particular need for drugs that engage multiple molecular targets. Although multi-target drugs are generally more effective than highly selective alternatives, we lack systematic methods for discovering such drugs. Target-based screening is an efficient technique for identifying potent modulators of individual targets. In contrast, phenotypic screening can identify drugs with multiple targets; however, these targets remain unknown. To address this gap, we combined the two drug discovery approaches using machine learning and information theory. We screened compounds in a phenotypic assay with primary CNS neurons and also in a panel of kinase enzyme assays. We used learning algorithms to relate the compounds’ kinase inhibition profiles to their influence on neurite outgrowth. This allowed us to identify kinases that may serve as targets for promoting neurite outgrowth, as well as others whose targeting should be avoided. We found that compounds that inhibit multiple targets (polypharmacology) promote robust neurite outgrowth in vitro. One compound with exemplary polypharmacology, was found to promote axon growth in a rodent spinal cord injury model. A more general applicability of our approach is suggested by its ability to deconvolve known targets for a breast cancer cell line, as well as targets recently shown to mediate drug resistance. PMID:26056718

  18. HCIDL: Human-computer interface description language for multi-target, multimodal, plastic user interfaces

    Directory of Open Access Journals (Sweden)

    Lamia Gaouar

    2018-06-01

    Full Text Available From the human-computer interface perspectives, the challenges to be faced are related to the consideration of new, multiple interactions, and the diversity of devices. The large panel of interactions (touching, shaking, voice dictation, positioning … and the diversification of interaction devices can be seen as a factor of flexibility albeit introducing incidental complexity. Our work is part of the field of user interface description languages. After an analysis of the scientific context of our work, this paper introduces HCIDL, a modelling language staged in a model-driven engineering approach. Among the properties related to human-computer interface, our proposition is intended for modelling multi-target, multimodal, plastic interaction interfaces using user interface description languages. By combining plasticity and multimodality, HCIDL improves usability of user interfaces through adaptive behaviour by providing end-users with an interaction-set adapted to input/output of terminals and, an optimum layout. Keywords: Model driven engineering, Human-computer interface, User interface description languages, Multimodal applications, Plastic user interfaces

  19. Seeing is believing: video classification for computed tomographic colonography using multiple-instance learning.

    Science.gov (United States)

    Wang, Shijun; McKenna, Matthew T; Nguyen, Tan B; Burns, Joseph E; Petrick, Nicholas; Sahiner, Berkman; Summers, Ronald M

    2012-05-01

    In this paper, we present development and testing results for a novel colonic polyp classification method for use as part of a computed tomographic colonography (CTC) computer-aided detection (CAD) system. Inspired by the interpretative methodology of radiologists using 3-D fly-through mode in CTC reading, we have developed an algorithm which utilizes sequences of images (referred to here as videos) for classification of CAD marks. For each CAD mark, we created a video composed of a series of intraluminal, volume-rendered images visualizing the detection from multiple viewpoints. We then framed the video classification question as a multiple-instance learning (MIL) problem. Since a positive (negative) bag may contain negative (positive) instances, which in our case depends on the viewing angles and camera distance to the target, we developed a novel MIL paradigm to accommodate this class of problems. We solved the new MIL problem by maximizing a L2-norm soft margin using semidefinite programming, which can optimize relevant parameters automatically. We tested our method by analyzing a CTC data set obtained from 50 patients from three medical centers. Our proposed method showed significantly better performance compared with several traditional MIL methods.

  20. Global Calibration of Multiple Cameras Based on Sphere Targets

    Directory of Open Access Journals (Sweden)

    Junhua Sun

    2016-01-01

    Full Text Available Global calibration methods for multi-camera system are critical to the accuracy of vision measurement. Proposed in this paper is such a method based on several groups of sphere targets and a precision auxiliary camera. Each camera to be calibrated observes a group of spheres (at least three, while the auxiliary camera observes all the spheres. The global calibration can be achieved after each camera reconstructs the sphere centers in its field of view. In the process of reconstructing a sphere center, a parameter equation is used to describe the sphere projection model. Theoretical analysis and computer simulation are carried out to analyze the factors that affect the calibration accuracy. Simulation results show that the parameter equation can largely improve the reconstruction accuracy. In the experiments, a two-camera system calibrated by our method is used to measure a distance about 578 mm, and the root mean squared error is within 0.14 mm. Furthermore, the experiments indicate that the method has simple operation and good flexibility, especially for the onsite multiple cameras without common field of view.

  1. Quantum partial search for uneven distribution of multiple target items

    Science.gov (United States)

    Zhang, Kun; Korepin, Vladimir

    2018-06-01

    Quantum partial search algorithm is an approximate search. It aims to find a target block (which has the target items). It runs a little faster than full Grover search. In this paper, we consider quantum partial search algorithm for multiple target items unevenly distributed in a database (target blocks have different number of target items). The algorithm we describe can locate one of the target blocks. Efficiency of the algorithm is measured by number of queries to the oracle. We optimize the algorithm in order to improve efficiency. By perturbation method, we find that the algorithm runs the fastest when target items are evenly distributed in database.

  2. Irradiation uniformity of spherical targets by multiple uv beams from OMEGA

    International Nuclear Information System (INIS)

    Beich, W.; Dunn, M.; Hutchison, R.

    1984-01-01

    Direct-drive laser fusion demands extremely high levels of irradiation uniformity to ensure uniform compression of spherical targets. The assessment of illumination uniformity of targets irradiated by multiple beams from the OMEGA facility is made with the aid of multiple beams spherical superposition codes, which take into account ray tracing and absorption and a detailed knowledge of the intensity distribution of each beam in the target plane. In this report, recent estimates of the irradiation uniformity achieved with 6 and 12 uv beams of OMEGA will be compared with previous measurements in the IR, and predictions will be made for the uv illumination uniformity achievable with 24 beams of OMEGA

  3. Computer studies of multiple-quantum spin dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Murdoch, J.B.

    1982-11-01

    The excitation and detection of multiple-quantum (MQ) transitions in Fourier transform NMR spectroscopy is an interesting problem in the quantum mechanical dynamics of spin systems as well as an important new technique for investigation of molecular structure. In particular, multiple-quantum spectroscopy can be used to simplify overly complex spectra or to separate the various interactions between a nucleus and its environment. The emphasis of this work is on computer simulation of spin-system evolution to better relate theory and experiment.

  4. Computer studies of multiple-quantum spin dynamics

    International Nuclear Information System (INIS)

    Murdoch, J.B.

    1982-11-01

    The excitation and detection of multiple-quantum (MQ) transitions in Fourier transform NMR spectroscopy is an interesting problem in the quantum mechanical dynamics of spin systems as well as an important new technique for investigation of molecular structure. In particular, multiple-quantum spectroscopy can be used to simplify overly complex spectra or to separate the various interactions between a nucleus and its environment. The emphasis of this work is on computer simulation of spin-system evolution to better relate theory and experiment

  5. Cooperative multi-robot observation of multiple moving targets

    International Nuclear Information System (INIS)

    Parker, L.E.; Emmons, B.A.

    1997-01-01

    An important issue that arises in the automation of many security, surveillance, and reconnaissance tasks is that of monitoring, or observing, the movements of targets navigating in a bounded area of interest. A key research issue in these problems is that of sensor placement--determining where sensors should be located to maintain the targets in view. In complex applications of this type, the use of multiple sensors dynamically moving over time is required. In this paper, the authors investigate the sue of a cooperative team of autonomous sensor-based robots for multi-robot observation of multiple moving targets. They focus primarily on developing the distributed control strategies that allow the robot team to attempt to maximize the collective tie during which each object is being observed by at least one robot in the area of interest. The initial efforts in this problem address the aspects of distributed control in homogeneous robot teams with equivalent sensing and movement capabilities working in an uncluttered, bounded area. This paper first formalizes the problem, discusses related work, and then shows that this problem is NP-hard. They then present a distributed approximate approach to solving this problem that combines low-level multi-robot control with higher-level control

  6. Computational design of trimeric influenza-neutralizing proteins targeting the hemagglutinin receptor binding site

    Energy Technology Data Exchange (ETDEWEB)

    Strauch, Eva-Maria; Bernard, Steffen M.; La, David; Bohn, Alan J.; Lee, Peter S.; Anderson, Caitlin E.; Nieusma, Travis; Holstein, Carly A.; Garcia, Natalie K.; Hooper, Kathryn A.; Ravichandran, Rashmi; Nelson, Jorgen W.; Sheffler, William; Bloom, Jesse D.; Lee, Kelly K.; Ward, Andrew B.; Yager, Paul; Fuller, Deborah H.; Wilson, Ian A.; Baker , David (UWASH); (Scripps); (FHCRC)

    2017-06-12

    Many viral surface glycoproteins and cell surface receptors are homo-oligomers1, 2, 3, 4, and thus can potentially be targeted by geometrically matched homo-oligomers that engage all subunits simultaneously to attain high avidity and/or lock subunits together. The adaptive immune system cannot generally employ this strategy since the individual antibody binding sites are not arranged with appropriate geometry to simultaneously engage multiple sites in a single target homo-oligomer. We describe a general strategy for the computational design of homo-oligomeric protein assemblies with binding functionality precisely matched to homo-oligomeric target sites5, 6, 7, 8. In the first step, a small protein is designed that binds a single site on the target. In the second step, the designed protein is assembled into a homo-oligomer such that the designed binding sites are aligned with the target sites. We use this approach to design high-avidity trimeric proteins that bind influenza A hemagglutinin (HA) at its conserved receptor binding site. The designed trimers can both capture and detect HA in a paper-based diagnostic format, neutralizes influenza in cell culture, and completely protects mice when given as a single dose 24 h before or after challenge with influenza.

  7. Vertical Load Distribution for Cloud Computing via Multiple Implementation Options

    Science.gov (United States)

    Phan, Thomas; Li, Wen-Syan

    Cloud computing looks to deliver software as a provisioned service to end users, but the underlying infrastructure must be sufficiently scalable and robust. In our work, we focus on large-scale enterprise cloud systems and examine how enterprises may use a service-oriented architecture (SOA) to provide a streamlined interface to their business processes. To scale up the business processes, each SOA tier usually deploys multiple servers for load distribution and fault tolerance, a scenario which we term horizontal load distribution. One limitation of this approach is that load cannot be distributed further when all servers in the same tier are loaded. In complex multi-tiered SOA systems, a single business process may actually be implemented by multiple different computation pathways among the tiers, each with different components, in order to provide resilience and scalability. Such multiple implementation options gives opportunities for vertical load distribution across tiers. In this chapter, we look at a novel request routing framework for SOA-based enterprise computing with multiple implementation options that takes into account the options of both horizontal and vertical load distribution.

  8. Changing paradigm from one target one ligand towards multi target directed ligand design for key drug targets of Alzheimer disease: An important role of Insilco methods in multi target directed ligands design.

    Science.gov (United States)

    Kumar, Akhil; Tiwari, Ashish; Sharma, Ashok

    2018-03-15

    Alzheimer disease (AD) is now considered as a multifactorial neurodegenerative disorder and rapidly increasing to an alarming situation and causing higher death rate. One target one ligand hypothesis is not able to provide complete solution of AD due to multifactorial nature of disease and one target one drug seems to fail to provide better treatment against AD. Moreover, current available treatments are limited and most of the upcoming treatments under clinical trials are based on modulating single target. So the current AD drug discovery research shifting towards new approach for better solution that simultaneously modulate more than one targets in the neurodegenerative cascade. This can be achieved by network pharmacology, multi-modal therapies, multifaceted, and/or the more recently proposed term "multi-targeted designed drugs. Drug discovery project is tedious, costly and long term project. Moreover, multi target AD drug discovery added extra challenges such as good binding affinity of ligands for multiple targets, optimal ADME/T properties, no/less off target side effect and crossing of the blood brain barrier. These hurdles may be addressed by insilico methods for efficient solution in less time and cost as computational methods successfully applied to single target drug discovery project. Here we are summarizing some of the most prominent and computationally explored single target against AD and further we discussed successful example of dual or multiple inhibitors for same targets. Moreover we focused on ligand and structure based computational approach to design MTDL against AD. However is not an easy task to balance dual activity in a single molecule but computational approach such as virtual screening docking, QSAR, simulation and free energy are useful in future MTDLs drug discovery alone or in combination with fragment based method. However, rational and logical implementations of computational drug designing methods are capable of assisting AD drug

  9. Multi-target detection and positioning in crowds using multiple camera surveillance

    Science.gov (United States)

    Huang, Jiahu; Zhu, Qiuyu; Xing, Yufeng

    2018-04-01

    In this study, we propose a pixel correspondence algorithm for positioning in crowds based on constraints on the distance between lines of sight, grayscale differences, and height in a world coordinates system. First, a Gaussian mixture model is used to obtain the background and foreground from multi-camera videos. Second, the hair and skin regions are extracted as regions of interest. Finally, the correspondences between each pixel in the region of interest are found under multiple constraints and the targets are positioned by pixel clustering. The algorithm can provide appropriate redundancy information for each target, which decreases the risk of losing targets due to a large viewing angle and wide baseline. To address the correspondence problem for multiple pixels, we construct a pixel-based correspondence model based on a similar permutation matrix, which converts the correspondence problem into a linear programming problem where a similar permutation matrix is found by minimizing an objective function. The correct pixel correspondences can be obtained by determining the optimal solution of this linear programming problem and the three-dimensional position of the targets can also be obtained by pixel clustering. Finally, we verified the algorithm with multiple cameras in experiments, which showed that the algorithm has high accuracy and robustness.

  10. Targeting multiple heterogeneous hardware platforms with OpenCL

    Science.gov (United States)

    Fox, Paul A.; Kozacik, Stephen T.; Humphrey, John R.; Paolini, Aaron; Kuller, Aryeh; Kelmelis, Eric J.

    2014-06-01

    The OpenCL API allows for the abstract expression of parallel, heterogeneous computing, but hardware implementations have substantial implementation differences. The abstractions provided by the OpenCL API are often insufficiently high-level to conceal differences in hardware architecture. Additionally, implementations often do not take advantage of potential performance gains from certain features due to hardware limitations and other factors. These factors make it challenging to produce code that is portable in practice, resulting in much OpenCL code being duplicated for each hardware platform being targeted. This duplication of effort offsets the principal advantage of OpenCL: portability. The use of certain coding practices can mitigate this problem, allowing a common code base to be adapted to perform well across a wide range of hardware platforms. To this end, we explore some general practices for producing performant code that are effective across platforms. Additionally, we explore some ways of modularizing code to enable optional optimizations that take advantage of hardware-specific characteristics. The minimum requirement for portability implies avoiding the use of OpenCL features that are optional, not widely implemented, poorly implemented, or missing in major implementations. Exposing multiple levels of parallelism allows hardware to take advantage of the types of parallelism it supports, from the task level down to explicit vector operations. Static optimizations and branch elimination in device code help the platform compiler to effectively optimize programs. Modularization of some code is important to allow operations to be chosen for performance on target hardware. Optional subroutines exploiting explicit memory locality allow for different memory hierarchies to be exploited for maximum performance. The C preprocessor and JIT compilation using the OpenCL runtime can be used to enable some of these techniques, as well as to factor in hardware

  11. A low-complexity interacting multiple model filter for maneuvering target tracking

    KAUST Repository

    Khalid, Syed Safwan; Abrar, Shafayat

    2017-01-01

    In this work, we address the target tracking problem for a coordinate-decoupled Markovian jump-mean-acceleration based maneuvering mobility model. A novel low-complexity alternative to the conventional interacting multiple model (IMM) filter is proposed for this class of mobility models. The proposed tracking algorithm utilizes a bank of interacting filters where the interactions are limited to the mixing of the mean estimates, and it exploits a fixed off-line computed Kalman gain matrix for the entire filter bank. Consequently, the proposed filter does not require matrix inversions during on-line operation which significantly reduces its complexity. Simulation results show that the performance of the low-complexity proposed scheme remains comparable to that of the traditional (highly-complex) IMM filter. Furthermore, we derive analytical expressions that iteratively evaluate the transient and steady-state performance of the proposed scheme, and establish the conditions that ensure the stability of the proposed filter. The analytical findings are in close accordance with the simulated results.

  12. A low-complexity interacting multiple model filter for maneuvering target tracking

    KAUST Repository

    Khalid, Syed Safwan

    2017-01-22

    In this work, we address the target tracking problem for a coordinate-decoupled Markovian jump-mean-acceleration based maneuvering mobility model. A novel low-complexity alternative to the conventional interacting multiple model (IMM) filter is proposed for this class of mobility models. The proposed tracking algorithm utilizes a bank of interacting filters where the interactions are limited to the mixing of the mean estimates, and it exploits a fixed off-line computed Kalman gain matrix for the entire filter bank. Consequently, the proposed filter does not require matrix inversions during on-line operation which significantly reduces its complexity. Simulation results show that the performance of the low-complexity proposed scheme remains comparable to that of the traditional (highly-complex) IMM filter. Furthermore, we derive analytical expressions that iteratively evaluate the transient and steady-state performance of the proposed scheme, and establish the conditions that ensure the stability of the proposed filter. The analytical findings are in close accordance with the simulated results.

  13. Computer code MLCOSP for multiple-correlation and spectrum analysis with a hybrid computer

    International Nuclear Information System (INIS)

    Oguma, Ritsuo; Fujii, Yoshio; Usui, Hozumi; Watanabe, Koichi

    1975-10-01

    Usage of the computer code MLCOSP(Multiple Correlation and Spectrum) developed is described for a hybrid computer installed in JAERI Functions of the hybrid computer and its terminal devices are utilized ingeniously in the code to reduce complexity of the data handling which occurrs in analysis of the multivariable experimental data and to perform the analysis in perspective. Features of the code are as follows; Experimental data can be fed to the digital computer through the analog part of the hybrid computer by connecting with a data recorder. The computed results are displayed in figures, and hardcopies are taken when necessary. Series-messages to the code are shown on the terminal, so man-machine communication is possible. And further the data can be put in through a keyboard, so case study according to the results of analysis is possible. (auth.)

  14. Improved Targeting Through Collaborative Decision-Making and Brain Computer Interfaces

    Science.gov (United States)

    Stoica, Adrian; Barrero, David F.; McDonald-Maier, Klaus

    2013-01-01

    This paper reports a first step toward a brain-computer interface (BCI) for collaborative targeting. Specifically, we explore, from a broad perspective, how the collaboration of a group of people can increase the performance on a simple target identification task. To this end, we requested a group of people to identify the location and color of a sequence of targets appearing on the screen and measured the time and accuracy of the response. The individual results are compared to a collective identification result determined by simple majority voting, with random choice in case of drawn. The results are promising, as the identification becomes significantly more reliable even with this simple voting and a small number of people (either odd or even number) involved in the decision. In addition, the paper briefly analyzes the role of brain-computer interfaces in collaborative targeting, extending the targeting task by using a BCI instead of a mechanical response.

  15. Lesional-targeting of neuroprotection to the inflammatory penumbra in experimental multiple sclerosis

    NARCIS (Netherlands)

    Al-Izki, S.; Pryce, G.; Hankey, D.J.R.; Lidster, K.; von Kutzleben, S.M.; Browne, L.; Clutterbuck, L.; Posada, C.; Chan, A.W.E.; Amor, S.; Perkins, V.; Gerritsen, W.H.; Ummenthum, K.; Peferoen-Baert, R.; van der Valk, P.; Montoya, A.; Joel, S.P.; Garthwaite, J.; Giovannoni, G.; Selwood, D.L.; Baker, D.

    2014-01-01

    Progressive multiple sclerosis is associated with metabolic failure of the axon and excitotoxicity that leads to chronic neurodegeneration. Global sodium-channel blockade causes side effects that can limit its use for neuroprotection in multiple sclerosis. Through selective targeting of drugs to

  16. Targeting an efficient target-to-target interval for P300 speller brain–computer interfaces

    Science.gov (United States)

    Sellers, Eric W.; Wang, Xingyu

    2013-01-01

    Longer target-to-target intervals (TTI) produce greater P300 event-related potential amplitude, which can increase brain–computer interface (BCI) classification accuracy and decrease the number of flashes needed for accurate character classification. However, longer TTIs requires more time for each trial, which will decrease the information transfer rate of BCI. In this paper, a P300 BCI using a 7 × 12 matrix explored new flash patterns (16-, 18- and 21-flash pattern) with different TTIs to assess the effects of TTI on P300 BCI performance. The new flash patterns were designed to minimize TTI, decrease repetition blindness, and examine the temporal relationship between each flash of a given stimulus by placing a minimum of one (16-flash pattern), two (18-flash pattern), or three (21-flash pattern) non-target flashes between each target flashes. Online results showed that the 16-flash pattern yielded the lowest classification accuracy among the three patterns. The results also showed that the 18-flash pattern provides a significantly higher information transfer rate (ITR) than the 21-flash pattern; both patterns provide high ITR and high accuracy for all subjects. PMID:22350331

  17. Computing proton dose to irregularly moving targets

    International Nuclear Information System (INIS)

    Phillips, Justin; Gueorguiev, Gueorgui; Grassberger, Clemens; Dowdell, Stephen; Paganetti, Harald; Sharp, Gregory C; Shackleford, James A

    2014-01-01

    Purpose: While four-dimensional computed tomography (4DCT) and deformable registration can be used to assess the dose delivered to regularly moving targets, there are few methods available for irregularly moving targets. 4DCT captures an idealized waveform, but human respiration during treatment is characterized by gradual baseline shifts and other deviations from a periodic signal. This paper describes a method for computing the dose delivered to irregularly moving targets based on 1D or 3D waveforms captured at the time of delivery. Methods: The procedure uses CT or 4DCT images for dose calculation, and 1D or 3D respiratory waveforms of the target position at time of delivery. Dose volumes are converted from their Cartesian geometry into a beam-specific radiological depth space, parameterized in 2D by the beam aperture, and longitudinally by the radiological depth. In this new frame of reference, the proton doses are translated according to the motion found in the 1D or 3D trajectory. These translated dose volumes are weighted and summed, then transformed back into Cartesian space, yielding an estimate of the dose that includes the effect of the measured breathing motion. The method was validated using a synthetic lung phantom and a single representative patient CT. Simulated 4DCT was generated for the phantom with 2 cm peak-to-peak motion. Results: A passively-scattered proton treatment plan was generated using 6 mm and 5 mm smearing for the phantom and patient plans, respectively. The method was tested without motion, and with two simulated breathing signals: a 2 cm amplitude sinusoid, and a 2 cm amplitude sinusoid with 3 cm linear drift in the phantom. The tumor positions were equally weighted for the patient calculation. Motion-corrected dose was computed based on the mid-ventilation CT image in the phantom and the peak exhale position in the patient. Gamma evaluation was 97.8% without motion, 95.7% for 2 cm sinusoidal motion, 95.7% with 3 cm drift in

  18. DDR: Efficient computational method to predict drug–target interactions using graph mining and machine learning approaches

    KAUST Repository

    Olayan, Rawan S.

    2017-11-23

    Motivation Finding computationally drug-target interactions (DTIs) is a convenient strategy to identify new DTIs at low cost with reasonable accuracy. However, the current DTI prediction methods suffer the high false positive prediction rate. Results We developed DDR, a novel method that improves the DTI prediction accuracy. DDR is based on the use of a heterogeneous graph that contains known DTIs with multiple similarities between drugs and multiple similarities between target proteins. DDR applies non-linear similarity fusion method to combine different similarities. Before fusion, DDR performs a pre-processing step where a subset of similarities is selected in a heuristic process to obtain an optimized combination of similarities. Then, DDR applies a random forest model using different graph-based features extracted from the DTI heterogeneous graph. Using five repeats of 10-fold cross-validation, three testing setups, and the weighted average of area under the precision-recall curve (AUPR) scores, we show that DDR significantly reduces the AUPR score error relative to the next best start-of-the-art method for predicting DTIs by 34% when the drugs are new, by 23% when targets are new, and by 34% when the drugs and the targets are known but not all DTIs between them are not known. Using independent sources of evidence, we verify as correct 22 out of the top 25 DDR novel predictions. This suggests that DDR can be used as an efficient method to identify correct DTIs.

  19. Multiple-target method for sputtering amorphous films for bubble-domain devices

    International Nuclear Information System (INIS)

    Burilla, C.T.; Bekebrede, W.R.; Smith, A.B.

    1976-01-01

    Previously, sputtered amorphous metal alloys for bubble applications have ordinarily been prepared by standard sputtering techniques using a single target electrode. The deposition of these alloys is reported using a multiple target rf technique in which a separate target is used for each element contained in the alloy. One of the main advantages of this multiple-target approach is that the film composition can be easily changed by simply varying the voltages applied to the elemental targets. In the apparatus, the centers of the targets are positioned on a 15 cm-radius circle. The platform holding the film substrate is on a 15 cm-long arm which can rotate about the center, thus bringing the sample successively under each target. The platform rotation rate is adjustable from 0 to 190 rpm. That this latter speed is sufficient to homogenize the alloys produced is demonstrated by measurements made of the uniaxial anisotropy constant in Gd 0 . 12 Co 0 . 59 Cu 0 . 29 films. The anisotropy is 6.0 x 10 5 ergs/cm 3 and independent of rotation rate above approximately 25 rpm, but it drops rapidly for slower rotation rates, reaching 1.8 x 10 5 ergs/cm 3 for 7 rpm. The film quality is equal to that of films made by conventional methods. Coercivities of a few oersteds in samples with stripe widths of 1 to 2 μm and magnetizations of 800 to 2800 G were observed

  20. Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans

    Science.gov (United States)

    2017-07-13

    Computed Tomography Scans by Autumn R Kulaga, Kathryn L Loftis, and Eric Murray Approved for public release; distribution is...Army Research Laboratory Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans by Autumn R Kulaga...Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c

  1. Teacher regulation of multiple computer-supported collaborating groups

    NARCIS (Netherlands)

    Van Leeuwen, Anouschka; Janssen, Jeroen; Erkens, Gijsbert; Brekelmans, Mieke

    2015-01-01

    Teachers regulating groups of students during computer-supported collaborative learning (CSCL) face the challenge of orchestrating their guidance at student, group, and class level. During CSCL, teachers can monitor all student activity and interact with multiple groups at the same time. Not much is

  2. Multiple Maneuvering Target Tracking by Improved Particle Filter Based on Multiscan JPDA

    Directory of Open Access Journals (Sweden)

    Jing Liu

    2012-01-01

    Full Text Available The multiple maneuvering target tracking algorithm based on a particle filter is addressed. The equivalent-noise approach is adopted, which uses a simple dynamic model consisting of target state and equivalent noise which accounts for the combined effects of the process noise and maneuvers. The equivalent-noise approach converts the problem of maneuvering target tracking to that of state estimation in the presence of nonstationary process noise with unknown statistics. A novel method for identifying the nonstationary process noise is proposed in the particle filter framework. Furthermore, a particle filter based multiscan Joint Probability Data Association (JPDA filter is proposed to deal with the data association problem in a multiple maneuvering target tracking. In the proposed multiscan JPDA algorithm, the distributions of interest are the marginal filtering distributions for each of the targets, and these distributions are approximated with particles. The multiscan JPDA algorithm examines the joint association events in a multiscan sliding window and calculates the marginal posterior probability based on the multiscan joint association events. The proposed algorithm is illustrated via an example involving the tracking of two highly maneuvering, at times closely spaced and crossed, targets, based on resolved measurements.

  3. Automatic detection of multiple UXO-like targets using magnetic anomaly inversion and self-adaptive fuzzy c-means clustering

    Science.gov (United States)

    Yin, Gang; Zhang, Yingtang; Fan, Hongbo; Ren, Guoquan; Li, Zhining

    2017-12-01

    We have developed a method for automatically detecting UXO-like targets based on magnetic anomaly inversion and self-adaptive fuzzy c-means clustering. Magnetic anomaly inversion methods are used to estimate the initial locations of multiple UXO-like sources. Although these initial locations have some errors with respect to the real positions, they form dense clouds around the actual positions of the magnetic sources. Then we use the self-adaptive fuzzy c-means clustering algorithm to cluster these initial locations. The estimated number of cluster centroids represents the number of targets and the cluster centroids are regarded as the locations of magnetic targets. Effectiveness of the method has been demonstrated using synthetic datasets. Computational results show that the proposed method can be applied to the case of several UXO-like targets that are randomly scattered within in a confined, shallow subsurface, volume. A field test was carried out to test the validity of the proposed method and the experimental results show that the prearranged magnets can be detected unambiguously and located precisely.

  4. Optimized computational imaging methods for small-target sensing in lens-free holographic microscopy

    Science.gov (United States)

    Xiong, Zhen; Engle, Isaiah; Garan, Jacob; Melzer, Jeffrey E.; McLeod, Euan

    2018-02-01

    Lens-free holographic microscopy is a promising diagnostic approach because it is cost-effective, compact, and suitable for point-of-care applications, while providing high resolution together with an ultra-large field-of-view. It has been applied to biomedical sensing, where larger targets like eukaryotic cells, bacteria, or viruses can be directly imaged without labels, and smaller targets like proteins or DNA strands can be detected via scattering labels like micro- or nano-spheres. Automated image processing routines can count objects and infer target concentrations. In these sensing applications, sensitivity and specificity are critically affected by image resolution and signal-to-noise ratio (SNR). Pixel super-resolution approaches have been shown to boost resolution and SNR by synthesizing a high-resolution image from multiple, partially redundant, low-resolution images. However, there are several computational methods that can be used to synthesize the high-resolution image, and previously, it has been unclear which methods work best for the particular case of small-particle sensing. Here, we quantify the SNR achieved in small-particle sensing using regularized gradient-descent optimization method, where the regularization is based on cardinal-neighbor differences, Bayer-pattern noise reduction, or sparsity in the image. In particular, we find that gradient-descent with sparsity-based regularization works best for small-particle sensing. These computational approaches were evaluated on images acquired using a lens-free microscope that we assembled from an off-the-shelf LED array and color image sensor. Compared to other lens-free imaging systems, our hardware integration, calibration, and sample preparation are particularly simple. We believe our results will help to enable the best performance in lens-free holographic sensing.

  5. A Convenient Cas9-based Conditional Knockout Strategy for Simultaneously Targeting Multiple Genes in Mouse.

    Science.gov (United States)

    Chen, Jiang; Du, Yinan; He, Xueyan; Huang, Xingxu; Shi, Yun S

    2017-03-31

    The most powerful way to probe protein function is to characterize the consequence of its deletion. Compared to conventional gene knockout (KO), conditional knockout (cKO) provides an advanced gene targeting strategy with which gene deletion can be performed in a spatially and temporally restricted manner. However, for most species that are amphiploid, the widely used Cre-flox conditional KO (cKO) system would need targeting loci in both alleles to be loxP flanked, which in practice, requires time and labor consuming breeding. This is considerably significant when one is dealing with multiple genes. CRISPR/Cas9 genome modulation system is advantaged in its capability in targeting multiple sites simultaneously. Here we propose a strategy that could achieve conditional KO of multiple genes in mouse with Cre recombinase dependent Cas9 expression. By transgenic construction of loxP-stop-loxP (LSL) controlled Cas9 (LSL-Cas9) together with sgRNAs targeting EGFP, we showed that the fluorescence molecule could be eliminated in a Cre-dependent manner. We further verified the efficacy of this novel strategy to target multiple sites by deleting c-Maf and MafB simultaneously in macrophages specifically. Compared to the traditional Cre-flox cKO strategy, this sgRNAs-LSL-Cas9 cKO system is simpler and faster, and would make conditional manipulation of multiple genes feasible.

  6. Multiple scattering in electron fluid and energy loss in multi-ionic targets

    Energy Technology Data Exchange (ETDEWEB)

    Deutsch, C., E-mail: claude.deutsch@u-psud.fr [LPGP, UParis-Sud, 91405-Orsay (France); Tahir, N.A. [GSI, 1Planck Str., 64291-Darmstadt (Germany); Barriga-Carrasco, M. [ETSII, UCastilla-la-Mancha, 13071 Ciudad-Real (Spain); Ceban, V. [LPGP, UParis-Sud, 91405-Orsay (France); Fromy, P. [CRI, UParis-Sud, 91405-Orsay (France); Gilles, D. [CEA/Saclay/DSM/IRFU/SAP, 91191-Gif-s-Yvette (France); Leger, D. [Laboratoire Monthouy, UValenciennes-Hainaut Cambresis (France); Maynard, G. [LPGP, UParis-Sud, 91405-Orsay (France); Tashev, B. [Department of Physics, KazNu, Tole Bi82, Almaty (Kazakhstan); Volpe, L. [Department of Physics, UMilano-Bicocca, Milano 20126 (Italy)

    2014-01-01

    Extensions of the standard stopping model (SSM) for ion projectiles interacting with dense targets of timely concern for ICF and WDM are reviewed. They include multiple scattering on partially degenerate electrons, low velocity ion slowing down in demixing H–He mixtures within Jovian planets core or multiionic target such as Kapton.

  7. Multiple scattering in electron fluid and energy loss in multi-ionic targets

    International Nuclear Information System (INIS)

    Deutsch, C.; Tahir, N.A.; Barriga-Carrasco, M.; Ceban, V.; Fromy, P.; Gilles, D.; Leger, D.; Maynard, G.; Tashev, B.; Volpe, L.

    2014-01-01

    Extensions of the standard stopping model (SSM) for ion projectiles interacting with dense targets of timely concern for ICF and WDM are reviewed. They include multiple scattering on partially degenerate electrons, low velocity ion slowing down in demixing H–He mixtures within Jovian planets core or multiionic target such as Kapton

  8. TRANGE: computer code to calculate the energy beam degradation in target stack

    International Nuclear Information System (INIS)

    Bellido, Luis F.

    1995-07-01

    A computer code to calculate the projectile energy degradation along a target stack was developed for an IBM or compatible personal microcomputer. A comparison of protons and deuterons bombarding uranium and aluminium targets was made. The results showed that the data obtained with TRANGE were in agreement with other computers code such as TRIM, EDP and also using Williamsom and Janni range and stopping power tables. TRANGE can be used for any charged particle ion, for energies between 1 to 100 MeV, in metal foils and solid compounds targets. (author). 8 refs., 2 tabs

  9. Building an organic computing device with multiple interconnected brains

    OpenAIRE

    Pais-Vieira, Miguel; Chiuffa, Gabriela; Lebedev, Mikhail; Yadav, Amol; Nicolelis, Miguel A. L.

    2015-01-01

    Recently, we proposed that Brainets, i.e. networks formed by multiple animal brains, cooperating and exchanging information in real time through direct brain-to-brain interfaces, could provide the core of a new type of computing device: an organic computer. Here, we describe the first experimental demonstration of such a Brainet, built by interconnecting four adult rat brains. Brainets worked by concurrently recording the extracellular electrical activity generated by populations of cortical ...

  10. Multiple network alignment on quantum computers

    Science.gov (United States)

    Daskin, Anmer; Grama, Ananth; Kais, Sabre

    2014-12-01

    Comparative analyses of graph-structured datasets underly diverse problems. Examples of these problems include identification of conserved functional components (biochemical interactions) across species, structural similarity of large biomolecules, and recurring patterns of interactions in social networks. A large class of such analyses methods quantify the topological similarity of nodes across networks. The resulting correspondence of nodes across networks, also called node alignment, can be used to identify invariant subgraphs across the input graphs. Given graphs as input, alignment algorithms use topological information to assign a similarity score to each -tuple of nodes, with elements (nodes) drawn from each of the input graphs. Nodes are considered similar if their neighbors are also similar. An alternate, equivalent view of these network alignment algorithms is to consider the Kronecker product of the input graphs and to identify high-ranked nodes in the Kronecker product graph. Conventional methods such as PageRank and HITS (Hypertext-Induced Topic Selection) can be used for this purpose. These methods typically require computation of the principal eigenvector of a suitably modified Kronecker product matrix of the input graphs. We adopt this alternate view of the problem to address the problem of multiple network alignment. Using the phase estimation algorithm, we show that the multiple network alignment problem can be efficiently solved on quantum computers. We characterize the accuracy and performance of our method and show that it can deliver exponential speedups over conventional (non-quantum) methods.

  11. Strategies for Sharing Seismic Data Among Multiple Computer Platforms

    Science.gov (United States)

    Baker, L. M.; Fletcher, J. B.

    2001-12-01

    the user. Commercial software packages, such as MatLab, also have the ability to share data in their own formats across multiple computer platforms. Our Fortran applications can create plot files in Adobe PostScript, Illustrator, and Portable Document Format (PDF) formats. Vendor support for reading these files is readily available on multiple computer platforms. We will illustrate by example our strategies for sharing seismic data among our multiple computer platforms, and we will discuss our positive and negative experiences. We will include our solutions for handling the different byte ordering, floating-point formats, and text file ``end-of-line'' conventions on the various computer platforms we use (6 different operating systems on 5 processor architectures).

  12. Computing all hybridization networks for multiple binary phylogenetic input trees.

    Science.gov (United States)

    Albrecht, Benjamin

    2015-07-30

    The computation of phylogenetic trees on the same set of species that are based on different orthologous genes can lead to incongruent trees. One possible explanation for this behavior are interspecific hybridization events recombining genes of different species. An important approach to analyze such events is the computation of hybridization networks. This work presents the first algorithm computing the hybridization number as well as a set of representative hybridization networks for multiple binary phylogenetic input trees on the same set of taxa. To improve its practical runtime, we show how this algorithm can be parallelized. Moreover, we demonstrate the efficiency of the software Hybroscale, containing an implementation of our algorithm, by comparing it to PIRNv2.0, which is so far the best available software computing the exact hybridization number for multiple binary phylogenetic trees on the same set of taxa. The algorithm is part of the software Hybroscale, which was developed specifically for the investigation of hybridization networks including their computation and visualization. Hybroscale is freely available(1) and runs on all three major operating systems. Our simulation study indicates that our approach is on average 100 times faster than PIRNv2.0. Moreover, we show how Hybroscale improves the interpretation of the reported hybridization networks by adding certain features to its graphical representation.

  13. Evaluation of Network Reliability for Computer Networks with Multiple Sources

    Directory of Open Access Journals (Sweden)

    Yi-Kuei Lin

    2012-01-01

    Full Text Available Evaluating the reliability of a network with multiple sources to multiple sinks is a critical issue from the perspective of quality management. Due to the unrealistic definition of paths of network models in previous literature, existing models are not appropriate for real-world computer networks such as the Taiwan Advanced Research and Education Network (TWAREN. This paper proposes a modified stochastic-flow network model to evaluate the network reliability of a practical computer network with multiple sources where data is transmitted through several light paths (LPs. Network reliability is defined as being the probability of delivering a specified amount of data from the sources to the sink. It is taken as a performance index to measure the service level of TWAREN. This paper studies the network reliability of the international portion of TWAREN from two sources (Taipei and Hsinchu to one sink (New York that goes through a submarine and land surface cable between Taiwan and the United States.

  14. Forward-backward multiplicity correlations of target fragments in nucleus-emulsion collisions at a few hundred MeV/u

    International Nuclear Information System (INIS)

    Zhang Donghai; Chen Yanling; Wang Guorong; Li Wangdong; Wang Qing; Yao Jijie; Zhou Jianguo; Li Rong; Li Junsheng; Li Huiling

    2015-01-01

    The forward-backward multiplicity and correlations of a target evaporated fragment (black track particle) and target recoiled proton (grey track particle) emitted from 150 A MeV "4He, 290 A MeV "1"2C, 400 A MeV "1"2C, 400 A MeV "2"0Ne and 500 A MeV "5"6Fe induced different types of nuclear emulsion target interactions are investigated. It is found that the forward and backward averaged multiplicity of a grey, black and heavily ionized track particle increases with the increase of the target size. The averaged multiplicity of a forward black track particle, backward black track particle, and backward grey track particle do not depend on the projectile size and energy, but the averaged multiplicity of a forward grey track particle increases with an increase of projectile size and energy. The backward grey track particle multiplicity distribution follows an exponential decay law and the decay constant decreases with an increase of target size. The backward-forward multiplicity correlations follow linear law which is independent of the projectile size and energy, and the saturation effect is observed in some heavy target data sets. (authors)

  15. Sensitivity studies of the neutron multiplicity spectrum in the spallation of Pb targets

    International Nuclear Information System (INIS)

    Sinha, A.; Garg, S.B.; Srinivasan, M.

    1986-01-01

    The number of neutrons produced per incident proton in the spallation of Pb targets is of direct relevance to the design of accelerator breeders. The nuclear cascade initiated by high-energy protons in spallation targets is usually described by an intranuclear cascade evaporation (INCE) model. Even though this model describes various average nuclear properties of spallation targets fairly well, differential quantities such as energy spectra, angular spectra etc., are not reproduced within the limits of experimental uncertainty. One of the reasons for this is the uncertainty in the magnitude of the parameters involved in the model, notably the level density parameter Bsub(O) whose magnitude is quoted by different workers to be in the range of 8-20 MeV. The accuracy of Bsub(O) could be improved if we could experimentally determine a quantity which is much more sensitive to Bsub(O) than the average neutron yield. In this paper we discuss one such quantity, namely the neutron multiplicity spectrum (MS). We compute the MS due to the spallation of Pb targets of different sizes at proton energies of 1.5, 1.0 and 0.59 GeV using the Monte Carlo code HETC. It is noticed that for the 1.5 GeV proton case the probability P(ν) for leakage of ν neutrons for ν in the range of 60-65, changes by about 70% when Bsub(O) is varied from 8 to 20 MeV. The corresponding change in the average neutron yield is <20%. It is therefore suggested that an accurate measurement of the MS can serve as a useful tool to narrow down the range of uncertainty in the Bsub(O) parameter. (author)

  16. Feature-space assessment of electrical impedance tomography coregistered with computed tomography in detecting multiple contrast targets

    International Nuclear Information System (INIS)

    Krishnan, Kalpagam; Liu, Jeff; Kohli, Kirpal

    2014-01-01

    Purpose: Fusion of electrical impedance tomography (EIT) with computed tomography (CT) can be useful as a clinical tool for providing additional physiological information about tissues, but requires suitable fusion algorithms and validation procedures. This work explores the feasibility of fusing EIT and CT images using an algorithm for coregistration. The imaging performance is validated through feature space assessment on phantom contrast targets. Methods: EIT data were acquired by scanning a phantom using a circuit, configured for injecting current through 16 electrodes, placed around the phantom. A conductivity image of the phantom was obtained from the data using electrical impedance and diffuse optical tomography reconstruction software (EIDORS). A CT image of the phantom was also acquired. The EIT and CT images were fused using a region of interest (ROI) coregistration fusion algorithm. Phantom imaging experiments were carried out on objects of different contrasts, sizes, and positions. The conductive medium of the phantoms was made of a tissue-mimicking bolus material that is routinely used in clinical radiation therapy settings. To validate the imaging performance in detecting different contrasts, the ROI of the phantom was filled with distilled water and normal saline. Spatially separated cylindrical objects of different sizes were used for validating the imaging performance in multiple target detection. Analyses of the CT, EIT and the EIT/CT phantom images were carried out based on the variations of contrast, correlation, energy, and homogeneity, using a gray level co-occurrence matrix (GLCM). A reference image of the phantom was simulated using EIDORS, and the performances of the CT and EIT imaging systems were evaluated and compared against the performance of the EIT/CT system using various feature metrics, detectability, and structural similarity index measures. Results: In detecting distilled and normal saline water in bolus medium, EIT as a stand

  17. Design and implementation of the modified signed digit multiplication routine on a ternary optical computer.

    Science.gov (United States)

    Xu, Qun; Wang, Xianchao; Xu, Chao

    2017-06-01

    Multiplication with traditional electronic computers is faced with a low calculating accuracy and a long computation time delay. To overcome these problems, the modified signed digit (MSD) multiplication routine is established based on the MSD system and the carry-free adder. Also, its parallel algorithm and optimization techniques are studied in detail. With the help of a ternary optical computer's characteristics, the structured data processor is designed especially for the multiplication routine. Several ternary optical operators are constructed to perform M transformations and summations in parallel, which has accelerated the iterative process of multiplication. In particular, the routine allocates data bits of the ternary optical processor based on digits of multiplication input, so the accuracy of the calculation results can always satisfy the users. Finally, the routine is verified by simulation experiments, and the results are in full compliance with the expectations. Compared with an electronic computer, the MSD multiplication routine is not only good at dealing with large-value data and high-precision arithmetic, but also maintains lower power consumption and fewer calculating delays.

  18. Random Scenario Generation for a Multiple Target Tracking Environment Evaluation

    DEFF Research Database (Denmark)

    Hussain, Dil Muhammad Akbar

    2006-01-01

    , which were normally crossing targets, was to test the efficiency of the track splitting algorithm for different situations. However this approach only gives a measure of performance for a specific, possibly unrealistic, scenario and it was felt appropriate to develop procedures that would enable a more...... general performance assessment. Therefore, a random target motion scenario is adopted. Its implementation in particular for testing the track splitting algorithm using Kalman filters is used and a couple of tracking performance parameters are computed to investigate such random scenarios....

  19. Qweak Data Analysis for Target Modeling Using Computational Fluid Dynamics

    Science.gov (United States)

    Moore, Michael; Covrig, Silviu

    2015-04-01

    The 2.5 kW liquid hydrogen (LH2) target used in the Qweak parity violation experiment is the highest power LH2 target in the world and the first to be designed with Computational Fluid Dynamics (CFD) at Jefferson Lab. The Qweak experiment determined the weak charge of the proton by measuring the parity-violating elastic scattering asymmetry of longitudinally polarized electrons from unpolarized liquid hydrogen at small momentum transfer (Q2 = 0 . 025 GeV2). This target met the design goals of bench-marked with the Qweak target data. This work is an essential ingredient in future designs of very high power low noise targets like MOLLER (5 kW, target noise asymmetry contribution < 25 ppm) and MESA (4.5 kW).

  20. Targeting the Pim kinases in multiple myeloma.

    LENUS (Irish Health Repository)

    Keane, N A

    2015-07-17

    Multiple myeloma (MM) is a plasma cell malignancy that remains incurable. Novel treatment strategies to improve survival are urgently required. The Pims are a small family of serine\\/threonine kinases with increased expression across the hematological malignancies. Pim-2 shows highest expression in MM and constitutes a promising therapeutic target. It is upregulated by the bone marrow microenvironment to mediate proliferation and promote MM survival. Pim-2 also has a key role in the bone destruction typically seen in MM. Additional putative roles of the Pim kinases in MM include trafficking of malignant cells, promoting oncogenic signaling in the hypoxic bone marrow microenvironment and mediating resistance to therapy. A number of Pim inhibitors are now under development with lead compounds entering the clinic. The ATP-competitive Pim inhibitor LGH447 has recently been reported to have single agent activity in MM. It is anticipated that Pim inhibition will be of clinical benefit in combination with standard treatments and\\/or with novel drugs targeting other survival pathways in MM.

  1. A computer program for determining multiplicities of powder reflexions

    International Nuclear Information System (INIS)

    Rouse, K.D.; Cooper, M.J.

    1977-01-01

    A computer program has been written which determines the multiplicity factors for a given set of X-ray or neutron powder diffraction reflexions for crystals of any space group. The value of the multiplicity for each reflexion is determined from a look-up table which is indexed by the symmetry type, determined directly from the space-group number, and the reflexion type, determined from the Miller indices. There are no restrictions on the choice of indices which are used to specify the reflexions. (Auth.)

  2. Method for Multiple Targets Tracking in Cognitive Radar Based on Compressed Sensing

    Directory of Open Access Journals (Sweden)

    Yang Jun

    2016-02-01

    Full Text Available A multiple targets cognitive radar tracking method based on Compressed Sensing (CS is proposed. In this method, the theory of CS is introduced to the case of cognitive radar tracking process in multiple targets scenario. The echo signal is sparsely expressed. The designs of sparse matrix and measurement matrix are accomplished by expressing the echo signal sparsely, and subsequently, the restruction of measurement signal under the down-sampling condition is realized. On the receiving end, after considering that the problems that traditional particle filter suffers from degeneracy, and require a large number of particles, the particle swarm optimization particle filter is used to track the targets. On the transmitting end, the Posterior Cramér-Rao Bounds (PCRB of the tracking accuracy is deduced, and the radar waveform parameters are further cognitively designed using PCRB. Simulation results show that the proposed method can not only reduce the data quantity, but also provide a better tracking performance compared with traditional method.

  3. A General Cross-Layer Cloud Scheduling Framework for Multiple IoT Computer Tasks.

    Science.gov (United States)

    Wu, Guanlin; Bao, Weidong; Zhu, Xiaomin; Zhang, Xiongtao

    2018-05-23

    The diversity of IoT services and applications brings enormous challenges to improving the performance of multiple computer tasks' scheduling in cross-layer cloud computing systems. Unfortunately, the commonly-employed frameworks fail to adapt to the new patterns on the cross-layer cloud. To solve this issue, we design a new computer task scheduling framework for multiple IoT services in cross-layer cloud computing systems. Specifically, we first analyze the features of the cross-layer cloud and computer tasks. Then, we design the scheduling framework based on the analysis and present detailed models to illustrate the procedures of using the framework. With the proposed framework, the IoT services deployed in cross-layer cloud computing systems can dynamically select suitable algorithms and use resources more effectively to finish computer tasks with different objectives. Finally, the algorithms are given based on the framework, and extensive experiments are also given to validate its effectiveness, as well as its superiority.

  4. Distributed Cooperative Search Control Method of Multiple UAVs for Moving Target

    Directory of Open Access Journals (Sweden)

    Chang-jian Ru

    2015-01-01

    Full Text Available To reduce the impact of uncertainties caused by unknown motion parameters on searching plan of moving targets and improve the efficiency of UAV’s searching, a novel distributed Multi-UAVs cooperative search control method for moving target is proposed in this paper. Based on detection results of onboard sensors, target probability map is updated using Bayesian theory. A Gaussian distribution of target transition probability density function is introduced to calculate prediction probability of moving target existence, and then target probability map can be further updated in real-time. A performance index function combining with target cost, environment cost, and cooperative cost is constructed, and the cooperative searching problem can be transformed into a central optimization problem. To improve computational efficiency, the distributed model predictive control method is presented, and thus the control command of each UAV can be obtained. The simulation results have verified that the proposed method can avoid the blindness of UAV searching better and improve overall efficiency of the team effectively.

  5. Performance of Cloud Computing Centers with Multiple Priority Classes

    NARCIS (Netherlands)

    Ellens, W.; Zivkovic, Miroslav; Akkerboom, J.; Litjens, R.; van den Berg, Hans Leo

    In this paper we consider the general problem of resource provisioning within cloud computing. We analyze the problem of how to allocate resources to different clients such that the service level agreements (SLAs) for all of these clients are met. A model with multiple service request classes

  6. MULGRES: a computer program for stepwise multiple regression analysis

    Science.gov (United States)

    A. Jeff Martin

    1971-01-01

    MULGRES is a computer program source deck that is designed for multiple regression analysis employing the technique of stepwise deletion in the search for most significant variables. The features of the program, along with inputs and outputs, are briefly described, with a note on machine compatibility.

  7. Exploiting off-targeting in guide-RNAs for CRISPR systems for simultaneous editing of multiple genes

    DEFF Research Database (Denmark)

    Ferreira, Raphael; Gatto, Francesco; Nielsen, Jens

    2017-01-01

    Bioinformatics tools to design guide-RNAs (gRNAs) in Clustered Regularly Interspaced Short Palindromic Repeats systems mostly focused on minimizing off-targeting to enhance efficacy of genome editing. However, there are circumstances in which off-targeting might be desirable to target multiple ge...

  8. Complex matrix multiplication operations with data pre-conditioning in a high performance computing architecture

    Science.gov (United States)

    Eichenberger, Alexandre E; Gschwind, Michael K; Gunnels, John A

    2014-02-11

    Mechanisms for performing a complex matrix multiplication operation are provided. A vector load operation is performed to load a first vector operand of the complex matrix multiplication operation to a first target vector register. The first vector operand comprises a real and imaginary part of a first complex vector value. A complex load and splat operation is performed to load a second complex vector value of a second vector operand and replicate the second complex vector value within a second target vector register. The second complex vector value has a real and imaginary part. A cross multiply add operation is performed on elements of the first target vector register and elements of the second target vector register to generate a partial product of the complex matrix multiplication operation. The partial product is accumulated with other partial products and a resulting accumulated partial product is stored in a result vector register.

  9. Computer experiments of the time-sequence of individual steps in multiple Coulomb-excitation

    International Nuclear Information System (INIS)

    Boer, J. de; Dannhaueser, G.

    1982-01-01

    The way in which the multiple E2 steps in the Coulomb-excitation of a rotational band of a nucleus follow one another is elucidated for selected examples using semiclassical computer experiments. The role a given transition plays for the excitation of a given final state is measured by a quantity named ''importance function''. It is found that these functions, calculated for the highest rotational state, peak at times forming a sequence for the successive E2 transitions starting from the ground state. This sequential behaviour is used to approximately account for the effects on the projectile orbit of the sequential transfer of excitation energy and angular momentum from projectile to target. These orbits lead to similar deflection functions and cross sections as those obtained from a symmetrization procedure approximately accounting for the transfer of angular momentum and energy. (Auth.)

  10. Estimating Accurate Target Coordinates with Magnetic Resonance Images by Using Multiple Phase-Encoding Directions during Acquisition.

    Science.gov (United States)

    Kim, Minsoo; Jung, Na Young; Park, Chang Kyu; Chang, Won Seok; Jung, Hyun Ho; Chang, Jin Woo

    2018-06-01

    Stereotactic procedures are image guided, often using magnetic resonance (MR) images limited by image distortion, which may influence targets for stereotactic procedures. The aim of this work was to assess methods of identifying target coordinates for stereotactic procedures with MR in multiple phase-encoding directions. In 30 patients undergoing deep brain stimulation, we acquired 5 image sets: stereotactic brain computed tomography (CT), T2-weighted images (T2WI), and T1WI in both right-to-left (RL) and anterior-to-posterior (AP) phase-encoding directions. Using CT coordinates as a reference, we analyzed anterior commissure and posterior commissure coordinates to identify any distortion relating to phase-encoding direction. Compared with CT coordinates, RL-directed images had more positive x-axis values (0.51 mm in T1WI, 0.58 mm in T2WI). AP-directed images had more negative y-axis values (0.44 mm in T1WI, 0.59 mm in T2WI). We adopted 2 methods to predict CT coordinates with MR image sets: parallel translation and selective choice of axes according to phase-encoding direction. Both were equally effective at predicting CT coordinates using only MR; however, the latter may be easier to use in clinical settings. Acquiring MR in multiple phase-encoding directions and selecting axes according to the phase-encoding direction allows identification of more accurate coordinates for stereotactic procedures. © 2018 S. Karger AG, Basel.

  11. Computer-aided target tracking in motion analysis studies

    Science.gov (United States)

    Burdick, Dominic C.; Marcuse, M. L.; Mislan, J. D.

    1990-08-01

    Motion analysis studies require the precise tracking of reference objects in sequential scenes. In a typical situation, events of interest are captured at high frame rates using special cameras, and selected objects or targets are tracked on a frame by frame basis to provide necessary data for motion reconstruction. Tracking is usually done using manual methods which are slow and prone to error. A computer based image analysis system has been developed that performs tracking automatically. The objective of this work was to eliminate the bottleneck due to manual methods in high volume tracking applications such as the analysis of crash test films for the automotive industry. The system has proven to be successful in tracking standard fiducial targets and other objects in crash test scenes. Over 95 percent of target positions which could be located using manual methods can be tracked by the system, with a significant improvement in throughput over manual methods. Future work will focus on the tracking of clusters of targets and on tracking deformable objects such as airbags.

  12. Getting satisfied with "satisfaction of search": How to measure errors during multiple-target visual search.

    Science.gov (United States)

    Biggs, Adam T

    2017-07-01

    Visual search studies are common in cognitive psychology, and the results generally focus upon accuracy, response times, or both. Most research has focused upon search scenarios where no more than 1 target will be present for any single trial. However, if multiple targets can be present on a single trial, it introduces an additional source of error because the found target can interfere with subsequent search performance. These errors have been studied thoroughly in radiology for decades, although their emphasis in cognitive psychology studies has been more recent. One particular issue with multiple-target search is that these subsequent search errors (i.e., specific errors which occur following a found target) are measured differently by different studies. There is currently no guidance as to which measurement method is best or what impact different measurement methods could have upon various results and conclusions. The current investigation provides two efforts to address these issues. First, the existing literature is reviewed to clarify the appropriate scenarios where subsequent search errors could be observed. Second, several different measurement methods are used with several existing datasets to contrast and compare how each method would have affected the results and conclusions of those studies. The evidence is then used to provide appropriate guidelines for measuring multiple-target search errors in future studies.

  13. SAR Target Recognition Based on Multi-feature Multiple Representation Classifier Fusion

    Directory of Open Access Journals (Sweden)

    Zhang Xinzheng

    2017-10-01

    Full Text Available In this paper, we present a Synthetic Aperture Radar (SAR image target recognition algorithm based on multi-feature multiple representation learning classifier fusion. First, it extracts three features from the SAR images, namely principal component analysis, wavelet transform, and Two-Dimensional Slice Zernike Moments (2DSZM features. Second, we harness the sparse representation classifier and the cooperative representation classifier with the above-mentioned features to get six predictive labels. Finally, we adopt classifier fusion to obtain the final recognition decision. We researched three different classifier fusion algorithms in our experiments, and the results demonstrate thatusing Bayesian decision fusion gives thebest recognition performance. The method based on multi-feature multiple representation learning classifier fusion integrates the discrimination of multi-features and combines the sparse and cooperative representation classification performance to gain complementary advantages and to improve recognition accuracy. The experiments are based on the Moving and Stationary Target Acquisition and Recognition (MSTAR database,and they demonstrate the effectiveness of the proposed approach.

  14. A multiple sampling ionization chamber for the External Target Facility

    International Nuclear Information System (INIS)

    Zhang, X.H.; Tang, S.W.; Ma, P.; Lu, C.G.; Yang, H.R.; Wang, S.T.; Yu, Y.H.; Yue, K.; Fang, F.; Yan, D.; Zhou, Y.; Wang, Z.M.; Sun, Y.; Sun, Z.Y.; Duan, L.M.; Sun, B.H.

    2015-01-01

    A multiple sampling ionization chamber used as a particle identification device for high energy heavy ions has been developed for the External Target Facility. The performance of this detector was tested with a 239 Pu α source and RI beams. A Z resolution (FWHM) of 0.4–0.6 was achieved for nuclear fragments of 18 O at 400 AMeV

  15. Modern Approaches to the Computation of the Probability of Target Detection in Cluttered Environments

    Science.gov (United States)

    Meitzler, Thomas J.

    The field of computer vision interacts with fields such as psychology, vision research, machine vision, psychophysics, mathematics, physics, and computer science. The focus of this thesis is new algorithms and methods for the computation of the probability of detection (Pd) of a target in a cluttered scene. The scene can be either a natural visual scene such as one sees with the naked eye (visual), or, a scene displayed on a monitor with the help of infrared sensors. The relative clutter and the temperature difference between the target and background (DeltaT) are defined and then used to calculate a relative signal -to-clutter ratio (SCR) from which the Pd is calculated for a target in a cluttered scene. It is shown how this definition can include many previous definitions of clutter and (DeltaT). Next, fuzzy and neural -fuzzy techniques are used to calculate the Pd and it is shown how these methods can give results that have a good correlation with experiment. The experimental design for actually measuring the Pd of a target by observers is described. Finally, wavelets are applied to the calculation of clutter and it is shown how this new definition of clutter based on wavelets can be used to compute the Pd of a target.

  16. Computer optimization of cutting yield from multiple ripped boards

    Science.gov (United States)

    A.R. Stern; K.A. McDonald

    1978-01-01

    RIPYLD is a computer program that optimizes the cutting yield from multiple-ripped boards. Decisions are based on automatically collected defect information, cutting bill requirements, and sawing variables. The yield of clear cuttings from a board is calculated for every possible permutation of specified rip widths and both the maximum and minimum percent yield...

  17. Optimal assignment of multiple utilities in heat exchange networks

    International Nuclear Information System (INIS)

    Salama, A.I.A.

    2009-01-01

    Existing numerical geometry-based techniques, developed by [A.I.A. Salama, Numerical techniques for determining heat energy targets in pinch analysis, Computers and Chemical Engineering 29 (2005) 1861-1866; A.I.A. Salama, Determination of the optimal heat energy targets in heat pinch analysis using a geometry-based approach, Computers and Chemical Engineering 30 (2006) 758-764], have been extended to optimally assign multiple utilities in heat exchange network (HEN). These techniques utilize the horizontal shift between the cold composite curve (CC) and the stationary hot CC to determine the HEN optimal energy targets, grand composite curve (GCC), and the complement grand composite curve (CGCC). The proposed numerical technique developed in this paper is direct and simultaneously determines the optimal heat-energy targets and optimally assigns multiple utilities as compared with an existing technique based on sequential assignment of multiple utilities. The technique starts by arranging in an ascending order the HEN stream and target temperatures, and the resulting set is labelled T. Furthermore, the temperature sets where multiple utilities are introduced are arranged in an ascending order and are labelled T ic and T ih for the cold and hot sides, respectively. The graphical presentation of the results is facilitated by the insertion at each multiple-utility temperature a perturbed temperature equals the insertion temperature minus a small perturbation. Furthermore, using the heat exchanger network (HEN) minimum temperature-differential approach (ΔT min ) and stream heat-capacity flow rates, the presentation is facilitated by using the conventional temperature shift of the HEN CCs. The set of temperature-shifted stream and target temperatures and perturbed temperatures in the overlap range between the CCs is labelled T ol . Using T ol , a simple formula employing enthalpy-flow differences between the hot composite curve CC h and the cold composite curve CC c is

  18. Hypergraph partitioning implementation for parallelizing matrix-vector multiplication using CUDA GPU-based parallel computing

    Science.gov (United States)

    Murni, Bustamam, A.; Ernastuti, Handhika, T.; Kerami, D.

    2017-07-01

    Calculation of the matrix-vector multiplication in the real-world problems often involves large matrix with arbitrary size. Therefore, parallelization is needed to speed up the calculation process that usually takes a long time. Graph partitioning techniques that have been discussed in the previous studies cannot be used to complete the parallelized calculation of matrix-vector multiplication with arbitrary size. This is due to the assumption of graph partitioning techniques that can only solve the square and symmetric matrix. Hypergraph partitioning techniques will overcome the shortcomings of the graph partitioning technique. This paper addresses the efficient parallelization of matrix-vector multiplication through hypergraph partitioning techniques using CUDA GPU-based parallel computing. CUDA (compute unified device architecture) is a parallel computing platform and programming model that was created by NVIDIA and implemented by the GPU (graphics processing unit).

  19. Detection and localization of multiple short range targets using FMCW radar signal

    KAUST Repository

    Jardak, Seifallah

    2016-07-26

    In this paper, a 24 GHz frequency-modulated continuous wave radar is used to detect and localize both stationary and moving targets. Depending on the application, the implemented software offers different modes of operation. For example, it can simply output raw data samples for advanced offline processing or directly carry out a two dimensional fast Fourier transform to estimate the location and velocity of multiple targets. To suppress clutter and detect only moving targets, two methods based on the background reduction and the slow time processing techniques are implemented. A trade-off between the two methods is presented based on their performance and the required processing time. © 2016 IEEE.

  20. Evaluation of myocardial ischemia by multiple detector computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Fernandes, Fabio Vieira, E-mail: rccury@me.com [Hospital do Coracao (HCor), Sao Paulo, SP (Brazil); Cury, Roberto Caldeira [Hospital Samaritano, Sao Paulo, SP (Brazil)

    2015-01-15

    For years, cardiovascular diseases have been the leading cause of death worldwide, bringing on important social and economic consequences. Given this scenario, the search for a method capable of diagnosing coronary artery diseases in an early and accurate way is increasingly higher. The coronary computed tomography angiogram is already widely established for the stratification of coronary artery diseases, and, more recently, the computed tomography myocardial perfusion imaging has been providing relevant information by correlating ischemia and the coronary anatomy. The objective of this review is to describe the evaluation of myocardial ischemia by multiple detector computed tomography. This study will resort to controlled clinical trials that show the possibility of a single method to identify the atherosclerotic load, presence of coronary artery luminal narrowing and possible myocardial ischemia, by means of a fast, practical and reliable method validated by a multicenter study. (author)

  1. Multi-Target Angle Tracking Algorithm for Bistatic Multiple-Input Multiple-Output (MIMO Radar Based on the Elements of the Covariance Matrix

    Directory of Open Access Journals (Sweden)

    Zhengyan Zhang

    2018-03-01

    Full Text Available In this paper, we consider the problem of tracking the direction of arrivals (DOA and the direction of departure (DOD of multiple targets for bistatic multiple-input multiple-output (MIMO radar. A high-precision tracking algorithm for target angle is proposed. First, the linear relationship between the covariance matrix difference and the angle difference of the adjacent moment was obtained through three approximate relations. Then, the proposed algorithm obtained the relationship between the elements in the covariance matrix difference. On this basis, the performance of the algorithm was improved by averaging the covariance matrix element. Finally, the least square method was used to estimate the DOD and DOA. The algorithm realized the automatic correlation of the angle and provided better performance when compared with the adaptive asymmetric joint diagonalization (AAJD algorithm. The simulation results demonstrated the effectiveness of the proposed algorithm. The algorithm provides the technical support for the practical application of MIMO radar.

  2. Multi-Target Angle Tracking Algorithm for Bistatic Multiple-Input Multiple-Output (MIMO) Radar Based on the Elements of the Covariance Matrix.

    Science.gov (United States)

    Zhang, Zhengyan; Zhang, Jianyun; Zhou, Qingsong; Li, Xiaobo

    2018-03-07

    In this paper, we consider the problem of tracking the direction of arrivals (DOA) and the direction of departure (DOD) of multiple targets for bistatic multiple-input multiple-output (MIMO) radar. A high-precision tracking algorithm for target angle is proposed. First, the linear relationship between the covariance matrix difference and the angle difference of the adjacent moment was obtained through three approximate relations. Then, the proposed algorithm obtained the relationship between the elements in the covariance matrix difference. On this basis, the performance of the algorithm was improved by averaging the covariance matrix element. Finally, the least square method was used to estimate the DOD and DOA. The algorithm realized the automatic correlation of the angle and provided better performance when compared with the adaptive asymmetric joint diagonalization (AAJD) algorithm. The simulation results demonstrated the effectiveness of the proposed algorithm. The algorithm provides the technical support for the practical application of MIMO radar.

  3. Efficient computation of the joint sample frequency spectra for multiple populations.

    Science.gov (United States)

    Kamm, John A; Terhorst, Jonathan; Song, Yun S

    2017-01-01

    A wide range of studies in population genetics have employed the sample frequency spectrum (SFS), a summary statistic which describes the distribution of mutant alleles at a polymorphic site in a sample of DNA sequences and provides a highly efficient dimensional reduction of large-scale population genomic variation data. Recently, there has been much interest in analyzing the joint SFS data from multiple populations to infer parameters of complex demographic histories, including variable population sizes, population split times, migration rates, admixture proportions, and so on. SFS-based inference methods require accurate computation of the expected SFS under a given demographic model. Although much methodological progress has been made, existing methods suffer from numerical instability and high computational complexity when multiple populations are involved and the sample size is large. In this paper, we present new analytic formulas and algorithms that enable accurate, efficient computation of the expected joint SFS for thousands of individuals sampled from hundreds of populations related by a complex demographic model with arbitrary population size histories (including piecewise-exponential growth). Our results are implemented in a new software package called momi (MOran Models for Inference). Through an empirical study we demonstrate our improvements to numerical stability and computational complexity.

  4. X-ray luminescence computed tomography imaging via multiple intensity weighted narrow beam irradiation

    Science.gov (United States)

    Feng, Bo; Gao, Feng; Zhao, Huijuan; Zhang, Limin; Li, Jiao; Zhou, Zhongxing

    2018-02-01

    The purpose of this work is to introduce and study a novel x-ray beam irradiation pattern for X-ray Luminescence Computed Tomography (XLCT), termed multiple intensity-weighted narrow-beam irradiation. The proposed XLCT imaging method is studied through simulations of x-ray and diffuse lights propagation. The emitted optical photons from X-ray excitable nanophosphors were collected by optical fiber bundles from the right-side surface of the phantom. The implementation of image reconstruction is based on the simulated measurements from 6 or 12 angular projections in terms of 3 or 5 x-ray beams scanning mode. The proposed XLCT imaging method is compared against the constant intensity weighted narrow-beam XLCT. From the reconstructed XLCT images, we found that the Dice similarity and quantitative ratio of targets have a certain degree of improvement. The results demonstrated that the proposed method can offer simultaneously high image quality and fast image acquisition.

  5. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  6. Two-dimensional multiplicity fluctuation analysis of target residues in nuclear collisions

    International Nuclear Information System (INIS)

    Dong-Hai, Zhang; Yao-Jie, Niu; Li-Chun, Wang; Wen-Jun, Yan; Li-Juan, Gao; Ming-Xing, Li; Li-Ping, Wu; Hui-Ling, Li; Jun-Sheng, Li

    2010-01-01

    Multiplicity fluctuation of the target residues emitted in the interactions in a wide range of projectile energies from 500 A MeV to 60 A GeV is investigated in the framework of two-dimensional scaled factorial moment methodology. The evidence of non-statistical multiplicity fluctuation is found in 16 O–AgBr collisions at 60 A GeV, but not in 56 Fe–AgBr collisions at 500 A MeV, 84 Kr–AgBr collisions at 1.7 A GeV, 16 O–AgBr collisions at 3.7 A GeV and 197 Au–AgBr collisions at 10.7 A GeV. (nuclear physics)

  7. A computational framework for modeling targets as complex adaptive systems

    Science.gov (United States)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  8. Targeting Accuracy of Image-Guided Radiosurgery for Intracranial Lesions: A Comparison Across Multiple Linear Accelerator Platforms.

    Science.gov (United States)

    Huang, Yimei; Zhao, Bo; Chetty, Indrin J; Brown, Stephen; Gordon, James; Wen, Ning

    2016-04-01

    To evaluate the overall positioning accuracy of image-guided intracranial radiosurgery across multiple linear accelerator platforms. A computed tomography scan with a slice thickness of 1.0 mm was acquired of an anthropomorphic head phantom in a BrainLAB U-frame mask. The phantom was embedded with three 5-mm diameter tungsten ball bearings, simulating a central, a left, and an anterior cranial lesion. The ball bearings were positioned to radiation isocenter under ExacTrac X-ray or cone-beam computed tomography image guidance on 3 Linacs: (1) ExacTrac X-ray localization on a Novalis Tx; (2) cone-beam computed tomography localization on the Novalis Tx; (3) cone-beam computed tomography localization on a TrueBeam; and (4) cone-beam computed tomography localization on an Edge. Each ball bearing was positioned 5 times to the radiation isocenter with different initial setup error following the 4 image guidance procedures on the 3 Linacs, and the mean (µ) and one standard deviation (σ) of the residual error were compared. Averaged overall 3 ball bearing locations, the vector length of the residual setup error in mm (µ ± σ) was 0.6 ± 0.2, 1.0 ± 0.5, 0.2 ± 0.1, and 0.3 ± 0.1 on ExacTrac X-ray localization on a Novalis Tx, cone-beam computed tomography localization on the Novalis Tx, cone-beam computed tomography localization on a TrueBeam, and cone-beam computed tomography localization on an Edge, with their range in mm being 0.4 to 1.1, 0.4 to 1.9, 0.1 to 0.5, and 0.2 to 0.6, respectively. The congruence between imaging and radiation isocenters in mm was 0.6 ± 0.1, 0.7 ± 0.1, 0.3 ± 0.1, and 0.2 ± 0.1, for the 4 systems, respectively. Targeting accuracy comparable to frame-based stereotactic radiosurgery can be achieved with image-guided intracranial stereotactic radiosurgery treatment. © The Author(s) 2015.

  9. SPANDY: a Monte Carlo program for gas target scattering geometry

    International Nuclear Information System (INIS)

    Jarmie, N.; Jett, J.H.; Niethammer, A.C.

    1977-02-01

    A Monte Carlo computer program is presented that simulates a two-slit gas target scattering geometry. The program is useful in estimating effects due to finite geometry and multiple scattering in the target foil. Details of the program are presented and experience with a specific example is discussed

  10. Identification of Multiple Cryptococcal Fungicidal Drug Targets by Combined Gene Dosing and Drug Affinity Responsive Target Stability Screening

    Directory of Open Access Journals (Sweden)

    Yoon-Dong Park

    2016-08-01

    Full Text Available Cryptococcus neoformans is a pathogenic fungus that is responsible for up to half a million cases of meningitis globally, especially in immunocompromised individuals. Common fungistatic drugs, such as fluconazole, are less toxic for patients but have low efficacy for initial therapy of the disease. Effective therapy against the disease is provided by the fungicidal drug amphotericin B; however, due to its high toxicity and the difficulty in administering its intravenous formulation, it is imperative to find new therapies targeting the fungus. The antiparasitic drug bithionol has been recently identified as having potent fungicidal activity. In this study, we used a combined gene dosing and drug affinity responsive target stability (GD-DARTS screen as well as protein modeling to identify a common drug binding site of bithionol within multiple NAD-dependent dehydrogenase drug targets. This combination genetic and proteomic method thus provides a powerful method for identifying novel fungicidal drug targets for further development.

  11. Prioritizing multiple therapeutic targets in parallel using automated DNA-encoded library screening

    Science.gov (United States)

    Machutta, Carl A.; Kollmann, Christopher S.; Lind, Kenneth E.; Bai, Xiaopeng; Chan, Pan F.; Huang, Jianzhong; Ballell, Lluis; Belyanskaya, Svetlana; Besra, Gurdyal S.; Barros-Aguirre, David; Bates, Robert H.; Centrella, Paolo A.; Chang, Sandy S.; Chai, Jing; Choudhry, Anthony E.; Coffin, Aaron; Davie, Christopher P.; Deng, Hongfeng; Deng, Jianghe; Ding, Yun; Dodson, Jason W.; Fosbenner, David T.; Gao, Enoch N.; Graham, Taylor L.; Graybill, Todd L.; Ingraham, Karen; Johnson, Walter P.; King, Bryan W.; Kwiatkowski, Christopher R.; Lelièvre, Joël; Li, Yue; Liu, Xiaorong; Lu, Quinn; Lehr, Ruth; Mendoza-Losana, Alfonso; Martin, John; McCloskey, Lynn; McCormick, Patti; O'Keefe, Heather P.; O'Keeffe, Thomas; Pao, Christina; Phelps, Christopher B.; Qi, Hongwei; Rafferty, Keith; Scavello, Genaro S.; Steiginga, Matt S.; Sundersingh, Flora S.; Sweitzer, Sharon M.; Szewczuk, Lawrence M.; Taylor, Amy; Toh, May Fern; Wang, Juan; Wang, Minghui; Wilkins, Devan J.; Xia, Bing; Yao, Gang; Zhang, Jean; Zhou, Jingye; Donahue, Christine P.; Messer, Jeffrey A.; Holmes, David; Arico-Muendel, Christopher C.; Pope, Andrew J.; Gross, Jeffrey W.; Evindar, Ghotas

    2017-07-01

    The identification and prioritization of chemically tractable therapeutic targets is a significant challenge in the discovery of new medicines. We have developed a novel method that rapidly screens multiple proteins in parallel using DNA-encoded library technology (ELT). Initial efforts were focused on the efficient discovery of antibacterial leads against 119 targets from Acinetobacter baumannii and Staphylococcus aureus. The success of this effort led to the hypothesis that the relative number of ELT binders alone could be used to assess the ligandability of large sets of proteins. This concept was further explored by screening 42 targets from Mycobacterium tuberculosis. Active chemical series for six targets from our initial effort as well as three chemotypes for DHFR from M. tuberculosis are reported. The findings demonstrate that parallel ELT selections can be used to assess ligandability and highlight opportunities for successful lead and tool discovery.

  12. Predicting the Noise of High Power Fluid Targets Using Computational Fluid Dynamics

    Science.gov (United States)

    Moore, Michael; Covrig Dusa, Silviu

    The 2.5 kW liquid hydrogen (LH2) target used in the Qweak parity violation experiment is the highest power LH2 target in the world and the first to be designed with Computational Fluid Dynamics (CFD) at Jefferson Lab. The Qweak experiment determined the weak charge of the proton by measuring the parity-violating elastic scattering asymmetry of longitudinally polarized electrons from unpolarized liquid hydrogen at small momentum transfer (Q2 = 0 . 025 GeV2). This target satisfied the design goals of bench-marked with the Qweak target data. This work is an essential component in future designs of very high power low noise targets like MOLLER (5 kW, target noise asymmetry contribution < 25 ppm) and MESA (4.5 kW).

  13. Detection-Discrimination Method for Multiple Repeater False Targets Based on Radar Polarization Echoes

    Directory of Open Access Journals (Sweden)

    Z. W. ZONG

    2014-04-01

    Full Text Available Multiple repeat false targets (RFTs, created by the digital radio frequency memory (DRFM system of jammer, are widely used in practical to effectively exhaust the limited tracking and discrimination resource of defence radar. In this paper, common characteristic of radar polarization echoes of multiple RFTs is used for target recognition. Based on the echoes from two receiving polarization channels, the instantaneous polarization radio (IPR is defined and its variance is derived by employing Taylor series expansion. A detection-discrimination method is designed based on probability grids. By using the data from microwave anechoic chamber, the detection threshold of the method is confirmed. Theoretical analysis and simulations indicate that the method is valid and feasible. Furthermore, the estimation performance of IPRs of RFTs due to the influence of signal noise ratio (SNR is also covered.

  14. Analyzing the multiple-target-multiple-agent scenario using optimal assignment algorithms

    Science.gov (United States)

    Kwok, Kwan S.; Driessen, Brian J.; Phillips, Cynthia A.; Tovey, Craig A.

    1997-09-01

    This work considers the problem of maximum utilization of a set of mobile robots with limited sensor-range capabilities and limited travel distances. The robots are initially in random positions. A set of robots properly guards or covers a region if every point within the region is within the effective sensor range of at least one vehicle. We wish to move the vehicles into surveillance positions so as to guard or cover a region, while minimizing the maximum distance traveled by any vehicle. This problem can be formulated as an assignment problem, in which we must optimally decide which robot to assign to which slot of a desired matrix of grid points. The cost function is the maximum distance traveled by any robot. Assignment problems can be solved very efficiently. Solution times for one hundred robots took only seconds on a silicon graphics crimson workstation. The initial positions of all the robots can be sampled by a central base station and their newly assigned positions communicated back to the robots. Alternatively, the robots can establish their own coordinate system with the origin fixed at one of the robots and orientation determined by the compass bearing of another robot relative to this robot. This paper presents example solutions to the multiple-target-multiple-agent scenario using a matching algorithm. Two separate cases with one hundred agents in each were analyzed using this method. We have found these mobile robot problems to be a very interesting application of network optimization methods, and we expect this to be a fruitful area for future research.

  15. Analyzing the multiple-target-multiple-agent scenario using optimal assignment algorithms

    International Nuclear Information System (INIS)

    Kwok, K.S.; Driessen, B.J.; Phillips, C.A.; Tovey, C.A.

    1997-01-01

    This work considers the problem of maximum utilization of a set of mobile robots with limited sensor-range capabilities and limited travel distances. The robots are initially in random positions. A set of robots properly guards or covers a region if every point within the region is within the effective sensor range of at least one vehicle. The authors wish to move the vehicles into surveillance positions so as to guard or cover a region, while minimizing the maximum distance traveled by any vehicle. This problem can be formulated as an assignment problem, in which they must optimally decide which robot to assign to which slot of a desired matrix of grid points. The cost function is the maximum distance traveled by any robot. Assignment problems can be solved very efficiently. Solutions times for one hundred robots took only seconds on a Silicon Graphics Crimson workstation. The initial positions of all the robots can be sampled by a central base station and their newly assigned positions communicated back to the robots. Alternatively, the robots can establish their own coordinate system with the origin fixed at one of the robots and orientation determined by the compass bearing of another robot relative to this robot. This paper presents example solutions to the multiple-target-multiple-agent scenario using a matching algorithm. Two separate cases with one hundred agents in each were analyzed using this method. They have found these mobile robot problems to be a very interesting application of network optimization methods, and they expect this to be a fruitful area for future research

  16. Matrix-vector multiplication using digital partitioning for more accurate optical computing

    Science.gov (United States)

    Gary, C. K.

    1992-01-01

    Digital partitioning offers a flexible means of increasing the accuracy of an optical matrix-vector processor. This algorithm can be implemented with the same architecture required for a purely analog processor, which gives optical matrix-vector processors the ability to perform high-accuracy calculations at speeds comparable with or greater than electronic computers as well as the ability to perform analog operations at a much greater speed. Digital partitioning is compared with digital multiplication by analog convolution, residue number systems, and redundant number representation in terms of the size and the speed required for an equivalent throughput as well as in terms of the hardware requirements. Digital partitioning and digital multiplication by analog convolution are found to be the most efficient alogrithms if coding time and hardware are considered, and the architecture for digital partitioning permits the use of analog computations to provide the greatest throughput for a single processor.

  17. Targeting CD38 with Daratumumab Monotherapy in Multiple Myeloma

    DEFF Research Database (Denmark)

    Lokhorst, Henk M; Plesner, Torben; Laubach, Jacob P

    2015-01-01

    BACKGROUND: Multiple myeloma cells uniformly overexpress CD38. We studied daratumumab, a CD38-targeting, human IgG1κ monoclonal antibody, in a phase 1-2 trial involving patients with relapsed myeloma or relapsed myeloma that was refractory to two or more prior lines of therapy. METHODS: In part 1...... interval [CI], 4.2 to 8.1), and 65% (95% CI, 28 to 86) of the patients who had a response did not have progression at 12 months. CONCLUSIONS: Daratumumab monotherapy had a favorable safety profile and encouraging efficacy in patients with heavily pretreated and refractory myeloma. (Funded by Janssen...

  18. Exploitation of Microdoppler and Multiple Scattering Phenomena for Radar Target Recognition

    Science.gov (United States)

    2006-08-24

    progress on the reserach grant "Exploitation of MicroDoppler and Multiple Scattering Phenomena for Radar Target Recognition" during the period 1...paper describes a methodology of modeling A number of ray-based EM techniques have been interferometric synthetic aperture radar (IFSAR) images...modes including the single present an IFSAR simulation methodology to simulate the antenna transmit mode, the ping-pong mode or the repeat interferogram

  19. Category-based attentional guidance can operate in parallel for multiple target objects.

    Science.gov (United States)

    Jenkins, Michael; Grubert, Anna; Eimer, Martin

    2018-04-30

    The question whether the control of attention during visual search is always feature-based or can also be based on the category of objects remains unresolved. Here, we employed the N2pc component as an on-line marker for target selection processes to compare the efficiency of feature-based and category-based attentional guidance. Two successive displays containing pairs of real-world objects (line drawings of kitchen or clothing items) were separated by a 10 ms SOA. In Experiment 1, target objects were defined by their category. In Experiment 2, one specific visual object served as target (exemplar-based search). On different trials, targets appeared either in one or in both displays, and participants had to report the number of targets (one or two). Target N2pc components were larger and emerged earlier during exemplar-based search than during category-based search, demonstrating the superior efficiency of feature-based attentional guidance. On trials where target objects appeared in both displays, both targets elicited N2pc components that overlapped in time, suggesting that attention was allocated in parallel to these target objects. Critically, this was the case not only in the exemplar-based task, but also when targets were defined by their category. These results demonstrate that attention can be guided by object categories, and that this type of category-based attentional control can operate concurrently for multiple target objects. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. A novel sampling method for multiple multiscale targets from scattering amplitudes at a fixed frequency

    Science.gov (United States)

    Liu, Xiaodong

    2017-08-01

    A sampling method by using scattering amplitude is proposed for shape and location reconstruction in inverse acoustic scattering problems. Only matrix multiplication is involved in the computation, thus the novel sampling method is very easy and simple to implement. With the help of the factorization of the far field operator, we establish an inf-criterion for characterization of underlying scatterers. This result is then used to give a lower bound of the proposed indicator functional for sampling points inside the scatterers. While for the sampling points outside the scatterers, we show that the indicator functional decays like the bessel functions as the sampling point goes away from the boundary of the scatterers. We also show that the proposed indicator functional continuously depends on the scattering amplitude, this further implies that the novel sampling method is extremely stable with respect to errors in the data. Different to the classical sampling method such as the linear sampling method or the factorization method, from the numerical point of view, the novel indicator takes its maximum near the boundary of the underlying target and decays like the bessel functions as the sampling points go away from the boundary. The numerical simulations also show that the proposed sampling method can deal with multiple multiscale case, even the different components are close to each other.

  1. Analysis of multiple-foil XRL targets using x-ray spectroscopy

    International Nuclear Information System (INIS)

    Wang, J.; Boehly, T.; Yaakobi, B.; Epstein, R.; Meyerhofer, D.; Richardson, M.C.; Russotto, M.; Soures, J.M.

    1989-01-01

    The multiple-foil collisional excitation x-ray laser targets proposed by LLE have been studied spectroscopically. Using spatially resolved 3d-2p x-ray spectra, the authors compare the temperatures and densities obtained in single- and double-foil geometries. They use the ratio of the dipole transitions to the electric quadrupole transitions in the Neon-like species as a density diagnostic. A non-LTE average-ion atomic physics model is used to describe the ionization process and a relativistic atomic physics code is used for calculation of the level energies, populations, and gain calculations. They support their claims that the double-vfoils provide higher densities and in some cases concave density profiles. The XUV spectra in the range of 20-300 A show the effect of target geometry and incident laser intensity on the lasing lines and the ionization balance

  2. A computational procedure for finding multiple solutions of convective heat transfer equations

    International Nuclear Information System (INIS)

    Mishra, S; DebRoy, T

    2005-01-01

    In recent years numerical solutions of the convective heat transfer equations have provided significant insight into the complex materials processing operations. However, these computational methods suffer from two major shortcomings. First, these procedures are designed to calculate temperature fields and cooling rates as output and the unidirectional structure of these solutions preclude specification of these variables as input even when their desired values are known. Second, and more important, these procedures cannot determine multiple pathways or multiple sets of input variables to achieve a particular output from the convective heat transfer equations. Here we propose a new method that overcomes the aforementioned shortcomings of the commonly used solutions of the convective heat transfer equations. The procedure combines the conventional numerical solution methods with a real number based genetic algorithm (GA) to achieve bi-directionality, i.e. the ability to calculate the required input variables to achieve a specific output such as temperature field or cooling rate. More important, the ability of the GA to find a population of solutions enables this procedure to search for and find multiple sets of input variables, all of which can lead to the desired specific output. The proposed computational procedure has been applied to convective heat transfer in a liquid layer locally heated on its free surface by an electric arc, where various sets of input variables are computed to achieve a specific fusion zone geometry defined by an equilibrium temperature. Good agreement is achieved between the model predictions and the independent experimental results, indicating significant promise for the application of this procedure in finding multiple solutions of convective heat transfer equations

  3. Assembler absolute forward thick-target bremsstrahlung spectra program

    International Nuclear Information System (INIS)

    Niculescu, V.I.R.; Baciu, G.; Ionescu-Bujor, M.

    1981-12-01

    The program is intended to compute the absolute forward thick-target bremsstrahlung spectrum for electrons in the energy range 1-24 MeV. The program takes into account the following phenomena: multiple scattering, energy loss and the attenuation of the emitted gamma rays. The computer program is written in Assembler having a higher degree of generality and is more performant than the FORTRAN version. (authors)

  4. Accelerating Spaceborne SAR Imaging Using Multiple CPU/GPU Deep Collaborative Computing

    Directory of Open Access Journals (Sweden)

    Fan Zhang

    2016-04-01

    Full Text Available With the development of synthetic aperture radar (SAR technologies in recent years, the huge amount of remote sensing data brings challenges for real-time imaging processing. Therefore, high performance computing (HPC methods have been presented to accelerate SAR imaging, especially the GPU based methods. In the classical GPU based imaging algorithm, GPU is employed to accelerate image processing by massive parallel computing, and CPU is only used to perform the auxiliary work such as data input/output (IO. However, the computing capability of CPU is ignored and underestimated. In this work, a new deep collaborative SAR imaging method based on multiple CPU/GPU is proposed to achieve real-time SAR imaging. Through the proposed tasks partitioning and scheduling strategy, the whole image can be generated with deep collaborative multiple CPU/GPU computing. In the part of CPU parallel imaging, the advanced vector extension (AVX method is firstly introduced into the multi-core CPU parallel method for higher efficiency. As for the GPU parallel imaging, not only the bottlenecks of memory limitation and frequent data transferring are broken, but also kinds of optimized strategies are applied, such as streaming, parallel pipeline and so on. Experimental results demonstrate that the deep CPU/GPU collaborative imaging method enhances the efficiency of SAR imaging on single-core CPU by 270 times and realizes the real-time imaging in that the imaging rate outperforms the raw data generation rate.

  5. Accelerating Spaceborne SAR Imaging Using Multiple CPU/GPU Deep Collaborative Computing.

    Science.gov (United States)

    Zhang, Fan; Li, Guojun; Li, Wei; Hu, Wei; Hu, Yuxin

    2016-04-07

    With the development of synthetic aperture radar (SAR) technologies in recent years, the huge amount of remote sensing data brings challenges for real-time imaging processing. Therefore, high performance computing (HPC) methods have been presented to accelerate SAR imaging, especially the GPU based methods. In the classical GPU based imaging algorithm, GPU is employed to accelerate image processing by massive parallel computing, and CPU is only used to perform the auxiliary work such as data input/output (IO). However, the computing capability of CPU is ignored and underestimated. In this work, a new deep collaborative SAR imaging method based on multiple CPU/GPU is proposed to achieve real-time SAR imaging. Through the proposed tasks partitioning and scheduling strategy, the whole image can be generated with deep collaborative multiple CPU/GPU computing. In the part of CPU parallel imaging, the advanced vector extension (AVX) method is firstly introduced into the multi-core CPU parallel method for higher efficiency. As for the GPU parallel imaging, not only the bottlenecks of memory limitation and frequent data transferring are broken, but also kinds of optimized strategies are applied, such as streaming, parallel pipeline and so on. Experimental results demonstrate that the deep CPU/GPU collaborative imaging method enhances the efficiency of SAR imaging on single-core CPU by 270 times and realizes the real-time imaging in that the imaging rate outperforms the raw data generation rate.

  6. Improving your target-template alignment with MODalign

    OpenAIRE

    Barbato, Alessandro; Benkert, Pascal; Schwede, Torsten; Tramontano, Anna; Kosinski, Jan

    2012-01-01

    Summary: MODalign is an interactive web-based tool aimed at helping protein structure modelers to inspect and manually modify the alignment between the sequences of a target protein and of its template(s). It interactively computes, displays and, upon modification of the target-template alignment, updates the multiple sequence alignments of the two protein families, their conservation score, secondary structure and solvent accessibility values, and local quality scores of the implied three-di...

  7. A multicolor panel of TALE-KRAB based transcriptional repressor vectors enabling knockdown of multiple gene targets.

    Science.gov (United States)

    Zhang, Zhonghui; Wu, Elise; Qian, Zhijian; Wu, Wen-Shu

    2014-12-05

    Stable and efficient knockdown of multiple gene targets is highly desirable for dissection of molecular pathways. Because it allows sequence-specific DNA binding, transcription activator-like effector (TALE) offers a new genetic perturbation technique that allows for gene-specific repression. Here, we constructed a multicolor lentiviral TALE-Kruppel-associated box (KRAB) expression vector platform that enables knockdown of multiple gene targets. This platform is fully compatible with the Golden Gate TALEN and TAL Effector Kit 2.0, a widely used and efficient method for TALE assembly. We showed that this multicolor TALE-KRAB vector system when combined together with bone marrow transplantation could quickly knock down c-kit and PU.1 genes in hematopoietic stem and progenitor cells of recipient mice. Furthermore, our data demonstrated that this platform simultaneously knocked down both c-Kit and PU.1 genes in the same primary cell populations. Together, our results suggest that this multicolor TALE-KRAB vector platform is a promising and versatile tool for knockdown of multiple gene targets and could greatly facilitate dissection of molecular pathways.

  8. Target assignment for security officers to K targets (TASK)

    International Nuclear Information System (INIS)

    Rowland, J.R.; Shelton, K.W.; Stunkel, C.B.

    1983-02-01

    A probabilistic algorithm is developed to provide an optimal Target Assignment for Security officers to K targets (TASK) using a maximin criterion. Under the assumption of only a limited number (N) of security officers, the TASK computer model determines deployment assignments which maximize the system protection against sabotage by an adversary who may select any link in the system, including the weakest, for the point of attack. Applying the TASK model to a hypothetical nuclear facility containing a nine-level building reveals that aggregate targets covering multiple vital areas should be utilized to reduce the number of possible target assignments to a value equal to or only slightly larger than N. The increased probability that a given aggregate target is covered by one or more security officers offsets the slight decrease in interruption probability due to its occurring earlier in the adversary's path. In brief, the TASK model determines the optimal maximin deployment strategy for limited numbers of security officers and calculates a quantitative measure of the resulting system protection

  9. A network-based multi-target computational estimation scheme for anticoagulant activities of compounds.

    Directory of Open Access Journals (Sweden)

    Qian Li

    Full Text Available BACKGROUND: Traditional virtual screening method pays more attention on predicted binding affinity between drug molecule and target related to a certain disease instead of phenotypic data of drug molecule against disease system, as is often less effective on discovery of the drug which is used to treat many types of complex diseases. Virtual screening against a complex disease by general network estimation has become feasible with the development of network biology and system biology. More effective methods of computational estimation for the whole efficacy of a compound in a complex disease system are needed, given the distinct weightiness of the different target in a biological process and the standpoint that partial inhibition of several targets can be more efficient than the complete inhibition of a single target. METHODOLOGY: We developed a novel approach by integrating the affinity predictions from multi-target docking studies with biological network efficiency analysis to estimate the anticoagulant activities of compounds. From results of network efficiency calculation for human clotting cascade, factor Xa and thrombin were identified as the two most fragile enzymes, while the catalytic reaction mediated by complex IXa:VIIIa and the formation of the complex VIIIa:IXa were recognized as the two most fragile biological matter in the human clotting cascade system. Furthermore, the method which combined network efficiency with molecular docking scores was applied to estimate the anticoagulant activities of a serial of argatroban intermediates and eight natural products respectively. The better correlation (r = 0.671 between the experimental data and the decrease of the network deficiency suggests that the approach could be a promising computational systems biology tool to aid identification of anticoagulant activities of compounds in drug discovery. CONCLUSIONS: This article proposes a network-based multi-target computational estimation

  10. A network-based multi-target computational estimation scheme for anticoagulant activities of compounds.

    Science.gov (United States)

    Li, Qian; Li, Xudong; Li, Canghai; Chen, Lirong; Song, Jun; Tang, Yalin; Xu, Xiaojie

    2011-03-22

    Traditional virtual screening method pays more attention on predicted binding affinity between drug molecule and target related to a certain disease instead of phenotypic data of drug molecule against disease system, as is often less effective on discovery of the drug which is used to treat many types of complex diseases. Virtual screening against a complex disease by general network estimation has become feasible with the development of network biology and system biology. More effective methods of computational estimation for the whole efficacy of a compound in a complex disease system are needed, given the distinct weightiness of the different target in a biological process and the standpoint that partial inhibition of several targets can be more efficient than the complete inhibition of a single target. We developed a novel approach by integrating the affinity predictions from multi-target docking studies with biological network efficiency analysis to estimate the anticoagulant activities of compounds. From results of network efficiency calculation for human clotting cascade, factor Xa and thrombin were identified as the two most fragile enzymes, while the catalytic reaction mediated by complex IXa:VIIIa and the formation of the complex VIIIa:IXa were recognized as the two most fragile biological matter in the human clotting cascade system. Furthermore, the method which combined network efficiency with molecular docking scores was applied to estimate the anticoagulant activities of a serial of argatroban intermediates and eight natural products respectively. The better correlation (r = 0.671) between the experimental data and the decrease of the network deficiency suggests that the approach could be a promising computational systems biology tool to aid identification of anticoagulant activities of compounds in drug discovery. This article proposes a network-based multi-target computational estimation method for anticoagulant activities of compounds by

  11. ceRNAs in plants: computational approaches and associated challenges for target mimic research.

    Science.gov (United States)

    Paschoal, Alexandre Rossi; Lozada-Chávez, Irma; Domingues, Douglas Silva; Stadler, Peter F

    2017-05-30

    The competing endogenous RNA hypothesis has gained increasing attention as a potential global regulatory mechanism of microRNAs (miRNAs), and as a powerful tool to predict the function of many noncoding RNAs, including miRNAs themselves. Most studies have been focused on animals, although target mimic (TMs) discovery as well as important computational and experimental advances has been developed in plants over the past decade. Thus, our contribution summarizes recent progresses in computational approaches for research of miRNA:TM interactions. We divided this article in three main contributions. First, a general overview of research on TMs in plants is presented with practical descriptions of the available literature, tools, data, databases and computational reports. Second, we describe a common protocol for the computational and experimental analyses of TM. Third, we provide a bioinformatics approach for the prediction of TM motifs potentially cross-targeting both members within the same or from different miRNA families, based on the identification of consensus miRNA-binding sites from known TMs across sequenced genomes, transcriptomes and known miRNAs. This computational approach is promising because, in contrast to animals, miRNA families in plants are large with identical or similar members, several of which are also highly conserved. From the three consensus TM motifs found with our approach: MIM166, MIM171 and MIM159/319, the last one has found strong support on the recent experimental work by Reichel and Millar [Specificity of plant microRNA TMs: cross-targeting of mir159 and mir319. J Plant Physiol 2015;180:45-8]. Finally, we stress the discussion on the major computational and associated experimental challenges that have to be faced in future ceRNA studies. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. A Single Unexpected Change in Target- but Not Distractor Motion Impairs Multiple Object Tracking

    Directory of Open Access Journals (Sweden)

    Hauke S. Meyerhoff

    2013-02-01

    Full Text Available Recent research addresses the question whether motion information of multiple objects contributes to maintaining a selection of objects across a period of motion. Here, we investigate whether target and/or distractor motion information is used during attentive tracking. We asked participants to track four objects and changed either the motion direction of targets, the motion direction of distractors, neither, or both during a brief flash in the middle of a tracking interval. We observed that a single direction change of targets is sufficient to impair tracking performance. In contrast, changing the motion direction of distractors had no effect on performance. This indicates that target- but not distractor motion information is evaluated during tracking.

  13. Exploring the potential of a structural alphabet-based tool for mining multiple target conformations and target flexibility insight.

    Science.gov (United States)

    Regad, Leslie; Chéron, Jean-Baptiste; Triki, Dhoha; Senac, Caroline; Flatters, Delphine; Camproux, Anne-Claude

    2017-01-01

    Protein flexibility is often implied in binding with different partners and is essential for protein function. The growing number of macromolecular structures in the Protein Data Bank entries and their redundancy has become a major source of structural knowledge of the protein universe. The analysis of structural variability through available redundant structures of a target, called multiple target conformations (MTC), obtained using experimental or modeling methods and under different biological conditions or different sources is one way to explore protein flexibility. This analysis is essential to improve the understanding of various mechanisms associated with protein target function and flexibility. In this study, we explored structural variability of three biological targets by analyzing different MTC sets associated with these targets. To facilitate the study of these MTC sets, we have developed an efficient tool, SA-conf, dedicated to capturing and linking the amino acid and local structure variability and analyzing the target structural variability space. The advantage of SA-conf is that it could be applied to divers sets composed of MTCs available in the PDB obtained using NMR and crystallography or homology models. This tool could also be applied to analyze MTC sets obtained by dynamics approaches. Our results showed that SA-conf tool is effective to quantify the structural variability of a MTC set and to localize the structural variable positions and regions of the target. By selecting adapted MTC subsets and comparing their variability detected by SA-conf, we highlighted different sources of target flexibility such as induced by binding partner, by mutation and intrinsic flexibility. Our results support the interest to mine available structures associated with a target using to offer valuable insight into target flexibility and interaction mechanisms. The SA-conf executable script, with a set of pre-compiled binaries are available at http://www.mti.univ-paris-diderot.fr/recherche/plateformes/logiciels.

  14. Exploring the potential of a structural alphabet-based tool for mining multiple target conformations and target flexibility insight.

    Directory of Open Access Journals (Sweden)

    Leslie Regad

    Full Text Available Protein flexibility is often implied in binding with different partners and is essential for protein function. The growing number of macromolecular structures in the Protein Data Bank entries and their redundancy has become a major source of structural knowledge of the protein universe. The analysis of structural variability through available redundant structures of a target, called multiple target conformations (MTC, obtained using experimental or modeling methods and under different biological conditions or different sources is one way to explore protein flexibility. This analysis is essential to improve the understanding of various mechanisms associated with protein target function and flexibility. In this study, we explored structural variability of three biological targets by analyzing different MTC sets associated with these targets. To facilitate the study of these MTC sets, we have developed an efficient tool, SA-conf, dedicated to capturing and linking the amino acid and local structure variability and analyzing the target structural variability space. The advantage of SA-conf is that it could be applied to divers sets composed of MTCs available in the PDB obtained using NMR and crystallography or homology models. This tool could also be applied to analyze MTC sets obtained by dynamics approaches. Our results showed that SA-conf tool is effective to quantify the structural variability of a MTC set and to localize the structural variable positions and regions of the target. By selecting adapted MTC subsets and comparing their variability detected by SA-conf, we highlighted different sources of target flexibility such as induced by binding partner, by mutation and intrinsic flexibility. Our results support the interest to mine available structures associated with a target using to offer valuable insight into target flexibility and interaction mechanisms. The SA-conf executable script, with a set of pre-compiled binaries are available at

  15. Multi-target QSPR modeling for simultaneous prediction of multiple gas-phase kinetic rate constants of diverse chemicals

    Science.gov (United States)

    Basant, Nikita; Gupta, Shikha

    2018-03-01

    The reactions of molecular ozone (O3), hydroxyl (•OH) and nitrate (NO3) radicals are among the major pathways of removal of volatile organic compounds (VOCs) in the atmospheric environment. The gas-phase kinetic rate constants (kO3, kOH, kNO3) are thus, important in assessing the ultimate fate and exposure risk of atmospheric VOCs. Experimental data for rate constants are not available for many emerging VOCs and the computational methods reported so far address a single target modeling only. In this study, we have developed a multi-target (mt) QSPR model for simultaneous prediction of multiple kinetic rate constants (kO3, kOH, kNO3) of diverse organic chemicals considering an experimental data set of VOCs for which values of all the three rate constants are available. The mt-QSPR model identified and used five descriptors related to the molecular size, degree of saturation and electron density in a molecule, which were mechanistically interpretable. These descriptors successfully predicted three rate constants simultaneously. The model yielded high correlations (R2 = 0.874-0.924) between the experimental and simultaneously predicted endpoint rate constant (kO3, kOH, kNO3) values in test arrays for all the three systems. The model also passed all the stringent statistical validation tests for external predictivity. The proposed multi-target QSPR model can be successfully used for predicting reactivity of new VOCs simultaneously for their exposure risk assessment.

  16. ESPRIT-like algorithm for computational-efficient angle estimation in bistatic multiple-input multiple-output radar

    Science.gov (United States)

    Gong, Jian; Lou, Shuntian; Guo, Yiduo

    2016-04-01

    An estimation of signal parameters via a rotational invariance techniques-like (ESPRIT-like) algorithm is proposed to estimate the direction of arrival and direction of departure for bistatic multiple-input multiple-output (MIMO) radar. The properties of a noncircular signal and Euler's formula are first exploited to establish a real-valued bistatic MIMO radar array data, which is composed of sine and cosine data. Then the receiving/transmitting selective matrices are constructed to obtain the receiving/transmitting rotational invariance factors. Since the rotational invariance factor is a cosine function, symmetrical mirror angle ambiguity may occur. Finally, a maximum likelihood function is used to avoid the estimation ambiguities. Compared with the existing ESPRIT, the proposed algorithm can save about 75% of computational load owing to the real-valued ESPRIT algorithm. Simulation results confirm the effectiveness of the ESPRIT-like algorithm.

  17. Computational design of high efficiency release targets for use at ISOL facilities

    CERN Document Server

    Liu, Y

    1999-01-01

    This report describes efforts made at the Oak Ridge National Laboratory to design high-efficiency-release targets that simultaneously incorporate the short diffusion lengths, high permeabilities, controllable temperatures, and heat-removal properties required for the generation of useful radioactive ion beam (RIB) intensities for nuclear physics and astrophysics research using the isotope separation on-line (ISOL) technique. Short diffusion lengths are achieved either by using thin fibrous target materials or by coating thin layers of selected target material onto low-density carbon fibers such as reticulated-vitreous-carbon fiber (RVCF) or carbon-bonded-carbon fiber (CBCF) to form highly permeable composite target matrices. Computational studies that simulate the generation and removal of primary beam deposited heat from target materials have been conducted to optimize the design of target/heat-sink systems for generating RIBs. The results derived from diffusion release-rate simulation studies for selected t...

  18. Trade-Off Exploration for Target Tracking Application in a Customized Multiprocessor Architecture

    Directory of Open Access Journals (Sweden)

    Yassin El-Hillali

    2009-01-01

    Full Text Available This paper presents the design of an FPGA-based multiprocessor-system-on-chip (MPSoC architecture optimized for Multiple Target Tracking (MTT in automotive applications. An MTT system uses an automotive radar to track the speed and relative position of all the vehicles (targets within its field of view. As the number of targets increases, the computational needs of the MTT system also increase making it difficult for a single processor to handle it alone. Our implementation distributes the computational load among multiple soft processor cores optimized for executing specific computational tasks. The paper explains how we designed and profiled the MTT application to partition it among different processors. It also explains how we applied different optimizations to customize the individual processor cores to their assigned tasks and to assess their impact on performance and FPGA resource utilization. The result is a complete MTT application running on an optimized MPSoC architecture that fits in a contemporary medium-sized FPGA and that meets the application's real-time constraints.

  19. Interacting Multiple Model (IMM Fifth-Degree Spherical Simplex-Radial Cubature Kalman Filter for Maneuvering Target Tracking

    Directory of Open Access Journals (Sweden)

    Hua Liu

    2017-06-01

    Full Text Available For improving the tracking accuracy and model switching speed of maneuvering target tracking in nonlinear systems, a new algorithm named the interacting multiple model fifth-degree spherical simplex-radial cubature Kalman filter (IMM5thSSRCKF is proposed in this paper. The new algorithm is a combination of the interacting multiple model (IMM filter and the fifth-degree spherical simplex-radial cubature Kalman filter (5thSSRCKF. The proposed algorithm makes use of Markov process to describe the switching probability among the models, and uses 5thSSRCKF to deal with the state estimation of each model. The 5thSSRCKF is an improved filter algorithm, which utilizes the fifth-degree spherical simplex-radial rule to improve the filtering accuracy. Finally, the tracking performance of the IMM5thSSRCKF is evaluated by simulation in a typical maneuvering target tracking scenario. Simulation results show that the proposed algorithm has better tracking performance and quicker model switching speed when disposing maneuver models compared with the interacting multiple model unscented Kalman filter (IMMUKF, the interacting multiple model cubature Kalman filter (IMMCKF and the interacting multiple model fifth-degree cubature Kalman filter (IMM5thCKF.

  20. Interacting Multiple Model (IMM) Fifth-Degree Spherical Simplex-Radial Cubature Kalman Filter for Maneuvering Target Tracking.

    Science.gov (United States)

    Liu, Hua; Wu, Wen

    2017-06-13

    For improving the tracking accuracy and model switching speed of maneuvering target tracking in nonlinear systems, a new algorithm named the interacting multiple model fifth-degree spherical simplex-radial cubature Kalman filter (IMM5thSSRCKF) is proposed in this paper. The new algorithm is a combination of the interacting multiple model (IMM) filter and the fifth-degree spherical simplex-radial cubature Kalman filter (5thSSRCKF). The proposed algorithm makes use of Markov process to describe the switching probability among the models, and uses 5thSSRCKF to deal with the state estimation of each model. The 5thSSRCKF is an improved filter algorithm, which utilizes the fifth-degree spherical simplex-radial rule to improve the filtering accuracy. Finally, the tracking performance of the IMM5thSSRCKF is evaluated by simulation in a typical maneuvering target tracking scenario. Simulation results show that the proposed algorithm has better tracking performance and quicker model switching speed when disposing maneuver models compared with the interacting multiple model unscented Kalman filter (IMMUKF), the interacting multiple model cubature Kalman filter (IMMCKF) and the interacting multiple model fifth-degree cubature Kalman filter (IMM5thCKF).

  1. Locations of serial reach targets are coded in multiple reference frames.

    Science.gov (United States)

    Thompson, Aidan A; Henriques, Denise Y P

    2010-12-01

    an egocentric frame anchored to the eye. However, the amount of change in this distance was smaller than predicted by a pure eye-fixed representation, suggesting that relative positions of the targets or allocentric coding was also used in sequential reach planning. The spatial coding and updating of sequential reach target locations seems to rely on a combined weighting of multiple reference frames, with one of them centered on the eye. Copyright © 2010 Elsevier Ltd. All rights reserved.

  2. High-Performance Matrix-Vector Multiplication on the GPU

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg

    2012-01-01

    In this paper, we develop a high-performance GPU kernel for one of the most popular dense linear algebra operations, the matrix-vector multiplication. The target hardware is the most recent Nvidia Tesla 20-series (Fermi architecture), which is designed from the ground up for scientific computing...

  3. HONEI: A collection of libraries for numerical computations targeting multiple processor architectures

    Science.gov (United States)

    van Dyk, Danny; Geveler, Markus; Mallach, Sven; Ribbrock, Dirk; Göddeke, Dominik; Gutwenger, Carsten

    2009-12-01

    We present HONEI, an open-source collection of libraries offering a hardware oriented approach to numerical calculations. HONEI abstracts the hardware, and applications written on top of HONEI can be executed on a wide range of computer architectures such as CPUs, GPUs and the Cell processor. We demonstrate the flexibility and performance of our approach with two test applications, a Finite Element multigrid solver for the Poisson problem and a robust and fast simulation of shallow water waves. By linking against HONEI's libraries, we achieve a two-fold speedup over straight forward C++ code using HONEI's SSE backend, and additional 3-4 and 4-16 times faster execution on the Cell and a GPU. A second important aspect of our approach is that the full performance capabilities of the hardware under consideration can be exploited by adding optimised application-specific operations to the HONEI libraries. HONEI provides all necessary infrastructure for development and evaluation of such kernels, significantly simplifying their development. Program summaryProgram title: HONEI Catalogue identifier: AEDW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GPLv2 No. of lines in distributed program, including test data, etc.: 216 180 No. of bytes in distributed program, including test data, etc.: 1 270 140 Distribution format: tar.gz Programming language: C++ Computer: x86, x86_64, NVIDIA CUDA GPUs, Cell blades and PlayStation 3 Operating system: Linux RAM: at least 500 MB free Classification: 4.8, 4.3, 6.1 External routines: SSE: none; [1] for GPU, [2] for Cell backend Nature of problem: Computational science in general and numerical simulation in particular have reached a turning point. The revolution developers are facing is not primarily driven by a change in (problem-specific) methodology, but rather by the fundamental paradigm shift of the

  4. Multiple exciton generation in chiral carbon nanotubes: Density functional theory based computation

    Science.gov (United States)

    Kryjevski, Andrei; Mihaylov, Deyan; Kilina, Svetlana; Kilin, Dmitri

    2017-10-01

    We use a Boltzmann transport equation (BE) to study time evolution of a photo-excited state in a nanoparticle including phonon-mediated exciton relaxation and the multiple exciton generation (MEG) processes, such as exciton-to-biexciton multiplication and biexciton-to-exciton recombination. BE collision integrals are computed using Kadanoff-Baym-Keldysh many-body perturbation theory based on density functional theory simulations, including exciton effects. We compute internal quantum efficiency (QE), which is the number of excitons generated from an absorbed photon in the course of the relaxation. We apply this approach to chiral single-wall carbon nanotubes (SWCNTs), such as (6,2) and (6,5). We predict efficient MEG in the (6,2) and (6,5) SWCNTs within the solar spectrum range starting at the 2Eg energy threshold and with QE reaching ˜1.6 at about 3Eg, where Eg is the electronic gap.

  5. Multiple Target Laser Designator (MTLD)

    Science.gov (United States)

    2007-03-01

    Optimized Liquid Crystal Scanning Element Optimize the Nonimaging Predictive Algorithm for Target Ranging, Tracking, and Position Estimation...commercial potential. 3.0 PROGRESS THIS QUARTER 3.1 Optimization of Nonimaging Holographic Antenna for Target Tracking and Position Estimation (Task 6) In

  6. Treatment planning with intensity modulated particle therapy for multiple targets in stage IV non-small cell lung cancer

    Science.gov (United States)

    Anderle, Kristjan; Stroom, Joep; Vieira, Sandra; Pimentel, Nuno; Greco, Carlo; Durante, Marco; Graeff, Christian

    2018-01-01

    Intensity modulated particle therapy (IMPT) can produce highly conformal plans, but is limited in advanced lung cancer patients with multiple lesions due to motion and planning complexity. A 4D IMPT optimization including all motion states was expanded to include multiple targets, where each target (isocenter) is designated to specific field(s). Furthermore, to achieve stereotactic treatment planning objectives, target and OAR weights plus objective doses were automatically iteratively adapted. Finally, 4D doses were calculated for different motion scenarios. The results from our algorithm were compared to clinical stereotactic body radiation treatment (SBRT) plans. The study included eight patients with 24 lesions in total. Intended dose regimen for SBRT was 24 Gy in one fraction, but lower fractionated doses had to be delivered in three cases due to OAR constraints or failed plan quality assurance. The resulting IMPT treatment plans had no significant difference in target coverage compared to SBRT treatment plans. Average maximum point dose and dose to specific volume in OARs were on average 65% and 22% smaller with IMPT. IMPT could also deliver 24 Gy in one fraction in a patient where SBRT was limited due to the OAR vicinity. The developed algorithm shows the potential of IMPT in treatment of multiple moving targets in a complex geometry.

  7. Low Complexity Moving Target Parameter Estimation for MIMO Radar using 2D-FFT

    KAUST Repository

    Jardak, Seifallah

    2017-06-16

    In multiple-input multiple-output radar, to localize a target and estimate its reflection coefficient, a given cost function is usually optimized over a grid of points. The performance of such algorithms is directly affected by the grid resolution. Increasing the number of grid points enhances the resolution of the estimator but also increases its computational complexity exponentially. In this work, two reduced complexity algorithms are derived based on Capon and amplitude and phase estimation (APES) to estimate the reflection coefficient, angular location and, Doppler shift of multiple moving targets. By exploiting the structure of the terms, the cost-function is brought into a form that allows us to apply the two-dimensional fast-Fourier-transform (2D-FFT) and reduce the computational complexity of estimation. Using low resolution 2D-FFT, the proposed algorithm identifies sub-optimal estimates and feeds them as initial points to the derived Newton gradient algorithm. In contrast to the grid-based search algorithms, the proposed algorithm can optimally estimate on- and off-the-grid targets in very low computational complexity. A new APES cost-function with better estimation performance is also discussed. Generalized expressions of the Cramér-Rao lower bound are derived to asses the performance of the proposed algorithm.

  8. Low Complexity Moving Target Parameter Estimation for MIMO Radar using 2D-FFT

    KAUST Repository

    Jardak, Seifallah; Ahmed, Sajid; Alouini, Mohamed-Slim

    2017-01-01

    In multiple-input multiple-output radar, to localize a target and estimate its reflection coefficient, a given cost function is usually optimized over a grid of points. The performance of such algorithms is directly affected by the grid resolution. Increasing the number of grid points enhances the resolution of the estimator but also increases its computational complexity exponentially. In this work, two reduced complexity algorithms are derived based on Capon and amplitude and phase estimation (APES) to estimate the reflection coefficient, angular location and, Doppler shift of multiple moving targets. By exploiting the structure of the terms, the cost-function is brought into a form that allows us to apply the two-dimensional fast-Fourier-transform (2D-FFT) and reduce the computational complexity of estimation. Using low resolution 2D-FFT, the proposed algorithm identifies sub-optimal estimates and feeds them as initial points to the derived Newton gradient algorithm. In contrast to the grid-based search algorithms, the proposed algorithm can optimally estimate on- and off-the-grid targets in very low computational complexity. A new APES cost-function with better estimation performance is also discussed. Generalized expressions of the Cramér-Rao lower bound are derived to asses the performance of the proposed algorithm.

  9. A minimal unified model of disease trajectories captures hallmarks of multiple sclerosis

    KAUST Repository

    Kannan, Venkateshan; Kiani, Narsis A.; Piehl, Fredrik; Tegner, Jesper

    2017-01-01

    Multiple Sclerosis (MS) is an autoimmune disease targeting the central nervous system (CNS) causing demyelination and neurodegeneration leading to accumulation of neurological disability. Here we present a minimal, computational model involving

  10. Security prospects through cloud computing by adopting multiple clouds

    DEFF Research Database (Denmark)

    Jensen, Meiko; Schwenk, Jörg; Bohli, Jens Matthias

    2011-01-01

    Clouds impose new security challenges, which are amongst the biggest obstacles when considering the usage of cloud services. This triggered a lot of research activities in this direction, resulting in a quantity of proposals targeting the various security threats. Besides the security issues coming...... with the cloud paradigm, it can also provide a new set of unique features which open the path towards novel security approaches, techniques and architectures. This paper initiates this discussion by contributing a concept which achieves security merits by making use of multiple distinct clouds at the same time....

  11. Monitoring system of multiple fire fighting based on computer vision

    Science.gov (United States)

    Li, Jinlong; Wang, Li; Gao, Xiaorong; Wang, Zeyong; Zhao, Quanke

    2010-10-01

    With the high demand of fire control in spacious buildings, computer vision is playing a more and more important role. This paper presents a new monitoring system of multiple fire fighting based on computer vision and color detection. This system can adjust to the fire position and then extinguish the fire by itself. In this paper, the system structure, working principle, fire orientation, hydrant's angle adjusting and system calibration are described in detail; also the design of relevant hardware and software is introduced. At the same time, the principle and process of color detection and image processing are given as well. The system runs well in the test, and it has high reliability, low cost, and easy nodeexpanding, which has a bright prospect of application and popularization.

  12. An algebraic substructuring using multiple shifts for eigenvalue computations

    International Nuclear Information System (INIS)

    Ko, Jin Hwan; Jung, Sung Nam; Byun, Do Young; Bai, Zhaojun

    2008-01-01

    Algebraic substructuring (AS) is a state-of-the-art method in eigenvalue computations, especially for large-sized problems, but originally it was designed to calculate only the smallest eigenvalues. Recently, an updated version of AS has been introduced to calculate the interior eigenvalues over a specified range by using a shift concept that is referred to as the shifted AS. In this work, we propose a combined method of both AS and the shifted AS by using multiple shifts for solving a considerable number of eigensolutions in a large-sized problem, which is an emerging computational issue of noise or vibration analysis in vehicle design. In addition, we investigated the accuracy of the shifted AS by presenting an error criterion. The proposed method has been applied to the FE model of an automobile body. The combined method yielded a higher efficiency without loss of accuracy in comparison to the original AS

  13. Distributed Factorization Computation on Multiple Volunteered Mobile Resource to Break RSA Key

    Science.gov (United States)

    Jaya, I.; Hardi, S. M.; Tarigan, J. T.; Zamzami, E. M.; Sihombing, P.

    2017-01-01

    Similar to common asymmeric encryption, RSA can be cracked by usmg a series mathematical calculation. The private key used to decrypt the massage can be computed using the public key. However, finding the private key may require a massive amount of calculation. In this paper, we propose a method to perform a distributed computing to calculate RSA’s private key. The proposed method uses multiple volunteered mobile devices to contribute during the calculation process. Our objective is to demonstrate how the use of volunteered computing on mobile devices may be a feasible option to reduce the time required to break a weak RSA encryption and observe the behavior and running time of the application on mobile devices.

  14. Low power multiple shell fusion targets for use with electron and ion beams

    International Nuclear Information System (INIS)

    Lindl, J.D.; Bangerter, R.O.

    1975-01-01

    Use of double shell targets with a separate low Z, low density ablator at large radius for the outer shell, reduces the focusing and power requirements while maintaining reasonable aspect ratios. A high Z, high density pusher shell is placed at a much smaller radius in order to obtain an aspect ratio small enough to protect against fluid instability. Velocity multiplication between these shells further lowers the power requirements. Careful tuning of the power profile and intershell density results in a low entropy implosion which allows breakeven at low powers. Ion beams appear to be a promising power source and breakeven at 10-20 Terrawatts with 10 MeV alpha particles appears feasible. Predicted performance of targets with various energy sources is shown and comparison is made with single shell targets

  15. Group Targets Tracking Using Multiple Models GGIW-CPHD Based on Best-Fitting Gaussian Approximation and Strong Tracking Filter

    Directory of Open Access Journals (Sweden)

    Yun Wang

    2016-01-01

    Full Text Available Gamma Gaussian inverse Wishart cardinalized probability hypothesis density (GGIW-CPHD algorithm was always used to track group targets in the presence of cluttered measurements and missing detections. A multiple models GGIW-CPHD algorithm based on best-fitting Gaussian approximation method (BFG and strong tracking filter (STF is proposed aiming at the defect that the tracking error of GGIW-CPHD algorithm will increase when the group targets are maneuvering. The best-fitting Gaussian approximation method is proposed to implement the fusion of multiple models using the strong tracking filter to correct the predicted covariance matrix of the GGIW component. The corresponding likelihood functions are deduced to update the probability of multiple tracking models. From the simulation results we can see that the proposed tracking algorithm MM-GGIW-CPHD can effectively deal with the combination/spawning of groups and the tracking error of group targets in the maneuvering stage is decreased.

  16. Evolution of the heteroharmonic strategy for target-range computation in the echolocation of Mormoopidae.

    Directory of Open Access Journals (Sweden)

    Emanuel C Mora

    2013-06-01

    Full Text Available Echolocating bats use the time elapsed from biosonar pulse emission to the arrival of echo (defined as echo-delay to assess target-distance. Target-distance is represented in the brain by delay-tuned neurons that are classified as either heteroharmonic or homoharmormic. Heteroharmonic neurons respond more strongly to pulse-echo pairs in which the timing of the pulse is given by the fundamental biosonar harmonic while the timing of echoes is provided by one (or several of the higher order harmonics. On the other hand, homoharmonic neurons are tuned to the echo delay between similar harmonics in the emitted pulse and echo. It is generally accepted that heteroharmonic computations are advantageous over homoharmonic computations; i.e. heteroharmonic neurons receive information from call and echo in different frequency-bands which helps to avoid jamming between pulse and echo signals. Heteroharmonic neurons have been found in two species of the family Mormoopidae (Pteronotus parnellii and Pteronotus quadridens and in Rhinolophus rouxi. Recently, it was proposed that heteroharmonic target-range computations are a primitive feature of the genus Pteronotus that was preserved in the evolution of the genus. Here we review recent findings on the evolution of echolocation in Mormoopidae, and try to link those findings to the evolution of the heteroharmonic computation strategy. We stress the hypothesis that the ability to perform heteroharmonic computations evolved separately from the ability of using long constant-frequency echolocation calls, high duty cycle echolocation and Doppler Shift Compensation. Also, we present the idea that heteroharmonic computations might have been of advantage for categorizing prey size, hunting eared insects and living in large conspecific colonies. We make five testable predictions that might help future investigations to clarify the evolution of the heteroharmonic echolocation in Mormoopidae and other families.

  17. Novel computational methods to predict drug–target interactions using graph mining and machine learning approaches

    KAUST Repository

    Olayan, Rawan S.

    2017-12-01

    Computational drug repurposing aims at finding new medical uses for existing drugs. The identification of novel drug-target interactions (DTIs) can be a useful part of such a task. Computational determination of DTIs is a convenient strategy for systematic screening of a large number of drugs in the attempt to identify new DTIs at low cost and with reasonable accuracy. This necessitates development of accurate computational methods that can help focus on the follow-up experimental validation on a smaller number of highly likely targets for a drug. Although many methods have been proposed for computational DTI prediction, they suffer the high false positive prediction rate or they do not predict the effect that drugs exert on targets in DTIs. In this report, first, we present a comprehensive review of the recent progress in the field of DTI prediction from data-centric and algorithm-centric perspectives. The aim is to provide a comprehensive review of computational methods for identifying DTIs, which could help in constructing more reliable methods. Then, we present DDR, an efficient method to predict the existence of DTIs. DDR achieves significantly more accurate results compared to the other state-of-theart methods. As supported by independent evidences, we verified as correct 22 out of the top 25 DDR DTIs predictions. This validation proves the practical utility of DDR, suggesting that DDR can be used as an efficient method to identify 5 correct DTIs. Finally, we present DDR-FE method that predicts the effect types of a drug on its target. On different representative datasets, under various test setups, and using different performance measures, we show that DDR-FE achieves extremely good performance. Using blind test data, we verified as correct 2,300 out of 3,076 DTIs effects predicted by DDR-FE. This suggests that DDR-FE can be used as an efficient method to identify correct effects of a drug on its target.

  18. Simple and Efficient Targeting of Multiple Genes Through CRISPR-Cas9 in Physcomitrella patens

    Directory of Open Access Journals (Sweden)

    Mauricio Lopez-Obando

    2016-11-01

    Full Text Available Powerful genome editing technologies are needed for efficient gene function analysis. The CRISPR-Cas9 system has been adapted as an efficient gene-knock-out technology in a variety of species. However, in a number of situations, knocking out or modifying a single gene is not sufficient; this is particularly true for genes belonging to a common family, or for genes showing redundant functions. Like many plants, the model organism Physcomitrella patens has experienced multiple events of polyploidization during evolution that has resulted in a number of families of duplicated genes. Here, we report a robust CRISPR-Cas9 system, based on the codelivery of a CAS9 expressing cassette, multiple sgRNA vectors, and a cassette for transient transformation selection, for gene knock-out in multiple gene families. We demonstrate that CRISPR-Cas9-mediated targeting of five different genes allows the selection of a quintuple mutant, and all possible subcombinations of mutants, in one experiment, with no mutations detected in potential off-target sequences. Furthermore, we confirmed the observation that the presence of repeats in the vicinity of the cutting region favors deletion due to the alternative end joining pathway, for which induced frameshift mutations can be potentially predicted. Because the number of multiple gene families in Physcomitrella is substantial, this tool opens new perspectives to study the role of expanded gene families in the colonization of land by plants.

  19. A computational approach to finding novel targets for existing drugs.

    Directory of Open Access Journals (Sweden)

    Yvonne Y Li

    2011-09-01

    Full Text Available Repositioning existing drugs for new therapeutic uses is an efficient approach to drug discovery. We have developed a computational drug repositioning pipeline to perform large-scale molecular docking of small molecule drugs against protein drug targets, in order to map the drug-target interaction space and find novel interactions. Our method emphasizes removing false positive interaction predictions using criteria from known interaction docking, consensus scoring, and specificity. In all, our database contains 252 human protein drug targets that we classify as reliable-for-docking as well as 4621 approved and experimental small molecule drugs from DrugBank. These were cross-docked, then filtered through stringent scoring criteria to select top drug-target interactions. In particular, we used MAPK14 and the kinase inhibitor BIM-8 as examples where our stringent thresholds enriched the predicted drug-target interactions with known interactions up to 20 times compared to standard score thresholds. We validated nilotinib as a potent MAPK14 inhibitor in vitro (IC50 40 nM, suggesting a potential use for this drug in treating inflammatory diseases. The published literature indicated experimental evidence for 31 of the top predicted interactions, highlighting the promising nature of our approach. Novel interactions discovered may lead to the drug being repositioned as a therapeutic treatment for its off-target's associated disease, added insight into the drug's mechanism of action, and added insight into the drug's side effects.

  20. Computer-aided Molecular Design of Compounds Targeting Histone Modifying Enzymes

    Science.gov (United States)

    Andreoli, Federico; Del Rio, Alberto

    2015-01-01

    Growing evidences show that epigenetic mechanisms play crucial roles in the genesis and progression of many physiopathological processes. As a result, research in epigenetic grew at a fast pace in the last decade. In particular, the study of histone post-translational modifications encountered an extraordinary progression and many modifications have been characterized and associated to fundamental biological processes and pathological conditions. Histone modifications are the catalytic result of a large set of enzyme families that operate covalent modifications on specific residues at the histone tails. Taken together, these modifications elicit a complex and concerted processing that greatly contribute to the chromatin remodeling and may drive different pathological conditions, especially cancer. For this reason, several epigenetic targets are currently under validation for drug discovery purposes and different academic and industrial programs have been already launched to produce the first pre-clinical and clinical outcomes. In this scenario, computer-aided molecular design techniques are offering important tools, mainly as a consequence of the increasing structural information available for these targets. In this mini-review we will briefly discuss the most common types of known histone modifications and the corresponding operating enzymes by emphasizing the computer-aided molecular design approaches that can be of use to speed-up the efforts to generate new pharmaceutically relevant compounds. PMID:26082827

  1. A computer program for multiple decrement life table analyses.

    Science.gov (United States)

    Poole, W K; Cooley, P C

    1977-06-01

    Life table analysis has traditionally been the tool of choice in analyzing distribution of "survival" times when a parametric form for the survival curve could not be reasonably assumed. Chiang, in two papers [1,2] formalized the theory of life table analyses in a Markov chain framework and derived maximum likelihood estimates of the relevant parameters for the analyses. He also discussed how the techniques could be generalized to consider competing risks and follow-up studies. Although various computer programs exist for doing different types of life table analysis [3] to date, there has not been a generally available, well documented computer program to carry out multiple decrement analyses, either by Chiang's or any other method. This paper describes such a program developed by Research Triangle Institute. A user's manual is available at printing costs which supplements the contents of this paper with a discussion of the formula used in the program listing.

  2. Computation of subsonic flow around airfoil systems with multiple separation

    Science.gov (United States)

    Jacob, K.

    1982-01-01

    A numerical method for computing the subsonic flow around multi-element airfoil systems was developed, allowing for flow separation at one or more elements. Besides multiple rear separation also sort bubbles on the upper surface and cove bubbles can approximately be taken into account. Also, compressibility effects for pure subsonic flow are approximately accounted for. After presentation the method is applied to several examples and improved in some details. Finally, the present limitations and desirable extensions are discussed.

  3. One For All? Hitting multiple Alzheimer’s Disease targets with one drug

    Directory of Open Access Journals (Sweden)

    Rebecca Ellen Hughes

    2016-04-01

    Full Text Available Alzheimer’s disease is a complex and multifactorial disease for which the mechanism is still not fully understood. As new insights into disease progression are discovered, new drugs must be designed to target those aspects of the disease that cause neuronal damage rather than just the symptoms currently addressed by single target drugs. It is becoming possible to target several aspects of the disease pathology at once using multi-target drugs. Intended as a introduction for non-experts, this review describes the key multi-target drug design approaches, namely structure-based, in silico, and data-mining, to evaluate what is preventing compounds progressing through the clinic to the market. Repurposing current drugs using their off-target effects reduces the cost of development, time to launch and also the uncertainty associated with safety and pharmacokinetics. The most promising drugs currently being investigated for repurposing to Alzheimer’s Disease are rasagiline, originally developed for the treatment of Parkinson’s Disease, and liraglutide, an antidiabetic. Rational drug design can combine pharmacophores of multiple drugs, systematically change functional groups, and rank them by virtual screening. Hits confirmed experimentally are rationally modified to generate an effective multi-potent lead compound. Examples from this approach are ASS234 with properties similar to rasagiline, and donecopride, a hybrid of an acetylcholinesterase inhibitor and a 5-HT4 receptor agonist with pro-cognitive effects. Exploiting these interdisciplinary approaches, public-private collaborative lead factories promise faster delivery of new drugs to the clinic.

  4. Simultaneous detection of multiple DNA targets by integrating dual-color graphene quantum dot nanoprobes and carbon nanotubes.

    Science.gov (United States)

    Qian, Zhaosheng; Shan, Xiaoyue; Chai, Lujing; Chen, Jianrong; Feng, Hui

    2014-12-01

    Simultaneous detection of multiple DNA targets was achieved based on a biocompatible graphene quantum dots (GQDs) and carbon nanotubes (CNTs) platform through spontaneous assembly between dual-color GQD-based probes and CNTs and subsequently self-recognition between DNA probes and targets. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. A triple axes multiple target holder assembly

    International Nuclear Information System (INIS)

    Tribedi, L.C.; Narvekar, S.D.; Pillay, R.G.; Tandon, P.N.

    1993-01-01

    We have designed and fabricated a rotatable target holder assembly capable of accommodating 27 targets. The target foils are mounted along two concentric circles on a ss wheel. On the outer circle 18 targets can be mounted each 20deg apart, and on the inner circle the remaining targets are positioned each 40deg apart. The self supporting or carbon backed targets are mounted on thin frames and are placed concentrically at the targets are mounted on thin frames and are placed concentrically at the target position on the wheel. Three degrees of freedom are provided to the target holder assembly. (author). 1 fig

  6. EBF factors drive expression of multiple classes of target genes governing neuronal development.

    Science.gov (United States)

    Green, Yangsook S; Vetter, Monica L

    2011-04-30

    Early B cell factor (EBF) family members are transcription factors known to have important roles in several aspects of vertebrate neurogenesis, including commitment, migration and differentiation. Knowledge of how EBF family members contribute to neurogenesis is limited by a lack of detailed understanding of genes that are transcriptionally regulated by these factors. We performed a microarray screen in Xenopus animal caps to search for targets of EBF transcriptional activity, and identified candidate targets with multiple roles, including transcription factors of several classes. We determined that, among the most upregulated candidate genes with expected neuronal functions, most require EBF activity for some or all of their expression, and most have overlapping expression with ebf genes. We also found that the candidate target genes that had the most strongly overlapping expression patterns with ebf genes were predicted to be direct transcriptional targets of EBF transcriptional activity. The identification of candidate targets that are transcription factor genes, including nscl-1, emx1 and aml1, improves our understanding of how EBF proteins participate in the hierarchy of transcription control during neuronal development, and suggests novel mechanisms by which EBF activity promotes migration and differentiation. Other candidate targets, including pcdh8 and kcnk5, expand our knowledge of the types of terminal differentiated neuronal functions that EBF proteins regulate.

  7. Computation of the target state and feedback controls for time optimal consensus in multi-agent systems

    Science.gov (United States)

    Mulla, Ameer K.; Patil, Deepak U.; Chakraborty, Debraj

    2018-02-01

    N identical agents with bounded inputs aim to reach a common target state (consensus) in the minimum possible time. Algorithms for computing this time-optimal consensus point, the control law to be used by each agent and the time taken for the consensus to occur, are proposed. Two types of multi-agent systems are considered, namely (1) coupled single-integrator agents on a plane and, (2) double-integrator agents on a line. At the initial time instant, each agent is assumed to have access to the state information of all the other agents. An algorithm, using convexity of attainable sets and Helly's theorem, is proposed, to compute the final consensus target state and the minimum time to achieve this consensus. Further, parts of the computation are parallelised amongst the agents such that each agent has to perform computations of O(N2) run time complexity. Finally, local feedback time-optimal control laws are synthesised to drive each agent to the target point in minimum time. During this part of the operation, the controller for each agent uses measurements of only its own states and does not need to communicate with any neighbouring agents.

  8. FireProt: Energy- and Evolution-Based Computational Design of Thermostable Multiple-Point Mutants.

    Science.gov (United States)

    Bednar, David; Beerens, Koen; Sebestova, Eva; Bendl, Jaroslav; Khare, Sagar; Chaloupkova, Radka; Prokop, Zbynek; Brezovsky, Jan; Baker, David; Damborsky, Jiri

    2015-11-01

    There is great interest in increasing proteins' stability to enhance their utility as biocatalysts, therapeutics, diagnostics and nanomaterials. Directed evolution is a powerful, but experimentally strenuous approach. Computational methods offer attractive alternatives. However, due to the limited reliability of predictions and potentially antagonistic effects of substitutions, only single-point mutations are usually predicted in silico, experimentally verified and then recombined in multiple-point mutants. Thus, substantial screening is still required. Here we present FireProt, a robust computational strategy for predicting highly stable multiple-point mutants that combines energy- and evolution-based approaches with smart filtering to identify additive stabilizing mutations. FireProt's reliability and applicability was demonstrated by validating its predictions against 656 mutations from the ProTherm database. We demonstrate that thermostability of the model enzymes haloalkane dehalogenase DhaA and γ-hexachlorocyclohexane dehydrochlorinase LinA can be substantially increased (ΔTm = 24°C and 21°C) by constructing and characterizing only a handful of multiple-point mutants. FireProt can be applied to any protein for which a tertiary structure and homologous sequences are available, and will facilitate the rapid development of robust proteins for biomedical and biotechnological applications.

  9. FireProt: Energy- and Evolution-Based Computational Design of Thermostable Multiple-Point Mutants.

    Directory of Open Access Journals (Sweden)

    David Bednar

    2015-11-01

    Full Text Available There is great interest in increasing proteins' stability to enhance their utility as biocatalysts, therapeutics, diagnostics and nanomaterials. Directed evolution is a powerful, but experimentally strenuous approach. Computational methods offer attractive alternatives. However, due to the limited reliability of predictions and potentially antagonistic effects of substitutions, only single-point mutations are usually predicted in silico, experimentally verified and then recombined in multiple-point mutants. Thus, substantial screening is still required. Here we present FireProt, a robust computational strategy for predicting highly stable multiple-point mutants that combines energy- and evolution-based approaches with smart filtering to identify additive stabilizing mutations. FireProt's reliability and applicability was demonstrated by validating its predictions against 656 mutations from the ProTherm database. We demonstrate that thermostability of the model enzymes haloalkane dehalogenase DhaA and γ-hexachlorocyclohexane dehydrochlorinase LinA can be substantially increased (ΔTm = 24°C and 21°C by constructing and characterizing only a handful of multiple-point mutants. FireProt can be applied to any protein for which a tertiary structure and homologous sequences are available, and will facilitate the rapid development of robust proteins for biomedical and biotechnological applications.

  10. Tracking and Recognition of Multiple Human Targets Moving in a Wireless Pyroelectric Infrared Sensor Network

    Directory of Open Access Journals (Sweden)

    Ji Xiong

    2014-04-01

    Full Text Available With characteristics of low-cost and easy deployment, the distributed wireless pyroelectric infrared sensor network has attracted extensive interest, which aims to make it an alternate infrared video sensor in thermal biometric applications for tracking and identifying human targets. In these applications, effectively processing signals collected from sensors and extracting the features of different human targets has become crucial. This paper proposes the application of empirical mode decomposition and the Hilbert-Huang transform to extract features of moving human targets both in the time domain and the frequency domain. Moreover, the support vector machine is selected as the classifier. The experimental results demonstrate that by using this method the identification rates of multiple moving human targets are around 90%.

  11. Neural Computations in a Dynamical System with Multiple Time Scales.

    Science.gov (United States)

    Mi, Yuanyuan; Lin, Xiaohan; Wu, Si

    2016-01-01

    Neural systems display rich short-term dynamics at various levels, e.g., spike-frequency adaptation (SFA) at the single-neuron level, and short-term facilitation (STF) and depression (STD) at the synapse level. These dynamical features typically cover a broad range of time scales and exhibit large diversity in different brain regions. It remains unclear what is the computational benefit for the brain to have such variability in short-term dynamics. In this study, we propose that the brain can exploit such dynamical features to implement multiple seemingly contradictory computations in a single neural circuit. To demonstrate this idea, we use continuous attractor neural network (CANN) as a working model and include STF, SFA and STD with increasing time constants in its dynamics. Three computational tasks are considered, which are persistent activity, adaptation, and anticipative tracking. These tasks require conflicting neural mechanisms, and hence cannot be implemented by a single dynamical feature or any combination with similar time constants. However, with properly coordinated STF, SFA and STD, we show that the network is able to implement the three computational tasks concurrently. We hope this study will shed light on the understanding of how the brain orchestrates its rich dynamics at various levels to realize diverse cognitive functions.

  12. Macromolecular target prediction by self-organizing feature maps.

    Science.gov (United States)

    Schneider, Gisbert; Schneider, Petra

    2017-03-01

    Rational drug discovery would greatly benefit from a more nuanced appreciation of the activity of pharmacologically active compounds against a diverse panel of macromolecular targets. Already, computational target-prediction models assist medicinal chemists in library screening, de novo molecular design, optimization of active chemical agents, drug re-purposing, in the spotting of potential undesired off-target activities, and in the 'de-orphaning' of phenotypic screening hits. The self-organizing map (SOM) algorithm has been employed successfully for these and other purposes. Areas covered: The authors recapitulate contemporary artificial neural network methods for macromolecular target prediction, and present the basic SOM algorithm at a conceptual level. Specifically, they highlight consensus target-scoring by the employment of multiple SOMs, and discuss the opportunities and limitations of this technique. Expert opinion: Self-organizing feature maps represent a straightforward approach to ligand clustering and classification. Some of the appeal lies in their conceptual simplicity and broad applicability domain. Despite known algorithmic shortcomings, this computational target prediction concept has been proven to work in prospective settings with high success rates. It represents a prototypic technique for future advances in the in silico identification of the modes of action and macromolecular targets of bioactive molecules.

  13. Multiple single-board-computer system for the KEK positron generator control

    International Nuclear Information System (INIS)

    Nakahara, Kazuo; Abe, Isamu; Enomoto, Atsushi; Otake, Yuji; Urano, Takao

    1986-01-01

    The KEK positron generator is controlled by means of a distributed microprocessor network. The control system is composed of three kinds of equipment: device controllers for the linac equipment, operation management stations and a communication network. Individual linac equipment has its own microprocessor-based controller. A multiple single board computer (SBC) system is used for communication control and for equipment surveillance; it has a database containing communication and linac equipment status information. The linac operation management that should be the most soft part in the control system, is separated from the multiple SBC system and is carried out by work-stations. The principle that every processor executes only one task is maintained throughout the control system. This made the software architecture very simple. (orig.)

  14. Target localization on standard axial images in computed tomography (CT) stereotaxis for functional neurosurgery - a technical note

    International Nuclear Information System (INIS)

    Patil, A.-A.

    1986-01-01

    A simple technique for marking functional neurosurgery target on computed tomography (CT) axial image is described. This permits the use of standard axial image for computed tomography (CT) stereotaxis in functional neurosurgery. (Author)

  15. Charged particles multiplicity in interactions of 3.7 A GeV 28Si with light and heavy target nuclei in nuclear emulsions

    International Nuclear Information System (INIS)

    Singh, B.K.; Tuli, S.K.

    1998-01-01

    Results from measurement of multiplicity of different charged particles emitted from the interactions of 3.7 A GeV 28 Si with different target groups in nuclear emulsion and correlations among them are presented. The nature of the dependence of multiplicities of charged particles on the impact parameter is examined. Analysis of data in terms of specific multiplicity for different target groups is performed and the results are discussed in the light of superposition model. (author)

  16. Detection and Identification of Multiple Stationary Human Targets Via Bio-Radar Based on the Cross-Correlation Method

    Directory of Open Access Journals (Sweden)

    Yang Zhang

    2016-10-01

    Full Text Available Ultra-wideband (UWB radar has been widely used for detecting human physiological signals (respiration, movement, etc. in the fields of rescue, security, and medicine owing to its high penetrability and range resolution. In these applications, especially in rescue after disaster (earthquake, collapse, mine accident, etc., the presence, number, and location of the trapped victims to be detected and rescued are the key issues of concern. Ample research has been done on the first issue, whereas the identification and localization of multi-targets remains a challenge. False positive and negative identification results are two common problems associated with the detection of multiple stationary human targets. This is mainly because the energy of the signal reflected from the target close to the receiving antenna is considerably stronger than those of the targets at further range, often leading to missing or false recognition if the identification method is based on the energy of the respiratory signal. Therefore, a novel method based on cross-correlation is proposed in this paper that is based on the relativity and periodicity of the signals, rather than on the energy. The validity of this method is confirmed through experiments using different scenarios; the results indicate a discernible improvement in the detection precision and identification of the multiple stationary targets.

  17. Detection and Identification of Multiple Stationary Human Targets Via Bio-Radar Based on the Cross-Correlation Method.

    Science.gov (United States)

    Zhang, Yang; Chen, Fuming; Xue, Huijun; Li, Zhao; An, Qiang; Wang, Jianqi; Zhang, Yang

    2016-10-27

    Ultra-wideband (UWB) radar has been widely used for detecting human physiological signals (respiration, movement, etc.) in the fields of rescue, security, and medicine owing to its high penetrability and range resolution. In these applications, especially in rescue after disaster (earthquake, collapse, mine accident, etc.), the presence, number, and location of the trapped victims to be detected and rescued are the key issues of concern. Ample research has been done on the first issue, whereas the identification and localization of multi-targets remains a challenge. False positive and negative identification results are two common problems associated with the detection of multiple stationary human targets. This is mainly because the energy of the signal reflected from the target close to the receiving antenna is considerably stronger than those of the targets at further range, often leading to missing or false recognition if the identification method is based on the energy of the respiratory signal. Therefore, a novel method based on cross-correlation is proposed in this paper that is based on the relativity and periodicity of the signals, rather than on the energy. The validity of this method is confirmed through experiments using different scenarios; the results indicate a discernible improvement in the detection precision and identification of the multiple stationary targets.

  18. miR-137 inhibits the invasion of melanoma cells through downregulation of multiple oncogenic target genes.

    Science.gov (United States)

    Luo, Chonglin; Tetteh, Paul W; Merz, Patrick R; Dickes, Elke; Abukiwan, Alia; Hotz-Wagenblatt, Agnes; Holland-Cunz, Stefan; Sinnberg, Tobias; Schittek, Birgit; Schadendorf, Dirk; Diederichs, Sven; Eichmüller, Stefan B

    2013-03-01

    MicroRNAs are small noncoding RNAs that regulate gene expression and have important roles in various types of cancer. Previously, miR-137 was reported to act as a tumor suppressor in different cancers, including malignant melanoma. In this study, we show that low miR-137 expression is correlated with poor survival in stage IV melanoma patients. We identified and validated two genes (c-Met and YB1) as direct targets of miR-137 and confirmed two previously known targets, namely enhancer of zeste homolog 2 (EZH2) and microphthalmia-associated transcription factor (MITF). Functional studies showed that miR-137 suppressed melanoma cell invasion through the downregulation of multiple target genes. The decreased invasion caused by miR-137 overexpression could be phenocopied by small interfering RNA knockdown of EZH2, c-Met, or Y box-binding protein 1 (YB1). Furthermore, miR-137 inhibited melanoma cell migration and proliferation. Finally, miR-137 induced apoptosis in melanoma cell lines and decreased BCL2 levels. In summary, our study confirms that miR-137 acts as a tumor suppressor in malignant melanoma and reveals that miR-137 regulates multiple targets including c-Met, YB1, EZH2, and MITF.

  19. EBF factors drive expression of multiple classes of target genes governing neuronal development

    Directory of Open Access Journals (Sweden)

    Vetter Monica L

    2011-04-01

    Full Text Available Abstract Background Early B cell factor (EBF family members are transcription factors known to have important roles in several aspects of vertebrate neurogenesis, including commitment, migration and differentiation. Knowledge of how EBF family members contribute to neurogenesis is limited by a lack of detailed understanding of genes that are transcriptionally regulated by these factors. Results We performed a microarray screen in Xenopus animal caps to search for targets of EBF transcriptional activity, and identified candidate targets with multiple roles, including transcription factors of several classes. We determined that, among the most upregulated candidate genes with expected neuronal functions, most require EBF activity for some or all of their expression, and most have overlapping expression with ebf genes. We also found that the candidate target genes that had the most strongly overlapping expression patterns with ebf genes were predicted to be direct transcriptional targets of EBF transcriptional activity. Conclusions The identification of candidate targets that are transcription factor genes, including nscl-1, emx1 and aml1, improves our understanding of how EBF proteins participate in the hierarchy of transcription control during neuronal development, and suggests novel mechanisms by which EBF activity promotes migration and differentiation. Other candidate targets, including pcdh8 and kcnk5, expand our knowledge of the types of terminal differentiated neuronal functions that EBF proteins regulate.

  20. Common features of microRNA target prediction tools

    Directory of Open Access Journals (Sweden)

    Sarah M. Peterson

    2014-02-01

    Full Text Available The human genome encodes for over 1800 microRNAs, which are short noncoding RNA molecules that function to regulate gene expression post-transcriptionally. Due to the potential for one microRNA to target multiple gene transcripts, microRNAs are recognized as a major mechanism to regulate gene expression and mRNA translation. Computational prediction of microRNA targets is a critical initial step in identifying microRNA:mRNA target interactions for experimental validation. The available tools for microRNA target prediction encompass a range of different computational approaches, from the modeling of physical interactions to the incorporation of machine learning. This review provides an overview of the major computational approaches to microRNA target prediction. Our discussion highlights three tools for their ease of use, reliance on relatively updated versions of miRBase, and range of capabilities, and these are DIANA-microT-CDS, miRanda-mirSVR, and TargetScan. In comparison across all microRNA target prediction tools, four main aspects of the microRNA:mRNA target interaction emerge as common features on which most target prediction is based: seed match, conservation, free energy, and site accessibility. This review explains these features and identifies how they are incorporated into currently available target prediction tools. MicroRNA target prediction is a dynamic field with increasing attention on development of new analysis tools. This review attempts to provide a comprehensive assessment of these tools in a manner that is accessible across disciplines. Understanding the basis of these prediction methodologies will aid in user selection of the appropriate tools and interpretation of the tool output.

  1. Primary intestinal lymphangiectasia: Multiple detector computed tomography findings after direct lymphangiography.

    Science.gov (United States)

    Sun, Xiaoli; Shen, Wenbin; Chen, Xiaobai; Wen, Tingguo; Duan, Yongli; Wang, Rengui

    2017-10-01

    To analyse the findings of multiple detector computed tomography (MDCT) after direct lymphangiography in primary intestinal lymphangiectasia (PIL). Fifty-five patients with PIL were retrospectively reviewed. All patients underwent MDCT after direct lymphangiography. The pathologies of 16 patients were confirmed by surgery and the remaining 39 patients were confirmed by gastroendoscopy and/or capsule endoscopy. After direct lymphangiography, MDCT found intra- and extraintestinal as well as lymphatic vessel abnormalities. Among the intra- and extraintestinal disorders, 49 patients had varying degrees of intestinal dilatation, 46 had small bowel wall thickening, 9 had pleural and pericardial effusions, 21 had ascites, 41 had mesenteric oedema, 20 had mesenteric nodules and 9 had abdominal lymphatic cysts. Features of lymphatic vessel abnormalities included intestinal trunk reflux (43.6%, n = 24), lumbar trunk reflux (89.1%, n = 49), pleural and pulmonary lymph reflux (14.5%, n = 8), pericardial and mediastinal lymph reflux (16.4%, n = 9), mediastinal and pulmonary lymph reflux (18.2%, n = 10), and thoracic duct outlet obstruction (90.9%, n = 50). Multiple detector computed tomography after direct lymphangiography provides a safe and accurate examination method and is an excellent tool for the diagnosis of PIL. © 2017 The Royal Australian and New Zealand College of Radiologists.

  2. Influences of target geometry on the microdosimetry of alpha particles in water

    International Nuclear Information System (INIS)

    Huston, T.E.

    1992-01-01

    Application of microdosimetric concepts to radiation exposure situations requires knowledge of the single-event density function, f 1 (z) , where z denotes specific energy imparted to target matter. Multiple-event density functions are calculated by taking convolutions of f 1 (z) with itself with the overall specific energy density function is then found by employing a compound Poisson process involving single and multiple-event spectra. The f l (z), depends strongly on the geometric details of a the source, target, and all intermediate matter. While most past applications of microdosimetry have been represented targets as spheres, may be better modeled as prolate or oblate spheroids. Using a ray-tracing technique coupled with a continuous-slowing-down approximation, methods are developed and presented for calculating single-event density functions for spheroidal targets irradiated by alpha-emitting point sources. Computational methods are incorporated into a fortran computer code entitled SEROID (single-event density functions for spheroids), which is listed in this paper. This was used to generate several single-event density functions, along with related means and standard deviations in specific energy, for spheroidal targets irradiated by alpha particles. Targets of varying shapes and orientations are examined. Results for non-spherical targets are compared to spherical targets of equal volume in order to assess influences which target geometry has on single-event quantities. From these comparisons it is found that both target shape and orientation are important in adequately characterizing the quantities examined in this study; over-simplifying the target geometry can lead to substantial error

  3. Automated planning target volume generation: an evaluation pitting a computer-based tool against human experts

    International Nuclear Information System (INIS)

    Ketting, Case H.; Austin-Seymour, Mary; Kalet, Ira; Jacky, Jon; Kromhout-Schiro, Sharon; Hummel, Sharon; Unger, Jonathan; Fagan, Lawrence M.; Griffin, Tom

    1997-01-01

    Purpose: Software tools are seeing increased use in three-dimensional treatment planning. However, the development of these tools frequently omits careful evaluation before placing them in clinical use. This study demonstrates the application of a rigorous evaluation methodology using blinded peer review to an automated software tool that produces ICRU-50 planning target volumes (PTVs). Methods and Materials: Seven physicians from three different institutions involved in three-dimensional treatment planning participated in the evaluation. Four physicians drew partial PTVs on nine test cases, consisting of four nasopharynx and five lung primaries. Using the same information provided to the human experts, the computer tool generated PTVs for comparison. The remaining three physicians, designated evaluators, individually reviewed the PTVs for acceptability. To exclude bias, the evaluators were blinded to the source (human or computer) of the PTVs they reviewed. Their scorings of the PTVs were statistically examined to determine if the computer tool performed as well as the human experts. Results: The computer tool was as successful as the human experts in generating PTVs. Failures were primarily attributable to insufficient margins around the clinical target volume and to encroachment upon critical structures. In a qualitative analysis, the human and computer experts displayed similar types and distributions of errors. Conclusions: Rigorous evaluation of computer-based radiotherapy tools requires comparison to current practice and can reveal areas for improvement before the tool enters clinical practice

  4. Multiplicities of charged kaons from deep-inelastic muon scattering off an isoscalar target

    CERN Document Server

    Adolph, C.

    2017-04-10

    Precise measurements of charged-kaon multiplicities in deep inelastic scattering were performed. The results are presented in three-dimensional bins of the Bjorken scaling variable x, the relative virtual-photon energy y, and the fraction z of the virtual-photon energy carried by the produced hadron. The data were obtained by the COMPASS Collaboration by scattering 160 GeV muons off an isoscalar 6 LiD target. They cover the kinematic domain 1 (GeV/c)2 5 GeV/c^2 in the invariant mass of the hadronic system. The results from the sum of the z-integrated K+ and K- multiplicities at high x point to a value of the non-strange quark fragmentation function larger than obtained by the earlier DSS fit.

  5. An algorithm to compute a rule for division problems with multiple references

    Directory of Open Access Journals (Sweden)

    Sánchez Sánchez, Francisca J.

    2012-01-01

    Full Text Available In this paper we consider an extension of the classic division problem with claims: Thedivision problem with multiple references. Hinojosa et al. (2012 provide a solution for this type of pro-blems. The aim of this work is to extend their results by proposing an algorithm that calculates allocationsbased on these results. All computational details are provided in the paper.

  6. A framework for multiple kernel support vector regression and its applications to siRNA efficacy prediction.

    Science.gov (United States)

    Qiu, Shibin; Lane, Terran

    2009-01-01

    The cell defense mechanism of RNA interference has applications in gene function analysis and promising potentials in human disease therapy. To effectively silence a target gene, it is desirable to select appropriate initiator siRNA molecules having satisfactory silencing capabilities. Computational prediction for silencing efficacy of siRNAs can assist this screening process before using them in biological experiments. String kernel functions, which operate directly on the string objects representing siRNAs and target mRNAs, have been applied to support vector regression for the prediction and improved accuracy over numerical kernels in multidimensional vector spaces constructed from descriptors of siRNA design rules. To fully utilize information provided by string and numerical data, we propose to unify the two in a kernel feature space by devising a multiple kernel regression framework where a linear combination of the kernels is used. We formulate the multiple kernel learning into a quadratically constrained quadratic programming (QCQP) problem, which although yields global optimal solution, is computationally demanding and requires a commercial solver package. We further propose three heuristics based on the principle of kernel-target alignment and predictive accuracy. Empirical results demonstrate that multiple kernel regression can improve accuracy, decrease model complexity by reducing the number of support vectors, and speed up computational performance dramatically. In addition, multiple kernel regression evaluates the importance of constituent kernels, which for the siRNA efficacy prediction problem, compares the relative significance of the design rules. Finally, we give insights into the multiple kernel regression mechanism and point out possible extensions.

  7. Clinical efficacy and management of monoclonal antibodies targeting CD38 and SLAMF7 in multiple myeloma

    DEFF Research Database (Denmark)

    van de Donk, Niels W C J; Moreau, Philippe; Plesner, Torben

    2016-01-01

    Immunotherapeutic strategies are emerging as promising therapeutic approaches in multiple myeloma (MM), with several monoclonal antibodies in advanced stages of clinical development. Of these agents, CD38-targeting antibodies have marked single agent activity in extensively pretreated MM...... of therapeutic antibodies with immunofixation and serum protein electrophoresis assays may lead to underestimation of complete response. Strategies to mitigate interference, based on shifting the therapeutic antibody band, are in development. Furthermore, daratumumab, and probably also other CD38-targeting...

  8. An individual differences approach to multiple-target visual search errors: How search errors relate to different characteristics of attention.

    Science.gov (United States)

    Adamo, Stephen H; Cain, Matthew S; Mitroff, Stephen R

    2017-12-01

    A persistent problem in visual search is that searchers are more likely to miss a target if they have already found another in the same display. This phenomenon, the Subsequent Search Miss (SSM) effect, has remained despite being a known issue for decades. Increasingly, evidence supports a resource depletion account of SSM errors-a previously detected target consumes attentional resources leaving fewer resources available for the processing of a second target. However, "attention" is broadly defined and is composed of many different characteristics, leaving considerable uncertainty about how attention affects second-target detection. The goal of the current study was to identify which attentional characteristics (i.e., selection, limited capacity, modulation, and vigilance) related to second-target misses. The current study compared second-target misses to an attentional blink task and a vigilance task, which both have established measures that were used to operationally define each of four attentional characteristics. Second-target misses in the multiple-target search were correlated with (1) a measure of the time it took for the second target to recovery from the blink in the attentional blink task (i.e., modulation), and (2) target sensitivity (d') in the vigilance task (i.e., vigilance). Participants with longer recovery and poorer vigilance had more second-target misses in the multiple-target visual search task. The results add further support to a resource depletion account of SSM errors and highlight that worse modulation and poor vigilance reflect a deficit in attentional resources that can account for SSM errors. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Aliskiren targets multiple systems to alleviate cancer cachexia.

    Science.gov (United States)

    Wang, Chaoyi; Guo, Dunwei; Wang, Qiang; You, Song; Qiao, Zhongpeng; Liu, Yong; Dai, Hang; Tang, Hua

    2016-11-01

    To examine the effects of aliskiren, a small-molecule renin inhibitor, on cancer cachexia and to explore the underlying mechanisms. A cancer cachexia model was established by subcutaneously injecting C26 mouse colon carcinoma cells into isogenic BALB/c mice. Aliskiren was administered intragastrically [10 mg/kg body weight (BW)] on day 5 (as a preventive strategy, AP group) or on day 12 (as a therapeutic strategy, AT group) after C26 injection. Mice that received no C26 injection (healthy controls, HC group) or only C26 injection but not aliskiren (cancer, CA group) were used as controls. BW, tumor growth, whole body functions, and survival were monitored daily in half of the mice in each group, whereas serum, tumors, and gastrocnemius muscles were harvested from the other mice after sacrifice on day 20 for further analysis. Aliskiren significantly alleviated multiple cachexia‑associated symptoms, including BW loss, tumor burden, muscle wasting, muscular dysfunction, and shortened survival. On the molecular level, aliskiren antagonized cachexia‑induced activation of the renin‑angiotensin system (RAS), systematic and muscular inflammation, oxidative stress, and autophagy‑lysosome as well as ubiquitin‑proteasome stimulation. In addition, early administration of aliskiren before cachexia development (AP group) resulted in more robust effects in alleviating cachexia or targeting underlying mechanisms than administration after cachexia development (AT group). Aliskiren exhibited potent anti‑cachexia activities. These activities were achieved through the targeting of at least four mechanisms underlying cachexia development: RAS activation, increase in systematic inflammation, upregulation of oxidative stress, and stimulation of autophagy-lysosome pathway (ALP) and ubiquitin-proteasome pathway (UPP).

  10. ClustalXeed: a GUI-based grid computation version for high performance and terabyte size multiple sequence alignment

    Directory of Open Access Journals (Sweden)

    Kim Taeho

    2010-09-01

    Full Text Available Abstract Background There is an increasing demand to assemble and align large-scale biological sequence data sets. The commonly used multiple sequence alignment programs are still limited in their ability to handle very large amounts of sequences because the system lacks a scalable high-performance computing (HPC environment with a greatly extended data storage capacity. Results We designed ClustalXeed, a software system for multiple sequence alignment with incremental improvements over previous versions of the ClustalX and ClustalW-MPI software. The primary advantage of ClustalXeed over other multiple sequence alignment software is its ability to align a large family of protein or nucleic acid sequences. To solve the conventional memory-dependency problem, ClustalXeed uses both physical random access memory (RAM and a distributed file-allocation system for distance matrix construction and pair-align computation. The computation efficiency of disk-storage system was markedly improved by implementing an efficient load-balancing algorithm, called "idle node-seeking task algorithm" (INSTA. The new editing option and the graphical user interface (GUI provide ready access to a parallel-computing environment for users who seek fast and easy alignment of large DNA and protein sequence sets. Conclusions ClustalXeed can now compute a large volume of biological sequence data sets, which were not tractable in any other parallel or single MSA program. The main developments include: 1 the ability to tackle larger sequence alignment problems than possible with previous systems through markedly improved storage-handling capabilities. 2 Implementing an efficient task load-balancing algorithm, INSTA, which improves overall processing times for multiple sequence alignment with input sequences of non-uniform length. 3 Support for both single PC and distributed cluster systems.

  11. Time-resolved x-ray line emission studies of thermal transport in multiple beam uv-irradiated targets

    International Nuclear Information System (INIS)

    Jaanimagi, P.A.; Henke, B.L.; Delettrez, J.; Richardson, M.C.

    1984-01-01

    Thermal transport in spherical targets irradiated with multiple, nanosecond duration laser beams, has been a topic of much discussion recently. Different inferences on the level of thermal flux inhibition have been drawn from plasma velocity and x-ray spectroscopic diagnostics. We present new measurements of thermal transport on spherical targets made through time-resolved x-ray spectroscopic measurements of the progress of the ablation surface through thin layers of material on the surface of the target. These measurements, made with 6 and 12 uv (351 nm) nanosecond beams from OMEGA, will be compared to previous thermal transport measurements. Transparencies of the conference presentation are given

  12. Multiple-Choice versus Constructed-Response Tests in the Assessment of Mathematics Computation Skills.

    Science.gov (United States)

    Gadalla, Tahany M.

    The equivalence of multiple-choice (MC) and constructed response (discrete) (CR-D) response formats as applied to mathematics computation at grade levels two to six was tested. The difference between total scores from the two response formats was tested for statistical significance, and the factor structure of items in both response formats was…

  13. Evolving Non-Dominated Parameter Sets for Computational Models from Multiple Experiments

    Science.gov (United States)

    Lane, Peter C. R.; Gobet, Fernand

    2013-03-01

    Creating robust, reproducible and optimal computational models is a key challenge for theorists in many sciences. Psychology and cognitive science face particular challenges as large amounts of data are collected and many models are not amenable to analytical techniques for calculating parameter sets. Particular problems are to locate the full range of acceptable model parameters for a given dataset, and to confirm the consistency of model parameters across different datasets. Resolving these problems will provide a better understanding of the behaviour of computational models, and so support the development of general and robust models. In this article, we address these problems using evolutionary algorithms to develop parameters for computational models against multiple sets of experimental data; in particular, we propose the `speciated non-dominated sorting genetic algorithm' for evolving models in several theories. We discuss the problem of developing a model of categorisation using twenty-nine sets of data and models drawn from four different theories. We find that the evolutionary algorithms generate high quality models, adapted to provide a good fit to all available data.

  14. The computer-aided design of a servo system as a multiple-criteria decision problem

    NARCIS (Netherlands)

    Udink ten Cate, A.J.

    1986-01-01

    This paper treats the selection of controller gains of a servo system as a multiple-criteria decision problem. In contrast to the usual optimization-based approaches to computer-aided design, inequality constraints are included in the problem as unconstrained objectives. This considerably simplifies

  15. A Lévy HJM Multiple-Curve Model with Application to CVA Computation

    DEFF Research Database (Denmark)

    Crépey, Stéphane; Grbac, Zorana; Ngor, Nathalie

    2015-01-01

    , the calibration to OTM swaptions guaranteeing that the model correctly captures volatility smile effects and the calibration to co-terminal ATM swaptions ensuring an appropriate term structure of the volatility in the model. To account for counterparty risk and funding issues, we use the calibrated multiple......-curve model as an underlying model for CVA computation. We follow a reduced-form methodology through which the problem of pricing the counterparty risk and funding costs can be reduced to a pre-default Markovian BSDE, or an equivalent semi-linear PDE. As an illustration, we study the case of a basis swap...... and a related swaption, for which we compute the counterparty risk and funding adjustments...

  16. Multiplicities of charged kaons from deep-inelastic muon scattering off an isoscalar target

    Directory of Open Access Journals (Sweden)

    C. Adolph

    2017-04-01

    Full Text Available Precise measurements of charged-kaon multiplicities in deep inelastic scattering were performed. The results are presented in three-dimensional bins of the Bjorken scaling variable x, the relative virtual-photon energy y, and the fraction z of the virtual-photon energy carried by the produced hadron. The data were obtained by the COMPASS Collaboration by scattering 160 GeV muons off an isoscalar 6LiD target. They cover the kinematic domain 1(GeV/c25 GeV/c2 in the invariant mass of the hadronic system. The results from the sum of the z-integrated K+ and K− multiplicities at high x point to a value of the non-strange quark fragmentation function larger than obtained by the earlier DSS fit.

  17. ApicoAP: the first computational model for identifying apicoplast-targeted proteins in multiple species of Apicomplexa.

    Directory of Open Access Journals (Sweden)

    Gokcen Cilingir

    Full Text Available Most of the parasites of the phylum Apicomplexa contain a relict prokaryotic-derived plastid called the apicoplast. This organelle is important not only for the survival of the parasite, but its unique properties make it an ideal drug target. The majority of apicoplast-associated proteins are nuclear encoded and targeted post-translationally to the organellar lumen via a bipartite signaling mechanism that requires an N-terminal signal and transit peptide (TP. Attempts to define a consensus motif that universally identifies apicoplast TPs have failed.In this study, we propose a generalized rule-based classification model to identify apicoplast-targeted proteins (ApicoTPs that use a bipartite signaling mechanism. Given a training set specific to an organism, this model, called ApicoAP, incorporates a procedure based on a genetic algorithm to tailor a discriminating rule that exploits the known characteristics of ApicoTPs. Performance of ApicoAP is evaluated for four labeled datasets of Plasmodium falciparum, Plasmodium yoelii, Babesia bovis, and Toxoplasma gondii proteins. ApicoAP improves the classification accuracy of the published dataset for P. falciparum to 94%, originally 90% using PlasmoAP.We present a parametric model for ApicoTPs and a procedure to optimize the model parameters for a given training set. A major asset of this model is that it is customizable to different parasite genomes. The ApicoAP prediction software is available at http://code.google.com/p/apicoap/ and http://bcb.eecs.wsu.edu.

  18. P2-21: Searching for Multiple Targets Using the iPad

    Directory of Open Access Journals (Sweden)

    Ian Thornton

    2012-10-01

    Full Text Available Search for multiple targets is constrained by both retrospective (i.e., where you've been and prospective (i.e., where you're planning to go components of performance. Previous studies using the Multi-Item Localisation (MILO task have demonstrated that participants accurately remember and discount locations they have already visited and that they plan future actions up to 2 or 3 items ahead (Thornton & Horowitz, 2004 Perception & Psychophysics 66 38–50. A prominent feature of the MILO serial-reaction time (SRT function is a highly elevated, that is slowed, response, to T1 compared to T2 and all the other items. This “prospective gap” is typically between 600 ms and 1000 ms. Here we present three experiments that use the MILO iPad app to explore this “prospective gap”. In Experiment 1, we “shuffled” the position of future targets each time a response was made. This blocks planning and thus slows all responses to the level of first target, effectively eliminating the gap. In Experiment 2, participants responded to eight identical targets, removing the need to plan a specific sequence of actions. In this situation, absolute response time is greatly reduced and the T1–T2 gap shrinks to around 350 ms. In Experiment 3, participants repeated their search through the same array 10 times. Under these circumstances, the gap systematically reduced from 1300 ms on trial 1 to 300 ms on trial 10. Together, these results suggest that the previously observed prospective gap is a combination of set-up time for registering a new visual layout, response preparation, and sequence planning.

  19. 'Multi-epitope-targeted' immune-specific therapy for a multiple sclerosis-like disease via engineered multi-epitope protein is superior to peptides.

    Directory of Open Access Journals (Sweden)

    Nathali Kaushansky

    Full Text Available Antigen-induced peripheral tolerance is potentially one of the most efficient and specific therapeutic approaches for autoimmune diseases. Although highly effective in animal models, antigen-based strategies have not yet been translated into practicable human therapy, and several clinical trials using a single antigen or peptidic-epitope in multiple sclerosis (MS yielded disappointing results. In these clinical trials, however, the apparent complexity and dynamics of the pathogenic autoimmunity associated with MS, which result from the multiplicity of potential target antigens and "epitope spread", have not been sufficiently considered. Thus, targeting pathogenic T-cells reactive against a single antigen/epitope is unlikely to be sufficient; to be effective, immunospecific therapy to MS should logically neutralize concomitantly T-cells reactive against as many major target antigens/epitopes as possible. We investigated such "multi-epitope-targeting" approach in murine experimental autoimmune encephalomyelitis (EAE associated with a single ("classical" or multiple ("complex" anti-myelin autoreactivities, using cocktail of different encephalitogenic peptides vis-a-vis artificial multi-epitope-protein (designated Y-MSPc encompassing rationally selected MS-relevant epitopes of five major myelin antigens, as "multi-epitope-targeting" agents. Y-MSPc was superior to peptide(s in concomitantly downregulating pathogenic T-cells reactive against multiple myelin antigens/epitopes, via inducing more effective, longer lasting peripheral regulatory mechanisms (cytokine shift, anergy, and Foxp3+ CTLA4+ regulatory T-cells. Y-MSPc was also consistently more effective than the disease-inducing single peptide or peptide cocktail, not only in suppressing the development of "classical" or "complex EAE" or ameliorating ongoing disease, but most importantly, in reversing chronic EAE. Overall, our data emphasize that a "multi-epitope-targeting" strategy is required for

  20. Realization of quantum gates with multiple control qubits or multiple target qubits in a cavity

    Science.gov (United States)

    Waseem, Muhammad; Irfan, Muhammad; Qamar, Shahid

    2015-06-01

    We propose a scheme to realize a three-qubit controlled phase gate and a multi-qubit controlled NOT gate of one qubit simultaneously controlling n-target qubits with a four-level quantum system in a cavity. The implementation time for multi-qubit controlled NOT gate is independent of the number of qubit. Three-qubit phase gate is generalized to n-qubit phase gate with multiple control qubits. The number of steps reduces linearly as compared to conventional gate decomposition method. Our scheme can be applied to various types of physical systems such as superconducting qubits coupled to a resonator and trapped atoms in a cavity. Our scheme does not require adjustment of level spacing during the gate implementation. We also show the implementation of Deutsch-Joza algorithm. Finally, we discuss the imperfections due to cavity decay and the possibility of physical implementation of our scheme.

  1. Target matching based on multi-view tracking

    Science.gov (United States)

    Liu, Yahui; Zhou, Changsheng

    2011-01-01

    A feature matching method is proposed based on Maximally Stable Extremal Regions (MSER) and Scale Invariant Feature Transform (SIFT) to solve the problem of the same target matching in multiple cameras. Target foreground is extracted by using frame difference twice and bounding box which is regarded as target regions is calculated. Extremal regions are got by MSER. After fitted into elliptical regions, those regions will be normalized into unity circles and represented with SIFT descriptors. Initial matching is obtained from the ratio of the closest distance to second distance less than some threshold and outlier points are eliminated in terms of RANSAC. Experimental results indicate the method can reduce computational complexity effectively and is also adapt to affine transformation, rotation, scale and illumination.

  2. On Combining Multiple-Instance Learning and Active Learning for Computer-Aided Detection of Tuberculosis

    NARCIS (Netherlands)

    Melendez Rodriguez, J.C.; Ginneken, B. van; Maduskar, P.; Philipsen, R.H.H.M.; Ayles, H.; Sanchez, C.I.

    2016-01-01

    The major advantage of multiple-instance learning (MIL) applied to a computer-aided detection (CAD) system is that it allows optimizing the latter with case-level labels instead of accurate lesion outlines as traditionally required for a supervised approach. As shown in previous work, a MIL-based

  3. Detecting and Georegistering Moving Ground Targets in Airborne QuickSAR via Keystoning and Multiple-Phase Center Interferometry

    Directory of Open Access Journals (Sweden)

    R. P. Perry

    2008-03-01

    Full Text Available SAR images experience significant range walk and, without some form of motion compensation, can be quite blurred. The MITRE-developed Keystone formatting simultaneously and automatically compensates for range walk due to the radial velocity component of each moving target, independent of the number of targets or the value of each target's radial velocity with respect to the ground. Target radial motion also causes moving targets in synthetic aperture radar images to appear at locations offset from their true instantaneous locations on the ground. In a multichannel radar, the interferometric phase values associated with all nonmoving points on the ground appear as a continuum of phase differences while the moving targets appear as interferometric phase discontinuities. By multiple threshold comparisons and grouping of pixels within the intensity and the phase images, we show that it is possible to reliably detect and accurately georegister moving targets within short-duration SAR (QuickSAR images.

  4. Monoclonal IgG in MGUS and multiple myeloma targets infectious pathogens

    Science.gov (United States)

    Bosseboeuf, Adrien; Feron, Delphine; Tallet, Anne; Rossi, Cédric; Charlier, Cathy; Garderet, Laurent; Caillot, Denis; Moreau, Philippe; Cardó-Vila, Marina; Pasqualini, Renata; Nelson, Alfreda Destea; Wilson, Bridget S.; Perreault, Hélène; Piver, Eric; Weigel, Pierre; Harb, Jean; Bigot-Corbel, Edith; Hermouet, Sylvie

    2017-01-01

    Subsets of mature B cell neoplasms are linked to infection with intracellular pathogens such as Epstein-Barr virus (EBV), hepatitis C virus (HCV), or Helicobacter pylori. However, the association between infection and the immunoglobulin-secreting (Ig-secreting) B proliferative disorders remains largely unresolved. We investigated whether the monoclonal IgG (mc IgG) produced by patients diagnosed with monoclonal gammopathy of undetermined significance (MGUS) or multiple myeloma (MM) targets infectious pathogens. Antigen specificity of purified mc IgG from a large patient cohort (n = 244) was determined using a multiplex infectious-antigen array (MIAA), which screens for reactivity to purified antigens or lysates from 9 pathogens. Purified mc IgG from 23.4% of patients (57 of 244) specifically recognized 1 pathogen in the MIAA. EBV was the most frequent target (15.6%), with 36 of 38 mc IgGs recognizing EBV nuclear antigen-1 (EBNA-1). MM patients with EBNA-1–specific mc IgG (14.0%) showed substantially greater bone marrow plasma cell infiltration and higher β2-microglobulin and inflammation/infection–linked cytokine levels compared with other smoldering myeloma/MM patients. Five other pathogens were the targets of mc IgG: herpes virus simplex-1 (2.9%), varicella zoster virus (1.6%), cytomegalovirus (0.8%), hepatitis C virus (1.2%), and H. pylori (1.2%). We conclude that a dysregulated immune response to infection may underlie disease onset and/or progression of MGUS and MM for subsets of patients. PMID:28978808

  5. Multi-Stage System for Automatic Target Recognition

    Science.gov (United States)

    Chao, Tien-Hsin; Lu, Thomas T.; Ye, David; Edens, Weston; Johnson, Oliver

    2010-01-01

    A multi-stage automated target recognition (ATR) system has been designed to perform computer vision tasks with adequate proficiency in mimicking human vision. The system is able to detect, identify, and track targets of interest. Potential regions of interest (ROIs) are first identified by the detection stage using an Optimum Trade-off Maximum Average Correlation Height (OT-MACH) filter combined with a wavelet transform. False positives are then eliminated by the verification stage using feature extraction methods in conjunction with neural networks. Feature extraction transforms the ROIs using filtering and binning algorithms to create feature vectors. A feedforward back-propagation neural network (NN) is then trained to classify each feature vector and to remove false positives. The system parameter optimizations process has been developed to adapt to various targets and datasets. The objective was to design an efficient computer vision system that can learn to detect multiple targets in large images with unknown backgrounds. Because the target size is small relative to the image size in this problem, there are many regions of the image that could potentially contain the target. A cursory analysis of every region can be computationally efficient, but may yield too many false positives. On the other hand, a detailed analysis of every region can yield better results, but may be computationally inefficient. The multi-stage ATR system was designed to achieve an optimal balance between accuracy and computational efficiency by incorporating both models. The detection stage first identifies potential ROIs where the target may be present by performing a fast Fourier domain OT-MACH filter-based correlation. Because threshold for this stage is chosen with the goal of detecting all true positives, a number of false positives are also detected as ROIs. The verification stage then transforms the regions of interest into feature space, and eliminates false positives using an

  6. Multiple single-element transducer photoacoustic computed tomography system

    Science.gov (United States)

    Kalva, Sandeep Kumar; Hui, Zhe Zhi; Pramanik, Manojit

    2018-02-01

    Light absorption by the chromophores (hemoglobin, melanin, water etc.) present in any biological tissue results in local temperature rise. This rise in temperature results in generation of pressure waves due to the thermoelastic expansion of the tissue. In a circular scanning photoacoustic computed tomography (PACT) system, these pressure waves can be detected using a single-element ultrasound transducer (SUST) (while rotating in full 360° around the sample) or using a circular array transducer. SUST takes several minutes to acquire the PA data around the sample whereas the circular array transducer takes only a fraction of seconds. Hence, for real time imaging circular array transducers are preferred. However, these circular array transducers are custom made, expensive and not easily available in the market whereas SUSTs are cheap and readily available in the market. Using SUST for PACT systems is still cost effective. In order to reduce the scanning time to few seconds instead of using single SUST (rotating 360° ), multiple SUSTs can be used at the same time to acquire the PA data. This will reduce the scanning time by two-fold in case of two SUSTs (rotating 180° ) or by four-fold and eight-fold in case of four SUSTs (rotating 90° ) and eight SUSTs (rotating 45° ) respectively. Here we show that with multiple SUSTs, similar PA images (numerical and experimental phantom data) can be obtained as that of PA images obtained using single SUST.

  7. Nucleotide excision repair is a potential therapeutic target in multiple myeloma

    Science.gov (United States)

    Szalat, R; Samur, M K; Fulciniti, M; Lopez, M; Nanjappa, P; Cleynen, A; Wen, K; Kumar, S; Perini, T; Calkins, A S; Reznichenko, E; Chauhan, D; Tai, Y-T; Shammas, M A; Anderson, K C; Fermand, J-P; Arnulf, B; Avet-Loiseau, H; Lazaro, J-B; Munshi, N C

    2018-01-01

    Despite the development of novel drugs, alkylating agents remain an important component of therapy in multiple myeloma (MM). DNA repair processes contribute towards sensitivity to alkylating agents and therefore we here evaluate the role of nucleotide excision repair (NER), which is involved in the removal of bulky adducts and DNA crosslinks in MM. We first evaluated NER activity using a novel functional assay and observed a heterogeneous NER efficiency in MM cell lines and patient samples. Using next-generation sequencing data, we identified that expression of the canonical NER gene, excision repair cross-complementation group 3 (ERCC3), significantly impacted the outcome in newly diagnosed MM patients treated with alkylating agents. Next, using small RNA interference, stable knockdown and overexpression, and small-molecule inhibitors targeting xeroderma pigmentosum complementation group B (XPB), the DNA helicase encoded by ERCC3, we demonstrate that NER inhibition significantly increases sensitivity and overcomes resistance to alkylating agents in MM. Moreover, inhibiting XPB leads to the dual inhibition of NER and transcription and is particularly efficient in myeloma cells. Altogether, we show that NER impacts alkylating agents sensitivity in myeloma cells and identify ERCC3 as a potential therapeutic target in MM. PMID:28588253

  8. One Novel Multiple-Target Plasmid Reference Molecule Targeting Eight Genetically Modified Canola Events for Genetically Modified Canola Detection.

    Science.gov (United States)

    Li, Zhuqing; Li, Xiang; Wang, Canhua; Song, Guiwen; Pi, Liqun; Zheng, Lan; Zhang, Dabing; Yang, Litao

    2017-09-27

    Multiple-target plasmid DNA reference materials have been generated and utilized as good substitutes of matrix-based reference materials in the analysis of genetically modified organisms (GMOs). Herein, we report the construction of one multiple-target plasmid reference molecule, pCAN, which harbors eight GM canola event-specific sequences (RF1, RF2, MS1, MS8, Topas 19/2, Oxy235, RT73, and T45) and a partial sequence of the canola endogenous reference gene PEP. The applicability of this plasmid reference material in qualitative and quantitative PCR assays of the eight GM canola events was evaluated, including the analysis of specificity, limit of detection (LOD), limit of quantification (LOQ), and performance of pCAN in the analysis of various canola samples, etc. The LODs are 15 copies for RF2, MS1, and RT73 assays using pCAN as the calibrator and 10 genome copies for the other events. The LOQ in each event-specific real-time PCR assay is 20 copies. In quantitative real-time PCR analysis, the PCR efficiencies of all event-specific and PEP assays are between 91% and 97%, and the squared regression coefficients (R 2 ) are all higher than 0.99. The quantification bias values varied from 0.47% to 20.68% with relative standard deviation (RSD) from 1.06% to 24.61% in the quantification of simulated samples. Furthermore, 10 practical canola samples sampled from imported shipments in the port of Shanghai, China, were analyzed employing pCAN as the calibrator, and the results were comparable with those assays using commercial certified materials as the calibrator. Concluding from these results, we believe that this newly developed pCAN plasmid is one good candidate for being a plasmid DNA reference material in the detection and quantification of the eight GM canola events in routine analysis.

  9. Combining on-chip synthesis of a focused combinatorial library with computational target prediction reveals imidazopyridine GPCR ligands.

    Science.gov (United States)

    Reutlinger, Michael; Rodrigues, Tiago; Schneider, Petra; Schneider, Gisbert

    2014-01-07

    Using the example of the Ugi three-component reaction we report a fast and efficient microfluidic-assisted entry into the imidazopyridine scaffold, where building block prioritization was coupled to a new computational method for predicting ligand-target associations. We identified an innovative GPCR-modulating combinatorial chemotype featuring ligand-efficient adenosine A1/2B and adrenergic α1A/B receptor antagonists. Our results suggest the tight integration of microfluidics-assisted synthesis with computer-based target prediction as a viable approach to rapidly generate bioactivity-focused combinatorial compound libraries with high success rates. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Introducing the RadBioStat Educational Software: Computer-Assisted Teaching of the Random Nature of Cell Killing

    Directory of Open Access Journals (Sweden)

    Safari A

    2014-06-01

    Full Text Available The interaction of radiation with cells and tissues has a random nature. Therefore, understanding the random nature of cell killing that is determined by Poisson distribution statistics is an essential point in education of radiation biology. RadBioStat is a newly developed educational MATLAB-based software designed for computer-assisted learning of the target theory in radiation biology. Although its potential applications is developing rapidly, currently RadBioStat software can be a useful tool in computerassisted education of radiobiological models such as single target single hit, multiple target single hit and multiple target multiple hit. Scholars’ feedback is valuable to the producers of this software and help them continuously improve this product, add new features and increase its desirability and functionality.

  11. Multiple targets of salicylic acid and its derivatives in plants and animals

    Directory of Open Access Journals (Sweden)

    Daniel F. Klessig

    2016-05-01

    Full Text Available Salicylic acid (SA is a critical plant hormone that is involved in many processes, including seed germination, root initiation, stomatal closure, floral induction, thermogenesis, and response to abiotic and biotic stresses. Its central role in plant immunity, although extensively studied, is still only partially understood. Classical biochemical approaches and, more recently, genome-wide high-throughput screens have identified more than two dozen plant SA-binding proteins (SABPs, as well as multiple candidates that have yet to be characterized. Some of these proteins bind SA with high affinity, while the affinity others exhibit is low. Given that SA levels vary greatly even within a particular plant species depending on subcellular location, tissue type, developmental stage, and with respect to both time and location after an environmental stimulus such as infection, the presence of SABPs exhibiting a wide range of affinities for SA may provide great flexibility and multiple mechanisms through which SA can act. SA and its derivatives, both natural and synthetic, also have multiple targets in animals/humans. Interestingly, many of these proteins, like their plant counterparts, are associated with immunity or disease development. Two recently identified SABPs, High Mobility Group Box protein (HMGB and Glyceraldehyde 3-Phosphate Dehydrogenase (GAPDH, are critical proteins that not only serve key structural or metabolic functions, but also play prominent roles in disease responses in both kingdoms.

  12. Evaluation of a modified Fitts law brain-computer interface target acquisition task in able and motor disabled individuals

    Science.gov (United States)

    Felton, E. A.; Radwin, R. G.; Wilson, J. A.; Williams, J. C.

    2009-10-01

    A brain-computer interface (BCI) is a communication system that takes recorded brain signals and translates them into real-time actions, in this case movement of a cursor on a computer screen. This work applied Fitts' law to the evaluation of performance on a target acquisition task during sensorimotor rhythm-based BCI training. Fitts' law, which has been used as a predictor of movement time in studies of human movement, was used here to determine the information transfer rate, which was based on target acquisition time and target difficulty. The information transfer rate was used to make comparisons between control modalities and subject groups on the same task. Data were analyzed from eight able-bodied and five motor disabled participants who wore an electrode cap that recorded and translated their electroencephalogram (EEG) signals into computer cursor movements. Direct comparisons were made between able-bodied and disabled subjects, and between EEG and joystick cursor control in able-bodied subjects. Fitts' law aptly described the relationship between movement time and index of difficulty for each task movement direction when evaluated separately and averaged together. This study showed that Fitts' law can be successfully applied to computer cursor movement controlled by neural signals.

  13. Interactive computer graphics for stereotactic neurosurgery

    International Nuclear Information System (INIS)

    Goodman, J.H.; Davis, J.R.; Gahbauer, R.A.

    1986-01-01

    A microcomputer (IBM PC/AT) system has been developed to incorporate multiple image sources for stereotactic neurosurgery. Hard copy data is calibrated and captured with a video camera and frame grabber. Line contours are generated automatically on the basis of gray scale density or digitized manually. Interactive computer graphics provide a format with acceptable speed and accuracy for stereotactic neurosurgery. The ability to dimensionally integrate existing image data from multiple sources for target selection makes preoperative scans and scanner compatible head holders unnecessary. The system description and examples of use for brain tumor biopsy and brachytherapy ware presented

  14. Computational multiple steady states for enzymatic esterification of ethanol and oleic acid in an isothermal CSTR.

    Science.gov (United States)

    Ho, Pang-Yen; Chuang, Guo-Syong; Chao, An-Chong; Li, Hsing-Ya

    2005-05-01

    The capacity of complex biochemical reaction networks (consisting of 11 coupled non-linear ordinary differential equations) to show multiple steady states, was investigated. The system involved esterification of ethanol and oleic acid by lipase in an isothermal continuous stirred tank reactor (CSTR). The Deficiency One Algorithm and the Subnetwork Analysis were applied to determine the steady state multiplicity. A set of rate constants and two corresponding steady states are computed. The phenomena of bistability, hysteresis and bifurcation are discussed. Moreover, the capacity of steady state multiplicity is extended to the family of the studied reaction networks.

  15. Computed a multiple band metamaterial absorber and its application based on the figure of merit value

    Science.gov (United States)

    Chen, Chao; Sheng, Yuping; Jun, Wang

    2018-01-01

    A high performed multiple band metamaterial absorber is designed and computed through the software Ansofts HFSS 10.0, which is constituted with two kinds of separated metal particles sub-structures. The multiple band absorption property of the metamaterial absorber is based on the resonance of localized surface plasmon (LSP) modes excited near edges of metal particles. The damping constant of gold layer is optimized to obtain a near-perfect absorption rate. Four kinds of dielectric layers is computed to achieve the perfect absorption perform. The perfect absorption perform of the metamaterial absorber is enhanced through optimizing the structural parameters (R = 75 nm, w = 80 nm). Moreover, a perfect absorption band is achieved because of the plasmonic hybridization phenomenon between LSP modes. The designed metamaterial absorber shows high sensitive in the changed of the refractive index of the liquid. A liquid refractive index sensor strategy is proposed based on the computed figure of merit (FOM) value of the metamaterial absorber. High FOM values (116, 111, and 108) are achieved with three liquid (Methanol, Carbon tetrachloride, and Carbon disulfide).

  16. Computational design and application of endogenous promoters for transcriptionally targeted gene therapy for rheumatoid arthritis.

    NARCIS (Netherlands)

    Geurts, J.; Joosten, L.A.B.; Takahashi, N.; Arntz, O.J.; Gluck, A.; Bennink, M.B.; Berg, W.B. van den; Loo, F.A.J. van de

    2009-01-01

    The promoter regions of genes that are differentially regulated in the synovial membrane during the course of rheumatoid arthritis (RA) represent attractive candidates for application in transcriptionally targeted gene therapy. In this study, we applied an unbiased computational approach to define

  17. ODMBP: Behavior Forwarding for Multiple Property Destinations in Mobile Social Networks

    Directory of Open Access Journals (Sweden)

    Jia Xu

    2016-01-01

    Full Text Available The smartphones are widely available in recent years. Wireless networks and personalized mobile devices are deeply integrated and embedded in our lives. The behavior based forwarding has become a new transmission paradigm for supporting many novel applications. However, the commodities, services, and individuals usually have multiple properties of their interests and behaviors. In this paper, we profile these multiple properties and propose an Opportunistic Dissemination Protocol based on Multiple Behavior Profile, ODMBP, in mobile social networks. We first map the interest space to the behavior space and extract the multiple behavior profiles from the behavior space. Then, we propose the correlation computing model based on the principle of BM25 to calculate the correlation metric of multiple behavior profiles. The correlation metric is used to forward the message to the users who are more similar to the target in our protocol. ODMBP consists of three stages: user initialization, gradient ascent, and group spread. Through extensive simulations, we demonstrate that the proposed multiple behavior profile and correlation computing model are correct and efficient. Compared to other classical routing protocols, ODMBP can significantly improve the performance in the aspect of delivery ratio, delay, and overhead ratio.

  18. D- production by multiple charge-transfer collisions in metal-vapor targets. [1 to 50 keV D/sup +/

    Energy Technology Data Exchange (ETDEWEB)

    Schlachter, A.S.

    1977-09-01

    A beam of D/sup -/ions can be produced by multiple charge-transfer collisions of a D/sup +/ beam in a thick metal-vapor target. Cross sections and equilibrium charge-state fractions are presented and discussed.

  19. Utilisation of computational fluid dynamics techniques for design of molybdenum target specification

    International Nuclear Information System (INIS)

    Yeoh, G.H.; Wassink, D.

    2003-01-01

    A three-dimensional computational fluid dynamics (CFD) model to investigate the hydraulic behaviour within a model of the liner and irradiation rig, located in the central portion of the HIFAR fuel element is described. Flow visualisation and LDV measurements are performed to better understand the fluid flow around the narrow spaces within the irradiation rig, annular target cans and liner. Based on the unstructured meshing consisted of triangular elements and tetrahedrons within the flow space generated for the geometrical structure, the CFD model was able to predict complex flow structures inside the liner containing the irradiation rig and target cans. The reliability of the model was validated against experiments. The predicted flow behaviour was comparable to the experimental observations. Predicted velocities were also found to be in good agreement with LDV measurements. (author)

  20. HAlign-II: efficient ultra-large multiple sequence alignment and phylogenetic tree reconstruction with distributed and parallel computing.

    Science.gov (United States)

    Wan, Shixiang; Zou, Quan

    2017-01-01

    Multiple sequence alignment (MSA) plays a key role in biological sequence analyses, especially in phylogenetic tree construction. Extreme increase in next-generation sequencing results in shortage of efficient ultra-large biological sequence alignment approaches for coping with different sequence types. Distributed and parallel computing represents a crucial technique for accelerating ultra-large (e.g. files more than 1 GB) sequence analyses. Based on HAlign and Spark distributed computing system, we implement a highly cost-efficient and time-efficient HAlign-II tool to address ultra-large multiple biological sequence alignment and phylogenetic tree construction. The experiments in the DNA and protein large scale data sets, which are more than 1GB files, showed that HAlign II could save time and space. It outperformed the current software tools. HAlign-II can efficiently carry out MSA and construct phylogenetic trees with ultra-large numbers of biological sequences. HAlign-II shows extremely high memory efficiency and scales well with increases in computing resource. THAlign-II provides a user-friendly web server based on our distributed computing infrastructure. HAlign-II with open-source codes and datasets was established at http://lab.malab.cn/soft/halign.

  1. Pacing a data transfer operation between compute nodes on a parallel computer

    Science.gov (United States)

    Blocksome, Michael A [Rochester, MN

    2011-09-13

    Methods, systems, and products are disclosed for pacing a data transfer between compute nodes on a parallel computer that include: transferring, by an origin compute node, a chunk of an application message to a target compute node; sending, by the origin compute node, a pacing request to a target direct memory access (`DMA`) engine on the target compute node using a remote get DMA operation; determining, by the origin compute node, whether a pacing response to the pacing request has been received from the target DMA engine; and transferring, by the origin compute node, a next chunk of the application message if the pacing response to the pacing request has been received from the target DMA engine.

  2. Interpolating between random walks and optimal transportation routes: Flow with multiple sources and targets

    Science.gov (United States)

    Guex, Guillaume

    2016-05-01

    In recent articles about graphs, different models proposed a formalism to find a type of path between two nodes, the source and the target, at crossroads between the shortest-path and the random-walk path. These models include a freely adjustable parameter, allowing to tune the behavior of the path toward randomized movements or direct routes. This article presents a natural generalization of these models, namely a model with multiple sources and targets. In this context, source nodes can be viewed as locations with a supply of a certain good (e.g. people, money, information) and target nodes as locations with a demand of the same good. An algorithm is constructed to display the flow of goods in the network between sources and targets. With again a freely adjustable parameter, this flow can be tuned to follow routes of minimum cost, thus displaying the flow in the context of the optimal transportation problem or, by contrast, a random flow, known to be similar to the electrical current flow if the random-walk is reversible. Moreover, a source-targetcoupling can be retrieved from this flow, offering an optimal assignment to the transportation problem. This algorithm is described in the first part of this article and then illustrated with case studies.

  3. Generalized internal multiple imaging

    KAUST Repository

    Zuberi, Mohammad Akbar Hosain

    2014-12-04

    Various examples are provided for generalized internal multiple imaging (GIMI). In one example, among others, a method includes generating a higher order internal multiple image using a background Green\\'s function and rendering the higher order internal multiple image for presentation. In another example, a system includes a computing device and a generalized internal multiple imaging (GIMI) application executable in the computing device. The GIMI application includes logic that generates a higher order internal multiple image using a background Green\\'s function and logic that renders the higher order internal multiple image for display on a display device. In another example, a non-transitory computer readable medium has a program executable by processing circuitry that generates a higher order internal multiple image using a background Green\\'s function and renders the higher order internal multiple image for display on a display device.

  4. Generalized internal multiple imaging

    KAUST Repository

    Zuberi, Mohammad Akbar Hosain; Alkhalifah, Tariq

    2014-01-01

    Various examples are provided for generalized internal multiple imaging (GIMI). In one example, among others, a method includes generating a higher order internal multiple image using a background Green's function and rendering the higher order internal multiple image for presentation. In another example, a system includes a computing device and a generalized internal multiple imaging (GIMI) application executable in the computing device. The GIMI application includes logic that generates a higher order internal multiple image using a background Green's function and logic that renders the higher order internal multiple image for display on a display device. In another example, a non-transitory computer readable medium has a program executable by processing circuitry that generates a higher order internal multiple image using a background Green's function and renders the higher order internal multiple image for display on a display device.

  5. Selected Arterial Infusion Chemotherapy Combined with Target Drugs 
for Non-small Cell Lung Cancer with Multiple Brain Metastase

    Directory of Open Access Journals (Sweden)

    Jinduo LI

    2012-05-01

    Full Text Available Background and objective The aim of this study is to evaluate the efficacy of selected arterial infusion chemotherapy in treating non-small cell lung cancer (NSCLC with multiple brain metastases and corresponding factors to influencing prognosis. Methods From September 2008 to October 2011, a total of 31 patients of NSCLC with multiple brain metastases (≥3 received selected incranial, bronchial and corresponding target arterial infusion chemotherapy combined with EGFR-TKIs. Interventional treatment was performed every four weeks, two-six cycles with synchronized or sequential targeted drugs (erlotinib, gefitinib or icotinib. Follow-up CT and MRI were regularly finished at interval of four weeks after two cycles of interventional treatment were finished or during taking targeted drugs in order to evaluate efficacy of the therapy. The procedure was stopped for the tumor disease was worse or the patient could not tolerate the toxity of drugs any longer. Results 31 patients was performed two to six cycles of interventional therapy, 3cycles at average. Response assessment showed that 5 (16.1% patients got a complete response (CR, 7 (22.6% had a partial response (PR, 11 (35.5% had a stable disease (SD and 8 (25.8% had a progressive disease (PD. The objective response rate (ORR was 38.7%, and the disease control rate was 74.2%. The median progression free survival (PFS and overall survival (OS were 13.1 months and 15.1 months. The 6-month survival rate, one-year survival rate and two-year survival rate were 79%, 61.1%, and 31.1%, respectively. The patients’ OS and PFS were influenced by smoking state, tumor pathology, extracranial metastases, period of targeted drug taking and performance status, not by sex, age, before therapy and the total of brain metastases. Conclusion Selected arterial infusion chemotherapy with targeted drugs is one of the most effective and safe treatment to NSCLC with multiple brain metastases. Smoking status, tumor

  6. Detailed computer simulation of damage accumulation in ion irradiated crystalline targets

    International Nuclear Information System (INIS)

    Jaraiz, M.; Arias, J.; Bailon, L.A.; Barbolla, J.J.

    1993-01-01

    A new version for the collision cascade simulation program MARLOWE is presented. This version incorporates damage build-up in full detail, i.e every interstitial and vacancy generated is retained throughout the simulation and can become a target in subsequent collisions, unless they recombine at some stage during the implantation. Vacancy-interstitial recombination is simulated by annihilating those pairs whose radius is less than a specified recombination radius. Also, stopped atoms are moved to their nearest lattice interstitial site if it is not occupied. In this way, a fully physical simulation can be carried out in detail, thus preserving a valuable feature of MARLOWE. To overcome the prohibitive computation time and memory required, a scheme has been followed to handle in a suitable way the data generated as the simulation proceeds. The model is described. Examples of memory and computation time requirements and damage accumulation effects on channelling in ion implantation are also presented. (Author)

  7. Computer-Aided Approaches for Targeting HIVgp41

    Directory of Open Access Journals (Sweden)

    William J. Allen

    2012-08-01

    Full Text Available Virus-cell fusion is the primary means by which the human immunodeficiency virus-1 (HIV delivers its genetic material into the human T-cell host. Fusion is mediated in large part by the viral glycoprotein 41 (gp41 which advances through four distinct conformational states: (i native, (ii pre-hairpin intermediate, (iii fusion active (fusogenic, and (iv post-fusion. The pre-hairpin intermediate is a particularly attractive step for therapeutic intervention given that gp41 N-terminal heptad repeat (NHR and C‑terminal heptad repeat (CHR domains are transiently exposed prior to the formation of a six-helix bundle required for fusion. Most peptide-based inhibitors, including the FDA‑approved drug T20, target the intermediate and there are significant efforts to develop small molecule alternatives. Here, we review current approaches to studying interactions of inhibitors with gp41 with an emphasis on atomic-level computer modeling methods including molecular dynamics, free energy analysis, and docking. Atomistic modeling yields a unique level of structural and energetic detail, complementary to experimental approaches, which will be important for the design of improved next generation anti-HIV drugs.

  8. Neural Computations in a Dynamical System with Multiple Time Scales

    Directory of Open Access Journals (Sweden)

    Yuanyuan Mi

    2016-09-01

    Full Text Available Neural systems display rich short-term dynamics at various levels, e.g., spike-frequencyadaptation (SFA at single neurons, and short-term facilitation (STF and depression (STDat neuronal synapses. These dynamical features typically covers a broad range of time scalesand exhibit large diversity in different brain regions. It remains unclear what the computationalbenefit for the brain to have such variability in short-term dynamics is. In this study, we proposethat the brain can exploit such dynamical features to implement multiple seemingly contradictorycomputations in a single neural circuit. To demonstrate this idea, we use continuous attractorneural network (CANN as a working model and include STF, SFA and STD with increasing timeconstants in their dynamics. Three computational tasks are considered, which are persistent activity,adaptation, and anticipative tracking. These tasks require conflicting neural mechanisms, andhence cannot be implemented by a single dynamical feature or any combination with similar timeconstants. However, with properly coordinated STF, SFA and STD, we show that the network isable to implement the three computational tasks concurrently. We hope this study will shed lighton the understanding of how the brain orchestrates its rich dynamics at various levels to realizediverse cognitive functions.

  9. The multiple roles of computational chemistry in fragment-based drug design

    Science.gov (United States)

    Law, Richard; Barker, Oliver; Barker, John J.; Hesterkamp, Thomas; Godemann, Robert; Andersen, Ole; Fryatt, Tara; Courtney, Steve; Hallett, Dave; Whittaker, Mark

    2009-08-01

    Fragment-based drug discovery (FBDD) represents a change in strategy from the screening of molecules with higher molecular weights and physical properties more akin to fully drug-like compounds, to the screening of smaller, less complex molecules. This is because it has been recognised that fragment hit molecules can be efficiently grown and optimised into leads, particularly after the binding mode to the target protein has been first determined by 3D structural elucidation, e.g. by NMR or X-ray crystallography. Several studies have shown that medicinal chemistry optimisation of an already drug-like hit or lead compound can result in a final compound with too high molecular weight and lipophilicity. The evolution of a lower molecular weight fragment hit therefore represents an attractive alternative approach to optimisation as it allows better control of compound properties. Computational chemistry can play an important role both prior to a fragment screen, in producing a target focussed fragment library, and post-screening in the evolution of a drug-like molecule from a fragment hit, both with and without the available fragment-target co-complex structure. We will review many of the current developments in the area and illustrate with some recent examples from successful FBDD discovery projects that we have conducted.

  10. Targeting p110gamma in gastrointestinal cancers: attack on multiple fronts

    Directory of Open Access Journals (Sweden)

    Marco eFalasca

    2014-10-01

    Full Text Available Phosphoinositide 3-kinases (PI3Ks regulate several cellular functions that are critical for cancer progression and development, including cell survival, proliferation and migration. Three classes of PI3Ks exist with the class I PI3K encompassing four isoforms of the catalytic subunit known as p110α, p110β, p110γ and p110δ. Although for many years attention has been mainly focused on p110α recent evidence supports the conclusion that p110β, p110γ and p110δ can also have a role in cancer. Amongst these, accumulating evidence now supports the conclusion that p110γ is involved in several cellular processes associated with cancer development and progression and indeed this specific isoform has emerged as a novel important player in cancer progression. Studies from our laboratory have identified a specific overexpression of p110γ in human pancreatic ductal adenocarcinoma (PDAC and in hepatocellular carcinoma (HCC tissues compared to their normal counterparts. Our data have further established that selective inhibition of this PI3K isoform is able to block PDAC and HCC cell proliferation, strongly suggesting that pharmacological inhibition of this enzyme can directly affect these tumors growth. Furthermore increasing evidence suggests that p110γ plays also a key role in the interactions between cancer cells and tumor microenvironment and in particular in tumor-associated immune response. It has also been reported that p110γ can regulate invasion of myeloid cells into tumors and tumor angiogenesis. Finally p110γ has also been directly involved in regulation of cancer cell migration. Taken together these data indicate that p110γ plays multiple roles in regulation of several processes that are critical for tumor progression and metastasis. This review will discuss the role of p110γ in gastrointestinal tumor development and progression and how targeting this enzyme might represent a way to target very aggressive tumors such as pancreatic and

  11. SeedVicious: Analysis of microRNA target and near-target sites.

    Science.gov (United States)

    Marco, Antonio

    2018-01-01

    Here I describe seedVicious, a versatile microRNA target site prediction software that can be easily fitted into annotation pipelines and run over custom datasets. SeedVicious finds microRNA canonical sites plus other, less efficient, target sites. Among other novel features, seedVicious can compute evolutionary gains/losses of target sites using maximum parsimony, and also detect near-target sites, which have one nucleotide different from a canonical site. Near-target sites are important to study population variation in microRNA regulation. Some analyses suggest that near-target sites may also be functional sites, although there is no conclusive evidence for that, and they may actually be target alleles segregating in a population. SeedVicious does not aim to outperform but to complement existing microRNA prediction tools. For instance, the precision of TargetScan is almost doubled (from 11% to ~20%) when we filter predictions by the distance between target sites using this program. Interestingly, two adjacent canonical target sites are more likely to be present in bona fide target transcripts than pairs of target sites at slightly longer distances. The software is written in Perl and runs on 64-bit Unix computers (Linux and MacOS X). Users with no computing experience can also run the program in a dedicated web-server by uploading custom data, or browse pre-computed predictions. SeedVicious and its associated web-server and database (SeedBank) are distributed under the GPL/GNU license.

  12. The pituitary tumor transforming gene 1 (PTTG-1: An immunological target for multiple myeloma

    Directory of Open Access Journals (Sweden)

    Gagliano Nicoletta

    2008-04-01

    Full Text Available Abstract Background Multiple Myeloma is a cancer of B plasma cells, which produce non-specific antibodies and proliferate uncontrolled. Due to the potential relapse and non-specificity of current treatments, immunotherapy promises to be more specific and may induce long-term immunity in patients. The pituitary tumor transforming gene 1 (PTTG-1 has been shown to be a novel oncogene, expressed in the testis, thymus, colon, lung and placenta (undetectable in most other tissues. Furthermore, it is over expressed in many tumors such as the pituitary adenoma, breast, gastrointestinal cancers, leukemia, lymphoma, and lung cancer and it seems to be associated with tumorigenesis, angiogenesis and cancer progression. The purpose was to investigate the presence/rate of expression of PTTG-1 in multiple myeloma patients. Methods We analyzed the PTTG-1 expression at the transcriptional and the protein level, by PCR, immunocytochemical methods, Dot-blot and ELISA performed on patient's sera in 19 multiple myeloma patients, 6 different multiple myeloma cell lines and in normal human tissue. Results We did not find PTTG-1 presence in the normal human tissue panel, but PTTG-1 mRNA was detectable in 12 of the 19 patients, giving evidence of a 63% rate of expression (data confirmed by ELISA. Four of the 6 investigated cell lines (66.6% were positive for PTTG-1. Investigations of protein expression gave evidence of 26.3% cytoplasmic expression and 16% surface expression in the plasma cells of multiple myeloma patients. Protein presence was also confirmed by Dot-blot in both cell lines and patients. Conclusion We established PTTG-1's presence at both the transcriptional and protein levels. These data suggest that PTTG-1 is aberrantly expressed in multiple myeloma plasma cells, is highly immunogenic and is a suitable target for immunotherapy of multiple myeloma.

  13. Compute-unified device architecture implementation of a block-matching algorithm for multiple graphical processing unit cards.

    Science.gov (United States)

    Massanes, Francesc; Cadennes, Marie; Brankov, Jovan G

    2011-07-01

    In this paper we describe and evaluate a fast implementation of a classical block matching motion estimation algorithm for multiple Graphical Processing Units (GPUs) using the Compute Unified Device Architecture (CUDA) computing engine. The implemented block matching algorithm (BMA) uses summed absolute difference (SAD) error criterion and full grid search (FS) for finding optimal block displacement. In this evaluation we compared the execution time of a GPU and CPU implementation for images of various sizes, using integer and non-integer search grids.The results show that use of a GPU card can shorten computation time by a factor of 200 times for integer and 1000 times for a non-integer search grid. The additional speedup for non-integer search grid comes from the fact that GPU has built-in hardware for image interpolation. Further, when using multiple GPU cards, the presented evaluation shows the importance of the data splitting method across multiple cards, but an almost linear speedup with a number of cards is achievable.In addition we compared execution time of the proposed FS GPU implementation with two existing, highly optimized non-full grid search CPU based motion estimations methods, namely implementation of the Pyramidal Lucas Kanade Optical flow algorithm in OpenCV and Simplified Unsymmetrical multi-Hexagon search in H.264/AVC standard. In these comparisons, FS GPU implementation still showed modest improvement even though the computational complexity of FS GPU implementation is substantially higher than non-FS CPU implementation.We also demonstrated that for an image sequence of 720×480 pixels in resolution, commonly used in video surveillance, the proposed GPU implementation is sufficiently fast for real-time motion estimation at 30 frames-per-second using two NVIDIA C1060 Tesla GPU cards.

  14. Computed tomography in multiple trauma patients. Technical aspects, work flow, and dose reduction

    International Nuclear Information System (INIS)

    Fellner, F.A.; Krieger, J.; Floery, D.; Lechner, N.

    2014-01-01

    Patients with severe, life-threatening trauma require a fast and accurate clinical and imaging diagnostic workup during the first phase of trauma management. Early whole-body computed tomography has clearly been proven to be the current standard of care of these patients. A similar imaging quality can be achieved in the multiple trauma setting compared with routine imaging especially using rapid, latest generation computed tomography (CT) scanners. This article encompasses a detailed view on the use of CT in patients with life-threatening trauma. A special focus is placed on radiological procedures in trauma units and on the methods for CT workup in routine cases and in challenging situations. Another focus discusses the potential of dose reduction of CT scans in multiple trauma as well as the examination of children with severe trauma. Various studies have demonstrated that early whole-body CT positively correlates with low morbidity and mortality and is clearly superior to the use of other imaging modalities. Optimal trauma unit management means a close cooperation between trauma surgeons, anesthesiologists and radiologists, whereby the radiologist is responsible for a rapid and accurate radiological workup and the rapid communication of imaging findings. However, even in the trauma setting, aspects of patient radiation doses should be kept in mind. (orig.) [de

  15. Optimization of OT-MACH Filter Generation for Target Recognition

    Science.gov (United States)

    Johnson, Oliver C.; Edens, Weston; Lu, Thomas T.; Chao, Tien-Hsin

    2009-01-01

    An automatic Optimum Trade-off Maximum Average Correlation Height (OT-MACH) filter generator for use in a gray-scale optical correlator (GOC) has been developed for improved target detection at JPL. While the OT-MACH filter has been shown to be an optimal filter for target detection, actually solving for the optimum is too computationally intensive for multiple targets. Instead, an adaptive step gradient descent method was tested to iteratively optimize the three OT-MACH parameters, alpha, beta, and gamma. The feedback for the gradient descent method was a composite of the performance measures, correlation peak height and peak to side lobe ratio. The automated method generated and tested multiple filters in order to approach the optimal filter quicker and more reliably than the current manual method. Initial usage and testing has shown preliminary success at finding an approximation of the optimal filter, in terms of alpha, beta, gamma values. This corresponded to a substantial improvement in detection performance where the true positive rate increased for the same average false positives per image.

  16. The display of multiple images derived from emission computed assisted tomography (ECAT)

    International Nuclear Information System (INIS)

    Jackson, P.C.; Davies, E.R.; Goddard, P.R.; Wilde, R.P.H.

    1983-01-01

    In emission computed assisted tomography, a technique has been developed to display the multiple sections of an organ within a single image, such that three dimensional appreciation of the organ can be obtained, whilst also preserving functional information. The technique when tested on phantoms showed no obvious deterioration in resolution and when used clinically gave satisfactory visual results. Such a method should allow easier appreciation of the extent of a lesion through an organ and thus allow dimensions to be obtained by direct measurement. (U.K.)

  17. [Construction and analysis of a monitoring system with remote real-time multiple physiological parameters based on cloud computing].

    Science.gov (United States)

    Zhu, Lingyun; Li, Lianjie; Meng, Chunyan

    2014-12-01

    There have been problems in the existing multiple physiological parameter real-time monitoring system, such as insufficient server capacity for physiological data storage and analysis so that data consistency can not be guaranteed, poor performance in real-time, and other issues caused by the growing scale of data. We therefore pro posed a new solution which was with multiple physiological parameters and could calculate clustered background data storage and processing based on cloud computing. Through our studies, a batch processing for longitudinal analysis of patients' historical data was introduced. The process included the resource virtualization of IaaS layer for cloud platform, the construction of real-time computing platform of PaaS layer, the reception and analysis of data stream of SaaS layer, and the bottleneck problem of multi-parameter data transmission, etc. The results were to achieve in real-time physiological information transmission, storage and analysis of a large amount of data. The simulation test results showed that the remote multiple physiological parameter monitoring system based on cloud platform had obvious advantages in processing time and load balancing over the traditional server model. This architecture solved the problems including long turnaround time, poor performance of real-time analysis, lack of extensibility and other issues, which exist in the traditional remote medical services. Technical support was provided in order to facilitate a "wearable wireless sensor plus mobile wireless transmission plus cloud computing service" mode moving towards home health monitoring for multiple physiological parameter wireless monitoring.

  18. Improving your target-template alignment with MODalign.

    KAUST Repository

    Barbato, Alessandro

    2012-02-04

    SUMMARY: MODalign is an interactive web-based tool aimed at helping protein structure modelers to inspect and manually modify the alignment between the sequences of a target protein and of its template(s). It interactively computes, displays and, upon modification of the target-template alignment, updates the multiple sequence alignments of the two protein families, their conservation score, secondary structure and solvent accessibility values, and local quality scores of the implied three-dimensional model(s). Although it has been designed to simplify the target-template alignment step in modeling, it is suitable for all cases where a sequence alignment needs to be inspected in the context of other biological information. AVAILABILITY AND IMPLEMENTATION: Freely available on the web at http://modorama.biocomputing.it/modalign. Website implemented in HTML and JavaScript with all major browsers supported. CONTACT: jan.kosinski@uniroma1.it.

  19. The MORPG-Based Learning System for Multiple Courses: A Case Study on Computer Science Curriculum

    Science.gov (United States)

    Liu, Kuo-Yu

    2015-01-01

    This study aimed at developing a Multiplayer Online Role Playing Game-based (MORPG) Learning system which enabled instructors to construct a game scenario and manage sharable and reusable learning content for multiple courses. It used the curriculum of "Introduction to Computer Science" as a study case to assess students' learning…

  20. Free-time and fixed end-point multi-target optimal control theory: Application to quantum computing

    International Nuclear Information System (INIS)

    Mishima, K.; Yamashita, K.

    2011-01-01

    Graphical abstract: The two-state Deutsch-Jozsa algortihm used to demonstrate the utility of free-time and fixed-end point multi-target optimal control theory. Research highlights: → Free-time and fixed-end point multi-target optimal control theory (FRFP-MTOCT) was constructed. → The features of our theory include optimization of the external time-dependent perturbations with high transition probabilities, that of the temporal duration, the monotonic convergence, and the ability to optimize multiple-laser pulses simultaneously. → The advantage of the theory and a comparison with conventional fixed-time and fixed end-point multi-target optimal control theory (FIFP-MTOCT) are presented by comparing data calculated using the present theory with those published previously [K. Mishima, K. Yamashita, Chem. Phys. 361 (2009) 106]. → The qubit system of our interest consists of two polar NaCl molecules coupled by dipole-dipole interaction. → The calculation examples show that our theory is useful for minor adjustment of the external fields. - Abstract: An extension of free-time and fixed end-point optimal control theory (FRFP-OCT) to monotonically convergent free-time and fixed end-point multi-target optimal control theory (FRFP-MTOCT) is presented. The features of our theory include optimization of the external time-dependent perturbations with high transition probabilities, that of the temporal duration, the monotonic convergence, and the ability to optimize multiple-laser pulses simultaneously. The advantage of the theory and a comparison with conventional fixed-time and fixed end-point multi-target optimal control theory (FIFP-MTOCT) are presented by comparing data calculated using the present theory with those published previously [K. Mishima, K. Yamashita, Chem. Phys. 361, (2009), 106]. The qubit system of our interest consists of two polar NaCl molecules coupled by dipole-dipole interaction. The calculation examples show that our theory is useful for minor

  1. Targeting MUC1-C suppresses polycomb repressive complex 1 in multiple myeloma.

    Science.gov (United States)

    Tagde, Ashujit; Markert, Tahireh; Rajabi, Hasan; Hiraki, Masayuki; Alam, Maroof; Bouillez, Audrey; Avigan, David; Anderson, Kenneth; Kufe, Donald

    2017-09-19

    The polycomb repressive complex 1 (PRC1) includes the BMI1, RING1 and RING2 proteins. BMI1 is required for survival of multiple myeloma (MM) cells. The MUC1-C oncoprotein is aberrantly expressed by MM cells, activates MYC and is also necessary for MM cell survival. The present studies show that targeting MUC1-C with (i) stable and inducible silencing and CRISPR/Cas9 editing and (ii) the pharmacologic inhibitor GO-203, which blocks MUC1-C function, downregulates BMI1, RING1 and RING2 expression. The results demonstrate that MUC1-C drives BMI1 transcription by a MYC-dependent mechanism. MUC1-C thus promotes MYC occupancy on the BMI1 promoter and thereby activates BMI1 expression. We also show that the MUC1-C→MYC pathway induces RING2 expression. Moreover, in contrast to BMI1 and RING2, we found that MUC1-C drives RING1 by an NF-κB p65-dependent mechanism. Targeting MUC1-C and thereby the suppression of these key PRC1 proteins was associated with downregulation of the PRC1 E3 ligase activity as evidenced by decreases in ubiquitylation of histone H2A. Targeting MUC1-C also resulted in activation of the PRC1-repressed tumor suppressor genes, PTEN, CDNK2A and BIM . These findings identify a heretofore unrecognized role for MUC1-C in the epigenetic regulation of MM cells.

  2. A computational method for identification of vaccine targets from protein regions of conserved human leukocyte antigen binding

    DEFF Research Database (Denmark)

    Olsen, Lars Rønn; Simon, Christian; Kudahl, Ulrich J.

    2015-01-01

    Background: Computational methods for T cell-based vaccine target discovery focus on selection of highly conserved peptides identified across pathogen variants, followed by prediction of their binding of human leukocyte antigen molecules. However, experimental studies have shown that T cells often...... target diverse regions in highly variable viral pathogens and this diversity may need to be addressed through redefinition of suitable peptide targets. Methods: We have developed a method for antigen assessment and target selection for polyvalent vaccines, with which we identified immune epitopes from...... variable regions, where all variants bind HLA. These regions, although variable, can thus be considered stable in terms of HLA binding and represent valuable vaccine targets. Results: We applied this method to predict CD8+ T-cell targets in influenza A H7N9 hemagglutinin and significantly increased...

  3. Technical Note: Using k-means clustering to determine the number and position of isocenters in MLC-based multiple target intracranial radiosurgery.

    Science.gov (United States)

    Yock, Adam D; Kim, Gwe-Ya

    2017-09-01

    To present the k-means clustering algorithm as a tool to address treatment planning considerations characteristic of stereotactic radiosurgery using a single isocenter for multiple targets. For 30 patients treated with stereotactic radiosurgery for multiple brain metastases, the geometric centroids and radii of each met were determined from the treatment planning system. In-house software used this as well as weighted and unweighted versions of the k-means clustering algorithm to group the targets to be treated with a single isocenter, and to position each isocenter. The algorithm results were evaluated using within-cluster sum of squares as well as a minimum target coverage metric that considered the effect of target size. Both versions of the algorithm were applied to an example patient to demonstrate the prospective determination of the appropriate number and location of isocenters. Both weighted and unweighted versions of the k-means algorithm were applied successfully to determine the number and position of isocenters. Comparing the two, both the within-cluster sum of squares metric and the minimum target coverage metric resulting from the unweighted version were less than those from the weighted version. The average magnitudes of the differences were small (-0.2 cm 2 and 0.1% for the within cluster sum of squares and minimum target coverage, respectively) but statistically significant (Wilcoxon signed-rank test, P k-means clustering algorithm represented an advantage of the unweighted version for the within-cluster sum of squares metric, and an advantage of the weighted version for the minimum target coverage metric. While additional treatment planning considerations have a large influence on the final treatment plan quality, both versions of the k-means algorithm provide automatic, consistent, quantitative, and objective solutions to the tasks associated with SRS treatment planning using a single isocenter for multiple targets. © 2017 The Authors. Journal

  4. Signal processing, sensor fusion, and target recognition; Proceedings of the Meeting, Orlando, FL, Apr. 20-22, 1992

    Science.gov (United States)

    Libby, Vibeke; Kadar, Ivan

    Consideration is given to a multiordered mapping technique for target prioritization, a neural network approach to multiple-target-tracking problems, a multisensor fusion algorithm for multitarget multibackground classification, deconvolutiom of multiple images of the same object, neural networks and genetic algorithms for combinatorial optimization of sensor data fusion, classification of atmospheric acoustic signals from fixed-wing aircraft, and an optics approach to sensor fusion for target recognition. Also treated are a zoom lens for automatic target recognition, a hybrid model for the analysis of radar sensors, an innovative test bed for developing and assessing air-to-air noncooperative target identification algorithms, SAR imagery scene segmentation using fractal processing, sonar feature-based bandwidth compression, laboratory experiments for a new sonar system, computational algorithms for discrete transform using fixed-size filter matrices, and pattern recognition for power systems.

  5. Extending Data Worth Analyses to Select Multiple Observations Targeting Multiple Forecasts

    DEFF Research Database (Denmark)

    Vilhelmsen, Troels Norvin; Ferre, Ty Paul

    2017-01-01

    . In the present study, we extend previous data worth analyses to include: simultaneous selection of multiple new measurements and consideration of multiple forecasts of interest. We show how the suggested approach can be used to optimize data collection. This can be used in a manner that suggests specific...... measurement sets or that produces probability maps indicating areas likely to be informative for specific forecasts. Moreover, we provide examples documenting that sequential measurement election approaches often lead to suboptimal designs and that estimates of data covariance should be included when...

  6. Lorazepam induces multiple disturbances in selective attention: attentional overload, decrement in target processing efficiency, and shifts in perceptual discrimination and response bias.

    Science.gov (United States)

    Michael, George Andrew; Bacon, Elisabeth; Offerlin-Meyer, Isabelle

    2007-09-01

    There is a general consensus that benzodiazepines affect attentional processes, yet only few studies have tried to investigate these impairments in detail. The purpose of the present study was to investigate the effects of a single dose of Lorazepam on performance in a target cancellation task with important time constraints. We measured correct target detections and correct distractor rejections, misses and false positives. The results show that Lorazepam produces multiple kinds of shifts in performance, which suggests that it impairs multipLe processes: (a) the evolution of performance over time was not the same between the placebo and the Lorazepam groups, with the Lorazepam affecting performance quite early after the beginning of the test. This is suggestive of a depletion of attentional resources during sequential attentional processing; (b) Lorazepam affected differently target and distractor processing, with target detection being the most impaired; (c) misses were more frequent under Lorazepam than under placebo, but no such difference was observed as far as false positives were concerned. Signal detection analyses showed that Lorazepam (d) decreased perceptual discrimination, and (e) reliably increased response bias. Our results bring new insights on the multiple effects of Lorazepam on selective attention which, when combined, may have deleterious effects on human performance.

  7. Targeted intervention: Computational approaches to elucidate and predict relapse in alcoholism.

    Science.gov (United States)

    Heinz, Andreas; Deserno, Lorenz; Zimmermann, Ulrich S; Smolka, Michael N; Beck, Anne; Schlagenhauf, Florian

    2017-05-01

    Alcohol use disorder (AUD) and addiction in general is characterized by failures of choice resulting in repeated drug intake despite severe negative consequences. Behavioral change is hard to accomplish and relapse after detoxification is common and can be promoted by consumption of small amounts of alcohol as well as exposure to alcohol-associated cues or stress. While those environmental factors contributing to relapse have long been identified, the underlying psychological and neurobiological mechanism on which those factors act are to date incompletely understood. Based on the reinforcing effects of drugs of abuse, animal experiments showed that drug, cue and stress exposure affect Pavlovian and instrumental learning processes, which can increase salience of drug cues and promote habitual drug intake. In humans, computational approaches can help to quantify changes in key learning mechanisms during the development and maintenance of alcohol dependence, e.g. by using sequential decision making in combination with computational modeling to elucidate individual differences in model-free versus more complex, model-based learning strategies and their neurobiological correlates such as prediction error signaling in fronto-striatal circuits. Computational models can also help to explain how alcohol-associated cues trigger relapse: mechanisms such as Pavlovian-to-Instrumental Transfer can quantify to which degree Pavlovian conditioned stimuli can facilitate approach behavior including alcohol seeking and intake. By using generative models of behavioral and neural data, computational approaches can help to quantify individual differences in psychophysiological mechanisms that underlie the development and maintenance of AUD and thus promote targeted intervention. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Direction-of-arrival estimation for co-located multiple-input multiple-output radar using structural sparsity Bayesian learning

    International Nuclear Information System (INIS)

    Wen Fang-Qing; Zhang Gong; Ben De

    2015-01-01

    This paper addresses the direction of arrival (DOA) estimation problem for the co-located multiple-input multiple-output (MIMO) radar with random arrays. The spatially distributed sparsity of the targets in the background makes compressive sensing (CS) desirable for DOA estimation. A spatial CS framework is presented, which links the DOA estimation problem to support recovery from a known over-complete dictionary. A modified statistical model is developed to accurately represent the intra-block correlation of the received signal. A structural sparsity Bayesian learning algorithm is proposed for the sparse recovery problem. The proposed algorithm, which exploits intra-signal correlation, is capable being applied to limited data support and low signal-to-noise ratio (SNR) scene. Furthermore, the proposed algorithm has less computation load compared to the classical Bayesian algorithm. Simulation results show that the proposed algorithm has a more accurate DOA estimation than the traditional multiple signal classification (MUSIC) algorithm and other CS recovery algorithms. (paper)

  9. Continuous analog of multiplicative algebraic reconstruction technique for computed tomography

    Science.gov (United States)

    Tateishi, Kiyoko; Yamaguchi, Yusaku; Abou Al-Ola, Omar M.; Kojima, Takeshi; Yoshinaga, Tetsuya

    2016-03-01

    We propose a hybrid dynamical system as a continuous analog to the block-iterative multiplicative algebraic reconstruction technique (BI-MART), which is a well-known iterative image reconstruction algorithm for computed tomography. The hybrid system is described by a switched nonlinear system with a piecewise smooth vector field or differential equation and, for consistent inverse problems, the convergence of non-negatively constrained solutions to a globally stable equilibrium is guaranteed by the Lyapunov theorem. Namely, we can prove theoretically that a weighted Kullback-Leibler divergence measure can be a common Lyapunov function for the switched system. We show that discretizing the differential equation by using the first-order approximation (Euler's method) based on the geometric multiplicative calculus leads to the same iterative formula of the BI-MART with the scaling parameter as a time-step of numerical discretization. The present paper is the first to reveal that a kind of iterative image reconstruction algorithm is constructed by the discretization of a continuous-time dynamical system for solving tomographic inverse problems. Iterative algorithms with not only the Euler method but also the Runge-Kutta methods of lower-orders applied for discretizing the continuous-time system can be used for image reconstruction. A numerical example showing the characteristics of the discretized iterative methods is presented.

  10. Incorporating 3D-printing technology in the design of head-caps and electrode drives for recording neurons in multiple brain regions.

    Science.gov (United States)

    Headley, Drew B; DeLucca, Michael V; Haufler, Darrell; Paré, Denis

    2015-04-01

    Recent advances in recording and computing hardware have enabled laboratories to record the electrical activity of multiple brain regions simultaneously. Lagging behind these technical advances, however, are the methods needed to rapidly produce microdrives and head-caps that can flexibly accommodate different recording configurations. Indeed, most available designs target single or adjacent brain regions, and, if multiple sites are targeted, specially constructed head-caps are used. Here, we present a novel design style, for both microdrives and head-caps, which takes advantage of three-dimensional printing technology. This design facilitates targeting of multiple brain regions in various configurations. Moreover, the parts are easily fabricated in large quantities, with only minor hand-tooling and finishing required. Copyright © 2015 the American Physiological Society.

  11. Computer-Aided Targeting of the PI3K/Akt/mTOR Pathway: Toxicity Reduction and Therapeutic Opportunities

    Directory of Open Access Journals (Sweden)

    Tan Li

    2014-10-01

    Full Text Available The PI3K/Akt/mTOR pathway plays an essential role in a wide range of biological functions, including metabolism, macromolecular synthesis, cell growth, proliferation and survival. Its versatility, however, makes it a conspicuous target of many pathogens; and the consequential deregulations of this pathway often lead to complications, such as tumorigenesis, type 2 diabetes and cardiovascular diseases. Molecular targeted therapy, aimed at modulating the deregulated pathway, holds great promise for controlling these diseases, though side effects may be inevitable, given the ubiquity of the pathway in cell functions. Here, we review a variety of factors found to modulate the PI3K/Akt/mTOR pathway, including gene mutations, certain metabolites, inflammatory factors, chemical toxicants, drugs found to rectify the pathway, as well as viruses that hijack the pathway for their own synthetic purposes. Furthermore, this evidence of PI3K/Akt/mTOR pathway alteration and related pathogenesis has inspired the exploration of computer-aided targeting of this pathway to optimize therapeutic strategies. Herein, we discuss several possible options, using computer-aided targeting, to reduce the toxicity of molecularly-targeted therapy, including mathematical modeling, to reveal system-level control mechanisms and to confer a low-dosage combination therapy, the potential of PP2A as a therapeutic target, the formulation of parameters to identify patients who would most benefit from specific targeted therapies and molecular dynamics simulations and docking studies to discover drugs that are isoform specific or mutation selective so as to avoid undesired broad inhibitions. We hope this review will stimulate novel ideas for pharmaceutical discovery and deepen our understanding of curability and toxicity by targeting the PI3K/Akt/mTOR pathway.

  12. Data fusion for target tracking and classification with wireless sensor network

    Science.gov (United States)

    Pannetier, Benjamin; Doumerc, Robin; Moras, Julien; Dezert, Jean; Canevet, Loic

    2016-10-01

    In this paper, we address the problem of multiple ground target tracking and classification with information obtained from a unattended wireless sensor network. A multiple target tracking (MTT) algorithm, taking into account road and vegetation information, is proposed based on a centralized architecture. One of the key issue is how to adapt classical MTT approach to satisfy embedded processing. Based on track statistics, the classification algorithm uses estimated location, velocity and acceleration to help to classify targets. The algorithms enables tracking human and vehicles driving both on and off road. We integrate road or trail width and vegetation cover, as constraints in target motion models to improve performance of tracking under constraint with classification fusion. Our algorithm also presents different dynamic models, to palliate the maneuvers of targets. The tracking and classification algorithms are integrated into an operational platform (the fusion node). In order to handle realistic ground target tracking scenarios, we use an autonomous smart computer deposited in the surveillance area. After the calibration step of the heterogeneous sensor network, our system is able to handle real data from a wireless ground sensor network. The performance of system is evaluated in a real exercise for intelligence operation ("hunter hunt" scenario).

  13. Computer-aided design of multi-target ligands at A1R, A2AR and PDE10A, key proteins in neurodegenerative diseases.

    Science.gov (United States)

    Kalash, Leen; Val, Cristina; Azuaje, Jhonny; Loza, María I; Svensson, Fredrik; Zoufir, Azedine; Mervin, Lewis; Ladds, Graham; Brea, José; Glen, Robert; Sotelo, Eddy; Bender, Andreas

    2017-12-30

    Compounds designed to display polypharmacology may have utility in treating complex diseases, where activity at multiple targets is required to produce a clinical effect. In particular, suitable compounds may be useful in treating neurodegenerative diseases by promoting neuronal survival in a synergistic manner via their multi-target activity at the adenosine A 1 and A 2A receptors (A 1 R and A 2A R) and phosphodiesterase 10A (PDE10A), which modulate intracellular cAMP levels. Hence, in this work we describe a computational method for the design of synthetically feasible ligands that bind to A 1 and A 2A receptors and inhibit phosphodiesterase 10A (PDE10A), involving a retrosynthetic approach employing in silico target prediction and docking, which may be generally applicable to multi-target compound design at several target classes. This approach has identified 2-aminopyridine-3-carbonitriles as the first multi-target ligands at A 1 R, A 2A R and PDE10A, by showing agreement between the ligand and structure based predictions at these targets. The series were synthesized via an efficient one-pot scheme and validated pharmacologically as A 1 R/A 2A R-PDE10A ligands, with IC 50 values of 2.4-10.0 μM at PDE10A and K i values of 34-294 nM at A 1 R and/or A 2A R. Furthermore, selectivity profiling of the synthesized 2-amino-pyridin-3-carbonitriles against other subtypes of both protein families showed that the multi-target ligand 8 exhibited a minimum of twofold selectivity over all tested off-targets. In addition, both compounds 8 and 16 exhibited the desired multi-target profile, which could be considered for further functional efficacy assessment, analog modification for the improvement of selectivity towards A 1 R, A 2A R and PDE10A collectively, and evaluation of their potential synergy in modulating cAMP levels.

  14. Computing the Free Energy Barriers for Less by Sampling with a Coarse Reference Potential while Retaining Accuracy of the Target Fine Model.

    Science.gov (United States)

    Plotnikov, Nikolay V

    2014-08-12

    Proposed in this contribution is a protocol for calculating fine-physics (e.g., ab initio QM/MM) free-energy surfaces at a high level of accuracy locally (e.g., only at reactants and at the transition state for computing the activation barrier) from targeted fine-physics sampling and extensive exploratory coarse-physics sampling. The full free-energy surface is still computed but at a lower level of accuracy from coarse-physics sampling. The method is analytically derived in terms of the umbrella sampling and the free-energy perturbation methods which are combined with the thermodynamic cycle and the targeted sampling strategy of the paradynamics approach. The algorithm starts by computing low-accuracy fine-physics free-energy surfaces from the coarse-physics sampling in order to identify the reaction path and to select regions for targeted sampling. Thus, the algorithm does not rely on the coarse-physics minimum free-energy reaction path. Next, segments of high-accuracy free-energy surface are computed locally at selected regions from the targeted fine-physics sampling and are positioned relative to the coarse-physics free-energy shifts. The positioning is done by averaging the free-energy perturbations computed with multistep linear response approximation method. This method is analytically shown to provide results of the thermodynamic integration and the free-energy interpolation methods, while being extremely simple in implementation. Incorporating the metadynamics sampling to the algorithm is also briefly outlined. The application is demonstrated by calculating the B3LYP//6-31G*/MM free-energy barrier for an enzymatic reaction using a semiempirical PM6/MM reference potential. These modifications allow computing the activation free energies at a significantly reduced computational cost but at the same level of accuracy compared to computing full potential of mean force.

  15. Targeting multiple pathogenic mechanisms with polyphenols for the treatment of Alzheimer’s disease: Experimental approach and therapeutic implications

    Directory of Open Access Journals (Sweden)

    Jun eWang

    2014-03-01

    Full Text Available Alzheimer’s disease (AD is the most prevalent neurodegenerative disease of aging and currently has no cure. Its onset and progression are influenced by multiple factors. There is growing consensus that successful treatment will rely on simultaneously targeting multiple pathological features of AD. Polyphenol compounds have many proven health benefits. In this study, we tested the hypothesis that combining three polyphenolic preparations (grape seed extract, resveratrol and Concord grape juice extract, with different polyphenolic compositions and partially redundant bioactivities, may simultaneously and synergistically mitigate amyloid-β (Aβ mediated neuropathology and cognitive impairments in a mouse model of AD. We found that administration of the polyphenols in combination did not alter the profile of bioactive polyphenol metabolites in the brain. We also found that combination treatment resulted in better protection against cognitive impairments compared to individual treatments, in J20 AD mice. Electrophysiological examination showed that acute treatment with select brain penetrating polyphenol metabolites, derived from these polyphenols, improved oligomeric Aβ (oAβ-induced long term potentiation (LTP deficits in hippocampal slices. Moreover, we found greatly reduced total amyloid content in the brain following combination treatment. Our studies provided experimental evidence that application of polyphenols targeting multiple disease-mechanisms may yield a greater likelihood of therapeutic efficacy.

  16. Simulation of multiple scattering background in heavy ion backscattering spectrometry

    International Nuclear Information System (INIS)

    Li, M.M.; O'Connor, D.J.

    1999-01-01

    With the development of heavy ion backscattering spectrometry (HIBS) for the detection of trace quantities of heavy-atom impurities on Si surfaces, it is necessary to quantify the multiple scattering contribution to the spectral background. In the present work, the Monte Carlo computer simulation program TRIM has been used to study the backscattering spectrum and the multiple scattering background features for heavy ions C, Ne, Si, Ar and Kr impinging on four types of targets: (1) a single ultra-thin (free standing) Au film of 10 A thickness, (2) a 10 A Au film on a 50 A Si surface, (3) a 10 A Au film on an Si substrate (10 000 A), and (4) a thick target (10 000 A) of pure Si. The ratio of the signal from the Au thin layer to the background due to multiple scattering has been derived by fitting the simulation results. From the simulation results, it is found that the Au film contributes to the background which the Si plays a role in developing due to the ion's multiple scattering in the substrate. Such a background is generated neither by only the Au thin layer nor by the pure Si substrate independently. The corresponding mechanism of multiple scattering in the target can be explained as one large-angle scattering in the Au layer and subsequently several small angle scatterings in the substrate. This study allows an appropriate choice of incident beam species and energy range when the HIBS is utilized to analyse low level impurities in Si wafers

  17. Computationally Efficient Automatic Coast Mode Target Tracking Based on Occlusion Awareness in Infrared Images.

    Science.gov (United States)

    Kim, Sohyun; Jang, Gwang-Il; Kim, Sungho; Kim, Junmo

    2018-03-27

    This paper proposes the automatic coast mode tracking of centroid trackers for infrared images to overcome the target occlusion status. The centroid tracking method, using only the brightness information of an image, is still widely used in infrared imaging tracking systems because it is difficult to extract meaningful features from infrared images. However, centroid trackers are likely to lose the track because they are highly vulnerable to screened status by the clutter or background. Coast mode, one of the tracking modes, maintains the servo slew rate with the tracking rate right before the loss of track. The proposed automatic coast mode tracking method makes decisions regarding entering coast mode by the prediction of target occlusion and tries to re-lock the target and resume the tracking after blind time. This algorithm comprises three steps. The first step is the prediction process of the occlusion by checking both matters which have target-likelihood brightness and which may screen the target despite different brightness. The second step is the process making inertial tracking commands to the servo. The last step is the process of re-locking a target based on the target modeling of histogram ratio. The effectiveness of the proposed algorithm is addressed by presenting experimental results based on computer simulation with various test imagery sequences compared to published tracking algorithms. The proposed algorithm is tested under a real environment with a naval electro-optical tracking system (EOTS) and airborne EO/IR system.

  18. Computationally Efficient Automatic Coast Mode Target Tracking Based on Occlusion Awareness in Infrared Images

    Directory of Open Access Journals (Sweden)

    Sohyun Kim

    2018-03-01

    Full Text Available This paper proposes the automatic coast mode tracking of centroid trackers for infrared images to overcome the target occlusion status. The centroid tracking method, using only the brightness information of an image, is still widely used in infrared imaging tracking systems because it is difficult to extract meaningful features from infrared images. However, centroid trackers are likely to lose the track because they are highly vulnerable to screened status by the clutter or background. Coast mode, one of the tracking modes, maintains the servo slew rate with the tracking rate right before the loss of track. The proposed automatic coast mode tracking method makes decisions regarding entering coast mode by the prediction of target occlusion and tries to re-lock the target and resume the tracking after blind time. This algorithm comprises three steps. The first step is the prediction process of the occlusion by checking both matters which have target-likelihood brightness and which may screen the target despite different brightness. The second step is the process making inertial tracking commands to the servo. The last step is the process of re-locking a target based on the target modeling of histogram ratio. The effectiveness of the proposed algorithm is addressed by presenting experimental results based on computer simulation with various test imagery sequences compared to published tracking algorithms. The proposed algorithm is tested under a real environment with a naval electro-optical tracking system (EOTS and airborne EO/IR system.

  19. Neutron-multiplication measurement instrument

    Energy Technology Data Exchange (ETDEWEB)

    Nixon, K.V.; Dowdy, E.J.; France, S.W.; Millegan, D.R.; Robba, A.A.

    1982-01-01

    The Advanced Nuclear Technology Group of the Los Alamos National Laboratory is now using intelligent data-acquisition and analysis instrumentation for determining the multiplication of nuclear material. Earlier instrumentation, such as the large NIM-crate systems, depended on house power and required additional computation to determine multiplication or to estimate error. The portable, battery-powered multiplication measurement unit, with advanced computational power, acquires data, calculates multiplication, and completes error analysis automatically. Thus, the multiplication is determined easily and an available error estimate enables the user to judge the significance of results.

  20. Neutron multiplication measurement instrument

    International Nuclear Information System (INIS)

    Nixon, K.V.; Dowdy, E.J.; France, S.W.; Millegan, D.R.; Robba, A.A.

    1983-01-01

    The Advanced Nuclear Technology Group of the Los Alamos National Laboratory is now using intelligent data-acquisition and analysis instrumentation for determining the multiplication of nuclear material. Earlier instrumentation, such as the large NIM-crate systems, depended on house power and required additional computation to determine multiplication or to estimate error. The portable, battery-powered multiplication measurement unit, with advanced computational power, acquires data, calculates multiplication, and completes error analysis automatically. Thus, the multiplication is determined easily and an available error estimate enables the user to judge the significance of results

  1. Neutron-multiplication measurement instrument

    International Nuclear Information System (INIS)

    Nixon, K.V.; Dowdy, E.J.; France, S.W.; Millegan, D.R.; Robba, A.A.

    1982-01-01

    The Advanced Nuclear Technology Group of the Los Alamos National Laboratory is now using intelligent data-acquisition and analysis instrumentation for determining the multiplication of nuclear material. Earlier instrumentation, such as the large NIM-crate systems, depended on house power and required additional computation to determine multiplication or to estimate error. The portable, battery-powered multiplication measurement unit, with advanced computational power, acquires data, calculates multiplication, and completes error analysis automatically. Thus, the multiplication is determined easily and an available error estimate enables the user to judge the significance of results

  2. Some safety studies of the MEGAPIE spallation source target performed using computational fluid dynamics

    International Nuclear Information System (INIS)

    Smith, B.L.

    2011-01-01

    Such a target forms part of the evolutionary Accelerator-Driven System (ADS) concept in which neutrons are generated in an otherwise sub-critical core by spallation reactions resulting from bombardment by a proton beam. The international project MEGAPIE had the objective of demonstrating the feasibility of the spallation process for a particular target design under strict test conditions. The test was carried over a period of four months at the end of 2006 at the SINQ facility of the Paul Scherrer Institute in Switzerland. The design studies carried out for the MEGAPIE target prior to irradiation using Computational Fluid Dynamics (CFD) resulted in an optimum flow configuration being defined for the coolant circulation. Simultaneously, stresses in the structural components were examined using Finite Element Method (FEM) techniques. To this purpose, an interface program was written which enabled different specialist groups to carry out the thermal hydraulics and structural mechanics analyses within the project with fully consistent model data. Results for steady-state operation of the target show that the critical lower target components are adequately cooled, and that stresses and displacements are well within tolerances. Transient analyses were also performed to demonstrate the robustness of the design in the event of abnormal operation, including pump failure and burn-through of the target casing by the proton beam. In the latter case, the CFD analyses complemented and extended full-scale tests. (author)

  3. Some safety studies of the MEGAPIE spallation source target performed using computational fluid dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Smith, B.L., E-mail: brian.smith@psi.ch [Paul Scherrer Institute, OHSA/C08, 5232 Villigen PSI (Switzerland)

    2011-07-01

    Such a target forms part of the evolutionary Accelerator-Driven System (ADS) concept in which neutrons are generated in an otherwise sub-critical core by spallation reactions resulting from bombardment by a proton beam. The international project MEGAPIE had the objective of demonstrating the feasibility of the spallation process for a particular target design under strict test conditions. The test was carried over a period of four months at the end of 2006 at the SINQ facility of the Paul Scherrer Institute in Switzerland. The design studies carried out for the MEGAPIE target prior to irradiation using Computational Fluid Dynamics (CFD) resulted in an optimum flow configuration being defined for the coolant circulation. Simultaneously, stresses in the structural components were examined using Finite Element Method (FEM) techniques. To this purpose, an interface program was written which enabled different specialist groups to carry out the thermal hydraulics and structural mechanics analyses within the project with fully consistent model data. Results for steady-state operation of the target show that the critical lower target components are adequately cooled, and that stresses and displacements are well within tolerances. Transient analyses were also performed to demonstrate the robustness of the design in the event of abnormal operation, including pump failure and burn-through of the target casing by the proton beam. In the latter case, the CFD analyses complemented and extended full-scale tests. (author)

  4. Efficiently outsourcing multiparty computation under multiple keys

    NARCIS (Netherlands)

    Peter, Andreas; Tews, Erik; Tews, Erik; Katzenbeisser, Stefan

    2013-01-01

    Secure multiparty computation enables a set of users to evaluate certain functionalities on their respective inputs while keeping these inputs encrypted throughout the computation. In many applications, however, outsourcing these computations to an untrusted server is desirable, so that the server

  5. Molecular diagnostics of a single drug-resistant multiple myeloma case using targeted next-generation sequencing

    Directory of Open Access Journals (Sweden)

    Ikeda H

    2015-10-01

    Full Text Available Hiroshi Ikeda,1 Kazuya Ishiguro,1 Tetsuyuki Igarashi,1 Yuka Aoki,1 Toshiaki Hayashi,1 Tadao Ishida,1 Yasushi Sasaki,1,2 Takashi Tokino,2 Yasuhisa Shinomura1 1Department of Gastroenterology, Rheumatology and Clinical Immunology, 2Medical Genome Sciences, Research Institute for Frontier Medicine, Sapporo Medical University, Sapporo, Japan Abstract: A 69-year-old man was diagnosed with IgG λ-type multiple myeloma (MM, Stage II in October 2010. He was treated with one cycle of high-dose dexamethasone. After three cycles of bortezomib, the patient exhibited slow elevations in the free light-chain levels and developed a significant new increase of serum M protein. Bone marrow cytogenetic analysis revealed a complex karyotype characteristic of malignant plasma cells. To better understand the molecular pathogenesis of this patient, we sequenced for mutations in the entire coding regions of 409 cancer-related genes using a semiconductor-based sequencing platform. Sequencing analysis revealed eight nonsynonymous somatic mutations in addition to several copy number variants, including CCND1 and RB1. These alterations may play roles in the pathobiology of this disease. This targeted next-generation sequencing can allow for the prediction of drug resistance and facilitate improvements in the treatment of MM patients. Keywords: multiple myeloma, drug resistance, genome-wide sequencing, semiconductor sequencer, target therapy

  6. Integrating computational methods to retrofit enzymes to synthetic pathways.

    Science.gov (United States)

    Brunk, Elizabeth; Neri, Marilisa; Tavernelli, Ivano; Hatzimanikatis, Vassily; Rothlisberger, Ursula

    2012-02-01

    Microbial production of desired compounds provides an efficient framework for the development of renewable energy resources. To be competitive to traditional chemistry, one requirement is to utilize the full capacity of the microorganism to produce target compounds with high yields and turnover rates. We use integrated computational methods to generate and quantify the performance of novel biosynthetic routes that contain highly optimized catalysts. Engineering a novel reaction pathway entails addressing feasibility on multiple levels, which involves handling the complexity of large-scale biochemical networks while respecting the critical chemical phenomena at the atomistic scale. To pursue this multi-layer challenge, our strategy merges knowledge-based metabolic engineering methods with computational chemistry methods. By bridging multiple disciplines, we provide an integral computational framework that could accelerate the discovery and implementation of novel biosynthetic production routes. Using this approach, we have identified and optimized a novel biosynthetic route for the production of 3HP from pyruvate. Copyright © 2011 Wiley Periodicals, Inc.

  7. Computational and Biochemical Discovery of RSK2 as a Novel Target for Epigallocatechin Gallate (EGCG.

    Directory of Open Access Journals (Sweden)

    Hanyong Chen

    Full Text Available The most active anticancer component in green tea is epigallocatechin-3-gallate (EGCG. Protein interaction with EGCG is a critical step for mediating the effects of EGCG on the regulation of various key molecules involved in signal transduction. By using computational docking screening methods for protein identification, we identified a serine/threonine kinase, 90-kDa ribosomal S6 kinase (RSK2, as a novel molecular target of EGCG. RSK2 includes two kinase catalytic domains in the N-terminal (NTD and the C-terminal (CTD and RSK2 full activation requires phosphorylation of both terminals. The computer prediction was confirmed by an in vitro kinase assay in which EGCG inhibited RSK2 activity in a dose-dependent manner. Pull-down assay results showed that EGCG could bind with RSK2 at both kinase catalytic domains in vitro and ex vivo. Furthermore, results of an ATP competition assay and a computer-docking model showed that EGCG binds with RSK2 in an ATP-dependent manner. In RSK2+/+ and RSK2-/- murine embryonic fibroblasts, EGCG decreased viability only in the presence of RSK2. EGCG also suppressed epidermal growth factor-induced neoplastic cell transformation by inhibiting phosphorylation of histone H3 at Ser10. Overall, these results indicate that RSK2 is a novel molecular target of EGCG.

  8. The meninges: new therapeutic targets for multiple sclerosis.

    Science.gov (United States)

    Russi, Abigail E; Brown, Melissa A

    2015-02-01

    The central nervous system (CNS) largely comprises nonregenerating cells, including neurons and myelin-producing oligodendrocytes, which are particularly vulnerable to immune cell-mediated damage. To protect the CNS, mechanisms exist that normally restrict the transit of peripheral immune cells into the brain and spinal cord, conferring an "immune-specialized" status. Thus, there has been a long-standing debate as to how these restrictions are overcome in several inflammatory diseases of the CNS, including multiple sclerosis (MS). In this review, we highlight the role of the meninges, tissues that surround and protect the CNS and enclose the cerebral spinal fluid, in promoting chronic inflammation that leads to neuronal damage. Although the meninges have traditionally been considered structures that provide physical protection for the brain and spinal cord, new data have established these tissues as sites of active immunity. It has been hypothesized that the meninges are important players in normal immunosurveillance of the CNS but also serve as initial sites of anti-myelin immune responses. The resulting robust meningeal inflammation elicits loss of localized blood-brain barrier (BBB) integrity and facilitates a large-scale influx of immune cells into the CNS parenchyma. We propose that targeting the cells and molecules mediating these inflammatory responses within the meninges offers promising therapies for MS that are free from the constraints imposed by the BBB. Importantly, such therapies may avoid the systemic immunosuppression often associated with the existing treatments. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Optimizing Sparse Matrix-Multiple Vectors Multiplication for Nuclear Configuration Interaction Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Aktulga, Hasan Metin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Buluc, Aydin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Samuel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yang, Chao [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-08-14

    Obtaining highly accurate predictions on the properties of light atomic nuclei using the configuration interaction (CI) approach requires computing a few extremal Eigen pairs of the many-body nuclear Hamiltonian matrix. In the Many-body Fermion Dynamics for nuclei (MFDn) code, a block Eigen solver is used for this purpose. Due to the large size of the sparse matrices involved, a significant fraction of the time spent on the Eigen value computations is associated with the multiplication of a sparse matrix (and the transpose of that matrix) with multiple vectors (SpMM and SpMM-T). Existing implementations of SpMM and SpMM-T significantly underperform expectations. Thus, in this paper, we present and analyze optimized implementations of SpMM and SpMM-T. We base our implementation on the compressed sparse blocks (CSB) matrix format and target systems with multi-core architectures. We develop a performance model that allows us to understand and estimate the performance characteristics of our SpMM kernel implementations, and demonstrate the efficiency of our implementation on a series of real-world matrices extracted from MFDn. In particular, we obtain 3-4 speedup on the requisite operations over good implementations based on the commonly used compressed sparse row (CSR) matrix format. The improvements in the SpMM kernel suggest we may attain roughly a 40% speed up in the overall execution time of the block Eigen solver used in MFDn.

  10. Multiple Model Particle Filtering For Multi-Target Tracking

    National Research Council Canada - National Science Library

    Hero, Alfred; Kreucher, Chris; Kastella, Keith

    2004-01-01

    .... The details of this method have been presented elsewhere 1. One feature of real targets is that they are poorly described by a single kinematic model Target behavior may change dramatically i.e...

  11. New and emerging immune-targeted drugs for the treatment of multiple sclerosis.

    Science.gov (United States)

    Palmer, Alan M

    2014-07-01

    Multiple sclerosis (MS) is a neurodegenerative disease with a major inflammatory component that constitutes the most common progressive and disabling neurological condition in young adults. Injectable immunomodulatory medicines such as interferon drugs and glatiramer acetate have dominated the MS market for over the past two decades but this situation is set to change. This is because of: (i) patent expirations, (ii) the introduction of natalizumab, which targets the interaction between leukocytes and the blood-CNS barrier, (iii) the launch of three oral immunomodulatory drugs (fingolimod, dimethyl fumarate and teriflunomide), with another (laquinimod) under regulatory review and (iv) a number of immunomodulatory monoclonal antibodies (alemtuzumab, daclizumab and ocrelizumab) about to enter the market. Current and emerging medicines are reviewed and their impact on people with MS considered. © 2013 The British Pharmacological Society.

  12. A game theory approach to target tracking in sensor networks.

    Science.gov (United States)

    Gu, Dongbing

    2011-02-01

    In this paper, we investigate a moving-target tracking problem with sensor networks. Each sensor node has a sensor to observe the target and a processor to estimate the target position. It also has wireless communication capability but with limited range and can only communicate with neighbors. The moving target is assumed to be an intelligent agent, which is "smart" enough to escape from the detection by maximizing the estimation error. This adversary behavior makes the target tracking problem more difficult. We formulate this target estimation problem as a zero-sum game in this paper and use a minimax filter to estimate the target position. The minimax filter is a robust filter that minimizes the estimation error by considering the worst case noise. Furthermore, we develop a distributed version of the minimax filter for multiple sensor nodes. The distributed computation is implemented via modeling the information received from neighbors as measurements in the minimax filter. The simulation results show that the target tracking algorithm proposed in this paper provides a satisfactory result.

  13. Dual-target cost in visual search for multiple unfamiliar faces.

    Science.gov (United States)

    Mestry, Natalie; Menneer, Tamaryn; Cave, Kyle R; Godwin, Hayward J; Donnelly, Nick

    2017-08-01

    The efficiency of visual search for one (single-target) and either of two (dual-target) unfamiliar faces was explored to understand the manifestations of capacity and guidance limitations in face search. The visual similarity of distractor faces to target faces was manipulated using morphing (Experiments 1 and 2) and multidimensional scaling (Experiment 3). A dual-target cost was found in all experiments, evidenced by slower and less accurate search in dual- than single-target conditions. The dual-target cost was unequal across the targets, with performance being maintained on one target and reduced on the other, which we label "preferred" and "non-preferred" respectively. We calculated the capacity for each target face and show reduced capacity for representing the non-preferred target face. However, results show that the capacity for the non-preferred target can be increased when the dual-target condition is conducted after participants complete the single-target conditions. Analyses of eye movements revealed evidence for weak guidance of fixations in single-target search, and when searching for the preferred target in dual-target search. Overall, the experiments show dual-target search for faces is capacity- and guidance-limited, leading to superior search for 1 face over the other in dual-target search. However, learning faces individually may improve capacity with the second face. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  14. Extending Data Worth Analyses to Select Multiple Observations Targeting Multiple Forecasts.

    Science.gov (United States)

    Vilhelmsen, Troels N; Ferré, Ty P A

    2017-09-15

    Hydrological models are often set up to provide specific forecasts of interest. Owing to the inherent uncertainty in data used to derive model structure and used to constrain parameter variations, the model forecasts will be uncertain. Additional data collection is often performed to minimize this forecast uncertainty. Given our common financial restrictions, it is critical that we identify data with maximal information content with respect to forecast of interest. In practice, this often devolves to qualitative decisions based on expert opinion. However, there is no assurance that this will lead to optimal design, especially for complex hydrogeological problems. Specifically, these complexities include considerations of multiple forecasts, shared information among potential observations, information content of existing data, and the assumptions and simplifications underlying model construction. In the present study, we extend previous data worth analyses to include: simultaneous selection of multiple new measurements and consideration of multiple forecasts of interest. We show how the suggested approach can be used to optimize data collection. This can be used in a manner that suggests specific measurement sets or that produces probability maps indicating areas likely to be informative for specific forecasts. Moreover, we provide examples documenting that sequential measurement election approaches often lead to suboptimal designs and that estimates of data covariance should be included when selecting future measurement sets. © 2017, National Ground Water Association.

  15. Multiplicities of charged pions and charged hadrons from deep-inelastic scattering of muons off an isoscalar target

    Directory of Open Access Journals (Sweden)

    C. Adolph

    2017-01-01

    Full Text Available Multiplicities of charged pions and charged hadrons produced in deep-inelastic scattering were measured in three-dimensional bins of the Bjorken scaling variable x, the relative virtual-photon energy y and the relative hadron energy z. Data were obtained by the COMPASS Collaboration using a 160GeV muon beam and an isoscalar target (6LiD. They cover the kinematic domain in the photon virtuality Q2>1(GeV/c2, 0.004multiplicity results to extract quark fragmentation functions.

  16. Identification of control targets in Boolean molecular network models via computational algebra.

    Science.gov (United States)

    Murrugarra, David; Veliz-Cuba, Alan; Aguilar, Boris; Laubenbacher, Reinhard

    2016-09-23

    Many problems in biomedicine and other areas of the life sciences can be characterized as control problems, with the goal of finding strategies to change a disease or otherwise undesirable state of a biological system into another, more desirable, state through an intervention, such as a drug or other therapeutic treatment. The identification of such strategies is typically based on a mathematical model of the process to be altered through targeted control inputs. This paper focuses on processes at the molecular level that determine the state of an individual cell, involving signaling or gene regulation. The mathematical model type considered is that of Boolean networks. The potential control targets can be represented by a set of nodes and edges that can be manipulated to produce a desired effect on the system. This paper presents a method for the identification of potential intervention targets in Boolean molecular network models using algebraic techniques. The approach exploits an algebraic representation of Boolean networks to encode the control candidates in the network wiring diagram as the solutions of a system of polynomials equations, and then uses computational algebra techniques to find such controllers. The control methods in this paper are validated through the identification of combinatorial interventions in the signaling pathways of previously reported control targets in two well studied systems, a p53-mdm2 network and a blood T cell lymphocyte granular leukemia survival signaling network. Supplementary data is available online and our code in Macaulay2 and Matlab are available via http://www.ms.uky.edu/~dmu228/ControlAlg . This paper presents a novel method for the identification of intervention targets in Boolean network models. The results in this paper show that the proposed methods are useful and efficient for moderately large networks.

  17. Computational design of RNAs with complex energy landscapes.

    Science.gov (United States)

    Höner zu Siederdissen, Christian; Hammer, Stefan; Abfalter, Ingrid; Hofacker, Ivo L; Flamm, Christoph; Stadler, Peter F

    2013-12-01

    RNA has become an integral building material in synthetic biology. Dominated by their secondary structures, which can be computed efficiently, RNA molecules are amenable not only to in vitro and in vivo selection, but also to rational, computation-based design. While the inverse folding problem of constructing an RNA sequence with a prescribed ground-state structure has received considerable attention for nearly two decades, there have been few efforts to design RNAs that can switch between distinct prescribed conformations. We introduce a user-friendly tool for designing RNA sequences that fold into multiple target structures. The underlying algorithm makes use of a combination of graph coloring and heuristic local optimization to find sequences whose energy landscapes are dominated by the prescribed conformations. A flexible interface allows the specification of a wide range of design goals. We demonstrate that bi- and tri-stable "switches" can be designed easily with moderate computational effort for the vast majority of compatible combinations of desired target structures. RNAdesign is freely available under the GPL-v3 license. Copyright © 2013 Wiley Periodicals, Inc.

  18. ISDD: A computational model of particle sedimentation, diffusion and target cell dosimetry for in vitro toxicity studies

    Science.gov (United States)

    2010-01-01

    Background The difficulty of directly measuring cellular dose is a significant obstacle to application of target tissue dosimetry for nanoparticle and microparticle toxicity assessment, particularly for in vitro systems. As a consequence, the target tissue paradigm for dosimetry and hazard assessment of nanoparticles has largely been ignored in favor of using metrics of exposure (e.g. μg particle/mL culture medium, particle surface area/mL, particle number/mL). We have developed a computational model of solution particokinetics (sedimentation, diffusion) and dosimetry for non-interacting spherical particles and their agglomerates in monolayer cell culture systems. Particle transport to cells is calculated by simultaneous solution of Stokes Law (sedimentation) and the Stokes-Einstein equation (diffusion). Results The In vitro Sedimentation, Diffusion and Dosimetry model (ISDD) was tested against measured transport rates or cellular doses for multiple sizes of polystyrene spheres (20-1100 nm), 35 nm amorphous silica, and large agglomerates of 30 nm iron oxide particles. Overall, without adjusting any parameters, model predicted cellular doses were in close agreement with the experimental data, differing from as little as 5% to as much as three-fold, but in most cases approximately two-fold, within the limits of the accuracy of the measurement systems. Applying the model, we generalize the effects of particle size, particle density, agglomeration state and agglomerate characteristics on target cell dosimetry in vitro. Conclusions Our results confirm our hypothesis that for liquid-based in vitro systems, the dose-rates and target cell doses for all particles are not equal; they can vary significantly, in direct contrast to the assumption of dose-equivalency implicit in the use of mass-based media concentrations as metrics of exposure for dose-response assessment. The difference between equivalent nominal media concentration exposures on a μg/mL basis and target cell

  19. ISDD: A computational model of particle sedimentation, diffusion and target cell dosimetry for in vitro toxicity studies

    Directory of Open Access Journals (Sweden)

    Chrisler William B

    2010-11-01

    Full Text Available Abstract Background The difficulty of directly measuring cellular dose is a significant obstacle to application of target tissue dosimetry for nanoparticle and microparticle toxicity assessment, particularly for in vitro systems. As a consequence, the target tissue paradigm for dosimetry and hazard assessment of nanoparticles has largely been ignored in favor of using metrics of exposure (e.g. μg particle/mL culture medium, particle surface area/mL, particle number/mL. We have developed a computational model of solution particokinetics (sedimentation, diffusion and dosimetry for non-interacting spherical particles and their agglomerates in monolayer cell culture systems. Particle transport to cells is calculated by simultaneous solution of Stokes Law (sedimentation and the Stokes-Einstein equation (diffusion. Results The In vitro Sedimentation, Diffusion and Dosimetry model (ISDD was tested against measured transport rates or cellular doses for multiple sizes of polystyrene spheres (20-1100 nm, 35 nm amorphous silica, and large agglomerates of 30 nm iron oxide particles. Overall, without adjusting any parameters, model predicted cellular doses were in close agreement with the experimental data, differing from as little as 5% to as much as three-fold, but in most cases approximately two-fold, within the limits of the accuracy of the measurement systems. Applying the model, we generalize the effects of particle size, particle density, agglomeration state and agglomerate characteristics on target cell dosimetry in vitro. Conclusions Our results confirm our hypothesis that for liquid-based in vitro systems, the dose-rates and target cell doses for all particles are not equal; they can vary significantly, in direct contrast to the assumption of dose-equivalency implicit in the use of mass-based media concentrations as metrics of exposure for dose-response assessment. The difference between equivalent nominal media concentration exposures on a

  20. A distributed computational search strategy for the identification of diagnostics targets: Application to finding aptamer targets for methicillin-resistant staphylococci

    Directory of Open Access Journals (Sweden)

    Flanagan Keith

    2014-06-01

    Full Text Available The rapid and cost-effective identification of bacterial species is crucial, especially for clinical diagnosis and treatment. Peptide aptamers have been shown to be valuable for use as a component of novel, direct detection methods. These small peptides have a number of advantages over antibodies, including greater specificity and longer shelf life. These properties facilitate their use as the detector components of biosensor devices. However, the identification of suitable aptamer targets for particular groups of organisms is challenging. We present a semi-automated processing pipeline for the identification of candidate aptamer targets from whole bacterial genome sequences. The pipeline can be configured to search for protein sequence fragments that uniquely identify a set of strains of interest. The system is also capable of identifying additional organisms that may be of interest due to their possession of protein fragments in common with the initial set. Through the use of Cloud computing technology and distributed databases, our system is capable of scaling with the rapidly growing genome repositories, and consequently of keeping the resulting data sets up-to-date. The system described is also more generically applicable to the discovery of specific targets for other diagnostic approaches such as DNA probes, PCR primers and antibodies.

  1. A distributed computational search strategy for the identification of diagnostics targets: application to finding aptamer targets for methicillin-resistant staphylococci.

    Science.gov (United States)

    Flanagan, Keith; Cockell, Simon; Harwood, Colin; Hallinan, Jennifer; Nakjang, Sirintra; Lawry, Beth; Wipat, Anil

    2014-06-30

    The rapid and cost-effective identification of bacterial species is crucial, especially for clinical diagnosis and treatment. Peptide aptamers have been shown to be valuable for use as a component of novel, direct detection methods. These small peptides have a number of advantages over antibodies, including greater specificity and longer shelf life. These properties facilitate their use as the detector components of biosensor devices. However, the identification of suitable aptamer targets for particular groups of organisms is challenging. We present a semi-automated processing pipeline for the identification of candidate aptamer targets from whole bacterial genome sequences. The pipeline can be configured to search for protein sequence fragments that uniquely identify a set of strains of interest. The system is also capable of identifying additional organisms that may be of interest due to their possession of protein fragments in common with the initial set. Through the use of Cloud computing technology and distributed databases, our system is capable of scaling with the rapidly growing genome repositories, and consequently of keeping the resulting data sets up-to-date. The system described is also more generically applicable to the discovery of specific targets for other diagnostic approaches such as DNA probes, PCR primers and antibodies.

  2. Roles of computed tomography and [18F]fluorodeoxyglucose-positron emission tomography/computed tomography in the characterization of multiple solitary solid lung nodules

    OpenAIRE

    Travaini, LL; Trifirò, G; Vigna, PD; Veronesi, G; De Pas, TM; Spaggiari, L; Paganelli, G; Bellomi, M

    2012-01-01

    The purpose of this study is to compare the performance of multidetector computed tomography (CT) and positron emission tomography/CT (PET/CT) with [18F]fluorodeoxyglucose in the diagnosis of multiple solitary lung nodules in 14 consecutive patients with suspicious lung cancer. CT and PET/CT findings were reviewed by a radiologist and nuclear medicine physician, respectively, blinded to the pathological diagnoses of lung cancer, considering nodule size, shape, and location (CT) and maximum st...

  3. Computation of beam quality parameters for Mo/Mo, Mo/Rh, Rh/Rh, and W/Al target/filter combinations in mammography

    International Nuclear Information System (INIS)

    Kharrati, Hedi; Zarrad, Boubaker

    2003-01-01

    A computer program was implemented to predict mammography x-ray beam parameters in the range 20-40 kV for Mo/Mo, Mo/Rh, Rh/Rh, and W/Al target/filter combinations. The computation method used to simulate mammography x-ray spectra is based on the Boone et al. model. The beam quality parameters such as the half-value layer (HVL), the homogeneity coefficient (HC), and the average photon energy were computed by simulating the interaction of the spectrum photons with matter. The checking of this computation was done using a comparison of the results with published data and measured values obtained at the Netherlands Metrology Institute Van Swinden Laboratorium, National Institute of Standards and Technology, and International Atomic Energy Agency. The predicted values with a mean deviation of 3.3% of HVL, 3.7% of HC, and 1.5% of average photon energy show acceptable agreement with published data and measurements for all target/filter combinations in the 23-40 kV range. The accuracy of this computation can be considered clinically acceptable and can allow an appreciable estimation for the beam quality parameters

  4. A Targeted "Capture" and "Removal" Scavenger toward Multiple Pollutants for Water Remediation based on Molecular Recognition.

    Science.gov (United States)

    Wang, Jie; Shen, Haijing; Hu, Xiaoxia; Li, Yan; Li, Zhihao; Xu, Jinfan; Song, Xiufeng; Zeng, Haibo; Yuan, Quan

    2016-03-01

    For the water remediation techniques based on adsorption, the long-standing contradictories between selectivity and multiple adsorbability, as well as between affinity and recyclability, have put it on weak defense amid more and more severe environment crisis. Here, a pollutant-targeting hydrogel scavenger is reported for water remediation with both high selectivity and multiple adsorbability for several pollutants, and with strong affinity and good recyclability through rationally integrating the advantages of multiple functional materials. In the scavenger, aptamers fold into binding pockets to accommodate the molecular structure of pollutants to afford perfect selectivity, and Janus nanoparticles with antibacterial function as well as anisotropic surfaces to immobilize multiple aptamers allow for simultaneously handling different kinds of pollutants. The scavenger exhibits high efficiencies in removing pollutants from water and it can be easily recycled for many times without significant loss of loading capacities. Moreover, the residual concentrations of each contaminant are well below the drinking water standards. Thermodynamic behavior of the adsorption process is investigated and the rate-controlling process is determined. Furthermore, a point of use device is constructed and it displays high efficiency in removing pollutants from environmental water. The scavenger exhibits great promise to be applied in the next generation of water purification systems.

  5. Estimating Single and Multiple Target Locations Using K-Means Clustering with Radio Tomographic Imaging in Wireless Sensor Networks

    Science.gov (United States)

    2015-03-26

    clustering is an algorithm that has been used in data mining applications such as machine learning applications , pattern recognition, hyper-spectral imagery...42 3.7.2 Application of K-means Clustering . . . . . . . . . . . . . . . . . 42 3.8 Experiment Design...Tomographic Imaging WLAN Wireless Local Area Networks WSN Wireless Sensor Network xx ESTIMATING SINGLE AND MULTIPLE TARGET LOCATIONS USING K-MEANS CLUSTERING

  6. Contrast-enhanced ultrasound and computed tomography findings of granulomatosis with polyangiitis presenting with multiple intrarenal microaneurysms: A case report.

    Science.gov (United States)

    Kim, Youe Ree; Lee, Young Hwan; Lee, Jong-Ho; Yoon, Kwon-Ha

    Granulomatosis with polyangiitis (GPA) is a systemic disorder that affects small- and medium- sized vessels in many organs. Although the kidneys are the second most commonly involved organ in patients with GPA, its manifestation as multiple intrarenal aneurysms is rare. We report an unusual manifestation of GPA with multiple intrarenal microaneurysms, as demonstrated by contrast-enhanced ultrasound and computed tomography. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Mapsembler, targeted and micro assembly of large NGS datasets on a desktop computer

    Directory of Open Access Journals (Sweden)

    Peterlongo Pierre

    2012-03-01

    Full Text Available Abstract Background The analysis of next-generation sequencing data from large genomes is a timely research topic. Sequencers are producing billions of short sequence fragments from newly sequenced organisms. Computational methods for reconstructing whole genomes/transcriptomes (de novo assemblers are typically employed to process such data. However, these methods require large memory resources and computation time. Many basic biological questions could be answered targeting specific information in the reads, thus avoiding complete assembly. Results We present Mapsembler, an iterative micro and targeted assembler which processes large datasets of reads on commodity hardware. Mapsembler checks for the presence of given regions of interest that can be constructed from reads and builds a short assembly around it, either as a plain sequence or as a graph, showing contextual structure. We introduce new algorithms to retrieve approximate occurrences of a sequence from reads and construct an extension graph. Among other results presented in this paper, Mapsembler enabled to retrieve previously described human breast cancer candidate fusion genes, and to detect new ones not previously known. Conclusions Mapsembler is the first software that enables de novo discovery around a region of interest of repeats, SNPs, exon skipping, gene fusion, as well as other structural events, directly from raw sequencing reads. As indexing is localized, the memory footprint of Mapsembler is negligible. Mapsembler is released under the CeCILL license and can be freely downloaded from http://alcovna.genouest.org/mapsembler/.

  8. Multiple-Parameter Estimation Method Based on Spatio-Temporal 2-D Processing for Bistatic MIMO Radar

    Directory of Open Access Journals (Sweden)

    Shouguo Yang

    2015-12-01

    Full Text Available A novel spatio-temporal 2-dimensional (2-D processing method that can jointly estimate the transmitting-receiving azimuth and Doppler frequency for bistatic multiple-input multiple-output (MIMO radar in the presence of spatial colored noise and an unknown number of targets is proposed. In the temporal domain, the cross-correlation of the matched filters’ outputs for different time-delay sampling is used to eliminate the spatial colored noise. In the spatial domain, the proposed method uses a diagonal loading method and subspace theory to estimate the direction of departure (DOD and direction of arrival (DOA, and the Doppler frequency can then be accurately estimated through the estimation of the DOD and DOA. By skipping target number estimation and the eigenvalue decomposition (EVD of the data covariance matrix estimation and only requiring a one-dimensional search, the proposed method achieves low computational complexity. Furthermore, the proposed method is suitable for bistatic MIMO radar with an arbitrary transmitted and received geometrical configuration. The correction and efficiency of the proposed method are verified by computer simulation results.

  9. Fuzzy multiple linear regression: A computational approach

    Science.gov (United States)

    Juang, C. H.; Huang, X. H.; Fleming, J. W.

    1992-01-01

    This paper presents a new computational approach for performing fuzzy regression. In contrast to Bardossy's approach, the new approach, while dealing with fuzzy variables, closely follows the conventional regression technique. In this approach, treatment of fuzzy input is more 'computational' than 'symbolic.' The following sections first outline the formulation of the new approach, then deal with the implementation and computational scheme, and this is followed by examples to illustrate the new procedure.

  10. Multiplicities of charged pions and unidentified charged hadrons from deep-inelastic scattering of muons off an isoscalar target

    CERN Document Server

    Adolph, C.; Aghasyan, M.; Akhunzyanov, R.; Alexeev, M.G.; Alexeev, G.D.; Amoroso, A.; Andrieux, V.; Anfimov, N.V.; Anosov, V.; Augustyniak, W.; Austregesilo, A.; Azevedo, C.D.R.; Badelek, B.; Balestra, F.; Barth, J.; Beck, R.; Bedfer, Y.; Bernhard, J.; Bicker, K.; Bielert, E.R.; Birsa, R.; Bisplinghoff, J.; Bodlak, M.; Boer, M.; Bordalo, P.; Bradamante, F.; Braun, C.; Bressan, A.; Buechele, M.; Capozza, L.; Chang, W. -C.; Chatterjee, C.; Chiosso, M.; Choi, I.; Chung, S. -U.; Cicuttin, A.; Crespo, M.L.; Curiel, Q.; Dalla Torre, S.; Dasgupta, S.S.; Dasgupta, S.; Denisov, O. Yu.; Dhara, L.; Donskov, S.V.; Doshita, N.; Duic, V.; Duennweber, W.; Dziewiecki, M.; Efremov, A.; Eversheim, P.D.; Eyrich, W.; Faessler, M.; Ferrero, A.; Finger, M.; Fischer, H.; Franco, C.; von Hohenesche, N. du Fresne; Friedrich, J.M.; Frolov, V.; Fuchey, E.; Gautheron, F.; Gavrichtchouk, O.P.; Gerassimov, S.; Giordano, F.; Gnesi, I.; Gorzellik, M.; Grabmueller, S.; Grasso, A.; Grosse Perdekamp, M.; Grube, B.; Grussenmeyer, T.; Guskov, A.; Haas, F.; Hahne, D.; von Harrach, D.; Hashimoto, R.; Heinsius, F.H.; Heitz, R.; Herrmann, F.; Hinterberger, F.; Horikawa, N.; dHose, N.; Hsieh, C. -Y.; Huber, S.; Ishimoto, S.; Ivanov, A.; Ivanshin, Yu.; Iwata, T.; Jahn, R.; Jary, V.; Joosten, R.; Joerg, P.; Kabuss, E.; Ketzer, B.; Khaustov, G.V.; Khokhlov, Yu. A.; Kisselev, Yu.; Klein, F.; Klimaszewski, K.; Koivuniemi, J.H.; Kolosov, V.N.; Kondo, K.; Koenigsmann, K.; Konorov, I.; Konstantinov, V.F.; Kotzinian, A.M.; Kouznetsov, O.M.; Kuhn, R.; Kraemer, M.; Kremser, P.; Krinner, F.; Kroumchtein, Z.V.; Kulinich, Y.; Kunne, F.; Kurek, K.; Kurjata, R.P.; Lednev, A.A.; Lehmann, A.; Levillain, M.; Levorato, S.; Lichtenstadt, J.; Longo, R.; Maggiora, A.; Magnon, A.; Makins, N.; Makke, N.; Mallot, G.K.; Marchand, C.; Marianski, B.; Martin, A.; Marzec, J.; Matousek, J.; Matsuda, H.; Matsuda, T.; Meshcheryakov, G.V.; Meyer, W.; Michigami, T.; Mikhailov, Yu. V.; Mikhasenko, M.; Mitrofanov, E.; Mitrofanov, N.; Miyachi, Y.; Montuenga, P.; Nagaytsev, A.; Nerling, F.; Neyret, D.; Nikolaenko, V.I.; Novy, J.; Nowak, W.-D.; Nukazuka, G.; Nunes, A.S.; Olshevsky, A.G.; Orlov, I.; Ostrick, M.; Panzieri, D.; Parsamyan, B.; Paul, S.; Peng, J. -C.; Pereira, F.; Pesek, M.; Peshekhonov, D.V.; Pierre, N.; Platchkov, S.; Pochodzalla, J.; Polyakov, V.A.; Pretz, J.; Quaresma, M.; Quintans, C.; Ramos, S.; Regali, C.; Reicherz, G.; Riedl, C.; Roskot, M.; Ryabchikov, D.I.; Rybnikov, A.; Rychter, A.; Salac, R.; Samoylenko, V.D.; Sandacz, A.; Santos, C.; Sarkar, S.; Savin, I.A.; Sawada, T.; Sbrizzai, G.; Schiavon, P.; Schmidt, K.; Schmieden, H.; Schoenning, K.; Schopferer, S.; Seder, E.; Selyunin, A.; Shevchenko, O. Yu.; Steffen, D.; Silva, L.; Sinha, L.; Sirtl, S.; Slunecka, M.; Smolik, J.; Sozzi, F.; Srnka, A.; Stolarski, M.; Sulc, M.; Suzuki, H.; Szabelski, A.; Szameitat, T.; Sznajder, P.; Takekawa, S.; Tasevsky, M.; Tessaro, S.; Tessarotto, F.; Thibaud, F.; Tosello, F.; Tskhay, V.; Uhl, S.; Veloso, J.; Virius, M.; Vondra, J.; Weisrock, T.; Wilfert, M.; Windmolders, R.; ter Wolbeek, J.; Zaremba, K.; Zavada, P.; Zavertyaev, M.; Zemlyanichkina, E.; Ziembicki, M.; Zink, A.

    2017-01-10

    Multiplicities of charged pions and unidentified hadrons produced in deep-inelastic scattering were measured in bins of the Bjorken scaling variable $x$, the relative virtual-photon energy $y$ and the relative hadron energy $z$. Data were obtained by the COMPASS Collaboration using a 160 GeV muon beam and an isoscalar target ($^6$LiD). They cover the kinematic domain in the photon virtuality $Q^2$ > 1(GeV/c$)^2$, $0.004 < x < 0.4$, $0.2 < z < 0.85$ and $0.1 < y < 0.7$. In addition, a leading-order pQCD analysis was performed using the pion multiplicity results to extract quark fragmentation functions.

  11. Addressing the targeting range of the ABILHAND-56 in relapsing-remitting multiple sclerosis: A mixed methods psychometric study.

    Science.gov (United States)

    Cleanthous, Sophie; Strzok, Sara; Pompilus, Farrah; Cano, Stefan; Marquis, Patrick; Cohan, Stanley; Goldman, Myla D; Kresa-Reahl, Kiren; Petrillo, Jennifer; Castrillo-Viguera, Carmen; Cadavid, Diego; Chen, Shih-Yin

    2018-01-01

    ABILHAND, a manual ability patient-reported outcome instrument originally developed for stroke patients, has been used in multiple sclerosis clinical trials; however, psychometric analyses indicated the measure's limited measurement range and precision in higher-functioning multiple sclerosis patients. The purpose of this study was to identify candidate items to expand the measurement range of the ABILHAND-56, thus improving its ability to detect differences in manual ability in higher-functioning multiple sclerosis patients. A step-wise mixed methods design strategy was used, comprising two waves of patient interviews, a combination of qualitative (concept elicitation and cognitive debriefing) and quantitative (Rasch measurement theory) analytic techniques, and consultation interviews with three clinical neurologists specializing in multiple sclerosis. Original ABILHAND was well understood in this context of use. Eighty-two new manual ability concepts were identified. Draft supplementary items were generated and refined with patient and neurologist input. Rasch measurement theory psychometric analysis indicated supplementary items improved targeting to higher-functioning multiple sclerosis patients and measurement precision. The final pool of Early Multiple Sclerosis Manual Ability items comprises 20 items. The synthesis of qualitative and quantitative methods used in this study improves the ABILHAND content validity to more effectively identify manual ability changes in early multiple sclerosis and potentially help determine treatment effect in higher-functioning patients in clinical trials.

  12. Multiple anatomy optimization of accumulated dose

    International Nuclear Information System (INIS)

    Watkins, W. Tyler; Siebers, Jeffrey V.; Moore, Joseph A.; Gordon, James; Hugo, Geoffrey D.

    2014-01-01

    Purpose: To investigate the potential advantages of multiple anatomy optimization (MAO) for lung cancer radiation therapy compared to the internal target volume (ITV) approach. Methods: MAO aims to optimize a single fluence to be delivered under free-breathing conditions such that the accumulated dose meets the plan objectives, where accumulated dose is defined as the sum of deformably mapped doses computed on each phase of a single four dimensional computed tomography (4DCT) dataset. Phantom and patient simulation studies were carried out to investigate potential advantages of MAO compared to ITV planning. Through simulated delivery of the ITV- and MAO-plans, target dose variations were also investigated. Results: By optimizing the accumulated dose, MAO shows the potential to ensure dose to the moving target meets plan objectives while simultaneously reducing dose to organs at risk (OARs) compared with ITV planning. While consistently superior to the ITV approach, MAO resulted in equivalent OAR dosimetry at planning objective dose levels to within 2% volume in 14/30 plans and to within 3% volume in 19/30 plans for each lung V20, esophagus V25, and heart V30. Despite large variations in per-fraction respiratory phase weights in simulated deliveries at high dose rates (e.g., treating 4/10 phases during single fraction beams) the cumulative clinical target volume (CTV) dose after 30 fractions and per-fraction dose were constant independent of planning technique. In one case considered, however, per-phase CTV dose varied from 74% to 117% of prescription implying the level of ITV-dose heterogeneity may not be appropriate with conventional, free-breathing delivery. Conclusions: MAO incorporates 4DCT information in an optimized dose distribution and can achieve a superior plan in terms of accumulated dose to the moving target and OAR sparing compared to ITV-plans. An appropriate level of dose heterogeneity in MAO plans must be further investigated

  13. Multiple anatomy optimization of accumulated dose

    Energy Technology Data Exchange (ETDEWEB)

    Watkins, W. Tyler, E-mail: watkinswt@virginia.edu; Siebers, Jeffrey V. [Department of Radiation Oncology, University of Virginia, Charlottesville, Virginia 22908 and Department of Radiation Oncology, Virginia Commonwealth University, Richmond, Virginia 23298 (United States); Moore, Joseph A. [Department of Radiation Oncology and Molecular Radiation Sciences, Johns Hopkins University, Baltimore, Maryland 21231 and Department of Radiation Oncology, Virginia Commonwealth University, Richmond, Virginia 23298 (United States); Gordon, James [Henry Ford Health System, Detroit, Michigan 48202 and Department of Radiation Oncology, Virginia Commonwealth University, Richmond, Virginia 23298 (United States); Hugo, Geoffrey D. [Department of Radiation Oncology, Virginia Commonwealth University, Richmond, Virginia 23298 (United States)

    2014-11-01

    Purpose: To investigate the potential advantages of multiple anatomy optimization (MAO) for lung cancer radiation therapy compared to the internal target volume (ITV) approach. Methods: MAO aims to optimize a single fluence to be delivered under free-breathing conditions such that the accumulated dose meets the plan objectives, where accumulated dose is defined as the sum of deformably mapped doses computed on each phase of a single four dimensional computed tomography (4DCT) dataset. Phantom and patient simulation studies were carried out to investigate potential advantages of MAO compared to ITV planning. Through simulated delivery of the ITV- and MAO-plans, target dose variations were also investigated. Results: By optimizing the accumulated dose, MAO shows the potential to ensure dose to the moving target meets plan objectives while simultaneously reducing dose to organs at risk (OARs) compared with ITV planning. While consistently superior to the ITV approach, MAO resulted in equivalent OAR dosimetry at planning objective dose levels to within 2% volume in 14/30 plans and to within 3% volume in 19/30 plans for each lung V20, esophagus V25, and heart V30. Despite large variations in per-fraction respiratory phase weights in simulated deliveries at high dose rates (e.g., treating 4/10 phases during single fraction beams) the cumulative clinical target volume (CTV) dose after 30 fractions and per-fraction dose were constant independent of planning technique. In one case considered, however, per-phase CTV dose varied from 74% to 117% of prescription implying the level of ITV-dose heterogeneity may not be appropriate with conventional, free-breathing delivery. Conclusions: MAO incorporates 4DCT information in an optimized dose distribution and can achieve a superior plan in terms of accumulated dose to the moving target and OAR sparing compared to ITV-plans. An appropriate level of dose heterogeneity in MAO plans must be further investigated.

  14. Multiple anatomy optimization of accumulated dose.

    Science.gov (United States)

    Watkins, W Tyler; Moore, Joseph A; Gordon, James; Hugo, Geoffrey D; Siebers, Jeffrey V

    2014-11-01

    To investigate the potential advantages of multiple anatomy optimization (MAO) for lung cancer radiation therapy compared to the internal target volume (ITV) approach. MAO aims to optimize a single fluence to be delivered under free-breathing conditions such that the accumulated dose meets the plan objectives, where accumulated dose is defined as the sum of deformably mapped doses computed on each phase of a single four dimensional computed tomography (4DCT) dataset. Phantom and patient simulation studies were carried out to investigate potential advantages of MAO compared to ITV planning. Through simulated delivery of the ITV- and MAO-plans, target dose variations were also investigated. By optimizing the accumulated dose, MAO shows the potential to ensure dose to the moving target meets plan objectives while simultaneously reducing dose to organs at risk (OARs) compared with ITV planning. While consistently superior to the ITV approach, MAO resulted in equivalent OAR dosimetry at planning objective dose levels to within 2% volume in 14/30 plans and to within 3% volume in 19/30 plans for each lung V20, esophagus V25, and heart V30. Despite large variations in per-fraction respiratory phase weights in simulated deliveries at high dose rates (e.g., treating 4/10 phases during single fraction beams) the cumulative clinical target volume (CTV) dose after 30 fractions and per-fraction dose were constant independent of planning technique. In one case considered, however, per-phase CTV dose varied from 74% to 117% of prescription implying the level of ITV-dose heterogeneity may not be appropriate with conventional, free-breathing delivery. MAO incorporates 4DCT information in an optimized dose distribution and can achieve a superior plan in terms of accumulated dose to the moving target and OAR sparing compared to ITV-plans. An appropriate level of dose heterogeneity in MAO plans must be further investigated.

  15. Hindsight regulates photoreceptor axon targeting through transcriptional control of jitterbug/Filamin and multiple genes involved in axon guidance in Drosophila.

    Science.gov (United States)

    Oliva, Carlos; Molina-Fernandez, Claudia; Maureira, Miguel; Candia, Noemi; López, Estefanía; Hassan, Bassem; Aerts, Stein; Cánovas, José; Olguín, Patricio; Sierralta, Jimena

    2015-09-01

    During axon targeting, a stereotyped pattern of connectivity is achieved by the integration of intrinsic genetic programs and the response to extrinsic long and short-range directional cues. How this coordination occurs is the subject of intense study. Transcription factors play a central role due to their ability to regulate the expression of multiple genes required to sense and respond to these cues during development. Here we show that the transcription factor HNT regulates layer-specific photoreceptor axon targeting in Drosophila through transcriptional control of jbug/Filamin and multiple genes involved in axon guidance and cytoskeleton organization.Using a microarray analysis we identified 235 genes whose expression levels were changed by HNT overexpression in the eye primordia. We analyzed nine candidate genes involved in cytoskeleton regulation and axon guidance, six of which displayed significantly altered gene expression levels in hnt mutant retinas. Functional analysis confirmed the role of OTK/PTK7 in photoreceptor axon targeting and uncovered Tiggrin, an integrin ligand, and Jbug/Filamin, a conserved actin- binding protein, as new factors that participate of photoreceptor axon targeting. Moreover, we provided in silico and molecular evidence that supports jbug/Filamin as a direct transcriptional target of HNT and that HNT acts partially through Jbug/Filamin in vivo to regulate axon guidance. Our work broadens the understanding of how HNT regulates the coordinated expression of a group of genes to achieve the correct connectivity pattern in the Drosophila visual system. © 2015 Wiley Periodicals, Inc. Develop Neurobiol 75: 1018-1032, 2015. © 2015 Wiley Periodicals, Inc.

  16. Non-target adjacent stimuli classification improves performance of classical ERP-based brain computer interface

    Science.gov (United States)

    Ceballos, G. A.; Hernández, L. F.

    2015-04-01

    Objective. The classical ERP-based speller, or P300 Speller, is one of the most commonly used paradigms in the field of Brain Computer Interfaces (BCI). Several alterations to the visual stimuli presentation system have been developed to avoid unfavorable effects elicited by adjacent stimuli. However, there has been little, if any, regard to useful information contained in responses to adjacent stimuli about spatial location of target symbols. This paper aims to demonstrate that combining the classification of non-target adjacent stimuli with standard classification (target versus non-target) significantly improves classical ERP-based speller efficiency. Approach. Four SWLDA classifiers were trained and combined with the standard classifier: the lower row, upper row, right column and left column classifiers. This new feature extraction procedure and the classification method were carried out on three open databases: the UAM P300 database (Universidad Autonoma Metropolitana, Mexico), BCI competition II (dataset IIb) and BCI competition III (dataset II). Main results. The inclusion of the classification of non-target adjacent stimuli improves target classification in the classical row/column paradigm. A gain in mean single trial classification of 9.6% and an overall improvement of 25% in simulated spelling speed was achieved. Significance. We have provided further evidence that the ERPs produced by adjacent stimuli present discriminable features, which could provide additional information about the spatial location of intended symbols. This work promotes the searching of information on the peripheral stimulation responses to improve the performance of emerging visual ERP-based spellers.

  17. Encryption and display of multiple-image information using computer-generated holography with modified GS iterative algorithm

    Science.gov (United States)

    Xiao, Dan; Li, Xiaowei; Liu, Su-Juan; Wang, Qiong-Hua

    2018-03-01

    In this paper, a new scheme of multiple-image encryption and display based on computer-generated holography (CGH) and maximum length cellular automata (MLCA) is presented. With the scheme, the computer-generated hologram, which has the information of the three primitive images, is generated by modified Gerchberg-Saxton (GS) iterative algorithm using three different fractional orders in fractional Fourier domain firstly. Then the hologram is encrypted using MLCA mask. The ciphertext can be decrypted combined with the fractional orders and the rules of MLCA. Numerical simulations and experimental display results have been carried out to verify the validity and feasibility of the proposed scheme.

  18. GAMUT: GPU accelerated microRNA analysis to uncover target genes through CUDA-miRanda

    Science.gov (United States)

    2014-01-01

    Background Non-coding sequences such as microRNAs have important roles in disease processes. Computational microRNA target identification (CMTI) is becoming increasingly important since traditional experimental methods for target identification pose many difficulties. These methods are time-consuming, costly, and often need guidance from computational methods to narrow down candidate genes anyway. However, most CMTI methods are computationally demanding, since they need to handle not only several million query microRNA and reference RNA pairs, but also several million nucleotide comparisons within each given pair. Thus, the need to perform microRNA identification at such large scale has increased the demand for parallel computing. Methods Although most CMTI programs (e.g., the miRanda algorithm) are based on a modified Smith-Waterman (SW) algorithm, the existing parallel SW implementations (e.g., CUDASW++ 2.0/3.0, SWIPE) are unable to meet this demand in CMTI tasks. We present CUDA-miRanda, a fast microRNA target identification algorithm that takes advantage of massively parallel computing on Graphics Processing Units (GPU) using NVIDIA's Compute Unified Device Architecture (CUDA). CUDA-miRanda specifically focuses on the local alignment of short (i.e., ≤ 32 nucleotides) sequences against longer reference sequences (e.g., 20K nucleotides). Moreover, the proposed algorithm is able to report multiple alignments (up to 191 top scores) and the corresponding traceback sequences for any given (query sequence, reference sequence) pair. Results Speeds over 5.36 Giga Cell Updates Per Second (GCUPs) are achieved on a server with 4 NVIDIA Tesla M2090 GPUs. Compared to the original miRanda algorithm, which is evaluated on an Intel Xeon E5620@2.4 GHz CPU, the experimental results show up to 166 times performance gains in terms of execution time. In addition, we have verified that the exact same targets were predicted in both CUDA-miRanda and the original mi

  19. Monte Carlo computation of Bremsstrahlung intensity and energy spectrum from a 15 MV linear electron accelerator tungsten target to optimise LINAC head shielding

    International Nuclear Information System (INIS)

    Biju, K.; Sharma, Amiya; Yadav, R.K.; Kannan, R.; Bhatt, B.C.

    2003-01-01

    The knowledge of exact photon intensity and energy distributions from the target of an electron target is necessary while designing the shielding for the accelerator head from radiation safety point of view. The computations were carried out for the intensity and energy distribution of photon spectrum from a 0.4 cm thick tungsten target in different angular directions for 15 MeV electrons using a validated Monte Carlo code MCNP4A. Similar results were computed for 30 MeV electrons and found agreeing with the data available in literature. These graphs and the TVT values in lead help to suggest an optimum shielding thickness for 15 MV Linac head. (author)

  20. Endocytosis of Cytotoxic Granules Is Essential for Multiple Killing of Target Cells by T Lymphocytes.

    Science.gov (United States)

    Chang, Hsin-Fang; Bzeih, Hawraa; Schirra, Claudia; Chitirala, Praneeth; Halimani, Mahantappa; Cordat, Emmanuelle; Krause, Elmar; Rettig, Jens; Pattu, Varsha

    2016-09-15

    CTLs are serial killers that kill multiple target cells via exocytosis of cytotoxic granules (CGs). CG exocytosis is tightly regulated and has been investigated in great detail; however, whether CG proteins are endocytosed following exocytosis and contribute to serial killing remains unknown. By using primary CTLs derived from a knock-in mouse of the CG membrane protein Synaptobrevin2, we show that CGs are endocytosed in a clathrin- and dynamin-dependent manner. Following acidification, endocytosed CGs are recycled through early and late, but not recycling endosomes. CGs are refilled with granzyme B at the late endosome stage and polarize to subsequent synapses formed between the CTL and new target cells. Importantly, inhibiting CG endocytosis in CTLs results in a significant reduction of their cytotoxic activity. Thus, our data demonstrate that continuous endocytosis of CG membrane proteins is a prerequisite for efficient serial killing of CTLs and identify key events in this process. Copyright © 2016 by The American Association of Immunologists, Inc.

  1. Identification of polycystic ovary syndrome potential drug targets based on pathobiological similarity in the protein-protein interaction network

    OpenAIRE

    Huang, Hao; He, Yuehan; Li, Wan; Wei, Wenqing; Li, Yiran; Xie, Ruiqiang; Guo, Shanshan; Wang, Yahui; Jiang, Jing; Chen, Binbin; Lv, Junjie; Zhang, Nana; Chen, Lina; He, Weiming

    2016-01-01

    Polycystic ovary syndrome (PCOS) is one of the most common endocrinological disorders in reproductive aged women. PCOS and Type 2 Diabetes (T2D) are closely linked in multiple levels and possess high pathobiological similarity. Here, we put forward a new computational approach based on the pathobiological similarity to identify PCOS potential drug target modules (PPDT-Modules) and PCOS potential drug targets in the protein-protein interaction network (PPIN). From the systems level and biologi...

  2. Multi-UAV joint target recognizing based on binocular vision theory

    Directory of Open Access Journals (Sweden)

    Yuan Zhang

    2017-01-01

    Full Text Available Target recognizing of unmanned aerial vehicle (UAV based on image processing take the advantage of 2D information containing in the image for identifying the target. Compare to single UAV with electrical optical tracking system (EOTS, multi-UAV with EOTS is able to take a group of image focused on the suspected target from multiple view point. Benefit from matching each couple of image in this group, points set constituted by matched feature points implicates the depth of each point. Coordinate of target feature points could be computing from depth of feature points. This depth information makes up a cloud of points and reconstructed an exclusive 3D model to recognizing system. Considering the target recognizing do not require precise target model, the cloud of feature points was regrouped into n subsets and reconstructed to a semi-3D model. Casting these subsets in a Cartesian coordinate and applying these projections in convolutional neural networks (CNN respectively, the integrated output of networks is the improved result of recognizing.

  3. Damage to Preheated Tungsten Targets after Multiple Plasma Impacts Simulating ITER ELMs

    Energy Technology Data Exchange (ETDEWEB)

    Garkusha, I.E.; Bandura, A.N.; Byrka, O.V.; Chebotarev, V.V.; Makhlay, V.A.; Tereshin, V.I. [Kharkov Inst. of Physics and Technology, Inst. of Plasma Physics of National Science Center, Akademicheskaya street, 1, 61108 Kharkov (Ukraine); Landman, I.; Pestchanyi, S. [FZK-Forschungszentrum Karlsruhe, Association Euratom-FZK, Technik und Umwelt, Postfach 3640, D-7602 1 Karlsruhe (Germany)

    2007-07-01

    Full text of publication follows: The energy loads onto ITER divertor surfaces associated with the Type I ELMs are expected to be up to 1 MJ/m{sup 2} during 0.1-0.5 ms, with the number of pulses about 103 per discharge. Tungsten is a candidate material for major part of the surface, but its brittleness can result in substantial macroscopic erosion after the repetitive heat loads. To minimize the brittle destruction, tungsten may be preheated above the ductile-to-brittle transition temperature. In this work the behavior of preheated tungsten targets under repetitive ELM-like plasma pulses is studied in simulation experiments with the quasi-stationary plasma accelerator QSPA Kh-50. The targets have been exposed up to 450 pulses of the duration 0.25 ms and the heat loads either 0.45 MJ/m{sup 2} or 0.75 MJ/m{sup 2}, which is respectively below and above the melting threshold. During the exposures the targets were permanently kept preheated at 650 deg. C by a heater at target backside. In the course of exposures the irradiated surfaces were examined after regular numbers of pulses using the SEM and the optical microscopy. The profilometry, XRD, microhardness and weight loss measurements have been performed, as well as comparisons of surface damages after the heat loads both below and above the melting threshold. It is obtained that macro-cracks do not develop on the preheated surface. After the impacts with surface melting, a fine mesh of intergranular microcracks has appeared. The width of fine intergranular cracks grows with pulse number, achieving 1-1.5 microns after 100 pulses, and after 210 pulses the crack width increases up to 20 microns, which is comparable with grain sizes. Threshold changes in surface morphology resulting in corrugation structures and pits on the surface as well as importance of surface tension in resulted 'micro-brush' structures are discussed. Further evolution of the surface pattern is caused by loss of separated grains on exposed

  4. Energy, target, projectile and multiplicity dependences of intermittency behaviour in high energy O(Si,S) induced interactions

    International Nuclear Information System (INIS)

    Adamovich, M.I.; Alexandrov, Y.A.; Chernyavski, M.M.; Gerassimov, S.G.; Kharlamov, S.P.; Larionova, V.G.; Maslennikova, N.V.; Orlova, G.I.; Peresadko, N.G.; Salmanova, N.A.; Tretyakova, M.I.; Ameeva, Z.U.; Andreeva, N.P.; Anzon, Z.V.; Bubnov, V.I.; Chasnikov, I.Y.; Eligbaeva, G.Z.; Eremenko, G.Z.; Gaitinov, A.S.; Kalyachkina, G.S.; Kanygina, E.K.; Skakhova, C.I.; Bhalla, K.B.; Kumar, V.; Lal, P.; Lokanathan, S.; Mookerjee, S.; Raniwala, R.; Raniwala, S.; Burnett, T.H.; Grote, J.; Koss, T.; Lord, J.; Skelding, D.; Strausz, S.C.; Wilkes, R.J.; Cai, X.; Huang, H.; Liu, L.S.; Qian, W.Y.; Wang, H.Q.; Zhou, D.C.; Zhou, J.C.; Chernova, L.P.; Gadzhieva, S.I.; Gulamov, K.G.; Kadyrov, F.G.; Lukicheva, N.S.; Navotny, V.S.; Svechnikova, L.N.; Friedlander, E.M.; Heckman, H.H.; Lindstrom, P.J.; Garpman, S.; Jakobsson, B.; Otterlund, I.; Persson, S.; Soederstroem, K.; Stenlund, E.; Judek, B.; Nasyrov, S.H.; Petrov, N.V.; Xu, G.F.; Zheng, P.Y.

    1991-01-01

    Fluctuations of charged particles in high energy oxygen, silicon and sulphur induced interactions are investigated with the method of scaled factorial moments. It is found that for decreasing bin size down to δη∝0.1 the EMU01 data exhibits intermittent behaviour. The intermittency indexes are found to decrease with increasing incident energy and multiplicity and to increase with increasing target mass. It seems also to increase as the projectile mass increases. (orig.)

  5. Using multiple metaphors and multimodalities as a semiotic resource when teaching year 2 students computational strategies

    Science.gov (United States)

    Mildenhall, Paula; Sherriff, Barbara

    2017-06-01

    Recent research indicates that using multimodal learning experiences can be effective in teaching mathematics. Using a social semiotic lens within a participationist framework, this paper reports on a professional learning collaboration with a primary school teacher designed to explore the use of metaphors and modalities in mathematics instruction. This video case study was conducted in a year 2 classroom over two terms, with the focus on building children's understanding of computational strategies. The findings revealed that the teacher was able to successfully plan both multimodal and multiple metaphor learning experiences that acted as semiotic resources to support the children's understanding of abstract mathematics. The study also led to implications for teaching when using multiple metaphors and multimodalities.

  6. Enhanced Algorithms for EO/IR Electronic Stabilization, Clutter Suppression, and Track-Before-Detect for Multiple Low Observable Targets

    Science.gov (United States)

    Tartakovsky, A.; Brown, A.; Brown, J.

    The paper describes the development and evaluation of a suite of advanced algorithms which provide significantly-improved capabilities for finding, fixing, and tracking multiple ballistic and flying low observable objects in highly stressing cluttered environments. The algorithms have been developed for use in satellite-based staring and scanning optical surveillance suites for applications including theatre and intercontinental ballistic missile early warning, trajectory prediction, and multi-sensor track handoff for midcourse discrimination and intercept. The functions performed by the algorithms include electronic sensor motion compensation providing sub-pixel stabilization (to 1/100 of a pixel), as well as advanced temporal-spatial clutter estimation and suppression to below sensor noise levels, followed by statistical background modeling and Bayesian multiple-target track-before-detect filtering. The multiple-target tracking is performed in physical world coordinates to allow for multi-sensor fusion, trajectory prediction, and intercept. Output of detected object cues and data visualization are also provided. The algorithms are designed to handle a wide variety of real-world challenges. Imaged scenes may be highly complex and infinitely varied -- the scene background may contain significant celestial, earth limb, or terrestrial clutter. For example, when viewing combined earth limb and terrestrial scenes, a combination of stationary and non-stationary clutter may be present, including cloud formations, varying atmospheric transmittance and reflectance of sunlight and other celestial light sources, aurora, glint off sea surfaces, and varied natural and man-made terrain features. The targets of interest may also appear to be dim, relative to the scene background, rendering much of the existing deployed software useless for optical target detection and tracking. Additionally, it may be necessary to detect and track a large number of objects in the threat cloud

  7. The NASA Ames PAH IR Spectroscopic Database: Computational Version 3.00 with Updated Content and the Introduction of Multiple Scaling Factors

    Science.gov (United States)

    Bauschlicher, Charles W., Jr.; Ricca, A.; Boersma, C.; Allamandola, L. J.

    2018-02-01

    Version 3.00 of the library of computed spectra in the NASA Ames PAH IR Spectroscopic Database (PAHdb) is described. Version 3.00 introduces the use of multiple scale factors, instead of the single scaling factor used previously, to align the theoretical harmonic frequencies with the experimental fundamentals. The use of multiple scale factors permits the use of a variety of basis sets; this allows new PAH species to be included in the database, such as those containing oxygen, and yields an improved treatment of strained species and those containing nitrogen. In addition, the computed spectra of 2439 new PAH species have been added. The impact of these changes on the analysis of an astronomical spectrum through database-fitting is considered and compared with a fit using Version 2.00 of the library of computed spectra. Finally, astronomical constraints are defined for the PAH spectral libraries in PAHdb.

  8. TRANGE: computer code to calculate the energy beam degradation in target stack; TRANGE: programa para calcular a degradacao de energia de particulas carregadas em alvos

    Energy Technology Data Exchange (ETDEWEB)

    Bellido, Luis F.

    1995-07-01

    A computer code to calculate the projectile energy degradation along a target stack was developed for an IBM or compatible personal microcomputer. A comparison of protons and deuterons bombarding uranium and aluminium targets was made. The results showed that the data obtained with TRANGE were in agreement with other computers code such as TRIM, EDP and also using Williamsom and Janni range and stopping power tables. TRANGE can be used for any charged particle ion, for energies between 1 to 100 MeV, in metal foils and solid compounds targets. (author). 8 refs., 2 tabs.

  9. LIBVERSIONINGCOMPILER: An easy-to-use library for dynamic generation and invocation of multiple code versions

    Science.gov (United States)

    Cherubin, S.; Agosta, G.

    2018-01-01

    We present LIBVERSIONINGCOMPILER, a C++ library designed to support the dynamic generation of multiple versions of the same compute kernel in a HPC scenario. It can be used to provide continuous optimization, code specialization based on the input data or on workload changes, or otherwise to dynamically adjust the application, without the burden of a full dynamic compiler. The library supports multiple underlying compilers but specifically targets the LLVM framework. We also provide examples of use, showing the overhead of the library, and providing guidelines for its efficient use.

  10. Reverse screening methods to search for the protein targets of chemopreventive compounds

    Science.gov (United States)

    Huang, Hongbin; Zhang, Guigui; Zhou, Yuquan; Lin, Chenru; Chen, Suling; Lin, Yutong; Mai, Shangkang; Huang, Zunnan

    2018-05-01

    This article is a systematic review of reverse screening methods used to search for the protein targets of chemopreventive compounds or drugs. Typical chemopreventive compounds include components of traditional Chinese medicine, natural compounds and Food and Drug Administration (FDA)-approved drugs. Such compounds are somewhat selective but are predisposed to bind multiple protein targets distributed throughout diverse signaling pathways in human cells. In contrast to conventional virtual screening, which identifies the ligands of a targeted protein from a compound database, reverse screening is used to identify the potential targets or unintended targets of a given compound from a large number of receptors by examining their known ligands or crystal structures. This method, also known as in silico or computational target fishing, is highly valuable for discovering the target receptors of query molecules from terrestrial or marine natural products, exploring the molecular mechanisms of chemopreventive compounds, finding alternative indications of existing drugs by drug repositioning, and detecting adverse drug reactions and drug toxicity. Reverse screening can be divided into three major groups: shape screening, pharmacophore screening and reverse docking. Several large software packages, such as Schrödinger and Discovery Studio; typical software/network services such as ChemMapper, PharmMapper, idTarget and INVDOCK; and practical databases of known target ligands and receptor crystal structures, such as ChEMBL, BindingDB and the Protein Data Bank (PDB), are available for use in these computational methods. Different programs, online services and databases have different applications and constraints. Here, we conducted a systematic analysis and multilevel classification of the computational programs, online services and compound libraries available for shape screening, pharmacophore screening and reverse docking to enable non-specialist users to quickly learn and

  11. A Novel Sensor Selection and Power Allocation Algorithm for Multiple-Target Tracking in an LPI Radar Network

    Directory of Open Access Journals (Sweden)

    Ji She

    2016-12-01

    Full Text Available Radar networks are proven to have numerous advantages over traditional monostatic and bistatic radar. With recent developments, radar networks have become an attractive platform due to their low probability of intercept (LPI performance for target tracking. In this paper, a joint sensor selection and power allocation algorithm for multiple-target tracking in a radar network based on LPI is proposed. It is found that this algorithm can minimize the total transmitted power of a radar network on the basis of a predetermined mutual information (MI threshold between the target impulse response and the reflected signal. The MI is required by the radar network system to estimate target parameters, and it can be calculated predictively with the estimation of target state. The optimization problem of sensor selection and power allocation, which contains two variables, is non-convex and it can be solved by separating power allocation problem from sensor selection problem. To be specific, the optimization problem of power allocation can be solved by using the bisection method for each sensor selection scheme. Also, the optimization problem of sensor selection can be solved by a lower complexity algorithm based on the allocated powers. According to the simulation results, it can be found that the proposed algorithm can effectively reduce the total transmitted power of a radar network, which can be conducive to improving LPI performance.

  12. Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.

    Science.gov (United States)

    Bian, Yuemin; Xie, Xiang-Qun Sean

    2018-04-09

    Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.

  13. Fine tuning of RFX/DAF-19-regulated target gene expression through binding to multiple sites in Caenorhabditis elegans

    OpenAIRE

    Chu, Jeffery S. C.; Tarailo-Graovac, Maja; Zhang, Di; Wang, Jun; Uyar, Bora; Tu, Domena; Trinh, Joanne; Baillie, David L.; Chen, Nansheng

    2011-01-01

    In humans, mutations of a growing list of regulatory factor X (RFX) target genes have been associated with devastating genetics disease conditions including ciliopathies. However, mechanisms underlying RFX transcription factors (TFs)-mediated gene expression regulation, especially differential gene expression regulation, are largely unknown. In this study, we explore the functional significance of the co-existence of multiple X-box motifs in regulating differential gene expression in Caenorha...

  14. TargetCompare: A web interface to compare simultaneous miRNAs targets.

    Science.gov (United States)

    Moreira, Fabiano Cordeiro; Dustan, Bruno; Hamoy, Igor G; Ribeiro-Dos-Santos, André M; Dos Santos, Andrea Ribeiro

    2014-01-01

    MicroRNAs (miRNAs) are small non-coding nucleotide sequences between 17 and 25 nucleotides in length that primarily function in the regulation of gene expression. A since miRNA has thousand of predict targets in a complex, regulatory cell signaling network. Therefore, it is of interest to study multiple target genes simultaneously. Hence, we describe a web tool (developed using Java programming language and MySQL database server) to analyse multiple targets of pre-selected miRNAs. We cross validated the tool in eight most highly expressed miRNAs in the antrum region of stomach. This helped to identify 43 potential genes that are target of at least six of the referred miRNAs. The developed tool aims to reduce the randomness and increase the chance of selecting strong candidate target genes and miRNAs responsible for playing important roles in the studied tissue. http://lghm.ufpa.br/targetcompare.

  15. Identifying Drug-Target Interactions with Decision Templates.

    Science.gov (United States)

    Yan, Xiao-Ying; Zhang, Shao-Wu

    2018-01-01

    During the development process of new drugs, identification of the drug-target interactions wins primary concerns. However, the chemical or biological experiments bear the limitation in coverage as well as the huge cost of both time and money. Based on drug similarity and target similarity, chemogenomic methods can be able to predict potential drug-target interactions (DTIs) on a large scale and have no luxurious need about target structures or ligand entries. In order to reflect the cases that the drugs having variant structures interact with common targets and the targets having dissimilar sequences interact with same drugs. In addition, though several other similarity metrics have been developed to predict DTIs, the combination of multiple similarity metrics (especially heterogeneous similarities) is too naïve to sufficiently explore the multiple similarities. In this paper, based on Gene Ontology and pathway annotation, we introduce two novel target similarity metrics to address above issues. More importantly, we propose a more effective strategy via decision template to integrate multiple classifiers designed with multiple similarity metrics. In the scenarios that predict existing targets for new drugs and predict approved drugs for new protein targets, the results on the DTI benchmark datasets show that our target similarity metrics are able to enhance the predictive accuracies in two scenarios. And the elaborate fusion strategy of multiple classifiers has better predictive power than the naïve combination of multiple similarity metrics. Compared with other two state-of-the-art approaches on the four popular benchmark datasets of binary drug-target interactions, our method achieves the best results in terms of AUC and AUPR for predicting available targets for new drugs (S2), and predicting approved drugs for new protein targets (S3).These results demonstrate that our method can effectively predict the drug-target interactions. The software package can

  16. Numerical Computation of Underground Inundation in Multiple Layers Using the Adaptive Transfer Method

    Directory of Open Access Journals (Sweden)

    Hyung-Jun Kim

    2018-01-01

    Full Text Available Extreme rainfall causes surface runoff to flow towards lowlands and subterranean facilities, such as subway stations and buildings with underground spaces in densely packed urban areas. These facilities and areas are therefore vulnerable to catastrophic submergence. However, flood modeling of underground space has not yet been adequately studied because there are difficulties in reproducing the associated multiple horizontal layers connected with staircases or elevators. This study proposes a convenient approach to simulate underground inundation when two layers are connected. The main facet of this approach is to compute the flow flux passing through staircases in an upper layer and to transfer the equivalent quantity to a lower layer. This is defined as the ‘adaptive transfer method’. This method overcomes the limitations of 2D modeling by introducing layers connecting concepts to prevent large variations in mesh sizes caused by complicated underlying obstacles or local details. Consequently, this study aims to contribute to the numerical analysis of flow in inundated underground spaces with multiple floors.

  17. Superresolution radar imaging based on fast inverse-free sparse Bayesian learning for multiple measurement vectors

    Science.gov (United States)

    He, Xingyu; Tong, Ningning; Hu, Xiaowei

    2018-01-01

    Compressive sensing has been successfully applied to inverse synthetic aperture radar (ISAR) imaging of moving targets. By exploiting the block sparse structure of the target image, sparse solution for multiple measurement vectors (MMV) can be applied in ISAR imaging and a substantial performance improvement can be achieved. As an effective sparse recovery method, sparse Bayesian learning (SBL) for MMV involves a matrix inverse at each iteration. Its associated computational complexity grows significantly with the problem size. To address this problem, we develop a fast inverse-free (IF) SBL method for MMV. A relaxed evidence lower bound (ELBO), which is computationally more amiable than the traditional ELBO used by SBL, is obtained by invoking fundamental property for smooth functions. A variational expectation-maximization scheme is then employed to maximize the relaxed ELBO, and a computationally efficient IF-MSBL algorithm is proposed. Numerical results based on simulated and real data show that the proposed method can reconstruct row sparse signal accurately and obtain clear superresolution ISAR images. Moreover, the running time and computational complexity are reduced to a great extent compared with traditional SBL methods.

  18. Computational-experimental approach to drug-target interaction mapping: A case study on kinase inhibitors.

    Directory of Open Access Journals (Sweden)

    Anna Cichonska

    2017-08-01

    Full Text Available Due to relatively high costs and labor required for experimental profiling of the full target space of chemical compounds, various machine learning models have been proposed as cost-effective means to advance this process in terms of predicting the most potent compound-target interactions for subsequent verification. However, most of the model predictions lack direct experimental validation in the laboratory, making their practical benefits for drug discovery or repurposing applications largely unknown. Here, we therefore introduce and carefully test a systematic computational-experimental framework for the prediction and pre-clinical verification of drug-target interactions using a well-established kernel-based regression algorithm as the prediction model. To evaluate its performance, we first predicted unmeasured binding affinities in a large-scale kinase inhibitor profiling study, and then experimentally tested 100 compound-kinase pairs. The relatively high correlation of 0.77 (p < 0.0001 between the predicted and measured bioactivities supports the potential of the model for filling the experimental gaps in existing compound-target interaction maps. Further, we subjected the model to a more challenging task of predicting target interactions for such a new candidate drug compound that lacks prior binding profile information. As a specific case study, we used tivozanib, an investigational VEGF receptor inhibitor with currently unknown off-target profile. Among 7 kinases with high predicted affinity, we experimentally validated 4 new off-targets of tivozanib, namely the Src-family kinases FRK and FYN A, the non-receptor tyrosine kinase ABL1, and the serine/threonine kinase SLK. Our sub-sequent experimental validation protocol effectively avoids any possible information leakage between the training and validation data, and therefore enables rigorous model validation for practical applications. These results demonstrate that the kernel

  19. Development of computer program ENAUDIBL for computation of the sensation levels of multiple, complex, intrusive sounds in the presence of residual environmental masking noise

    Energy Technology Data Exchange (ETDEWEB)

    Liebich, R. E.; Chang, Y.-S.; Chun, K. C.

    2000-03-31

    The relative audibility of multiple sounds occurs in separate, independent channels (frequency bands) termed critical bands or equivalent rectangular (filter-response) bandwidths (ERBs) of frequency. The true nature of human hearing is a function of a complex combination of subjective factors, both auditory and nonauditory. Assessment of the probability of individual annoyance, community-complaint reaction levels, speech intelligibility, and the most cost-effective mitigation actions requires sensation-level data; these data are one of the most important auditory factors. However, sensation levels cannot be calculated by using single-number, A-weighted sound level values. This paper describes specific steps to compute sensation levels. A unique, newly developed procedure is used, which simplifies and improves the accuracy of such computations by the use of maximum sensation levels that occur, for each intrusive-sound spectrum, within each ERB. The newly developed program ENAUDIBL makes use of ERB sensation-level values generated with some computational subroutines developed for the formerly documented program SPECTRAN.

  20. Computational Magnetohydrodynamics of General Materials in Generalized Coordinates and Applications to Laser-Target Interactions

    Science.gov (United States)

    MacGillivray, Jeff T.; Peterkin, Robert E., Jr.

    2003-10-01

    We have developed a multiblock arbitrary coordinate Hydromagnetics (MACH) code for computing the time-evolution of materials of arbitrary phase (solid, liquid, gas, and plasma) in response to forces that arise from material and magnetic pressures. MACH is a single-fluid, time-dependent, arbitrary Lagrangian-Eulerian (ALE) magnetohydrodynamic (MHD) simulation environment. The 2 1/2 -dimensional MACH2 and the parallel 3-D MACH3 are widely used in the MHD community to perform accurate simulation of the time evolution of electrically conducting materials in a wide variety of laboratory situations. In this presentation, we discuss simulations of the interaction of an intense laser beam with a solid target in an ambient gas. Of particular interest to us is a laser-supported detonation wave (blast wave) that originates near the surface of the target when the laser intensity is sufficiently large to vaporize target material within the focal spot of the beam. Because the MACH3 simulations are fully three-dimensional, we are able to simulate non-normal laser incidence. A magnetic field is also produced from plasma energy near the edge of the focal spot.

  1. Tracking a convoy of multiple targets using acoustic sensor data

    Science.gov (United States)

    Damarla, T. R.

    2003-08-01

    In this paper we present an algorithm to track a convoy of several targets in a scene using acoustic sensor array data. The tracking algorithm is based on template of the direction of arrival (DOA) angles for the leading target. Often the first target is the closest target to the sensor array and hence the loudest with good signal to noise ratio. Several steps were used to generate a template of the DOA angle for the leading target, namely, (a) the angle at the present instant should be close to the angle at the previous instant and (b) the angle at the present instant should be within error bounds of the predicted value based on the previous values. Once the template of the DOA angles of the leading target is developed, it is used to predict the DOA angle tracks of the remaining targets. In order to generate the tracks for the remaining targets, a track is established if the angles correspond to the initial track values of the first target. Second the time delay between the first track and the remaining tracks are estimated at the highest correlation points between the first track and the remaining tracks. As the vehicles move at different speeds the tracks either compress or expand depending on whether a target is moving fast or slow compared to the first target. The expansion and compression ratios are estimated and used to estimate the predicted DOA angle values of the remaining targets. Based on these predicted DOA angles of the remaining targets the DOA angles obtained from the MVDR or Incoherent MUSIC will be appropriately assigned to proper tracks. Several other rules were developed to avoid mixing the tracks. The algorithm is tested on data collected at Aberdeen Proving Ground with a convoy of 3, 4 and 5 vehicles. Some of the vehicles are tracked and some are wheeled vehicles. The tracking algorithm results are found to be good. The results will be presented at the conference and in the paper.

  2. Designing high power targets with computational fluid dynamics (CFD)

    International Nuclear Information System (INIS)

    Covrig, S. D.

    2013-01-01

    High power liquid hydrogen (LH2) targets, up to 850 W, have been widely used at Jefferson Lab for the 6 GeV physics program. The typical luminosity loss of a 20 cm long LH2 target was 20% for a beam current of 100 μA rastered on a square of side 2 mm on the target. The 35 cm long, 2500 W LH2 target for the Qweak experiment had a luminosity loss of 0.8% at 180 μA beam rastered on a square of side 4 mm at the target. The Qweak target was the highest power liquid hydrogen target in the world and with the lowest noise figure. The Qweak target was the first one designed with CFD at Jefferson Lab. A CFD facility is being established at Jefferson Lab to design, build and test a new generation of low noise high power targets

  3. Designing high power targets with computational fluid dynamics (CFD)

    Energy Technology Data Exchange (ETDEWEB)

    Covrig, S. D. [Thomas Jefferson National Laboratory, Newport News, VA 23606 (United States)

    2013-11-07

    High power liquid hydrogen (LH2) targets, up to 850 W, have been widely used at Jefferson Lab for the 6 GeV physics program. The typical luminosity loss of a 20 cm long LH2 target was 20% for a beam current of 100 μA rastered on a square of side 2 mm on the target. The 35 cm long, 2500 W LH2 target for the Qweak experiment had a luminosity loss of 0.8% at 180 μA beam rastered on a square of side 4 mm at the target. The Qweak target was the highest power liquid hydrogen target in the world and with the lowest noise figure. The Qweak target was the first one designed with CFD at Jefferson Lab. A CFD facility is being established at Jefferson Lab to design, build and test a new generation of low noise high power targets.

  4. Iodide and xenon enhancement of computed tomography (CT) in multiple sclerosis (MS)

    International Nuclear Information System (INIS)

    Radue, E.W.; Kendall, B.E.

    1978-01-01

    The characteristic findings on computed tomography (CT) in multiple sclerosis (MS) are discussed. In a series of 49 cases plain CT was normal in 21 (43%), cerebral atrophy alone was present in 17 (35%) and plaques were visible in 11 (23%). These were most often adjacent to the lateral ventricles (14 plaques) and in the parietal white matter (10 plaques). CT was performed after the intravenous administration of iodide in 16 of these cases. Two patients with low attenuation plaques were scanned with xenon enhancement; the plaques absorbed less xenon than the corresponding contralateral brain substance and additional, previously isodense plaques were revealed. In one case the white matter absorbed much less xenon than normal and its uptake relative to grey matter was reduced. (orig.) [de

  5. FrFT-CSWSF: Estimating cross-range velocities of ground moving targets using multistatic synthetic aperture radar

    Directory of Open Access Journals (Sweden)

    Li Chenlei

    2014-10-01

    Full Text Available Estimating cross-range velocity is a challenging task for space-borne synthetic aperture radar (SAR, which is important for ground moving target indication (GMTI. Because the velocity of a target is very small compared with that of the satellite, it is difficult to correctly estimate it using a conventional monostatic platform algorithm. To overcome this problem, a novel method employing multistatic SAR is presented in this letter. The proposed hybrid method, which is based on an extended space-time model (ESTIM of the azimuth signal, has two steps: first, a set of finite impulse response (FIR filter banks based on a fractional Fourier transform (FrFT is used to separate multiple targets within a range gate; second, a cross-correlation spectrum weighted subspace fitting (CSWSF algorithm is applied to each of the separated signals in order to estimate their respective parameters. As verified through computer simulation with the constellations of Cartwheel, Pendulum and Helix, this proposed time-frequency-subspace method effectively improves the estimation precision of the cross-range velocities of multiple targets.

  6. Using the Dual-Target Cost to Explore the Nature of Search Target Representations

    Science.gov (United States)

    Stroud, Michael J.; Menneer, Tamaryn; Cave, Kyle R.; Donnelly, Nick

    2012-01-01

    Eye movements were monitored to examine search efficiency and infer how color is mentally represented to guide search for multiple targets. Observers located a single color target very efficiently by fixating colors similar to the target. However, simultaneous search for 2 colors produced a dual-target cost. In addition, as the similarity between…

  7. The Vasa Homolog RDE-12 engages target mRNA and multiple argonaute proteins to promote RNAi in C. elegans.

    Science.gov (United States)

    Shirayama, Masaki; Stanney, William; Gu, Weifeng; Seth, Meetu; Mello, Craig C

    2014-04-14

    Argonaute (AGO) proteins are key nuclease effectors of RNAi. Although purified AGOs can mediate a single round of target RNA cleavage in vitro, accessory factors are required for small interfering RNA (siRNA) loading and to achieve multiple-target turnover. To identify AGO cofactors, we immunoprecipitated the C. elegans AGO WAGO-1, which engages amplified small RNAs during RNAi. These studies identified a robust association between WAGO-1 and a conserved Vasa ATPase-related protein RDE-12. rde-12 mutants are deficient in RNAi, including viral suppression, and fail to produce amplified secondary siRNAs and certain endogenous siRNAs (endo-siRNAs). RDE-12 colocalizes with WAGO-1 in germline P granules and in cytoplasmic and perinuclear foci in somatic cells. These findings and our genetic studies suggest that RDE-12 is first recruited to target mRNA by upstream AGOs (RDE-1 and ERGO-1), where it promotes small RNA amplification and/or WAGO-1 loading. Downstream of these events, RDE-12 forms an RNase-resistant (target mRNA-independent) complex with WAGO-1 and may thus have additional functions in target mRNA surveillance and silencing. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Integrating publicly-available data to generate computationally ...

    Science.gov (United States)

    The adverse outcome pathway (AOP) framework provides a way of organizing knowledge related to the key biological events that result in a particular health outcome. For the majority of environmental chemicals, the availability of curated pathways characterizing potential toxicity is limited. Methods are needed to assimilate large amounts of available molecular data and quickly generate putative AOPs for further testing and use in hazard assessment. A graph-based workflow was used to facilitate the integration of multiple data types to generate computationally-predicted (cp) AOPs. Edges between graph entities were identified through direct experimental or literature information or computationally inferred using frequent itemset mining. Data from the TG-GATEs and ToxCast programs were used to channel large-scale toxicogenomics information into a cpAOP network (cpAOPnet) of over 20,000 relationships describing connections between chemical treatments, phenotypes, and perturbed pathways measured by differential gene expression and high-throughput screening targets. Sub-networks of cpAOPs for a reference chemical (carbon tetrachloride, CCl4) and outcome (hepatic steatosis) were extracted using the network topology. Comparison of the cpAOP subnetworks to published mechanistic descriptions for both CCl4 toxicity and hepatic steatosis demonstrate that computational approaches can be used to replicate manually curated AOPs and identify pathway targets that lack genomic mar

  9. Bioresponsive and fluorescent hyaluronic acid-iodixanol nanogels for targeted X-ray computed tomography imaging and chemotherapy of breast tumors

    NARCIS (Netherlands)

    Zhu, Yaqin; Wang, Xinhui; Wang, X.; Chen, J.; Meng, Fenghua; Deng, D.; Cheng, R.; Feijen, Jan; Zhong, Zhiyuan

    2016-01-01

    Nanotheranostics is a rapidly growing field combining disease diagnosis and therapy, which ultimately may add in the development of ‘personalized medicine’. Here, we designed and developed bioresponsive and fluorescent hyaluronic acid-iodixanol nanogels (HAI-NGs) for targeted X-ray computed

  10. Performance evaluation for volumetric segmentation of multiple sclerosis lesions using MATLAB and computing engine in the graphical processing unit (GPU)

    Science.gov (United States)

    Le, Anh H.; Park, Young W.; Ma, Kevin; Jacobs, Colin; Liu, Brent J.

    2010-03-01

    Multiple Sclerosis (MS) is a progressive neurological disease affecting myelin pathways in the brain. Multiple lesions in the white matter can cause paralysis and severe motor disabilities of the affected patient. To solve the issue of inconsistency and user-dependency in manual lesion measurement of MRI, we have proposed a 3-D automated lesion quantification algorithm to enable objective and efficient lesion volume tracking. The computer-aided detection (CAD) of MS, written in MATLAB, utilizes K-Nearest Neighbors (KNN) method to compute the probability of lesions on a per-voxel basis. Despite the highly optimized algorithm of imaging processing that is used in CAD development, MS CAD integration and evaluation in clinical workflow is technically challenging due to the requirement of high computation rates and memory bandwidth in the recursive nature of the algorithm. In this paper, we present the development and evaluation of using a computing engine in the graphical processing unit (GPU) with MATLAB for segmentation of MS lesions. The paper investigates the utilization of a high-end GPU for parallel computing of KNN in the MATLAB environment to improve algorithm performance. The integration is accomplished using NVIDIA's CUDA developmental toolkit for MATLAB. The results of this study will validate the practicality and effectiveness of the prototype MS CAD in a clinical setting. The GPU method may allow MS CAD to rapidly integrate in an electronic patient record or any disease-centric health care system.

  11. Aufwandsanalyse für computerunterstützte Multiple-Choice Papierklausuren [Cost analysis for computer supported multiple-choice paper examinations

    Directory of Open Access Journals (Sweden)

    Mandel, Alexander

    2011-11-01

    Full Text Available [english] Introduction: Multiple-choice-examinations are still fundamental for assessment in medical degree programs. In addition to content related research, the optimization of the technical procedure is an important question. Medical examiners face three options: paper-based examinations with or without computer support or completely electronic examinations. Critical aspects are the effort for formatting, the logistic effort during the actual examination, quality, promptness and effort of the correction, the time for making the documents available for inspection by the students, and the statistical analysis of the examination results.Methods: Since three semesters a computer program for input and formatting of MC-questions in medical and other paper-based examinations is used and continuously improved at Wuerzburg University. In the winter semester (WS 2009/10 eleven, in the summer semester (SS 2010 twelve and in WS 2010/11 thirteen medical examinations were accomplished with the program and automatically evaluated. For the last two semesters the remaining manual workload was recorded. Results: The cost of the formatting and the subsequent analysis including adjustments of the analysis of an average examination with about 140 participants and about 35 questions was 5-7 hours for exams without complications in the winter semester 2009/2010, about 2 hours in SS 2010 and about 1.5 hours in the winter semester 2010/11. Including exams with complications, the average time was about 3 hours per exam in SS 2010 and 2.67 hours for the WS 10/11. Discussion: For conventional multiple-choice exams the computer-based formatting and evaluation of paper-based exams offers a significant time reduction for lecturers in comparison with the manual correction of paper-based exams and compared to purely electronically conducted exams it needs a much simpler technological infrastructure and fewer staff during the exam.[german] Einleitung: Multiple

  12. Single-Isocenter Multiple-Target Stereotactic Radiosurgery: Risk of Compromised Coverage

    International Nuclear Information System (INIS)

    Roper, Justin; Chanyavanich, Vorakarn; Betzel, Gregory; Switchenko, Jeffrey; Dhabaan, Anees

    2015-01-01

    Purpose: To determine the dosimetric effects of rotational errors on target coverage using volumetric modulated arc therapy (VMAT) for multitarget stereotactic radiosurgery (SRS). Methods and Materials: This retrospective study included 50 SRS cases, each with 2 intracranial planning target volumes (PTVs). Both PTVs were planned for simultaneous treatment to 21 Gy using a single-isocenter, noncoplanar VMAT SRS technique. Rotational errors of 0.5°, 1.0°, and 2.0° were simulated about all axes. The dose to 95% of the PTV (D95) and the volume covered by 95% of the prescribed dose (V95) were evaluated using multivariate analysis to determine how PTV coverage was related to PTV volume, PTV separation, and rotational error. Results: At 0.5° rotational error, D95 values and V95 coverage rates were ≥95% in all cases. For rotational errors of 1.0°, 7% of targets had D95 and V95 values 95% for only 63% of the targets. Multivariate analysis showed that PTV volume and distance to isocenter were strong predictors of target coverage. Conclusions: The effects of rotational errors on target coverage were studied across a broad range of SRS cases. In general, the risk of compromised coverage increased with decreasing target volume, increasing rotational error and increasing distance between targets. Multivariate regression models from this study may be used to quantify the dosimetric effects of rotational errors on target coverage given patient-specific input parameters of PTV volume and distance to isocenter.

  13. Coronary artery analysis: Computer-assisted selection of best-quality segments in multiple-phase coronary CT angiography

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Chuan, E-mail: chuan@umich.edu; Chan, Heang-Ping; Hadjiyski, Lubomir M.; Chughtai, Aamer; Wei, Jun; Kazerooni, Ella A. [Department of Radiology, The University of Michigan, Ann Arbor, Michigan 48109-0904 (United States)

    2016-10-15

    Purpose: The authors are developing an automated method to identify the best-quality coronary arterial segment from multiple-phase coronary CT angiography (cCTA) acquisitions, which may be used by either interpreting physicians or computer-aided detection systems to optimally and efficiently utilize the diagnostic information available in multiple-phase cCTA for the detection of coronary artery disease. Methods: After initialization with a manually identified seed point, each coronary artery tree is automatically extracted from multiple cCTA phases using our multiscale coronary artery response enhancement and 3D rolling balloon region growing vessel segmentation and tracking method. The coronary artery trees from multiple phases are then aligned by a global registration using an affine transformation with quadratic terms and nonlinear simplex optimization, followed by a local registration using a cubic B-spline method with fast localized optimization. The corresponding coronary arteries among the available phases are identified using a recursive coronary segment matching method. Each of the identified vessel segments is transformed by the curved planar reformation (CPR) method. Four features are extracted from each corresponding segment as quality indicators in the original computed tomography volume and the straightened CPR volume, and each quality indicator is used as a voting classifier for the arterial segment. A weighted voting ensemble (WVE) classifier is designed to combine the votes of the four voting classifiers for each corresponding segment. The segment with the highest WVE vote is then selected as the best-quality segment. In this study, the training and test sets consisted of 6 and 20 cCTA cases, respectively, each with 6 phases, containing a total of 156 cCTA volumes and 312 coronary artery trees. An observer preference study was also conducted with one expert cardiothoracic radiologist and four nonradiologist readers to visually rank vessel segment

  14. Coronary artery analysis: Computer-assisted selection of best-quality segments in multiple-phase coronary CT angiography

    International Nuclear Information System (INIS)

    Zhou, Chuan; Chan, Heang-Ping; Hadjiyski, Lubomir M.; Chughtai, Aamer; Wei, Jun; Kazerooni, Ella A.

    2016-01-01

    Purpose: The authors are developing an automated method to identify the best-quality coronary arterial segment from multiple-phase coronary CT angiography (cCTA) acquisitions, which may be used by either interpreting physicians or computer-aided detection systems to optimally and efficiently utilize the diagnostic information available in multiple-phase cCTA for the detection of coronary artery disease. Methods: After initialization with a manually identified seed point, each coronary artery tree is automatically extracted from multiple cCTA phases using our multiscale coronary artery response enhancement and 3D rolling balloon region growing vessel segmentation and tracking method. The coronary artery trees from multiple phases are then aligned by a global registration using an affine transformation with quadratic terms and nonlinear simplex optimization, followed by a local registration using a cubic B-spline method with fast localized optimization. The corresponding coronary arteries among the available phases are identified using a recursive coronary segment matching method. Each of the identified vessel segments is transformed by the curved planar reformation (CPR) method. Four features are extracted from each corresponding segment as quality indicators in the original computed tomography volume and the straightened CPR volume, and each quality indicator is used as a voting classifier for the arterial segment. A weighted voting ensemble (WVE) classifier is designed to combine the votes of the four voting classifiers for each corresponding segment. The segment with the highest WVE vote is then selected as the best-quality segment. In this study, the training and test sets consisted of 6 and 20 cCTA cases, respectively, each with 6 phases, containing a total of 156 cCTA volumes and 312 coronary artery trees. An observer preference study was also conducted with one expert cardiothoracic radiologist and four nonradiologist readers to visually rank vessel segment

  15. Automated computer analysis of x-ray radiographs greatly facilitates measurement of coating-thickness variations in laser-fusion targets

    International Nuclear Information System (INIS)

    Stupin, D.M.; Moore, K.R.; Thomas, G.D.; Whitman, R.L.

    1981-01-01

    An automated system was built to analyze x-ray radiographs of laser fusion targets which greatly facilitates the detection of coating thickness variations. Many laser fusion targets reqire opaque coatings 1 to 20 μm thick which have been deposited on small glass balloons 100 to 500 μm in diameter. These coatings must be uniformly thick to 1% for the targets to perform optimally. Our system is designed to detect variations as small as 100 A in 1-μm-thick coatings by converting the optical density variations of contact x-ray radiographs into coating thickness variations. Radiographic images are recorded in HRP emulsions and magnified by an optical microscope, imaged onto television camera, digitized and processed on a Data General S/230 computer with a code by Whitman. After an initial set-up by the operator, as many as 200 targets will be automatically characterized

  16. Optimizing megakaryocyte polyploidization by targeting multiple pathways of cytokinesis.

    Science.gov (United States)

    Avanzi, Mauro P; Chen, Amanda; He, Wu; Mitchell, W Beau

    2012-11-01

    Large-scale in vitro production of platelets (PLTs) from cord blood stem cells is one goal of stem cell research. One step toward this goal will be to produce polyploid megakaryocytes capable of releasing high numbers of PLTs. Megakaryocyte polyploidization requires distinct cytoskeletal and cellular mechanisms, including actin polymerization, myosin activation, microtubule formation, and increased DNA production. In this study we variably combined inhibition of these principal mechanisms of cytokinesis with the goal of driving polyploidization in megakaryocytes. Megakaryocytes were derived from umbilical cord blood and cultured with reagents that inhibit distinct mechanisms of cytokinesis: Rho-Rock inhibitor (RRI), Src inhibitor (SI), nicotinamide (NIC), aurora B inhibitor (ABI), and myosin light chain kinase inhibitor (MLCKI). Combinations of reagents were used to determine their interactions and to maximize megakaryocyte ploidy. Treatment with RRI, NIC, SI, and ABI, but not with MLCKI, increased the final ploidy and RRI was the most effective single reagent. RRI and MLCKI, both inhibitors of MLC activation, resulted in opposite ploidy outcomes. Combinations of reagents also increased ploidy and the use of NIC, SI, and ABI was as effective as RRI alone. Addition of MLCKI to NIC, SI, and ABI reached the highest level of polyploidization. Megakaryocyte polyploidization results from modulation of a combination of distinct cytokinesis pathways. Reagents targeting distinct cytoskeletal pathways produced additive effects in final megakaryocyte ploidy. The RRI, however, showed no additive effect but produced a high final ploidy due to overlapping inhibition of multiple cytokinesis pathways. © 2012 American Association of Blood Banks.

  17. SFACTOR: a computer code for calculating dose equivalent to a target organ per microcurie-day residence of a radionuclide in a source organ

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, D.E. Jr.; Pleasant, J.C.; Killough, G.G.

    1977-11-01

    A computer code SFACTOR was developed to estimate the average dose equivalent S (rem/..mu..Ci-day) to each of a specified list of target organs per microcurie-day residence of a radionuclide in source organs in man. Source and target organs of interest are specified in the input data stream, along with the nuclear decay information. The SFACTOR code computes components of the dose equivalent rate from each type of decay present for a particular radionuclide, including alpha, electron, and gamma radiation. For those transuranic isotopes which also decay by spontaneous fission, components of S from the resulting fission fragments, neutrons, betas, and gammas are included in the tabulation. Tabulations of all components of S are provided for an array of 22 source organs and 24 target organs for 52 radionuclides in an adult.

  18. SFACTOR: a computer code for calculating dose equivalent to a target organ per microcurie-day residence of a radionuclide in a source organ

    International Nuclear Information System (INIS)

    Dunning, D.E. Jr.; Pleasant, J.C.; Killough, G.G.

    1977-11-01

    A computer code SFACTOR was developed to estimate the average dose equivalent S (rem/μCi-day) to each of a specified list of target organs per microcurie-day residence of a radionuclide in source organs in man. Source and target organs of interest are specified in the input data stream, along with the nuclear decay information. The SFACTOR code computes components of the dose equivalent rate from each type of decay present for a particular radionuclide, including alpha, electron, and gamma radiation. For those transuranic isotopes which also decay by spontaneous fission, components of S from the resulting fission fragments, neutrons, betas, and gammas are included in the tabulation. Tabulations of all components of S are provided for an array of 22 source organs and 24 target organs for 52 radionuclides in an adult

  19. Computational enzyme design approaches with significant biological outcomes: progress and challenges

    OpenAIRE

    Li, Xiaoman; Zhang, Ziding; Song, Jiangning

    2012-01-01

    Enzymes are powerful biocatalysts, however, so far there is still a large gap between the number of enzyme-based practical applications and that of naturally occurring enzymes. Multiple experimental approaches have been applied to generate nearly all possible mutations of target enzymes, allowing the identification of desirable variants with improved properties to meet the practical needs. Meanwhile, an increasing number of computational methods have been developed to assist in the modificati...

  20. Applications of Fast Truncated Multiplication in Cryptography

    Directory of Open Access Journals (Sweden)

    Laszlo Hars

    2006-12-01

    Full Text Available Truncated multiplications compute truncated products, contiguous subsequences of the digits of integer products. For an n-digit multiplication algorithm of time complexity O(nα, with 1<α≤2, there is a truncated multiplication algorithm, which is constant times faster when computing a short enough truncated product. Applying these fast truncated multiplications, several cryptographic long integer arithmetic algorithms are improved, including integer reciprocals, divisions, Barrett and Montgomery multiplications, 2n-digit modular multiplication on hardware for n-digit half products. For example, Montgomery multiplication is performed in 2.6 Karatsuba multiplication time.

  1. A fast ellipse extended target PHD filter using box-particle implementation

    Science.gov (United States)

    Zhang, Yongquan; Ji, Hongbing; Hu, Qi

    2018-01-01

    This paper presents a box-particle implementation of the ellipse extended target probability hypothesis density (ET-PHD) filter, called the ellipse extended target box particle PHD (EET-BP-PHD) filter, where the extended targets are described as a Poisson model developed by Gilholm et al. and the term "box" is here equivalent to the term "interval" used in interval analysis. The proposed EET-BP-PHD filter is capable of dynamically tracking multiple ellipse extended targets and estimating the target states and the number of targets, in the presence of clutter measurements, false alarms and missed detections. To derive the PHD recursion of the EET-BP-PHD filter, a suitable measurement likelihood is defined for a given partitioning cell, and the main implementation steps are presented along with the necessary box approximations and manipulations. The limitations and capabilities of the proposed EET-BP-PHD filter are illustrated by simulation examples. The simulation results show that a box-particle implementation of the ET-PHD filter can avoid the high number of particles and reduce computational burden, compared to a particle implementation of that for extended target tracking.

  2. Application of the 2-D discrete-ordinates method to multiple scattering of laser radiation

    International Nuclear Information System (INIS)

    Zardecki, A.; Gerstl, S.A.W.; Embury, J.F.

    1983-01-01

    The discrete-ordinates finite-element radiation transport code twotran is applied to describe the multiple scattering of a laser beam from a reflecting target. For a model scenario involving a 99% relative humidity rural aerosol we compute the average intensity of the scattered radiation and correction factors to the Beer-Lambert law arising from multiple scattering. As our results indicate, 2-D x-y and r-z geometry modeling can reliably describe a realistic 3-D scenario. Specific results are presented for the two visual ranges of 1.52 and 0.76 km which show that, for sufficiently high aerosol concentrations (e.g., equivalent to V = 0.76 km), the target signature in a distant detector becomes dominated by multiply scattered radiation from interactions of the laser light with the aerosol environment. The merits of the scaling group and the delta-M approximation for the transfer equation are also explored

  3. Accuracy & Computational Considerations for Wide--Angle One--way Seismic Propagators and Multiple Scattering by Invariant Embedding

    Science.gov (United States)

    Thomson, C. J.

    2004-12-01

    Pseudodifferential operators (PSDOs) yield in principle exact one--way seismic wave equations, which are attractive both conceptually and for their promise of computational efficiency. The one--way operators can be extended to include multiple--scattering effects, again in principle exactly. In practice approximations must be made and, as an example, the variable--wavespeed Helmholtz equation for scalar waves in two space dimensions is here factorized to give the one--way wave equation. This simple case permits clear identification of a sequence of physically reasonable approximations to be used when the mathematically exact PSDO one--way equation is implemented on a computer. As intuition suggests, these approximations hinge on the medium gradients in the direction transverse to the main propagation direction. A key point is that narrow--angle approximations are to be avoided in the interests of accuracy. Another key consideration stems from the fact that the so--called ``standard--ordering'' PSDO indicates how lateral interpolation of the velocity structure can significantly reduce computational costs associated with the Fourier or plane--wave synthesis lying at the heart of the calculations. The decision on whether a slow or a fast Fourier transform code should be used rests upon how many lateral model parameters are truly distinct. A third important point is that the PSDO theory shows what approximations are necessary in order to generate an exponential one--way propagator for the laterally varying case, representing the intuitive extension of classical integral--transform solutions for a laterally homogeneous medium. This exponential propagator suggests the use of larger discrete step sizes, and it can also be used to approach phase--screen like approximations (though the latter are not the main interest here). Numerical comparisons with finite--difference solutions will be presented in order to assess the approximations being made and to gain an understanding

  4. Computer-Based Video Instruction to Teach Students with Intellectual Disabilities to Use Public Bus Transportation

    Science.gov (United States)

    Mechling, Linda; O'Brien, Eileen

    2010-01-01

    This study investigated the effectiveness of computer-based video instruction (CBVI) to teach three young adults with moderate intellectual disabilities to push a "request to stop bus signal" and exit a city bus in response to target landmarks. A multiple probe design across three students and one bus route was used to evaluate effectiveness of…

  5. Computer simulation for the effect of target angle in diagnostic x-ray tube output and half-value layer

    International Nuclear Information System (INIS)

    Hayami, Akimune; Fuchihata, Hajime; Yamazaki, Takeshi; Mori, Yoshinobu; Ozeki, Syuji.

    1980-01-01

    The change of target angle of X-ray tube plays an important role in changing both the output and the quality of X-rays. A computer simulation was made to estimate the effect of target angle on the output and the quality (half-value layer: HVL) in the central ray using Storm's semiempirical formula. The data here presented are the values of output and HVL for the target angles of 10, 15, 20 and 30 degrees and for the total filtrations of 1, 2, 3 and 4 mm Al eq., at an increment of 10 kV steps of applied voltage between 50 and 150 kV. The output values and HVL's as a function of target angle, applied voltage and total filtration are shown for a full-wave rectified diagnostic X-ray generator. As a result, changes ranging from 17 to 76% in the output and 5 to 66% in the HVL were noted by varying the target angle from 10 to 30 degrees. Therefore, the target angle of X-ray tube should be clearly stated whenever the output and the quality (HVL) of X-ray generator are discussed. (author)

  6. Comparison of static conformal field with multiple noncoplanar arc techniques for stereotactic radiosurgery or stereotactic radiotherapy

    International Nuclear Information System (INIS)

    Hamilton, Russell J.; Kuchnir, Franca T.; Sweeney, Patrick; Rubin, Steven J.; Dujovny, Manuel; Pelizzari, Charles A.; Chen, George T. Y.

    1995-01-01

    Purpose: Compare the use of static conformal fields with the use of multiple noncoplanar arcs for stereotactic radiosurgery or stereotactic radiotherapy treatment of intracranial lesions. Evaluate the efficacy of these treatment techniques to deliver dose distributions comparable to those considered acceptable in current radiotherapy practice. Methods and Materials: A previously treated radiosurgery case of a patient presenting with an irregularly shaped intracranial lesion was selected. Using a three-dimensional (3D) treatment-planning system, treatment plans using a single isocenter multiple noncoplanar arc technique and multiple noncoplanar conformal static fields were generated. Isodose distributions and dose volume histograms (DVHs) were computed for each treatment plan. We required that the 80% (of maximum dose) isodose surface enclose the target volume for all treatment plans. The prescription isodose was set equal to the minimum target isodose. The DVHs were analyzed to evaluate and compare the different treatment plans. Results: The dose distribution in the target volume becomes more uniform as the number of conformal fields increases. The volume of normal tissue receiving low doses (> 10% of prescription isodose) increases as the number of static fields increases. The single isocenter multiple arc plan treats the greatest volume of normal tissue to low doses, approximately 1.6 times more volume than that treated by four static fields. The volume of normal tissue receiving high (> 90% of prescription isodose) and intermediate (> 50% of prescription isodose) doses decreases by 29 and 22%, respectively, as the number of static fields is increased from four to eight. Increasing the number of static fields to 12 only further reduces the high and intermediate dose volumes by 10 and 6%, respectively. The volume receiving the prescription dose is more than 3.5 times larger than the target volume for all treatment plans. Conclusions: Use of a multiple noncoplanar

  7. Mean associated multiplicities in deep inelastic processes

    International Nuclear Information System (INIS)

    Dzhaparidze, G.Sh.; Kiselev, A.V.; Petrov, V.A.

    1982-01-01

    A formula is derived for the mean hadron multiplicity in the target fragmentation range of deep inelastic scattering processes. It is shown that in the high-x region the ratio of the mean multiplicities in the current fragmentation region and in the target fragmentation region tends to unity at high energies. The mean multiplicity for the Drell-Yan process is considered

  8. Mean associated multiplicities in deep inelastic processes

    International Nuclear Information System (INIS)

    Dzhaparidze, G.S.; Kiselev, A.V.; Petrov, V.A.

    1982-01-01

    A formula is derived for the mean multiplicity of hadrons in the target-fragmentation region in the process of deep inelastic scattering. It is shown that in the region of large x the ratio of the mean multiplicities in the current- and target-fragmentation regions tends to unity at high energies. The mean multiplicity in the Drell-Yan process is also discussed

  9. The Calibration Target for the Mars 2020 SHERLOC Instrument: Multiple Science Roles for Future Manned and Unmanned Mars Exploration

    Science.gov (United States)

    Fries, M.; Bhartia, R.; Beegle, L.; Burton, A.; Ross, A.; Shahar, A.

    2014-01-01

    The Scanning Habitable Environments with Raman & Luminescence for Organics & Chemicals (SHERLOC) instrument is a deep ultraviolet (UV) Raman/fluorescence instrument selected as part of the Mars 2020 rover instrument suite. SHERLOC will be mounted on the rover arm and its primary role is to identify carbonaceous species in martian samples, which may be selected for inclusion into a returnable sample cache. The SHERLOC instrument will require the use of a calibration target, and by design, multiple science roles will be addressed in the design of the target. Samples of materials used in NASA Extravehicular Mobility unit (EMU, or "space suit") manufacture have been included in the target to serve as both solid polymer calibration targets for SHERLOC instrument function, as well as for testing the resiliency of those materials under martian ambient conditions. A martian meteorite will also be included in the target to serve as a well-characterized example of a martian rock that contains trace carbonaceous material. This rock will be the first rock that we know of that has completed a round trip between planets and will therefore serve an EPO role to attract public attention to science and planetary exploration. The SHERLOC calibration target will address a wide range of NASA goals to include basic science of interest to both the Science Mission Directorate (SMD) and Human Exploration and Operations Mission Directorate (HEOMD).

  10. Multiple cyber attacks against a target with observation errors and dependent outcomes: Characterization and optimization

    International Nuclear Information System (INIS)

    Hu, Xiaoxiao; Xu, Maochao; Xu, Shouhuai; Zhao, Peng

    2017-01-01

    In this paper we investigate a cybersecurity model: An attacker can launch multiple attacks against a target with a termination strategy that says that the attacker will stop after observing a number of successful attacks or when the attacker is out of attack resources. However, the attacker's observation of the attack outcomes (i.e., random variables indicating whether the target is compromised or not) has an observation error that is specified by both a false-negative and a false-positive probability. The novelty of the model we study is the accommodation of the dependence between the attack outcomes, because the dependence was assumed away in the literature. In this model, we characterize the monotonicity and bounds of the compromise probability (i.e., the probability that the target is compromised). In addition to extensively showing the impact of dependence on quantities such as compromise probability and attack cost, we give methods for finding the optimal strategy that leads to maximum compromise probability or minimum attack cost. This study highlights that the dependence between random variables cannot be assumed away, because the results will be misleading. - Highlights: • A novel cybersecurity model is proposed to accommodate the dependence among attack outcomes. • The monotonicity and bounds of the compromise probability are studied. • The dependence effect on the compromise probability and attack cost is discussed via simulation. • The optimal strategy that leads to maximum compromise probability or minimum attack cost is presented.

  11. Multiple constant multiplication optimizations for field programmable gate arrays

    CERN Document Server

    Kumm, Martin

    2016-01-01

    This work covers field programmable gate array (FPGA)-specific optimizations of circuits computing the multiplication of a variable by several constants, commonly denoted as multiple constant multiplication (MCM). These optimizations focus on low resource usage but high performance. They comprise the use of fast carry-chains in adder-based constant multiplications including ternary (3-input) adders as well as the integration of look-up table-based constant multipliers and embedded multipliers to get the optimal mapping to modern FPGAs. The proposed methods can be used for the efficient implementation of digital filters, discrete transforms and many other circuits in the domain of digital signal processing, communication and image processing. Contents Heuristic and ILP-Based Optimal Solutions for the Pipelined Multiple Constant Multiplication Problem Methods to Integrate Embedded Multipliers, LUT-Based Constant Multipliers and Ternary (3-Input) Adders An Optimized Multiple Constant Multiplication Architecture ...

  12. SU-E-I-27: Establishing Target Exposure Index Values for Computed Radiography

    International Nuclear Information System (INIS)

    Murphy, N; Tchou, P; Belcher, K; Scott, A

    2014-01-01

    Purpose: To develop a standard set of target exposure index (TEI) values to be applied to Agfa Computed Radiography (CR) readers in accordance with International Electrotechnical Committee 62494-1 (ed. 1.0). Methods: A large data cohort was collected from six USAF Medical Treatment Facilities that exclusively use Agfa CR Readers. Dose monitoring statistics were collected from each reader. The data was analyzed based on anatomic region, view, and processing speed class. The Agfa specific exposure metric, logarithmic mean (LGM), was converted to exposure index (EI) for each data set. The optimum TEI value was determined by minimizing the number of studies that fell outside the acceptable deviation index (DI) range of +/− 2 for phototimed techniques or a range of +/−3 for fixed techniques. An anthropomorphic radiographic phantom was used to corroborate the TEI recommendations. Images were acquired of several anatomic regions and views using standard techniques. The images were then evaluated by two radiologists as either acceptable or unacceptable. The acceptable image with the lowest exposure and EI value was compared to the recommended TEI values using a passing DI range. Results: Target EI values were determined for a comprehensive list of anatomic regions and views. Conclusion: Target EI values must be established on each CR unit in order to provide a positive feedback system for the technologist. This system will serve as a mechanism to prevent under or overexposures of patients. The TEI recommendations are a first attempt at a large scale process improvement with the goal of setting reasonable and standardized TEI values. The implementation and effectiveness of the recommended TEI values should be monitored and adjustments made as necessary

  13. FACC: A Novel Finite Automaton Based on Cloud Computing for the Multiple Longest Common Subsequences Search

    Directory of Open Access Journals (Sweden)

    Yanni Li

    2012-01-01

    Full Text Available Searching for the multiple longest common subsequences (MLCS has significant applications in the areas of bioinformatics, information processing, and data mining, and so forth, Although a few parallel MLCS algorithms have been proposed, the efficiency and effectiveness of the algorithms are not satisfactory with the increasing complexity and size of biologic data. To overcome the shortcomings of the existing MLCS algorithms, and considering that MapReduce parallel framework of cloud computing being a promising technology for cost-effective high performance parallel computing, a novel finite automaton (FA based on cloud computing called FACC is proposed under MapReduce parallel framework, so as to exploit a more efficient and effective general parallel MLCS algorithm. FACC adopts the ideas of matched pairs and finite automaton by preprocessing sequences, constructing successor tables, and common subsequences finite automaton to search for MLCS. Simulation experiments on a set of benchmarks from both real DNA and amino acid sequences have been conducted and the results show that the proposed FACC algorithm outperforms the current leading parallel MLCS algorithm FAST-MLCS.

  14. Charged particle fusion targets

    International Nuclear Information System (INIS)

    Bangerter, R.O.; Meeker, D.J.

    1977-01-01

    The power, voltage, energy and other requirements of electron and ion beam fusion targets are reviewed. Single shell, multiple shell and magnetically insulated target designs are discussed. Questions of stability are also considered. In particular, it is shown that ion beam targets are stabilized by an energy spread in the ion beam

  15. Computer simulation of multiple stability regions in an internally cooled superconducting conductor and of helium replenishment in a bath-cooled conductor

    International Nuclear Information System (INIS)

    Turner, L.R.; Shindler, J.

    1984-09-01

    For upcoming fusion experiments and future fusion reactors, superconducting magnetic have been chosen or considered which employ cooling by pool-boiling HeI, by HeII, and by internally flowing HeI. The choice of conductor and cooling method should be determined in part by the response of the magnet to sudden localized heat pulses of various magnitudes. The paper describes the successful computer simulation of multiple stability in internally cooled conductors, as observed experimentally, using the computer code SSICC. It also describes the modeling of helium replenishment in the cooling channels of a bath-cooled conductor, using the computer code TASS

  16. Multiple polysaccharide-drug complex-loaded liposomes: A unique strategy in drug loading and cancer targeting.

    Science.gov (United States)

    Ruttala, Hima Bindu; Ramasamy, Thiruganesh; Gupta, Biki; Choi, Han-Gon; Yong, Chul Soon; Kim, Jong Oh

    2017-10-01

    In the present study, a unique strategy was developed to develop nanocarriers containing multiple therapeutics with controlled release characteristics. In this study, we demonstrated the synthesis of dextran sulfate-doxorubicin (DS-DOX) and alginate-cisplatin (AL-CIS) polymer-drug complexes to produce a transferrin ligand-conjugated liposome. The targeted nanoparticles (TL-DDAC) were nano-sized and spherical. The targeted liposome exhibited a specific receptor-mediated endocytic uptake in cancer cells. The enhanced cellular uptake of TL-DDAC resulted in a significantly better anticancer effect in resistant and sensitive breast cancer cells compared to that of the free drugs. Specifically, DOX and CIS at a molar ratio of 1:1 exhibited better therapeutic performance compared to that of other combinations. The combination of an anthracycline-based topoisomerase II inhibitor (DOX) and a platinum compound (CIS) resulted in significantly higher cell apoptosis (early and late) in both types of cancer cells. In conclusion, treatment with DS-DOX and AL-CIS based combination liposomes modified with transferrin (TL-DDAC) was an effective cancer treatment strategy. Further investigation in clinically relevant animal models is warranted to prove the therapeutic efficacy of this unique strategy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. A computational method for identification of vaccine targets from protein regions of conserved human leukocyte antigen binding

    DEFF Research Database (Denmark)

    Olsen, Lars Rønn; Simon, Christian; Kudahl, Ulrich J.

    2015-01-01

    Background: Computational methods for T cell-based vaccine target discovery focus on selection of highly conserved peptides identified across pathogen variants, followed by prediction of their binding of human leukocyte antigen molecules. However, experimental studies have shown that T cells ofte...... or proteome using human leukocyte antigen binding predictions and made a web-accessible software implementation freely available at http://met-hilab.cbs.dtu.dk/blockcons/....

  18. Clinical Validation of Atlas-Based Auto-Segmentation of Multiple Target Volumes and Normal Tissue (Swallowing/Mastication) Structures in the Head and Neck

    Energy Technology Data Exchange (ETDEWEB)

    Teguh, David N. [Department of Radiation Oncology, Erasmus Medical Center-Daniel den Hoed Cancer Center, Rotterdam (Netherlands); Levendag, Peter C., E-mail: p.levendag@erasmusmc.nl [Department of Radiation Oncology, Erasmus Medical Center-Daniel den Hoed Cancer Center, Rotterdam (Netherlands); Voet, Peter W.J.; Al-Mamgani, Abrahim [Department of Radiation Oncology, Erasmus Medical Center-Daniel den Hoed Cancer Center, Rotterdam (Netherlands); Han Xiao; Wolf, Theresa K.; Hibbard, Lyndon S. [Elekta-CMS Software, Maryland Heights, MO 63043 (United States); Nowak, Peter; Akhiat, Hafid; Dirkx, Maarten L.P.; Heijmen, Ben J.M.; Hoogeman, Mischa S. [Department of Radiation Oncology, Erasmus Medical Center-Daniel den Hoed Cancer Center, Rotterdam (Netherlands)

    2011-11-15

    . Conclusion: Multiple-subject ABAS of computed tomography images proved to be a useful novel tool in the rapid delineation of target and normal tissues. Although editing of the autocontours is inevitable, a substantial time reduction was achieved using editing, instead of manual contouring (180 vs. 66 min).

  19. Clinical Validation of Atlas-Based Auto-Segmentation of Multiple Target Volumes and Normal Tissue (Swallowing/Mastication) Structures in the Head and Neck

    International Nuclear Information System (INIS)

    Teguh, David N.; Levendag, Peter C.; Voet, Peter W.J.; Al-Mamgani, Abrahim; Han Xiao; Wolf, Theresa K.; Hibbard, Lyndon S.; Nowak, Peter; Akhiat, Hafid; Dirkx, Maarten L.P.; Heijmen, Ben J.M.; Hoogeman, Mischa S.

    2011-01-01

    . Conclusion: Multiple-subject ABAS of computed tomography images proved to be a useful novel tool in the rapid delineation of target and normal tissues. Although editing of the autocontours is inevitable, a substantial time reduction was achieved using editing, instead of manual contouring (180 vs. 66 min).

  20. Matrine Is Identified as a Novel Macropinocytosis Inducer by a Network Target Approach

    Directory of Open Access Journals (Sweden)

    Bo Zhang

    2018-01-01

    Full Text Available Comprehensively understanding pharmacological functions of natural products is a key issue to be addressed for the discovery of new drugs. Unlike some single-target drugs, natural products always exert diverse therapeutic effects through acting on a “network” that consists of multiple targets, making it necessary to develop a systematic approach, e.g., network pharmacology, to reveal pharmacological functions of natural products and infer their mechanisms of action. In this work, to identify the “network target” of a natural product, we perform a functional analysis of matrine, a marketed drug in China extracted from a medical herb Ku-Shen (Radix Sophorae Flavescentis. Here, the network target of matrine was firstly predicted by drugCIPHER, a genome-wide target prediction method. Based on the network target of matrine, we performed a functional gene set enrichment analysis to computationally identify the potential pharmacological functions of matrine, most of which are supported by the literature evidence, including neurotoxicity and neuropharmacological activities of matrine. Furthermore, computational results demonstrated that matrine has the potential for the induction of macropinocytosis and the regulation of ATP metabolism. Our experimental data revealed that the large vesicles induced by matrine are consistent with the typical characteristics of macropinosome. Our verification results also suggested that matrine could decrease cellular ATP level. These findings demonstrated the availability and effectiveness of the network target strategy for identifying the comprehensive pharmacological functions of natural products.

  1. Mining Emerging Patterns for Recognizing Activities of Multiple Users in Pervasive Computing

    DEFF Research Database (Denmark)

    Gu, Tao; Wu, Zhanqing; Wang, Liang

    2009-01-01

    Understanding and recognizing human activities from sensor readings is an important task in pervasive computing. Existing work on activity recognition mainly focuses on recognizing activities for a single user in a smart home environment. However, in real life, there are often multiple inhabitants...... activity models, and propose an Emerging Pattern based Multi-user Activity Recognizer (epMAR) to recognize both single-user and multiuser activities. We conduct our empirical studies by collecting real-world activity traces done by two volunteers over a period of two weeks in a smart home environment...... sensor readings in a home environment, and propose a novel pattern mining approach to recognize both single-user and multi-user activities in a unified solution. We exploit Emerging Pattern – a type of knowledge pattern that describes significant changes between classes of data – for constructing our...

  2. Cross-scale Efficient Tensor Contractions for Coupled Cluster Computations Through Multiple Programming Model Backends

    Energy Technology Data Exchange (ETDEWEB)

    Ibrahim, Khaled Z. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Epifanovsky, Evgeny [Q-Chem, Inc., Pleasanton, CA (United States); Williams, Samuel W. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Computational Research Division; Krylov, Anna I. [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Chemistry

    2016-07-26

    Coupled-cluster methods provide highly accurate models of molecular structure by explicit numerical calculation of tensors representing the correlation between electrons. These calculations are dominated by a sequence of tensor contractions, motivating the development of numerical libraries for such operations. While based on matrix-matrix multiplication, these libraries are specialized to exploit symmetries in the molecular structure and in electronic interactions, and thus reduce the size of the tensor representation and the complexity of contractions. The resulting algorithms are irregular and their parallelization has been previously achieved via the use of dynamic scheduling or specialized data decompositions. We introduce our efforts to extend the Libtensor framework to work in the distributed memory environment in a scalable and energy efficient manner. We achieve up to 240 speedup compared with the best optimized shared memory implementation. We attain scalability to hundreds of thousands of compute cores on three distributed-memory architectures, (Cray XC30&XC40, BlueGene/Q), and on a heterogeneous GPU-CPU system (Cray XK7). As the bottlenecks shift from being compute-bound DGEMM's to communication-bound collectives as the size of the molecular system scales, we adopt two radically different parallelization approaches for handling load-imbalance. Nevertheless, we preserve a uni ed interface to both programming models to maintain the productivity of computational quantum chemists.

  3. Computerized nipple identification for multiple image analysis in computer-aided diagnosis

    International Nuclear Information System (INIS)

    Zhou Chuan; Chan Heangping; Paramagul, Chintana; Roubidoux, Marilyn A.; Sahiner, Berkman; Hadjiiski, Labomir M.; Petrick, Nicholas

    2004-01-01

    Correlation of information from multiple-view mammograms (e.g., MLO and CC views, bilateral views, or current and prior mammograms) can improve the performance of breast cancer diagnosis by radiologists or by computer. The nipple is a reliable and stable landmark on mammograms for the registration of multiple mammograms. However, accurate identification of nipple location on mammograms is challenging because of the variations in image quality and in the nipple projections, resulting in some nipples being nearly invisible on the mammograms. In this study, we developed a computerized method to automatically identify the nipple location on digitized mammograms. First, the breast boundary was obtained using a gradient-based boundary tracking algorithm, and then the gray level profiles along the inside and outside of the boundary were identified. A geometric convergence analysis was used to limit the nipple search to a region of the breast boundary. A two-stage nipple detection method was developed to identify the nipple location using the gray level information around the nipple, the geometric characteristics of nipple shapes, and the texture features of glandular tissue or ducts which converge toward the nipple. At the first stage, a rule-based method was designed to identify the nipple location by detecting significant changes of intensity along the gray level profiles inside and outside the breast boundary and the changes in the boundary direction. At the second stage, a texture orientation-field analysis was developed to estimate the nipple location based on the convergence of the texture pattern of glandular tissue or ducts towards the nipple. The nipple location was finally determined from the detected nipple candidates by a rule-based confidence analysis. In this study, 377 and 367 randomly selected digitized mammograms were used for training and testing the nipple detection algorithm, respectively. Two experienced radiologists identified the nipple locations

  4. Geometrical differences in target volumes based on 18F-fluorodeoxyglucose positron emission tomography/computed tomography and four-dimensional computed tomography maximum intensity projection images of primary thoracic esophageal cancer.

    Science.gov (United States)

    Guo, Y; Li, J; Wang, W; Zhang, Y; Wang, J; Duan, Y; Shang, D; Fu, Z

    2014-01-01

    The objective of the study was to compare geometrical differences of target volumes based on four-dimensional computed tomography (4DCT) maximum intensity projection (MIP) and 18F-fluorodeoxyglucose positron emission tomography/computed tomography (18F-FDG PET/CT) images of primary thoracic esophageal cancer for radiation treatment. Twenty-one patients with thoracic esophageal cancer sequentially underwent contrast-enhanced three-dimensional computed tomography (3DCT), 4DCT, and 18F-FDG PET/CT thoracic simulation scans during normal free breathing. The internal gross target volume defined as IGTVMIP was obtained by contouring on MIP images. The gross target volumes based on PET/CT images (GTVPET ) were determined with nine different standardized uptake value (SUV) thresholds and manual contouring: SUV≥2.0, 2.5, 3.0, 3.5 (SUVn); ≥20%, 25%, 30%, 35%, 40% of the maximum (percentages of SUVmax, SUVn%). The differences in volume ratio (VR), conformity index (CI), and degree of inclusion (DI) between IGTVMIP and GTVPET were investigated. The mean centroid distance between GTVPET and IGTVMIP ranged from 4.98 mm to 6.53 mm. The VR ranged from 0.37 to 1.34, being significantly (P<0.05) closest to 1 at SUV2.5 (0.94), SUV20% (1.07), or manual contouring (1.10). The mean CI ranged from 0.34 to 0.58, being significantly closest to 1 (P<0.05) at SUV2.0 (0.55), SUV2.5 (0.56), SUV20% (0.56), SUV25% (0.53), or manual contouring (0.58). The mean DI of GTVPET in IGTVMIP ranged from 0.61 to 0.91, and the mean DI of IGTVMIP in GTVPET ranged from 0.34 to 0.86. The SUV threshold setting of SUV2.5, SUV20% or manual contouring yields the best tumor VR and CI with internal-gross target volume contoured on MIP of 4DCT dataset, but 3DPET/CT and 4DCT MIP could not replace each other for motion encompassing target volume delineation for radiation treatment. © 2014 International Society for Diseases of the Esophagus.

  5. Computer simulation of sputtering of graphite target in magnetron sputtering device with two zones of erosion

    Directory of Open Access Journals (Sweden)

    Bogdanov R.V.

    2015-03-01

    Full Text Available A computer simulation program for discharge in a magnetron sputtering device with two erosion zones was developed. Basic laws of the graphite target sputtering process and transport of sputtered material to the substrate were taken into account in the Monte Carlo code. The results of computer simulation for radial distributions of density and energy flux of carbon atoms on the substrate (at different values of discharge current and pressure of the working gas confirmed the possibility of obtaining qualitative homogeneous films using this magnetron sputtering device. Also the discharge modes were determined for this magnetron sputtering device, in which it was possible to obtain such energy and density of carbon atoms fluxes, which were suitable for deposition of carbon films containing carbon nanotubes and other nanoparticles.

  6. Visualizing Matrix Multiplication

    Science.gov (United States)

    Daugulis, Peteris; Sondore, Anita

    2018-01-01

    Efficient visualizations of computational algorithms are important tools for students, educators, and researchers. In this article, we point out an innovative visualization technique for matrix multiplication. This method differs from the standard, formal approach by using block matrices to make computations more visual. We find this method a…

  7. MODELING OF TARGETED DRUG DELIVERY PART II. MULTIPLE DRUG ADMINISTRATION

    Directory of Open Access Journals (Sweden)

    A. V. Zaborovskiy

    2017-01-01

    Full Text Available In oncology practice, despite significant advances in early cancer detection, surgery, radiotherapy, laser therapy, targeted therapy, etc., chemotherapy is unlikely to lose its relevance in the near future. In this context, the development of new antitumor agents is one of the most important problems of cancer research. In spite of the importance of searching for new compounds with antitumor activity, the possibilities of the “old” agents have not been fully exhausted. Targeted delivery of antitumor agents can give them a “second life”. When developing new targeted drugs and their further introduction into clinical practice, the change in their pharmacodynamics and pharmacokinetics plays a special role. The paper describes a pharmacokinetic model of the targeted drug delivery. The conditions under which it is meaningful to search for a delivery vehicle for the active substance were described. Primary screening of antitumor agents was undertaken to modify them for the targeted delivery based on underlying assumptions of the model.

  8. Reduce in Variation and Improve Efficiency of Target Volume Delineation by a Computer-Assisted System Using a Deformable Image Registration Approach

    International Nuclear Information System (INIS)

    Chao, K.S. Clifford; Bhide, Shreerang FRCR; Chen, Hansen; Asper, Joshua PAC; Bush, Steven; Franklin, Gregg; Kavadi, Vivek; Liengswangwong, Vichaivood; Gordon, William; Raben, Adam; Strasser, Jon; Koprowski, Christopher; Frank, Steven; Chronowski, Gregory; Ahamad, Anesa; Malyapa, Robert; Zhang Lifei; Dong Lei

    2007-01-01

    Purpose: To determine whether a computer-assisted target volume delineation (CAT) system using a deformable image registration approach can reduce the variation of target delineation among physicians with different head and neck (HN) IMRT experiences and reduce the time spent on the contouring process. Materials and Methods: We developed a deformable image registration method for mapping contours from a template case to a patient case with a similar tumor manifestation but different body configuration. Eight radiation oncologists with varying levels of clinical experience in HN IMRT performed target delineation on two HN cases, one with base-of-tongue (BOT) cancer and another with nasopharyngeal cancer (NPC), by first contouring from scratch and then by modifying the contours deformed by the CAT system. The gross target volumes were provided. Regions of interest for comparison included the clinical target volumes (CTVs) and normal organs. The volumetric and geometric variation of these regions of interest and the time spent on contouring were analyzed. Results: We found that the variation in delineating CTVs from scratch among the physicians was significant, and that using the CAT system reduced volumetric variation and improved geometric consistency in both BOT and NPC cases. The average timesaving when using the CAT system was 26% to 29% for more experienced physicians and 38% to 47% for the less experienced ones. Conclusions: A computer-assisted target volume delineation approach, using a deformable image-registration method with template contours, was able to reduce the variation among physicians with different experiences in HN IMRT while saving contouring time

  9. The effects of interventions targeting multiple health behaviors on smoking cessation outcomes: a rapid realist review protocol.

    Science.gov (United States)

    Minian, Nadia; deRuiter, Wayne K; Lingam, Mathangee; Corrin, Tricia; Dragonetti, Rosa; Manson, Heather; Taylor, Valerie H; Zawertailo, Laurie; Ebnahmady, Arezoo; Melamed, Osnat C; Rodak, Terri; Hahn, Margaret; Selby, Peter

    2018-03-01

    Health behaviors directly impact the health of individuals, and populations. Since individuals tend to engage in multiple unhealthy behaviors such as smoking, excessive alcohol use, physical inactivity, and eating an unhealthy diet simultaneously, many large community-based interventions have been implemented to reduce the burden of disease through the modification of multiple health behaviors. Smoking cessation can be particularly challenging as the odds of becoming dependent on nicotine increase with every unhealthy behavior a smoker exhibits. This paper presents a protocol for a rapid realist review which aims to identify factors associated with effectively changing tobacco use and target two or more additional unhealthy behaviors. An electronic literature search will be conducted using the following bibliographic databases: MEDLINE, Embase, PsycINFO, Cumulative Index to Nursing and Allied Health Literature (CINAHL), The Cochrane Library, Social Science Abstracts, Social Work Abstracts, and Web of Science. Two reviewers will screen titles and abstracts for relevant research, and the selected full papers will be used to extract data and assess the quality of evidence. Throughout this process, the rapid realist approach proposed by Saul et al., 2013 will be used to refine our initial program theory and identify contextual factors and mechanisms that are associated with successful multiple health behavior change. This review will provide evidence-based research on the context and mechanisms that may drive the success or failure of interventions designed to support multiple health behavior change. This information will be used to guide curriculum and program development for a government funded project on improving smoking cessation by addressing multiple health behaviors in people in Canada. PROSPERO CRD42017064430.

  10. A Study of Multiplicities in Hadronic Interactions

    Energy Technology Data Exchange (ETDEWEB)

    Estrada Tristan, Nora Patricia; /San Luis Potosi U.

    2006-02-01

    Using data from the SELEX (Fermilab E781) experiment obtained with a minimum-bias trigger, we study multiplicity and angular distributions of secondary particles produced in interactions in the experimental targets. We observe interactions of {Sigma}{sup -}, proton, {pi}{sup -}, and {pi}{sup +}, at beam momenta between 250 GeV/c and 650 GeV/c, in copper, polyethylene, graphite, and beryllium targets. We show that the multiplicity and angular distributions for meson and baryon beams at the same momentum are identical. We also show that the mean multiplicity increases with beam momentum, and presents only small variations with the target material.

  11. Systemic delivery of microRNA-101 potently inhibits hepatocellular carcinoma in vivo by repressing multiple targets.

    Directory of Open Access Journals (Sweden)

    Fang Zheng

    2015-02-01

    Full Text Available Targeted therapy based on adjustment of microRNA (miRNAs activity takes great promise due to the ability of these small RNAs to modulate cellular behavior. However, the efficacy of miR-101 replacement therapy to hepatocellular carcinoma (HCC remains unclear. In the current study, we first observed that plasma levels of miR-101 were significantly lower in distant metastatic HCC patients than in HCCs without distant metastasis, and down-regulation of plasma miR-101 predicted a worse disease-free survival (DFS, P<0.05. In an animal model of HCC, we demonstrated that systemic delivery of lentivirus-mediated miR-101 abrogated HCC growth in the liver, intrahepatic metastasis and distant metastasis to the lung and to the mediastinum, resulting in a dramatic suppression of HCC development and metastasis in mice without toxicity and extending life expectancy. Furthermore, enforced overexpression of miR-101 in HCC cells not only decreased EZH2, COX2 and STMN1, but also directly down-regulated a novel target ROCK2, inhibited Rho/Rac GTPase activation, and blocked HCC cells epithelial-mesenchymal transition (EMT and angiogenesis, inducing a strong abrogation of HCC tumorigenesis and aggressiveness both in vitro and in vivo. These results provide proof-of-concept support for systemic delivery of lentivirus-mediated miR-101 as a powerful anti-HCC therapeutic modality by repressing multiple molecular targets.

  12. A network integration approach for drug-target interaction prediction and computational drug repositioning from heterogeneous information.

    Science.gov (United States)

    Luo, Yunan; Zhao, Xinbin; Zhou, Jingtian; Yang, Jinglin; Zhang, Yanqing; Kuang, Wenhua; Peng, Jian; Chen, Ligong; Zeng, Jianyang

    2017-09-18

    The emergence of large-scale genomic, chemical and pharmacological data provides new opportunities for drug discovery and repositioning. In this work, we develop a computational pipeline, called DTINet, to predict novel drug-target interactions from a constructed heterogeneous network, which integrates diverse drug-related information. DTINet focuses on learning a low-dimensional vector representation of features, which accurately explains the topological properties of individual nodes in the heterogeneous network, and then makes prediction based on these representations via a vector space projection scheme. DTINet achieves substantial performance improvement over other state-of-the-art methods for drug-target interaction prediction. Moreover, we experimentally validate the novel interactions between three drugs and the cyclooxygenase proteins predicted by DTINet, and demonstrate the new potential applications of these identified cyclooxygenase inhibitors in preventing inflammatory diseases. These results indicate that DTINet can provide a practically useful tool for integrating heterogeneous information to predict new drug-target interactions and repurpose existing drugs.Network-based data integration for drug-target prediction is a promising avenue for drug repositioning, but performance is wanting. Here, the authors introduce DTINet, whose performance is enhanced in the face of noisy, incomplete and high-dimensional biological data by learning low-dimensional vector representations.

  13. Benznidazole biotransformation and multiple targets in Trypanosoma cruzi revealed by metabolomics.

    Directory of Open Access Journals (Sweden)

    Andrea Trochine

    2014-05-01

    Full Text Available The first line treatment for Chagas disease, a neglected tropical disease caused by the protozoan parasite Trypanosoma cruzi, involves administration of benznidazole (Bzn. Bzn is a 2-nitroimidazole pro-drug which requires nitroreduction to become active, although its mode of action is not fully understood. In the present work we used a non-targeted MS-based metabolomics approach to study the metabolic response of T. cruzi to Bzn.Parasites treated with Bzn were minimally altered compared to untreated trypanosomes, although the redox active thiols trypanothione, homotrypanothione and cysteine were significantly diminished in abundance post-treatment. In addition, multiple Bzn-derived metabolites were detected after treatment. These metabolites included reduction products, fragments and covalent adducts of reduced Bzn linked to each of the major low molecular weight thiols: trypanothione, glutathione, γ-glutamylcysteine, glutathionylspermidine, cysteine and ovothiol A. Bzn products known to be generated in vitro by the unusual trypanosomal nitroreductase, TcNTRI, were found within the parasites, but low molecular weight adducts of glyoxal, a proposed toxic end-product of NTRI Bzn metabolism, were not detected.Our data is indicative of a major role of the thiol binding capacity of Bzn reduction products in the mechanism of Bzn toxicity against T. cruzi.

  14. B-cell activating factor in the pathophysiology of multiple myeloma: a target for therapy?

    International Nuclear Information System (INIS)

    Hengeveld, P J; Kersten, M J

    2015-01-01

    Multiple myeloma (MM) is a currently incurable malignancy of plasma cells. Malignant myeloma cells (MMCs) are heavily dependent upon the bone marrow (BM) microenvironment for their survival. One component of this tumor microenvironment, B-Cell Activating Factor (BAFF), has been implicated as a key player in this interaction. This review discusses the role of BAFF in the pathophysiology of MM, and the potential of BAFF-inhibitory therapy for the treatment of MM. Multiple studies have shown that BAFF functions as a survival factor for MMCs. Furthermore, MMCs express several BAFF-binding receptors. Of these, only Transmembrane Activator and CAML Interactor (TACI) correlates with the MMC's capability to ligate BAFF. Additionally, the level of expression of TACI correlates with the level of the MMC's BM dependency. Ligation of BAFF receptors on MMCs causes activation of the Nuclear Factor of κ-B (NF-κB) pathway, a crucial pathway for the pathogenesis of many B-cell malignancies. Serum BAFF levels are significantly elevated in MM patients when compared to healthy controls, and correlate inversely with overall survival. BAFF signaling is thus an interesting target for the treatment of MM. Several BAFF-inhibitory drugs are currently under evaluation for the treatment of MM. These include BAFF-monoclonal antibodies (tabalumab) and antibody-drug conjugates (GSK2857916)

  15. Computational Prediction of Hot Spot Residues

    Science.gov (United States)

    Morrow, John Kenneth; Zhang, Shuxing

    2013-01-01

    Most biological processes involve multiple proteins interacting with each other. It has been recently discovered that certain residues in these protein-protein interactions, which are called hot spots, contribute more significantly to binding affinity than others. Hot spot residues have unique and diverse energetic properties that make them challenging yet important targets in the modulation of protein-protein complexes. Design of therapeutic agents that interact with hot spot residues has proven to be a valid methodology in disrupting unwanted protein-protein interactions. Using biological methods to determine which residues are hot spots can be costly and time consuming. Recent advances in computational approaches to predict hot spots have incorporated a myriad of features, and have shown increasing predictive successes. Here we review the state of knowledge around protein-protein interactions, hot spots, and give an overview of multiple in silico prediction techniques of hot spot residues. PMID:22316154

  16. COMPUTER-MEDIATED COMMUNICATION IN FOREIGN LANGUAGE EDUCATION: Use of Target Language and Learner Perceptions

    Directory of Open Access Journals (Sweden)

    Nesrin OZDENER

    2008-04-01

    Full Text Available Among the challenges many teachers face in facilitating the improvement of speaking skills are sparing sufficient time for practice to enable students to achieve fluency in speaking through internalizing the structures, and establishing a balance between fluency and accuracy. This study aimed to seek an answer to the question as to whether Computer-Mediated Communication Technologies be a solution for overcoming these problems. The study was conducted as additional practice to the foreign language lessons with the participation of 60 students. Task-based language teaching principles were taken as basis in preparation of the teaching materials in the study, in which text and voice chat applications among the Computer-Mediated Communication Technologies were used. During the applications data were collected in several ways: participants’ perspectives regarding their changing experiences and the types of tasks used were investigated through the use of open-ended questionnaires after each session; a general insight was obtained into the students’ experiences with close-ended questionnaires given at the end of the study; and the use of the target language in communications among students were determined by investigating the text communication logs. From a user-oriented perspective, the results of the study shed light on the strategies that can be used in computer-mediated communication technologies valuing the experiences and perceptions of the learners.

  17. A survey of current trends in computational drug repositioning.

    Science.gov (United States)

    Li, Jiao; Zheng, Si; Chen, Bin; Butte, Atul J; Swamidass, S Joshua; Lu, Zhiyong

    2016-01-01

    Computational drug repositioning or repurposing is a promising and efficient tool for discovering new uses from existing drugs and holds the great potential for precision medicine in the age of big data. The explosive growth of large-scale genomic and phenotypic data, as well as data of small molecular compounds with granted regulatory approval, is enabling new developments for computational repositioning. To achieve the shortest path toward new drug indications, advanced data processing and analysis strategies are critical for making sense of these heterogeneous molecular measurements. In this review, we show recent advancements in the critical areas of computational drug repositioning from multiple aspects. First, we summarize available data sources and the corresponding computational repositioning strategies. Second, we characterize the commonly used computational techniques. Third, we discuss validation strategies for repositioning studies, including both computational and experimental methods. Finally, we highlight potential opportunities and use-cases, including a few target areas such as cancers. We conclude with a brief discussion of the remaining challenges in computational drug repositioning. Published by Oxford University Press 2015. This work is written by US Government employees and is in the public domain in the US.

  18. An Automatic Multi-Target Independent Analysis Framework for Non-Planar Infrared-Visible Registration.

    Science.gov (United States)

    Sun, Xinglong; Xu, Tingfa; Zhang, Jizhou; Zhao, Zishu; Li, Yuankun

    2017-07-26

    In this paper, we propose a novel automatic multi-target registration framework for non-planar infrared-visible videos. Previous approaches usually analyzed multiple targets together and then estimated a global homography for the whole scene, however, these cannot achieve precise multi-target registration when the scenes are non-planar. Our framework is devoted to solving the problem using feature matching and multi-target tracking. The key idea is to analyze and register each target independently. We present a fast and robust feature matching strategy, where only the features on the corresponding foreground pairs are matched. Besides, new reservoirs based on the Gaussian criterion are created for all targets, and a multi-target tracking method is adopted to determine the relationships between the reservoirs and foreground blobs. With the matches in the corresponding reservoir, the homography of each target is computed according to its moving state. We tested our framework on both public near-planar and non-planar datasets. The results demonstrate that the proposed framework outperforms the state-of-the-art global registration method and the manual global registration matrix in all tested datasets.

  19. Moving target detection based on temporal-spatial information fusion for infrared image sequences

    Science.gov (United States)

    Toing, Wu-qin; Xiong, Jin-yu; Zeng, An-jun; Wu, Xiao-ping; Xu, Hao-peng

    2009-07-01

    Moving target detection and localization is one of the most fundamental tasks in visual surveillance. In this paper, through analyzing the advantages and disadvantages of the traditional approaches about moving target detection, a novel approach based on temporal-spatial information fusion is proposed for moving target detection. The proposed method combines the spatial feature in single frame and the temporal properties within multiple frames of an image sequence of moving target. First, the method uses the spatial image segmentation for target separation from background and uses the local temporal variance for extracting targets and wiping off the trail artifact. Second, the logical "and" operator is used to fuse the temporal and spatial information. In the end, to the fusion image sequence, the morphological filtering and blob analysis are used to acquire exact moving target. The algorithm not only requires minimal computation and memory but also quickly adapts to the change of background and environment. Comparing with other methods, such as the KDE, the Mixture of K Gaussians, etc., the simulation results show the proposed method has better validity and higher adaptive for moving target detection, especially in infrared image sequences with complex illumination change, noise change, and so on.

  20. Computing multiple-output regression quantile regions

    Czech Academy of Sciences Publication Activity Database

    Paindaveine, D.; Šiman, Miroslav

    2012-01-01

    Roč. 56, č. 4 (2012), s. 840-853 ISSN 0167-9473 R&D Projects: GA MŠk(CZ) 1M06047 Institutional research plan: CEZ:AV0Z10750506 Keywords : halfspace depth * multiple-output regression * parametric linear programming * quantile regression Subject RIV: BA - General Mathematics Impact factor: 1.304, year: 2012 http://library.utia.cas.cz/separaty/2012/SI/siman-0376413.pdf

  1. Silver nanoclusters-assisted ion-exchange reaction with CdTe quantum dots for photoelectrochemical detection of adenosine by target-triggering multiple-cycle amplification strategy.

    Science.gov (United States)

    Zhao, Yang; Tan, Lu; Gao, Xiaoshan; Jie, Guifen; Huang, Tingyu

    2018-07-01

    Herein, we successfully devised a novel photoelectrochemical (PEC) platform for ultrasensitive detection of adenosine by target-triggering cascade multiple cycle amplification based on the silver nanoparticles-assisted ion-exchange reaction with CdTe quantum dots (QDs). In the presence of target adenosine, DNA s1 is released from the aptamer and then hybridizes with hairpin DNA (HP1), which could initiate the cycling cleavage process under the reaction of nicking endonuclease. Then the product (DNA b) of cycle I could act as the "DNA trigger" of cycle II to further generate a large number of DNA s1, which again go back to cycle I, thus a cascade multiple DNA cycle amplification was carried out to produce abundant DNA c. These DNA c fragments with the cytosine (C)-rich loop were captured by magnetic beads, and numerous silver nanoclusters (Ag NCs) were synthesized by AgNO 3 and sodium borohydride. The dissolved AgNCs released numerous silver ions which could induce ion exchange reaction with the CdTe QDs, thus resulting in greatly amplified change of photocurrent for target detection. The detection linear range for adenosine was 1.0 fM ~10 nM with the detection limit of 0.5 fM. The present PEC strategy combining cascade multiple DNA cycle amplification and AgNCs-induced ion-exchange reaction with QDs provides new insight into rapid, and ultrasensitive PEC detection of different biomolecules, which showed great potential for detecting trace amounts in bioanalysis and clinical biomedicine. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Computational identification of conserved microRNAs and their targets from expression sequence tags of blueberry (Vaccinium corybosum).

    Science.gov (United States)

    Li, Xuyan; Hou, Yanming; Zhang, Li; Zhang, Wenhao; Quan, Chen; Cui, Yuhai; Bian, Shaomin

    2014-01-01

    MicroRNAs (miRNAs) are a class of endogenous, approximately 21nt in length, non-coding RNA, which mediate the expression of target genes primarily at post-transcriptional levels. miRNAs play critical roles in almost all plant cellular and metabolic processes. Although numerous miRNAs have been identified in the plant kingdom, the miRNAs in blueberry, which is an economically important small fruit crop, still remain totally unknown. In this study, we reported a computational identification of miRNAs and their targets in blueberry. By conducting an EST-based comparative genomics approach, 9 potential vco-miRNAs were discovered from 22,402 blueberry ESTs according to a series of filtering criteria, designated as vco-miR156-5p, vco-miR156-3p, vco-miR1436, vco-miR1522, vco-miR4495, vco-miR5120, vco-miR5658, vco-miR5783, and vco-miR5986. Based on sequence complementarity between miRNA and its target transcript, 34 target ESTs from blueberry and 70 targets from other species were identified for the vco-miRNAs. The targets were found to be involved in transcription, RNA splicing and binding, DNA duplication, signal transduction, transport and trafficking, stress response, as well as synthesis and metabolic process. These findings will greatly contribute to future research in regard to functions and regulatory mechanisms of blueberry miRNAs.

  3. Computing multiple periodic solutions of nonlinear vibration problems using the harmonic balance method and Groebner bases

    Science.gov (United States)

    Grolet, Aurelien; Thouverez, Fabrice

    2015-02-01

    This paper is devoted to the study of vibration of mechanical systems with geometric nonlinearities. The harmonic balance method is used to derive systems of polynomial equations whose solutions give the frequency component of the possible steady states. Groebner basis methods are used for computing all solutions of polynomial systems. This approach allows to reduce the complete system to an unique polynomial equation in one variable driving all solutions of the problem. In addition, in order to decrease the number of variables, we propose to first work on the undamped system, and recover solution of the damped system using a continuation on the damping parameter. The search for multiple solutions is illustrated on a simple system, where the influence of the retained number of harmonic is studied. Finally, the procedure is applied on a simple cyclic system and we give a representation of the multiple states versus frequency.

  4. Integration of multiple determinants in the neuronal computation of economic values.

    Science.gov (United States)

    Raghuraman, Anantha P; Padoa-Schioppa, Camillo

    2014-08-27

    Economic goods may vary on multiple dimensions (determinants). A central conjecture in decision neuroscience is that choices between goods are made by comparing subjective values computed through the integration of all relevant determinants. Previous work identified three groups of neurons in the orbitofrontal cortex (OFC) of monkeys engaged in economic choices: (1) offer value cells, which encode the value of individual offers; (2) chosen value cells, which encode the value of the chosen good; and (3) chosen juice cells, which encode the identity of the chosen good. In principle, these populations could be sufficient to generate a decision. Critically, previous work did not assess whether offer value cells (the putative input to the decision) indeed encode subjective values as opposed to physical properties of the goods, and/or whether offer value cells integrate multiple determinants. To address these issues, we recorded from the OFC while monkeys chose between risky outcomes. Confirming previous observations, three populations of neurons encoded the value of individual offers, the value of the chosen option, and the value-independent choice outcome. The activity of both offer value cells and chosen value cells encoded values defined by the integration of juice quantity and probability. Furthermore, both populations reflected the subjective risk attitude of the animals. We also found additional groups of neurons encoding the risk associated with a particular option, the risky nature of the chosen option, and whether the trial outcome was positive or negative. These results provide substantial support for the conjecture described above and for the involvement of OFC in good-based decisions. Copyright © 2014 the authors 0270-6474/14/3311583-21$15.00/0.

  5. Drug-target interaction prediction via class imbalance-aware ensemble learning.

    Science.gov (United States)

    Ezzat, Ali; Wu, Min; Li, Xiao-Li; Kwoh, Chee-Keong

    2016-12-22

    Multiple computational methods for predicting drug-target interactions have been developed to facilitate the drug discovery process. These methods use available data on known drug-target interactions to train classifiers with the purpose of predicting new undiscovered interactions. However, a key challenge regarding this data that has not yet been addressed by these methods, namely class imbalance, is potentially degrading the prediction performance. Class imbalance can be divided into two sub-problems. Firstly, the number of known interacting drug-target pairs is much smaller than that of non-interacting drug-target pairs. This imbalance ratio between interacting and non-interacting drug-target pairs is referred to as the between-class imbalance. Between-class imbalance degrades prediction performance due to the bias in prediction results towards the majority class (i.e. the non-interacting pairs), leading to more prediction errors in the minority class (i.e. the interacting pairs). Secondly, there are multiple types of drug-target interactions in the data with some types having relatively fewer members (or are less represented) than others. This variation in representation of the different interaction types leads to another kind of imbalance referred to as the within-class imbalance. In within-class imbalance, prediction results are biased towards the better represented interaction types, leading to more prediction errors in the less represented interaction types. We propose an ensemble learning method that incorporates techniques to address the issues of between-class imbalance and within-class imbalance. Experiments show that the proposed method improves results over 4 state-of-the-art methods. In addition, we simulated cases for new drugs and targets to see how our method would perform in predicting their interactions. New drugs and targets are those for which no prior interactions are known. Our method displayed satisfactory prediction performance and was

  6. Application of Soft Computing Techniques and Multiple Regression Models for CBR prediction of Soils

    Directory of Open Access Journals (Sweden)

    Fatimah Khaleel Ibrahim

    2017-08-01

    Full Text Available The techniques of soft computing technique such as Artificial Neutral Network (ANN have improved the predicting capability and have actually discovered application in Geotechnical engineering. The aim of this research is to utilize the soft computing technique and Multiple Regression Models (MLR for forecasting the California bearing ratio CBR( of soil from its index properties. The indicator of CBR for soil could be predicted from various soils characterizing parameters with the assist of MLR and ANN methods. The data base that collected from the laboratory by conducting tests on 86 soil samples that gathered from different projects in Basrah districts. Data gained from the experimental result were used in the regression models and soft computing techniques by using artificial neural network. The liquid limit, plastic index , modified compaction test and the CBR test have been determined. In this work, different ANN and MLR models were formulated with the different collection of inputs to be able to recognize their significance in the prediction of CBR. The strengths of the models that were developed been examined in terms of regression coefficient (R2, relative error (RE% and mean square error (MSE values. From the results of this paper, it absolutely was noticed that all the proposed ANN models perform better than that of MLR model. In a specific ANN model with all input parameters reveals better outcomes than other ANN models.

  7. Novel maximum likelihood approach for passive detection and localisation of multiple emitters

    Science.gov (United States)

    Hernandez, Marcel

    2017-12-01

    In this paper, a novel target acquisition and localisation algorithm (TALA) is introduced that offers a capability for detecting and localising multiple targets using the intermittent "signals-of-opportunity" (e.g. acoustic impulses or radio frequency transmissions) they generate. The TALA is a batch estimator that addresses the complex multi-sensor/multi-target data association problem in order to estimate the locations of an unknown number of targets. The TALA is unique in that it does not require measurements to be of a specific type, and can be implemented for systems composed of either homogeneous or heterogeneous sensors. The performance of the TALA is demonstrated in simulated scenarios with a network of 20 sensors and up to 10 targets. The sensors generate angle-of-arrival (AOA), time-of-arrival (TOA), or hybrid AOA/TOA measurements. It is shown that the TALA is able to successfully detect 83-99% of the targets, with a negligible number of false targets declared. Furthermore, the localisation errors of the TALA are typically within 10% of the errors generated by a "genie" algorithm that is given the correct measurement-to-target associations. The TALA also performs well in comparison with an optimistic Cramér-Rao lower bound, with typical differences in performance of 10-20%, and differences in performance of 40-50% in the most difficult scenarios considered. The computational expense of the TALA is also controllable, which allows the TALA to maintain computational feasibility even in the most challenging scenarios considered. This allows the approach to be implemented in time-critical scenarios, such as in the localisation of artillery firing events. It is concluded that the TALA provides a powerful situational awareness aid for passive surveillance operations.

  8. The computational form of craving is a selective multiplication of economic value.

    Science.gov (United States)

    Konova, Anna B; Louie, Kenway; Glimcher, Paul W

    2018-04-17

    Craving is thought to be a specific desire state that biases choice toward the desired object, be it chocolate or drugs. A vast majority of people report having experienced craving of some kind. In its pathological form craving contributes to health outcomes in addiction and obesity. Yet despite its ubiquity and clinical relevance we still lack a basic neurocomputational understanding of craving. Here, using an instantaneous measure of subjective valuation and selective cue exposure, we identify a behavioral signature of a food craving-like state and advance a computational framework for understanding how this state might transform valuation to bias choice. We find desire induced by exposure to a specific high-calorie, high-fat/sugar snack good is expressed in subjects' momentary willingness to pay for this good. This effect is selective but not exclusive to the exposed good; rather, we find it generalizes to nonexposed goods in proportion to their subjective attribute similarity to the exposed ones. A second manipulation of reward size (number of snack units available for purchase) further suggested that a multiplicative gain mechanism supports the transformation of valuation during laboratory craving. These findings help explain how real-world food craving can result in behaviors inconsistent with preferences expressed in the absence of craving and open a path for the computational modeling of craving-like phenomena using a simple and repeatable experimental tool for assessing subjective states in economic terms. Copyright © 2018 the Author(s). Published by PNAS.

  9. Reduced complexity FFT-based DOA and DOD estimation for moving target in bistatic MIMO radar

    KAUST Repository

    Ali, Hussain

    2016-06-24

    In this paper, we consider a bistatic multiple-input multiple-output (MIMO) radar. We propose a reduced complexity algorithm to estimate the direction-of-arrival (DOA) and direction-of-departure (DOD) for moving target. We show that the calculation of parameter estimation can be expressed in terms of one-dimensional fast-Fourier-transforms which drastically reduces the complexity of the optimization algorithm. The performance of the proposed algorithm is compared with the two-dimension multiple signal classification (2D-MUSIC) and reduced-dimension MUSIC (RD-MUSIC) algorithms. It is shown by simulations, our proposed algorithm has better estimation performance and lower computational complexity compared to the 2D-MUSIC and RD-MUSIC algorithms. Moreover, simulation results also show that the proposed algorithm achieves the Cramer-Rao lower bound. © 2016 IEEE.

  10. A boundary integral method for numerical computation of radar cross section of 3D targets using hybrid BEM/FEM with edge elements

    Science.gov (United States)

    Dodig, H.

    2017-11-01

    This contribution presents the boundary integral formulation for numerical computation of time-harmonic radar cross section for 3D targets. Method relies on hybrid edge element BEM/FEM to compute near field edge element coefficients that are associated with near electric and magnetic fields at the boundary of the computational domain. Special boundary integral formulation is presented that computes radar cross section directly from these edge element coefficients. Consequently, there is no need for near-to-far field transformation (NTFFT) which is common step in RCS computations. By the end of the paper it is demonstrated that the formulation yields accurate results for canonical models such as spheres, cubes, cones and pyramids. Method has demonstrated accuracy even in the case of dielectrically coated PEC sphere at interior resonance frequency which is common problem for computational electromagnetic codes.

  11. HomoTarget: a new algorithm for prediction of microRNA targets in Homo sapiens.

    Science.gov (United States)

    Ahmadi, Hamed; Ahmadi, Ali; Azimzadeh-Jamalkandi, Sadegh; Shoorehdeli, Mahdi Aliyari; Salehzadeh-Yazdi, Ali; Bidkhori, Gholamreza; Masoudi-Nejad, Ali

    2013-02-01

    MiRNAs play an essential role in the networks of gene regulation by inhibiting the translation of target mRNAs. Several computational approaches have been proposed for the prediction of miRNA target-genes. Reports reveal a large fraction of under-predicted or falsely predicted target genes. Thus, there is an imperative need to develop a computational method by which the target mRNAs of existing miRNAs can be correctly identified. In this study, combined pattern recognition neural network (PRNN) and principle component analysis (PCA) architecture has been proposed in order to model the complicated relationship between miRNAs and their target mRNAs in humans. The results of several types of intelligent classifiers and our proposed model were compared, showing that our algorithm outperformed them with higher sensitivity and specificity. Using the recent release of the mirBase database to find potential targets of miRNAs, this model incorporated twelve structural, thermodynamic and positional features of miRNA:mRNA binding sites to select target candidates. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Burglar Target Selection

    Science.gov (United States)

    Townsley, Michael; Bernasco, Wim; Ruiter, Stijn; Johnson, Shane D.; White, Gentry; Baum, Scott

    2015-01-01

    Objectives: This study builds on research undertaken by Bernasco and Nieuwbeerta and explores the generalizability of a theoretically derived offender target selection model in three cross-national study regions. Methods: Taking a discrete spatial choice approach, we estimate the impact of both environment- and offender-level factors on residential burglary placement in the Netherlands, the United Kingdom, and Australia. Combining cleared burglary data from all study regions in a single statistical model, we make statistical comparisons between environments. Results: In all three study regions, the likelihood an offender selects an area for burglary is positively influenced by proximity to their home, the proportion of easily accessible targets, and the total number of targets available. Furthermore, in two of the three study regions, juvenile offenders under the legal driving age are significantly more influenced by target proximity than adult offenders. Post hoc tests indicate the magnitudes of these impacts vary significantly between study regions. Conclusions: While burglary target selection strategies are consistent with opportunity-based explanations of offending, the impact of environmental context is significant. As such, the approach undertaken in combining observations from multiple study regions may aid criminology scholars in assessing the generalizability of observed findings across multiple environments. PMID:25866418

  13. Legal Issues in Cyber Targeting

    DEFF Research Database (Denmark)

    Juhlin, Jonas Alastair

    Imagine this scenario: Two states are in armed conflict with each other. In order to gain an advantage, one side launches a cyber-attack against the opponent’s computer network. The malicious malware paralyze the military computer network, as intended, but the malware spreads into the civilian...... system with physical damage to follow. This can happen and the natural question arises: What must be considered lawful targeting according to the international humanitarian law in cyber warfare? What steps must an attacker take to minimize the damage done to unlawful targets when conducting an offensive...... operation? How can the attacker separate military targets from civilian targets in cyber space? This paper addresses these questions and argues that a network (civilian or military) consist of several software components and that it is the individual components that is the target. If the components are used...

  14. THE RELIABILITY OF IDENTIFICATION EVIDENCE WITH MULTIPLE LINEUPS

    Directory of Open Access Journals (Sweden)

    Nick J. Broers

    2013-01-01

    Full Text Available This study aimed to establish the diagnostic value of multiple lineup decisions made for portrait, body, and profile lineups, including multiple target/suspect choices, rejections, foil choices, and don’t know answers. A total of 192 participants identified a thief and a victim of theft from independent simultaneous target-present or target-absent 6-person portrait, body and profile lineups after watching one of two stimulus films. As hypothesized, multiple target/suspect choices had incriminating value whereas multiple rejections, foil choices, and don’t know answers had mostly exonerating value. For suspect choices, the combination of all three lineup modes consistently elicited high diagnosticities across targets. Combinations of non-suspect choices (rejections, foil choices, or don’t know answers were less successful and the different combinations showed less consistency in terms of diagnosticity. It was concluded that the use of multiple lineups, such as different facial poses and different aspects of a person should be particularly beneficial in three situations: if a witness only saw the perpetrator from a pose different than the frontal view normally used for lineups; if one or more witnesses saw the perpetrator from more than one perspective; and if different witnesses saw the perpetrator from different perspectives.

  15. Multiple-Swarm Ensembles: Improving the Predictive Power and Robustness of Predictive Models and Its Use in Computational Biology.

    Science.gov (United States)

    Alves, Pedro; Liu, Shuang; Wang, Daifeng; Gerstein, Mark

    2018-01-01

    Machine learning is an integral part of computational biology, and has already shown its use in various applications, such as prognostic tests. In the last few years in the non-biological machine learning community, ensembling techniques have shown their power in data mining competitions such as the Netflix challenge; however, such methods have not found wide use in computational biology. In this work, we endeavor to show how ensembling techniques can be applied to practical problems, including problems in the field of bioinformatics, and how they often outperform other machine learning techniques in both predictive power and robustness. Furthermore, we develop a methodology of ensembling, Multi-Swarm Ensemble (MSWE) by using multiple particle swarm optimizations and demonstrate its ability to further enhance the performance of ensembles.

  16. Moving-Target Position Estimation Using GPU-Based Particle Filter for IoT Sensing Applications

    Directory of Open Access Journals (Sweden)

    Seongseop Kim

    2017-11-01

    Full Text Available A particle filter (PF has been introduced for effective position estimation of moving targets for non-Gaussian and nonlinear systems. The time difference of arrival (TDOA method using acoustic sensor array has normally been used to for estimation by concealing the location of a moving target, especially underwater. In this paper, we propose a GPU -based acceleration of target position estimation using a PF and propose an efficient system and software architecture. The proposed graphic processing unit (GPU-based algorithm has more advantages in applying PF signal processing to a target system, which consists of large-scale Internet of Things (IoT-driven sensors because of the parallelization which is scalable. For the TDOA measurement from the acoustic sensor array, we use the generalized cross correlation phase transform (GCC-PHAT method to obtain the correlation coefficient of the signal using Fast Fourier Transform (FFT, and we try to accelerate the calculations of GCC-PHAT based TDOA measurements using FFT with GPU compute unified device architecture (CUDA. The proposed approach utilizes a parallelization method in the target position estimation algorithm using GPU-based PF processing. In addition, it could efficiently estimate sudden movement change of the target using GPU-based parallel computing which also can be used for multiple target tracking. It also provides scalability in extending the detection algorithm according to the increase of the number of sensors. Therefore, the proposed architecture can be applied in IoT sensing applications with a large number of sensors. The target estimation algorithm was verified using MATLAB and implemented using GPU CUDA. We implemented the proposed signal processing acceleration system using target GPU to analyze in terms of execution time. The execution time of the algorithm is reduced by 55% from to the CPU standalone operation in target embedded board, NVIDIA Jetson TX1. Also, to apply large

  17. Automated target recognition and tracking using an optical pattern recognition neural network

    Science.gov (United States)

    Chao, Tien-Hsin

    1991-01-01

    The on-going development of an automatic target recognition and tracking system at the Jet Propulsion Laboratory is presented. This system is an optical pattern recognition neural network (OPRNN) that is an integration of an innovative optical parallel processor and a feature extraction based neural net training algorithm. The parallel optical processor provides high speed and vast parallelism as well as full shift invariance. The neural network algorithm enables simultaneous discrimination of multiple noisy targets in spite of their scales, rotations, perspectives, and various deformations. This fully developed OPRNN system can be effectively utilized for the automated spacecraft recognition and tracking that will lead to success in the Automated Rendezvous and Capture (AR&C) of the unmanned Cargo Transfer Vehicle (CTV). One of the most powerful optical parallel processors for automatic target recognition is the multichannel correlator. With the inherent advantages of parallel processing capability and shift invariance, multiple objects can be simultaneously recognized and tracked using this multichannel correlator. This target tracking capability can be greatly enhanced by utilizing a powerful feature extraction based neural network training algorithm such as the neocognitron. The OPRNN, currently under investigation at JPL, is constructed with an optical multichannel correlator where holographic filters have been prepared using the neocognitron training algorithm. The computation speed of the neocognitron-type OPRNN is up to 10(exp 14) analog connections/sec that enabling the OPRNN to outperform its state-of-the-art electronics counterpart by at least two orders of magnitude.

  18. Robotics Vision-based Heuristic Reasoning for Underwater Target Tracking and Navigation

    Directory of Open Access Journals (Sweden)

    Chua Kia

    2005-09-01

    Full Text Available This paper presents a robotics vision-based heuristic reasoning system for underwater target tracking and navigation. This system is introduced to improve the level of automation of underwater Remote Operated Vehicles (ROVs operations. A prototype which combines computer vision with an underwater robotics system is successfully designed and developed to perform target tracking and intelligent navigation. This study focuses on developing image processing algorithms and fuzzy inference system for the analysis of the terrain. The vision system developed is capable of interpreting underwater scene by extracting subjective uncertainties of the object of interest. Subjective uncertainties are further processed as multiple inputs of a fuzzy inference system that is capable of making crisp decisions concerning where to navigate. The important part of the image analysis is morphological filtering. The applications focus on binary images with the extension of gray-level concepts. An open-loop fuzzy control system is developed for classifying the traverse of terrain. The great achievement is the system's capability to recognize and perform target tracking of the object of interest (pipeline in perspective view based on perceived condition. The effectiveness of this approach is demonstrated by computer and prototype simulations. This work is originated from the desire to develop robotics vision system with the ability to mimic the human expert's judgement and reasoning when maneuvering ROV in the traverse of the underwater terrain.

  19. Robotics Vision-based Heuristic Reasoning for Underwater Target Tracking and Navigation

    Directory of Open Access Journals (Sweden)

    Chua Kia

    2008-11-01

    Full Text Available This paper presents a robotics vision-based heuristic reasoning system for underwater target tracking and navigation. This system is introduced to improve the level of automation of underwater Remote Operated Vehicles (ROVs operations. A prototype which combines computer vision with an underwater robotics system is successfully designed and developed to perform target tracking and intelligent navigation. This study focuses on developing image processing algorithms and fuzzy inference system for the analysis of the terrain. The vision system developed is capable of interpreting underwater scene by extracting subjective uncertainties of the object of interest. Subjective uncertainties are further processed as multiple inputs of a fuzzy inference system that is capable of making crisp decisions concerning where to navigate. The important part of the image analysis is morphological filtering. The applications focus on binary images with the extension of gray-level concepts. An open-loop fuzzy control system is developed for classifying the traverse of terrain. The great achievement is the system's capability to recognize and perform target tracking of the object of interest (pipeline in perspective view based on perceived condition. The effectiveness of this approach is demonstrated by computer and prototype simulations. This work is originated from the desire to develop robotics vision system with the ability to mimic the human expert's judgement and reasoning when maneuvering ROV in the traverse of the underwater terrain.

  20. Autonomous Shepherding Behaviors of Multiple Target Steering Robots.

    Science.gov (United States)

    Lee, Wonki; Kim, DaeEun

    2017-11-25

    This paper presents a distributed coordination methodology for multi-robot systems, based on nearest-neighbor interactions. Among many interesting tasks that may be performed using swarm robots, we propose a biologically-inspired control law for a shepherding task, whereby a group of external agents drives another group of agents to a desired location. First, we generated sheep-like robots that act like a flock. We assume that each agent is capable of measuring the relative location and velocity to each of its neighbors within a limited sensing area. Then, we designed a control strategy for shepherd-like robots that have information regarding where to go and a steering ability to control the flock, according to the robots' position relative to the flock. We define several independent behavior rules; each agent calculates to what extent it will move by summarizing each rule. The flocking sheep agents detect the steering agents and try to avoid them; this tendency leads to movement of the flock. Each steering agent only needs to focus on guiding the nearest flocking agent to the desired location. Without centralized coordination, multiple steering agents produce an arc formation to control the flock effectively. In addition, we propose a new rule for collecting behavior, whereby a scattered flock or multiple flocks are consolidated. From simulation results with multiple robots, we show that each robot performs actions for the shepherding behavior, and only a few steering agents are needed to control the whole flock. The results are displayed in maps that trace the paths of the flock and steering robots. Performance is evaluated via time cost and path accuracy to demonstrate the effectiveness of this approach.

  1. Autonomous Shepherding Behaviors of Multiple Target Steering Robots

    Directory of Open Access Journals (Sweden)

    Wonki Lee

    2017-11-01

    Full Text Available This paper presents a distributed coordination methodology for multi-robot systems, based on nearest-neighbor interactions. Among many interesting tasks that may be performed using swarm robots, we propose a biologically-inspired control law for a shepherding task, whereby a group of external agents drives another group of agents to a desired location. First, we generated sheep-like robots that act like a flock. We assume that each agent is capable of measuring the relative location and velocity to each of its neighbors within a limited sensing area. Then, we designed a control strategy for shepherd-like robots that have information regarding where to go and a steering ability to control the flock, according to the robots’ position relative to the flock. We define several independent behavior rules; each agent calculates to what extent it will move by summarizing each rule. The flocking sheep agents detect the steering agents and try to avoid them; this tendency leads to movement of the flock. Each steering agent only needs to focus on guiding the nearest flocking agent to the desired location. Without centralized coordination, multiple steering agents produce an arc formation to control the flock effectively. In addition, we propose a new rule for collecting behavior, whereby a scattered flock or multiple flocks are consolidated. From simulation results with multiple robots, we show that each robot performs actions for the shepherding behavior, and only a few steering agents are needed to control the whole flock. The results are displayed in maps that trace the paths of the flock and steering robots. Performance is evaluated via time cost and path accuracy to demonstrate the effectiveness of this approach.

  2. Computer Detection of Low Contrast Targets.

    Science.gov (United States)

    1982-06-18

    computed from the Hessian and the gradient and is given by the formula W) = - U Hf( IVf (M), Vf()) IVfj 3 Because of the amount of noise present in these...IT (nz + 1 + Zn cost ) 1/2 and this integral is a maximum for n=1 and decreases as n increases, exactly what a good measure of curvature should do

  3. Computed tomography contrast media extravasation: treatment algorithm and immediate treatment by squeezing with multiple slit incisions.

    Science.gov (United States)

    Kim, Sue Min; Cook, Kyung Hoon; Lee, Il Jae; Park, Dong Ha; Park, Myong Chul

    2017-04-01

    In our hospital, an adverse event reporting system was initiated that alerts the plastic surgery department immediately after suspecting contrast media extravasation injury. This system is particularly important for a large volume of extravasation during power injector use. Between March 2011 and May 2015, a retrospective chart review was performed on all patients experiencing contrast media extravasation while being treated at our hospital. Immediate treatment by squeezing with multiple slit incisions was conducted for a portion of these patients. Eighty cases of extravasation were reported from approximately 218 000 computed tomography scans. The expected extravasation volume was larger than 50 ml, or severe pressure was felt on the affected limb in 23 patients. They were treated with multiple slit incisions followed by squeezing. Oedema of the affected limb disappeared after 1-2 hours after treatment, and the skin incisions healed within a week. We propose a set of guidelines for the initial management of contrast media extravasation injuries for a timely intervention. For large-volume extravasation cases, immediate management with multiple slit incisions is safe and effective in reducing the swelling quickly, preventing patient discomfort and decreasing skin and soft tissue problems. © 2016 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  4. Computational prediction of protein hot spot residues.

    Science.gov (United States)

    Morrow, John Kenneth; Zhang, Shuxing

    2012-01-01

    Most biological processes involve multiple proteins interacting with each other. It has been recently discovered that certain residues in these protein-protein interactions, which are called hot spots, contribute more significantly to binding affinity than others. Hot spot residues have unique and diverse energetic properties that make them challenging yet important targets in the modulation of protein-protein complexes. Design of therapeutic agents that interact with hot spot residues has proven to be a valid methodology in disrupting unwanted protein-protein interactions. Using biological methods to determine which residues are hot spots can be costly and time consuming. Recent advances in computational approaches to predict hot spots have incorporated a myriad of features, and have shown increasing predictive successes. Here we review the state of knowledge around protein-protein interactions, hot spots, and give an overview of multiple in silico prediction techniques of hot spot residues.

  5. Two adults with multiple disabilities use a computer-aided telephone system to make phone calls independently.

    Science.gov (United States)

    Lancioni, Giulio E; O'Reilly, Mark F; Singh, Nirbhay N; Sigafoos, Jeff; Oliva, Doretta; Alberti, Gloria; Lang, Russell

    2011-01-01

    This study extended the assessment of a newly developed computer-aided telephone system with two participants (adults) who presented with blindness or severe visual impairment and motor or motor and intellectual disabilities. For each participant, the study was carried out according to an ABAB design, in which the A represented baseline phases and the B represented intervention phases, during which the special telephone system was available. The system involved among others a net-book computer provided with specific software, a global system for mobile communication modem, and a microswitch. Both participants learned to use the system very rapidly and managed to make phone calls independently to a variety of partners such as family members, friends and staff personnel. The results were discussed in terms of the technology under investigation (its advantages, drawbacks, and need of improvement) and the social-communication impact it can make for persons with multiple disabilities. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Network motif-based identification of transcription factor-target gene relationships by integrating multi-source biological data

    Directory of Open Access Journals (Sweden)

    de los Reyes Benildo G

    2008-04-01

    Full Text Available Abstract Background Integrating data from multiple global assays and curated databases is essential to understand the spatio-temporal interactions within cells. Different experiments measure cellular processes at various widths and depths, while databases contain biological information based on established facts or published data. Integrating these complementary datasets helps infer a mutually consistent transcriptional regulatory network (TRN with strong similarity to the structure of the underlying genetic regulatory modules. Decomposing the TRN into a small set of recurring regulatory patterns, called network motifs (NM, facilitates the inference. Identifying NMs defined by specific transcription factors (TF establishes the framework structure of a TRN and allows the inference of TF-target gene relationship. This paper introduces a computational framework for utilizing data from multiple sources to infer TF-target gene relationships on the basis of NMs. The data include time course gene expression profiles, genome-wide location analysis data, binding sequence data, and gene ontology (GO information. Results The proposed computational framework was tested using gene expression data associated with cell cycle progression in yeast. Among 800 cell cycle related genes, 85 were identified as candidate TFs and classified into four previously defined NMs. The NMs for a subset of TFs are obtained from literature. Support vector machine (SVM classifiers were used to estimate NMs for the remaining TFs. The potential downstream target genes for the TFs were clustered into 34 biologically significant groups. The relationships between TFs and potential target gene clusters were examined by training recurrent neural networks whose topologies mimic the NMs to which the TFs are classified. The identified relationships between TFs and gene clusters were evaluated using the following biological validation and statistical analyses: (1 Gene set enrichment

  7. Supertracker: A Programmable Parallel Pipeline Arithmetic Processor For Auto-Cueing Target Processing

    Science.gov (United States)

    Mack, Harold; Reddi, S. S.

    1980-04-01

    Supertracker represents a programmable parallel pipeline computer architecture that has been designed to meet the real time image processing requirements of auto-cueing target data processing. The prototype bread-board currently under development will be designed to perform input video preprocessing and processing for 525-line and 875-line TV formats FLIR video, automatic display gain and contrast control, and automatic target cueing, classification, and tracking. The video preprocessor is capable of performing operations full frames of video data in real time, e.g., frame integration, storage, 3 x 3 convolution, and neighborhood processing. The processor architecture is being implemented using bit-slice microprogrammable arithmetic processors, operating in parallel. Each processor is capable of up to 20 million operations per second. Multiple frame memories are used for additional flexibility.

  8. Multiple-scattering analysis of laser-beam propagation in the atmosphere and through obscurants

    International Nuclear Information System (INIS)

    Zardecki, A.; Gerstl, S.A.W.

    1983-01-01

    The general purpose, discrete-ordinates transport code TWOTRAN is applied to describe the propagation and multiple scattering of a laser beam in a nonhomogeneous aerosol medium. For the medium composed of smoke, haze, and a rain cloud, the problem of the target detectability in a realistic atmospheric scenario is addressed and solved. The signals reflected from the target vs the signals scattered from the smoke cloud are analyzed as a function of the smoke concentration. By calculating the average intensity and a correction factor in the x-y and r-z geometries, the consistency of the rectangular and cylindrical geometry models is assessed. Received power for a detector with a small field of view is computed on a sphere of 1-km radius around the laser source for the Air Force Geophysics Laboratory rural aerosol model with extinction coefficients of 4 km - 1 and 10 km - 1 . This computation allows us to study the received power as a function of the angle between the detector and source axes. The correction factor describing the multiple-scattering enhancement with respect to the simple Lambert-Beer law is introduced, and its calculation is employed to validate the use of the small-angle approximation for the transmissometer configuration. An outline of the theory for a finite field of view detector is followed by numerical results pertaining to the received power and intensity for various aerosol models. Recommendations regarding future work are also formulated

  9. The study of infrared target recognition at sea background based on visual attention computational model

    Science.gov (United States)

    Wang, Deng-wei; Zhang, Tian-xu; Shi, Wen-jun; Wei, Long-sheng; Wang, Xiao-ping; Ao, Guo-qing

    2009-07-01

    Infrared images at sea background are notorious for the low signal-to-noise ratio, therefore, the target recognition of infrared image through traditional methods is very difficult. In this paper, we present a novel target recognition method based on the integration of visual attention computational model and conventional approach (selective filtering and segmentation). The two distinct techniques for image processing are combined in a manner to utilize the strengths of both. The visual attention algorithm searches the salient regions automatically, and represented them by a set of winner points, at the same time, demonstrated the salient regions in terms of circles centered at these winner points. This provides a priori knowledge for the filtering and segmentation process. Based on the winner point, we construct a rectangular region to facilitate the filtering and segmentation, then the labeling operation will be added selectively by requirement. Making use of the labeled information, from the final segmentation result we obtain the positional information of the interested region, label the centroid on the corresponding original image, and finish the localization for the target. The cost time does not depend on the size of the image but the salient regions, therefore the consumed time is greatly reduced. The method is used in the recognition of several kinds of real infrared images, and the experimental results reveal the effectiveness of the algorithm presented in this paper.

  10. Multi-Robot, Multi-Target Particle Swarm Optimization Search in Noisy Wireless Environments

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Derr; Milos Manic

    2009-05-01

    Multiple small robots (swarms) can work together using Particle Swarm Optimization (PSO) to perform tasks that are difficult or impossible for a single robot to accomplish. The problem considered in this paper is exploration of an unknown environment with the goal of finding a target(s) at an unknown location(s) using multiple small mobile robots. This work demonstrates the use of a distributed PSO algorithm with a novel adaptive RSS weighting factor to guide robots for locating target(s) in high risk environments. The approach was developed and analyzed on multiple robot single and multiple target search. The approach was further enhanced by the multi-robot-multi-target search in noisy environments. The experimental results demonstrated how the availability of radio frequency signal can significantly affect robot search time to reach a target.

  11. Parallel Computing Using Web Servers and "Servlets".

    Science.gov (United States)

    Lo, Alfred; Bloor, Chris; Choi, Y. K.

    2000-01-01

    Describes parallel computing and presents inexpensive ways to implement a virtual parallel computer with multiple Web servers. Highlights include performance measurement of parallel systems; models for using Java and intranet technology including single server, multiple clients and multiple servers, single client; and a comparison of CGI (common…

  12. Network Understanding of Herb Medicine via Rapid Identification of Ingredient-Target Interactions

    Science.gov (United States)

    Zhang, Hai-Ping; Pan, Jian-Bo; Zhang, Chi; Ji, Nan; Wang, Hao; Ji, Zhi-Liang

    2014-01-01

    Today, herb medicines have become the major source for discovery of novel agents in countermining diseases. However, many of them are largely under-explored in pharmacology due to the limitation of current experimental approaches. Therefore, we proposed a computational framework in this study for network understanding of herb pharmacology via rapid identification of putative ingredient-target interactions in human structural proteome level. A marketing anti-cancer herb medicine in China, Yadanzi (Brucea javanica), was chosen for mechanistic study. Total 7,119 ingredient-target interactions were identified for thirteen Yadanzi active ingredients. Among them, about 29.5% were estimated to have better binding affinity than their corresponding marketing drug-target interactions. Further Bioinformatics analyses suggest that simultaneous manipulation of multiple proteins in the MAPK signaling pathway and the phosphorylation process of anti-apoptosis may largely answer for Yadanzi against non-small cell lung cancers. In summary, our strategy provides an efficient however economic solution for systematic understanding of herbs' power.

  13. Targeting poly (ADP-ribose polymerase partially contributes to bufalin-induced cell death in multiple myeloma cells.

    Directory of Open Access Journals (Sweden)

    He Huang

    Full Text Available Despite recent pharmaceutical advancements in therapeutic drugs, multiple myeloma (MM remains an incurable disease. Recently, ploy(ADP-ribose polymerase 1 (PARP1 has been shown as a potentially promising target for MM therapy. A previous report suggested bufalin, a component of traditional Chinese medicine ("Chan Su", might target PARP1. However, this hypothesis has not been verified. We here showed that bufalin could inhibit PARP1 activity in vitro and reduce DNA-damage-induced poly(ADP-ribosylation in MM cells. Molecular docking analysis revealed that the active site of bufalin interaction is within the catalytic domain of PAPR1. Thus, PARP1 is a putative target of bufalin. Furthermore, we showed, for the first time that the proliferation of MM cell lines (NCI-H929, U266, RPMI8226 and MM.1S and primary CD138(+ MM cells could be inhibited by bufalin, mainly via apoptosis and G2-M phase cell cycle arrest. MM cell apoptosis was confirmed by apoptotic cell morphology, Annexin-V positive cells, and the caspase3 activation. We further evaluated the role of PARP1 in bufalin-induced apoptosis, discovering that PARP1 overexpression partially suppressed bufalin-induced cell death. Moreover, bufalin can act as chemosensitizer to enhance the cell growth-inhibitory effects of topotecan, camptothecin, etoposide and vorinostat in MM cells. Collectively, our data suggest that bufalin is a novel PARP1 inhibitor and a potentially promising therapeutic agent against MM alone or in combination with other drugs.

  14. Computational modeling and in-vitro/in-silico correlation of phospholipid-based prodrugs for targeted drug delivery in inflammatory bowel disease

    Science.gov (United States)

    Dahan, Arik; Markovic, Milica; Keinan, Shahar; Kurnikov, Igor; Aponick, Aaron; Zimmermann, Ellen M.; Ben-Shabat, Shimon

    2017-11-01

    Targeting drugs to the inflamed intestinal tissue(s) represents a major advancement in the treatment of inflammatory bowel disease (IBD). In this work we present a powerful in-silico modeling approach to guide the molecular design of novel prodrugs targeting the enzyme PLA2, which is overexpressed in the inflamed tissues of IBD patients. The prodrug consists of the drug moiety bound to the sn-2 position of phospholipid (PL) through a carbonic linker, aiming to allow PLA2 to release the free drug. The linker length dictates the affinity of the PL-drug conjugate to PLA2, and the optimal linker will enable maximal PLA2-mediated activation. Thermodynamic integration and Weighted Histogram Analysis Method (WHAM)/Umbrella Sampling method were used to compute the changes in PLA2 transition state binding free energy of the prodrug molecule (ΔΔGtr) associated with decreasing/increasing linker length. The simulations revealed that 6-carbons linker is the optimal one, whereas shorter or longer linkers resulted in decreased PLA2-mediated activation. These in-silico results were shown to be in excellent correlation with experimental in-vitro data. Overall, this modern computational approach enables optimization of the molecular design of novel prodrugs, which may allow targeting the free drug specifically to the diseased intestinal tissue of IBD patients.

  15. Functional Multiple-Set Canonical Correlation Analysis

    Science.gov (United States)

    Hwang, Heungsun; Jung, Kwanghee; Takane, Yoshio; Woodward, Todd S.

    2012-01-01

    We propose functional multiple-set canonical correlation analysis for exploring associations among multiple sets of functions. The proposed method includes functional canonical correlation analysis as a special case when only two sets of functions are considered. As in classical multiple-set canonical correlation analysis, computationally, the…

  16. Multiple target sound quality balance for hybrid electric powertrain noise

    Science.gov (United States)

    Mosquera-Sánchez, J. A.; Sarrazin, M.; Janssens, K.; de Oliveira, L. P. R.; Desmet, W.

    2018-01-01

    The integration of the electric motor to the powertrain in hybrid electric vehicles (HEVs) presents acoustic stimuli that elicit new perceptions. The large number of spectral components, as well as the wider bandwidth of this sort of noises, pose new challenges to current noise, vibration and harshness (NVH) approaches. This paper presents a framework for enhancing the sound quality (SQ) of the hybrid electric powertrain noise perceived inside the passenger compartment. Compared with current active sound quality control (ASQC) schemes, where the SQ improvement is just an effect of the control actions, the proposed technique features an optimization stage, which enables the NVH specialist to actively implement the amplitude balance of the tones that better fits into the auditory expectations. Since Loudness, Roughness, Sharpness and Tonality are the most relevant SQ metrics for interior HEV noise, they are used as performance metrics in the concurrent optimization analysis, which, eventually, drives the control design method. Thus, multichannel active sound profiling systems that feature cross-channel compensation schemes are guided by the multi-objective optimization stage, by means of optimal sets of amplitude gain factors that can be implemented at each single sensor location, while minimizing cross-channel effects that can either degrade the original SQ condition, or even hinder the implementation of independent SQ targets. The proposed framework is verified experimentally, with realistic stationary hybrid electric powertrain noise, showing SQ enhancement for multiple locations within a scaled vehicle mock-up. The results show total success rates in excess of 90%, which indicate that the proposed method is promising, not only for the improvement of the SQ of HEV noise, but also for a variety of periodic disturbances with similar features.

  17. Drug-Target Interactions: Prediction Methods and Applications.

    Science.gov (United States)

    Anusuya, Shanmugam; Kesherwani, Manish; Priya, K Vishnu; Vimala, Antonydhason; Shanmugam, Gnanendra; Velmurugan, Devadasan; Gromiha, M Michael

    2018-01-01

    Identifying the interactions between drugs and target proteins is a key step in drug discovery. This not only aids to understand the disease mechanism, but also helps to identify unexpected therapeutic activity or adverse side effects of drugs. Hence, drug-target interaction prediction becomes an essential tool in the field of drug repurposing. The availability of heterogeneous biological data on known drug-target interactions enabled many researchers to develop various computational methods to decipher unknown drug-target interactions. This review provides an overview on these computational methods for predicting drug-target interactions along with available webservers and databases for drug-target interactions. Further, the applicability of drug-target interactions in various diseases for identifying lead compounds has been outlined. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  18. The study of Kruskal's and Prim's algorithms on the Multiple Instruction and Single Data stream computer system

    Directory of Open Access Journals (Sweden)

    A. Yu. Popov

    2015-01-01

    Full Text Available Bauman Moscow State Technical University is implementing a project to develop operating principles of computer system having radically new architecture. A developed working model of the system allowed us to evaluate an efficiency of developed hardware and software. The experimental results presented in previous studies, as well as the analysis of operating principles of new computer system permit to draw conclusions regarding its efficiency in solving discrete optimization problems related to processing of sets.The new architecture is based on a direct hardware support of operations of discrete mathematics, which is reflected in using the special facilities for processing of sets and data structures. Within the framework of the project a special device was designed, i.e. a structure processor (SP, which improved the performance, without limiting the scope of applications of such a computer system.The previous works presented the basic principles of the computational process organization in MISD (Multiple Instructions, Single Data system, showed the structure and features of the structure processor and the general principles to solve discrete optimization problems on graphs.This paper examines two search algorithms of the minimum spanning tree, namely Kruskal's and Prim's algorithms. It studies the implementations of algorithms for two SP operation modes: coprocessor mode and MISD one. The paper presents results of experimental comparison of MISD system performance in coprocessor mode with mainframes.

  19. Fusion target design

    International Nuclear Information System (INIS)

    Bangerter, R.O.

    1978-01-01

    Most detailed fusion target design is done by numerical simulation using large computers. Although numerical simulation is briefly discussed, this lecture deals primarily with the way in which basic physical arguments, driver technology considerations and economical power production requirements are used to guide and augment the simulations. Physics topics discussed include target energetics, preheat, stability and symmetry. A specific design example is discussed

  20. Role of whole-body 64-slice multidetector computed tomography in treatment planning for multiple myeloma.

    Science.gov (United States)

    Razek, Ahmed Abdel Khalek Abdel; Ezzat, Amany; Azmy, Emad; Tharwat, Nehal

    2013-08-01

    The authors evaluated the role of whole-body 64-slice multidetector computed tomography (WB-MDCT) in treatment planning for multiple myeloma. This was a prospective study of 28 consecutive patients with multiple myeloma (19 men, nine women; age range, 51-73 years; mean age, 60 years) who underwent WB-MDCT and conventional radiography (CR) of the skeleton. The images were interpreted for the presence of bony lesions, medullary lesions, fractures and extraosseous lesions. We evaluated any changes in treatment planning as a result of WB-MDCT findings. WB-MDCT was superior to CR for detecting bony lesions (p=0.001), especially of the spine (p=0.001) and thoracic cage (p=0.006). WB-MDCT upstaged 14 patients, with a significant difference in staging (p=0.002) between WB-MDCT and CR. Medullary involvement either focal (n=6) or diffuse (n=3) had a positive correlation with the overall score (r=0.790) and stage (r=0.618) of disease. Spine fractures were better detected at WB-MDCT (n=4) than at CR (n=2). Extraosseous soft tissue lesions (n=7) were detected only at WB-MDCT. Findings detected at the WB-MDCT led to changes in the patient's treatment plan in 39% of cases. Upstaging of seven patients (25%) altered the medical treatment plan, and four of 28 (14%) patients required additional radiotherapy (7%) and vertebroplasty (7%). We conclude that WB-MDCT has an impact on treatment planning and prognosis in patients with multiple myeloma, as it has high rate of detecting cortical and medullary bone lesions, spinal fracture and extraosseous lesions. This information may alter treatment planning in multiple myeloma due to disease upstaging and detection of spine fracture and extraosseous spinal lesions.

  1. Multiple-instance learning for computer-aided detection of tuberculosis

    Science.gov (United States)

    Melendez, J.; Sánchez, C. I.; Philipsen, R. H. H. M.; Maduskar, P.; van Ginneken, B.

    2014-03-01

    Detection of tuberculosis (TB) on chest radiographs (CXRs) is a hard problem. Therefore, to help radiologists or even take their place when they are not available, computer-aided detection (CAD) systems are being developed. In order to reach a performance comparable to that of human experts, the pattern recognition algorithms of these systems are typically trained on large CXR databases that have been manually annotated to indicate the abnormal lung regions. However, manually outlining those regions constitutes a time-consuming process that, besides, is prone to inconsistencies and errors introduced by interobserver variability and the absence of an external reference standard. In this paper, we investigate an alternative pattern classi cation method, namely multiple-instance learning (MIL), that does not require such detailed information for a CAD system to be trained. We have applied this alternative approach to a CAD system aimed at detecting textural lesions associated with TB. Only the case (or image) condition (normal or abnormal) was provided in the training stage. We compared the resulting performance with those achieved by several variations of a conventional system trained with detailed annotations. A database of 917 CXRs was constructed for experimentation. It was divided into two roughly equal parts that were used as training and test sets. The area under the receiver operating characteristic curve was utilized as a performance measure. Our experiments show that, by applying the investigated MIL approach, comparable results as with the aforementioned conventional systems are obtained in most cases, without requiring condition information at the lesion level.

  2. M402, a novel heparan sulfate mimetic, targets multiple pathways implicated in tumor progression and metastasis.

    Directory of Open Access Journals (Sweden)

    He Zhou

    Full Text Available Heparan sulfate proteoglycans (HSPGs play a key role in shaping the tumor microenvironment by presenting growth factors, cytokines, and other soluble factors that are critical for host cell recruitment and activation, as well as promoting tumor progression, metastasis, and survival. M402 is a rationally engineered, non-cytotoxic heparan sulfate (HS mimetic, designed to inhibit multiple factors implicated in tumor-host cell interactions, including VEGF, FGF2, SDF-1α, P-selectin, and heparanase. A single s.c. dose of M402 effectively inhibited seeding of B16F10 murine melanoma cells to the lung in an experimental metastasis model. Fluorescent-labeled M402 demonstrated selective accumulation in the primary tumor. Immunohistological analyses of the primary tumor revealed a decrease in microvessel density in M402 treated animals, suggesting anti-angiogenesis to be one of the mechanisms involved in-vivo. M402 treatment also normalized circulating levels of myeloid derived suppressor cells in tumor bearing mice. Chronic administration of M402, alone or in combination with cisplatin or docetaxel, inhibited spontaneous metastasis and prolonged survival in an orthotopic 4T1 murine mammary carcinoma model. These data demonstrate that modulating HSPG biology represents a novel approach to target multiple factors involved in tumor progression and metastasis.

  3. Dosimetric consequences of the shift towards computed tomography guided target definition and planning for breast conserving radiotherapy

    Directory of Open Access Journals (Sweden)

    Korevaar Erik W

    2008-01-01

    Full Text Available Abstract Background The shift from conventional two-dimensional (2D to three-dimensional (3D-conformal target definition and dose-planning seems to have introduced volumetric as well as geometric changes. The purpose of this study was to compare coverage of computed tomography (CT-based breast and boost planning target volumes (PTV, absolute volumes irradiated, and dose delivered to the organs at risk with conventional 2D and 3D-conformal breast conserving radiotherapy. Methods Twenty-five patients with left-sided breast cancer were subject of CT-guided target definition and 3D-conformal dose-planning, and conventionally defined target volumes and treatment plans were reconstructed on the planning CT. Accumulated dose-distributions were calculated for the conventional and 3D-conformal dose-plans, taking into account a prescribed dose of 50 Gy for the breast plans and 16 Gy for the boost plans. Results With conventional treatment plans, CT-based breast and boost PTVs received the intended dose in 78% and 32% of the patients, respectively, and smaller volumes received the prescribed breast and boost doses compared with 3D-conformal dose-planning. The mean lung dose, the volume of the lungs receiving > 20 Gy, the mean heart dose, and volume of the heart receiving > 30 Gy were significantly less with conventional treatment plans. Specific areas within the breast and boost PTVs systematically received a lower than intended dose with conventional treatment plans. Conclusion The shift towards CT-guided target definition and planning as the golden standard for breast conserving radiotherapy has resulted in improved target coverage at the cost of larger irradiated volumes and an increased dose delivered to organs at risk. Tissue is now included into the breast and boost target volumes that was never explicitly defined or included with conventional treatment. Therefore, a coherent definition of the breast and boost target volumes is needed, based on

  4. Dosimetric consequences of the shift towards computed tomography guided target definition and planning for breast conserving radiotherapy

    International Nuclear Information System (INIS)

    Laan, Hans Paul van der; Dolsma, Wil V; Maduro, John H; Korevaar, Erik W; Langendijk, Johannes A

    2008-01-01

    The shift from conventional two-dimensional (2D) to three-dimensional (3D)-conformal target definition and dose-planning seems to have introduced volumetric as well as geometric changes. The purpose of this study was to compare coverage of computed tomography (CT)-based breast and boost planning target volumes (PTV), absolute volumes irradiated, and dose delivered to the organs at risk with conventional 2D and 3D-conformal breast conserving radiotherapy. Twenty-five patients with left-sided breast cancer were subject of CT-guided target definition and 3D-conformal dose-planning, and conventionally defined target volumes and treatment plans were reconstructed on the planning CT. Accumulated dose-distributions were calculated for the conventional and 3D-conformal dose-plans, taking into account a prescribed dose of 50 Gy for the breast plans and 16 Gy for the boost plans. With conventional treatment plans, CT-based breast and boost PTVs received the intended dose in 78% and 32% of the patients, respectively, and smaller volumes received the prescribed breast and boost doses compared with 3D-conformal dose-planning. The mean lung dose, the volume of the lungs receiving > 20 Gy, the mean heart dose, and volume of the heart receiving > 30 Gy were significantly less with conventional treatment plans. Specific areas within the breast and boost PTVs systematically received a lower than intended dose with conventional treatment plans. The shift towards CT-guided target definition and planning as the golden standard for breast conserving radiotherapy has resulted in improved target coverage at the cost of larger irradiated volumes and an increased dose delivered to organs at risk. Tissue is now included into the breast and boost target volumes that was never explicitly defined or included with conventional treatment. Therefore, a coherent definition of the breast and boost target volumes is needed, based on clinical data confirming tumour control probability and normal

  5. Tumor-targeted nanomedicines for cancer theranostics

    Science.gov (United States)

    Lammers, Twan; Shi, Yang

    2017-01-01

    Chemotherapeutic drugs have multiple drawbacks, including severe side effects and suboptimal therapeutic efficacy. Nanomedicines assist in improving the biodistribution and the target accumulation of chemotherapeutic drugs, and are therefore able to enhance the balance between efficacy and toxicity. Multiple different types of nanomedicines have been evaluated over the years, including liposomes, polymer-drug conjugates and polymeric micelles, which rely on strategies such as passive targeting, active targeting and triggered release for improved tumor-directed drug delivery. Based on the notion that tumors and metastases are highly heterogeneous, it is important to integrate imaging properties in nanomedicine formulations in order to enable non-invasive and quantitative assessment of targeting efficiency. By allowing for patient pre-selection, such next generation nanotheranostics are useful for facilitating clinical translation and personalizing nanomedicine treatments. PMID:27865762

  6. Identification of potential inhibitors based on compound proposal contest: Tyrosine-protein kinase Yes as a target.

    Science.gov (United States)

    Chiba, Shuntaro; Ikeda, Kazuyoshi; Ishida, Takashi; Gromiha, M Michael; Taguchi, Y-H; Iwadate, Mitsuo; Umeyama, Hideaki; Hsin, Kun-Yi; Kitano, Hiroaki; Yamamoto, Kazuki; Sugaya, Nobuyoshi; Kato, Koya; Okuno, Tatsuya; Chikenji, George; Mochizuki, Masahiro; Yasuo, Nobuaki; Yoshino, Ryunosuke; Yanagisawa, Keisuke; Ban, Tomohiro; Teramoto, Reiji; Ramakrishnan, Chandrasekaran; Thangakani, A Mary; Velmurugan, D; Prathipati, Philip; Ito, Junichi; Tsuchiya, Yuko; Mizuguchi, Kenji; Honma, Teruki; Hirokawa, Takatsugu; Akiyama, Yutaka; Sekijima, Masakazu

    2015-11-26

    A search of broader range of chemical space is important for drug discovery. Different methods of computer-aided drug discovery (CADD) are known to propose compounds in different chemical spaces as hit molecules for the same target protein. This study aimed at using multiple CADD methods through open innovation to achieve a level of hit molecule diversity that is not achievable with any particular single method. We held a compound proposal contest, in which multiple research groups participated and predicted inhibitors of tyrosine-protein kinase Yes. This showed whether collective knowledge based on individual approaches helped to obtain hit compounds from a broad range of chemical space and whether the contest-based approach was effective.

  7. 48 CFR 227.7203-2 - Acquisition of noncommercial computer software and computer software documentation.

    Science.gov (United States)

    2010-10-01

    ... at one site or multiple site licenses, and the format and media in which the software or... noncommercial computer software and computer software documentation. 227.7203-2 Section 227.7203-2 Federal... CONTRACTING REQUIREMENTS PATENTS, DATA, AND COPYRIGHTS Rights in Computer Software and Computer Software...

  8. Computing Accurate Grammatical Feedback in a Virtual Writing Conference for German-Speaking Elementary-School Children: An Approach Based on Natural Language Generation

    Science.gov (United States)

    Harbusch, Karin; Itsova, Gergana; Koch, Ulrich; Kuhner, Christine

    2009-01-01

    We built a natural language processing (NLP) system implementing a "virtual writing conference" for elementary-school children, with German as the target language. Currently, state-of-the-art computer support for writing tasks is restricted to multiple-choice questions or quizzes because automatic parsing of the often ambiguous and fragmentary…

  9. Generation and compression of a target plasma for magnetized target fusion

    International Nuclear Information System (INIS)

    Kirkpatrick, R.C.; Lindemuth, I.R.; Sheehey, P.T.

    1998-01-01

    This is the final report of a three-year, Laboratory Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). Magnetized target fusion (MTF) is intermediate between the two very different approaches to fusion: inertial and magnetic confinement fusion (ICF and MCF). Results from collaboration with a Russian MTF team on their MAGO experiments suggest they have a target plasma suitable for compression to provide an MTF proof of principle. This LDRD project had tow main objectives: first, to provide a computational basis for experimental investigation of an alternative MTF plasma, and second to explore the physics and computational needs for a continuing program. Secondary objectives included analytic and computational support for MTF experiments. The first objective was fulfilled. The second main objective has several facets to be described in the body of this report. Finally, the authors have developed tools for analyzing data collected on the MAGO and LDRD experiments, and have tested them on limited MAGO data

  10. Golay Complementary Waveforms in Reed–Müller Sequences for Radar Detection of Nonzero Doppler Targets

    Science.gov (United States)

    Wang, Xuezhi; Huang, Xiaotao; Suvorova, Sofia; Moran, Bill

    2018-01-01

    Golay complementary waveforms can, in theory, yield radar returns of high range resolution with essentially zero sidelobes. In practice, when deployed conventionally, while high signal-to-noise ratios can be achieved for static target detection, significant range sidelobes are generated by target returns of nonzero Doppler causing unreliable detection. We consider signal processing techniques using Golay complementary waveforms to improve radar detection performance in scenarios involving multiple nonzero Doppler targets. A signal processing procedure based on an existing, so called, Binomial Design algorithm that alters the transmission order of Golay complementary waveforms and weights the returns is proposed in an attempt to achieve an enhanced illumination performance. The procedure applies one of three proposed waveform transmission ordering algorithms, followed by a pointwise nonlinear processor combining the outputs of the Binomial Design algorithm and one of the ordering algorithms. The computational complexity of the Binomial Design algorithm and the three ordering algorithms are compared, and a statistical analysis of the performance of the pointwise nonlinear processing is given. Estimation of the areas in the Delay–Doppler map occupied by significant range sidelobes for given targets are also discussed. Numerical simulations for the comparison of the performances of the Binomial Design algorithm and the three ordering algorithms are presented for both fixed and randomized target locations. The simulation results demonstrate that the proposed signal processing procedure has a better detection performance in terms of lower sidelobes and higher Doppler resolution in the presence of multiple nonzero Doppler targets compared to existing methods. PMID:29324708

  11. Multiple-beam laser–plasma interactions in inertial confinement fusion

    Energy Technology Data Exchange (ETDEWEB)

    Myatt, J. F., E-mail: jmya@lle.rochester.edu; Zhang, J.; Maximov, A. V. [Laboratory for Laser Energetics, University of Rochester, 250 East River Road, Rochester, New York 14623-1299 (United States); Department of Mechanical Engineering, University of Rochester, Rochester, New York 14627 (United States); Short, R. W.; Seka, W.; Edgell, D. H.; Michel, D. T.; Igumenshchev, I. V. [Laboratory for Laser Energetics, University of Rochester, 250 East River Road, Rochester, New York 14623-1299 (United States); Froula, D. H. [Laboratory for Laser Energetics, University of Rochester, 250 East River Road, Rochester, New York 14623-1299 (United States); Department of Physics and Astronomy, University of Rochester, Rochester, New York 14627-0171 (United States); Hinkel, D. E.; Michel, P.; Moody, J. D. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, California 94551-0808 (United States)

    2014-05-15

    The experimental evidence for multiple-beam laser-plasma instabilities of relevance to laser driven inertial confinement fusion at the ignition scale is reviewed, in both the indirect and direct-drive approaches. The instabilities described are cross-beam energy transfer (in both indirectly driven targets on the NIF and in direct-drive targets), multiple-beam stimulated Raman scattering (for indirect-drive), and multiple-beam two-plasmon decay instability (in direct drive). Advances in theoretical understanding and in the numerical modeling of these multiple beam instabilities are presented.

  12. Optical Interconnection Via Computer-Generated Holograms

    Science.gov (United States)

    Liu, Hua-Kuang; Zhou, Shaomin

    1995-01-01

    Method of free-space optical interconnection developed for data-processing applications like parallel optical computing, neural-network computing, and switching in optical communication networks. In method, multiple optical connections between multiple sources of light in one array and multiple photodetectors in another array made via computer-generated holograms in electrically addressed spatial light modulators (ESLMs). Offers potential advantages of massive parallelism, high space-bandwidth product, high time-bandwidth product, low power consumption, low cross talk, and low time skew. Also offers advantage of programmability with flexibility of reconfiguration, including variation of strengths of optical connections in real time.

  13. Materials and nanosystems : interdisciplinary computational modeling at multiple scales

    International Nuclear Information System (INIS)

    Huber, S.E.

    2014-01-01

    Over the last five decades, computer simulation and numerical modeling have become valuable tools complementing the traditional pillars of science, experiment and theory. In this thesis, several applications of computer-based simulation and modeling shall be explored in order to address problems and open issues in chemical and molecular physics. Attention shall be paid especially to the different degrees of interrelatedness and multiscale-flavor, which may - at least to some extent - be regarded as inherent properties of computational chemistry. In order to do so, a variety of computational methods are used to study features of molecular systems which are of relevance in various branches of science and which correspond to different spatial and/or temporal scales. Proceeding from small to large measures, first, an application in astrochemistry, the investigation of spectroscopic and energetic aspects of carbonic acid isomers shall be discussed. In this respect, very accurate and hence at the same time computationally very demanding electronic structure methods like the coupled-cluster approach are employed. These studies are followed by the discussion of an application in the scope of plasma-wall interaction which is related to nuclear fusion research. There, the interactions of atoms and molecules with graphite surfaces are explored using density functional theory methods. The latter are computationally cheaper than coupled-cluster methods and thus allow the treatment of larger molecular systems, but yield less accuracy and especially reduced error control at the same time. The subsequently presented exploration of surface defects at low-index polar zinc oxide surfaces, which are of interest in materials science and surface science, is another surface science application. The necessity to treat even larger systems of several hundreds of atoms requires the use of approximate density functional theory methods. Thin gold nanowires consisting of several thousands of

  14. Final results of the 'Benchmark on computer simulation of radioactive nuclides production rate and heat generation rate in a spallation target'

    International Nuclear Information System (INIS)

    Janczyszyn, J.; Pohorecki, W.; Domanska, G.; Maiorino, R.J.; David, J.C.; Velarde, F.A.

    2011-01-01

    A benchmark has been organized to assess the computer simulation of nuclide production and heat generation in a spallation lead target. The physical models applied for the calculation of thick lead target activation do not produce satisfactory results for the majority of analysed nuclides, however one can observe better or worse quantitative compliance with the experimental results. Analysis of the quality of calculated results show the best performance for heavy nuclides (A: 170 - 190). For intermediate nuclides (A: 60 - 130) almost all are underestimated while for A: 130 - 170 mainly overestimated. The shape of the activity distribution in the target is well reproduced in calculations by all models but the numerical comparison shows similar performance as for the whole target. The Isabel model yields best results. As for the whole target heating rate, the results from all participants are consistent. Only small differences are observed between results from physical models. As for the heating distribution in the target it looks not quite similar. The quantitative comparison of the distributions yielded by different spallation reaction models shows for the major part of the target no serious differences - generally below 10%. However, in the most outside parts of the target front layers and the part of the target at its end behind the primary protons range, a spread higher than 40 % is obtained

  15. Target for production of X-rays

    Energy Technology Data Exchange (ETDEWEB)

    Korenev, S.A. E-mail: sergey_korenev@steris.com

    2004-10-01

    The patented new type of X-ray target is considered in this report. The main concept of the target consists in developing a sandwich structure depositing a coating of materials with high Z on the substrate with low Z, high thermal conductivity and high thermal stability. The target presents multiple layers system. The thermal conditions for X-ray target are discussed. The experimental results for Ta target on the Al and Cu substrates are presented.

  16. Target for production of X-rays

    International Nuclear Information System (INIS)

    Korenev, S.A.

    2004-01-01

    The patented new type of X-ray target is considered in this report. The main concept of the target consists in developing a sandwich structure depositing a coating of materials with high Z on the substrate with low Z, high thermal conductivity and high thermal stability. The target presents multiple layers system. The thermal conditions for X-ray target are discussed. The experimental results for Ta target on the Al and Cu substrates are presented

  17. Target for production of X-rays

    Science.gov (United States)

    Korenev, S. A.

    2004-09-01

    The patented new type of X-ray target is considered in this report. The main concept of the target consists in developing a sandwich structure depositing a coating of materials with high Z on the substrate with low Z, high thermal conductivity and high thermal stability. The target presents multiple layers system. The thermal conditions for X-ray target are discussed. The experimental results for Ta target on the Al and Cu substrates are presented.

  18. Investigation of flow asymmetry and instability in the liquid mercury target of the Spallation Neutron Source

    International Nuclear Information System (INIS)

    Pointer, D.; Ruggles, A.; Wendel, M.; Crye, J.

    2000-01-01

    The Spallation Neutron Source (SNS) will utilize a liquid mercury target placed in the path of a high-energy proton beam to produce neutrons for research activities. As the high-energy protons interact with the mercury target, the majority of the beam energy is converted to thermal energy. The liquid mercury must provide sufficient heat transfer to maintain the temperature of the target structure within the thermal limits of the structural materials. Therefore, the behavior of the liquid mercury flow must be characterized in sufficient detail to ensure accurate evaluation of heat transfer in the mercury target. A combination of experimental and computational methods is utilized to characterize the flow in these preliminary analyses. Preliminary studies of the liquid mercury flow in the SNS target indicate that the flow in the exit channel may exhibit multiple recirculation zones, flow asymmetries, and possibly large-scale flow instabilities. While these studies are not conclusive, they serve to focus the efforts of subsequent CFD modeling and experimental programs to better characterize the flow patterns in the SNS mercury target

  19. Homography-based multiple-camera person-tracking

    Science.gov (United States)

    Turk, Matthew R.

    2009-01-01

    Multiple video cameras are cheaply installed overlooking an area of interest. While computerized single-camera tracking is well-developed, multiple-camera tracking is a relatively new problem. The main multi-camera problem is to give the same tracking label to all projections of a real-world target. This is called the consistent labelling problem. Khan and Shah (2003) introduced a method to use field of view lines to perform multiple-camera tracking. The method creates inter-camera meta-target associations when objects enter at the scene edges. They also said that a plane-induced homography could be used for tracking, but this method was not well described. Their homography-based system would not work if targets use only one side of a camera to enter the scene. This paper overcomes this limitation and fully describes a practical homography-based tracker. A new method to find the feet feature is introduced. The method works especially well if the camera is tilted, when using the bottom centre of the target's bounding-box would produce inaccurate results. The new method is more accurate than the bounding-box method even when the camera is not tilted. Next, a method is presented that uses a series of corresponding point pairs "dropped" by oblivious, live human targets to find a plane-induced homography. The point pairs are created by tracking the feet locations of moving targets that were associated using the field of view line method. Finally, a homography-based multiple-camera tracking algorithm is introduced. Rules governing when to create the homography are specified. The algorithm ensures that homography-based tracking only starts after a non-degenerate homography is found. The method works when not all four field of view lines are discoverable; only one line needs to be found to use the algorithm. To initialize the system, the operator must specify pairs of overlapping cameras. Aside from that, the algorithm is fully automatic and uses the natural movement of

  20. GSK3-mediated MAF phosphorylation in multiple myeloma as a potential therapeutic target

    International Nuclear Information System (INIS)

    Herath, N I; Rocques, N; Garancher, A; Eychène, A; Pouponnot, C

    2014-01-01

    Multiple myeloma (MM) is an incurable haematological malignancy characterised by the proliferation of mature antibody-secreting plasma B cells in the bone marrow. MM can arise from initiating translocations, of which the musculoaponeurotic fibrosarcoma (MAF) family is implicated in ∼5%. MMs bearing Maf translocations are of poor prognosis. These translocations are associated with elevated Maf expression, including c-MAF, MAFB and MAFA, and with t(14;16) and t(14;20) translocations, involving c-MAF and MAFB, respectively. c-MAF is also overexpressed in MM through MEK/ERK activation, bringing the number of MMs driven by the deregulation of a Maf gene close to 50%. Here we demonstrate that MAFB and c-MAF are phosphorylated by the Ser/Thr kinase GSK3 in human MM cell lines. We show that LiCl-induced GSK3 inhibition targets these phosphorylations and specifically decreases proliferation and colony formation of Maf-expressing MM cell lines. Interestingly, bortezomib induced stabilisation of Maf phosphorylation, an observation that could explain, at least partially, the low efficacy of bortezomib for patients carrying Maf translocations. Thus, GSK3 inhibition could represent a new therapeutic approach for these patients

  1. Identification of the APC/C co-factor FZR1 as a novel therapeutic target for multiple myeloma.

    Science.gov (United States)

    Crawford, Lisa J; Anderson, Gordon; Johnston, Cliona K; Irvine, Alexandra E

    2016-10-25

    Multiple Myeloma (MM) is a haematological neoplasm characterised by the clonal proliferation of malignant plasma cells in the bone marrow. The success of proteasome inhibitors in the treatment of MM has highlighted the importance of the ubiquitin proteasome system (UPS) in the pathogenesis of this disease. In this study, we analysed gene expression of UPS components to identify novel therapeutic targets within this pathway in MM. Here we demonstrate how this approach identified previously validated and novel therapeutic targets. In addition we show that FZR1 (Fzr), a cofactor of the multi-subunit E3 ligase complex anaphase-promoting complex/cyclosome (APC/C), represents a novel therapeutic target in myeloma. The APC/C associates independently with two cofactors, Fzr and Cdc20, to control cell cycle progression. We found high levels of FZR1 in MM primary cells and cell lines and demonstrate that expression is further increased on adhesion to bone marrow stromal cells (BMSCs). Specific knockdown of either FZR1 or CDC20 reduced viability and induced growth arrest of MM cell lines, and resulted in accumulation of APC/CFzr substrate Topoisomerase IIα (TOPIIα) or APC/CCdc20 substrate Cyclin B. Similar effects were observed following treatment with proTAME, an inhibitor of both APC/CFzr and APC/CCdc20. Combinations of proTAME with topoisomerase inhibitors, etoposide and doxorubicin, significantly increased cell death in MM cell lines and primary cells, particularly if TOPIIα levels were first increased through pre-treatment with proTAME. Similarly, combinations of proTAME with the microtubule inhibitor vincristine resulted in enhanced cell death. This study demonstrates the potential of targeting the APC/C and its cofactors as a therapeutic approach in MM.

  2. Multiple proviral integration events after virological synapse-mediated HIV-1 spread

    International Nuclear Information System (INIS)

    Russell, Rebecca A.; Martin, Nicola; Mitar, Ivonne; Jones, Emma; Sattentau, Quentin J.

    2013-01-01

    HIV-1 can move directly between T cells via virological synapses (VS). Although aspects of the molecular and cellular mechanisms underlying this mode of spread have been elucidated, the outcomes for infection of the target cell remain incompletely understood. We set out to determine whether HIV-1 transfer via VS results in productive, high-multiplicity HIV-1 infection. We found that HIV-1 cell-to-cell spread resulted in nuclear import of multiple proviruses into target cells as seen by fluorescence in-situ hybridization. Proviral integration into the target cell genome was significantly higher than that seen in a cell-free infection system, and consequent de novo viral DNA and RNA production in the target cell detected by quantitative PCR increased over time. Our data show efficient proviral integration across VS, implying the probability of multiple integration events in target cells that drive productive T cell infection. - Highlights: • Cell-to-cell HIV-1 infection delivers multiple vRNA copies to the target cell. • Cell-to-cell infection results in productive infection of the target cell. • Cell-to-cell transmission is more efficient than cell-free HIV-1 infection. • Suggests a mechanism for recombination in cells infected with multiple viral genomes

  3. Target preparation

    International Nuclear Information System (INIS)

    Hinn, G.M.

    1984-01-01

    A few of the more interesting of the 210 targets prepared in the Laboratory last year are listed. In addition the author continues to use powdered silver mixed with /sup 9,10/BeO to produce sources for accelerator radio dating of Alaskan and South Polar snow. Currently, he is trying to increase production by multiple sample processing. Also the author routinely makes 3 μg/cm 2 cracked slacked carbon stripper foils and is continuing research with some degree of success in making enriched 28 Si targets starting with the oxide

  4. Efficient sparse matrix-matrix multiplication for computing periodic responses by shooting method on Intel Xeon Phi

    Science.gov (United States)

    Stoykov, S.; Atanassov, E.; Margenov, S.

    2016-10-01

    Many of the scientific applications involve sparse or dense matrix operations, such as solving linear systems, matrix-matrix products, eigensolvers, etc. In what concerns structural nonlinear dynamics, the computations of periodic responses and the determination of stability of the solution are of primary interest. Shooting method iswidely used for obtaining periodic responses of nonlinear systems. The method involves simultaneously operations with sparse and dense matrices. One of the computationally expensive operations in the method is multiplication of sparse by dense matrices. In the current work, a new algorithm for sparse matrix by dense matrix products is presented. The algorithm takes into account the structure of the sparse matrix, which is obtained by space discretization of the nonlinear Mindlin's plate equation of motion by the finite element method. The algorithm is developed to use the vector engine of Intel Xeon Phi coprocessors. It is compared with the standard sparse matrix by dense matrix algorithm and the one developed by Intel MKL and it is shown that by considering the properties of the sparse matrix better algorithms can be developed.

  5. Computed tomography imaging of early coronary artery lesions in stable individuals with multiple cardiovascular risk factors

    Directory of Open Access Journals (Sweden)

    Xi Yang

    2015-04-01

    Full Text Available OBJECTIVES: To investigate the prevalence, extent, severity, and features of coronary artery lesions in stable patients with multiple cardiovascular risk factors. METHODS: Seventy-seven patients with more than 3 cardiovascular risk factors were suspected of having coronary artery disease. Patients with high-risk factors and 39 controls with no risk factors were enrolled in the study. The related risk factors included hypertension, impaired glucose tolerance, dyslipidemia, smoking history, and overweight. The characteristics of coronary lesions were identified and evaluated by 64-slice coronary computed tomography angiography. RESULTS: The incidence of coronary atherosclerosis was higher in the high-risk group than in the no-risk group. The involved branches of the coronary artery, the diffusivity of the lesion, the degree of stenosis, and the nature of the plaques were significantly more severe in the high-risk group compared with the no-risk group (all p < 0.05. CONCLUSION: Among stable individuals with high-risk factors, early coronary artery lesions are common and severe. Computed tomography has promising value for the early screening of coronary lesions.

  6. Cooperative Robots to Observe Moving Targets: Review.

    Science.gov (United States)

    Khan, Asif; Rinner, Bernhard; Cavallaro, Andrea

    2018-01-01

    The deployment of multiple robots for achieving a common goal helps to improve the performance, efficiency, and/or robustness in a variety of tasks. In particular, the observation of moving targets is an important multirobot application that still exhibits numerous open challenges, including the effective coordination of the robots. This paper reviews control techniques for cooperative mobile robots monitoring multiple targets. The simultaneous movement of robots and targets makes this problem particularly interesting, and our review systematically addresses this cooperative multirobot problem for the first time. We classify and critically discuss the control techniques: cooperative multirobot observation of multiple moving targets, cooperative search, acquisition, and track, cooperative tracking, and multirobot pursuit evasion. We also identify the five major elements that characterize this problem, namely, the coordination method, the environment, the target, the robot and its sensor(s). These elements are used to systematically analyze the control techniques. The majority of the studied work is based on simulation and laboratory studies, which may not accurately reflect real-world operational conditions. Importantly, while our systematic analysis is focused on multitarget observation, our proposed classification is useful also for related multirobot applications.

  7. Computer simulation of multiple dynamic photorefractive gratings

    DEFF Research Database (Denmark)

    Buchhave, Preben

    1998-01-01

    The benefits of a direct visualization of space-charge grating buildup are described. The visualization is carried out by a simple repetitive computer program, which simulates the basic processes in the band-transport model and displays the result graphically or in the form of numerical data. The...

  8. Collaborative filtering on a family of biological targets.

    Science.gov (United States)

    Erhan, Dumitru; L'heureux, Pierre-Jean; Yue, Shi Yi; Bengio, Yoshua

    2006-01-01

    Building a QSAR model of a new biological target for which few screening data are available is a statistical challenge. However, the new target may be part of a bigger family, for which we have more screening data. Collaborative filtering or, more generally, multi-task learning, is a machine learning approach that improves the generalization performance of an algorithm by using information from related tasks as an inductive bias. We use collaborative filtering techniques for building predictive models that link multiple targets to multiple examples. The more commonalities between the targets, the better the multi-target model that can be built. We show an example of a multi-target neural network that can use family information to produce a predictive model of an undersampled target. We evaluate JRank, a kernel-based method designed for collaborative filtering. We show their performance on compound prioritization for an HTS campaign and the underlying shared representation between targets. JRank outperformed the neural network both in the single- and multi-target models.

  9. The effect of target and non-target similarity on neural classification performance: A boost from confidence

    OpenAIRE

    Amar R Marathe; Anthony J Ries; Vernon J Lawhern; Vernon J Lawhern; Brent J Lance; Jonathan eTouryan; Kaleb eMcDowell; Hubert eCecotti

    2015-01-01

    Brain computer interaction (BCI) technologies have proven effective in utilizing single-trial classification algorithms to detect target images in rapid serial visualization presentation tasks. While many factors contribute to the accuracy of these algorithms, a critical aspect that is often overlooked concerns the feature similarity between target and non-target images. In most real-world environments there are likely to be many shared features between targets and non-targets resulting in si...

  10. The effect of target and non-target similarity on neural classification performance: a boost from confidence

    OpenAIRE

    Marathe, Amar R.; Ries, Anthony J.; Lawhern, Vernon J.; Lance, Brent J.; Touryan, Jonathan; McDowell, Kaleb; Cecotti, Hubert

    2015-01-01

    Brain computer interaction (BCI) technologies have proven effective in utilizing single-trial classification algorithms to detect target images in rapid serial visualization presentation tasks. While many factors contribute to the accuracy of these algorithms, a critical aspect that is often overlooked concerns the feature similarity between target and non-target images. In most real-world environments there are likely to be many shared features between targets and non-targets resulting in si...

  11. Ion tail filling in laser-fusion targets

    International Nuclear Information System (INIS)

    Henderson, D.B.

    1975-06-01

    Thermonuclear burn begins in laser-fusion targets with the collapse of the imploding fuel shell. At this instant the ion velocity distribution is non-Maxwellian, requiring correction to the commonly used computer simulation codes. This correction is computed and compared with that arising from the loss of fast ions in marginal (rho R less than 0.01 gm cm -2 ) targets. (U.S.)

  12. Designing multiple ligands - medicinal chemistry strategies and challenges.

    Science.gov (United States)

    Morphy, Richard; Rankovic, Zoran

    2009-01-01

    It has been widely recognised over the recent years that parallel modulation of multiple biological targets can be beneficial for treatment of diseases with complex etiologies such as cancer asthma, and psychiatric disease. In this article, current strategies for the generation of ligands with a specific multi-target profile (designed multiple ligands or DMLs) are described and a number of illustrative example are given. Designing multiple ligands is frequently a challenging endeavour for medicinal chemists, with the need to appropriately balance affinity for 2 or more targets whilst obtaining physicochemical and pharmacokinetic properties that are consistent with the administration of an oral drug. Given that the properties of DMLs are influenced to a large extent by the proteomic superfamily to which the targets belong and the lead generation strategy that is pursued, an early assessment of the feasibility of any given DML project is essential.

  13. Hairpin RNA Targeting Multiple Viral Genes Confers Strong Resistance to Rice Black-Streaked Dwarf Virus

    Directory of Open Access Journals (Sweden)

    Fangquan Wang

    2016-05-01

    Full Text Available Rice black-streaked dwarf virus (RBSDV belongs to the genus Fijivirus in the family of Reoviridae and causes severe yield loss in rice-producing areas in Asia. RNA silencing, as a natural defence mechanism against plant viruses, has been successfully exploited for engineering virus resistance in plants, including rice. In this study, we generated transgenic rice lines harbouring a hairpin RNA (hpRNA construct targeting four RBSDV genes, S1, S2, S6 and S10, encoding the RNA-dependent RNA polymerase, the putative core protein, the RNA silencing suppressor and the outer capsid protein, respectively. Both field nursery and artificial inoculation assays of three generations of the transgenic lines showed that they had strong resistance to RBSDV infection. The RBSDV resistance in the segregating transgenic populations correlated perfectly with the presence of the hpRNA transgene. Furthermore, the hpRNA transgene was expressed in the highly resistant transgenic lines, giving rise to abundant levels of 21–24 nt small interfering RNA (siRNA. By small RNA deep sequencing, the RBSDV-resistant transgenic lines detected siRNAs from all four viral gene sequences in the hpRNA transgene, indicating that the whole chimeric fusion sequence can be efficiently processed by Dicer into siRNAs. Taken together, our results suggest that long hpRNA targeting multiple viral genes can be used to generate stable and durable virus resistance in rice, as well as other plant species.

  14. Hairpin RNA Targeting Multiple Viral Genes Confers Strong Resistance to Rice Black-Streaked Dwarf Virus.

    Science.gov (United States)

    Wang, Fangquan; Li, Wenqi; Zhu, Jinyan; Fan, Fangjun; Wang, Jun; Zhong, Weigong; Wang, Ming-Bo; Liu, Qing; Zhu, Qian-Hao; Zhou, Tong; Lan, Ying; Zhou, Yijun; Yang, Jie

    2016-05-11

    Rice black-streaked dwarf virus (RBSDV) belongs to the genus Fijivirus in the family of Reoviridae and causes severe yield loss in rice-producing areas in Asia. RNA silencing, as a natural defence mechanism against plant viruses, has been successfully exploited for engineering virus resistance in plants, including rice. In this study, we generated transgenic rice lines harbouring a hairpin RNA (hpRNA) construct targeting four RBSDV genes, S1, S2, S6 and S10, encoding the RNA-dependent RNA polymerase, the putative core protein, the RNA silencing suppressor and the outer capsid protein, respectively. Both field nursery and artificial inoculation assays of three generations of the transgenic lines showed that they had strong resistance to RBSDV infection. The RBSDV resistance in the segregating transgenic populations correlated perfectly with the presence of the hpRNA transgene. Furthermore, the hpRNA transgene was expressed in the highly resistant transgenic lines, giving rise to abundant levels of 21-24 nt small interfering RNA (siRNA). By small RNA deep sequencing, the RBSDV-resistant transgenic lines detected siRNAs from all four viral gene sequences in the hpRNA transgene, indicating that the whole chimeric fusion sequence can be efficiently processed by Dicer into siRNAs. Taken together, our results suggest that long hpRNA targeting multiple viral genes can be used to generate stable and durable virus resistance in rice, as well as other plant species.

  15. On the implementation of the Ford | Fulkerson algorithm on the Multiple Instruction and Single Data computer system

    Directory of Open Access Journals (Sweden)

    A. Yu. Popov

    2014-01-01

    Full Text Available Algorithms of optimization in networks and direct graphs find a broad application when solving the practical tasks. However, along with large-scale introduction of information technologies in human activity, requirements for volumes of input data and retrieval rate of solution are aggravated. In spite of the fact that by now the large number of algorithms for the various models of computers and computing systems have been studied and implemented, the solution of key problems of optimization for real dimensions of tasks remains difficult. In this regard search of new and more efficient computing structures, as well as update of known algorithms are of great current interest.The work considers an implementation of the search-end algorithm of the maximum flow on the direct graph for multiple instructions and single data computer system (MISD developed in BMSTU. Key feature of this architecture is deep hardware support of operations over sets and structures of data. Functions of storage and access to them are realized on the specialized processor of structures processing (SP which is capable to perform at the hardware level such operations as: add, delete, search, intersect, complete, merge, and others. Advantage of such system is possibility of parallel execution of parts of the computing tasks regarding the access to the sets to data structures simultaneously with arithmetic and logical processing of information.The previous works present the general principles of the computing process arrangement and features of programs implemented in MISD system, describe the structure and principles of functioning the processor of structures processing, show the general principles of the graph task solutions in such system, and experimentally study the efficiency of the received algorithms.The work gives command formats of the SP processor, offers the technique to update the algorithms realized in MISD system, suggests the option of Ford-Falkersona algorithm

  16. Direction-of-arrival estimation for co-located multiple-input multiple-output radar using structural sparsity Bayesian learning

    Science.gov (United States)

    Wen, Fang-Qing; Zhang, Gong; Ben, De

    2015-11-01

    This paper addresses the direction of arrival (DOA) estimation problem for the co-located multiple-input multiple-output (MIMO) radar with random arrays. The spatially distributed sparsity of the targets in the background makes compressive sensing (CS) desirable for DOA estimation. A spatial CS framework is presented, which links the DOA estimation problem to support recovery from a known over-complete dictionary. A modified statistical model is developed to accurately represent the intra-block correlation of the received signal. A structural sparsity Bayesian learning algorithm is proposed for the sparse recovery problem. The proposed algorithm, which exploits intra-signal correlation, is capable being applied to limited data support and low signal-to-noise ratio (SNR) scene. Furthermore, the proposed algorithm has less computation load compared to the classical Bayesian algorithm. Simulation results show that the proposed algorithm has a more accurate DOA estimation than the traditional multiple signal classification (MUSIC) algorithm and other CS recovery algorithms. Project supported by the National Natural Science Foundation of China (Grant Nos. 61071163, 61271327, and 61471191), the Funding for Outstanding Doctoral Dissertation in Nanjing University of Aeronautics and Astronautics, China (Grant No. BCXJ14-08), the Funding of Innovation Program for Graduate Education of Jiangsu Province, China (Grant No. KYLX 0277), the Fundamental Research Funds for the Central Universities, China (Grant No. 3082015NP2015504), and the Priority Academic Program Development of Jiangsu Higher Education Institutions (PADA), China.

  17. Multiple ionization of atoms by ion impact

    International Nuclear Information System (INIS)

    DuBois, R.D.

    1988-01-01

    In order to model the energy deposition of fast ions as they slow down in gaseous media, information about the ionization occurring in collisions between ions and target atoms/molecules is required. Our measurements of doubly differential electron emission cross sections provide detailed information about the ionization process but do not provide any information about the final states of the target. They also do not distinguish between the emission of one or more target electrons in a single collision. It is important to know the relative importance of multiple-, with respect to single-, target ionization in order to accurately model the energy deposition. To date, multiple ionization of He, Ne, Ar, Kr, and Xe targets has been studied. Primarily, H and He ions were used, although some data for heavier ions (C,N and O) have also been obtained

  18. Propagator formalism and computer simulation of restricted diffusion behaviors of inter-molecular multiple-quantum coherences

    International Nuclear Information System (INIS)

    Cai Congbo; Chen Zhong; Cai Shuhui; Zhong Jianhui

    2005-01-01

    In this paper, behaviors of single-quantum coherences and inter-molecular multiple-quantum coherences under restricted diffusion in nuclear magnetic resonance experiments were investigated. The propagator formalism based on the loss of spin phase memory during random motion was applied to describe the diffusion-induced signal attenuation. The exact expression of the signal attenuation under the short gradient pulse approximation for restricted diffusion between two parallel plates was obtained using this propagator method. For long gradient pulses, a modified formalism was proposed. The simulated signal attenuation under the effects of gradient pulses of different width based on the Monte Carlo method agrees with the theoretical predictions. The propagator formalism and computer simulation can provide convenient, intuitive and precise methods for the study of the diffusion behaviors

  19. Emission computed tomography

    International Nuclear Information System (INIS)

    Ott, R.J.

    1986-01-01

    Emission Computed Tomography is a technique used for producing single or multiple cross-sectional images of the distribution of radionuclide labelled agents in vivo. The techniques of Single Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) are described with particular regard to the function of the detectors used to produce images and the computer techniques used to build up images. (UK)

  20. Three-dimensional multiple reciprocity boundary element method for one-group neutron diffusion eigenvalue computations

    International Nuclear Information System (INIS)

    Itagaki, Masafumi; Sahashi, Naoki.

    1996-01-01

    The multiple reciprocity method (MRM) in conjunction with the boundary element method has been employed to solve one-group eigenvalue problems described by the three-dimensional (3-D) neutron diffusion equation. The domain integral related to the fission source is transformed into a series of boundary-only integrals, with the aid of the higher order fundamental solutions based on the spherical and the modified spherical Bessel functions. Since each degree of the higher order fundamental solutions in the 3-D cases has a singularity of order (1/r), the above series of boundary integrals requires additional terms which do not appear in the 2-D MRM formulation. The critical eigenvalue itself can be also described using only boundary integrals. Test calculations show that Wielandt's spectral shift technique guarantees rapid and stable convergence of 3-D MRM computations. (author)

  1. Damage to tungsten macro-brush targets under multiple ELM-like heat loads. Experiments vs. numerical simulations and extrapolation to ITER

    Energy Technology Data Exchange (ETDEWEB)

    Bazylev, B.; Landman, I. [Forschungszentrum Karlsruhe (Germany). IHM; Janeschitz, G. [Forschungszentrum Karlsruhe (DE). Fusion EURATOM] (and others)

    2007-07-01

    Operation of ITER at high fusion gain is assumed to be the H-mode. A characteristic feature of this regime is the transient release of energy from the confined plasma onto PFCs by multiple ELMs (about 104 ELMs per ITER discharge), which can play a determining role in the erosion rate and lifetime of these components. The expected energy heat loads on the ITER divertor during Type I ELM are in range 0.5-4 MJ/m{sup 2} in timescales of 0.3-0.6 ms. Tungsten macro-brush armour (W-brushes) is foreseen as one of plasma facing components (PFC) for ITER divertor and dome. During the intense transient events in ITER the surface melting, melt motion, melt splashing and evaporation are seen as the main mechanisms of W erosion. The expected erosion of the ITER plasma facing components under transient energy loads can be properly estimated by numerical simulations validated against target erosion of the experiments at the plasma gun facility QSPA-T. Within the collaboration established between EU fusion programme and the Russian Federation, W-brush targets (produced either from pure tungsten or tungsten with 1% of La{sub 2}O{sub 3}) manufactured according to the EU specifications for the ITER divertor targets, have been exposed to multiple ITER ELM-like loads in plasma gun facilities at TRINITI in the range 0.5 - 2.2 MJ/m2 with pulse duration of 0.5 ms. The measured material erosion data have been used to validate the codes MEMOS and PHEMOBRID. Numerical simulations, including 3D-simulations (codes MEMOS and PHEMOBRID), carried out for the conditions of the QSPA-T experiments with heat loads in the range 0.5-2.2 MJ/m{sup 2} and the timescale 0.5 ms demonstrated a rather good agreement with the data obtained at the plasma gun facility QSPA: melting of brush edges at low heat loads, intense melt motion and bridge formation caused by the Rayleigh-Taylor instability at heat loads Q>1.3 MJ/m{sup 2}. The melt splashing generated by the Kelvin-Helmholtz, and Rayleigh

  2. Catabolite regulation analysis of Escherichia coli for acetate overflow mechanism and co-consumption of multiple sugars based on systems biology approach using computer simulation.

    Science.gov (United States)

    Matsuoka, Yu; Shimizu, Kazuyuki

    2013-10-20

    It is quite important to understand the basic principle embedded in the main metabolism for the interpretation of the fermentation data. For this, it may be useful to understand the regulation mechanism based on systems biology approach. In the present study, we considered the perturbation analysis together with computer simulation based on the models which include the effects of global regulators on the pathway activation for the main metabolism of Escherichia coli. Main focus is the acetate overflow metabolism and the co-fermentation of multiple carbon sources. The perturbation analysis was first made to understand the nature of the feed-forward loop formed by the activation of Pyk by FDP (F1,6BP), and the feed-back loop formed by the inhibition of Pfk by PEP in the glycolysis. Those together with the effect of transcription factor Cra caused by FDP level affected the glycolysis activity. The PTS (phosphotransferase system) acts as the feed-back system by repressing the glucose uptake rate for the increase in the glucose uptake rate. It was also shown that the increased PTS flux (or glucose consumption rate) causes PEP/PYR ratio to be decreased, and EIIA-P, Cya, cAMP-Crp decreased, where cAMP-Crp in turn repressed TCA cycle and more acetate is formed. This was further verified by the detailed computer simulation. In the case of multiple carbon sources such as glucose and xylose, it was shown that the sequential utilization of carbon sources was observed for wild type, while the co-consumption of multiple carbon sources with slow consumption rates were observed for the ptsG mutant by computer simulation, and this was verified by experiments. Moreover, the effect of a specific gene knockout such as Δpyk on the metabolic characteristics was also investigated based on the computer simulation. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Multiplicity distributions in inelastic reactions on nuclei

    CERN Document Server

    Caneschi, L; Schwimmer, A

    1976-01-01

    The multiplicity distribution of the number of knocked-out nucleons and the correlation of the former with the multiplicity of the produced mesons, in inelastic particle-nucleus scattering, are computed.

  4. Target Price Accuracy

    Directory of Open Access Journals (Sweden)

    Alexander G. Kerl

    2011-04-01

    Full Text Available This study analyzes the accuracy of forecasted target prices within analysts’ reports. We compute a measure for target price forecast accuracy that evaluates the ability of analysts to exactly forecast the ex-ante (unknown 12-month stock price. Furthermore, we determine factors that explain this accuracy. Target price accuracy is negatively related to analyst-specific optimism and stock-specific risk (measured by volatility and price-to-book ratio. However, target price accuracy is positively related to the level of detail of each report, company size and the reputation of the investment bank. The potential conflicts of interests between an analyst and a covered company do not bias forecast accuracy.

  5. Collectively loading programs in a multiple program multiple data environment

    Science.gov (United States)

    Aho, Michael E.; Attinella, John E.; Gooding, Thomas M.; Gooding, Thomas M.; Miller, Samuel J.

    2016-11-08

    Techniques are disclosed for loading programs efficiently in a parallel computing system. In one embodiment, nodes of the parallel computing system receive a load description file which indicates, for each program of a multiple program multiple data (MPMD) job, nodes which are to load the program. The nodes determine, using collective operations, a total number of programs to load and a number of programs to load in parallel. The nodes further generate a class route for each program to be loaded in parallel, where the class route generated for a particular program includes only those nodes on which the program needs to be loaded. For each class route, a node is selected using a collective operation to be a load leader which accesses a file system to load the program associated with a class route and broadcasts the program via the class route to other nodes which require the program.

  6. N-way FRET microscopy of multiple protein-protein interactions in live cells.

    Directory of Open Access Journals (Sweden)

    Adam D Hoppe

    Full Text Available Fluorescence Resonance Energy Transfer (FRET microscopy has emerged as a powerful tool to visualize nanoscale protein-protein interactions while capturing their microscale organization and millisecond dynamics. Recently, FRET microscopy was extended to imaging of multiple donor-acceptor pairs, thereby enabling visualization of multiple biochemical events within a single living cell. These methods require numerous equations that must be defined on a case-by-case basis. Here, we present a universal multispectral microscopy method (N-Way FRET to enable quantitative imaging for any number of interacting and non-interacting FRET pairs. This approach redefines linear unmixing to incorporate the excitation and emission couplings created by FRET, which cannot be accounted for in conventional linear unmixing. Experiments on a three-fluorophore system using blue, yellow and red fluorescent proteins validate the method in living cells. In addition, we propose a simple linear algebra scheme for error propagation from input data to estimate the uncertainty in the computed FRET images. We demonstrate the strength of this approach by monitoring the oligomerization of three FP-tagged HIV Gag proteins whose tight association in the viral capsid is readily observed. Replacement of one FP-Gag molecule with a lipid raft-targeted FP allowed direct observation of Gag oligomerization with no association between FP-Gag and raft-targeted FP. The N-Way FRET method provides a new toolbox for capturing multiple molecular processes with high spatial and temporal resolution in living cells.

  7. MUSIDH, multiple use of simulated demographic histories, a novel method to reduce computation time in microsimulation models of infectious diseases.

    Science.gov (United States)

    Fischer, E A J; De Vlas, S J; Richardus, J H; Habbema, J D F

    2008-09-01

    Microsimulation of infectious diseases requires simulation of many life histories of interacting individuals. In particular, relatively rare infections such as leprosy need to be studied in very large populations. Computation time increases disproportionally with the size of the simulated population. We present a novel method, MUSIDH, an acronym for multiple use of simulated demographic histories, to reduce computation time. Demographic history refers to the processes of birth, death and all other demographic events that should be unrelated to the natural course of an infection, thus non-fatal infections. MUSIDH attaches a fixed number of infection histories to each demographic history, and these infection histories interact as if being the infection history of separate individuals. With two examples, mumps and leprosy, we show that the method can give a factor 50 reduction in computation time at the cost of a small loss in precision. The largest reductions are obtained for rare infections with complex demographic histories.

  8. Scaling laws for simple heavy ion targets

    International Nuclear Information System (INIS)

    Gula, W.P.; Magelssen, G.R.

    1981-01-01

    We have examined the behavior of single shell DT gas filled spherical targets irradiated by a constant power heavy ion beam pulse. For targets in which the ion range is less than the shell thickness, our computational results suggest that the target can be divided into three regions: (1) the absorber (100 to 400 eV for the energies we have considered), (2) the cold pusher (a few eV), and (3) the DT gas fuel. We have examined the pusher collapse time, velocity, and maximum kinetic energy variations as functions of the various target parameters and ion beam energy. The results are expressed in analytic terms and verified by computer simulation

  9. A bio-inspired swarm robot coordination algorithm for multiple target searching

    Science.gov (United States)

    Meng, Yan; Gan, Jing; Desai, Sachi

    2008-04-01

    The coordination of a multi-robot system searching for multi targets is challenging under dynamic environment since the multi-robot system demands group coherence (agents need to have the incentive to work together faithfully) and group competence (agents need to know how to work together well). In our previous proposed bio-inspired coordination method, Local Interaction through Virtual Stigmergy (LIVS), one problem is the considerable randomness of the robot movement during coordination, which may lead to more power consumption and longer searching time. To address these issues, an adaptive LIVS (ALIVS) method is proposed in this paper, which not only considers the travel cost and target weight, but also predicting the target/robot ratio and potential robot redundancy with respect to the detected targets. Furthermore, a dynamic weight adjustment is also applied to improve the searching performance. This new method a truly distributed method where each robot makes its own decision based on its local sensing information and the information from its neighbors. Basically, each robot only communicates with its neighbors through a virtual stigmergy mechanism and makes its local movement decision based on a Particle Swarm Optimization (PSO) algorithm. The proposed ALIVS algorithm has been implemented on the embodied robot simulator, Player/Stage, in a searching target. The simulation results demonstrate the efficiency and robustness in a power-efficient manner with the real-world constraints.

  10. CosmoTransitions: Computing cosmological phase transition temperatures and bubble profiles with multiple fields

    Science.gov (United States)

    Wainwright, Carroll L.

    2012-09-01

    I present a numerical package (CosmoTransitions) for analyzing finite-temperature cosmological phase transitions driven by single or multiple scalar fields. The package analyzes the different vacua of a theory to determine their critical temperatures (where the vacuum energy levels are degenerate), their supercooling temperatures, and the bubble wall profiles which separate the phases and describe their tunneling dynamics. I introduce a new method of path deformation to find the profiles of both thin- and thick-walled bubbles. CosmoTransitions is freely available for public use.Program summaryProgram Title: CosmoTransitionsCatalogue identifier: AEML_v1_0Program summary URL: http://cpc.cs.qub.ac.uk/summaries/AEML_v1_0.htmlProgram obtainable from: CPC Program Library, Queen's University, Belfast, N. IrelandLicensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.htmlNo. of lines in distributed program, including test data, etc.: 8775No. of bytes in distributed program, including test data, etc.: 621096Distribution format: tar.gzProgramming language: Python.Computer: Developed on a 2009 MacBook Pro. No computer-specific optimization was performed.Operating system: Designed and tested on Mac OS X 10.6.8. Compatible with any OS with Python installed.RAM: Approximately 50 MB, mostly for loading plotting packages.Classification: 1.9, 11.1.External routines: SciPy, NumPy, matplotLibNature of problem: I describe a program to analyze early-Universe finite-temperature phase transitions with multiple scalar fields. The goal is to analyze the phase structure of an input theory, determine the amount of supercooling at each phase transition, and find the bubble-wall profiles of the nucleated bubbles that drive the transitions.Solution method: To find the bubble-wall profile, the program assumes that tunneling happens along a fixed path in field space. This reduces the equations of motion to one dimension, which can then be solved using the overshoot

  11. A minimal unified model of disease trajectories captures hallmarks of multiple sclerosis

    KAUST Repository

    Kannan, Venkateshan

    2017-03-29

    Multiple Sclerosis (MS) is an autoimmune disease targeting the central nervous system (CNS) causing demyelination and neurodegeneration leading to accumulation of neurological disability. Here we present a minimal, computational model involving the immune system and CNS that generates the principal subtypes of the disease observed in patients. The model captures several key features of MS, especially those that distinguish the chronic progressive phase from that of the relapse-remitting. In addition, a rare subtype of the disease, progressive relapsing MS naturally emerges from the model. The model posits the existence of two key thresholds, one in the immune system and the other in the CNS, that separate dynamically distinct behavior of the model. Exploring the two-dimensional space of these thresholds, we obtain multiple phases of disease evolution and these shows greater variation than the clinical classification of MS, thus capturing the heterogeneity that is manifested in patients.

  12. Introduction to programming multiple-processor computers

    International Nuclear Information System (INIS)

    Hicks, H.R.; Lynch, V.E.

    1985-04-01

    FORTRAN applications programs can be executed on multiprocessor computers in either a unitasking (traditional) or multitasking form. The latter allows a single job to use more than one processor simultaneously, with a consequent reduction in wall-clock time and, perhaps, the cost of the calculation. An introduction to programming in this environment is presented. The concepts of synchronization and data sharing using EVENTS and LOCKS are illustrated with examples. The strategy of strong synchronization and the use of synchronization templates are proposed. We emphasize that incorrect multitasking programs can produce irreproducible results, which makes debugging more difficult

  13. LANSCE target system performance

    International Nuclear Information System (INIS)

    Russell, G.J.; Gilmore, J.S.; Robinson, H.; Legate, G.L.; Bridge, A.; Sanchez, R.J.; Brewton, R.J.; Woods, R.; Hughes, H.G. III

    1989-01-01

    We measured neutron beam fluxes at LANSCE using gold foil activation techniques. We did an extensive computer simulation of the as-built LANSCE Target/Moderator/Reflector/Shield geometry. We used this mockup in a Monte Carlo calculation to predict LANSCE neutronic performance for comparison with measured results. For neutron beam fluxes at 1 eV, the ratio of measured data to calculated varies from ∼0.6-0.9. The computed 1 eV neutron leakage at the moderator surface is 3.9 x 10 10 n/eV-sr-s-μA for LANSCE high-intensity water moderators. The corresponding values for the LANSCE high-resolution water moderator and the liquid hydrogen moderator are 3.3 and 2.9 x 10 10 , respectively. LANSCE predicted moderator intensities (per proton) for a tungsten target are essentially the same as ISIS predicted moderator intensities for a depleted uranium target. The calculated LANSCE steady state unperturbed thermal (E 13 n/cm 2 -s. The unique LANSCE split-target/flux-trap-moderator system is performing exceedingly well. The system has operated without a target or moderator change for over three years at nominal proton currents of ∼25 μA of 800-MeV protons. (author)

  14. Multiple model cardinalized probability hypothesis density filter

    Science.gov (United States)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  15. Assessing Multiple Pathways for Achieving China’s National Emissions Reduction Target

    Directory of Open Access Journals (Sweden)

    Mingyue Wang

    2018-06-01

    Full Text Available In order to achieve China’s target of carbon intensity emissions reduction in 2030, there is a need to identify a scientific pathway and feasible strategies. In this study, we used stochastic frontier analysis method of energy efficiency, incorporating energy structure, economic structure, human capital, capital stock and potential energy efficiency to identify an efficient pathway for achieving emissions reduction target. We set up 96 scenarios including single factor scenarios and multi-factors combination scenarios for the simulation. The effects of each scenario on achieving the carbon intensity reduction target are then evaluated. It is found that: (1 Potential energy efficiency has the greatest contribution to the carbon intensity emissions reduction target; (2 they are unlikely to reach the 2030 carbon intensity reduction target of 60% by only optimizing a single factor; (3 in order to achieve the 2030 target, several aspects have to be adjusted: the fossil fuel ratio must be lower than 80%, and its average growth rate must be decreased by 2.2%; the service sector ratio in GDP must be higher than 58.3%, while the growth rate of non-service sectors must be lowered by 2.4%; and both human capital and capital stock must achieve and maintain a stable growth rate and a 1% increase annually in energy efficiency. Finally, the specific recommendations of this research were discussed, including constantly improved energy efficiency; the upgrading of China’s industrial structure must be accelerated; emissions reduction must be done at the root of energy sources; multi-level input mechanisms in overall levels of education and training to cultivate the human capital stock must be established; investment in emerging equipment and accelerate the closure of backward production capacity to accumulate capital stock.

  16. Visualizing multiple inter-organelle contact sites using the organelle-targeted split-GFP system.

    Science.gov (United States)

    Kakimoto, Yuriko; Tashiro, Shinya; Kojima, Rieko; Morozumi, Yuki; Endo, Toshiya; Tamura, Yasushi

    2018-04-18

    Functional integrity of eukaryotic organelles relies on direct physical contacts between distinct organelles. However, the entity of organelle-tethering factors is not well understood due to lack of means to analyze inter-organelle interactions in living cells. Here we evaluate the split-GFP system for visualizing organelle contact sites in vivo and show its advantages and disadvantages. We observed punctate GFP signals from the split-GFP fragments targeted to any pairs of organelles among the ER, mitochondria, peroxisomes, vacuole and lipid droplets in yeast cells, which suggests that these organelles form contact sites with multiple organelles simultaneously although it is difficult to rule out the possibilities that these organelle contacts sites are artificially formed by the irreversible associations of the split-GFP probes. Importantly, split-GFP signals in the overlapped regions of the ER and mitochondria were mainly co-localized with ERMES, an authentic ER-mitochondria tethering structure, suggesting that split-GFP assembly depends on the preexisting inter-organelle contact sites. We also confirmed that the split-GFP system can be applied to detection of the ER-mitochondria contact sites in HeLa cells. We thus propose that the split-GFP system is a potential tool to observe and analyze inter-organelle contact sites in living yeast and mammalian cells.

  17. Range distributions in multiply implanted targets

    International Nuclear Information System (INIS)

    Kostic, S.; Jimenez-Rodriguez, J.J.; Karpuzov, D.S.; Armour, D.G.; Carter, G.; Salford Univ.

    1984-01-01

    Range distributions in inhomogeneous binary targets have been investigated both theoretically and experimentally. Silicon single crystal targets [(111) orientation] were implanted with 40 keV Pb + ions to fluences in the range from 5x10 14 to 7.5x10 16 cm -2 prior to bombardment with 80 keV Kr + ions to a fluence of 5x10 15 cm -2 . The samples were analysed using high resolution Rutherford backscattering before and after the krypton implantation in order to determine the dependence of the krypton distribution on the amount of lead previously implanted. The theoretical analysis was undertaken using the formalism developed in [1] and the computer simulation was based on the MARLOWE code. The agreement between the experimental, theoretical and computational krypton profiles is very good and the results indicate that accurate prediction of ranges profiles in inhomogeneous binary targets is possible using available theoretical and computational treatments. (orig.)

  18. Generalised two target localisation using passive monopulse radar

    KAUST Repository

    Jardak, Seifallah

    2017-04-07

    The simultaneous lobing technique, also known as monopulse technique, has been widely used for fast target localisation and tracking purposes. Many works focused on accurately localising one or two targets lying within a narrow beam centred around the monopulse antenna boresight. In this study, a new approach is proposed, which uses the outputs of four antennas to rapidly localise two point targets present in the hemisphere. If both targets have the same elevation angle, the proposed scheme cannot detect them. To detect such targets, a second set of antennas is required. In this study, to detect two targets at generalised locations, the antenna array is divided into multiple overlapping sets each of four antennas. Two algorithms are proposed to combine the outputs from multiple sets and improve the detection performance. Simulation results show that the algorithm is able to localise both targets with <;2° mean square error in azimuth and elevation.

  19. Measuring vigilance decrement using computer vision assisted eye tracking in dynamic naturalistic environments.

    Science.gov (United States)

    Bodala, Indu P; Abbasi, Nida I; Yu Sun; Bezerianos, Anastasios; Al-Nashash, Hasan; Thakor, Nitish V

    2017-07-01

    Eye tracking offers a practical solution for monitoring cognitive performance in real world tasks. However, eye tracking in dynamic environments is difficult due to high spatial and temporal variation of stimuli, needing further and thorough investigation. In this paper, we study the possibility of developing a novel computer vision assisted eye tracking analysis by using fixations. Eye movement data is obtained from a long duration naturalistic driving experiment. Source invariant feature transform (SIFT) algorithm was implemented using VLFeat toolbox to identify multiple areas of interest (AOIs). A new measure called `fixation score' was defined to understand the dynamics of fixation position between the target AOI and the non target AOIs. Fixation score is maximum when the subjects focus on the target AOI and diminishes when they gaze at the non-target AOIs. Statistically significant negative correlation was found between fixation score and reaction time data (r =-0.2253 and pdecrement, the fixation score decreases due to visual attention shifting away from the target objects resulting in an increase in the reaction time.

  20. Study of target and non-target interplay in spatial attention task.

    Science.gov (United States)

    Sweeti; Joshi, Deepak; Panigrahi, B K; Anand, Sneh; Santhosh, Jayasree

    2018-02-01

    Selective visual attention is the ability to selectively pay attention to the targets while inhibiting the distractors. This paper aims to study the targets and non-targets interplay in spatial attention task while subject attends to the target object present in one visual hemifield and ignores the distractor present in another visual hemifield. This paper performs the averaged evoked response potential (ERP) analysis and time-frequency analysis. ERP analysis agrees to the left hemisphere superiority over late potentials for the targets present in right visual hemifield. Time-frequency analysis performed suggests two parameters i.e. event-related spectral perturbation (ERSP) and inter-trial coherence (ITC). These parameters show the same properties for the target present in either of the visual hemifields but show the difference while comparing the activity corresponding to the targets and non-targets. In this way, this study helps to visualise the difference between targets present in the left and right visual hemifields and, also the targets and non-targets present in the left and right visual hemifields. These results could be utilised to monitor subjects' performance in brain-computer interface (BCI) and neurorehabilitation.

  1. Modeling multiple time series annotations as noisy distortions of the ground truth: An Expectation-Maximization approach.

    Science.gov (United States)

    Gupta, Rahul; Audhkhasi, Kartik; Jacokes, Zach; Rozga, Agata; Narayanan, Shrikanth

    2018-01-01

    Studies of time-continuous human behavioral phenomena often rely on ratings from multiple annotators. Since the ground truth of the target construct is often latent, the standard practice is to use ad-hoc metrics (such as averaging annotator ratings). Despite being easy to compute, such metrics may not provide accurate representations of the underlying construct. In this paper, we present a novel method for modeling multiple time series annotations over a continuous variable that computes the ground truth by modeling annotator specific distortions. We condition the ground truth on a set of features extracted from the data and further assume that the annotators provide their ratings as modification of the ground truth, with each annotator having specific distortion tendencies. We train the model using an Expectation-Maximization based algorithm and evaluate it on a study involving natural interaction between a child and a psychologist, to predict confidence ratings of the children's smiles. We compare and analyze the model against two baselines where: (i) the ground truth in considered to be framewise mean of ratings from various annotators and, (ii) each annotator is assumed to bear a distinct time delay in annotation and their annotations are aligned before computing the framewise mean.

  2. Foraging through multiple target categories reveals the flexibility of visual working memory.

    Science.gov (United States)

    Kristjánsson, Tómas; Kristjánsson, Árni

    2018-02-01

    A key assumption in the literature on visual attention is that templates, actively maintained in visual working memory (VWM), guide visual attention. An important question therefore involves the nature and capacity of VWM. According to load theories, more than one search template can be active at the same time and capacity is determined by the total load rather than a precise number of templates. By an alternative account only one search template can be active within visual working memory at any given time, while other templates are in an accessory state - but do not affect visual selection. We addressed this question by varying the number of targets and distractors in a visual foraging task for 40 targets among 40 distractors in two ways: 1) Fixed-distractor-number, involving two distractor types while target categories varied from one to four. 2) Fixed-color-number (7), so that if the target types were two, distractors types were five, while if target number increased to three, distractor types were four (etc.). The two accounts make differing predictions. Under the single-template account, we should expect large switch costs as target types increase to two, but switch-costs should not increase much as target types increase beyond two. Load accounts predict an approximately linear increase in switch costs with increased target type number. The results were that switch costs increased roughly linearly in both conditions, in line with load accounts. The results are discussed in light of recent proposals that working memory reflects lingering neural activity at various sites that operate on the stimuli in each case and findings showing neurally silent working memory representations. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Single-photon sensitive fast ebCMOS camera system for multiple-target tracking of single fluorophores: application to nano-biophotonics

    Science.gov (United States)

    Cajgfinger, Thomas; Chabanat, Eric; Dominjon, Agnes; Doan, Quang T.; Guerin, Cyrille; Houles, Julien; Barbier, Remi

    2011-03-01

    Nano-biophotonics applications will benefit from new fluorescent microscopy methods based essentially on super-resolution techniques (beyond the diffraction limit) on large biological structures (membranes) with fast frame rate (1000 Hz). This trend tends to push the photon detectors to the single-photon counting regime and the camera acquisition system to real time dynamic multiple-target tracing. The LUSIPHER prototype presented in this paper aims to give a different approach than those of Electron Multiplied CCD (EMCCD) technology and try to answer to the stringent demands of the new nano-biophotonics imaging techniques. The electron bombarded CMOS (ebCMOS) device has the potential to respond to this challenge, thanks to the linear gain of the accelerating high voltage of the photo-cathode, to the possible ultra fast frame rate of CMOS sensors and to the single-photon sensitivity. We produced a camera system based on a 640 kPixels ebCMOS with its acquisition system. The proof of concept for single-photon based tracking for multiple single-emitters is the main result of this paper.

  4. Least-squares reverse time migration of multiples

    KAUST Repository

    Zhang, Dongliang

    2013-12-06

    The theory of least-squares reverse time migration of multiples (RTMM) is presented. In this method, least squares migration (LSM) is used to image free-surface multiples where the recorded traces are used as the time histories of the virtual sources at the hydrophones and the surface-related multiples are the observed data. For a single source, the entire free-surface becomes an extended virtual source where the downgoing free-surface multiples more fully illuminate the subsurface compared to the primaries. Since each recorded trace is treated as the time history of a virtual source, knowledge of the source wavelet is not required and the ringy time series for each source is automatically deconvolved. If the multiples can be perfectly separated from the primaries, numerical tests on synthetic data for the Sigsbee2B and Marmousi2 models show that least-squares reverse time migration of multiples (LSRTMM) can significantly improve the image quality compared to RTMM or standard reverse time migration (RTM) of primaries. However, if there is imperfect separation and the multiples are strongly interfering with the primaries then LSRTMM images show no significant advantage over the primary migration images. In some cases, they can be of worse quality. Applying LSRTMM to Gulf of Mexico data shows higher signal-to-noise imaging of the salt bottom and top compared to standard RTM images. This is likely attributed to the fact that the target body is just below the sea bed so that the deep water multiples do not have strong interference with the primaries. Migrating a sparsely sampled version of the Marmousi2 ocean bottom seismic data shows that LSM of primaries and LSRTMM provides significantly better imaging than standard RTM. A potential liability of LSRTMM is that multiples require several round trips between the reflector and the free surface, so that high frequencies in the multiples suffer greater attenuation compared to the primary reflections. This can lead to lower

  5. Compression of magnetized target in the magneto-inertial fusion

    Science.gov (United States)

    Kuzenov, V. V.

    2017-12-01

    This paper presents a mathematical model, numerical method and results of the computer analysis of the compression process and the energy transfer in the target plasma, used in magneto-inertial fusion. The computer simulation of the compression process of magnetized cylindrical target by high-power laser pulse is presented.

  6. Parallel solid-phase isothermal amplification and detection of multiple DNA targets in microliter-sized wells of a digital versatile disc

    International Nuclear Information System (INIS)

    Santiago-Felipe, Sara; Tortajada-Genaro, Luis Antonio; Puchades, Rosa; Maquieira, Ángel

    2016-01-01

    An integrated method for the parallelized detection of multiple DNA target sequences is presented by using microstructures in a digital versatile disc (DVD). Samples and reagents were managed by using both the capillary and centrifugal forces induced by disc rotation. Recombinase polymerase amplification (RPA), in a bridge solid phase format, took place in separate wells, which thereby modified their optical properties. Then the DVD drive reader recorded the modifications of the transmitted laser beam. The strategy allowed tens of genetic determinations to be made simultaneously within <2 h, with small sample volumes (3 μL), low manipulation and at low cost. The method was applied to high-throughput screening of relevant safety threats (allergens, GMOs and pathogenic bacteria) in food samples. Satisfactory results were obtained in terms of sensitivity (48.7 fg of DNA) and reproducibility (below 18 %). This scheme warrants cost-effective multiplex amplification and detection and is perceived to represent a viable tool for screening of nucleic acid targets. (author)

  7. Hybrid value foraging: How the value of targets shapes human foraging behavior.

    Science.gov (United States)

    Wolfe, Jeremy M; Cain, Matthew S; Alaoui-Soce, Abla

    2018-04-01

    In hybrid foraging, observers search visual displays for multiple instances of multiple target types. In previous hybrid foraging experiments, although there were multiple types of target, all instances of all targets had the same value. Under such conditions, behavior was well described by the marginal value theorem (MVT). Foragers left the current "patch" for the next patch when the instantaneous rate of collection dropped below their average rate of collection. An observer's specific target selections were shaped by previous target selections. Observers were biased toward picking another instance of the same target. In the present work, observers forage for instances of four target types whose value and prevalence can vary. If value is kept constant and prevalence manipulated, participants consistently show a preference for the most common targets. Patch-leaving behavior follows MVT. When value is manipulated, observers favor more valuable targets, though individual foraging strategies become more diverse, with some observers favoring the most valuable target types very strongly, sometimes moving to the next patch without collecting any of the less valuable targets.

  8. MO-C-17A-06: Online Adaptive Re-Planning to Account for Independent Motions Between Multiple Targets During Radiotherapy of Lung Cancer

    International Nuclear Information System (INIS)

    Liu, F; Tai, A; Ahunbay, E; Gore, E; Johnstone, C; Li, X

    2014-01-01

    Purpose: To quantify interfractional independent motions between multiple targets in radiotherapy (RT) of lung cancer, and to study the dosimetric benefits of an online adaptive replanning method to account for these variations. Methods: Ninety five diagnostic-quality daily CTs acquired for 9 lung cancer patients treated with IGRT using an in-room CT (CTVision, Siemens) were analyzed. On each daily CT set, contours of the targets (GTV, CTV, or involved nodes) and organs at risk were generated by populating the planning contours using an auto-segmentation tool (ABAS, Elekta) with manual editing. For each patient, an IMRT plan was generated based on the planning CT with a prescription dose of 60 Gy in 2Gy fractions. Three plans were generated and compared for each daily CT set: an IGRT (repositioning) plan by copying the original plan with the required shifts, an online adaptive plan by rapidly modifying the aperture shapes and segment weights of the original plan to conform to the daily anatomy, and a new fully re-optimized plan based on the daily CT using a planning system (Panther, Prowess). Results: The daily deviations of the distance between centers of masses of the targets from the plans varied daily from -10 to 8 mm with an average −0.9±4.1 mm (one standard deviation). The average CTV V100 are 99.0±0.7%, 97.9±2.8%, 99.0±0.6%, and 99.1±0.6%, and the lung V20 Gy 928±332 cc, 944±315 cc, 917±300 cc, and 891±295 cc for the original, repositioning, adaptive, and re-optimized plans, respectively. Wilcoxon signed-rank tests show that the adaptive plans are statistically significantly better than the repositioning plans and comparable with the reoptimized plans. Conclusion: There exist unpredictable, interfractional, relative volume changes and independent motions between multiple targets during lung cancer RT which cannot be accounted for by the current IGRT repositioning but can be corrected by the online adaptive replanning method

  9. MO-C-17A-06: Online Adaptive Re-Planning to Account for Independent Motions Between Multiple Targets During Radiotherapy of Lung Cancer

    Energy Technology Data Exchange (ETDEWEB)

    Liu, F; Tai, A; Ahunbay, E; Gore, E; Johnstone, C; Li, X [Medical College of Wisconsin, Milwaukee, WI (United States)

    2014-06-15

    Purpose: To quantify interfractional independent motions between multiple targets in radiotherapy (RT) of lung cancer, and to study the dosimetric benefits of an online adaptive replanning method to account for these variations. Methods: Ninety five diagnostic-quality daily CTs acquired for 9 lung cancer patients treated with IGRT using an in-room CT (CTVision, Siemens) were analyzed. On each daily CT set, contours of the targets (GTV, CTV, or involved nodes) and organs at risk were generated by populating the planning contours using an auto-segmentation tool (ABAS, Elekta) with manual editing. For each patient, an IMRT plan was generated based on the planning CT with a prescription dose of 60 Gy in 2Gy fractions. Three plans were generated and compared for each daily CT set: an IGRT (repositioning) plan by copying the original plan with the required shifts, an online adaptive plan by rapidly modifying the aperture shapes and segment weights of the original plan to conform to the daily anatomy, and a new fully re-optimized plan based on the daily CT using a planning system (Panther, Prowess). Results: The daily deviations of the distance between centers of masses of the targets from the plans varied daily from -10 to 8 mm with an average −0.9±4.1 mm (one standard deviation). The average CTV V100 are 99.0±0.7%, 97.9±2.8%, 99.0±0.6%, and 99.1±0.6%, and the lung V20 Gy 928±332 cc, 944±315 cc, 917±300 cc, and 891±295 cc for the original, repositioning, adaptive, and re-optimized plans, respectively. Wilcoxon signed-rank tests show that the adaptive plans are statistically significantly better than the repositioning plans and comparable with the reoptimized plans. Conclusion: There exist unpredictable, interfractional, relative volume changes and independent motions between multiple targets during lung cancer RT which cannot be accounted for by the current IGRT repositioning but can be corrected by the online adaptive replanning method.

  10. Molecular Imaging of Cancer Using X-ray Computed Tomography with Protease Targeted Iodinated Activity-Based Probes.

    Science.gov (United States)

    Gaikwad, Hanmant K; Tsvirkun, Darya; Ben-Nun, Yael; Merquiol, Emmanuelle; Popovtzer, Rachela; Blum, Galia

    2018-03-14

    X-ray computed tomography (CT) is a robust, precise, fast, and reliable imaging method that enables excellent spatial resolution and quantification of contrast agents throughout the body. However, CT is largely inadequate for molecular imaging applications due mainly to its low contrast sensitivity that forces the use of large concentrations of contrast agents for detection. To overcome this limitation, we generated a new class of iodinated nanoscale activity-based probes (IN-ABPs) that sufficiently accumulates at the target site by covalently binding cysteine cathepsins that are exceptionally highly expressed in cancer. The IN-ABPs are comprised of a short targeting peptide selective to specific cathepsins, an electrophilic moiety that allows activity-dependent covalent binding, and tags containing dendrimers with up to 48 iodine atoms. IN-ABPs selectively bind and inhibit activity of recombinant and intracellular cathepsin B, L, and S. We compared the in vivo kinetics, biodistribution, and tumor accumulation of IN-ABPs bearing 18 and 48 iodine atoms each, and their control counterparts lacking the targeting moiety. Here we show that although both IN-ABPs bind specifically to cathepsins within the tumor and produce detectable CT contrast, the 48-iodine bearing IN-ABP was found to be optimal with signals over 2.1-fold higher than its nontargeted counterpart. In conclusion, this study shows the synthetic feasibility and potential utility of IN-ABPs as potent contrast agents that enable molecular imaging of tumors using CT.

  11. On the average complexity of sphere decoding in lattice space-time coded multiple-input multiple-output channel

    KAUST Repository

    Abediseid, Walid

    2012-12-21

    The exact average complexity analysis of the basic sphere decoder for general space-time codes applied to multiple-input multiple-output (MIMO) wireless channel is known to be difficult. In this work, we shed the light on the computational complexity of sphere decoding for the quasi- static, lattice space-time (LAST) coded MIMO channel. Specifically, we drive an upper bound of the tail distribution of the decoder\\'s computational complexity. We show that when the computational complexity exceeds a certain limit, this upper bound becomes dominated by the outage probability achieved by LAST coding and sphere decoding schemes. We then calculate the minimum average computational complexity that is required by the decoder to achieve near optimal performance in terms of the system parameters. Our results indicate that there exists a cut-off rate (multiplexing gain) for which the average complexity remains bounded. Copyright © 2012 John Wiley & Sons, Ltd.

  12. Perfect quantum multiple-unicast network coding protocol

    Science.gov (United States)

    Li, Dan-Dan; Gao, Fei; Qin, Su-Juan; Wen, Qiao-Yan

    2018-01-01

    In order to realize long-distance and large-scale quantum communication, it is natural to utilize quantum repeater. For a general quantum multiple-unicast network, it is still puzzling how to complete communication tasks perfectly with less resources such as registers. In this paper, we solve this problem. By applying quantum repeaters to multiple-unicast communication problem, we give encoding-decoding schemes for source nodes, internal ones and target ones, respectively. Source-target nodes share EPR pairs by using our encoding-decoding schemes over quantum multiple-unicast network. Furthermore, quantum communication can be accomplished perfectly via teleportation. Compared with existed schemes, our schemes can reduce resource consumption and realize long-distance transmission of quantum information.

  13. Rationally engineered nanoparticles target multiple myeloma cells, overcome cell-adhesion-mediated drug resistance, and show enhanced efficacy in vivo

    International Nuclear Information System (INIS)

    Kiziltepe, T; Ashley, J D; Stefanick, J F; Qi, Y M; Alves, N J; Handlogten, M W; Suckow, M A; Navari, R M; Bilgicer, B

    2012-01-01

    In the continuing search for effective cancer treatments, we report the rational engineering of a multifunctional nanoparticle that combines traditional chemotherapy with cell targeting and anti-adhesion functionalities. Very late antigen-4 (VLA-4) mediated adhesion of multiple myeloma (MM) cells to bone marrow stroma confers MM cells with cell-adhesion-mediated drug resistance (CAM-DR). In our design, we used micellar nanoparticles as dynamic self-assembling scaffolds to present VLA-4-antagonist peptides and doxorubicin (Dox) conjugates, simultaneously, to selectively target MM cells and to overcome CAM-DR. Dox was conjugated to the nanoparticles through an acid-sensitive hydrazone bond. VLA-4-antagonist peptides were conjugated via a multifaceted synthetic procedure for generating precisely controlled number of targeting functionalities. The nanoparticles were efficiently internalized by MM cells and induced cytotoxicity. Mechanistic studies revealed that nanoparticles induced DNA double-strand breaks and apoptosis in MM cells. Importantly, multifunctional nanoparticles overcame CAM-DR, and were more efficacious than Dox when MM cells were cultured on fibronectin-coated plates. Finally, in a MM xenograft model, nanoparticles preferentially homed to MM tumors with ∼10 fold more drug accumulation and demonstrated dramatic tumor growth inhibition with a reduced overall systemic toxicity. Altogether, we demonstrate the disease driven engineering of a nanoparticle-based drug delivery system, enabling the model of an integrative approach in the treatment of MM

  14. MULTIPLE OBJECTS

    Directory of Open Access Journals (Sweden)

    A. A. Bosov

    2015-04-01

    Full Text Available Purpose. The development of complicated techniques of production and management processes, information systems, computer science, applied objects of systems theory and others requires improvement of mathematical methods, new approaches for researches of application systems. And the variety and diversity of subject systems makes necessary the development of a model that generalizes the classical sets and their development – sets of sets. Multiple objects unlike sets are constructed by multiple structures and represented by the structure and content. The aim of the work is the analysis of multiple structures, generating multiple objects, the further development of operations on these objects in application systems. Methodology. To achieve the objectives of the researches, the structure of multiple objects represents as constructive trio, consisting of media, signatures and axiomatic. Multiple object is determined by the structure and content, as well as represented by hybrid superposition, composed of sets, multi-sets, ordered sets (lists and heterogeneous sets (sequences, corteges. Findings. In this paper we study the properties and characteristics of the components of hybrid multiple objects of complex systems, proposed assessments of their complexity, shown the rules of internal and external operations on objects of implementation. We introduce the relation of arbitrary order over multiple objects, we define the description of functions and display on objects of multiple structures. Originality.In this paper we consider the development of multiple structures, generating multiple objects.Practical value. The transition from the abstract to the subject of multiple structures requires the transformation of the system and multiple objects. Transformation involves three successive stages: specification (binding to the domain, interpretation (multiple sites and particularization (goals. The proposed describe systems approach based on hybrid sets

  15. Applying Ancestry and Sex Computation as a Quality Control Tool in Targeted Next-Generation Sequencing.

    Science.gov (United States)

    Mathias, Patrick C; Turner, Emily H; Scroggins, Sheena M; Salipante, Stephen J; Hoffman, Noah G; Pritchard, Colin C; Shirts, Brian H

    2016-03-01

    To apply techniques for ancestry and sex computation from next-generation sequencing (NGS) data as an approach to confirm sample identity and detect sample processing errors. We combined a principal component analysis method with k-nearest neighbors classification to compute the ancestry of patients undergoing NGS testing. By combining this calculation with X chromosome copy number data, we determined the sex and ancestry of patients for comparison with self-report. We also modeled the sensitivity of this technique in detecting sample processing errors. We applied this technique to 859 patient samples with reliable self-report data. Our k-nearest neighbors ancestry screen had an accuracy of 98.7% for patients reporting a single ancestry. Visual inspection of principal component plots was consistent with self-report in 99.6% of single-ancestry and mixed-ancestry patients. Our model demonstrates that approximately two-thirds of potential sample swaps could be detected in our patient population using this technique. Patient ancestry can be estimated from NGS data incidentally sequenced in targeted panels, enabling an inexpensive quality control method when coupled with patient self-report. © American Society for Clinical Pathology, 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Memory and selective attention in multiple sclerosis: cross-sectional computer-based assessment in a large outpatient sample.

    Science.gov (United States)

    Adler, Georg; Lembach, Yvonne

    2015-08-01

    Cognitive impairments may have a severe impact on everyday functioning and quality of life of patients with multiple sclerosis (MS). However, there are some methodological problems in the assessment and only a few studies allow a representative estimate of the prevalence and severity of cognitive impairments in MS patients. We applied a computer-based method, the memory and attention test (MAT), in 531 outpatients with MS, who were assessed at nine neurological practices or specialized outpatient clinics. The findings were compared with those obtained in an age-, sex- and education-matched control group of 84 healthy subjects. Episodic short-term memory was substantially decreased in the MS patients. About 20% of them reached a score of only less than two standard deviations below the mean of the control group. The episodic short-term memory score was negatively correlated with the EDSS score. Minor but also significant impairments in the MS patients were found for verbal short-term memory, episodic working memory and selective attention. The computer-based MAT was found to be useful for a routine assessment of cognition in MS outpatients.

  17. Common pitfalls in preclinical cancer target validation.

    Science.gov (United States)

    Kaelin, William G

    2017-07-01

    An alarming number of papers from laboratories nominating new cancer drug targets contain findings that cannot be reproduced by others or are simply not robust enough to justify drug discovery efforts. This problem probably has many causes, including an underappreciation of the danger of being misled by off-target effects when using pharmacological or genetic perturbants in complex biological assays. This danger is particularly acute when, as is often the case in cancer pharmacology, the biological phenotype being measured is a 'down' readout (such as decreased proliferation, decreased viability or decreased tumour growth) that could simply reflect a nonspecific loss of cellular fitness. These problems are compounded by multiple hypothesis testing, such as when candidate targets emerge from high-throughput screens that interrogate multiple targets in parallel, and by a publication and promotion system that preferentially rewards positive findings. In this Perspective, I outline some of the common pitfalls in preclinical cancer target identification and some potential approaches to mitigate them.

  18. Computational modeling as a tool for water resources management: an alternative approach to problems of multiple uses

    Directory of Open Access Journals (Sweden)

    Haydda Manolla Chaves da Hora

    2012-04-01

    Full Text Available Today in Brazil there are many cases of incompatibility regarding use of water and its availability. Due to the increase in required variety and volume, the concept of multiple uses was created, as stated by Pinheiro et al. (2007. The use of the same resource to satisfy different needs with several restrictions (qualitative and quantitative creates conflicts. Aiming to minimize these conflicts, this work was applied to the particular cases of Hydrographic Regions VI and VIII of Rio de Janeiro State, using computational modeling techniques (based on MOHID software – Water Modeling System as a tool for water resources management.

  19. Bioinformatic analysis of xenobiotic reactive metabolite target proteins and their interacting partners

    Directory of Open Access Journals (Sweden)

    Hanzlik Robert P

    2009-06-01

    Full Text Available Abstract Background Protein covalent binding by reactive metabolites of drugs, chemicals and natural products can lead to acute cytotoxicity. Recent rapid progress in reactive metabolite target protein identification has shown that adduction is surprisingly selective and inspired the hope that analysis of target proteins might reveal protein factors that differentiate target- vs. non-target proteins and illuminate mechanisms connecting covalent binding to cytotoxicity. Results Sorting 171 known reactive metabolite target proteins revealed a number of GO categories and KEGG pathways to be significantly enriched in targets, but in most cases the classes were too large, and the "percent coverage" too small, to allow meaningful conclusions about mechanisms of toxicity. However, a similar analysis of the directlyinteracting partners of 28 common targets of multiple reactive metabolites revealed highly significant enrichments in terms likely to be highly relevant to cytotoxicity (e.g., MAP kinase pathways, apoptosis, response to unfolded protein. Machine learning was used to rank the contribution of 211 computed protein features to determining protein susceptibility to adduction. Protein lysine (but not cysteine content and protein instability index (i.e., rate of turnover in vivo were among the features most important to determining susceptibility. Conclusion As yet there is no good explanation for why some low-abundance proteins become heavily adducted while some abundant proteins become only lightly adducted in vivo. Analyzing the directly interacting partners of target proteins appears to yield greater insight into mechanisms of toxicity than analyzing target proteins per se. The insights provided can readily be formulated as hypotheses to test in future experimental studies.

  20. TargetMine, an integrated data warehouse for candidate gene prioritisation and target discovery.

    Directory of Open Access Journals (Sweden)

    Yi-An Chen

    Full Text Available Prioritising candidate genes for further experimental characterisation is a non-trivial challenge in drug discovery and biomedical research in general. An integrated approach that combines results from multiple data types is best suited for optimal target selection. We developed TargetMine, a data warehouse for efficient target prioritisation. TargetMine utilises the InterMine framework, with new data models such as protein-DNA interactions integrated in a novel way. It enables complicated searches that are difficult to perform with existing tools and it also offers integration of custom annotations and in-house experimental data. We proposed an objective protocol for target prioritisation using TargetMine and set up a benchmarking procedure to evaluate its performance. The results show that the protocol can identify known disease-associated genes with high precision and coverage. A demonstration version of TargetMine is available at http://targetmine.nibio.go.jp/.