WorldWideScience

Sample records for previously published algorithm

  1. Publish/Subscribe on Top of DHT Using RETE Algorithm

    Science.gov (United States)

    Shvartzshnaider, Yan; Ott, Maximilian; Levy, David

    This paper discusses the construction of a Global Semantic Graph (GSG) [1] to support future information- and collaboration-centric applications and services. The GSG is a publish/subscribe (pub/sub) based architecture that supports publication of tuples and subscriptions with standing graph queries. We believe that an implementation of an efficient pattern matching algorithm such as Rete [2] on top of a distributed environment might serve as a possible substrate for GSG's pub/sub facility. Rete operates on loosely coupled alpha, beta and join nodes and therefore has been chosen by us for implementation in a distributed setting.

  2. A new warfarin dosing algorithm including VKORC1 3730 G > A polymorphism: comparison with results obtained by other published algorithms.

    Science.gov (United States)

    Cini, Michela; Legnani, Cristina; Cosmi, Benilde; Guazzaloca, Giuliana; Valdrè, Lelia; Frascaro, Mirella; Palareti, Gualtiero

    2012-08-01

    Warfarin dosing is affected by clinical and genetic variants, but the contribution of the genotype associated with warfarin resistance in pharmacogenetic algorithms has not been well assessed yet. We developed a new dosing algorithm including polymorphisms associated both with warfarin sensitivity and resistance in the Italian population, and its performance was compared with those of eight previously published algorithms. Clinical and genetic data (CYP2C9*2, CYP2C9*3, VKORC1 -1639 G > A, and VKORC1 3730 G > A) were used to elaborate the new algorithm. Derivation and validation groups comprised 55 (58.2% men, mean age 69 years) and 40 (57.5% men, mean age 70 years) patients, respectively, who were on stable anticoagulation therapy for at least 3 months with different oral anticoagulation therapy (OAT) indications. Performance of the new algorithm, evaluated with mean absolute error (MAE) defined as the absolute value of the difference between observed daily maintenance dose and predicted daily dose, correlation with the observed dose and R(2) value, was comparable with or slightly lower than that obtained using the other algorithms. The new algorithm could correctly assign 53.3%, 50.0%, and 57.1% of patients to the low (≤25 mg/week), intermediate (26-44 mg/week) and high (≥ 45 mg/week) dosing range, respectively. Our data showed a significant increase in predictive accuracy among patients requiring high warfarin dose compared with the other algorithms (ranging from 0% to 28.6%). The algorithm including VKORC1 3730 G > A, associated with warfarin resistance, allowed a more accurate identification of resistant patients who require higher warfarin dosage.

  3. Availability of nuclear decay data in electronic form, including beta spectra not previously published

    International Nuclear Information System (INIS)

    Eckerman, K.F.; Westfall, R.J.; Ryman, J.C.; Cristy, M.

    1994-01-01

    The unabridged data used in preparing ICRP Publication 38 (1983) and a monograph of the Medical Internal Radiation Dose (MIRD) Committee are now available in electronic form. The open-quotes ICRP38 collectionclose quotes contains data on the energies and intensities of radiations emitted by 825 radionuclides (those in ICRP Publication 38 plus 13 from the MIRD monograph), and the open-quotes MIRD collectionclose quotes contains data on 242 radionuclides. Each collection consists of a radiations data file and a beta spectra data file. The radiations data file contains the complete listing of the emitted radiations, their types, mean or unique energies, and absolute intensities for each radionuclide, the probability that a beta particle will be emitted with kinetic energies defined by a standard energy grid. Although summary information from the radiation data files has been published, neither the unabridged data nor the beta spectra have been published. These data files and a data extraction utility, which runs on a personal computer, are available from the Radiation Shielding Information Center at Oak Ridge National Laboratory. 13 refs., 1 fig., 6 tabs

  4. List of new names and new combinations previously effectively, but not validly, published.

    Science.gov (United States)

    2008-09-01

    The purpose of this announcement is to effect the valid publication of the following effectively published new names and new combinations under the procedure described in the Bacteriological Code (1990 Revision). Authors and other individuals wishing to have new names and/or combinations included in future lists should send three copies of the pertinent reprint or photocopies thereof, or an electronic copy of the published paper, to the IJSEM Editorial Office for confirmation that all of the other requirements for valid publication have been met. It is also a requirement of IJSEM and the ICSP that authors of new species, new subspecies and new combinations provide evidence that types are deposited in two recognized culture collections in two different countries (i.e. documents certifying deposition and availability of type strains). It should be noted that the date of valid publication of these new names and combinations is the date of publication of this list, not the date of the original publication of the names and combinations. The authors of the new names and combinations are as given below, and these authors' names will be included in the author index of the present issue and in the volume author index. Inclusion of a name on these lists validates the publication of the name and thereby makes it available in bacteriological nomenclature. The inclusion of a name on this list is not to be construed as taxonomic acceptance of the taxon to which the name is applied. Indeed, some of these names may, in time, be shown to be synonyms, or the organisms may be transferred to another genus, thus necessitating the creation of a new combination.

  5. C-mixture and multi-constraints based genetic algorithm for collaborative data publishing

    Directory of Open Access Journals (Sweden)

    Yogesh R. Kulkarni

    2018-04-01

    Full Text Available Due to increasing need of using distributed databases, high demand presents on sharing data to easily update and access the useful information without any interruption. The sharing of distributed databases causes a serious issue of securing information since the databases consist of sensitive personal information. To preserve the sensitive information and at the same time, releasing the useful information, a significant effort is made by the researchers under privacy preserving data publishing that have been receiving considerable attention in recent years. In this work, a new privacy measure, called c-mixture is introduced to maintain the privacy constraint without affecting utility of the database. In order to apply the proposed privacy measure to privacy preserving data publishing, a new algorithm called, CPGEN is developed using genetic algorithm and multi-objective constraints. The proposed multi-objective optimization considered the multiple privacy constraints along with the utility measurement to measure the importance. Also, the proposed CPGEN is adapted to handle the cold-start problem which commonly happened in distributed databases. The proposed algorithm is experimented with adult dataset and quantitative performance is analyzed using generalized information loss and average equivalence class size metric. From the experimentation, we proved that the proposed algorithm maintained the privacy and utility as compared with the existing algorithm. Keywords: Privacy, Utility, Distributed databases, Data publishing, Optimization, Sensitive information

  6. Publishing data from electronic health records while preserving privacy: a survey of algorithms

    OpenAIRE

    Gkoulalas-Divanis, Aris; Loukides, Grigorios; Sun, Jimeng

    2014-01-01

    The dissemination of Electronic Health Records (EHRs) can be highly beneficial for a range of medical studies, spanning from clinical trials to epidemic control studies, but it must be performed in a way that preserves patients’ privacy. This is not straightforward, because the disseminated data need to be protected against several privacy threats, while remaining useful for subsequent analysis tasks. In this work, we present a survey of algorithms that have been proposed for publishing struc...

  7. Publishing data from electronic health records while preserving privacy: a survey of algorithms.

    Science.gov (United States)

    Gkoulalas-Divanis, Aris; Loukides, Grigorios; Sun, Jimeng

    2014-08-01

    The dissemination of Electronic Health Records (EHRs) can be highly beneficial for a range of medical studies, spanning from clinical trials to epidemic control studies, but it must be performed in a way that preserves patients' privacy. This is not straightforward, because the disseminated data need to be protected against several privacy threats, while remaining useful for subsequent analysis tasks. In this work, we present a survey of algorithms that have been proposed for publishing structured patient data, in a privacy-preserving way. We review more than 45 algorithms, derive insights on their operation, and highlight their advantages and disadvantages. We also provide a discussion of some promising directions for future research in this area. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Reproducibility discrepancies following reanalysis of raw data for a previously published study on diisononyl phthalate (DINP in rats

    Directory of Open Access Journals (Sweden)

    Min Chen

    2017-08-01

    Full Text Available A 2011 publication by Boberg et al. entitled “Reproductive and behavioral effects of diisononyl phthalate (DINP in perinatally exposed rats” [1] reported statistically significant changes in sperm parameters, testicular histopathology, anogenital distance and retained nipples in developing males. Using the statistical methods as reported by Boberg et al. (2011 [1], we reanalyzed the publically available raw data ([dataset] US EPA (United States Environmental Protection Agency, 2016 [2]. The output of our reanalysis and the discordances with the data as published in Boberg et al. (2011 [1] are highlighted herein. Further discussion of the basis for the replication discordances and the insufficiency of the Boberg et al. (2011 [1] response to address them can be found in a companion letter of correspondence (doi: 10.1016/j.reprotox.2017.03.013.; (Morfeld et al., 2011 [3].

  9. Is email a reliable means of contacting authors of previously published papers? A study of the Emergency Medicine Journal for 2001.

    Science.gov (United States)

    O'Leary, F

    2003-07-01

    To determine whether it is possible to contact authors of previously published papers via email. A cross sectional study of the Emergency Medicine Journal for 2001. 118 articles were included in the study. The response rate from those with valid email addresses was 73%. There was no statistical difference between the type of email address used and the address being invalid (p=0.392) or between the type of article and the likelihood of a reply (p=0.197). More responses were obtained from work addresses when compared with Hotmail addresses (86% v 57%, p=0.02). Email is a valid means of contacting authors of previously published articles, particularly within the emergency medicine specialty. A work based email address may be a more valid means of contact than a Hotmail address.

  10. Using Genetic Algorithm and MODFLOW to Characterize Aquifer System of Northwest Florida (Published Proceedings)

    Science.gov (United States)

    By integrating Genetic Algorithm and MODFLOW2005, an optimizing tool is developed to characterize the aquifer system of Region II, Northwest Florida. The history and the newest available observation data of the aquifer system is fitted automatically by using the numerical model c...

  11. Technical Note: A novel leaf sequencing optimization algorithm which considers previous underdose and overdose events for MLC tracking radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Wisotzky, Eric, E-mail: eric.wisotzky@charite.de, E-mail: eric.wisotzky@ipk.fraunhofer.de; O’Brien, Ricky; Keall, Paul J., E-mail: paul.keall@sydney.edu.au [Radiation Physics Laboratory, Sydney Medical School, University of Sydney, Sydney, NSW 2006 (Australia)

    2016-01-15

    Purpose: Multileaf collimator (MLC) tracking radiotherapy is complex as the beam pattern needs to be modified due to the planned intensity modulation as well as the real-time target motion. The target motion cannot be planned; therefore, the modified beam pattern differs from the original plan and the MLC sequence needs to be recomputed online. Current MLC tracking algorithms use a greedy heuristic in that they optimize for a given time, but ignore past errors. To overcome this problem, the authors have developed and improved an algorithm that minimizes large underdose and overdose regions. Additionally, previous underdose and overdose events are taken into account to avoid regions with high quantity of dose events. Methods: The authors improved the existing MLC motion control algorithm by introducing a cumulative underdose/overdose map. This map represents the actual projection of the planned tumor shape and logs occurring dose events at each specific regions. These events have an impact on the dose cost calculation and reduce recurrence of dose events at each region. The authors studied the improvement of the new temporal optimization algorithm in terms of the L1-norm minimization of the sum of overdose and underdose compared to not accounting for previous dose events. For evaluation, the authors simulated the delivery of 5 conformal and 14 intensity-modulated radiotherapy (IMRT)-plans with 7 3D patient measured tumor motion traces. Results: Simulations with conformal shapes showed an improvement of L1-norm up to 8.5% after 100 MLC modification steps. Experiments showed comparable improvements with the same type of treatment plans. Conclusions: A novel leaf sequencing optimization algorithm which considers previous dose events for MLC tracking radiotherapy has been developed and investigated. Reductions in underdose/overdose are observed for conformal and IMRT delivery.

  12. Compilation of new and previously published geochemical and modal data for Mesoproterozoic igneous rocks of the St. Francois Mountains, southeast Missouri

    Science.gov (United States)

    du Bray, Edward A.; Day, Warren C.; Meighan, Corey J.

    2018-04-16

    The purpose of this report is to present recently acquired as well as previously published geochemical and modal petrographic data for igneous rocks in the St. Francois Mountains, southeast Missouri, as part of an ongoing effort to understand the regional geology and ore deposits of the Mesoproterozoic basement rocks of southeast Missouri, USA. The report includes geochemical data that is (1) newly acquired by the U.S. Geological Survey and (2) compiled from numerous sources published during the last fifty-five years. These data are required for ongoing petrogenetic investigations of these rocks. Voluminous Mesoproterozoic igneous rocks in the St. Francois Mountains of southeast Missouri constitute the basement buried beneath Paleozoic sedimentary rock that is over 600 meters thick in places. The Mesoproterozoic rocks of southeast Missouri represent a significant component of approximately 1.4 billion-year-old (Ga) igneous rocks that crop out extensively in North America along the southeast margin of Laurentia and subsequent researchers suggested that iron oxide-copper deposits in the St. Francois Mountains are genetically associated with ca. 1.4 Ga magmatism in this region. The geochemical and modal data sets described herein were compiled to support investigations concerning the tectonic setting and petrologic processes responsible for the associated magmatism.

  13. A compendium of P- and S-wave velocities from surface-to-borehole logging; summary and reanalysis of previously published data and analysis of unpublished data

    Science.gov (United States)

    Boore, David M.

    2003-01-01

    For over 28 years, the U.S. Geological Survey (USGS) has been acquiring seismic velocity and geologic data at a number of locations in California, many of which were chosen because strong ground motions from earthquakes were recorded at the sites. The method for all measurements involves picking first arrivals of P- and S-waves from a surface source recorded at various depths in a borehole (as opposed to noninvasive methods, such as the SASW method [e.g., Brown et al., 2002]). The results from most of the sites are contained in a series of U.S. Geological Survey Open-File Reports (see References). Until now, none of the results have been available as computer files, and before 1992 the interpretation of the arrival times was in terms of piecemeal interval velocities, with no attempt to derive a layered model that would fit the travel times in an overall sense (the one exception is Porcella, 1984). In this report I reanalyze all of the arrival times in terms of layered models for P- and for S-wave velocities at each site, and I provide the results as computer files. In addition to the measurements reported in the open-file reports, I also include some borehole results from other reports, as well as some results never before published. I include data for 277 boreholes (at the time of this writing; more will be added to the web site as they are obtained), all in California (I have data from boreholes in Washington and Utah, but these will be published separately). I am also in the process of interpreting travel time data obtained using a seismic cone penetrometer at hundreds of sites; these data can be interpreted in the same way of those obtained from surface-to-borehole logging. When available, the data will be added to the web site (see below for information on obtaining data from the World Wide Web (WWW)). In addition to the basic borehole data and results, I provide information concerning strong-motion stations that I judge to be close enough to the boreholes

  14. Men without a sense of smell exhibit a strongly reduced number of sexual relationships, women exhibit reduced partnership security - a reanalysis of previously published data.

    Science.gov (United States)

    Croy, Ilona; Bojanowski, Viola; Hummel, Thomas

    2013-02-01

    Olfactory function influences social behavior. For instance, olfaction seems to play a key role in mate choice and helps detecting emotions in other people. In a previous study, we showed that people who were born without a sense of smell exhibit enhanced social insecurity. Based on the comments to this article we decided to have a closer look to whether the absence of the sense of smell affects men and women differently. Under this focus questionnaire data of 32 patients, diagnosed with isolated congenital anosmia (10 men, 22 women) and 36 age-matched healthy controls (15 men, 21 women) was reanalyzed. In result, men and women without a sense of smell reported enhanced social insecurity, but with different consequences: Men who were born without a sense of smell exhibit a strongly reduced number of sexual relationships and women are affected such that they feel less secure about their partner. This emphasizes the importance of the sense of smell for intimate relationships. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Using the genome aggregation database, computational pathogenicity prediction tools, and patch clamp heterologous expression studies to demote previously published long QT syndrome type 1 mutations from pathogenic to benign.

    Science.gov (United States)

    Clemens, Daniel J; Lentino, Anne R; Kapplinger, Jamie D; Ye, Dan; Zhou, Wei; Tester, David J; Ackerman, Michael J

    2018-04-01

    Mutations in the KCNQ1-encoded Kv7.1 potassium channel cause long QT syndrome (LQTS) type 1 (LQT1). It has been suggested that ∼10%-20% of rare LQTS case-derived variants in the literature may have been published erroneously as LQT1-causative mutations and may be "false positives." The purpose of this study was to determine which previously published KCNQ1 case variants are likely false positives. A list of all published, case-derived KCNQ1 missense variants (MVs) was compiled. The occurrence of each MV within the Genome Aggregation Database (gnomAD) was assessed. Eight in silico tools were used to predict each variant's pathogenicity. Case-derived variants that were either (1) too frequently found in gnomAD or (2) absent in gnomAD but predicted to be pathogenic by ≤2 tools were considered potential false positives. Three of these variants were characterized functionally using whole-cell patch clamp technique. Overall, there were 244 KCNQ1 case-derived MVs. Of these, 29 (12%) were seen in ≥10 individuals in gnomAD and are demotable. However, 157 of 244 MVs (64%) were absent in gnomAD. Of these, 7 (4%) were predicted to be pathogenic by ≤2 tools, 3 of which we characterized functionally. There was no significant difference in current density between heterozygous KCNQ1-F127L, -P477L, or -L619M variant-containing channels compared to KCNQ1-WT. This study offers preliminary evidence for the demotion of 32 (13%) previously published LQT1 MVs. Of these, 29 were demoted because of their frequent sighting in gnomAD. Additionally, in silico analysis and in vitro functional studies have facilitated the demotion of 3 ultra-rare MVs (F127L, P477L, L619M). Copyright © 2017 Heart Rhythm Society. Published by Elsevier Inc. All rights reserved.

  16. Algorithms

    Indian Academy of Sciences (India)

    polynomial) division have been found in Vedic Mathematics which are dated much before Euclid's algorithm. A programming language Is used to describe an algorithm for execution on a computer. An algorithm expressed using a programming.

  17. Algorithms

    Indian Academy of Sciences (India)

    to as 'divide-and-conquer'. Although there has been a large effort in realizing efficient algorithms, there are not many universally accepted algorithm design paradigms. In this article, we illustrate algorithm design techniques such as balancing, greedy strategy, dynamic programming strategy, and backtracking or traversal of ...

  18. Desktop Publishing.

    Science.gov (United States)

    Stanley, Milt

    1986-01-01

    Defines desktop publishing, describes microcomputer developments and software tools that make it possible, and discusses its use as an instructional tool to improve writing skills. Reasons why students' work should be published, examples of what to publish, and types of software and hardware to facilitate publishing are reviewed. (MBR)

  19. Algorithms

    Indian Academy of Sciences (India)

    ticians but also forms the foundation of computer science. Two ... with methods of developing algorithms for solving a variety of problems but ... applications of computers in science and engineer- ... numerical calculus are as important. We will ...

  20. Music publishing

    OpenAIRE

    Simões, Alberto; Almeida, J. J.

    2003-01-01

    Current music publishing in the Internet is mainly concerned with sound publishing. We claim that music publishing is not only to make sound available but also to define relations between a set of music objects like music scores, guitar chords, lyrics and their meta-data. We want an easy way to publish music in the Internet, to make high quality paper booklets and even to create Audio CD's. In this document we present a workbench for music publishing based on open formats, using open-source t...

  1. Algorithms

    Indian Academy of Sciences (India)

    algorithm design technique called 'divide-and-conquer'. One of ... Turtle graphics, September. 1996. 5. ... whole list named 'PO' is a pointer to the first element of the list; ..... Program for computing matrices X and Y and placing the result in C *).

  2. Algorithms

    Indian Academy of Sciences (India)

    algorithm that it is implicitly understood that we know how to generate the next natural ..... Explicit comparisons are made in line (1) where maximum and minimum is ... It can be shown that the function T(n) = 3/2n -2 is the solution to the above ...

  3. Algorithms

    Indian Academy of Sciences (India)

    will become clear in the next article when we discuss a simple logo like programming language. ... Rod B may be used as an auxiliary store. The problem is to find an algorithm which performs this task. ... No disks are moved from A to Busing C as auxiliary rod. • move _disk (A, C);. (No + l)th disk is moved from A to C directly ...

  4. Publisher Correction

    DEFF Research Database (Denmark)

    Turcot, Valérie; Lu, Yingchang; Highland, Heather M

    2018-01-01

    In the published version of this paper, the name of author Emanuele Di Angelantonio was misspelled. This error has now been corrected in the HTML and PDF versions of the article.......In the published version of this paper, the name of author Emanuele Di Angelantonio was misspelled. This error has now been corrected in the HTML and PDF versions of the article....

  5. Publisher Correction

    DEFF Research Database (Denmark)

    Bonàs-Guarch, Sílvia; Guindo-Martínez, Marta; Miguel-Escalada, Irene

    2018-01-01

    In the originally published version of this Article, the affiliation details for Santi González, Jian'an Luan and Claudia Langenberg were inadvertently omitted. Santi González should have been affiliated with 'Barcelona Supercomputing Center (BSC), Joint BSC-CRG-IRB Research Program in Computatio......In the originally published version of this Article, the affiliation details for Santi González, Jian'an Luan and Claudia Langenberg were inadvertently omitted. Santi González should have been affiliated with 'Barcelona Supercomputing Center (BSC), Joint BSC-CRG-IRB Research Program...

  6. Publisher Correction

    DEFF Research Database (Denmark)

    Stokholm, Jakob; Blaser, Martin J.; Thorsen, Jonathan

    2018-01-01

    The originally published version of this Article contained an incorrect version of Figure 3 that was introduced following peer review and inadvertently not corrected during the production process. Both versions contain the same set of abundance data, but the incorrect version has the children...

  7. Publisher Correction

    DEFF Research Database (Denmark)

    Turcot, Valérie; Lu, Yingchang; Highland, Heather M

    2018-01-01

    In the version of this article originally published, one of the two authors with the name Wei Zhao was omitted from the author list and the affiliations for both authors were assigned to the single Wei Zhao in the author list. In addition, the ORCID for Wei Zhao (Department of Biostatistics and E...

  8. Dear Publisher.

    Science.gov (United States)

    Chelton, Mary K.

    1992-01-01

    Addresses issues that concern the relationship between publishers and librarians, including differences between libraries and bookstores; necessary information for advertisements; out-of-stock designations and their effect on budgets; the role of distributors and vendors; direct mail for book promotions; unsolicited review copies; communications…

  9. Electronic Publishing.

    Science.gov (United States)

    Lancaster, F. W.

    1989-01-01

    Describes various stages involved in the applications of electronic media to the publishing industry. Highlights include computer typesetting, or photocomposition; machine-readable databases; the distribution of publications in electronic form; computer conferencing and electronic mail; collaborative authorship; hypertext; hypermedia publications;…

  10. Measurement errors in polymerase chain reaction are a confounding factor for a correct interpretation of 5-HTTLPR polymorphism effects on lifelong premature ejaculation: a critical analysis of a previously published meta-analysis of six studies.

    Science.gov (United States)

    Janssen, Paddy K C; Olivier, Berend; Zwinderman, Aeilko H; Waldinger, Marcel D

    2014-01-01

    To analyze a recently published meta-analysis of six studies on 5-HTTLPR polymorphism and lifelong premature ejaculation (PE). Calculation of fraction observed and expected genotype frequencies and Hardy Weinberg equilibrium (HWE) of cases and controls. LL,SL and SS genotype frequencies of patients were subtracted from genotype frequencies of an ideal population (LL25%, SL50%, SS25%, p = 1 for HWE). Analysis of PCRs of six studies and re-analysis of the analysis and Odds ratios (ORs) reported in the recently published meta-analysis. Three studies deviated from HWE in patients and one study deviated from HWE in controls. In three studies in-HWE the mean deviation of genotype frequencies from a theoretical population not-deviating from HWE was small: LL(1.7%), SL(-2.3%), SS(0.6%). In three studies not-in-HWE the mean deviation of genotype frequencies was high: LL(-3.3%), SL(-18.5%) and SS(21.8%) with very low percentage SL genotype concurrent with very high percentage SS genotype. The most serious PCR deviations were reported in the three not-in-HWE studies. The three in-HWE studies had normal OR. In contrast, the three not-in-HWE studies had a low OR. In three studies not-in-HWE and with very low OR, inadequate PCR analysis and/or inadequate interpretation of its gel electrophoresis resulted in very low SL and a resulting shift to very high SS genotype frequency outcome. Consequently, PCRs of these three studies are not reliable. Failure to note the inadequacy of PCR tests makes such PCRs a confounding factor in clinical interpretation of genetic studies. Currently, a meta-analysis can only be performed on three studies-in-HWE. However, based on the three studies-in-HWE with OR of about 1 there is not any indication that in men with lifelong PE the frequency of LL,SL and SS genotype deviates from the general male population and/or that the SL or SS genotype is in any way associated with lifelong PE.

  11. Measurement errors in polymerase chain reaction are a confounding factor for a correct interpretation of 5-HTTLPR polymorphism effects on lifelong premature ejaculation: a critical analysis of a previously published meta-analysis of six studies.

    Directory of Open Access Journals (Sweden)

    Paddy K C Janssen

    Full Text Available OBJECTIVE: To analyze a recently published meta-analysis of six studies on 5-HTTLPR polymorphism and lifelong premature ejaculation (PE. METHODS: Calculation of fraction observed and expected genotype frequencies and Hardy Weinberg equilibrium (HWE of cases and controls. LL,SL and SS genotype frequencies of patients were subtracted from genotype frequencies of an ideal population (LL25%, SL50%, SS25%, p = 1 for HWE. Analysis of PCRs of six studies and re-analysis of the analysis and Odds ratios (ORs reported in the recently published meta-analysis. RESULTS: Three studies deviated from HWE in patients and one study deviated from HWE in controls. In three studies in-HWE the mean deviation of genotype frequencies from a theoretical population not-deviating from HWE was small: LL(1.7%, SL(-2.3%, SS(0.6%. In three studies not-in-HWE the mean deviation of genotype frequencies was high: LL(-3.3%, SL(-18.5% and SS(21.8% with very low percentage SL genotype concurrent with very high percentage SS genotype. The most serious PCR deviations were reported in the three not-in-HWE studies. The three in-HWE studies had normal OR. In contrast, the three not-in-HWE studies had a low OR. CONCLUSIONS: In three studies not-in-HWE and with very low OR, inadequate PCR analysis and/or inadequate interpretation of its gel electrophoresis resulted in very low SL and a resulting shift to very high SS genotype frequency outcome. Consequently, PCRs of these three studies are not reliable. Failure to note the inadequacy of PCR tests makes such PCRs a confounding factor in clinical interpretation of genetic studies. Currently, a meta-analysis can only be performed on three studies-in-HWE. However, based on the three studies-in-HWE with OR of about 1 there is not any indication that in men with lifelong PE the frequency of LL,SL and SS genotype deviates from the general male population and/or that the SL or SS genotype is in any way associated with lifelong PE.

  12. Towards an Ethical Framework for Publishing Twitter Data in Social Research: Taking into Account Users' Views, Online Context and Algorithmic Estimation.

    Science.gov (United States)

    Williams, Matthew L; Burnap, Pete; Sloan, Luke

    2017-12-01

    New and emerging forms of data, including posts harvested from social media sites such as Twitter, have become part of the sociologist's data diet. In particular, some researchers see an advantage in the perceived 'public' nature of Twitter posts, representing them in publications without seeking informed consent. While such practice may not be at odds with Twitter's terms of service, we argue there is a need to interpret these through the lens of social science research methods that imply a more reflexive ethical approach than provided in 'legal' accounts of the permissible use of these data in research publications. To challenge some existing practice in Twitter-based research, this article brings to the fore: (1) views of Twitter users through analysis of online survey data; (2) the effect of context collapse and online disinhibition on the behaviours of users; and (3) the publication of identifiable sensitive classifications derived from algorithms.

  13. Transition to electronic publishing

    Science.gov (United States)

    Bowning, Sam

    Previous communications have described some of the many changes that will occur in the next few months as AGU makes the transition to fully electronic publishing. With the advent of the new AGU electronic publishing system, manuscripts will be submitted, edited, reviewed, and published in electronic formats. This piece discusses how the electronic journals will differ from the print journals. Electronic publishing will require some adjustments to the ways we currently think about journals from our perspective of standard print versions. Visiting the Web site of AGU's Geochemistry, Geophysics, Geosystems (G-Cubed) is a great way to get familiar with the look and feel of electronic publishing. However, protocols, especially for citations of articles, are still evolving. Some of the biggest changes for users of AGU publications may be the lack of page numbers, the use of a unique identifier (DOI),and changes in citation style.

  14. Algorithming the Algorithm

    DEFF Research Database (Denmark)

    Mahnke, Martina; Uprichard, Emma

    2014-01-01

    Imagine sailing across the ocean. The sun is shining, vastness all around you. And suddenly [BOOM] you’ve hit an invisible wall. Welcome to the Truman Show! Ever since Eli Pariser published his thoughts on a potential filter bubble, this movie scenario seems to have become reality, just with slight...... changes: it’s not the ocean, it’s the internet we’re talking about, and it’s not a TV show producer, but algorithms that constitute a sort of invisible wall. Building on this assumption, most research is trying to ‘tame the algorithmic tiger’. While this is a valuable and often inspiring approach, we...

  15. Publishing and Revising Content

    Science.gov (United States)

    Editors and Webmasters can publish content without going through a workflow. Publishing times and dates can be set, and multiple pages can be published in bulk. Making an edit to published content created a revision.

  16. Publishing with XML structure, enter, publish

    CERN Document Server

    Prost, Bernard

    2015-01-01

    XML is now at the heart of book publishing techniques: it provides the industry with a robust, flexible format which is relatively easy to manipulate. Above all, it preserves the future: the XML text becomes a genuine tactical asset enabling publishers to respond quickly to market demands. When new publishing media appear, it will be possible to very quickly make your editorial content available at a lower cost. On the downside, XML can become a bottomless pit for publishers attracted by its possibilities. There is a strong temptation to switch to audiovisual production and to add video and a

  17. An Electronic Publishing Model for Academic Publishers.

    Science.gov (United States)

    Gold, Jon D.

    1994-01-01

    Describes an electronic publishing model based on Standard Generalized Markup Language (SGML) and considers its use by an academic publisher. Highlights include how SGML is used to produce an electronic book, hypertext, methods of delivery, intellectual property rights, and future possibilities. Sample documents are included. (two references) (LRW)

  18. Getting Your Textbook Published.

    Science.gov (United States)

    Irwin, Armond J.

    1982-01-01

    Points to remember in getting a textbook published are examined: book idea, publisher's sales representatives, letter of inquiry, qualifications for authorship, author information form, idea proposal, reviews, marketing and sales, publishing agreement, author royalties, and copyright assignment. (CT)

  19. Plagiarism in scientific publishing.

    Science.gov (United States)

    Masic, Izet

    2012-12-01

    Scientific publishing is the ultimate product of scientist work. Number of publications and their quoting are measures of scientist success while unpublished researches are invisible to the scientific community, and as such nonexistent. Researchers in their work rely on their predecessors, while the extent of use of one scientist work, as a source for the work of other authors is the verification of its contributions to the growth of human knowledge. If the author has published an article in a scientific journal it cannot publish the article in any other journal h with a few minor adjustments or without quoting parts of the first article, which are used in another article. Copyright infringement occurs when the author of a new article with or without the mentioning the author used substantial portions of previously published articles, including tables and figures. Scientific institutions and universities should,in accordance with the principles of Good Scientific Practice (GSP) and Good Laboratory Practices (GLP) have a center for monitoring,security, promotion and development of quality research. Establish rules and compliance to rules of good scientific practice are the obligations of each research institutions,universities and every individual-researchers,regardless of which area of science is investigated. In this way, internal quality control ensures that a research institution such as a university, assume responsibility for creating an environment that promotes standards of excellence, intellectual honesty and legality. Although the truth should be the aim of scientific research, it is not guiding fact for all scientists. The best way to reach the truth in its study and to avoid the methodological and ethical mistakes is to consistently apply scientific methods and ethical standards in research. Although variously defined plagiarism is basically intended to deceive the reader's own scientific contribution. There is no general regulation of control of

  20. PLAGIARISM IN SCIENTIFIC PUBLISHING

    Science.gov (United States)

    Masic, Izet

    2012-01-01

    Scientific publishing is the ultimate product of scientist work. Number of publications and their quoting are measures of scientist success while unpublished researches are invisible to the scientific community, and as such nonexistent. Researchers in their work rely on their predecessors, while the extent of use of one scientist work, as a source for the work of other authors is the verification of its contributions to the growth of human knowledge. If the author has published an article in a scientific journal it cannot publish the article in any other journal h with a few minor adjustments or without quoting parts of the first article, which are used in another article. Copyright infringement occurs when the author of a new article with or without the mentioning the author used substantial portions of previously published articles, including tables and figures. Scientific institutions and universities should,in accordance with the principles of Good Scientific Practice (GSP) and Good Laboratory Practices (GLP) have a center for monitoring,security, promotion and development of quality research. Establish rules and compliance to rules of good scientific practice are the obligations of each research institutions,universities and every individual-researchers,regardless of which area of science is investigated. In this way, internal quality control ensures that a research institution such as a university, assume responsibility for creating an environment that promotes standards of excellence, intellectual honesty and legality. Although the truth should be the aim of scientific research, it is not guiding fact for all scientists. The best way to reach the truth in its study and to avoid the methodological and ethical mistakes is to consistently apply scientific methods and ethical standards in research. Although variously defined plagiarism is basically intended to deceive the reader’s own scientific contribution. There is no general regulation of control of

  1. Embracing Electronic Publishing.

    Science.gov (United States)

    Wills, Gordon

    1996-01-01

    Electronic publishing is the grandest revolution in the capture and dissemination of academic and professional knowledge since Caxton developed the printing press. This article examines electronic publishing, describes different electronic publishing scenarios (authors' cooperative, consolidator/retailer/agent oligopsony, publisher oligopoly), and…

  2. PUBLISHER'S ANNOUNCEMENT: Refereeing standards

    Science.gov (United States)

    Bender, C.; Scriven, N.

    2004-08-01

    On 1 January 2004 I will be assuming the position of Editor-in-Chief of Journal of Physics A: Mathematical and General (J. Phys. A). I am flattered at the confidence expressed in my ability to carry out this challenging job and I will try hard to justify this confidence. The previous Editor-in-Chief, Ed Corrigan, has worked tirelessly for the last five years and has done an excellent job for the journal. Everyone at the journal is profoundly grateful for his leadership and for his achievements. Before accepting the position of Editor-in-Chief, I visited the office of J. Phys. A to examine the organization and to assess its strengths and weaknesses. This office is located at the Institute of Physics Publishing (IOPP) headquarters in Bristol. J. Phys. A has been expanding rapidly and now publishes at the rate of nearly 1000 articles (or about 14,000 pages) per year. The entire operation of the journal is conducted in a very small space---about 15 square metres! Working in this space are six highly intelligent, talented, hard working, and dedicated people: Neil Scriven, Publisher; Mike Williams, Publishing Editor; Rose Gray and Sarah Nadin, Publishing Administrators; Laura Smith and Steve Richards, Production Editors. In this small space every day about eight submitted manuscripts are downloaded from the computer or received in the post. These papers are then processed and catalogued, referees are selected, and the papers are sent out for evaluation. In this small space the referees' reports are received, publication decisions are made, and accepted articles are then published quickly by IOPP. The whole operation is amazingly efficient. Indeed, one of the great strengths of J. Phys. A is the speed at which papers are processed. The average time between the receipt of a manuscript and an editorial decision is under sixty days. (Many distinguished journals take three to five times this amount of time.) This speed of publication is an extremely strong enticement for

  3. Copyright of Electronic Publishing.

    Science.gov (United States)

    Dong, Elaine; Wang, Bob

    2002-01-01

    Analyzes the importance of copyright, considers the main causes of copyright infringement in electronic publishing, discusses fair use of a copyrighted work, and suggests methods to safeguard copyrighted electronic publishing, including legislation, contracts, and technology. (Author/LRW)

  4. Publishing: The Creative Business.

    Science.gov (United States)

    Bohne, Harald; Van Ierssel, Harry

    This book offers guidelines to emerging and would-be publishers, whether they plan to enter publishing as a career, a sideline, or a diversion. It stresses the business aspects of publishing and emphasizes the major housekeeping functions encountered in the business, except methods of sales and distribution. Contents include "The Mechanics of…

  5. Academic Nightmares: Predatory Publishing

    Science.gov (United States)

    Van Nuland, Sonya E.; Rogers, Kem A.

    2017-01-01

    Academic researchers who seek to publish their work are confronted daily with a barrage of e-mails from aggressive marketing campaigns that solicit them to publish their research with a specialized, often newly launched, journal. Known as predatory journals, they often promise high editorial and publishing standards, yet their exploitive business…

  6. Desktop Publishing Made Simple.

    Science.gov (United States)

    Wentling, Rose Mary

    1989-01-01

    The author discusses the types of computer hardware and software necessary to set up a desktop publishing system, both for use in educational administration and for instructional purposes. Classroom applications of desktop publishing are presented. The author also provides guidelines for preparing to teach desktop publishing. (CH)

  7. Data Sharing & Publishing at Nature Publishing Group

    Science.gov (United States)

    VanDecar, J. C.; Hrynaszkiewicz, I.; Hufton, A. L.

    2015-12-01

    In recent years, the research community has come to recognize that upon-request data sharing has important limitations1,2. The Nature-titled journals feel that researchers have a duty to share data without undue qualifications, in a manner that allows others to replicate and build upon their published findings. Historically, the Nature journals have been strong supporters of data deposition in communities with existing data mandates, and have required data sharing upon request in all other cases. To help address some of the limitations of upon-request data sharing, the Nature titles have strengthened their existing data policies and forged a new partnership with Scientific Data, to promote wider data sharing in discoverable, citeable and reusable forms, and to ensure that scientists get appropriate credit for sharing3. Scientific Data is a new peer-reviewed journal for descriptions of research datasets, which works with a wide of range of public data repositories4. Articles at Scientific Data may either expand on research publications at other journals or may be used to publish new datasets. The Nature Publishing Group has also signed the Joint Declaration of Data Citation Principles5, and Scientific Data is our first journal to include formal data citations. We are currently in the process of adding data citation support to our various journals. 1 Wicherts, J. M., Borsboom, D., Kats, J. & Molenaar, D. The poor availability of psychological research data for reanalysis. Am. Psychol. 61, 726-728, doi:10.1037/0003-066x.61.7.726 (2006). 2 Vines, T. H. et al. Mandated data archiving greatly improves access to research data. FASEB J. 27, 1304-1308, doi:10.1096/fj.12-218164 (2013). 3 Data-access practices strengthened. Nature 515, 312, doi:10.1038/515312a (2014). 4 More bang for your byte. Sci. Data 1, 140010, doi:10.1038/sdata.2014.10 (2014). 5 Data Citation Synthesis Group: Joint Declaration of Data Citation Principles. (FORCE11, San Diego, CA, 2014).

  8. Publishing studies: what else?

    Directory of Open Access Journals (Sweden)

    Bertrand Legendre

    2015-07-01

    Full Text Available This paper intends to reposition “publishing studies” in the long process that goes from the beginning of book history to the current research on cultural industries. It raises questions about interdisciplinarity and the possibility of considering publishing independently of other sectors of the media and cultural offerings. Publishing is now included in a large range of industries and, at the same time, analyses tend to become more and more segmented according to production sectors and scientific fields. In addition to the problems created, from the professional point of view, by this double movement, this one requires a questioning of the concept of “publishing studies”.

  9. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.

  10. Publisher Correction to

    NARCIS (Netherlands)

    Barrio, Isabel C.; Lindén, Elin; Beest, Te Mariska; Olofsson, Johan; Rocha, Adrian; Soininen, Eeva M.; Alatalo, Juha M.; Andersson, Tommi; Asmus, Ashley; Boike, Julia; Bråthen, Kari Anne; Bryant, John P.; Buchwal, Agata; Bueno, C.G.; Christie, Katherine S.; Egelkraut, Dagmar; Ehrich, Dorothee; Fishback, Lee Ann; Forbes, Bruce C.; Gartzia, Maite; Grogan, Paul; Hallinger, Martin; Heijmans, Monique M.P.D.; Hik, David S.; Hofgaard, Annika; Holmgren, Milena; Høye, Toke T.; Huebner, Diane C.; Jónsdóttir, Ingibjörg Svala; Kaarlejärvi, Elina; Kumpula, Timo; Lange, Cynthia Y.M.J.G.; Lange, Jelena; Lévesque, Esther; Limpens, Juul; Macias-Fauria, Marc; Myers-Smith, Isla; Nieukerken, van Erik J.; Normand, Signe; Post, Eric S.; Schmidt, Niels Martin; Sitters, Judith; Skoracka, Anna; Sokolov, Alexander; Sokolova, Natalya; Speed, James D.M.; Street, Lorna E.; Sundqvist, Maja K.; Suominen, Otso; Tananaev, Nikita; Tremblay, Jean Pierre; Urbanowicz, Christine; Uvarov, Sergey A.; Watts, David; Wilmking, Martin; Wookey, Philip A.; Zimmermann, Heike H.; Zverev, Vitali; Kozlov, Mikhail V.

    2018-01-01

    The above mentioned article was originally scheduled for publication in the special issue on Ecology of Tundra Arthropods with guest editors Toke T. Høye . Lauren E. Culler. Erroneously, the article was published in Polar Biology, Volume 40, Issue 11, November, 2017. The publisher sincerely

  11. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0258-252X. AJOL African Journals Online.

  12. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1596-6798. AJOL African Journals Online.

  13. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1115-2613. AJOL African Journals Online.

  14. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0047-651X. AJOL African Journals Online.

  15. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0856-7212. AJOL African Journals Online.

  16. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0378-4738. AJOL African Journals Online.

  17. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0254-2765. AJOL African Journals Online.

  18. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0850-3907. AJOL African Journals Online.

  19. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2141-8322. AJOL African Journals Online.

  20. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0794-7410. AJOL African Journals Online.

  1. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2078-6778. AJOL African Journals Online.

  2. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2305-8862. AJOL African Journals Online.

  3. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1596-9819. AJOL African Journals Online.

  4. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0379-4350. AJOL African Journals Online.

  5. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2408-8137. AJOL African Journals Online.

  6. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1029-5933. AJOL African Journals Online.

  7. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2467-8252. AJOL African Journals Online.

  8. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0376-4753. AJOL African Journals Online.

  9. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1118-1028. AJOL African Journals Online.

  10. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1597-4292. AJOL African Journals Online.

  11. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0189-9686. AJOL African Journals Online.

  12. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2360-994X. AJOL African Journals Online.

  13. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1595-1413. AJOL African Journals Online.

  14. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2078-5151. AJOL African Journals Online.

  15. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1694-0423. AJOL African Journals Online.

  16. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0855-4307. AJOL African Journals Online.

  17. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1596-9827. AJOL African Journals Online.

  18. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0379-9069. AJOL African Journals Online.

  19. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1998-1279. AJOL African Journals Online.

  20. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1606-7479. AJOL African Journals Online.

  1. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1995-7262. AJOL African Journals Online.

  2. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0856-8960. AJOL African Journals Online.

  3. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0855-5591. AJOL African Journals Online.

  4. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1531-4065. AJOL African Journals Online.

  5. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1110-5607. AJOL African Journals Online.

  6. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2076-7714. AJOL African Journals Online.

  7. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1858-554X. AJOL African Journals Online.

  8. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1994-8220. AJOL African Journals Online.

  9. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1596-6232. AJOL African Journals Online.

  10. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2224-0020. AJOL African Journals Online.

  11. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0556-8641. AJOL African Journals Online.

  12. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1596-5414. AJOL African Journals Online.

  13. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2305-2678. AJOL African Journals Online.

  14. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1119-3077. AJOL African Journals Online.

  15. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2078-676X. AJOL African Journals Online.

  16. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1027-4332. AJOL African Journals Online.

  17. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1814-232X. AJOL African Journals Online.

  18. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1998-9881. AJOL African Journals Online.

  19. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0303-691X. AJOL African Journals Online.

  20. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0376-8902. AJOL African Journals Online.

  1. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2507-7961. AJOL African Journals Online.

  2. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0189-5117. AJOL African Journals Online.

  3. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1012-2796. AJOL African Journals Online.

  4. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2313-1799. AJOL African Journals Online.

  5. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1025-9848. AJOL African Journals Online.

  6. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2449-108X. AJOL African Journals Online.

  7. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2141-9884. AJOL African Journals Online.

  8. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1727-3781. AJOL African Journals Online.

  9. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2090-7214. AJOL African Journals Online.

  10. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2410-8936. AJOL African Journals Online.

  11. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0856-0714. AJOL African Journals Online.

  12. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1684-5374. AJOL African Journals Online.

  13. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1998-8125. AJOL African Journals Online.

  14. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1016-0728. AJOL African Journals Online.

  15. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1821-9241. AJOL African Journals Online.

  16. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1607-0011. AJOL African Journals Online.

  17. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. AJOL African Journals Online. HOW TO USE ...

  18. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2591 6831. AJOL African Journals Online.

  19. Desktop Publishing for Counselors.

    Science.gov (United States)

    Lucking, Robert; Mitchum, Nancy

    1990-01-01

    Discusses the fundamentals of desktop publishing for counselors, including hardware and software systems and peripherals. Notes by using desktop publishing, counselors can produce their own high-quality documents without the expense of commercial printers. Concludes computers present a way of streamlining the communications of a counseling…

  20. Publishing: Alternatives and Economics.

    Science.gov (United States)

    Penchansky, Mimi; And Others

    The Library Association of the City University of New York presents an annotated bibliography on the subject of small and alternative publishing. In the first section directories, indexes, catalogs, and reviews are briefly described. Book distributors for small publishers are listed next. The major portion of the bibliography is a listing of books…

  1. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1999-7671. AJOL African Journals Online.

  2. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1110-6859. AJOL African Journals Online.

  3. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0794-4721. AJOL African Journals Online.

  4. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2520–7997. AJOL African Journals Online.

  5. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 2072-6589. AJOL African Journals Online.

  6. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 0012-835X. AJOL African Journals Online.

  7. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1680-6905. AJOL African Journals Online.

  8. About this Publishing System

    African Journals Online (AJOL)

    This journal uses Open Journal Systems 2.4.3.0, which is open source journal management and publishing software developed, supported, and freely distributed by the Public Knowledge Project under the GNU General Public License. OJS Editorial and Publishing Process. ISSN: 1821-8148. AJOL African Journals Online.

  9. The Academic Publishing Industry

    DEFF Research Database (Denmark)

    Nell, Phillip Christopher; Wenzel, Tim Ole; Schmidt, Florian

    2014-01-01

    The case starts with introducing the outstanding profitability of academic journal publishers such as Elsevier and then dives into describing the research process from an idea to conducting research and to publishing the results in academic journals. Subsequently, demand and supply for scientific...... journals and papers are discussed including drivers and involved parties. Furthermore, the case describes competition between suppliers, customers, and publishers. In sum, the case study features a rich description of the industry’s many unusual attributes which allows for discussing the benefits...

  10. Elearning and digital publishing

    CERN Document Server

    Ching, Hsianghoo Steve; Mc Naught, Carmel

    2006-01-01

    ""ELearning and Digital Publishing"" will occupy a unique niche in the literature accessed by library and publishing specialists, and by university teachers and planners. It examines the interfaces between the work done by four groups of university staff who have been in the past quite separate from, or only marginally related to, each other - library staff, university teachers, university policy makers, and staff who work in university publishing presses. All four groups are directly and intimately connected with the main functions of universities - the creation, management and dissemination

  11. Desktop Publishing in Libraries.

    Science.gov (United States)

    Cisler, Steve

    1987-01-01

    Describes the components, costs, and capabilities of several desktop publishing systems, and examines their possible impact on work patterns within organizations. The text and graphics of the article were created using various microcomputer software packages. (CLB)

  12. Sisyphus desperately seeking publisher

    Indian Academy of Sciences (India)

    Antoinette Molinié

    The editors wield their Olympian authority by making today's scientists endlessly push their weighty boulders up ... since publishing has become a highly lucrative business. ... estimate that the richest 8.4 % own 83.3 % (see Global Wealth.

  13. Issues in Electronic Publishing.

    Science.gov (United States)

    Meadow, Charles T.

    1997-01-01

    Discusses issues related to electronic publishing. Topics include writing; reading; production, distribution, and commerce; copyright and ownership of intellectual property; archival storage; technical obsolescence; control of content; equality of access; and cultural changes. (Author/LRW)

  14. The Library as Publisher.

    Science.gov (United States)

    Field, Roy

    1979-01-01

    Presents a guide to for-profit library publishing of reprints, original manuscripts, and smaller items. Discussed are creation of a publications panel to manage finances and preparation, determining prices of items, and drawing up author contracts. (SW)

  15. The Book Publishing Industry

    OpenAIRE

    Jean-Paul Simon; Giuditta de Prato

    2012-01-01

    This report offers an in-depth analysis of the major economic developments in the book publishing industry. The analysis integrates data from a statistical report published earlier as part of this project. The report is divided into 4 main parts. Chapter 1, the introduction, puts the sector into an historical perspective. Chapter 2 introduces the markets at a global and regional level; describes some of the major EU markets (France, Germany, Italy, Spain and the United Kingdom). Chapter 3 ana...

  16. Open-Access Publishing

    Directory of Open Access Journals (Sweden)

    Nedjeljko Frančula

    2013-06-01

    Full Text Available Nature, one of the most prominent scientific journals dedicated one of its issues to recent changes in scientific publishing (Vol. 495, Issue 7442, 27 March 2013. Its editors stressed that words technology and revolution are closely related when it comes to scientific publishing. In addition, the transformation of research publishing is not as much a revolution than an attrition war in which all sides are buried. The most important change they refer to is the open-access model in which an author or an institution pays in advance for publishing a paper in a journal, and the paper is then available to users on the Internet free of charge.According to preliminary results of a survey conducted among 23 000 scientists by the publisher of Nature, 45% of them believes all papers should be published in open access, but at the same time 22% of them would not allow the use of papers for commercial purposes. Attitudes toward open access vary according to scientific disciplines, leading the editors to conclude the revolution still does not suit everyone.

  17. Utility-preserving anonymization for health data publishing.

    Science.gov (United States)

    Lee, Hyukki; Kim, Soohyung; Kim, Jong Wook; Chung, Yon Dohn

    2017-07-11

    Publishing raw electronic health records (EHRs) may be considered as a breach of the privacy of individuals because they usually contain sensitive information. A common practice for the privacy-preserving data publishing is to anonymize the data before publishing, and thus satisfy privacy models such as k-anonymity. Among various anonymization techniques, generalization is the most commonly used in medical/health data processing. Generalization inevitably causes information loss, and thus, various methods have been proposed to reduce information loss. However, existing generalization-based data anonymization methods cannot avoid excessive information loss and preserve data utility. We propose a utility-preserving anonymization for privacy preserving data publishing (PPDP). To preserve data utility, the proposed method comprises three parts: (1) utility-preserving model, (2) counterfeit record insertion, (3) catalog of the counterfeit records. We also propose an anonymization algorithm using the proposed method. Our anonymization algorithm applies full-domain generalization algorithm. We evaluate our method in comparison with existence method on two aspects, information loss measured through various quality metrics and error rate of analysis result. With all different types of quality metrics, our proposed method show the lower information loss than the existing method. In the real-world EHRs analysis, analysis results show small portion of error between the anonymized data through the proposed method and original data. We propose a new utility-preserving anonymization method and an anonymization algorithm using the proposed method. Through experiments on various datasets, we show that the utility of EHRs anonymized by the proposed method is significantly better than those anonymized by previous approaches.

  18. A new algorithm for hip fracture surgery

    DEFF Research Database (Denmark)

    Palm, Henrik; Krasheninnikoff, Michael; Holck, Kim

    2012-01-01

    Background and purpose Treatment of hip fracture patients is controversial. We implemented a new operative and supervision algorithm (the Hvidovre algorithm) for surgical treatment of all hip fractures, primarily based on own previously published results. Methods 2,000 consecutive patients over 50...... years of age who were admitted and operated on because of a hip fracture were prospectively included. 1,000 of these patients were included after implementation of the algorithm. Demographic parameters, hospital treatment, and reoperations within the first postoperative year were assessed from patient...... by reoperations was reduced from 24% of total hospitalization before the algorithm was introduced to 18% after it was introduced. Interpretation It is possible to implement an algorithm for treatment of all hip fracture patients in a large teaching hospital. In our case, the Hvidovre algorithm both raised...

  19. From Genetics to Genetic Algorithms

    Indian Academy of Sciences (India)

    Genetic algorithms (GAs) are computational optimisation schemes with an ... The algorithms solve optimisation problems ..... Genetic Algorithms in Search, Optimisation and Machine. Learning, Addison-Wesley Publishing Company, Inc. 1989.

  20. Publishers and repositories

    CERN Multimedia

    CERN. Geneva

    2007-01-01

    The impact of self-archiving on journals and publishers is an important topic for all those involved in scholarly communication. There is some evidence that the physics arXiv has had no impact on physics journals, while 'economic common sense' suggests that some impact is inevitable. I shall review recent studies of librarian attitudes towards repositories and journals, and place this in the context of IOP Publishing's experiences with arXiv. I shall offer some possible reasons for the mis-match between these perspectives and then discuss how IOP has linked with arXiv and experimented with OA publishing. As well as launching OA journals we have co-operated with Cornell and the arXiv on Eprintweb.org, a platform that offers new features to repository users. View Andrew Wray's biography

  1. Ethics in Scientific Publishing

    Science.gov (United States)

    Sage, Leslie J.

    2012-08-01

    We all learn in elementary school not turn in other people's writing as if it were our own (plagiarism), and in high school science labs not to fake our data. But there are many other practices in scientific publishing that are depressingly common and almost as unethical. At about the 20 percent level authors are deliberately hiding recent work -- by themselves as well as by others -- so as to enhance the apparent novelty of their most recent paper. Some people lie about the dates the data were obtained, to cover up conflicts of interest, or inappropriate use of privileged information. Others will publish the same conference proceeding in multiple volumes, or publish the same result in multiple journals with only trivial additions of data or analysis (self-plagiarism). These shady practices should be roundly condemned and stopped. I will discuss these and other unethical actions I have seen over the years, and steps editors are taking to stop them.

  2. Publisher Correction: Predicting unpredictability

    Science.gov (United States)

    Davis, Steven J.

    2018-06-01

    In this News & Views article originally published, the wrong graph was used for panel b of Fig. 1, and the numbers on the y axes of panels a and c were incorrect; the original and corrected Fig. 1 is shown below. This has now been corrected in all versions of the News & Views.

  3. Web Publishing Schedule

    Science.gov (United States)

    Section 207(f)(2) of the E-Gov Act requires federal agencies to develop an inventory and establish a schedule of information to be published on their Web sites, make those schedules available for public comment. To post the schedules on the web site.

  4. Hprints - Licence to publish

    DEFF Research Database (Denmark)

    Rabow, Ingegerd; Sikström, Marjatta; Drachen, Thea Marie

    2010-01-01

    realised the potential advantages for them. The universities have a role here as well as the libraries that manage the archives and support scholars in various aspects of the publishing processes. Libraries are traditionally service providers with a mission to facilitate the knowledge production...

  5. The Academic Publishing Industry

    DEFF Research Database (Denmark)

    Nell, Phillip Christopher; Wenzel, Tim Ole; Schmidt, Florian

    2014-01-01

    . The case is intended to be used as a basis for class discussion rather than to illustrate effective handling of a managerial situation. It is based on published sources, interviews, and personal experience. The authors have disguised some names and other identifying information to protect confidentiality....

  6. Desktop Publishing in Education.

    Science.gov (United States)

    Hall, Wendy; Layman, J.

    1989-01-01

    Discusses the state of desktop publishing (DTP) in education today and describes the weaknesses of the systems available for use in the classroom. Highlights include document design and layout; text composition; graphics; word processing capabilities; a comparison of commercial and educational DTP packages; and skills required for DTP. (four…

  7. Support open access publishing

    DEFF Research Database (Denmark)

    Ekstrøm, Jeannette

    2013-01-01

    Projektet Support Open Access Publishing har til mål at få opdateret Sherpa/Romeo databasen (www.sherpa.ac.uk/romeo) med fagligt relevante, danske tidsskrifter. Projektet skal endvidere undersøge mulighederne for at få udviklet en database, hvor forskere på tværs af relevante tidsskriftsinformati......Projektet Support Open Access Publishing har til mål at få opdateret Sherpa/Romeo databasen (www.sherpa.ac.uk/romeo) med fagligt relevante, danske tidsskrifter. Projektet skal endvidere undersøge mulighederne for at få udviklet en database, hvor forskere på tværs af relevante...

  8. Prepare to publish.

    Science.gov (United States)

    Price, P M

    2000-01-01

    "I couldn't possibly write an article." "I don't have anything worthwhile to write about." "I am not qualified to write for publication." Do any of these statements sound familiar? This article is intended to dispel these beliefs. You can write an article. You care for the most complex patients in the health care system so you do have something worthwhile to write about. Beside correct spelling and grammar there are no special skills, certificates or diplomas required for publishing. You are qualified to write for publication. The purpose of this article is to take the mystique out of the publication process. Each step of publishing an article will be explained, from idea formation to framing your first article. Practical examples and recommendations will be presented. The essential components of the APA format necessary for Dynamics: The Official Journal of the Canadian Association of Critical Care Nurses will be outlined and resources to assist you will be provided.

  9. Reclaiming Society Publishing

    Directory of Open Access Journals (Sweden)

    Philip E. Steinberg

    2015-07-01

    Full Text Available Learned societies have become aligned with commercial publishers, who have increasingly taken over the latter’s function as independent providers of scholarly information. Using the example of geographical societies, the advantages and disadvantages of this trend are examined. It is argued that in an era of digital publication, learned societies can offer leadership with a new model of open access that can guarantee high quality scholarly material whose publication costs are supported by society membership dues.

  10. RETRACTION: Publishers' Note

    Science.gov (United States)

    post="(Executive Editor">Graeme Watt,

    2010-06-01

    Withdrawal of the paper "Was the fine-structure constant variable over cosmological time?" by L. D. Thong, N. M. Giao, N. T. Hung and T. V. Hung (EPL, 87 (2009) 69002) This paper has been formally withdrawn on ethical grounds because the article contains extensive and repeated instances of plagiarism. EPL treats all identified evidence of plagiarism in the published articles most seriously. Such unethical behaviour will not be tolerated under any circumstance. It is unfortunate that this misconduct was not detected before going to press. My thanks to Editor colleagues from other journals for bringing this fact to my attention.

  11. PUBLISHER'S ANNOUNCEMENT: Editorial developments

    Science.gov (United States)

    2009-01-01

    We are delighted to announce that from January 2009, Professor Murray T Batchelor of the Australian National University, Canberra will be the new Editor-in-Chief of Journal of Physics A: Mathematical and Theoretical. Murray Batchelor has been Editor of the Mathematical Physics section of the journal since 2007. Prior to this, he served as a Board Member and an Advisory Panel member for the journal. His primary area of research is the statistical mechanics of exactly solved models. He holds a joint appointment in mathematics and physics and has held visiting positions at the Universities of Leiden, Amsterdam, Oxford and Tokyo. We very much look forward to working with Murray to continue to improve the journal's quality and interest to the readership. We would like to thank our outgoing Editor-in-Chief, Professor Carl M Bender. Carl has done a magnificent job as Editor-in-Chief and has worked tirelessly to improve the journal over the last five years. Carl has been instrumental in designing and implementing strategies that have enhanced the quality of papers published and service provided by Journal of Physics A: Mathematical and Theoretical. Notably, under his tenure, we have introduced the Fast Track Communications (FTC) section to the journal. This section provides a venue for outstanding short papers that report new and timely developments in mathematical and theoretical physics and offers accelerated publication and high visibility for our authors. During the last five years, we have raised the quality threshold for acceptance in the journal and now reject over 60% of submissions. As a result, papers published in Journal of Physics A: Mathematical and Theoretical are amongst the best in the field. We have also maintained and improved on our excellent receipt-to-first-decision times, which now average less than 50 days for papers. We have recently announced another innovation; the Journal of Physics A Best Paper Prize. These prizes will honour excellent papers

  12. Why publish with AGU?

    Science.gov (United States)

    Graedel, T. E.

    The most visible activity of the American Geophysical Union is its publication of scientific journals. There are eight of these: Journal of Geophysical Research—Space Physics (JGR I), Journal of Geophysical Research—Solid Earth (JGR II), Journal of Geophysical Research—Oceans and Atmospheres (JGR III), Radio Science (RS), Water Resources Research (WRR), Geophysical Research Letters (GRL), Reviews of Geophysics and Space Physics (RGSP), and the newest, Tectonics.AGU's journals have established solid reputations for scientific excellence over the years. Reputation is not sufficient to sustain a high quality journal, however, since other factors enter into an author's decision on where to publish his or her work. In this article the characteristics of AGU's journals are compared with those of its competitors, with the aim of furnishing guidance to prospective authors and a better understanding of the value of the products to purchasers.

  13. Previously unknown species of Aspergillus.

    Science.gov (United States)

    Gautier, M; Normand, A-C; Ranque, S

    2016-08-01

    The use of multi-locus DNA sequence analysis has led to the description of previously unknown 'cryptic' Aspergillus species, whereas classical morphology-based identification of Aspergillus remains limited to the section or species-complex level. The current literature highlights two main features concerning these 'cryptic' Aspergillus species. First, the prevalence of such species in clinical samples is relatively high compared with emergent filamentous fungal taxa such as Mucorales, Scedosporium or Fusarium. Second, it is clearly important to identify these species in the clinical laboratory because of the high frequency of antifungal drug-resistant isolates of such Aspergillus species. Matrix-assisted laser desorption/ionization-time of flight mass spectrometry (MALDI-TOF MS) has recently been shown to enable the identification of filamentous fungi with an accuracy similar to that of DNA sequence-based methods. As MALDI-TOF MS is well suited to the routine clinical laboratory workflow, it facilitates the identification of these 'cryptic' Aspergillus species at the routine mycology bench. The rapid establishment of enhanced filamentous fungi identification facilities will lead to a better understanding of the epidemiology and clinical importance of these emerging Aspergillus species. Based on routine MALDI-TOF MS-based identification results, we provide original insights into the key interpretation issues of a positive Aspergillus culture from a clinical sample. Which ubiquitous species that are frequently isolated from air samples are rarely involved in human invasive disease? Can both the species and the type of biological sample indicate Aspergillus carriage, colonization or infection in a patient? Highly accurate routine filamentous fungi identification is central to enhance the understanding of these previously unknown Aspergillus species, with a vital impact on further improved patient care. Copyright © 2016 European Society of Clinical Microbiology and

  14. Standardized mortality in eating disorders--a quantitative summary of previously published and new evidence

    DEFF Research Database (Denmark)

    Nielsen, Søren; Møller-Madsen, S.; Isager, Torben

    2011-01-01

    strong evidence for an increase in SMR for anorexia nervosa (AN), whereas no firm conclusions could be drawn for bulimia nervosa (BN). Bias caused by loss to follow-up was quantified and found non-negligable in some samples (possible increase in SMR from 25% to 240%). We did not find a significant effect...

  15. Standardized mortality in eating disorders--a quantitative summary of previously published and new evidence

    DEFF Research Database (Denmark)

    Nielsen, Søren; Møller-Madsen, S.; Isager, Torben

    1998-01-01

    strong evidence for an increase in SMR for anorexia nervosa (AN), whereas no firm conclusions could be drawn for bulimia nervosa (BN). Bias caused by loss to follow-up was quantified and found non-negligable in some samples (possible increase in SMR from 25% to 240%). We did not find a significant effect...

  16. Leiomyosarcoma of the Prostate: Case Report and Review of 54 Previously Published Cases

    Directory of Open Access Journals (Sweden)

    Gerasimos P. Vandoros

    2008-01-01

    Full Text Available Prostate leiomyosarcoma is an extremely rare and highly aggressive neoplasm that accounts for less than 0.1% of primary prostate malignancies. We present a patient with primary leiomyosarcoma of the prostate and review 54 cases reported in the literature to discuss the clinical, diagnostic and therapeutic aspects of this uncommon tumor. Median survival was estimated at 17 months (95% C.I. 20.7–43.7 months and the 1-, 3-, and 5-year actuarial survival rates were 68%, 34%, and 26%, respectively. The only factors predictive of long-term survival were negative surgical margins and absence of metastatic disease at presentation. A multidisciplinary approach is necessary for appropriate management of this dire entity.

  17. δ-dependency for privacy-preserving XML data publishing.

    Science.gov (United States)

    Landberg, Anders H; Nguyen, Kinh; Pardede, Eric; Rahayu, J Wenny

    2014-08-01

    An ever increasing amount of medical data such as electronic health records, is being collected, stored, shared and managed in large online health information systems and electronic medical record systems (EMR) (Williams et al., 2001; Virtanen, 2009; Huang and Liou, 2007) [1-3]. From such rich collections, data is often published in the form of census and statistical data sets for the purpose of knowledge sharing and enabling medical research. This brings with it an increasing need for protecting individual people privacy, and it becomes an issue of great importance especially when information about patients is exposed to the public. While the concept of data privacy has been comprehensively studied for relational data, models and algorithms addressing the distinct differences and complex structure of XML data are yet to be explored. Currently, the common compromise method is to convert private XML data into relational data for publication. This ad hoc approach results in significant loss of useful semantic information previously carried in the private XML data. Health data often has very complex structure, which is best expressed in XML. In fact, XML is the standard format for exchanging (e.g. HL7 version 3(1)) and publishing health information. Lack of means to deal directly with data in XML format is inevitably a serious drawback. In this paper we propose a novel privacy protection model for XML, and an algorithm for implementing this model. We provide general rules, both for transforming a private XML schema into a published XML schema, and for mapping private XML data to the new privacy-protected published XML data. In addition, we propose a new privacy property, δ-dependency, which can be applied to both relational and XML data, and that takes into consideration the hierarchical nature of sensitive data (as opposed to "quasi-identifiers"). Lastly, we provide an implementation of our model, algorithm and privacy property, and perform an experimental analysis

  18. Choosing the Right Desktop Publisher.

    Science.gov (United States)

    Eiser, Leslie

    1988-01-01

    Investigates the many different desktop publishing packages available today. Lists the steps to desktop publishing. Suggests which package to use with specific hardware available. Compares several packages for IBM, Mac, and Apple II based systems. (MVL)

  19. EPIC: Electronic Publishing is Cheaper.

    Science.gov (United States)

    Regier, Willis G.

    Advocates of inexpensive publishing confront a widespread complaint that there is already an overproduction of scholarship that electronic publishing will make worse. The costs of electronic publishing correlate to a clutch of choices: speeds of access, breadth and depth of content, visibility, flexibility, durability, dependability, definition of…

  20. International Marketing Developing Publishing Business

    Directory of Open Access Journals (Sweden)

    Eugenijus Chlivickas

    2015-05-01

    Full Text Available Lithuanian integration in the financial Eurozone and Lithuanian publishing business development in the European Union and outside it, becomes an important problem requiring a solution. Promoting the dissemination of printed books and literacy in Lithuania and beyond, to properly introduce the achievements of Lithuania in foreign countries, it is important to ensure Lithuanian letter, educational and scientific book publishing development. The article examines the characteristics of the international marketing publishing, the world and Lithuanian state publishing houses on the basis of foreign and Lithuanian scientists theoretical insights about the instruments of international marketing opportunities, developing proposals for publishing business integration of new economic conditions.

  1. What comes first? Publishing business or publishing studies?

    Directory of Open Access Journals (Sweden)

    Josipa Selthofer

    2015-07-01

    Full Text Available The aim of this paper is to analyze and compare publishing studies, their programmes at the undergraduate and graduate levels and scholars involved in the teaching of publishing courses at the top universities around the world and in Croatia. Since traditional publishing business is rapidly changing, new skills and new jobs are involved in it. The main research question is: Can modern publishing studies produce a modern publisher? Or, is it the other way around? The hypothesis of the paper is that scholars involved in the teaching of publishing courses at the top universities around the world have a background in publishing business. So, can they prepare their students for the future and can their students gain competencies they need to compete in a confusing world of digital authors and electronic books? The research methods used were content analysis and comparison. Research sample included 36 university publishing programmes at the undergraduate and graduate level worldwide (24 MA, 12 BA. The research sample was limited mainly to the English-speaking countries. In most non-English-speaking countries, it was difficult to analyse the programme curriculum in the native language because the programme and course description did not exit. In the data gathering phase, a customized web application was used for content analysis. The application has three main sections: a list of websites to evaluate, a visual representation of the uploaded website and a list of characteristics grouped by categories for quantifying data. About twenty years ago, publishing was not considered a separate scientific branch in Croatia. Publishing studies are therefore a new phenomenon to both scholars and publishers in Croatia. To create a new, ideal publishing course, can we simply copy global trends or is it better to create something of our own?

  2. A theoretical analysis of the median LMF adaptive algorithm

    DEFF Research Database (Denmark)

    Bysted, Tommy Kristensen; Rusu, C.

    1999-01-01

    Higher order adaptive algorithms are sensitive to impulse interference. In the case of the LMF (Least Mean Fourth), an easy and effective way to reduce this is to median filter the instantaneous gradient of the LMF algorithm. Although previous published simulations have indicated that this reduces...... the speed of convergence, no analytical studies have yet been made to prove this. In order to enhance the usability, this paper presents a convergence and steady-state analysis of the median LMF adaptive algorithm. As expected this proves that the median LMF has a slower convergence and a lower steady...

  3. THE QUALITY CRITERIA AND SELF-PUBLISHING IN SCIENTIFIC PUBLISHING

    Directory of Open Access Journals (Sweden)

    Almudena Mangas-Vega

    2015-11-01

    Full Text Available Self-publishing is a growing phenomenon in recent years. It is a process that goes beyond a simple change of leader in the publication, since it involves also a change of role of agents that were consolidated over time. A self-published work does not have to mean lack of quality, so it is important to define parameters and indicators that help its evaluation and identify who has the responsibility of those criteria. The article shows these aspects from the possibilities for cross-platform publishing and concludes with an analysis of the aspects that can be considered in assessing the quality of self-publishing.

  4. Thomas Jefferson, Page Design, and Desktop Publishing.

    Science.gov (United States)

    Hartley, James

    1991-01-01

    Discussion of page design for desktop publishing focuses on the importance of functional issues as opposed to aesthetic issues, and criticizes a previous article that stressed aesthetic issues. Topics discussed include balance, consistency in text structure, and how differences in layout affect the clarity of "The Declaration of…

  5. What Desktop Publishing Can Teach Professional Writing Students about Publishing.

    Science.gov (United States)

    Dobberstein, Michael

    1992-01-01

    Points out that desktop publishing is a metatechnology that allows professional writing students access to the production phase of publishing, giving students hands-on practice in preparing text for printing and in learning how that preparation affects the visual meaning of documents. (SR)

  6. E-publishing and multimodalities

    OpenAIRE

    Yngve Nordkvelle

    2008-01-01

    In the literature of e-publishing there has been a consistent call from the advent of e-publishing on, until now, to explore new ways of expressing ideas through the new media. It has been claimed that the Internet opens an alley of possibilities and opportunites for publishing that will change the ways of publishing once and for all. In the area of publication of e-journals, however, the call for changes has received very modest responds.The thing is, it appears, that the conventional paper ...

  7. A Global algorithm for linear radiosity

    OpenAIRE

    Sbert Cassasayas, Mateu; Pueyo Sánchez, Xavier

    1993-01-01

    A linear algorithm for radiosity is presented, linear both in time and storage. The new algorithm is based on previous work by the authors and on the well known algorithms for progressive radiosity and Monte Carlo particle transport.

  8. Desktop Publishing in the University.

    Science.gov (United States)

    Burstyn, Joan N., Ed.

    Highlighting changes in the work of people within the university, this book presents nine essays that examine the effects of desktop publishing and electronic publishing on professors and students, librarians, and those who work at university presses and in publication departments. Essays in the book are: (1) "Introduction: The Promise of Desktop…

  9. The Decision to Publish Electronically.

    Science.gov (United States)

    Craig, Gary

    1983-01-01

    Argues that decision to publish a given intellectual product "electronically" is a business decision based on customer needs, available format alternatives, current business climate, and variety of already existing factors. Publishers are most influenced by customers' acceptance of new products and their own role as intermediaries in…

  10. Publishing in Open Access Journals

    International Development Research Centre (IDRC) Digital Library (Canada)

    mbrunet

    00054.x). • An ISSN (International Standard Serial Number e.g. 1234-5678) has ... Publisher uses direct and unsolicited marketing (i.e., spamming) or advertising is obtrusive (to publish articles or serve on editorial board). • No information is ...

  11. Comics, Copyright and Academic Publishing

    Directory of Open Access Journals (Sweden)

    Ronan Deazley

    2014-05-01

    Full Text Available This article considers the extent to which UK-based academics can rely upon the copyright regime to reproduce extracts and excerpts from published comics and graphic novels without having to ask the copyright owner of those works for permission. In doing so, it invites readers to engage with a broader debate about the nature, demands and process of academic publishing.

  12. Electronic Publishing: Baseline Data 1993.

    Science.gov (United States)

    Brock, Laurie

    1993-01-01

    Provides highlights of a report describing research conducted to analyze and compare publishers' and developers' current and planned involvement in electronic publishing. Topics include acceptance of new media, licensing issues, costs and other perceived obstacles, and CD-ROMs platforms. (EAM)

  13. The Evolution of Electronic Publishing.

    Science.gov (United States)

    Lancaster, F. W.

    1995-01-01

    Discusses the evolution of electronic publishing from the early 1960s when computers were used merely to produce conventional printed products to the present move toward networked scholarly publishing. Highlights include library development, periodicals on the Internet, online journals versus paper journals, problems, and the future of…

  14. The handbook of journal publishing

    CERN Document Server

    Morris, Sally; LaFrenier, Douglas; Reich, Margaret

    2013-01-01

    The Handbook of Journal Publishing is a comprehensive reference work written by experienced professionals, covering all aspects of journal publishing, both online and in print. Journals are crucial to scholarly communication, but changes in recent years in the way journals are produced, financed, and used make this an especially turbulent and challenging time for journal publishers - and for authors, readers, and librarians. The Handbook offers a thorough guide to the journal publishing process, from editing and production through marketing, sales, and fulfilment, with chapters on management, finances, metrics, copyright, and ethical issues. It provides a wealth of practical tools, including checklists, sample documents, worked examples, alternative scenarios, and extensive lists of resources, which readers can use in their day-to-day work. Between them, the authors have been involved in every aspect of journal publishing over several decades and bring to the text their experience working for a wide range of ...

  15. Developments in Publishing: The Potential of Digital Publishing

    OpenAIRE

    X. Tian

    2007-01-01

    This research aims to identify issues associated with the impact of digital technology on the publishing industry with a specific focus on aspects of the sustainability of existing business models in Australia. Based on the case studies, interviews and Australian-wide online surveys, the research presents a review of the traditional business models in book publishing for investigating their effectiveness in a digital environment. It speculates on how and what should be considered for construc...

  16. Sound algorithms

    OpenAIRE

    De Götzen , Amalia; Mion , Luca; Tache , Olivier

    2007-01-01

    International audience; We call sound algorithms the categories of algorithms that deal with digital sound signal. Sound algorithms appeared in the very infancy of computer. Sound algorithms present strong specificities that are the consequence of two dual considerations: the properties of the digital sound signal itself and its uses, and the properties of auditory perception.

  17. Genetic algorithms

    Science.gov (United States)

    Wang, Lui; Bayer, Steven E.

    1991-01-01

    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  18. Enhancing Breast Cancer Recurrence Algorithms Through Selective Use of Medical Record Data.

    Science.gov (United States)

    Kroenke, Candyce H; Chubak, Jessica; Johnson, Lisa; Castillo, Adrienne; Weltzien, Erin; Caan, Bette J

    2016-03-01

    The utility of data-based algorithms in research has been questioned because of errors in identification of cancer recurrences. We adapted previously published breast cancer recurrence algorithms, selectively using medical record (MR) data to improve classification. We evaluated second breast cancer event (SBCE) and recurrence-specific algorithms previously published by Chubak and colleagues in 1535 women from the Life After Cancer Epidemiology (LACE) and 225 women from the Women's Health Initiative cohorts and compared classification statistics to published values. We also sought to improve classification with minimal MR examination. We selected pairs of algorithms-one with high sensitivity/high positive predictive value (PPV) and another with high specificity/high PPV-using MR information to resolve discrepancies between algorithms, properly classifying events based on review; we called this "triangulation." Finally, in LACE, we compared associations between breast cancer survival risk factors and recurrence using MR data, single Chubak algorithms, and triangulation. The SBCE algorithms performed well in identifying SBCE and recurrences. Recurrence-specific algorithms performed more poorly than published except for the high-specificity/high-PPV algorithm, which performed well. The triangulation method (sensitivity = 81.3%, specificity = 99.7%, PPV = 98.1%, NPV = 96.5%) improved recurrence classification over two single algorithms (sensitivity = 57.1%, specificity = 95.5%, PPV = 71.3%, NPV = 91.9%; and sensitivity = 74.6%, specificity = 97.3%, PPV = 84.7%, NPV = 95.1%), with 10.6% MR review. Triangulation performed well in survival risk factor analyses vs analyses using MR-identified recurrences. Use of multiple recurrence algorithms in administrative data, in combination with selective examination of MR data, may improve recurrence data quality and reduce research costs. © The Author 2015. Published by Oxford University Press. All rights reserved. For

  19. How libraries use publisher metadata

    Directory of Open Access Journals (Sweden)

    Steve Shadle

    2013-11-01

    Full Text Available With the proliferation of electronic publishing, libraries are increasingly relying on publisher-supplied metadata to meet user needs for discovery in library systems. However, many publisher/content provider staff creating metadata are unaware of the end-user environment and how libraries use their metadata. This article provides an overview of the three primary discovery systems that are used by academic libraries, with examples illustrating how publisher-supplied metadata directly feeds into these systems and is used to support end-user discovery and access. Commonly seen metadata problems are discussed, with recommendations suggested. Based on a series of presentations given in Autumn 2012 to the staff of a large publisher, this article uses the University of Washington Libraries systems and services as illustrative examples. Judging by the feedback received from these presentations, publishers (specifically staff not familiar with the big picture of metadata standards work would benefit from a better understanding of the systems and services libraries provide using the data that is created and managed by publishers.

  20. From protocol to published report

    DEFF Research Database (Denmark)

    Berendt, Louise; Callréus, Torbjörn; Petersen, Lene Grejs

    2016-01-01

    and published reports of academic clinical drug trials. METHODS: A comparison was made between study protocols and their corresponding published reports. We assessed the overall consistency, which was defined as the absence of discrepancy regarding study type (categorized as either exploratory or confirmatory...... in 1999, 2001, and 2003, 95 of which fulfilled the eligibility criteria and had at least one corresponding published report reporting data on trial subjects. Overall consistency was observed in 39% of the trials (95% CI: 29 to 49%). Randomized controlled trials (RCTs) constituted 72% (95% CI: 63 to 81......%) of the sample, and 87% (95% CI: 80 to 94%) of the trials were hospital based. CONCLUSIONS: Overall consistency between protocols and their corresponding published reports was low. Motivators for the inconsistencies are unknown but do not seem restricted to economic incentives....

  1. Desktop publishing com o scribus

    OpenAIRE

    Silva, Fabrício Riff; Uchôa, Kátia Cilene Amaral

    2015-01-01

    Este artigo apresenta um breve tutorial sobre Desktop Publishing, com ênfase no software livre Scribus, através da criação de um exemplo prático que explora algumas de suas principais funcionalidades.

  2. Publisher Correction: On our bookshelf

    Science.gov (United States)

    Karouzos, Marios

    2018-03-01

    In the version of this Books and Arts originally published, the book title Spectroscopy for Amateur Astronomy was incorrect; it should have read Spectroscopy for Amateur Astronomers. This has now been corrected.

  3. Published journal article with data

    Data.gov (United States)

    U.S. Environmental Protection Agency — published journal article. This dataset is associated with the following publication: Schumacher, B., J. Zimmerman, J. Elliot, and G. Swanson. The Effect of...

  4. Free Publishing Culture. Sustainable Models?

    Directory of Open Access Journals (Sweden)

    Silvia Nanclares Escudero

    2013-03-01

    Full Text Available As a result of the collective research on the possibilities for publishing production and distribution offered nowadays by the Free Culture scenario, we present here a mapping of symptoms in order to propose a transitory diagnostic of the question: Is it possible to generate an economically sustainable publishing model based on the uses and customs generated and provided by Free Culture? Data, intuitions, experiences and ideas attempt to back up our affirmative answer.

  5. THE TYPES OF PUBLISHING SLOGANS

    Directory of Open Access Journals (Sweden)

    Ryzhov Konstantin Germanovich

    2015-03-01

    Full Text Available The author of the article focuses his attention on publishing slogans which are posted on 100 present-day Russian publishing houses' official websites and have not yet been studied in the special literature. The author has developed his own classification of publishing slogans based on the results of analysis and considering the current scientific views on the classification of slogans. The examined items are classified into autonomous and text-dependent according to interrelationship with an advertising text; marketable, corporative and mixed according to a presentation subject; rational, emotional and complex depending on the method of influence upon a recipient; slogan-presentation, slogan-assurance, slogan-identifier, slogan-appraisal, slogan-appeal depending on the communicative strategy; slogans consisting of one sentence and of two or more sentences; Russian and foreign ones. The analysis of the slogans of all kinds presented in the actual material allowed the author to determine the dominant features of the Russian publishing slogan which is an autonomous sentence in relation to the advertising text. In spite of that, the slogan shows the publishing output, influences the recipient emotionally, actualizes the communicative strategy of publishing house presentation of its distinguishing features, gives assurance to the target audience and distinguishes the advertised subject among competitors.

  6. Improved ultrashort pulse-retrieval algorithm for frequency-resolved optical gating

    International Nuclear Information System (INIS)

    DeLong, K.W.; Trebino, R.

    1994-01-01

    We report on significant improvements in the pulse-retrieval algorithm used to reconstruct the amplitude and the phase of ultrashort optical pulses from the experimental frequency-resolved optical gating trace data in the polarization-gate geometry. These improvements involve the use of an intensity constraint, an overcorrection technique, and a multidimensional minimization scheme. While the previously published, basic algorithm converged for most common ultrashort pulses, it failed to retrieve pulses with significant intensity substructure. The improved composite algorithm successfully converges for such pulses. It can now retrieve essentially all pulses of practical interest. We present examples of complex waveforms that were retrieved by the improved algorithm

  7. The Community Publishing Project: assisting writers to self-publish ...

    African Journals Online (AJOL)

    This article examines the need for a small project such as the Community Publishing Project in South Africa and explores its aims. The method of involving writers and community groups in the publication process is described and two completed projects are evaluated. Lessons learnt by the Centre for the Book in managing ...

  8. Algorithmic cryptanalysis

    CERN Document Server

    Joux, Antoine

    2009-01-01

    Illustrating the power of algorithms, Algorithmic Cryptanalysis describes algorithmic methods with cryptographically relevant examples. Focusing on both private- and public-key cryptographic algorithms, it presents each algorithm either as a textual description, in pseudo-code, or in a C code program.Divided into three parts, the book begins with a short introduction to cryptography and a background chapter on elementary number theory and algebra. It then moves on to algorithms, with each chapter in this section dedicated to a single topic and often illustrated with simple cryptographic applic

  9. Web publishing today and tomorrow

    CERN Document Server

    Lie, Hakon W

    1999-01-01

    The three lectures will give participants the grand tour of the Web as we know it today, as well as peeks into the past and the future. Many three-letter acronyms will be expanded, and an overview will be provided to see how the various specifications work together. Web publishing is the common theme throughout the lectures and in the second lecture, special emphasis will be given to data formats for publishing, including HTML, XML, MathML and SMIL. In the last lectures, automatic document manipulation and presentation will be discussed, including CSS, DOM and XTL.

  10. Parallel algorithms for placement and routing in VLSI design. Ph.D. Thesis

    Science.gov (United States)

    Brouwer, Randall Jay

    1991-01-01

    The computational requirements for high quality synthesis, analysis, and verification of very large scale integration (VLSI) designs have rapidly increased with the fast growing complexity of these designs. Research in the past has focused on the development of heuristic algorithms, special purpose hardware accelerators, or parallel algorithms for the numerous design tasks to decrease the time required for solution. Two new parallel algorithms are proposed for two VLSI synthesis tasks, standard cell placement and global routing. The first algorithm, a parallel algorithm for global routing, uses hierarchical techniques to decompose the routing problem into independent routing subproblems that are solved in parallel. Results are then presented which compare the routing quality to the results of other published global routers and which evaluate the speedups attained. The second algorithm, a parallel algorithm for cell placement and global routing, hierarchically integrates a quadrisection placement algorithm, a bisection placement algorithm, and the previous global routing algorithm. Unique partitioning techniques are used to decompose the various stages of the algorithm into independent tasks which can be evaluated in parallel. Finally, results are presented which evaluate the various algorithm alternatives and compare the algorithm performance to other placement programs. Measurements are presented on the parallel speedups available.

  11. Algorithmic mathematics

    CERN Document Server

    Hougardy, Stefan

    2016-01-01

    Algorithms play an increasingly important role in nearly all fields of mathematics. This book allows readers to develop basic mathematical abilities, in particular those concerning the design and analysis of algorithms as well as their implementation. It presents not only fundamental algorithms like the sieve of Eratosthenes, the Euclidean algorithm, sorting algorithms, algorithms on graphs, and Gaussian elimination, but also discusses elementary data structures, basic graph theory, and numerical questions. In addition, it provides an introduction to programming and demonstrates in detail how to implement algorithms in C++. This textbook is suitable for students who are new to the subject and covers a basic mathematical lecture course, complementing traditional courses on analysis and linear algebra. Both authors have given this "Algorithmic Mathematics" course at the University of Bonn several times in recent years.

  12. A Course in Desktop Publishing.

    Science.gov (United States)

    Somerick, Nancy M.

    1992-01-01

    Describes "Promotional Publications," a required course for public relations majors, which teaches the basics of desktop publishing. Outlines how the course covers the preparation of publications used as communication tools in public relations, advertising, and organizations, with an emphasis upon design, layout, and technology. (MM)

  13. Improving Published Descriptions of Germplasm.

    Science.gov (United States)

    Published descriptions of new germplasm, such as in the Journal of Plant Registrations (JPR) and, prior to mid-2007, in Crop Science, are important vehicles for allowing researchers and other interested parties to learn about such germplasm and the methods used to generate them. Launched in 2007, JP...

  14. Publishing in Open Access Journals

    International Development Research Centre (IDRC) Digital Library (Canada)

    mbrunet

    While most open access journals are peer‐reviewed and high quality, there are a number of ... Publisher has a negative reputation (e.g., documented examples in Chronicle of Higher Education, ... A key part of Canada's aid program, IDRC supports research in developing countries to promote growth and development.

  15. FTP: Full-Text Publishing?

    Science.gov (United States)

    Jul, Erik

    1992-01-01

    Describes the use of file transfer protocol (FTP) on the INTERNET computer network and considers its use as an electronic publishing system. The differing electronic formats of text files are discussed; the preparation and access of documents are described; and problems are addressed, including a lack of consistency. (LRW)

  16. Library Networks and Electronic Publishing.

    Science.gov (United States)

    Olvey, Lee D.

    1995-01-01

    Provides a description of present and proposed plans and strategies of OCLC (Online Computer Library Center) and their relationship to electronic publishing. FirstSearch (end-user access to secondary information), GUIDON (electronic journals online) and FastDoc (document delivery) are emphasized. (JKP)

  17. Total algorithms

    NARCIS (Netherlands)

    Tel, G.

    We define the notion of total algorithms for networks of processes. A total algorithm enforces that a "decision" is taken by a subset of the processes, and that participation of all processes is required to reach this decision. Total algorithms are an important building block in the design of

  18. Group leaders optimization algorithm

    Science.gov (United States)

    Daskin, Anmer; Kais, Sabre

    2011-03-01

    We present a new global optimization algorithm in which the influence of the leaders in social groups is used as an inspiration for the evolutionary technique which is designed into a group architecture. To demonstrate the efficiency of the method, a standard suite of single and multi-dimensional optimization functions along with the energies and the geometric structures of Lennard-Jones clusters are given as well as the application of the algorithm on quantum circuit design problems. We show that as an improvement over previous methods, the algorithm scales as N 2.5 for the Lennard-Jones clusters of N-particles. In addition, an efficient circuit design is shown for a two-qubit Grover search algorithm which is a quantum algorithm providing quadratic speedup over the classical counterpart.

  19. Critical appraisal of published literature

    Science.gov (United States)

    Umesh, Goneppanavar; Karippacheril, John George; Magazine, Rahul

    2016-01-01

    With a large output of medical literature coming out every year, it is impossible for readers to read every article. Critical appraisal of scientific literature is an important skill to be mastered not only by academic medical professionals but also by those involved in clinical practice. Before incorporating changes into the management of their patients, a thorough evaluation of the current or published literature is an important step in clinical practice. It is necessary for assessing the published literature for its scientific validity and generalizability to the specific patient community and reader's work environment. Simple steps have been provided by Consolidated Standard for Reporting Trial statements, Scottish Intercollegiate Guidelines Network and several other resources which if implemented may help the reader to avoid reading flawed literature and prevent the incorporation of biased or untrustworthy information into our practice. PMID:27729695

  20. Critical appraisal of published literature

    Directory of Open Access Journals (Sweden)

    Goneppanavar Umesh

    2016-01-01

    Full Text Available With a large output of medical literature coming out every year, it is impossible for readers to read every article. Critical appraisal of scientific literature is an important skill to be mastered not only by academic medical professionals but also by those involved in clinical practice. Before incorporating changes into the management of their patients, a thorough evaluation of the current or published literature is an important step in clinical practice. It is necessary for assessing the published literature for its scientific validity and generalizability to the specific patient community and reader′s work environment. Simple steps have been provided by Consolidated Standard for Reporting Trial statements, Scottish Intercollegiate Guidelines Network and several other resources which if implemented may help the reader to avoid reading flawed literature and prevent the incorporation of biased or untrustworthy information into our practice.

  1. Bibliography of published papers, 1977

    International Nuclear Information System (INIS)

    1978-01-01

    Papers published by RERF (a cooperative Japan-U.S. research organization) personnel mainly in 1977 issues of journals are listed as bibliography giving the title, authors, etc. Mostly in both Japanese and English. The total of about 50 such cover areas as follows; Variety of diseases such as cancer and cardiovascular, dosimetry, genetics, pathology, radiation effects including such as diseases, and summary reports. (Mori, K.)

  2. Publisher Correction: Eternal blood vessels

    Science.gov (United States)

    Hindson, Jordan

    2018-05-01

    This article was originally published with an incorrect reference for the original article. The reference has been amended. Please see the correct reference below. Qiu, Y. et al. Microvasculature-on-a-chip for the long-term study of endothelial barrier dysfunction and microvascular obstruction in disease. Nat. Biomed. Eng. https://doi.org/10.1038/s41551-018-0224-z (2018)

  3. The Industrial Engineering publishing landscape

    OpenAIRE

    Claasen, Schalk

    2012-01-01

    Looking at the Industrial Engineering publishing landscape through the window of Google Search, an interesting panorama unfolds. The view that I took is actually just a peek and therefore my description of what I saw is not meant to be comprehensive. The African landscape is empty except for the South African Journal of Industrial Engineering (SAJIE). This is an extraordinary situation if compared to the South American continent where there are Industrial Engineering journals in at least ...

  4. Where is smoking research published?

    Science.gov (United States)

    Liguori, A.; Hughes, J. R.

    1996-01-01

    OBJECTIVE: To identify journals that have a focus on human nicotine/smoking research and to investigate the coverage of smoking in "high-impact" journals. DESIGN: The MEDLINE computer database was searched for English-language articles on human studies published in 1988-1992 using "nicotine", "smoking", "smoking cessation", "tobacco", or "tobacco use disorder" as focus descriptors. This search was supplemented with a similar search of the PSYCLIT computer database. Fifty-eight journals containing at least 20 nicotine/smoking articles over the five years were analysed for impact factor (IF; citations per article). RESULTS: Among the journals with the highest percentage of nicotine- or smoking-focused articles (that is, 9-39% of their articles were on nicotine/smoking), Addiction, American Journal of Public Health, Cancer Causes and Control, Health Psychology, and Preventive Medicine had the greatest IF (range = 1.3-2.6). Among the journals highest in impact factor (IF > 3), only American Journal of Epidemiology, American Review of Respiratory Disease, Journal of the National Cancer Institute, and Journal of the American Medical Association published more than 10 nicotine/smoking articles per year (3-5% of all articles). Of these, only Journal of the American Medical Association published a large number of nicotine/smoking articles (32 per year). CONCLUSIONS: Although smoking causes 20% of all mortality in developed countries, the topic is not adequately covered in high-impact journals. Most smoking research is published in low-impact journals. 




 PMID:8795857

  5. Rapid fish stock depletion in previously unexploited seamounts: the ...

    African Journals Online (AJOL)

    Rapid fish stock depletion in previously unexploited seamounts: the case of Beryx splendens from the Sierra Leone Rise (Gulf of Guinea) ... A spectral analysis and red-noise spectra procedure (REDFIT) algorithm was used to identify the red-noise spectrum from the gaps in the observed time-series of catch per unit effort by ...

  6. Books average previous decade of economic misery.

    Science.gov (United States)

    Bentley, R Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20(th) century since the Depression, we find a strong correlation between a 'literary misery index' derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade.

  7. Books Average Previous Decade of Economic Misery

    Science.gov (United States)

    Bentley, R. Alexander; Acerbi, Alberto; Ormerod, Paul; Lampos, Vasileios

    2014-01-01

    For the 20th century since the Depression, we find a strong correlation between a ‘literary misery index’ derived from English language books and a moving average of the previous decade of the annual U.S. economic misery index, which is the sum of inflation and unemployment rates. We find a peak in the goodness of fit at 11 years for the moving average. The fit between the two misery indices holds when using different techniques to measure the literary misery index, and this fit is significantly better than other possible correlations with different emotion indices. To check the robustness of the results, we also analysed books written in German language and obtained very similar correlations with the German economic misery index. The results suggest that millions of books published every year average the authors' shared economic experiences over the past decade. PMID:24416159

  8. Open access to scientific publishing

    Directory of Open Access Journals (Sweden)

    Janne Beate Reitan

    2016-12-01

    Full Text Available Interest in open access (OA to scientific publications is steadily increasing, both in Norway and internationally. From the outset, FORMakademisk has been published as a digital journal, and it was one of the first to offer OA in Norway. We have since the beginning used Open Journal Systems (OJS as publishing software. OJS is part of the Public Knowledge Project (PKP, which was created by Canadian John Willinsky and colleagues at the Faculty of Education at the University of British Columbia in 1998. The first version of OJS came as an open source software in 2001. The programme is free for everyone to use and is part of a larger collective movement wherein knowledge is shared. When FORMakademisk started in 2008, we received much help from the journal Acta Didactic (n.d. at the University of Oslo, which had started the year before us. They had also translated the programme to Norwegian. From the start, we were able to publish in both Norwegian and English. Other journals have used FORMakademisk as a model and source of inspiration when starting or when converting from subscription-based print journals to electronic OA, including the Journal of Norwegian Media Researchers [Norsk medietidsskrift]. It is in this way that the movement around PKP works and continues to grow to provide free access to research. As the articles are OA, they are also easily accessible to non-scientists. We also emphasise that the language should be readily available, although it should maintain a high scientific quality. Often there may be two sides of the same coin. We on the editorial team are now looking forward to adopting the newly developed OJS 3 this spring, with many new features and an improved design for users, including authors, peer reviewers, editors and readers.

  9. Electronic publishing of SPE papers

    International Nuclear Information System (INIS)

    Perdue, J.M.

    1992-01-01

    This paper reports that the SPE is creating an electronic index to over 25,000 technical papers and will produce a CD-ROM as an initial product. This SPE CD-ROM Masterdisc will be available at the SPE Annual Meeting in Washington, D.C. on October 4-7, 1992. The SPE Board has appointed an Ad Hoc Committee on Electronic Publishing to coordinate and oversee this project and to recommend authoring standards for submitting SPE papers electronically in the future

  10. Open Access Publishing with Drupal

    Directory of Open Access Journals (Sweden)

    Nina McHale

    2011-10-01

    Full Text Available In January 2009, the Colorado Association of Libraries (CAL suspended publication of its print quarterly journal, Colorado Libraries, as a cost-saving measure in a time of fiscal uncertainty. Printing and mailing the journal to its 1300 members cost CAL more than $26,000 per year. Publication of the journal was placed on an indefinite hiatus until the editorial staff proposed an online, open access format a year later. The benefits to migrating to open access included: significantly lower costs; a green platform; instant availability of content; a greater level of access to users with disabilities; and a higher level of visibility of the journal and the association. The editorial staff chose Drupal, including the E-journal module, and while Drupal is notorious for its steep learning curve—which exacerbated delays to content that had been created before the publishing hiatus—the fourth electronic issue was published recently at coloradolibrariesjournal.org. This article will discuss both the benefits and challenges of transitioning to an open access model and the choice Drupal as a platform over other more established journal software options.

  11. E-publishing and multimodalities

    Directory of Open Access Journals (Sweden)

    Yngve Nordkvelle

    2008-12-01

    Full Text Available In the literature of e-publishing there has been a consistent call from the advent of e-publishing on, until now, to explore new ways of expressing ideas through the new media. It has been claimed that the Internet opens an alley of possibilities and opportunites for publishing that will change the ways of publishing once and for all. In the area of publication of e-journals, however, the call for changes has received very modest responds.The thing is, it appears, that the conventional paper journal has a solid grip on the accepted formats of publishing. In a published research paper Mayernik (2007 explaines some of the reasons for that. Although pioneers of e-publishing suggested various areas where academic publishing could be expanded on, the opportunities given are scarsely used. Mayernik outlines "Non-linearity", "Multimedia", "Multiple use", "Interactivity" and "Rapid Publication" as areas of expansion for the academic e-journal. (2007. The paper deserves a thorough reading in itself, and I will briefly quote from his conclusion: "It is likely that the traditional linear article will continue to be the prevalent format for scholarly journals, both print and electronic, for the foreseeable future, and while electronic features will garner more and more use as technology improves, they will continue to be used to supplement, and not supplant, the traditional article."This is a challenging situation. If we accept the present dominant style of presenting scientific literature, we would use our energy best in seeking a way of improving the efficiency of that communication style. The use of multimedia, non-linearity etc. would perfect the present state, but still keep the scientific article as the main template. It is very unlikely that scientific publication will substitute the scholarly article with unproven alternatives. What we face is a rather conservative style of remediation that blurs the impact of the new media, - or "transparency" if

  12. Publishing corruption discussion: predatory journalism.

    Science.gov (United States)

    Jones, James W; McCullough, Laurence B

    2014-02-01

    Dr Spock is a brilliant young vascular surgeon who is up for tenure next year. He has been warned by the chair of surgery that he needs to increase his list of publications to assure passage. He has recently had a paper reviewed by one of the top journals in his specialty, Journal X-special, with several suggestions for revision. He received an e-mail request for manuscript submission from a newly minted, open access, Journal of Vascular Disease Therapy, which promises a quick and likely favorable response for a fee. What should be done? A. Send the paper to another peer reviewed journal with the suggested revisions. B. Resubmit the paper to Journal X-special. C. Submit to the online journal as is to save time. D. Submit to the online journal and another regular journal. E. Look for another job. Copyright © 2014 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.

  13. The IAEA as a publisher

    International Nuclear Information System (INIS)

    1965-01-01

    One of the largest publishing enterprises in Vienna has developed in then Agency, incidental to its function of disseminating scientific information. The Agency recently completed its sixth year of scientific publication of literature dealing with the peaceful uses of atomic energy. Quite early in the history of IAEA, this work grew to considerable dimensions. In 1959 the programme consisted of two volumes in the Proceedings series, one in the Safety series, and four Technical Directories, making a total in that year of 18 000 books, in addition to those prepared for free distribution. In the following year, as Agency meetings and other activities developed, the list was much longer consisting of six volumes in the Proceedings series, two in the Safety series, two in the Technical Directory series, eight in the Review series, two in the Bibliographical series, three panel reports, one volume in the legal series and the first issue of 'Nuclear Fusion'. The total number of volumes sold was 24 000, in addition to the large number for free distribution. Thereafter, there was some difficulty in keeping up with the expanding demands, and some arrears of contract printing began to accumulate. It was therefore decided to introduce internal printing of Agency publications. The adoption of the 'cold type' method in 1962 led to considerable savings and faster production. During 1963, printing and binding equipment was installed which rendered the Agency independent of contractual services. Current policy is to print and bind internally all IAEA publications except the journal, 'Nuclear Fusion', Average annual production now consists of about twenty volumes of the proceedings of scientific meetings, six technical directories (the Directory of Nuclear Reactors has been published in its fifth edition), several bibliographies and numerous technical reports

  14. ESTABLISHING A PUBLISHING OUTFIT IN NIGERIA EMENYONU ...

    African Journals Online (AJOL)

    CIU

    stakeholders in the publishing industry,the legal environment of publishing, ... retailers. Publishing is a peculiar form of business for which a special group of very .... The publisher should provide furniture and fittings for the staff and intended ...

  15. The Psychopharmacology Algorithm Project at the Harvard South Shore Program: An Algorithm for Generalized Anxiety Disorder.

    Science.gov (United States)

    Abejuela, Harmony Raylen; Osser, David N

    2016-01-01

    This revision of previous algorithms for the pharmacotherapy of generalized anxiety disorder was developed by the Psychopharmacology Algorithm Project at the Harvard South Shore Program. Algorithms from 1999 and 2010 and associated references were reevaluated. Newer studies and reviews published from 2008-14 were obtained from PubMed and analyzed with a focus on their potential to justify changes in the recommendations. Exceptions to the main algorithm for special patient populations, such as women of childbearing potential, pregnant women, the elderly, and those with common medical and psychiatric comorbidities, were considered. Selective serotonin reuptake inhibitors (SSRIs) are still the basic first-line medication. Early alternatives include duloxetine, buspirone, hydroxyzine, pregabalin, or bupropion, in that order. If response is inadequate, then the second recommendation is to try a different SSRI. Additional alternatives now include benzodiazepines, venlafaxine, kava, and agomelatine. If the response to the second SSRI is unsatisfactory, then the recommendation is to try a serotonin-norepinephrine reuptake inhibitor (SNRI). Other alternatives to SSRIs and SNRIs for treatment-resistant or treatment-intolerant patients include tricyclic antidepressants, second-generation antipsychotics, and valproate. This revision of the GAD algorithm responds to issues raised by new treatments under development (such as pregabalin) and organizes the evidence systematically for practical clinical application.

  16. An implicit flux-split algorithm to calculate hypersonic flowfields in chemical equilibrium

    Science.gov (United States)

    Palmer, Grant

    1987-01-01

    An implicit, finite-difference, shock-capturing algorithm that calculates inviscid, hypersonic flows in chemical equilibrium is presented. The flux vectors and flux Jacobians are differenced using a first-order, flux-split technique. The equilibrium composition of the gas is determined by minimizing the Gibbs free energy at every node point. The code is validated by comparing results over an axisymmetric hemisphere against previously published results. The algorithm is also applied to more practical configurations. The accuracy, stability, and versatility of the algorithm have been promising.

  17. Clinical algorithms to aid osteoarthritis guideline dissemination.

    Science.gov (United States)

    Meneses, S R F; Goode, A P; Nelson, A E; Lin, J; Jordan, J M; Allen, K D; Bennell, K L; Lohmander, L S; Fernandes, L; Hochberg, M C; Underwood, M; Conaghan, P G; Liu, S; McAlindon, T E; Golightly, Y M; Hunter, D J

    2016-09-01

    Numerous scientific organisations have developed evidence-based recommendations aiming to optimise the management of osteoarthritis (OA). Uptake, however, has been suboptimal. The purpose of this exercise was to harmonize the recent recommendations and develop a user-friendly treatment algorithm to facilitate translation of evidence into practice. We updated a previous systematic review on clinical practice guidelines (CPGs) for OA management. The guidelines were assessed using the Appraisal of Guidelines for Research and Evaluation for quality and the standards for developing trustworthy CPGs as established by the National Academy of Medicine (NAM). Four case scenarios and algorithms were developed by consensus of a multidisciplinary panel. Sixteen guidelines were included in the systematic review. Most recommendations were directed toward physicians and allied health professionals, and most had multi-disciplinary input. Analysis for trustworthiness suggests that many guidelines still present a lack of transparency. A treatment algorithm was developed for each case scenario advised by recommendations from guidelines and based on panel consensus. Strategies to facilitate the implementation of guidelines in clinical practice are necessary. The algorithms proposed are examples of how to apply recommendations in the clinical context, helping the clinician to visualise the patient flow and timing of different treatment modalities. Copyright © 2016 Osteoarthritis Research Society International. Published by Elsevier Ltd. All rights reserved.

  18. Training nuclei detection algorithms with simple annotations

    Directory of Open Access Journals (Sweden)

    Henning Kost

    2017-01-01

    Full Text Available Background: Generating good training datasets is essential for machine learning-based nuclei detection methods. However, creating exhaustive nuclei contour annotations, to derive optimal training data from, is often infeasible. Methods: We compared different approaches for training nuclei detection methods solely based on nucleus center markers. Such markers contain less accurate information, especially with regard to nuclear boundaries, but can be produced much easier and in greater quantities. The approaches use different automated sample extraction methods to derive image positions and class labels from nucleus center markers. In addition, the approaches use different automated sample selection methods to improve the detection quality of the classification algorithm and reduce the run time of the training process. We evaluated the approaches based on a previously published generic nuclei detection algorithm and a set of Ki-67-stained breast cancer images. Results: A Voronoi tessellation-based sample extraction method produced the best performing training sets. However, subsampling of the extracted training samples was crucial. Even simple class balancing improved the detection quality considerably. The incorporation of active learning led to a further increase in detection quality. Conclusions: With appropriate sample extraction and selection methods, nuclei detection algorithms trained on the basis of simple center marker annotations can produce comparable quality to algorithms trained on conventionally created training sets.

  19. A meditation on the use of hands. Previously published in Scandinavian Journal of Occupational Therapy 1995; 2: 153-166.

    Science.gov (United States)

    Kielhofner, G

    2014-01-01

    The theme of mind-body unity is fundamental to occupational therapy. Nonetheless, the field continues to embrace a dualism of mind and body. This dualism persists because the field views the body only as an object, ignoring how the body is lived. Drawing upon phenomenological discussions of bodily experience, this paper illustrates how the lived body is a locus of intelligence, intentionality, adaptiveness, and experience. It also considers the bodily ground of motivation and thought and discusses how the body constitutes and incorporates its world. Finally, the paper considers implications of the lived body for therapy.

  20. Outcome of unicompartmental knee arthroplasty in octogenarians with tricompartmental osteoarthritis: A longer followup of previously published report

    Directory of Open Access Journals (Sweden)

    Sanjiv KS Marya

    2013-01-01

    Full Text Available Background: Unicompartmental knee arthroplasty (UKA has specific indications, producing excellent results. It, however, has a limited lifespan and needs eventual conversion to total knee arthroplasty (TKA. It is, therefore, a temporizing procedure in select active young patients with advanced unicompartmental osteoarthritis (UCOA. Being a less morbid procedure it is suggested as an alternative in the very elderly patients with tricompartmental osteoarthritis (TCOA. We performed UKA in a series of 45 octogenarians with TCOA predominant medial compartment osteoarthritis (MCOA and analyzed the results. Materials and Methods: Forty five octogenarian patients with TCOA predominant MCOA underwent UKA (19 bilateral from January 2002 to January 2012. All had similar preoperative work-up, surgical approach, procedure, implants and postoperative protocol. Clinicoradiological assessment was done at 3-monthly intervals for the first year, then yearly till the last followup (average 72 months, range 8-128 months. Results were evaluated using the knee society scores (KSS, satisfaction index [using the visual analogue scale (VAS] and orthogonal radiographs (for loosening, subsidence, lysis or implant wear. Resurgery for any cause was considered failure. Results: Four patients (six knees died due to medical conditions, two patients (three knees were lost to followup, and these were excluded from the final analysis. Barring two failures, all the remaining patients were pain-free and performing well at the final followup. Indications for resurgery were: medial femoral condyle fracture needing fixation subsequent conversion to TKA at 2 years (n=1 and progression of arthritis and pain leading to revision TKA at 6 years (n=1. Conclusion: UKA has shown successful outcomes with regards to pain relief and function with 96.4% implant survival and 94.9% good or excellent outcomes. Due to lower demands, early rehabilitation, less morbidity, and relative short life expectancy, UKA can successfully manage TCOA in the octogenarians.

  1. Electrons, Electronic Publishing, and Electronic Display.

    Science.gov (United States)

    Brownrigg, Edwin B.; Lynch, Clifford A.

    1985-01-01

    Provides a perspective on electronic publishing by distinguishing between "Newtonian" publishing and "quantum-mechanical" publishing. Highlights include media and publishing, works delivered through electronic media, electronic publishing and the printed word, management of intellectual property, and recent copyright-law issues…

  2. New recursive-least-squares algorithms for nonlinear active control of sound and vibration using neural networks.

    Science.gov (United States)

    Bouchard, M

    2001-01-01

    In recent years, a few articles describing the use of neural networks for nonlinear active control of sound and vibration were published. Using a control structure with two multilayer feedforward neural networks (one as a nonlinear controller and one as a nonlinear plant model), steepest descent algorithms based on two distinct gradient approaches were introduced for the training of the controller network. The two gradient approaches were sometimes called the filtered-x approach and the adjoint approach. Some recursive-least-squares algorithms were also introduced, using the adjoint approach. In this paper, an heuristic procedure is introduced for the development of recursive-least-squares algorithms based on the filtered-x and the adjoint gradient approaches. This leads to the development of new recursive-least-squares algorithms for the training of the controller neural network in the two networks structure. These new algorithms produce a better convergence performance than previously published algorithms. Differences in the performance of algorithms using the filtered-x and the adjoint gradient approaches are discussed in the paper. The computational load of the algorithms discussed in the paper is evaluated for multichannel systems of nonlinear active control. Simulation results are presented to compare the convergence performance of the algorithms, showing the convergence gain provided by the new algorithms.

  3. Algorithmic alternatives

    International Nuclear Information System (INIS)

    Creutz, M.

    1987-11-01

    A large variety of Monte Carlo algorithms are being used for lattice gauge simulations. For purely bosonic theories, present approaches are generally adequate; nevertheless, overrelaxation techniques promise savings by a factor of about three in computer time. For fermionic fields the situation is more difficult and less clear. Algorithms which involve an extrapolation to a vanishing step size are all quite closely related. Methods which do not require such an approximation tend to require computer time which grows as the square of the volume of the system. Recent developments combining global accept/reject stages with Langevin or microcanonical updatings promise to reduce this growth to V/sup 4/3/

  4. Combinatorial algorithms

    CERN Document Server

    Hu, T C

    2002-01-01

    Newly enlarged, updated second edition of a valuable text presents algorithms for shortest paths, maximum flows, dynamic programming and backtracking. Also discusses binary trees, heuristic and near optimums, matrix multiplication, and NP-complete problems. 153 black-and-white illus. 23 tables.Newly enlarged, updated second edition of a valuable, widely used text presents algorithms for shortest paths, maximum flows, dynamic programming and backtracking. Also discussed are binary trees, heuristic and near optimums, matrix multiplication, and NP-complete problems. New to this edition: Chapter 9

  5. Why should we publish Linked Data?

    Science.gov (United States)

    Blower, Jon; Riechert, Maik; Koubarakis, Manolis; Pace, Nino

    2016-04-01

    We use the Web every day to access information from all kinds of different sources. But the complexity and diversity of scientific data mean that discovering accessing and interpreting data remains a large challenge to researchers, decision-makers and other users. Different sources of useful information on data, algorithms, instruments and publications are scattered around the Web. How can we link all these things together to help users to better understand and exploit earth science data? How can we combine scientific data with other relevant data sources, when standards for describing and sharing data vary so widely between communities? "Linked Data" is a term that describes a set of standards and "best practices" for sharing data on the Web (http://www.w3.org/standards/semanticweb/data). These principles can be summarised as follows: 1. Create unique and persistent identifiers for the important "things" in a community (e.g. datasets, publications, algorithms, instruments). 2. Allow users to "look up" these identifiers on the web to find out more information about them. 3. Make this information machine-readable in a community-neutral format (such as RDF, Resource Description Framework). 4. Within this information, embed links to other things and concepts and say how these are related. 5. Optionally, provide web service interfaces to allow the user to perform sophisticated queries over this information (using a language such as SPARQL). The promise of Linked Data is that, through these techniques, data will be more discoverable, more comprehensible and more usable by different communities, not just the community that produced the data. As a result, many data providers (particularly public-sector institutions) are now publishing data in this way. However, this area is still in its infancy in terms of real-world applications. Data users need guidance and tools to help them use Linked Data. Data providers need reassurance that the investments they are making in

  6. Autodriver algorithm

    Directory of Open Access Journals (Sweden)

    Anna Bourmistrova

    2011-02-01

    Full Text Available The autodriver algorithm is an intelligent method to eliminate the need of steering by a driver on a well-defined road. The proposed method performs best on a four-wheel steering (4WS vehicle, though it is also applicable to two-wheel-steering (TWS vehicles. The algorithm is based on coinciding the actual vehicle center of rotation and road center of curvature, by adjusting the kinematic center of rotation. The road center of curvature is assumed prior information for a given road, while the dynamic center of rotation is the output of dynamic equations of motion of the vehicle using steering angle and velocity measurements as inputs. We use kinematic condition of steering to set the steering angles in such a way that the kinematic center of rotation of the vehicle sits at a desired point. At low speeds the ideal and actual paths of the vehicle are very close. With increase of forward speed the road and tire characteristics, along with the motion dynamics of the vehicle cause the vehicle to turn about time-varying points. By adjusting the steering angles, our algorithm controls the dynamic turning center of the vehicle so that it coincides with the road curvature center, hence keeping the vehicle on a given road autonomously. The position and orientation errors are used as feedback signals in a closed loop control to adjust the steering angles. The application of the presented autodriver algorithm demonstrates reliable performance under different driving conditions.

  7. Desktop Publishing Choices: Making an Appropriate Decision.

    Science.gov (United States)

    Crawford, Walt

    1991-01-01

    Discusses various choices available for desktop publishing systems. Four categories of software are described, including advanced word processing, graphics software, low-end desktop publishing, and mainstream desktop publishing; appropriate hardware is considered; and selection guidelines are offered, including current and future publishing needs,…

  8. MUCH Electronic Publishing Environment: Principles and Practices.

    Science.gov (United States)

    Min, Zheng; Rada, Roy

    1994-01-01

    Discusses the electronic publishing system called Many Using and Creating Hypermedia (MUCH). The MUCH system supports collaborative authoring; reuse; formatting and printing; management; hypermedia publishing and delivery; and interchange. This article examines electronic publishing environments; the MUCH environment; publishing activities; and…

  9. Self-Published Books: An Empirical "Snapshot"

    Science.gov (United States)

    Bradley, Jana; Fulton, Bruce; Helm, Marlene

    2012-01-01

    The number of books published by authors using fee-based publication services, such as Lulu and AuthorHouse, is overtaking the number of books published by mainstream publishers, according to Bowker's 2009 annual data. Little empirical research exists on self-published books. This article presents the results of an investigation of a random sample…

  10. Multispecies Coevolution Particle Swarm Optimization Based on Previous Search History

    Directory of Open Access Journals (Sweden)

    Danping Wang

    2017-01-01

    Full Text Available A hybrid coevolution particle swarm optimization algorithm with dynamic multispecies strategy based on K-means clustering and nonrevisit strategy based on Binary Space Partitioning fitness tree (called MCPSO-PSH is proposed. Previous search history memorized into the Binary Space Partitioning fitness tree can effectively restrain the individuals’ revisit phenomenon. The whole population is partitioned into several subspecies and cooperative coevolution is realized by an information communication mechanism between subspecies, which can enhance the global search ability of particles and avoid premature convergence to local optimum. To demonstrate the power of the method, comparisons between the proposed algorithm and state-of-the-art algorithms are grouped into two categories: 10 basic benchmark functions (10-dimensional and 30-dimensional, 10 CEC2005 benchmark functions (30-dimensional, and a real-world problem (multilevel image segmentation problems. Experimental results show that MCPSO-PSH displays a competitive performance compared to the other swarm-based or evolutionary algorithms in terms of solution accuracy and statistical tests.

  11. Optimal Fungal Space Searching Algorithms.

    Science.gov (United States)

    Asenova, Elitsa; Lin, Hsin-Yu; Fu, Eileen; Nicolau, Dan V; Nicolau, Dan V

    2016-10-01

    Previous experiments have shown that fungi use an efficient natural algorithm for searching the space available for their growth in micro-confined networks, e.g., mazes. This natural "master" algorithm, which comprises two "slave" sub-algorithms, i.e., collision-induced branching and directional memory, has been shown to be more efficient than alternatives, with one, or the other, or both sub-algorithms turned off. In contrast, the present contribution compares the performance of the fungal natural algorithm against several standard artificial homologues. It was found that the space-searching fungal algorithm consistently outperforms uninformed algorithms, such as Depth-First-Search (DFS). Furthermore, while the natural algorithm is inferior to informed ones, such as A*, this under-performance does not importantly increase with the increase of the size of the maze. These findings suggest that a systematic effort of harvesting the natural space searching algorithms used by microorganisms is warranted and possibly overdue. These natural algorithms, if efficient, can be reverse-engineered for graph and tree search strategies.

  12. Successive combination jet algorithm for hadron collisions

    International Nuclear Information System (INIS)

    Ellis, S.D.; Soper, D.E.

    1993-01-01

    Jet finding algorithms, as they are used in e + e- and hadron collisions, are reviewed and compared. It is suggested that a successive combination style algorithm, similar to that used in e + e- physics, might be useful also in hadron collisions, where cone style algorithms have been used previously

  13. To develop a universal gamut mapping algorithm

    International Nuclear Information System (INIS)

    Morovic, J.

    1998-10-01

    When a colour image from one colour reproduction medium (e.g. nature, a monitor) needs to be reproduced on another (e.g. on a monitor or in print) and these media have different colour ranges (gamuts), it is necessary to have a method for mapping between them. If such a gamut mapping algorithm can be used under a wide range of conditions, it can also be incorporated in an automated colour reproduction system and considered to be in some sense universal. In terms of preliminary work, a colour reproduction system was implemented, for which a new printer characterisation model (including grey-scale correction) was developed. Methods were also developed for calculating gamut boundary descriptors and for calculating gamut boundaries along given lines from them. The gamut mapping solution proposed in this thesis is a gamut compression algorithm developed with the aim of being accurate and universally applicable. It was arrived at by way of an evolutionary gamut mapping development strategy for the purposes of which five test images were reproduced between a CRT and printed media obtained using an inkjet printer. Initially, a number of previously published algorithms were chosen and psychophysically evaluated whereby an important characteristic of this evaluation was that it also considered the performance of algorithms for individual colour regions within the test images used. New algorithms were then developed on their basis, subsequently evaluated and this process was repeated once more. In this series of experiments the new GCUSP algorithm, which consists of a chroma-dependent lightness compression followed by a compression towards the lightness of the reproduction cusp on the lightness axis, gave the most accurate and stable performance overall. The results of these experiments were also useful for improving the understanding of some gamut mapping factors - in particular gamut difference. In addition to looking at accuracy, the pleasantness of reproductions obtained

  14. Algorithmic Self

    DEFF Research Database (Denmark)

    Markham, Annette

    This paper takes an actor network theory approach to explore some of the ways that algorithms co-construct identity and relational meaning in contemporary use of social media. Based on intensive interviews with participants as well as activity logging and data tracking, the author presents a richly...... layered set of accounts to help build our understanding of how individuals relate to their devices, search systems, and social network sites. This work extends critical analyses of the power of algorithms in implicating the social self by offering narrative accounts from multiple perspectives. It also...... contributes an innovative method for blending actor network theory with symbolic interaction to grapple with the complexity of everyday sensemaking practices within networked global information flows....

  15. Structure and navigation for electronic publishing

    Science.gov (United States)

    Tillinghast, John; Beretta, Giordano B.

    1998-01-01

    The sudden explosion of the World Wide Web as a new publication medium has given a dramatic boost to the electronic publishing industry, which previously was a limited market centered around CD-ROMs and on-line databases. While the phenomenon has parallels to the advent of the tabloid press in the middle of last century, the electronic nature of the medium brings with it the typical characteristic of 4th wave media, namely the acceleration in its propagation speed and the volume of information. Consequently, e-publications are even flatter than print media; Shakespeare's Romeo and Juliet share the same computer screen with a home-made plagiarized copy of Deep Throat. The most touted tool for locating useful information on the World Wide Web is the search engine. However, due to the medium's flatness, sought information is drowned in a sea of useless information. A better solution is to build tools that allow authors to structure information so that it can easily be navigated. We experimented with the use of ontologies as a tool to formulate structures for information about a specific topic, so that related concepts are placed in adjacent locations and can easily be navigated using simple and ergonomic user models. We describe our effort in building a World Wide Web based photo album that is shared among a small network of people.

  16. Analysis of longitudinal variations in North Pacific alkalinity to improve predictive algorithms

    Science.gov (United States)

    Fry, Claudia H.; Tyrrell, Toby; Achterberg, Eric P.

    2016-10-01

    The causes of natural variation in alkalinity in the North Pacific surface ocean need to be investigated to understand the carbon cycle and to improve predictive algorithms. We used GLODAPv2 to test hypotheses on the causes of three longitudinal phenomena in Alk*, a tracer of calcium carbonate cycling. These phenomena are (a) an increase from east to west between 45°N and 55°N, (b) an increase from west to east between 25°N and 40°N, and (c) a minor increase from west to east in the equatorial upwelling region. Between 45°N and 55°N, Alk* is higher on the western than on the eastern side, and this is associated with denser isopycnals with higher Alk* lying at shallower depths. Between 25°N and 40°N, upwelling along the North American continental shelf causes higher Alk* in the east. Along the equator, a strong east-west trend was not observed, even though the upwelling on the eastern side of the basin is more intense, because the water brought to the surface is not high in Alk*. We created two algorithms to predict alkalinity, one for the entire Pacific Ocean north of 30°S and one for the eastern margin. The Pacific Ocean algorithm is more accurate than the commonly used algorithm published by Lee et al. (2006), of similar accuracy to the best previously published algorithm by Sasse et al. (2013), and is less biased with longitude than other algorithms in the subpolar North Pacific. Our eastern margin algorithm is more accurate than previously published algorithms.

  17. Types of Open Access Publishers in Scopus

    Directory of Open Access Journals (Sweden)

    David Solomon

    2013-05-01

    Full Text Available This study assessed characteristics of publishers who published 2010 open access (OA journals indexed in Scopus. Publishers were categorized into six types; professional, society, university, scholar/researcher, government, and other organizations. Type of publisher was broken down by number of journals/articles published in 2010, funding model, location, discipline and whether the journal was born or converted to OA. Universities and societies accounted for 50% of the journals and 43% of the articles published. Professional publisher accounted for a third of the journals and 42% of the articles. With the exception of professional and scholar/researcher publishers, most journals were originally subscription journals that made at least their digital version freely available. Arts, humanities and social science journals are largely published by societies and universities outside the major publishing countries. Professional OA publishing is most common in biomedicine, mathematics, the sciences and engineering. Approximately a quarter of the journals are hosted on national/international platforms, in Latin America, Eastern Europe and Asia largely published by universities and societies without the need for publishing fees. This type of collaboration between governments, universities and/or societies may be an effective means of expanding open access publications.

  18. What was hidden in the Publisher's Archive

    DEFF Research Database (Denmark)

    Mai, Anne-Marie

    2015-01-01

    On the Danish Author Elsa Gress and her correspondence on American Literature with the Publisher, K. E. Hermann, Arena.......On the Danish Author Elsa Gress and her correspondence on American Literature with the Publisher, K. E. Hermann, Arena....

  19. Desktop Publishing: Changing Technology, Changing Occupations.

    Science.gov (United States)

    Stanton, Michael

    1991-01-01

    Describes desktop publishing (DTP) and its place in corporations. Lists job titles of those working in desktop publishing and describes DTP as it is taught at secondary and postsecondary levels and by private trainers. (JOW)

  20. Making the Leap to Desktop Publishing.

    Science.gov (United States)

    Schleifer, Neal

    1986-01-01

    Describes one teacher's approach to desktop publishing. Explains how the Macintosh and LaserWriter were used in the publication of a school newspaper. Guidelines are offered to teachers for the establishment of a desktop publishing lab. (ML)

  1. Promises and Realities of Desktop Publishing.

    Science.gov (United States)

    Thompson, Patricia A.; Craig, Robert L.

    1991-01-01

    Examines the underlying assumptions of the rhetoric of desktop publishing promoters. Suggests four criteria to help educators provide insights into issues and challenges concerning desktop publishing technology that design students will face on the job. (MG)

  2. Electronic Journal Publishing: Observations from Inside.

    Science.gov (United States)

    Hunter, Karen

    1998-01-01

    Focuses on electronic scholarly-journal publishing. Discusses characteristics of current academic electronic publishing; effects of the World Wide Web; user needs and positions of academic libraries; costs; and decisions of research librarians that drive the industry. (AEF)

  3. DNABIT Compress - Genome compression algorithm.

    Science.gov (United States)

    Rajarajeswari, Pothuraju; Apparao, Allam

    2011-01-22

    Data compression is concerned with how information is organized in data. Efficient storage means removal of redundancy from the data being stored in the DNA molecule. Data compression algorithms remove redundancy and are used to understand biologically important molecules. We present a compression algorithm, "DNABIT Compress" for DNA sequences based on a novel algorithm of assigning binary bits for smaller segments of DNA bases to compress both repetitive and non repetitive DNA sequence. Our proposed algorithm achieves the best compression ratio for DNA sequences for larger genome. Significantly better compression results show that "DNABIT Compress" algorithm is the best among the remaining compression algorithms. While achieving the best compression ratios for DNA sequences (Genomes),our new DNABIT Compress algorithm significantly improves the running time of all previous DNA compression programs. Assigning binary bits (Unique BIT CODE) for (Exact Repeats, Reverse Repeats) fragments of DNA sequence is also a unique concept introduced in this algorithm for the first time in DNA compression. This proposed new algorithm could achieve the best compression ratio as much as 1.58 bits/bases where the existing best methods could not achieve a ratio less than 1.72 bits/bases.

  4. Basics of Desktop Publishing. Second Edition.

    Science.gov (United States)

    Beeby, Ellen; Crummett, Jerrie

    This document contains teacher and student materials for a basic course in desktop publishing. Six units of instruction cover the following: (1) introduction to desktop publishing; (2) desktop publishing systems; (3) software; (4) type selection; (5) document design; and (6) layout. The teacher edition contains some or all of the following…

  5. The Changing Business of Scholarly Publishing.

    Science.gov (United States)

    Hunter, Karen

    1993-01-01

    Discussion of changes and trends in scholarly publishing highlights monographs; journals; user-centered publishing; electronic products and services, including adding value, marketing strategies, and new pricing systems; changing attitudes regarding copyright; trends in publishing industry reorganization; and impacts on research libraries. (LRW)

  6. A unified algorithm for predicting partition coefficients for PBPK modeling of drugs and environmental chemicals

    International Nuclear Information System (INIS)

    Peyret, Thomas; Poulin, Patrick; Krishnan, Kannan

    2010-01-01

    The algorithms in the literature focusing to predict tissue:blood PC (P tb ) for environmental chemicals and tissue:plasma PC based on total (K p ) or unbound concentration (K pu ) for drugs differ in their consideration of binding to hemoglobin, plasma proteins and charged phospholipids. The objective of the present study was to develop a unified algorithm such that P tb , K p and K pu for both drugs and environmental chemicals could be predicted. The development of the unified algorithm was accomplished by integrating all mechanistic algorithms previously published to compute the PCs. Furthermore, the algorithm was structured in such a way as to facilitate predictions of the distribution of organic compounds at the macro (i.e. whole tissue) and micro (i.e. cells and fluids) levels. The resulting unified algorithm was applied to compute the rat P tb , K p or K pu of muscle (n = 174), liver (n = 139) and adipose tissue (n = 141) for acidic, neutral, zwitterionic and basic drugs as well as ketones, acetate esters, alcohols, aliphatic hydrocarbons, aromatic hydrocarbons and ethers. The unified algorithm reproduced adequately the values predicted previously by the published algorithms for a total of 142 drugs and chemicals. The sensitivity analysis demonstrated the relative importance of the various compound properties reflective of specific mechanistic determinants relevant to prediction of PC values of drugs and environmental chemicals. Overall, the present unified algorithm uniquely facilitates the computation of macro and micro level PCs for developing organ and cellular-level PBPK models for both chemicals and drugs.

  7. Parallel algorithms

    CERN Document Server

    Casanova, Henri; Robert, Yves

    2008-01-01

    ""…The authors of the present book, who have extensive credentials in both research and instruction in the area of parallelism, present a sound, principled treatment of parallel algorithms. … This book is very well written and extremely well designed from an instructional point of view. … The authors have created an instructive and fascinating text. The book will serve researchers as well as instructors who need a solid, readable text for a course on parallelism in computing. Indeed, for anyone who wants an understandable text from which to acquire a current, rigorous, and broad vi

  8. Algorithm 865

    DEFF Research Database (Denmark)

    Gustavson, Fred G.; Reid, John K.; Wasniewski, Jerzy

    2007-01-01

    We present subroutines for the Cholesky factorization of a positive-definite symmetric matrix and for solving corresponding sets of linear equations. They exploit cache memory by using the block hybrid format proposed by the authors in a companion article. The matrix is packed into n(n + 1)/2 real...... variables, and the speed is usually better than that of the LAPACK algorithm that uses full storage (n2 variables). Included are subroutines for rearranging a matrix whose upper or lower-triangular part is packed by columns to this format and for the inverse rearrangement. Also included is a kernel...

  9. Peer-review: An IOP Publishing Perspective

    Science.gov (United States)

    Smith, Timothy

    2015-03-01

    Online publishing is challenging, and potentially changing, the role of publishers in both managing the peer-review process and disseminating the work that they publish in meeting contrasting needs from diverse groups of research communities. Recognizing the value of peer-review as a fundamental service to authors and the research community, the underlying principles of managing the process for journals published by IOP Publishing remain unchanged and yet the potential and demand for alternative models exists. This talk will discuss the traditional approach to peer-review placed in the context of this changing demand.

  10. Preoperative screening: value of previous tests.

    Science.gov (United States)

    Macpherson, D S; Snow, R; Lofgren, R P

    1990-12-15

    To determine the frequency of tests done in the year before elective surgery that might substitute for preoperative screening tests and to determine the frequency of test results that change from a normal value to a value likely to alter perioperative management. Retrospective cohort analysis of computerized laboratory data (complete blood count, sodium, potassium, and creatinine levels, prothrombin time, and partial thromboplastin time). Urban tertiary care Veterans Affairs Hospital. Consecutive sample of 1109 patients who had elective surgery in 1988. At admission, 7549 preoperative tests were done, 47% of which duplicated tests performed in the previous year. Of 3096 previous results that were normal as defined by hospital reference range and done closest to the time of but before admission (median interval, 2 months), 13 (0.4%; 95% CI, 0.2% to 0.7%), repeat values were outside a range considered acceptable for surgery. Most of the abnormalities were predictable from the patient's history, and most were not noted in the medical record. Of 461 previous tests that were abnormal, 78 (17%; CI, 13% to 20%) repeat values at admission were outside a range considered acceptable for surgery (P less than 0.001, frequency of clinically important abnormalities of patients with normal previous results with those with abnormal previous results). Physicians evaluating patients preoperatively could safely substitute the previous test results analyzed in this study for preoperative screening tests if the previous tests are normal and no obvious indication for retesting is present.

  11. Structural Revision of Some Recently Published Iridoid Glucosides

    DEFF Research Database (Denmark)

    Jensen, Søren Rosendal; Calis, Ihsan; Gotfredsen, Charlotte Held

    2007-01-01

    ). Finally, two alleged iridoid galactosides from Buddleja crispa named buddlejosides A and B (12a and 12b) have been shown to be the corresponding glucosides; the former is identical to agnuside (13a) while the latter is 3,4-dihydroxybenzoylaucubin (13b), an iridoid glucoside not previously published...

  12. Augmenting Data with Published Results in Bayesian Linear Regression

    Science.gov (United States)

    de Leeuw, Christiaan; Klugkist, Irene

    2012-01-01

    In most research, linear regression analyses are performed without taking into account published results (i.e., reported summary statistics) of similar previous studies. Although the prior density in Bayesian linear regression could accommodate such prior knowledge, formal models for doing so are absent from the literature. The goal of this…

  13. Poet's Market, 1997: Where & How To Publish Your Poetry.

    Science.gov (United States)

    Martin, Christine, Ed.; Bentley, Chantelle, Ed.

    This directory provides 1700 listings and evaluations of poetry publishers--300 more than in the previous edition--along with complete submission and contact information. Listings include both domestic and international markets, from mass circulation and literary magazines to small presses and university quarterlies, and contain complete profiles…

  14. Automatic electromagnetic valve for previous vacuum

    International Nuclear Information System (INIS)

    Granados, C. E.; Martin, F.

    1959-01-01

    A valve which permits the maintenance of an installation vacuum when electric current fails is described. It also lets the air in the previous vacuum bomb to prevent the oil ascending in the vacuum tubes. (Author)

  15. Ethical issues in publishing in predatory journals.

    Science.gov (United States)

    Ferris, Lorraine E; Winker, Margaret A

    2017-06-15

    Predatory journals, or journals that charge an article processing charge (APC) to authors, yet do not have the hallmarks of legitimate scholarly journals such as peer review and editing, Editorial Boards, editorial offices, and other editorial standards, pose a number of new ethical issues in journal publishing. This paper discusses ethical issues around predatory journals and publishing in them. These issues include misrepresentation; lack of editorial and publishing standards and practices; academic deception; research and funding wasted; lack of archived content; and undermining confidence in research literature. It is important that the scholarly community, including authors, institutions, editors, and publishers, support the legitimate scholarly research enterprise, and avoid supporting predatory journals by not publishing in them, serving as their editors or on the Editorial Boards, or permitting faculty to knowingly publish in them without consequences.

  16. Kidnapping Detection and Recognition in Previous Unknown Environment

    Directory of Open Access Journals (Sweden)

    Yang Tian

    2017-01-01

    Full Text Available An unaware event referred to as kidnapping makes the estimation result of localization incorrect. In a previous unknown environment, incorrect localization result causes incorrect mapping result in Simultaneous Localization and Mapping (SLAM by kidnapping. In this situation, the explored area and unexplored area are divided to make the kidnapping recovery difficult. To provide sufficient information on kidnapping, a framework to judge whether kidnapping has occurred and to identify the type of kidnapping with filter-based SLAM is proposed. The framework is called double kidnapping detection and recognition (DKDR by performing two checks before and after the “update” process with different metrics in real time. To explain one of the principles of DKDR, we describe a property of filter-based SLAM that corrects the mapping result of the environment using the current observations after the “update” process. Two classical filter-based SLAM algorithms, Extend Kalman Filter (EKF SLAM and Particle Filter (PF SLAM, are modified to show that DKDR can be simply and widely applied in existing filter-based SLAM algorithms. Furthermore, a technique to determine the adapted thresholds of metrics in real time without previous data is presented. Both simulated and experimental results demonstrate the validity and accuracy of the proposed method.

  17. QlikView Server and Publisher

    CERN Document Server

    Redmond, Stephen

    2014-01-01

    This is a comprehensive guide with a step-by-step approach that enables you to host and manage servers using QlikView Server and QlikView Publisher.If you are a server administrator wanting to learn about how to deploy QlikView Server for server management,analysis and testing, and QlikView Publisher for publishing of business content then this is the perfect book for you. No prior experience with QlikView is expected.

  18. Strategic Brand Management Tools in Publishing

    OpenAIRE

    Pitsaki, Irini

    2011-01-01

    Further to the introduction of the brand concept evolution and theory, as well as the ways these operate in the publishing sector (see paper: Pitsaki, I. 2010), the present paper treats publishing strategies and the tools used to establish them. Publishers often base their brand strategy on classic marketing approaches, such as the marketing mix -product, price, promotion, placement and people. They also direct their products to specific market segments in regard to the type of content and te...

  19. Method and Excel VBA Algorithm for Modeling Master Recession Curve Using Trigonometry Approach.

    Science.gov (United States)

    Posavec, Kristijan; Giacopetti, Marco; Materazzi, Marco; Birk, Steffen

    2017-11-01

    A new method was developed and implemented into an Excel Visual Basic for Applications (VBAs) algorithm utilizing trigonometry laws in an innovative way to overlap recession segments of time series and create master recession curves (MRCs). Based on a trigonometry approach, the algorithm horizontally translates succeeding recession segments of time series, placing their vertex, that is, the highest recorded value of each recession segment, directly onto the appropriate connection line defined by measurement points of a preceding recession segment. The new method and algorithm continues the development of methods and algorithms for the generation of MRC, where the first published method was based on a multiple linear/nonlinear regression model approach (Posavec et al. 2006). The newly developed trigonometry-based method was tested on real case study examples and compared with the previously published multiple linear/nonlinear regression model-based method. The results show that in some cases, that is, for some time series, the trigonometry-based method creates narrower overlaps of the recession segments, resulting in higher coefficients of determination R 2 , while in other cases the multiple linear/nonlinear regression model-based method remains superior. The Excel VBA algorithm for modeling MRC using the trigonometry approach is implemented into a spreadsheet tool (MRCTools v3.0 written by and available from Kristijan Posavec, Zagreb, Croatia) containing the previously published VBA algorithms for MRC generation and separation. All algorithms within the MRCTools v3.0 are open access and available free of charge, supporting the idea of running science on available, open, and free of charge software. © 2017, National Ground Water Association.

  20. Mergers, Acquisitions, and Access: STM Publishing Today

    Science.gov (United States)

    Robertson, Kathleen

    Electronic publishing is changing the fundamentals of the entire printing/delivery/archive system that has served as the distribution mechanism for scientific research over the last century and a half. The merger-mania of the last 20 years, preprint pools, and publishers' licensing and journals-bundling plans are among the phenomena impacting the scientific information field. Science-Technology-Medical (STM) publishing is experiencing a period of intense consolidation and reorganization. This paper gives an overview of the economic factors fueling these trends, the major STM publishers, and the government regulatory bodies that referee this industry in Europe, Canada, and the USA.

  1. Investigation of previously derived Hyades, Coma, and M67 reddenings

    International Nuclear Information System (INIS)

    Taylor, B.J.

    1980-01-01

    New Hyades polarimetry and field star photometry have been obtained to check the Hyades reddening, which was found to be nonzero in a previous paper. The new Hyades polarimetry implies essentially zero reddening; this is also true of polarimetry published by Behr (which was incorrectly interpreted in the previous paper). Four photometric techniques which are presumed to be insensitive to blanketing are used to compare the Hyades to nearby field stars; these four techniques also yield essentially zero reddening. When all of these results are combined with others which the author has previously published and a simultaneous solution for the Hyades, Coma, and M67 reddenings is made, the results are E (B-V) =3 +- 2 (sigma) mmag, -1 +- 3 (sigma) mmag, and 46 +- 6 (sigma) mmag, respectively. No support for a nonzero Hyades reddening is offered by the new results. When the newly obtained reddenings for the Hyades, Coma, and M67 are compared with results from techniques given by Crawford and by users of the David Dunlap Observatory photometric system, no differences between the new and other reddenings are found which are larger than about 2 sigma. The author had previously found that the M67 main-sequence stars have about the same blanketing as that of Coma and less blanketing than the Hyades; this conclusion is essentially unchanged by the revised reddenings

  2. False gold: Safely navigating open access publishing to avoid predatory publishers and journals.

    Science.gov (United States)

    McCann, Terence V; Polacsek, Meg

    2018-04-01

    The aim of this study was to review and discuss predatory open access publishing in the context of nursing and midwifery and develop a set of guidelines that serve as a framework to help clinicians, educators and researchers avoid predatory publishers. Open access publishing is increasingly common across all academic disciplines. However, this publishing model is vulnerable to exploitation by predatory publishers, posing a threat to nursing and midwifery scholarship and practice. Guidelines are needed to help researchers recognize predatory journals and publishers and understand the negative consequences of publishing in them. Discussion paper. A literature search of BioMed Central, CINAHL, MEDLINE with Full Text and PubMed for terms related to predatory publishing, published in the period 2007-2017. Lack of awareness of the risks and pressure to publish in international journals, may result in nursing and midwifery researchers publishing their work in dubious open access journals. Caution should be taken prior to writing and submitting a paper, to avoid predatory publishers. The advantage of open access publishing is that it provides readers with access to peer-reviewed research as soon as it is published online. However, predatory publishers use deceptive methods to exploit open access publishing for their own profit. Clear guidelines are needed to help researchers navigate safely open access publishing. A deeper understanding of the risks of predatory publishing is needed. Clear guidelines should be followed by nursing and midwifery researchers seeking to publish their work in open access journals. © 2017 John Wiley & Sons Ltd.

  3. Genetic algorithms and fuzzy multiobjective optimization

    CERN Document Server

    Sakawa, Masatoshi

    2002-01-01

    Since the introduction of genetic algorithms in the 1970s, an enormous number of articles together with several significant monographs and books have been published on this methodology. As a result, genetic algorithms have made a major contribution to optimization, adaptation, and learning in a wide variety of unexpected fields. Over the years, many excellent books in genetic algorithm optimization have been published; however, they focus mainly on single-objective discrete or other hard optimization problems under certainty. There appears to be no book that is designed to present genetic algorithms for solving not only single-objective but also fuzzy and multiobjective optimization problems in a unified way. Genetic Algorithms And Fuzzy Multiobjective Optimization introduces the latest advances in the field of genetic algorithm optimization for 0-1 programming, integer programming, nonconvex programming, and job-shop scheduling problems under multiobjectiveness and fuzziness. In addition, the book treats a w...

  4. Academic Publishing: Making the Implicit Explicit

    Directory of Open Access Journals (Sweden)

    Cecile Badenhorst

    2016-07-01

    Full Text Available For doctoral students, publishing in peer-reviewed journals is a task many face with anxiety and trepidation. The world of publishing, from choosing a journal, negotiating with editors and navigating reviewers’ responses is a bewildering place. Looking in from the outside, it seems that successful and productive academic writers have knowledge that is inaccessible to novice scholars. While there is a growing literature on writing for scholarly publication, many of these publications promote writing and publishing as a straightforward activity that anyone can achieve if they follow the rules. We argue that the specific and situated contexts in which academic writers negotiate publishing practices is more complicated and messy. In this paper, we attempt to make explicit our publishing processes to highlight the complex nature of publishing. We use autoethnographic narratives to provide discussion points and insights into the challenges of publishing peer reviewed articles. One narrative is by a doctoral student at the beginning of her publishing career, who expresses her desires, concerns and anxieties about writing for publication. The other narrative focuses on the publishing practices of a more experienced academic writer. Both are international scholars working in the Canadian context. The purpose of this paper is to explore academic publishing through the juxtaposition of these two narratives to make explicit some of the more implicit processes. Four themes emerge from these narratives. To publish successfully, academic writers need: (1 to be discourse analysts; (2 to have a critical competence; (3 to have writing fluency; and (4 to be emotionally intelligent.

  5. Publish or perish: authorship and peer review

    Science.gov (United States)

    Publish or perish is defined in Wikipedia as the pressure to publish work constantly to further or sustain one’s career in academia. This is an apt description given that refereed scientific publications are the currency of science and the primary means for broad dissemination of knowledge. Professi...

  6. Open Access Publishing: What Authors Want

    Science.gov (United States)

    Nariani, Rajiv; Fernandez, Leila

    2012-01-01

    Campus-based open access author funds are being considered by many academic libraries as a way to support authors publishing in open access journals. Article processing fees for open access have been introduced recently by publishers and have not yet been widely accepted by authors. Few studies have surveyed authors on their reasons for publishing…

  7. Publisher Correction: Invisible Trojan-horse attack

    DEFF Research Database (Denmark)

    Sajeed, Shihan; Minshull, Carter; Jain, Nitin

    2017-01-01

    A correction to this article has been published and is linked from the HTML version of this paper. The error has been fixed in the paper.......A correction to this article has been published and is linked from the HTML version of this paper. The error has been fixed in the paper....

  8. The cost of publishing in Danish astronomy

    DEFF Research Database (Denmark)

    Dorch, Bertil F.

    I investigate the cost of publishing in Danish astronomy on a fine scale, including all direct publication costs: The figures show how the annual number of publications with authors from Denmark in astronomy journals increased by a factor approximately four during 15 years (Elsevier’s Scopus...... database), and the increase of the corresponding potential (maximum) cost of publishing....

  9. Pages from the Desktop: Desktop Publishing Today.

    Science.gov (United States)

    Crawford, Walt

    1994-01-01

    Discusses changes that have made desktop publishing appealing and reasonably priced. Hardware, software, and printer options for getting started and moving on, typeface developments, and the key characteristics of desktop publishing are described. The author's notes on 33 articles from the personal computing literature from January-March 1994 are…

  10. Desktop Publishing for the Gifted/Talented.

    Science.gov (United States)

    Hamilton, Wayne

    1987-01-01

    Examines the nature of desktop publishing and how it can be used in the classroom for gifted/talented students. Characteristics and special needs of such students are identified, and it is argued that desktop publishing addresses those needs, particularly with regard to creativity. Twenty-six references are provided. (MES)

  11. Equity for open-access journal publishing.

    Directory of Open Access Journals (Sweden)

    Stuart M Shieber

    2009-08-01

    Full Text Available Open-access journals, which provide access to their scholarly articles freely and without limitations, are at a systematic disadvantage relative to traditional closed-access journal publishing and its subscription-based business model. A simple, cost-effective remedy to this inequity could put open-access publishing on a path to become a sustainable, efficient system.

  12. 20 CFR 902.3 - Published information.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Published information. 902.3 Section 902.3 Employees' Benefits JOINT BOARD FOR THE ENROLLMENT OF ACTUARIES RULES REGARDING AVAILABILITY OF INFORMATION § 902.3 Published information. (a) Federal Register. Pursuant to sections 552 and 553 of title 5 of the...

  13. Publishers' Sales Strategies: A Questionable Business.

    Science.gov (United States)

    Eaglen, Audrey B.

    1988-01-01

    Speed, fill rate, and discount are reasons why it is often preferable for libraries to order directly from publishers rather than through a distributor. Nevertheless, some publishers have decided not to accept orders from libraries and schools. This has had a deleterious effect on libraries and library collections. (MES)

  14. Scientific publishing: some food for thought

    Directory of Open Access Journals (Sweden)

    Vittorio Bo

    2007-03-01

    Full Text Available Scientific publishing, here to be considered in a broader sense, as publishing of both specialised scientific journals and science popularisation works addressed to a wider audience, has been sailing for some years on troubled waters. To gather some possible food for thought is the purpose of this brief article.

  15. Electronic Publishing in Science: Changes and Risks.

    Science.gov (United States)

    Kinne, Otto

    1999-01-01

    Discussion of the Internet and the guidance of the World Wide Web Consortium focuses on scientific communication and electronic publishing. Considers the speed of communicating and disseminating information; quality issues; cost; library subscriptions; publishers; and risks and concerns, including the role of editors and reviewers or referees.…

  16. Another Interface: Electronic Publishing and Technical Services.

    Science.gov (United States)

    Yamamoto, Rumi

    1986-01-01

    Discusses the problems of assimilating electronic publishing within the technical services area of academic libraries: whether to consider electronic journals as acquisitions; how to catalog them; whether to charge users for access to them; and how to preserve online publications for future research. Future trends in electronic publishing are…

  17. Electronic Publishing: Introduction to This Issue.

    Science.gov (United States)

    Siegel, Martin A.

    1994-01-01

    Provides an overview of this special issue that addresses the possibilities and implications of electronic publishing and information dissemination as key components of effective education. Highlights include the theory and framework of electronic publishing; differences between electronic text and print; development of new educational materials;…

  18. New journals for publishing medical case reports.

    Science.gov (United States)

    Akers, Katherine G

    2016-04-01

    Because they do not rank highly in the hierarchy of evidence and are not frequently cited, case reports describing the clinical circumstances of single patients are seldom published by medical journals. However, many clinicians argue that case reports have significant educational value, advance medical knowledge, and complement evidence-based medicine. Over the last several years, a vast number (∼160) of new peer-reviewed journals have emerged that focus on publishing case reports. These journals are typically open access and have relatively high acceptance rates. However, approximately half of the publishers of case reports journals engage in questionable or "predatory" publishing practices. Authors of case reports may benefit from greater awareness of these new publication venues as well as an ability to discriminate between reputable and non-reputable journal publishers.

  19. Open Access, data capitalism and academic publishing.

    Science.gov (United States)

    Hagner, Michael

    2018-02-16

    Open Access (OA) is widely considered a breakthrough in the history of academic publishing, rendering the knowledge produced by the worldwide scientific community accessible to all. In numerous countries, national governments, funding institutions and research organisations have undertaken enormous efforts to establish OA as the new publishing standard. The benefits and new perspectives, however, cause various challenges. This essay addresses several issues, including that OA is deeply embedded in the logic and practices of data capitalism. Given that OA has proven an attractive business model for commercial publishers, the key predictions of OA-advocates, namely that OA would liberate both scientists and tax payers from the chains of global publishing companies, have not become true. In its conclusion, the paper discusses the opportunities and pitfalls of non-commercial publishing.

  20. Exploring Digital News Publishing Business Models

    DEFF Research Database (Denmark)

    Lindskow, Kasper

    News publishers in the industrialized world are experiencing a fundamental challenge to their business models because of the changing modes of consumption, competition, and production of their offerings that are associated with the emergence of the networked information society. The erosion...... of the traditional business models poses an existential threat to news publishing and has given rise to a continuing struggle among news publishers to design digital business models that will be sustainable in the future. This dissertation argues that a central and underresearched aspect of digital news publishing...... business models concerns the production networks that support the co-production of digital news offerings. To fill this knowledge gap, this dissertation explores the strategic design of the digital news publishing production networks that are associated with HTML-based news offerings on the open Web...

  1. A formal analysis of a dynamic distributed spanning tree algorithm

    NARCIS (Netherlands)

    Mooij, A.J.; Wesselink, J.W.

    2003-01-01

    Abstract. We analyze the spanning tree algorithm in the IEEE 1394.1 draft standard, which correctness has not previously been proved. This algorithm is a fully-dynamic distributed graph algorithm, which, in general, is hard to develop. The approach we use is to formally develop an algorithm that is

  2. 77 FR 70176 - Previous Participation Certification

    Science.gov (United States)

    2012-11-23

    ... participants' previous participation in government programs and ensure that the past record is acceptable prior... information is designed to be 100 percent automated and digital submission of all data and certifications is... government programs and ensure that the past record is acceptable prior to granting approval to participate...

  3. On the Tengiz petroleum deposit previous study

    International Nuclear Information System (INIS)

    Nysangaliev, A.N.; Kuspangaliev, T.K.

    1997-01-01

    Tengiz petroleum deposit previous study is described. Some consideration about structure of productive formation, specific characteristic properties of petroleum-bearing collectors are presented. Recommendation on their detail study and using of experience on exploration and development of petroleum deposit which have analogy on most important geological and industrial parameters are given. (author)

  4. Subsequent pregnancy outcome after previous foetal death

    NARCIS (Netherlands)

    Nijkamp, J. W.; Korteweg, F. J.; Holm, J. P.; Timmer, A.; Erwich, J. J. H. M.; van Pampus, M. G.

    Objective: A history of foetal death is a risk factor for complications and foetal death in subsequent pregnancies as most previous risk factors remain present and an underlying cause of death may recur. The purpose of this study was to evaluate subsequent pregnancy outcome after foetal death and to

  5. Algorithmic chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Fontana, W.

    1990-12-13

    In this paper complex adaptive systems are defined by a self- referential loop in which objects encode functions that act back on these objects. A model for this loop is presented. It uses a simple recursive formal language, derived from the lambda-calculus, to provide a semantics that maps character strings into functions that manipulate symbols on strings. The interaction between two functions, or algorithms, is defined naturally within the language through function composition, and results in the production of a new function. An iterated map acting on sets of functions and a corresponding graph representation are defined. Their properties are useful to discuss the behavior of a fixed size ensemble of randomly interacting functions. This function gas'', or Turning gas'', is studied under various conditions, and evolves cooperative interaction patterns of considerable intricacy. These patterns adapt under the influence of perturbations consisting in the addition of new random functions to the system. Different organizations emerge depending on the availability of self-replicators.

  6. Sensitivity of NTCP parameter values against a change of dose calculation algorithm

    International Nuclear Information System (INIS)

    Brink, Carsten; Berg, Martin; Nielsen, Morten

    2007-01-01

    Optimization of radiation treatment planning requires estimations of the normal tissue complication probability (NTCP). A number of models exist that estimate NTCP from a calculated dose distribution. Since different dose calculation algorithms use different approximations the dose distributions predicted for a given treatment will in general depend on the algorithm. The purpose of this work is to test whether the optimal NTCP parameter values change significantly when the dose calculation algorithm is changed. The treatment plans for 17 breast cancer patients have retrospectively been recalculated with a collapsed cone algorithm (CC) to compare the NTCP estimates for radiation pneumonitis with those obtained from the clinically used pencil beam algorithm (PB). For the PB calculations the NTCP parameters were taken from previously published values for three different models. For the CC calculations the parameters were fitted to give the same NTCP as for the PB calculations. This paper demonstrates that significant shifts of the NTCP parameter values are observed for three models, comparable in magnitude to the uncertainties of the published parameter values. Thus, it is important to quote the applied dose calculation algorithm when reporting estimates of NTCP parameters in order to ensure correct use of the models

  7. Critical analysis of marketing in Croatian publishing

    Directory of Open Access Journals (Sweden)

    Silvija Gašparić

    2018-03-01

    Full Text Available Marketing is an inevitable part of today's modern lifestyle. The role that marketing plays is so big that it has become the most important part of business. Due to crisis that is still affecting publishers in Croatia, this paper emphasizes the power of advertising as a key ingredient in how to overcome this situation and upgrade the system of publishing in Croatia. The framework of the paper is based on marketing as a tool that leads to popularization of books and sales increase. Beside the experimental part which gives an insight into public's opinion about books, publishing and marketing, the first chapter gives the literature review and analysis conducted on the whole process of book publishing in Croatia with pointing out mistakes that Croatian publishers make. Also, benefits of foreign publishing will be mentioned and used for comparison and projection on to the problems of the native market. The aim of this analysis and this viewpoint paper is to contribute the comprehension of marketing strategies and activities and its use and gains in Croatian publishing.

  8. Navigating the heavy seas of online publishing

    DEFF Research Database (Denmark)

    Carpentier, Samuel; Dörry, Sabine; Lord, Sébastien

    2015-01-01

    Articulo – Journal of Urban Research celebrates its 10th anniversary! To celebrate this milestone, the current editors discuss the numerous changes and challenges related to publishing a peer-reviewed online journal. Since 2005, Articulo has progressively become more international, more professio......Articulo – Journal of Urban Research celebrates its 10th anniversary! To celebrate this milestone, the current editors discuss the numerous changes and challenges related to publishing a peer-reviewed online journal. Since 2005, Articulo has progressively become more international, more...... rough seas of online publishing in the future....

  9. Preparing and Publishing a Scientific Manuscript

    Directory of Open Access Journals (Sweden)

    Padma R Jirge

    2017-01-01

    Full Text Available Publishing original research in a peer-reviewed and indexed journal is an important milestone for a scientist or a clinician. It is an important parameter to assess academic achievements. However, technical and language barriers may prevent many enthusiasts from ever publishing. This review highlights the important preparatory steps for creating a good manuscript and the most widely used IMRaD (Introduction, Materials and Methods, Results, and Discussion method for writing a good manuscript. It also provides a brief overview of the submission and review process of a manuscript for publishing in a biomedical journal.

  10. [SciELO: method for electronic publishing].

    Science.gov (United States)

    Laerte Packer, A; Rocha Biojone, M; Antonio, I; Mayumi Takemaka, R; Pedroso García, A; Costa da Silva, A; Toshiyuki Murasaki, R; Mylek, C; Carvalho Reisl, O; Rocha F Delbucio, H C

    2001-01-01

    It describes the SciELO Methodology Scientific Electronic Library Online for electronic publishing of scientific periodicals, examining issues such as the transition from traditional printed publication to electronic publishing, the scientific communication process, the principles which founded the methodology development, its application in the building of the SciELO site, its modules and components, the tools use for its construction etc. The article also discusses the potentialities and trends for the area in Brazil and Latin America, pointing out questions and proposals which should be investigated and solved by the methodology. It concludes that the SciELO Methodology is an efficient, flexible and wide solution for the scientific electronic publishing.

  11. Open Access Publishing in the Electronic Age.

    Science.gov (United States)

    Kovács, Gábor L

    2014-10-01

    The principle of open-access (OA) publishing is more and more prevalent also on the field of laboratory medicine. Open-access journals (OAJs) are available online to the reader usually without financial, legal, or technical barriers. Some are subsidized, and some require payment on behalf of the author. OAJs are one of the two general methods for providing OA. The other one is self-archiving in a repository. The electronic journal of the IFCC (eJIFCC) is a platinum OAJ- i.e. there is no charge to read, or to submit to this journal. Traditionally, the author was required to transfer the copyright to the journal publisher. Publishers claimed this was necessary in order to protect author's rights. However, many authors found this unsatisfactory, and have used their influence to affect a gradual move towards a license to publish instead. Under such a system, the publisher has permission to edit, print, and distribute the article commercially, but the author(s) retain the other rights themselves. An OA mandate is a policy adopted by a research institution, research funder, or government which requires researchers to make their published, peer-reviewed journal articles and conference papers OA by self-archiving their peer-reviewed drafts in a repository ("green OA") or by publishing them in an OAJ ("gold OA"). Creative Commons (CC) is a nonprofit organization that enables the sharing and use of creativity and knowledge through free legal tools. The free, easy-to-use copyright licenses provide a simple, standardized way to give the public permission to share and use creative work. CC licenses let you easily change your copyright terms from the default of "all rights reserved" to "some rights reserved." OA publishing also raises a number of new ethical problems (e.g. predatory publishers, fake papers). Laboratory scientists are encouraged to publish their scientific results OA (especially in eJIFCC). They should, however, be aware of their rights, institutional mandate

  12. Subsequent childbirth after a previous traumatic birth.

    Science.gov (United States)

    Beck, Cheryl Tatano; Watson, Sue

    2010-01-01

    Nine percent of new mothers in the United States who participated in the Listening to Mothers II Postpartum Survey screened positive for meeting the Diagnostic and Statistical Manual of Mental Disorders, Fourth Edition criteria for posttraumatic stress disorder after childbirth. Women who have had a traumatic birth experience report fewer subsequent children and a longer length of time before their second baby. Childbirth-related posttraumatic stress disorder impacts couples' physical relationship, communication, conflict, emotions, and bonding with their children. The purpose of this study was to describe the meaning of women's experiences of a subsequent childbirth after a previous traumatic birth. Phenomenology was the research design used. An international sample of 35 women participated in this Internet study. Women were asked, "Please describe in as much detail as you can remember your subsequent pregnancy, labor, and delivery following your previous traumatic birth." Colaizzi's phenomenological data analysis approach was used to analyze the stories of the 35 women. Data analysis yielded four themes: (a) riding the turbulent wave of panic during pregnancy; (b) strategizing: attempts to reclaim their body and complete the journey to motherhood; (c) bringing reverence to the birthing process and empowering women; and (d) still elusive: the longed-for healing birth experience. Subsequent childbirth after a previous birth trauma has the potential to either heal or retraumatize women. During pregnancy, women need permission and encouragement to grieve their prior traumatic births to help remove the burden of their invisible pain.

  13. Quantum algorithm for linear regression

    Science.gov (United States)

    Wang, Guoming

    2017-07-01

    We present a quantum algorithm for fitting a linear regression model to a given data set using the least-squares approach. Differently from previous algorithms which yield a quantum state encoding the optimal parameters, our algorithm outputs these numbers in the classical form. So by running it once, one completely determines the fitted model and then can use it to make predictions on new data at little cost. Moreover, our algorithm works in the standard oracle model, and can handle data sets with nonsparse design matrices. It runs in time poly( log2(N ) ,d ,κ ,1 /ɛ ) , where N is the size of the data set, d is the number of adjustable parameters, κ is the condition number of the design matrix, and ɛ is the desired precision in the output. We also show that the polynomial dependence on d and κ is necessary. Thus, our algorithm cannot be significantly improved. Furthermore, we also give a quantum algorithm that estimates the quality of the least-squares fit (without computing its parameters explicitly). This algorithm runs faster than the one for finding this fit, and can be used to check whether the given data set qualifies for linear regression in the first place.

  14. 18th International Conference on Electronic Publishing

    CERN Document Server

    Dobreva, Milena

    2014-01-01

    The ways in which research data is used and handled continue to capture public attention and are the focus of increasing interest. Electronic publishing is intrinsic to digital data management, and relevant to the fields of data mining, digital publishing and social networks, with their implications for scholarly communication, information services, e-learning, e-business and the cultural heritage sector. This book presents the proceedings of the 18th International Conference on Electronic Publishing (ELPUB), held in Thessaloniki, Greece, in June 2014. The conference brings together researchers and practitioners to discuss the many aspects of electronic publishing, and the theme this year is 'Let's put data to use: digital scholarship for the next generation'. As well as examining the role of cultural heritage and service organisations in the creation, accessibility, duration and long-term preservation of data, it provides a discussion forum for the appraisal, citation and licensing of research data and the n...

  15. Predatory publishing and cybercrime targeting academics.

    Science.gov (United States)

    Umlauf, Mary Grace; Mochizuki, Yuki

    2018-04-01

    The purpose of this report is to inform and warn academics about practices used by cybercriminals who seek to profit from unwary scholars and undermine the industry of science. This report describes the signs, symptoms, characteristics, and consequences of predatory publishing and related forms of consumer fraud. Methods to curb these cybercrimes include educating scholars and students about tactics used by predatory publishers; institutional changes in how faculty are evaluated using publications; soliciting cooperation from the industries that support academic publishing and indexing to curb incorporation of illegitimate journals; and taking an offensive position by reporting these consumer fraud crimes to the authorities. Over and above the problem of publishing good science in fraudulent journals, disseminating and citing poor-quality research threaten the credibility of science and of nursing. © 2018 John Wiley & Sons Australia, Ltd.

  16. Design Options for a Desktop Publishing Course.

    Science.gov (United States)

    Mayer, Kenneth R.; Nelson, Sandra J.

    1992-01-01

    Offers recommendations for development of an undergraduate desktop publishing course. Discusses scholastic level and prerequisites, purpose and objectives, instructional resources and methodology, assignments and evaluation, and a general course outline. (SR)

  17. Open Access publishing in physics gains momentum

    CERN Multimedia

    2006-01-01

    The first meeting of European particle physics funding agencies took place on 3 November at CERN to establish a consortium for Open Access publishing in particle physics, SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics). Open Access could transform the academic publishing world, with a great impact on research. The traditional model of research publication is funded through reader subscriptions. Open Access will turn this model on its head by changing the funding structure of research results, without increasing the overall cost of publishing. Instead of demanding payment from readers, publications will be distributed free of charge, financed by funding agencies via laboratories and the authors. This new concept will bring greater benefits and broaden opportunities for researchers and funding agencies by providing unrestricted distribution of the results of publicly funded research. The meeting marked a positive step forward, with international support from laboratories, fundin...

  18. INNOVATION MANAGEMENT TOOLS IN PUBLISHING COMPANIES

    Directory of Open Access Journals (Sweden)

    A. Shegda

    2013-09-01

    Full Text Available This article is devoted to the highly topical issue of modern publishing business as innovation management. introduction of technological innovation, measured as a promising strategy for the development of a constructive industry. The paper deals with main problems in managing of publishing companies. The reference consider of innovation management tools. In the article are exams the problems of books trend decline which require publishers introducing innovative methods of production and distribution. It was found that while the tools can be used. The process of innovation management with the following basic tools like as marketing innovation bench marketing, franchising, engineering innovation. It was found that while the tools can be used. So, the aim of the article is to analyze the modern tools of innovation management in the publishing field.

  19. Monitoring Information By Industry - Printing and Publishing

    Science.gov (United States)

    Stationary source emissions monitoring is required to demonstrate that a source is meeting the requirements in Federal or state rules. This page is about control techniques used to reduce pollutant emissions in the printing and publishing industry.

  20. Printing and Publishing Industry Training Board

    Science.gov (United States)

    Industrial Training International, 1974

    1974-01-01

    Accounted is the supervisory training program currently in operation in the printing and publishing industry. The purpose of the training program is to increase managerial efficiency and to better prepare new supervisors. (DS)

  1. NSA Diana Wueger Published in Washington Quarterly

    OpenAIRE

    Grant, Catherine L.

    2016-01-01

    National Security Affairs (NSA) News NSA Faculty Associate for Research Diana Wueger has recently had an article titled “India’s Nuclear-Armed Submarines: Deterrence or Danger?” published in the Washington Quarterly.

  2. Multi-scale graph-cut algorithm for efficient water-fat separation.

    Science.gov (United States)

    Berglund, Johan; Skorpil, Mikael

    2017-09-01

    To improve the accuracy and robustness to noise in water-fat separation by unifying the multiscale and graph cut based approaches to B 0 -correction. A previously proposed water-fat separation algorithm that corrects for B 0 field inhomogeneity in 3D by a single quadratic pseudo-Boolean optimization (QPBO) graph cut was incorporated into a multi-scale framework, where field map solutions are propagated from coarse to fine scales for voxels that are not resolved by the graph cut. The accuracy of the single-scale and multi-scale QPBO algorithms was evaluated against benchmark reference datasets. The robustness to noise was evaluated by adding noise to the input data prior to water-fat separation. Both algorithms achieved the highest accuracy when compared with seven previously published methods, while computation times were acceptable for implementation in clinical routine. The multi-scale algorithm was more robust to noise than the single-scale algorithm, while causing only a small increase (+10%) of the reconstruction time. The proposed 3D multi-scale QPBO algorithm offers accurate water-fat separation, robustness to noise, and fast reconstruction. The software implementation is freely available to the research community. Magn Reson Med 78:941-949, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  3. Decentralized provenance-aware publishing with nanopublications

    Directory of Open Access Journals (Sweden)

    Tobias Kuhn

    2016-08-01

    Full Text Available Publication and archival of scientific results is still commonly considered the responsability of classical publishing companies. Classical forms of publishing, however, which center around printed narrative articles, no longer seem well-suited in the digital age. In particular, there exist currently no efficient, reliable, and agreed-upon methods for publishing scientific datasets, which have become increasingly important for science. In this article, we propose to design scientific data publishing as a web-based bottom-up process, without top-down control of central authorities such as publishing companies. Based on a novel combination of existing concepts and technologies, we present a server network to decentrally store and archive data in the form of nanopublications, an RDF-based format to represent scientific data. We show how this approach allows researchers to publish, retrieve, verify, and recombine datasets of nanopublications in a reliable and trustworthy manner, and we argue that this architecture could be used as a low-level data publication layer to serve the Semantic Web in general. Our evaluation of the current network shows that this system is efficient and reliable.

  4. Electronic astronomical information handling and flexible publishing.

    Science.gov (United States)

    Heck, A.

    The current dramatic evolution in information technology is bringing major modifications in the way scientists work and communicate. The concept of electronic information handling encompasses the diverse types of information, the different media, as well as the various communication methodologies and technologies. It ranges from the very collection of data until the final publication of results and sharing of knowledge. New problems and challenges result also from the new information culture, especially on legal, ethical, and educational grounds. Electronic publishing will have to diverge from an electronic version of contributions on paper and will be part of a more general flexible-publishing policy. The benefits of private publishing are questioned. The procedures for validating published material and for evaluating scientific activities will have to be adjusted too. Provision of electronic refereed information independently from commercial publishers in now feasible. Scientists and scientific institutions have now the possibility to run an efficient information server with validated (refereed) material without the help of a commercial publishers.

  5. Electronic publishing and Acupuncture in Medicine.

    Science.gov (United States)

    White, Adrian

    2006-09-01

    The internet has fundamentally altered scientific publishing; this article discusses current models and how they affect this journal. The greatest innovation is a new range of open access journals published only on the internet, aimed at rapid publication and universal access. In most cases authors pay a publication charge for the overhead costs of the journal. Journals that are published by professional organisations primarily for their members have some functions other than publishing research, including clinical articles, conference reports and news items. A small number of these journals are permitting open access to their research reports. Commercial science publishing still exists, where profit for shareholders provides motivation in addition to the desire to spread knowledge for the benefit of all. A range of electronic databases now exists that offer various levels of listing and searching. Some databases provide direct links to journal articles, such as the LinkOut scheme in PubMed. Acupuncture in Medicine will continue to publish in paper format; all research articles will be available on open access, but non-subscribers will need to pay for certain other articles for the first 12 months after publication. All Acupuncture in Medicine articles will in future be included in the LinkOut scheme, and be presented to the databases electronically.

  6. Analysis of thirteen predatory publishers: a trap for eager-to-publish researchers.

    Science.gov (United States)

    Bolshete, Pravin

    2018-01-01

    To demonstrate a strategy employed by predatory publishers to trap eager-to-publish authors or researchers into submitting their work. This was a case study of 13 potential, possible, or probable predatory scholarly open-access publishers with similar characteristics. Eleven publishers were included from Beall's list and two additional publishers were identified from a Google web search. Each publisher's site was visited and its content analyzed. Publishers publishing biomedical journals were further explored and additional data was collected regarding their volumes, details of publications and editorial-board members. Overall, the look and feel of all 13 publishers was similar including names of publishers, website addresses, homepage content, homepage images, list of journals and subject areas, as if they were copied and pasted. There were discrepancies in article-processing charges within the publishers. None of the publishers identified names in their contact details and primarily included only email addresses. Author instructions were similar across all 13 publishers. Most publishers listed journals of varied subject areas including biomedical journals (12 publishers) covering different geographic locations. Most biomedical journals published none or very few articles. The highest number of articles published by any single biomedical journal was 28. Several editorial-board members were listed across more than one journals, with one member listed 81 times in different 69 journals (i.e. twice in 12 journals). There was a strong reason to believe that predatory publishers may have several publication houses with different names under a single roof to trap authors from different geographic locations.

  7. Recently Published Lectures and Tutorials for ATLAS

    CERN Multimedia

    J. Herr

    2006-01-01

    As reported in the September 2004 ATLAS eNews, the Web Lecture Archive Project, a collaboration between the University of Michigan and CERN, has developed a synchronized system for recording and publishing educational multimedia presentations, using the Web as medium. The current system, including future developments for the project and the field in general, was recently presented at the CHEP 2006 conference in Mumbai, India. The relevant presentations and papers can be found here: The Web Lecture Archive Project A Web Lecture Capture System with Robotic Speaker Tracking This year, the University of Michigan team has been asked to record and publish all ATLAS Plenary sessions, as well as a large number of Physics and Computing tutorials. A significant amount of this material has already been published and can be accessed via the links below. All lectures can be viewed on any major platform with any common internet browser, either via streaming or local download (for limited bandwidth). Please enjoy the l...

  8. Recently Published Lectures and Tutorials for ATLAS

    CERN Multimedia

    Goldfarb, S.

    2006-01-01

    As reported in the September 2004 ATLAS eNews, the Web Lecture Archive Project, WLAP, a collaboration between the University of Michigan and CERN, has developed a synchronized system for recording and publishing educational multimedia presentations, using the Web as medium. The current system, including future developments for the project and the field in general, was recently presented at the CHEP 2006 conference in Mumbai, India. The relevant presentations and papers can be found here: The Web Lecture Archive Project. A Web Lecture Capture System with Robotic Speaker Tracking This year, the University of Michigan team has been asked to record and publish all ATLAS Plenary sessions, as well as a large number of Physics and Computing tutorials. A significant amount of this material has already been published and can be accessed via the links below. All lectures can be viewed on any major platform with any common internet browser, either via streaming or local download (for limited bandwidth). Please e...

  9. Electronic Publishing or Electronic Information Handling?

    Science.gov (United States)

    Heck, A.

    The current dramatic evolution in information technology is bringing major modifications in the way scientists communicate. The concept of 'electronic publishing' is too restrictive and has often different, sometimes conflicting, interpretations. It is thus giving way to the broader notion of 'electronic information handling' encompassing the diverse types of information, the different media, as well as the various communication methodologies and technologies. New problems and challenges result also from this new information culture, especially on legal, ethical, and educational grounds. The procedures for validating 'published material' and for evaluating scientific activities will have to be adjusted too. 'Fluid' information is becoming a common concept. Electronic publishing cannot be conceived without link to knowledge bases nor without intelligent information retrieval tools.

  10. Optimization of wind farm turbines layout using an evolutive algorithm

    International Nuclear Information System (INIS)

    Gonzalez, Javier Serrano; Santos, Jesus Riquelme; Payan, Manuel Burgos; Gonzalez Rodriguez, Angel G.; Mora, Jose Castro

    2010-01-01

    The optimum wind farm configuration problem is discussed in this paper and an evolutive algorithm to optimize the wind farm layout is proposed. The algorithm's optimization process is based on a global wind farm cost model using the initial investment and the present value of the yearly net cash flow during the entire wind-farm life span. The proposed algorithm calculates the yearly income due to the sale of the net generated energy taking into account the individual wind turbine loss of production due to wake decay effects and it can deal with areas or terrains with non-uniform load-bearing capacity soil and different roughness length for every wind direction or restrictions such as forbidden areas or limitations in the number of wind turbines or the investment. The results are first favorably compared with those previously published and a second collection of test cases is used to proof the performance and suitability of the proposed evolutive algorithm to find the optimum wind farm configuration. (author)

  11. Electronic publishing and intelligent information retrieval

    Science.gov (United States)

    Heck, A.

    1992-01-01

    Europeans are now taking steps to homogenize policies and standardize procedures in electronic publishing (EP) in astronomy and space sciences. This arose from an open meeting organized in Oct. 1991 at Strasbourg Observatory (France) and another business meeting held late Mar. 1992 with the major publishers and journal editors in astronomy and space sciences. The ultimate aim of EP might be considered as the so-called 'intelligent information retrieval' (IIR) or better named 'advanced information retrieval' (AIR), taking advantage of the fact that the material to be published appears at some stage in a machine-readable form. It is obvious that the combination of desktop and electronic publishing with networking and new structuring of knowledge bases will profoundly reshape not only our ways of publishing, but also our procedures of communicating and retrieving information. It should be noted that a world-wide survey among astronomers and space scientists carried out before the October 1991 colloquium on the various packages and machines used, indicated that TEX-related packages were already in majoritarian use in our community. It has also been stressed at each meeting that the European developments should be carried out in collaboration with what is done in the US (STELLAR project, for instance). American scientists and journal editors actually attended both meetings mentioned above. The paper will offer a review of the status of electronic publishing in astronomy and its possible contribution to advanced information retrieval in this field. It will also report on recent meetings such as the 'Astronomy from Large Databases-2 (ALD-2)' conference dealing with the latest developments in networking, in data, information, and knowledge bases, as well as in the related methodologies.

  12. Introduction to scientific publishing backgrounds, concepts, strategies

    CERN Document Server

    Öchsner, Andreas

    2013-01-01

    This book is a very concise introduction to the basic knowledge of scientific publishing. It  starts with the basics of writing a scientific paper, and recalls the different types of scientific documents. In gives an overview on the major scientific publishing companies and different business models. The book also introduces to abstracting and indexing services and how they can be used for the evaluation of science, scientists, and institutions. Last but not least, this short book faces the problem of plagiarism and publication ethics.

  13. Publishing activities improves undergraduate biology education.

    Science.gov (United States)

    Smith, Michelle K

    2018-06-01

    To improve undergraduate biology education, there is an urgent need for biology instructors to publish their innovative active-learning instructional materials in peer-reviewed journals. To do this, instructors can measure student knowledge about a variety of biology concepts, iteratively design activities, explore student learning outcomes and publish the results. Creating a set of well-vetted activities, searchable through a journal interface, saves other instructors time and encourages the use of active-learning instructional practices. For authors, these publications offer new opportunities to collaborate and can provide evidence of a commitment to using active-learning instructional techniques in the classroom.

  14. Publishing to become an 'ideal academic'

    DEFF Research Database (Denmark)

    Lund, Rebecca

    2012-01-01

    over a two-year period in a recently merged Finnish university. I focus specifically on how a translocal discourse of competitive performance measurement and standards of academic excellence are accomplished in the local construction of the “ideal academic” as a person who publishes articles in A level...... journals. While the construct is hard for anyone to live up to, it would seem to be more difficult for some people than for others. The current obsession with getting published in top journals place those women, who are heavily engaged in teaching activities and with responsibilities besides academic work...

  15. Advances in semantic authoring and publishing

    CERN Document Server

    Groza, T

    2012-01-01

    Dissemination can be seen as a communication process between scientists. Over the course of several publications, they expose and support their findings, while discussing stated claims. Such discourse structures are trapped within the content of the publications, thus making the semantics discoverable only by humans. In addition, the lack of advances in scientific publishing, where electronic publications are still used as simple projections of paper documents, combined with the current growth in the amount of scientific research being published, transforms the process of finding relevant lite

  16. Publish Subscribe Systems Design and Principles

    CERN Document Server

    Tarkoma, Sasu

    2012-01-01

    This book offers an unified treatment of the problems solved by publish/subscribe, how to design and implement the solutions In this book, the author provides an insight into the publish/subscribe technology including the design, implementation, and evaluation of new systems based on the technology.  The book also addresses the basic design patterns and solutions, and discusses their application in practical application scenarios. Furthermore, the author examines current standards and industry best practices as well as recent research proposals in the area. Finally, necessary content ma

  17. Scholarly publishing depends on peer reviewers.

    Science.gov (United States)

    Fernandez-Llimos, Fernando

    2018-01-01

    The peer-review crisis is posing a risk to the scholarly peer-reviewed journal system. Journals have to ask many potential peer reviewers to obtain a minimum acceptable number of peers accepting reviewing a manuscript. Several solutions have been suggested to overcome this shortage. From reimbursing for the job, to eliminating pre-publication reviews, one cannot predict which is more dangerous for the future of scholarly publishing. And, why not acknowledging their contribution to the final version of the article published? PubMed created two categories of contributors: authors [AU] and collaborators [IR]. Why not a third category for the peer-reviewer?

  18. Scholarly publishing depends on peer reviewers

    Directory of Open Access Journals (Sweden)

    Fernandez-Llimos F

    2018-03-01

    Full Text Available The peer-review crisis is posing a risk to the scholarly peer-reviewed journal system. Journals have to ask many potential peer reviewers to obtain a minimum acceptable number of peers accepting reviewing a manuscript. Several solutions have been suggested to overcome this shortage. From reimbursing for the job, to eliminating pre-publication reviews, one cannot predict which is more dangerous for the future of scholarly publishing. And, why not acknowledging their contribution to the final version of the article published? PubMed created two categories of contributors: authors [AU] and collaborators [IR]. Why not a third category for the peer-reviewer?

  19. A Faster Algorithm for Computing Straight Skeletons

    KAUST Repository

    Mencel, Liam A.

    2014-01-01

    computation in O(n (log n) log r) time. It improves on the previously best known algorithm for this reduction, which is randomised, and runs in expected O(n √(h+1) log² n) time for a polygon with h holes. Using known motorcycle graph algorithms, our result

  20. A Clustal Alignment Improver Using Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Thomsen, Rene; Fogel, Gary B.; Krink, Thimo

    2002-01-01

    Multiple sequence alignment (MSA) is a crucial task in bioinformatics. In this paper we extended previous work with evolutionary algorithms (EA) by using MSA solutions obtained from the wellknown Clustal V algorithm as a candidate solution seed of the initial EA population. Our results clearly show...

  1. [Trends of electronic publishing in medicine and life sciences].

    Science.gov (United States)

    Strelski-Waisman, Neta; Waisman, Dan

    2005-09-01

    Scientific publication in the electronic media is gaining popularity in academic libraries, research institutions and commercial organizations. The electronic journal may shorten the processes of writing and publication, decrease publication and distribution costs, and enable access from any location in the world. Electronic publications have unique advantages: it is possible to search them, to create hyperlinks to references and footnotes, as well as to information on the web and to include graphics and photographs at a very low cost. Audio, video and tri-dimensional images may also be included. Electronic publishing may also speed up review and publication processes and enable the writer to receive immediate feedback through the web. However, in spite of the advantages, there are certain points that must be considered: accessibility to previously published material is not guaranteed as databases are not always stable and coverage may change without notice. In addition, the price that commercial publishers charge for their services may be very high or be subject to the purchase of a packaged deal that may include unwanted databases. Many issues of copyright and the use of published material are not yet finalized. In this review we discuss the advantages and disadvantages of the electronic scientific publication, the feasibility of keeping appropriate quality and peer-review process, the stability and accessibility of databases managed by the publishers and the acceptance of the electronic format by scientists and clinicians.

  2. Pseudo-deterministic Algorithms

    OpenAIRE

    Goldwasser , Shafi

    2012-01-01

    International audience; In this talk we describe a new type of probabilistic algorithm which we call Bellagio Algorithms: a randomized algorithm which is guaranteed to run in expected polynomial time, and to produce a correct and unique solution with high probability. These algorithms are pseudo-deterministic: they can not be distinguished from deterministic algorithms in polynomial time by a probabilistic polynomial time observer with black box access to the algorithm. We show a necessary an...

  3. Educational Systems Design Implications of Electronic Publishing.

    Science.gov (United States)

    Romiszowski, Alexander J.

    1994-01-01

    Discussion of electronic publishing focuses on the four main purposes of media in general: communication, entertainment, motivation, and education. Highlights include electronic journals and books; hypertext; user control; computer graphics and animation; electronic games; virtual reality; multimedia; electronic performance support;…

  4. Hypertext Publishing and the Revitalization of Knowledge.

    Science.gov (United States)

    Louie, Steven; Rubeck, Robert F.

    1989-01-01

    Discusses the use of hypertext for publishing and other document control activities in higher education. Topics discussed include a model of hypertext, called GUIDE, that is used at the University of Arizona Medical School; the increase in the number of scholarly publications; courseware development by faculty; and artificial intelligence. (LRW)

  5. Open Access Publishing in Particle Physics

    CERN Document Server

    2007-01-01

    Particle Physics, often referred to as High Energy Physics (HEP), spearheaded the Open Access dissemination of scientific results with the mass mailing of preprints in the pre-Web era and with the launch of the arXiv preprint system at the dawn of the '90s. The HEP community is now ready for a further push to Open Access while retaining all the advantages of the peerreview system and, at the same time, bring the spiralling cost of journal subscriptions under control. I will present a plan for the conversion to Open Access of HEP peer-reviewed journals, through a consortium of HEP funding agencies, laboratories and libraries: SCOAP3 (Sponsoring Consortium for Open Access Publishing in Particle Physics). SCOAP3 will engage with scientific publishers towards building a sustainable model for Open Access publishing, which is as transparent as possible for HEP authors. The current system in which journals income comes from subscription fees is replaced with a scheme where SCOAP3 compensates publishers for the costs...

  6. Publishing Qualitative Research in Counseling Journals

    Science.gov (United States)

    Hunt, Brandon

    2011-01-01

    This article focuses on the essential elements to be included when developing a qualitative study and preparing the findings for publication. Using the sections typically found in a qualitative article, the author describes content relevant to each section, with additional suggestions for publishing qualitative research.

  7. Publisher Correction: Geometric constraints during epithelial jamming

    Science.gov (United States)

    Atia, Lior; Bi, Dapeng; Sharma, Yasha; Mitchel, Jennifer A.; Gweon, Bomi; Koehler, Stephan A.; DeCamp, Stephen J.; Lan, Bo; Kim, Jae Hun; Hirsch, Rebecca; Pegoraro, Adrian F.; Lee, Kyu Ha; Starr, Jacqueline R.; Weitz, David A.; Martin, Adam C.; Park, Jin-Ah; Butler, James P.; Fredberg, Jeffrey J.

    2018-06-01

    In the version of this Article originally published, the Supplementary Movies were linked to the wrong descriptions. These have now been corrected. Additionally, the authors would like to note that co-authors James P. Butler and Jeffrey J. Fredberg contributed equally to this Article; this change has now been made.

  8. Doing Publishable Research with Undergraduate Students

    Science.gov (United States)

    Fenn, Aju J.; Johnson, Daniel K. N.; Smith, Mark Griffin; Stimpert, J. L.

    2010-01-01

    Many economics majors write a senior thesis. Although this experience can be the pinnacle of their education, publication is not the common standard for undergraduates. The authors describe four approaches that have allowed students to get their work published: (1) identify a topic, such as competitive balance in sports, and have students work on…

  9. Desktop publishing: a useful tool for scientists.

    Science.gov (United States)

    Lindroth, J R; Cooper, G; Kent, R L

    1994-01-01

    Desktop publishing offers features that are not available in word processing programs. The process yields an impressive and professional-looking document that is legible and attractive. It is a simple but effective tool to enhance the quality and appearance of your work and perhaps also increase your productivity.

  10. Desktop Publishing as a Learning Resources Service.

    Science.gov (United States)

    Drake, David

    In late 1988, Midland College in Texas implemented a desktop publishing service to produce instructional aids and reduce and complement the workload of the campus print shop. The desktop service was placed in the Media Services Department of the Learning Resource Center (LRC) for three reasons: the LRC was already established as a campus-wide…

  11. Desktop Publishing: Things Gutenberg Never Taught You.

    Science.gov (United States)

    Bowman, Joel P.; Renshaw, Debbie A.

    1989-01-01

    Provides a desktop publishing (DTP) overview, including: advantages and disadvantages; hardware and software requirements; and future development. Discusses cost-effectiveness, confidentiality, credibility, effects on volume of paper-based communication, and the need for training in layout and design which DTP creates. Includes a glossary of DTP…

  12. Basics of Desktop Publishing. Teacher Edition.

    Science.gov (United States)

    Beeby, Ellen

    This color-coded teacher's guide contains curriculum materials designed to give students an awareness of various desktop publishing techniques before they determine their computer hardware and software needs. The guide contains six units, each of which includes some or all of the following basic components: objective sheet, suggested activities…

  13. Reconfiguration Service for Publish/Subscribe Middleware

    NARCIS (Netherlands)

    Zieba, Bogumil; Glandrup, Maurice; van Sinderen, Marten J.; Wegdam, M.

    2006-01-01

    Mission-critical, distributed systems are often designed as a set of distributed, components that interact using publish/subscribe middleware. Currently, in these systems, software components are usually statically allocated to the nodes to fulfil predictability, reliability requirements. However, a

  14. Awareness and Perceptions of Published Osteoporosis Clinical ...

    African Journals Online (AJOL)

    Awareness and Perceptions of Published Osteoporosis Clinical Guidelines-a Survey of Primary Care Practitioners in the Cape Town Metropolitan Area. ... Further attention needs to be focused on developing implementation and dissemination strategies of evidence-based guidelines in South Africa. South African Journal of ...

  15. Librarians and Libraries Supporting Open Access Publishing

    Science.gov (United States)

    Richard, Jennifer; Koufogiannakis, Denise; Ryan, Pam

    2009-01-01

    As new models of scholarly communication emerge, librarians and libraries have responded by developing and supporting new methods of storing and providing access to information and by creating new publishing support services. This article will examine the roles of libraries and librarians in developing and supporting open access publishing…

  16. 12 CFR 271.3 - Published information.

    Science.gov (United States)

    2010-01-01

    ... preceding year upon all matters of policy relating to open market operations, showing the reasons underlying... information relating to open market operations of the Federal Reserve Banks is published in the Federal... Banks and Banking FEDERAL RESERVE SYSTEM (CONTINUED) FEDERAL OPEN MARKET COMMITTEE RULES REGARDING...

  17. Electronic Publishing and The American Astronomical Society

    Science.gov (United States)

    Milkey, R. W.

    1999-12-01

    Electronic Publishing has created, and will continue to create, new opportunities and challenges for representing scientific work in new media and formats. The AAS will position itself to take advantage of these, both for newly created works and for improved representation of works already published. It is the view of the AAS that we hold the works that we publish in trust for our community and are obligated to protect the integrity of these works and to assure that they continue to be available to the research community. Assignment of copyright to the AAS by the author plays a central role in the preservation of the integrity and accessability of the literature published by the American Astronomical Society. In return for such assignment the AAS allows the author to freely use the work for his/her own purpose and to control the grant of permission to third parties to use such materials. The AAS retains the right to republish the work in whatever format or medium, and to retain the rights after the author's death. Specific advantages to this approach include: Assurance of the continued availability of the materials to the research and educational communities; A guarantee of the intellectual integrity of the materials in the archive; Stimulation of the development of new means of presentation or of access to the archival literature; and Provision of a uniformity of treatment for copyright issues and to relieve the individual authors of much of the administrative work.

  18. Open access publishing in physics gains momentum

    CERN Multimedia

    2006-01-01

    "The first meeting of European particle physics funding agencies took place today at CERN to establish a consortium for Open Access publishing in particle physics, SCOAP3. This is the first time an antire scientific field is exploring the conversion of its reader-paid journals into an author-paid Open Access format." (1 page)

  19. [Medical publishing in Norway 1905-2005].

    Science.gov (United States)

    Nylenna, Magne; Larsen, Øivind

    2005-06-02

    The nation-building process in Norway took mainly place before the Norwegian-Swedish union came to a close in 1905. This was not a dramatic change, though the end of the union did bring a lift to Norwegian national consciousness. In 1905 there were three general medical journals in Norway and approximately 1200 doctors. German was the most important language of international science, but most scientific publishing was done in Norwegian. After the Second World War, English became the dominating language of scientific communication. Twentieth-century medicine and medical publishing was an era of specialisation and internationalisation. Norwegian medicine has to a large extent been internationalised through Nordic cooperation, with the Nordic specialist journals being of particular importance. With increasing professionalism in research, international English-language journals have become the major channels of communication, though several Norwegian-language journals (on paper or on the internet) have been established and are of crucial importance to a national identity within medical specialties. In 2005 there is only one general medical journal in Norwegian, in a country with approximately 20,000 doctors. A national identity related to medical publishing is not given much attention, though national medicine is still closely tied in with national culture. Good clinical practice should be based on a firm knowledge of local society and local tradition. This is a challenge in contemporary medical publishing.

  20. Humanists, Libraries, Electronic Publishing, and the Future.

    Science.gov (United States)

    Sweetland, James H.

    1992-01-01

    Discusses the impact of computerization on humanists and libraries. Highlights include a lack of relevant databases; a reliance on original text; vocabulary and language issues; lack of time pressure; research style; attitudes of humanists toward technology; trends in electronic publishing; hypertext; collection development; electronic mail;…

  1. Evolving Digital Publishing Opportunities across Composition Studies

    Science.gov (United States)

    Hawishler, Gail E.; Selfe, Cynthia L.

    2014-01-01

    In this article, the authors report since the early 1980s, the profession has seen plenty of changes in the arena of digital scholarly publishing: during this time, while the specific challenges have seldom remained the same, the presence and the pressures of rapid technological change endure. In fact, as an editorial team that has, in part,…

  2. Electronic Publishing in Library and Information Science.

    Science.gov (United States)

    Lee, Joel M.; And Others

    1988-01-01

    Discusses electronic publishing as it refers to machine-readable databases. Types of electronic products and services are described and related topics considered: (1) usage of library and information science databases; (2) production and distribution of databases; (3) trends and projections in the electronic information industry; and (4)…

  3. Publisher Correction: The price of fast fashion

    Science.gov (United States)

    2018-02-01

    In the version of this Editorial originally published, the rate of clothing disposal to landfill was incorrectly given as `one rubbish truck per day'; it should have read `one rubbish truck per second'. This has now been corrected in the online versions of the Editorial.

  4. Data Publishing - View from the Front

    Science.gov (United States)

    Carlson, David; Pfeiffenberger, Hans

    2014-05-01

    As data publishing journals - Earth System Science Data (ESSD, Copernicus, since 2009), Geophysical Data Journal (GDJ, Wiley, recent) and Scientific Data (SD, Nature Publishing Group, anticipated from May 2014) - expose data sets, implement data description and data review practices, and develop partnerships with data centres and data providers, we anticipate substantial benefits for the broad earth system and environmental research communities but also substantial challenges for all parties. A primary advantage emerges from open access to convergent data: subsurface hydrographic data near Antarctica, for example, now available for combination and comparison with nearby atmospheric data (both documented in ESSD), basin-scale precipitation data (accessed through GDJ) for comparison and interpolation with long-term global precipitation records (accessed from ESSD), or, imagining not too far into the future, stomach content and abundance data for European fish (from ESSD) linked to genetic or nutritional data (from SD). In addition to increased opportunity for discovery and collaboration, we also notice parallel developments of new tools for (published) data visualization and display and increasing acceptance of data publication as a useful and anticipated dissemination step included in project- and institution-based data management plans. All parties - providers, publishers and users - will benefit as various indexing services (SCI, SCOPUS, DCI etc.) acknowledge the creative, intellectual and meritorious efforts of data preparation and data provision. The challenges facing data publication, in most cases very familiar to the data community but made more acute by the advances in data publishing, include diverging metadata standards (among biomedical, green ocean modeling and meteorological communities, for example), adhering to standards and practices for permanent identification while also accommodating 'living' data, and maintaining prompt but rigorous review and

  5. Underestimation of Severity of Previous Whiplash Injuries

    Science.gov (United States)

    Naqui, SZH; Lovell, SJ; Lovell, ME

    2008-01-01

    INTRODUCTION We noted a report that more significant symptoms may be expressed after second whiplash injuries by a suggested cumulative effect, including degeneration. We wondered if patients were underestimating the severity of their earlier injury. PATIENTS AND METHODS We studied recent medicolegal reports, to assess subjects with a second whiplash injury. They had been asked whether their earlier injury was worse, the same or lesser in severity. RESULTS From the study cohort, 101 patients (87%) felt that they had fully recovered from their first injury and 15 (13%) had not. Seventy-six subjects considered their first injury of lesser severity, 24 worse and 16 the same. Of the 24 that felt the violence of their first accident was worse, only 8 had worse symptoms, and 16 felt their symptoms were mainly the same or less than their symptoms from their second injury. Statistical analysis of the data revealed that the proportion of those claiming a difference who said the previous injury was lesser was 76% (95% CI 66–84%). The observed proportion with a lesser injury was considerably higher than the 50% anticipated. CONCLUSIONS We feel that subjects may underestimate the severity of an earlier injury and associated symptoms. Reasons for this may include secondary gain rather than any proposed cumulative effect. PMID:18201501

  6. [Electronic cigarettes - effects on health. Previous reports].

    Science.gov (United States)

    Napierała, Marta; Kulza, Maksymilian; Wachowiak, Anna; Jabłecka, Katarzyna; Florek, Ewa

    2014-01-01

    Currently very popular in the market of tobacco products have gained electronic cigarettes (ang. E-cigarettes). These products are considered to be potentially less harmful in compared to traditional tobacco products. However, current reports indicate that the statements of the producers regarding to the composition of the e- liquids not always are sufficient, and consumers often do not have reliable information on the quality of the product used by them. This paper contain a review of previous reports on the composition of e-cigarettes and their impact on health. Most of the observed health effects was related to symptoms of the respiratory tract, mouth, throat, neurological complications and sensory organs. Particularly hazardous effects of the e-cigarettes were: pneumonia, congestive heart failure, confusion, convulsions, hypotension, aspiration pneumonia, face second-degree burns, blindness, chest pain and rapid heartbeat. In the literature there is no information relating to passive exposure by the aerosols released during e-cigarette smoking. Furthermore, the information regarding to the use of these products in the long term are not also available.

  7. Small sum privacy and large sum utility in data publishing.

    Science.gov (United States)

    Fu, Ada Wai-Chee; Wang, Ke; Wong, Raymond Chi-Wing; Wang, Jia; Jiang, Minhao

    2014-08-01

    While the study of privacy preserving data publishing has drawn a lot of interest, some recent work has shown that existing mechanisms do not limit all inferences about individuals. This paper is a positive note in response to this finding. We point out that not all inference attacks should be countered, in contrast to all existing works known to us, and based on this we propose a model called SPLU. This model protects sensitive information, by which we refer to answers for aggregate queries with small sums, while queries with large sums are answered with higher accuracy. Using SPLU, we introduce a sanitization algorithm to protect data while maintaining high data utility for queries with large sums. Empirical results show that our method behaves as desired. Copyright © 2014 Elsevier Inc. All rights reserved.

  8. Consortium Negotiations with Publishers - Past and Future

    Directory of Open Access Journals (Sweden)

    Pierre Carbone

    2007-09-01

    Full Text Available Since the mid nineties, with the development of online access to information (journals, databases, e-books, libraries strengthened their cooperation. They set up consortia at different levels around the world, generally with the support of the public authorities, for negotiating collectively with the publishers and information providers general agreements for access to these resources. This cooperation has been reinforced at the international level with the exchange of experiences and the debates in the ICOLC seminars and statements. So did the French consortium Couperin, which is now gathering more than 200 academic and research institutions. The level of access and downloading from these resources is growing with geometrical progression, and reaches a scale with no comparison to ILL or access to printed documents, but the costs did not reduce and the libraries budgets did not increase. At first, agreements with the major journal publishers were based on cross-access, and evolved rapidly to the access at a large bundle of titles in the so-called Big deal. After experiencing the advantages of the Big deal, the libraries are now more sensitive to the limits and lack of flexibility and to cost-effectiveness. These Big deals were based on a model where online access fee is built on the cost of print subscriptions, and the problem for the consortia and for the publishers is now to evolve from this print plus online model to an e-only model, no more based on the historical amount of the print subscriptions, to a new deal. In many European countries, VAT legislation is an obstacle to e-only, and this problem must be discussed at the European level. This change to e-only takes place at a moment where changes in the scientific publishing world are important (mergers of publishing houses, growth of research and of scientific publishing in the developing countries, open access and open archives movement. The transition to e-only leads also the library

  9. Publishing Landscape Archaeology in the Digital World

    Directory of Open Access Journals (Sweden)

    Howry Jeffrey C.

    2017-12-01

    Full Text Available The challenge of presenting micro- and macro-scale scale data in landscape archaeology studies is facilitated by a diversity of GIS technologies. Specific to scholarly research is the need to selectively share certain types of data with collaborators and academic researchers while also publishing general information in the public domain. This article presents a general model for scholarly online collaboration and teaching while providing examples of the kinds of landscape archaeology that can be published online. Specifically illustrated is WorldMap, an interactive mapping platform based upon open-source software which uses browsers built to open source standards. The various features of this platform allow tight user viewing control, views with URL referencing, commenting and certification of layers, as well as user annotation. Illustration of WorldMap features and its value for scholarly research and teaching is provided in the context of landscape archaeology studies.

  10. Springer Publishing Booth | 4-5 October

    CERN Multimedia

    2016-01-01

    In the spirit of continuation of the CERN Book Fairs of the past years, Springer Nature will be present with a book and journal booth on October 4th and 5th, located as usual in the foyer of the Main Building. Some of the latest titles in particle physics and related fields will be on sale.   You are cordially invited to come to the booth to meet Heike Klingebiel (Licensing Manager / Library Sales), Hisako Niko (Publishing Editor) and Christian Caron (Publishing Editor). In particular, information about the new Nano database – nanomaterial and device profiles from high-impact journals and patents, manually abstracted, curated and updated by nanotechnology experts – will be available. The database is accessible here: http://nano.nature.com/. 

  11. Publishing priorities of biomedical research funders

    Science.gov (United States)

    Collins, Ellen

    2013-01-01

    Objectives To understand the publishing priorities, especially in relation to open access, of 10 UK biomedical research funders. Design Semistructured interviews. Setting 10 UK biomedical research funders. Participants 12 employees with responsibility for research management at 10 UK biomedical research funders; a purposive sample to represent a range of backgrounds and organisation types. Conclusions Publicly funded and large biomedical research funders are committed to open access publishing and are pleased with recent developments which have stimulated growth in this area. Smaller charitable funders are supportive of the aims of open access, but are concerned about the practical implications for their budgets and their funded researchers. Across the board, biomedical research funders are turning their attention to other priorities for sharing research outputs, including data, protocols and negative results. Further work is required to understand how smaller funders, including charitable funders, can support open access. PMID:24154520

  12. The Open Data Repositorys Data Publisher

    Science.gov (United States)

    Stone, N.; Lafuente, B.; Downs, R. T.; Blake, D.; Bristow, T.; Fonda, M.; Pires, A.

    2015-01-01

    Data management and data publication are becoming increasingly important components of researcher's workflows. The complexity of managing data, publishing data online, and archiving data has not decreased significantly even as computing access and power has greatly increased. The Open Data Repository's Data Publisher software strives to make data archiving, management, and publication a standard part of a researcher's workflow using simple, web-based tools and commodity server hardware. The publication engine allows for uploading, searching, and display of data with graphing capabilities and downloadable files. Access is controlled through a robust permissions system that can control publication at the field level and can be granted to the general public or protected so that only registered users at various permission levels receive access. Data Publisher also allows researchers to subscribe to meta-data standards through a plugin system, embargo data publication at their discretion, and collaborate with other researchers through various levels of data sharing. As the software matures, semantic data standards will be implemented to facilitate machine reading of data and each database will provide a REST application programming interface for programmatic access. Additionally, a citation system will allow snapshots of any data set to be archived and cited for publication while the data itself can remain living and continuously evolve beyond the snapshot date. The software runs on a traditional LAMP (Linux, Apache, MySQL, PHP) server and is available on GitHub (http://github.com/opendatarepository) under a GPLv2 open source license. The goal of the Open Data Repository is to lower the cost and training barrier to entry so that any researcher can easily publish their data and ensure it is archived for posterity.

  13. Publisher Correction: Local sourcing in astronomy

    Science.gov (United States)

    2018-06-01

    In the version of this Editorial originally published, we mistakenly wrote that `the NAOJ ... may decommission Subaru in favour of other priorities'. In fact, the National Astronomical Observatory of Japan is committed to the long-term operation of the Subaru telescope. In the corrected version that whole sentence has been replaced with: `It will be critical to maintain such smaller telescopes in the age of the ELTs.'

  14. Electronic pre-publishing for worldwide access

    International Nuclear Information System (INIS)

    Dallman, D.; Draper, M.; Schwarz, S.

    1994-01-01

    In High Energy Physics, as in other areas of research, paper preprints have traditionally been the primary method of communication, before publishing in a journal. Electronic bulletin boards (EBBs) are now taking over as the dominant medium. While fast and readily available EBBs do not constitute electronic journals as they bypass the referee system crucial for prestigious research journals, although this too may be achieved electronically in time. (UK)

  15. The Open Data Repository's Data Publisher

    Science.gov (United States)

    Stone, N.; Lafuente, B.; Downs, R. T.; Bristow, T.; Blake, D. F.; Fonda, M.; Pires, A.

    2015-12-01

    Data management and data publication are becoming increasingly important components of research workflows. The complexity of managing data, publishing data online, and archiving data has not decreased significantly even as computing access and power has greatly increased. The Open Data Repository's Data Publisher software (http://www.opendatarepository.org) strives to make data archiving, management, and publication a standard part of a researcher's workflow using simple, web-based tools and commodity server hardware. The publication engine allows for uploading, searching, and display of data with graphing capabilities and downloadable files. Access is controlled through a robust permissions system that can control publication at the field level and can be granted to the general public or protected so that only registered users at various permission levels receive access. Data Publisher also allows researchers to subscribe to meta-data standards through a plugin system, embargo data publication at their discretion, and collaborate with other researchers through various levels of data sharing. As the software matures, semantic data standards will be implemented to facilitate machine reading of data and each database will provide a REST application programming interface for programmatic access. Additionally, a citation system will allow snapshots of any data set to be archived and cited for publication while the data itself can remain living and continuously evolve beyond the snapshot date. The software runs on a traditional LAMP (Linux, Apache, MySQL, PHP) server and is available on GitHub (http://github.com/opendatarepository) under a GPLv2 open source license. The goal of the Open Data Repository is to lower the cost and training barrier to entry so that any researcher can easily publish their data and ensure it is archived for posterity. We gratefully acknowledge the support for this study by the Science-Enabling Research Activity (SERA), and NASA NNX11AP82A

  16. Promising Products for Printing and Publishing Market

    Directory of Open Access Journals (Sweden)

    Renata Činčikaitė

    2011-04-01

    Full Text Available The article surveys printing and publishing market and its strong and weak aspects. The concept of a new product is described as well as its lifetime and the necessity of its introduction to the market. The enterprise X operating on the market is analyzed, its strong and weak characteristics are presented. The segmentation of the company consumers is performed. On the basis of the performed analysis the potential promising company products are defined.Article in Lithuanian

  17. Redressing the inverted pyramid of scientific publishing

    Science.gov (United States)

    Caux, Jean-Sébastien

    2017-11-01

    Scientific publishing is currently undergoing a progressively rapid transformation away from the traditional subscription model. With the Open Access movement in full swing, existing business practices and future plans are coming under increasing scrutiny, while new "big deals" are being made at breakneck speed. Scientists can rightfully ask themselves if all these changes are going the right way, and if not, what can be done about it.

  18. Open Access publishing in physics gains momentum

    CERN Multimedia

    2006-01-01

    "As if inventing the World-Wide Web were not revolutionary enough, the European Organisation for Nuclear Research (CERN) is now on its way to unleashing a paradigm shift in the world of academic publishing. For the first time ever, an entire scientific field is exploring the possibility of converting its reader-paid journals into an author-pai Open Access format." (1 page)

  19. The BiPublishers ranking: Main results and methodological problems when constructing rankings of academic publishers

    Directory of Open Access Journals (Sweden)

    Torres-Salinas, Daniel

    2015-12-01

    Full Text Available We present the results of the Bibliometric Indicators for Publishers project (also known as BiPublishers. This project represents the first attempt to systematically develop bibliometric publisher rankings. The data for this project was derived from the Book Citation Index and the study time period was 2009-2013. We have developed 42 rankings: 4 by fields and 38 by disciplines. We display six indicators for publishers divided into three types: output, impact and publisher’s profile. The aim is to capture different characteristics of the research performance of publishers. 254 publishers were processed and classified according to publisher type: commercial publishers and university presses. We present the main publishers by field and then discuss the principal challenges presented when developing this type of tool. The BiPublishers ranking is an on-going project which aims to develop and explore new data sources and indicators to better capture and define the research impact of publishers.Presentamos los resultados del proyecto Bibliometric Indicators for Publishers (BiPublishers. Es el primer proyecto que desarrolla de manera sistemática rankings bibliométricos de editoriales. La fuente de datos empleada es el Book Citation Index y el periodo de análisis 2009-2013. Se presentan 42 rankings: 4 por áreas y 38 por disciplinas. Mostramos seis indicadores por editorial divididos según su tipología: producción, impacto y características editoriales. Se procesaron 254 editoriales y se clasificaron según el tipo: comerciales y universitarias. Se presentan las principales editoriales por áreas. Después, se discuten los principales retos a superar en el desarrollo de este tipo de herramientas. El ranking Bipublishers es un proyecto en desarrollo que persigue analizar y explorar nuevas fuentes de datos e indicadores para captar y definir el impacto de las editoriales académicas.

  20. Recently Published Lectures and Tutorials for ATLAS

    CERN Multimedia

    Herr, J.

    2006-01-01

    As reported in the September 2004 ATLAS eNews, the Web Lecture Archive Project, WLAP, a collaboration between the University of Michigan and CERN, has developed a synchronized system for recording and publishing educational multimedia presentations, using the Web as medium. This year, the University of Michigan team has been asked to record and publish all ATLAS Plenary sessions, as well as a large number of Physics and Computing tutorials. A significant amount of this material has already been published and can be accessed via the links below. The WLAP model is spreading. This summer, the CERN's High School Teachers program has used WLAP's system to record several physics lectures directed toward a broad audience. And a new project called MScribe, which is essentially the WLAP system coupled with an infrared tracking camera, is being used by the University of Michigan to record several University courses this academic year. All lectures can be viewed on any major platform with any common internet browser...

  1. The ethics of open access publishing.

    Science.gov (United States)

    Parker, Michael

    2013-03-22

    Should those who work on ethics welcome or resist moves to open access publishing? This paper analyses arguments in favour and against the increasing requirement for open access publishing and considers their implications for bioethics research. In the context of biomedical science, major funders are increasingly mandating open access as a condition of funding and such moves are also common in other disciplines. Whilst there has been some debate about the implications of open-access for the social sciences and humanities, there has been little if any discussion about the implications of open access for ethics. This is surprising given both the central role of public reason and critique in ethics and the fact that many of the arguments made for and against open access have been couched in moral terms. In what follows I argue that those who work in ethics have a strong interest in supporting moves towards more open publishing approaches which have the potential both to inform and promote richer and more diverse forms of public deliberation and to be enriched by them. The importance of public deliberation in practical and applied ethics suggests that ethicists have a particular interest in the promotion of diverse and experimental forms of publication and debate and in supporting new, more creative and more participatory approaches to publication.

  2. Applied and implied semantics in crystallographic publishing

    Directory of Open Access Journals (Sweden)

    McMahon Brian

    2012-08-01

    Full Text Available Abstract Background Crystallography is a data-rich, software-intensive scientific discipline with a community that has undertaken direct responsibility for publishing its own scientific journals. That community has worked actively to develop information exchange standards allowing readers of structure reports to access directly, and interact with, the scientific content of the articles. Results Structure reports submitted to some journals of the International Union of Crystallography (IUCr can be automatically validated and published through an efficient and cost-effective workflow. Readers can view and interact with the structures in three-dimensional visualization applications, and can access the experimental data should they wish to perform their own independent structure solution and refinement. The journals also layer on top of this facility a number of automated annotations and interpretations to add further scientific value. Conclusions The benefits of semantically rich information exchange standards have revolutionised the scholarly publishing process for crystallography, and establish a model relevant to many other physical science disciplines.

  3. Open Access Publishing - Strengths and Strategies

    Science.gov (United States)

    Rasmussen, Martin

    2010-05-01

    The journal crisis and the demand for free accessibility to the results of publicly funded research were the main drivers of the Open Access movement since the late 1990's. Besides many academic institutions that support the different ways of Open Access publishing, there is a growing number of publishing houses that are specialized on this new access and business model of scholarly literature. The lecture provides an overview of the different kinds of Open Access publishing, discusses the variety of underlying business models, names the advantages and potentials for researches and the public, and overcomes some objections against Open Access. Besides the increased visibility and information supply, the topic of copyrights and exploitation rights will be discussed. Furthermore, it is a central aim of the presentation to show that Open Access does not only support full peer-review, but also provides the potential for even enhanced quality assurance. The financing of business models based on open accessible literature is another important part to be outlined in the lecture.

  4. Testing mapping algorithms of the cancer-specific EORTC QLQ-C30 onto EQ-5D in malignant mesothelioma.

    Science.gov (United States)

    Arnold, David T; Rowen, Donna; Versteegh, Matthijs M; Morley, Anna; Hooper, Clare E; Maskell, Nicholas A

    2015-01-23

    In order to estimate utilities for cancer studies where the EQ-5D was not used, the EORTC QLQ-C30 can be used to estimate EQ-5D using existing mapping algorithms. Several mapping algorithms exist for this transformation, however, algorithms tend to lose accuracy in patients in poor health states. The aim of this study was to test all existing mapping algorithms of QLQ-C30 onto EQ-5D, in a dataset of patients with malignant pleural mesothelioma, an invariably fatal malignancy where no previous mapping estimation has been published. Health related quality of life (HRQoL) data where both the EQ-5D and QLQ-C30 were used simultaneously was obtained from the UK-based prospective observational SWAMP (South West Area Mesothelioma and Pemetrexed) trial. In the original trial 73 patients with pleural mesothelioma were offered palliative chemotherapy and their HRQoL was assessed across five time points. This data was used to test the nine available mapping algorithms found in the literature, comparing predicted against observed EQ-5D values. The ability of algorithms to predict the mean, minimise error and detect clinically significant differences was assessed. The dataset had a total of 250 observations across 5 timepoints. The linear regression mapping algorithms tested generally performed poorly, over-estimating the predicted compared to observed EQ-5D values, especially when observed EQ-5D was below 0.5. The best performing algorithm used a response mapping method and predicted the mean EQ-5D with accuracy with an average root mean squared error of 0.17 (Standard Deviation; 0.22). This algorithm reliably discriminated between clinically distinct subgroups seen in the primary dataset. This study tested mapping algorithms in a population with poor health states, where they have been previously shown to perform poorly. Further research into EQ-5D estimation should be directed at response mapping methods given its superior performance in this study.

  5. Publishing bioethics and bioethics--reflections on academic publishing by a journal editor.

    Science.gov (United States)

    Schüklenk, Udo

    2011-02-01

    This article by one of the Editors of Bioethics, published in the 25th anniversary issue of the journal, describes some of the revolutionary changes academic publishing has undergone during the last decades. Many humanities journals went from typically small print-runs, counting by the hundreds, to on-line availability in thousands of university libraries worldwide. Article up-take by our subscribers can be measured efficiently. The implications of this and other changes to academic publishing are discussed. Important ethical challenges need to be addressed in areas such as the enforcement of plagiarism-related policies, the so-called 'impact factor' and its impact on academic integrity, and the question of whether on-line only publishing can currently guarantee the integrity of academic publishing histories. © 2010 Blackwell Publishing Ltd.

  6. Hamiltonian Algorithm Sound Synthesis

    OpenAIRE

    大矢, 健一

    2013-01-01

    Hamiltonian Algorithm (HA) is an algorithm for searching solutions is optimization problems. This paper introduces a sound synthesis technique using Hamiltonian Algorithm and shows a simple example. "Hamiltonian Algorithm Sound Synthesis" uses phase transition effect in HA. Because of this transition effect, totally new waveforms are produced.

  7. Progressive geometric algorithms

    NARCIS (Netherlands)

    Alewijnse, S.P.A.; Bagautdinov, T.M.; de Berg, M.T.; Bouts, Q.W.; ten Brink, Alex P.; Buchin, K.A.; Westenberg, M.A.

    2015-01-01

    Progressive algorithms are algorithms that, on the way to computing a complete solution to the problem at hand, output intermediate solutions that approximate the complete solution increasingly well. We present a framework for analyzing such algorithms, and develop efficient progressive algorithms

  8. Progressive geometric algorithms

    NARCIS (Netherlands)

    Alewijnse, S.P.A.; Bagautdinov, T.M.; Berg, de M.T.; Bouts, Q.W.; Brink, ten A.P.; Buchin, K.; Westenberg, M.A.

    2014-01-01

    Progressive algorithms are algorithms that, on the way to computing a complete solution to the problem at hand, output intermediate solutions that approximate the complete solution increasingly well. We present a framework for analyzing such algorithms, and develop efficient progressive algorithms

  9. Reinvention of publishers' revenue model: expectations of advertisers towards publishers' products

    OpenAIRE

    Koller, Hans; Dennstedt, Bianca

    2017-01-01

    Publishers have to reconsider their revenue model. Facing a massive decline in the circulation of newspapers and magazines over the past years, publishers have lost not only readers but also many advertisers. Thus, publishers are faced with both changed customer expectations as well as difficulty in generating profit. Users are increasingly less willing to pay for digital products and their expectations of digital content have changed: They would like to contribute their own content as well a...

  10. The Algorithmic Imaginary

    DEFF Research Database (Denmark)

    Bucher, Taina

    2017-01-01

    the notion of the algorithmic imaginary. It is argued that the algorithmic imaginary – ways of thinking about what algorithms are, what they should be and how they function – is not just productive of different moods and sensations but plays a generative role in moulding the Facebook algorithm itself...... of algorithms affect people's use of these platforms, if at all? To help answer these questions, this article examines people's personal stories about the Facebook algorithm through tweets and interviews with 25 ordinary users. To understand the spaces where people and algorithms meet, this article develops...

  11. The BR eigenvalue algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Geist, G.A. [Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.; Howell, G.W. [Florida Inst. of Tech., Melbourne, FL (United States). Dept. of Applied Mathematics; Watkins, D.S. [Washington State Univ., Pullman, WA (United States). Dept. of Pure and Applied Mathematics

    1997-11-01

    The BR algorithm, a new method for calculating the eigenvalues of an upper Hessenberg matrix, is introduced. It is a bulge-chasing algorithm like the QR algorithm, but, unlike the QR algorithm, it is well adapted to computing the eigenvalues of the narrowband, nearly tridiagonal matrices generated by the look-ahead Lanczos process. This paper describes the BR algorithm and gives numerical evidence that it works well in conjunction with the Lanczos process. On the biggest problems run so far, the BR algorithm beats the QR algorithm by a factor of 30--60 in computing time and a factor of over 100 in matrix storage space.

  12. PUBLISHER'S ANNOUNCEMENT: Editorial developments Editorial developments

    Science.gov (United States)

    2010-01-01

    I am delighted to inform you that from January 2010 Professor Alfred K Louis of the University of Saarland, Germany, will be the new Editor-in-Chief of Inverse Problems. Alfred joins us with a wealth of experience and a great deal of respect from the community. He has served the journal in a number of ways as an Editorial Board member, outstanding reviewer and author. We very much look forward to working with him to continue to publish the highest quality articles in the field and build on our extremely successful special section and topical review programmes. Whilst welcoming Alfred to the position, we are also keen to thank our outgoing Editor-in-Chief, Professor Bill Symes, for the fabulous job that he has done over the past five years. Under Bill's direction, Inverse Problems has gone from strength to strength. In fact, in the last year we have taken the step of moving from six to 12 issues a year, reflecting the increased number of high-quality papers submitted to the journal. During the last five years we have published a wide range of fantastic special sections and topical reviews, including a celebration of the journal's 25th year (issue 12 2009), in which Bill played a pivotal role. We are very much looking forward to 2010 and will be celebrating our 25th birthday further with a selection of highlighted articles chosen from the past 25 years. We hope that you will continue to enjoy reading the journal. If you have any feedback, comments or questions, please do not hesitate to contact us at ip@iop.org. Zoë Crossman Publisher

  13. Scientific Publishing and the Data Deluge (Invited)

    Science.gov (United States)

    Hanson, B.

    2010-12-01

    The ability to collect and analyze huge data sets is changing and revolutionizing many aspects of science, including scientific publishing. Policies and practices with respect to data management and archiving have been evolving at journals, but many outstanding problems and challenges have emerged and some are growing. Journals have an evolving mission including a traditional role in advancing science and an increasingly important role of accrediting peer-reviewed research used in public policy and the legal and regulatory systems. Publishing is increasingly responsible for assuring the reliability and transparency of data for both uses, and policies have been evolving to meet these goals. Most journals now include data supplements and have strengthened sharing and archiving requirements. For example, Science now requires all references to be available (published) at publication, and to the extent possible, supporting data to be archived in online supplements. Many problems remain and are growing: Journals cannot handle some of the large data sets routinely being produced now, and must rely on public databases. Of these, too many do not have reliable funding, and others (e.g., personal or institutional WWW sites) are not reliably curated. Much usable data is being discarded. Journals are in the role of monitoring and in too many cases enforcing deposition and sharing of data. Presentation and visualization of data requires new tools that are challenging to standardize and maintain, and to represent within traditional formats still used by most users. Much data is being archived in a minimally usable form (PDF) without common metadata. A growing burden is being placed on reviewers and editors as papers are longer and more complex, and many journals are seeing large growths in submissions. In some disciplines, huge private data sets, third-party data, or privacy issues are increasingly important, and scientists and journals may be unaware of use restrictions. It is

  14. Conceiving, Writing and Publishing a Scientific Paper

    Directory of Open Access Journals (Sweden)

    Marta Christina Suciu

    2018-02-01

    Full Text Available I consider that the second edition of the book “Conceiving, Writing and Publishing a Scientific Paper. An approach in the context of economic research” is a work of high scientific standing, excellently documented, addressed to knowledgeable readers but also to the new generations of PhD students and researchers interested in attaining the highest academic and scientific performances in the field of Economics. From my point of view, both editions of the book are highly useful for the academic environment. I especially appreciate the authors’ efforts to improve and expand the second edition. This fact is due to the changes occurred in the evolution of scientometric indicators, to the feedback received from readers, but also as the result of the desire to include multiple networks and platforms that take into account professional and institutional profiles, such as to facilitate the researchers’ dissemination of their results obtained in the scientific research activity. The authors benefit on an unanimous recognition and are highly appreciated., fact proven for example by the ten existing reviews of the book’s first edition, published in journals indexed in Web of Science, Scopus and other international databases. I am mostly impressed by the authors’ permanent concern to improve their book’s scientific content. The first edition was awarded the distinction “The Best Book in the Field of Economics” published in 2016 in Romania by the Association of Romanian Economic Faculties, but also by The Association of Cultural-Educational Cooperation from Suceava (ACCES prize in 2016

  15. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  16. XML Publishing with Adobe InDesign

    CERN Document Server

    Hoskins, Dorothy

    2010-01-01

    From Adobe InDesign CS2 to InDesign CS5, the ability to work with XML content has been built into every version of InDesign. Some of the useful applications are importing database content into InDesign to create catalog pages, exporting XML that will be useful for subsequent publishing processes, and building chunks of content that can be reused in multiple publications. In this Short Cut, we'll play with the contents of a college course catalog and see how we can use XML for course descriptions, tables, and other content. Underlying principles of XML structure, DTDs, and the InDesign namesp

  17. Tackling Tumblr Web Publishing Made Simple

    CERN Document Server

    Hedengren, Thord Daniel

    2011-01-01

    A comprehensive guide to the popular web publishing site Tumblr The popularity of Tumblr is growing by leaps and bounds, as it continues to make a name for itself as a reliable, accessible blogging platform. Yet, there is very little documentation on Tumblr, leaving newcomers confused as to where to start. That's where this helpful book comes in. Written by well-respected author Thord Hedengren, this step-by-step guide is an ideal starting point for Tumblr newcomers as well as web designers who want to take their Tumblblogs to the next level. You'll learn how to maximize the full potential of

  18. Astronomical Publishing: Yesterday, Today and Tomorrow

    Science.gov (United States)

    Huchra, John

    Just in the last few years scientific publishing has moved rapidly away from the modes that served it well for over two centuries. As "digital natives" take over the field and rapid and open access comes to dominate the way we communicate, both scholarly journals and libraries need to adopt new business models to serve their communities. This is best done by identifying new "added value" such as databases, full text searching, full cross indexing while at the same time retaining the high quality of peer reviewed publication.

  19. Algorithms for worst-case tolerance optimization

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans; Madsen, Kaj

    1979-01-01

    New algorithms are presented for the solution of optimum tolerance assignment problems. The problems considered are defined mathematically as a worst-case problem (WCP), a fixed tolerance problem (FTP), and a variable tolerance problem (VTP). The basic optimization problem without tolerances...... is denoted the zero tolerance problem (ZTP). For solution of the WCP we suggest application of interval arithmetic and also alternative methods. For solution of the FTP an algorithm is suggested which is conceptually similar to algorithms previously developed by the authors for the ZTP. Finally, the VTP...... is solved by a double-iterative algorithm in which the inner iteration is performed by the FTP- algorithm. The application of the algorithm is demonstrated by means of relatively simple numerical examples. Basic properties, such as convergence properties, are displayed based on the examples....

  20. Pioneers, publishers and the dissemination of archaeological knowledge: A study of publishing in British archaeology 1816-1851

    Directory of Open Access Journals (Sweden)

    Sarah Scott

    2013-08-01

    Full Text Available The first half of the nineteenth century was a formative period in the development of archaeology as a discipline and archaeological publishing played a key role in this. Libraries were an essential marker of social and intellectual status and there now exists a considerable body of scholarship on the most impressive publications of the day and on the factors influencing their presentation; for example, in relation to the publication of Mediterranean classical antiquities. The crucial role which publishers played in the selection and dissemination of scholarship has been addressed in recent studies of the history of the book, and there is a growing literature on the role of publishers in the dissemination of scientific knowledge, but there has to date been very limited evaluation of the role of publishers in the selection and dissemination of archaeological knowledge in Britain in this period. This study will investigate the extent to which the publication and dissemination of archaeological knowledge, and hence the discipline itself, was shaped by the intellectual and/or commercial concerns of publishers, with a view to providing a more nuanced understanding of the ways in which knowledge was filtered and the impact that this had. Key trends in archaeological publishing in the period 1816-51 will be identified, based on the London Catalogue of Books, and will show how and why this kind of study should be seen as an essential component of any research which considers the history of the discipline. Selected case studies will show the immense, and previously unacknowledged, importance of decisions made during the publication process on the development of archaeology in Britain, and directions for further study will be identified.

  1. Desktop Publishing: A Brave New World and Publishing from the Desktop.

    Science.gov (United States)

    Lormand, Robert; Rowe, Jane J.

    1988-01-01

    The first of two articles presents basic selection criteria for desktop publishing software packages, including discussion of expectations, required equipment, training costs, publication size, desired software features, additional equipment needed, and quality control. The second provides a brief description of desktop publishing using the Apple…

  2. Scientific Journal Publishing in India: Promoting electronic publishing of scholarly journals in India

    OpenAIRE

    Abraham, Thomas; Minj, Suvarsha

    2007-01-01

    Provides a report about the Scientific Journal Publishing in India (SJPI) Project which promotes electronic publishing of scholarly journals. It covers briefly the objectives, implementation and outcomes of the Project. Open Journal Systems and Open Archives Harvester were used to achieve the goals of the Project.

  3. Entomological journals and publishing in Japan.

    Science.gov (United States)

    Fukatsu, Takema

    Here I present an overview of entomological journals and publishing in Japan, thereby providing a convenient portal to the valuable scientific resources for the world's entomological researchers and scientific communities. Currently, except for several international journals published fully in English such as Applied Entomology and Zoology and Entomological Science , many entomological and entomology-related journals in Japan are not indexed by major scientific databases like Web of Science, and therefore they are neither conveniently recognizable nor accessible for the world's entomological communities. However, I point out that many of the contents of such journals are freely available via Japan's public platforms for electronic scientific literature, Japan Science and Technology Information Aggregator, Electronic (J-stage) or Citation Information by National Institute of Informatics (CiNii). Here I list 32 entomological and entomology-related societies and their 45 journals, the majority of which belong to either the Union of Japanese Societies for Insect Sciences (UJSIS), the Union of the Japanese Societies for Systematic Biology (UJSSB), the Union of Japanese Societies for Natural History (UJSNH), or the Union of Japanese Societies for Biological Science (UJSBS), with their respective URL and open-access availability.

  4. Open meeting on changing the publishing model

    CERN Multimedia

    2005-01-01

    The Director-General is calling all CERN editors and authors to a meeting to contribute to the discussion on the direction that CERN should take in its experimentation with new publishing models. The current subscription-funded publishing model for journal articles (where access to a particular journal is granted upon payment of a subscription, often arranged by the institutional library) has been the status quo for many years. However, new evidence suggests that removing this subscription barrier gives access to a greater number of readers and so leads to a higher citation rate and therefore greater impact. New so-called Open Access models are emerging but these require the support of authors and editors to be successful. A number of presentations have been solicited which will explain the background to the current situation and Chief Scientific Officer, Jos Engelen, will lead a discussion about the pros and cons of CERN following a particular model. Your input and support is crucial to the success of suc...

  5. Copyright case a victory for science publishing

    Science.gov (United States)

    Cole, Stephen

    An important victory for the financial health and future of scientific journals was won July 23 when Judge Pierre Leval of the Federal District Court in New York handed down his decision on the copyright infringement suit, American Geophysical Union, et al. v. Texaco Inc. Leval ruled that profit-making companies cannot photocopy copyrighted journal articles without permission and without compensating the copyright holder.The class action suit was brought in 1985 by AGU and six other scientific publishers on behalf of 8500 publishers worldwide who make their titles available for legal copying under licenses granted by the Copyright Clearance Center, Inc. This licensing system was designed in cooperation with major corporations to facilitate compliance with the 1976 Copyright Act. Although more than 200 companies now use the center, some corporations, such as Texaco, have not. The suit was initiated to force compliance with copyright law. The current decision is very important because it establishes legal precedents on the “fair use” issue.

  6. Publish or perish: tools for survival

    Directory of Open Access Journals (Sweden)

    Quan SF

    2017-02-01

    Full Text Available No abstract available. Article truncated at 150 words. Success in one’s chosen profession is often predicated upon meeting a profession-wide standard of excellence or productivity. In the corporate world, the metric might be sales volume and in clinical medicine it may be patient satisfaction and/or number of patients seen. In academic medicine, including the fields of Pulmonary and Critical Care Medicine, the “coin of the realm” is demonstrable written scholarship. In large part, this is determined by the number and quality of publications in scientific journals. Unfortunately, the skills required to navigate the complexities of how to publish in the scientific literature rarely are taught in either medical school or postgraduate training. To assist the inexperienced academic physician or scientist, the Writing for Scholarship Interest Group of the Harvard Medical School Academy recently published “A Writer’s Toolkit” (1. This comprehensive monograph provides valuable information on all phases of the writing process ranging from conceptualization of a manuscript to …

  7. Publishing perishing? Towards tomorrow's information architecture

    Directory of Open Access Journals (Sweden)

    Gerstein Mark B

    2007-01-01

    Full Text Available Abstract Scientific articles are tailored to present information in human-readable aliquots. Although the Internet has revolutionized the way our society thinks about information, the traditional text-based framework of the scientific article remains largely unchanged. This format imposes sharp constraints upon the type and quantity of biological information published today. Academic journals alone cannot capture the findings of modern genome-scale inquiry. Like many other disciplines, molecular biology is a science of facts: information inherently suited to database storage. In the past decade, a proliferation of public and private databases has emerged to house genome sequence, protein structure information, functional genomics data and more; these digital repositories are now a vital component of scientific communication. The next challenge is to integrate this vast and ever-growing body of information with academic journals and other media. To truly integrate scientific information we must modernize academic publishing to exploit the power of the Internet. This means more than online access to articles, hyperlinked references and web-based supplemental data; it means making articles fully computer-readable with intelligent markup and Structured Digital Abstracts. Here, we examine the changing roles of scholarly journals and databases. We present our vision of the optimal information architecture for the biosciences, and close with tangible steps to improve our handling of scientific information today while paving the way for an expansive central index in the future.

  8. AAS Publishing News: Astronomical Software Citation Workshop

    Science.gov (United States)

    Kohler, Susanna

    2015-07-01

    Do you write code for your research? Use astronomical software? Do you wish there were a better way of citing, sharing, archiving, or discovering software for astronomy research? You're not alone! In April 2015, AAS's publishing team joined other leaders in the astronomical software community in a meeting funded by the Sloan Foundation, with the purpose of discussing these issues and potential solutions. In attendance were representatives from academic astronomy, publishing, libraries, for-profit software sharing platforms, telescope facilities, and grantmaking institutions. The goal of the group was to establish “protocols, policies, and platforms for astronomical software citation, sharing, and archiving,” in the hopes of encouraging a set of normalized standards across the field. The AAS is now collaborating with leaders at GitHub to write grant proposals for a project to develop strategies for software discoverability and citation, in astronomy and beyond. If this topic interests you, you can find more details in this document released by the group after the meeting: http://astronomy-software-index.github.io/2015-workshop/ The group hopes to move this project forward with input and support from the broader community. Please share the above document, discuss it on social media using the hashtag #astroware (so that your conversations can be found!), or send private comments to julie.steffen@aas.org.

  9. Computational geometry algorithms and applications

    CERN Document Server

    de Berg, Mark; Overmars, Mark; Schwarzkopf, Otfried

    1997-01-01

    Computational geometry emerged from the field of algorithms design and anal­ ysis in the late 1970s. It has grown into a recognized discipline with its own journals, conferences, and a large community of active researchers. The suc­ cess of the field as a research discipline can on the one hand be explained from the beauty of the problems studied and the solutions obtained, and, on the other hand, by the many application domains--computer graphics, geographic in­ formation systems (GIS), robotics, and others-in which geometric algorithms play a fundamental role. For many geometric problems the early algorithmic solutions were either slow or difficult to understand and implement. In recent years a number of new algorithmic techniques have been developed that improved and simplified many of the previous approaches. In this textbook we have tried to make these modem algorithmic solutions accessible to a large audience. The book has been written as a textbook for a course in computational geometry, but it can ...

  10. PUBLISHER'S ANNOUNCEMENT: Editorial developments Editorial developments

    Science.gov (United States)

    Gillan, Rebecca

    2009-01-01

    We are delighted to announce that from January 2009, Professor Murray T Batchelor of the Australian National University, Canberra will be the new Editor-in-Chief of Journal of Physics A: Mathematical and Theoretical. Murray Batchelor has been Editor of the Mathematical Physics section of the journal since 2007. Prior to this, he served as a Board Member and an Advisory Panel member for the journal. His primary area of research is the statistical mechanics of exactly solved models. He holds a joint appointment in mathematics and physics and has held visiting positions at the Universities of Leiden, Amsterdam, Oxford and Tokyo. We very much look forward to working with Murray to continue to improve the journal's quality and interest to the readership. We would like to thank our outgoing Editor-in-Chief, Professor Carl M Bender. Carl has done a magnificent job as Editor-in-Chief and has worked tirelessly to improve the journal over the last five years. Carl has been instrumental in designing and implementing strategies that have enhanced the quality of papers published and service provided by Journal of Physics A: Mathematical and Theoretical. Notably, under his tenure, we have introduced the Fast Track Communications (FTC) section to the journal. This section provides a venue for outstanding short papers that report new and timely developments in mathematical and theoretical physics and offers accelerated publication and high visibility for our authors. During the last five years, we have raised the quality threshold for acceptance in the journal and now reject over 60% of submissions. As a result, papers published in Journal of Physics A: Mathematical and Theoretical are amongst the best in the field. We have also maintained and improved on our excellent receipt-to-first-decision times, which now average less than 50 days for papers. We have recently announced another innovation; the Journal of Physics A Best Paper Prize. These prizes will honour excellent papers

  11. Predictive effects of previous episodes on the risk of recurrence in depressive and bipolar disorders

    DEFF Research Database (Denmark)

    Kessing, Lars Vedel; Andersen, Per Kragh

    2005-01-01

    Findings from several studies have suggested that the risk of recurrence increases with the number of previous episodes in depressive and bipolar disorders. However, a comprehensive and critical review of the literature published during the past century shows that in several previous studies...

  12. Testing the performance of empirical remote sensing algorithms in the Baltic Sea waters with modelled and in situ reflectance data

    Directory of Open Access Journals (Sweden)

    Martin Ligi

    2017-01-01

    Full Text Available Remote sensing studies published up to now show that the performance of empirical (band-ratio type algorithms in different parts of the Baltic Sea is highly variable. Best performing algorithms are different in the different regions of the Baltic Sea. Moreover, there is indication that the algorithms have to be seasonal as the optical properties of phytoplankton assemblages dominating in spring and summer are different. We modelled 15,600 reflectance spectra using HydroLight radiative transfer model to test 58 previously published empirical algorithms. 7200 of the spectra were modelled using specific inherent optical properties (SIOPs of the open parts of the Baltic Sea in summer and 8400 with SIOPs of spring season. Concentration range of chlorophyll-a, coloured dissolved organic matter (CDOM and suspended matter used in the model simulations were based on the actually measured values available in literature. For each optically active constituent we added one concentration below actually measured minimum and one concentration above the actually measured maximum value in order to test the performance of the algorithms in wider range. 77 in situ reflectance spectra from rocky (Sweden and sandy (Estonia, Latvia coastal areas were used to evaluate the performance of the algorithms also in coastal waters. Seasonal differences in the algorithm performance were confirmed but we found also algorithms that can be used in both spring and summer conditions. The algorithms that use bands available on OLCI, launched in February 2016, are highlighted as this sensor will be available for Baltic Sea monitoring for coming decades.

  13. Webmail: an Automated Web Publishing System

    Science.gov (United States)

    Bell, David

    A system for publishing frequently updated information to the World Wide Web will be described. Many documents now hosted by the NOAO Web server require timely posting and frequent updates, but need only minor changes in markup or are in a standard format requiring only conversion to HTML. These include information from outside the organization, such as electronic bulletins, and a number of internal reports, both human and machine generated. Webmail uses procmail and Perl scripts to process incoming email messages in a variety of ways. This processing may include wrapping or conversion to HTML, posting to the Web or internal newsgroups, updating search indices or links on related pages, and sending email notification of the new pages to interested parties. The Webmail system has been in use at NOAO since early 1997 and has steadily grown to include fourteen recipes that together handle about fifty messages per week.

  14. A Learned Society's Perspective on Publishing.

    Science.gov (United States)

    Suzuki, Kunihiko; Edelson, Alan; Iversen, Leslie L; Hausmann, Laura; Schulz, Jörg B; Turner, Anthony J

    2016-10-01

    Scientific journals that are owned by a learned society, like the Journal of Neurochemistry (JNC), which is owned by the International Society for Neurochemistry (ISN), benefit the scientific community in that a large proportion of the income is returned to support the scientific mission of the Society. The income generated by the JNC enables the ISN to organize conferences as a platform for members and non-members alike to share their research, supporting researchers particularly in developing countries by travel grants and other funds, and promoting education in student schools. These direct benefits and initiatives for ISN members and non-members distinguish a society journal from pure commerce. However, the world of scholarly publishing is changing rapidly. Open access models have challenged the business model of traditional journal subscription and hence provided free access to publicly funded scientific research. In these models, the manuscript authors pay a publication cost after peer review and acceptance of the manuscript. Over the last decade, numerous new open access journals have been launched and traditional subscription journals have started to offer open access (hybrid journals). However, open access journals follow the general scheme that, of all participating parties, the publisher receives the highest financial benefit. The income is generated by researchers whose positions and research are mostly financed by taxpayers' or funders' money, and by reviewers and editors, who frequently are not reimbursed. Last but not least, the authors pay for the publication of their work after a rigorous and sometimes painful review process. JNC itself has an open access option, at a significantly reduced cost for Society members as an additional benefit. This article provides first-hand insights from two former Editors-in-Chief, Kunihiko Suzuki and Leslie Iversen, about the history of JNC's ownership and about the difficulties and battles fought along the way to

  15. Promotion and monetization in publishing mobile applications

    Directory of Open Access Journals (Sweden)

    Fernando Peinado Miguel

    2016-07-01

    Full Text Available The road to new technologies and digital systems of transmission of the message produce unpublished business channels to support the media management current. And the printed media can benefit since they can find greater advantages in this new channel of increasing consumption: mobile greater advantages in this new channel of increasing consumption: mobile applications. The modern editor coexists with these new technologies, scheduled or native applications on web languages that offer multiple possibilities of loyalty, monetization, distribution and synergies with the channels already considered classics. The study of apps to generate revenue in the media can be the new contextual paradigm of the Periodismo and establish the key factors for a better profitability of the participants this industry, increasingly weakened financially and business by the accelerated arrival of technology in publishing groups accustomed to changes of slow evolution.

  16. Quantum Computation and Algorithms

    International Nuclear Information System (INIS)

    Biham, O.; Biron, D.; Biham, E.; Grassi, M.; Lidar, D.A.

    1999-01-01

    It is now firmly established that quantum algorithms provide a substantial speedup over classical algorithms for a variety of problems, including the factorization of large numbers and the search for a marked element in an unsorted database. In this talk I will review the principles of quantum algorithms, the basic quantum gates and their operation. The combination of superposition and interference, that makes these algorithms efficient, will be discussed. In particular, Grover's search algorithm will be presented as an example. I will show that the time evolution of the amplitudes in Grover's algorithm can be found exactly using recursion equations, for any initial amplitude distribution

  17. Marshall Rosenbluth and the Metropolis algorithm

    International Nuclear Information System (INIS)

    Gubernatis, J.E.

    2005-01-01

    The 1953 publication, 'Equation of State Calculations by Very Fast Computing Machines' by N. Metropolis, A. W. Rosenbluth and M. N. Rosenbluth, and M. Teller and E. Teller [J. Chem. Phys. 21, 1087 (1953)] marked the beginning of the use of the Monte Carlo method for solving problems in the physical sciences. The method described in this publication subsequently became known as the Metropolis algorithm, undoubtedly the most famous and most widely used Monte Carlo algorithm ever published. As none of the authors made subsequent use of the algorithm, they became unknown to the large simulation physics community that grew from this publication and their roles in its development became the subject of mystery and legend. At a conference marking the 50th anniversary of the 1953 publication, Marshall Rosenbluth gave his recollections of the algorithm's development. The present paper describes the algorithm, reconstructs the historical context in which it was developed, and summarizes Marshall's recollections

  18. Issues in Science Publishing. What's Hot and What's not?

    Directory of Open Access Journals (Sweden)

    Jaime A. Teixeira da Silva

    2015-06-01

    Full Text Available Science is in crisis: a crisis of trust, and a crisis of values. Yet, this is an opportune moment for scientists to examine the issues that underly science to discover how they may be of use, beyond their laboratory or field experience, to improve the research and publishing landscapes to create an environment that suits their needs more. Traditionally, the science publishing landscape had been controlled by the science, technology and medicine publishers, who have always taunted their peer review systems as being fail-safe. Yet, considerable moss has been gathered by the post-publication peer review (PPPR movement over the past few years, indicating that the voice of the average scientist now carries more weight, and more value, than ever before. Despite this, most scientists are unaware of their potential power of opinion. Especially when it comes to commenting on, and correcting, the already published literature. Commenting by name, or anonymously, is the new PPPR publishing reality. There needs to also be a concomitant movement away from artificial metrics, such as the impact factor, which serve only as ego-boosting parameters, and which distract the wider readership from the weaknesses of the traditional peer review system currently in place. Increasing cases of the abuse of peer review, such as the creation of fake identities, affiliations or e-mail addresses further highlights the need for scientists to be vigilant, without necessairly being vigilantes. The discovery, within a matter of years, that the literature is more corrupted than was previously thought, in some cases caused by clear cases of editorial cronyism, or abuse, has resulted in a need for scientists to exceed their functions as mere scientists to evolve into whistle-blowers. Some ethical guidelines are in place, such as those by COPE, yet what is being increasingly witnessed, is a discrepancy between preached values by select COPE member journals, and the literature that

  19. A leaf sequencing algorithm to enlarge treatment field length in IMRT

    International Nuclear Information System (INIS)

    Xia Ping; Hwang, Andrew B.; Verhey, Lynn J.

    2002-01-01

    With MLC-based IMRT, the maximum usable field size is often smaller than the maximum field size for conventional treatments. This is due to the constraints of the overtravel distances of MLC leaves and/or jaws. Using a new leaf sequencing algorithm, the usable IMRT field length (perpendicular to the MLC motion) can be mostly made equal to the full length of the MLC field without violating the upper jaw overtravel limit. For any given intensity pattern, a criterion was proposed to assess whether an intensity pattern can be delivered without violation of the jaw position constraints. If the criterion is met, the new algorithm will consider the jaw position constraints during the segmentation for the step and shoot delivery method. The strategy employed by the algorithm is to connect the intensity elements outside the jaw overtravel limits with those inside the jaw overtravel limits. Several methods were used to establish these connections during segmentation by modifying a previously published algorithm (areal algorithm), including changing the intensity level, alternating the leaf-sequencing direction, or limiting the segment field size. The algorithm was tested with 1000 random intensity patterns with dimensions of 21x27 cm2, 800 intensity patterns with higher intensity outside the jaw overtravel limit, and three different types of clinical treatment plans that were undeliverable using a segmentation method from a commercial treatment planning system. The new algorithm achieved a success rate of 100% with these test patterns. For the 1000 random patterns, the new algorithm yields a similar average number of segments of 36.9±2.9 in comparison to 36.6±1.3 when using the areal algorithm. For the 800 patterns with higher intensities outside the jaw overtravel limits, the new algorithm results in an increase of 25% in the average number of segments compared to the areal algorithm. However, the areal algorithm fails to create deliverable segments for 90% of these

  20. Variable depth recursion algorithm for leaf sequencing

    International Nuclear Information System (INIS)

    Siochi, R. Alfredo C.

    2007-01-01

    The processes of extraction and sweep are basic segmentation steps that are used in leaf sequencing algorithms. A modified version of a commercial leaf sequencer changed the way that the extracts are selected and expanded the search space, but the modification maintained the basic search paradigm of evaluating multiple solutions, each one consisting of up to 12 extracts and a sweep sequence. While it generated the best solutions compared to other published algorithms, it used more computation time. A new, faster algorithm selects one extract at a time but calls itself as an evaluation function a user-specified number of times, after which it uses the bidirectional sweeping window algorithm as the final evaluation function. To achieve a performance comparable to that of the modified commercial leaf sequencer, 2-3 calls were needed, and in all test cases, there were only slight improvements beyond two calls. For the 13 clinical test maps, computation speeds improved by a factor between 12 and 43, depending on the constraints, namely the ability to interdigitate and the avoidance of the tongue-and-groove under dose. The new algorithm was compared to the original and modified versions of the commercial leaf sequencer. It was also compared to other published algorithms for 1400, random, 15x15, test maps with 3-16 intensity levels. In every single case the new algorithm provided the best solution

  1. Biomedical informatics: we are what we publish.

    Science.gov (United States)

    Elkin, P L; Brown, S H; Wright, G

    2013-01-01

    This article is part of a For-Discussion-Section of Methods of Information in Medicine on "Biomedical Informatics: We are what we publish". It is introduced by an editorial and followed by a commentary paper with invited comments. In subsequent issues the discussion may continue through letters to the editor. Informatics experts have attempted to define the field via consensus projects which has led to consensus statements by both AMIA. and by IMIA. We add to the output of this process the results of a study of the Pubmed publications with abstracts from the field of Biomedical Informatics. We took the terms from the AMIA consensus document and the terms from the IMIA definitions of the field of Biomedical Informatics and combined them through human review to create the Health Informatics Ontology. We built a terminology server using the Intelligent Natural Language Processor (iNLP). Then we downloaded the entire set of articles in Medline identified by searching the literature by "Medical Informatics" OR "Bioinformatics". The articles were parsed by the joint AMIA / IMIA terminology and then again using SNOMED CT and for the Bioinformatics they were also parsed using HGNC Ontology. We identified 153,580 articles using "Medical Informatics" and 20,573 articles using "Bioinformatics". This resulted in 168,298 unique articles and an overlap of 5,855 articles. Of these 62,244 articles (37%) had titles and abstracts that contained at least one concept from the Health Informatics Ontology. SNOMED CT indexing showed that the field interacts with most all clinical fields of medicine. Further defining the field by what we publish can add value to the consensus driven processes that have been the mainstay of the efforts to date. Next steps should be to extract terms from the literature that are uncovered and create class hierarchies and relationships for this content. We should also examine the high occurring of MeSH terms as markers to define Biomedical Informatics

  2. A numeric comparison of variable selection algorithms for supervised learning

    International Nuclear Information System (INIS)

    Palombo, G.; Narsky, I.

    2009-01-01

    Datasets in modern High Energy Physics (HEP) experiments are often described by dozens or even hundreds of input variables. Reducing a full variable set to a subset that most completely represents information about data is therefore an important task in analysis of HEP data. We compare various variable selection algorithms for supervised learning using several datasets such as, for instance, imaging gamma-ray Cherenkov telescope (MAGIC) data found at the UCI repository. We use classifiers and variable selection methods implemented in the statistical package StatPatternRecognition (SPR), a free open-source C++ package developed in the HEP community ( (http://sourceforge.net/projects/statpatrec/)). For each dataset, we select a powerful classifier and estimate its learning accuracy on variable subsets obtained by various selection algorithms. When possible, we also estimate the CPU time needed for the variable subset selection. The results of this analysis are compared with those published previously for these datasets using other statistical packages such as R and Weka. We show that the most accurate, yet slowest, method is a wrapper algorithm known as generalized sequential forward selection ('Add N Remove R') implemented in SPR.

  3. Publisher Correction: Measuring progress from nationally determined contributions to mid-century strategies

    Science.gov (United States)

    Iyer, Gokul; Ledna, Catherine; Clarke, Leon; Edmonds, James; McJeon, Haewon; Kyle, Page; Williams, James H.

    2018-03-01

    In the version of this Article previously published, technical problems led to the wrong summary appearing on the homepage, and an incorrect Supplementary Information file being uploaded. Both errors have now been corrected.

  4. A second chance for authors of hijacked journals to publish in legitimate journals.

    Science.gov (United States)

    Jalalian, Mehrdad

    2015-01-01

    This article proposes the republication of articles that have previously been published in counterfeit websites of hijacked journals. The paper also discusses the technical and ethical aspects of republishing such articles.

  5. IAEA publishes first health and safety manual

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1959-01-15

    A 'Manual on the Safe Handling of Radioisotopes' was published in English on 15 Dec ember 1958 by the International Atomic Energy Agency. This is a comprehensive handbook of internationally compiled recommendations for users of radioisotopes. It covers organizational, medical and technical aspects of radiation safety practices. It is also the Agency's first technical publication. French, Russian and Spanish editions will appear shortly. The Manual should prove useful to all users of radioisotopes in industry, medicine, research, etc., but is directed mainly to small scale users who may not have access to other sources of information. The recommendations apply only to radioactivity surpassing the limit of 0,002 microcurie concentration per gram of material; or a total activity of more than 0,1 microcuries in the working areas; this limit is based on the most dangerous radioisotopes. The experts state that the limiting level might be higher for less dangerous isotopes, but recommend that all be treated as potentially dangerous. This would have educational value and avoid accidents caused by misidentification. The Manual also stressed that good radiation safety practices depend on effective organization and warns that even very competent workers sometimes ignore or forget important health and safety requirements.

  6. History of attempts to publish a paper

    International Nuclear Information System (INIS)

    Kowalski, Ludwik

    2006-01-01

    A paper reviewing recent cold fusion claim, written by the author in 2004, has been rejected (without sending it to referees and without offering any criticism) by editors of seven journals, namely, Physics Today (USA), American Scientist (USA), Scientific American (USA), Nature (UK), New Scientist (UK), The Physics Teacher (USA), Science (USA). The present paper has the following contents: 1. Here is how my paper was introduced to the editor of one of the above journals. Other accompanied letters were similar; 2. In rejecting my paper the editor of Physics Today wrote; 3. And here how the editor of American Scientist responded to my submission; 4. Responding to the above I wrote; 5. Seek not the golden egg, seek the goose; 6. In a subsequent reply I wrote; 7. The manuscript was submitted to Scientific American; 8. I then tried to publish the paper in Nature; 9. I then tried another UK journal, New Scientist; 10. My immediate reply; 11. The manuscript was then submitted to the Editor in chief of Science

  7. History of attempts to publish a paper

    Energy Technology Data Exchange (ETDEWEB)

    Kowalski, Ludwik [Department of Mathematical Sciences, Montclair State University, 341, Brook Avenue, Passaic NJ, 07055 (United States)

    2006-07-01

    A paper reviewing recent cold fusion claim, written by the author in 2004, has been rejected (without sending it to referees and without offering any criticism) by editors of seven journals, namely, Physics Today (USA), American Scientist (USA), Scientific American (USA), Nature (UK), New Scientist (UK), The Physics Teacher (USA), Science (USA). The present paper has the following contents: 1. Here is how my paper was introduced to the editor of one of the above journals. Other accompanied letters were similar; 2. In rejecting my paper the editor of Physics Today wrote; 3. And here how the editor of American Scientist responded to my submission; 4. Responding to the above I wrote; 5. Seek not the golden egg, seek the goose; 6. In a subsequent reply I wrote; 7. The manuscript was submitted to Scientific American; 8. I then tried to publish the paper in Nature; 9. I then tried another UK journal, New Scientist; 10. My immediate reply; 11. The manuscript was then submitted to the Editor in chief of Science.

  8. New developments in publishing related to authorship.

    Science.gov (United States)

    Donev, Doncho

    2014-01-01

    To present the inappropriate types of authorship and practice, and the most recent developments related to basic principles and criteria to a fair system for allocating authorship in scientific publications. An analysis of relevant materials and documents, sources from the internet and published literature and personal experience and observations of the author. Working in multidisciplinary teams is a common feature of modern research processes. The most sensitive question is how to decide on who to acknowledge as author of a multi-authored publication. The pertinence of this question is growing with the increasing importance of individual scientists' publication records for professional status and career. However, discussions about authorship allocation might lead to serious conflicts and disputes among coworkers which could even endanger cooperation and successful completion of a research project. It seems that discussion and education about ethical standards and practical guidelines for fairly allocating authorship are insufficient and the question of ethical practices related to authorship in multi-authored publications remains generally unresolved. It is necessary to work for raising awareness about the importance and need for education about principles of scientific communication and fair allocation of authorship, ethics of research and publication of results. The use of various forms of education in the scientific community, especially young researchers and students, in order to create an ethical environment, is one of the most effective ways to prevent the emergence of scientific and publication dishonesty and fraud, including pathology of authorship.

  9. IAEA publishes first health and safety manual

    International Nuclear Information System (INIS)

    1959-01-01

    A 'Manual on the Safe Handling of Radioisotopes' was published in English on 15 Dec ember 1958 by the International Atomic Energy Agency. This is a comprehensive handbook of internationally compiled recommendations for users of radioisotopes. It covers organizational, medical and technical aspects of radiation safety practices. It is also the Agency's first technical publication. French, Russian and Spanish editions will appear shortly. The Manual should prove useful to all users of radioisotopes in industry, medicine, research, etc., but is directed mainly to small scale users who may not have access to other sources of information. The recommendations apply only to radioactivity surpassing the limit of 0,002 microcurie concentration per gram of material; or a total activity of more than 0,1 microcuries in the working areas; this limit is based on the most dangerous radioisotopes. The experts state that the limiting level might be higher for less dangerous isotopes, but recommend that all be treated as potentially dangerous. This would have educational value and avoid accidents caused by misidentification. The Manual also stressed that good radiation safety practices depend on effective organization and warns that even very competent workers sometimes ignore or forget important health and safety requirements

  10. Clinical effectiveness of a Bayesian algorithm for the diagnosis and management of heparin-induced thrombocytopenia.

    Science.gov (United States)

    Raschke, R A; Gallo, T; Curry, S C; Whiting, T; Padilla-Jones, A; Warkentin, T E; Puri, A

    2017-08-01

    Essentials We previously published a diagnostic algorithm for heparin-induced thrombocytopenia (HIT). In this study, we validated the algorithm in an independent large healthcare system. The accuracy was 98%, sensitivity 82% and specificity 99%. The algorithm has potential to improve accuracy and efficiency in the diagnosis of HIT. Background Heparin-induced thrombocytopenia (HIT) is a life-threatening drug reaction caused by antiplatelet factor 4/heparin (anti-PF4/H) antibodies. Commercial tests to detect these antibodies have suboptimal operating characteristics. We previously developed a diagnostic algorithm for HIT that incorporated 'four Ts' (4Ts) scoring and a stratified interpretation of an anti-PF4/H enzyme-linked immunosorbent assay (ELISA) and yielded a discriminant accuracy of 0.97 (95% confidence interval [CI], 0.93-1.00). Objectives The purpose of this study was to validate the algorithm in an independent patient population and quantitate effects that algorithm adherence could have on clinical care. Methods A retrospective cohort comprised patients who had undergone anti-PF4/H ELISA and serotonin release assay (SRA) testing in our healthcare system from 2010 to 2014. We determined the algorithm recommendation for each patient, compared recommendations with the clinical care received, and enumerated consequences of discrepancies. Operating characteristics were calculated for algorithm recommendations using SRA as the reference standard. Results Analysis was performed on 181 patients, 10 of whom were ruled in for HIT. The algorithm accurately stratified 98% of patients (95% CI, 95-99%), ruling out HIT in 158, ruling in HIT in 10 and recommending an SRA in 13 patients. Algorithm adherence would have obviated 165 SRAs and prevented 30 courses of unnecessary antithrombotic therapy for HIT. Diagnostic sensitivity was 0.82 (95% CI, 0.48-0.98), specificity 0.99 (95% CI, 0.97-1.00), PPV 0.90 (95% CI, 0.56-0.99) and NPV 0.99 (95% CI, 0.96-1.00). Conclusions An

  11. An Algorithm for the Convolution of Legendre Series

    KAUST Repository

    Hale, Nicholas; Townsend, Alex

    2014-01-01

    An O(N2) algorithm for the convolution of compactly supported Legendre series is described. The algorithm is derived from the convolution theorem for Legendre polynomials and the recurrence relation satisfied by spherical Bessel functions. Combining with previous work yields an O(N 2) algorithm for the convolution of Chebyshev series. Numerical results are presented to demonstrate the improved efficiency over the existing algorithm. © 2014 Society for Industrial and Applied Mathematics.

  12. Fermion cluster algorithms

    International Nuclear Information System (INIS)

    Chandrasekharan, Shailesh

    2000-01-01

    Cluster algorithms have been recently used to eliminate sign problems that plague Monte-Carlo methods in a variety of systems. In particular such algorithms can also be used to solve sign problems associated with the permutation of fermion world lines. This solution leads to the possibility of designing fermion cluster algorithms in certain cases. Using the example of free non-relativistic fermions we discuss the ideas underlying the algorithm

  13. Autonomous Star Tracker Algorithms

    DEFF Research Database (Denmark)

    Betto, Maurizio; Jørgensen, John Leif; Kilsgaard, Søren

    1998-01-01

    Proposal, in response to an ESA R.f.P., to design algorithms for autonomous star tracker operations.The proposal also included the development of a star tracker breadboard to test the algorithms performances.......Proposal, in response to an ESA R.f.P., to design algorithms for autonomous star tracker operations.The proposal also included the development of a star tracker breadboard to test the algorithms performances....

  14. A note on the linear memory Baum-Welch algorithm

    DEFF Research Database (Denmark)

    Jensen, Jens Ledet

    2009-01-01

    We demonstrate the simplicity and generality of the recently introduced linear space Baum-Welch algorithm for hidden Markov models. We also point to previous literature on the subject.......We demonstrate the simplicity and generality of the recently introduced linear space Baum-Welch algorithm for hidden Markov models. We also point to previous literature on the subject....

  15. Low-dose computed tomography image restoration using previous normal-dose scan

    International Nuclear Information System (INIS)

    Ma, Jianhua; Huang, Jing; Feng, Qianjin; Zhang, Hua; Lu, Hongbing; Liang, Zhengrong; Chen, Wufan

    2011-01-01

    Purpose: In current computed tomography (CT) examinations, the associated x-ray radiation dose is of a significant concern to patients and operators. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) or kVp parameter (or delivering less x-ray energy to the body) as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and the noise would propagate into the CT image if no adequate noise control is applied during image reconstruction. Since a normal-dose high diagnostic CT image scanned previously may be available in some clinical applications, such as CT perfusion imaging and CT angiography (CTA), this paper presents an innovative way to utilize the normal-dose scan as a priori information to induce signal restoration of the current low-dose CT image series. Methods: Unlike conventional local operations on neighboring image voxels, nonlocal means (NLM) algorithm utilizes the redundancy of information across the whole image. This paper adapts the NLM to utilize the redundancy of information in the previous normal-dose scan and further exploits ways to optimize the nonlocal weights for low-dose image restoration in the NLM framework. The resulting algorithm is called the previous normal-dose scan induced nonlocal means (ndiNLM). Because of the optimized nature of nonlocal weights calculation, the ndiNLM algorithm does not depend heavily on image registration between the current low-dose and the previous normal-dose CT scans. Furthermore, the smoothing parameter involved in the ndiNLM algorithm can be adaptively estimated based on the image noise relationship between the current low-dose and the previous normal-dose scanning protocols. Results: Qualitative and quantitative evaluations were carried out on a physical phantom as well as clinical abdominal and brain perfusion CT scans in terms of accuracy and resolution properties. The gain by the use

  16. Publishing integrity and good practices in editing in biomedicine.

    Science.gov (United States)

    Polenakovic, Momir; Gucev, Zoran

    2014-01-01

    accept reports which support the reviewer's concepts of thinking and, like Procrustes, cutting everything else out. Authorship is often a contentious issue, as undeserved authors appear on the list of authors. Some principles are now a norm in academic publishing. This applies to the declaration of a conflict of interest, the consent of the patient and the approval of the Ethical Board of the institution. This global informational technological revolution has, unfortunately, led to largely widespread and increasingly sophisticated deviations: plagiarism, data fabrication and data falsification as forms of scientific misconduct. Those events are now more widespread than in the past. Luckily new tools to track them are much better than previously. The race for perfect publishing integrity and for the best good practices in editing in biomedicine is on. New and old challenges will be met. The benevolent and caring society, educated professionals and an enlightened public remain essential preconditions. The wealth of nations depends on R&D and consequently on academic publishing.

  17. Algorithm for Video Summarization of Bronchoscopy Procedures

    Directory of Open Access Journals (Sweden)

    Leszczuk Mikołaj I

    2011-12-01

    Full Text Available Abstract Background The duration of bronchoscopy examinations varies considerably depending on the diagnostic and therapeutic procedures used. It can last more than 20 minutes if a complex diagnostic work-up is included. With wide access to videobronchoscopy, the whole procedure can be recorded as a video sequence. Common practice relies on an active attitude of the bronchoscopist who initiates the recording process and usually chooses to archive only selected views and sequences. However, it may be important to record the full bronchoscopy procedure as documentation when liability issues are at stake. Furthermore, an automatic recording of the whole procedure enables the bronchoscopist to focus solely on the performed procedures. Video recordings registered during bronchoscopies include a considerable number of frames of poor quality due to blurry or unfocused images. It seems that such frames are unavoidable due to the relatively tight endobronchial space, rapid movements of the respiratory tract due to breathing or coughing, and secretions which occur commonly in the bronchi, especially in patients suffering from pulmonary disorders. Methods The use of recorded bronchoscopy video sequences for diagnostic, reference and educational purposes could be considerably extended with efficient, flexible summarization algorithms. Thus, the authors developed a prototype system to create shortcuts (called summaries or abstracts of bronchoscopy video recordings. Such a system, based on models described in previously published papers, employs image analysis methods to exclude frames or sequences of limited diagnostic or education value. Results The algorithm for the selection or exclusion of specific frames or shots from video sequences recorded during bronchoscopy procedures is based on several criteria, including automatic detection of "non-informative", frames showing the branching of the airways and frames including pathological lesions. Conclusions

  18. DNABIT Compress – Genome compression algorithm

    Science.gov (United States)

    Rajarajeswari, Pothuraju; Apparao, Allam

    2011-01-01

    Data compression is concerned with how information is organized in data. Efficient storage means removal of redundancy from the data being stored in the DNA molecule. Data compression algorithms remove redundancy and are used to understand biologically important molecules. We present a compression algorithm, “DNABIT Compress” for DNA sequences based on a novel algorithm of assigning binary bits for smaller segments of DNA bases to compress both repetitive and non repetitive DNA sequence. Our proposed algorithm achieves the best compression ratio for DNA sequences for larger genome. Significantly better compression results show that “DNABIT Compress” algorithm is the best among the remaining compression algorithms. While achieving the best compression ratios for DNA sequences (Genomes),our new DNABIT Compress algorithm significantly improves the running time of all previous DNA compression programs. Assigning binary bits (Unique BIT CODE) for (Exact Repeats, Reverse Repeats) fragments of DNA sequence is also a unique concept introduced in this algorithm for the first time in DNA compression. This proposed new algorithm could achieve the best compression ratio as much as 1.58 bits/bases where the existing best methods could not achieve a ratio less than 1.72 bits/bases. PMID:21383923

  19. A verified LLL algorithm

    NARCIS (Netherlands)

    Divasón, Jose; Joosten, Sebastiaan; Thiemann, René; Yamada, Akihisa

    2018-01-01

    The Lenstra-Lenstra-Lovász basis reduction algorithm, also known as LLL algorithm, is an algorithm to find a basis with short, nearly orthogonal vectors of an integer lattice. Thereby, it can also be seen as an approximation to solve the shortest vector problem (SVP), which is an NP-hard problem,

  20. Data publishing - visions of the future

    Science.gov (United States)

    Schäfer, Leonie; Klump, Jens; Bertelmann, Roland; Klar, Jochen; Enke, Harry; Rathmann, Torsten; Koudela, Daniela; Köhler, Klaus; Müller-Pfefferkorn, Ralph; van Uytvanck, Dieter; Strathmann, Stefan; Engelhardt, Claudia

    2013-04-01

    This poster describes future scenarios of information infrastructures in science and other fields of research. The scenarios presented are based on practical experience resulting from interaction with research data in a research center and its library, and further enriched by the results of a baseline study of existing data repositories and data infrastructures. The baseline study was conducted as part of the project "Requirements for a multi-disciplinary research data infrastructure (Radieschen)", which is funded by the German Research Foundation (DFG). Current changes in information infrastructures pose new challenges to libraries and scientific journals, which both act as information service providers, facilitating access to digital media, support publications of research data and enable their long-term archiving. Digital media and research data open new aspects in the field of activity of libraries and scientific journals. What will a library of the future look like? Will a library purely serve as interface to data centres? Will libraries and data centres merge into a new service unit? Will a future library be the interface to academic cloud services? Scientific journals already converted from mostly print editions to print and e-journals. What type of journals will emerge in the future? Is there a role for data-centred journals? Will there be journals to publish software code to make this type of research result citable and a part of the record of science? Just as users evolve from being consumers of information into producers, the role of information service providers, such as libraries, changes from a purely supporting to a contributing role. Furthermore, the role of the library changes from a central point of access for the search of publications to an important link in the value-adding chain from author to publication. Journals for software publication might be another vision for the future in data publishing. Software forms the missing link between big

  1. Civility in scientific publishing: The glyphosate paper.

    Science.gov (United States)

    Blaylock, Russell Lane

    2015-01-01

    In recent years, we have witnessed a decline in civility in the public arena when various socially sensitive issues are being presented. Those of us engaged in the publishing of scientific papers and in our comments on these papers, need to be cognizant of the social graces, courteous demeanor, and chivalry. Debates are essential to our learning and in being able to ferret out the essentials of various scientific issues that are of value. Because of the amount of time and effort connected with analyzing the complex problems and the years invested in such endeavors, we often resort to the behavior, that is, contentious and at times even quite insulting to our opponents during our defense. This is the part of human nature but as civilized human beings, we must strive to maintain the courtesy and a calm demeanor during such discussions and debates. I have yielded to such temptations myself but am striving to repent of my sins. The medical and scientific history should have taught us that in defending our ideas we learn and sometimes come to the realization that our paradigm or hypothesis is wrong, either in part or whole. Such debates allow us to fine tune our ideas and correct our errors in thinking, which are easily, consciously, or subconsciously sublimated by our enthusiasm. The glyphosate papers presented ideas that, while well supported by the scientific studies and logical conclusions, also contained some possible errors in its suppositions. Dr. Miguel Faria challenged some of these concepts and was met with some degree of derision by one of the authors. This editorial comment is in response to these issues.

  2. Modified automatic R-peak detection algorithm for patients with epilepsy using a portable electrocardiogram recorder.

    Science.gov (United States)

    Jeppesen, J; Beniczky, S; Fuglsang Frederiksen, A; Sidenius, P; Johansen, P

    2017-07-01

    Earlier studies have shown that short term heart rate variability (HRV) analysis of ECG seems promising for detection of epileptic seizures. A precise and accurate automatic R-peak detection algorithm is a necessity in a real-time, continuous measurement of HRV, in a portable ECG device. We used the portable CE marked ePatch® heart monitor to record the ECG of 14 patients, who were enrolled in the videoEEG long term monitoring unit for clinical workup of epilepsy. Recordings of the first 7 patients were used as training set of data for the R-peak detection algorithm and the recordings of the last 7 patients (467.6 recording hours) were used to test the performance of the algorithm. We aimed to modify an existing QRS-detection algorithm to a more precise R-peak detection algorithm to avoid the possible jitter Qand S-peaks can create in the tachogram, which causes error in short-term HRVanalysis. The proposed R-peak detection algorithm showed a high sensitivity (Se = 99.979%) and positive predictive value (P+ = 99.976%), which was comparable with a previously published QRS-detection algorithm for the ePatch® ECG device, when testing the same dataset. The novel R-peak detection algorithm designed to avoid jitter has very high sensitivity and specificity and thus is a suitable tool for a robust, fast, real-time HRV-analysis in patients with epilepsy, creating the possibility for real-time seizure detection for these patients.

  3. Mixed media : feminist presses and publishing politics in twentieth-century Britain

    OpenAIRE

    Murray, S. E.

    1999-01-01

    The high cultural profile of contemporary feminist publishing in Britain has previously met with a curiously evasive response from those spheres of academic discourse in which it might be expected to figure: women's studies, while asserting the innate politicality of all communication, has tended to overlook the subject of publishing in favour of less materialist cultural modes; while publishing studies has conventionally overlooked the significance of gender as a differential ...

  4. HEART TRANSPLANTATION IN PATIENTS WITH PREVIOUS OPEN HEART SURGERY

    Directory of Open Access Journals (Sweden)

    R. Sh. Saitgareev

    2016-01-01

    Full Text Available Heart Transplantation (HTx to date remains the most effective and radical method of treatment of patients with end-stage heart failure. The defi cit of donor hearts is forcing to resort increasingly to the use of different longterm mechanical circulatory support systems, including as a «bridge» to the follow-up HTx. According to the ISHLT Registry the number of recipients underwent cardiopulmonary bypass surgery increased from 40% in the period from 2004 to 2008 to 49.6% for the period from 2009 to 2015. HTx performed in repeated patients, on the one hand, involves considerable technical diffi culties and high risks; on the other hand, there is often no alternative medical intervention to HTx, and if not dictated by absolute contradictions the denial of the surgery is equivalent to 100% mortality. This review summarizes the results of a number of published studies aimed at understanding the immediate and late results of HTx in patients, previously underwent open heart surgery. The effect of resternotomy during HTx and that of the specifi c features associated with its implementation in recipients previously operated on open heart, and its effects on the immediate and long-term survival were considered in this review. Results of studies analyzing the risk factors for perioperative complications in repeated recipients were also demonstrated. Separately, HTx risks after implantation of prolonged mechanical circulatory support systems were examined. The literature does not allow to clearly defi ning the impact factor of earlier performed open heart surgery on the course of perioperative period and on the prognosis of survival in recipients who underwent HTx. On the other hand, subject to the regular fl ow of HTx and the perioperative period the risks in this clinical situation are justifi ed as a long-term prognosis of recipients previously conducted open heart surgery and are comparable to those of patients who underwent primary HTx. Studies

  5. Nature-inspired optimization algorithms

    CERN Document Server

    Yang, Xin-She

    2014-01-01

    Nature-Inspired Optimization Algorithms provides a systematic introduction to all major nature-inspired algorithms for optimization. The book's unified approach, balancing algorithm introduction, theoretical background and practical implementation, complements extensive literature with well-chosen case studies to illustrate how these algorithms work. Topics include particle swarm optimization, ant and bee algorithms, simulated annealing, cuckoo search, firefly algorithm, bat algorithm, flower algorithm, harmony search, algorithm analysis, constraint handling, hybrid methods, parameter tuning

  6. PCTFPeval: a web tool for benchmarking newly developed algorithms for predicting cooperative transcription factor pairs in yeast.

    Science.gov (United States)

    Lai, Fu-Jou; Chang, Hong-Tsun; Wu, Wei-Sheng

    2015-01-01

    Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Allowing users to select eight existing performance indices and 15

  7. VISUALIZATION OF PAGERANK ALGORITHM

    OpenAIRE

    Perhaj, Ervin

    2013-01-01

    The goal of the thesis is to develop a web application that help users understand the functioning of the PageRank algorithm. The thesis consists of two parts. First we develop an algorithm to calculate PageRank values of web pages. The input of algorithm is a list of web pages and links between them. The user enters the list through the web interface. From the data the algorithm calculates PageRank value for each page. The algorithm repeats the process, until the difference of PageRank va...

  8. Parallel sorting algorithms

    CERN Document Server

    Akl, Selim G

    1985-01-01

    Parallel Sorting Algorithms explains how to use parallel algorithms to sort a sequence of items on a variety of parallel computers. The book reviews the sorting problem, the parallel models of computation, parallel algorithms, and the lower bounds on the parallel sorting problems. The text also presents twenty different algorithms, such as linear arrays, mesh-connected computers, cube-connected computers. Another example where algorithm can be applied is on the shared-memory SIMD (single instruction stream multiple data stream) computers in which the whole sequence to be sorted can fit in the

  9. Music Consumption and Publishing in Todays Music Industry : Music publishing for an independent record label

    OpenAIRE

    Pienimäki, Kristian

    2015-01-01

    For the last two decades the changes in music technology and music consumption have affected music publishing as well as its viability. Much due to music digitalization and the overall decline in physical sales, the music industry has been forced to re-evaluate the means of publishing. The topic of the thesis is of current interest since the music industry is still in the state of change and new research is important. The thesis was assigned by an independent record label called Meiän Mu...

  10. Modified Clipped LMS Algorithm

    Directory of Open Access Journals (Sweden)

    Lotfizad Mojtaba

    2005-01-01

    Full Text Available Abstract A new algorithm is proposed for updating the weights of an adaptive filter. The proposed algorithm is a modification of an existing method, namely, the clipped LMS, and uses a three-level quantization ( scheme that involves the threshold clipping of the input signals in the filter weight update formula. Mathematical analysis shows the convergence of the filter weights to the optimum Wiener filter weights. Also, it can be proved that the proposed modified clipped LMS (MCLMS algorithm has better tracking than the LMS algorithm. In addition, this algorithm has reduced computational complexity relative to the unmodified one. By using a suitable threshold, it is possible to increase the tracking capability of the MCLMS algorithm compared to the LMS algorithm, but this causes slower convergence. Computer simulations confirm the mathematical analysis presented.

  11. Behind the Spam: A ``Spectral Analysis'' of Predatory Publishers

    Science.gov (United States)

    Beall, Jeffrey

    2016-10-01

    Most researchers today are bombarded with spam email solicitations from questionable scholarly publishers. These emails solicit article manuscripts, editorial board service, and even ad hoc peer reviews. These ``predatory'' publishers exploit the scholarly publishing process, patterning themselves after legitimate scholarly publishers yet performing little or no peer review and quickly accepting submitted manuscripts and collecting fees from submitting authors. These counterfeit publishers and journals have published much junk science? especially in the field of cosmology? threatening the integrity of the academic record. This paper examines the current state of predatory publishing and advises researchers how to navigate scholarly publishing to best avoid predatory publishers and other scholarly publishing-related perils.

  12. Some multigrid algorithms for SIMD machines

    Energy Technology Data Exchange (ETDEWEB)

    Dendy, J.E. Jr. [Los Alamos National Lab., NM (United States)

    1996-12-31

    Previously a semicoarsening multigrid algorithm suitable for use on SIMD architectures was investigated. Through the use of new software tools, the performance of this algorithm has been considerably improved. The method has also been extended to three space dimensions. The method performs well for strongly anisotropic problems and for problems with coefficients jumping by orders of magnitude across internal interfaces. The parallel efficiency of this method is analyzed, and its actual performance on the CM-5 is compared with its performance on the CRAY-YMP. A standard coarsening multigrid algorithm is also considered, and we compare its performance on these two platforms as well.

  13. Urethrotomy has a much lower success rate than previously reported.

    Science.gov (United States)

    Santucci, Richard; Eisenberg, Lauren

    2010-05-01

    We evaluated the success rate of direct vision internal urethrotomy as a treatment for simple male urethral strictures. A retrospective chart review was performed on 136 patients who underwent urethrotomy from January 1994 through March 2009. The Kaplan-Meier method was used to analyze stricture-free probability after the first, second, third, fourth and fifth urethrotomy. Patients with complex strictures (36) were excluded from the study for reasons including previous urethroplasty, neophallus or previous radiation, and 24 patients were lost to followup. Data were available for 76 patients. The stricture-free rate after the first urethrotomy was 8% with a median time to recurrence of 7 months. For the second urethrotomy stricture-free rate was 6% with a median time to recurrence of 9 months. For the third urethrotomy stricture-free rate was 9% with a median time to recurrence of 3 months. For procedures 4 and 5 stricture-free rate was 0% with a median time to recurrence of 20 and 8 months, respectively. Urethrotomy is a popular treatment for male urethral strictures. However, the performance characteristics are poor. Success rates were no higher than 9% in this series for first or subsequent urethrotomy during the observation period. Most of the patients in this series will be expected to experience failure with longer followup and the expected long-term success rate from any (1 through 5) urethrotomy approach is 0%. Urethrotomy should be considered a temporizing measure until definitive curative reconstruction can be planned. 2010 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  14. Line-breaking algorithm enhancement in inverse typesetting paradigma

    Directory of Open Access Journals (Sweden)

    Jan Přichystal

    2007-01-01

    Full Text Available High quality text preparing using computer desktop publishing systems usually uses line-breaking algorithm which cannot make provision for line heights and typeset paragraph accurately when composition width, page break, line index or other object appears. This article deals with enhancing of line-breaking algorithm based on optimum-fit algorithm. This algorithm is enhanced with calculation of immediate typesetting width and thus solves problem of forced change. Line-breaking algorithm enhancement causes expansion potentialities of high-quality typesetting in cases that have not been yet covered with present typesetting systems.

  15. Places, Publishers and Personal Ties; the relational qualities of urban environments for book publishers

    NARCIS (Netherlands)

    Heebels, B.; van Aalst, I.; Atzema, O.A.L.C.

    2014-01-01

    Book publishers act as cultural mediators. Personal networks and face-to-face contacts with authors, booksellers, colleagues and people from the press are crucial for their business. This is in accordance with the literature on cultural entrepreneurship, which emphasizes the importance of informal

  16. PCIU: Hardware Implementations of an Efficient Packet Classification Algorithm with an Incremental Update Capability

    Directory of Open Access Journals (Sweden)

    O. Ahmed

    2011-01-01

    Full Text Available Packet classification plays a crucial role for a number of network services such as policy-based routing, firewalls, and traffic billing, to name a few. However, classification can be a bottleneck in the above-mentioned applications if not implemented properly and efficiently. In this paper, we propose PCIU, a novel classification algorithm, which improves upon previously published work. PCIU provides lower preprocessing time, lower memory consumption, ease of incremental rule update, and reasonable classification time compared to state-of-the-art algorithms. The proposed algorithm was evaluated and compared to RFC and HiCut using several benchmarks. Results obtained indicate that PCIU outperforms these algorithms in terms of speed, memory usage, incremental update capability, and preprocessing time. The algorithm, furthermore, was improved and made more accessible for a variety of applications through implementation in hardware. Two such implementations are detailed and discussed in this paper. The results indicate that a hardware/software codesign approach results in a slower, but easier to optimize and improve within time constraints, PCIU solution. A hardware accelerator based on an ESL approach using Handel-C, on the other hand, resulted in a 31x speed-up over a pure software implementation running on a state of the art Xeon processor.

  17. Ripple FPN reduced algorithm based on temporal high-pass filter and hardware implementation

    Science.gov (United States)

    Li, Yiyang; Li, Shuo; Zhang, Zhipeng; Jin, Weiqi; Wu, Lei; Jin, Minglei

    2016-11-01

    Cooled infrared detector arrays always suffer from undesired Ripple Fixed-Pattern Noise (FPN) when observe the scene of sky. The Ripple Fixed-Pattern Noise seriously affect the imaging quality of thermal imager, especially for small target detection and tracking. It is hard to eliminate the FPN by the Calibration based techniques and the current scene-based nonuniformity algorithms. In this paper, we present a modified space low-pass and temporal high-pass nonuniformity correction algorithm using adaptive time domain threshold (THP&GM). The threshold is designed to significantly reduce ghosting artifacts. We test the algorithm on real infrared in comparison to several previously published methods. This algorithm not only can effectively correct common FPN such as Stripe, but also has obviously advantage compared with the current methods in terms of detail protection and convergence speed, especially for Ripple FPN correction. Furthermore, we display our architecture with a prototype built on a Xilinx Virtex-5 XC5VLX50T field-programmable gate array (FPGA). The hardware implementation of the algorithm based on FPGA has two advantages: (1) low resources consumption, and (2) small hardware delay (less than 20 lines). The hardware has been successfully applied in actual system.

  18. Social participation: redesign of education, research, and practice in occupational therapy. Previously published in Scandinavian Journal of Occupational Therapy 2013; 20: 2-8.

    Science.gov (United States)

    Piškur, Barbara

    2014-01-01

    There is growing attention to participation and social participation in literature and policy reports. Occupational therapists strongly believe that creating coherence between the person's occupations and environment will facilitate participation of each individual. Nowadays, societal developments such as "health literacy and self-management", "Web 2.0 social media", "empowering communities", and "Nothing About Us Without Us" increase opportunities for people to interact on different levels of social participation. Social participation can be used as an outcome, though it can also be seen as a means to change society and to develop solutions for barriers experienced by people with chronic diseases or disabilities. Societal developments will have an impact on social participation in terms of supporting each other and contributing to society. Additionally, these changes will have a major influence on the way we educate, conduct research, and deliver occupational therapy practice.

  19. Population-based Neisseria gonorrhoeae, Chlamydia trachomatis and Trichomonas vaginalis Prevalence Using Discarded, Deidentified Urine Specimens Previously Collected for Drug Testing (Open Access Publisher’s Version)

    Science.gov (United States)

    2017-10-24

    trichomonas vaginalis testing, Melinda Balansay-ames, chris Myers and gary Brice for Pcr- based sex determination testing, and Kimberly De Vera for...2017-053355 rEFErEnCEs 1 torrone e , Papp J, Weinstock H. centers for Disease control and Prevention (cDc). Prevalence of Chlamydia trachomatis genital...infection among persons aged 14-39 years-United States, 2007-2012. MMWR Morb Mortal Wkly Rep 2014;63:834–7. 2 rietmeijer ca, Hopkins e , geisler WM

  20. Semioptimal practicable algorithmic cooling

    International Nuclear Information System (INIS)

    Elias, Yuval; Mor, Tal; Weinstein, Yossi

    2011-01-01

    Algorithmic cooling (AC) of spins applies entropy manipulation algorithms in open spin systems in order to cool spins far beyond Shannon's entropy bound. Algorithmic cooling of nuclear spins was demonstrated experimentally and may contribute to nuclear magnetic resonance spectroscopy. Several cooling algorithms were suggested in recent years, including practicable algorithmic cooling (PAC) and exhaustive AC. Practicable algorithms have simple implementations, yet their level of cooling is far from optimal; exhaustive algorithms, on the other hand, cool much better, and some even reach (asymptotically) an optimal level of cooling, but they are not practicable. We introduce here semioptimal practicable AC (SOPAC), wherein a few cycles (typically two to six) are performed at each recursive level. Two classes of SOPAC algorithms are proposed and analyzed. Both attain cooling levels significantly better than PAC and are much more efficient than the exhaustive algorithms. These algorithms are shown to bridge the gap between PAC and exhaustive AC. In addition, we calculated the number of spins required by SOPAC in order to purify qubits for quantum computation. As few as 12 and 7 spins are required (in an ideal scenario) to yield a mildly pure spin (60% polarized) from initial polarizations of 1% and 10%, respectively. In the latter case, about five more spins are sufficient to produce a highly pure spin (99.99% polarized), which could be relevant for fault-tolerant quantum computing.

  1. Towards mainstreaming of biodiversity data publishing: recommendations of the GBIF Data Publishing Framework Task Group.

    Science.gov (United States)

    Moritz, Tom; Krishnan, S; Roberts, Dave; Ingwersen, Peter; Agosti, Donat; Penev, Lyubomir; Cockerill, Matthew; Chavan, Vishwas

    2011-01-01

    Data are the evidentiary basis for scientific hypotheses, analyses and publication, for policy formation and for decision-making. They are essential to the evaluation and testing of results by peer scientists both present and future. There is broad consensus in the scientific and conservation communities that data should be freely, openly available in a sustained, persistent and secure way, and thus standards for 'free' and 'open' access to data have become well developed in recent years. The question of effective access to data remains highly problematic. Specifically with respect to scientific publishing, the ability to critically evaluate a published scientific hypothesis or scientific report is contingent on the examination, analysis, evaluation - and if feasible - on the re-generation of data on which conclusions are based. It is not coincidental that in the recent 'climategate' controversies, the quality and integrity of data and their analytical treatment were central to the debate. There is recent evidence that even when scientific data are requested for evaluation they may not be available. The history of dissemination of scientific results has been marked by paradigm shifts driven by the emergence of new technologies. In recent decades, the advance of computer-based technology linked to global communications networks has created the potential for broader and more consistent dissemination of scientific information and data. Yet, in this digital era, scientists and conservationists, organizations and institutions have often been slow to make data available. Community studies suggest that the withholding of data can be attributed to a lack of awareness, to a lack of technical capacity, to concerns that data should be withheld for reasons of perceived personal or organizational self interest, or to lack of adequate mechanisms for attribution. There is a clear need for institutionalization of a 'data publishing framework' that can address sociocultural

  2. Towards mainstreaming of biodiversity data publishing: recommendations of the GBIF Data Publishing Framework Task Group

    Directory of Open Access Journals (Sweden)

    Moritz Tom

    2011-12-01

    Full Text Available Abstract Background Data are the evidentiary basis for scientific hypotheses, analyses and publication, for policy formation and for decision-making. They are essential to the evaluation and testing of results by peer scientists both present and future. There is broad consensus in the scientific and conservation communities that data should be freely, openly available in a sustained, persistent and secure way, and thus standards for 'free' and 'open' access to data have become well developed in recent years. The question of effective access to data remains highly problematic. Discussion Specifically with respect to scientific publishing, the ability to critically evaluate a published scientific hypothesis or scientific report is contingent on the examination, analysis, evaluation - and if feasible - on the re-generation of data on which conclusions are based. It is not coincidental that in the recent 'climategate' controversies, the quality and integrity of data and their analytical treatment were central to the debate. There is recent evidence that even when scientific data are requested for evaluation they may not be available. The history of dissemination of scientific results has been marked by paradigm shifts driven by the emergence of new technologies. In recent decades, the advance of computer-based technology linked to global communications networks has created the potential for broader and more consistent dissemination of scientific information and data. Yet, in this digital era, scientists and conservationists, organizations and institutions have often been slow to make data available. Community studies suggest that the withholding of data can be attributed to a lack of awareness, to a lack of technical capacity, to concerns that data should be withheld for reasons of perceived personal or organizational self interest, or to lack of adequate mechanisms for attribution. Conclusions There is a clear need for institutionalization of a

  3. Toxoplasma gondii and schizophrenia: a review of published RCTs.

    Science.gov (United States)

    Chorlton, Sam D

    2017-07-01

    Over the last 60 years, accumulating evidence has suggested that acute, chronic, and maternal Toxoplasma gondii infections predispose to schizophrenia. More recent evidence suggests that chronically infected patients with schizophrenia present with more severe disease. After acute infection, parasites form walled cysts in the brain, leading to lifelong chronic infection and drug resistance to commonly used antiparasitics. Chronic infection is the most studied and closely linked with development and severity of schizophrenia. There are currently four published randomized controlled trials evaluating antiparasitic drugs, specifically azithromycin, trimethoprim, artemisinin, and artemether, in patients with schizophrenia. No trials have demonstrated a change in psychopathology with adjunctive treatment. Published trials have either selected drugs without evidence against chronic infection or used them at doses too low to reduce brain cyst burden. Furthermore, trials have failed to achieve sufficient power or account for confounders such as previous antipsychotic treatment, sex, age, or rhesus status on antiparasitic effect. There are currently no ongoing trials of anti-Toxoplasma therapy in schizophrenia despite ample evidence to justify further testing.

  4. Perception of Key Barriers in Using and Publishing Open Data

    Directory of Open Access Journals (Sweden)

    Martin Beno

    2017-12-01

    Full Text Available TThere is a growing body of literature recognizing the benefits of Open Data. However, many potential data providers are unwilling to publish their data and at the same time, data users are often faced with difficulties when attempting to use Open Data in practice. Despite various barriers in using and publishing Open Data still being present, studies which systematically collect and assess these barriers are rare. Based on this observation we present a review on prior literature on barriers and the results of an empirical study aimed at assessing both the users’ and publishers’ views on obstacles regarding Open Data adoption. We collected data with an online survey in Austria and internationally. Using a sample of 183 participants, we draw conclusions about the relative importance of the barriers reported in the literature. In comparison to a previous conference paper presented at the conference for E-Democracy and Open Government, this article includes new additional data from participants outside Austria, reports new analyses, and substantially extends the discussion of results and of possible strategies for the mitigation of Open Data barriers.

  5. Publishing for Learned Societies: The Secret Life of a Scholarly Publisher

    Science.gov (United States)

    Nicholson, David

    Wiley-Blackwell was formed in February 2007 as a result of the acquisition of Blackwell Publishing Ltd. by John Wiley & Sons, Inc. and the merger between Blackwell and Wiley's Scientific, Technical, and Medical business. Wiley-Blackwell publishes approximately 1,250 scholarly peer-reviewed journals including Monthly Notices of the Royal Astronomical Society and Astronomische Nachrichten, and has relationships with over 800 learned societies. The "secret life" of the article's title refers to the two broad areas of activity we undertake for our society partners, namely practical assistance and strategic advice. One of our goals at Wiley-Blackwell is to set the standard for both areas, and this article illustrates how we are doing this with a series of tangible examples.

  6. The GBIF integrated publishing toolkit: facilitating the efficient publishing of biodiversity data on the internet.

    Directory of Open Access Journals (Sweden)

    Tim Robertson

    Full Text Available The planet is experiencing an ongoing global biodiversity crisis. Measuring the magnitude and rate of change more effectively requires access to organized, easily discoverable, and digitally-formatted biodiversity data, both legacy and new, from across the globe. Assembling this coherent digital representation of biodiversity requires the integration of data that have historically been analog, dispersed, and heterogeneous. The Integrated Publishing Toolkit (IPT is a software package developed to support biodiversity dataset publication in a common format. The IPT's two primary functions are to 1 encode existing species occurrence datasets and checklists, such as records from natural history collections or observations, in the Darwin Core standard to enhance interoperability of data, and 2 publish and archive data and metadata for broad use in a Darwin Core Archive, a set of files following a standard format. Here we discuss the key need for the IPT, how it has developed in response to community input, and how it continues to evolve to streamline and enhance the interoperability, discoverability, and mobilization of new data types beyond basic Darwin Core records. We close with a discussion how IPT has impacted the biodiversity research community, how it enhances data publishing in more traditional journal venues, along with new features implemented in the latest version of the IPT, and future plans for more enhancements.

  7. Shape: automatic conformation prediction of carbohydrates using a genetic algorithm

    Directory of Open Access Journals (Sweden)

    Rosen Jimmy

    2009-09-01

    Full Text Available Abstract Background Detailed experimental three dimensional structures of carbohydrates are often difficult to acquire. Molecular modelling and computational conformation prediction are therefore commonly used tools for three dimensional structure studies. Modelling procedures generally require significant training and computing resources, which is often impractical for most experimental chemists and biologists. Shape has been developed to improve the availability of modelling in this field. Results The Shape software package has been developed for simplicity of use and conformation prediction performance. A trivial user interface coupled to an efficient genetic algorithm conformation search makes it a powerful tool for automated modelling. Carbohydrates up to a few hundred atoms in size can be investigated on common computer hardware. It has been shown to perform well for the prediction of over four hundred bioactive oligosaccharides, as well as compare favourably with previously published studies on carbohydrate conformation prediction. Conclusion The Shape fully automated conformation prediction can be used by scientists who lack significant modelling training, and performs well on computing hardware such as laptops and desktops. It can also be deployed on computer clusters for increased capacity. The prediction accuracy under the default settings is good, as it agrees well with experimental data and previously published conformation prediction studies. This software is available both as open source and under commercial licenses.

  8. Introduction to Evolutionary Algorithms

    CERN Document Server

    Yu, Xinjie

    2010-01-01

    Evolutionary algorithms (EAs) are becoming increasingly attractive for researchers from various disciplines, such as operations research, computer science, industrial engineering, electrical engineering, social science, economics, etc. This book presents an insightful, comprehensive, and up-to-date treatment of EAs, such as genetic algorithms, differential evolution, evolution strategy, constraint optimization, multimodal optimization, multiobjective optimization, combinatorial optimization, evolvable hardware, estimation of distribution algorithms, ant colony optimization, particle swarm opti

  9. Recursive forgetting algorithms

    DEFF Research Database (Denmark)

    Parkum, Jens; Poulsen, Niels Kjølstad; Holst, Jan

    1992-01-01

    In the first part of the paper, a general forgetting algorithm is formulated and analysed. It contains most existing forgetting schemes as special cases. Conditions are given ensuring that the basic convergence properties will hold. In the second part of the paper, the results are applied...... to a specific algorithm with selective forgetting. Here, the forgetting is non-uniform in time and space. The theoretical analysis is supported by a simulation example demonstrating the practical performance of this algorithm...

  10. A GPU-based finite-size pencil beam algorithm with 3D-density correction for radiotherapy dose calculation

    International Nuclear Information System (INIS)

    Gu Xuejun; Jia Xun; Jiang, Steve B; Jelen, Urszula; Li Jinsheng

    2011-01-01

    Targeting at the development of an accurate and efficient dose calculation engine for online adaptive radiotherapy, we have implemented a finite-size pencil beam (FSPB) algorithm with a 3D-density correction method on graphics processing unit (GPU). This new GPU-based dose engine is built on our previously published ultrafast FSPB computational framework (Gu et al 2009 Phys. Med. Biol. 54 6287-97). Dosimetric evaluations against Monte Carlo dose calculations are conducted on ten IMRT treatment plans (five head-and-neck cases and five lung cases). For all cases, there is improvement with the 3D-density correction over the conventional FSPB algorithm and for most cases the improvement is significant. Regarding the efficiency, because of the appropriate arrangement of memory access and the usage of GPU intrinsic functions, the dose calculation for an IMRT plan can be accomplished well within 1 s (except for one case) with this new GPU-based FSPB algorithm. Compared to the previous GPU-based FSPB algorithm without 3D-density correction, this new algorithm, though slightly sacrificing the computational efficiency (∼5-15% lower), has significantly improved the dose calculation accuracy, making it more suitable for online IMRT replanning.

  11. Explaining algorithms using metaphors

    CERN Document Server

    Forišek, Michal

    2013-01-01

    There is a significant difference between designing a new algorithm, proving its correctness, and teaching it to an audience. When teaching algorithms, the teacher's main goal should be to convey the underlying ideas and to help the students form correct mental models related to the algorithm. This process can often be facilitated by using suitable metaphors. This work provides a set of novel metaphors identified and developed as suitable tools for teaching many of the 'classic textbook' algorithms taught in undergraduate courses worldwide. Each chapter provides exercises and didactic notes fo

  12. Algorithms in Algebraic Geometry

    CERN Document Server

    Dickenstein, Alicia; Sommese, Andrew J

    2008-01-01

    In the last decade, there has been a burgeoning of activity in the design and implementation of algorithms for algebraic geometric computation. Some of these algorithms were originally designed for abstract algebraic geometry, but now are of interest for use in applications and some of these algorithms were originally designed for applications, but now are of interest for use in abstract algebraic geometry. The workshop on Algorithms in Algebraic Geometry that was held in the framework of the IMA Annual Program Year in Applications of Algebraic Geometry by the Institute for Mathematics and Its

  13. Shadow algorithms data miner

    CERN Document Server

    Woo, Andrew

    2012-01-01

    Digital shadow generation continues to be an important aspect of visualization and visual effects in film, games, simulations, and scientific applications. This resource offers a thorough picture of the motivations, complexities, and categorized algorithms available to generate digital shadows. From general fundamentals to specific applications, it addresses shadow algorithms and how to manage huge data sets from a shadow perspective. The book also examines the use of shadow algorithms in industrial applications, in terms of what algorithms are used and what software is applicable.

  14. Spectral Decomposition Algorithm (SDA)

    Data.gov (United States)

    National Aeronautics and Space Administration — Spectral Decomposition Algorithm (SDA) is an unsupervised feature extraction technique similar to PCA that was developed to better distinguish spectral features in...

  15. Quick fuzzy backpropagation algorithm.

    Science.gov (United States)

    Nikov, A; Stoeva, S

    2001-03-01

    A modification of the fuzzy backpropagation (FBP) algorithm called QuickFBP algorithm is proposed, where the computation of the net function is significantly quicker. It is proved that the FBP algorithm is of exponential time complexity, while the QuickFBP algorithm is of polynomial time complexity. Convergence conditions of the QuickFBP, resp. the FBP algorithm are defined and proved for: (1) single output neural networks in case of training patterns with different targets; and (2) multiple output neural networks in case of training patterns with equivalued target vector. They support the automation of the weights training process (quasi-unsupervised learning) establishing the target value(s) depending on the network's input values. In these cases the simulation results confirm the convergence of both algorithms. An example with a large-sized neural network illustrates the significantly greater training speed of the QuickFBP rather than the FBP algorithm. The adaptation of an interactive web system to users on the basis of the QuickFBP algorithm is presented. Since the QuickFBP algorithm ensures quasi-unsupervised learning, this implies its broad applicability in areas of adaptive and adaptable interactive systems, data mining, etc. applications.

  16. Portfolios of quantum algorithms.

    Science.gov (United States)

    Maurer, S M; Hogg, T; Huberman, B A

    2001-12-17

    Quantum computation holds promise for the solution of many intractable problems. However, since many quantum algorithms are stochastic in nature they can find the solution of hard problems only probabilistically. Thus the efficiency of the algorithms has to be characterized by both the expected time to completion and the associated variance. In order to minimize both the running time and its uncertainty, we show that portfolios of quantum algorithms analogous to those of finance can outperform single algorithms when applied to the NP-complete problems such as 3-satisfiability.

  17. Technical Writing Teachers and the Challenges of Desktop Publishing.

    Science.gov (United States)

    Kalmbach, James

    1988-01-01

    Argues that technical writing teachers must understand desktop publishing. Discusses the strengths that technical writing teachers bring to desktop publishing, and the impact desktop publishing will have on technical writing courses and programs. (ARH)

  18. Writing and Publishing Books in Counseling: A Survey of Authors.

    Science.gov (United States)

    Seligman, Linda; Kelly, Shirley C.

    1990-01-01

    Presents data and ideas from 74 authors who published books in counseling field. Reviews writing and publishing process. Provides information on timetables, book contracts, and remuneration as well as suggestions on publisher selection and contract negotiation. (Author/CM)

  19. Establishing a publishing outfit in Nigeria | Emenyonu | International ...

    African Journals Online (AJOL)

    Journal Home > Vol 12, No 1 (2017) > ... The paper examines the steps in establishing a publishing firm in the Nigerian environment. ... to follow in establishing a publishing company, networking with stakeholders in the publishing industry,the ...

  20. Persistent seropositivity for yellow fever in a previously vaccinated autologous hematopoietic stem cell transplantation recipient.

    Science.gov (United States)

    Hayakawa, Kayoko; Takasaki, Tomohiko; Tsunemine, Hiroko; Kanagawa, Shuzo; Kutsuna, Satoshi; Takeshita, Nozomi; Mawatari, Momoko; Fujiya, Yoshihiro; Yamamoto, Kei; Ohmagari, Norio; Kato, Yasuyuki

    2015-08-01

    The duration of a protective level of yellow fever antibodies after autologous hematopoietic stem cell transplantation in a previously vaccinated person is unclear. The case of a patient who had previously been vaccinated for yellow fever and who remained seropositive for 22 months after autologous peripheral blood stem cell transplantation for malignant lymphoma is described herein. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Machine Learning Algorithms Outperform Conventional Regression Models in Predicting Development of Hepatocellular Carcinoma

    Science.gov (United States)

    Singal, Amit G.; Mukherjee, Ashin; Elmunzer, B. Joseph; Higgins, Peter DR; Lok, Anna S.; Zhu, Ji; Marrero, Jorge A; Waljee, Akbar K

    2015-01-01

    Background Predictive models for hepatocellular carcinoma (HCC) have been limited by modest accuracy and lack of validation. Machine learning algorithms offer a novel methodology, which may improve HCC risk prognostication among patients with cirrhosis. Our study's aim was to develop and compare predictive models for HCC development among cirrhotic patients, using conventional regression analysis and machine learning algorithms. Methods We enrolled 442 patients with Child A or B cirrhosis at the University of Michigan between January 2004 and September 2006 (UM cohort) and prospectively followed them until HCC development, liver transplantation, death, or study termination. Regression analysis and machine learning algorithms were used to construct predictive models for HCC development, which were tested on an independent validation cohort from the Hepatitis C Antiviral Long-term Treatment against Cirrhosis (HALT-C) Trial. Both models were also compared to the previously published HALT-C model. Discrimination was assessed using receiver operating characteristic curve analysis and diagnostic accuracy was assessed with net reclassification improvement and integrated discrimination improvement statistics. Results After a median follow-up of 3.5 years, 41 patients developed HCC. The UM regression model had a c-statistic of 0.61 (95%CI 0.56-0.67), whereas the machine learning algorithm had a c-statistic of 0.64 (95%CI 0.60–0.69) in the validation cohort. The machine learning algorithm had significantly better diagnostic accuracy as assessed by net reclassification improvement (pmachine learning algorithm (p=0.047). Conclusion Machine learning algorithms improve the accuracy of risk stratifying patients with cirrhosis and can be used to accurately identify patients at high-risk for developing HCC. PMID:24169273

  2. On the performance of pre-microRNA detection algorithms

    DEFF Research Database (Denmark)

    Saçar Demirci, Müşerref Duygu; Baumbach, Jan; Allmer, Jens

    2017-01-01

    assess 13 ab initio pre-miRNA detection approaches using all relevant, published, and novel data sets while judging algorithm performance based on ten intrinsic performance measures. We present an extensible framework, izMiR, which allows for the unbiased comparison of existing algorithms, adding new...

  3. Algorithm 426 : Merge sort algorithm [M1

    NARCIS (Netherlands)

    Bron, C.

    1972-01-01

    Sorting by means of a two-way merge has a reputation of requiring a clerically complicated and cumbersome program. This ALGOL 60 procedure demonstrates that, using recursion, an elegant and efficient algorithm can be designed, the correctness of which is easily proved [2]. Sorting n objects gives

  4. A Study on the Communication Mechanism for Publishing and Producing News on Mobile Devices

    Directory of Open Access Journals (Sweden)

    Ruhan Zhao

    2016-07-01

    Full Text Available This study explores the communication mechanism for publishing and producing news through analyzing mediums such as Microblog, WeChat and, in particular, the mobile app, TouTiao. The results of this study show that the status and practice of professional journalism and gatekeepers are being phased out of news production. Adversely, algorithms and technology are taking their place at the center of the circle of news production.

  5. Composite Differential Search Algorithm

    Directory of Open Access Journals (Sweden)

    Bo Liu

    2014-01-01

    Full Text Available Differential search algorithm (DS is a relatively new evolutionary algorithm inspired by the Brownian-like random-walk movement which is used by an organism to migrate. It has been verified to be more effective than ABC, JDE, JADE, SADE, EPSDE, GSA, PSO2011, and CMA-ES. In this paper, we propose four improved solution search algorithms, namely “DS/rand/1,” “DS/rand/2,” “DS/current to rand/1,” and “DS/current to rand/2” to search the new space and enhance the convergence rate for the global optimization problem. In order to verify the performance of different solution search methods, 23 benchmark functions are employed. Experimental results indicate that the proposed algorithm performs better than, or at least comparable to, the original algorithm when considering the quality of the solution obtained. However, these schemes cannot still achieve the best solution for all functions. In order to further enhance the convergence rate and the diversity of the algorithm, a composite differential search algorithm (CDS is proposed in this paper. This new algorithm combines three new proposed search schemes including “DS/rand/1,” “DS/rand/2,” and “DS/current to rand/1” with three control parameters using a random method to generate the offspring. Experiment results show that CDS has a faster convergence rate and better search ability based on the 23 benchmark functions.

  6. Algorithms and Their Explanations

    NARCIS (Netherlands)

    Benini, M.; Gobbo, F.; Beckmann, A.; Csuhaj-Varjú, E.; Meer, K.

    2014-01-01

    By analysing the explanation of the classical heapsort algorithm via the method of levels of abstraction mainly due to Floridi, we give a concrete and precise example of how to deal with algorithmic knowledge. To do so, we introduce a concept already implicit in the method, the ‘gradient of

  7. Finite lattice extrapolation algorithms

    International Nuclear Information System (INIS)

    Henkel, M.; Schuetz, G.

    1987-08-01

    Two algorithms for sequence extrapolation, due to von den Broeck and Schwartz and Bulirsch and Stoer are reviewed and critically compared. Applications to three states and six states quantum chains and to the (2+1)D Ising model show that the algorithm of Bulirsch and Stoer is superior, in particular if only very few finite lattice data are available. (orig.)

  8. Recursive automatic classification algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Bauman, E V; Dorofeyuk, A A

    1982-03-01

    A variational statement of the automatic classification problem is given. The dependence of the form of the optimal partition surface on the form of the classification objective functional is investigated. A recursive algorithm is proposed for maximising a functional of reasonably general form. The convergence problem is analysed in connection with the proposed algorithm. 8 references.

  9. Graph Colouring Algorithms

    DEFF Research Database (Denmark)

    Husfeldt, Thore

    2015-01-01

    This chapter presents an introduction to graph colouring algorithms. The focus is on vertex-colouring algorithms that work for general classes of graphs with worst-case performance guarantees in a sequential model of computation. The presentation aims to demonstrate the breadth of available...

  10. 8. Algorithm Design Techniques

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 8. Algorithms - Algorithm Design Techniques. R K Shyamasundar. Series Article Volume 2 ... Author Affiliations. R K Shyamasundar1. Computer Science Group, Tata Institute of Fundamental Research, Homi Bhabha Road, Mumbai 400 005, India ...

  11. Efficient sequential and parallel algorithms for record linkage.

    Science.gov (United States)

    Mamun, Abdullah-Al; Mi, Tian; Aseltine, Robert; Rajasekaran, Sanguthevar

    2014-01-01

    Integrating data from multiple sources is a crucial and challenging problem. Even though there exist numerous algorithms for record linkage or deduplication, they suffer from either large time needs or restrictions on the number of datasets that they can integrate. In this paper we report efficient sequential and parallel algorithms for record linkage which handle any number of datasets and outperform previous algorithms. Our algorithms employ hierarchical clustering algorithms as the basis. A key idea that we use is radix sorting on certain attributes to eliminate identical records before any further processing. Another novel idea is to form a graph that links similar records and find the connected components. Our sequential and parallel algorithms have been tested on a real dataset of 1,083,878 records and synthetic datasets ranging in size from 50,000 to 9,000,000 records. Our sequential algorithm runs at least two times faster, for any dataset, than the previous best-known algorithm, the two-phase algorithm using faster computation of the edit distance (TPA (FCED)). The speedups obtained by our parallel algorithm are almost linear. For example, we get a speedup of 7.5 with 8 cores (residing in a single node), 14.1 with 16 cores (residing in two nodes), and 26.4 with 32 cores (residing in four nodes). We have compared the performance of our sequential algorithm with TPA (FCED) and found that our algorithm outperforms the previous one. The accuracy is the same as that of this previous best-known algorithm.

  12. Parallel algorithms for continuum dynamics

    International Nuclear Information System (INIS)

    Hicks, D.L.; Liebrock, L.M.

    1987-01-01

    Simply porting existing parallel programs to a new parallel processor may not achieve the full speedup possible; to achieve the maximum efficiency may require redesigning the parallel algorithms for the specific architecture. The authors discuss here parallel algorithms that were developed first for the HEP processor and then ported to the CRAY X-MP/4, the ELXSI/10, and the Intel iPSC/32. Focus is mainly on the most recent parallel processing results produced, i.e., those on the Intel Hypercube. The applications are simulations of continuum dynamics in which the momentum and stress gradients are important. Examples of these are inertial confinement fusion experiments, severe breaks in the coolant system of a reactor, weapons physics, shock-wave physics. Speedup efficiencies on the Intel iPSC Hypercube are very sensitive to the ratio of communication to computation. Great care must be taken in designing algorithms for this machine to avoid global communication. This is much more critical on the iPSC than it was on the three previous parallel processors

  13. Evaluation of novel algorithm embedded in a wearable sEMG device for seizure detection

    DEFF Research Database (Denmark)

    Conradsen, Isa; Beniczky, Sandor; Wolf, Peter

    2012-01-01

    We implemented a modified version of a previously published algorithm for detection of generalized tonic-clonic seizures into a prototype wireless surface electromyography (sEMG) recording device. The method was modified to require minimum computational load, and two parameters were trained...... on prior sEMG data recorded with the device. Along with the normal sEMG recording, the device is able to set an alarm whenever the implemented algorithm detects a seizure. These alarms are annotated in the data file along with the signal. The device was tested at the Epilepsy Monitoring Unit (EMU......) at the Danish Epilepsy Center. Five patients were included in the study and two of them had generalized tonic-clonic seizures. All patients were monitored for 2–5 days. A double-blind study was made on the five patients. The overall result showed that the device detected four of seven seizures and had a false...

  14. Geometric approximation algorithms

    CERN Document Server

    Har-Peled, Sariel

    2011-01-01

    Exact algorithms for dealing with geometric objects are complicated, hard to implement in practice, and slow. Over the last 20 years a theory of geometric approximation algorithms has emerged. These algorithms tend to be simple, fast, and more robust than their exact counterparts. This book is the first to cover geometric approximation algorithms in detail. In addition, more traditional computational geometry techniques that are widely used in developing such algorithms, like sampling, linear programming, etc., are also surveyed. Other topics covered include approximate nearest-neighbor search, shape approximation, coresets, dimension reduction, and embeddings. The topics covered are relatively independent and are supplemented by exercises. Close to 200 color figures are included in the text to illustrate proofs and ideas.

  15. Fast geometric algorithms

    International Nuclear Information System (INIS)

    Noga, M.T.

    1984-01-01

    This thesis addresses a number of important problems that fall within the framework of the new discipline of Computational Geometry. The list of topics covered includes sorting and selection, convex hull algorithms, the L 1 hull, determination of the minimum encasing rectangle of a set of points, the Euclidean and L 1 diameter of a set of points, the metric traveling salesman problem, and finding the superrange of star-shaped and monotype polygons. The main theme of all the work was to develop a set of very fast state-of-the-art algorithms that supersede any rivals in terms of speed and ease of implementation. In some cases existing algorithms were refined; for others new techniques were developed that add to the present database of fast adaptive geometric algorithms. What emerges is a collection of techniques that is successful at merging modern tools developed in analysis of algorithms with those of classical geometry

  16. Totally parallel multilevel algorithms

    Science.gov (United States)

    Frederickson, Paul O.

    1988-01-01

    Four totally parallel algorithms for the solution of a sparse linear system have common characteristics which become quite apparent when they are implemented on a highly parallel hypercube such as the CM2. These four algorithms are Parallel Superconvergent Multigrid (PSMG) of Frederickson and McBryan, Robust Multigrid (RMG) of Hackbusch, the FFT based Spectral Algorithm, and Parallel Cyclic Reduction. In fact, all four can be formulated as particular cases of the same totally parallel multilevel algorithm, which are referred to as TPMA. In certain cases the spectral radius of TPMA is zero, and it is recognized to be a direct algorithm. In many other cases the spectral radius, although not zero, is small enough that a single iteration per timestep keeps the local error within the required tolerance.

  17. Governance by algorithms

    Directory of Open Access Journals (Sweden)

    Francesca Musiani

    2013-08-01

    Full Text Available Algorithms are increasingly often cited as one of the fundamental shaping devices of our daily, immersed-in-information existence. Their importance is acknowledged, their performance scrutinised in numerous contexts. Yet, a lot of what constitutes 'algorithms' beyond their broad definition as “encoded procedures for transforming input data into a desired output, based on specified calculations” (Gillespie, 2013 is often taken for granted. This article seeks to contribute to the discussion about 'what algorithms do' and in which ways they are artefacts of governance, providing two examples drawing from the internet and ICT realm: search engine queries and e-commerce websites’ recommendations to customers. The question of the relationship between algorithms and rules is likely to occupy an increasingly central role in the study and the practice of internet governance, in terms of both institutions’ regulation of algorithms, and algorithms’ regulation of our society.

  18. Where genetic algorithms excel.

    Science.gov (United States)

    Baum, E B; Boneh, D; Garrett, C

    2001-01-01

    We analyze the performance of a genetic algorithm (GA) we call Culling, and a variety of other algorithms, on a problem we refer to as the Additive Search Problem (ASP). We show that the problem of learning the Ising perceptron is reducible to a noisy version of ASP. Noisy ASP is the first problem we are aware of where a genetic-type algorithm bests all known competitors. We generalize ASP to k-ASP to study whether GAs will achieve "implicit parallelism" in a problem with many more schemata. GAs fail to achieve this implicit parallelism, but we describe an algorithm we call Explicitly Parallel Search that succeeds. We also compute the optimal culling point for selective breeding, which turns out to be independent of the fitness function or the population distribution. We also analyze a mean field theoretic algorithm performing similarly to Culling on many problems. These results provide insight into when and how GAs can beat competing methods.

  19. Network-Oblivious Algorithms

    DEFF Research Database (Denmark)

    Bilardi, Gianfranco; Pietracaprina, Andrea; Pucci, Geppino

    2016-01-01

    A framework is proposed for the design and analysis of network-oblivious algorithms, namely algorithms that can run unchanged, yet efficiently, on a variety of machines characterized by different degrees of parallelism and communication capabilities. The framework prescribes that a network......-oblivious algorithm be specified on a parallel model of computation where the only parameter is the problem’s input size, and then evaluated on a model with two parameters, capturing parallelism granularity and communication latency. It is shown that for a wide class of network-oblivious algorithms, optimality...... of cache hierarchies, to the realm of parallel computation. Its effectiveness is illustrated by providing optimal network-oblivious algorithms for a number of key problems. Some limitations of the oblivious approach are also discussed....

  20. A Faster Algorithm for Computing Straight Skeletons

    KAUST Repository

    Cheng, Siu-Wing

    2014-09-01

    We present a new algorithm for computing the straight skeleton of a polygon. For a polygon with n vertices, among which r are reflex vertices, we give a deterministic algorithm that reduces the straight skeleton computation to a motorcycle graph computation in O(n (logn)logr) time. It improves on the previously best known algorithm for this reduction, which is randomized, and runs in expected O(n√h+1log2n) time for a polygon with h holes. Using known motorcycle graph algorithms, our result yields improved time bounds for computing straight skeletons. In particular, we can compute the straight skeleton of a non-degenerate polygon in O(n (logn) logr + r 4/3 + ε ) time for any ε > 0. On degenerate input, our time bound increases to O(n (logn) logr + r 17/11 + ε ).

  1. A Faster Algorithm for Computing Straight Skeletons

    KAUST Repository

    Mencel, Liam A.

    2014-05-06

    We present a new algorithm for computing the straight skeleton of a polygon. For a polygon with n vertices, among which r are reflex vertices, we give a deterministic algorithm that reduces the straight skeleton computation to a motorcycle graph computation in O(n (log n) log r) time. It improves on the previously best known algorithm for this reduction, which is randomised, and runs in expected O(n √(h+1) log² n) time for a polygon with h holes. Using known motorcycle graph algorithms, our result yields improved time bounds for computing straight skeletons. In particular, we can compute the straight skeleton of a non-degenerate polygon in O(n (log n) log r + r^(4/3 + ε)) time for any ε > 0. On degenerate input, our time bound increases to O(n (log n) log r + r^(17/11 + ε))

  2. A Faster Algorithm for Computing Straight Skeletons

    KAUST Repository

    Cheng, Siu-Wing; Mencel, Liam A.; Vigneron, Antoine E.

    2014-01-01

    We present a new algorithm for computing the straight skeleton of a polygon. For a polygon with n vertices, among which r are reflex vertices, we give a deterministic algorithm that reduces the straight skeleton computation to a motorcycle graph computation in O(n (logn)logr) time. It improves on the previously best known algorithm for this reduction, which is randomized, and runs in expected O(n√h+1log2n) time for a polygon with h holes. Using known motorcycle graph algorithms, our result yields improved time bounds for computing straight skeletons. In particular, we can compute the straight skeleton of a non-degenerate polygon in O(n (logn) logr + r 4/3 + ε ) time for any ε > 0. On degenerate input, our time bound increases to O(n (logn) logr + r 17/11 + ε ).

  3. Considerations and Algorithms for Compression of Sets

    DEFF Research Database (Denmark)

    Larsson, Jesper

    We consider compression of unordered sets of distinct elements. After a discus- sion of the general problem, we focus on compressing sets of fixed-length bitstrings in the presence of statistical information. We survey techniques from previous work, suggesting some adjustments, and propose a novel...... compression algorithm that allows transparent incorporation of various estimates for probability distribution. Our experimental results allow the conclusion that set compression can benefit from incorporat- ing statistics, using our method or variants of previously known techniques....

  4. 44 CFR 5.21 - Effect of failure to publish.

    Science.gov (United States)

    2010-10-01

    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Effect of failure to publish... failure to publish. 5 U.S.C. 552(a)(1) provides that, except to the extent that a person has actual and... adversely affected by, a matter required to be published in the Federal Register and not so published. ...

  5. 22 CFR 212.11 - Materials to be published.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Materials to be published. 212.11 Section 212... Federal Register § 212.11 Materials to be published. (a) USAID separately states and currently publishes... section. (b) USAID Public Notice No. 1 and the USAID Regulations published in chapter II of Title 22 and...

  6. 15 CFR 10.13 - Withdrawal of a published standard.

    Science.gov (United States)

    2010-01-01

    ...) Before withdrawing a standard published under these procedures, the Director will review the relative... 15 Commerce and Foreign Trade 1 2010-01-01 2010-01-01 false Withdrawal of a published standard. 10... DEVELOPMENT OF VOLUNTARY PRODUCT STANDARDS § 10.13 Withdrawal of a published standard. (a) Standards published...

  7. The Ins and the Outs of Electronic Publishing.

    Science.gov (United States)

    Wills, Mathew; Wills, Gordon

    1996-01-01

    Examines electronic publishing for academic and professional publishers. Discusses benefits of electronic publishing to authors and readers, argues that the hard sell and product-driven mindsets will not work in a customer-focused communications medium, and outlines characteristics of electronic publishing that must be incorporated in successful…

  8. A Faster Algorithm for Computing Motorcycle Graphs

    KAUST Repository

    Vigneron, Antoine E.; Yan, Lie

    2014-01-01

    We present a new algorithm for computing motorcycle graphs that runs in (Formula presented.) time for any (Formula presented.), improving on all previously known algorithms. The main application of this result is to computing the straight skeleton of a polygon. It allows us to compute the straight skeleton of a non-degenerate polygon with (Formula presented.) holes in (Formula presented.) expected time. If all input coordinates are (Formula presented.)-bit rational numbers, we can compute the straight skeleton of a (possibly degenerate) polygon with (Formula presented.) holes in (Formula presented.) expected time. In particular, it means that we can compute the straight skeleton of a simple polygon in (Formula presented.) expected time if all input coordinates are (Formula presented.)-bit rationals, while all previously known algorithms have worst-case running time (Formula presented.). © 2014 Springer Science+Business Media New York.

  9. A Faster Algorithm for Computing Motorcycle Graphs

    KAUST Repository

    Vigneron, Antoine E.

    2014-08-29

    We present a new algorithm for computing motorcycle graphs that runs in (Formula presented.) time for any (Formula presented.), improving on all previously known algorithms. The main application of this result is to computing the straight skeleton of a polygon. It allows us to compute the straight skeleton of a non-degenerate polygon with (Formula presented.) holes in (Formula presented.) expected time. If all input coordinates are (Formula presented.)-bit rational numbers, we can compute the straight skeleton of a (possibly degenerate) polygon with (Formula presented.) holes in (Formula presented.) expected time. In particular, it means that we can compute the straight skeleton of a simple polygon in (Formula presented.) expected time if all input coordinates are (Formula presented.)-bit rationals, while all previously known algorithms have worst-case running time (Formula presented.). © 2014 Springer Science+Business Media New York.

  10. Behind the Spam: A "Spectral Analysis" of Predatory Publishers

    Science.gov (United States)

    Beall, Jeffrey

    2015-08-01

    Most researchers today are bombarded with spam email solicitations from questionable scholarly publishers. These emails solicit article manuscripts, editorial board service, and even ad hoc peer reviews. These "predatory" publishers exploit the scholarly publishing process, patterning themselves after legitimate scholarly publishers yet performing little or no peer review and quickly accepting submitted manuscripts and collecting fees from submitting authors. These counterfeit publishers and journals have published much junk science — especially in the field of cosmology — threatening the integrity of the academic record. This presentation examines the current state of predatory publishing and related scams such as fake impact factors and advises researchers how to navigate scholarly publishing to best avoid predatory publishers and other scholarly publishing-related perils.

  11. EDITORIAL: Roberts Prize for the best paper published in 2010 Roberts Prize for the best paper published in 2010

    Science.gov (United States)

    Webb, Steve; Harris, Simon

    2011-08-01

    The publishers of Physics in Medicine and Biology (PMB), IOP Publishing, in association with the journal owners, the Institute of Physics and Engineering in Medicine (IPEM), jointly award an annual prize for the best paper published in PMB during the previous year. The procedure for deciding the winner has been made as thorough as possible, to try to ensure that an outstanding paper wins the prize. We started off with a shortlist of the 10 research papers published in 2010 which were rated the best based on the referees' quality assessments. Following the submission of a short 'case for winning' document by each of the shortlisted authors, an IPEM college of jurors of the status of FIPEM assessed and rated these 10 papers in order to choose a winner, which was then endorsed by the Editorial Board. We have much pleasure in advising readers that the Roberts Prize for the best paper published in 2010 is awarded to M M Paulides et al from Erasmus MC, Rotterdam, The Netherlands, for their paper on hyperthermia treatment: The clinical feasibility of deep hyperthermia treatment in the head and neck: new challenges for positioning and temperature measurement M M Paulides, J F Bakker, M Linthorst, J van der Zee, Z Rijnen, E Neufeld, P M T Pattynama, P P Jansen, P C Levendag and G C van Rhoon 2010 Phys. Med. Biol. 55 2465 Our congratulations go to these authors. Of course all of the shortlisted papers were of great merit, and the full top-10 is listed below (in alphabetical order). Steve Webb Editor-in-Chief Simon Harris Publisher References Alonzo-Proulx O, Packard N, Boone J M, Al-Mayah A, Brock K K, Shen S Z and Yaffe M J 2010 Validation of a method for measuring the volumetric breast density from digital mammograms Phys. Med. Biol. 55 3027 Bian J, Siewerdsen J H, Han X, Sidky E Y, Prince J L, Pelizzari C A and Pan X 2010 Evaluation of sparse-view reconstruction from flat-panel-detector cone-beam CT Phys. Med. Biol. 55 6575 Brun M-A, Formanek F, Yasuda A, Sekine M, Ando N

  12. Quality assessment of published health economic analyses from South America.

    Science.gov (United States)

    Machado, Márcio; Iskedjian, Michael; Einarson, Thomas R

    2006-05-01

    Health economic analyses have become important to healthcare systems worldwide. No studies have previously examined South America's contribution in this area. To survey the literature with the purpose of reviewing, quantifying, and assessing the quality of published South American health economic analyses. A search of MEDLINE (1990-December 2004), EMBASE (1990-December 2004), International Pharmaceutical Abstracts (1990-December 2004), Literatura Latino-Americana e do Caribe em Ciências da Saúde (1982-December 2004), and Sistema de Informacion Esencial en Terapéutica y Salud (1980-December 2004) was completed using the key words cost-effectiveness analysis (CEA), cost-utility analysis (CUA), cost-minimization analysis (CMA), and cost-benefit analysis (CBA); abbreviations CEA, CUA, CMA, and CBA; and all South American country names. Papers were categorized by type and country by 2 independent reviewers. Quality was assessed using a 12 item checklist, characterizing scores as 4 (good), 3 (acceptable), 2 (poor), 1 (unable to judge), and 0 (unacceptable). To be included in our investigation, studies needed to have simultaneously examined costs and outcomes. We retrieved 25 articles; one duplicate article was rejected, leaving 24 (CEA = 15, CBA = 6, CMA = 3; Brazil = 9, Argentina = 5, Colombia = 3, Chile = 2, Ecuador = 2, 1 each from Peru, Uruguay, Venezuela). Variability between raters was less than 0.5 point on overall scores (OS) and less than 1 point on all individual items. Mean OS was 2.6 (SD 1.0, range 1.4-3.8). CBAs scored highest (OS 2.8, SD 0.8), CEAs next (OS 2.7, SD 0.7), and CMAs lowest (OS 2.0, SD 0.5). When scored by type of question, definition of study aim scored highest (OS 3.0, SD 0.8), while ethical issues scored lowest (OS 1.5, SD 0.9). By country, Peru scored highest (mean OS 3.8) and Uruguay had the lowest scores (mean OS 2.2). A nonsignificant time trend was noted for OS (R2 = 0.12; p = 0.104). Quality scores of health economic analyses

  13. Algorithm for cellular reprogramming.

    Science.gov (United States)

    Ronquist, Scott; Patterson, Geoff; Muir, Lindsey A; Lindsly, Stephen; Chen, Haiming; Brown, Markus; Wicha, Max S; Bloch, Anthony; Brockett, Roger; Rajapakse, Indika

    2017-11-07

    The day we understand the time evolution of subcellular events at a level of detail comparable to physical systems governed by Newton's laws of motion seems far away. Even so, quantitative approaches to cellular dynamics add to our understanding of cell biology. With data-guided frameworks we can develop better predictions about, and methods for, control over specific biological processes and system-wide cell behavior. Here we describe an approach for optimizing the use of transcription factors (TFs) in cellular reprogramming, based on a device commonly used in optimal control. We construct an approximate model for the natural evolution of a cell-cycle-synchronized population of human fibroblasts, based on data obtained by sampling the expression of 22,083 genes at several time points during the cell cycle. To arrive at a model of moderate complexity, we cluster gene expression based on division of the genome into topologically associating domains (TADs) and then model the dynamics of TAD expression levels. Based on this dynamical model and additional data, such as known TF binding sites and activity, we develop a methodology for identifying the top TF candidates for a specific cellular reprogramming task. Our data-guided methodology identifies a number of TFs previously validated for reprogramming and/or natural differentiation and predicts some potentially useful combinations of TFs. Our findings highlight the immense potential of dynamical models, mathematics, and data-guided methodologies for improving strategies for control over biological processes. Copyright © 2017 the Author(s). Published by PNAS.

  14. Parallel Algorithms for the Exascale Era

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Laboratory

    2016-10-19

    New parallel algorithms are needed to reach the Exascale level of parallelism with millions of cores. We look at some of the research developed by students in projects at LANL. The research blends ideas from the early days of computing while weaving in the fresh approach brought by students new to the field of high performance computing. We look at reproducibility of global sums and why it is important to parallel computing. Next we look at how the concept of hashing has led to the development of more scalable algorithms suitable for next-generation parallel computers. Nearly all of this work has been done by undergraduates and published in leading scientific journals.

  15. Algorithms for the Computation of Debris Risk

    Science.gov (United States)

    Matney, Mark J.

    2017-01-01

    Determining the risks from space debris involve a number of statistical calculations. These calculations inevitably involve assumptions about geometry - including the physical geometry of orbits and the geometry of satellites. A number of tools have been developed in NASA’s Orbital Debris Program Office to handle these calculations; many of which have never been published before. These include algorithms that are used in NASA’s Orbital Debris Engineering Model ORDEM 3.0, as well as other tools useful for computing orbital collision rates and ground casualty risks. This paper presents an introduction to these algorithms and the assumptions upon which they are based.

  16. Algorithms for the Computation of Debris Risks

    Science.gov (United States)

    Matney, Mark

    2017-01-01

    Determining the risks from space debris involve a number of statistical calculations. These calculations inevitably involve assumptions about geometry - including the physical geometry of orbits and the geometry of non-spherical satellites. A number of tools have been developed in NASA's Orbital Debris Program Office to handle these calculations; many of which have never been published before. These include algorithms that are used in NASA's Orbital Debris Engineering Model ORDEM 3.0, as well as other tools useful for computing orbital collision rates and ground casualty risks. This paper will present an introduction to these algorithms and the assumptions upon which they are based.

  17. Algorithms in Singular

    Directory of Open Access Journals (Sweden)

    Hans Schonemann

    1996-12-01

    Full Text Available Some algorithms for singularity theory and algebraic geometry The use of Grobner basis computations for treating systems of polynomial equations has become an important tool in many areas. This paper introduces of the concept of standard bases (a generalization of Grobner bases and the application to some problems from algebraic geometry. The examples are presented as SINGULAR commands. A general introduction to Grobner bases can be found in the textbook [CLO], an introduction to syzygies in [E] and [St1]. SINGULAR is a computer algebra system for computing information about singularities, for use in algebraic geometry. The basic algorithms in SINGULAR are several variants of a general standard basis algorithm for general monomial orderings (see [GG]. This includes wellorderings (Buchberger algorithm ([B1], [B2] and tangent cone orderings (Mora algorithm ([M1], [MPT] as special cases: It is able to work with non-homogeneous and homogeneous input and also to compute in the localization of the polynomial ring in 0. Recent versions include algorithms to factorize polynomials and a factorizing Grobner basis algorithm. For a complete description of SINGULAR see [Si].

  18. Impact of previously disadvantaged land-users on sustainable ...

    African Journals Online (AJOL)

    Impact of previously disadvantaged land-users on sustainable agricultural ... about previously disadvantaged land users involved in communal farming systems ... of input, capital, marketing, information and land use planning, with effect on ...

  19. A New Modified Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Medha Gupta

    2016-07-01

    Full Text Available Nature inspired meta-heuristic algorithms studies the emergent collective intelligence of groups of simple agents. Firefly Algorithm is one of the new such swarm-based metaheuristic algorithm inspired by the flashing behavior of fireflies. The algorithm was first proposed in 2008 and since then has been successfully used for solving various optimization problems. In this work, we intend to propose a new modified version of Firefly algorithm (MoFA and later its performance is compared with the standard firefly algorithm along with various other meta-heuristic algorithms. Numerical studies and results demonstrate that the proposed algorithm is superior to existing algorithms.

  20. Genetic Algorithm Applied to the Eigenvalue Equalization Filtered-x LMS Algorithm (EE-FXLMS

    Directory of Open Access Journals (Sweden)

    Stephan P. Lovstedt

    2008-01-01

    Full Text Available The FXLMS algorithm, used extensively in active noise control (ANC, exhibits frequency-dependent convergence behavior. This leads to degraded performance for time-varying tonal noise and noise with multiple stationary tones. Previous work by the authors proposed the eigenvalue equalization filtered-x least mean squares (EE-FXLMS algorithm. For that algorithm, magnitude coefficients of the secondary path transfer function are modified to decrease variation in the eigenvalues of the filtered-x autocorrelation matrix, while preserving the phase, giving faster convergence and increasing overall attenuation. This paper revisits the EE-FXLMS algorithm, using a genetic algorithm to find magnitude coefficients that give the least variation in eigenvalues. This method overcomes some of the problems with implementing the EE-FXLMS algorithm arising from finite resolution of sampled systems. Experimental control results using the original secondary path model, and a modified secondary path model for both the previous implementation of EE-FXLMS and the genetic algorithm implementation are compared.