WorldWideScience

Sample records for sources fifty-one sets

  1. The American oil industry and the Fifty-Fifty Agreement of 1950

    International Nuclear Information System (INIS)

    Anderson, T.H.

    1988-01-01

    This paper traces the history of the Fifty-fifty Agreement of 1950, and the role of the American oil industry in that agreement. Regardless of the definition one chooses for the term foreign policy, it can now be argued with considerable force that the Fifty-fifty Agreement was, in fact, the product of American foreign policy. Defining the term narrowly as the enunciated policy of the Executive Branch of Government on matters outside the national boundary, it is clear that a consistent policy had been in place for a long time supporting exactly the type of action which was taken hear. Defining the term broadly, as the defacto cumulative thrust of all competing interest groups within a nation-state (both governmental and private) outside the national boundary, makes the assertion even more clear. Including Aramco and the parent companies in the equation leaves little doubt that the Fifty-fifty Agreement was the product of American foreign policy

  2. One hundred and fifty years of combustion of fossil hydrocarbons: The emergent alternatives

    International Nuclear Information System (INIS)

    Laine, Jorge

    2009-01-01

    After one hundred fifty years of drilling first commercial petroleum wells that led to the intensive use of liquid fuels to move transport vehicles, we are arriving at the peak of the world-wide petroleum reserves. Yet, we still have a good portion for spending, with the hope that the consequences will be better than in the first part, which has implied several wars and deteriorations of the environment. This assay brings a review about the history of fossil fuels and with the prospective of the emergent energetic alternatives, placing emphasis on bioenergy as an alternative for the transition between the actual combustion age and the new age of clean energy.

  3. Community Energy Systems and the Law of Public Utilities. Volume Fifty-one. Wisconsin

    Energy Technology Data Exchange (ETDEWEB)

    Feurer, D.A.; Weaver, C.L.

    1981-01-01

    A detailed description is presented of the laws and programs of the State of Wisconsin governing the regulation of public energy utilities, the siting of energy generating and transmission facilities, the municipal franchising of public energy utilities, and the prescription of rates to be charged by utilities including attendant problems of cost allocations, rate base and operating expense determinations, and rate of return allowances. These laws and programs are analyzed to identify impediments which they may present to the implementation of Integrated Community Energy Systems (ICES). This report is one of fifty-one separate volumes which describe such regulatory programs at the Federal level and in each state as background to the report entitled Community Energy Systems and the Law of Public Utilities - Volume One: An Overview. This report also contains a summary of a strategy described in Volume One - An Overview for overcoming these impediments by working within the existing regulatory framework and by making changes in the regulatory programs to enhance the likelihood of ICES implementation.

  4. ``Fifty-fifty`` - an unexpectedly successful start. Implementation of Agenda 21 results in energy savings in school buildings; ``Fifty-fifty`` - ein erfreulich erfolgreicher Start. Energieeinsparung durch praktizierte Agenda 21 in Schulen

    Energy Technology Data Exchange (ETDEWEB)

    Schafhausen, F. [Bundesministerium fuer Umwelt, Naturschutz und Reaktorsicherheit, Bonn (Germany)

    1998-10-01

    The idea of the `Fifty-Fifty` pilot project was conceived at the environmental department of the Hamburg municipal authority; its implementation took place in close cooperation with the department for schools, the young, and vocational training (`BSJB`). It started in October 94 and was finalized in June 97. After its successful completion, `Fifty-Fifty` meanwhile has the status of a permanent project.- For decades it has been known that public buildings consume added energy (power and heat) as well as water not only because technical devices are lacking or obsolete but also because of negligence and carelessness. `Fifty-Fifty` endeavours to bring about a change. (orig.) [Deutsch] Geschichte des `Fifty-fifty`-Modells: Die Idee wurde von der Hamburger Umweltbehoerde entwickelt und in enger Zusammenarbeit mit der Behoerde fuer Schule, Jugend und Berufsbildung (BSJB) umgesetzt. Der Modellversuch wurde im Oktober 1994 gestartet und im Juni 1997 abgeschlossen. Nach der erfolgreichen Beendigung des Modellversuchs hat `Fifty-fifty` mittlerweile den Status eines Dauerprojekts. Seit Jahrzehnten ist bekannt, dass in oeffentlich genutzten Gebaeuden nicht nur aufgrund fehlender oder veralteter technischer Ausstattung, sondern insbesondere auch wegen eines unsachgemaessen oder gedankenlosen Verhaltens mehr Energie (Strom und Waerme) und mehr Wasser verbraucht werden. `Fifty-fifty` tut etwas dagegen. (orig.)

  5. Extreme Markup: The Fifty US Hospitals With The Highest Charge-To-Cost Ratios.

    Science.gov (United States)

    Bai, Ge; Anderson, Gerard F

    2015-06-01

    Using Medicare cost reports, we examined the fifty US hospitals with the highest charge-to-cost ratios in 2012. These hospitals have markups (ratios of charges over Medicare-allowable costs) approximately ten times their Medicare-allowable costs compared to a national average of 3.4 and a mode of 2.4. Analysis of the fifty hospitals showed that forty-nine are for profit (98 percent), forty-six are owned by for-profit hospital systems (92 percent), and twenty (40 percent) operate in Florida. One for-profit hospital system owns half of these fifty hospitals. While most public and private health insurers do not use hospital charges to set their payment rates, uninsured patients are commonly asked to pay the full charges, and out-of-network patients and casualty and workers' compensation insurers are often expected to pay a large portion of the full charges. Because it is difficult for patients to compare prices, market forces fail to constrain hospital charges. Federal and state governments may want to consider limitations on the charge-to-cost ratio, some form of all-payer rate setting, or mandated price disclosure to regulate hospital markups. Project HOPE—The People-to-People Health Foundation, Inc.

  6. Fifty shades of exploitation: Fan labor and Fifty Shades of Grey

    Directory of Open Access Journals (Sweden)

    Bethan Jones

    2014-03-01

    Full Text Available This exploration of the debates that have taken place in fandom over the ethics of pulling fan fiction and publishing it as original work draws on the notion of the fannish gift economy, which postulates that gifts such as fan fiction and fan art have value in the fannish community because they are designed to create and cement its social structure. Tension exists between fans who subscribe to the notion of a fannish gift economy and those who exploit fandom by using it to sell their pulled-to-publish works. An examination of E. L. James's 2012 Fifty Shades trilogy (comprising the books Fifty Shades of Grey, Fifty Shades Darker, and Fifty Shades Freed, which began as Twilight fan fiction, in addition to Twilight fan art sold through sites such as Redbubble and Etsy, demonstrates a tension between the two modes of fan expression: sale of artworks appears to be an acceptable practice in fandom, but the commercial sale of fan fic, even when marketed as original fiction, is widely contested.

  7. Toward Exact Number: Young Children Use One-to-one Correspondence to Measure Set Identity but not Numerical Equality

    Science.gov (United States)

    Izard, Véronique; Streri, Arlette; Spelke, Elizabeth S.

    2014-01-01

    Exact integer concepts are fundamental to a wide array of human activities, but their origins are obscure. Some have proposed that children are endowed with a system of natural number concepts, whereas others have argued that children construct these concepts by mastering verbal counting or other numeric symbols. This debate remains unresolved, because it is difficult to test children’s mastery of the logic of integer concepts without using symbols to enumerate large sets, and the symbols themselves could be a source of difficulty for children. Here, we introduce a new method, focusing on large quantities and avoiding the use of words or other symbols for numbers, to study children’s understanding of an essential property underlying integer concepts: the relation of exact numerical equality. Children aged 32-36 months, who possessed no symbols for exact numbers beyond 4, were given one-to-one correspondence cues to help them track a set of puppets, and their enumeration of the set was assessed by a non-verbal manual search task. Children used one-to-one correspondence relations to reconstruct exact quantities in sets of 5 or 6 objects, as long as the elements forming the sets remained the same individuals. In contrast, they failed to track exact quantities when one element was added, removed, or substituted for another. These results suggest an alternative to both nativist and symbol-based constructivist theories of the development of natural number concepts: Before learning symbols for exact numbers, children have a partial understanding of the properties of exact numbers. PMID:24680885

  8. Motivation and Goal-Setting in College Athletes

    OpenAIRE

    Cash, Erin

    2009-01-01

    Motivation and goal-setting are important concepts in athletics and sport and exercise psychology. However, little research has compared motivation and goal-setting by gender. The self-determination theory was used and the purpose of this study was to determine if there is a difference between male and female athletes when looking at amotivation, external regulation, identified regulation, intrinsic motivation, and goal-setting. One hundred and six student-athletes (fifty one males and f...

  9. Set of devices for producing radioactive 60Co-sources

    International Nuclear Information System (INIS)

    Eichhorn, P.; Tobisch, F.

    1982-01-01

    A set of devices for producing radioactive 60 Co-sources was developed. A single source has a radioactivity of 445x10 10 GBq. It consists of a double envelope of stainless steel filled with a mixture of small pieces of cobalt and stainless steel wire. The diameter of a source is 11 mm; the length 80 mm. Cobalt wires of different radioactivity with a length of about 110 mm and 0,8 mm diameter are the raw material. The set is located in a hot cell. Construction, functions and operation of the set are described in detail. (author)

  10. Las notas al pie en la traducción de Fifty Shades (Footnotes in the Translation of Fifty Shades

    Directory of Open Access Journals (Sweden)

    Xinia Valverde Jara

    2017-11-01

    Full Text Available Se analiza el recurso de las notas a pie de página insertas en la traducción independiente de literatura comercial en versión digital, de la trilogía Fifty Shades of Grey, Fifty Shades Darker y Fifty Shades Freed, de E. L. James. Se examina el grado de influencia de esos elementos extratextuales, y se concluye que todos ellos, organizados en un complejo proceso de traducción, en especial los del contexto de llegada, condicionan la estrategia traductológica. Abstract This study examines the use of footnotes as a translation strategy in the independent translation of commercial literature in digital format, based on the trilogy written by E. L. James, Fifty Shades of Grey, Fifty Shades Darker and Fifty Shades Freed. The analysis measures the influence of these extratextual elements within the complex translation process, specifically those of the target language context, and it is concluded that they condition the translation strategies used.

  11. Sexist Attitudes Among Emerging Adult Women Readers of Fifty Shades Fiction.

    Science.gov (United States)

    Altenburger, Lauren E; Carotta, Christin L; Bonomi, Amy E; Snyder, Anastasia

    2017-02-01

    Stereotypical sexist representations of men and women in popular culture reinforce rigid views of masculinity (e.g., males as being strong, in control, masterful, and aggressive) and femininity (e.g., women as being fragile and weak, unassertive, peaceful, irrational, and driven by emotions). The present study examined associations between the fictional series Fifty Shades-one popular culture mechanism that includes pervasive traditional gender role representations-and underlying sexist beliefs among a sample of 715 women ages 18-24 years. Analyses revealed associations between Fifty Shades readership and sexism, as measured through the Ambivalent Sexism Inventory. Namely women who reported reading Fifty Shades had higher levels of ambivalent, benevolent, and hostile sexism. Further, those who interpreted Fifty Shades as "romantic" had higher levels of ambivalent and benevolent sexism. Our findings support prior empirical studies noting associations between interacting with aspects of popular culture, such as television and video games, and individual beliefs and behaviors.

  12. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  13. Fifty-one years of Los Alamos Spacecraft

    Energy Technology Data Exchange (ETDEWEB)

    Fenimore, Edward E. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-04

    From 1963 to 2014, the Los Alamos National Laboratory was involved in at least 233 spacecraft. There are probably only one or two institutions in the world that have been involved in so many spacecraft. Los Alamos space exploration started with the Vela satellites for nuclear test detection, but soon expanded to ionospheric research (mostly barium releases), radioisotope thermoelectric generators, solar physics, solar wind, magnetospheres, astrophysics, national security, planetary physics, earth resources, radio propagation in the ionosphere, and cubesats. Here, we present a list of the spacecraft, their purpose, and their launch dates for use during RocketFest

  14. Fifty years of fuzzy logic and its applications

    CERN Document Server

    Rishe, Naphtali; Kandel, Abraham

    2015-01-01

    This book presents a comprehensive report on the evolution of Fuzzy Logic since its formulation in Lotfi Zadeh’s seminal paper on “fuzzy sets,” published in 1965. In addition, it features a stimulating sampling from the broad field of research and development inspired by Zadeh’s paper. The chapters, written by pioneers and prominent scholars in the field, show how fuzzy sets have been successfully applied to artificial intelligence, control theory, inference, and reasoning. The book also reports on theoretical issues; features recent applications of Fuzzy Logic in the fields of neural networks, clustering, data mining, and software testing; and highlights an important paradigm shift caused by Fuzzy Logic in the area of uncertainty management. Conceived by the editors as an academic celebration of the fifty years’ anniversary of the 1965 paper, this work is a must-have for students and researchers willing to get an inspiring picture of the potentialities, limitations, achievements and accomplishments...

  15. Deterrence, denuclearization, and proliferation: Alternative visions of the next fifty years

    International Nuclear Information System (INIS)

    Lehman, R.F. II.

    1994-01-01

    The great library of Alexandria may have contained fewer volumes than the number which have been written on the subject of nuclear weapons in the Cold War. With the end of the Cold War, a new nuclear library is in the making. Much thought is being given to the next steps in nuclear policy, strategy, forces, arms control, and nonproliferation. For this very distinguished conference, however, I have been asked to look further ahead indeed, forward fifty-years. Prognostication is always a risky business. Detailed predictions beyond the shortest duration are difficult to label as ''scientific'' even in the social sciences. Forecasting ahead fifty years in an age of ever accelerating change would seem to be hopeless. Projecting the future of nuclear weapons, however, may not be as complex as one might think. Detailing the future fifty years from now is not necessary. We want to inform upcoming decisions by examining the possibilities, not write a history in advance of what is to happen. Our look forward con benefit from a brief look back fifty years. In retrospect, those years passed quickly, and with each additional year, analysts make them appear more simple than they seemed at the time. This paper contributes further to this process of oversimplification, as we say, ''for heuristic purposes.'' When in doubt, I have erred on the side of being provocative

  16. Deterrence, denuclearization, and proliferation: Alternative visions of the next fifty years

    Energy Technology Data Exchange (ETDEWEB)

    Lehman, R.F. II

    1994-02-12

    The great library of Alexandria may have contained fewer volumes than the number which have been written on the subject of nuclear weapons in the Cold War. With the end of the Cold War, a new nuclear library is in the making. Much thought is being given to the next steps in nuclear policy, strategy, forces, arms control, and nonproliferation. For this very distinguished conference, however, I have been asked to look further ahead indeed, forward fifty-years. Prognostication is always a risky business. Detailed predictions beyond the shortest duration are difficult to label as ``scientific`` even in the social sciences. Forecasting ahead fifty years in an age of ever accelerating change would seem to be hopeless. Projecting the future of nuclear weapons, however, may not be as complex as one might think. Detailing the future fifty years from now is not necessary. We want to inform upcoming decisions by examining the possibilities, not write a history in advance of what is to happen. Our look forward con benefit from a brief look back fifty years. In retrospect, those years passed quickly, and with each additional year, analysts make them appear more simple than they seemed at the time. This paper contributes further to this process of oversimplification, as we say, ``for heuristic purposes.`` When in doubt, I have erred on the side of being provocative.

  17. Atoms for peace plus fifty

    International Nuclear Information System (INIS)

    Eisenhower, S.

    2003-01-01

    One of Dwight Eisenhower's most significant political legacies stemmed from his management of the nuclear question. Five decades after Eisenhower's 'Atoms for Peace' speech before the United Nations, the nuclear dilemma persists but the world is a different, and I would submit, a better place today than it might have been had that vision not been articulated, or its proposals not advanced. The 'Atoms for Peace' speech had a number of objectives, but it is over arching goal was to propose a set of ideas, a nuclear strategy, which would call on the Soviets to cooperate internationally for the betterment of mankind. This would reengage the Soviets in discussions on nuclear matters at a time when arms control talks had stalled, but it would also offer hope, and a practical set of ideas, to the developing world. 'Atoms for Peace' spawned many developments, including the establishment of the International Atomic Energy Agency, and eventually the Nuclear Non-Proliferation Treaty. While 'Atoms for Peace', as well as the institutions it created, has come under fire in recent years, it is hard to imagine what the world would have been like without it. Largely through the international Atomic Energy Agency, nations around the world have participated in research and development programs, including the use of nuclear energy in important civilian applications. Nuclear electric power accounts for nearly one-fifth of the world's electricity - reducing global tensions by replacing oil in many applications, and providing much of the world's electricity that is generated without the release of greenhouse gases or other destructive emissions. Many other nuclear and radiation-related technologies, especially radiopharmaceuticals and medical advances involving radiation, have resulted in large part from research spawned by 'Atoms for Peace'. Millions of lives have been saved in the process. While the 'nuclear dilemma' remains a challenge almost as complex as it was fifty years ago, the

  18. The next fifty years

    International Nuclear Information System (INIS)

    Scaldwell, Reg

    1995-01-01

    The General Manager of the reactor division of Centronics, a world class manufacturer of reactor control systems and radiation detectors, describes its achievements and looks forward to speculate on what the next fifty years may mean in terms of technological development. Tables are given of the locations of Centronics Fission Chambers and of reactors controlled with Centronic Detectors. (UK)

  19. Fifty years of Cuba's medical diplomacy: from idealism to pragmatism.

    Science.gov (United States)

    Feinsilver, Julie M

    2010-01-01

    Medical diplomacy, the collaboration between countries to simultaneously produce health benefits and improve relations, has been a cornerstone of Cuban foreign policy since the outset of the revolution fifty years ago. It has helped Cuba garner symbolic capital (goodwill, influence, and prestige) well beyond what would have been possible for a small, developing country, and it has contributed to making Cuba a player on the world stage. In recent years, medical diplomacy has been instrumental in providing considerable material capital (aid, credit, and trade), as the oil-for-doctors deals with Venezuela demonstrates. This has helped keep the revolution afloat in trying economic times. What began as the implementation of the one of the core values of the revolution, namely health as a basic human right for all peoples, has continued as both an idealistic and a pragmatic pursuit. This article examines the factors that enabled Cuba to conduct medical diplomacy over the past fifty years, the rationale behind the conduct of this type of soft power politics, the results of that effort, and the mix of idealism and pragmatism that has characterized the experience. Moreover, it presents a typology of medical diplomacy that Cuba has used over the past fifty years.

  20. One-to-one dietary interventions undertaken in a dental setting to change dietary behaviour.

    Science.gov (United States)

    Harris, Rebecca; Gamboa, Ana; Dailey, Yvonne; Ashcroft, Angela

    2012-03-14

    The dental care setting is an appropriate place to deliver dietary assessment and advice as part of patient management. However, we do not know whether this is effective in changing dietary behaviour. To assess the effectiveness of one-to-one dietary interventions for all ages carried out in a dental care setting in changing dietary behaviour. The effectiveness of these interventions in the subsequent changing of oral and general health is also assessed. The following electronic databases were searched: the Cochrane Oral Health Group Trials Register (to 24 January 2012), the Cochrane Central Register of Controlled Trials (CENTRAL) (The Cochrane Library 2012, Issue 1), MEDLINE via OVID (1950 to 24 January 2012), EMBASE via OVID (1980 to 24 January 2012), CINAHL via EBSCO (1982 to 24 January 2012), PsycINFO via OVID (1967 to 24 January 2012), and Web of Science (1945 to 12 April 2011). We also undertook an electronic search of key conference proceedings (IADR and ORCA between 2000 and 13 July 2011). Reference lists of relevant articles, thesis publications (Dissertations Abstracts Online 1861 to 2011) were searched. The authors of eligible trials were contacted to identify any unpublished work. Randomised controlled trials assessing the effectiveness of one-to-one dietary interventions delivered in a dental care setting. Abstract screening, eligibility screening and data extraction decisions were all carried out independently and in duplicate by two review authors. Consensus between the two opinions was achieved by discussion, or involvement of a third review author. Five studies met the criteria for inclusion in the review. Two of these were multi-intervention studies where the dietary intervention was one component of a wider programme of prevention, but where data on dietary behaviour change were reported. One of the single intervention studies was concerned with dental caries prevention. The other two concerned general health outcomes. There were no studies

  1. The fifty-year-old woman and midlife stress.

    Science.gov (United States)

    Campbell, S

    It has been assumed that the fifties are a relatively stable decade; however, women in their fifties are susceptible to many stresses, internal and external. The possibilities of widowhood, divorce, or poverty, combined with intra- and interpersonal strains, make this a time of insecurity about aging for many women. Some suggestions as to why women nonetheless cope successfully with aging are considered.

  2. Evaluation of setting time and flow properties of self-synthesize alginate impressions

    Science.gov (United States)

    Halim, Calista; Cahyanto, Arief; Sriwidodo, Harsatiningsih, Zulia

    2018-02-01

    Alginate is an elastic hydrocolloid dental impression materials to obtain negative reproduction of oral mucosa such as to record soft-tissue and occlusal relationships. The aim of the present study was to synthesize alginate and to determine the setting time and flow properties. There were five groups of alginate consisted of fifty samples self-synthesize alginate and commercial alginate impression product. Fifty samples were divided according to two tests, each twenty-five samples for setting time and flow test. Setting time test was recorded in the s unit, meanwhile, flow test was recorded in the mm2 unit. The fastest setting time result was in the group three (148.8 s) and the latest was group fours). The highest flow test result was in the group three (69.70 mm2) and the lowest was group one (58.34 mm2). Results were analyzed statistically by one way ANOVA (α= 0.05), showed that there was a statistical significance of setting time while no statistical significance of flow properties between self-synthesize alginate and alginate impression product. In conclusion, the alginate impression was successfully self-synthesized and variation composition gives influence toward setting time and flow properties. The most resemble setting time of control group is group three. The most resemble flow of control group is group four.

  3. Controlled-source seismic interferometry with one way wave fields

    Science.gov (United States)

    van der Neut, J.; Wapenaar, K.; Thorbecke, J. W.

    2008-12-01

    In Seismic Interferometry we generally cross-correlate registrations at two receiver locations and sum over an array of sources to retrieve a Green's function as if one of the receiver locations hosts a (virtual) source and the other receiver location hosts an actual receiver. One application of this concept is to redatum an area of surface sources to a downhole receiver location, without requiring information about the medium between the sources and receivers, thus providing an effective tool for imaging below complex overburden, which is also known as the Virtual Source method. We demonstrate how elastic wavefield decomposition can be effectively combined with controlled-source Seismic Interferometry to generate virtual sources in a downhole receiver array that radiate only down- or upgoing P- or S-waves with receivers sensing only down- or upgoing P- or S- waves. For this purpose we derive exact Green's matrix representations from a reciprocity theorem for decomposed wavefields. Required is the deployment of multi-component sources at the surface and multi- component receivers in a horizontal borehole. The theory is supported with a synthetic elastic model, where redatumed traces are compared with those of a directly modeled reflection response, generated by placing active sources at the virtual source locations and applying elastic wavefield decomposition on both source and receiver side.

  4. Students' Consideration of Source Information during the Reading of Multiple Texts and Its Effect on Intertextual Conflict Resolution

    Science.gov (United States)

    Kobayashi, Keiichi

    2014-01-01

    This study investigated students' spontaneous use of source information for the resolution of conflicts between texts. One-hundred fifty-four undergraduate students read two conflicting explanations concerning the relationship between blood type and personality under two conditions: either one explanation with a higher credibility source and…

  5. Fifty years of nuclear magnetic resonance

    International Nuclear Information System (INIS)

    Martinez Valderrama, Juan Crisostomo

    1997-01-01

    Short information about the main developments of nuclear magnetic resonance during their fifty existence years is presented. Beside two examples of application (HETCOR and INADEQUATE) to the structural determination of organic compounds are described

  6. FIFTY-FIVE GALLON DRUM STANDARD STUDY

    International Nuclear Information System (INIS)

    Puigh, R.J.

    2009-01-01

    Fifty-five gallon drums are routinely used within the U.S. for the storage and eventual disposal of fissionable materials as Transuranic or low-level waste. To support these operations, criticality safety evaluations are required. A questionnaire was developed and sent to selected Endusers at Hanford, Idaho National Laboratory, Lawrence Livermore National Laboratory, Oak Ridge and the Savannah River Site to solicit current practices. This questionnaire was used to gather information on the kinds of fissionable materials packaged into drums, the models used in performing criticality safety evaluations in support of operations involving these drums, and the limits and controls established for the handling and storage of these drums. The completed questionnaires were reviewed and clarifications solicited through individual communications with each Enduser to obtain more complete and consistent responses. All five sites have similar drum operations involving thousands to tens of thousands of fissionable material waste drums. The primary sources for these drums are legacy (prior operations) and decontamination and decommissioning wastes at all sites except Lawrence Livermore National Laboratory. The results from this survey and our review are discussed in this paper

  7. Resource cost results for one-way entanglement distillation and state merging of compound and arbitrarily varying quantum sources

    International Nuclear Information System (INIS)

    Boche, H.; Janßen, G.

    2014-01-01

    We consider one-way quantum state merging and entanglement distillation under compound and arbitrarily varying source models. Regarding quantum compound sources, where the source is memoryless, but the source state an unknown member of a certain set of density matrices, we continue investigations begun in the work of Bjelaković et al. [“Universal quantum state merging,” J. Math. Phys. 54, 032204 (2013)] and determine the classical as well as entanglement cost of state merging. We further investigate quantum state merging and entanglement distillation protocols for arbitrarily varying quantum sources (AVQS). In the AVQS model, the source state is assumed to vary in an arbitrary manner for each source output due to environmental fluctuations or adversarial manipulation. We determine the one-way entanglement distillation capacity for AVQS, where we invoke the famous robustification and elimination techniques introduced by Ahlswede. Regarding quantum state merging for AVQS we show by example that the robustification and elimination based approach generally leads to suboptimal entanglement as well as classical communication rates

  8. Estimation of distance error by fuzzy set theory required for strength determination of HDR (192)Ir brachytherapy sources.

    Science.gov (United States)

    Kumar, Sudhir; Datta, D; Sharma, S D; Chourasiya, G; Babu, D A R; Sharma, D N

    2014-04-01

    Verification of the strength of high dose rate (HDR) (192)Ir brachytherapy sources on receipt from the vendor is an important component of institutional quality assurance program. Either reference air-kerma rate (RAKR) or air-kerma strength (AKS) is the recommended quantity to specify the strength of gamma-emitting brachytherapy sources. The use of Farmer-type cylindrical ionization chamber of sensitive volume 0.6 cm(3) is one of the recommended methods for measuring RAKR of HDR (192)Ir brachytherapy sources. While using the cylindrical chamber method, it is required to determine the positioning error of the ionization chamber with respect to the source which is called the distance error. An attempt has been made to apply the fuzzy set theory to estimate the subjective uncertainty associated with the distance error. A simplified approach of applying this fuzzy set theory has been proposed in the quantification of uncertainty associated with the distance error. In order to express the uncertainty in the framework of fuzzy sets, the uncertainty index was estimated and was found to be within 2.5%, which further indicates that the possibility of error in measuring such distance may be of this order. It is observed that the relative distance li estimated by analytical method and fuzzy set theoretic approach are consistent with each other. The crisp values of li estimated using analytical method lie within the bounds computed using fuzzy set theory. This indicates that li values estimated using analytical methods are within 2.5% uncertainty. This value of uncertainty in distance measurement should be incorporated in the uncertainty budget, while estimating the expanded uncertainty in HDR (192)Ir source strength measurement.

  9. Fiction or not? Fifty Shades is associated with health risks in adolescent and young adult females.

    Science.gov (United States)

    Bonomi, Amy E; Nemeth, Julianna M; Altenburger, Lauren E; Anderson, Melissa L; Snyder, Anastasia; Dotto, Irma

    2014-09-01

    No prior study has empirically characterized the association between health risks and reading popular fiction depicting violence against women. Fifty Shades--a blockbuster fiction series--depicts pervasive violence against women, perpetuating a broader social narrative that normalizes these types of risks and behaviors in women's lives. The present study characterized the association between health risks in women who read and did not read Fifty Shades; while our cross-sectional study design precluded causal determinations, an empirical representation of the health risks in women consuming the problematic messages in Fifty Shades is made. Females ages 18 to 24 (n=715), who were enrolled in a large Midwestern university, completed a cross-sectional online survey about their health behaviors and Fifty Shades' readership. The analysis included 655 females (219 who read at least the first Fifty Shades novel and 436 who did not read any part of Fifty Shades). Age- and race-adjusted multivariable models characterized Fifty Shades' readers and nonreaders on intimate partner violence victimization (experiencing physical, sexual and psychological abuse, including cyber-abuse, at some point during their lifetime); binge drinking (consuming five or more alcoholic beverages on six or more days in the last month); sexual practices (having five or more intercourse partners and/or one or more anal sex partner during their lifetime); and using diet aids or fasting for 24 or more hours at some point during their lifetime. One-third of subjects read Fifty Shades (18.6%, or 122/655, read all three novels, and 14.8%, or 97/655, read at least the first novel but not all three). In age- and race-adjusted models, compared with nonreaders, females who read at least the first novel (but not all three) were more likely than nonreaders to have had, during their lifetime, a partner who shouted, yelled, or swore at them (relative risk [RR]=1.25) and who delivered unwanted calls/text messages

  10. Unsplit schemes for hyperbolic conservation laws with source terms in one space dimension

    International Nuclear Information System (INIS)

    Papalexandris, M.V.; Leonard, A.; Dimotakis, P.E.

    1997-01-01

    The present work is concerned with an application of the theory of characteristics to conservation laws with source terms in one space dimension, such as the Euler equations for reacting flows. Space-time paths are introduced on which the flow/chemistry equations decouple to a characteristic set of ODE's for the corresponding homogeneous laws, thus allowing the introduction of functions analogous to the Riemann invariants in classical theory. The geometry of these paths depends on the spatial gradients of the solution. This particular decomposition can be used in the design of efficient unsplit algorithms for the numerical integration of the equations. As a first step, these ideas are implemented for the case of a scalar conservation law with a nonlinear source term. The resulting algorithm belongs to the class of MUSCL-type, shock-capturing schemes. Its accuracy and robustness are checked through a series of tests. The stiffness of the source term is also studied. Then, the algorithm is generalized for a system of hyperbolic equations, namely the Euler equations for reacting flows. A numerical study of unstable detonations is performed. 57 refs

  11. Deutsches Atomforum turns fifty

    International Nuclear Information System (INIS)

    Geisler, Maja

    2009-01-01

    Fifty years ago, the Deutsches Atomforum e. V. was founded to promote the peaceful uses of nuclear power in Germany. On July 1, 2009, the organization celebrated its fiftieth birthday in Berlin. The anniversary was celebrated in the Berlin electricity plant, Germany's oldest existing building for commercial electricity generation. DAtF President Dr. Walter Hohlefelder welcomed some 200 high-ranking guests from politics, industry, and from the nuclear community, above all, the Chancellor of the Federal Republic of Germany, Dr. Angela Merkel, and, as keynote speaker, Professor Dr. Arnulf Baring. (orig.)

  12. Critical sets in one-parametric mathematical programs with complementarity constraints

    NARCIS (Netherlands)

    Bouza Allende, G.; Guddat, J.; Still, Georg J.

    2008-01-01

    One-parametric mathematical programs with complementarity constraints are considered. The structure of the set of generalized critical points is analysed for the generic case. It is shown how this analysis can locally be reduced to the study of appropriate standard one-parametric finite problems. By

  13. Diabetes mellitus during pregnancy: a study of fifty cases

    International Nuclear Information System (INIS)

    Randhawa, M. S.; Moin, S.; Shoaib, F.

    2003-01-01

    To review and critically evaluated the incidence, epidemiology, clinical pattern, diagnosis, management, complications and outcome of diabetes mellitus during pregnancy in hospital based study. Results: Total number of women delivered were 11271. Fifty cases of diabetes mellitus during pregnancy were studied. Mostly the patients were more than 30 years of age, multiparous ladies with gestational diabetes in 80% of cases, Type-II diabetes in 16% and only in 4% Type-I diabetes was reported. Insulin was required in 40% of patients. Eight women out of 50 had spontaneous miscarriage, 5 underwent preterm delivery while 36 reached term with one intrauterine death. Total number of babies delivered alive were 41. There was one stillbirth and 3 neonatal deaths. Conclusion: Management of diabetes mellitus in pregnancy involves teamwork of obstetricians, physicians and neonatologists. (author)

  14. Muscular variations during axillary dissection: A clinical study in fifty patients

    Directory of Open Access Journals (Sweden)

    Upasna

    2015-01-01

    Methods: The anatomy of axilla regarding muscular variations was studied in 50 patients who had an axillary dissection for the staging and treatment of invasive primary breast cancer over one year. Results: In a period of one year, two patients (4% with axillary arch and one patient (2% with absent pectoralis major and minor muscles among fifty patients undergoing axillary surgery for breast cancer were identified. Conclusions: Axillary arch when present should always be identified and formally divided to allow adequate exposure of axillary contents, in order to achieve a complete lymphatic dissection. Complete absence of pectoralis major and minor muscles precludes the insertion of breast implants and worsens the prognosis of breast cancer.

  15. Open-Source Electronic Health Record Systems for Low-Resource Settings: Systematic Review.

    Science.gov (United States)

    Syzdykova, Assel; Malta, André; Zolfo, Maria; Diro, Ermias; Oliveira, José Luis

    2017-11-13

    Despite the great impact of information and communication technologies on clinical practice and on the quality of health services, this trend has been almost exclusive to developed countries, whereas countries with poor resources suffer from many economic and social issues that have hindered the real benefits of electronic health (eHealth) tools. As a component of eHealth systems, electronic health records (EHRs) play a fundamental role in patient management and effective medical care services. Thus, the adoption of EHRs in regions with a lack of infrastructure, untrained staff, and ill-equipped health care providers is an important task. However, the main barrier to adopting EHR software in low- and middle-income countries is the cost of its purchase and maintenance, which highlights the open-source approach as a good solution for these underserved areas. The aim of this study was to conduct a systematic review of open-source EHR systems based on the requirements and limitations of low-resource settings. First, we reviewed existing literature on the comparison of available open-source solutions. In close collaboration with the University of Gondar Hospital, Ethiopia, we identified common limitations in poor resource environments and also the main requirements that EHRs should support. Then, we extensively evaluated the current open-source EHR solutions, discussing their strengths and weaknesses, and their appropriateness to fulfill a predefined set of features relevant for low-resource settings. The evaluation methodology allowed assessment of several key aspects of available solutions that are as follows: (1) integrated applications, (2) configurable reports, (3) custom reports, (4) custom forms, (5) interoperability, (6) coding systems, (7) authentication methods, (8) patient portal, (9) access control model, (10) cryptographic features, (11) flexible data model, (12) offline support, (13) native client, (14) Web client,(15) other clients, (16) code

  16. Quantum mechanics over sets

    Science.gov (United States)

    Ellerman, David

    2014-03-01

    In models of QM over finite fields (e.g., Schumacher's ``modal quantum theory'' MQT), one finite field stands out, Z2, since Z2 vectors represent sets. QM (finite-dimensional) mathematics can be transported to sets resulting in quantum mechanics over sets or QM/sets. This gives a full probability calculus (unlike MQT with only zero-one modalities) that leads to a fulsome theory of QM/sets including ``logical'' models of the double-slit experiment, Bell's Theorem, QIT, and QC. In QC over Z2 (where gates are non-singular matrices as in MQT), a simple quantum algorithm (one gate plus one function evaluation) solves the Parity SAT problem (finding the parity of the sum of all values of an n-ary Boolean function). Classically, the Parity SAT problem requires 2n function evaluations in contrast to the one function evaluation required in the quantum algorithm. This is quantum speedup but with all the calculations over Z2 just like classical computing. This shows definitively that the source of quantum speedup is not in the greater power of computing over the complex numbers, and confirms the idea that the source is in superposition.

  17. Analytical one parameter method for PID motion controller settings

    NARCIS (Netherlands)

    van Dijk, Johannes; Aarts, Ronald G.K.M.

    2012-01-01

    In this paper analytical expressions for PID-controllers settings for electromechanical motion systems are presented. It will be shown that by an adequate frequency domain oriented parametrization, the parameters of a PID-controller are analytically dependent on one variable only, the cross-over

  18. 75 FR 17832 - Pricing for 2010 Lincoln One-Cent Coin Two-Roll Set

    Science.gov (United States)

    2010-04-07

    ... DEPARTMENT OF THE TREASURY United States Mint Pricing for 2010 Lincoln One-Cent Coin Two-Roll Set AGENCY: United States Mint, Department of the Treasury. ACTION: Notice. SUMMARY: The United States Mint is announcing the price of the 2010 Lincoln One-Cent Coin Two-Roll Set. The 2010 Lincoln One-Cent...

  19. Mathematical card magic fifty-two new effects

    CERN Document Server

    Mulcahy, Colm

    2013-01-01

    Mathematical card effects offer both beginning and experienced magicians an opportunity to entertain with a minimum of props. Featuring mostly original creations, Mathematical Card Magic: Fifty-Two New Effects presents an entertaining look at new mathematically based card tricks. Each chapter contains four card effects, generally starting with simple applications of a particular mathematical principle and ending with more complex ones. Practice a handful of the introductory effects and, in no time, you'll establish your reputation as a "mathemagician." Delve a little deeper into each chapter and the mathematics gets more interesting. The author explains the mathematics as needed in an easy-to-follow way. He also provides additional details, background, and suggestions for further explorations.Suitable for recreational math buffs and amateur card lovers or as a text in a first-year seminar, this color book offers a diverse collection of new mathemagic principles and effects.

  20. PANDORA: keyword-based analysis of protein sets by integration of annotation sources.

    Science.gov (United States)

    Kaplan, Noam; Vaaknin, Avishay; Linial, Michal

    2003-10-01

    Recent advances in high-throughput methods and the application of computational tools for automatic classification of proteins have made it possible to carry out large-scale proteomic analyses. Biological analysis and interpretation of sets of proteins is a time-consuming undertaking carried out manually by experts. We have developed PANDORA (Protein ANnotation Diagram ORiented Analysis), a web-based tool that provides an automatic representation of the biological knowledge associated with any set of proteins. PANDORA uses a unique approach of keyword-based graphical analysis that focuses on detecting subsets of proteins that share unique biological properties and the intersections of such sets. PANDORA currently supports SwissProt keywords, NCBI Taxonomy, InterPro entries and the hierarchical classification terms from ENZYME, SCOP and GO databases. The integrated study of several annotation sources simultaneously allows a representation of biological relations of structure, function, cellular location, taxonomy, domains and motifs. PANDORA is also integrated into the ProtoNet system, thus allowing testing thousands of automatically generated clusters. We illustrate how PANDORA enhances the biological understanding of large, non-uniform sets of proteins originating from experimental and computational sources, without the need for prior biological knowledge on individual proteins.

  1. Circulations, debates and appropriations of the Fifty Shades of Grey in Argentina

    Directory of Open Access Journals (Sweden)

    Karina Felitti

    2018-01-01

    Full Text Available This article studies the social effects of the circulation and reception of the phenomenon of Fifty Shades of Grey in Argentina, in the context of sexualization of culture, postfeminism and the local agenda on gender policies. It analyzes the discussions that it aroused in the academy and the feminist militancy, its repercussions in the press and in the market of erotic goods and services, and the opinions and experiences of a set of readers. It is a qualitative study that combines textual analysis and in-depth interviews. Similarities are observed with analyzes carried out in other countries and also we pointed local specificities related to the discussions on gender violence, its manifestations and women’s agency.

  2. Fifty years of Erlangen radiochemistry

    International Nuclear Information System (INIS)

    Morell, W.

    2007-01-01

    On June 29, 2006, the Radiochemical Laboratory of AREVA NP GmbH (formerly Siemens AG) in Erlangen celebrated its fiftieth anniversary. The occasion was marked by an event attended by more than 1,000 guests, among them Werner Gebauhr, the 85-year-old founder and first head of the Laboratory; the Managing Directors of AREVA NP GmbH, Ralf Gueldner and Ruediger Steuerlein; representatives of universities, research institutions, power utilities, and public authorities. The present head of the Radiochemical Laboratory, Wilfred Morell, sketched the highlights of the work performed over the past fifty years, which ranged from solid-state and very-high-purity materials technologies to development and service activities for nuclear technology. Manfred Erve, head of the Technical Center of AREVA NP GmbH, of which the Radiochemical Laboratory is a part, emphasized the changes in priorities over the past fifty years, which had always been met successfully by Radiochemistry. In the scientific part of the event, Wolfgang Schwarz (E.ON Kernkraftwerk GmbH, KKW Isar), Ulf Ilg (EnBW Kraftwerk AG, KKW Philippsburg), and Hans-Josef Allelein (Gesellschaft fuer Anlagen- und Reaktorsicherheit mbH) explained 3 major subject areas in which Erlangen Radio-chemistry over many years has contributed basic findings (see other articles in this atw issue). On the occasion of the anniversary, a comprehensive booklet was published under the title of '50 Jahre Radiochemie Erlangen - 1956-2006'. (orig.)

  3. What's in a ray set: moving towards a unified ray set format

    Science.gov (United States)

    Muschaweck, Julius

    2011-10-01

    For the purpose of optical simulation, a plethora of formats exist to describe the properties of a light source. Except for the EULUMDAT and IES formats which describe sources in terms of aperture area and far field intensity, all these formats are vendor specific, and no generally accepted standard exists. Most illumination simulation software vendors use their own format for ray sets, which describe sources in terms of many rays. Some of them keep their format definition proprietary. Thus, software packages typically can read or write only their own specific format, although the actual data content is not so different. Typically, they describe origin and direction of each ray in 3D vectors, and use one more single number for magnitude, where magnitude may denote radiant flux, luminous flux (equivalently tristimulus Y), or tristimulus X and Z. Sometimes each ray also carries its wavelength, while other formats allow to specify an overall spectrum for the whole source. In addition, in at least one format, polarization properties are also included for each ray. This situation makes it inefficient and potentially error prone for light source manufacturers to provide ray data sets for their sources in many different formats. Furthermore, near field goniometer vendors again use their proprietary formats to store the source description in terms of luminance data, and offer their proprietary software to generate ray sets from this data base. Again, the plethora of ray sets make the ray set production inefficient and potentially error prone. In this paper, we propose to describe ray data sets in terms of phase space, as a step towards a standardized ray set format. It is well known that luminance and radiance can be defined as flux density in phase space: luminance is flux divided by etendue. Therefore, single rays can be thought of as center points of phase space cells, where each cell possesses its volume (i.e. etendue), its flux, and therefore its luminance. In

  4. Columbia physics in the fifties: Untold tales

    Directory of Open Access Journals (Sweden)

    J. Sucher

    2000-07-01

    Full Text Available Eyvind Wichmann and I were both graduate students at Columbia University in the fifties, a decade of remarkable creativity by a star-studded physics faculty, which included some ten Nobel Laureates. I share some reminiscences about our time there and explain the role played in our relationship by an eightball.

  5. Fifty years ago - nuclear physics at Cambridge

    International Nuclear Information System (INIS)

    Burcham, W.E.

    1982-01-01

    Fifty years ago, the Cavendish saw the first nuclear transformations using artificially accelerated particles, and was soon to provide confirmation of the discovery of the positron. In 1932, the Cavendish Laboratory under the guiding hand of Rutherford was a world focus for research and with these discoveries saw the birth of modern particle physics. (orig.).

  6. EDITORIAL: Semiconductor lasers: the first fifty years Semiconductor lasers: the first fifty years

    Science.gov (United States)

    Calvez, S.; Adams, M. J.

    2012-09-01

    Anniversaries call for celebrations. Since it is now fifty years since the first semiconductor lasers were reported, it is highly appropriate to celebrate this anniversary with a Special Issue dedicated to the topic. The semiconductor laser now has a major effect on our daily lives since it has been a key enabler in the development of optical fibre communications (and hence the internet and e-mail), optical storage (CDs, DVDs, etc) and barcode scanners. In the early 1960s it was impossible for most people (with the exception of very few visionaries) to foresee any of these future developments, and the first applications identified were for military purposes (range-finders, target markers, etc). Of course, many of the subsequent laser applications were made possible by developments in semiconductor materials, in the associated growth and fabrication technology, and in the increased understanding of the underlying fundamental physics. These developments continue today, so that the subject of semiconductor lasers, although mature, is in good health and continues to grow. Hence, we can be confident that the pervasive influence of semiconductor lasers will continue to develop as optoelectronics technology makes further advances into other sectors such as healthcare, security and a whole host of applications based on the global imperatives to reduce energy consumption, minimise environmental impact and conserve resources. The papers in this Special Issue are intended to tell some of the story of the last fifty years of laser development as well as to provide evidence of the current state of semiconductor laser research. Hence, there are a number of papers where the early developments are recalled by authors who played prominent parts in the story, followed by a selection of papers from authors who are active in today's exciting research. The twenty-fifth anniversary of the semiconductor laser was celebrated by the publication of a number of papers dealing with the early

  7. Fifty years after Hiroshima

    International Nuclear Information System (INIS)

    Imai, R.

    1997-01-01

    Any discussion of the nuclear-proliferation regime in the post-Cold War era, its present role and expected function in the future must take into account the outcome of the New York Non-proliferation Treaty Review Conference held in 1995, and the four factors that shaped the twentieth century: fascism and communism, nuclear energy and environmental concerns, as well as the interaction between them. none of the factors can be handled within the boundaries of a single state. After fifty years of nuclear competition both for military and energy generation purposes, international conferences on disarmament, Non-proliferation Treaty, Comprehensive Test Ban Treaty, START negotiations and agreements, there are still remaining nuclear issues. Population growth, energy demands and supplies, all related to global environmental changes must be taken into account

  8. Los Angeles OneSource System Youth Participant Customer Satisfaction Survey, 2010-2011

    Science.gov (United States)

    Heisley, Deborah D.; Moore, Richard W.; Patch, Robin N.

    2012-01-01

    As part of the Workforce Investment Act of 1998, Los Angeles OneSource Centers offer low-income youth ages 14-21 services aimed at improving educational achievement, enhancing job skills, and preparing for college. The primary purpose of this study was to evaluate the youths' satisfaction with services received at 14 OneSource Centers throughout…

  9. Information Systems Curricula: A Fifty Year Journey

    Science.gov (United States)

    Longenecker, Herbert E., Jr.; Feinstein, David; Clark, Jon D.

    2013-01-01

    This article presents the results of research to explore the nature of changes in skills over a fifty year period spanning the life of Information Systems model curricula. Work begun in 1999 was expanded both backwards in time, as well as forwards to 2012 to define skills relevant to Information Systems curricula. The work in 1999 was based on job…

  10. At the sources of one's well-being: early rehabilitation for employees with symptoms of distress.

    Science.gov (United States)

    Kuoppala, Jaana; Kekoni, Jouni

    2013-07-01

    To examine the effects of a new multifaceted early rehabilitation program on employee well-being targeted on distressed employees in small-to-medium sized workplaces. Fifty-two employees (92% women; age: 34 to 66 years) participated in five biweekly sessions with one follow-up day at 6 months. Rehabilitation professionals specially trained for the mindfulness method covered topics from health, nutrition, sleep, physical activity to stress management. Employees were divided by their well-being level at baseline into "healthy" and "symptomatic" groups. Main outcomes were job, mental, and physical well-being. Well-being among the symptomatic employees reached that of the healthy ones at baseline. Also, the healthy participants benefited from the program to a small degree. The preliminary findings of this new program are promising although more research is needed on its effects and cost-effectiveness.

  11. [Origin exploration of "the fifty-nine acupoints for febrile disease"].

    Science.gov (United States)

    Li, Guangyi

    2017-02-12

    Fifty-nine acupoints for febrile disease is recorded in Huangdi Neijing ( Huangdi's Internal Classics ). By analyzing the combination of these acupoints, the writer discovered the acupoint composition and detected their origins from Huangdi's Internal Classics , in which the terms biaoben, qijie and beishu are involved in the theoretic evidence. The writer thought the "fifty-nine acupoints for febrile disease" implied the self-evolution of some acupuncture school in ancient time, which was formed by absorbing the theoretic experiences of the other schools. It is necessary to analyze and interpret the other literatures besides Huangdi's Internal Classics and probably obtain the further reorganization on it.

  12. 1977 flywheel technology symposium proceedings. [Fifty-two papers

    Energy Technology Data Exchange (ETDEWEB)

    Chang, G.C.; Stone, R.G. (eds.)

    1978-03-01

    Fifty-two papers, four paper abstracts, and four brief summaries of panel discussions are presented on flywheel energy storage technology. A separate abstract was prepared for each of 41 papers for inclusion in DOE Energy Research Abstracts (ERA). Eleven papers were processed previously for inclusion in the data base. (PMA)

  13. Young Women's Perceptions of the Relationship in Fifty Shades of Grey.

    Science.gov (United States)

    Bonomi, Amy E; Nichols, Emily M; Carotta, Christin L; Kiuchi, Yuya; Perry, Samantha

    2016-02-01

    Millions of women are interacting with Fifty Shades of Grey-a best-selling novel and film. Yet, to date, no social science study has been undertaken to examine women's perceptions of the Fifty Shades relationship narrative in its film adaptation-what they deem appealing, what they deem unappealing, and what they would welcome or resist in their own relationship. In the present study, we used focus groups to examine women's perceptions of the relationship patterns in the Fifty Shades of Grey film. Focus groups were conducted with 35 young adult women (randomly sampled from the registrar's office of a large Midwestern university) immediately after watching the Fifty Shades film with the study team at a local theater within two days of the film's release. Seven semistructured questions concentrating on reactions to the relationship patterns between Christian Grey and Anastasia Steele depicted in the Fifty Shades film were asked, including general reactions, appealing and unappealing characteristics, romantic and dangerous elements, and aspects that participants would tolerate (or not tolerate) in their own relationships. While participants assessed parts of the relationship between Christian and Anastasia as exciting and romantic, they consistently indicated an unappealing lack of health in the relationship. Participants expressed grave concerns over Christian's stalking, controlling, manipulative, and emotionally abusive behavior, anger in sexual interactions, and neglect of Anastasia's needs. At the same time, they sympathized with and rationalized Christian's behaviors as a function of his personality, needs, and abilities. A small contingent implicated Anastasia in the unhealthy relationship process, whereas a broader majority of participants highlighted the challenges with trying to "speak up" in an unhealthy relationship like Christian and Anastasia's. When asked where participants would draw the line in their own relationship, participants indicated they would

  14. Sources of Cognitive Inflexibility in Set-Shifting Tasks: Insights Into Developmental Theories From Adult Data.

    Science.gov (United States)

    Dick, Anthony Steven

    2012-01-01

    Two experiments examined processes underlying cognitive inflexibility in set-shifting tasks typically used to assess the development of executive function in children. Adult participants performed a Flexible Item Selection Task (FIST) that requires shifting from categorizing by one dimension (e.g., color) to categorizing by a second orthogonal dimension (e.g., shape). The experiments showed performance of the FIST involves suppression of the representation of the ignored dimension; response times for selecting a target object in an immediately-following oddity task were slower when the oddity target was the previously-ignored stimulus of the FIST. However, proactive interference from the previously relevant stimulus dimension also impaired responding. The results are discussed with respect to two prominent theories of the source of difficulty for children and adults on dimensional shifting tasks: attentional inertia and negative priming . In contrast to prior work emphasizing one over the other process, the findings indicate that difficulty in the FIST, and by extension other set-shifting tasks, can be attributed to both the need to shift away from the previously attended representation ( attentional inertia ), and the need to shift to the previously ignored representation ( negative priming ). Results are discussed in relation to theoretical explanations for cognitive inflexibility in adults and children.

  15. Detrital zircon provenance of the Hartselle Sandstone Unit, Southeastern USA: Insights into sediment source, paleogeography, and setting

    Science.gov (United States)

    Harthy, M. A.; Gifford, J.

    2017-12-01

    The Hartselle sandstone is an excellent example of an Oil sand, a resource rich in bitumen. The unit is a light-colored thick-bedded to massive quartzose sandstone, that is widespread across an area from Georgia in the east to Mississippi in the west, and south from Alabama to Kentucky as a northern border. Formation thickness ranges from 0 to more than 150 feet. The unit has been stratigraphically dated to the Middle-Upper Mississippian age. One hypothesis suggests that the sandstone unit formed from the geological remains of barrier islands located in the ocean between Gondwana and Laurentia. The Hartselle is thought to have formed by the movement waves and currents along the shoreline, which carried sand and concentrated it into a set of northwest to southeast trending barrier islands. Transgression-regression events shifted the islands back and forth in relation to the position of the shoreline, leading to the large areal extent of the unit. However, the current data are not enough to explain the geographical position of the Hartselle sandstone unit as it is not running parallel to the ancient shoreline. Another mystery is the source of the sand, some believing the source was from the south (Gondwana) and others that erosion was from the north (Laurentia). Detrital zircon provenance analysis will address the uncertainty in sediment source. We will compare zircon U-Pb age spectra to possible Laurentian and Gondwanan source areas to discriminate between these possibilities. In addition, the age of the youngest detrital zircon population will provide additional constraints on the maximum age of deposition for the unit. These detrital ages will also help us to understand the tectonic setting at the time of Hartselle deposition. Lastly, we aim to explain the widespread nature of the unit and the processes involved in the formation of the Hartselle sandstone. When taken together, these interpretations will illuminate the age, depositional and tectonic setting of a

  16. Basic limnology of fifty-one lakes in Costa Rica.

    Science.gov (United States)

    Haberyan, Kurt A; Horn, Sally P; Umaña, Gerardo

    2003-03-01

    We visited 51 lakes in Costa Rica as part of a broad-based survey to document their physical and chemical characteristics and how these relate to the mode of formation and geographical distribution of the lakes. The four oxbow lakes were low in elevation and tended to be turbid, high in conductivity and CO2, but low in dissolved O2; one of these, L. Gandoca, had a hypolimnion essentially composed of sea water. These were similar to the four wetland lakes, but the latter instead had low conductivities and pH, and turbidity was often due to tannins rather than suspended sediments. The thirteen artificial lakes formed a very heterogenous group, whose features varied depending on local factors. The thirteen lakes dammed by landslides, lava flows, or lahars occurred in areas with steep slopes, and were more likely to be stratified than most other types of lakes. The eight lakes that occupy volcanic craters tended to be deep, stratified, clear, and cool; two of these, L. Hule and L. Río Cuarto, appeared to be oligomictic (tending toward meromictic). The nine glacial lakes, all located above 3440 m elevation near Cerro Chirripó, were clear, cold, dilute, and are probably polymictic. Cluster analysis resulted in three significant groups of lakes. Cluster 1 included four calcium-rich lakes (average 48 mg l-1), Cluster 2 included fourteen lakes with more Si than Ca+2 and higher Cl- than the other clusters, and Cluster 3 included the remaining thirty-three lakes that were generally less concentrated. Each cluster included lakes of various origins located in different geographical regions; these data indicate that, apart from the high-altitude glacial lakes and lakes in the Miravalles area, similarity in lake chemistry is independent of lake distribution.

  17. Family dinner frequency, settings and sources, and body weight in US adults.

    Science.gov (United States)

    Sobal, Jeffery; Hanson, Karla

    2014-07-01

    Contemporary families and food systems are both becoming more dynamic and complex, and current associations between adult family meals and body mass index (BMI) are not well understood. This investigation took a new approach by examining diverse settings and sources of food for family dinners in relationship to BMI in a cross-sectional nationally representative survey of 360 US adults age 18-85 living with family members. In this sample, 89% of adults ate family dinners at least 5 days per week and almost all ate family dinners cooked and eaten at home. About half of these adults also ate family dinners at restaurants, fast food places, or ate takeout food at home, and less common were family dinners at homes of relatives or friends. Family dinners eaten at fast food places, but not other settings or sources, were significantly associated with higher BMI. Overall, adult family dinners were commonplace, usually involved home cooking, and when at fast food places may be related with higher adult body weights. Copyright © 2014. Published by Elsevier Ltd.

  18. Ranking Tehran’s Stock Exchange Top Fifty Stocks Using Fundamental Indexes and Fuzzy TOPSIS

    Directory of Open Access Journals (Sweden)

    E. S. Saleh

    2017-08-01

    Full Text Available Investment through the purchase of securities, constitute an important part of countries economic exchange. Therefore, making decisions about investing in a particular stock has become one of the most controversial areas of economic and financial research and various institutions have began to rank companies stock and determine priorities of stock purchase to investment. The current research, with the determination of important required indexes for companies ranking based on their shares value on the Tehran stock exchange, can greatly help to the accurate ranking of fifty premier listed companies. Initial ranking indicators are extracted and then a decision-making group (exchange experts with the use of the Delphi method and also non-parametric statistic methods, determines the final indexes. Then, by using Fuzzy ANP, weight criteria are obtained with taking into account their interaction with each other. Finally, using fuzzy TOPSIS and information extraction about the premier fifty listed companies of Tehran stock exchange in 2014 are ranked with the software "Rahavard Novin”. Sensitivity analysis to criteria weight and relevant analysis presentation was conducted at the end of the study procedures.

  19. Solute transport with periodic input point source in one-dimensional ...

    African Journals Online (AJOL)

    JOY

    groundwater flow velocity is considered proportional to multiple of temporal function and ζ th ... One-dimensional solute transport through porous media with or without .... solute free. ... the periodic concentration at source of the boundary i.e.,. 0.

  20. mmpdb: An Open-Source Matched Molecular Pair Platform for Large Multiproperty Data Sets.

    Science.gov (United States)

    Dalke, Andrew; Hert, Jérôme; Kramer, Christian

    2018-05-29

    Matched molecular pair analysis (MMPA) enables the automated and systematic compilation of medicinal chemistry rules from compound/property data sets. Here we present mmpdb, an open-source matched molecular pair (MMP) platform to create, compile, store, retrieve, and use MMP rules. mmpdb is suitable for the large data sets typically found in pharmaceutical and agrochemical companies and provides new algorithms for fragment canonicalization and stereochemistry handling. The platform is written in Python and based on the RDKit toolkit. It is freely available from https://github.com/rdkit/mmpdb .

  1. Examining the Types of Social Support and the Actual Sources of Support in Older Chinese and Korean Immigrants

    Science.gov (United States)

    Wong, Sabrina T.; Yoo, Grace J.; Stewart, Anita L.

    2005-01-01

    This study explored social support domains and actual sources of support for older Chinese and Korean immigrants and compared them to the traditional domains based on mainly White, middle class populations. Fifty-two older Cantonese and Korean speaking immigrants participated in one of eight focus groups. We identified four similar domains:…

  2. German Atomic Energy Act turns fifty

    International Nuclear Information System (INIS)

    Schneider, Horst

    2009-01-01

    The German Atomic Energy Act entered into force on January 1, 1960. It turns fifty at the beginning of 2010. Is this a reason to celebrate or rather the opposite? Lawyers, in principle, can view old pieces of legislation from 2 perspectives: On the one hand, aged laws are treated in a spirit of veneration and are celebrated as proven. On the other hand, an anniversary of this kind can be a welcome reason for demands to abolish or, at least, fundamentally renew that law. Over the past half century, the German Atomic Energy Act went through stormy and varied phases both of a legal and a political character. Its 50 th anniversary is likely to spark off very conflicting evaluations as well. A review of legal history shows that the German or, rather, the Federal German Atomic Energy Act (AtG) was not a first-of-its-kind piece of legislation but stemmed from the 1957 EURATOM Treaty, in a way representing a latecomer of that treaty. The Atomic Energy Act experienced a number of important developments throughout its history: - In 1975, compulsory licensing of fuel element factories was introduced. - The back end of the fuel cycle, especially final storage, were incorporated in the Atomic Energy Act comprehensively first in 1976. - In 1985, legislators decided in favor of unlimited nuclear liability. - In 1994 and 1998, only some innovations in special items were introduced under the headings of environmental impact assessment and suitability for repository storage because the controversy about nuclear power did not permit a fundamental alignment towards a more comprehensive modern safety law. - The decision to opt out of the peaceful uses of nuclear power in 2002 drew the final line so far of decisions about directions of nuclear law in a major amendment. In parallel, the decisions by the Federal Constitutional Court and the Federal Administrative Court in the late 1970s and, above all, the 1980s provided important assistance which has remained valid to this day. What is

  3. ISINA: INTEGRAL Source Identification Network Algorithm

    Science.gov (United States)

    Scaringi, S.; Bird, A. J.; Clark, D. J.; Dean, A. J.; Hill, A. B.; McBride, V. A.; Shaw, S. E.

    2008-11-01

    We give an overview of ISINA: INTEGRAL Source Identification Network Algorithm. This machine learning algorithm, using random forests, is applied to the IBIS/ISGRI data set in order to ease the production of unbiased future soft gamma-ray source catalogues. First, we introduce the data set and the problems encountered when dealing with images obtained using the coded mask technique. The initial step of source candidate searching is introduced and an initial candidate list is created. A description of the feature extraction on the initial candidate list is then performed together with feature merging for these candidates. Three training and testing sets are created in order to deal with the diverse time-scales encountered when dealing with the gamma-ray sky. Three independent random forests are built: one dealing with faint persistent source recognition, one dealing with strong persistent sources and a final one dealing with transients. For the latter, a new transient detection technique is introduced and described: the transient matrix. Finally the performance of the network is assessed and discussed using the testing set and some illustrative source examples. Based on observations with INTEGRAL, an ESA project with instruments and science data centre funded by ESA member states (especially the PI countries: Denmark, France, Germany, Italy, Spain), Czech Republic and Poland, and the participation of Russia and the USA. E-mail: simo@astro.soton.ac.uk

  4. A Source Identification Algorithm for INTEGRAL

    Science.gov (United States)

    Scaringi, Simone; Bird, Antony J.; Clark, David J.; Dean, Anthony J.; Hill, Adam B.; McBride, Vanessa A.; Shaw, Simon E.

    2008-12-01

    We give an overview of ISINA: INTEGRAL Source Identification Network Algorithm. This machine learning algorithm, using Random Forests, is applied to the IBIS/ISGRI dataset in order to ease the production of unbiased future soft gamma-ray source catalogues. The key steps of candidate searching, filtering and feature extraction are described. Three training and testing sets are created in order to deal with the diverse timescales and diverse objects encountered when dealing with the gamma-ray sky. Three independent Random Forest are built: one dealing with faint persistent source recognition, one dealing with strong persistent sources and a final one dealing with transients. For the latter, a new transient detection technique is introduced and described: the Transient Matrix. Finally the performance of the network is assessed and discussed using the testing set and some illustrative source examples.

  5. Electricity market auction settings in a future Danish electricity system with a high penetration of renewable energy sources - A comparison of marginal pricing and pay-as-bid

    International Nuclear Information System (INIS)

    Nielsen, Steffen; Sorknaes, Peter; Ostergaard, Poul Alberg

    2011-01-01

    The long-term goal for Danish energy policy is to be free of fossil fuels through the increasing use of renewable energy sources (RES) including fluctuating renewable electricity (FRE). The Danish electricity market is part of the Nordic power exchange, which uses a Marginal Price auction system (MPS) for the day-ahead auctions. The market price is thus equal to the bidding price of the most expensive auction winning unit. In the MPS, the FRE bid at prices of or close to zero resulting in reduced market prices during hours of FRE production. In turn, this reduces the FRE sources' income from market sales. As more FRE is implemented, this effect will only become greater, thereby reducing the income for FRE producers. Other auction settings could potentially help to reduce this problem. One candidate is the pay-as-bid auction setting (PAB), where winning units are paid their own bidding price. This article investigates the two auction settings, to find whether a change of auction setting would provide a more suitable frame for large shares of FRE. This has been done with two energy system scenarios with different shares of FRE. From the analysis, it is found that MPS is generally better for the FRE sources. The result is, however, very sensitive to the base assumptions used for the calculations. -- Highlights: → In this study two different auction settings for the Danish electricity market are compared. → Two scenarios are used in the analyses, one representing the present system and one representing a future 100% renewable energy system. → We find that marginal price auction system is most suitable for supporting fluctuating renewable energy in both scenarios. → The results are very sensitive to the assumptions about bidding prices for each technology.

  6. Conference on Nuclear Energy and Science for the 21st Century: Atoms for Peace Plus Fifty - Washington, D.C., October 2003

    Energy Technology Data Exchange (ETDEWEB)

    Pfaltzgraff, Robert L [Institute for Foreign Policy Analysis

    2006-10-22

    This conference's focus was the peaceful uses of the atom and their implications for nuclear science, energy security, nuclear medicine and national security. The conference also provided the setting for the presentation of the prestigious Enrico Fermi Prize, a Presidential Award which recognizes the contributions of distinguished members of the scientific community for a lifetime of exceptional achievement in the science and technology of nuclear, atomic, molecular, and particle interactions and effects. An impressive group of distinguished speakers addressed various issues that included: the impact and legacy of the Eisenhower Administration’s “Atoms for Peace” concept, the current and future role of nuclear power as an energy source, the challenges of controlling and accounting for existing fissile material, and the horizons of discovery for particle or high-energy physics. The basic goal of the conference was to examine what has been accomplished over the past fifty years as well as to peer into the future to gain insights into what may occur in the fields of nuclear energy, nuclear science, nuclear medicine, and the control of nuclear materials.

  7. First Fifty Years of Chemoresistive Gas Sensors

    Directory of Open Access Journals (Sweden)

    Giovanni Neri

    2015-01-01

    Full Text Available The first fifty years of chemoresistive sensors for gas detection are here reviewed, focusing on the main scientific and technological innovations that have occurred in the field over the course of these years. A look at advances made in fundamental and applied research and leading to the development of actual high performance chemoresistive devices is presented. The approaches devoted to the synthesis of novel semiconducting materials with unprecedented nanostructure and gas-sensing properties have been also presented. Perspectives on new technologies and future applications of chemoresistive gas sensors have also been highlighted.

  8. Nuclear fission discovered fifty years ago

    International Nuclear Information System (INIS)

    Weis, M.

    1988-01-01

    Fifty years ago, Otto Hahn, Lise Meitner and Fritz Strassmann discovered the process of nuclear fission which, more than other scientific discoveries to date, profoundly has changed the world and continues to influence our life significantly: This discovery made the up to then incontestable physicists' view of the atom as an inseparable whole suddenly shatter to pieces. It has brought about the invaluable advantages of a peaceful utilization of nuclear energy, and at the same time put scientists in the position to build the most terrible weapon ever, threatening mankind and earth with complete destruction. All this certainly is reason enough to recall the scientists, their work and the spirit of the time. (orig.) [de

  9. Nuclear spin: Fifty years of ups and downs

    Energy Technology Data Exchange (ETDEWEB)

    Pines, A. [Lawrence Berkeley National Lab., CA (United States)

    1996-12-31

    Rumors of its demise notwithstanding, nuclear magnetic resonance (NMR) continues to flourish fifty years after our birth. The lecture will be a reminiscence about moments of excitation, coherence and relaxation in the history of NMR which produced, among other developments, spin echoes and time reversal, Fourier transform and multidimensional spectroscopy, magnetic resonance imaging, and high resolution solid state NMR. Applications of modern NMR spectroscopy cut across the boundaries of physics, chemistry, materials, biology and medicine.

  10. Defect inspection in hot slab surface: multi-source CCD imaging based fuzzy-rough sets method

    Science.gov (United States)

    Zhao, Liming; Zhang, Yi; Xu, Xiaodong; Xiao, Hong; Huang, Chao

    2016-09-01

    To provide an accurate surface defects inspection method and make the automation of robust image region of interests(ROI) delineation strategy a reality in production line, a multi-source CCD imaging based fuzzy-rough sets method is proposed for hot slab surface quality assessment. The applicability of the presented method and the devised system are mainly tied to the surface quality inspection for strip, billet and slab surface etcetera. In this work we take into account the complementary advantages in two common machine vision (MV) systems(line array CCD traditional scanning imaging (LS-imaging) and area array CCD laser three-dimensional (3D) scanning imaging (AL-imaging)), and through establishing the model of fuzzy-rough sets in the detection system the seeds for relative fuzzy connectedness(RFC) delineation for ROI can placed adaptively, which introduces the upper and lower approximation sets for RIO definition, and by which the boundary region can be delineated by RFC region competitive classification mechanism. For the first time, a Multi-source CCD imaging based fuzzy-rough sets strategy is attempted for CC-slab surface defects inspection that allows an automatic way of AI algorithms and powerful ROI delineation strategies to be applied to the MV inspection field.

  11. Detection of hepatitis C virus RNA: comparison of one-stage polymerase chain reaction (PCR) with nested-set PCR.

    OpenAIRE

    Gretch, D R; Wilson, J J; Carithers, R L; dela Rosa, C; Han, J H; Corey, L

    1993-01-01

    We evaluated a new hepatitis C virus RNA assay based on one-stage PCR followed by liquid hybridization with an oligonucleotide probe and compared it with nested-set PCR. The one-stage and nested-set PCR assays had identical sensitivities in analytical experiments and showed 100% concordance when clinical specimens were used. One-stage PCR may be less prone to contamination than nested-set PCR.

  12. DMPD: Fifty years of interferon research: aiming at a moving target. [Dynamic Macrophage Pathway CSML Database

    Lifescience Database Archive (English)

    Full Text Available mmunity. 2006 Sep;25(3):343-8. (.png) (.svg) (.html) (.csml) Show Fifty years of interferon research: aiming at a moving target. Pubm...edID 16979566 Title Fifty years of interferon research: aiming at a moving target.

  13. Tag-KEM from Set Partial Domain One-Way Permutations

    Science.gov (United States)

    Abe, Masayuki; Cui, Yang; Imai, Hideki; Kurosawa, Kaoru

    Recently a framework called Tag-KEM/DEM was introduced to construct efficient hybrid encryption schemes. Although it is known that generic encode-then-encrypt construction of chosen ciphertext secure public-key encryption also applies to secure Tag-KEM construction and some known encoding method like OAEP can be used for this purpose, it is worth pursuing more efficient encoding method dedicated for Tag-KEM construction. This paper proposes an encoding method that yields efficient Tag-KEM schemes when combined with set partial one-way permutations such as RSA and Rabin's encryption scheme. To our knowledge, this leads to the most practical hybrid encryption scheme of this type. We also present an efficient Tag-KEM which is CCA-secure under general factoring assumption rather than Blum factoring assumption.

  14. Tracking brachytherapy sources using emission imaging with one flat panel detector

    International Nuclear Information System (INIS)

    Song Haijun; Bowsher, James; Das, Shiva; Yin Fangfang

    2009-01-01

    This work proposes to use the radiation from brachytherapy sources to track their dwell positions in three-dimensional (3D) space. The prototype device uses a single flat panel detector and a BB tray. The BBs are arranged in a defined pattern. The shadow of the BBs on the flat panel is analyzed to derive the 3D coordinates of the illumination source, i.e., the dwell position of the brachytherapy source. A kilovoltage x-ray source located 3.3 m away was used to align the center BB with the center pixel on the flat panel detector. For a test plan of 11 dwell positions, with an Ir-192 high dose rate unit, one projection was taken for each dwell point, and locations of the BB shadows were manually identified on the projection images. The 3D coordinates for the 11 dwell positions were reconstructed based on two BBs. The distances between dwell points were compared with the expected values. The average difference was 0.07 cm with a standard deviation of 0.15 cm. With automated BB shadow recognition in the future, this technique possesses the potential of tracking the 3D trajectory and the dwell times of a brachytherapy source in real time, enabling real time source position verification.

  15. Community Energy Systems and the Law of Public Utilities. Volume Fifty. West Virginia

    Energy Technology Data Exchange (ETDEWEB)

    Feurer, D.A.; Weaver, C.L.

    1981-01-01

    A detailed description is presented of the laws and programs of the State of West Virginia governing the regulation of public energy utilities, the siting of energy generating and transmission facilities, the municipal franchising of public energy utilities, and the prescription of rates to be charged by utilities including attendant problems of cost allocations, rate base and operating expense determinations, and rate of return allowances. These laws and programs are analyzed to identify impediments which they may present to the implementation of Integrated Community Energy Systems (ICES). This report is one of fifty-one separate volumes which describe such regulatory programs at the Federal level and in each state as background to the report entitled Community Energy Systems and the Law of Public Utilities - Volume One: An Overview. This report also contains a summary of a strategy described in Volume One - An Overview for overcoming these impediments by working within the existing regulatory framework and by making changes in the regulatory programs to enhance the likelihood of ICES implementation.

  16. Community Energy Systems and the Law of Public Utilities. Volume Fifty-two. Wyoming

    Energy Technology Data Exchange (ETDEWEB)

    Feurer, D.A.; Weaver, C.L.

    1981-01-01

    A detailed description is presented of the laws and programs of the State of Wyoming governing the regulation of public energy utilities, the siting of energy generating and transmission facilities, the municipal franchising of public energy utilities, and the prescription of rates to be charged by utilities including attendant problems of cost allocations, rate base and operating expense determinations, and rate of return allowances. These laws and programs are analyzed to identify impediments which they may present to the implementation of Integrated Community Energy Systems (ICES). This report is one of fifty-one separate volumes which describe such regulatory programs at the Federal level and in each state as background to the report entitled Community Energy Systems and the Law of Public Utilities - Volume One: An Overview. This report also contains a summary of a strategy described in Volume One - An Overview for overcoming these impediments by working within the existing regulatory framework and by making changes in the regulatory programs to enhance the likelihood of ICES implementation.

  17. Inferring source attribution from a multiyear multisource data set of Salmonella in Minnesota.

    Science.gov (United States)

    Ahlstrom, C; Muellner, P; Spencer, S E F; Hong, S; Saupe, A; Rovira, A; Hedberg, C; Perez, A; Muellner, U; Alvarez, J

    2017-12-01

    Salmonella enterica is a global health concern because of its widespread association with foodborne illness. Bayesian models have been developed to attribute the burden of human salmonellosis to specific sources with the ultimate objective of prioritizing intervention strategies. Important considerations of source attribution models include the evaluation of the quality of input data, assessment of whether attribution results logically reflect the data trends and identification of patterns within the data that might explain the detailed contribution of different sources to the disease burden. Here, more than 12,000 non-typhoidal Salmonella isolates from human, bovine, porcine, chicken and turkey sources that originated in Minnesota were analysed. A modified Bayesian source attribution model (available in a dedicated R package), accounting for non-sampled sources of infection, attributed 4,672 human cases to sources assessed here. Most (60%) cases were attributed to chicken, although there was a spike in cases attributed to a non-sampled source in the second half of the study period. Molecular epidemiological analysis methods were used to supplement risk modelling, and a visual attribution application was developed to facilitate data exploration and comprehension of the large multiyear data set assessed here. A large amount of within-source diversity and low similarity between sources was observed, and visual exploration of data provided clues into variations driving the attribution modelling results. Results from this pillared approach provided first attribution estimates for Salmonella in Minnesota and offer an understanding of current data gaps as well as key pathogen population features, such as serotype frequency, similarity and diversity across the sources. Results here will be used to inform policy and management strategies ultimately intended to prevent and control Salmonella infection in the state. © 2017 Blackwell Verlag GmbH.

  18. Basic limnology of fifty-one lakes in Costa Rica

    Directory of Open Access Journals (Sweden)

    Kurt A. Haberyan

    2003-03-01

    Full Text Available We visited 51 lakes in Costa Rica as part of a broad-based survey to document their physical and chemical characteristics and how these relate to the mode of formation and geographical distribution of the lakes. The four oxbow lakes were low in elevation and tended to be turbid, high in conductivity and CO2 , but low in dissolved O2 ; one of these, L. Gandoca, had a hypolimnion essentially composed of sea water. These were similar to the four wetland lakes, but the latter instead had low conductivities and pH, and turbidity was often due to tannins rather than suspended sediments. The thirteen artificial lakes formed a very heterogenous group, whose features varied depending on local factors. The thirteen lakes dammed by landslides, lava flows, or lahars occurred in areas with steep slopes, and were more likely to be stratified than most other types of lakes. The eight lakes that occupy volcanic craters tended to be deep, stratified, clear, and cool; two of these, L. Hule and L. Río Cuarto, appeared to be oligomictic (tending toward meromictic. The nine glacial lakes, all located above 3440 m elevation near Cerro Chirripó, were clear, cold, dilute, and are probably polymictic. Cluster analysis resulted in three significant groups of lakes. Cluster 1 included four calcium-rich lakes (average 48 mg l-1, Cluster 2 included fourteen lakes with more Si than Ca+2 and higher Cl- than the other clusters, and Cluster 3 included the remaining thirty-three lakes that were generally less concentrated. Each cluster included lakes of various origins located in different geographical regions; these data indicate that, apart from the high-altitude glacial lakes and lakes in the Miravalles area, similarity in lake chemistry is independent of lake distribution.Se visitaron 51 lagos en Costa Rica como parte de un sondeo de lagos más amplio, con el fin de documentar sus carácteristicas físicas y químicas y las relaciones entre estas carácteristicas y el modo

  19. Laser applications for energy. Fifty years since advent of laser and next thirty years

    International Nuclear Information System (INIS)

    Nakai, Sadao

    2011-01-01

    The utilization of light has been changed since the advent of lasers about fifty years ago. Now in the twenty first century, laser science is being applied in every industry as the fundamental technology. In the recent years, remarkable progresses have been made in the semiconductor lasers of high power and wide wavelength region. The amazing developments of ceramics laser materials like YAG and nonlinear optics materials of organic crystals have been achieved as well as the big progress in the fiber lasers. It is also to be pointed out that very high power ultra short laser pulses have become available. In the field of power photonics, which is based on the power semiconductor lasers, fiber lasers and new laser materials, various industrial applications are expected to be constructed further in civil engineering, manufacturing technology, agricultural and biological applications, medical utilization and space sciences. It is expected, by the development of ultra short pulse and ultra high mean power lasers, that particle accelerations, ultra high density sciences, nuclear fusion neutron sources and laser fusion power reactors are to be advanced drastically. Recent development and future prospects of high power lasers are illustrated. Lasers are now regarded as one of the key technologies in line with the national policy toward the creation of innovative industries. Realization of the laser fusion reactor is the most challenging target in the coming thirty years. (S. Funahashi)

  20. InP MMIC Chip Set for Power Sources Covering 80-170 GHz

    Science.gov (United States)

    Ngo, Catherine

    2001-01-01

    We will present a Monolithic Millimeter-wave Integrated Circuit (MMIC) chip set which provides high output-power sources for driving diode frequency multipliers into the terahertz range. The chip set was fabricated at HRL Laboratories using a 0.1-micrometer gate-length InAlAs/InGaAs/InP high electron mobility transistor (HEMT) process, and features transistors with an f(sub max) above 600 GHz. The HRL InP HEMT process has already demonstrated amplifiers in the 60-200 GHz range. In this paper, these high frequency HEMTs form the basis for power sources up to 170 GHz. A number of state-of-the-art InP HEMT MMICs will be presented. These include voltage-controlled and fixed-tuned oscillators, power amplifiers, and an active doubler. We will first discuss an 80 GHz voltage-controlled oscillator with 5 GHz of tunability and at least 17 mW of output power, as well as a 120 GHz oscillator providing 7 mW of output power. In addition, we will present results of a power amplifier which covers the full WRIO waveguide band (75-110 GHz), and provides 40-50 mW of output power. Furthermore, we will present an active doubler at 164 GHz providing 8% bandwidth, 3 mW of output power, and an unprecedented 2 dB of conversion loss for an InP HEMT MMIC at this frequency. Finally, we will demonstrate a power amplifier to cover 140-170 GHz with 15-25 mW of output power and 8 dB gain. These components can form a power source in the 155-165 GHz range by cascading the 80 GHz oscillator, W-band power amplifier, 164 GHz active doubler and final 140-170 GHz power amplifier for a stable, compact local oscillator subsystem, which could be used for atmospheric science or astrophysics radiometers.

  1. An open-source wireless sensor stack: from Arduino to SDI-12 to Water One Flow

    Science.gov (United States)

    Hicks, S.; Damiano, S. G.; Smith, K. M.; Olexy, J.; Horsburgh, J. S.; Mayorga, E.; Aufdenkampe, A. K.

    2013-12-01

    Implementing a large-scale streaming environmental sensor network has previously been limited by the high cost of the datalogging and data communication infrastructure. The Christina River Basin Critical Zone Observatory (CRB-CZO) is overcoming the obstacles to large near-real-time data collection networks by using Arduino, an open source electronics platform, in combination with XBee ZigBee wireless radio modules. These extremely low-cost and easy-to-use open source electronics are at the heart of the new DIY movement and have provided solutions to countless projects by over half a million users worldwide. However, their use in environmental sensing is in its infancy. At present a primary limitation to widespread deployment of open-source electronics for environmental sensing is the lack of a simple, open-source software stack to manage streaming data from heterogeneous sensor networks. Here we present a functioning prototype software stack that receives sensor data over a self-meshing ZigBee wireless network from over a hundred sensors, stores the data locally and serves it on demand as a CUAHSI Water One Flow (WOF) web service. We highlight a few new, innovative components, including: (1) a versatile open data logger design based the Arduino electronics platform and ZigBee radios; (2) a software library implementing SDI-12 communication protocol between any Arduino platform and SDI12-enabled sensors without the need for additional hardware (https://github.com/StroudCenter/Arduino-SDI-12); and (3) 'midStream', a light-weight set of Python code that receives streaming sensor data, appends it with metadata on the fly by querying a relational database structured on an early version of the Observations Data Model version 2.0 (ODM2), and uses the WOFpy library to serve the data as WaterML via SOAP and REST web services.

  2. Distributed Large Independent Sets in One Round On Bounded-independence Graphs

    OpenAIRE

    Halldorsson , Magnus M.; Konrad , Christian

    2015-01-01

    International audience; We present a randomized one-round, single-bit messages, distributed algorithm for the maximum independent set problem in polynomially bounded-independence graphs with poly-logarithmic approximation factor. Bounded-independence graphs capture various models of wireless networks such as the unit disc graphs model and the quasi unit disc graphs model. For instance, on unit disc graphs, our achieved approximation ratio is O((log(n)/log(log(n)))^2).A starting point of our w...

  3. Thorium molecular negative ion production in a cesium sputter source at BARC-TIFR pelletron accelerator ion source test set up

    International Nuclear Information System (INIS)

    Gupta, A.K.; Mehrotra, N.; Kale, R.M.; Alamelu, D.; Aggarwal, S.K.

    2005-01-01

    Ion source test set up at Pelletron Accelerator facility has been utilized extensively for the production and characterization of negative ions, with particular emphasis being place at the species of experimental users interest. The attention have been focussed towards the formation of rare earth negative ions, due to their importance in the ongoing accelerator mass spectroscopy program and isotopic abundance measurements using secondary negative ion mass spectrometry

  4. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    Science.gov (United States)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014

  5. A seasonal variation of the three Leading diagnoses over fifty ...

    African Journals Online (AJOL)

    A seasonal variation of the three Leading diagnoses over fifty months at the Duk Lost. Boys Clinic, South Sudan reed, williama, Dannan, Tomb, Friedman Danielc, MD, Manyok gabrield, Connor Barbarae, MD, reed Davidf, MD. Introduction. The Duk Lost Boys Clinic, a Primary Health Care Clinic in Duk Payuel, is the only ...

  6. Deutsches Atomforum turns fifty; 50 Jahre Deutsches Atomforum

    Energy Technology Data Exchange (ETDEWEB)

    Geisler, Maja [Deutsches Atomforum e.V., Berlin (Germany). Bereich Oeffentlichkeitsarbeit, Informationskreis KernEnergie

    2009-07-15

    Fifty years ago, the Deutsches Atomforum e. V. was founded to promote the peaceful uses of nuclear power in Germany. On July 1, 2009, the organization celebrated its fiftieth birthday in Berlin. The anniversary was celebrated in the Berlin electricity plant, Germany's oldest existing building for commercial electricity generation. DAtF President Dr. Walter Hohlefelder welcomed some 200 high-ranking guests from politics, industry, and from the nuclear community, above all, the Chancellor of the Federal Republic of Germany, Dr. Angela Merkel, and, as keynote speaker, Professor Dr. Arnulf Baring. (orig.)

  7. Enlarge the training set based on inter-class relationship for face recognition from one image per person.

    Science.gov (United States)

    Li, Qin; Wang, Hua Jing; You, Jane; Li, Zhao Ming; Li, Jin Xue

    2013-01-01

    In some large-scale face recognition task, such as driver license identification and law enforcement, the training set only contains one image per person. This situation is referred to as one sample problem. Because many face recognition techniques implicitly assume that several (at least two) images per person are available for training, they cannot deal with the one sample problem. This paper investigates principal component analysis (PCA), Fisher linear discriminant analysis (LDA), and locality preserving projections (LPP) and shows why they cannot perform well in one sample problem. After that, this paper presents four reasons that make one sample problem itself difficult: the small sample size problem; the lack of representative samples; the underestimated intra-class variation; and the overestimated inter-class variation. Based on the analysis, this paper proposes to enlarge the training set based on the inter-class relationship. This paper also extends LDA and LPP to extract features from the enlarged training set. The experimental results show the effectiveness of the proposed method.

  8. Anthropogenic Sulfur Dioxide Emissions, 1850-2005: National and Regional Data Set by Source Category, Version 2.86

    Data.gov (United States)

    National Aeronautics and Space Administration — The Anthropogenic Sulfur Dioxide Emissions, 1850-2005: National and Regional Data Set by Source Category, Version 2.86 provides annual estimates of anthropogenic...

  9. UHE point source survey at Cygnus experiment

    International Nuclear Information System (INIS)

    Lu, X.; Yodh, G.B.; Alexandreas, D.E.; Allen, R.C.; Berley, D.; Biller, S.D.; Burman, R.L.; Cady, R.; Chang, C.Y.; Dingus, B.L.; Dion, G.M.; Ellsworth, R.W.; Gilra, M.K.; Goodman, J.A.; Haines, T.J.; Hoffman, C.M.; Kwok, P.; Lloyd-Evans, J.; Nagle, D.E.; Potter, M.E.; Sandberg, V.D.; Stark, M.J.; Talaga, R.L.; Vishwanath, P.R.; Zhang, W.

    1991-01-01

    A new method of searching for UHE point source has been developed. With a data sample of 150 million events, we have surveyed the sky for point sources over 3314 locations (1.4 degree <δ<70.4 degree). It was found that their distribution is consistent with a random fluctuation. In addition, fifty two known potential sources, including pulsars and binary x-ray sources, were studied. The source with the largest positive excess is the Crab Nebula. An excess of 2.5 sigma above the background is observed in a bin of 2.3 degree by 2.5 degree in declination and right ascension respectively

  10. Assessing data quality and the variability of source data verification auditing methods in clinical research settings.

    Science.gov (United States)

    Houston, Lauren; Probst, Yasmine; Martin, Allison

    2018-05-18

    Data audits within clinical settings are extensively used as a major strategy to identify errors, monitor study operations and ensure high-quality data. However, clinical trial guidelines are non-specific in regards to recommended frequency, timing and nature of data audits. The absence of a well-defined data quality definition and method to measure error undermines the reliability of data quality assessment. This review aimed to assess the variability of source data verification (SDV) auditing methods to monitor data quality in a clinical research setting. The scientific databases MEDLINE, Scopus and Science Direct were searched for English language publications, with no date limits applied. Studies were considered if they included data from a clinical trial or clinical research setting and measured and/or reported data quality using a SDV auditing method. In total 15 publications were included. The nature and extent of SDV audit methods in the articles varied widely, depending upon the complexity of the source document, type of study, variables measured (primary or secondary), data audit proportion (3-100%) and collection frequency (6-24 months). Methods for coding, classifying and calculating error were also inconsistent. Transcription errors and inexperienced personnel were the main source of reported error. Repeated SDV audits using the same dataset demonstrated ∼40% improvement in data accuracy and completeness over time. No description was given in regards to what determines poor data quality in clinical trials. A wide range of SDV auditing methods are reported in the published literature though no uniform SDV auditing method could be determined for "best practice" in clinical trials. Published audit methodology articles are warranted for the development of a standardised SDV auditing method to monitor data quality in clinical research settings. Copyright © 2018. Published by Elsevier Inc.

  11. Representasi Perempuan Dalam Film (Analisis Semiotika Representasi Perempuan Dalam Film “Fifty Shades of Grey”)

    OpenAIRE

    Aviomeita, Friska

    2016-01-01

    This study entitled "Representation of Women In Film ( Roland Barthes Semiotics Analysis In the film Fifty Shades of Grey ) " . The purpose of this study to find out how women are represented in the film " Fifty Shades of Grey " by denotation , connotation and myths . Film has always influenced and shaped the public based on the contents of the message behind it. Messages or values contained in the film may affect the audience. In this study, researchers used several theorie...

  12. Enlarge the training set based on inter-class relationship for face recognition from one image per person.

    Directory of Open Access Journals (Sweden)

    Qin Li

    Full Text Available In some large-scale face recognition task, such as driver license identification and law enforcement, the training set only contains one image per person. This situation is referred to as one sample problem. Because many face recognition techniques implicitly assume that several (at least two images per person are available for training, they cannot deal with the one sample problem. This paper investigates principal component analysis (PCA, Fisher linear discriminant analysis (LDA, and locality preserving projections (LPP and shows why they cannot perform well in one sample problem. After that, this paper presents four reasons that make one sample problem itself difficult: the small sample size problem; the lack of representative samples; the underestimated intra-class variation; and the overestimated inter-class variation. Based on the analysis, this paper proposes to enlarge the training set based on the inter-class relationship. This paper also extends LDA and LPP to extract features from the enlarged training set. The experimental results show the effectiveness of the proposed method.

  13. Pattern-set generation algorithm for the one-dimensional multiple stock sizes cutting stock problem

    Science.gov (United States)

    Cui, Yaodong; Cui, Yi-Ping; Zhao, Zhigang

    2015-09-01

    A pattern-set generation algorithm (PSG) for the one-dimensional multiple stock sizes cutting stock problem (1DMSSCSP) is presented. The solution process contains two stages. In the first stage, the PSG solves the residual problems repeatedly to generate the patterns in the pattern set, where each residual problem is solved by the column-generation approach, and each pattern is generated by solving a single large object placement problem. In the second stage, the integer linear programming model of the 1DMSSCSP is solved using a commercial solver, where only the patterns in the pattern set are considered. The computational results of benchmark instances indicate that the PSG outperforms existing heuristic algorithms and rivals the exact algorithm in solution quality.

  14. Function of One Regular Separable Relation Set Decided for the Minimal Covering in Multiple Valued Logic

    Directory of Open Access Journals (Sweden)

    Liu Yu Zhen

    2016-01-01

    Full Text Available Multiple-valued logic is an important branch of the computer science and technology. Multiple-valued logic studies the theory, multiple-valued circuit & multiple-valued system, and the applications of multiple-valued logic included.In the theory of multiple-valued logic, one primary and important problem is the completeness of function sets, which can be solved depending on the decision for all the precomplete sets(also called maximal closed sets of K-valued function sets noted by PK*, and another is the decision for Sheffer function, which can be totally solved by picking out all of the minimal covering of the precomplete sets. In the function structure theory of multi-logic, decision on Sheffer function is an important role. It contains structure and decision of full multi-logic and partial multi-logic. Its decision is closely related to decision of completeness of function which can be done by deciding the minimal covering of full multi-logic and partial-logic. By theory of completeness of partial multi-logic, we prove that function of one regular separable relation is not minimal covering of PK* under the condition of m = 2, σ = e.

  15. Olfactory source localization in the open field using one or both nostrils.

    Science.gov (United States)

    Welge-Lussen, A; Looser, G L; Westermann, B; Hummel, T

    2014-03-01

    This study aims to examine humans ́ abilities to localize odorants within the open field. Young participants were tested on a localization task using a relatively selective olfactory stimulus (2-phenylethyl-alcohol, PEA) and cineol, an odorant with a strong trigeminal component. Participants were blindfolded and had to localize an odorant source at 2 m distance (far-field condition) and a 0.4 m distance (near-field condition) with either two nostrils open or only one open nostril. For the odorant with trigeminal properties, the number of correct trials did not differ when one or both nostrils were used, while more PEA localization trials were correctly completed with both rather than one nostril. In the near-field condition, correct localization was possible in 72-80% of the trials, irrespective of the odorant and the number of nostrils used. Localization accuracy, measured as spatial deviation from the olfactory source, was significantly higher in the near-field compared to the far-field condition, but independent of the odorant being localized. Odorant localization within the open field is difficult, but possible. In contrast to the general view, humans seem to be able to exploit the two-nostril advantage with increasing task difficulty.

  16. Developing open source, self-contained disease surveillance software applications for use in resource-limited settings

    Directory of Open Access Journals (Sweden)

    Campbell Timothy C

    2012-09-01

    Full Text Available Abstract Background Emerging public health threats often originate in resource-limited countries. In recognition of this fact, the World Health Organization issued revised International Health Regulations in 2005, which call for significantly increased reporting and response capabilities for all signatory nations. Electronic biosurveillance systems can improve the timeliness of public health data collection, aid in the early detection of and response to disease outbreaks, and enhance situational awareness. Methods As components of its Suite for Automated Global bioSurveillance (SAGES program, The Johns Hopkins University Applied Physics Laboratory developed two open-source, electronic biosurveillance systems for use in resource-limited settings. OpenESSENCE provides web-based data entry, analysis, and reporting. ESSENCE Desktop Edition provides similar capabilities for settings without internet access. Both systems may be configured to collect data using locally available cell phone technologies. Results ESSENCE Desktop Edition has been deployed for two years in the Republic of the Philippines. Local health clinics have rapidly adopted the new technology to provide daily reporting, thus eliminating the two-to-three week data lag of the previous paper-based system. Conclusions OpenESSENCE and ESSENCE Desktop Edition are two open-source software products with the capability of significantly improving disease surveillance in a wide range of resource-limited settings. These products, and other emerging surveillance technologies, can assist resource-limited countries compliance with the revised International Health Regulations.

  17. Atmospheric deposition having been one of the major source of Pb in Jiaozhou Bay

    Science.gov (United States)

    Yang, Dongfang; Miao, Zhenqing; Zhang, Xiaolong; Wang, Qi; Li, Haixia

    2018-03-01

    Many marine bays have been polluted by Pb due to the rapid development of industry, and identifying the major source of Pb is essential to pollution control. This paper analyzed the distribution and pollution source of Pb in Jiaozhou Bay in 1988. Results showed that Pb contents in surface waters in Jiaozhou Bay in April, July and October 1988 were 5.52-24.61 μg L‑1, 7.66-38.62 μg L‑1 and 6.89-19.30 μg L‑1, respectively. The major Pb sources in this bay were atmospheric deposition, and marine current, whose source strengths were 19.30-24.61μg L‑1 and 38.62 μg L‑1, respectively. Atmospheric deposition had been one of the major Pb sources in Jiaozhou Bay, and the source strengths were stable and strong. The pollution level of Pb in this bay in 1988 was moderate to heavy, and the source control measurements were necessary.

  18. Energy well. Ground-source heat in one-family houses; Energiakaivo. Maalaemmoen hyoedyntaeminen pientaloissa

    Energy Technology Data Exchange (ETDEWEB)

    Juvonen, J.; Lapinlampi, T.

    2013-08-15

    This guide deals with the legislation, planning, building, usage and maintenance of ground-source heat systems. The guide gives recommendations and instructions on national level on the permit practices and how to carry out the whole ground-source heat system project. The main focus of the guide is on energy wells for one-family houses. The principle is that an action permit is needed to build a ground-source heat system. On ground water areas a permit according to the water act may also be required. To avoid any problems, the placement of the system needs to be planned precisely. This guide gives a comprehension to the orderer on the issues that need to be considered before ordering, during construction, when the system is running and when giving up the use of the ground-source heat system. (orig.)

  19. Efficient One-click Browsing of Large Trajectory Sets

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin

    2014-01-01

    presents a novel query type called sheaf, where users can browse trajectory data sets using a single mouse click. Sheaves are very versatile and can be used for location-based advertising, travel-time analysis, intersection analysis, and reachability analysis (isochrones). A novel in-memory trajectory...... index compresses the data by a factor of 12.4 and enables execution of sheaf queries in 40 ms. This is up to 2 orders of magnitude faster than existing work. We demonstrate the simplicity, versatility, and efficiency of sheaf queries using a real-world trajectory set consisting of 2.7 million...

  20. Fifty Shades a way readers put sex back in their lives

    DEFF Research Database (Denmark)

    Knudsen, Gry Høngsmark

    2014-01-01

    This is a guest blog from Gry Høngsmark Knudsen, a former student, who visited my “Women’s Economy” class this winter to report on her project about the reception of Fifty Shades of Grey. We all enjoyed her report, so I asked her to write a short blog for Double X Readers....

  1. Time-Dependent Selection of an Optimal Set of Sources to Define a Stable Celestial Reference Frame

    Science.gov (United States)

    Le Bail, Karine; Gordon, David

    2010-01-01

    Temporal statistical position stability is required for VLBI sources to define a stable Celestial Reference Frame (CRF) and has been studied in many recent papers. This study analyzes the sources from the latest realization of the International Celestial Reference Frame (ICRF2) with the Allan variance, in addition to taking into account the apparent linear motions of the sources. Focusing on the 295 defining sources shows how they are a good compromise of different criteria, such as statistical stability and sky distribution, as well as having a sufficient number of sources, despite the fact that the most stable sources of the entire ICRF2 are mostly in the Northern Hemisphere. Nevertheless, the selection of a stable set is not unique: studying different solutions (GSF005a and AUG24 from GSFC and OPA from the Paris Observatory) over different time periods (1989.5 to 2009.5 and 1999.5 to 2009.5) leads to selections that can differ in up to 20% of the sources. Observing, recording, and network improvement are some of the causes, showing better stability for the CRF over the last decade than the last twenty years. But this may also be explained by the assumption of stationarity that is not necessarily right for some sources.

  2. Archival Theory and the Shaping of Educational History: Utilizing New Sources and Reinterpreting Traditional Ones

    Science.gov (United States)

    Glotzer, Richard

    2013-01-01

    Information technology has spawned new evidentiary sources, better retrieval systems for existing ones, and new tools for interpreting traditional source materials. These advances have contributed to a broadening of public participation in civil society (Blouin and Rosenberg 2006). In these culturally unsettled and economically fragile times…

  3. The First Fifty ABO Blood Group Incompatible Kidney Transplantations: The Rotterdam Experience

    Directory of Open Access Journals (Sweden)

    Madelon van Agteren

    2014-01-01

    Full Text Available This study describes the single center experience and long-term results of ABOi kidney transplantation using a pretransplantation protocol involving immunoadsorption combined with rituximab, intravenous immunoglobulins, and triple immune suppression. Fifty patients received an ABOi kidney transplant in the period from 2006 to 2012 with a follow-up of at least one year. Eleven antibody mediated rejections were noted of which 5 were mixed antibody and cellular mediated rejections. Nine cellular mediated rejections were recorded. Two grafts were lost due to rejection in the first year. One-year graft survival of the ABOi grafts was comparable to 100 matched ABO compatible renal grafts, 96% versus 99%. At 5-year follow-up, the graft survival was 90% in the ABOi versus 97% in the control group. Posttransplantation immunoadsorption was not an essential part of the protocol and no association was found between antibody titers and subsequent graft rejection. Steroids could be withdrawn safely 3 months after transplantation. Adverse events specifically related to the ABOi protocol were not observed. The currently used ABOi protocol shows good short and midterm results despite a high rate of antibody mediated rejections in the first years after the start of the program.

  4. What history tells us XIII. Fifty years of the Central Dogma

    Indian Academy of Sciences (India)

    2008-05-14

    May 14, 2008 ... Home; Journals; Journal of Biosciences; Volume 33; Issue 2. What history tells us XIII. Fifty years of the Central Dogma. Michel Morange. Series Volume 33 Issue 2 June 2008 pp 171-175. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/jbsc/033/02/0171-0175 ...

  5. Unbounded dynamics and compact invariant sets of one Hamiltonian system defined by the minimally coupled field

    Energy Technology Data Exchange (ETDEWEB)

    Starkov, Konstantin E., E-mail: kstarkov@ipn.mx

    2015-06-12

    In this paper we study some features of global dynamics for one Hamiltonian system arisen in cosmology which is formed by the minimally coupled field; this system was introduced by Maciejewski et al. in 2007. We establish that under some simple conditions imposed on parameters of this system all trajectories are unbounded in both of time directions. Further, we present other conditions for system parameters under which we localize the domain with unbounded dynamics; this domain is defined with help of bounds for values of the Hamiltonian level surface parameter. We describe the case when our system possesses periodic orbits which are found explicitly. In the rest of the cases we get some localization bounds for compact invariant sets. - Highlights: • Domain with unbounded dynamics is localized. • Equations for periodic orbits are given in one level set. • Localizations for compact invariant sets are got.

  6. Hubble Source Catalog

    Science.gov (United States)

    Lubow, S.; Budavári, T.

    2013-10-01

    We have created an initial catalog of objects observed by the WFPC2 and ACS instruments on the Hubble Space Telescope (HST). The catalog is based on observations taken on more than 6000 visits (telescope pointings) of ACS/WFC and more than 25000 visits of WFPC2. The catalog is obtained by cross matching by position in the sky all Hubble Legacy Archive (HLA) Source Extractor source lists for these instruments. The source lists describe properties of source detections within a visit. The calculations are performed on a SQL Server database system. First we collect overlapping images into groups, e.g., Eta Car, and determine nearby (approximately matching) pairs of sources from different images within each group. We then apply a novel algorithm for improving the cross matching of pairs of sources by adjusting the astrometry of the images. Next, we combine pairwise matches into maximal sets of possible multi-source matches. We apply a greedy Bayesian method to split the maximal matches into more reliable matches. We test the accuracy of the matches by comparing the fluxes of the matched sources. The result is a set of information that ties together multiple observations of the same object. A byproduct of the catalog is greatly improved relative astrometry for many of the HST images. We also provide information on nondetections that can be used to determine dropouts. With the catalog, for the first time, one can carry out time domain, multi-wavelength studies across a large set of HST data. The catalog is publicly available. Much more can be done to expand the catalog capabilities.

  7. X-Ray Scattering Applications Using Pulsed X-Ray Sources

    Energy Technology Data Exchange (ETDEWEB)

    Larson, B.C.

    1999-05-23

    Pulsed x-ray sources have been used in transient structural phenomena investigations for over fifty years; however, until the advent of synchrotrons sources and the development of table-top picosecond lasers, general access to ligh temporal resolution x-ray diffraction was relatively limited. Advances in diffraction techniques, sample excitation schemes, and detector systems, in addition to IncEased access to pulsed sources, have ld tO what is now a diverse and growing array of pulsed-source measurement applications. A survey of time-resolved investigations using pulsed x-ray sources is presented and research opportunities using both present and planned pulsed x-ray sources are discussed.

  8. Dual energy CTA of the supraaortic arteries: Technical improvements with a novel dual source CT system

    International Nuclear Information System (INIS)

    Lell, Michael M.; Hinkmann, Fabian; Nkenke, Emeka; Schmidt, Bernhard; Seidensticker, Peter; Kalender, Willi A.; Uder, Michael; Achenbach, Stephan

    2010-01-01

    Objectives: Computed tomography angiography (CTA) is a well-accepted imaging modality to evaluate the supraaortic vessels. Initial reports have suggested that dual energy CTA (DE-CTA) can enhance diagnosis by creating bone-free data sets, which can be visualized in 3D, but a number of limitations of this technique have also been addressed. We sought to describe the performance of DE-CTA of the supraaortic vessels with a novel dual source CT system with special emphasis on image quality and post-processing related artifacts. Materials and methods: Thirty-three patients underwent carotid CT angiography on a second generation dual source CT system. Simultaneous acquisitions of 100 and 140 kV data sets in arterial phase were performed. Two examiners evaluated overall bone suppression with a 3-point scale (1 = poor; 3 = excellent) and image quality regarding integrity of the vessel lumen of different vessel segments (n = 26) with a 5-point scale (1 = poor; 5 = excellent), CTA source data served as the reference. Results: Excellent bone suppression could be achieved in the head and neck. Only minor bone remnants occurred, mean score for bone removal was 2.9. Mean score for vessel integrity was 4.3. Eight hundred fifty-seven vessel segments could be evaluated. Six hundred thirty-five segments (74%) showed no lumen alteration, 65 segments (7.6%) lumen alterations 10% resulting in a total luminal reduction 50%, and 113 segments (13.2%) showed a gap in the vessel course (100% total lumen reduction). Artificial gaps of the vessel lumen occurred in 28 vessel segments due to artifacts caused by dental hardware and in all but one (65) ophthalmic arteries. Conclusions: Excellent bone suppression could be achieved, DE imaging with 100 and 140 kV lead to improved image quality and vessel integrity in the shoulder region than previously reported. The ophthalmic artery still cannot be adequately visualized.

  9. Fifty years experiences in nuclear engineering education at Tokyo Institute of Technology

    International Nuclear Information System (INIS)

    Fujii, Yasuhiko; Saito, Masaki; Aritomi, Masanori

    2008-01-01

    Nuclear engineering education has been initiated in 1957 at the graduate school of Tokyo Institute of Technology. Educational activities have been conducted for fifty years under the support of the Research Laboratory for Nuclear Reactors. In the past fifty years, about 1000 Master students and 200 Doctoral students and 200 Doctoral students graduated from our Nuclear Engineering Department at Tokyo Institute of Technology. Many of them found their jobs in nuclear industries and institutes. International course of nuclear engineering was initiated in 1994, and so far about 90 students from 15 overseas countries have graduated from our Master and Doctoral Programs. In 2003, our proposal of 'Innovative Nuclear Energy System for the Sustainable World' was adopted as the Center of Excellent Program sponsored by Ministry of Education, Science and Technology. Recently a collaborative education network has been developed among Kanazawa University, Fukui University, Ibaraki University, Okayama University, Tokyo Institute of Technology and Japan Atomic Energy Agency. (author)

  10. The richness of discovery : Amoco's first fifty years in Canada (1948-1998)

    International Nuclear Information System (INIS)

    McKenzie-Brown, P.

    1998-01-01

    A review of Amoco's first fifty years of operations in Canada, including investments, discoveries, and policies was presented. While no claim is made for this attractively produced slim volume to be a definitive history, it does manage to shed light on some of the great achievements and outstanding deeds of the people behind the company. The book provides a glimpse into how Amoco contributed to the growth of the Canadian petroleum industry in diverse areas including the manufacturing sector, the petroleum service sector, oil field technology, oil field infrastructure and petrochemical development. The company enjoyed spectacular success during the 1950s and 1960s. As evidence of that success, in 1997 Amoco Canada was the largest Canadian producer and exporter of natural gas and NGLs, the largest cold producer of heavy oil, the second largest in situ producer of heavy oil, and one of the 10 largest producers of conventional oil. refs., tabs., figs

  11. Constraints on equivalent elastic source models from near-source data

    International Nuclear Information System (INIS)

    Stump, B.

    1993-01-01

    A phenomenological based seismic source model is important in quantifying the important physical processes that affect the observed seismic radiation in the linear-elastic regime. Representations such as these were used to assess yield effects on seismic waves under a Threshold Test Ban Treaty and to help transport seismic coupling experience at one test site to another. These same characterizations in a non-proliferation environment find applications in understanding the generation of the different types of body and surface waves from nuclear explosions, single chemical explosions, arrays of chemical explosions used in mining, rock bursts and earthquakes. Seismologists typically begin with an equivalent elastic representation of the source which when convolved with the propagation path effects produces a seismogram. The Representation Theorem replaces the true source with an equivalent set of body forces, boundary conditions or initial conditions. An extension of this representation shows the equivalence of the body forces, boundary conditions and initial conditions and replaces the source with a set of force moments, the first degree moment tensor for a point source representation. The difficulty with this formulation, which can completely describe the observed waveforms when the propagation path effects are known, is in the physical interpretation of the actual physical processes acting in the source volume. Observational data from within the source region, where processes are often nonlinear, linked to numerical models of the important physical processes in this region are critical to a unique physical understanding of the equivalent elastic source function

  12. Community Exchange Systems. What They Are. How They Work. How to Set One Up.

    Science.gov (United States)

    Page, Leslie

    This booklet explains the concept of a community exchange system (CES), or barter system, for the exchange of goods and services and describes how to set one up. The booklet is concerned only with nonprofit, voluntary organizations. The booklet is organized in four sections. The first section introduces the community exchange systems idea and…

  13. The Research of Dr. Joanne Simpson: Fifty Years Investigating Hurricanes, Tropical Clouds and Cloud Systems

    Science.gov (United States)

    Tao, W. -K.; Halverson, J.; Adler, R.; Garstang, M.; Houze, R., Jr.; LeMone, M.; Pielke, R., Sr.; Woodley, W.; O'C.Starr, David (Technical Monitor)

    2001-01-01

    This AMS Meteorological Monographs is dedicated to Dr. Joanne Simpson for her many pioneering research efforts in tropical meteorology during her fifty-year career. Dr. Simpson's major areas of scientific research involved the "hot tower" hypothesis and its role in hurricanes, structure and maintenance of trade winds, air-sea interaction, and observations and the mechanism for hurricanes and waterspouts. She was also a pioneer in cloud modeling with the first one-dimensional model and had the first cumulus model on a computer. She also played a major role in planning and leading observational experiments on convective cloud systems. The launch of the Tropical Rainfall Measuring Mission (TRMM) satellite, a joint U.S.-Japan project, in November of 1997 made it possible for quantitative measurements of tropical rainfall to be obtained on a continuous basis over the entire global tropics. Dr. Simpson was the TRAM Project Scientist from 1986 until its launch in 1997. Her efforts during this crucial period ensured that the mission was both well planned scientifically and well engineered as well as within budget. In this paper, Dr. J. Simpson's nine specific accomplishments during her fifty-year career: (1) hot tower hypothesis, (2) hurricanes, (3) airflow and clouds over heated islands, (4) cloud models, (5) trade winds and their role in cumulus development, (6) air-sea interaction, (7) cloud-cloud interactions and mergers, (8) waterspouts, and (9) TRMM science, will be described and discussed.

  14. Urine testing and urinary tract infections in febrile infants seen in office settings: the Pediatric Research in Office Settings' Febrile Infant Study.

    Science.gov (United States)

    Newman, Thomas B; Bernzweig, Jane A; Takayama, John I; Finch, Stacia A; Wasserman, Richard C; Pantell, Robert H

    2002-01-01

    To determine the predictors and results of urine testing of young febrile infants seen in office settings. Prospective cohort study. Offices of 573 pediatric practitioners from 219 practices in the American Academy of Pediatrics Pediatric Research in Office Settings' research network. A total of 3066 infants 3 months or younger with temperatures of 38 degrees C or higher were evaluated and treated according to the judgment of their practitioners. Urine testing results, early and late urinary tract infections (UTIs), and UTIs with bacteremia. Fifty-four percent of the infants initially had urine tested, of whom 10% had a UTI. The height of the fever was associated with urine testing and a UTI among those tested (adjusted odds ratio per degree Celsius, 2.2 for both). Younger age, ill appearance, and lack of a fever source were associated with urine testing but not with a UTI, whereas lack of circumcision (adjusted odds ratio, 11.6), female sex (adjusted odds ratio, 5.4), and longer duration of fever (adjusted odds ratio, 1.8 for fever lasting > or = 24 hours) were not associated with urine testing but were associated with a UTI. Bacteremia accompanied the UTI in 10% of the patients, including 17% of those younger than 1 month. Among 807 infants not initially tested or treated with antibiotics, only 2 had a subsequent documented UTI; both did well. Practitioners order urine tests selectively, focusing on younger and more ill-appearing infants and on those without an apparent fever source. Such selective urine testing, with close follow-up, was associated with few late UTIs in this large study. Urine testing should focus particularly on uncircumcised boys, girls, the youngest and sickest infants, and those with persistent fever.

  15. The digestible energy, metabolizable energy, and net energy content of dietary fat sources in thirteen- and fifty-kilogram pigs.

    Science.gov (United States)

    Kellner, T A; Patience, J F

    2017-09-01

    The objective was to determine the energy concentration of a diverse array of dietary fat sources and, from these data, develop regression equations that explain differences based on chemical composition. A total of 120 Genetiporc 6.0 × Genetiporc F25 (PIC, Inc., Hendersonville, TN) individually housed barrows were studied for 56 d. These barrows (initial BW of 9.9 ± 0.6 kg) were randomly allotted to 1 of 15 dietary treatments. Each experimental diet included 95% of a corn-soybean meal basal diet plus 5% either corn starch or 1 of 14 dietary fat sources. The 14 dietary fat sources (animal-vegetable blend, canola oil, choice white grease source A, choice white grease source B, coconut oil, corn oil source A, corn oil source B, fish oil, flaxseed oil, palm oil, poultry fat, soybean oil source A, soybean oil source B, and tallow) were selected to provide a diverse and robust range of unsaturated fatty acid:SFA ratios (U:S). Pigs were limit-fed experimental diets from d 0 to 10 and from d 46 to 56, providing a 7-d adaption for fecal collection on d 7 to 10 (13 kg BW) and d 53 to 56 (50 kg BW). At 13 kg BW, the average energy content of the 14 sources was 8.42 Mcal DE/kg, 8.26 Mcal ME/kg, and 7.27 Mcal NE/kg. At 50 kg BW, the average energy content was 8.45 Mcal DE/kg, 8.28 Mcal ME/kg, and 7.29 Mcal NE/kg. At 13 kg BW, the variation of dietary fat DE content was explained by DE (Mcal/kg) = 9.363 + [0.097 × (FFA, %)] - [0.016 × omega-6:omega-3 fatty acids ratio] - [1.240 × (arachidic acid, %)] - [5.054 × (insoluble impurities, %)] + [0.014 × (palmitic acid, %)] ( = 0.008, = 0.82). At 50 kg BW, the variation of dietary fat DE content was explained by DE (Mcal/kg) = 8.357 + [0.189 × U:S] - [0.195 × (FFA, %)] - [6.768 × (behenic acid, %)] + [0.024 × (PUFA, %)] ( = 0.002, = 0.81). In summary, the chemical composition of dietary fat explained a large degree of the variation observed in the energy content of dietary fat sources at both 13 and 50 kg BW.

  16. Fifty years in fusion and the way forward

    International Nuclear Information System (INIS)

    Jacquinot, J.

    2010-01-01

    This particular 'Fusion Pioneers Memorial lecture' was given 50 years after the first historical FEC conference in 1958. It was a unique occasion to perform a global reflection on thermonuclear fusion which is summarized in this paper. We first consider the case for fusion energy then move on to the scientific achievements during the past five decades. Finally, the lessons drawn from the past give a framework to consider the challenges ahead of us. The 1958 pioneers had the vision of the vital importance of international collaboration to succeed in this unique endeavour. Fifty years later, this vision has amply proven its worth. Looking at the way forward, this vision constitutes a strong basis to harness fusion energy in the decades to come.

  17. Domain analysis of computational science - Fifty years of a scientific computing group

    Energy Technology Data Exchange (ETDEWEB)

    Tanaka, M.

    2010-02-23

    I employed bibliometric- and historical-methods to study the domain of the Scientific Computing group at Brookhaven National Laboratory (BNL) for an extended period of fifty years, from 1958 to 2007. I noted and confirmed the growing emergence of interdisciplinarity within the group. I also identified a strong, consistent mathematics and physics orientation within it.

  18. Fifty years of sociological leadership at Social Science and Medicine.

    Science.gov (United States)

    Timmermans, Stefan; Tietbohl, Caroline

    2018-01-01

    In this review article, we examine some of the conceptual contributions of sociology of health and illness over the past fifty years. Specifically, we focus on research dealing with medicalization, the management of stigma, research on adherence and compliance, and patient-doctor interaction. We show how these themes that originated within sociology, diffused in other disciplines. Sociology in Social Science and Medicine started as an applied research tradition but morphed into a robust, stand-alone social science tradition. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Professional and organizational commitment in paediatric occupational therapists: the influence of practice setting.

    Science.gov (United States)

    Seruya, Francine M; Hinojosa, Jim

    2010-09-01

    The professional and organizational commitment of paediatric occupational therapists working in two distinct practice settings, schools and medically based settings, was investigated. A web-based survey program was used to administer a questionnaire to occupational therapists employed in New York, New Jersey and Connecticut. The study employed social identity theory as a guiding perspective in understanding therapists' professional and organizational commitment. One hundred and fifty-seven paediatric therapists responded to the Professional Commitment Questionnaire and the Organizational Commitment Questionnaire to gauge their commitment to both the profession and their employing organizations. Results indicated that paediatric therapists, regardless of employment setting, have high professional commitment. Paediatric occupational therapists employed in medically based settings indicated statistically significant higher organizational commitment than their school-based counterparts. For therapists that work in school settings, the presence of a professional cohort did not influence professional commitment scores. As the study employed a web-based survey methodology, only individuals who were members of associations and had access to a computer and the Internet were able to participate. Further study might include widening the participant pool as well as adding additional instruments to explore both professional and organizational commitment on a more national scale. Copyright 2010 John Wiley & Sons, Ltd.

  20. Exact distribution of a pattern in a set of random sequences generated by a Markov source: applications to biological data.

    Science.gov (United States)

    Nuel, Gregory; Regad, Leslie; Martin, Juliette; Camproux, Anne-Claude

    2010-01-26

    In bioinformatics it is common to search for a pattern of interest in a potentially large set of rather short sequences (upstream gene regions, proteins, exons, etc.). Although many methodological approaches allow practitioners to compute the distribution of a pattern count in a random sequence generated by a Markov source, no specific developments have taken into account the counting of occurrences in a set of independent sequences. We aim to address this problem by deriving efficient approaches and algorithms to perform these computations both for low and high complexity patterns in the framework of homogeneous or heterogeneous Markov models. The latest advances in the field allowed us to use a technique of optimal Markov chain embedding based on deterministic finite automata to introduce three innovative algorithms. Algorithm 1 is the only one able to deal with heterogeneous models. It also permits to avoid any product of convolution of the pattern distribution in individual sequences. When working with homogeneous models, Algorithm 2 yields a dramatic reduction in the complexity by taking advantage of previous computations to obtain moment generating functions efficiently. In the particular case of low or moderate complexity patterns, Algorithm 3 exploits power computation and binary decomposition to further reduce the time complexity to a logarithmic scale. All these algorithms and their relative interest in comparison with existing ones were then tested and discussed on a toy-example and three biological data sets: structural patterns in protein loop structures, PROSITE signatures in a bacterial proteome, and transcription factors in upstream gene regions. On these data sets, we also compared our exact approaches to the tempting approximation that consists in concatenating the sequences in the data set into a single sequence. Our algorithms prove to be effective and able to handle real data sets with multiple sequences, as well as biological patterns of

  1. Fifty years of continuous improvement: (What has DOE done for analytical chemistry?)

    Energy Technology Data Exchange (ETDEWEB)

    Shults, W.D.

    1993-11-01

    Over the past fifty years, analytical scientist within the DOE complex have had a tremendous impact on the field of analytical chemistry. This paper suggests six ``high impact`` research/development areas that either originated within or were brought to maturity within the DOE laboratories. ``High impact`` means they lead to new subdisciplines or to new ways of doing business.

  2. One hundred and fifty years of sprint and distance running – Past trends and future prospects

    Science.gov (United States)

    Weiss, Martin; Newman, Alexandra; Whitmore, Ceri; Weiss, Stephan

    2016-01-01

    Abstract Sprint and distance running have experienced remarkable performance improvements over the past century. Attempts to forecast running performances share an almost similarly long history but have relied so far on relatively short data series. Here, we compile a comprehensive set of season-best performances for eight Olympically contested running events. With this data set, we conduct (1) an exponential time series analysis and (2) a power-law experience curve analysis to quantify the rate of past performance improvements and to forecast future performances until the year 2100. We find that the sprint and distance running performances of women and men improve exponentially with time and converge at yearly rates of 4% ± 3% and 2% ± 2%, respectively, towards their asymptotic limits. Running performances can also be modelled with the experience curve approach, yielding learning rates of 3% ± 1% and 6% ± 2% for the women's and men's events, respectively. Long-term trends suggest that: (1) women will continue to run 10–20% slower than men, (2) 9.50 s over 100 m dash may only be broken at the end of this century and (3) several middle- and long-distance records may be broken within the next two to three decades. The prospects of witnessing a sub-2 hour marathon before 2100 remain inconclusive. Our results should be interpreted cautiously as forecasting human behaviour is intrinsically uncertain. The future season-best sprint and distance running performances will continue to scatter around the trends identified here and may yield unexpected improvements of standing world records. PMID:26088705

  3. Fifty years after Hiroshima and Nagasaki

    International Nuclear Information System (INIS)

    Nishiwaki, Y.

    1996-01-01

    factors such as those mentioned above must be considered in interpreting the effect of the atomic bombing, instead of ascribing all the effects solely to ionizing radiation. Fifty years ago, in 1945, the first three atomic bombs in human history were produced by the United States; one of these bombs was exploded experimentally at the testing ground in the desert 80 km from Alamogordo, New Mexico, on 16 July. The remaining two were used against Japan; the one called L ittle Boy , using Uranium 235 was dropped on Hiroshima on 6 August, the other F at Man , using Plutonium 239 on Nagasaki on 9 August, 1945

  4. Fifty years with nuclear fission

    International Nuclear Information System (INIS)

    Behrens, J.W.; Carlson, A.D.

    1989-01-01

    The news of the discovery of nucler fission, by Otto Hahn and Fritz Strassmann in Germany, was brought to the United States by Niels Bohr in January 1939. Since its discovery, the United States, and the world for that matter, has never been the same. It therefore seemed appropriate to acknowledge the fiftieth anniversary of its discovery by holding a topical meeting entitled, ''Fifty years with nuclear fission,'' in the United States during the year 1989. The objective of the meeting was to bring together pioneers of the nuclear industry and other scientists and engineers to report on reminiscences of the past and on the more recent developments in fission science and technology. The conference highlighted the early pioneers of the nuclear industry by dedicating a full day (April 26), consisting of two plenary sessions, at the National Academy of Sciences (NAS) in Washington, DC. More recent developments in fission science and technology in addition to historical reflections were topics for two full days of sessions (April 27 and 28) at the main sites of the NIST in Gaithersburg, Maryland. The wide range of topics covered by Volume 2 of this topical meeting included plenary invited, and contributed sessions entitled, Nuclear fission -- a prospective; reactors II; fission science II; medical and industrial applications by by-products; reactors and safeguards; general research, instrumentation, and by-products; and fission data, astrophysics, and space applications. The individual papers have been cataloged separately

  5. New generation of light sources: Present and future

    International Nuclear Information System (INIS)

    Couprie, M.E.

    2014-01-01

    Spectroscopy and imaging in the VUV–X-ray domain are very sensitive tools for the investigation of the properties of matter [1–3]. Time-resolved studies enable to follow the movies of ultra-fast reactions. More than fifty years after the laser discovery [4], VUVX light sources are actively developed around the world. Among them, high order harmonics generated in gas, X-ray lasers, synchrotron radiation, free electron lasers are providing a wide offer, from laboratory size sources to large scale facilities, with various features, suitable for different types of experiments. The properties of these sources are here reviewed. Quest of new performances and flexibility is also discussed

  6. Effects of setting new source performance standards for fluidized-bed combustion systems

    Energy Technology Data Exchange (ETDEWEB)

    1978-02-01

    This study was undertaken for the US Environmental Protection Agency to examine the potential consequences of revisions in New Source Performance Standards (NSPS) on fluidized-bed combustor-based steam electric generators of greater than 250,000,000 Btu. A study of the appropriateness and differential effects of alternate regulatory approaches to the standards-setting process was made. Problems dealing with an emerging technology such as fluidized-bed combustion were emphasized. Finally, an examination was made of the potential benefits of fluidized-bed combustion (FBC) systems relative to conventional coal-fired systems equipped with scrubbers. Information is included on the relative advantages and disadvantages of utility-sized fluidized-bed combustors, the technical consequences of NSPS alternatives, policy implications concerning NSPS for steam-electric generators, and cost models for atmospheric and pressurized FBC systems. (LCL)

  7. The history of fifty years of institute of electrical engineers 1947-1996

    International Nuclear Information System (INIS)

    1997-07-01

    This book starts with the survey of a century of Korea electrical industry. It includes a business of electric power and electrical machinery industry. Next, it deals with the survey of fifty years of Korea institute electrical engineers. Then, it enumerates the articles of association, organization, board members, the role of the administrative organization, study and institute business activities in detail.

  8. An inverse problem for a one-dimensional time-fractional diffusion problem

    KAUST Repository

    Jin, Bangti

    2012-06-26

    We study an inverse problem of recovering a spatially varying potential term in a one-dimensional time-fractional diffusion equation from the flux measurements taken at a single fixed time corresponding to a given set of input sources. The unique identifiability of the potential is shown for two cases, i.e. the flux at one end and the net flux, provided that the set of input sources forms a complete basis in L 2(0, 1). An algorithm of the quasi-Newton type is proposed for the efficient and accurate reconstruction of the coefficient from finite data, and the injectivity of the Jacobian is discussed. Numerical results for both exact and noisy data are presented. © 2012 IOP Publishing Ltd.

  9. A One-Source Approach for Estimating Land Surface Heat Fluxes Using Remotely Sensed Land Surface Temperature

    Directory of Open Access Journals (Sweden)

    Yongmin Yang

    2017-01-01

    Full Text Available The partitioning of available energy between sensible heat and latent heat is important for precise water resources planning and management in the context of global climate change. Land surface temperature (LST is a key variable in energy balance process and remotely sensed LST is widely used for estimating surface heat fluxes at regional scale. However, the inequality between LST and aerodynamic surface temperature (Taero poses a great challenge for regional heat fluxes estimation in one-source energy balance models. To address this issue, we proposed a One-Source Model for Land (OSML to estimate regional surface heat fluxes without requirements for empirical extra resistance, roughness parameterization and wind velocity. The proposed OSML employs both conceptual VFC/LST trapezoid model and the electrical analog formula of sensible heat flux (H to analytically estimate the radiometric-convective resistance (rae via a quartic equation. To evaluate the performance of OSML, the model was applied to the Soil Moisture-Atmosphere Coupling Experiment (SMACEX in United States and the Multi-Scale Observation Experiment on Evapotranspiration (MUSOEXE in China, using remotely sensed retrievals as auxiliary data sets at regional scale. Validated against tower-based surface fluxes observations, the root mean square deviation (RMSD of H and latent heat flux (LE from OSML are 34.5 W/m2 and 46.5 W/m2 at SMACEX site and 50.1 W/m2 and 67.0 W/m2 at MUSOEXE site. The performance of OSML is very comparable to other published studies. In addition, the proposed OSML model demonstrates similar skills of predicting surface heat fluxes in comparison to SEBS (Surface Energy Balance System. Since OSML does not require specification of aerodynamic surface characteristics, roughness parameterization and meteorological conditions with high spatial variation such as wind speed, this proposed method shows high potential for routinely acquisition of latent heat flux estimation

  10. Implementation of Bourbaki's Elements of Mathematics in Coq: Part One, Theory of Sets

    OpenAIRE

    Grimm , José

    2013-01-01

    We believe that it is possible to put the whole work of Bourbaki into a computer. One of the objectives of the Gaia project concerns homological algebra (theory as well as algorithms); in a first step we want to implement all nine chapters of the book Algebra. But this requires a theory of sets (with axiom of choice etc.) more powerful than what is provided by Ensembles; we have chosen the work of Carlos Simpson as basis. This reports lists and comments all definitions and theorems of the Cha...

  11. APERO, AN OPEN SOURCE BUNDLE ADJUSMENT SOFTWARE FOR AUTOMATIC CALIBRATION AND ORIENTATION OF SET OF IMAGES

    Directory of Open Access Journals (Sweden)

    M. Pierrot Deseilligny

    2012-09-01

    Full Text Available IGN has developed a set of photogrammetric tools, APERO and MICMAC, for computing 3D models from set of images. This software, developed initially for its internal needs are now delivered as open source code. This paper focuses on the presentation of APERO the orientation software. Compared to some other free software initiatives, it is probably more complex but also more complete, its targeted user is rather professionals (architects, archaeologist, geomophologist than people. APERO uses both computer vision approach for estimation of initial solution and photogrammetry for a rigorous compensation of the total error; it has a large library of parametric model of distortion allowing a precise modelization of all the kind of pinhole camera we know, including several model of fish-eye; there is also several tools for geo-referencing the result. The results are illustrated on various application, including the data-set of 3D-Arch workshop.

  12. The decision book fifty models for strategic thinking

    CERN Document Server

    Krogerus, Mikael

    2011-01-01

    Most of us face the same questions every day: What do I want? And how can I get it? How can I live more happily and work more efficiently? A worldwide bestseller, The Decision Book distils into a single volume the fifty best decision-making models used on MBA courses and elsewhere that will help you tackle these important questions - from the well known (the Eisenhower matrix for time management) to the less familiar but equally useful (the Swiss Cheese model). It will even show you how to remember everything you will have learned by the end of it. Stylish and compact, this little black book is a powerful asset. Whether you need to plot a presentation, assess someone's business idea or get to know yourself better, this unique guide will help you simplify any problem and take steps towards the right decision.

  13. Fifty years of 'Atoms for Peace'

    International Nuclear Information System (INIS)

    Heller, W.

    2004-01-01

    Fifty years ago, on December 8, 1953, the then U.S. President, Dwight D. Eisenhower, in his famous speech before the General Assembly of the United Nations proclaimed his 'Atoms for Peace' program, which was to initiate a policy of international cooperation. The event had been preceded by a policy of the United States intended to guarantee to the United States the monopoly in the production and use of nuclear weapons, which ultimately failed because of the resistance of the Soviet Union. The doctrine of a technological monopoly in the nuclear field was to be changed in favor of cooperative ventures under the rigorous control of the United States. The 1954 Atomic Energy Act clearly formulated the will to cooperate. Following a U.S. initiative, the International Atomic Energy Agency (IAEA) was founded in 1956 to assist in transfers of nuclear technology and assume controlling functions to prevent abuse for non-peaceful purposes. Quite a number of countries used the 'Atoms for Peace' offer to develop nuclear power in very close cooperation with American industry and depending on U.S. nuclear fuel supply. On the whole, 'Atoms for Peace' has paved the way to a worldwide peaceful use of nuclear power. (orig.)

  14. Observations of radio sources or 'What happened to radio stars?'

    International Nuclear Information System (INIS)

    Conway, R.G.

    1988-01-01

    A review is given of the early history of the interpretation of the radiation mechanisms following the discovery of the discrete radio sources, both galactic and extragalactic. The conflicting views which prevailed in the early fifties are discussed in some detail: some advocated thermal radiation from stars relatively close by, and others proposed the alternative that synchrotron radiation was responsible for the majority of the radio sources. Attention is drawn to the importance of high-resolution interferometry, whereby the structure of many of the sources could be obtained. Red-shift measurements and spectral distributions also played a part in determining distances and flux strengths at the sources. (U.K.)

  15. The Kramers problem: Fifty years of development

    International Nuclear Information System (INIS)

    Mel'nikov, V.O.

    1990-09-01

    In the last fifty years the seminal work by Kramers of 1940 has been greatly extended both by elaboration of new theoretical approaches and through applications to new experimental systems. The most interesting case turns out to be the regime of weak-to-medium damping, in which case the Fokker-Planck equation can be reduced to an equation or to a system of integral equations of the Wiener-Hopf type. Exact solutions can then be given for the escape rate from single- and double-well potentials. This general scheme can be naturally extended to include quantum penetration through a semiclassical barrier and the effect of quantum noise. Finally, we consider the Brownian motion in a tilted washboard potential using Josephson junctions as an illustrative example. In that context we calculate (i) fluctuation-induced voltage-current characteristics, (ii) the lifetime of a zero-voltage state, (iii) the lifetime of the running state, (iv) partial probabilities of the phase jumps by 2πn (n is an integer) and (v) retrapping current distribution in both classical and quantum regimes. (author). 61 refs, 17 figs

  16. Pre-coding method and apparatus for multiple source or time-shifted single source data and corresponding inverse post-decoding method and apparatus

    Science.gov (United States)

    Yeh, Pen-Shu (Inventor)

    1998-01-01

    A pre-coding method and device for improving data compression performance by removing correlation between a first original data set and a second original data set, each having M members, respectively. The pre-coding method produces a compression-efficiency-enhancing double-difference data set. The method and device produce a double-difference data set, i.e., an adjacent-delta calculation performed on a cross-delta data set or a cross-delta calculation performed on two adjacent-delta data sets, from either one of (1) two adjacent spectral bands coming from two discrete sources, respectively, or (2) two time-shifted data sets coming from a single source. The resulting double-difference data set is then coded using either a distortionless data encoding scheme (entropy encoding) or a lossy data compression scheme. Also, a post-decoding method and device for recovering a second original data set having been represented by such a double-difference data set.

  17. Vannevar Bush: Fifty Years Later

    Science.gov (United States)

    Lagowski, J. J.

    1995-12-01

    strategy will invariably decrease the flow of truly new knowledge in a discipline, a process that will eventually affect the viability of our technology base. Some argue for a third view, namely, expanding the career options for PhD's by altering the details of the training process. If there was a flaw in the Bush plan, it was to be found in the implicit premise that an ever-growing supply of scientists would stimulate new demand for scientific expertise, not just in government and universities, but in industry and the professional venues. Bush probably never expected that, because of federal funding, university scientists would in 50 years produce not just the national reserve of scientists he sought to develop, but a growing number of young PhD's, many of whom wanted nothing more--and nothing less--than to be university scientists themselves. Bush probably never guessed at the efficiency of the process for the education of scientists he set into motion. The absence of a plan to complement supply with demand is one source of the inherent structural problem in American science today. Young PhD's do not receive a sufficiently versatile training to do anything other than academic scientific research. Science as a way of knowing is clearly a sound foundation for a variety of careers. Numerous opportunities exist that can use the skills of the scientist while rewarding creativity, autonomy, problem-solving, industriousness, and the yearning for knowledge--all the characteristics associated with well-trained scientists. The challenge for academe is to refine or adapt Vannevar Bush's original "social contract" into a new one, more appropriate for the 21st century.

  18. Exact distribution of a pattern in a set of random sequences generated by a Markov source: applications to biological data

    Directory of Open Access Journals (Sweden)

    Regad Leslie

    2010-01-01

    Full Text Available Abstract Background In bioinformatics it is common to search for a pattern of interest in a potentially large set of rather short sequences (upstream gene regions, proteins, exons, etc.. Although many methodological approaches allow practitioners to compute the distribution of a pattern count in a random sequence generated by a Markov source, no specific developments have taken into account the counting of occurrences in a set of independent sequences. We aim to address this problem by deriving efficient approaches and algorithms to perform these computations both for low and high complexity patterns in the framework of homogeneous or heterogeneous Markov models. Results The latest advances in the field allowed us to use a technique of optimal Markov chain embedding based on deterministic finite automata to introduce three innovative algorithms. Algorithm 1 is the only one able to deal with heterogeneous models. It also permits to avoid any product of convolution of the pattern distribution in individual sequences. When working with homogeneous models, Algorithm 2 yields a dramatic reduction in the complexity by taking advantage of previous computations to obtain moment generating functions efficiently. In the particular case of low or moderate complexity patterns, Algorithm 3 exploits power computation and binary decomposition to further reduce the time complexity to a logarithmic scale. All these algorithms and their relative interest in comparison with existing ones were then tested and discussed on a toy-example and three biological data sets: structural patterns in protein loop structures, PROSITE signatures in a bacterial proteome, and transcription factors in upstream gene regions. On these data sets, we also compared our exact approaches to the tempting approximation that consists in concatenating the sequences in the data set into a single sequence. Conclusions Our algorithms prove to be effective and able to handle real data sets with

  19. "Double crap!" abuse and harmed identity in Fifty Shades of Grey.

    Science.gov (United States)

    Bonomi, Amy E; Altenburger, Lauren E; Walton, Nicole L

    2013-09-01

    when genuinely angry, dismisses Anastasia's requests for boundaries, and threatens her). Anastasia experiences reactions typical of abused women, including: constant perceived threat ("my stomach churns from his threats"); altered identity (describes herself as a "pale, haunted ghost"); and stressful managing (engages in behaviors to "keep the peace," such as withholding information about her social whereabouts to avoid Christian's anger). Anastasia becomes disempowered and entrapped in the relationship as her behaviors become mechanized in response to Christian's abuse. Our analysis identified patterns in Fifty Shades that reflect pervasive intimate partner violence-one of the biggest problems of our time. Further, our analysis adds to a growing body of literature noting dangerous violence standards being perpetuated in popular culture.

  20. Statement to the fifty-fifth session of the United Nations General Assembly

    International Nuclear Information System (INIS)

    ElBaradei, M.

    2000-01-01

    In his statement to the fifty-fifth session of the United Nations General Assembly, the Director General of the IAEA briefly presented the three fundamental functions of the IAEA, namely: its role as a catalyst for the development and transfer of peaceful nuclear technologies, the efforts to prevent the proliferation of nuclear weapons and move towards nuclear disarmament, and the work to build and maintain a global nuclear safety regime

  1. Dual energy CTA of the supraaortic arteries: Technical improvements with a novel dual source CT system

    Energy Technology Data Exchange (ETDEWEB)

    Lell, Michael M., E-mail: Michael.lell@uk-erlangen.de [Department of Radiology, University Erlangen, Maximiliansplatz 1, 91054 Erlangen (Germany); Hinkmann, Fabian [Department of Radiology, University Erlangen, Maximiliansplatz 1, 91054 Erlangen (Germany); Nkenke, Emeka [Department of Maxillofacial Surgery, University Erlangen (Germany); Schmidt, Bernhard [Bayer-Schering Healthcare, Berlin (Germany); Seidensticker, Peter [Siemens Healthcare, CT-Division, Forchheim (Germany); Kalender, Willi A. [Institute of Medical Physics, University Erlangen (Germany); Uder, Michael [Department of Radiology, University Erlangen, Maximiliansplatz 1, 91054 Erlangen (Germany); Achenbach, Stephan [Department of Cardiology, University Erlangen (Germany)

    2010-11-15

    Objectives: Computed tomography angiography (CTA) is a well-accepted imaging modality to evaluate the supraaortic vessels. Initial reports have suggested that dual energy CTA (DE-CTA) can enhance diagnosis by creating bone-free data sets, which can be visualized in 3D, but a number of limitations of this technique have also been addressed. We sought to describe the performance of DE-CTA of the supraaortic vessels with a novel dual source CT system with special emphasis on image quality and post-processing related artifacts. Materials and methods: Thirty-three patients underwent carotid CT angiography on a second generation dual source CT system. Simultaneous acquisitions of 100 and 140 kV data sets in arterial phase were performed. Two examiners evaluated overall bone suppression with a 3-point scale (1 = poor; 3 = excellent) and image quality regarding integrity of the vessel lumen of different vessel segments (n = 26) with a 5-point scale (1 = poor; 5 = excellent), CTA source data served as the reference. Results: Excellent bone suppression could be achieved in the head and neck. Only minor bone remnants occurred, mean score for bone removal was 2.9. Mean score for vessel integrity was 4.3. Eight hundred fifty-seven vessel segments could be evaluated. Six hundred thirty-five segments (74%) showed no lumen alteration, 65 segments (7.6%) lumen alterations <10%, 27 segments (3.1%) lumen alterations >10% resulting in a total luminal reduction <50%, 17 segments (2%) lumen alterations of more than 10% resulting in a total luminal reduction >50%, and 113 segments (13.2%) showed a gap in the vessel course (100% total lumen reduction). Artificial gaps of the vessel lumen occurred in 28 vessel segments due to artifacts caused by dental hardware and in all but one (65) ophthalmic arteries. Conclusions: Excellent bone suppression could be achieved, DE imaging with 100 and 140 kV lead to improved image quality and vessel integrity in the shoulder region than previously

  2. Fifty years with nuclear fission

    International Nuclear Information System (INIS)

    Behrens, J.W.; Carlson, A.D.

    1989-01-01

    The news of the discovery of nuclear fission, by Otto Hahn and Fritz Strassmann in Germany, was brought to the United States by Niels Bohr in January 1939. Since its discovery, the United States, and the world for that matter, has never been the same. It therefore seemed appropriate to acknowledge the fifieth anniversary of its discovery by holding a topical meeting entitled, ''Fifty Years with Nuclear Fission,'' in the United States during the year 1989. The objective of the meeting was to bring together pioneers of the nuclear industry and other scientists and engineers to report on reminiscences of the past and on the more recent development in fission science and technology. The conference highlighted the early pioneers of the nuclear industry by dedicated a full day (April 26), consisting of two plenary sessions, at the National Academy of Sciences (NAS) in Washington, DC. More recent developments in fission science and technology in addition to historical reflections were topics for two fully days of sessions (April 27 and 28) at the main site of the NIST in Gaithersburg, Maryland. The wide range of topics covered in this Volume 1 by this topical meeting included plenary invited, and contributed sessions entitled: Preclude to the First Chain Reaction -- 1932 to 1942; Early Fission Research -- Nuclear Structure and Spontaneous Fission; 50 Years of Fission, Science, and Technology; Nuclear Reactors, Secure Energy for the Future; Reactors 1; Fission Science 1; Safeguards and Space Applications; Fission Data; Nuclear Fission -- Its Various Aspects; Theory and Experiments in Support of Theory; Reactors and Safeguards; and General Research, Instrumentation, and By-Product. The individual papers have been cataloged separately

  3. Physics Mining of Multi-Source Data Sets

    Science.gov (United States)

    Helly, John; Karimabadi, Homa; Sipes, Tamara

    2012-01-01

    Powerful new parallel data mining algorithms can produce diagnostic and prognostic numerical models and analyses from observational data. These techniques yield higher-resolution measures than ever before of environmental parameters by fusing synoptic imagery and time-series measurements. These techniques are general and relevant to observational data, including raster, vector, and scalar, and can be applied in all Earth- and environmental science domains. Because they can be highly automated and are parallel, they scale to large spatial domains and are well suited to change and gap detection. This makes it possible to analyze spatial and temporal gaps in information, and facilitates within-mission replanning to optimize the allocation of observational resources. The basis of the innovation is the extension of a recently developed set of algorithms packaged into MineTool to multi-variate time-series data. MineTool is unique in that it automates the various steps of the data mining process, thus making it amenable to autonomous analysis of large data sets. Unlike techniques such as Artificial Neural Nets, which yield a blackbox solution, MineTool's outcome is always an analytical model in parametric form that expresses the output in terms of the input variables. This has the advantage that the derived equation can then be used to gain insight into the physical relevance and relative importance of the parameters and coefficients in the model. This is referred to as physics-mining of data. The capabilities of MineTool are extended to include both supervised and unsupervised algorithms, handle multi-type data sets, and parallelize it.

  4. The Implementation of One-Week-One-Article Program in a Reading Class: A Reflective Practice

    Directory of Open Access Journals (Sweden)

    Yudi Rahmatullah

    2017-07-01

    Full Text Available This article presents my reflections on the implementation of one-week-one-article program. Fifty-three students participated in this program. Every week they presented the article they had read. I found that the majority of students actively participated in this program, showing seriousness in understanding the content of the article, the pronunciation of difficult words, and the flow of the presentation. This program at least promoted three aspects: students’ motivation, cooperative learning, and their critical thinking. Even though this program was conducted for university students, it is likely to be working with students of junior and senior secondary school with some modification

  5. Õige valu õiges kohas. Lahinguväljal näeme! Noor veri vemmeldab : Slide-Fifty

    Index Scriptorium Estoniae

    2006-01-01

    9.-10. juunil Järvakandis toimuval festivalil Rabarock esinevatest ansamblitest: The Skreppers, Metsatöll, Slide Fifty (vt. www. rabarock. ee, www.skreppers.com, www.metsatoll.ee, www.slidefifty.com)

  6. Fifty Cases of Parkinson's Disease Treated by Acupuncture Combined with Madopar

    Institute of Scientific and Technical Information of China (English)

    REN Xiao-ming

    2008-01-01

    objective;To search for an effective therapy for trealing motor disorder due to Parkinson's disease(PD).Methods;Fifty cases in a treatrnent group were treated by acupuncture combined with madopar,and 30 cases in a control group treated by madopar only.Results;A total effective rate of 92%was achieved with obvious alleviation of motor disorder in the treatment group,which was significantly higher than that in the control group(P<0.05).Conclusion;Acupuncture can enhance therapeutic effects of western medicine and lessen the dose of the medicine needed.

  7. Nuclear power: A look at the future. International Conference on Fifty Years of Nuclear Power: The Next Fifty Years, 27 June 2004, Moscow, Russia

    International Nuclear Information System (INIS)

    ElBaradei, M.

    2004-01-01

    other electricity sources. Critical Issues in Shaping the Future are: Carbon Emissions and the Growth in Demand; Security of Supply; Public Perceptions and Misconceptions: Shaping National Choices; Performance in Addressing Key Concerns: Nuclear Safety, Management and Disposal of Spent Nuclear Fuel, Nuclear Security, Technological and Policy Innovation, Fuel for Transportation (Growing Interest in the Potential for a 'Hydrogen Economy'). Although it is difficult to predict with any confidence what the next fifty years holds for nuclear power, the factors that will shape the future of nuclear power are relatively evident, and action should ne taken to address those factors, to enhance the prospects that nuclear energy remains a viable source of safe, secure and environmentally benign energy

  8. A qualitative study of gestational weight gain goal setting.

    Science.gov (United States)

    Criss, Shaniece; Oken, Emily; Guthrie, Lauren; Hivert, Marie-France

    2016-10-20

    Gestational weight gain (GWG) is an important predictor of short and long-term pregnancy outcomes for both mother and child, and women who set a GWG goal are more likely to gain within recommended ranges. Little information is available regarding potentially modifiable factors that underlie a woman's GWG goals. Our aims were to explore women's perceptions regarding factors that affect GWG, their understanding of appropriate GWG, their goal-setting experiences including patient-health care provider (HCP) conversations, and supportive interventions they would most like to help them achieve the recommended GWG. We conducted nine in-depth interviews and seven focus groups with a total of 33 Boston, Massachusetts (MA) area women who were pregnant and had delivered within the prior 6 months. We recorded and transcribed all interviews. Two investigators independently coded resulting transcripts. We managed data using MAXQDA2 and conducted a content analysis. Perceived factors that contributed to GWG goal-setting included the mother's weight control behaviors concerning exercise and diet-including a "new way of eating for two" and "semblance of control", experiences during prior pregnancies, conversations with HCPs, and influence from various information sources. Women focused on behaviors with consistent messaging across multiple sources of information, but mainly trusted their HCP, valued one-to-one conversations with them about GWG, preferred that the HCP initiate the conversation about GWG goals, and would be open to have the conversation started based on visual aid based on their own GWG progression. Pregnant women highly value discussions with their HCP to set GWG goals. Pregnant women view their clinicians as the most reliable source of information and believe that clinicians should open weight-related discussions throughout pregnancy.

  9. THE CHANDRA SOURCE CATALOG

    International Nuclear Information System (INIS)

    Evans, Ian N.; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G.; Grier, John D.; Hain, Roger M.; Harbo, Peter N.; He Xiangqun; Karovska, Margarita; Kashyap, Vinay L.; Davis, John E.; Houck, John C.; Hall, Diane M.

    2010-01-01

    The Chandra Source Catalog (CSC) is a general purpose virtual X-ray astrophysics facility that provides access to a carefully selected set of generally useful quantities for individual X-ray sources, and is designed to satisfy the needs of a broad-based group of scientists, including those who may be less familiar with astronomical data analysis in the X-ray regime. The first release of the CSC includes information about 94,676 distinct X-ray sources detected in a subset of public Advanced CCD Imaging Spectrometer imaging observations from roughly the first eight years of the Chandra mission. This release of the catalog includes point and compact sources with observed spatial extents ∼<30''. The catalog (1) provides access to the best estimates of the X-ray source properties for detected sources, with good scientific fidelity, and directly supports scientific analysis using the individual source data; (2) facilitates analysis of a wide range of statistical properties for classes of X-ray sources; and (3) provides efficient access to calibrated observational data and ancillary data products for individual X-ray sources, so that users can perform detailed further analysis using existing tools. The catalog includes real X-ray sources detected with flux estimates that are at least 3 times their estimated 1σ uncertainties in at least one energy band, while maintaining the number of spurious sources at a level of ∼<1 false source per field for a 100 ks observation. For each detected source, the CSC provides commonly tabulated quantities, including source position, extent, multi-band fluxes, hardness ratios, and variability statistics, derived from the observations in which the source is detected. In addition to these traditional catalog elements, for each X-ray source the CSC includes an extensive set of file-based data products that can be manipulated interactively, including source images, event lists, light curves, and spectra from each observation in which a

  10. The Chandra Source Catalog

    Science.gov (United States)

    Evans, Ian N.; Primini, Francis A.; Glotfelty, Kenny J.; Anderson, Craig S.; Bonaventura, Nina R.; Chen, Judy C.; Davis, John E.; Doe, Stephen M.; Evans, Janet D.; Fabbiano, Giuseppina; Galle, Elizabeth C.; Gibbs, Danny G., II; Grier, John D.; Hain, Roger M.; Hall, Diane M.; Harbo, Peter N.; He, Xiangqun Helen; Houck, John C.; Karovska, Margarita; Kashyap, Vinay L.; Lauer, Jennifer; McCollough, Michael L.; McDowell, Jonathan C.; Miller, Joseph B.; Mitschang, Arik W.; Morgan, Douglas L.; Mossman, Amy E.; Nichols, Joy S.; Nowak, Michael A.; Plummer, David A.; Refsdal, Brian L.; Rots, Arnold H.; Siemiginowska, Aneta; Sundheim, Beth A.; Tibbetts, Michael S.; Van Stone, David W.; Winkelman, Sherry L.; Zografou, Panagoula

    2010-07-01

    The Chandra Source Catalog (CSC) is a general purpose virtual X-ray astrophysics facility that provides access to a carefully selected set of generally useful quantities for individual X-ray sources, and is designed to satisfy the needs of a broad-based group of scientists, including those who may be less familiar with astronomical data analysis in the X-ray regime. The first release of the CSC includes information about 94,676 distinct X-ray sources detected in a subset of public Advanced CCD Imaging Spectrometer imaging observations from roughly the first eight years of the Chandra mission. This release of the catalog includes point and compact sources with observed spatial extents lsim30''. The catalog (1) provides access to the best estimates of the X-ray source properties for detected sources, with good scientific fidelity, and directly supports scientific analysis using the individual source data; (2) facilitates analysis of a wide range of statistical properties for classes of X-ray sources; and (3) provides efficient access to calibrated observational data and ancillary data products for individual X-ray sources, so that users can perform detailed further analysis using existing tools. The catalog includes real X-ray sources detected with flux estimates that are at least 3 times their estimated 1σ uncertainties in at least one energy band, while maintaining the number of spurious sources at a level of lsim1 false source per field for a 100 ks observation. For each detected source, the CSC provides commonly tabulated quantities, including source position, extent, multi-band fluxes, hardness ratios, and variability statistics, derived from the observations in which the source is detected. In addition to these traditional catalog elements, for each X-ray source the CSC includes an extensive set of file-based data products that can be manipulated interactively, including source images, event lists, light curves, and spectra from each observation in which a

  11. Management of 'orphan' sources

    International Nuclear Information System (INIS)

    Telleria, D.; Spano, F.; Rudelli, M.D.

    1998-01-01

    The experience has shown that most of the accidents with severe radiological consequences take place when radioactive sources were beyond the control system. In Argentina, the primary framework in radiological safety was established in the late fifties, with a non-prescriptive regulatory approach. For any application involving radioactive material, users must be authorised by the Authority, unless the application has demonstrated to be exempted. The licensees are responsible for ensuring protection against the risk associated with exposure to radiation, and for safety of radioactive sources. To obtain an authorisation, the applicant has to prove to the Authority knowledge and capability to carry on an application. Not only normal operation circumstances are considered, but every conceivable accidental situation. It has been shown the existence of radioactive sources not attributed to an authorised user or installation, and therefore outside of the primary control structure described above. These sources, from here on called 'orphans' recognise several origins. The regulatory authority should necessary foresee mechanisms to afford early detection and management of these sources, before an undesired consequence arises. Up to some extent, the deployment of multiple and varied organisations or procedures, could be understood as a 'defence in depth' concept, applied to the control. (author)

  12. An inverse problem for a one-dimensional time-fractional diffusion problem

    KAUST Repository

    Jin, Bangti; Rundell, William

    2012-01-01

    We study an inverse problem of recovering a spatially varying potential term in a one-dimensional time-fractional diffusion equation from the flux measurements taken at a single fixed time corresponding to a given set of input sources. The unique

  13. A geographic distribution data set of biodiversity in Italian freshwaters

    Directory of Open Access Journals (Sweden)

    Angela Boggero

    2016-10-01

    Full Text Available We present a data set on the biodiversity of Italian freshwaters, including lakeshores and riverbanks of natural (N=379: springs, streams and lakes and artificial (N=11: fountains sites. The data set belongs partly to the Italian Long Term Ecological Research network (LTER-Italy and partly to LifeWatch, the European e-Science infrastructure for biodiversity and ecosystem research. The data included cover a time period corresponding to the last fifty years (1962-2014. They span a large number of taxa from prokaryotes and unicellular eukaryotes to vertebrates and plants, including taxa linked to the aquatic habitat in at least part of their life cycles (like immature stages of insects, amphibians, birds and vascular plants. The data set consists of 6463 occurrence data and distribution records for 1738 species. The complete data set is available in csv file format via the LifeWatch Service Centre.

  14. Estimating and correcting the amplitude radiation pattern of a virtual source

    NARCIS (Netherlands)

    Van der Neut, J.; Bakulin, A.

    2009-01-01

    In the virtual source (VS) method we crosscorrelate seismic recordings at two receivers to create a new data set as if one of these receivers were a virtual source and the other a receiver. We focus on the amplitudes and kinematics of VS data, generated by an array of active sources at the surface

  15. One-week triple therapy for eradication of helicobacter pylori

    International Nuclear Information System (INIS)

    Shah, N.H.; Shah, M.S.; Khan, I.; Hameed, K.

    2002-01-01

    Objective: The optimum therapy for Helicobacter pylori infection is yet to be defined in Pakistan despite a high prevalence of helicobacter associated diseases in this community. The most popular and effective regimen was therefore chosen among the currently recommended combinations used worldwide to document its efficacy in our symptomatic Helicobacter positive dyspeptic patients. Design: It was a prospective, non-randomized study. Place and duration of Study: The study lasted from January 1998 till June 1999 at the Postgraduate Institute, Government Lady Reading Hospital and Fauji Foundation Hospital, Peshawar. Subjects and Methods: Consecutive dyspeptic patients with peptic ulcer disease as well as non ulcer dyspepsia with a positive H. pylori status on histology from the specimens obtained from the antral region of the stomach, who consented to take part in the study were enrolled. They were given omeprazole 20 mg bd, clarithromycin 500 mg bd. And amoxycillin 1 gm bd for one week. One group comprised patients with confirmed peptic ulcer disease while the second group comprised patients with macroscopic/microscopic antral gastritis. Patients with peptic ulcer disease were given additional course of omerprazol for another 4 weeks to ensure healing of their ulcers. All patients were re scoped after stopping all drugs and their H. pylori status re-assessed on histology. Results: A total of 84 patients consented to enter the study. Fifty-nine were males and twenty-five were females. Fifty-eight patients completed the study while others were lost followup. There were no dropouts due to side effects of the drugs. Sixteen patients had peptic ulcer disease while 68 had macroscopic/microscopic active antral gastric only. The Helicobacter pylori eradication has been successful in only 12 patients giving a cure rate of 20.60% as determined per protocol analysis. The eradication rates were disappointingly low in both groups. Conclusion: The results are extremely

  16. Alternative Energy Sources

    CERN Document Server

    Michaelides, Efstathios E (Stathis)

    2012-01-01

    Alternative Energy Sources is designed to give the reader, a clear view of the role each form of alternative energy may play in supplying the energy needs of the human society in the near and intermediate future (20-50 years).   The two first chapters on energy demand and supply and environmental effects, set the tone as to why the widespread use of alternative energy is essential for the future of human society. The third chapter exposes the reader to the laws of energy conversion processes, as well as the limitations of converting one energy form to another. The sections on exergy give a succinct, quantitative background on the capability/potential of each energy source to produce power on a global scale. The fourth, fifth and sixth chapters are expositions of fission and fusion nuclear energy. The following five chapters (seventh to eleventh) include detailed descriptions of the most common renewable energy sources – wind, solar, geothermal, biomass, hydroelectric – and some of the less common sources...

  17. Women in Management - A Movement from the Fifties to the New Millenium

    OpenAIRE

    Parikh Indira J; Engineer Mahrukh

    2002-01-01

    "Women in Management - A Movement from the Fifties to the New Millennium", views the evolution and changes that have occurred from the 1950s onwards and looks at new opportunities for women managers in the new millennium. Women in Management are coming of age. The transformation of the Indian woman from an enigmatic figure, covered in meters of fabric, to todays educated, successful and accomplished professional has not been without great personal sacrifices. These are women who have broken t...

  18. The Henderson Question? The Melbourne Institute and fifty years of welfare policy

    OpenAIRE

    R. G. Gregory

    2013-01-01

    We discuss selected research contributions of the Melbourne Institute of Applied Economics and Social Research, to fifty years of welfare policy for those of work force age and focus particularly on the policy focus of R. F. Henderson, the inaugural director. Following the spirit of his 1960s poverty research, government, in the mid-1970s, doubled unemployment allowances in real terms and increased pensions by approximately forty per cent. Both income support payments were to be indexed by av...

  19. Nuclear magnetic resonance evaluation of fifty children with cancer

    International Nuclear Information System (INIS)

    Cohen, M.D.; Klatte, E.C.; Smith, J.A.; Carr, B.E.; Martin-Simmerman, P.

    1985-01-01

    Fifty children with cancer have been studied by MR. The patients studied include ten with lymphoma, nine with neuroblastoma, five with rhabdomyosarcoma, six with leukemia, five with Ewings sarcoma, four with Wilms tumor and several with other miscellaneous tumors. The results of scanning show that MR is well tolerated by children. Primary tumor has been identified in every case. Metastases have been identified in many patients. MR has proved helpful in identifying the organ of origin of a tumor. Because of excellent vessel visualization it is helpful in planning surgical resection of a tumor. In addition, in a number of patients MR has proved helpful in monitoring the response of tumor to nonsurgical therapy. With continued improvement in image quality it is believed that MR has a major role to play in pediatric tumor imaging [fr

  20. UpSet: Visualization of Intersecting Sets

    Science.gov (United States)

    Lex, Alexander; Gehlenborg, Nils; Strobelt, Hendrik; Vuillemot, Romain; Pfister, Hanspeter

    2016-01-01

    Understanding relationships between sets is an important analysis task that has received widespread attention in the visualization community. The major challenge in this context is the combinatorial explosion of the number of set intersections if the number of sets exceeds a trivial threshold. In this paper we introduce UpSet, a novel visualization technique for the quantitative analysis of sets, their intersections, and aggregates of intersections. UpSet is focused on creating task-driven aggregates, communicating the size and properties of aggregates and intersections, and a duality between the visualization of the elements in a dataset and their set membership. UpSet visualizes set intersections in a matrix layout and introduces aggregates based on groupings and queries. The matrix layout enables the effective representation of associated data, such as the number of elements in the aggregates and intersections, as well as additional summary statistics derived from subset or element attributes. Sorting according to various measures enables a task-driven analysis of relevant intersections and aggregates. The elements represented in the sets and their associated attributes are visualized in a separate view. Queries based on containment in specific intersections, aggregates or driven by attribute filters are propagated between both views. We also introduce several advanced visual encodings and interaction methods to overcome the problems of varying scales and to address scalability. UpSet is web-based and open source. We demonstrate its general utility in multiple use cases from various domains. PMID:26356912

  1. Exploring the Power of Heterogeneous Information Sources

    Science.gov (United States)

    2011-01-01

    set of movies derived from two information sources: movie genres and users. The genre information may indicate that two movies that are “ animations ...are more similar than two movies one of which is an “ animation ” and one of which is a “romance” movie. Similarly, movies watched by the same set of...grown- ups ’ ’ ’ ’ ’ ’ ’ X1-The Lion King; X2-Toy Story; X3-Kungfu Panda ; X4-Wall-E; X5-Casablanca; X6-Titanic; X7-The Notebook kids Figure 6.2: A

  2. One TV, One Price?

    OpenAIRE

    Jean Imbs; Haroon Mumtaz; Morten O. Ravn; Hélène Rey

    2009-01-01

    We use a unique dataset on television prices across European countries and regions to investigate the sources of differences in price levels. Our findings are as follows: (i) Quality is a crucial determinant of price differences. Even in an integrated economic zone as Europe, rich economies tend to consume higher quality goods. This effect accounts for the lion’s share of international price dispersion. (ii) Sizable international price differentials subsist even for the same television sets. ...

  3. The Proton Synchrotron, going strong at fifty years

    CERN Multimedia

    Django Manglunki

    It was on the evening of 24 November 1959 that an incredulous Hildred Blewett, on detachment to CERN from the Brookhaven laboratory, exclaimed “Yes! We’re through transition!” The first beam of ten billion protons had not only broken through the 5.2 GeV barrier but gone on all the way to 24 GeV, the machine’s top energy at that time.   An operational screenshot from the PS, taken on its 50th anniversary. The three white peaks depict different phases (cycles) of the PS’s operation. In the first and third cycle, the PS is producing a very low-intensity beam for LHC commissioning. In the second cycle, protons are being spilled out for use in the East Area. Fifty years ago the PS, the first strong-focusing proton synchrotron using alternating gradient technology, first began to circulate beams at an unprecedented level of energy. Over the years, a complex of linear and circular accelerators and storage rings grew up around the PS. In the mid-1990s ...

  4. Risk assessment of water pollution sources based on an integrated k-means clustering and set pair analysis method in the region of Shiyan, China.

    Science.gov (United States)

    Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan

    2016-07-01

    Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Non-polar organic compounds as PM2.5 source tracers: Investigation of their sources and degradation in the Pearl River Delta, China

    Science.gov (United States)

    Wang, Q.; Feng, Y.; Huang, X. H. H.; Griffith, S.; Zhang, T.; Zhang, Q.; Wu, D.; Yu, J.

    2016-12-01

    Nonpolar organic compounds (NPOCs) including alkanes, polycyclic aromatic hydrocarbons (PAHs), hopanes, steranes, and 1,3,5-triphenylbenzene, were quantified in PM2.5 samples at four sites in the Pearl River Delta (PRD) region, China over a two-year period from 2011 to 2012. The four sites include one industrial zone (Nanhai), one urban (Guangzhou), one urban outskirt (Dongguan) and one suburban (Nansha) locations. Some NPOCs are uniquely emitted from particular combustion sources, and thereby serving as convenient markers in source apportionment. Based on this multi-year and multi-site data set, spatial and seasonal variations, correlation analysis and ratio-ratio plots were used to investigate the source information and degradation of NPOC tracers. In summer, NPOCs showed distinct local emission characteristics, with urban sites having much higher concentrations than suburban site. In winter, regional transport was an important influence on NPOC levels, driving up concentrations at all sampling sites and diminishing an urban-suburban spatial gradient. The lighter NPOCs exhibited more prominent seasonal variations, suggesting their particle-phase abundance is more influenced by temperature, a critical factor in controlling the extent of semi-volatile organics partitioned into the aerosol phase. The heavier NPOCs, especially PAHs, showed negligible correlation among the four sites, suggesting more influence from local emissions. Ratio-ratio plots indicate photo-degradation and mixing of various sources for the NPOCs in the PRD. A positive matrix factorization (PMF) analysis of this large NPOC data set suggests that heavier NPOCs are more suitable source indicators than lighter NPOCs. Incorporating particle-phase light NPOC concentrations in PMF produces a separate factor, which primarily contains those light NPOCs and likely is not a source factor. Total NPOC concentrations predicted using Pankow partitioning theory were explored as PMF inputs, however, the PMF

  6. Technical and tactical performance indicators based on the outcome of the set in the school volleyball

    Directory of Open Access Journals (Sweden)

    Yago Pessoa da Costa

    2017-09-01

    Full Text Available The aim of the study was to identify and compare the technical and tactical performance indicators based on the outcome of the set in the school female volleyball. The study included 110 athletes, aged between 12 and 14 years, belonging to 11 teams. Fifty-eight sets of 28 games were filmed and 7194 actions, 2830 serves, 2157 serve reception, 1358 passes and 1299 attacks were analyzed. Afterwards, the game sets were divided into winners and losers sets. Teams that won the sets had advantage at the serve reception, set and attack on error and excellence/point criteria (p< 0.05 and of serve, set and attack (p< 0.001. In conclusion, the winner’s sets were those with a better technical-tactical performance quantitatively and qualitatively.

  7. Chevy Corvette: Icon Of American Life In The Fifties

    Directory of Open Access Journals (Sweden)

    Wishnoebroto Wishnoebroto

    2007-11-01

    Full Text Available Cars not only function simply as a mean of transportation. Like paintings, the design of a car could represent a certain cultural and social phenomenon of a country. The design of Chevrolet (Chevy Corvette is very different compared to its competitors in the 50s. The size, engine, weight, and the materials of this car were chosen based on the assumption that speed and agility is on top of everything. It was not surprising that in the 50s, the year when the first Corvette was designed and launched, The US was involved in a cold war with the Soviets. Arm race and competition to be the first was the major issue and Corvette was the first car that suggests this spirit. This paper tries to show the distinctiveness of Corvette and how it can be used to explain the character of American people in the fifties

  8. Human, animal and environmental contributors to antibiotic resistance in low-resource settings: integrating behavioural, epidemiological and One Health approaches.

    Science.gov (United States)

    Rousham, Emily K; Unicomb, Leanne; Islam, Mohammad Aminul

    2018-04-11

    Antibiotic resistance (ABR) is recognized as a One Health challenge because of the rapid emergence and dissemination of resistant bacteria and genes among humans, animals and the environment on a global scale. However, there is a paucity of research assessing ABR contemporaneously in humans, animals and the environment in low-resource settings. This critical review seeks to identify the extent of One Health research on ABR in low- and middle-income countries (LMICs). Existing research has highlighted hotspots for environmental contamination; food-animal production systems that are likely to harbour reservoirs or promote transmission of ABR as well as high and increasing human rates of colonization with ABR commensal bacteria such as Escherichia coli However, very few studies have integrated all three components of the One Health spectrum to understand the dynamics of transmission and the prevalence of community-acquired resistance in humans and animals. Microbiological, epidemiological and social science research is needed at community and population levels across the One Health spectrum in order to fill the large gaps in knowledge of ABR in low-resource settings. © 2018 The Author(s).

  9. Multi-Province Listeriosis Outbreak Linked to Contaminated Deli Meat Consumed Primarily in Institutional Settings, Canada, 2008.

    Science.gov (United States)

    Currie, Andrea; Farber, Jeffrey M; Nadon, Céline; Sharma, Davendra; Whitfield, Yvonne; Gaulin, Colette; Galanis, Eleni; Bekal, Sadjia; Flint, James; Tschetter, Lorelee; Pagotto, Franco; Lee, Brenda; Jamieson, Fred; Badiani, Tina; MacDonald, Diane; Ellis, Andrea; May-Hadford, Jennifer; McCormick, Rachel; Savelli, Carmen; Middleton, Dean; Allen, Vanessa; Tremblay, Francois-William; MacDougall, Laura; Hoang, Linda; Shyng, Sion; Everett, Doug; Chui, Linda; Louie, Marie; Bangura, Helen; Levett, Paul N; Wilkinson, Krista; Wylie, John; Reid, Janet; Major, Brian; Engel, Dave; Douey, Donna; Huszczynski, George; Di Lecci, Joe; Strazds, Judy; Rousseau, Josée; Ma, Kenneth; Isaac, Leah; Sierpinska, Urszula

    2015-08-01

    A multi-province outbreak of listeriosis occurred in Canada from June to November 2008. Fifty-seven persons were infected with 1 of 3 similar outbreak strains defined by pulsed-field gel electrophoresis, and 24 (42%) individuals died. Forty-one (72%) of 57 individuals were residents of long-term care facilities or hospital inpatients during their exposure period. Descriptive epidemiology, product traceback, and detection of the outbreak strains of Listeria monocytogenes in food samples and the plant environment confirmed delicatessen meat manufactured by one establishment and purchased primarily by institutions was the source of the outbreak. The food safety investigation identified a plant environment conducive to the introduction and proliferation of L. monocytogenes and persistently contaminated with Listeria spp. This outbreak demonstrated the need for improved listeriosis surveillance, strict control of L. monocytogenes in establishments producing ready-to-eat foods, and advice to vulnerable populations and institutions serving these populations regarding which high-risk foods to avoid.

  10. Management of colon wounds in the setting of damage control laparotomy: a cautionary tale.

    Science.gov (United States)

    Weinberg, Jordan A; Griffin, Russell L; Vandromme, Marianne J; Melton, Sherry M; George, Richard L; Reiff, Donald A; Kerby, Jeffrey D; Rue, Loring W

    2009-11-01

    Although colon wounds are commonly treated in the setting of damage control laparotomy (DCL), a paucity of data exist to guide management. The purpose of this study was to evaluate our experience with the management of colonic wounds in the context of DCL, using colonic wound outcomes after routine, single laparotomy (SL) as a benchmark. Consecutive patients during a 7-year period with full-thickness or devitalizing colon injury were identified. Early deaths (Colon-related complications (abscess, suture or staple leak, and stomal ischemia) were compared between those managed in the setting of DCL versus those managed by SL, both overall and as stratified by procedure (primary repair, resection and anastomosis, and resection and colostomy). One hundred fifty-seven patients met study criteria: 101 had undergone SL and 56 had undergone DCL. Comparison of DCL patients with SL patients was notable for a significant difference in colon-related complications (30% vs. 12%, p colon-related complications among those that underwent resection and anastomosis (DCL: 39% vs. SL: 18%, p colonic wounds in the setting of DCL is associated with a relatively high incidence of complications. The excessive incidence of leak overall and morbidity particular to resection and anastomosis, however, give us pause. Although stoma construction is not without its own complications in the setting of DCL, it may be the safer alternative.

  11. An Inverse Source Problem for a One-dimensional Wave Equation: An Observer-Based Approach

    KAUST Repository

    Asiri, Sharefa M.

    2013-05-25

    Observers are well known in the theory of dynamical systems. They are used to estimate the states of a system from some measurements. However, recently observers have also been developed to estimate some unknowns for systems governed by Partial differential equations. Our aim is to design an observer to solve inverse source problem for a one dimensional wave equation. Firstly, the problem is discretized in both space and time and then an adaptive observer based on partial field measurements (i.e measurements taken form the solution of the wave equation) is applied to estimate both the states and the source. We see the effectiveness of this observer in both noise-free and noisy cases. In each case, numerical simulations are provided to illustrate the effectiveness of this approach. Finally, we compare the performance of the observer approach with Tikhonov regularization approach.

  12. Orientation Estimation and Signal Reconstruction of a Directional Sound Source

    DEFF Research Database (Denmark)

    Guarato, Francesco

    , one for each call emission, were compared to those calculated through a pre-existing technique based on interpolation of sound-pressure levels at microphone locations. The application of the method to the bat calls could provide knowledge on bat behaviour that may be useful for a bat-inspired sensor......Previous works in the literature about one tone or broadband sound sources mainly deal with algorithms and methods developed in order to localize the source and, occasionally, estimate the source bearing angle (with respect to a global reference frame). The problem setting assumes, in these cases......, omnidirectional receivers collecting the acoustic signal from the source: analysis of arrival times in the recordings together with microphone positions and source directivity cues allows to get information about source position and bearing. Moreover, sound sources have been included into sensor systems together...

  13. Stratospheric Water and OzOne Satellite Homogenized (SWOOSH) data set

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Stratospheric Water and Ozone Satellite Homogenized (SWOOSH) data set is a merged record of stratospheric ozone and water vapor measurements taken by a number of...

  14. Pulsed negative hydrogen source for currents up to one ampere

    International Nuclear Information System (INIS)

    Prelec, K.; Sluyters, T.

    1975-01-01

    During the 2nd Symposium on Ion Sources and Formation of Ion Beams, the development of a Mk II pulsed double slit magnetron source for the production of negative hydrogen ions was discussed. The source was capable of yielding beam currents up to 125 milliamperes, corresponding to current densities of 1.25 A/cm 2 . In order to increase negative hydrogen beam intensities by an order of magnitude (this would be quite useful for initial high energy neutral injector systems on Tokamaks), a larger, Mk III magnetron has been constructed, with the number of slits increased up to six. The idea was to utilize in a more efficient way the plasma width. In addition, such a source geometry will be more adaptable for beam formation and acceleration than single slit structures. With three extraction slits, a negative hydrogen yield of 300 mA was obtained with current densities of 1.2 A/cm 2 ; preliminary results with six extraction slits showed beam currents in excess of half an ampere with averaged current densities in excess of 0.75 A/cm 2 . (U.S.)

  15. DIFFUSION - WRS system module number 7539 for solving a set of multigroup diffusion equations in one dimension

    International Nuclear Information System (INIS)

    Grimstone, M.J.

    1978-06-01

    The WRS Modular Programming System has been developed as a means by which programmes may be more efficiently constructed, maintained and modified. In this system a module is a self-contained unit typically composed of one or more Fortran routines, and a programme is constructed from a number of such modules. This report describes one WRS module, the function of which is to solve a set of multigroup diffusion equations for a system represented in one-dimensional plane, cylindrical or spherical geometry. The information given in this manual is of use both to the programmer wishing to incorporate the module in a programme, and to the user of such a programme. (author)

  16. Fifty years of entomological publications in the Revista de Biología Tropical.

    Science.gov (United States)

    Hanson, Paul

    2002-01-01

    Over its fifty year history nearly twenty percent of the papers published in the Revista de Biología Tropical have been about insects and arachnids. In the 1950's papers on arthropods of medical importance were dominant, in the 1960's there was a poliferation of papers on bees, and in more recent years the subjects have become increasingly diverse. In terms of nationality of contributing authors, the journal appears to have become increasingly international in later years.

  17. USE OF COMPOSITE DATA SETS FOR SOURCE-TRACKING ENTEROCCOCCI IN THE WATER COLUMN AND SHORELINE INTERSTITIAL WATERS ON PENSACOLA BEACH, FL

    Science.gov (United States)

    Genthner, Fred J., Joseph B. James, Diane F. Yates and Stephanie D. Friedman. Submitted. Use of Composite Data Sets for Source-Tracking Enterococci in the Water Column and Shoreline Interstitial Waters on Pensacola Beach Florida. Mar. Pollut. Bull. 33 p. (ERL,GB 1212). So...

  18. L'etat, c'est moi. Fifty years of history and philosophy of evolutionary biology.

    Science.gov (United States)

    Ruse, Michael

    2016-01-01

    I reflect on my fifty-year history as a philosopher of biology, showing how it has taken me from rather narrow analytic studies, through the history of ideas, and now on to issues to do with science and religion. I argue that moral concerns were and still are a major motivation behind what I do and write. Copyright: © 2016 by Fabrizio Serra editore, Pisa · Roma.

  19. Special set linear algebra and special set fuzzy linear algebra

    OpenAIRE

    Kandasamy, W. B. Vasantha; Smarandache, Florentin; Ilanthenral, K.

    2009-01-01

    The authors in this book introduce the notion of special set linear algebra and special set fuzzy Linear algebra, which is an extension of the notion set linear algebra and set fuzzy linear algebra. These concepts are best suited in the application of multi expert models and cryptology. This book has five chapters. In chapter one the basic concepts about set linear algebra is given in order to make this book a self contained one. The notion of special set linear algebra and their fuzzy analog...

  20. Temporal dynamics of conflict monitoring and the effects of one or two conflict sources on error-(related) negativity.

    Science.gov (United States)

    Armbrecht, Anne-Simone; Wöhrmann, Anne; Gibbons, Henning; Stahl, Jutta

    2010-09-01

    The present electrophysiological study investigated the temporal development of response conflict and the effects of diverging conflict sources on error(-related) negativity (Ne). Eighteen participants performed a combined stop-signal flanker task, which was comprised of two different conflict sources: a left-right and a go-stop response conflict. It is assumed that the Ne reflects the activity of a conflict monitoring system and thus increases according to (i) the number of conflict sources and (ii) the temporal development of the conflict activity. No increase of the Ne amplitude after double errors (comprising two conflict sources) as compared to hand- and stop-errors (comprising one conflict source) was found, whereas a higher Ne amplitude was observed after a delayed stop-signal onset. The results suggest that the Ne is not sensitive to an increase in the number of conflict sources, but to the temporal dynamics of a go-stop response conflict. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  1. Source contributions to PM10 and arsenic concentrations in Central Chile using positive matrix factorization

    Science.gov (United States)

    Hedberg, Emma; Gidhagen, Lars; Johansson, Christer

    Sampling of particles (PM10) was conducted during a one-year period at two rural sites in Central Chile, Quillota and Linares. The samples were analyzed for elemental composition. The data sets have undergone source-receptor analyses in order to estimate the sources and their abundance's in the PM10 size fraction, by using the factor analytical method positive matrix factorization (PMF). The analysis showed that PM10 was dominated by soil resuspension at both sites during the summer months, while during winter traffic dominated the particle mass at Quillota and local wood burning dominated the particle mass at Linares. Two copper smelters impacted the Quillota station, and contributed to 10% and 16% of PM10 as an average during summer and winter, respectively. One smelter impacted Linares by 8% and 19% of PM10 in the summer and winter, respectively. For arsenic the two smelters accounted for 87% of the monitored arsenic levels at Quillota and at Linares one smelter contributed with 72% of the measured mass. In comparison with PMF, the use of a dispersion model tended to overestimate the smelter contribution to arsenic levels at both sites. The robustness of the PMF model was tested by using randomly reduced data sets, where 85%, 70%, 50% and 33% of the samples were included. In this way the ability of the model to reconstruct the sources initially found by the original data set could be tested. On average for all sources the relative standard deviation increased from 7% to 25% for the variables identifying the sources, when decreasing the data set from 85% to 33% of the samples, indicating that the solution initially found was very stable to begin with. But it was also noted that sources due to industrial or combustion processes were more sensitive for the size of the data set, compared to the natural sources as local soil and sea spray sources.

  2. Dendrobium: Sources of Active Ingredients to Treat Age-Related Pathologies

    Science.gov (United States)

    Cakova, Veronika; Bonte, Frederic; Lobstein, Annelise

    2017-01-01

    Dendrobium represents one of the most important orchid genera, ornamentally and medicinally. Dendrobiums are sympodial epiphytic plants, which is a name they are worthy of, the name coming from Greek origin: "dendros", tree, and "bios", life. Dendrobium species have been used for a thousand years as first-rate herbs in traditional Chinese medicine (TCM). They are source of tonic, astringent, analgesic, antipyretic, and anti-inflammatory substances, and have been traditionally used as medicinal herbs in the treatment of a variety of disorders, such as, nourishing the stomach, enhancing production of body fluids or nourishing Yin. The Chinese consider Dendrobium as one of the fifty fundamental herbs used to treat all kinds of ailments and use Dendrobium tonic for longevity. This review is focused on main research conducted during the last decade (2006-2016) on Dendrobium plants and their constituents, which have been subjected to investigations of their pharmacological effects involving anticancer, anti-diabetic, neuroprotective and immunomodulating activities, to report their undeniable potential for treating age-related pathologies. PMID:29344419

  3. Intercomparison of JAERI Torso Phantom lung sets

    International Nuclear Information System (INIS)

    Kramer, Gary H.; Hauck, Barry M.

    2000-01-01

    During the course of an IAEA sponsored In Vivo intercomparison using the JAERI phantom the Human Monitoring Laboratory was able to intercompare thirteen lung sets made by three suppliers. One set consisted of sliced lungs with planar inserts containing different radionuclides. The others consisted of whole lung sets with the activity homogeneously distributed throughout the tissue substitute material. Radionuclides in the study were: natural uranium, 3% enriched uranium, 241 Am, 238 Pu, 239 Pu, 152 Eu, and 232 Th Except for the 241 Am (59.5 keV) and occasionally one of the 232 Th (209 keV) photopeaks, the lung sets that had radioactivity homogeneously distributed throughout the tissue equivalent lung tissue material showed good agreement. The 241 Am lung set gave a counting efficiency that appeared 25% too high for all overlay plate configurations. This was observed by other participants. It exemplifies that the manufacture of tissue substitute lung sets is still something of a black art. Despite all precautions, this lung set is either inhomogeneous or has had the wrong activity added. Heterogeneity can lead to an error in the activity estimate of a factor of three if the activity was severely localised due to improper mixing. A factor of 1.25, which appears to be the discrepancy, could easily be explained in this way. It will not be known for some time, however, what the true reason is as the participants are still waiting for the destructive analysis of this lung set to determine the 'true' activity. The sliced lungs ( 241 Am, 152 Eu, and U-nat) manufactured by the Human Monitoring Laboratory are in excellent agreement with the other lung sets. The advantages of sliced lung sets and planar sources are manifold. Activity can be distributed in a known and reproducible manner to mimic either a homogeneous or heterogeneous distribution in the lung. Short lived radionuclides can be used. Cost is much less than purchasing or manufacturing lung sets that have the

  4. Measuring Data Quality Through a Source Data Verification Audit in a Clinical Research Setting.

    Science.gov (United States)

    Houston, Lauren; Probst, Yasmine; Humphries, Allison

    2015-01-01

    Health data has long been scrutinised in relation to data quality and integrity problems. Currently, no internationally accepted or "gold standard" method exists measuring data quality and error rates within datasets. We conducted a source data verification (SDV) audit on a prospective clinical trial dataset. An audit plan was applied to conduct 100% manual verification checks on a 10% random sample of participant files. A quality assurance rule was developed, whereby if >5% of data variables were incorrect a second 10% random sample would be extracted from the trial data set. Error was coded: correct, incorrect (valid or invalid), not recorded or not entered. Audit-1 had a total error of 33% and audit-2 36%. The physiological section was the only audit section to have <5% error. Data not recorded to case report forms had the greatest impact on error calculations. A significant association (p=0.00) was found between audit-1 and audit-2 and whether or not data was deemed correct or incorrect. Our study developed a straightforward method to perform a SDV audit. An audit rule was identified and error coding was implemented. Findings demonstrate that monitoring data quality by a SDV audit can identify data quality and integrity issues within clinical research settings allowing quality improvement to be made. The authors suggest this approach be implemented for future research.

  5. Set theory essentials

    CERN Document Server

    Milewski, Emil G

    2012-01-01

    REA's Essentials provide quick and easy access to critical information in a variety of different fields, ranging from the most basic to the most advanced. As its name implies, these concise, comprehensive study guides summarize the essentials of the field covered. Essentials are helpful when preparing for exams, doing homework and will remain a lasting reference source for students, teachers, and professionals. Set Theory includes elementary logic, sets, relations, functions, denumerable and non-denumerable sets, cardinal numbers, Cantor's theorem, axiom of choice, and order relations.

  6. Outcomes of a Community-Based Paediatric Weight Management Programme in an Irish Midlands Setting

    LENUS (Irish Health Repository)

    Bennett, AE

    2018-02-01

    Ongoing investigation is needed into feasible approaches which reduce excess weight in childhood. This study aimed to assess the effectiveness of an adapted version of the Scottish Childhood Overweight Treatment Trial (SCOTT) in an Irish primary care setting. Families were offered monthly dietitian-led sessions for six months. These sessions targeted dietary habits, family meals, screen time and exercise. Of the 95 children (mean age 7.6 years) referred, 90.5% (n86) were obese and 9.5% (n9) were overweight. Fifty-one (53.7%) families opted into the programme from referral, and 18 completed the programme (64.7% attrition). Statistically significant reductions in body mass index (BMI) were observed between sessions one and six (25.7±4.2kg\\/m2 and 25.3±4.8kg\\/m2, respectively, p<0.01). BMI z-score modestly decreased by 0.2 (p=0.01). Despite these reductions, issues with programme referral, attrition and long-term effectiveness were evident. Further investigation into strategies which reduce paediatric overweight is warranted.

  7. Image-guided radiotherapy for fifty-eight patients with lung cancer

    International Nuclear Information System (INIS)

    Liang Jun; Zhang Tao; Wang Wenqin

    2009-01-01

    Objective: To study the value of image-guided radiotherapy (IGRT) in lung cancer. Methods: From Mar. 2007 to Dec. 2007,58 patients with lung cancer were treated with IGRT. Set-up errors in each axial direction was calculated based on IGRT images of each patient. The change of GTV was evaluated on both cone-beam CT and CT simulator images. Results: Twenty-two patients with left lung cancer,30 with right lung cancer, 5 with mediastinal lymphanode metastasis and one with vertebra metastasis were included. The set-up error in x, y and z axes was (0.02±0.26) cm, (0.14±0.49) cm and ( -0.13± 0.27) cm, respectively,while the rotary set-up error in each axis was -0.15 degree ± 1.59 degree, -0.01 degree ± 1.50 degree and 0.12 degree ±1.08 degree, respectively. The set-up errors were significantly decreased by using of IGRT. GTV movement was observed in 15 patients (25.9%) ,including 5 with left upper lung cancer. GTV moving to the anterior direction was observed in 9 patients,including 4 with]eft upper lung cancer. GTV reduced in 23 (44.2%) patients during treatment. Asymmetric GTV reduction of 22 lesions was observed,with a mean reductive volume of 4.9 cm 3 . When GTV began to shrink,the irradiation dose was 4 -46 Gy, with 20 -30 Gy in 9 patients. Conclusions: The use of IGRT can significantly reduce set-up errors. GTV movement and reduction are observed in some cases. The time to modify the target volume needs to be further studied. (authors)

  8. Set operads in combinatorics and computer science

    CERN Document Server

    Méndez, Miguel A

    2015-01-01

    This monograph has two main objectives. The first one is to give a self-contained exposition of the relevant facts about set operads, in the context of combinatorial species and its operations. This approach has various advantages: one of them is that the definition of combinatorial operations on species, product, sum, substitution and derivative, are simple and natural. They were designed as the set theoretical counterparts of the homonym operations on exponential generating functions, giving an immediate insight on the combinatorial meaning of them. The second objective is more ambitious. Before formulating it, authors present a brief historic account on the sources of decomposition theory. For more than forty years decompositions of discrete structures have been studied in different branches of discrete mathematics: combinatorial optimization, network and graph theory, switching design or boolean functions, simple multi-person games and clutters, etc.

  9. Physics Mining of Multi-source Data Sets, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to implement novel physics mining algorithms with analytical capabilities to derive diagnostic and prognostic numerical models from multi-source...

  10. New perspectives from new generations of neutron sources

    International Nuclear Information System (INIS)

    Mezei, F.

    2007-01-01

    Since the early fifties the vital multidisciplinary progress in understanding condensed matter is, in a substantial fraction, based on results of neutron scattering experiments. Neutron scattering is an inherently intensity limited method and after 50 years of considerable advance - primarily achieved by improving the scattering instruments - the maturation of the technique of pulsed spallation sources now opens up the way to provide more neutrons with improved cost and energy efficiency. A quantitative analysis of the figure-of-merit of the specialized instruments for pulsed source operation shows that up to 2 orders of magnitude intensity gains can be achieved in the next decade, with the advent of high power spallation sources. The first stations on this road, the MW class short pulse spallation sources SNS in the Usa (under commissioning), and J-PARC in Japan (under construction) will be followed by the 5 MW long pulse European Spallation Source (ESS). Further progress, that can be envisaged on the longer term, could amount to as much as another factor of 10 improvement. (author)

  11. Pulse explosion ion beam source with one pulse regime supply for surface modification of materials

    International Nuclear Information System (INIS)

    Korenev, S.A.

    1989-01-01

    A variant of explosion ion beam source with one positive pulse supply for surface modification of materials is described. Ion source consists of vacuum diode and pulse generator Arcadiev-Marx type. Residual gas pressure was p∼5x10 -5 torr in the diode. The sort of ions was fixed by materials initiator anode plasma. The produce carbon ions a carbon-fibrous initiator is used for niobium and titanium-niobium-titanium cable with picking copper matrix. The ions density current regulation is realized by by change of diode gap in the correspondence with Child-Langmuir law. For carbon ions the current density is j∼6A/cm 2 for voltage U∼100kV and j∼32A/cm 2 for voltage U∼300 kV. 7 refs.; 1 fig

  12. Age-related changes in the macula. A histopathological study of fifty Indian donor eyes

    OpenAIRE

    Biswas Jyotirmay; Raman Rajiv

    2002-01-01

    PURPOSE: Age-related macular degeneration (ARMD) is clinically less common in India compared to the West. Therefore, clinicians are unfamiliar with histopathologic evidence of age-related macular changes in the Indian population. METHODS: Fifty consecutive human donor eyes removed for corneal grafting were studied for gross, microscopic and histochemical features of age-related changes in the macula in the Indian population. A horizontal block was cut from the globe including the optic disc, ...

  13. Performance of one hundred watt HVM LPP-EUV source

    Science.gov (United States)

    Mizoguchi, Hakaru; Nakarai, Hiroaki; Abe, Tamotsu; Nowak, Krzysztof M.; Kawasuji, Yasufumi; Tanaka, Hiroshi; Watanabe, Yukio; Hori, Tsukasa; Kodama, Takeshi; Shiraishi, Yutaka; Yanagida, Tatsuya; Soumagne, Georg; Yamada, Tsuyoshi; Yamazaki, Taku; Okazaki, Shinji; Saitou, Takashi

    2015-03-01

    We have been developing CO2-Sn-LPP EUV light source which is the most promising solution as the 13.5nm high power light source for HVM EUVL. Unique and original technologies such as: combination of pulsed CO2 laser and Sn droplets, dual wavelength laser pulses shooting, and mitigation with magnetic field, have been developed in Gigaphoton Inc. The theoretical and experimental data have clearly showed the advantage of our proposed strategy. Based on these data we are developing first practical source for HVM - "GL200E". This data means 250W EUV power will be able to realize around 20kW level pulsed CO2 laser. We have reported engineering data from our recent test such around 43W average clean power, CE=2.0%, with 100kHz operation and other data 19). We have already finished preparation of higher average power CO2 laser more than 20kW at output power cooperate with Mitsubishi Electric Corporation 14). Recently we achieved 92W with 50kHz, 50% duty cycle operation 20). We have reported component technology progress of EUV light source system. We report promising experimental data and result of simulation of magnetic mitigation system in Proto #1 system. We demonstrated several data with Proto #2 system: (1) emission data of 140W in burst under 70kHz 50% duty cycle during 10 minutes. (2) emission data of 118W in burst under 60kHz 70% duty cycle during 10 minutes. (3) emission data of 42W in burst under 20kHz 50% duty cycle (10000pls/0.5ms OFF) during 3 hours (110Mpls). Also we report construction of Pilot #1 system. Final target is week level operation with 250W EUV power with CE=4%, more than 27kW CO2 laser power by the end of Q2 of 2015.

  14. A generic coordinate system and a set of generic variables for MFE database

    International Nuclear Information System (INIS)

    Miner, W.H. Jr.; Ross, D.W.; Solano, E.R.; Valanju, P.M.; Wiley, J.C.

    1993-01-01

    Over the last several years, profile data from nine different tokamaks have been stored in the magnetic fusion energy database (MFEDB). These data sets have come from a variety of sources and most are given in different coordinate systems. In order to attempt any intermachine analysis, it is convenient to transform these data sets into one generic coordinate system and to choose a uniform set of variable names. The authors describe the data sets from each tokamak indicating the source of the data and the coordinate system in which it is given. Next, they discuss the generic coordinate that has been adopted and show how it is implemented for each tokamak. Finally, the generic naming convention that has been adopted is discussed. It follows closely that which was used by Christiansen et al. for the ITER Global Energy Confinement H-Mode Database. For further clarification, they discuss the characteristics of the magnetic geometry given a Fourier representation of the magnetic equilibria

  15. Open source Matrix Product States: Opening ways to simulate entangled many-body quantum systems in one dimension

    Science.gov (United States)

    Jaschke, Daniel; Wall, Michael L.; Carr, Lincoln D.

    2018-04-01

    Numerical simulations are a powerful tool to study quantum systems beyond exactly solvable systems lacking an analytic expression. For one-dimensional entangled quantum systems, tensor network methods, amongst them Matrix Product States (MPSs), have attracted interest from different fields of quantum physics ranging from solid state systems to quantum simulators and quantum computing. Our open source MPS code provides the community with a toolset to analyze the statics and dynamics of one-dimensional quantum systems. Here, we present our open source library, Open Source Matrix Product States (OSMPS), of MPS methods implemented in Python and Fortran2003. The library includes tools for ground state calculation and excited states via the variational ansatz. We also support ground states for infinite systems with translational invariance. Dynamics are simulated with different algorithms, including three algorithms with support for long-range interactions. Convenient features include built-in support for fermionic systems and number conservation with rotational U(1) and discrete Z2 symmetries for finite systems, as well as data parallelism with MPI. We explain the principles and techniques used in this library along with examples of how to efficiently use the general interfaces to analyze the Ising and Bose-Hubbard models. This description includes the preparation of simulations as well as dispatching and post-processing of them.

  16. Fifty years of fat: news coverage of trends that predate obesity prevalence.

    Science.gov (United States)

    Davis, Brennan; Wansink, Brian

    2015-07-10

    Obesity prevalence has risen in fifty years. While people generally expect media mentions of health risks like obesity prevalence to follow health risk trends, food consumption trends may precede obesity prevalence trends. Therefore, this research investigates whether media mentions of food predate obesity prevalence. Fifty years of non-advertising articles in the New York Times (and 17 years for the London Times) are coded for the mention of less healthy (5 salty and 5 sweet snacks) and healthy (5 fruits and 5 vegetables) food items by year and then associated with annual obesity prevalence in subsequent years. Time-series generalized linear models test whether food-related mentions predate or postdate obesity prevalence in each country. United States obesity prevalence is positively associated with New York Times mentions of sweet snacks (b = 55.2, CI = 42.4 to 68.1, p = .000) and negatively associated with mentions of fruits (b = -71.28, CI -91.5 to -51.1, p = .000) and vegetables (b = -13.6, CI = -17.5 to -9.6, p = .000). Similar results are found for the United Kingdom and The London Times. Importantly, the "obesity followed mentions" models are stronger than the "obesity preceded mentions" models. It may be possible to estimate a nation's future obesity prevalence (e.g., three years from now) based on how frequently national media mention sweet snacks (positively related) and vegetables or fruits (negatively related) today. This may provide public health officials and epidemiologists with new tools to more quickly assess the effectiveness of current obesity interventions based on what is mentioned in the media today.

  17. Two denominators for one numerator: the example of neonatal mortality.

    Science.gov (United States)

    Harmon, Quaker E; Basso, Olga; Weinberg, Clarice R; Wilcox, Allen J

    2018-06-01

    Preterm delivery is one of the strongest predictors of neonatal mortality. A given exposure may increase neonatal mortality directly, or indirectly by increasing the risk of preterm birth. Efforts to assess these direct and indirect effects are complicated by the fact that neonatal mortality arises from two distinct denominators (i.e. two risk sets). One risk set comprises fetuses, susceptible to intrauterine pathologies (such as malformations or infection), which can result in neonatal death. The other risk set comprises live births, who (unlike fetuses) are susceptible to problems of immaturity and complications of delivery. In practice, fetal and neonatal sources of neonatal mortality cannot be separated-not only because of incomplete information, but because risks from both sources can act on the same newborn. We use simulations to assess the repercussions of this structural problem. We first construct a scenario in which fetal and neonatal factors contribute separately to neonatal mortality. We introduce an exposure that increases risk of preterm birth (and thus neonatal mortality) without affecting the two baseline sets of neonatal mortality risk. We then calculate the apparent gestational-age-specific mortality for exposed and unexposed newborns, using as the denominator either fetuses or live births at a given gestational age. If conditioning on gestational age successfully blocked the mediating effect of preterm delivery, then exposure would have no effect on gestational-age-specific risk. Instead, we find apparent exposure effects with either denominator. Except for prediction, neither denominator provides a meaningful way to define gestational-age-specific neonatal mortality.

  18. Automatic classification of time-variable X-ray sources

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M. [Sydney Institute for Astronomy, School of Physics, The University of Sydney, Sydney, NSW 2006 (Australia)

    2014-05-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  19. Automatic classification of time-variable X-ray sources

    International Nuclear Information System (INIS)

    Lo, Kitty K.; Farrell, Sean; Murphy, Tara; Gaensler, B. M.

    2014-01-01

    To maximize the discovery potential of future synoptic surveys, especially in the field of transient science, it will be necessary to use automatic classification to identify some of the astronomical sources. The data mining technique of supervised classification is suitable for this problem. Here, we present a supervised learning method to automatically classify variable X-ray sources in the Second XMM-Newton Serendipitous Source Catalog (2XMMi-DR2). Random Forest is our classifier of choice since it is one of the most accurate learning algorithms available. Our training set consists of 873 variable sources and their features are derived from time series, spectra, and other multi-wavelength contextual information. The 10 fold cross validation accuracy of the training data is ∼97% on a 7 class data set. We applied the trained classification model to 411 unknown variable 2XMM sources to produce a probabilistically classified catalog. Using the classification margin and the Random Forest derived outlier measure, we identified 12 anomalous sources, of which 2XMM J180658.7–500250 appears to be the most unusual source in the sample. Its X-ray spectra is suggestive of a ultraluminous X-ray source but its variability makes it highly unusual. Machine-learned classification and anomaly detection will facilitate scientific discoveries in the era of all-sky surveys.

  20. Fifty years of mathematical physics selected works of Ludwig Faddeev

    CERN Document Server

    Faddeev, Ludwig; Niemi, Antti J

    2016-01-01

    This unique volume summarizes with a historical perspective several of the major scientific achievements of Ludwig Faddeev, with a foreword by Nobel Laureate C N Yang. The volume that spans over fifty years of Faddeev's career begins where he started his own scientific research, in the subject of scattering theory and the three-body problem. It then continues to describe Faddeev's contributions to automorphic functions, followed by an extensive account of his many fundamental contributions to quantum field theory including his original article on ghosts with Popov. Faddeev's contributions to soliton theory and integrable models are then described, followed by a survey of his work on quantum groups. The final scientific section is devoted to Faddeev's contemporary research including articles on his long-term interest in constructing knotted solitons and understanding confinement. The volume concludes with his personal view on science and mathematical physics in particular.

  1. Characterization of full set material constants of piezoelectric materials based on ultrasonic method and inverse impedance spectroscopy using only one sample.

    Science.gov (United States)

    Li, Shiyang; Zheng, Limei; Jiang, Wenhua; Sahul, Raffi; Gopalan, Venkatraman; Cao, Wenwu

    2013-09-14

    The most difficult task in the characterization of complete set material properties for piezoelectric materials is self-consistency. Because there are many independent elastic, dielectric, and piezoelectric constants, several samples are needed to obtain the full set constants. Property variation from sample to sample often makes the obtained data set lack of self-consistency. Here, we present a method, based on pulse-echo ultrasound and inverse impedance spectroscopy, to precisely determine the full set physical properties of piezoelectric materials using only one small sample, which eliminated the sample to sample variation problem to guarantee self-consistency. The method has been applied to characterize the [001] C poled Mn modified 0.27Pb(In 1/2 Nb 1/2 )O 3 -0.46Pb(Mg 1/3 Nb 2/3 )O 3 -0.27PbTiO 3 single crystal and the validity of the measured data is confirmed by a previously established method. For the inverse calculations using impedance spectrum, the stability of reconstructed results is analyzed by fluctuation analysis of input data. In contrast to conventional regression methods, our method here takes the full advantage of both ultrasonic and inverse impedance spectroscopy methods to extract all constants from only one small sample. The method provides a powerful tool for assisting novel piezoelectric materials of small size and for generating needed input data sets for device designs using finite element simulations.

  2. ARGO, 1-D Neutron Diffusion in Slab, Cylindrical, Spherical Geometry from JAERI Fast-Set, ABBN, RCBN

    International Nuclear Information System (INIS)

    Ikawa, Koji

    1971-01-01

    1 - Nature of physical problem solved: ARGO is a one-dimensional (slab, cylinder or sphere), multigroup diffusion code for use in fast reactor criticality and kinetic parameter analysis. Three cross section sets, i.e., JAERI-Fast-Set, ABBN-Set and RCBN-Set, of 25 groups are prepared for the code as its library tapes. 2 - Method of solution: Eigenvalues are computed by ordinary source-iteration techniques with ordinary acceleration methods for convergence. 3 - Restrictions on the complexity of the problem: Sphere geometry

  3. Re-Setting Music Education's "Default Settings"

    Science.gov (United States)

    Regelski, Thomas A.

    2013-01-01

    This paper explores the effects and problems of one highly influential default setting of the "normal style template" of music education and proposes some alternatives. These do not require abandoning all traditional templates for school music. But re-setting the default settings does depend on reconsidering the promised function of…

  4. Irish Accents, Foreign Voices: Mediated Agency and Authenticity in In the Name of the Father and Fifty Dead Men Walking

    Directory of Open Access Journals (Sweden)

    Nicole Ives-Allison

    2013-06-01

    Full Text Available Given the intensity of narrative contestation over the public history of and discourse around the modern period of Northern Irish civil conflict known locally as ‘the Troubles’, for filmmakers from outside of Northern Ireland to be seen as making a legitimate contribution to existing debates, there is a pressure for their film texts to be read as ‘authentic’. This desire for authenticity fundamentally shapes the narrative approach taken by these filmmakers. Various filmmaking strategies have been employed in the pursuit of authenticity, but both Jim Sheridan’s In the Name of the Father (1993 and Kari Skogland’s Fifty Dead Men Walking (2008 have taken a distinctly narrative approach, relying upon local written autobiographical material. However, the way in which Sheridan and Skogland have sought to deploy the authenticity embedded in locally grounded source material flirts with self-defeatism as both films problematically obscure the limitations on agency imposed by the filmmakers on the local voices upon who claims of authenticity, and thus the films’ legitimacy, depend.

  5. 26 CFR 1.414(r)-4 - Qualified separate line of business-fifty-employee and notice requirements.

    Science.gov (United States)

    2010-04-01

    ...-employee and notice requirements. 1.414(r)-4 Section 1.414(r)-4 Internal Revenue INTERNAL REVENUE SERVICE... Bonus Plans, Etc. § 1.414(r)-4 Qualified separate line of business—fifty-employee and notice... business (as determined under § 1.414(r)-3) satisfies the 50-employee and notice requirements of § 1.414(r...

  6. Estimating and correcting the amplitude radiation pattern of a virtual source

    OpenAIRE

    Van der Neut, J.; Bakulin, A.

    2009-01-01

    In the virtual source (VS) method we crosscorrelate seismic recordings at two receivers to create a new data set as if one of these receivers were a virtual source and the other a receiver. We focus on the amplitudes and kinematics of VS data, generated by an array of active sources at the surface and recorded by an array of receivers in a borehole. The quality of the VS data depends on the radiation pattern of the virtual source, which in turn is controlled by the spatial aperture of the sur...

  7. Supply risk management functions of sourcing intermediaries – an investigation of the clothing industry

    DEFF Research Database (Denmark)

    Vedel, Mette; Ellegaard, Chris

    2013-01-01

    Purpose: The purpose of this research is to uncover how buying companies use sourcing intermediaries to manage supply risks in global sourcing. Design/methodology/approach: We carry out an explorative qualitative study of the clothing industry, interviewing key respondents that occupy different...... intermediary types, characterised by the set of functions they handle. Research limitations/implications: By analysing a limited set of in-depth interviews in one industry we have traded off broader analytical generalization for in-depth exploration and theory building. Therefore, future research should test...... by identifying the supply risk management functions that sourcing intermediaries carry out for buying companies. We also contribute by uncovering different types of sourcing intermediaries, determined by the collection of functions handled....

  8. Sparse Bayesian Learning for Nonstationary Data Sources

    Science.gov (United States)

    Fujimaki, Ryohei; Yairi, Takehisa; Machida, Kazuo

    This paper proposes an online Sparse Bayesian Learning (SBL) algorithm for modeling nonstationary data sources. Although most learning algorithms implicitly assume that a data source does not change over time (stationary), one in the real world usually does due to such various factors as dynamically changing environments, device degradation, sudden failures, etc (nonstationary). The proposed algorithm can be made useable for stationary online SBL by setting time decay parameters to zero, and as such it can be interpreted as a single unified framework for online SBL for use with stationary and nonstationary data sources. Tests both on four types of benchmark problems and on actual stock price data have shown it to perform well.

  9. Simple one-pot aqueous synthesis of CdHgTe nanocrystals using sodium tellurite as the Te source

    International Nuclear Information System (INIS)

    Shen, Zhitao; Luo, Chunhua; Huang, Rong; Wang, Yiting; Peng, Hui; Travas-sejdic, Jadranka

    2014-01-01

    In this work, we systematically investigated the one-pot aqueous synthesis conditions of CdHgTe nanocrystals (NCs) using sodium tellurite (Na 2 TeO 3 ) as the Te source, and found that the added content of Hg 2+ and the initial pH value of reaction solutions significantly affected the photoluminescence quantum yield (PL QY) of alloyed CdHgTe NCs. When the concentration of Cd was 1.0 mmol L −1 , the mole ratio of Cd/Te/Hg/MPA was 1:0.5:0.05:2.4, and the initial pH value of the reaction solution was about 8.78, the PL QY of as-prepared CdHgTe NCs was up to 45%. Characterization by HRTEM and XRD confirmed the crystalline nature of CdHgTe NCs. Compared to other synthetic approaches of CdHgTe NCs, our experimental results indicate that Na 2 TeO 3 could be an attractive alternative Te source to directly synthesize CdHgTe NCs in aqueous media. - Highlights: • A one-pot method was developed for the synthesis of highly luminescent CdHgTe nanocrystals (NCs). • Sodium tellurite was used as the Te source. • The quantum yield reached up to 45%. • The experimental conditions were optimized and the prepared CdHgTe NCs were characterized

  10. Sources of Cognitive Inflexibility in Set-Shifting Tasks: Insights Into Developmental Theories From Adult Data

    OpenAIRE

    Dick, Anthony Steven

    2012-01-01

    Two experiments examined processes underlying cognitive inflexibility in set-shifting tasks typically used to assess the development of executive function in children. Adult participants performed a Flexible Item Selection Task (FIST) that requires shifting from categorizing by one dimension (e.g., color) to categorizing by a second orthogonal dimension (e.g., shape). The experiments showed performance of the FIST involves suppression of the representation of the ignored dimension; response t...

  11. [National Academy of Medicine, one hundred and fifty-four academic year].

    Science.gov (United States)

    Mansilla-Olivares, Armando

    2017-01-01

    El incalculable valor que atesora la Academia Nacional de Medicina, su enorme influencia y repercusión sobre la toma de decisiones médico-científico-epidemiológicas, radica única y exclusivamente en el dominio del conocimiento, la mente científica y el talento de todos y cada uno de sus miembros, desde su fundación en 1864 hasta nuestros días.

  12. Simplified two and fifty-one region state-based EEIO model comparison

    Data.gov (United States)

    U.S. Environmental Protection Agency — Supporting data for 2 region and 51 region models assessed in the manuscript "Exploring the relevance of spatial scale to life cycle inventory results using...

  13. Utility of the American-European Consensus Group and American College of Rheumatology Classification Criteria for Sjögren's syndrome in patients with systemic autoimmune diseases in the clinical setting.

    Science.gov (United States)

    Hernández-Molina, Gabriela; Avila-Casado, Carmen; Nuñez-Alvarez, Carlos; Cárdenas-Velázquez, Francisco; Hernández-Hernández, Carlos; Luisa Calderillo, María; Marroquín, Verónica; Recillas-Gispert, Claudia; Romero-Díaz, Juanita; Sánchez-Guerrero, Jorge

    2015-03-01

    The aim of this study was to evaluate the feasibility and performance of the American-European Consensus Group (AECG) and ACR Classification Criteria for SS in patients with systemic autoimmune diseases. Three hundred and fifty patients with primary SS, SLE, RA or scleroderma were randomly selected from our patient registry. Each patient was clinically diagnosed as probable/definitive SS or non-SS following a standardized evaluation including clinical symptoms and manifestations, confirmatory tests, fluorescein staining test, autoantibodies, lip biopsy and medical chart review. Using the clinical diagnosis as the gold standard, the degree of agreement with each criteria set and between the criteria sets was estimated. One hundred fifty-four (44%) patients were diagnosed with SS. The AECG criteria were incomplete in 36 patients (10.3%) and the ACR criteria in 96 (27.4%; P vs 62.3 and a specificity of 94.3 vs 91.3, respectively. Either set of criteria was met by 123 patients (80%); 95 (61.7%) met the AECG criteria and 96 (62.3%) met the ACR criteria, but only 68 (44.2%) patients met both sets. The concordance rate between clinical diagnosis and AECG or ACR criteria was moderate (k statistic 0.58 and 0.55, respectively). Among 99 patients with definitive SS sensitivity was 83.3 vs 77.7 and specificity was 90.8 vs 85.6, respectively. A discrepancy between clinical diagnosis and criteria was seen in 59 patients (17%). The feasibility of the SS AECG criteria is superior to that of the ACR criteria, however, their performance was similar among patients with systemic autoimmune diseases. A subset of SS patients is still missed by both criteria sets. © The Author 2014. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. THE HIGHEST-ENERGY COSMIC RAYS CANNOT BE DOMINANTLY PROTONS FROM STEADY SOURCES

    Energy Technology Data Exchange (ETDEWEB)

    Fang, Ke [Department of Astronomy, University of Maryland, College Park, MD 20742-2421 (United States); Kotera, Kumiko [Sorbonne Universités, UPMC Univ. Paris 6 et CNRS, UMR 7095, Institut d’Astrophysique de Paris, 98 bis bd Arago, F-75014 Paris (France)

    2016-11-20

    The bulk of observed ultrahigh-energy cosmic rays could be light or heavier elements and originate from an either steady or transient population of sources. This leaves us with four general categories of sources. Energetic requirements set a lower limit on single-source luminosities, while the distribution of particle arrival directions in the sky sets a lower limit on the source number density. The latter constraint depends on the angular smearing in the skymap due to the magnetic deflections of the charged particles during their propagation from the source to the Earth. We contrast these limits with the luminosity functions from surveys of existing luminous steady objects in the nearby universe and strongly constrain one of the four categories of source models, namely, steady proton sources. The possibility that cosmic rays with energy >8 × 10{sup 19} eV are dominantly pure protons coming from steady sources is excluded at 95% confidence level, under the safe assumption that protons experience less than 30° magnetic deflection on flight.

  15. Simulation of fruit-set and trophic competition and optimization of yield advantages in six Capsicum cultivars using functional-structural plant modelling.

    Science.gov (United States)

    Ma, Y T; Wubs, A M; Mathieu, A; Heuvelink, E; Zhu, J Y; Hu, B G; Cournède, P H; de Reffye, P

    2011-04-01

    Many indeterminate plants can have wide fluctuations in the pattern of fruit-set and harvest. Fruit-set in these types of plants depends largely on the balance between source (assimilate supply) and sink strength (assimilate demand) within the plant. This study aims to evaluate the ability of functional-structural plant models to simulate different fruit-set patterns among Capsicum cultivars through source-sink relationships. A greenhouse experiment of six Capsicum cultivars characterized with different fruit weight and fruit-set was conducted. Fruit-set patterns and potential fruit sink strength were determined through measurement. Source and sink strength of other organs were determined via the GREENLAB model, with a description of plant organ weight and dimensions according to plant topological structure established from the measured data as inputs. Parameter optimization was determined using a generalized least squares method for the entire growth cycle. Fruit sink strength differed among cultivars. Vegetative sink strength was generally lower for large-fruited cultivars than for small-fruited ones. The larger the size of the fruit, the larger variation there was in fruit-set and fruit yield. Large-fruited cultivars need a higher source-sink ratio for fruit-set, which means higher demand for assimilates. Temporal heterogeneity of fruit-set affected both number and yield of fruit. The simulation study showed that reducing heterogeneity of fruit-set was obtained by different approaches: for example, increasing source strength; decreasing vegetative sink strength, source-sink ratio for fruit-set and flower appearance rate; and harvesting individual fruits earlier before full ripeness. Simulation results showed that, when we increased source strength or decreased vegetative sink strength, fruit-set and fruit weight increased. However, no significant differences were found between large-fruited and small-fruited groups of cultivars regarding the effects of source

  16. Nutrition screening tools: does one size fit all? A systematic review of screening tools for the hospital setting.

    Science.gov (United States)

    van Bokhorst-de van der Schueren, Marian A E; Guaitoli, Patrícia Realino; Jansma, Elise P; de Vet, Henrica C W

    2014-02-01

    Numerous nutrition screening tools for the hospital setting have been developed. The aim of this systematic review is to study construct or criterion validity and predictive validity of nutrition screening tools for the general hospital setting. A systematic review of English, French, German, Spanish, Portuguese and Dutch articles identified via MEDLINE, Cinahl and EMBASE (from inception to the 2nd of February 2012). Additional studies were identified by checking reference lists of identified manuscripts. Search terms included key words for malnutrition, screening or assessment instruments, and terms for hospital setting and adults. Data were extracted independently by 2 authors. Only studies expressing the (construct, criterion or predictive) validity of a tool were included. 83 studies (32 screening tools) were identified: 42 studies on construct or criterion validity versus a reference method and 51 studies on predictive validity on outcome (i.e. length of stay, mortality or complications). None of the tools performed consistently well to establish the patients' nutritional status. For the elderly, MNA performed fair to good, for the adults MUST performed fair to good. SGA, NRS-2002 and MUST performed well in predicting outcome in approximately half of the studies reviewed in adults, but not in older patients. Not one single screening or assessment tool is capable of adequate nutrition screening as well as predicting poor nutrition related outcome. Development of new tools seems redundant and will most probably not lead to new insights. New studies comparing different tools within one patient population are required. Copyright © 2013 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  17. Effectiveness of locomotion training in a home visit preventive care project: one-group pre-intervention versus post-intervention design study.

    Science.gov (United States)

    Ito, Shinya; Hashimoto, Mari; Aduma, Saori; Yasumura, Seiji

    2015-11-01

    Locomotion training in a home visit-type preventive-care program has been reported elsewhere. However, continuation of appropriate exercises in a home setting is difficult, and few reports are available on locomotion training in a home setting. The objective of this study was to evaluate the effectiveness of locomotion training over 3 months in a home visit-type preventive-care program for improvement of motor function among elderly people. Nine hundred and fifty-eight elderly people in Tendo City in Japan who were not currently attending any preventive-care program were invited to participate in the study, and 87 were enrolled. In the pre-intervention and post-intervention assessments, we administered an interview survey (the Kihon Checklist), the timed one-leg standing test with eyes open and the sit-to-stand test, at the participants' homes. The intervention involved one set of training exercises with the participants standing on each leg for 1 min and squatting five or six times. The participants were asked to repeat one set of the exercises three times a day at home. In addition, the participants were regularly asked over the telephone about their performance of the exercises. Physical strength, cognitive function, and total scores of the Kihon Checklist were significantly lower after the intervention than before. In addition, the one-leg standing test time was significantly longer after the intervention (mean ± SD, 23.9 ± 35.4) than before (15.7 ± 20.5), and the sit-to-stand test time was significantly shorter after the intervention (13.0 ± 6.2) than before (14.8 ± 8.3). Locomotion training in a home-visit preventive-care program with telephone support effectively improved the motor function of elderly people who were not currently attending any preventive-care program organized by the long-term care insurance system.

  18. An Evolving Worldview: Making Open Source Easy

    Science.gov (United States)

    Rice, Z.

    2017-12-01

    NASA Worldview is an interactive interface for browsing full-resolution, global satellite imagery. Worldview supports an open data policy so that academia, private industries and the general public can use NASA's satellite data to address Earth science related issues. Worldview was open sourced in 2014. By shifting to an open source approach, the Worldview application has evolved to better serve end-users. Project developers are able to have discussions with end-users and community developers to understand issues and develop new features. Community developers are able to track upcoming features, collaborate on them and make their own contributions. Developers who discover issues are able to address those issues and submit a fix. This reduces the time it takes for a project developer to reproduce an issue or develop a new feature. Getting new developers to contribute to the project has been one of the most important and difficult aspects of open sourcing Worldview. After witnessing potential outside contributors struggle, a focus has been made on making the installation of Worldview simple to reduce the initial learning curve and make contributing code easy. One way we have addressed this is through a simplified setup process. Our setup documentation includes a set of prerequisites and a set of straightforward commands to clone, configure, install and run. This presentation will emphasize our focus to simplify and standardize Worldview's open source code so that more people are able to contribute. The more people who contribute, the better the application will become over time.

  19. One-way mode transmission in one-dimensional phononic crystal plates

    Science.gov (United States)

    Zhu, Xuefeng; Zou, Xinye; Liang, Bin; Cheng, Jianchun

    2010-12-01

    We investigate theoretically the band structures of one-dimensional phononic crystal (PC) plates with both antisymmetric and symmetric structures, and show how unidirectional transmission behavior can be obtained for either antisymmetric waves (A modes) or symmetric waves (S modes) by exploiting mode conversion and selection in the linear plate systems. The theoretical approach is illustrated for one PC plate example where unidirectional transmission behavior is obtained in certain frequency bands. Employing harmonic frequency analysis, we numerically demonstrate the one-way mode transmission for the PC plate with finite superlattice by calculating the steady-state displacement fields under A modes source (or S modes source) in forward and backward direction, respectively. The results show that the incident waves from A modes source (or S modes source) are transformed into S modes waves (or A modes waves) after passing through the superlattice in the forward direction and the Lamb wave rejections in the backward direction are striking with a power extinction ratio of more than 1000. The present structure can be easily extended to two-dimensional PC plate and efficiently encourage practical studies of experimental realization which is believed to have much significance for one-way Lamb wave mode transmission.

  20. Estimation of influential points in any data set from coefficient of determination and its leave-one-out cross-validated counterpart.

    Science.gov (United States)

    Tóth, Gergely; Bodai, Zsolt; Héberger, Károly

    2013-10-01

    Coefficient of determination (R (2)) and its leave-one-out cross-validated analogue (denoted by Q (2) or R cv (2) ) are the most frequantly published values to characterize the predictive performance of models. In this article we use R (2) and Q (2) in a reversed aspect to determine uncommon points, i.e. influential points in any data sets. The term (1 - Q (2))/(1 - R (2)) corresponds to the ratio of predictive residual sum of squares and the residual sum of squares. The ratio correlates to the number of influential points in experimental and random data sets. We propose an (approximate) F test on (1 - Q (2))/(1 - R (2)) term to quickly pre-estimate the presence of influential points in training sets of models. The test is founded upon the routinely calculated Q (2) and R (2) values and warns the model builders to verify the training set, to perform influence analysis or even to change to robust modeling.

  1. Sources of Stress among Undergraduate Students in the University of Benin, Benin City, Nigeria: Implications for Counselling

    Science.gov (United States)

    Alika, Ijeoma Henrietta

    2012-01-01

    The study examined the role of inadequate facilities/accommodation, poor health, emotional problems, socio-economic status and poor time management as sources of stress among University of Benin undergraduates. The research instrument used was a questionnaire. The survey method was adopted for the study. Seven hundred and fifty respondents were…

  2. Gauge evolution of elementary particles physics during the last fifty years

    International Nuclear Information System (INIS)

    Khodjaev, L.Sh

    2002-01-01

    Gauge evolution of the elementary particle physics has been remarked by outstanding and exiting discoveries during the last fifty years of X X century. We review a new tendency in the development of the modern elementary particle physics. The phenomenological basis for the formulation of Standard Model has been reviewed. The Standard Model based on the fundamental postulates has been formulated. The concept of the fundamental symmetries has been introduced to look for not fundamental particles but fundamental symmetries. The Standard Model is renormalizable and therefore potentially consistent in all energy scales. The Standard Model in principle can describe the properties of the Universe beginning at 10 -43 sec. after Big Bang. In searching of more general theory obvious program is to searching the first of all global symmetries and then learn consequences connected with the localization of these global symmetries

  3. On The Computation Of The Best-fit Okada-type Tsunami Source

    Science.gov (United States)

    Miranda, J. M. A.; Luis, J. M. F.; Baptista, M. A.

    2017-12-01

    The forward simulation of earthquake-induced tsunamis usually assumes that the initial sea surface elevation mimics the co-seismic deformation of the ocean bottom described by a simple "Okada-type" source (rectangular fault with constant slip in a homogeneous elastic half space). This approach is highly effective, in particular in far-field conditions. With this assumption, and a given set of tsunami waveforms recorded by deep sea pressure sensors and (or) coastal tide stations it is possible to deduce the set of parameters of the Okada-type solution that best fits a set of sea level observations. To do this, we build a "space of possible tsunami sources-solution space". Each solution consists of a combination of parameters: earthquake magnitude, length, width, slip, depth and angles - strike, rake, and dip. To constrain the number of possible solutions we use the earthquake parameters defined by seismology and establish a range of possible values for each parameter. We select the "best Okada source" by comparison of the results of direct tsunami modeling using the solution space of tsunami sources. However, direct tsunami modeling is a time-consuming process for the whole solution space. To overcome this problem, we use a precomputed database of Empirical Green Functions to compute the tsunami waveforms resulting from unit water sources and search which one best matches the observations. In this study, we use as a test case the Solomon Islands tsunami of 6 February 2013 caused by a magnitude 8.0 earthquake. The "best Okada" source is the solution that best matches the tsunami recorded at six DART stations in the area. We discuss the differences between the initial seismic solution and the final one obtained from tsunami data This publication received funding of FCT-project UID/GEO/50019/2013-Instituto Dom Luiz.

  4. Seasonal coefficient of performance for ground source heat pump and groundwater one in Białystok

    Science.gov (United States)

    Gajewski, Andrzej

    2017-11-01

    European Economic Area (EEA) states declare to contain greenhouse gases emissions at 20% by 2020, whereas European Union (EU) does 40% before 2030, which result in encouragement to apply low-carbon technologies. Coefficient of Performance (COP) and Seasonal Coefficient of Performance (SCOPnet) are obtained using temperature measurement done by The Institute of Meteorology and Water Management - National Research Institute (IMGW-PIB) at the weather station in Bialystok for ten-year period. The first variant is ground source heat pump (GSHP) and the second one is groundwater source heat pump (WSHP) which can be equipped with separating heat exchanger (SHE) optionally. In both cases heat is generated for heating system only. Ground temperature is determined from Baggs (1983) formula using Oleśkowicz-Popiel et. al. (2002) adaptation to Polish climate and substituting the local constants achieved by Biernacka (2010). Water temperature in a groundwater basin is obtained from Kowalski (2007) equation. Estimation is done in each hour of heating season. All COP values are higher than 3.5 required by EU (2013). SCOPnet are as follows: 6.12, 5.86, 5.03 for WSHP, WSHP+SHE, GSHP respectively. Insomuch as WSHP needs only two boreholes it is recommended to the areas beneath ones a groundwater basin is located.

  5. Seasonal coefficient of performance for ground source heat pump and groundwater one in Białystok

    Directory of Open Access Journals (Sweden)

    Gajewski Andrzej

    2017-01-01

    Full Text Available European Economic Area (EEA states declare to contain greenhouse gases emissions at 20% by 2020, whereas European Union (EU does 40% before 2030, which result in encouragement to apply low-carbon technologies. Coefficient of Performance (COP and Seasonal Coefficient of Performance (SCOPnet are obtained using temperature measurement done by The Institute of Meteorology and Water Management – National Research Institute (IMGW-PIB at the weather station in Bialystok for ten-year period. The first variant is ground source heat pump (GSHP and the second one is groundwater source heat pump (WSHP which can be equipped with separating heat exchanger (SHE optionally. In both cases heat is generated for heating system only. Ground temperature is determined from Baggs (1983 formula using Oleśkowicz-Popiel et. al. (2002 adaptation to Polish climate and substituting the local constants achieved by Biernacka (2010. Water temperature in a groundwater basin is obtained from Kowalski (2007 equation. Estimation is done in each hour of heating season. All COP values are higher than 3.5 required by EU (2013. SCOPnet are as follows: 6.12, 5.86, 5.03 for WSHP, WSHP+SHE, GSHP respectively. Insomuch as WSHP needs only two boreholes it is recommended to the areas beneath ones a groundwater basin is located.

  6. Lloyd Berkner: Catalyst for Meteorology's Fabulous Fifties

    Science.gov (United States)

    Lewis, J. M.

    2002-05-01

    In the long sweep of meteorological history - from Aristotle's Meteorologica to the threshold of the third millennium - the 1950s will surely be recognized as a defining decade. The contributions of many individuals were responsible for the combination of vision and institution building that marked this decade and set the stage for explosive development during the subsequent forty years. In the minds of many individuals who were active during those early years, however, one name stands out as a prime mover par excellence: Lloyd Viel Berkner. On May 1, 1957, Berkner addressed the National Press Club. The address was entitled, "Horizons of Meteorology". It reveals Berkner's insights into meteorology from his position as Chairman of the Committee on Meteorology of the National Academy of Sciences, soon to release the path-breaking report, Research and Education in Meteorology (1958). The address also reflects the viewpoint of an individual deeply involved in the International Geophysical Year (IGY). It is an important footnote to meteorological history. We welcome this opportunity to profile Berkner and to discuss "Horizons of Meteorology" in light of meteorology's state-of-affairs in the 1950s and the possible relevance to Berkner's ideas to contemporary issues.

  7. Health Research Governance: Introduction of a New Web-based Research Evaluation Model in Iran: One-decade Experience

    Science.gov (United States)

    MALEKZADEH, Reza; AKHONDZADEH, Shahin; EBADIFAR, Asghar; BARADARAN EFTEKHARI, Monir; OWLIA, Parviz; GHANEI, Mostafa; FALAHAT, Katayoun; HABIBI, Elham; SOBHANI, Zahra; DJALALINIA, Shirin; PAYKARI, Niloofar; MOJARRAB, Shahnaz; ELTEMASI, Masoumeh; LAALI, Reza

    2016-01-01

    Background: Governance is one of the main functions of Health Research System (HRS) that consist of four essential elements such as setting up evaluation system. The goal of this study was to introduce a new web based research evaluation model in Iran. Methods: Based on main elements of governance, research indicators have been clarified and with cooperation of technical team, appropriate software was designed. Three main steps in this study consist of developing of mission-oriented program, creating enabling environment and set up Iran Research Medical Portal as a center for research evaluation. Results: Fifty-two universities of medical sciences in three types have been participated. After training the evaluation focal points in all of medical universities, access to data entry and uploading all of documents were provided. Regarding to mission – based program, the contribution of medical universities in knowledge production was 60% for type one, 31% for type two and 9% for type three. The research priorities based on Essential National Health Research (ENHR) approach and mosaic model were gathered from universities of medical sciences and aggregated to nine main areas as national health research priorities. Ethical committees were established in all of medical universities. Conclusion: Web based research evaluation model is a comprehensive and integrated system for data collection in research. This system is appropriate tool to national health research ranking. PMID:27957437

  8. A One-Dimensional Thermoelastic Problem due to a Moving Heat Source under Fractional Order Theory of Thermoelasticity

    Directory of Open Access Journals (Sweden)

    Tianhu He

    2014-01-01

    Full Text Available The dynamic response of a one-dimensional problem for a thermoelastic rod with finite length is investigated in the context of the fractional order theory of thermoelasticity in the present work. The rod is fixed at both ends and subjected to a moving heat source. The fractional order thermoelastic coupled governing equations for the rod are formulated. Laplace transform as well as its numerical inversion is applied to solving the governing equations. The variations of the considered temperature, displacement, and stress in the rod are obtained and demonstrated graphically. The effects of time, velocity of the moving heat source, and fractional order parameter on the distributions of the considered variables are of concern and discussed in detail.

  9. Source apportionment of secondary organic aerosol in China using a regional source-oriented chemical transport model and two emission inventories.

    Science.gov (United States)

    Wang, Peng; Ying, Qi; Zhang, Hongliang; Hu, Jianlin; Lin, Yingchao; Mao, Hongjun

    2018-06-01

    A Community Multiscale Air Quality (CMAQ) model with source-oriented lumped SAPRC-11 (S11L) photochemical mechanism and secondary organic aerosol (SOA) module was applied to determine the contributions of anthropogenic and biogenic sources to SOA concentrations in China. A one-year simulation of 2013 using the Multi-resolution Emission Inventory for China (MEIC) shows that summer SOA are generally higher (10-15 μg m -3 ) due to large contributions of biogenic (country average 60%) and industrial sources (17%). In winter, SOA formation was mostly due to anthropogenic emissions from industries (40%) and residential sources (38%). Emissions from other countries in southeast China account for approximately 14% of the SOA in both summer and winter, and 46% in spring due to elevated open biomass burning in southeast Asia. The Regional Emission inventory in ASia v2.1 (REAS2) was applied in this study for January and August 2013. Two sets of simulations with the REAS2 inventory were conducted using two different methods to speciate total non-methane carbon into model species. One approach uses total non-methane hydrocarbon (NMHC) emissions and representative speciation profiles from the SPECIATE database. The other approach retains the REAS2 speciated species that can be directly mapped to S11L model species and uses source specific splitting factors to map other REAS2 lumped NMHC species. Biogenic emissions are still the most significant contributor in summer based on these two sets of simulations. However, contributions from the transportation sector to SOA in January are predicted to be much more important based on the two REAS2 emission inventories (∼30-40% vs. ∼5% by MEIC), and contributions from residential sources according to REAS2 was much lower (∼21-24% vs. ∼42%). These discrepancies in source contributions to SOA need to be further investigated as the country seeks for optimal emission control strategies to fight severe air pollution. Copyright

  10. Socioeconomic rehabilitation of successful renal transplant patients and impact of funding source: Indian scenario.

    Science.gov (United States)

    Kapoor, Rakesh; Sharma, Raj Kumar; Srivastava, Aneesh; Kapoor, Rohit; Arora, Sohrab; Sureka, Sanjoy Kumar

    2015-01-01

    Socio-economic rehabilitation is an important outcome parameter in successful renal transplant recipients, particularly in developing countries with low income patients who often depend on extraneous sources to fund their surgery costs. We studied the socioeconomic rehabilitation and changes in socioeconomic status (SES) of successful renal allograft recipients among Indian patients and its correlation with their source of funding for the surgery. A cross-sectional, questionnaire-based study was conducted on 183 patients between January 2010 to January 2013. Patients with follow up of at least 1 year after successful renal transplant were included. During interview, two questionnaires were administered, one related to the SES including source of funding before transplantation and another one relating to the same at time of interview. Changes in SES were categorized as improvement, stable and deterioration if post-transplant SES score increased >5%, increased or decreased by 5% of pre-transplant value, respectively. In this cohort, 97 (52.7%), 67 (36.4%) and 19 (10.3%) patients were non-funded (self-funded), one-time funded and continuous funded, respectively. Fifty-six (30.4%) recipients had improvement in SES, whereas 89 (48.4%) and 38 (20.7%) recipients had deterioration and stable SES. Improvement in SES was seen in 68% patients with continuous funding support whereas, in only 36% and 12% patients with non-funded and onetime funding support (P = 0.001) respectively. Significant correlation was found (R = 0.715) between baseline socioeconomic strata and changes in SES after transplant. 70% of the patients with upper and upper middle class status had improving SES. Patients with middle class, lower middle and lower class had deterioration of SES after transplant in 47.4%, 79.6% and 66.7% patients, respectively. Most of the recipients from middle and lower social strata, which included more than 65% of our patient's population, had deteriorating SES even after a

  11. Tied Up In Knots: Irony, Ambiguity, and the 'Difficult' Pleasures of FIFTY SHADES OF GREY

    OpenAIRE

    McCulloch, Richard

    2016-01-01

    Upon its release in February 2015, Sam Taylor-Johnson’s Fifty Shades of Grey met with vehement critical derision, finding itself reproached on both artistic and ideological grounds. While the precise focus of this indignation varied between reviews, complaints broadly fell into three categories: (1) it isn’t titillating enough (2) it\\ud is misogynistic, and (3) the love story and characters are clichéd and unrealistic.\\ud \\ud In this article, however, I demonstrate that the film is far more s...

  12. Celebrating fifty years of research and applications in reminiscence and life review: State of the art and new directions

    NARCIS (Netherlands)

    Westerhof, Gerben Johan; Bohlmeijer, Ernst Thomas

    2014-01-01

    Fifty years ago, psychiatrist Robert Butler (1963) published an influential article on the recollection and evaluation of personal memories in later life. We discuss the major insights and applications in psychological gerontology that were inspired by Butler. Reminiscence and life review serve to

  13. The anomaly of Marquesan ceramics : a fifty year retrospective

    International Nuclear Information System (INIS)

    Allen, M.S.; Dickinson, W.R.; Huebert, J.M.

    2012-01-01

    Fifty years ago pioneering archaeologist Robert Suggs reported a small number of pottery sherds from the Marquesas Islands. The first such finds in East Polynesia, at the time they were considered indicative of both a Marquesan homeland and local ceramic manufacture. In the intervening years, additional sherds have been recovered from three other Marquesan localities resulting in a total of 14 specimens. Prior petrographic studies demonstrate unambiguously that some derive from Fiji. Others have been interpreted historically as representative of an indigenous Marquesan ceramic industry. Here we bring together key petrographic analyses from Polynesia, recent chronological assessments of the Marquesan sequence, and insights from new field research to reassess the origins and chronology of Marquesan pottery. We suggest that there is little support for an indigenous Marquesan ceramic industry, and most likely all of the specimens are imports. With respect to the timing of ceramic arrivals, three hypotheses are explored: 1) with founding settlers, 2) as a component of long-distance exchange networks operating between the 12th to 16th centuries AD, or 3) as late prehistoric or historic imports. The preponderance of evidence points to the second alternative, although the other two cannot be completely discounted for the assemblage as a whole. (author). 63 refs., 5 figs., 3 tabs.

  14. Production of accelerated electrons near an electron source in the plasma resonance region

    International Nuclear Information System (INIS)

    Fedorov, V.A.

    1989-01-01

    Conditions of generation of plasma electrons accelerated and their characteristics in the vicinity of an electron source are determined. The electron source isolated electrically with infinitely conducting surface, being in unrestricted collisionless plasma ω 0 >>ν, where ω 0 - plasma frequency of nonperturbated plasma, ν - frequency of plasma electron collisions with other plasma particles, is considered. Spherically symmetric injection of electrons, which rates are simulated by ω frequency, occurs from the source surface. When describing phenomena in the vicinity of the electron source, one proceeds from the quasihydrodynamic equation set

  15. Ion source

    International Nuclear Information System (INIS)

    1977-01-01

    The specifications of a set of point-shape electrodes of non-corrodable material that can hold a film of liquid material of equal thickness is described. Contained in a jacket, this set forms an ion source. The electrode is made of tungsten with a glassy carbon layer for insulation and an outer layer of aluminium-oxide ceramic material

  16. Upper gastrointestinal bleeding: Five-year experience from one centre

    Directory of Open Access Journals (Sweden)

    Jovanović Ivan

    2008-01-01

    Full Text Available Introduction Acute upper gastrointestinal bleeding is the commonest emergency managed by gastroenterologists. Objective To assess the frequency of erosive gastropathy and duodenal ulcer as a cause of upper gastrointestinal (GI bleeding as well as its relation to age, gender and known risk factors. METHOD We conducted retrospective observational analysis of emergency endoscopy reports from the records of the Emergency Department of Clinic for Gastroenterology and Hepatology, Clinical Centre of Serbia, during the period from 2000 to 2005. Data consisted of patients' demographics, endoscopic findings and potential risk factors. Results During the period 2000-2005, three thousand nine hundred and fifty four emergency upper endoscopies were performed for acute bleeding. In one quarter of cases, acute gastric erosions were the actual cause of bleeding. One half of them were associated with excessive consumption of salicylates and NSAIDs. In most of the examined cases, bleeding stopped spontaneously, while 7.6% of the cases required endoscopic intervention. Duodenal ulcer was detected as a source of bleeding in 1320 (33.4% patients and was significantly associated with a male gender (71.8% and salicylate or NSAID abuse (59.1% (χ2-test; p=0.007. Conclusion Erosive gastropathy and duodenal ulcer represent a significant cause of upper gastrointestinal bleeding accounting for up to 60% of all cases that required emergency endoscopy during the 5- year period. Consumption of NSAIDs and salicylates was associated more frequently with bleeding from a duodenal ulcer than with erosive gastropathy leading to a conclusion that we must explore other causes of erosive gastropathy more thoroughly. .

  17. Validation of the Comprehensive ICF Core Set for obstructive pulmonary diseases from the perspective of physiotherapists.

    Science.gov (United States)

    Rauch, Alexandra; Kirchberger, Inge; Stucki, Gerold; Cieza, Alarcos

    2009-12-01

    The 'Comprehensive ICF Core Set for obstructive pulmonary diseases' (OPD) is an application of the International Classification of Functioning, Disability and Health (ICF) and represents the typical spectrum of problems in functioning of patients with OPD. To optimize a multidisciplinary and patient-oriented approach in pulmonary rehabilitation, in which physiotherapy plays an important role, the ICF offers a standardized language and understanding of functioning. For it to be a useful tool for physiotherapists in rehabilitation of patients with OPD, the objective of this study was to validate this Comprehensive ICF Core Set for OPD from the perspective of physiotherapists. A three-round survey based on the Delphi technique of physiotherapists who are experienced in the treatment of OPD asked about the problems, resources and aspects of environment of patients with OPD that physiotherapists treat in clinical practice (physiotherapy intervention categories). Responses were linked to the ICF and compared with the existing Comprehensive ICF Core Set for OPD. Fifty-one physiotherapists from 18 countries named 904 single terms that were linked to 124 ICF categories, 9 personal factors and 16 'not classified' concepts. The identified ICF categories were mainly third-level categories compared with mainly second-level categories of the Comprehensive ICF Core Set for OPD. Seventy of the ICF categories, all personal factors and 15 'not classified' concepts gained more than 75% agreement among the physiotherapists. Of these ICF categories, 55 (78.5%) were covered by the Comprehensive ICF Core Set for OPD. The validity of the Comprehensive ICF Core Set for OPD was largely supported by the physiotherapists. Nevertheless, ICF categories that were not covered, personal factors and not classified terms offer opportunities towards the final ICF Core Set for OPD and further research to strengthen physiotherapists' perspective in pulmonary rehabilitation.

  18. How Many Separable Sources? Model Selection In Independent Components Analysis

    Science.gov (United States)

    Woods, Roger P.; Hansen, Lars Kai; Strother, Stephen

    2015-01-01

    Unlike mixtures consisting solely of non-Gaussian sources, mixtures including two or more Gaussian components cannot be separated using standard independent components analysis methods that are based on higher order statistics and independent observations. The mixed Independent Components Analysis/Principal Components Analysis (mixed ICA/PCA) model described here accommodates one or more Gaussian components in the independent components analysis model and uses principal components analysis to characterize contributions from this inseparable Gaussian subspace. Information theory can then be used to select from among potential model categories with differing numbers of Gaussian components. Based on simulation studies, the assumptions and approximations underlying the Akaike Information Criterion do not hold in this setting, even with a very large number of observations. Cross-validation is a suitable, though computationally intensive alternative for model selection. Application of the algorithm is illustrated using Fisher's iris data set and Howells' craniometric data set. Mixed ICA/PCA is of potential interest in any field of scientific investigation where the authenticity of blindly separated non-Gaussian sources might otherwise be questionable. Failure of the Akaike Information Criterion in model selection also has relevance in traditional independent components analysis where all sources are assumed non-Gaussian. PMID:25811988

  19. One-dimensional Schroedinger operators with interactions singular on a discrete set

    International Nuclear Information System (INIS)

    Gesztesy, F.; Kirsch, W.

    We study the self-adjointness of Schroedinger operators -d 2 /dx 2 +V(x) on an arbitrary interval, (a,b) with V(x) locally integrable on (a,b)inverse slantX where X is a discrete set. The treatment of quantum mechanical systems describing point interactions or periodic (possibly strongly singular) potentials is thereby included and explicit examples are presented. (orig.)

  20. Venice: Fifty years after the great flood of November 4, 1966

    Science.gov (United States)

    Rizzoli, P. M.

    2017-12-01

    Fifty years ago Venice and its lagoon suffered the most devastating flood in their millennial history. The causes of the increasingly recurring floods will be examined, namely the man-induced subsidence in the period 1925-1970 and the storm surges of the Adriatic sea. The engineering solution designed for their protection , named the MOSE system, will be discussed in detail. The MOSE was started in 2003 and is near completion. It consists of four barriers , invisible in normal conditions, which will close the inlets to the lagoon under the prediction of a forthcoming flood. Finally, the perspective of the MOSE capability of protecting the city under scenarios of future global sea level rise will be assessed. This assessment must critically take into account that Venice and its lagoon are confined in the northernmost corner of the semi-enclosed, marginal Mediterranean sea for which the uncertainties of future sea level rise greatly exceed the uncertainties of the global averages.

  1. What makes a champion! over fifty extraordinary individuals share their insights

    CERN Document Server

    2014-01-01

    What drives great and successful individuals — be they athletes, artists, or scientists — or businesses, to achieve the extraordinary? Over fifty champions from all walks of life, brought together by Allan Snyder, draw on their experiences to explore the secrets of success in this inspiring, revealing and thought-provoking book. Hear from the authors what made a McDonalds' branch become the most successful in the world; how a cottage business is catapulted into a world brand; how a visual artist's works crosses almost every medium imaginable; how an Ernst and Young setup becomes a top-notch employer; or why many geniuses or brilliant individuals never become champions, while many 'ordinary' individuals do; why many people don't know about their talent; what constitutes a champion outcome; and the neurological explanation for championship. Straddling academia and practitioners in all fields — government, entertainment, sports, business, arts, education, medicine, media — the authors include business...

  2. Comparison of content in phenolic compounds, polyphenol oxidase and peroxidase in grains of fifty sorghum cultivars from Burkina Faso.

    NARCIS (Netherlands)

    Dicko, M.H.; Hilhorst, M.H.; Gruppen, H.; Traore, A.S.; Laane, N.C.M.; Berkel, van W.J.H.; Voragen, A.G.J.

    2002-01-01

    Analysis of fifty sorghum [Sorghum bicolor (L.) Moench] varieties used in Burkina Faso showed that they have different contents of phenolic compounds, peroxidase (POX), and polyphenol oxidase (PPO). Most of the varieties (82%) had a tannin content less than 0.25% (w/w). POX specific activity was

  3. Linear finite element method for one-dimensional diffusion problems

    Energy Technology Data Exchange (ETDEWEB)

    Brandao, Michele A.; Dominguez, Dany S.; Iglesias, Susana M., E-mail: micheleabrandao@gmail.com, E-mail: dany@labbi.uesc.br, E-mail: smiglesias@uesc.br [Universidade Estadual de Santa Cruz (LCC/DCET/UESC), Ilheus, BA (Brazil). Departamento de Ciencias Exatas e Tecnologicas. Laboratorio de Computacao Cientifica

    2011-07-01

    We describe in this paper the fundamentals of Linear Finite Element Method (LFEM) applied to one-speed diffusion problems in slab geometry. We present the mathematical formulation to solve eigenvalue and fixed source problems. First, we discretized a calculus domain using a finite set of elements. At this point, we obtain the spatial balance equations for zero order and first order spatial moments inside each element. Then, we introduce the linear auxiliary equations to approximate neutron flux and current inside the element and architect a numerical scheme to obtain the solution. We offer numerical results for fixed source typical model problems to illustrate the method's accuracy for coarse-mesh calculations in homogeneous and heterogeneous domains. Also, we compare the accuracy and computational performance of LFEM formulation with conventional Finite Difference Method (FDM). (author)

  4. Sources of PCR-induced distortions in high-throughput sequencing data sets

    Science.gov (United States)

    Kebschull, Justus M.; Zador, Anthony M.

    2015-01-01

    PCR permits the exponential and sequence-specific amplification of DNA, even from minute starting quantities. PCR is a fundamental step in preparing DNA samples for high-throughput sequencing. However, there are errors associated with PCR-mediated amplification. Here we examine the effects of four important sources of error—bias, stochasticity, template switches and polymerase errors—on sequence representation in low-input next-generation sequencing libraries. We designed a pool of diverse PCR amplicons with a defined structure, and then used Illumina sequencing to search for signatures of each process. We further developed quantitative models for each process, and compared predictions of these models to our experimental data. We find that PCR stochasticity is the major force skewing sequence representation after amplification of a pool of unique DNA amplicons. Polymerase errors become very common in later cycles of PCR but have little impact on the overall sequence distribution as they are confined to small copy numbers. PCR template switches are rare and confined to low copy numbers. Our results provide a theoretical basis for removing distortions from high-throughput sequencing data. In addition, our findings on PCR stochasticity will have particular relevance to quantification of results from single cell sequencing, in which sequences are represented by only one or a few molecules. PMID:26187991

  5. Observation of extragalactic X-ray sources

    International Nuclear Information System (INIS)

    Bui-Van, Andre.

    1973-01-01

    A narrow angular resolution detection apparatus using a high performance collimator has proved particularly well suited for the programs of observation of X ray sources. The experimental set-up and its performance are described. One chapter deals with the particular problems involved in the observation of X ray sources with the aid of sounding balloons. The absorption of extraterrestrial photons by the earth atmosphere is taken into account in the procesing of the observation data using two methods of calculation: digital and with simulation techniques. The results of three balloon flights are then presented with the interpretation of the observations carried out using both thermal and non thermal emission models. This analysis leads to some possible characteristics of structure of the Perseus galaxy cluster [fr

  6. Set-fit effects in choice.

    Science.gov (United States)

    Evers, Ellen R K; Inbar, Yoel; Zeelenberg, Marcel

    2014-04-01

    In 4 experiments, we investigate how the "fit" of an item with a set of similar items affects choice. We find that people have a notion of a set that "fits" together--one where all items are the same, or all items differ, on salient attributes. One consequence of this notion is that in addition to preferences over the set's individual items, choice reflects set-fit. This leads to predictable shifts in preferences, sometimes even resulting in people choosing normatively inferior options over superior ones.

  7. Source allocation by least-squares hydrocarbon fingerprint matching

    Energy Technology Data Exchange (ETDEWEB)

    William A. Burns; Stephen M. Mudge; A. Edward Bence; Paul D. Boehm; John S. Brown; David S. Page; Keith R. Parker [W.A. Burns Consulting Services LLC, Houston, TX (United States)

    2006-11-01

    There has been much controversy regarding the origins of the natural polycyclic aromatic hydrocarbon (PAH) and chemical biomarker background in Prince William Sound (PWS), Alaska, site of the 1989 Exxon Valdez oil spill. Different authors have attributed the sources to various proportions of coal, natural seep oil, shales, and stream sediments. The different probable bioavailabilities of hydrocarbons from these various sources can affect environmental damage assessments from the spill. This study compares two different approaches to source apportionment with the same data (136 PAHs and biomarkers) and investigate whether increasing the number of coal source samples from one to six increases coal attributions. The constrained least-squares (CLS) source allocation method that fits concentrations meets geologic and chemical constraints better than partial least-squares (PLS) which predicts variance. The field data set was expanded to include coal samples reported by others, and CLS fits confirm earlier findings of low coal contributions to PWS. 15 refs., 5 figs.

  8. Modeling of Single Event Transients With Dual Double-Exponential Current Sources: Implications for Logic Cell Characterization

    Science.gov (United States)

    Black, Dolores A.; Robinson, William H.; Wilcox, Ian Z.; Limbrick, Daniel B.; Black, Jeffrey D.

    2015-08-01

    Single event effects (SEE) are a reliability concern for modern microelectronics. Bit corruptions can be caused by single event upsets (SEUs) in the storage cells or by sampling single event transients (SETs) from a logic path. An accurate prediction of soft error susceptibility from SETs requires good models to convert collected charge into compact descriptions of the current injection process. This paper describes a simple, yet effective, method to model the current waveform resulting from a charge collection event for SET circuit simulations. The model uses two double-exponential current sources in parallel, and the results illustrate why a conventional model based on one double-exponential source can be incomplete. A small set of logic cells with varying input conditions, drive strength, and output loading are simulated to extract the parameters for the dual double-exponential current sources. The parameters are based upon both the node capacitance and the restoring current (i.e., drive strength) of the logic cell.

  9. Fifty Years of HF Doppler Observations

    Directory of Open Access Journals (Sweden)

    T Ogawa

    2009-04-01

    Full Text Available High frequency Doppler observations of the ionosphere began in August of 1957 in Kyoto. The number of the observation points worldwide were about 40 in 1980 and are about 20 at present. By this method the movement of the ionosphere reflection height and electron density below the height can be observed. Such variations are occurred by a wide variety of sources.

  10. The use of source memory to identify one's own episodic confusion errors.

    Science.gov (United States)

    Smith, S M; Tindell, D R; Pierce, B H; Gilliland, T R; Gerkens, D R

    2001-03-01

    In 4 category cued recall experiments, participants falsely recalled nonlist common members, a semantic confusion error. Errors were more likely if critical nonlist words were presented on an incidental task, causing source memory failures called episodic confusion errors. Participants could better identify the source of falsely recalled words if they had deeply processed the words on the incidental task. For deep but not shallow processing, participants could reliably include or exclude incidentally shown category members in recall. The illusion that critical items actually appeared on categorized lists was diminished but not eradicated when participants identified episodic confusion errors post hoc among their own recalled responses; participants often believed that critical items had been on both the incidental task and the study list. Improved source monitoring can potentially mitigate episodic (but not semantic) confusion errors.

  11. Source amplitudes for active exterior cloaking

    International Nuclear Information System (INIS)

    Norris, Andrew N; Amirkulova, Feruza A; Parnell, William J

    2012-01-01

    The active cloak comprises a discrete set of multipole sources that destructively interfere with an incident time harmonic scalar wave to produce zero total field over a finite spatial region. For a given number of sources and their positions in two dimensions it is shown that the multipole amplitudes can be expressed as infinite sums of the coefficients of the incident wave decomposed into regular Bessel functions. The field generated by the active sources vanishes in the infinite region exterior to a set of circles defined by the relative positions of the sources. The results provide a direct solution to the inverse problem of determining the source amplitudes. They also define a broad class of non-radiating discrete sources. (paper)

  12. Flare-up rate in molars with periapical radiolucency in one-visit vs two-visit endodontic treatment.

    Science.gov (United States)

    Akbar, Iftikhar; Iqbal, Azhar; Al-Omiri, Mahmoud K

    2013-05-01

    The objective of this study was to compare postobturation flare-ups following single and two-visit endodontic treatment of molar teeth with periapical radiolucency. A total of 100 patients with asymptomatic molar teeth with periapical radiolucency were selected. They were randomly allocated into two groups. Fifty patients received complete endodontic treatment in one-visit. Fifty patients received treatment by debridement and instrumentation at the first visit followed by obturation at the second visit. 10% of patients had flare-ups in the single visit group and 8% of patients had flare-ups in the two-visit group. Number of visits did not affect the success of endodontic treatment (p>0.05). Age, gender and tooth type had no effects on the occurrence of flare-ups regardless the number of visits (p>0.05). One-visit endodontic treatment was as successful as two-visit endodontic treatment as evaluated by rate of flareups in asymptomatic molar teeth with periapical radiolucency.

  13. Wind Power - A Power Source Enabled by Power Electronics

    DEFF Research Database (Denmark)

    Blaabjerg, Frede; Chen, Zhe

    2004-01-01

    . The deregulation of energy has lowered the investment in bigger power plants, which means the need for new electrical power sources may be very high in the near future. Two major technologies will play important roles to solve the future problems. One is to change the electrical power production sources from......The global electrical energy consumption is still rising and there is a steady demand to increase the power capacity. The production, distribution and the use of the energy should be as technological efficient as possible and incentives to save energy at the end-user should be set up...... the conventional, fossil (and short term) based energy sources to renewable energy sources. The other is to use high efficient power electronics in power systems, power production and end-user application. This paper discuss the most emerging renewable energy source, wind energy, which by means of power...

  14. X-ray system with coupled source drive and detector drive

    International Nuclear Information System (INIS)

    1976-01-01

    An electronic coupling replacing the (more expensive) mechanical coupling which controls the speed of two sets of two electric motors, one driving an X-ray source and the other an X-ray detector, is described. Source and detector are kept rotating in parallel planes with a fairly constant velocity ratio. The drives are controlled by an electronic system comprising a comparator circuit comparing the position-indicative signals, a process control circuit and an inverter switch. The control system regulates the speed of the electric motors. The signal processing is described

  15. Challenges in combining different data sets during analysis when using grounded theory.

    Science.gov (United States)

    Rintala, Tuula-Maria; Paavilainen, Eija; Astedt-Kurki, Päivi

    2014-05-01

    To describe the challenges in combining two data sets during grounded theory analysis. The use of grounded theory in nursing research is common. It is a suitable method for studying human action and interaction. It is recommended that many alternative sources of data are collected to create as rich a dataset as possible. Data from interviews with people with diabetes (n=19) and their family members (n=19). Combining two data sets. When using grounded theory, there are numerous challenges in collecting and managing data, especially for the novice researcher. One challenge is to combine different data sets during the analysis. There are many methodological textbooks about grounded theory but there is little written in the literature about combining different data sets. Discussion is needed on the management of data and the challenges of grounded theory. This article provides a means for combining different data sets in the grounded theory analysis process.

  16. A survey of tobacco dependence treatment guidelines in 121 countries

    Science.gov (United States)

    Piné-Abata, Hembadoon; McNeill, Ann; Raw, Martin; Bitton, Asaf; Rigotti, Nancy; Murray, Rachael

    2013-01-01

    Aims To report progress among Parties to the World Health Organization (WHO) Framework Convention on Tobacco Control (FCTC) in developing national tobacco treatment guidelines in accordance with FCTC Article 14 guideline recommendations. Design Cross-sectional study. Setting Electronic survey from December 2011 to August 2012; participants were asked to complete either an online or attached Microsoft Word questionnaire. Participants One hundred and sixty-three of the 173 Parties to the FCTC at the time of our survey. Measurements The 51-item questionnaire contained 30 items specifically on guidelines. Questions covered the areas of guidelines writing process, content, key recommendations and other characteristics. Findings One hundred and twenty-one countries (73%) responded. Fifty-three countries (44%) had guidelines, ranging from 75% among high-income countries to 11% among low-income countries. Nearly all guidelines recommended brief advice (93%), intensive specialist support (93%) and medications (96%), while 66% recommended quitlines. Fifty-seven percent had a dissemination strategy, 76% stated funding source and 68% had professional endorsement. Conclusion Fewer than half of the Parties to the WHO FCTC have developed national tobacco treatment guidelines, but, where guidelines exist, they broadly follow FCTC Article 14 guideline recommendations. PMID:23437892

  17. Open Source Web Based Geospatial Processing with OMAR

    Directory of Open Access Journals (Sweden)

    Mark Lucas

    2009-01-01

    Full Text Available The availability of geospatial data sets is exploding. New satellites, aerial platforms, video feeds, global positioning system tagged digital photos, and traditional GIS information are dramatically increasing across the globe. These raw materials need to be dynamically processed, combined and correlated to generate value added information products to answer a wide range of questions. This article provides an overview of OMAR web based geospatial processing. OMAR is part of the Open Source Software Image Map project under the Open Source Geospatial Foundation. The primary contributors of OSSIM make their livings by providing professional services to US Government agencies and programs. OMAR provides one example that open source software solutions are increasingly being deployed in US government agencies. We will also summarize the capabilities of OMAR and its plans for near term development.

  18. Policy issues in setting de minimis standards for latent cancer risks of radiation and chemical carcinogens

    International Nuclear Information System (INIS)

    Spangler, M.

    1984-01-01

    In the fuel cycles for the development and utilization of alternative energy resources, the risk of latent cancer arises from a number of sources. Included are ionizing radiation and the carcinogenic potential of polluting chemicals present in certain fuels or in materials associated with the construction, operation, maintenance or waste treatment processes of nuclear power, fossil fuels, synfuels, biomass, and other sources of energy. One aspect of developing a carcinogen guideline policy for a consistent and effective regulatory regime to use in dealing with these assorted carcinogenic risks is the setting of de minimis quantitative standards. In this report, 11 policy issues related to the setting of such regulatory standards are identified and a brief commentary is provided. 15 references, 1 table

  19. Priority setting: what constitutes success? A conceptual framework for successful priority setting.

    Science.gov (United States)

    Sibbald, Shannon L; Singer, Peter A; Upshur, Ross; Martin, Douglas K

    2009-03-05

    The sustainability of healthcare systems worldwide is threatened by a growing demand for services and expensive innovative technologies. Decision makers struggle in this environment to set priorities appropriately, particularly because they lack consensus about which values should guide their decisions. One way to approach this problem is to determine what all relevant stakeholders understand successful priority setting to mean. The goal of this research was to develop a conceptual framework for successful priority setting. Three separate empirical studies were completed using qualitative data collection methods (one-on-one interviews with healthcare decision makers from across Canada; focus groups with representation of patients, caregivers and policy makers; and Delphi study including scholars and decision makers from five countries). This paper synthesizes the findings from three studies into a framework of ten separate but interconnected elements germane to successful priority setting: stakeholder understanding, shifted priorities/reallocation of resources, decision making quality, stakeholder acceptance and satisfaction, positive externalities, stakeholder engagement, use of explicit process, information management, consideration of values and context, and revision or appeals mechanism. The ten elements specify both quantitative and qualitative dimensions of priority setting and relate to both process and outcome components. To our knowledge, this is the first framework that describes successful priority setting. The ten elements identified in this research provide guidance for decision makers and a common language to discuss priority setting success and work toward improving priority setting efforts.

  20. Fifty Years of Safeguards under the EURATOM Treaty. A Regulatory Review

    International Nuclear Information System (INIS)

    Patel, B.; Chare, P.

    2007-01-01

    March 2007 marked the 50th anniversary of the signing of one of the founding treaties of the European Community. The EURATOM Treaty has its origins at a time when the stability of energy supplies in Europe was a major concern. Recently, much debate has centred on the possible reform or repeal of some parts of the treaty, given that its original aim was to promote and oversee the development of nuclear energy in Europe. This debate has focused attention on the future contribution of nuclear power to increasing energy demands in an enlarged Europe. However, despite these issues there is near universal agreement that the EURATOM Treaty has played a vital role in the protection of European citizens through the controls required for nuclear materials. Chapter 7 of the treaty (Safeguards) confers wide regulatory powers to the European Commission to ensure that civil nuclear materials are not diverted from their intended use as declared by the operators. This paper describes the early period of operation of the safeguards inspectorate, and gives statistics on the numbers and types of inspections carried out by the EURATOM inspectors, and discusses from an operational point of view the value of inspection activities. Further, a critical appraisal of Articles 77-85 within Chapter 7 is made. The paper also considers those safeguards requirements that are important to strengthen, in order to maintain a strong regulatory system to oversee future challenges, particularly in the context of increasing decommissioning activities within Europe. It is noteworthy that fifty-years after the founding of the treaty, many of the concerns about security of energy supply have re-emerged. It is a measure of the vision and forward thinking of its founders that the treaty has successfully overseen the safe and secure development of nuclear power in Europe (which currently provides a third of its electricity needs) and despite the many changes and developments that have occurred, that the

  1. The State of Hawaii Department of Education Job Sharing Pilot Project.

    Science.gov (United States)

    Hawaii State Dept. of Education, Honolulu.

    Intended to test the feasibility of job-sharing in Hawaii's schools, this project was set up to provide job sharing of one hundred teaching positions on a fifty-fifty basis between experienced tenured teachers and new hires. The report describes the purpose and intent of the project; defines job sharing; establishes tentative guidelines for…

  2. 46 CFR 111.10-5 - Multiple energy sources.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Multiple energy sources. 111.10-5 Section 111.10-5...-GENERAL REQUIREMENTS Power Supply § 111.10-5 Multiple energy sources. Failure of any single generating set energy source such as a boiler, diesel, gas turbine, or steam turbine must not cause all generating sets...

  3. Energy sources and nuclear energy. Comparative analysis and ethical reflections

    International Nuclear Information System (INIS)

    Hoenraet, C.

    1999-01-01

    Under the authority of the episcopacy of Brugge in Belgium an independent working group Ethics and Nuclear Energy was set up. The purpose of the working group was to collect all the necessary information on existing energy sources and to carry out a comparative analysis of their impact on mankind and the environment. Also attention was paid to economical and social aspects. The results of the study are subjected to an ethical reflection. The book is aimed at politicians, teachers, journalists and every interested layman who wants to gain insight into the consequences of the use of nuclear energy and other energy sources. Based on the information in this book one should be able to objectively define one's position in future debates on this subject

  4. Stable source reconstruction from a finite number of measurements in the multi-frequency inverse source problem

    DEFF Research Database (Denmark)

    Karamehmedovic, Mirza; Kirkeby, Adrian; Knudsen, Kim

    2018-01-01

    setting: From measurements made at a finite set of frequencies we uniquely determine and reconstruct sources in a subspace spanned by finitely many Fourier-Bessel functions. Further, we obtain a constructive criterion for identifying a minimal set of measurement frequencies sufficient for reconstruction......, and under an additional, mild assumption, the reconstruction method is shown to be stable." Our analysis is based on a singular value decomposition of the source-to-measurement forward operators and the distribution of positive zeros of the Bessel functions of the first kind. The reconstruction method...

  5. The Efficient Utilization of Open Source Information

    Energy Technology Data Exchange (ETDEWEB)

    Baty, Samuel R. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Intelligence and Systems Analysis

    2016-08-11

    These are a set of slides on the efficient utilization of open source information. Open source information consists of a vast set of information from a variety of sources. Not only does the quantity of open source information pose a problem, the quality of such information can hinder efforts. To show this, two case studies are mentioned: Iran and North Korea, in order to see how open source information can be utilized. The huge breadth and depth of open source information can complicate an analysis, especially because open information has no guarantee of accuracy. Open source information can provide key insights either directly or indirectly: looking at supporting factors (flow of scientists, products and waste from mines, government budgets, etc.); direct factors (statements, tests, deployments). Fundamentally, it is the independent verification of information that allows for a more complete picture to be formed. Overlapping sources allow for more precise bounds on times, weights, temperatures, yields or other issues of interest in order to determine capability. Ultimately, a "good" answer almost never comes from an individual, but rather requires the utilization of a wide range of skill sets held by a team of people.

  6. Sets in Coq, Coq in Sets

    Directory of Open Access Journals (Sweden)

    Bruno Barras

    2010-01-01

    Full Text Available This work is about formalizing models of various type theories of the Calculus of Constructions family. Here we focus on set theoretical models. The long-term goal is to build a formal set theoretical model of the Calculus of Inductive Constructions, so we can be sure that Coq is consistent with the language used by most mathematicians.One aspect of this work is to axiomatize several set theories: ZF possibly with inaccessible cardinals, and HF, the theory of hereditarily finite sets. On top of these theories we have developped a piece of the usual set theoretical construction of functions, ordinals and fixpoint theory. We then proved sound several models of the Calculus of Constructions, its extension with an infinite hierarchy of universes, and its extension with the inductive type of natural numbers where recursion follows the type-based termination approach.The other aspect is to try and discharge (most of these assumptions. The goal here is rather to compare the theoretical strengths of all these formalisms. As already noticed by Werner, the replacement axiom of ZF in its general form seems to require a type-theoretical axiom of choice (TTAC.

  7. Synchrotron light sources and free-electron lasers accelerator physics, instrumentation and science applications

    CERN Document Server

    Khan, Shaukat; Schneider, Jochen; Hastings, Jerome

    2016-01-01

    Hardly any other discovery of the nineteenth century did have such an impact on science and technology as Wilhelm Conrad Röntgen’s seminal find of the X-rays. X-ray tubes soon made their way as excellent instruments for numerous applications in medicine, biology, materials science and testing, chemistry and public security. Developing new radiation sources with higher brilliance and much extended spectral range resulted in stunning developments like the electron synchrotron and electron storage ring and the freeelectron laser. This handbook highlights these developments in fifty chapters. The reader is given not only an inside view of exciting science areas but also of design concepts for the most advanced light sources. The theory of synchrotron radiation and of the freeelectron laser, design examples and the technology basis are presented. The handbook presents advanced concepts like seeding and harmonic generation, the booming field of Terahertz radiation sources and upcoming brilliant light sources dri...

  8. Validation and Comparison of One-Dimensional Ground Motion Methodologies

    International Nuclear Information System (INIS)

    B. Darragh; W. Silva; N. Gregor

    2006-01-01

    Both point- and finite-source stochastic one-dimensional ground motion models, coupled to vertically propagating equivalent-linear shear-wave site response models are validated using an extensive set of strong motion data as part of the Yucca Mountain Project. The validation and comparison exercises are presented entirely in terms of 5% damped pseudo absolute response spectra. The study consists of a quantitative analyses involving modeling nineteen well-recorded earthquakes, M 5.6 to 7.4 at over 600 sites. The sites range in distance from about 1 to about 200 km in the western US (460 km for central-eastern US). In general, this validation demonstrates that the stochastic point- and finite-source models produce accurate predictions of strong ground motions over the range of 0 to 100 km and for magnitudes M 5.0 to 7.4. The stochastic finite-source model appears to be broadband, producing near zero bias from about 0.3 Hz (low frequency limit of the analyses) to the high frequency limit of the data (100 and 25 Hz for response and Fourier amplitude spectra, respectively)

  9. Validation and Comparison of One-Dimensional Graound Motion Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    B. Darragh; W. Silva; N. Gregor

    2006-06-28

    Both point- and finite-source stochastic one-dimensional ground motion models, coupled to vertically propagating equivalent-linear shear-wave site response models are validated using an extensive set of strong motion data as part of the Yucca Mountain Project. The validation and comparison exercises are presented entirely in terms of 5% damped pseudo absolute response spectra. The study consists of a quantitative analyses involving modeling nineteen well-recorded earthquakes, M 5.6 to 7.4 at over 600 sites. The sites range in distance from about 1 to about 200 km in the western US (460 km for central-eastern US). In general, this validation demonstrates that the stochastic point- and finite-source models produce accurate predictions of strong ground motions over the range of 0 to 100 km and for magnitudes M 5.0 to 7.4. The stochastic finite-source model appears to be broadband, producing near zero bias from about 0.3 Hz (low frequency limit of the analyses) to the high frequency limit of the data (100 and 25 Hz for response and Fourier amplitude spectra, respectively).

  10. Radiography method mprising determination of corrected absorption values for members of sets of mutually inclined beam paths

    International Nuclear Information System (INIS)

    McLeMay, C.A.G.

    1978-01-01

    An x-ray apparatus is described for examining a body including a source of a fan of radiation and detectors of the radiation along beams in the fan are transversed in a plane to provide data for a number of sets of parallel beams in the plane. An orbital motion is provided to give data for further sets at different inclinations in the plane. The data can be processed by arrangements using such parallel sets. The orbital motion is continuous but the extent of angular change in one lateral scan is kept small so that lack of parallelism in the parallel sets does not give excessive errors

  11. Identifying Cases of Type 2 Diabetes in Heterogeneous Data Sources: Strategy from the EMIF Project.

    Directory of Open Access Journals (Sweden)

    Giuseppe Roberto

    Full Text Available Due to the heterogeneity of existing European sources of observational healthcare data, data source-tailored choices are needed to execute multi-data source, multi-national epidemiological studies. This makes transparent documentation paramount. In this proof-of-concept study, a novel standard data derivation procedure was tested in a set of heterogeneous data sources. Identification of subjects with type 2 diabetes (T2DM was the test case. We included three primary care data sources (PCDs, three record linkage of administrative and/or registry data sources (RLDs, one hospital and one biobank. Overall, data from 12 million subjects from six European countries were extracted. Based on a shared event definition, sixteeen standard algorithms (components useful to identify T2DM cases were generated through a top-down/bottom-up iterative approach. Each component was based on one single data domain among diagnoses, drugs, diagnostic test utilization and laboratory results. Diagnoses-based components were subclassified considering the healthcare setting (primary, secondary, inpatient care. The Unified Medical Language System was used for semantic harmonization within data domains. Individual components were extracted and proportion of population identified was compared across data sources. Drug-based components performed similarly in RLDs and PCDs, unlike diagnoses-based components. Using components as building blocks, logical combinations with AND, OR, AND NOT were tested and local experts recommended their preferred data source-tailored combination. The population identified per data sources by resulting algorithms varied from 3.5% to 15.7%, however, age-specific results were fairly comparable. The impact of individual components was assessed: diagnoses-based components identified the majority of cases in PCDs (93-100%, while drug-based components were the main contributors in RLDs (81-100%. The proposed data derivation procedure allowed the

  12. A lossless one-pass sorting algorithm for symmetric three-dimensional gamma-ray data sets

    International Nuclear Information System (INIS)

    Brinkman, M.J.; Manatt, D.R.; Becker, J.A.; Henry, E.A.

    1992-01-01

    An algorithm for three-dimensional sorting and storing of the large data sets expected from the next generation of large gamma-ray detector arrays (i.e., EUROGAM, GAMMASPHERE) is presented. The algorithm allows the storage of realistic data sets on standard mass storage media. A discussion of an efficient implementation of the algorithm is provided with a proposed technique for exploiting its inherently parallel nature. (author). 5 refs., 2 figs

  13. A lossless one-pass sorting algorithm for symmetric three-dimensional gamma-ray data sets

    Energy Technology Data Exchange (ETDEWEB)

    Brinkman, M J; Manatt, D R; Becker, J A; Henry, E A [Lawrence Livermore National Lab., CA (United States)

    1992-08-01

    An algorithm for three-dimensional sorting and storing of the large data sets expected from the next generation of large gamma-ray detector arrays (i.e., EUROGAM, GAMMASPHERE) is presented. The algorithm allows the storage of realistic data sets on standard mass storage media. A discussion of an efficient implementation of the algorithm is provided with a proposed technique for exploiting its inherently parallel nature. (author). 5 refs., 2 figs.

  14. Thermal modeling of multi-shape heating sources on n-layer electronic board

    Directory of Open Access Journals (Sweden)

    Monier-Vinard Eric

    2017-01-01

    Full Text Available The present work completes the toolbox of analytical solutions that deal with resolving steady-state temperatures of a multi-layered structure heated by one or many heat sources. The problematic of heating sources having non-rectangular shapes is addressed to enlarge the capability of analytical approaches. Moreover, various heating sources could be located on the external surfaces of the sandwiched layers as well as embedded at interface of its constitutive layers. To demonstrate its relevance, the updated analytical solution has been compared with numerical simulations on the case of a multi-layered electronic board submitted to a set of heating source configurations. The comparison shows a high agreement between analytical and numerical calculations to predict the centroid and average temperatures. The promoted analytical approach establishes a kit of practical expressions, easy to implement, which would be cumulated, using superposition principle, to help electronic designers to early detect component or board temperatures beyond manufacturer limit. The ability to eliminate bad concept candidates with a minimum of set-up, relevant assumptions and low computation time can be easily achieved.

  15. Sediment composition of big Chinese and Indochinese rivers reflects geology of their source, not tectonic setting of their sink.

    Science.gov (United States)

    Garzanti, Eduardo; Andò, Sergio; Limonta, Mara; Nie, Junsheng; Resentini, Alberto; Vezzoli, Giovanni; Wang, Jiangang; Yang, Shouye

    2016-04-01

    There are several reasons why the tectonic setting of a sedimentary basin cannot be inferred from the composition of its sedimentary fill. One is that sediments can, and quite often are transported for thousands of kilometers from sources uplifted by certain tectonic processes to subsident basins created by totally different tectonic processes. A classical case is the Amazon River, carrying detritus from the Andean Cordillera to the Atlantic passive margin on the opposite side of South America (Franzinelli and Potter, 1983; Dickinson, 1988). Similar is the case of major rivers in China and Indochina, sourced in Tibetan orogenic highlands and reaching the Chinese passive margin or the back-arc/pull-apart Andaman Sea. The Huang He (Yellow River), the most sediment-laden river in the world, delivers annually to the Bohai Sea 1 billion tons of litho-feldspatho-quartzose sedimentaclastic/metamorphiclastic sediments with moderately rich, amphibole-epidote-garnet suites including apatite and zircon (Nie et al., 2015). The Changjiang (Yangtze) River, the fourth longest on Earth and the largest in Eurasia, carries to the East China Sea litho-feldspatho-quartzose sedimentaclastic/metamorphiclastic sand with moderately poor, amphibole-epidote suites including clinopyroxene and garnet (Vezzoli et al., 2016). The Ayeyarwadi (Irrawaddy) River, ranking among the five major rivers in the world for its annual load of 0.4 billion tons, carries to the Andaman Sea litho-feldspatho-quartzose metamorphiclastic/sedimentaclastic sand with moderately rich, amphibole-epidote suites including garnet and clinopyroxene (Garzanti et al., 2013). Detrital modes in these three very big river basins are thus similar, and would plot in the "Recycled Orogen" field of Dickinson (1985) rather than in the "Continental Block" or "Magmatic Arc" fields. The orogenic signature acquired in mountainous headwaters is carried all the way to the mouth, and even after long-distance transport across wide

  16. Falls documentation in nursing homes: agreement between the minimum data set and chart abstractions of medical and nursing documentation.

    Science.gov (United States)

    Hill-Westmoreland, Elizabeth E; Gruber-Baldini, Ann L

    2005-02-01

    To assess the agreement between falls as recorded in the Minimum Data Set (MDS) and fall events abstracted from chart documentation of elderly nursing home (NH) residents. Secondary analysis of data from a longitudinal panel study. Fifty-six randomly selected NHs in Maryland stratified by facility size and geographic region. Four hundred sixty-two NH residents, aged 65 and older, in NHs for 1 year. Falls were abstracted from resident charts and compared with MDS fall variables. Fall events data obtained from other sources of chart documentation were matched for the corresponding periods of 30 and 180 days before the 1-year MDS assessment date. For a 30-day period, concordance between the MDS and chart abstractions of falls occurred in 65% of cases, with a kappa coefficient of 0.29 (Pfalls the MDS missed indicated that these residents had significantly more activity of daily living impairment and significantly less unsteady gait and cane/walker use. The MDS underreported falls. Nurses completing MDS assessments must carefully review residents' medical records for falls documentation. Future studies should use caution when employing MDS data as the only indicator of falls.

  17. The Impact of a One-Dose versus Two-Dose Oral Cholera Vaccine Regimen in Outbreak Settings: A Modeling Study

    Science.gov (United States)

    Azman, Andrew S.; Luquero, Francisco J.; Ciglenecki, Iza; Grais, Rebecca F.; Sack, David A.; Lessler, Justin

    2015-01-01

    uncertainty due to imperfect surveillance data and uncertainty about the transmission dynamics of cholera in each setting. Conclusions Reactive vaccination campaigns using a single dose of OCV may avert more cases and deaths than a standard two-dose campaign when vaccine supplies are limited, while at the same time reducing logistical complexity. These findings should motivate consideration of the trade-offs between one- and two-dose campaigns in resource-constrained settings, though further field efficacy data are needed and should be a priority in any one-dose campaign. PMID:26305226

  18. The Impact of a One-Dose versus Two-Dose Oral Cholera Vaccine Regimen in Outbreak Settings: A Modeling Study.

    Directory of Open Access Journals (Sweden)

    Andrew S Azman

    2015-08-01

    surveillance data and uncertainty about the transmission dynamics of cholera in each setting.Reactive vaccination campaigns using a single dose of OCV may avert more cases and deaths than a standard two-dose campaign when vaccine supplies are limited, while at the same time reducing logistical complexity. These findings should motivate consideration of the trade-offs between one- and two-dose campaigns in resource-constrained settings, though further field efficacy data are needed and should be a priority in any one-dose campaign.

  19. Nuclear Material Detection by One-Short-Pulse-Laser-Driven Neutron Source

    International Nuclear Information System (INIS)

    Favalli, Andrea; Aymond, F.; Bridgewater, Jon S.; Croft, Stephen; Deppert, O.; Devlin, Matthew James; Falk, Katerina; Fernandez, Juan Carlos; Gautier, Donald Cort; Gonzales, Manuel A.; Goodsell, Alison Victoria; Guler, Nevzat; Hamilton, Christopher Eric; Hegelich, Bjorn Manuel; Henzlova, Daniela; Ianakiev, Kiril Dimitrov; Iliev, Metodi; Johnson, Randall Philip; Jung, Daniel; Kleinschmidt, Annika; Koehler, Katrina Elizabeth; Pomerantz, Ishay; Roth, Markus; Santi, Peter Angelo; Shimada, Tsutomu; Swinhoe, Martyn Thomas; Taddeucci, Terry Nicholas; Wurden, Glen Anthony; Palaniyappan, Sasikumar; McCary, E.

    2015-01-01

    Covered in the PowerPoint presentation are the following areas: Motivation and requirements for active interrogation of nuclear material; laser-driven neutron source; neutron diagnostics; active interrogation of nuclear material; and, conclusions, remarks, and future works.

  20. Point source reconstruction principle of linear inverse problems

    International Nuclear Information System (INIS)

    Terazono, Yasushi; Matani, Ayumu; Fujimaki, Norio; Murata, Tsutomu

    2010-01-01

    Exact point source reconstruction for underdetermined linear inverse problems with a block-wise structure was studied. In a block-wise problem, elements of a source vector are partitioned into blocks. Accordingly, a leadfield matrix, which represents the forward observation process, is also partitioned into blocks. A point source is a source having only one nonzero block. An example of such a problem is current distribution estimation in electroencephalography and magnetoencephalography, where a source vector represents a vector field and a point source represents a single current dipole. In this study, the block-wise norm, a block-wise extension of the l p -norm, was defined as the family of cost functions of the inverse method. The main result is that a set of three conditions was found to be necessary and sufficient for block-wise norm minimization to ensure exact point source reconstruction for any leadfield matrix that admit such reconstruction. The block-wise norm that satisfies the conditions is the sum of the cost of all the observations of source blocks, or in other words, the block-wisely extended leadfield-weighted l 1 -norm. Additional results are that minimization of such a norm always provides block-wisely sparse solutions and that its solutions form cones in source space

  1. Drawing and Writing in Digital Science Notebooks: Sources of Formative Assessment Data

    Science.gov (United States)

    Shelton, Angi; Smith, Andrew; Wiebe, Eric; Behrle, Courtney; Sirkin, Ruth; Lester, James

    2016-01-01

    Formative assessment strategies are used to direct instruction by establishing where learners' understanding is, how it is developing, informing teachers and students alike as to how they might get to their next set of goals of conceptual understanding. For the science classroom, one rich source of formative assessment data about scientific…

  2. Organ Transplantation and the Uniform Anatomical Gift Act: A Fifty-Year Perspective.

    Science.gov (United States)

    Sadler, Blair L; Sadler, Alfred M

    2018-03-01

    Fifty years ago this summer, the Uniform Anatomical Gift Act was adopted by the National Conference of Commissioners on Uniform State Laws and approved by the American Bar Association. The UAGA has provided a sound and stable legal platform on which to base an effective nationwide organ donation system. The cardinal principles of altruism, autonomy, and public trust are still important. At a time when confidence and trust in our government and many private institutions has declined, maintaining trust and confidence in our health care system and its commitment to "first, do no harm" has never been more important. Any policies that override these core ethical principles could cause irreparable damage to the public's faith in our transplant system. While progress has been made to increase organ registration and the number of organs transplanted, much more must be done to realize the potential of life-saving therapy without jeopardizing ethical principles. © 2018 The Hastings Center.

  3. Using a source-receptor approach to characterise VOC behaviour in a French urban area influenced by industrial emissions. Part II: source contribution assessment using the Chemical Mass Balance (CMB) model.

    Science.gov (United States)

    Badol, Caroline; Locoge, Nadine; Galloo, Jean-Claude

    2008-01-25

    In Part I of this study (Badol C, Locoge N, Leonardis T, Gallo JC. Using a source-receptor approach to characterise VOC behaviour in a French urban area influenced by industrial emissions, Part I: Study area description, data set acquisition and qualitative data analysis of the data set. Sci Total Environ 2007; submitted as companion manuscript.) the study area, acquisition of the one-year data set and qualitative analysis of the data set have been described. In Part II a source profile has been established for each activity present in the study area: 6 profiles (urban heating, solvent use, natural gas leakage, biogenic emissions, gasoline evaporation and vehicle exhaust) have been extracted from literature to characterise urban sources, 7 industrial profiles have been established via canister sampling around industrial plants (hydrocarbon cracking, oil refinery, hydrocarbon storage, lubricant storage, lubricant refinery, surface treatment and metallurgy). The CMB model is briefly described and its implementation is discussed through the selection of source profiles and fitting species. Main results of CMB modellings for the Dunkerque area are presented. (1) The daily evolution of source contributions for the urban wind sector shows that the vehicle exhaust source contribution varies between 40 and 55% and its relative increase at traffic rush hours is hardly perceptible. (2) The relative contribution of vehicle exhaust varies from 55% in winter down to 30% in summer. This decrease is due to the increase of the relative contribution of hydrocarbon storage source reaching up to 20% in summer. (3) The evolution of source contributions with wind directions has confirmed that in urban wind sectors the contribution of vehicle exhaust dominate with around 45-55%. For the other wind sectors that include some industrial plants, the contribution of industrial sources is around 60% and could reach 80% for the sector 280-310 degrees , which corresponds to the most dense

  4. Source Reference File

    Data.gov (United States)

    Social Security Administration — This file contains a national set of names and contact information for doctors, hospitals, clinics, and other facilities (known collectively as sources) from which...

  5. Dual-Source Swept-Source Optical Coherence Tomography Reconstructed on Integrated Spectrum

    Directory of Open Access Journals (Sweden)

    Shoude Chang

    2012-01-01

    Full Text Available Dual-source swept-source optical coherence tomography (DS-SSOCT has two individual sources with different central wavelengths, linewidth, and bandwidths. Because of the difference between the two sources, the individually reconstructed tomograms from each source have different aspect ratio, which makes the comparison and integration difficult. We report a method to merge two sets of DS-SSOCT raw data in a common spectrum, on which both data have the same spectrum density and a correct separation. The reconstructed tomographic image can seamlessly integrate the two bands of OCT data together. The final image has higher axial resolution and richer spectroscopic information than any of the individually reconstructed tomography image.

  6. A realist review of one-to-one breastfeeding peer support experiments conducted in developed country settings.

    Science.gov (United States)

    Trickey, Heather; Thomson, Gill; Grant, Aimee; Sanders, Julia; Mann, Mala; Murphy, Simon; Paranjothy, Shantini

    2018-01-01

    The World Health Organisation guidance recommends breastfeeding peer support (BFPS) as part of a strategy to improve breastfeeding rates. In the UK, BFPS is supported by National Institute for Health and Care Excellence guidance and a variety of models are in use. The experimental evidence for BFPS in developed countries is mixed and traditional methods of systematic review are ill-equipped to explore heterogeneity, complexity, and context influences on effectiveness. This review aimed to enhance learning from the experimental evidence base for one-to-one BFPS intervention. Principles of realist review were applied to intervention case studies associated with published experimental studies. The review aimed (a) to explore heterogeneity in theoretical underpinnings and intervention design for one-to-one BFPS intervention; (b) inform design decisions by identifying transferable lessons developed from cross-case comparison of context-mechanism-outcome relationships; and (c) inform evaluation design by identifying context-mechanism-outcome relationships associated with experimental conditions. Findings highlighted poor attention to intervention theory and considerable heterogeneity in BFPS intervention design. Transferable mid-range theories to inform design emerged, which could be grouped into seven categories: (a) congruence with local infant feeding norms, (b) integration with the existing system of health care, (c) overcoming practical and emotional barriers to access, (d) ensuring friendly, competent, and proactive peers, (e) facilitating authentic peer-mother interactions, (f) motivating peers to ensure positive within-intervention amplification, and (g) ensuring positive legacy and maintenance of gains. There is a need to integrate realist principles into evaluation design to improve our understanding of what forms of BFPS work, for whom and under what circumstances. © 2017 John Wiley & Sons Ltd.

  7. Fertility in cancer patients after cryopreservation of one ovary

    DEFF Research Database (Denmark)

    Schmidt, K T; Andersen, Anders Nyboe; Greve, T

    2013-01-01

    This questionnaire study describes the fertility and ovarian function in 143 adult female cancer survivors with only one ovary due to cryopreservation of the other. The women were asked about their ovarian function (as defined by the presence of a spontaneous menstrual cycle), pregnancies...... and their outcome. The mean follow-up time was 58months after cryopreservation (range 24-129months). The risk of premature ovarian failure was high in the group of patients with leukaemia (13/15; 87%) but low in the breast cancer group (5/54; 9%). Fifty-seven women had actively tried to become pregnant after end...

  8. Antecedents to agenda setting and framing in health news: an examination of priority, angle, source, and resource usage from a national survey of U.S. health reporters and editors.

    Science.gov (United States)

    Wallington, Sherrie Flynt; Blake, Kelly; Taylor-Clark, Kalahn; Viswanath, K

    2010-01-01

    The influence of news media on audience cognitions, attitudes, and behaviors in the realm of politics, race relations, science, and health has been extensively documented.Agenda setting and framing studies show that news media influence how people develop schema and place priorities on issues, with media stories serving as a major source of issue frames. Although news media are an important intermediary in the translation of scientific knowledge to different publics, little has been documented about the production of health news and factors that may predict media agenda setting and framing in health journalism. We used data from a 2005 national survey of U.S. health reporters and editors to examine predictors of source, resource, story angle, and frame usage among reporters and editors by variables such as organizational structure, individual characteristics of respondents (such as education and years working as a journalist),and perceptions of occupational autonomy. Multivariable logistic regression models revealed several differences among U.S. health reports and editors in the likelihood of using a variety of news sources, resources, priorities, and angles in reporting. Media agenda setting and framing theories suggest that practitioners familiar with media processes can work with journalists to frame messages, thereby increasing the probability of accurate and effective reporting. Results from this study may help to inform interactions between public health and medical practitioners and the press [corrected].

  9. An Ode to Stuart Hall's "The Supply of Demand": The Case of Post-Secondary Education in Ontario Fifty Years Later

    Science.gov (United States)

    FitzGerald Murphy, Maggie

    2016-01-01

    Despite the fact that over fifty years have passed since its publication, Stuart Hall's article "The Supply of Demand" (1960), is remarkably relevant today. The central message that society must not be blinded by "prosperity" such that it no longer envisions and demands a better world is especially pertinent in light of the…

  10. A New Technique to Identify Arbitrarily Shaped Noise Sources

    Directory of Open Access Journals (Sweden)

    Roberto A. Tenenbaum

    2006-01-01

    Full Text Available Acoustic intensity is one of the available tools for evaluating sound radiation from vibrating bodies. Active intensity may, in some situations, not give a faithful insight about how much energy is in fact carried into the far field. It was then proposed a new parameter, the supersonic acoustic intensity, which takes into account only the intensity generated by components having a smaller wavenumber than the acoustic one. However, the method is only efective for simple sources, such as plane plates, cylinders and spheres. This work presents a new technique, based on the Boundary Elements Method and the Singular Value Decomposition, to compute the supersonic acoustic intensity for arbitrarily shaped sources. The technique is based in the Kirchoff-Helmholtz equation in a discretized approach, leading to a radiation operator that relates the normal velocity on the source's surface mesh with the pressure at grid points located in the field. Then, the singular value decomposition technique is set to the radiation operator and a cutoff criterion is applied to remove non propagating components. Some numerical examples are presented.

  11. Novel gene sets improve set-level classification of prokaryotic gene expression data.

    Science.gov (United States)

    Holec, Matěj; Kuželka, Ondřej; Železný, Filip

    2015-10-28

    Set-level classification of gene expression data has received significant attention recently. In this setting, high-dimensional vectors of features corresponding to genes are converted into lower-dimensional vectors of features corresponding to biologically interpretable gene sets. The dimensionality reduction brings the promise of a decreased risk of overfitting, potentially resulting in improved accuracy of the learned classifiers. However, recent empirical research has not confirmed this expectation. Here we hypothesize that the reported unfavorable classification results in the set-level framework were due to the adoption of unsuitable gene sets defined typically on the basis of the Gene ontology and the KEGG database of metabolic networks. We explore an alternative approach to defining gene sets, based on regulatory interactions, which we expect to collect genes with more correlated expression. We hypothesize that such more correlated gene sets will enable to learn more accurate classifiers. We define two families of gene sets using information on regulatory interactions, and evaluate them on phenotype-classification tasks using public prokaryotic gene expression data sets. From each of the two gene-set families, we first select the best-performing subtype. The two selected subtypes are then evaluated on independent (testing) data sets against state-of-the-art gene sets and against the conventional gene-level approach. The novel gene sets are indeed more correlated than the conventional ones, and lead to significantly more accurate classifiers. The novel gene sets are indeed more correlated than the conventional ones, and lead to significantly more accurate classifiers. Novel gene sets defined on the basis of regulatory interactions improve set-level classification of gene expression data. The experimental scripts and other material needed to reproduce the experiments are available at http://ida.felk.cvut.cz/novelgenesets.tar.gz.

  12. The Effects of Environmental Management Systems on Source Separation in the Work and Home Settings

    Directory of Open Access Journals (Sweden)

    Chris von Borgstede

    2012-06-01

    Full Text Available Measures that challenge the generation of waste are needed to address the global problem of the increasing volumes of waste that are generated in both private homes and workplaces. Source separation at the workplace is commonly implemented by environmental management systems (EMS. In the present study, the relationship between source separation at work and at home was investigated. A questionnaire that maps psychological and behavioural predictors of source separation was distributed to employees at different workplaces. The results show that respondents with awareness of EMS report higher levels of source separation at work, stronger environmental concern, personal and social norms, and perceive source separation to be less difficult. Furthermore, the results support the notion that after the adoption of EMS at the workplace, source separation at work spills over into source separation in the household. The potential implications for environmental management systems are discussed.

  13. X fluorescence spectrometer including at least one toroidal monochromator with logarithmic spiral

    International Nuclear Information System (INIS)

    Florestan, J.

    1986-01-01

    This spectrometer includes a X-ray source, an entrance diaphragm, a revolution monochromator with monocrystal thin plates and a seal set in its center, an outer diaphragm and a X-ray detector. A second monochromator can be set between the source and the sample. The thin plates are set so as to be a toroidal ring whose cross section in an axial plane describes a logarithmic spiral [fr

  14. One-pot four-component synthesis of 2-aryl-3,3-dihaloacrylonitriles using potassium hexacyanoferrate(II) as environmentally benign cyanide source

    International Nuclear Information System (INIS)

    Zhao, Zhouxing; Li, Zheng

    2011-01-01

    An efficient route to one-pot four-component reactions of aroyl chlorides, potassium hexacyanoferrate(II), triphenylphosphine and carbon tetrahalides to synthesize 2-aryl-3,3-dichloroacrylonitriles and 2-aryl-3,3-dibromoacrylonitriles was described. This protocol has advantages of use of non-toxic cyanide source, high yield and simple work-up procedure. (author)

  15. Point-source inversion techniques

    Science.gov (United States)

    Langston, Charles A.; Barker, Jeffrey S.; Pavlin, Gregory B.

    1982-11-01

    A variety of approaches for obtaining source parameters from waveform data using moment-tensor or dislocation point source models have been investigated and applied to long-period body and surface waves from several earthquakes. Generalized inversion techniques have been applied to data for long-period teleseismic body waves to obtain the orientation, time function and depth of the 1978 Thessaloniki, Greece, event, of the 1971 San Fernando event, and of several events associated with the 1963 induced seismicity sequence at Kariba, Africa. The generalized inversion technique and a systematic grid testing technique have also been used to place meaningful constraints on mechanisms determined from very sparse data sets; a single station with high-quality three-component waveform data is often sufficient to discriminate faulting type (e.g., strike-slip, etc.). Sparse data sets for several recent California earthquakes, for a small regional event associated with the Koyna, India, reservoir, and for several events at the Kariba reservoir have been investigated in this way. Although linearized inversion techniques using the moment-tensor model are often robust, even for sparse data sets, there are instances where the simplifying assumption of a single point source is inadequate to model the data successfully. Numerical experiments utilizing synthetic data and actual data for the 1971 San Fernando earthquake graphically demonstrate that severe problems may be encountered if source finiteness effects are ignored. These techniques are generally applicable to on-line processing of high-quality digital data, but source complexity and inadequacy of the assumed Green's functions are major problems which are yet to be fully addressed.

  16. [Assessing the balance of quality indicator sets of external quality assurance according to SGB V section 136].

    Science.gov (United States)

    Doebler, Klaus; Geraedts, Max

    2017-12-20

    The value and usefulness of the results of indicator-based performance measurement in healthcare for different purposes do not only depend on the methodological quality of the individual indicators but also on the composition of the indicator sets. So far, the balance of the currently used indicator sets of the German mandatory national performance measurement system for hospitals has not been systematically analyzed. Due to the lack of a methodological gold standard for the assessment of balance and orientation of indicator sets we adapted the OECD concept of quality dimensions and defined four categories: 1) "Achieving primary goals of treatment", 2) "Avoiding adverse events", 3) "Indication" and 4) "Patient-centeredness". We defined rules for the assignment to the categories and analyzed the distribution of the 239 indicators from 29 medical areas in relation to these categories. 63 indicators (26.4 %) were assigned to the category "Achieving primary goals of treatment", 153 (64.0 %) to the category "Avoiding adverse events", 18 (7.5 %) to the category "Indication", one indicator (0.4 %) to the category "Patient-centeredness". Four indicators (1.7 %) addressed documentation quality. 12 of the 29 indicator sets only covered one OECD quality dimension by at least one indicator. The current indicator sets seem to be unbalanced with a strong focus on the category "Avoiding adverse events". As regards the goal of monitoring the compliance with minimal safety standards and performing improvement interventions, the direction of the indicator sets seems to be appropriate. With respect to other goals, such as for example the identification of "excellence", further development efforts are required. One relevant reason for the dominant focus on the category "Avoiding adverse events" seems to be that data sources for a follow-up and for the inclusion of the patient perspective have not been available until recently. There is a strong demand for the consequent use of

  17. An alternative subspace approach to EEG dipole source localization

    Science.gov (United States)

    Xu, Xiao-Liang; Xu, Bobby; He, Bin

    2004-01-01

    In the present study, we investigate a new approach to electroencephalography (EEG) three-dimensional (3D) dipole source localization by using a non-recursive subspace algorithm called FINES. In estimating source dipole locations, the present approach employs projections onto a subspace spanned by a small set of particular vectors (FINES vector set) in the estimated noise-only subspace instead of the entire estimated noise-only subspace in the case of classic MUSIC. The subspace spanned by this vector set is, in the sense of principal angle, closest to the subspace spanned by the array manifold associated with a particular brain region. By incorporating knowledge of the array manifold in identifying FINES vector sets in the estimated noise-only subspace for different brain regions, the present approach is able to estimate sources with enhanced accuracy and spatial resolution, thus enhancing the capability of resolving closely spaced sources and reducing estimation errors. The present computer simulations show, in EEG 3D dipole source localization, that compared to classic MUSIC, FINES has (1) better resolvability of two closely spaced dipolar sources and (2) better estimation accuracy of source locations. In comparison with RAP-MUSIC, FINES' performance is also better for the cases studied when the noise level is high and/or correlations among dipole sources exist.

  18. An alternative subspace approach to EEG dipole source localization

    International Nuclear Information System (INIS)

    Xu Xiaoliang; Xu, Bobby; He Bin

    2004-01-01

    In the present study, we investigate a new approach to electroencephalography (EEG) three-dimensional (3D) dipole source localization by using a non-recursive subspace algorithm called FINES. In estimating source dipole locations, the present approach employs projections onto a subspace spanned by a small set of particular vectors (FINES vector set) in the estimated noise-only subspace instead of the entire estimated noise-only subspace in the case of classic MUSIC. The subspace spanned by this vector set is, in the sense of principal angle, closest to the subspace spanned by the array manifold associated with a particular brain region. By incorporating knowledge of the array manifold in identifying FINES vector sets in the estimated noise-only subspace for different brain regions, the present approach is able to estimate sources with enhanced accuracy and spatial resolution, thus enhancing the capability of resolving closely spaced sources and reducing estimation errors. The present computer simulations show, in EEG 3D dipole source localization, that compared to classic MUSIC, FINES has (1) better resolvability of two closely spaced dipolar sources and (2) better estimation accuracy of source locations. In comparison with RAP-MUSIC, FINES' performance is also better for the cases studied when the noise level is high and/or correlations among dipole sources exist

  19. Noise-tolerant parity learning with one quantum bit

    Science.gov (United States)

    Park, Daniel K.; Rhee, June-Koo K.; Lee, Soonchil

    2018-03-01

    Demonstrating quantum advantage with less powerful but more realistic devices is of great importance in modern quantum information science. Recently, a significant quantum speedup was achieved in the problem of learning a hidden parity function with noise. However, if all data qubits at the query output are completely depolarized, the algorithm fails. In this work, we present a quantum parity learning algorithm that exhibits quantum advantage as long as one qubit is provided with nonzero polarization in each query. In this scenario, the quantum parity learning naturally becomes deterministic quantum computation with one qubit. Then the hidden parity function can be revealed by performing a set of operations that can be interpreted as measuring nonlocal observables on the auxiliary result qubit having nonzero polarization and each data qubit. We also discuss the source of the quantum advantage in our algorithm from the resource-theoretic point of view.

  20. Fire forbids fifty-fifty forest

    Science.gov (United States)

    Staal, Arie; Hantson, Stijn; Holmgren, Milena; Pueyo, Salvador; Bernardi, Rafael E.; Flores, Bernardo M.; Xu, Chi; Scheffer, Marten

    2018-01-01

    Recent studies have interpreted patterns of remotely sensed tree cover as evidence that forest with intermediate tree cover might be unstable in the tropics, as it will tip into either a closed forest or a more open savanna state. Here we show that across all continents the frequency of wildfires rises sharply as tree cover falls below ~40%. Using a simple empirical model, we hypothesize that the steepness of this pattern causes intermediate tree cover (30‒60%) to be unstable for a broad range of assumptions on tree growth and fire-driven mortality. We show that across all continents, observed frequency distributions of tropical tree cover are consistent with this hypothesis. We argue that percolation of fire through an open landscape may explain the remarkably universal rise of fire frequency around a critical tree cover, but we show that simple percolation models cannot predict the actual threshold quantitatively. The fire-driven instability of intermediate states implies that tree cover will not change smoothly with climate or other stressors and shifts between closed forest and a state of low tree cover will likely tend to be relatively sharp and difficult to reverse. PMID:29351323

  1. Fire forbids fifty-fifty forest.

    Science.gov (United States)

    van Nes, Egbert H; Staal, Arie; Hantson, Stijn; Holmgren, Milena; Pueyo, Salvador; Bernardi, Rafael E; Flores, Bernardo M; Xu, Chi; Scheffer, Marten

    2018-01-01

    Recent studies have interpreted patterns of remotely sensed tree cover as evidence that forest with intermediate tree cover might be unstable in the tropics, as it will tip into either a closed forest or a more open savanna state. Here we show that across all continents the frequency of wildfires rises sharply as tree cover falls below ~40%. Using a simple empirical model, we hypothesize that the steepness of this pattern causes intermediate tree cover (30‒60%) to be unstable for a broad range of assumptions on tree growth and fire-driven mortality. We show that across all continents, observed frequency distributions of tropical tree cover are consistent with this hypothesis. We argue that percolation of fire through an open landscape may explain the remarkably universal rise of fire frequency around a critical tree cover, but we show that simple percolation models cannot predict the actual threshold quantitatively. The fire-driven instability of intermediate states implies that tree cover will not change smoothly with climate or other stressors and shifts between closed forest and a state of low tree cover will likely tend to be relatively sharp and difficult to reverse.

  2. Coral seas in fifty years: Need for local policies

    Science.gov (United States)

    Longley, P.; Cheng, N. S.; Fontaine, R. M.; Horton, K.; Bhattacharya, A.

    2017-12-01

    Arising stressors from both global and local sources threaten coral reefs, with studies indicating that local and global sources might reduce coral resilience. Local sources include sediment stress and nutrient stress from fishing; global sources include increasing sea surface temperature and ocean acidification. Through an in-depth review and re-analysis of published work, conducted under the scope of a course in the spring of 2017 semester and follow up research over the summer of 2017 and fall of 2017, students in Environmental Studies Course, ENVS 4100: Coral reefs, at the University of Colorado Boulder have developed a framework to initiate a discussion of global and local policies focused on protection of coral reefs. The research aims to assess current threats and suggest mitigation efforts. The paper uses secondary research to analyze impact of ocean acidification on aragonite saturation levels, current thermal stress, nutrient stress, and sediment factors that influence the health of coral and its surrounding ecosystem over the Common Era. Case studies in this paper include the Caribbean and Red Sea coral reefs, due to the variation of the atmosphere, temperature, and human activity in these regions. This paper intends to offer sufficient evidence that will lead to appropriate policy decisions that pertain to reef conservation.

  3. Open source EMR software: profiling, insights and hands-on analysis.

    Science.gov (United States)

    Kiah, M L M; Haiqi, Ahmed; Zaidan, B B; Zaidan, A A

    2014-11-01

    The use of open source software in health informatics is increasingly advocated by authors in the literature. Although there is no clear evidence of the superiority of the current open source applications in the healthcare field, the number of available open source applications online is growing and they are gaining greater prominence. This repertoire of open source options is of a great value for any future-planner interested in adopting an electronic medical/health record system, whether selecting an existent application or building a new one. The following questions arise. How do the available open source options compare to each other with respect to functionality, usability and security? Can an implementer of an open source application find sufficient support both as a user and as a developer, and to what extent? Does the available literature provide adequate answers to such questions? This review attempts to shed some light on these aspects. The objective of this study is to provide more comprehensive guidance from an implementer perspective toward the available alternatives of open source healthcare software, particularly in the field of electronic medical/health records. The design of this study is twofold. In the first part, we profile the published literature on a sample of existent and active open source software in the healthcare area. The purpose of this part is to provide a summary of the available guides and studies relative to the sampled systems, and to identify any gaps in the published literature with respect to our research questions. In the second part, we investigate those alternative systems relative to a set of metrics, by actually installing the software and reporting a hands-on experience of the installation process, usability, as well as other factors. The literature covers many aspects of open source software implementation and utilization in healthcare practice. Roughly, those aspects could be distilled into a basic taxonomy, making the

  4. Impact of intimate partner violence on clinic attendance, viral suppression and CD4 cell count of women living with HIV in an urban clinic setting.

    Science.gov (United States)

    Anderson, Jocelyn C; Campbell, Jacquelyn C; Glass, Nancy E; Decker, Michele R; Perrin, Nancy; Farley, Jason

    2018-04-01

    The substance abuse, violence and HIV/AIDS (SAVA) syndemic represents a complex set of social determinants of health that impacts the lives of women. Specifically, there is growing evidence that intimate partner violence (IPV) places women at risk for both HIV acquisition and poorer HIV-related outcomes. This study assessed prevalence of IPV in an HIV clinic setting, as well as the associations between IPV, symptoms of depression and PTSD on three HIV-related outcomes-CD4 count, viral load, and missed clinic visits. In total, 239 adult women attending an HIV-specialty clinic were included. Fifty-one percent (95% CI: 45%-58%) reported past year psychological, physical, or sexual intimate partner abuse. In unadjusted models, IPV was associated with having a CD4 count 33% of past year all type clinic visits (OR: 1.535, 95% CI: 0.920-2.560, p = 0.101) or HIV specialty clinic visits (OR: 1.251, 95% CI: 0.732-2.140). In multivariable regression, controlling for substance use, mental health symptoms and demographic covariates, IPV remained associated with CD4 count suppression. The association between IPV and lower CD4 counts, but not adherence markers such as viral suppression and missed visits, indicates a need to examine potential physiologic impacts of trauma that may alter the immune functioning of women living with HIV. Incorporating trauma-informed approaches into current HIV care settings is one opportunity that begins to address IPV in this patient population.

  5. Global digital data sets of soil type, soil texture, surface slope and other properties: Documentation of archived data tape

    Science.gov (United States)

    Staub, B.; Rosenzweig, C.; Rind, D.

    1987-01-01

    The file structure and coding of four soils data sets derived from the Zobler (1986) world soil file is described. The data were digitized on a one-degree square grid. They are suitable for large-area studies such as climate research with general circulation models, as well as in forestry, agriculture, soils, and hydrology. The first file is a data set of codes for soil unit, land-ice, or water, for all the one-degree square cells on Earth. The second file is a data set of codes for texture, land-ice, or water, for the same soil units. The third file is a data set of codes for slope, land-ice, or water for the same units. The fourth file is the SOILWRLD data set, containing information on soil properties of land cells of both Matthews' and Food and Agriculture Organization (FAO) sources. The fourth file reconciles land-classification differences between the two and has missing data filled in.

  6. Fast temperature optimization of multi-source hyperthermia applicators with reduced-order modeling of 'virtual sources'

    International Nuclear Information System (INIS)

    Cheng, K-S; Stakhursky, Vadim; Craciunescu, Oana I; Stauffer, Paul; Dewhirst, Mark; Das, Shiva K

    2008-01-01

    The goal of this work is to build the foundation for facilitating real-time magnetic resonance image guided patient treatment for heating systems with a large number of physical sources (e.g. antennas). Achieving this goal requires knowledge of how the temperature distribution will be affected by changing each source individually, which requires time expenditure on the order of the square of the number of sources. To reduce computation time, we propose a model reduction approach that combines a smaller number of predefined source configurations (fewer than the number of actual sources) that are most likely to heat tumor. The source configurations consist of magnitude and phase source excitation values for each actual source and may be computed from a CT scan based plan or a simplified generic model of the corresponding patient anatomy. Each pre-calculated source configuration is considered a 'virtual source'. We assume that the actual best source settings can be represented effectively as weighted combinations of the virtual sources. In the context of optimization, each source configuration is treated equivalently to one physical source. This model reduction approach is tested on a patient upper-leg tumor model (with and without temperature-dependent perfusion), heated using a 140 MHz ten-antenna cylindrical mini-annular phased array. Numerical simulations demonstrate that using only a few pre-defined source configurations can achieve temperature distributions that are comparable to those from full optimizations using all physical sources. The method yields close to optimal temperature distributions when using source configurations determined from a simplified model of the tumor, even when tumor position is erroneously assumed to be ∼2.0 cm away from the actual position as often happens in practical clinical application of pre-treatment planning. The method also appears to be robust under conditions of changing, nonlinear, temperature-dependent perfusion. The

  7. Mass spectrometric characterization of a pyrolytic radical source using femtosecond ionization

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H M; Beaud, P; Mischler, B; Radi, P P; Tzannis, A P; Gerber, T [Paul Scherrer Inst. (PSI), Villigen (Switzerland)

    1997-06-01

    Radicals play, as reactive species, an important role in the chemistry of combustion. In contrast to atmospheric flames where spectra are congested due to high vibrational and rotational excitation, experiments in the cold environment of a molecular beam (MB) yield clean spectra that can be easily attributed to one species by Resonantly Enhanced Multi Photon Ionization (REMP). A pyrolytic radical source has been set up. To characterize the efficiency of the source `soft` ionization with femto second pulses is applied which results in less fragmentation, simplifying the interpretation of the mass spectrum. (author) figs., tabs., refs.

  8. Sets, Planets, and Comets

    Science.gov (United States)

    Baker, Mark; Beltran, Jane; Buell, Jason; Conrey, Brian; Davis, Tom; Donaldson, Brianna; Detorre-Ozeki, Jeanne; Dibble, Leila; Freeman, Tom; Hammie, Robert; Montgomery, Julie; Pickford, Avery; Wong, Justine

    2013-01-01

    Sets in the game "Set" are lines in a certain four-dimensional space. Here we introduce planes into the game, leading to interesting mathematical questions, some of which we solve, and to a wonderful variation on the game "Set," in which every tableau of nine cards must contain at least one configuration for a player to pick up.

  9. Social Set Visualizer (SoSeVi) II

    DEFF Research Database (Denmark)

    Flesch, Benjamin; Vatrapu, Ravi

    2016-01-01

    This paper reports the second iteration of the Social Set Visualizer (SoSeVi), a set theoretical visual analytics dashboard of big social data. In order to further demonstrate its usefulness in large-scale visual analytics tasks of individual and collective behavior of actors in social networks......, the current iteration of the Social Set Visualizer (SoSeVi) in version II builds on recent advancements in visualizing set intersections. The development of the SoSeVi dashboard involved cutting-edge open source visual analytics libraries (D3.js) and creation of new visualizations such as of actor mobility...

  10. SARNET. Severe Accident Research Network - key issues in the area of source term

    International Nuclear Information System (INIS)

    Giordano, P.; Micaelli, J.C.; Haste, T.; Herranz, L.

    2005-01-01

    About fifty European organisations integrate in SARNET (Network of Excellence of the EU 6 th Framework Programme) their research capacities in resolve better the most important remaining uncertainties and safety issues concerning existing and future Nuclear Power Plants (NPPs) under hypothetical Severe Accident (SA) conditions. Wishing to maintain a long-lasting cooperation, they conduct three types of activities: integrating activities, spreading of excellence and jointly executed research. This paper summarises the main results obtained by the network after the first year, giving more prominence to those from jointly executed research in the Source Term area. Integrating activities have been performed through different means: the ASTEC integral computer code for severe accident transient modelling, through development of PSA2 methodologies, through the setting of a structure for definition of evolving R and D priorities and through the development of a web-network of data bases that hosts experimental data. Such activities have been facilitated by the development of an Advanced Communication Tool. Concerning spreading of excellence, educational courses covering Severe Accident Analysis Methodology and Level 2 PSA have been set up, to be given in early 2006. A detailed text book on Severe Accident Phenomenology has been designed and agreed amongst SARNET members. A mobility programme for students and young researchers is being developed, some detachments are already completed or in progress, and examples are quoted. Jointly executed research activities concern key issues grouped in the Corium, Containment and Source Term areas. In Source Term, behaviour of the highly radio-toxic ruthenium under oxidising conditions (like air ingress) for HBU and MOX fuel has been investigated. First modelling proposals for ASTEC have been made for oxidation of fuel and of ruthenium. Experiments on transport of highly volatile oxide ruthenium species have been performed. Reactor

  11. Price setting in turbulent times

    DEFF Research Database (Denmark)

    Ólafsson, Tjörvi; Pétursdóttir, Ásgerdur; Vignisdóttir, Karen Á.

    This price setting survey among Icelandic firms aims to make two contributions to the literature. First, it studies price setting in an advanced economy within a more turbulent macroeconomic environment than has previously been done. The results indicate that price adjustments are to a larger...... extent driven by exchange rate fluctuations than in most other advanced countries. The median Icelandic firm reviews its prices every four months and changes them every six months. The main sources of price rigidity and the most commonly used price setting methods are the same as in most other countries....... A second contribution to the literature is our analysis of the nexus between price setting and exchange rate movements, a topic that has attracted surprisingly limited attention in this survey-based literature. A novel aspect of our approach is to base our analysis on a categorisation of firms...

  12. Invariant sets for Windows

    CERN Document Server

    Morozov, Albert D; Dragunov, Timothy N; Malysheva, Olga V

    1999-01-01

    This book deals with the visualization and exploration of invariant sets (fractals, strange attractors, resonance structures, patterns etc.) for various kinds of nonlinear dynamical systems. The authors have created a special Windows 95 application called WInSet, which allows one to visualize the invariant sets. A WInSet installation disk is enclosed with the book.The book consists of two parts. Part I contains a description of WInSet and a list of the built-in invariant sets which can be plotted using the program. This part is intended for a wide audience with interests ranging from dynamical

  13. Emittance studies of the 2.45 GHz permanent magnet ECR ion source

    Science.gov (United States)

    Zelenak, A.; Bogomolov, S. L.; Yazvitsky, N. Yu.

    2004-05-01

    During the past several years different types of permanent magnet 2.45 GHz (electron cyclotron resonance) ion sources were developed for production of singly charged ions. Ion sources of this type are used in the first stage of DRIBs project, and are planned to be used in the MASHA mass separator. The emittance of the beam provided by the source is one of the important parameters for these applications. An emittance scanner composed from a set of parallel slits and rotary wire beam profile monitor was used for the studying of the beam emittance characteristics. The emittance of helium and argon ion beams was measured with different shapes of the plasma electrode for several ion source parameters: microwave power, source potential, plasma aperture-puller aperture gap distance, gas pressure. The results of measurements are compared with previous simulations of ion optics.

  14. Emittance studies of the 2.45 GHz permanent magnet ECR ion source

    International Nuclear Information System (INIS)

    Zelenak, A.; Bogomolov, S.L.; Yazvitsky, N.Yu.

    2004-01-01

    During the past several years different types of permanent magnet 2.45 GHz (electron cyclotron resonance) ion sources were developed for production of singly charged ions. Ion sources of this type are used in the first stage of DRIBs project, and are planned to be used in the MASHA mass separator. The emittance of the beam provided by the source is one of the important parameters for these applications. An emittance scanner composed from a set of parallel slits and rotary wire beam profile monitor was used for the studying of the beam emittance characteristics. The emittance of helium and argon ion beams was measured with different shapes of the plasma electrode for several ion source parameters: microwave power, source potential, plasma aperture-puller aperture gap distance, gas pressure. The results of measurements are compared with previous simulations of ion optics

  15. Sources and fate of environmental radioactivity at the earth's surface

    International Nuclear Information System (INIS)

    El-Daoushy, F.

    2010-01-01

    Sources and fate of environmental radioactivity at the earth surface This is to link environmental radioactivity to RP in Africa? To describe the benefits of Africa from this field in terms of RP, safety and security policies. To create a mission and a vision to fulfil the needs of ONE PEOPLE, ONE GOAL, ONE FAITH. Sources, processes and fate of environmental radioactivity Previous experience helps setting up an African agenda.(1) Factors influencing cosmogenic radionuclides(2) Factors influencing artificial radionuclides: (a) nuclear weapon-tests (b) nuclear accidents (c) Energy, mining and industrial waste (3) Factors influencing the global Rn-222 and its daughters. (4) Dynamics of cycles of natural radioactivity, e.g. Pb-210. (5) Environmental radiotracers act as DIAGNOSTIC TOOLS to assess air and water quality and impacts of the atmospheric and hydrospheric compartments on ecosystems.6) Definition of base-lines for rehabilitation and protection. Climate influences sources/behaviour/fate of environmental radioactivity. Impacts on life forms in Africa would be severe. Assessing environmental radioactivity resolves these issue

  16. British battleships of world war one new revised edition

    CERN Document Server

    Burt, R A

    2012-01-01

    This superb reference book achieved the status of 'classic' soon after its first publication in 1986; it was soon out of print and is now one of the most sought-after naval reference books on the secondhand market.
It presents, in one superb volume, the complete technical history of British capital ship design and construction during the dreadnought era. One hundred years ago at Jutland, Dogger Bank, Heligoland Bight and the first battle for the Falklands, might squadrons of these great armoured ships fought their German counterparts for command of the seas. Beginning with Dreadnought, the book continues to the end of the First World War, and all of the fifty dreadnoughts, 'super-dreadnoughts' and battlecruisers that served the Royal Navy during this era are described and superbly illustrated with photographs and line drawings. 
Each class of ship is described in detail so that design origins, and technical and operational factors, are discussed alongside characteristics, with special emphasis on armament...

  17. Source Distribution Method for Unsteady One-Dimensional Flows With Small Mass, Momentum, and Heat Addition and Small Area Variation

    Science.gov (United States)

    Mirels, Harold

    1959-01-01

    A source distribution method is presented for obtaining flow perturbations due to small unsteady area variations, mass, momentum, and heat additions in a basic uniform (or piecewise uniform) one-dimensional flow. First, the perturbations due to an elemental area variation, mass, momentum, and heat addition are found. The general solution is then represented by a spatial and temporal distribution of these elemental (source) solutions. Emphasis is placed on discussing the physical nature of the flow phenomena. The method is illustrated by several examples. These include the determination of perturbations in basic flows consisting of (1) a shock propagating through a nonuniform tube, (2) a constant-velocity piston driving a shock, (3) ideal shock-tube flows, and (4) deflagrations initiated at a closed end. The method is particularly applicable for finding the perturbations due to relatively thin wall boundary layers.

  18. Integration of relational and textual biomedical sources. A pilot experiment using a semi-automated method for logical schema acquisition.

    Science.gov (United States)

    García-Remesal, M; Maojo, V; Billhardt, H; Crespo, J

    2010-01-01

    Bringing together structured and text-based sources is an exciting challenge for biomedical informaticians, since most relevant biomedical sources belong to one of these categories. In this paper we evaluate the feasibility of integrating relational and text-based biomedical sources using: i) an original logical schema acquisition method for textual databases developed by the authors, and ii) OntoFusion, a system originally designed by the authors for the integration of relational sources. We conducted an integration experiment involving a test set of seven differently structured sources covering the domain of genetic diseases. We used our logical schema acquisition method to generate schemas for all textual sources. The sources were integrated using the methods and tools provided by OntoFusion. The integration was validated using a test set of 500 queries. A panel of experts answered a questionnaire to evaluate i) the quality of the extracted schemas, ii) the query processing performance of the integrated set of sources, and iii) the relevance of the retrieved results. The results of the survey show that our method extracts coherent and representative logical schemas. Experts' feedback on the performance of the integrated system and the relevance of the retrieved results was also positive. Regarding the validation of the integration, the system successfully provided correct results for all queries in the test set. The results of the experiment suggest that text-based sources including a logical schema can be regarded as equivalent to structured databases. Using our method, previous research and existing tools designed for the integration of structured databases can be reused - possibly subject to minor modifications - to integrate differently structured sources.

  19. Multi-attribute bilateral bargaining in a one-to-many setting

    NARCIS (Netherlands)

    E.H. Gerding (Enrico); D.J.A. Somefun (Koye); J.A. La Poutré (Han)

    2005-01-01

    htmlabstractNegotiations are an important way of reaching agreements between selfish autonomous agents. In this paper we focus on one-to-many bargaining within the context of agent-mediated electronic commerce. We consider an approach where a seller negotiates over multiple interdependent attributes

  20. Goal setting as an outcome measure: A systematic review.

    Science.gov (United States)

    Hurn, Jane; Kneebone, Ian; Cropley, Mark

    2006-09-01

    Goal achievement has been considered to be an important measure of outcome by clinicians working with patients in physical and neurological rehabilitation settings. This systematic review was undertaken to examine the reliability, validity and sensitivity of goal setting and goal attainment scaling approaches when used with working age and older people. To review the reliability, validity and sensitivity of both goal setting and goal attainment scaling when employed as an outcome measure within a physical and neurological working age and older person rehabilitation environment, by examining the research literature covering the 36 years since goal-setting theory was proposed. Data sources included a computer-aided literature search of published studies examining the reliability, validity and sensitivity of goal setting/goal attainment scaling, with further references sourced from articles obtained through this process. There is strong evidence for the reliability, validity and sensitivity of goal attainment scaling. Empirical support was found for the validity of goal setting but research demonstrating its reliability and sensitivity is limited. Goal attainment scaling appears to be a sound measure for use in physical rehabilitation settings with working age and older people. Further work needs to be carried out with goal setting to establish its reliability and sensitivity as a measurement tool.

  1. Setting up of Nuclide GRAF-3S spark source mass spectrometer for the analysis of high purity materials

    International Nuclear Information System (INIS)

    Mahalingam, T.R.; Murugaiyan, P.; Soni, K.S.; Venkateswarlu, Ch.

    1975-01-01

    A spark source mass spectrometer model GRAF-35 manufactured by the Nuclide Corporation, U.S.A., was set up for analysis of nuclear-grade and high purity materials. The main difficulty with its successful operation was to achieve and maintain the required level of vacuum i.e. less than 2X10 -8 torr in the magnetic analyser region. With 100 1/s ion pump, the required vacuum could be achieved, but the spectrometer required periodical baking which minimises the life of the instrument. The pumping system was replaced by Ultek Boostivac pump - a combination of ion pump (150 1/s) and a titanium sublimation pump (1000 1/sec speed for condensable vapours) which eliminated baking as the necessary level of vacuum could be easily achieved whenever required. Results of the analysis of zone-refined indium and uranium for trace impurities are given. (M.G.B.)

  2. Fifty years of progress in acoustic phonetics

    Science.gov (United States)

    Stevens, Kenneth N.

    2004-10-01

    Three events that occurred 50 or 60 years ago shaped the study of acoustic phonetics, and in the following few decades these events influenced research and applications in speech disorders, speech development, speech synthesis, speech recognition, and other subareas in speech communication. These events were: (1) the source-filter theory of speech production (Chiba and Kajiyama; Fant); (2) the development of the sound spectrograph and its interpretation (Potter, Kopp, and Green; Joos); and (3) the birth of research that related distinctive features to acoustic patterns (Jakobson, Fant, and Halle). Following these events there has been systematic exploration of the articulatory, acoustic, and perceptual bases of phonological categories, and some quantification of the sources of variability in the transformation of this phonological representation of speech into its acoustic manifestations. This effort has been enhanced by studies of how children acquire language in spite of this variability and by research on speech disorders. Gaps in our knowledge of this inherent variability in speech have limited the directions of applications such as synthesis and recognition of speech, and have led to the implementation of data-driven techniques rather than theoretical principles. Some examples of advances in our knowledge, and limitations of this knowledge, are reviewed.

  3. Characteristics and locations of sources

    International Nuclear Information System (INIS)

    Lahtinen, J.; Poellaenen, R.; Toivonen, H.

    1997-01-01

    Ten artificial radiation sources were placed in the terrain in order to test the capability of airborne measuring teams to detect them. One of the sources was a line source, others were point sources (three of them collimated). The radionuclides used in the sources were 60 Co, 137 Cs, 99m Tc and 192 Ir. The source activities ranged from about 26 MBq (one of the cobalt sources) to 0.56 TBq (iridium). (au)

  4. HARVESTING, INTEGRATING AND DISTRIBUTING LARGE OPEN GEOSPATIAL DATASETS USING FREE AND OPEN-SOURCE SOFTWARE

    Directory of Open Access Journals (Sweden)

    R. Oliveira

    2016-06-01

    Full Text Available Federal, State and Local government agencies in the USA are investing heavily on the dissemination of Open Data sets produced by each of them. The main driver behind this thrust is to increase agencies’ transparency and accountability, as well as to improve citizens’ awareness. However, not all Open Data sets are easy to access and integrate with other Open Data sets available even from the same agency. The City and County of Denver Open Data Portal distributes several types of geospatial datasets, one of them is the city parcels information containing 224,256 records. Although this data layer contains many pieces of information it is incomplete for some custom purposes. Open-Source Software were used to first collect data from diverse City of Denver Open Data sets, then upload them to a repository in the Cloud where they were processed using a PostgreSQL installation on the Cloud and Python scripts. Our method was able to extract non-spatial information from a ‘not-ready-to-download’ source that could then be combined with the initial data set to enhance its potential use.

  5. Optimal energy window setting depending on the energy resolution for radionuclides used in gamma camera imaging. Planar imaging evaluation

    International Nuclear Information System (INIS)

    Kojima, Akihiro; Watanabe, Hiroyuki; Arao, Yuichi; Kawasaki, Masaaki; Takaki, Akihiro; Matsumoto, Masanori

    2007-01-01

    In this study, we examined whether the optimal energy window (EW) setting depending on an energy resolution of a gamma camera, which we previously proposed, is valid on planar scintigraphic imaging using Tl-201, Ga-67, Tc-99m, and I-123. Image acquisitions for line sources and paper sheet phantoms containing each radionuclide were performed in air and with scattering materials. For the six photopeaks excluding the Hg-201 characteristic x-rays' one, the conventional 20%-width energy window (EW20%) setting and the optimal energy window (optimal EW) setting (15%-width below 100 keV and 13%-width above 100 keV) were compared. For the Hg-201 characteristic x-rays' photopeak, the conventional on-peak EW20% setting was compared with the off-peak EW setting (73 keV-25%) and the wider off-peak EW setting (77 keV-29%). Image-count ratio (defined as the ratio of the image counts obtained with an EW and the total image counts obtained with the EW covered the whole photopeak for a line source in air), image quality, spatial resolutions (full width half maximum (FWHM) and full width tenth maximum (FWTM) values), count-profile curves, and defect-contrast values were compared between the conventional EW setting and the optimal EW setting. Except for the Hg-201 characteristic x-rays, the image-count ratios were 94-99% for the EW20% setting, but 78-89% for the optimal EW setting. However, the optimal EW setting reduced scatter fraction (defined as the scattered-to-primary counts ratio) effectively, as compared with the EW20% setting. Consequently, all the images with the optimal EW setting gave better image quality than ones with the EW20% setting. For the Hg-201 characteristic x-rays, the off-peak EW setting showed great improvement in image quality in comparison with the EW20% setting and the wider off-peak EW setting gave the best results. In conclusion, from our planar imaging study it was shown that although the optimal EW setting proposed by us gives less image-count ratio by

  6. Crowd-Sourcing the Aesthetics of Platform Games

    DEFF Research Database (Denmark)

    Shaker, Noor; Yannakakis, Georgios N.; Togelius, Julian

    2014-01-01

    What are the aesthetics of platform games and what makes a platform level engaging, challenging and/or frustrating? We attempt to answer such questions through mining a large-set of crowd-sourced gameplay data of a clone of the classic platform game Super Mario Bros. The data consists of 40 short...... game levels that differ along six key level design parameters. Collectively, these levels are played 1560 times over the Internet and the perceived experience is annotated by experiment participants via self-reported ranking (pairwise preferences). Given the wealth of this crowd-sourced data, as all...... details about players’ in-game behaviour are logged, the problem becomes one of extracting meaningful numerical features at the appropriate level of abstraction for the construction of generic computational models of player experience and, thereby, game aesthetics. We explore dissimilar types of features...

  7. Complementarity among climate related energy sources: Sensitivity study to climate characteristics across Europe

    Science.gov (United States)

    Francois, Baptiste; Hingray, Benoit; Creutin, Jean-Dominique; Raynaud, Damien; Borga, Marco; Vautard, Robert

    2015-04-01

    Climate related energy sources like solar-power, wind-power and hydro-power are important contributors to the transitions to a low-carbon economy. Past studies, mainly based on solar and wind powers, showed that the power from such energy sources fluctuates in time and space following their driving climatic variables. However, when combining different energy sources together, their intermittent feature is smoothed, resulting to lower time variability of the produced power and to lower storage capacity required for balancing. In this study, we consider solar, wind and hydro energy sources in a 100% renewable Europe using a set of 12 regions following two climate transects, the first one going from the Northern regions (Norway, Finland) to the Southern ones (Greece, Andalucía, Tunisia) and the second one going from the oceanic climate (West of France, Galicia) to the continental one (Romania, Belorussia). For each of those regions, we combine wind and solar irradiance data from the Weather Research and Forecasting Model (Vautard et al., 2014), temperature data from the European Climate Assessment & Dataset (Haylock et al., 2008) and runoff from the Global Runoff Data Center (GRDC, 1999) for estimating solar-power, wind-power, run-of-the-river hydro-power and the electricity demand over a time period of 30 years. The use of this set of 12 regions across Europe allows integrating knowledge about time and space variability for each different energy sources. We then assess the optimal share of each energy sources, aiming to decrease the time variability of the regional energy balance at different time scales as well as the energy storage required for balancing within each region. We also evaluate how energy transport among regions contributes for smoothing out both the energy balance and the storage requirement. The strengths of this study are i) to handle with run-of-the-river hydro power in addition to wind and solar energy sources and ii) to carry out this analysis

  8. The Advanced Light Source Upgrade

    International Nuclear Information System (INIS)

    Chemla, Daniel S.; Feinberg, Benjamin; Hussain, Zahid; Krebs, Gary F.; Padmore, Howard A.; Robin, David S.; Robinson, Arthur L.; Smith, Neville V.

    2003-01-01

    The ALS, a third-generation synchrotron light source at Berkeley Lab, has been operating for almost a decade and is generating forefront science by exploiting the high brightness of a third-generation source in three areas: (1) high resolving power for spectroscopy; (2) high spatial resolution for microscopy and spectromicroscopy; and (3) high coherence for experiments such as speckle. However, the ALS was one of the first third-generation machines to be designed, and accelerator and insertion-device technology have significantly changed since its conception. As a result, its performance will inevitably be outstripped by newer, more advanced sources. To remain competitive and then set a new standard, the performance of the ALS, in particular its brightness, must be enhanced. Substantial improvements in brightness and current have always been feasible in principle, but they incur the penalty of a much reduced lifetime, which is totally unacceptable to our users. Significant brightness improvements can be realized in the core soft x-ray region by going to top-off operation, where injection would be quasi-continuous and the lifetime objections disappear. In top-off mode with higher average current, a reduced vertical emittance and beta function, and small-gap permanent-magnet or superconducting insertion devices, one to two orders of magnitude improvement in brightness can be had in the soft x-ray range. These improvements also extend the high energy range of the undulator radiation beyond the current limit of 2000 eV. Descriptions of the upgrade and the important new science achievable are presented

  9. A New Source Biasing Approach in ADVANTG

    International Nuclear Information System (INIS)

    Bevill, Aaron M.; Mosher, Scott W.

    2012-01-01

    The ADVANTG code has been developed at Oak Ridge National Laboratory to generate biased sources and weight window maps for MCNP using the CADIS and FW-CADIS methods. In preparation for an upcoming RSICC release, a new approach for generating a biased source has been developed. This improvement streamlines user input and improves reliability. Previous versions of ADVANTG generated the biased source from ADVANTG input, writing an entirely new general fixed-source definition (SDEF). Because volumetric sources were translated into SDEF-format as a finite set of points, the user had to perform a convergence study to determine whether the number of source points used accurately represented the source region. Further, the large number of points that must be written in SDEF-format made the MCNP input and output files excessively long and difficult to debug. ADVANTG now reads SDEF-format distributions and generates corresponding source biasing cards, eliminating the need for a convergence study. Many problems of interest use complicated source regions that are defined using cell rejection. In cell rejection, the source distribution in space is defined using an arbitrarily complex cell and a simple bounding region. Source positions are sampled within the bounding region but accepted only if they fall within the cell; otherwise, the position is resampled entirely. When biasing in space is applied to sources that use rejection sampling, current versions of MCNP do not account for the rejection in setting the source weight of histories, resulting in an 'unfair game'. This problem was circumvented in previous versions of ADVANTG by translating volumetric sources into a finite set of points, which does not alter the mean history weight ((bar w)). To use biasing parameters without otherwise modifying the original cell-rejection SDEF-format source, ADVANTG users now apply a correction factor for (bar w) in post-processing. A stratified-random sampling approach in ADVANTG is under

  10. A simple 2D SOC model for one of the main sources of geomagnetic disturbances: Flares

    International Nuclear Information System (INIS)

    Meirelles, M.C.; Dias, V.H.A.; Oliva, D.; Papa, A.R.R.

    2010-01-01

    We introduce a simple model for solar flares, one of the main sources of geomagnetic disturbances. We have obtained power-laws for the probability distribution functions of some relevant physical characteristics of flares which could serve as the fingerprint of a critical state at the base of such phenomena and, given that we have not introduced a fine tune mechanism, of self-organized criticality. We compare our results with some recent experimental work on the statistics of flares and analyze the possible connection of these power laws with others already found by our group in geomagnetic disturbances distributions. We also present some limitations of our model as well as possible extensions and corrections to be taken into account in future works.

  11. The Standard Days Method(®): efficacy, satisfaction and demand at regular family planning service delivery settings in Turkey.

    Science.gov (United States)

    Kursun, Zerrin; Cali, Sanda; Sakarya, Sibel

    2014-06-01

    To evaluate the demand, efficacy, and satisfaction concerning the Standard Days Method(®) (SDM; a fertility awareness method) as an option presented among other contraceptive methods at regular service delivery settings. The survey group consisted of 993 women who presented at the primary care units in Umraniye District of Istanbul, Turkey, between 1 October 2006 and 31 March 2008, and started to use a new method. Women were enrolled until reaching a limit of 250 new users for each method, or expiration of the six-month registration period. Participants were followed for up to one year of method use. The characteristics of women who chose the SDM were similar to those of participants who opted for other methods. The most common reasons for selecting it were that it is natural and causes no side effects. Fifty-one percent used the SDM for the full year, compared to 71% who chose an intrauterine device (IUD). Continuation rates were significantly lower for all other methods. During the one-year follow-up period, 12% of SDM-, 7% of pill-, 7% of condom-, 3% of monthly injection-, 1% of quarterly injection-, and 0.5% of IUD users became pregnant. The SDM had relatively high continuation rates and relatively good levels of satisfaction among participants and their husbands. It should be mentioned among the routinely offered contraceptive methods.

  12. Characteristics and locations of sources

    Energy Technology Data Exchange (ETDEWEB)

    Lahtinen, J.; Poellaenen, R.; Toivonen, H. [Finnish Centre for Radiation and Nuclear Safety, Helsinki (Finland)

    1997-12-31

    Ten artificial radiation sources were placed in the terrain in order to test the capability of airborne measuring teams to detect them. One of the sources was a line source, others were point sources (three of them collimated). The radionuclides used in the sources were {sup 60}Co, {sup 137}Cs, {sup 99m}Tc and {sup 192}Ir. The source activities ranged from about 26 MBq (one of the cobalt sources) to 0.56 TBq (iridium). (au).

  13. Characteristics and locations of sources

    Energy Technology Data Exchange (ETDEWEB)

    Lahtinen, J; Poellaenen, R; Toivonen, H [Finnish Centre for Radiation and Nuclear Safety, Helsinki (Finland)

    1998-12-31

    Ten artificial radiation sources were placed in the terrain in order to test the capability of airborne measuring teams to detect them. One of the sources was a line source, others were point sources (three of them collimated). The radionuclides used in the sources were {sup 60}Co, {sup 137}Cs, {sup 99m}Tc and {sup 192}Ir. The source activities ranged from about 26 MBq (one of the cobalt sources) to 0.56 TBq (iridium). (au).

  14. Proceedings of the fifty sixth congress of Indian Society of Theoretical and Applied Mechanics: souvenir

    International Nuclear Information System (INIS)

    Pradhan, V.H.

    2011-01-01

    The fifty sixth congress of Indian Society of Theoretical and Applied Mechanics had been organized from December 19-21, 2011. Mathematics has been playing a key role in the development of society right from the use of geometry in land measurement to satellite launching. The appropriate mathematical tools are right answers to real world problems which has increase many fold because of the advancement in various branches of applied sciences, engineering and technologies with the advanced development of high speed super digital computers. The topics such as wireless communication, health systems, financing, budgeting, planning, management and decision making which plays important role in day to day lives have been covered in this meet. Papers relevant to INIS have been indexed separately

  15. Effects of various cone-beam computed tomography settings on the detection of recurrent caries under restorations in extracted primary teeth

    OpenAIRE

    Kamburo?lu, K?van?; S?nmez, G?l; Berkta?, Zeynep Serap; Kurt, Hakan; ?zen, Do?ukan

    2017-01-01

    Purpose The aim of this study was to assess the ex vivo diagnostic ability of 9 different cone-beam computed tomography (CBCT) settings in the detection of recurrent caries under amalgam restorations in primary teeth. Materials and Methods Fifty-two primary teeth were used. Twenty-six teeth had dentine caries and 26 teeth did not have dentine caries. Black class II cavities were prepared and restored with amalgam. In the 26 carious teeth, recurrent caries were left under restorations. The oth...

  16. Soft-to-hard templating to well-dispersed N-doped mesoporous carbon nanospheres via one-pot carbon/silica source copolymerization

    Institute of Scientific and Technical Information of China (English)

    Qinglu Kong; Lingxia Zhang; Min Wang; Mengli Li; Heliang Yao; Jianlin Shi

    2016-01-01

    Here we report a new approach referred as "softto-hard templating" strategy via the copolymerization of carbon source (dopamine) and silica source (tetraethyl orthosilicate) for the synthesis of well dispersed N-doped mesoporous carbon nanospheres (MCNs),which exhibit high performance for electrochemical supercapacitor.This method overcomes the shortcoming of uncontrolled dispersity and complicated procedures of soft-or hard-tem-plating methods,respectively.Moreover,the synthesized MCNs feature enriched heteroatom N-doping and easy functionalization by noble-metal nanoparticles during the one-pot synthesis.All the above characters make the asprepared MCNs a promising platform in a variety of applications.To demonstrate the applicability of the synthesized nitrogen-doped MCNs,this material has been employed as an electrode for high-performance electrochemical supercapacitor,which shows a capacitance of 223 and 140 F/g at current densities of 0.5 and 10 A/g in 1 mol/L KOH electrolyte,respectively.

  17. Multiple time-reversed guide-sources in shallow water

    Science.gov (United States)

    Gaumond, Charles F.; Fromm, David M.; Lingevitch, Joseph F.; Gauss, Roger C.; Menis, Richard

    2003-10-01

    Detection in a monostatic, broadband, active sonar system in shallow water is degraded by propagation-induced spreading. The detection improvement from multiple spatially separated guide sources (GSs) is presented as a method to mitigate this degradation. The improvement of detection by using information in a set of one-way transmissions from a variety of positions is shown using sea data. The experimental area is south of the Hudson Canyon off the coast of New Jersey. The data were taken using five elements of a time-reversing VLA. The five elements were contiguous and at midwater depth. The target and guide source was an echo repeater positioned at various ranges and at middepth. The transmitted signals were 3.0- to 3.5-kHz LFMs. The data are analyzed to show the amount of information present in the collection, a baseline probability of detection (PD) not using the collection of GS signals, the improvement in PD from the use of various sets of GS signals. The dependence of the improvement as a function of range is also shown. [The authors acknowledge support from Dr. Jeffrey Simmen, ONR321OS, and the chief scientist Dr. Charles Holland. Work supported by ONR.

  18. [Explicit memory for type font of words in source monitoring and recognition tasks].

    Science.gov (United States)

    Hatanaka, Yoshiko; Fujita, Tetsuya

    2004-02-01

    We investigated whether people can consciously remember type fonts of words by methods of examining explicit memory; source-monitoring and old/new-recognition. We set matched, non-matched, and non-studied conditions between the study and the test words using two kinds of type fonts; Gothic and MARU. After studying words in one way of encoding, semantic or physical, subjects in a source-monitoring task made a three way discrimination between new words, Gothic words, and MARU words (Exp. 1). Subjects in an old/new-recognition task indicated whether test words were previously presented or not (Exp. 2). We compared the source judgments with old/new recognition data. As a result, these data showed conscious recollection for type font of words on the source monitoring task and dissociation between source monitoring and old/new recognition performance.

  19. Benefits and problems of health-care robots in aged care settings: A comparison trial.

    Science.gov (United States)

    Broadbent, Elizabeth; Kerse, Ngaire; Peri, Kathryn; Robinson, Hayley; Jayawardena, Chandimal; Kuo, Tony; Datta, Chandan; Stafford, Rebecca; Butler, Haley; Jawalkar, Pratyusha; Amor, Maddy; Robins, Ben; MacDonald, Bruce

    2016-03-01

    This study investigated whether multiple health-care robots could have any benefits or cause any problems in an aged care facility. Fifty-three residents and 53 staff participated in a non-randomised controlled trial over 12 weeks. Six robots provided entertainment, communication and health-monitoring functions in staff rooms and activity lounges. These settings were compared to control settings without robots. There were no significant differences between groups in resident or staff outcomes, except a significant increase in job satisfaction in the control group only. The intervention group perceived the robots had more agency and experience than the control group did. Perceived agency of the robots decreased over time in both groups. Overall, we received very mixed responses with positive, neutral and negative comments. The robots had no major benefits or problems. Future research could give robots stronger operational roles, use more specific outcome measures, and perform cost-benefit analyses. © 2015 AJA Inc.

  20. Identifying Sources of Scientific Knowledge: classifying non-source items in the WoS

    Energy Technology Data Exchange (ETDEWEB)

    Calero-Medina, C.M.

    2016-07-01

    The sources of scientific knowledge can be tracked using the references in scientific publications. For instance, the publications from the scientific journals covered by the Web of Science database (WoS) contain references to publications for which an indexed source record exist in the WoS (source items) or to references for which an indexed source record does not exist in the WoS (non-source items). The classification of the non-source items is the main objective of the work in progress presented here. Some other scholars have classified and identified non-source items with different purposes (e.g. Butler & Visser (2006); Liseé, Larivière & Archambault (2008); Nerderhof, van Leeuwen & van Raan (2010); Hicks & Wang (2013); Boyack & Klavans (2014)). But these studies are focused in specific source types, fields or set of papers. The work presented here is much broader in terms of the number of publications, source types and fields. (Author)

  1. Zebrafish Expression Ontology of Gene Sets (ZEOGS): A Tool to Analyze Enrichment of Zebrafish Anatomical Terms in Large Gene Sets

    Science.gov (United States)

    Marsico, Annalisa

    2013-01-01

    Abstract The zebrafish (Danio rerio) is an established model organism for developmental and biomedical research. It is frequently used for high-throughput functional genomics experiments, such as genome-wide gene expression measurements, to systematically analyze molecular mechanisms. However, the use of whole embryos or larvae in such experiments leads to a loss of the spatial information. To address this problem, we have developed a tool called Zebrafish Expression Ontology of Gene Sets (ZEOGS) to assess the enrichment of anatomical terms in large gene sets. ZEOGS uses gene expression pattern data from several sources: first, in situ hybridization experiments from the Zebrafish Model Organism Database (ZFIN); second, it uses the Zebrafish Anatomical Ontology, a controlled vocabulary that describes connected anatomical structures; and third, the available connections between expression patterns and anatomical terms contained in ZFIN. Upon input of a gene set, ZEOGS determines which anatomical structures are overrepresented in the input gene set. ZEOGS allows one for the first time to look at groups of genes and to describe them in terms of shared anatomical structures. To establish ZEOGS, we first tested it on random gene selections and on two public microarray datasets with known tissue-specific gene expression changes. These tests showed that ZEOGS could reliably identify the tissues affected, whereas only very few enriched terms to none were found in the random gene sets. Next we applied ZEOGS to microarray datasets of 24 and 72 h postfertilization zebrafish embryos treated with beclomethasone, a potent glucocorticoid. This analysis resulted in the identification of several anatomical terms related to glucocorticoid-responsive tissues, some of which were stage-specific. Our studies highlight the ability of ZEOGS to extract spatial information from datasets derived from whole embryos, indicating that ZEOGS could be a useful tool to automatically analyze gene

  2. Zebrafish Expression Ontology of Gene Sets (ZEOGS): a tool to analyze enrichment of zebrafish anatomical terms in large gene sets.

    Science.gov (United States)

    Prykhozhij, Sergey V; Marsico, Annalisa; Meijsing, Sebastiaan H

    2013-09-01

    The zebrafish (Danio rerio) is an established model organism for developmental and biomedical research. It is frequently used for high-throughput functional genomics experiments, such as genome-wide gene expression measurements, to systematically analyze molecular mechanisms. However, the use of whole embryos or larvae in such experiments leads to a loss of the spatial information. To address this problem, we have developed a tool called Zebrafish Expression Ontology of Gene Sets (ZEOGS) to assess the enrichment of anatomical terms in large gene sets. ZEOGS uses gene expression pattern data from several sources: first, in situ hybridization experiments from the Zebrafish Model Organism Database (ZFIN); second, it uses the Zebrafish Anatomical Ontology, a controlled vocabulary that describes connected anatomical structures; and third, the available connections between expression patterns and anatomical terms contained in ZFIN. Upon input of a gene set, ZEOGS determines which anatomical structures are overrepresented in the input gene set. ZEOGS allows one for the first time to look at groups of genes and to describe them in terms of shared anatomical structures. To establish ZEOGS, we first tested it on random gene selections and on two public microarray datasets with known tissue-specific gene expression changes. These tests showed that ZEOGS could reliably identify the tissues affected, whereas only very few enriched terms to none were found in the random gene sets. Next we applied ZEOGS to microarray datasets of 24 and 72 h postfertilization zebrafish embryos treated with beclomethasone, a potent glucocorticoid. This analysis resulted in the identification of several anatomical terms related to glucocorticoid-responsive tissues, some of which were stage-specific. Our studies highlight the ability of ZEOGS to extract spatial information from datasets derived from whole embryos, indicating that ZEOGS could be a useful tool to automatically analyze gene expression

  3. DC Motor control using motor-generator set with controlled generator field

    Science.gov (United States)

    Belsterling, Charles A.; Stone, John

    1982-01-01

    A d.c. generator is connected in series opposed to the polarity of a d.c. power source supplying a d.c. drive motor. The generator is part of a motor-generator set, the motor of which is supplied from the power source connected to the motor. A generator field control means varies the field produced by at least one of the generator windings in order to change the effective voltage output. When the generator voltage is exactly equal to the d.c. voltage supply, no voltage is applied across the drive motor. As the field of the generator is reduced, the drive motor is supplied greater voltage until the full voltage of the d.c. power source is supplied when the generator has zero field applied. Additional voltage may be applied across the drive motor by reversing and increasing the reversed field on the generator. The drive motor may be reversed in direction from standstill by increasing the generator field so that a reverse voltage is applied across the d.c. motor.

  4. An artificial neural network approach to reconstruct the source term of a nuclear accident

    International Nuclear Information System (INIS)

    Giles, J.; Palma, C. R.; Weller, P.

    1997-01-01

    This work makes use of one of the main features of artificial neural networks, which is their ability to 'learn' from sets of known input and output data. Indeed, a trained artificial neural network can be used to make predictions on the input data when the output is known, and this feedback process enables one to reconstruct the source term from field observations. With this aim, an artificial neural networks has been trained, using the projections of a segmented plume atmospheric dispersion model at fixed points, simulating a set of gamma detectors located outside the perimeter of a nuclear facility. The resulting set of artificial neural networks was used to determine the release fraction and rate for each of the noble gases, iodines and particulate fission products that could originate from a nuclear accident. Model projections were made using a large data set consisting of effective release height, release fraction of noble gases, iodines and particulate fission products, atmospheric stability, wind speed and wind direction. The model computed nuclide-specific gamma dose rates. The locations of the detectors were chosen taking into account both building shine and wake effects, and varied in distance between 800 and 1200 m from the reactor.The inputs to the artificial neural networks consisted of the measurements from the detector array, atmospheric stability, wind speed and wind direction; the outputs comprised a set of release fractions and heights. Once trained, the artificial neural networks was used to reconstruct the source term from the detector responses for data sets not used in training. The preliminary results are encouraging and show that the noble gases and particulate fission product release fractions are well determined

  5. A simple mass-conserved level set method for simulation of multiphase flows

    Science.gov (United States)

    Yuan, H.-Z.; Shu, C.; Wang, Y.; Shu, S.

    2018-04-01

    In this paper, a modified level set method is proposed for simulation of multiphase flows with large density ratio and high Reynolds number. The present method simply introduces a source or sink term into the level set equation to compensate the mass loss or offset the mass increase. The source or sink term is derived analytically by applying the mass conservation principle with the level set equation and the continuity equation of flow field. Since only a source term is introduced, the application of the present method is as simple as the original level set method, but it can guarantee the overall mass conservation. To validate the present method, the vortex flow problem is first considered. The simulation results are compared with those from the original level set method, which demonstrates that the modified level set method has the capability of accurately capturing the interface and keeping the mass conservation. Then, the proposed method is further validated by simulating the Laplace law, the merging of two bubbles, a bubble rising with high density ratio, and Rayleigh-Taylor instability with high Reynolds number. Numerical results show that the mass is a well-conserved by the present method.

  6. BEAMLINE-CONTROLLED STEERING OF SOURCE-POINT ANGLE AT THE ADVANCED PHOTON SOURCE

    Energy Technology Data Exchange (ETDEWEB)

    Emery, L.; Fystro, G.; Shang, H.; Smith, M.

    2017-06-25

    An EPICS-based steering software system has been implemented for beamline personnel to directly steer the angle of the synchrotron radiation sources at the Advanced Photon Source. A script running on a workstation monitors "start steering" beamline EPICS records, and effects a steering given by the value of the "angle request" EPICS record. The new system makes the steering process much faster than before, although the older steering protocols can still be used. The robustness features of the original steering remain. Feedback messages are provided to the beamlines and the accelerator operators. Underpinning this new steering protocol is the recent refinement of the global orbit feedback process whereby feedforward of dipole corrector set points and orbit set points are used to create a local steering bump in a rapid and seamless way.

  7. Fifty years of polymer science

    NARCIS (Netherlands)

    Du Prez, Filip; Hoogenboom, Richard; Klumperman, Bert; Meier, Michael; Monteiro, Michael; Müller, Alejandro; Vancso, Gyula J.

    2015-01-01

    The European Polymer Journal (EPJ) has been serving the scientific community for 50 years, which makes it one of the older macromolecular journals with a broad focus. Since its launch 50 years ago, EPJ has provided a distinguished forum for publications in polymer research, including chemistry,

  8. Sources of Artefacts in Synthetic Aperture Radar Interferometry Data Sets

    Science.gov (United States)

    Becek, K.; Borkowski, A.

    2012-07-01

    In recent years, much attention has been devoted to digital elevation models (DEMs) produced using Synthetic Aperture Radar Interferometry (InSAR). This has been triggered by the relative novelty of the InSAR method and its world-famous product—the Shuttle Radar Topography Mission (SRTM) DEM. However, much less attention, if at all, has been paid to sources of artefacts in SRTM. In this work, we focus not on the missing pixels (null pixels) due to shadows or the layover effect, but rather on outliers that were undetected by the SRTM validation process. The aim of this study is to identify some of the causes of the elevation outliers in SRTM. Such knowledge may be helpful to mitigate similar problems in future InSAR DEMs, notably the ones currently being developed from data acquired by the TanDEM-X mission. We analysed many cross-sections derived from SRTM. These cross-sections were extracted over the elevation test areas, which are available from the Global Elevation Data Testing Facility (GEDTF) whose database contains about 8,500 runways with known vertical profiles. Whenever a significant discrepancy between the known runway profile and the SRTM cross-section was detected, a visual interpretation of the high-resolution satellite image was carried out to identify the objects causing the irregularities. A distance and a bearing from the outlier to the object were recorded. Moreover, we considered the SRTM look direction parameter. A comprehensive analysis of the acquired data allows us to establish that large metallic structures, such as hangars or car parking lots, are causing the outliers. Water areas or plain wet terrains may also cause an InSAR outlier. The look direction and the depression angle of the InSAR system in relation to the suspected objects influence the magnitude of the outliers. We hope that these findings will be helpful in designing the error detection routines of future InSAR or, in fact, any microwave aerial- or space-based survey. The

  9. SOURCES OF ARTEFACTS IN SYNTHETIC APERTURE RADAR INTERFEROMETRY DATA SETS

    Directory of Open Access Journals (Sweden)

    K. Becek

    2012-07-01

    Full Text Available In recent years, much attention has been devoted to digital elevation models (DEMs produced using Synthetic Aperture Radar Interferometry (InSAR. This has been triggered by the relative novelty of the InSAR method and its world-famous product—the Shuttle Radar Topography Mission (SRTM DEM. However, much less attention, if at all, has been paid to sources of artefacts in SRTM. In this work, we focus not on the missing pixels (null pixels due to shadows or the layover effect, but rather on outliers that were undetected by the SRTM validation process. The aim of this study is to identify some of the causes of the elevation outliers in SRTM. Such knowledge may be helpful to mitigate similar problems in future InSAR DEMs, notably the ones currently being developed from data acquired by the TanDEM-X mission. We analysed many cross-sections derived from SRTM. These cross-sections were extracted over the elevation test areas, which are available from the Global Elevation Data Testing Facility (GEDTF whose database contains about 8,500 runways with known vertical profiles. Whenever a significant discrepancy between the known runway profile and the SRTM cross-section was detected, a visual interpretation of the high-resolution satellite image was carried out to identify the objects causing the irregularities. A distance and a bearing from the outlier to the object were recorded. Moreover, we considered the SRTM look direction parameter. A comprehensive analysis of the acquired data allows us to establish that large metallic structures, such as hangars or car parking lots, are causing the outliers. Water areas or plain wet terrains may also cause an InSAR outlier. The look direction and the depression angle of the InSAR system in relation to the suspected objects influence the magnitude of the outliers. We hope that these findings will be helpful in designing the error detection routines of future InSAR or, in fact, any microwave aerial- or space

  10. Study of cover source mismatch in steganalysis and ways to mitigate its impact

    Science.gov (United States)

    Kodovský, Jan; Sedighi, Vahid; Fridrich, Jessica

    2014-02-01

    When a steganalysis detector trained on one cover source is applied to images from a different source, generally the detection error increases due to the mismatch between both sources. In steganography, this situation is recognized as the so-called cover source mismatch (CSM). The drop in detection accuracy depends on many factors, including the properties of both sources, the detector construction, the feature space used to represent the covers, and the steganographic algorithm. Although well recognized as the single most important factor negatively affecting the performance of steganalyzers in practice, the CSM received surprisingly little attention from researchers. One of the reasons for this is the diversity with which the CSM can manifest. On a series of experiments in the spatial and JPEG domains, we refute some of the common misconceptions that the severity of the CSM is tied to the feature dimensionality or their "fragility." The CSM impact on detection appears too difficult to predict due to the effect of complex dependencies among the features. We also investigate ways to mitigate the negative effect of the CSM using simple measures, such as by enlarging the diversity of the training set (training on a mixture of sources) and by employing a bank of detectors trained on multiple different sources and testing on a detector trained on the closest source.

  11. Source contributions of fine particulate matter during one winter haze episodes in Xi'an, China

    Science.gov (United States)

    Yang, X.; Wu, Q.

    2017-12-01

    Long-term exposure to high levels of fine particulate matter (PM2.5) is found to be associated with adverse effects on human health, ecological environment and climate change. Identification the major source regions of fine particulate matter are essential to proposing proper joint prevention and control strategies for heavy haze mitigation. In this work, the Comprehensive Air Quality Model with extensions (CAMx) together with the Particulate Source Apportionment Technology (PSAT) and the Weather Research and Forecast Model (WRF), have been applied to analyze the major source regions of PM2.5 in Xi'an during the heavy haze episodes in winter (29, December, 2016 - 5 January 2017), and the framework of the model system is shown in Fig. 1. Firstly, according to the model evaluation of the daily PM2.5 concentrations for the two months, the model has well performance, and the fraction of predictions within a factor of 2 of the observations (FAC2) is 84%, while the correlation coefficient (R) is 0.80 in Xi'an. By using the PSAT in CAMx model, a detailed source region contribution matrix is derived for all points within the Xi'an region and its six surrounding areas, and long-range regional transport. The results show that the local emission in Xi'an is the mainly sources at downtown area, which contributing 72.9% as shown in Fig.2, and the contribution rate of transportations between adjacent areas depends on wind direction. Meanwhile, three different suburban areas selected for detailed analysis in fine particles sources. Comparing to downtown area, the sources of suburban areas are more multiply, and the transportations make the contribution 40%-82%. In the suburban areas, regional inflows play an important role in the fine particles concentrations, indicating a strong need for regional joint emission control efforts. The results enhance the quantitative understanding of the PM2.5 source regions and provide a basis for policymaking to advance the control of pollution

  12. Rough set soft computing cancer classification and network: one stone, two birds.

    Science.gov (United States)

    Zhang, Yue

    2010-07-15

    Gene expression profiling provides tremendous information to help unravel the complexity of cancer. The selection of the most informative genes from huge noise for cancer classification has taken centre stage, along with predicting the function of such identified genes and the construction of direct gene regulatory networks at different system levels with a tuneable parameter. A new study by Wang and Gotoh described a novel Variable Precision Rough Sets-rooted robust soft computing method to successfully address these problems and has yielded some new insights. The significance of this progress and its perspectives will be discussed in this article.

  13. Setting the One Health Agenda and the Human–Companion Animal Bond

    Directory of Open Access Journals (Sweden)

    Gregg K. Takashima

    2014-10-01

    Full Text Available “One Health”, also called “One Medicine”, began as an initiative advocating greater integration of human and animal medicine, in the 1800s. This concept has recently come to prominence, driven by the recognition that 75% of the newly emerging infectious diseases will arise from animal reservoirs, and that successful control and prevention will require a coordinated human medical and veterinary approach. Consequently, many One Health discussions have centered on the surveillance of animals in order to anticipate the potential emergence of new zoonotic diseases. An area that has been given only cursory mention, are the many ways that small companion animals benefit individual, community and possibly world health. The goal of this paper is to briefly review some of the evidenced-based data concerning the benefits of having companion animals in our lives, focusing on four major areas; cancer, heart disease, autism spectrum disorder (ASD, and the potential positive economic effects of the human-companion animal bond on One Health. Heart disease and cancer are the two leading causes of morbidity and mortality in the world, while ASD is a growing concern, not only for its individual effects, but also for its effect on family units, educational institutions, and its social implications for the community. In addition, these diseases can greatly affect the national and global cost of healthcare, as well as the economic output of a nation. It is therefore important to include and build on the concept of the Human-Animal Bond (HAB as it relates to healthcare in these areas.

  14. Data format standard for sharing light source measurements

    Science.gov (United States)

    Gregory, G. Groot; Ashdown, Ian; Brandenburg, Willi; Chabaud, Dominique; Dross, Oliver; Gangadhara, Sanjay; Garcia, Kevin; Gauvin, Michael; Hansen, Dirk; Haraguchi, Kei; Hasna, Günther; Jiao, Jianzhong; Kelley, Ryan; Koshel, John; Muschaweck, Julius

    2013-09-01

    Optical design requires accurate characterization of light sources for computer aided design (CAD) software. Various methods have been used to model sources, from accurate physical models to measurement of light output. It has become common practice for designers to include measured source data for design simulations. Typically, a measured source will contain rays which sample the output distribution of the source. The ray data must then be exported to various formats suitable for import into optical analysis or design software. Source manufacturers are also making measurements of their products and supplying CAD models along with ray data sets for designers. The increasing availability of data has been beneficial to the design community but has caused a large expansion in storage needs for the source manufacturers since each software program uses a unique format to describe the source distribution. In 2012, the Illuminating Engineering Society (IES) formed a working group to understand the data requirements for ray data and recommend a standard file format. The working group included representatives from software companies supplying the analysis and design tools, source measurement companies providing metrology, source manufacturers creating the data and users from the design community. Within one year the working group proposed a file format which was recently approved by the IES for publication as TM-25. This paper will discuss the process used to define the proposed format, highlight some of the significant decisions leading to the format and list the data to be included in the first version of the standard.

  15. The costs of switching attentional sets

    NARCIS (Netherlands)

    Dombrowe, I.C.; Donk, M.; Olivers, C.N.L.

    2012-01-01

    People prioritize those aspects of the visual environment that match their attentional set. In the present study, we investigated whether switching from one attentional set to another is associated with a cost. We asked observers to sequentially saccade toward two color-defined targets, one on the

  16. The costs of switching attentional sets

    NARCIS (Netherlands)

    Dombrowe, I.C.; Donk, M.; Olivers, C.N.L.

    2011-01-01

    People prioritize those aspects of the visual environment that match their attentional set. In the present study, we investigated whether switching from one attentional set to another is associated with a cost. We asked observers to sequentially saccade toward two color-defined targets, one on the

  17. Development of a dc low pressure D- surface-conversion source using a 10-cm-diameter solid barium converter

    International Nuclear Information System (INIS)

    Kwan, J.W.; Anderson, O.A.; Chan, C.F.; Cooper, W.S.; deVries, G.J.; Kunkel, W.B.; Leung, K.N.; Lietzke, A.F.; Steele, W.F.; van Os, C.F.A.; Wells, R.P.; Williams, M.D.

    1991-09-01

    A D - surface-conversion source using a solid barium converter is designed for steady-state operation to produce 200 mA of D - . A similar ion source of twice the size as the one discussed here will meet the requirements set by the present US-ITER neutral beam injector design. Among the possible types of ion sources being considered for the US-ITER neutral beam design, the barium converter surface-conversion source is the only kind that does not use cesium in the discharge. This absence of cesium will minimize the number of accelerator breakdowns. 15 refs., 4 figs

  18. Chemical Topic Modeling: Exploring Molecular Data Sets Using a Common Text-Mining Approach.

    Science.gov (United States)

    Schneider, Nadine; Fechner, Nikolas; Landrum, Gregory A; Stiefl, Nikolaus

    2017-08-28

    Big data is one of the key transformative factors which increasingly influences all aspects of modern life. Although this transformation brings vast opportunities it also generates novel challenges, not the least of which is organizing and searching this data deluge. The field of medicinal chemistry is not different: more and more data are being generated, for instance, by technologies such as DNA encoded libraries, peptide libraries, text mining of large literature corpora, and new in silico enumeration methods. Handling those huge sets of molecules effectively is quite challenging and requires compromises that often come at the expense of the interpretability of the results. In order to find an intuitive and meaningful approach to organizing large molecular data sets, we adopted a probabilistic framework called "topic modeling" from the text-mining field. Here we present the first chemistry-related implementation of this method, which allows large molecule sets to be assigned to "chemical topics" and investigating the relationships between those. In this first study, we thoroughly evaluate this novel method in different experiments and discuss both its disadvantages and advantages. We show very promising results in reproducing human-assigned concepts using the approach to identify and retrieve chemical series from sets of molecules. We have also created an intuitive visualization of the chemical topics output by the algorithm. This is a huge benefit compared to other unsupervised machine-learning methods, like clustering, which are commonly used to group sets of molecules. Finally, we applied the new method to the 1.6 million molecules of the ChEMBL22 data set to test its robustness and efficiency. In about 1 h we built a 100-topic model of this large data set in which we could identify interesting topics like "proteins", "DNA", or "steroids". Along with this publication we provide our data sets and an open-source implementation of the new method (CheTo) which

  19. All Set! Evidence of Simultaneous Attentional Control Settings for Multiple Target Colors

    Science.gov (United States)

    Irons, Jessica L.; Folk, Charles L.; Remington, Roger W.

    2012-01-01

    Although models of visual search have often assumed that attention can only be set for a single feature or property at a time, recent studies have suggested that it may be possible to maintain more than one attentional control setting. The aim of the present study was to investigate whether spatial attention could be guided by multiple attentional…

  20. Training Delivery Methods as Source of Dynamic Capabilities: The Case of Sports' Organisations

    Science.gov (United States)

    Arraya, Marco António Mexia; Porfírio, Jose António

    2017-01-01

    Purpose: Training as an important source of dynamic capabilities (DC) is important to the performance of sports' organisations (SO) both to athletes and to non-athletic staff. There are a variety of training delivery methods (TDMs). The purpose of this study is to determine from a set of six TDMs which one is considered to be the most suitable to…

  1. Comparison of seismic sources for imaging geologic structures on the Oak Ridge Reservation, Tennessee

    International Nuclear Information System (INIS)

    Doll, W.E.

    1997-02-01

    In this study, five non-invasive swept sources, three non-invasive impulsive sources and one invasive impulsive source were compared. Previous shallow seismic source tests (Miller and others, 1986, 1992, 1994) have established that site characteristics should be considered in determining the optimal source. These studies evaluated a number of invasive sources along with a few non-invasive impulsive sources. Several sources (particularly the high frequency vibrators) that were included in the ORR test were not available or not practical during previous tests, cited above. This study differs from previous source comparisons in that it (1) includes many swept sources, (2) is designed for a greater target depth, (3) was conducted in a very different geologic environment, and (4) generated a larger and more diverse data set (including high fold CMP sections and walkaway vertical seismic profiles) for each source. The test site is centered around test injection well HF-2, between the southern end of Waste Area Grouping 5 (WAG 5) and the High Flux Isotope Reactor (HFIR)

  2. MAYAK production association ``the fifty year old secret''

    Science.gov (United States)

    Chikshov, A. I.; Glagolenko, Y. V.; Malykh, Y. A.; Langley, Roger; Latham, Ian

    1998-06-01

    This paper details the history, production capability, facilities and capacity of MAYAK Production Association. The paper also gives technical details of reactors, storage ponds and handling facilities used for fabrication of Co-60 and Cs-137 sources for Gamma Irradiation Plants and other applications, and describes the quality assurance system and methods.

  3. Integrating multiple data sources for malware classification

    Science.gov (United States)

    Anderson, Blake Harrell; Storlie, Curtis B; Lane, Terran

    2015-04-28

    Disclosed herein are representative embodiments of tools and techniques for classifying programs. According to one exemplary technique, at least one graph representation of at least one dynamic data source of at least one program is generated. Also, at least one graph representation of at least one static data source of the at least one program is generated. Additionally, at least using the at least one graph representation of the at least one dynamic data source and the at least one graph representation of the at least one static data source, the at least one program is classified.

  4. Sources of Evidence-of-Learning: Learning and Assessment in the Era of Big Data

    Science.gov (United States)

    Cope, Bill; Kalantzis, Mary

    2015-01-01

    This article sets out to explore a shift in the sources of evidence-of-learning in the era of networked computing. One of the key features of recent developments has been popularly characterized as "big data". We begin by examining, in general terms, the frame of reference of contemporary debates on machine intelligence and the role of…

  5. MOTIVATION: Goals and Goal Setting

    Science.gov (United States)

    Stratton, Richard K.

    2005-01-01

    Goal setting has great impact on a team's performance. Goals enable a team to synchronize their efforts to achieve success. In this article, the author talks about goals and goal setting. This articles complements Domain 5--Teaching and Communication (p.14) and discusses one of the benchmarks listed therein: "Teach the goal setting process and…

  6. Automated novelty detection in the WISE survey with one-class support vector machines

    Science.gov (United States)

    Solarz, A.; Bilicki, M.; Gromadzki, M.; Pollo, A.; Durkalec, A.; Wypych, M.

    2017-10-01

    Wide-angle photometric surveys of previously uncharted sky areas or wavelength regimes will always bring in unexpected sources - novelties or even anomalies - whose existence and properties cannot be easily predicted from earlier observations. Such objects can be efficiently located with novelty detection algorithms. Here we present an application of such a method, called one-class support vector machines (OCSVM), to search for anomalous patterns among sources preselected from the mid-infrared AllWISE catalogue covering the whole sky. To create a model of expected data we train the algorithm on a set of objects with spectroscopic identifications from the SDSS DR13 database, present also in AllWISE. The OCSVM method detects as anomalous those sources whose patterns - WISE photometric measurements in this case - are inconsistent with the model. Among the detected anomalies we find artefacts, such as objects with spurious photometry due to blending, but more importantly also real sources of genuine astrophysical interest. Among the latter, OCSVM has identified a sample of heavily reddened AGN/quasar candidates distributed uniformly over the sky and in a large part absent from other WISE-based AGN catalogues. It also allowed us to find a specific group of sources of mixed types, mostly stars and compact galaxies. By combining the semi-supervised OCSVM algorithm with standard classification methods it will be possible to improve the latter by accounting for sources which are not present in the training sample, but are otherwise well-represented in the target set. Anomaly detection adds flexibility to automated source separation procedures and helps verify the reliability and representativeness of the training samples. It should be thus considered as an essential step in supervised classification schemes to ensure completeness and purity of produced catalogues. The catalogues of outlier data are only available at the CDS via anonymous ftp to http

  7. Monte Carlo criticality source convergence in a loosely coupled fuel storage system

    International Nuclear Information System (INIS)

    Blomquist, Roger N.; Gelbard, Ely M.

    2003-01-01

    The fission source convergence of a very loosely coupled array of 36 fuel subassemblies with slightly non-symmetric reflection is studied. The fission source converges very slowly from a uniform guess to the fundamental mode in which about 40% of the fissions occur in one corner subassembly. Eigenvalue and fission source estimates are analyzed using a set of statistical tests similar to those used in MCNP, including the 'drift-in-mean' test and a new drift-in-mean test using a linear fit to the cumulative estimate drift, the Shapiro-Wilk test for normality, the relative error test, and the '1/N' test. The normality test does not detect a drifting eigenvalue or fission source. Applied to eigenvalue estimates, the other tests generally fail to detect an unconverged solution, but they are sometimes effective when evaluating fission source distributions. None of the tests provides completely reliable indication of convergence, although they can detect nonconvergence. (author)

  8. Programming settings and recharge interval in a prospective study of a rechargeable sacral neuromodulation system for the treatment of overactive bladder.

    Science.gov (United States)

    Blok, Bertil; Van Kerrebroeck, Philip; de Wachter, Stefan; Ruffion, Alain; Van der Aa, Frank; Jairam, Ranjana; Perrouin-Verbe, Marie; Elneil, Sohier

    2018-02-01

    The RELAX-OAB study is designed to confirm the safety, efficacy, and technical performance of the Axonics r-SNM System, a miniaturized, rechargeable SNM system approved in Europe and Canada for the treatment of bladder and bowel dysfunction. The purpose of this article is to describe study subjects' ability to charge the rechargeable neurostimulator and to document their neurostimulator program settings and recharge interval over time. Fifty-one OAB patients were implanted in a single-stage procedure. These results represent the 3-month charging experience for 48 subjects who completed the 3-month follow-up. Recharge intervals were estimated using therapy stimulation settings and subject experience was evaluated using questionnaires. Forty-seven of forty-eight (98%) subjects were able to successfully charge their device prior to follow-up within 1-month post-implant. At 3-month post-implant, 98% of subjects were able to charge prior to their follow-up visit. Average stimulation amplitude across all subjects was 1.8 mA (±1.1 mA). A total of 69% of subjects had ≥14-day recharge intervals (time between charging) and 98% of subjects had ≥7-day recharge interval. No charging related adverse events occurred. Study subjects were able to charge the Axonics r-SNM System and stimulation settings provided 2 weeks of therapy between recharging for most subjects. Subject satisfaction indicates that subjects are satisfied with rechargeable SNM therapy. © 2018 The Authors. Neurourology and Urodynamics Published by Wiley Periodicals, Inc.

  9. The effect of energy distribution of external source on source multiplication in fast assemblies

    International Nuclear Information System (INIS)

    Karam, R.A.; Vakilian, M.

    1976-02-01

    The essence of this study is the effect of energy distribution of a source on the detection rate as a function of K effective in fast assemblies. This effectiveness, as a function of K was studied in a fission chamber, using the ABN cross-section set and Mach 1 code. It was found that with a source which has a fission spectrum, the reciprocal count rate versus mass relationship is linear down to K effective 0.59. For a thermal source, the linearity was never achieved. (author)

  10. A description of assistive technology sources, services and outcomes of use in a number of African settings.

    Science.gov (United States)

    Visagie, Surona; Eide, Arne H; Mannan, Hasheem; Schneider, Marguerite; Swartz, Leslie; Mji, Gubela; Munthali, Alister; Khogali, Mustafa; van Rooy, Gert; Hem, Karl-Gerhard; MacLachlan, Malcolm

    2017-10-01

    Purpose statement: The article explores assistive technology sources, services and outcomes in South Africa, Namibia, Malawi and Sudan. A survey was done in purposively selected sites of the study countries. Cluster sampling followed by random sampling served to identify 400-500 households (HHs) with members with disabilities per country. A HH questionnaire and individual questionnaire was completed. Country level analysis was limited to descriptive statistics. Walking mobility aids was most commonly bought/provided (46.3%), followed by visual aids (42.6%). The most common sources for assistive technology were government health services (37.8%), "other" (29.8%), and private health services (22.9%). Out of the participants, 59.3% received full information in how to use the device. Maintenance was mostly done by users and their families (37.3%). Devices helped a lot in 73.3% of cases and improved quality of life for 67.9% of participants, while 39.1% experienced functional difficulties despite the devices. Although there is variation between the study settings, the main impression is that of fragmented or absent systems of provision of assistive technology. Implications for rehabilitation Provision of assistive technology and services varied between countries, but the overall impression was of poor provision and fragmented services. The limited provision of assistive technology for personal care and handling products is of concern as many of these devices requires little training and ongoing support while they can make big functional differences. Rural respondents experienced more difficulties when using the device and received less information on use and maintenance of the device than their urban counterparts. A lack of government responsibility for assistive device services correlated with a lack of information and/or training of participants and maintenance of devices.

  11. The trials methodological research agenda: results from a priority setting exercise

    Science.gov (United States)

    2014-01-01

    Background Research into the methods used in the design, conduct, analysis, and reporting of clinical trials is essential to ensure that effective methods are available and that clinical decisions made using results from trials are based on the best available evidence, which is reliable and robust. Methods An on-line Delphi survey of 48 UK Clinical Research Collaboration registered Clinical Trials Units (CTUs) was undertaken. During round one, CTU Directors were asked to identify important topics that require methodological research. During round two, their opinion about the level of importance of each topic was recorded, and during round three, they were asked to review the group’s average opinion and revise their previous opinion if appropriate. Direct reminders were sent to maximise the number of responses at each round. Results are summarised using descriptive methods. Results Forty one (85%) CTU Directors responded to at least one round of the Delphi process: 25 (52%) responded in round one, 32 (67%) responded in round two, 24 (50%) responded in round three. There were only 12 (25%) who responded to all three rounds and 18 (38%) who responded to both rounds two and three. Consensus was achieved amongst CTU Directors that the top three priorities for trials methodological research were ‘Research into methods to boost recruitment in trials’ (considered the highest priority), ‘Methods to minimise attrition’ and ‘Choosing appropriate outcomes to measure’. Fifty other topics were included in the list of priorities and consensus was reached that two topics, ‘Radiotherapy study designs’ and ‘Low carbon trials’, were not priorities. Conclusions This priority setting exercise has identified the research topics felt to be most important to the key stakeholder group of Directors of UKCRC registered CTUs. The use of robust methodology to identify these priorities will help ensure that this work informs the trials methodological research agenda, with

  12. The Chandra Source Catalog : Automated Source Correlation

    Science.gov (United States)

    Hain, Roger; Evans, I. N.; Evans, J. D.; Glotfelty, K. J.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Davis, J. E.; Doe, S. M.; Fabbiano, G.; Galle, E.; Gibbs, D. G.; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Houck, J. C.; Karovska, M.; Lauer, J.; McCollough, M. L.; McDowell, J. C.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Nowak, M. A.; Plummer, D. A.; Primini, F. A.; Refsdal, B. L.; Rots, A. H.; Siemiginowska, A. L.; Sundheim, B. A.; Tibbetts, M. S.; Van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-01-01

    Chandra Source Catalog (CSC) master source pipeline processing seeks to automatically detect sources and compute their properties. Since Chandra is a pointed mission and not a sky survey, different sky regions are observed for a different number of times at varying orientations, resolutions, and other heterogeneous conditions. While this provides an opportunity to collect data from a potentially large number of observing passes, it also creates challenges in determining the best way to combine different detection results for the most accurate characterization of the detected sources. The CSC master source pipeline correlates data from multiple observations by updating existing cataloged source information with new data from the same sky region as they become available. This process sometimes leads to relatively straightforward conclusions, such as when single sources from two observations are similar in size and position. Other observation results require more logic to combine, such as one observation finding a single, large source and another identifying multiple, smaller sources at the same position. We present examples of different overlapping source detections processed in the current version of the CSC master source pipeline. We explain how they are resolved into entries in the master source database, and examine the challenges of computing source properties for the same source detected multiple times. Future enhancements are also discussed. This work is supported by NASA contract NAS8-03060 (CXC).

  13. Social Set Visualizer

    DEFF Research Database (Denmark)

    Flesch, Benjamin; Hussain, Abid; Vatrapu, Ravi

    2015-01-01

    -edge open source visual analytics libraries from D3.js and creation of new visualizations (ac-tor mobility across time, conversational comets etc). Evaluation of the dashboard consisting of technical testing, usability testing, and domain-specific testing with CSR students and yielded positive results.......This paper presents a state-of-the art visual analytics dash-board, Social Set Visualizer (SoSeVi), of approximately 90 million Facebook actions from 11 different companies that have been mentioned in the traditional media in relation to garment factory accidents in Bangladesh. The enterprise...

  14. Beginning Typewriting: A Fifty-Fifty Proposition

    Science.gov (United States)

    Ivarie, Ted

    1976-01-01

    Beginning typewriting should be a 50-50 proposition with equal time devoted to machine operation and skill development and to language arts instruction in elementary and secondary education. (Author/LH)

  15. The Isolation and Detection of Staphylococcus aureus Enterotoxins A-E and TSST-1 Genes from Different Sources by PCR Method

    Directory of Open Access Journals (Sweden)

    Norouzi J

    2012-09-01

    Full Text Available Background and Objectives: Enterotoxins and toxic shock syndrome toxin–1 (TSST-1 are an important virulence factor of S. aureus. The purpose of this study was to analyze the presence of S. aureus enterotoxin (sea-see and tst genes in the samples that collected from different sources with PCR method. Methods: During 5 months from 150 collected samples, 80 strains were identified as S. aureus. PCR reaction was used for investigation on the presence of genes for staphylococcal enterotoxins (A-E and toxic shock syndrome toxin-1 (TSST-1. Results: Fifty three samples (66.25% out of 80 samples were positive for one or more ETs and TSST-1 genes. of these positive strains, 17 (32.07% were positive for sea 39 (73.58% for seb, 30 (56.6% for sec, 2 (3.7% for sed, 21 (39.62% for see, and 14 (26.41% for tst. Enterotoxins and tst in 40 samples (75.47% observed.Conclusion: In this study, high prevalence of S. aureus, its enterotoxin, and tst genes were observed in clinical samples, food samples, and healthy people. This fact emphasizes on the role of human as original source and carrier of S. aureus. Also, use of PCR reaction for detection of these genes in S. aureus that isolated from various sources is recommended.

  16. A Test Set for stiff Initial Value Problem Solvers in the open source software R: Package deTestSet

    NARCIS (Netherlands)

    Mazzia, F.; Cash, J.R.; Soetaert, K.

    2012-01-01

    In this paper we present the R package deTestSet that includes challenging test problems written as ordinary differential equations (ODEs), differential algebraic equations (DAEs) of index up to 3 and implicit differential equations (IDES). In addition it includes 6 new codes to solve initial value

  17. Building Background Knowledge through Reading: Rethinking Text Sets

    Science.gov (United States)

    Lupo, Sarah M.; Strong, John Z.; Lewis, William; Walpole, Sharon; McKenna, Michael C.

    2018-01-01

    To increase reading volume and help students access challenging texts, the authors propose a four-dimensional framework for text sets. The quad text set framework is designed around a target text: a challenging content area text, such as a canonical literary work, research article, or historical primary source document. The three remaining…

  18. Spatial and temporal variations of loads and sources of total and dissolved Phosphorus in a set of rivers (Western France).

    Science.gov (United States)

    Legeay, Pierre-Louis; Moatar, Florentina; Gascuel-Odoux, Chantal; Gruau, Gérard

    2015-04-01

    In intensive agricultural regions with important livestock farming, long-term land application of Phosphorus (P) both as chemical fertilizer and animal wastes, have resulted in elevated P contents in soils. Since we know that high P concentrations in rivers is of major concern, few studies have been done at to assess the spatiotemporal variability of P loads in rivers and apportionment of point and nonpoint source in total loads. Here we focus on Brittany (Western France) where even though P is a great issue in terms of human and drinking water safety (cyano-toxins), environmental protection and economic costs for Brittany with regards to the periodic proliferations of cyanobacteria that occur every year in this region, no regional-scale systematic study has been carried out so far. We selected a set of small rivers (stream order 3-5) with homogeneous agriculture and granitic catchment. By gathering data from three water quality monitoring networks, covering more than 100 measurements stations, we provide a regional-scale quantification of the spatiotemporal variability of dissolved P (DP) and total P (TP) interannual loads from 1992 to 2012. Build on mean P load in low flows and statistical significance tests, we developed a new indicator, called 'low flow P load' (LFP-load), which allows us to determine the importance of domestic and industrial P sources in total P load and to assess their spatiotemporal variability compared to agricultural sources. The calculation and the map representation of DP and TP interannual load variations allow identification of the greatest and lowest P contributory catchments over the study period and the way P loads of Brittany rivers have evolved through time. Both mean DP and TP loads have been divided by more than two over the last 20 years. Mean LFDP-load decreased by more than 60% and mean LFTP-load by more than 45% on average over the same period showing that this marked temporal decrease in total load is largely due to the

  19. "One Problem Became Another": Disclosure of Rape-Related Pregnancy in the Abortion Care Setting.

    Science.gov (United States)

    Perry, Rachel; Murphy, Molly; Haider, Sadia; Harwood, Bryna

    2015-01-01

    We sought to explore the experiences of women who disclosed that their pregnancies resulted from rape in the abortion care setting, as well as the experiences of professionals involved in care of women with rape-related pregnancy. In-depth interviews were conducted with 9 patients who had terminated rape-related pregnancies and 12 professionals working in abortion care or rape crisis advocacy (5 abortion providers, 4 rape crisis center advocates, 2 social workers, and 1 clinic administrator). Transcribed interviews were coded and analyzed for themes related to the experiences of disclosing rape and the consequences of disclosure in the abortion care setting. Patients and professionals involved in care of women with rape-related pregnancy described opportunities arising from disclosure, including interpersonal (explaining abortion decision making in the context of assault, belief, and caring by providers), as well as structural opportunities (funding assistance, legal options, and mental health options). Whereas most patients did not choose to pursue all three structural opportunities, both patients and professionals emphasized the importance of offering them. The most important consequence of disclosure for patients was being believed and feeling that providers cared about them. Rape-related pregnancy disclosure in the abortion care setting can lead to opportunities for interpersonal support and open options for funding, legal recourse, and mental health care. Those working in abortion care should create environments conducive to disclosure and opportunities for rape survivors to access these additional options if they desire. Copyright © 2015 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.

  20. Comparison between commercial and open-source SCADA packages-A case study

    International Nuclear Information System (INIS)

    Barana, O.; Barbato, P.; Breda, M.; Capobianco, R.; Luchetta, A.; Molon, F.; Moressa, M.; Simionato, P.; Taliercio, C.; Zampiva, E.

    2010-01-01

    SPIDER (Source for the Production of Ions of Deuterium Extracted from Rf plasma) is a new experimental device under development at Consorzio RFX aimed at providing a full-scale ion source prototype for MITICA (Megavolt ITer Injector and Concept Advance), the experiment devoted at supplying a full-scale prototype of the ITER Heating Neutral Beam Injector (HNB). Both experimental devices will be hosted in a new facility known as PRIMA (Padova Research on Injectors Megavolt Accelerated). The correct operation of SPIDER and MITICA will be guaranteed by process automation and plant monitoring that will be implemented using suitable controllers (cycle time greater than 10 ms) in conjunction with appropriate SCADA (Supervisory Control And Data Acquisition) software. This paper presents the tests performed at Consorzio RFX to evaluate commercial and open-source SCADA packages and prepare a technical base for the selection of the SCADA system for SPIDER. Two commercial solutions and two open-source solutions (EPICS and TANGO) were investigated. The typical test-bed was represented by a SCADA system exchanging data with a PLC (Programmable Logic Controller). The case study consisted of: (a) the development of a minimal panel provided with fields for setting parameters and of a trend window; (b) the set-up of two kinds of communication, a direct connection between the SCADA and the PLC and an indirect one by means of an OPC (Object Linking and Embedding for Process Control) server. The communication performance was evaluated measuring the network traffic with a fixed number of data variables exchanged and different polling cycle times. The conclusions show that the final choice of a SCADA package for SPIDER will be between one commercial SCADA and EPICS. This choice will not depend uniquely on the results of the tests, but will be also dictated by the early schedule of the SPIDER operations.

  1. Romanian experience on safety and security of radiation sources

    International Nuclear Information System (INIS)

    Botgros, Madalina; Coroianu, Anton; Negreanu, Mircea

    2008-01-01

    Romania has established the first administrative structure for controlling the deployment of the nuclear activities in 1961 and the first Romanian nuclear law was published in 1974. In the present, it is in force the Law no. 111, published in 1996 and republished in 2003. Moreover, there are available facilities and services to the persons authorized to manage radioactive sources. The regulation for safety and security of radioactive sources was amended two times in order to implement the international recommendations for setting up the national system for accounting and control of radiation sources and to coordinate the recovery activities. As part of national control programme, the national inventory of sources and devices is updated permanently, when issuing a new authorization, when modifying an existing one, or when renewing an authorization system and records in the database. The government responsibility for the orphan sources is stated in the law on radioactive waste management and decommissioning fund. There is a protocol between CNCAN, Ministry of Internal and Ministry of Health and Family regarding the co-operation in the case of finding orphan sources. When a radiation source is spent, it becomes radioactive waste that has to be disposed off properly. Depending on the case, the holder of a spent source has the possibility either to return the radioactive source to its manufacturer for regeneration or to transfer it to the Radioactive Waste Treatment Facility. (author)

  2. Microwave and RF vacuum electronic power sources

    CERN Document Server

    Carter, Richard G

    2018-01-01

    Do you design and build vacuum electron devices, or work with the systems that use them? Quickly develop a solid understanding of how these devices work with this authoritative guide, written by an author with over fifty years of experience in the field. Rigorous in its approach, it focuses on the theory and design of commercially significant types of gridded, linear-beam, crossed-field and fast-wave tubes. Essential components such as waveguides, resonators, slow-wave structures, electron guns, beams, magnets and collectors are also covered, as well as the integration and reliable operation of devices in microwave and RF systems. Complex mathematical analysis is kept to a minimum, and Mathcad worksheets supporting the book online aid understanding of key concepts and connect the theory with practice. Including coverage of primary sources and current research trends, this is essential reading for researchers, practitioners and graduate students working on vacuum electron devices.

  3. Wind power - a power source now enabled by power electronics

    DEFF Research Database (Denmark)

    Blaabjerg, Frede; Iov, Florin

    2007-01-01

    energy at the end-user should be set up. Deregulation of energy has lowered the investment in larger power plants, which means the need for new electrical power sources may be increased in the near future. Two major technologies will play important roles to solve the future problems. One is to change......The global electrical energy consumption is still rising and there is a steady demand to increase the power capacity. It is expected that it has to be doubled within 20 years. The production, distribution and use of the energy should be as technological efficient as possible and incentives to save...... the electrical power production sources from the conventional, fossil (and short term) based energy sources to renewable energy resources. Another is to use high efficient power electronics in power generation, power transmission/distribution and end-user application. This paper discuss the most emerging...

  4. Sourcing Excellence

    DEFF Research Database (Denmark)

    Adeyemi, Oluseyi

    2011-01-01

    Sourcing Excellence is one of the key performance indicators (KPIs) in this world of ever changing sourcing strategies. Manufacturing companies need to access and diagnose the reliability and competencies of existing suppliers in order to coordinate and develop them. This would help in managing...

  5. Y-source inverter

    DEFF Research Database (Denmark)

    Siwakoti, Yam P.; Town, Graham; Loh, Poh Chiang

    2014-01-01

    This paper introduces a new 3-phase Y-source inverter whose gain is presently not matched by classical impedance-network-based inverters operating at the same duty ratio. The proposed network uses a tightly coupled transformer with three windings. By squeezing the shoot-through range while keeping...... higher boost, the inverter can operate at a higher modulation index, thereby minimizing switching device stress and providing better output power quality. In addition, the inverter has more degrees of freedom for setting the voltage gain and modulation index than other classical impedance-source networks...

  6. Development of irradiator 60Co sources

    International Nuclear Information System (INIS)

    Mosca, Rodrigo C.; Moura, Eduardo S.; Zeituni, Carlos A.; Mathor, Monica B.

    2011-01-01

    According to a recent report by the International Agency for Research on Cancer (IARC) / WHO (2008-2010), the global impact of cancer more than doubled in 30 years. In this report, it was estimated that occurred about 12 million new cancer cases and 7 million deaths. In Brazil in 2010, with estimates for the year 2011, point to the occurrence of 489,270 new cases of cancer. Among the possibilities for cancer treatment, radiotherapy is one of the most important therapeutic and resources used to combat it. However, inherent complications of treatment can occur such as tiredness, loss of appetite, radiodermatitis and in more extreme cases late radionecrosis. In order to reproduce a point of radionecrosis in the vicinity of radiodermatitis to mimic these effects in animals, producing a model for assessment of tissue repair, we propose the setting up of an irradiator source of collimated 60 Co. The development of was based on 11 sources of 60 Co with 1 mm thickness that were inserted by inference in stainless steel 'gate-source' screw (patent pending) and later adjusted in a cross-shaped arrangement reinforced so that the beam radiation is directed to a target point, saving for other regions around this target point. The main use of this irradiator with sources of 60 Co is just one cause radionecrosis point (target point) of approximately 5 mm 2 with a surrounding and adjacent area of radiodermatitis around about 8 to 10 mm 2 in laboratory animals for subsequent coating with epidermal-dermal matrix populated by a cell culture of human fibroblasts, keratinocytes and mesenchymal stem cells. With that said, its use will be valuable for evaluation of curative treatments against the bone and radionecrosis or palliative treatment rather than as it is currently assumed. (author)

  7. Source apportionment of toxic chemical pollutants at Trombay region

    International Nuclear Information System (INIS)

    Sahu, S.K.; Pandit, G.G.; Puranik, V.D.

    2007-05-01

    Anthropogenic activities like industrial production and transportation, a wide range of chemical pollutants such as trace and toxic metals, pesticides, polycyclic aromatic hydrocarbons etc. eventually find their way into various environmental compartments. One of the main issues of environmental pollution is the chemical composition of aerosols and their sources. In spite of all the efforts a considerable part of the atmospheric aerosol mass is still not accounted for. This report describes some of the activities of Environmental Assessment Division which are having direct relevance to the public health and regulatory bodies. Extensive studies were carried out in our laboratories for the Trombay site, over the years; on the organic as well as inorganic pollution in the environment to understand inter compartmental behaviour of these chemical pollutants. In this report an attempt has been made to collect different size fractionated ambient aerosols and to quantify the percentage contribution of each size fraction to the total aerosol mass. Subsequently, an effort has been made for chemical characterization (inorganic, organic and carbon content) of these particulate matter using different analytical techniques. The comprehensive data set on chemical characterization of particulate matter thus generated is being used with receptor modeling techniques to identify the possible sources contributing to the observed concentrations of the measured pollutants. The use of this comprehensive data set in receptor modeling has been helpful in distinguishing the source types in a better way. Receptor modeling techniques are powerful tools that can be used to locate sources of pollutants to the atmosphere. The major advantage of the receptor models is that actual ambient data are used to apportion source contributions, negating the need for dispersion calculations. Pollution sources affecting the sampling site were statistically identified using varimax rotated factor analysis of

  8. Long term leaching of chlorinated solvents from source zones in low permeability settings with fractures

    DEFF Research Database (Denmark)

    Bjerg, Poul Løgstrup; Chambon, Julie Claire Claudia; Troldborg, Mads

    2008-01-01

    spreads to the low permeability matrix by diffusion. This results in a long term source of contamination due to back-diffusion. Leaching from such sources is further complicated by microbial degradation under anaerobic conditions to sequentially form the daughter products trichloroethylene, cis...

  9. Point Pollution Sources Dimensioning

    Directory of Open Access Journals (Sweden)

    Georgeta CUCULEANU

    2011-06-01

    Full Text Available In this paper a method for determining the main physical characteristics of the point pollution sources is presented. It can be used to find the main physical characteristics of them. The main physical characteristics of these sources are top inside source diameter and physical height. The top inside source diameter is calculated from gas flow-rate. For reckoning the physical height of the source one takes into account the relation given by the proportionality factor, defined as ratio between the plume rise and physical height of the source. The plume rise depends on the gas exit velocity and gas temperature. That relation is necessary for diminishing the environmental pollution when the production capacity of the plant varies, in comparison with the nominal one.

  10. Blind source separation dependent component analysis

    CERN Document Server

    Xiang, Yong; Yang, Zuyuan

    2015-01-01

    This book provides readers a complete and self-contained set of knowledge about dependent source separation, including the latest development in this field. The book gives an overview on blind source separation where three promising blind separation techniques that can tackle mutually correlated sources are presented. The book further focuses on the non-negativity based methods, the time-frequency analysis based methods, and the pre-coding based methods, respectively.

  11. Estimating the CCSD basis-set limit energy from small basis sets: basis-set extrapolations vs additivity schemes

    Energy Technology Data Exchange (ETDEWEB)

    Spackman, Peter R.; Karton, Amir, E-mail: amir.karton@uwa.edu.au [School of Chemistry and Biochemistry, The University of Western Australia, Perth, WA 6009 (Australia)

    2015-05-15

    Coupled cluster calculations with all single and double excitations (CCSD) converge exceedingly slowly with the size of the one-particle basis set. We assess the performance of a number of approaches for obtaining CCSD correlation energies close to the complete basis-set limit in conjunction with relatively small DZ and TZ basis sets. These include global and system-dependent extrapolations based on the A + B/L{sup α} two-point extrapolation formula, and the well-known additivity approach that uses an MP2-based basis-set-correction term. We show that the basis set convergence rate can change dramatically between different systems(e.g.it is slower for molecules with polar bonds and/or second-row elements). The system-dependent basis-set extrapolation scheme, in which unique basis-set extrapolation exponents for each system are obtained from lower-cost MP2 calculations, significantly accelerates the basis-set convergence relative to the global extrapolations. Nevertheless, we find that the simple MP2-based basis-set additivity scheme outperforms the extrapolation approaches. For example, the following root-mean-squared deviations are obtained for the 140 basis-set limit CCSD atomization energies in the W4-11 database: 9.1 (global extrapolation), 3.7 (system-dependent extrapolation), and 2.4 (additivity scheme) kJ mol{sup –1}. The CCSD energy in these approximations is obtained from basis sets of up to TZ quality and the latter two approaches require additional MP2 calculations with basis sets of up to QZ quality. We also assess the performance of the basis-set extrapolations and additivity schemes for a set of 20 basis-set limit CCSD atomization energies of larger molecules including amino acids, DNA/RNA bases, aromatic compounds, and platonic hydrocarbon cages. We obtain the following RMSDs for the above methods: 10.2 (global extrapolation), 5.7 (system-dependent extrapolation), and 2.9 (additivity scheme) kJ mol{sup –1}.

  12. Estimating the CCSD basis-set limit energy from small basis sets: basis-set extrapolations vs additivity schemes

    International Nuclear Information System (INIS)

    Spackman, Peter R.; Karton, Amir

    2015-01-01

    Coupled cluster calculations with all single and double excitations (CCSD) converge exceedingly slowly with the size of the one-particle basis set. We assess the performance of a number of approaches for obtaining CCSD correlation energies close to the complete basis-set limit in conjunction with relatively small DZ and TZ basis sets. These include global and system-dependent extrapolations based on the A + B/L α two-point extrapolation formula, and the well-known additivity approach that uses an MP2-based basis-set-correction term. We show that the basis set convergence rate can change dramatically between different systems(e.g.it is slower for molecules with polar bonds and/or second-row elements). The system-dependent basis-set extrapolation scheme, in which unique basis-set extrapolation exponents for each system are obtained from lower-cost MP2 calculations, significantly accelerates the basis-set convergence relative to the global extrapolations. Nevertheless, we find that the simple MP2-based basis-set additivity scheme outperforms the extrapolation approaches. For example, the following root-mean-squared deviations are obtained for the 140 basis-set limit CCSD atomization energies in the W4-11 database: 9.1 (global extrapolation), 3.7 (system-dependent extrapolation), and 2.4 (additivity scheme) kJ mol –1 . The CCSD energy in these approximations is obtained from basis sets of up to TZ quality and the latter two approaches require additional MP2 calculations with basis sets of up to QZ quality. We also assess the performance of the basis-set extrapolations and additivity schemes for a set of 20 basis-set limit CCSD atomization energies of larger molecules including amino acids, DNA/RNA bases, aromatic compounds, and platonic hydrocarbon cages. We obtain the following RMSDs for the above methods: 10.2 (global extrapolation), 5.7 (system-dependent extrapolation), and 2.9 (additivity scheme) kJ mol –1

  13. Comparative study of ultrasonographic, radiographic and after surgery signs in fifty bitches with pyometra

    International Nuclear Information System (INIS)

    Tello, L.; Martin, F; Valdes F, Alberto; Albala, A.

    1996-01-01

    Diagnoses of pyometra are specially based on clinical signs, clinicalpathology findings and radiography. Ultrasound has been a new, only recently available diagnostic tool. To establish some relation between radiography and ultrasonography scans, fifty female dogs with clinicalsigns of pyometra were scanned with ultrasound and radiography. Lateral and ventro-dorsal views were obtained, and scored from 0 to 3 according to presence of radiographical signs. On the other hand, this study tries to correlate uterine section measurements between ultrasound and measurements of the isolated organ in 3 anatomic points after surgery: a) Caudal to kidneys, b) Lateral to navel, c) Uterine body, caudalto horn bifurcation. Lateral radiographs were more succesfull in diagnosing raised uterine volume than ventro-dorsal radiographs. The radiograph technique was less efficient than ultrasound in distinguishing positive cases with pyometra. Ultrasound was able to detect 100% of thepyometra cases. Measurements of uterine sections larger than 900 mm(2) were correlated with higher values (2 or 3) in radiographic scores (p [es

  14. Viability utilization of one Se sup(75) source in the analysis of uranium, thorium and rare earths for use on energy dispersive x-ray fluorescence

    International Nuclear Information System (INIS)

    Nova Mussel, W. da.

    1989-01-01

    This work is a study about the viable utilization of one Se sup(75) source as an excitation source for the use of Energy Dispersive X-Ray Fluorescence (EDXRF), in the analysis of Uranium, Thorium and the Rare Earths. The following arrangement was build up: a HPGE detector, two Se sup(75) sources in 30 sup(0) positions of castle, deadtime of 5%. Using this arrangement the calibration curve for U and Th was measured and the angular correlation coeficient was r+ 0,999, and for the Rare Earths was superior r+ 0,960. The answer given for this system was considered very fine. (author)

  15. SARS and hospital priority setting: a qualitative case study and evaluation

    Directory of Open Access Journals (Sweden)

    Upshur Ross EG

    2004-12-01

    Full Text Available Abstract Background Priority setting is one of the most difficult issues facing hospitals because of funding restrictions and changing patient need. A deadly communicable disease outbreak, such as the Severe Acute Respiratory Syndrome (SARS in Toronto in 2003, amplifies the difficulties of hospital priority setting. The purpose of this study is to describe and evaluate priority setting in a hospital in response to SARS using the ethical framework 'accountability for reasonableness'. Methods This study was conducted at a large tertiary hospital in Toronto, Canada. There were two data sources: 1 over 200 key documents (e.g. emails, bulletins, and 2 35 interviews with key informants. Analysis used a modified thematic technique in three phases: open coding, axial coding, and evaluation. Results Participants described the types of priority setting decisions, the decision making process and the reasoning used. Although the hospital leadership made an effort to meet the conditions of 'accountability for reasonableness', they acknowledged that the decision making was not ideal. We described good practices and opportunities for improvement. Conclusions 'Accountability for reasonableness' is a framework that can be used to guide fair priority setting in health care organizations, such as hospitals. In the midst of a crisis such as SARS where guidance is incomplete, consequences uncertain, and information constantly changing, where hour-by-hour decisions involve life and death, fairness is more important rather than less.

  16. SARS and hospital priority setting: a qualitative case study and evaluation.

    Science.gov (United States)

    Bell, Jennifer A H; Hyland, Sylvia; DePellegrin, Tania; Upshur, Ross E G; Bernstein, Mark; Martin, Douglas K

    2004-12-19

    Priority setting is one of the most difficult issues facing hospitals because of funding restrictions and changing patient need. A deadly communicable disease outbreak, such as the Severe Acute Respiratory Syndrome (SARS) in Toronto in 2003, amplifies the difficulties of hospital priority setting. The purpose of this study is to describe and evaluate priority setting in a hospital in response to SARS using the ethical framework 'accountability for reasonableness'. This study was conducted at a large tertiary hospital in Toronto, Canada. There were two data sources: 1) over 200 key documents (e.g. emails, bulletins), and 2) 35 interviews with key informants. Analysis used a modified thematic technique in three phases: open coding, axial coding, and evaluation. Participants described the types of priority setting decisions, the decision making process and the reasoning used. Although the hospital leadership made an effort to meet the conditions of 'accountability for reasonableness', they acknowledged that the decision making was not ideal. We described good practices and opportunities for improvement. 'Accountability for reasonableness' is a framework that can be used to guide fair priority setting in health care organizations, such as hospitals. In the midst of a crisis such as SARS where guidance is incomplete, consequences uncertain, and information constantly changing, where hour-by-hour decisions involve life and death, fairness is more important rather than less.

  17. Attitudes towards Internationalism through the Lens of Cognitive Effort, Global Mindset, and Cultural Intelligence

    Science.gov (United States)

    Romano, Joan; Platania, Judith

    2014-01-01

    In the current study we examine attitudes towards internationalism through the lens of a specific set of constructs necessary in defining an effective global leader. One hundred fifty-nine undergraduates responded to items measuring need for cognition, cultural intelligence, and a set of items measuring the correlates of global mindset. In…

  18. Tsunami hazard at the Western Mediterranean Spanish coast from seismic sources

    Science.gov (United States)

    Álvarez-Gómez, J. A.; Aniel-Quiroga, Í.; González, M.; Otero, L.

    2011-01-01

    Spain represents an important part of the tourism sector in the Western Mediterranean, which has been affected in the past by tsunamis. Although the tsunami risk at the Spanish coasts is not the highest of the Mediterranean, the necessity of tsunami risk mitigation measures should not be neglected. In the Mediterranean area, Spain is exposed to two different tectonic environments with contrasting characteristics. On one hand, the Alboran Basin characterised by transcurrent and transpressive tectonics and, on the other hand, the North Algerian fold and thrust belt, characterised by compressive tectonics. A set of 22 seismic tsunamigenic sources has been used to estimate the tsunami threat over the Spanish Mediterranean coast of the Iberian peninsula and the Balearic Islands. Maximum wave elevation maps and tsunami travel times have been computed by means of numerical modelling and we have obtained estimations of threat levels for each source over the Spanish coast. The sources on the Western edge of North Algeria are the most dangerous, due to their threat to the South-Eastern coast of the Iberian Peninsula and to the Western Balearic Islands. In general, the Northern Algerian sources pose a greater risk to the Spanish coast than the Alboran Sea sources, which only threaten the peninsular coast. In the Iberian Peninsula, the Spanish provinces of Almeria and Murcia are the most exposed, while all the Balearic Islands can be affected by the North Algerian sources with probable severe damage, specially the islands of Ibiza and Minorca. The results obtained in this work are useful to plan future regional and local warning systems, as well as to set the priority areas to conduct research on detailed tsunami risk.

  19. OSIRIX: open source multimodality image navigation software

    Science.gov (United States)

    Rosset, Antoine; Pysher, Lance; Spadola, Luca; Ratib, Osman

    2005-04-01

    The goal of our project is to develop a completely new software platform that will allow users to efficiently and conveniently navigate through large sets of multidimensional data without the need of high-end expensive hardware or software. We also elected to develop our system on new open source software libraries allowing other institutions and developers to contribute to this project. OsiriX is a free and open-source imaging software designed manipulate and visualize large sets of medical images: http://homepage.mac.com/rossetantoine/osirix/

  20. Defining an Open Source Strategy for NASA

    Science.gov (United States)

    Mattmann, C. A.; Crichton, D. J.; Lindsay, F.; Berrick, S. W.; Marshall, J. J.; Downs, R. R.

    2011-12-01

    Over the course of the past year, we have worked to help frame a strategy for NASA and open source software. This includes defining information processes to understand open source licensing, attribution, commerciality, redistribution, communities, architectures, and interactions within the agency. Specifically we held a training session at the NASA Earth Science Data Systems Working Group meeting in Open Source software as it relates to the NASA Earth Science data systems enterprise, including EOSDIS, the Distributed Active Archive Centers (DAACs), ACCESS proposals, and the MEASURES communities, and efforts to understand how open source software can be both consumed and produced within that ecosystem. In addition, we presented at the 1st NASA Open Source Summit (OSS) and helped to define an agency-level strategy, a set of recommendations and paths forward for how to identify healthy open source communities, how to deal with issues such as contributions originating from other agencies, and how to search out talent with the right skills to develop software for NASA in the modern age. This talk will review our current recommendations for open source at NASA, and will cover the set of thirteen recommendations output from the NASA Open Source Summit and discuss some of their implications for the agency.

  1. User's manual for ONEDANT: a code package for one-dimensional, diffusion-accelerated, neutral-particle transport

    International Nuclear Information System (INIS)

    O'Dell, R.D.; Brinkley, F.W. Jr.; Marr, D.R.

    1982-02-01

    ONEDANT is designed for the CDC-7600, but the program has been implemented and run on the IBM-370/190 and CRAY-I computers. ONEDANT solves the one-dimensional multigroup transport equation in plane, cylindrical, spherical, and two-angle plane geometries. Both regular and adjoint, inhomogeneous and homogeneous (k/sub eff/ and eigenvalue search) problems subject to vacuum, reflective, periodic, white, albedo, or inhomogeneous boundary flux conditions are solved. General anisotropic scattering is allowed and anisotropic inhomogeneous sources are permitted. ONEDANT numerically solves the one-dimensional, multigroup form of the neutral-particle, steady-state form of the Boltzmann transport equation. The discrete-ordinates approximation is used for treating the angular variation of the particle distribution and the diamond-difference scheme is used for phase space discretization. Negative fluxes are eliminated by a local set-to-zero-and-correct algorithm. A standard inner (within-group) iteration, outer (energy-group-dependent source) iteration technique is used. Both inner and outer iterations are accelerated using the diffusion synthetic acceleration method

  2. 48 CFR 206.203 - Set-asides for small business concerns.

    Science.gov (United States)

    2010-10-01

    ... Competition After Exclusion of Sources 206.203 Set-asides for small business concerns. (b) Also no separate... 48 Federal Acquisition Regulations System 3 2010-10-01 2010-10-01 false Set-asides for small business concerns. 206.203 Section 206.203 Federal Acquisition Regulations System DEFENSE ACQUISITION...

  3. Wien filter for a polarized ions source

    International Nuclear Information System (INIS)

    Perez A, P.I.

    1977-01-01

    In order to carry out investigation works about nuclear structure, the Nuclear Center of Mexico has an accelerator Tandem Van de Graff of 12 Mv. Now in this center there is a polarized ions source, in a setting phase, totally constructed in the workshop of the accelerator. This source, supplies an ion beam with a polarization whose propagation direction is not the adequate one for the dispersion and reaction processes wanted to be realized. A filter Wien was used to obtain the correct direction of the polarization vector. The purpose of this work is the study of the filter necessary conditions in order to reach the desirable objective. In the first part some generalities are given about: polarization phenomena, polarized ions source and description of the performance of the Wien filter. In the second part, the problem of the passage of a polarized beam through the filter is tried and solved. Finally, the design and construction of the filter is presented together with the results of the experimentation with the object to justify the suppositions which were taken into consideration in the solution of the filter problem. (author)

  4. Analytical representation of the solution of the space kinetic diffusion equation in a one-dimensional and homogeneous domain

    Energy Technology Data Exchange (ETDEWEB)

    Tumelero, Fernanda; Bodmann, Bardo E. J.; Vilhena, Marco T. [Universidade Federal do Rio Grande do Sul (PROMEC/UFRGS), Porto Alegre, RS (Brazil). Programa de Pos Graduacao em Engenharia Mecanica; Lapa, Celso M.F., E-mail: fernanda.tumelero@yahoo.com.br, E-mail: bardo.bodmann@ufrgs.br, E-mail: mtmbvilhena@gmail.com, E-mail: lapa@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-07-01

    In this work we solve the space kinetic diffusion equation in a one-dimensional geometry considering a homogeneous domain, for two energy groups and six groups of delayed neutron precursors. The proposed methodology makes use of a Taylor expansion in the space variable of the scalar neutron flux (fast and thermal) and the concentration of delayed neutron precursors, allocating the time dependence to the coefficients. Upon truncating the Taylor series at quadratic order, one obtains a set of recursive systems of ordinary differential equations, where a modified decomposition method is applied. The coefficient matrix is split into two, one constant diagonal matrix and the second one with the remaining time dependent and off-diagonal terms. Moreover, the equation system is reorganized such that the terms containing the latter matrix are treated as source terms. Note, that the homogeneous equation system has a well known solution, since the matrix is diagonal and constant. This solution plays the role of the recursion initialization of the decomposition method. The recursion scheme is set up in a fashion where the solutions of the previous recursion steps determine the source terms of the subsequent steps. A second feature of the method is the choice of the initial and boundary conditions, which are satisfied by the recursion initialization, while from the rst recursion step onward the initial and boundary conditions are homogeneous. The recursion depth is then governed by a prescribed accuracy for the solution. (author)

  5. Nutri One-on-One: The Assessment and Evaluation of a Brief One-on-One Nutritional Coaching in Patients Affected by Metabolic Syndrome

    Directory of Open Access Journals (Sweden)

    Jennifer King

    2015-01-01

    Full Text Available Nutri One-on-One was a program with the aim to positively modify medical clinic patients’ nutritional habits and lifestyles through a brief one-on-one health coaching session. Each session was conducted by utilizing motivational interviewing techniques to allow for tailored nutrition education and goal setting. These sessions were followed by a phone call to participants at 1 month following the session. The outcomes assessed were participant perception of achieving personal nutrition and lifestyle goals, retention of knowledge, and participants’ satisfaction with the program. Physicians working in the clinic were assessed for satisfaction with the program. Most of the physicians were generally satisfied with the program and found it to be an asset to their practice. Participants perceived that they achieved their goals, were pleased with the program, and retained knowledge.

  6. Vitali systems in R^n with irregular sets

    DEFF Research Database (Denmark)

    Mejlbro, Leif; Topsøe, Flemming

    1996-01-01

    Vitali type theorems are results stating that out of a given family of sets one can select pairwise disjoint sets which fill out a "large" region. Usually one works with "regular" sets such as balls. We shall establish results with sets of a more complicated geometrical structure, e.g., Cantor......-like sets are allowed. The results are related to a generalisation of the classical notion of a differentiation basis.l They concern real n-space R^n and Lebesgue measure....

  7. Computer aided control of the Bonn Penning polarized ion source

    International Nuclear Information System (INIS)

    He, N.W.; VonRossen, P.; Eversheim, P.D.; Busch, R.

    1984-01-01

    A CBM computer system is described which has been set up to control the Bonn Polarized Ion Source. The controlling program, besides setting and logging parameters, performs an optimization of the ion source output. A free definable figure of merit, being composed of the current of the ionizer and its variance, has proven to be an effective means in directing the source optimization. The performance that has been reached during the first successful tests is reported

  8. Water sources in mangroves in four hydrogeomorphic settings in Florida and Mexico

    Science.gov (United States)

    Christina Stringer; Mark. Rains

    2016-01-01

    Mangroves are transitional environments, where fresh water from the terrestrial environments mix with seawater from the marine environment. The relative contributions of these sources vary and play a role in controlling the physical and chemical hydrological characteristics of mangroves and facilitate the exchange of mass, energy, and organisms between mangroves and...

  9. Constraining sources of ultrahigh energy cosmic rays and shear acceleration mechanism of particles in relativistic jets

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Ruoyu

    2015-06-10

    Ultrahigh energy cosmic rays are extreme energetic particles from outer space. They have aroused great interest among scientists for more than fifty years. However, due to the rarity of the events and complexity of the process of their propagation to Earth, they are still one of the biggest puzzles in modern high energy astrophysics. This dissertation is dedicated to study the origin of ultrahigh energy cosmic rays from various aspects. Firstly, we discuss a possible link between recently discovered sub-PeV/PeV neutrinos and ultrahigh energy cosmic rays. If these two kinds of particles share the same origin, the observation of neutrinos may provide additional and non-trivial constraints on the sources of ultrahigh energy cosmic rays. Secondly, we jointly employ the chemical composition measurement and the arrival directions of ultrahigh energy cosmic rays, and find a robust upper limit for distances of sources of ultrahigh energy cosmic rays above ∝55 EeV, as well as a lower limit for their metallicities. Finally, we study the shear acceleration mechanism in relativistic jets, which is a more efficient mechanism for the acceleration of higher energy particle. We compute the acceleration efficiency and the time-dependent particle energy spectrum, and explore the feature of synchrotron radiation of the accelerated particles. The possible realizations of this mechanism for acceleration of ultrahigh energy cosmic rays in different astrophysical environments is also discussed.

  10. On the spectrum of the one-speed slab-geometry discrete ordinates operator in neutron transport theory

    International Nuclear Information System (INIS)

    Abreu, Marcos Pimenta de

    1998-01-01

    We describe a numerical method applied to the first-order form of one-speed slab-geometry discrete ordinates equations modelling time-independent neutron transport problems with anisotropic scattering, with no interior source and defined in a nonmultiplying homogeneous host medium. Our numerical method is concerned with the generation of the spectrum and of a vector basis for the null space of the one-speed slab-geometry discrete ordinates operator. Moreover, it allows us to overcome the difficulties introduced in previous methods by anisotropic scattering and by angular quadrature sets of high order. To illustrate the positive features of our numerical method, we present numerical results for one-speed slab-geometry neutron transport model problems with anisotropic scattering

  11. Data envelopment analysis - DEA and fuzzy sets to assess the performance of academic departments: a case study at Federal University of Santa Catarina - UFSC

    Directory of Open Access Journals (Sweden)

    Lopes Ana Lúcia Miranda

    2002-01-01

    Full Text Available This paper address the issue of performance evaluation - productivity and quality - of academic departments at an University. A DEA model was used to simulate a process of cross-evaluation between departments. The results of DEA in the dimensions of teaching, research, service and quality were modeled as fuzzy numbers and then aggregated through a weighted ordered aggregator. A single index of performance for each department was generated. The proposal is to identify departments with low performance in one or more dimensions that should receive additional evaluation from an external auditing committee. A by-product of the model is to enlarge the possibility of working with more variables than a conventional DEA model. The model applied to a set of fifty-eight departments of a Brazilian University showed fifteen with low performance. Zero correlation between department teaching, research and service were observed. Weak correlation was detected between research productivity and quality. Weak scale effects were detected.

  12. Life cycle assessment of renewable energy sources

    CERN Document Server

    Singh, Anoop; Olsen, Stig Irving

    2013-01-01

    Governments are setting challenging targets to increase the production of energy and transport fuel from sustainable sources. The emphasis is increasingly on renewable sources including wind, solar, geothermal, biomass based biofuel, photovoltaics or energy recovery from waste. What are the environmental consequences of adopting these other sources? How do these various sources compare to each other? Life Cycle Assessment of Renewable Energy Sources tries to answer these questions based on the universally adopted method of Life Cycle Assessment (LCA). This book introduces the concept and impor

  13. OpenMC In Situ Source Convergence Detection

    Energy Technology Data Exchange (ETDEWEB)

    Aldrich, Garrett Allen [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Univ. of California, Davis, CA (United States); Dutta, Soumya [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); The Ohio State Univ., Columbus, OH (United States); Woodring, Jonathan Lee [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-07

    We designed and implemented an in situ version of particle source convergence for the OpenMC particle transport simulator. OpenMC is a Monte Carlo based-particle simulator for neutron criticality calculations. For the transport simulation to be accurate, source particles must converge on a spatial distribution. Typically, convergence is obtained by iterating the simulation by a user-settable, fixed number of steps, and it is assumed that convergence is achieved. We instead implement a method to detect convergence, using the stochastic oscillator for identifying convergence of source particles based on their accumulated Shannon Entropy. Using our in situ convergence detection, we are able to detect and begin tallying results for the full simulation once the proper source distribution has been confirmed. Our method ensures that the simulation is not started too early, by a user setting too optimistic parameters, or too late, by setting too conservative a parameter.

  14. A hybrid source-driven method to compute fast neutron fluence in reactor pressure vessel - 017

    International Nuclear Information System (INIS)

    Ren-Tai, Chiang

    2010-01-01

    A hybrid source-driven method is developed to compute fast neutron fluence with neutron energy greater than 1 MeV in nuclear reactor pressure vessel (RPV). The method determines neutron flux by solving a steady-state neutron transport equation with hybrid neutron sources composed of peripheral fixed fission neutron sources and interior chain-reacted fission neutron sources. The relative rod-by-rod power distribution of the peripheral assemblies in a nuclear reactor obtained from reactor core depletion calculations and subsequent rod-by-rod power reconstruction is employed as the relative rod-by-rod fixed fission neutron source distribution. All fissionable nuclides other than U-238 (such as U-234, U-235, U-236, Pu-239 etc) are replaced with U-238 to avoid counting the fission contribution twice and to preserve fast neutron attenuation for heavy nuclides in the peripheral assemblies. An example is provided to show the feasibility of the method. Since the interior fuels only have a marginal impact on RPV fluence results due to rapid attenuation of interior fast fission neutrons, a generic set or one of several generic sets of interior fuels can be used as the driver and only the neutron sources in the peripheral assemblies will be changed in subsequent hybrid source-driven fluence calculations. Consequently, this hybrid source-driven method can simplify and reduce cost for fast neutron fluence computations. This newly developed hybrid source-driven method should be a useful and simplified tool for computing fast neutron fluence at selected locations of interest in RPV of contemporary nuclear power reactors. (authors)

  15. Rate-control algorithms testing by using video source model

    DEFF Research Database (Denmark)

    Belyaev, Evgeny; Turlikov, Andrey; Ukhanova, Anna

    2008-01-01

    In this paper the method of rate control algorithms testing by the use of video source model is suggested. The proposed method allows to significantly improve algorithms testing over the big test set.......In this paper the method of rate control algorithms testing by the use of video source model is suggested. The proposed method allows to significantly improve algorithms testing over the big test set....

  16. Study allowing a decision-making from the activity calculation of a iodine 131 source detected in a dump at the incineration facility

    International Nuclear Information System (INIS)

    Houy, J.C.; Laugle, S.

    2000-01-01

    This study is divided in six parts: the first one details the determination of the different threshold in order to make the decision; the second part is the description of the gantry placed at the incineration factory; the third part is devoted to the gantry calibration by detector; the fourth part concerns the theoretical determination of the rates ratio in function of the source position in the truck; the fifth part makes the concordance between the theoretical calculations and the practice measures; the sixth part expresses the source activity and position determination in the truck with the decision-making. To conclude, the alarm threshold adjustment of the Rennes incineration factory is set to twice the background noise without taking into account of the source position in the domestic wastes truck. The alarm setting off can be carried out for a low activity source situated close to the truck wall and conversely, do not detect a MBq source situated in the middle of the truck. This alarm should be set off from a calculation program, taking into account the detectors report, in order to estimate the activity and the position of the source in the truck and to determine the decision making for the management of these wastes. (N.C.)

  17. Configuration of electro-optic fire source detection system

    Science.gov (United States)

    Fabian, Ram Z.; Steiner, Zeev; Hofman, Nir

    2007-04-01

    The recent fighting activities in various parts of the world have highlighted the need for accurate fire source detection on one hand and fast "sensor to shooter cycle" capabilities on the other. Both needs can be met by the SPOTLITE system which dramatically enhances the capability to rapidly engage hostile fire source with a minimum of casualties to friendly force and to innocent bystanders. Modular system design enable to meet each customer specific requirements and enable excellent future growth and upgrade potential. The design and built of a fire source detection system is governed by sets of requirements issued by the operators. This can be translated into the following design criteria: I) Long range, fast and accurate fire source detection capability. II) Different threat detection and classification capability. III) Threat investigation capability. IV) Fire source data distribution capability (Location, direction, video image, voice). V) Men portability. ) In order to meet these design criteria, an optimized concept was presented and exercised for the SPOTLITE system. Three major modular components were defined: I) Electro Optical Unit -Including FLIR camera, CCD camera, Laser Range Finder and Marker II) Electronic Unit -including system computer and electronic. III) Controller Station Unit - Including the HMI of the system. This article discusses the system's components definition and optimization processes, and also show how SPOTLITE designers successfully managed to introduce excellent solutions for other system parameters.

  18. Fifty years of Technical Cooperation

    International Nuclear Information System (INIS)

    2007-01-01

    The International Atomic Energy Agency (IAEA) was established in Vienna in 1957. The Statute of the IAEA, approved by 81 nations, founded the organization on three pillars: nuclear verification; safety and security; and the transfer of technology. Today, these three pillars still remain at the heart of the organization's work. However, the way in which the IAEA carries out this work, particularly with regard to technology transfer, has changed greatly over the years. When the IAEA opened for business, nuclear science and technology were in their infancy. Many Member States had no nuclear capacity at all. The IAEA's 'technical assistance' programme, as it was then known, was modest. Early projects were small in scale and short lived, focusing mainly on building human capacities and creating institutions and facilities that would support the introduction of nuclear technology in a safe and effective manner. Today, the picture is more complex. Instead of merely offering assistance, the IAEA focuses on cooperation for sustainable socioeconomic development, building on the skills and infrastructure that Member States have acquired over the past five decades. Member States are full partners in the process, guiding the IAEA's technical cooperation activities, setting national and regional priorities, and offering training opportunities and technical support to the IAEA and to other Member States. Technical cooperation between developing countries is facilitated and supported through regional cooperative agreements. Regional centres of expertise play an important role in sharing the benefits of nuclear science and technology among Member States

  19. Treatment planning source assessment

    International Nuclear Information System (INIS)

    Calzetta Larrieu, O.; Blaumann, H.; Longhino, J.

    2000-01-01

    The reactor RA-6 NCT system was improved during the last year mainly in two aspects: the facility itself getting lower contamination factors and using better measurements techniques to obtain lower uncertainties in its characterization. In this job we show the different steps to get the source to be used in the treatment planning code representing the NCT facility. The first one was to compare the dosimetry in a water phantom between the calculation using the entire facility including core, filter and shields and a surface source at the end of the beam. The second one was to transform this particle by particle source in a distribution one regarding the minimum spatial, energy and angular resolution to get similar results. Finally we compare calculation and experimental values with and without the water phantom to adjust the distribution source. The results are discussed. (author)

  20. Spatial part-set cuing facilitation.

    Science.gov (United States)

    Kelley, Matthew R; Parasiuk, Yuri; Salgado-Benz, Jennifer; Crocco, Megan

    2016-07-01

    Cole, Reysen, and Kelley [2013. Part-set cuing facilitation for spatial information. Journal of Experimental Psychology: Learning, Memory, & Cognition, 39, 1615-1620] reported robust part-set cuing facilitation for spatial information using snap circuits (a colour-coded electronics kit designed for children to create rudimentary circuit boards). In contrast, Drinkwater, Dagnall, and Parker [2006. Effects of part-set cuing on experienced and novice chess players' reconstruction of a typical chess midgame position. Perceptual and Motor Skills, 102(3), 645-653] and Watkins, Schwartz, and Lane [1984. Does part-set cuing test for memory organization? Evidence from reconstructions of chess positions. Canadian Journal of Psychology/Revue Canadienne de Psychologie, 38(3), 498-503] showed no influence of part-set cuing for spatial information when using chess boards. One key difference between the two procedures was that the snap circuit stimuli were explicitly connected to one another, whereas chess pieces were not. Two experiments examined the effects of connection type (connected vs. unconnected) and cue type (cued vs. uncued) on memory for spatial information. Using chess boards (Experiment 1) and snap circuits (Experiment 2), part-set cuing facilitation only occurred when the stimuli were explicitly connected; there was no influence of cuing with unconnected stimuli. These results are potentially consistent with the retrieval strategy disruption hypothesis, as well as the two- and three-mechanism accounts of part-set cuing.

  1. A static analysis tool set for assembler code verification

    International Nuclear Information System (INIS)

    Dhodapkar, S.D.; Bhattacharjee, A.K.; Sen, Gopa

    1991-01-01

    Software Verification and Validation (V and V) is an important step in assuring reliability and quality of the software. The verification of program source code forms an important part of the overall V and V activity. The static analysis tools described here are useful in verification of assembler code. The tool set consists of static analysers for Intel 8086 and Motorola 68000 assembly language programs. The analysers examine the program source code and generate information about control flow within the program modules, unreachable code, well-formation of modules, call dependency between modules etc. The analysis of loops detects unstructured loops and syntactically infinite loops. Software metrics relating to size and structural complexity are also computed. This report describes the salient features of the design, implementation and the user interface of the tool set. The outputs generated by the analyser are explained using examples taken from some projects analysed by this tool set. (author). 7 refs., 17 figs

  2. Determination of a basic set of Eigen-functions and of the corresponding norm in the case of the one-velocity integral differential Boltzmann equation in spherical geometry

    International Nuclear Information System (INIS)

    Lafore, P.

    1965-01-01

    The object of the present work is to draw up a basic set of orthogonal eigenfunctions; resolution of the one-velocity integral-differential Boltzmann equation; this in the case of a spherical geometry system. (author) [fr

  3. Investigation of an incident due to orphan sources in Iran

    Energy Technology Data Exchange (ETDEWEB)

    Mehdizadeh, S. [Shiraz Univ. (Iran, Islamic Republic of); Kardan, M.R.; Abdollah Miangi, F.; Rastkhah, N. [National Radiation Protection Dept., Tehran (Iran, Islamic Republic of)

    2006-07-01

    This paper discusses an incident occurred in one of the radiation application centers in Iran, and follow up investigations as well as lessons learnt. In January 2004 the Regulatory Authority was informed through Shiraz University R.P.O. of an incident regarding orphan radioactive sources which belonged to a Radiation Application Center (R.A.C.) in Shiraz. In this incident a metallic box containing a neutron emitting and three gamma emitting sources belonging to the above mentioned center were unearthed from a burial site outside of Shiraz by some person or persons and the box was eventually set on fire. Investigations conducted later on, showed that despite the melting of the paraffin shield of neutron emitting source, the capsules containing sources were intact and unharmed, and no radioactive leaking had occurred. Further investigations showed that this box contained one Am-Be source and three Cs-137 sources which had been given to the above mentioned center, long time before by a foreign well logging company from former Czech Republic and without notifying the Regulatory Authority. Follow up measurements indicated the maximum dose rate of 111 GBq Am-Be and 3.7 GBq Cs-137 sources to be 13 mSv/h and 7 mSv/hr accordingly. Therefore the maximum dose for the people involved in this incident is estimated to have been about 100 mSv, consequently no severe deterministic effects to individuals is expected: the findings showed that the main reasons for the incident were as follows: 1. Lack of professionalism in working with radioactive materials. 2. Violation of obligation under Radiation protection act and related regulations by the owner of the sources which was a well logging foreign company. 3. Leaving the sources in an improper storage condition. 4. Unauthorized access to the radiation sources in the owner center. 5. Lack of an effective national register system in the Regulatory Authority. 6. Lack of a well scheduled and risk base inspection program for the

  4. Monthly CO surface sources inventory based on the 2000-2001 MOPITT satellite data

    Science.gov (United States)

    Pétron, Gabrielle; Granier, Claire; Khattatov, Boris; Yudin, Valery; Lamarque, Jean-François; Emmons, Louisa; Gille, John; Edwards, David P.

    2004-11-01

    This paper presents results of the inverse modeling of carbon monoxide surface sources on a monthly and regional basis using the MOPITT (Measurement Of the Pollution In The Troposphere) CO retrievals. The targeted time period is from April 2000 to March 2001. A sequential and time-dependent inversion scheme is implemented to correct an a priori set of monthly mean CO sources. The a posteriori estimates for the total anthropogenic (fossil fuel + biofuel + biomass burning) surface sources of CO in TgCO/yr are 509 in Asia, 267 in Africa, 140 in North America, 90 in Europe and 84 in Central and South America. Inverting on a monthly scale allows one to assess a corrected seasonality specific to each source type and each region. Forward CTM simulations with the a posteriori emissions show a substantial improvement of the agreement between modeled CO and independent in situ observations.

  5. Analysis of inconsistent source sampling in monte carlo weight-window variance reduction methods

    Directory of Open Access Journals (Sweden)

    David P. Griesheimer

    2017-09-01

    Full Text Available The application of Monte Carlo (MC to large-scale fixed-source problems has recently become possible with new hybrid methods that automate generation of parameters for variance reduction techniques. Two common variance reduction techniques, weight windows and source biasing, have been automated and popularized by the consistent adjoint-driven importance sampling (CADIS method. This method uses the adjoint solution from an inexpensive deterministic calculation to define a consistent set of weight windows and source particles for a subsequent MC calculation. One of the motivations for source consistency is to avoid the splitting or rouletting of particles at birth, which requires computational resources. However, it is not always possible or desirable to implement such consistency, which results in inconsistent source biasing. This paper develops an original framework that mathematically expresses the coupling of the weight window and source biasing techniques, allowing the authors to explore the impact of inconsistent source sampling on the variance of MC results. A numerical experiment supports this new framework and suggests that certain classes of problems may be relatively insensitive to inconsistent source sampling schemes with moderate levels of splitting and rouletting.

  6. New Maximal Two-distance Sets

    DEFF Research Database (Denmark)

    Lisonek, Petr

    1996-01-01

    A two-distance set in E^d is a point set X inthe d-dimensional Euclidean spacesuch that the distances between distinct points in Xassume only two different non-zero values. Based on results from classical distance geometry, we developan algorithm to classify, for a given dimension, all maximal...... (largest possible)two-distance sets in E^d.Using this algorithm we have completed the full classificationfor all dimensions less than or equal to 7, andwe have found one set in E^8 whosemaximality follows from Blokhuis' upper bound on sizes of s-distance sets.While in the dimensions less than or equal to 6...

  7. Localization of the solution of a one-dimensional one-phase Stefan problem

    OpenAIRE

    Cortazar, C.; Elgueta, M.; Primicerio, M.

    1996-01-01

    Studiamo la localizzazione, l'insieme dei punti di blow up ed alcuni aspetti della velocità di propagazione della frontiera libera di soluzioni di un problema di Stefan unidimensionale ad una fase. We study localization, the set of blow up points and some aspects of the speed of the free boundary of solutions of a one-dimensional, one-phase Stefan problem.

  8. Action plan for renewable energy sources

    International Nuclear Information System (INIS)

    2000-03-01

    In the Finnish Energy Strategy, approved by the Finnish Government in 1997, the emphasis is laid on the importance of bioenergy and other renewable energy sources for the creation of such prerequisites for the Finnish energy economy that the supply of energy can be secured, the price on energy is competitive and the emissions from energy generation are within the limits set by the international commitments made by Finland. In 1998, the European Union Meeting of the Ministers of Energy adopted a resolution taking a positive attitude to the Communication from the Commission 'Energy for the future: Renewable sources of energy' - White Paper for a Community Strategy and Action Plan. National measures play a key role in the achievement of the objectives set in the White Paper. This Action Plan for Renewable Energy Sources is a national programme in line with the EU's White Paper. It comprises all renewable sources of energy available in Finland. It encompasses even peat, which in Finland has traditionally been considered to be a solid biofuel but is internationally classified as one of the non-renewable sources of energy. In the Action Plan, objectives are set for the volume of renewable energy sources used in the year 2010 including a prognosis on the development by the year 2025. The goal is that by the year 2010 the volume of energy generated using renewable energy sources has increased by 50% compared with the year 1995. This would mean an increase by 3 Mtoe, which is about 1 Mtoe more than anticipated in the outlook based on the Finnish Energy Strategy. A further goal is to double the use of renewable energy sources by the year 2025. The aggregate use of renewable energy sources depends to a large extent both on the development of the price on energy produced using other energy sources and on possible changes in the production volume of the Finnish forest industry. The most important objective stated in the Action Plan is to improve the competitiveness of renewable

  9. Infection prevention needs assessment in Colorado hospitals: rural and urban settings.

    Science.gov (United States)

    Reese, Sara M; Gilmartin, Heather; Rich, Karen L; Price, Connie S

    2014-06-01

    The purpose of our study was to conduct a needs assessment for infection prevention programs in both rural and urban hospitals in Colorado. Infection control professionals (ICPs) from Colorado hospitals participated in an online survey on training, personnel, and experience; ICP time allocation; and types of surveillance. Responses were evaluated and compared based on hospital status (rural or urban). Additionally, rural ICPs participated in an interview about resources and training. Surveys were received from 62 hospitals (77.5% response); 33 rural (75.0% response) and 29 urban (80.6% response). Fifty-two percent of rural ICPs reported multiple job responsibilities compared with 17.2% of urban ICPs. Median length of experience for rural ICPs was 4.0 years compared with 11.5 years for urban ICPs (P = .008). Fifty-one percent of rural ICPs reported no access to infectious disease physicians (0.0% urban) and 81.8% of rural hospitals reported no antimicrobial stewardship programs (31.0% urban). Through the interviews it was revealed that priorities for rural ICPs were training and communication. Our study revealed numerous differences between infection prevention programs in rural versus urban hospitals. An infection prevention outreach program established in Colorado could potentially address the challenges faced by rural hospital infection prevention departments. Copyright © 2014 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Mosby, Inc. All rights reserved.

  10. Neutron source

    International Nuclear Information System (INIS)

    Cason, J.L. Jr.; Shaw, C.B.

    1975-01-01

    A neutron source which is particularly useful for neutron radiography consists of a vessel containing a moderating media of relatively low moderating ratio, a flux trap including a moderating media of relatively high moderating ratio at the center of the vessel, a shell of depleted uranium dioxide surrounding the moderating media of relatively high moderating ratio, a plurality of guide tubes each containing a movable source of neutrons surrounding the flux trap, a neutron shield surrounding one part of each guide tube, and at least one collimator extending from the flux trap to the exterior of the neutron source. The shell of depleted uranium dioxide has a window provided with depleted uranium dioxide shutters for each collimator. Reflectors are provided above and below the flux trap and on the guide tubes away from the flux trap

  11. Mortality and Clostridium difficile infection in an Australian setting.

    Science.gov (United States)

    Mitchell, Brett G; Gardner, Anne; Hiller, Janet E

    2013-10-01

    To quantify the risk of death associated with Clostridium difficile infection, in an Australian tertiary hospital. Two reviews examining Clostridium difficile infection and mortality indicate that Clostridium difficile infection is associated with increased mortality in hospitalized patients. Studies investigating the mortality of Clostridium difficile infection in settings outside of Europe and North America are required, so that the epidemiology of Clostridium difficile infection in these regions can be understood and appropriate prevention strategies made. An observational non-concurrent cohort study design was used. Data from all persons who had (exposed) and a matched sample of persons who did not have Clostridium difficile infection, for the calendar years 2007-2010, were analysed. The risk of dying within 30, 60, 90 and 180 days was compared using the two groups. Kaplan-Meier survival analysis and conditional logistic regression models were applied to the data to examine time to death and mortality risk adjusted for comorbidities using the Charlson Comorbidity Index. One hundred and fifty-eight cases of infection were identified. A statistically significant difference in all-cause mortality was identified between exposed and non-exposed groups at 60 and 180 days. In a conditional regression model, mortality in the exposed group was significantly higher at 180 days. In this Australian study, Clostridium difficile infection was associated with increased mortality. In doing so, it highlights the need for nurses to immediately instigate contact precautions for persons suspected of having Clostridium difficile infection and to facilitate a timely faecal collection for testing. Our findings support ongoing surveillance of Clostridium difficile infection and associated prevention and control activities. © 2013 Blackwell Publishing Ltd.

  12. Source characterization of underground explosions from hydrodynamic-to-elastic coupling simulations

    Science.gov (United States)

    Chiang, A.; Pitarka, A.; Ford, S. R.; Ezzedine, S. M.; Vorobiev, O.

    2017-12-01

    A major improvement in ground motion simulation capabilities for underground explosion monitoring during the first phase of the Source Physics Experiment (SPE) is the development of a wave propagation solver that can propagate explosion generated non-linear near field ground motions to the far-field. The calculation is done using a hybrid modeling approach with a one-way hydrodynamic-to-elastic coupling in three dimensions where near-field motions are computed using GEODYN-L, a Lagrangian hydrodynamics code, and then passed to WPP, an elastic finite-difference code for seismic waveform modeling. The advancement in ground motion simulation capabilities gives us the opportunity to assess moment tensor inversion of a realistic volumetric source with near-field effects in a controlled setting, where we can evaluate the recovered source properties as a function of modeling parameters (i.e. velocity model) and can provide insights into previous source studies on SPE Phase I chemical shots and other historical nuclear explosions. For example the moment tensor inversion of far-field SPE seismic data demonstrated while vertical motions are well-modeled using existing velocity models large misfits still persist in predicting tangential shear wave motions from explosions. One possible explanation we can explore is errors and uncertainties from the underlying Earth model. Here we investigate the recovered moment tensor solution, particularly on the non-volumetric component, by inverting far-field ground motions simulated from physics-based explosion source models in fractured material, where the physics-based source models are based on the modeling of SPE-4P, SPE-5 and SPE-6 near-field data. The hybrid modeling approach provides new prospects in modeling explosion source and understanding the uncertainties associated with it.

  13. Genotypic and phenotypic diversity of Ralstonia pickettii and Ralstonia insidiosa isolates from clinical and environmental sources including High-purity Water.

    LENUS (Irish Health Repository)

    Ryan, Michael P

    2011-08-30

    Abstract Background Ralstonia pickettii is a nosocomial infectious agent and a significant industrial contaminant. It has been found in many different environments including clinical situations, soil and industrial High Purity Water. This study compares the phenotypic and genotypic diversity of a selection of strains of Ralstonia collected from a variety of sources. Results Ralstonia isolates (fifty-nine) from clinical, industrial and environmental origins were compared genotypically using i) Species-specific-PCR, ii) PCR and sequencing of the 16S-23S rRNA Interspatial region (ISR) iii) the fliC gene genes, iv) RAPD and BOX-PCR and v) phenotypically using biochemical testing. The species specific-PCR identified fifteen out of fifty-nine designated R. pickettii isolates as actually being the closely related species R. insidiosa. PCR-ribotyping of the 16S-23S rRNA ISR indicated few major differences between the isolates. Analysis of all isolates demonstrated different banding patterns for both the RAPD and BOX primers however these were found not to vary significantly. Conclusions R. pickettii species isolated from wide geographic and environmental sources appear to be reasonably homogenous based on genotypic and phenotypic characteristics. R. insidiosa can at present only be distinguished from R. pickettii using species specific PCR. R. pickettii and R. insidiosa isolates do not differ significantly phenotypically or genotypically based on environmental or geographical origin.

  14. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    International Nuclear Information System (INIS)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P.

    2012-09-01

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  15. Using Bayesian Belief Network (BBN) modelling for Rapid Source Term Prediction. RASTEP Phase 1

    Energy Technology Data Exchange (ETDEWEB)

    Knochenhauer, M.; Swaling, V.H.; Alfheim, P. [Scandpower AB, Sundbyberg (Sweden)

    2012-09-15

    The project is connected to the development of RASTEP, a computerized source term prediction tool aimed at providing a basis for improving off-site emergency management. RASTEP uses Bayesian belief networks (BBN) to model severe accident progression in a nuclear power plant in combination with pre-calculated source terms (i.e., amount, timing, and pathway of released radio-nuclides). The output is a set of possible source terms with associated probabilities. In the NKS project, a number of complex issues associated with the integration of probabilistic and deterministic analyses are addressed. This includes issues related to the method for estimating source terms, signal validation, and sensitivity analysis. One major task within Phase 1 of the project addressed the problem of how to make the source term module flexible enough to give reliable and valid output throughout the accident scenario. Of the alternatives evaluated, it is recommended that RASTEP is connected to a fast running source term prediction code, e.g., MARS, with a possibility of updating source terms based on real-time observations. (Author)

  16. Tsunami hazard at the Western Mediterranean Spanish coast from seismic sources

    Directory of Open Access Journals (Sweden)

    J. A. Álvarez-Gómez

    2011-01-01

    Full Text Available Spain represents an important part of the tourism sector in the Western Mediterranean, which has been affected in the past by tsunamis. Although the tsunami risk at the Spanish coasts is not the highest of the Mediterranean, the necessity of tsunami risk mitigation measures should not be neglected. In the Mediterranean area, Spain is exposed to two different tectonic environments with contrasting characteristics. On one hand, the Alboran Basin characterised by transcurrent and transpressive tectonics and, on the other hand, the North Algerian fold and thrust belt, characterised by compressive tectonics. A set of 22 seismic tsunamigenic sources has been used to estimate the tsunami threat over the Spanish Mediterranean coast of the Iberian peninsula and the Balearic Islands. Maximum wave elevation maps and tsunami travel times have been computed by means of numerical modelling and we have obtained estimations of threat levels for each source over the Spanish coast. The sources on the Western edge of North Algeria are the most dangerous, due to their threat to the South-Eastern coast of the Iberian Peninsula and to the Western Balearic Islands. In general, the Northern Algerian sources pose a greater risk to the Spanish coast than the Alboran Sea sources, which only threaten the peninsular coast. In the Iberian Peninsula, the Spanish provinces of Almeria and Murcia are the most exposed, while all the Balearic Islands can be affected by the North Algerian sources with probable severe damage, specially the islands of Ibiza and Minorca. The results obtained in this work are useful to plan future regional and local warning systems, as well as to set the priority areas to conduct research on detailed tsunami risk.

  17. The PLOS ONE Synthetic Biology Collection: Six Years and Counting

    Science.gov (United States)

    Peccoud, Jean; Isalan, Mark

    2012-01-01

    Since it was launched in 2006, PLOS ONE has published over fifty articles illustrating the many facets of the emerging field of synthetic biology. This article reviews these publications by organizing them into broad categories focused on DNA synthesis and assembly techniques, the development of libraries of biological parts, the use of synthetic biology in protein engineering applications, and the engineering of gene regulatory networks and metabolic pathways. Finally, we review articles that describe enabling technologies such as software and modeling, along with new instrumentation. In order to increase the visibility of this body of work, the papers have been assembled into the PLOS ONE Synthetic Biology Collection (www.ploscollections.org/synbio). Many of the innovative features of the PLOS ONE web site will help make this collection a resource that will support a lively dialogue between readers and authors of PLOS ONE synthetic biology papers. The content of the collection will be updated periodically by including relevant articles as they are published by the journal. Thus, we hope that this collection will continue to meet the publishing needs of the synthetic biology community. PMID:22916228

  18. IVF twins: buy one get one free?

    Science.gov (United States)

    Ismail, Laura; Mittal, Monica; Kalu, Emmanuel

    2012-10-01

    There has been an overall increase in the incidence of multiple pregnancies and assisted reproduction technology is largely responsible for this rise. Although twins may appeal to couples undergoing in vitro fertilisation (IVF), they have been associated with serious health consequences to the babies, their mothers and the family unit, as well as having massive financial implications for the National Health Service. Transfer of more than one embryo during IVF is mainly responsible for IVF twins, and elective transfer of a single embryo at a time with cryopreservation of surplus embryos for later transfer has been shown to be an effective strategy to minimise the risk of twins without compromising IVF success rates. Factors that will impact on the success of the policy of elective single embryo transfer (eSET) include improvement in embryo selection for transfer, better cryopreservation techniques and adequate state funding for IVF. However, in implementing the policy of eSET it is important that each case is assessed on an individual basis since in some situations (e.g. in older women) the transfer of two embryos may be more cost effective. Adequate and continuous education of all stakeholders is essential if the policy of eSET is to be successful in the UK.

  19. Pollen sources in the Bojanów forest complex identified on honeybee pollen load by microscopic analysis

    Directory of Open Access Journals (Sweden)

    Ernest Stawiarz

    2017-11-01

    Full Text Available The aim of this study was to determine sources of pollen for the honeybee in the Bojanów forest complex, Nowa Dęba Forest District (southeastern Poland. Sampling of pollen loads from bees extended from the beginning of May until the end of September 2016 and was carried out at 7-day intervals using pollen traps mounted at the entrance of beehives. A total of 73 pollen load samples were collected from the study area. Fifty-nine taxa from 31 plant families were identified in the analyzed material. From 4 to 21 taxa (average 9.5 were recorded in one sample. The pollen of Brassicaceae (“others”, Taraxacum type, Solidago type, and Rumex had the highest frequency in the pollen loads examined. Apart from these four taxa, pollen grains of Rubus type, Poaceae (“others”, Calluna, Fagopyrum, Trifolium repens s. l., Phacelia, Aster type, Melampyrum, Quercus, Cornus, and Veronica were recorded in the dominant pollen group. The forest habitat taxa that provided pollen rewards to honeybees in the Bojanów forest complex were the following: Rubus, Calluna, Prunus, Tilia, Frangula alnus, Pinus, Quercus, Cornus, Robinia pseudoacacia, Salix, and Vaccinium. Apart from forest vegetation, the species from meadows and wastelands adjacent to this forest complex, represented by Taraxacum, Rumex, Plantago, Poaceae, Trifolium repens, and Solidago, proved to be an important source of pollen. The study indicates that forest communities are a valuable source of pollen for pollinating insects from early spring through to late fall.

  20. 31 CFR 92.4 - Uncirculated Mint Sets.

    Science.gov (United States)

    2010-07-01

    ... 31 Money and Finance: Treasury 1 2010-07-01 2010-07-01 false Uncirculated Mint Sets. 92.4 Section... OPERATIONS AND PROCEDURES Numismatic Operations § 92.4 Uncirculated Mint Sets. Uncirculated Mint Sets, i.e., specially packaged coin sets containing one coin of each denomination struck at the Mints at Philadelphia...

  1. Development of irradiator {sup 60}Co sources

    Energy Technology Data Exchange (ETDEWEB)

    Mosca, Rodrigo C.; Moura, Eduardo S.; Zeituni, Carlos A.; Mathor, Monica B., E-mail: rcmosca@usp.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2011-07-01

    According to a recent report by the International Agency for Research on Cancer (IARC) / WHO (2008-2010), the global impact of cancer more than doubled in 30 years. In this report, it was estimated that occurred about 12 million new cancer cases and 7 million deaths. In Brazil in 2010, with estimates for the year 2011, point to the occurrence of 489,270 new cases of cancer. Among the possibilities for cancer treatment, radiotherapy is one of the most important therapeutic and resources used to combat it. However, inherent complications of treatment can occur such as tiredness, loss of appetite, radiodermatitis and in more extreme cases late radionecrosis. In order to reproduce a point of radionecrosis in the vicinity of radiodermatitis to mimic these effects in animals, producing a model for assessment of tissue repair, we propose the setting up of an irradiator source of collimated {sup 60}Co. The development of was based on 11 sources of {sup 60}Co with 1 mm thickness that were inserted by inference in stainless steel 'gate-source' screw (patent pending) and later adjusted in a cross-shaped arrangement reinforced so that the beam radiation is directed to a target point, saving for other regions around this target point. The main use of this irradiator with sources of {sup 60}Co is just one cause radionecrosis point (target point) of approximately 5 mm{sup 2} with a surrounding and adjacent area of radiodermatitis around about 8 to 10 mm{sup 2} in laboratory animals for subsequent coating with epidermal-dermal matrix populated by a cell culture of human fibroblasts, keratinocytes and mesenchymal stem cells. With that said, its use will be valuable for evaluation of curative treatments against the bone and radionecrosis or palliative treatment rather than as it is currently assumed. (author)

  2. newspapers' agricultural agenda setting and extension agents ...

    African Journals Online (AJOL)

    p2333147

    Keywords: Newspapers', agricultural, extension agents' agenda setting. ABSTRACT ... from the priorities of political or other interest groups to the news priorities of media ... people. The questions that arise are-: what are the sources operating for ... The ADPs presently adopt the training and visit (T & V) system of extension.

  3. Scoping Study of Machine Learning Techniques for Visualization and Analysis of Multi-source Data in Nuclear Safeguards

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Yonggang

    2018-05-07

    In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integrated analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.

  4. Developing and setting up optical methods to study the speckle patterns created by optical beam smoothing

    International Nuclear Information System (INIS)

    Surville, J.

    2005-12-01

    We have developed three main optical methods to study the speckles generated by a smoothed laser source. The first method addresses the measurement of the temporal and spatial correlation functions of the source, with a modified Michelson interferometer. The second one is a pump-probe technique created to shoot a picture of a speckle pattern generated at a set time. And the third one is an evolution of the second method dedicated to time-frequency coding, thanks to a frequency chirped probe pulse. Thus, the speckles can be followed in time and their motion can be described. According to these three methods, the average size and duration of the speckles can be measured. It is also possible to measure the size and the duration of each of them and mostly their velocity in a given direction. All the results obtained have been confronted to the different existing theories. We show that the statistical distributions of the measured speckles'size and speckles'intensity agree satisfactorily with theoretical values

  5. Radiation protection and fuzzy set theory

    International Nuclear Information System (INIS)

    Nishiwaki, Y.

    1993-01-01

    In radiation protection we encounter a variety of sources of uncertainties which are due to fuzziness in our cognition or perception of objects. For systematic treatment of this type of uncertainty, the concepts of fuzzy sets or fuzzy measures could be applied to construct system models, which may take into consideration both subjective or intrinsic fuzziness and objective or extrinsic fuzziness. The theory of fuzzy sets and fuzzy measures is still in a developing stage, but its concept may be applied to various problems of subjective perception of risk, nuclear safety, radiation protection and also to the problems of man-machine interface and human factor engineering or ergonomic

  6. Geographic patterns of carbon dioxide emissions from fossil-fuel burning, hydraulic cement production, and gas flaring on a one degree by one degree grid cell basis: 1950 to 1990

    Energy Technology Data Exchange (ETDEWEB)

    Brenkert, A.L. [ed.] [Oak Ridge National Lab., TN (United States). Carbon Dioxide Information Analysis Center; Andres, R.J. [Univ. of Alaska, Fairbanks, AK (United States). Inst. of Northern Engineering; Marland, G. [Oak Ridge National Lab., TN (United States). Environmental Sciences Div.; Fung, I. [Univ. of Victoria, British Columbia (Canada)]|[National Aeronautics and Space Administration, New York, NY (United States). Goddard Inst. for Space Studies; Matthews, E. [Columbia Univ., New York, NY (United States)]|[National Aeronautics and Space Administration, New York, NY (United States). Goddard Inst. for Space Studies

    1997-03-01

    Data sets of one degree latitude by one degree longitude carbon dioxide (CO{sub 2}) emissions in units of thousand metric tons of carbon (C) per year from anthropogenic sources have been produced for 1950, 1960, 1970, 1980 and 1990. Detailed geographic information on CO{sub 2} emissions can be critical in understanding the pattern of the atmospheric and biospheric response to these emissions. Global, regional and national annual estimates for 1950 through 1992 were published previously. Those national, annual CO{sub 2} emission estimates were based on statistics on fossil-fuel burning, cement manufacturing and gas flaring in oil fields as well as energy production, consumption and trade data, using the methods of Marland and Rotty. The national annual estimates were combined with gridded one-degree data on political units and 1984 human populations to create the new gridded CO{sub 2} emission data sets. The same population distribution was used for each of the years as proxy for the emission distribution within each country. The implied assumption for that procedure was that per capita energy use and fuel mix is uniform over a political unit. The consequence of this first-order procedure is that the spatial changes observed over time are solely due to changes in national energy consumption and nation-based fuel mix. Increases in emissions over time are apparent for most areas.

  7. Open-source tools for data mining.

    Science.gov (United States)

    Zupan, Blaz; Demsar, Janez

    2008-03-01

    With a growing volume of biomedical databases and repositories, the need to develop a set of tools to address their analysis and support knowledge discovery is becoming acute. The data mining community has developed a substantial set of techniques for computational treatment of these data. In this article, we discuss the evolution of open-source toolboxes that data mining researchers and enthusiasts have developed over the span of a few decades and review several currently available open-source data mining suites. The approaches we review are diverse in data mining methods and user interfaces and also demonstrate that the field and its tools are ready to be fully exploited in biomedical research.

  8. Polar source analysis : technical memorandum

    Science.gov (United States)

    2017-09-29

    The following technical memorandum describes the development, testing and analysis of various polar source data sets. The memorandum also includes recommendation for potential inclusion in future releases of AEDT. This memorandum is the final deliver...

  9. Studying constructions of national identity across historical settings

    DEFF Research Database (Denmark)

    Ydesen, Christian; Øland, Trine

    2018-01-01

    , and national self-imagery. The first setting is the first decade after World War II and the second setting is the post-9/11 era. The empirical focus is based on sources pertaining to the way police officers and related professionals of the Danish welfare nation-state construct disturbing behavior and how......This article aims to demonstrate how constructions of national identity can be studied across historical settings. In this sense, the article contributes knowledge about how Danish-ness is constructed in two historical settings characterized by great upheavals in popular moral codes, culture...... these constructions are made into categories that activate an array of interventions. Using a comparative outlook between the two historical settings and by putting theoretically guided questions to work empirically, the purpose of this article is to understand 1) the boundaries of legitimate behavior and membership...

  10. Reconstruction of extended sources for the Helmholtz equation

    KAUST Repository

    Kress, Rainer

    2013-02-26

    The basis of most imaging methods is to detect hidden obstacles or inclusions within a body when one can only make measurements on an exterior surface. Our underlying model is that of inverse acoustic scattering based on the Helmholtz equation. Our inclusions are interior forces with compact support and our data consist of a single measurement of near-field Cauchy data on the external boundary. We propose an algorithm that under certain assumptions allows for the determination of the support set of these forces by solving a simpler \\'equivalent point source\\' problem, and which uses a Newton scheme to improve the corresponding initial approximation. © 2013 IOP Publishing Ltd.

  11. Glucose becomes one of the worst carbon sources for E.coli on poor nitrogen sources due to suboptimal levels of cAMP

    Science.gov (United States)

    Bren, Anat; Park, Junyoung O.; Towbin, Benjamin D.; Dekel, Erez; Rabinowitz, Joshua D.; Alon, Uri

    2016-01-01

    In most conditions, glucose is the best carbon source for E. coli: it provides faster growth than other sugars, and is consumed first in sugar mixtures. Here we identify conditions in which E. coli strains grow slower on glucose than on other sugars, namely when a single amino acid (arginine, glutamate, or proline) is the sole nitrogen source. In sugar mixtures with these nitrogen sources, E. coli still consumes glucose first, but grows faster rather than slower after exhausting glucose, generating a reversed diauxic shift. We trace this counterintuitive behavior to a metabolic imbalance: levels of TCA-cycle metabolites including α-ketoglutarate are high, and levels of the key regulatory molecule cAMP are low. Growth rates were increased by experimentally increasing cAMP levels, either by adding external cAMP, by genetically perturbing the cAMP circuit or by inhibition of glucose uptake. Thus, the cAMP control circuitry seems to have a ‘bug’ that leads to slow growth under what may be an environmentally rare condition. PMID:27109914

  12. Source-space ICA for MEG source imaging.

    Science.gov (United States)

    Jonmohamadi, Yaqub; Jones, Richard D

    2016-02-01

    One of the most widely used approaches in electroencephalography/magnetoencephalography (MEG) source imaging is application of an inverse technique (such as dipole modelling or sLORETA) on the component extracted by independent component analysis (ICA) (sensor-space ICA + inverse technique). The advantage of this approach over an inverse technique alone is that it can identify and localize multiple concurrent sources. Among inverse techniques, the minimum-variance beamformers offer a high spatial resolution. However, in order to have both high spatial resolution of beamformer and be able to take on multiple concurrent sources, sensor-space ICA + beamformer is not an ideal combination. We propose source-space ICA for MEG as a powerful alternative approach which can provide the high spatial resolution of the beamformer and handle multiple concurrent sources. The concept of source-space ICA for MEG is to apply the beamformer first and then singular value decomposition + ICA. In this paper we have compared source-space ICA with sensor-space ICA both in simulation and real MEG. The simulations included two challenging scenarios of correlated/concurrent cluster sources. Source-space ICA provided superior performance in spatial reconstruction of source maps, even though both techniques performed equally from a temporal perspective. Real MEG from two healthy subjects with visual stimuli were also used to compare performance of sensor-space ICA and source-space ICA. We have also proposed a new variant of minimum-variance beamformer called weight-normalized linearly-constrained minimum-variance with orthonormal lead-field. As sensor-space ICA-based source reconstruction is popular in EEG and MEG imaging, and given that source-space ICA has superior spatial performance, it is expected that source-space ICA will supersede its predecessor in many applications.

  13. The continuous time random walk, still trendy: fifty-year history, state of art and outlook

    Science.gov (United States)

    Kutner, Ryszard; Masoliver, Jaume

    2017-03-01

    In this article we demonstrate the very inspiring role of the continuous-time random walk (CTRW) formalism, the numerous modifications permitted by its flexibility, its various applications, and the promising perspectives in the various fields of knowledge. A short review of significant achievements and possibilities is given. However, this review is still far from completeness. We focused on a pivotal role of CTRWs mainly in anomalous stochastic processes discovered in physics and beyond. This article plays the role of an extended announcement of the Eur. Phys. J. B Special Issue [open-calls-for-papers/123-epj-b/1090-ctrw-50-years-on">http://epjb.epj.org/open-calls-for-papers/123-epj-b/1090-ctrw-50-years-on] containing articles which show incredible possibilities of the CTRWs. Contribution to the Topical Issue "Continuous Time Random Walk Still Trendy: Fifty-year History, Current State and Outlook", edited by Ryszard Kutner and Jaume Masoliver.

  14. Refreshing File Aggregate of Distributed Data Warehouse in Sets of Electric Apparatus

    Institute of Scientific and Technical Information of China (English)

    YU Baoqin; WANG Taiyong; ZHANG Jun; ZHOU Ming; HE Gaiyun; LI Guoqin

    2006-01-01

    Integrating heterogeneous data sources is a precondition to share data for enterprises.Highly-efficient data updating can both save system expenses, and offer real-time data.It is one of the hot issues to modify data rapidly in the pre-processing area of the data warehouse.An extract transform loading design is proposed based on a new data algorithm called Diff-Match,which is developed by utilizing mode matching and data-filtering technology.It can accelerate data renewal, filter the heterogeneous data, and seek out different sets of data.Its efficiency has been proved by its successful application in an enterprise of electric apparatus groups.

  15. A Maximum Resonant Set of Polyomino Graphs

    Directory of Open Access Journals (Sweden)

    Zhang Heping

    2016-05-01

    Full Text Available A polyomino graph P is a connected finite subgraph of the infinite plane grid such that each finite face is surrounded by a regular square of side length one and each edge belongs to at least one square. A dimer covering of P corresponds to a perfect matching. Different dimer coverings can interact via an alternating cycle (or square with respect to them. A set of disjoint squares of P is a resonant set if P has a perfect matching M so that each one of those squares is M-alternating. In this paper, we show that if K is a maximum resonant set of P, then P − K has a unique perfect matching. We further prove that the maximum forcing number of a polyomino graph is equal to the cardinality of a maximum resonant set. This confirms a conjecture of Xu et al. [26]. We also show that if K is a maximal alternating set of P, then P − K has a unique perfect matching.

  16. Calculated and measured brachytherapy dosimetry parameters in water for the Xoft Axxent X-Ray Source: An electronic brachytherapy source

    International Nuclear Information System (INIS)

    Rivard, Mark J.; Davis, Stephen D.; DeWerd, Larry A.; Rusch, Thomas W.; Axelrod, Steve

    2006-01-01

    A new x-ray source, the model S700 Axxent trade mark sign X-Ray Source (Source), has been developed by Xoft Inc. for electronic brachytherapy. Unlike brachytherapy sources containing radionuclides, this Source may be turned on and off at will and may be operated at variable currents and voltages to change the dose rate and penetration properties. The in-water dosimetry parameters for this electronic brachytherapy source have been determined from measurements and calculations at 40, 45, and 50 kV settings. Monte Carlo simulations of radiation transport utilized the MCNP5 code and the EPDL97-based mcplib04 cross-section library. Inter-tube consistency was assessed for 20 different Sources, measured with a PTW 34013 ionization chamber. As the Source is intended to be used for a maximum of ten treatment fractions, tube stability was also assessed. Photon spectra were measured using a high-purity germanium (HPGe) detector, and calculated using MCNP. Parameters used in the two-dimensional (2D) brachytherapy dosimetry formalism were determined. While the Source was characterized as a point due to the small anode size, P (5) were 0.20, 0.24, and 0.29 for the 40, 45, and 50 kV voltage settings, respectively. For 1 125 I and 103 Pd, yet with capability for variable and much higher dose rates and subsequently adjustable penetration capabilities. This paper presents the calculated and measured in-water brachytherapy dosimetry parameters for the model S700 Source at the aforementioned three operating voltages

  17. The sources of process innovation in user firms: an exploration of the antecedents and impact of non-R&D innovation and learning-by-doing

    OpenAIRE

    Bogers, Marcel; Foray, Dominique

    2009-01-01

    Previous research has shown that innovation can have various sources and different forms. Innovation can thus be developed by different types of organizations or individuals for several reasons, and it can be related to product or process technologies. One particular stream of research has explored the role of users as a source of innovation. Examples of user innovations can be found in a variety of fields and settings such as mountain biking, snowboarding, open source software, scientific in...

  18. Open Source Software: critical review of scientific literature and other sources

    OpenAIRE

    Querol del Amo, Marc

    2007-01-01

    This thesis presents the results of a survey of Open Source Licensing literature. It aims to assist the reader in choosing the best license for his/her business. For this reason, the content of this thesis can be divided into: (i) an open source licensing overview, (ii) the explication of the main features of the most popular open source licenses, (iii) the consequences of using one or another and (iv) the critical or controversial issues related to Open Source Licensing. Furthermore, at the ...

  19. Stereoscopic radiographic images with gamma source encoding

    International Nuclear Information System (INIS)

    Strocovsky, S.G.; Otero, D

    2012-01-01

    Conventional radiography with X-ray tube has several drawbacks, as the compromise between the size of the focal spot and the fluence. The finite dimensions of the focal spot impose a limit to the spatial resolution. Gamma radiography uses gamma-ray sources which surpass in size, portability and simplicity to X-ray tubes. However, its low intrinsic fluence forces to use extended sources that also degrade the spatial resolution. In this work, we show the principles of a new radiographic technique that overcomes the limitations associated with the finite dimensions of X-ray sources, and that offers additional benefits to conventional techniques. The new technique called coding source imaging (CSI), is based on the use of extended sources, edge-encoding of radiation and differential detection. The mathematical principles and the method of images reconstruction with the new proposed technique are explained in the present work. Analytical calculations were made to determine the maximum spatial resolution and the variables on which it depends. The CSI technique was tested by means of Monte Carlo simulations with sets of spherical objects. We show that CSI has stereoscopic capabilities and it can resolve objects smaller than the source size. The CSI decoding algorithm reconstructs simultaneously four different projections from the same object, while conventional radiography produces only one projection per acquisition. Projections are located in separate image fields on the detector plane. Our results show it is possible to apply an extremely simple radiographic technique with extended sources, and get 3D information of the attenuation coefficient distribution for simple geometry objects in a single acquisition. The results are promising enough to evaluate the possibility of future research with more complex objects typical of medical diagnostic radiography and industrial gamma radiography (author)

  20. Counting SET-free sets

    OpenAIRE

    Harman, Nate

    2016-01-01

    We consider the following counting problem related to the card game SET: How many $k$-element SET-free sets are there in an $n$-dimensional SET deck? Through a series of algebraic reformulations and reinterpretations, we show the answer to this question satisfies two polynomiality conditions.

  1. Boys' Music? School Context and Middle-School Boys' Musical Choices

    Science.gov (United States)

    Bennetts, Kathleen Scott

    2013-01-01

    This article focusses primarily on the findings relating to the musical participation of boys in one Melbourne school. As part of a project that investigated boys' attitudes and participation at fifty-one schools, several contextual features were identified that set "Balton Boys" High School' apart from other participating schools,…

  2. Asymmetric Joint Source-Channel Coding for Correlated Sources with Blind HMM Estimation at the Receiver

    Directory of Open Access Journals (Sweden)

    Ser Javier Del

    2005-01-01

    Full Text Available We consider the case of two correlated sources, and . The correlation between them has memory, and it is modelled by a hidden Markov chain. The paper studies the problem of reliable communication of the information sent by the source over an additive white Gaussian noise (AWGN channel when the output of the other source is available as side information at the receiver. We assume that the receiver has no a priori knowledge of the correlation statistics between the sources. In particular, we propose the use of a turbo code for joint source-channel coding of the source . The joint decoder uses an iterative scheme where the unknown parameters of the correlation model are estimated jointly within the decoding process. It is shown that reliable communication is possible at signal-to-noise ratios close to the theoretical limits set by the combination of Shannon and Slepian-Wolf theorems.

  3. Using 137Cs and 210Pbex and other sediment source fingerprints to document suspended sediment sources in small forested catchments in south-central Chile

    International Nuclear Information System (INIS)

    Schuller, P.; Walling, D.E.; Iroumé, A.; Quilodrán, C.; Castillo, A.; Navas, A.

    2013-01-01

    A study of the impact of forest harvesting operations on sediment mobilization from forested catchments has been undertaken in south-central Chile. The study focused on two sets of small paired catchments (treatment and control), with similar soil type, but contrasting mean annual rainfall, located about 400 km apart at Nacimiento (1200 mm yr −1 ) and Los Ulmos (2500 mm yr −1 ). The objective was to study the changes in the relative contribution of the primary sources of fine sediment caused by forestry operations. Attention focused on the pre-harvest and post-harvest periods and the post-replanting period was included for the Nacimiento treatment catchment. The sediment source fingerprinting technique was used to document the contributions of the potential sources. Emphasis was placed on discriminating between the forest slopes, forest roads and channel erosion as potential sources of fine sediment and on assessing the relative contributions of these three sources to the sediment yield from the catchments. The fallout radionuclides (FRNs) 137 Cs and excess lead-210, the environmental radionuclides 226 Ra and 40 K and soil organic matter (SOM) were tested as possible fingerprints for discriminating between potential sediment sources. The Kruskal–Wallis test and discriminant function analysis were used to guide the selection of the optimum fingerprint set for each catchment and observation period. Either one or both of the FRNs were selected for inclusion in the optimum fingerprint for all datasets. The relative contribution of each sediment source to the target sediment load was estimated using the selected fingerprint properties, and a mixing model coupled with a Monte Carlo simulation technique that takes account of uncertainty in characterizing sediment source properties. The goodness of fit of the mixing model was tested by comparing the measured and simulated fingerprint properties for the target sediment samples. In the Nacimiento treatment catchment

  4. One-dimensional transport code for one-group problems in plane geometry

    International Nuclear Information System (INIS)

    Bareiss, E.H.; Chamot, C.

    1970-09-01

    Equations and results are given for various methods of solution of the one-dimensional transport equation for one energy group in plane geometry with inelastic scattering and an isotropic source. After considerable investigation, a matrix method of solution was found to be faster and more stable than iteration procedures. A description of the code is included which allows for up to 24 regions, 250 points, and 16 angles such that the product of the number of angles and the number of points is less than 600

  5. Expert analogy use in a naturalistic setting

    Science.gov (United States)

    Kretz, Donald R.; Krawczyk, Daniel C.

    2014-01-01

    The use of analogy is an important component of human cognition. The type of analogy we produce and communicate depends heavily on a number of factors, such as the setting, the level of domain expertise present, and the speaker's goal or intent. In this observational study, we recorded economics experts during scientific discussion and examined the categorical distance and structural depth of the analogies they produced. We also sought to characterize the purpose of the analogies that were generated. Our results supported previous conclusions about the infrequency of superficial similarity in subject-generated analogs, but also showed that distance and depth characteristics were more evenly balanced than in previous observational studies. This finding was likely due to the nature of the goals of the participants, as well as the broader nature of their expertise. An analysis of analogical purpose indicated that the generation of concrete source examples of more general target concepts was most prevalent. We also noted frequent instances of analogies intended to form visual images of source concepts. Other common purposes for analogies were the addition of colorful speech, inclusion (i.e., subsumption) of a target into a source concept, or differentiation between source and target concepts. We found no association between depth and either of the other two characteristics, but our findings suggest a relationship between purpose and distance; i.e., that visual imagery typically entailed an outside-domain source whereas exemplification was most frequently accomplished using within-domain analogies. Overall, we observed a rich and diverse set of spontaneously produced analogical comparisons. The high degree of expertise within the observed group along with the richly comparative nature of the economics discipline likely contributed to this analogical abundance. PMID:25505437

  6. Expert Analogy Use in a Naturalistic Setting

    Directory of Open Access Journals (Sweden)

    Donald R Kretz

    2014-11-01

    Full Text Available The use of analogy is an important component of human cognition. The type of analogy we produce and communicate depends heavily on a number of factors, such as the setting, the level of domain expertise present, and the speaker’s goal or intent. In this observational study, we recorded economics experts during scientific discussion and examined the categorical distance and structural depth of the analogies they produced. We also sought to characterize the purpose of the analogies that were generated. Our results supported previous conclusions about the infrequency of superficial similarity in subject-generated analogs, but also showed that distance and depth characteristics were more evenly balanced than in previous observational studies. This finding was likely due to the nature of the goals of the participants, as well as the broader nature of their expertise. An analysis of analogical purpose indicated that the generation of concrete source examples of more general target concepts was most prevalent. We also noted frequent instances of analogies intended to form visual images of source concepts. Other common purposes for analogies were the addition of colorful speech, inclusion (i.e., subsumption of a target into a source concept, or differentiation between source and target concepts. We found no association between depth and either of the other two characteristics, but our findings suggest a relationship between purpose and distance; i.e., that visual imagery typically entailed an outside-domain source whereas exemplification was most frequently accomplished using within-domain analogies. Overall, we observed a rich and diverse set of spontaneously produced analogical comparisons. The high degree of expertise within the observed group along with the richly comparative nature of the economics discipline likely contributed to this analogical abundance.

  7. Validation of secondary commercial data sources for physical activity facilities in urban and nonurban settings.

    Science.gov (United States)

    Han, Euna; Powell, Lisa; Slater, Sandy; Quinn, Christopher

    2012-11-01

    Secondary data are often necessary to assess the availability of commercial physical activity (PA) facilities and examine its association with individual behaviors and outcomes, yet the validity of such sources has been explored only in a limited number of studies. Field data were collected on the presence and attributes of commercial PA facilities in a random sample of 30 urban, 15 suburban, and 15 rural Census tracts in the Chicago metropolitan statistical area and surrounding area. Approximately 40% of PA establishments in the field data were listed for both urban and nonurban tracts in both lists except for nonurban tracts in D&B (35%), which was significantly improved in the combined list of D&B and InfoUSA. Approximately one-quarter of the PA facilities listed in D&B were found on the ground, whereas 40% to 50% of PA facilities listed in InfoUSA were found on the ground. PA establishments that offered instruction programs or lessons or that had a court or pool were less likely to be listed, particularly in the nonurban tracts. Secondary commercial business lists on PA facilities should be used with caution in assessing the built environment.

  8. Nutrition screening tools: Does one size fit all? A systematic review of screening tools for the hospital setting

    NARCIS (Netherlands)

    van Bokhorst-de van der Schueren, M.A.E.; Guaitoli, P.R.; Jansma, E.P.; de Vet, H.C.W.

    2014-01-01

    Background & aims: Numerous nutrition screening tools for the hospital setting have been developed. The aim of this systematic review is to study construct or criterion validity and predictive validity of nutrition screening tools for the general hospital setting. Methods: A systematic review of

  9. Global combustion sources of organic aerosols: model comparison with 84 AMS factor-analysis data sets

    Science.gov (United States)

    Tsimpidi, Alexandra P.; Karydis, Vlassis A.; Pandis, Spyros N.; Lelieveld, Jos

    2016-07-01

    Emissions of organic compounds from biomass, biofuel, and fossil fuel combustion strongly influence the global atmospheric aerosol load. Some of the organics are directly released as primary organic aerosol (POA). Most are emitted in the gas phase and undergo chemical transformations (i.e., oxidation by hydroxyl radical) and form secondary organic aerosol (SOA). In this work we use the global chemistry climate model ECHAM/MESSy Atmospheric Chemistry (EMAC) with a computationally efficient module for the description of organic aerosol (OA) composition and evolution in the atmosphere (ORACLE). The tropospheric burden of open biomass and anthropogenic (fossil and biofuel) combustion particles is estimated to be 0.59 and 0.63 Tg, respectively, accounting for about 30 and 32 % of the total tropospheric OA load. About 30 % of the open biomass burning and 10 % of the anthropogenic combustion aerosols originate from direct particle emissions, whereas the rest is formed in the atmosphere. A comprehensive data set of aerosol mass spectrometer (AMS) measurements along with factor-analysis results from 84 field campaigns across the Northern Hemisphere are used to evaluate the model results. Both the AMS observations and the model results suggest that over urban areas both POA (25-40 %) and SOA (60-75 %) contribute substantially to the overall OA mass, whereas further downwind and in rural areas the POA concentrations decrease substantially and SOA dominates (80-85 %). EMAC does a reasonable job in reproducing POA and SOA levels during most of the year. However, it tends to underpredict POA and SOA concentrations during winter indicating that the model misses wintertime sources of OA (e.g., residential biofuel use) and SOA formation pathways (e.g., multiphase oxidation).

  10. Experienced iPad-Using Early Childhood Teachers: Practices in the One-to-One iPad Classroom

    Science.gov (United States)

    Lu, Ya-Huei; Ottenbreit-Leftwich, Anne T.; Ding, Ai-Chu; Glazewski, Krista

    2017-01-01

    Although many elementary schools have adopted one-to-one programs, we still lack information on how teachers integrate iPads or other tablets into their daily instruction, especially in early childhood settings. The purpose of this case study was to present how four experienced iPad-using early childhood teachers integrated one-to-one iPads into…

  11. Scalar one-loop integrals for QCD

    International Nuclear Information System (INIS)

    Ellis, R. Keith; Zanderighi, Giulia

    2008-01-01

    We construct a basis set of infra-red and/or collinearly divergent scalar one-loop integrals and give analytic formulas, for tadpole, bubble, triangle and box integrals, regulating the divergences (ultra-violet, infra-red or collinear) by regularization in D = 4-2ε dimensions. For scalar triangle integrals we give results for our basis set containing 6 divergent integrals. For scalar box integrals we give results for our basis set containing 16 divergent integrals. We provide analytic results for the 5 divergent box integrals in the basis set which are missing in the literature. Building on the work of van Oldenborgh, a general, publicly available code has been constructed, which calculates both finite and divergent one-loop integrals. The code returns the coefficients of 1/ε 2 ,1/ε 1 and 1/ε 0 as complex numbers for an arbitrary tadpole, bubble, triangle or box integral

  12. Entanglement entropy in causal set theory

    Science.gov (United States)

    Sorkin, Rafael D.; Yazdi, Yasaman K.

    2018-04-01

    Entanglement entropy is now widely accepted as having deep connections with quantum gravity. It is therefore desirable to understand it in the context of causal sets, especially since they provide in a natural manner the UV cutoff needed to render entanglement entropy finite. Formulating a notion of entanglement entropy in a causal set is not straightforward because the type of canonical hypersurface-data on which its definition typically relies is not available. Instead, we appeal to the more global expression given in Sorkin (2012 (arXiv:1205.2953)) which, for a Gaussian scalar field, expresses the entropy of a spacetime region in terms of the field’s correlation function within that region (its ‘Wightman function’ W(x, x') ). Carrying this formula over to the causal set, one obtains an entropy which is both finite and of a Lorentz invariant nature. We evaluate this global entropy-expression numerically for certain regions (primarily order-intervals or ‘causal diamonds’) within causal sets of 1  +  1 dimensions. For the causal-set counterpart of the entanglement entropy, we obtain, in the first instance, a result that follows a (spacetime) volume law instead of the expected (spatial) area law. We find, however, that one obtains an area law if one truncates the commutator function (‘Pauli–Jordan operator’) and the Wightman function by projecting out the eigenmodes of the Pauli–Jordan operator whose eigenvalues are too close to zero according to a geometrical criterion which we describe more fully below. In connection with these results and the questions they raise, we also study the ‘entropy of coarse-graining’ generated by thinning out the causal set, and we compare it with what one obtains by similarly thinning out a chain of harmonic oscillators, finding the same, ‘universal’ behaviour in both cases.

  13. Pseudodynamic Source Characterization for Strike-Slip Faulting Including Stress Heterogeneity and Super-Shear Ruptures

    KAUST Repository

    Mena, B.

    2012-08-08

    Reliable ground‐motion prediction for future earthquakes depends on the ability to simulate realistic earthquake source models. Though dynamic rupture calculations have recently become more popular, they are still computationally demanding. An alternative is to invoke the framework of pseudodynamic (PD) source characterizations that use simple relationships between kinematic and dynamic source parameters to build physically self‐consistent kinematic models. Based on the PD approach of Guatteri et al. (2004), we propose new relationships for PD models for moderate‐to‐large strike‐slip earthquakes that include local supershear rupture speed due to stress heterogeneities. We conduct dynamic rupture simulations using stochastic initial stress distributions to generate a suite of source models in the magnitude Mw 6–8. This set of models shows that local supershear rupture speed prevails for all earthquake sizes, and that the local rise‐time distribution is not controlled by the overall fault geometry, but rather by local stress changes on the faults. Based on these findings, we derive a new set of relations for the proposed PD source characterization that accounts for earthquake size, buried and surface ruptures, and includes local rise‐time variations and supershear rupture speed. By applying the proposed PD source characterization to several well‐recorded past earthquakes, we verify that significant improvements in fitting synthetic ground motion to observed ones is achieved when comparing our new approach with the model of Guatteri et al. (2004). The proposed PD methodology can be implemented into ground‐motion simulation tools for more physically reliable prediction of shaking in future earthquakes.

  14. Proceedings of the workshop on ion source issues relevant to a pulsed spallation neutron source: Part 1: Workshop summary

    International Nuclear Information System (INIS)

    Schroeder, L.; Leung, K.N.; Alonso, J.

    1994-10-01

    The workshop reviewed the ion-source requirements for high-power accelerator-driven spallation neutron facilities, and the performance of existing ion sources. Proposals for new facilities in the 1- to 5-MW range call for a widely differing set of ion-source requirements. For example, the source peak current requirements vary from 40 mA to 150 mA, while the duty factor ranges from 1% to 9%. Much of the workshop discussion centered on the state-of-the-art of negative hydrogen ion source (H - ) technology and the present experience with Penning and volume sources. In addition, other ion source technologies, for positive ions or CW applications were reviewed. Some of these sources have been operational at existing accelerator complexes and some are in the source-development stage on test stands

  15. Clinical nutrition managers have access to sources of empowerment.

    Science.gov (United States)

    Mislevy, J M; Schiller, M R; Wolf, K N; Finn, S C

    2000-09-01

    To ascertain perceived access of dietitians to power in the workplace. The conceptual framework was Kanter's theory of organizational power. The Conditions for Work Effectiveness Questionnaire was used to measure perceived access to sources of power: information, support, resources, and opportunities. Demographic data were collected to identify factors that may enhance empowerment. The questionnaire was sent to a random sample of 348 dietitians chosen from members of the Clinical Nutrition Management dietetic practice group of the American Dietetic Association. Blank questionnaires were returned by 99 (28.4%) people not working as clinical nutrition managers, which left 249 in the sample. Descriptive statistics were used to organize and summarize data. One-way analysis of variance and t tests were performed to identify differences in responses based on levels of education, work setting, and information technology skills. Usable questionnaires were received from 178 people (71.5%). On a 5-point scale, scores for access to information (mean +/- standard deviation [SD] = 3.8 +/- 0.7), opportunity (mean +/- SD = 3.6 +/- 0.7), support (mean +/- SD = 3.2 +/- 0.9), and resources (mean +/- SD = 3.1 +/- 0.8) demonstrated that clinical nutrition managers perceived themselves as having substantial access to sources of empowerment. Those having higher levels of education, working in larger hospitals, having better-developed information technology skills, and using information technology more frequently had statistically significant higher empowerment scores (P = leadership roles in today's health care settings. Their power may be enhanced by asserting more pressure to gain greater access to sources of power: support, information, resources, and opportunities.

  16. The identification of teaching interactions used in one-to-one teaching of number in the early years of schooling

    Directory of Open Access Journals (Sweden)

    Bronwyn Ewing

    2016-12-01

    Full Text Available This research paper reports on phase one of an investigation of video recorded intensive one-to-one teaching interactions with 6–7-year-old students who were in their second year of schooling in Australia and identified by the their teacher as low attaining in early number. The two-phased study from which this paper emerges was originally conducted in 1998 as part of my Bachelor of Teaching Honours (Research program at Southern Cross University Lismore, New South Wales. That study identified teaching interactions particularly suited to one-to-one teaching in the Maths Recovery Program, a program designed for these students who were at risk of failure in early number. Since that time a great deal has not changed with limited literature available that comprehensively reports on teaching interactions in intensive one-to-one settings. Revisiting the original study is considered timely given the increasing number of withdrawal and intensive programs now funded and adopted by schools and yet, rarely reported on in terms of the effectiveness of the teaching interactions that occur in such settings. This paper then presents a discussion of a preliminary series of teaching interactions that either positively and or negatively influence an intensive one-to-one teaching and learning setting.

  17. Improving nuclear utility generation capacity, understanding the sources of forced outage and learning how to prevent them

    International Nuclear Information System (INIS)

    Brodeur, D.L.; Todreas, N.E.; Angus, V.T.

    1998-01-01

    MIT and PECO Energy have completed a detailed examination of the sources of forced outages at the Limerick Generating Station (LGS) Boiling Water Reactor Class IV (BWR IV) site over a five year period and contrasted that information to similar BWR IV utilities in the United States over the same period. Each forced outage was attributed to one system and assigned causal codes of equipment versus human factors and failure attributes such as weak design, poor craftsmanship, and worn parts. It was found that fifty four percent of the lost power at LGS was the result of Balance of Plant failures. Industry wide data identifies fifty nine percent of the lost power as attributed to Balance of Plant failures. Balance of Plant systems are those systems not included in the primary and safety related system category. Considering failure causal factors, forty six percent of the lost power at the utility under study was the result of equipment factors such as weak design or worn parts. Significantly, the study showed a high variance between those systems which caused significant forced outage at the two sister LGS units. This demonstrated the infrequent nature of plant forced outages within a given system. This was supported by the observation that dominant systems attributing to forced outage at LGS were not equally represented in industry data. It is suggested that for individual utilities to dramatically improve unit capability factors with regard to Balance of Plant systems, they must learn from industry wide experiences and develop cooperative means of exchanging lessons learned among similarly designed plants and systems. With the broad knowledge base of system failures, current designs must be frequently assessed and altered until each system poses an acceptable level of risk to generation capacity. (author)

  18. One or many Buddhas?

    DEFF Research Database (Denmark)

    Sobisch, Jan-Ulrich

    2016-01-01

    According to Pali Buddhist sources, there can only be one Buddha per world system. Mahayana Buddhism maintains different view, which is argued here by 'Jig rten gsum mgön based, among other things, on a quotation from the Uttaratantrashastra (= Ratnagotravibhaga)....

  19. A one-stage cultivation process for lipid- and carbohydrate-rich biomass of Scenedesmus obtusiusculus based on artificial and natural water sources.

    Science.gov (United States)

    Schulze, Christian; Reinhardt, Jakob; Wurster, Martina; Ortiz-Tena, José Guillermo; Sieber, Volker; Mundt, Sabine

    2016-10-01

    A one-stage cultivation process of the microalgae Scenedesmus obtusiusculus with medium based on natural water sources was developed to enhance lipids and carbohydrates. A medium based on artificial sea water, Baltic Sea water and river water with optimized nutrient concentrations compared to the standard BG-11 for nitrate (-75%), phosphate and iron (-90%) was used for cultivation. Although nitrate exhaustion over cultivation resulted in nitrate limitation, growth of the microalgae was not reduced. The lipid content increased from 6.0% to 19.9%, an increase in oleic and stearic acid was observed. The unsaponifiable matter of the lipid fraction was reduced from 19.5% to 11.4%. The carbohydrate yield rose from 45% to 50% and the protein content decreased from 32.4% to 15.9%. Using natural water sources with optimized nutrient concentrations could open the opportunity to modulate biomass composition and to reduce the cultivation costs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Passage from Pen and Paper to Keyboard and Screen: An Investigation of the Evolution of Writing Instruction in One-to-One Laptop Settings

    Science.gov (United States)

    Jett, Janice Rowe

    2013-01-01

    With the steady increase of ubiquitous computing initiatives across the country in the last decade, there is a pressing need for specific research looking at content area instruction in 1:1 settings. This qualitative multiple case study examines writing instruction at two middle schools as it is delivered by experienced teachers in five English…

  1. Information contraction and extraction by multivariate autoregressive (MAR) modelling. Pt. 2. Dominant noise sources in BWRS

    International Nuclear Information System (INIS)

    Morishima, N.

    1996-01-01

    The multivariate autoregressive (MAR) modeling of a vector noise process is discussed in terms of the estimation of dominant noise sources in BWRs. The discussion is based on a physical approach: a transfer function model on BWR core dynamics is utilized in developing a noise model; a set of input-output relations between three system variables and twelve different noise sources is obtained. By the least-square fitting of a theoretical PSD on neutron noise to an experimental one, four kinds of dominant noise sources are selected. It is shown that some of dominant noise sources consist of two or more different noise sources and have the spectral properties of being coloured and correlated with each other. By diagonalizing the PSD matrix for dominant noise sources, we may obtain an MAR expression for a vector noise process as a response to the diagonal elements(i.e. residual noises) being white and mutually-independent. (Author)

  2. Natural radioactivity in groundwater sources in Ireland

    Energy Technology Data Exchange (ETDEWEB)

    Currivan, L.; Dowdall, A.; Mcginnity, P.; Ciara, M. [Radiological Protection Institute of Ireland (Ireland); Craig, M. [Environmental Protection Agency (Ireland)

    2014-07-01

    The Radiological Protection Institute of Ireland (RPII) in collaboration with the Irish Environmental Protection Agency (EPA) undertook a national survey of radioactivity in groundwater sources for compliance with parameters set out in the European Communities Drinking Water Directive. The Directive outlines the minimum requirements for the quality of drinking water and water intended for human consumption. Over two hundred samples were screened for radioactivity. Where indicated, analysis for individual radionuclide activity was undertaken and the radiation dose arising calculated. Furthermore, samples were analysed for radon concentration. This survey is the first comprehensive national survey of radioactivity in groundwater sources in Ireland. Approximately 18 per cent of drinking water in Ireland originates from groundwater and springs with the remainder from surface water. Between 2007 and 2011, water samples from a representative network of groundwater sources were analysed and assessed for compliance with the radioactivity parameters set out in the Drinking Water Directive. The assessment was carried out using the methodology for screening drinking water set out by the WHO. For practical purposes the WHO recommended screening levels for drinking water below which no further action is required of 100 mBq/l for gross alpha activity and 1000 mBq/l for gross beta activity were applied. Of the 203 groundwater sources screened for gross alpha and gross beta all met the gross beta activity criteria of less than 1000 mBq/l and 175 supplies had gross alpha activity concentrations of less than 100 mBq/l. For these sources no further analysis was required. The remaining 28 sources required further (radionuclide-specific) analysis from an alpha activity perspective. Results on ranges and distributions of radionuclide concentrations in groundwater as well as ingestion doses estimated for consumers of these water supplies will be presented. Document available in abstract

  3. The Fifty Year Rehabilitation of the Egg

    Directory of Open Access Journals (Sweden)

    Donald J. McNamara

    2015-10-01

    Full Text Available The 1968 American Heart Association announced a dietary recommendation that all individuals consume less than 300 mg of dietary cholesterol per day and no more than three whole eggs per week. This recommendation has not only significantly impacted the dietary patterns of the population, but also resulted in the public limiting a highly nutritious and affordable source of high quality nutrients, including choline which was limited in the diets of most individuals. The egg industry addressed the egg issue with research documenting the minimal effect of egg intake on plasma lipoprotein levels, as well as research verifying the importance of egg nutrients in a variety of issues related to health promotion. In 2015 dietary cholesterol and egg restrictions have been dropped by most health promotion agencies worldwide and recommended to be dropped from the 2015 Dietary Guidelines for Americans.

  4. The Chandra Source Catalog: Algorithms

    Science.gov (United States)

    McDowell, Jonathan; Evans, I. N.; Primini, F. A.; Glotfelty, K. J.; McCollough, M. L.; Houck, J. C.; Nowak, M. A.; Karovska, M.; Davis, J. E.; Rots, A. H.; Siemiginowska, A. L.; Hain, R.; Evans, J. D.; Anderson, C. S.; Bonaventura, N. R.; Chen, J. C.; Doe, S. M.; Fabbiano, G.; Galle, E. C.; Gibbs, D. G., II; Grier, J. D.; Hall, D. M.; Harbo, P. N.; He, X.; Lauer, J.; Miller, J. B.; Mitschang, A. W.; Morgan, D. L.; Nichols, J. S.; Plummer, D. A.; Refsdal, B. L.; Sundheim, B. A.; Tibbetts, M. S.; van Stone, D. W.; Winkelman, S. L.; Zografou, P.

    2009-09-01

    Creation of the Chandra Source Catalog (CSC) required adjustment of existing pipeline processing, adaptation of existing interactive analysis software for automated use, and development of entirely new algorithms. Data calibration was based on the existing pipeline, but more rigorous data cleaning was applied and the latest calibration data products were used. For source detection, a local background map was created including the effects of ACIS source readout streaks. The existing wavelet source detection algorithm was modified and a set of post-processing scripts used to correct the results. To analyse the source properties we ran the SAO Traceray trace code for each source to generate a model point spread function, allowing us to find encircled energy correction factors and estimate source extent. Further algorithms were developed to characterize the spectral, spatial and temporal properties of the sources and to estimate the confidence intervals on count rates and fluxes. Finally, sources detected in multiple observations were matched, and best estimates of their merged properties derived. In this paper we present an overview of the algorithms used, with more detailed treatment of some of the newly developed algorithms presented in companion papers.

  5. Understanding Constructions of Danish-ness using Heterogeneous Types of Sources across Historical Settings

    DEFF Research Database (Denmark)

    Øland, Trine; Ydesen, Christian

    -state construct ‘disturbing/disquieting behavior’, delinquency, and misbehavior and how these constructions are made into a category that activates educational and other interventions. Using a comparative outlook between the two historical settings the purpose is to understand 1) the boundaries of legitimate...

  6. Technical Note: Display window setting: An important factor for detecting subtle but clinically relevant artifacts in daily CT quality control.

    Science.gov (United States)

    Long, Zaiyang; Bruesewitz, Michael R; Sheedy, Emily N; Powell, Michele A; Kramer, Jacqualynn C; Supalla, Randall R; Colvin, Chance M; Bechel, Jessica R; Favazza, Christopher P; Kofler, James M; Leng, Shuai; McCollough, Cynthia H; Yu, Lifeng

    2016-12-01

    This study aimed to investigate the influence of display window setting on technologist performance detecting subtle but clinically relevant artifacts in daily computed tomography (CT) quality control (dQC) images. Fifty three sets of dQC images were retrospectively selected, including 30 sets without artifacts, and 23 with subtle but clinically relevant artifacts. They were randomized and shown to six CT technologists (two new and four experienced). Each technologist reviewed all images in each of two sessions, one with a display window width (WW) of 100 HU, which is currently recommended by the American College of Radiology, and the other with a narrow WW of 40 HU, both at a window level of 0 HU. For each case, technologists rated the presence of image artifacts based on a five point scale. The area under the receiver operating characteristic curve (AUC) was used to evaluate the artifact detection performance. At a WW of 100 HU, the AUC (95% confidence interval) was 0.658 (0.576, 0.740), 0.532 (0.429, 0.635), and 0.616 (0.543, 0.619) for the experienced, new, and all technologists, respectively. At a WW of 40 HU, the AUC was 0.768 (0.687, 0.850), 0.546 (0.433, 0.658), and 0.694 (0.619, 0.769), respectively. The performance significantly improved at WW of 40 HU for experienced technologists (p = 0.009) and for all technologists (p = 0.040). Use of a narrow display WW significantly improved technologists' performance in dQC for detecting subtle but clinically relevant artifacts as compared to that using a 100 HU display WW.

  7. Comparative study between Steiner's cephalometric-radiographic patterns and the ones of Brazilian's, white teenagers, who present normal occlusion

    International Nuclear Information System (INIS)

    Domingues, A.P. de.

    1986-01-01

    The purpose of this study was to study comparatively the cephalometric-radiographic patterns of Steiner's analysis and the ones of Brazilians, white teenagers, who present normal occlusions. The sample was composed of fifty seven teleradiographies on lateral pattern from Brazilian teenagers. Those teenagers are white and their parents are Brazilian, descended from Mediterraneans. Also the examined teenagers had not undergone previous orthodontic treatment and as it was said above, present normal occlusion. (author) [pt

  8. Fifty years old, and still going strong: Transmission electron optical studies of materials

    International Nuclear Information System (INIS)

    Brown, L.M.

    2008-01-01

    Highlights in the history of transmission electron microscopy and scanning transmission electron microscopy include the introduction of diffraction contrast, resolution of periodic lattices by phase contrast and incoherent imaging via the high-angle annular dark-field detector. Convergent-beam electron diffraction and analytical electron microscopy, especially the application of energy-dispersive X-ray and electron energy-loss spectrometry, have provided structural and chemical information in addition to strain contrast from lattice defects. From the outset, novel specimen stages and improvements to aid the operator enhanced the electron-optical engineering provided by the instrument makers. The spatial resolution achieved was mainly determined by the way the instrument was used, and not by the basic resolution limit set by the electron optics. However, the application of computer controlled correction of spherical (and higher order) aberration has resulted in a new generation of instruments capable of sub-Angstrom point-to-point resolution. This improved performance, combined with electron energy-loss spectrometry, promises genuine three-dimensional determination of atomic and electronic structure: an indispensable weapon in the battle to fabricate and control useful nanostructures. The uncertainty principle now fundamentally restricts some of the observations one can make, but much more technical development over the next decades must occur before one can say that the techniques of electron-optical imaging of material structure have reached their fundamental limitations. One can expect remarkable progress over the next few years

  9. Confusion-limited extragalactic source survey at 4.755 GHz. I. Source list and areal distributions

    International Nuclear Information System (INIS)

    Ledden, J.E.; Broderick, J.J.; Condon, J.J.; Brown, R.L.

    1980-01-01

    A confusion-limited 4.755-GHz survey covering 0.00 956 sr between right ascensions 07/sup h/05/sup m/ and 18/sup h/ near declination +35 0 has been made with the NRAO 91-m telescope. The survey found 237 sources and is complete above 15 mJy. Source counts between 15 and 100 mJy were obtained directly. The P(D) distribution was used to determine the number counts between 0.5 and 13.2 mJy, to search for anisotropy in the density of faint extragalactic sources, and to set a 99%-confidence upper limit of 1.83 mK to the rms temperature fluctuation of the 2.7-K cosmic microwave background on angular scales smaller than 7.3 arcmin. The discrete-source density, normalized to the static Euclidean slope, falls off sufficiently rapidly below 100 mJy that no new population of faint flat-spectrum sources is required to explain the 4.755-GHz source counts

  10. Laser-produced X-ray sources

    International Nuclear Information System (INIS)

    Hudson, L.T.; Seely, J.F.

    2010-01-01

    A formidable array of advanced laser systems are emerging that produce extreme states of light and matter. By irradiating solid and gaseous targets with lasers of increasing energy densities, new physical regimes of radiation effects are being explored for the first time in controlled laboratory settings. One result that is being accomplished or pursued using a variety of techniques, is the realization of novel sources of X-rays with unprecedented characteristics and light-matter interactions, the mechanisms of which are in many cases still being elucidated. Examples include the megajoule class of laser-produced plasmas designed in pursuit of alternative-energy and security applications and the petawatt class of lasers used for fast ignition and X-ray radiographic applications such as medical imaging and real-time imaging of plasma hydrodynamics. As these technologies mature, increased emphasis will need to be placed on advanced instrumentation and diagnostic metrology to characterize the spectra, time structure, and absolute brightness of X-rays emitted by these unconventional sources. Such customized and absolutely calibrated measurement tools will serve as an enabling technology that can help in assessing the overall system performance and progress, as well as identification of the underlying interaction mechanisms of interest to basic and applied strong-field and high-energy-density science.

  11. Population studies of the unidentified EGRET sources

    Energy Technology Data Exchange (ETDEWEB)

    Siegal-Gaskins, J M [University of Chicago, Chicago, IL 60637 (United States); Pavlidou, V [University of Chicago, Chicago, IL 60637 (United States); Olinto, A V [University of Chicago, Chicago, IL 60637 (United States); Brown, C [University of Chicago, Chicago, IL 60637 (United States); Fields, B D [University of Illinois at Urbana-Champaign, Urbana, IL 61801 (United States)

    2007-03-15

    The third EGRET catalog contains a large number of unidentified sources. Current data allows the intriguing possibility that some of these objects may represent a new class of yet undiscovered gamma-ray sources. By assuming that galaxies similar to the Milky Way host comparable populations of objects, we constrain the allowed Galactic abundance and distribution of various classes of gamma-ray sources using the EGRET data set. Furthermore, regardless of the nature of the unidentified sources, faint unresolved objects of the same class contribute to the observed diffuse gamma-ray background. We investigate the potential contribution of these unresolved sources to the extragalactic gamma-ray background.

  12. Population studies of the unidentified EGRET sources

    International Nuclear Information System (INIS)

    Siegal-Gaskins, J M; Pavlidou, V; Olinto, A V; Brown, C; Fields, B D

    2007-01-01

    The third EGRET catalog contains a large number of unidentified sources. Current data allows the intriguing possibility that some of these objects may represent a new class of yet undiscovered gamma-ray sources. By assuming that galaxies similar to the Milky Way host comparable populations of objects, we constrain the allowed Galactic abundance and distribution of various classes of gamma-ray sources using the EGRET data set. Furthermore, regardless of the nature of the unidentified sources, faint unresolved objects of the same class contribute to the observed diffuse gamma-ray background. We investigate the potential contribution of these unresolved sources to the extragalactic gamma-ray background

  13. Lead shot from hunting as a source of lead in human blood

    International Nuclear Information System (INIS)

    Johansen, Poul; Pedersen, Henning Sloth; Asmund, Gert; Riget, Frank

    2006-01-01

    This study investigates the relationship between the intake of birds hunted with lead shot and the lead concentration in human blood. Fifty adult men from Nuuk, Greenland took part in the study. From September 2003 to June 2004 they regularly gave blood samples and recorded how many birds they ate. We found a clear relationship between the number of bird meals and blood lead and also a clear seasonal variation. The concentration was highest in mid-winter when bird consumption is at its highest. Blood lead was low (15 μg/L, mean concentration) among the participants reporting not eating birds. Among those reporting to eat birds regularly, blood lead was significantly higher, up to 128 μg/L (mean concentration). Concentrations depended on the frequency of bird meals: the more the bird meals, the higher the resulting blood lead. This clear relationship points to lead shot as the dominating lead source to people in Greenland. - Birds hunted with lead shot and consumed are a source of lead in human blood

  14. Lead shot from hunting as a source of lead in human blood

    Energy Technology Data Exchange (ETDEWEB)

    Johansen, Poul [National Environmental Research Institute, Frederiksborgvej 399, DK-4000 Roskilde (Denmark)]. E-mail: poj@dmu.dk; Pedersen, Henning Sloth [Primary Health Care Center, DK-3900 Nuuk (Greenland); Asmund, Gert [National Environmental Research Institute, Frederiksborgvej 399, DK-4000 Roskilde (Denmark); Riget, Frank [National Environmental Research Institute, Frederiksborgvej 399, DK-4000 Roskilde (Denmark)

    2006-07-15

    This study investigates the relationship between the intake of birds hunted with lead shot and the lead concentration in human blood. Fifty adult men from Nuuk, Greenland took part in the study. From September 2003 to June 2004 they regularly gave blood samples and recorded how many birds they ate. We found a clear relationship between the number of bird meals and blood lead and also a clear seasonal variation. The concentration was highest in mid-winter when bird consumption is at its highest. Blood lead was low (15 {mu}g/L, mean concentration) among the participants reporting not eating birds. Among those reporting to eat birds regularly, blood lead was significantly higher, up to 128 {mu}g/L (mean concentration). Concentrations depended on the frequency of bird meals: the more the bird meals, the higher the resulting blood lead. This clear relationship points to lead shot as the dominating lead source to people in Greenland. - Birds hunted with lead shot and consumed are a source of lead in human blood.

  15. Open-source intelligence in the Czech military knowledge syst em and process design

    OpenAIRE

    Krejci, Roman

    2002-01-01

    Owing to the recent transitions in the Czech Republic, the Czech military must satisfy a large set of new requirements. One way the military intelligence can become more effective and can conserve resources is by increasing the efficiency of open-source intelligence (OSINT), which plays an important part in intelligence gathering in the age of information. When using OSINT effectively, the military intelligence can elevate its responsiveness to different types of crises and can also properly ...

  16. Comparison of Aspergillus species-complexes detected in different environmental settings

    OpenAIRE

    Sabino, Raquel; Viegas, Carla; Veríssimo, Carla; Clemons, K. V.; Stevens, D. A.

    2014-01-01

    Purpose: Samples from different environmental sources were screened for the presence of Aspergillus, and the distribution of the different species-complexes was determined in order to understand differences among that distribution in the several environmental sources and which of these species complexes are present in specific environmental settings. Methods: Four distinct environments (beaches, poultries, swineries and hospital) were studied and analyzed for which Aspergillus complexes were ...

  17. Source Similarity and Social Media Health Messages: Extending Construal Level Theory to Message Sources.

    Science.gov (United States)

    Young, Rachel

    2015-09-01

    Social media users post messages about health goals and behaviors to online social networks. Compared with more traditional sources of health communication such as physicians or health journalists, peer sources are likely to be perceived as more socially close or similar, which influences how messages are processed. This experimental study uses construal level theory of psychological distance to predict how mediated health messages from peers influence health-related cognition and behavioral intention. Participants were exposed to source cues that identified peer sources as being either highly attitudinally and demographically similar to or different from participants. As predicted by construal level theory, participants who perceived sources of social media health messages as highly similar listed a greater proportion of beliefs about the feasibility of health behaviors and a greater proportion of negative beliefs, while participants who perceived sources as more dissimilar listed a greater proportion of positive beliefs about the health behaviors. Results of the study could be useful in determining how health messages from peers could encourage individuals to set realistic health goals.

  18. Characterization of the IOTA Proton Source

    Energy Technology Data Exchange (ETDEWEB)

    Young, Samantha [Chicago U.

    2017-08-11

    This project focuses on characterizing the IOTA proton source through changing the parameters of four various components of the Low Energy Beam Transport (LEBT). Because of an inecient lament, current was limited to 2 mA when 40 mA is ultimately desired. Through an investigation of the solenoids and trims of the LEBT, we sought more knowledge about the optimum settings for running the IOTA proton source.

  19. H2RM: A Hybrid Rough Set Reasoning Model for Prediction and Management of Diabetes Mellitus

    Directory of Open Access Journals (Sweden)

    Rahman Ali

    2015-07-01

    Full Text Available Diabetes is a chronic disease characterized by high blood glucose level that results either from a deficiency of insulin produced by the body, or the body’s resistance to the effects of insulin. Accurate and precise reasoning and prediction models greatly help physicians to improve diagnosis, prognosis and treatment procedures of different diseases. Though numerous models have been proposed to solve issues of diagnosis and management of diabetes, they have the following drawbacks: (1 restricted one type of diabetes; (2 lack understandability and explanatory power of the techniques and decision; (3 limited either to prediction purpose or management over the structured contents; and (4 lack competence for dimensionality and vagueness of patient’s data. To overcome these issues, this paper proposes a novel hybrid rough set reasoning model (H2RM that resolves problems of inaccurate prediction and management of type-1 diabetes mellitus (T1DM and type-2 diabetes mellitus (T2DM. For verification of the proposed model, experimental data from fifty patients, acquired from a local hospital in semi-structured format, is used. First, the data is transformed into structured format and then used for mining prediction rules. Rough set theory (RST based techniques and algorithms are used to mine the prediction rules. During the online execution phase of the model, these rules are used to predict T1DM and T2DM for new patients. Furthermore, the proposed model assists physicians to manage diabetes using knowledge extracted from online diabetes guidelines. Correlation-based trend analysis techniques are used to manage diabetic observations. Experimental results demonstrate that the proposed model outperforms the existing methods with 95.9% average and balanced accuracies.

  20. H2RM: A Hybrid Rough Set Reasoning Model for Prediction and Management of Diabetes Mellitus.

    Science.gov (United States)

    Ali, Rahman; Hussain, Jamil; Siddiqi, Muhammad Hameed; Hussain, Maqbool; Lee, Sungyoung

    2015-07-03

    Diabetes is a chronic disease characterized by high blood glucose level that results either from a deficiency of insulin produced by the body, or the body's resistance to the effects of insulin. Accurate and precise reasoning and prediction models greatly help physicians to improve diagnosis, prognosis and treatment procedures of different diseases. Though numerous models have been proposed to solve issues of diagnosis and management of diabetes, they have the following drawbacks: (1) restricted one type of diabetes; (2) lack understandability and explanatory power of the techniques and decision; (3) limited either to prediction purpose or management over the structured contents; and (4) lack competence for dimensionality and vagueness of patient's data. To overcome these issues, this paper proposes a novel hybrid rough set reasoning model (H2RM) that resolves problems of inaccurate prediction and management of type-1 diabetes mellitus (T1DM) and type-2 diabetes mellitus (T2DM). For verification of the proposed model, experimental data from fifty patients, acquired from a local hospital in semi-structured format, is used. First, the data is transformed into structured format and then used for mining prediction rules. Rough set theory (RST) based techniques and algorithms are used to mine the prediction rules. During the online execution phase of the model, these rules are used to predict T1DM and T2DM for new patients. Furthermore, the proposed model assists physicians to manage diabetes using knowledge extracted from online diabetes guidelines. Correlation-based trend analysis techniques are used to manage diabetic observations. Experimental results demonstrate that the proposed model outperforms the existing methods with 95.9% average and balanced accuracies.