WorldWideScience

Sample records for deferred acceptance algorithm

  1. Dynamic Matching Markets and the Deferred Acceptance Mechanism

    DEFF Research Database (Denmark)

    Kennes, John; Monte, Daniel; Tumennasan, Norovsambuu

    In many dynamic matching markets, priorities depend on previous allocations. In such environments, agents on the proposing side can manipulate the period-by-period deferred acceptance (DA) mechanism. We show that the fraction of agents with incentives to manipulate the DA mechanism approaches zero...... as the market size increases. In addition, we provide a novel al- gorithm to calculate the percentage of markets that can be manipulated. Based on randomly generated data, we find that the DA becomes approximately non-manipulable when the schools capacity reaches 20. Our theoretical and simulation results...... together justify the implementation of the period-by-period DA mechanism in dynamic markets....

  2. Judicial Deference Allows European Consensus to Emerge

    DEFF Research Database (Denmark)

    Dothan, Shai

    2018-01-01

    jurisdiction. But the ECHR sometimes defers to countries, even if their policies fall short of the standard accepted by most of the countries in Europe. This deference is accomplished by using the so-called "margin of appreciation" doctrine. Naturally, emerging consensus and margin of appreciation are often......, the paper demonstrates that a correct application of the margin of appreciation doctrine actually helps emerging consensus reach optimal results, by giving countries an incentive to make their policies independently....

  3. New accountant job market reform by computer algorithm: an experimental study

    Directory of Open Access Journals (Sweden)

    Hirose Yoshitaka

    2017-01-01

    Full Text Available The purpose of this study is to examine the matching of new accountants with accounting firms in Japan. A notable feature of the present study is that it brings a computer algorithm to the job-hiring task. Job recruitment activities for new accountants in Japan are one-time, short-term struggles. Accordingly, many have searched for new rules to replace the current ones of the process. Job recruitment activities for new accountants in Japan change every year. This study proposes modifying these job recruitment activities by combining computer and human efforts. Furthermore, the study formulates the job recruitment activities by using a model and conducting experiments. As a result, the Deferred Acceptance (DA algorithm derives a high truth-telling percentage, a stable matching percentage, and greater efficiency compared with the previous approach. This suggests the potential of the Deferred Acceptance algorithm as a replacement for current approaches. In terms of accurate percentage and stability, the DA algorithm is superior to the current methods and should be adopted.

  4. Deferred Tax Assets and Deferred Tax Expense Against Tax Planning Profit Management

    Directory of Open Access Journals (Sweden)

    Warsono

    2017-09-01

    Full Text Available The purpose of this study is to examine the probability of earnings management performed by Property and Real Estate companies listed in Indonesia Stock Exchange (BEI in the period 2011-2015. How to do the management to influence the accounting numbers can be either profit management through deferred tax assets, deferred tax expense and tax planning in the financial statements. This paper examines the effect of deferred tax assets deferred tax burden, and tax planning to earnings management conducted by the company. Data of the research is to use secondary data from company financial statements that were downloaded from the official website of Indonesia Stock Exchange. Using sampling technique is performed by purposive sampling. The study population is the Property and Real Estate companies listed in Indonesia Stock Exchange in the period 2011-2015. The study take sample as many as 34 companies Property and Real Estate in the Stock Exchange in 2011-2015. Hypothesis testing uses multiple regressions with SPSS software version 22. The result shows that the Deferred Tax Assets positive and significant effect on earnings management; while deferred tax expense and tax planning significant negative effect on earnings management.

  5. Analyzing actual risk in malaria-deferred donors through selective serologic testing.

    Science.gov (United States)

    Nguyen, Megan L; Goff, Tami; Gibble, Joan; Steele, Whitney R; Leiby, David A

    2013-08-01

    Approximately 150,000 US blood donors are deferred annually for travel to malaria-endemic areas. However, the majority do not travel to the high-risk areas of Africa associated with transfusion-transmitted malaria (TTM) but visit low-risk areas such as Mexico. This study tests for Plasmodium infection among malaria-deferred donors, particularly those visiting Mexico. Blood donors deferred for malaria risk (travel, residence, or previous infection) provided blood samples and completed a questionnaire. Plasma was tested for Plasmodium antibodies by enzyme immunoassay (EIA); repeat-reactive (RR) samples were considered positive and tested by real-time polymerase chain reaction (PCR). Accepted donors provided background testing data. During 2005 to 2011, a total of 5610 malaria-deferred donors were tested by EIA, including 5412 travel deferrals. Overall, 88 (1.6%) were EIA RR; none were PCR positive. Forty-nine (55.7%) RR donors previously had malaria irrespective of deferral category, including 34 deferred for travel. Among 1121 travelers to Mexico, 90% visited Quintana Roo (no or very low risk), but just 2.2% visited Oaxaca/Chiapas (moderate or high risk). Only two Mexican travelers tested RR; both previously had malaria not acquired in Mexico. Travel to Mexico represents a large percentage of US donors deferred for malaria risk; however, these donors primarily visit no- or very-low-risk areas. No malaria cases acquired in Mexico were identified thereby supporting previous risk estimates. Consideration should be given to allowing blood donations from U.S. donors who travel to Quintana Roo and other low-risk areas in Mexico. A more effective approach to preventing TTM would be to defer all donors with a history of malaria, even if remote. © 2012 American Association of Blood Banks.

  6. Making Deferred Taxes Relevant

    NARCIS (Netherlands)

    Brouwer, Arjan; Naarding, Ewout

    2018-01-01

    We analyse the conceptual problems in current accounting for deferred taxes and provide solutions derived from the literature in order to make International Financial Reporting Standards (IFRS) deferred tax numbers value-relevant. In our view, the empirical results concerning the value relevance of

  7. An Enhanced Artificial Bee Colony Algorithm with Solution Acceptance Rule and Probabilistic Multisearch.

    Science.gov (United States)

    Yurtkuran, Alkın; Emel, Erdal

    2016-01-01

    The artificial bee colony (ABC) algorithm is a popular swarm based technique, which is inspired from the intelligent foraging behavior of honeybee swarms. This paper proposes a new variant of ABC algorithm, namely, enhanced ABC with solution acceptance rule and probabilistic multisearch (ABC-SA) to address global optimization problems. A new solution acceptance rule is proposed where, instead of greedy selection between old solution and new candidate solution, worse candidate solutions have a probability to be accepted. Additionally, the acceptance probability of worse candidates is nonlinearly decreased throughout the search process adaptively. Moreover, in order to improve the performance of the ABC and balance the intensification and diversification, a probabilistic multisearch strategy is presented. Three different search equations with distinctive characters are employed using predetermined search probabilities. By implementing a new solution acceptance rule and a probabilistic multisearch approach, the intensification and diversification performance of the ABC algorithm is improved. The proposed algorithm has been tested on well-known benchmark functions of varying dimensions by comparing against novel ABC variants, as well as several recent state-of-the-art algorithms. Computational results show that the proposed ABC-SA outperforms other ABC variants and is superior to state-of-the-art algorithms proposed in the literature.

  8. An Enhanced Artificial Bee Colony Algorithm with Solution Acceptance Rule and Probabilistic Multisearch

    Directory of Open Access Journals (Sweden)

    Alkın Yurtkuran

    2016-01-01

    Full Text Available The artificial bee colony (ABC algorithm is a popular swarm based technique, which is inspired from the intelligent foraging behavior of honeybee swarms. This paper proposes a new variant of ABC algorithm, namely, enhanced ABC with solution acceptance rule and probabilistic multisearch (ABC-SA to address global optimization problems. A new solution acceptance rule is proposed where, instead of greedy selection between old solution and new candidate solution, worse candidate solutions have a probability to be accepted. Additionally, the acceptance probability of worse candidates is nonlinearly decreased throughout the search process adaptively. Moreover, in order to improve the performance of the ABC and balance the intensification and diversification, a probabilistic multisearch strategy is presented. Three different search equations with distinctive characters are employed using predetermined search probabilities. By implementing a new solution acceptance rule and a probabilistic multisearch approach, the intensification and diversification performance of the ABC algorithm is improved. The proposed algorithm has been tested on well-known benchmark functions of varying dimensions by comparing against novel ABC variants, as well as several recent state-of-the-art algorithms. Computational results show that the proposed ABC-SA outperforms other ABC variants and is superior to state-of-the-art algorithms proposed in the literature.

  9. 26 CFR 31.3306(r)(2)-1 - Treatment of amounts deferred under certain nonqualified deferred compensation plans.

    Science.gov (United States)

    2010-04-01

    ... nonqualified deferred compensation plans. 31.3306(r)(2)-1 Section 31.3306(r)(2)-1 Internal Revenue INTERNAL..., Internal Revenue Code of 1954) § 31.3306(r)(2)-1 Treatment of amounts deferred under certain nonqualified deferred compensation plans. (a) In general. Section 3306(r)(2) provides a special timing rule for the tax...

  10. How to avoid deferred-compensation troubles.

    Science.gov (United States)

    Freeman, Todd I

    2005-06-01

    Executive compensation packages have long included stock options and deferred compensation plans in order to compete for talent. Last year, Congress passed a law in response to the Enron debacle, in which executives were perceived to be protecting their deferred compensation at the expense of employees, creditors, and investors. The new law is designed to protect companies and their shareholders from being raided by the very executives that guided the company to financial ruin. Physicians who are part owners of medical practices need to know about the changes in the law regarding deferred compensation and how to avoid costly tax penalties. This article discusses how the changes affect medical practices as well as steps physician-owned clinics can take to avoid the risk of penalty, such as freezing deferred compensation and creating a new deferred compensation plan.

  11. Deferred Personal Life Decisions of Women Physicians.

    Science.gov (United States)

    Bering, Jamie; Pflibsen, Lacey; Eno, Cassie; Radhakrishnan, Priya

    2018-05-01

    Inadequate work-life balance can have significant implications regarding individual performance, retention, and on the future of the workforce in medicine. The purpose of this study was to determine whether women physicians defer personal life decisions in pursuit of their medical career. We conducted a survey study of women physicians ages 20-80 from various medical specialties using a combination of social media platforms and women physicians' professional listservs with 801 survey responses collected from May through November 2015. The primary endpoint was whether women physicians deferred personal life decisions in pursuit of their medical career. Secondary outcomes include types of decisions deferred and correlations with age, hours worked per week, specialty, number of children, and career satisfaction. Respondents were categorized into deferred and nondeferred groups. Personal decision deferments were reported by 64% of respondents. Of these, 86% reported waiting to have children and 22% reported waiting to get married. Finally, while 85% of women in the nondeferment group would choose medicine again as a career, only 71% of women in the deferment group would do so (p job satisfaction, and insurance/administrative burden. The results of this survey have significant implications on the future of the workforce in medicine. Overall, our analysis shows that 64% of women physicians defer important life decisions in pursuit of their medical career. With an increase in the number of women physicians entering the workforce, lack of support and deferred personal decisions have a potential negative impact on individual performance and retention. Employers must consider the economic impact and potential workforce shortages that may develop if these issues are not addressed.

  12. On Two Competing Affirmative Actions under Deferred Acceptance Algorithm

    DEFF Research Database (Denmark)

    Liu, Yun

    In this paper, we study two kinds of affirmative action policies, quota-based and reserve-based, under the Gale-Shapley student-optimal stable mechanism (SOSM). We first try to reveal the source of perverse affirmative action policies, especially on the purported beneficiaries. We show...... that a variant of Ergin-acyclicity structure, type-specific acyclicity, is crucial for effective affirmative action policies. This result may provide a simple criterion to decide whether affirmative action is appropriate to implement under certain market structure. We next include college's incentive...... into consideration, and indicate that for all markets without type-specific cycles and with sufficient competition for each unfilled seat, the reserve-based affirmative action is more vulnerable to manipulation compare to its quota-based counterpart. This argument implies that the efficiency gain from the more...

  13. 17 CFR 256.190 - Accumulated deferred income taxes.

    Science.gov (United States)

    2010-04-01

    .... proprietary capital ... (CONTINUED) UNIFORM SYSTEM OF ACCOUNTS FOR MUTUAL SERVICE COMPANIES AND SUBSIDIARY SERVICE COMPANIES, PUBLIC UTILITY HOLDING COMPANY ACT OF 1935 4. Deferred Debits § 256.190 Accumulated deferred income taxes. (a...

  14. The Transmutation of Deference in Medicine: An Ethico-Legal Perspective.

    Science.gov (United States)

    Devaney, Sarah; Holm, Søren

    2018-05-01

    This article critically considers the question of whether an increase in legal recognition of patient autonomy culminating in the decision of the Supreme Court in Montgomery v Lanarkshire Health Board in 2015 has led to the death of deference to doctors, not only within the courts and healthcare regulatory arenas in England and Wales but also in the consulting room and the health care system more broadly. We argue that deference has not been eradicated, but that the types of deference paid to doctors and to the medical profession have changed. In addition, whilst traditionally deference was extended towards the medical profession, increasing instances of deference being shown to other parties in the healthcare setting can be identified, allowing wider debate or recognition of the complexity of understandings, interests and aims of all those involved. Finally, we note instances in which deference to the medical profession has become more hidden.

  15. Deferred correction approach on generic transport equation

    International Nuclear Information System (INIS)

    Shah, I.A.; Ali, M.

    2004-01-01

    In this study, a two dimensional Steady Convection-Diffusion was solved, using Deferred correction approach, and results were compared with standard spatial discretization schemes. Numerical investigations were carried out based on the velocity and flow direction, for various diffusivity coefficients covering a range from diffusive to convective flows. The results show that the Deferred Ted Correction Approach gives more accurate and stable results in relation to UDS and CDs discretization of convective terms. Deferred Correction Approach caters for the wiggles for convective flows in case of central difference discretization of the equation and also caters for the dissipative error generated by the first order upwind discretization of convective fluxes. (author)

  16. 25 CFR 152.35 - Deferred payment sales.

    Science.gov (United States)

    2010-04-01

    ... desire, a sale may be made or approved on the deferred payment plan. The terms of the sale will be... 25 Indians 1 2010-04-01 2010-04-01 false Deferred payment sales. 152.35 Section 152.35 Indians..., CERTIFICATES OF COMPETENCY, REMOVAL OF RESTRICTIONS, AND SALE OF CERTAIN INDIAN LANDS Mortgages and Deeds of...

  17. Reporting Deferred Gifts: CASE-NACUBO Guidelines Ensure Consistency.

    Science.gov (United States)

    Ridenour, James F.; Munger, Peter L.

    1983-01-01

    Three methods for reporting the value of a deferred gift are described: the tax method, net realizable value, and fair market value. Three major categories of deferred gifts are identified: pooled income funds, charitable remainder trusts, and charitable gift annuities. (MLW)

  18. Are groups more likely to defer choice than their members?

    Directory of Open Access Journals (Sweden)

    Chris M. White

    2011-04-01

    Full Text Available When faced with a choice, people can normally select no option, i.e., defer choice. Previous research has investigated when and why individuals defer choice, but has almost never looked at these questions when groups of people make choices. Separate reasons predict that groups may be equally likely, more likely, or less likely than individuals to defer choice. We re-analyzed some previously published data and conducted a new experiment to address this question. We found that small groups of people tended to defer choice more often than their members would. Assuming that the groups used a plurality rule but gave additional weight to individual preferences to defer choice allowed the groups' responses to be predicted quite well. We discuss several possible explanations of these findings.

  19. Complex stimulation of peripheral nerve regeneration after deferred neurorrhaphy

    Directory of Open Access Journals (Sweden)

    Ivanov A.N.

    2017-09-01

    Full Text Available The aim is to study the complex stimulation effect including skin autotransplantation and electrical stimulation of the sciatic nerve on microcircular, electrophysiological and morphological changes after deferred neurorrhaphy in rats. Material and methods. The experiment was performed in 50 albino rats divided into control, comparative and experimental groups. In the experimental group, on the background of deferred neurorrhaphy, skin autotransplantation and electrical stimulation of the sciatic nerve had been carried out. In the comparative group only deferred neurorrhaphy was performed. Research methods included laser doppler flowmetry, electroneuromyography and morphological analysis of the operated nerve. Results. Complex stimulation including skin autotransplantation and direct action of electrical pulses on the sciatic nerve after its deferred neurorrhaphy causes restoration of bloodstream in the operated limb, promotes intensification of restoration of nerve fibers. Conclusion. Intensification of sciatic nerve regeneration after deferred neurorrhaphy in rats under the influence of complex stimulation including full-thickness skin graft autotransplantation and direct action of electrical pulses substantiates experimentally appropriateness of clinical testing of the given method for treatment of patients with peripheral nerve injuries.

  20. 18 CFR 367.2550 - Account 255, Accumulated deferred investment tax credits.

    Science.gov (United States)

    2010-04-01

    ..., Accumulated deferred investment tax credits. 367.2550 Section 367.2550 Conservation of Power and Water... 255, Accumulated deferred investment tax credits. This account must be credited with all investment tax credits deferred by companies that have elected to follow deferral accounting, partial or full...

  1. Deferred tax analysis and impact on firm's economic efficiency ratios

    Directory of Open Access Journals (Sweden)

    Hana Bohušová

    2005-01-01

    Full Text Available Category of deferred income tax is a complex topic including the whole accounting system and the income tax. Calculation method can be time-consuming and demanding a high quality system of analytical evidence and a system of valuation and demanding the high level of accountants' knowledge. The aim in the theoretical level was to analyze process of calculation and recording of deferred tax. Importance of recording of deferred tax and the impact on financial analysis ratios was analyzed. Fourteen business entities were examined. Deferred tax recording is a legal way to reduce retained earnings a to protect of its careless alocation.

  2. Analysis of Deferred Taxes in the Business Environment in Serbia

    Directory of Open Access Journals (Sweden)

    Savka VUČKOVIĆ-MILUTINOVIĆ

    2013-06-01

    Full Text Available Flow-through model of income tax reporting in general purpose financial statements had a long history of use in Serbia. It was only in 2004 (and 2003 for banks, when the implementation of deferred taxes model started. It was inevitable, because IAS/IFRS became mandatory basis for preparing financial statements. In this paper we examine quality of deferred taxes disclosures in the financial statements of companies in Serbia. We also documented the most common temporary differences that arise in measuring accounting and taxable income and in that way we identified the major sources of deferred tax. We analyzed the materiality of deferred taxes and their effect on company´s performance in Serbia.

  3. Findings concerning testis, vas deference, and epididymis in adult cases with nonpalpable testes

    Directory of Open Access Journals (Sweden)

    Coskun Sahin

    2011-12-01

    Full Text Available In this study, we aimed to state the relationship between testis, epididymis and vas deference, in adult cases with nonpalpable testis. Between January 1996 and December 2009, we evaluated 154 adult cases with nonpalpable testes. Mean age was 23 years (20-27 years. Explorations were performed by open inguinal incision, laparoscopy, and by inguinal incision and laparoscopy together on 22, 131 and 1 patient, respectively. Of all the unilateral cases, 32 were accepted as vanishing testis. In five of these cases, vas deference was ending inside the abdomen, and in the others, it was ending inside the scrotum. In the remaining 99 unilateral and 22 bilateral cases, 143 testes were found in total. Testes were found in the inguinal canal as atrophic in one case, at the right renal pedicle level with dysmorphic testis in one case, and anterior to the internal ring between the bladder and the common iliac vessels at a smaller than normal size in 119 cases. One (0.69% case did not have epididymis. While epididymis was attached to the testis only at the head and tail locations in 88 (61.53% cases, it was totally attached to the testis in 54 (37.76% cases. There is an obviously high incidence rate of testis and vas deference anomalies, where epididymis is the most frequent one. In cases with abdominal testes, this rate is highest for high localised abdominal testes.

  4. NAMA 80/20 DEFERRED PAYMENT INITIATIVE PARTICIPATION FORM

    OpenAIRE

    2012-01-01

    Brochure detailing the Deferred Payment Initiative key features and information on how to apply for the initiative: "NAMA has launched a Deferred Payment Initiative (the ‘Initiative’) on a pilot basis. The Initiative is aimed at potential owner-occupiers who are interested in purchasing residential property but are concerned at the risk of further price declines."

  5. Myocardial Damage in Patients With Deferred Stenting After STEMI

    DEFF Research Database (Denmark)

    Lønborg, Jacob; Engstrøm, Thomas; Ahtarovski, Kiril Aleksov

    2017-01-01

    BACKGROUND: Although some studies found improved coronary flow and myocardial salvage when stent implantation was deferred, the DANAMI-3-DEFER (Third DANish Study of Optimal Acute Treatment of Patients With ST-elevation Myocardial Infarction) did not show any improvement in clinical outcome in pa...

  6. Deferred Imitation and Social Communication in Speaking and Nonspeaking Children with Autism

    Science.gov (United States)

    Strid, Karin; Heimann, Mikael; Gillberg, Christopher; Smith, Lars; Tjus, Tomas

    2013-01-01

    Deferred imitation and early social communication skills were compared among speaking and nonspeaking children with autism and children developing typically. Overall, the children with autism showed a lower frequency on measures of deferred imitation and social communication compared with typically developing children. Deferred imitation was…

  7. 34 CFR 682.210 - Deferment.

    Science.gov (United States)

    2010-07-01

    ... provide information, including an example, on the impact of capitalization of accrued, unpaid interest on... applicable, the post-deferment grace period expire, a borrower resumes any delinquency status that existed...

  8. What parents of children who have received emergency care think about deferring consent in randomised trials of emergency treatments: postal survey.

    Directory of Open Access Journals (Sweden)

    Carrol Gamble

    Full Text Available To investigate parents' views about deferred consent to inform management of trial disclosure after a child's death.A postal questionnaire survey was sent to members of the Meningitis Research Foundation UK charity, whose child had suffered from bacterial meningitis or meningococcal septicaemia within the previous 5 years. Main outcome measures were acceptability of deferred consent; timing of requesting consent; and the management of disclosure of the trial after a child's death.220 families were sent questionnaires of whom 63 (29% were bereaved. 68 families responded (31%, of whom 19 (28% were bereaved. The majority (67% was willing for their child to be involved in the trial without the trial being explained to them beforehand; 70% wanted to be informed about the trial as soon as their child's condition had stabilised. In the event of a child's death before the trial could be discussed the majority of bereaved parents (66% 12/18 anticipated wanting to be told about the trial at some time. This compared with 37% (18/49 of non-bereaved families (p = 0.06. Parents' free text responses indicated that the word 'trial' held strongly negative connotations. A few parents regarded gaps in the evidence base about emergency treatments as indicating staff lacked expertise to care for a critically ill child. Bereaved parents' free text responses indicated the importance of individualised management of disclosure about a trial following a child's death.Deferred consent is acceptable to the majority of respondents. Parents whose children had recovered differed in their views compared to bereaved parents. Most bereaved parents would want to be informed about the trial in the aftermath of a child's death, although a minority strongly opposed such disclosure. Distinction should be drawn between the views of bereaved and non-bereaved parents when considering the acceptability of different consent processes.

  9. Diversity Controlling Genetic Algorithm for Order Acceptance and Scheduling Problem

    Directory of Open Access Journals (Sweden)

    Cheng Chen

    2014-01-01

    Full Text Available Selection and scheduling are an important topic in production systems. To tackle the order acceptance and scheduling problem on a single machine with release dates, tardiness penalty, and sequence-dependent setup times, in this paper a diversity controlling genetic algorithm (DCGA is proposed, in which a diversified population is maintained during the whole search process through survival selection considering both the fitness and the diversity of individuals. To measure the similarity between individuals, a modified Hamming distance without considering the unaccepted orders in the chromosome is adopted. The proposed DCGA was validated on 1500 benchmark instances with up to 100 orders. Compared with the state-of-the-art algorithms, the experimental results show that DCGA improves the solution quality obtained significantly, in terms of the deviation from upper bound.

  10. Automated backbone assignment of labeled proteins using the threshold accepting algorithm

    International Nuclear Information System (INIS)

    Leutner, Michael; Gschwind, Ruth M.; Liermann, Jens; Schwarz, Christian; Gemmecker, Gerd; Kessler, Horst

    1998-01-01

    The sequential assignment of backbone resonances is the first step in the structure determination of proteins by heteronuclear NMR. For larger proteins, an assignment strategy based on proton side-chain information is no longer suitable for the use in an automated procedure. Our program PASTA (Protein ASsignment by Threshold Accepting) is therefore designed to partially or fully automate the sequential assignment of proteins, based on the analysis of NMR backbone resonances plus C β information. In order to overcome the problems caused by peak overlap and missing signals in an automated assignment process, PASTA uses threshold accepting, a combinatorial optimization strategy, which is superior to simulated annealing due to generally faster convergence and better solutions. The reliability of this algorithm is shown by reproducing the complete sequential backbone assignment of several proteins from published NMR data. The robustness of the algorithm against misassigned signals, noise, spectral overlap and missing peaks is shown by repeating the assignment with reduced sequential information and increased chemical shift tolerances. The performance of the program on real data is finally demonstrated with automatically picked peak lists of human nonpancreatic synovial phospholipase A 2 , a protein with 124 residues

  11. Concurrent non-deferred reference counting on the Microgrid: first experiences

    NARCIS (Netherlands)

    Herhut, S.; Joslin, C.; Scholz, S.-B.; Poss, R.; Grelck, C.

    2011-01-01

    We present a first evaluation of our novel approach for non- deferred reference counting on the Microgrid many-core architecture. Non-deferred reference counting is a fundamental building block of im- plicit heap management of functional array languages in general and Sin- gle Assignment C in

  12. 17 CFR 256.411 - Provision for deferred income taxes-credit.

    Science.gov (United States)

    2010-04-01

    ... taxes-credit. 256.411 Section 256.411 Commodity and Securities Exchanges SECURITIES AND EXCHANGE... deferred income taxes—credit. This account shall be credited and Accumulated Deferred Income Taxes debited with an amount equal to the portion of taxes on income payable for the year which is attributable to a...

  13. Deferment cutting in Appalachian hardwoods: the what, whys, and hows

    Science.gov (United States)

    H. Clay Smith; Gary W. Miller

    1991-01-01

    Deferment cutting is a regeneration practice that resembles a seed-tree or shelterwood cutting. The difference is that residual trees are not cut when the reproduction becomes established. Instead, residual trees are left until new reproduction matures to sawtimber size, and another regeneration cut is the silvicultural objective. Hence, with deferment cutting specific...

  14. 26 CFR 1.615-3 - Election to defer pre-1970 exploration expenditures.

    Science.gov (United States)

    2010-04-01

    ... (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Natural Resources § 1.615-3 Election to defer pre... section 1231 (except coal or iron ore to which section 631(c) applies), the deferred exploration...

  15. Memory and representation in young children with Down syndrome: Exploring deferred imitation and object permanence.

    Science.gov (United States)

    Rast, Mechthild; Meltzoff, Andrew N

    1995-01-01

    Deferred imitation and object permanence (OP) were tested in 48 young children with Down syndrome (DS), ranging from 20 to 43 months of age. Deferred imitation and high-level OP (invisible displacements) have long been held to be synchronous developments during sensory-motor "Stage 6" (18-24 months of age in unimpaired children). The results of the current study demonstrate deferred imitation in young children with DS, showing they can learn novel behaviors from observation and retain multiple models in memory. This is the first demonstration of deferred imitation in young children with DS. The average OP level passed in this sample was A-not-B, a task passed at 8-12 months of age in normally developing infants. Analyses showed that individual children who failed high-level OP (invisible displacements) could still perform deferred imitation. This indicates that deferred imitation and OP invisible displacements are not synchronous developments in children with DS. This asynchrony is compatible with new data from unimpaired children suggesting that deferred imitation and high-level OP entail separate and distinctive kinds of memory and representation.

  16. 48 CFR 9904.415 - Accounting for the cost of deferred compensation.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Accounting for the cost of deferred compensation. 9904.415 Section 9904.415 Federal Acquisition Regulations System COST ACCOUNTING... AND COST ACCOUNTING STANDARDS COST ACCOUNTING STANDARDS 9904.415 Accounting for the cost of deferred...

  17. Taxable and Tax-deferred Investing with the Limited Use of Losses

    DEFF Research Database (Denmark)

    Fischer, Marcel; Gallmeyer, Michael

    2017-01-01

    We study the impact of the different tax treatment of capital gains and losses on the optimal location of assets in taxable and tax-deferred accounts. The classical result of Black (1980) and Tepper (1981) suggests that investors should follow a strict pecking order asset location rule and hold...... those assets that are subject to the highest tax rate preferentially in tax-deferred accounts. We show that with the different tax treatment of realized gains and losses, only tax-efficient equity mutual funds are optimally held in taxable accounts, whereas mutual funds with average tax......-(in)efficiency are preferentially held in tax-deferred accounts....

  18. 18 CFR 367.4112 - Account 411.2, Provision for deferred income taxes-Credit, other income and deductions.

    Science.gov (United States)

    2010-04-01

    ..., Provision for deferred income taxes-Credit, other income and deductions. 367.4112 Section 367.4112... deferred taxes and deferrals of taxes, credit, that relate to other income and deductions. ... Accounts Service Company Operating Income § 367.4112 Account 411.2, Provision for deferred income taxes...

  19. The impact of excess choice on deferment of decisions to volunteer

    Directory of Open Access Journals (Sweden)

    Lauren S. Carroll

    2011-10-01

    Full Text Available Excess choice has previously been shown to have detrimental effects on decisions about consumer products. As the number of options increases, people are more likely to put off making an active choice (i.e., defer and show less satisfaction with any purchase actually made. We extend this line of enquiry to choosing a charitable organisation to volunteer for. The issue is important because the number of voluntary organisations is enormous and the impact of such a decision may be greater than for consumer decisions in terms of time commitment and benefits to the volunteer and society. Study 1 asked students to examine a real volunteering website and record how many organisations they considered, decision difficulty and whether or not they would like to sign up for a chosen organisation or prefer to defer a decision. Study 2 presented either a relatively small (10 or large (30 choice set of hypothetical organisations and measured deferment likelihood and decision difficulty. In both studies the more options considered, the greater the likelihood to defer. This effect was mediated by decision difficulty. This research is the first to find that detrimental effects of excess choice extend to volunteering. Implications for volunteer recruitment are discussed.

  20. Deferred Tax Assests and Bank Regulatory Capital

    NARCIS (Netherlands)

    Gallemore, J.

    2012-01-01

    Abstract: In this study, I examine three issues: (1) whether the probability of bank failure is increasing in the proportion of regulatory capital composed of deferred tax assets (DTA), (2) whether market participants incorporate the increased failure risk associated with the DTA component of

  1. 26 CFR 1.453-4 - Sale of real property involving deferred periodic payments.

    Science.gov (United States)

    2010-04-01

    ..., for the purpose of determining whether a sale is on the installment plan, be included as a part of the... 26 Internal Revenue 6 2010-04-01 2010-04-01 false Sale of real property involving deferred... Included § 1.453-4 Sale of real property involving deferred periodic payments. (a) In general. Sales of...

  2. Preschool children's proto-episodic memory assessed by deferred imitation.

    Science.gov (United States)

    Burns, Patrick; Russell, Charlotte; Russell, James

    2015-01-01

    In two experiments, both employing deferred imitation, we studied the developmental origins of episodic memory in two- to three-year-old children by adopting a "minimalist" view of episodic memory based on its What-When-Where ("WWW": spatiotemporal plus semantic) content. We argued that the temporal element within spatiotemporal should be the order/simultaneity of the event elements, but that it is not clear whether the spatial content should be egocentric or allocentric. We also argued that episodic recollection should be configural (tending towards all-or-nothing recall of the WWW elements). Our first deferred imitation experiment, using a two-dimensional (2D) display, produced superior-to-chance performance after 2.5 years but no evidence of configural memory. Moreover, performance did not differ from that on a What-What-What control task. Our second deferred imitation study required the children to reproduce actions on an object in a room, thereby affording layout-based spatial cues. In this case, not only was there superior-to-chance performance after 2.5 years but memory was also configural at both ages. We discuss the importance of allocentric spatial cues in episodic recall in early proto-episodic memory and reflect on the possible role of hippocampal development in this process.

  3. Donating blood for research: a potential method for enhancing customer satisfaction of permanently deferred blood donors.

    Science.gov (United States)

    Waller, Daniel; Thijsen, Amanda; Garradd, Allira; Hayman, Jane; Smith, Geoff

    2017-01-01

    Each year, a large number of individuals in Australia are deferred from donating blood. A deferral may have a negative impact on donor satisfaction and subsequent word-of-mouth communication. The Australian Red Cross Blood Service (the Blood Service) is, therefore, investigating options for managing service interactions with deferred donors to maintain positive relationships. While public research institutes in Australia have established independent research donor registries, other countries provide programmes allowing deferred donors to donate blood for research via blood collection agencies. This study examined attitudes towards donating blood for research use in a sample of permanently deferred Australian donors. Donors permanently deferred because of a risk of variant Creutzfeldt-Jakob disease (n=449) completed a postal survey that examined attitudes towards research donation. The majority of participants were interested in donating blood for research (96%), and joining a registry of research donors (93%). Participants preferred to donate for transfusion or clinical research, and were willing to travel large distances. Results indicated that positive attitudes towards the Blood Service would be extended if the opportunity to donate blood was provided. These findings indicate a desire for continued engagement with the Blood Service despite deferral. Donating blood for research is a potential way of maintaining positive relationships with permanently deferred donors which also benefits the health research community. Through maintaining positive relationships with deferred donors, positive word-of-mouth activity can be stimulated. Further work is needed to determine the feasibility of implementing research donation through the Blood Service in Australia.

  4. 18 CFR 367.4101 - Account 410.1, Provision for deferred income taxes, operating income.

    Science.gov (United States)

    2010-04-01

    ..., FEDERAL POWER ACT AND NATURAL GAS ACT Income Statement Chart of Accounts Service Company Operating Income § 367.4101 Account 410.1, Provision for deferred income taxes, operating income. This account must..., Provision for deferred income taxes, operating income. 367.4101 Section 367.4101 Conservation of Power and...

  5. 76 FR 39105 - Notice of Request for Comments on Proposed Deferred Maintenance and Repairs Standards

    Science.gov (United States)

    2011-07-05

    ... FEDERAL ACCOUNTING STANDARDS ADVISORY BOARD Notice of Request for Comments on Proposed Deferred Maintenance and Repairs Standards AGENCY: Federal Accounting Standards Advisory Board. ACTION: Notice. Board... Accounting Standards Advisory Board (FASAB) is requesting comments on the Exposure Draft, Deferred...

  6. 18 CFR 367.4111 - Account 411.1, Provision for deferred income taxes-Credit, operating income.

    Science.gov (United States)

    2010-04-01

    ..., Provision for deferred income taxes-Credit, operating income. 367.4111 Section 367.4111 Conservation of... Company Operating Income § 367.4111 Account 411.1, Provision for deferred income taxes—Credit, operating... taxes, credit, that relate to service company operating income. ...

  7. Accounting choices in Brazil: identifying the characteristics of publicly traded companies that opted to maintain versus derecognise deferred assets

    Directory of Open Access Journals (Sweden)

    Fernando Drago Lorencini

    2012-04-01

    Full Text Available The issuance of Brazilian Law 11.638/2007 is a critical step in the convergence of the Brazilian Generally Accepted Accounting Principles (GAAPs towards International Financial Reporting Standards. After the law was implemented and later modified by Provisional Executive Order 449/2008 (converted into Law 11.941/2009, certain accounting choices were allowed during the transition period. The Brazilian GAAPs allowed for restructuring costs and costs related to opening a new facility to be recognised as assets. As a transitional provision, companies were allowed to choose between maintaining or eliminating these values. In this paper, we attempted to identify which company characteristics were associated with this accounting choice. The final sample consisted of Brazilian companies listed on the BM & FBOVESPA, and a logistic regression identified two characteristics. Participation in one of the three different corporate governance levels of the BM & FBOVESPA was associated with the choice to derecognise the deferred assets, while companies decided to maintain the deferred asset if it was relatively large. The empirical evidence reported here contributes to the literature by explaining the manner in which a set of firm characteristics is related to a firm's accounting choices.

  8. Minimum radwaste system to support commercial operation-what equipment can be deferred

    International Nuclear Information System (INIS)

    Marshall, R.W.; Tafazzoli, M.M.

    1984-01-01

    Because of cash flow problems being experienced by utilities as nuclear power stations approach completion, areas of the plant for which the completion of the construction effort could be deferred past commercial operation should be reviewed. The radwaste treatment systems are prime candidates for such a deferral because of the availability, either temporary or permanent, of alternative treatment methods for the waste streams expected to be produced. In order to identify the radwaste equipment, components and associated hardware in the radwaste building which could be deferred past commercial operation, a study was performed by Impell Corporation to evaluate the existing radwaste treatment system and determine the minimum system necessary to support commercial operation of a typical BWR. The study identified the minimum-installed radwaste treatment system which, in combination with portable temporary equipment, would accommodate the waste types and quantities likely to be produced in the first few years of operation. In addition, the minimum-installed system had to be licensable and excessive radiation exposures should not be incurred during the construction of the deferred portions of the system after commercial operation. From this study, it was concluded that a significant quantity of radwaste processing equipment and the associated piping, valves and instrumentation could be deferred. The estimated savings, in construction manhours (excluding field distributables) alone, was over 102,000 M-H

  9. Impairment of Goodwill and Deferred Taxes under IFRS

    NARCIS (Netherlands)

    Detzen, D.; Stork-Wersborg, T.; Zulch, H.

    2016-01-01

    This article discusses the effect of deferred tax liabilities (DTLs) on an impairment test of goodwill. While IAS 12.66 acknowledges that DTLs arising in a business combination influence the amount of goodwill an entity recognises, International Financial Reporting Standards are silent on the

  10. 48 CFR 32.607-2 - Deferment of collection.

    Science.gov (United States)

    2010-10-01

    ... financially weak contractors, balancing the need for Government security against loss and undue hardship on... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Deferment of collection. 32.607-2 Section 32.607-2 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION...

  11. Effect of deferred or no treatment with ursodeoxycholic acid in patients with early primary biliary cholangitis.

    Science.gov (United States)

    Tanaka, Atsushi; Hirohara, Junko; Nakano, Toshiaki; Yagi, Minami; Namisaki, Tadashi; Yoshiji, Hitoshi; Nakanuma, Yasuni; Takikawa, Hajime

    2018-02-06

    As primary biliary cholangitis (PBC) is a heterogeneous disease, we hypothesized that there is a population of patients with early PBC who do not require prompt treatment with ursodeoxycholic acid (UDCA). In this study, we analyzed data from a large-scale PBC cohort in Japan, and retrospectively investigated whether outcomes of early PBC patients were affected with prompt or deferred/no UDCA treatment. We defined early PBC as asymptomatic, serum alkaline phosphatase early PBC patients between the treatment regimens; prompt treatment group (UDCA was initiated within 1 year after diagnosis) and deferred/no treatment group (UDCA initiated >1 year after diagnosis or never initiated). Furthermore, we examined the outcomes of early PBC patients alternatively defined only with symptomatology and biochemistry. We identified 562 early PBC patients (prompt: n = 509; deferred/no treatment: n = 53). Incidence rates (per 1000 patient-years) for liver-related mortality or liver transplantation and decompensating events were 0.5 and 5.4, respectively, in the prompt treatment group, and 0 and 8.7, respectively, in the deferred/no treatment group. Multivariate analyses showed that age and bilirubin were significantly associated with developing decompensating events, whereas the prompt and deferred/no treatments were not. We obtained similar results in early PBC patients defined without histological examination. We showed that deferred/no treatment for early PBC patients did not affect the outcomes. This study provides a rationale for a future prospective, randomized study. © 2018 The Japan Society of Hepatology.

  12. 47 CFR 1.104 - Preserving the right of review; deferred consideration of application for review.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Preserving the right of review; deferred... Actions Taken by the Commission and Pursuant to Delegated Authority; Effective Dates and Finality Dates of Actions § 1.104 Preserving the right of review; deferred consideration of application for review. (a) The...

  13. Long-term memory, forgetting, and deferred imitation in 12-month-old infants

    OpenAIRE

    Klein, Pamela J.; Meltzoff, Andrew N.

    1999-01-01

    Long-term recall memory, as indexed by deferred imitation, was assessed in 12-month-old infants. Independent groups of infants were tested after retention intervals of 3 min, 1 week and 4 weeks. Deferred imitation was assessed using the ‘observation-only’ procedure in which infants were not allowed motor practice on the tasks before the delay was imposed. Thus, the memory could not have been based on re-accessing a motor habit, because none was formed in the first place. After the delay, memo...

  14. 75 FR 22813 - Guidance for Industry: Requalification Method for Reentry of Blood Donors Deferred Because of...

    Science.gov (United States)

    2010-04-30

    ...] Guidance for Industry: Requalification Method for Reentry of Blood Donors Deferred Because of Reactive Test... availability of a document entitled ``Guidance for Industry: Requalification Method for Reentry of Blood Donors... document entitled ``Guidance for Industry: Requalification Method for Reentry of Blood Donors Deferred...

  15. 26 CFR 1.616-2 - Election to defer.

    Science.gov (United States)

    2010-04-01

    ... Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES (CONTINUED) Natural Resources § 1.616-2 Election to defer. (a) General rule. In lieu of... exchange of a capital asset or property treated under section 1231 (except coal or iron ore to which...

  16. 13 CFR 120.1717 - Seller's Pool Loan deferments.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Seller's Pool Loan deferments. 120.1717 Section 120.1717 Business Credit and Assistance SMALL BUSINESS ADMINISTRATION BUSINESS LOANS Establishment of SBA Secondary Market Guarantee Program for First Lien Position 504 Loan Pools § 120.1717 Seller...

  17. 47 CFR 32.4100 - Net current deferred operating income taxes.

    Science.gov (United States)

    2010-10-01

    ... SERVICES UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Balance Sheet Accounts § 32.4100 Net current deferred operating income taxes. (a) This account shall include the balance...

  18. Strong Stability Preserving Property of the Deferred Correction Time Discretization

    National Research Council Canada - National Science Library

    Liu, Yuan; Shu, Chi-Wang; Zhang, Mengping

    2007-01-01

    In this paper, we study the strong stability preserving "SSP" property of a class of deferred correction time discretization methods, for solving the method-of-lines schemes approximating hyperbolic...

  19. Embracing equifinality with efficiency: Limits of Acceptability sampling using the DREAM(LOA) algorithm

    Science.gov (United States)

    Vrugt, Jasper A.; Beven, Keith J.

    2018-04-01

    This essay illustrates some recent developments to the DiffeRential Evolution Adaptive Metropolis (DREAM) MATLAB toolbox of Vrugt (2016) to delineate and sample the behavioural solution space of set-theoretic likelihood functions used within the GLUE (Limits of Acceptability) framework (Beven and Binley, 1992, 2014; Beven and Freer, 2001; Beven, 2006). This work builds on the DREAM(ABC) algorithm of Sadegh and Vrugt (2014) and enhances significantly the accuracy and CPU-efficiency of Bayesian inference with GLUE. In particular it is shown how lack of adequate sampling in the model space might lead to unjustified model rejection.

  20. Different supplents for finishing of Nellore cattle on deferred Brachiaria decumbens pasture during the dry season

    Directory of Open Access Journals (Sweden)

    Antonio Tadeu de Andrade

    2015-07-01

    Full Text Available This study evaluated the effect of four types of supplement on the finishing of Nellore cattle on deferred Brachiaria decumbens pasture during the dry season. Sixty-four castrated Nellore males with an age of approximately 34 months and initial body weight (BW ranging from 360 to 380 kg were divided into 16 animals per treatment in a completely randomized design. The treatments consisted of four types of pasture supplement: deferred Brachiaria decumbens pasture + energy protein mineral salt (SuEPM used as control; deferred Brachiaria decumbens pasture + urea + cottonseed meal (28% CP + ground corn grain (SuCo; deferred Brachiaria decumbens pasture + urea + cottonseed meal (28% CP + citrus pulp (SuCPu; deferred Brachiaria decumbens pasture + urea + cottonseed meal (28% CP + soy hull (SuSH. The pasture was deferred for 170 days and provided 3,482 kg DM/ha of forage, permitting a stocking rate of 1.56 AU/ha (DM intake of 2.25% BW and 50% pasture efficiency. The animals received the supplement ad libitum in the SuEPM treatment and as % BW in the other treatments from July to October. The animals were slaughtered at a minimum BW of 457 kg. The following variables were evaluated: final weight, weight gain during the period (WG, average daily gain (ADG, hot carcass weight (HCW, and hot carcass yield (HCY. With respect to final weight, the supplement in the SuCo, SuCPu and SuSH treatments permitted a greater supply of nutrients and the animals therefore exhibited better performance (P<0.05 compared to the SuEPM treatment (mean of 478.68 vs 412.62 kg. The same effect was observed for the other parameters studied. Analysis of WG and ADG showed that SuSH was superior to the SuCo and SuCPu treatments (P<0.05 due to the increased offer of concentrate and SuEPM was inferior to the other treatments. Higher HCW (260.05 kg and HCY (53.92% were obtained with treatment SuSH as a result of greater performance. Supplementation of cattle during the dry period on

  1. 18 CFR 367.4102 - Account 410.2, Provision for deferred income taxes, other income and deductions.

    Science.gov (United States)

    2010-04-01

    ... COMPANY ACT OF 2005, FEDERAL POWER ACT AND NATURAL GAS ACT Income Statement Chart of Accounts Service Company Operating Income § 367.4102 Account 410.2, Provision for deferred income taxes, other income and..., Provision for deferred income taxes, other income and deductions. 367.4102 Section 367.4102 Conservation of...

  2. 5 CFR 847.907 - How is the monthly annuity rate used to compute the present value of the deferred annuity without...

    Science.gov (United States)

    2010-01-01

    ... compute the present value of the deferred annuity without credit for NAFI service determined? 847.907... the present value of the deferred annuity without credit for NAFI service determined? (a) The monthly annuity rate used to compute the present value of the deferred annuity under § 847.906 of this subpart for...

  3. Algorithms

    Indian Academy of Sciences (India)

    to as 'divide-and-conquer'. Although there has been a large effort in realizing efficient algorithms, there are not many universally accepted algorithm design paradigms. In this article, we illustrate algorithm design techniques such as balancing, greedy strategy, dynamic programming strategy, and backtracking or traversal of ...

  4. A burn center paradigm to fulfill deferred consent public disclosure and community consultation requirements for emergency care research.

    Science.gov (United States)

    Blackford, Martha G; Falletta, Lynn; Andrews, David A; Reed, Michael D

    2012-09-01

    To fulfill Food and Drug Administration and Department of Health and Human Services emergency care research informed consent requirements, our burn center planned and executed a deferred consent strategy gaining Institutional Review Board (IRB) approval to proceed with the clinical study. These federal regulations dictate public disclosure and community consultation unique to acute care research. Our regional burn center developed and implemented a deferred consent public notification and community consultation paradigm appropriate for a burn study. Published accounts of deferred consent strategies focus on acute care resuscitation practices. We adapted those strategies to design and conduct a comprehensive public notification/community consultation plan to satisfy deferred consent requirements for burn center research. To implement a robust media campaign we engaged the hospital's public relations department, distributed media materials, recruited hospital staff for speaking engagements, enlisted community volunteers, and developed initiatives to inform "hard-to-reach" populations. The hospital's IRB determined we fulfilled our obligation to notify the defined community. Our communication strategy should provide a paradigm other burn centers may appropriate and adapt when planning and executing a deferred consent initiative. Copyright © 2012 Elsevier Ltd and ISBI. All rights reserved.

  5. Survival after primary and deferred cystectomy for stage T1 transitional cell carcinoma of the bladder

    Directory of Open Access Journals (Sweden)

    Bedeir Ali-El-Dein

    2011-01-01

    Conclusions: Cancer-specific survival is statistically comparable for primary and deferred cystectomy in T1 bladder cancer, although there is a non-significant difference in favor of primary cystectomy. In the deferred cystectomy group, the number of TURBTs beyond three is associated with lower survival. Conservative treatment should be adopted for most cases in this category.

  6. 47 CFR 32.4341 - Net deferred tax liability adjustments.

    Science.gov (United States)

    2010-10-01

    ... income tax charges and credits pertaining to Account 32.4361, Deferred tax regulatory adjustments—net. (b... be recorded in Account 4361 as required by paragraph (a) of this section. (3) The tax effects of... UNIFORM SYSTEM OF ACCOUNTS FOR TELECOMMUNICATIONS COMPANIES Instructions for Balance Sheet Accounts § 32...

  7. Is justice deferred, justice denied? Not necessarily

    OpenAIRE

    Ryder, N.; Palmer, A.

    2016-01-01

    At long last, the Serious Fraud Office has received a major boost in its prosecution of bribery. Serious Fraud Office v Standard Bank PLC is a landmark case because it is not only the first case where the SFO has looked to prosecute a commercial organisation for failure to prevent bribery under Bribery Act 2010, but the first occasion where it has sought to enter a Deferred Prosecution Agreement under Crime and Courts Act 2013.

  8. 17 CFR 256.255 - Accumulated deferred investment tax credits.

    Science.gov (United States)

    2010-04-01

    ... investment tax credits. 256.255 Section 256.255 Commodity and Securities Exchanges SECURITIES AND EXCHANGE... investment tax credits. (a) This account shall be credited and account 411.5, Investment tax credit, debited with investment tax credits deferred by companies which do not apply such credits as a reduction of the...

  9. 26 CFR 1.174-4 - Treatment as deferred expenses.

    Science.gov (United States)

    2010-04-01

    ... which are not chargeable to property of a character subject to an allowance for depreciation or... taxpayer first realizes benefits from the expenditures. The length of the period shall be selected by the... showing to the contrary, the taxpayer will be deemed to have begun to realize benefits from the deferred...

  10. Deferred versus conventional stent implantation in patients with ST-segment elevation myocardial infarction (DANAMI 3-DEFER)

    DEFF Research Database (Denmark)

    Kelbæk, Henning; Høfsten, Dan Eik; Køber, Lars

    2016-01-01

    to assess the clinical outcomes of deferred stent implantation versus standard PCI in patients with STEMI. METHODS: We did this open-label, randomised controlled trial at four primary PCI centres in Denmark. Eligible patients (aged >18 years) had acute onset symptoms lasting 12 h or less, and ST......-segment elevation of 0·1 mV or more in at least two or more contiguous electrocardiographic leads or newly developed left bundle branch block. Patients were randomly assigned (1:1), via an electronic web-based system with permuted block sizes of two to six, to receive either standard primary PCI with immediate...

  11. Validation of clinical acceptability of an atlas-based segmentation algorithm for the delineation of organs at risk in head and neck cancer

    Energy Technology Data Exchange (ETDEWEB)

    Hoang Duc, Albert K., E-mail: albert.hoangduc.ucl@gmail.com; McClelland, Jamie; Modat, Marc; Cardoso, M. Jorge; Mendelson, Alex F. [Center for Medical Image Computing, University College London, London WC1E 6BT (United Kingdom); Eminowicz, Gemma; Mendes, Ruheena; Wong, Swee-Ling; D’Souza, Derek [Radiotherapy Department, University College London Hospitals, 235 Euston Road, London NW1 2BU (United Kingdom); Veiga, Catarina [Department of Medical Physics and Bioengineering, University College London, London WC1E 6BT (United Kingdom); Kadir, Timor [Mirada Medical UK, Oxford Center for Innovation, New Road, Oxford OX1 1BY (United Kingdom); Ourselin, Sebastien [Centre for Medical Image Computing, University College London, London WC1E 6BT (United Kingdom)

    2015-09-15

    Purpose: The aim of this study was to assess whether clinically acceptable segmentations of organs at risk (OARs) in head and neck cancer can be obtained automatically and efficiently using the novel “similarity and truth estimation for propagated segmentations” (STEPS) compared to the traditional “simultaneous truth and performance level estimation” (STAPLE) algorithm. Methods: First, 6 OARs were contoured by 2 radiation oncologists in a dataset of 100 patients with head and neck cancer on planning computed tomography images. Each image in the dataset was then automatically segmented with STAPLE and STEPS using those manual contours. Dice similarity coefficient (DSC) was then used to compare the accuracy of these automatic methods. Second, in a blind experiment, three separate and distinct trained physicians graded manual and automatic segmentations into one of the following three grades: clinically acceptable as determined by universal delineation guidelines (grade A), reasonably acceptable for clinical practice upon manual editing (grade B), and not acceptable (grade C). Finally, STEPS segmentations graded B were selected and one of the physicians manually edited them to grade A. Editing time was recorded. Results: Significant improvements in DSC can be seen when using the STEPS algorithm on large structures such as the brainstem, spinal canal, and left/right parotid compared to the STAPLE algorithm (all p < 0.001). In addition, across all three trained physicians, manual and STEPS segmentation grades were not significantly different for the brainstem, spinal canal, parotid (right/left), and optic chiasm (all p > 0.100). In contrast, STEPS segmentation grades were lower for the eyes (p < 0.001). Across all OARs and all physicians, STEPS produced segmentations graded as well as manual contouring at a rate of 83%, giving a lower bound on this rate of 80% with 95% confidence. Reduction in manual interaction time was on average 61% and 93% when automatic

  12. Life Cycle Asset Allocation in the Presence of Housing and Tax-Deferred Investing

    DEFF Research Database (Denmark)

    Marekwica, Marcel; Schaefer, Alexander; Sebastian, Steffen

    2013-01-01

    , investors can deduct mortgage interest payments from taxable income, while simultaneously earning interest in tax-deferred accounts tax-free. Matching empirical evidence, our model predicts that investors with higher retirement savings choose higher loan-to-value ratios to exploit this tax arbitrage......We study the dynamic consumption-portfolio problem over the life cycle, with respect to tax-deferred investing for investors who acquire housing services by either renting or owning a home. The joint existence of these two investment vehicles creates potential for tax arbitrage. Specifically...... opportunity. However, many households could benefit from more effectively taking advantage of tax arbitrage....

  13. Pulsed feedback defers cellular differentiation.

    Directory of Open Access Journals (Sweden)

    Joe H Levine

    2012-01-01

    Full Text Available Environmental signals induce diverse cellular differentiation programs. In certain systems, cells defer differentiation for extended time periods after the signal appears, proliferating through multiple rounds of cell division before committing to a new fate. How can cells set a deferral time much longer than the cell cycle? Here we study Bacillus subtilis cells that respond to sudden nutrient limitation with multiple rounds of growth and division before differentiating into spores. A well-characterized genetic circuit controls the concentration and phosphorylation of the master regulator Spo0A, which rises to a critical concentration to initiate sporulation. However, it remains unclear how this circuit enables cells to defer sporulation for multiple cell cycles. Using quantitative time-lapse fluorescence microscopy of Spo0A dynamics in individual cells, we observed pulses of Spo0A phosphorylation at a characteristic cell cycle phase. Pulse amplitudes grew systematically and cell-autonomously over multiple cell cycles leading up to sporulation. This pulse growth required a key positive feedback loop involving the sporulation kinases, without which the deferral of sporulation became ultrasensitive to kinase expression. Thus, deferral is controlled by a pulsed positive feedback loop in which kinase expression is activated by pulses of Spo0A phosphorylation. This pulsed positive feedback architecture provides a more robust mechanism for setting deferral times than constitutive kinase expression. Finally, using mathematical modeling, we show how pulsing and time delays together enable "polyphasic" positive feedback, in which different parts of a feedback loop are active at different times. Polyphasic feedback can enable more accurate tuning of long deferral times. Together, these results suggest that Bacillus subtilis uses a pulsed positive feedback loop to implement a "timer" that operates over timescales much longer than a cell cycle.

  14. Extending RTA/Linux with fixed-priority scheduling with deferred preemption

    NARCIS (Netherlands)

    Bergsma, M.; Holenderski, M.J.; Bril, R.J.; Lukkien, J.J.; Petters, S.M.; Zijlstra, P.

    2009-01-01

    Fixed-Priority Scheduling with Deferred Preemption (FPDS) is a middle ground between Fixed-Priority Pre-emptive Scheduling and Fixed-Priority Non-preemptive Scheduling, and offers advantages with respect to context switch overhead and resource access control. In this paper we present our work on

  15. Mixed-Precision Spectral Deferred Correction: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Grout, Ray W. S.

    2015-09-02

    Convergence of spectral deferred correction (SDC), where low-order time integration methods are used to construct higher-order methods through iterative refinement, can be accelerated in terms of computational effort by using mixed-precision methods. Using ideas from multi-level SDC (in turn based on FAS multigrid ideas), some of the SDC correction sweeps can use function values computed in reduced precision without adversely impacting the accuracy of the final solution. This is particularly beneficial for the performance of combustion solvers such as S3D [6] which require double precision accuracy but are performance limited by the cost of data motion.

  16. Faster magnet sorting with a threshold acceptance algorithm

    International Nuclear Information System (INIS)

    Lidia, S.; Carr, R.

    1995-01-01

    We introduce here a new technique for sorting magnets to minimize the field errors in permanent magnet insertion devices. Simulated annealing has been used in this role, but we find the technique of threshold acceptance produces results of equal quality in less computer time. Threshold accepting would be of special value in designing very long insertion devices, such as long free electron lasers (FELs). Our application of threshold acceptance to magnet sorting showed that it converged to equivalently low values of the cost function, but that it converged significantly faster. We present typical cases showing time to convergence for various error tolerances, magnet numbers, and temperature schedules

  17. Faster magnet sorting with a threshold acceptance algorithm

    International Nuclear Information System (INIS)

    Lidia, S.

    1994-08-01

    The authors introduce here a new technique for sorting magnets to minimize the field errors in permanent magnet insertion devices. Simulated annealing has been used in this role, but they find the technique of threshold acceptance produces results of equal quality in less computer time. Threshold accepting would be of special value in designing very long insertion devices, such as long FEL's. Their application of threshold acceptance to magnet sorting showed that it converged to equivalently low values of the cost function, but that it converged significantly faster. They present typical cases showing time to convergence for various error tolerances, magnet numbers, and temperature schedules

  18. California; Bay Area Air Quality Management District; Determination To Defer Sanctions

    Science.gov (United States)

    EPA is making an interim final determination to defer imposition of sanctions based on a proposed determination that CARB submitted rules on behalf of BAAQMD that satisfy part D of the Clean Air Act for areas under the jurisdiction of the BAAQMD.

  19. Etiology of anemia of blood donor candidates deferred by hematologic screening

    Directory of Open Access Journals (Sweden)

    Michel Alves da Silva

    2012-01-01

    Full Text Available OBJECTIVE: Iron deficiency is the most common cause of anemia and one of the main factors in the clinical deferral of blood donors. This fact prompted the current study that aimed to determine the prevalence and etiology of anemia in blood donor candidates and to evaluate the hematological screening technique used for the exclusion of these donors. METHODS: This was a prospective study that compared two groups (Anemic and Non-anemic. Initially screening for anemia was performed by manually measuring hemoglobin (Bioclin® Kit; the results were subsequently compared with an automated screening method (Coulter T-890. The etiology was investigated by hemoglobin electrophoresis in alkaline and acid pH, Hb A2 dosage and measurement of the ferritin concentration by immunoagglutination. Differences and associations of interest were analyzed using the Yates and McNemar's Chi-square tests and the Fisher, Mann-Whitney, Wilcoxon and Kruskal-Wallis tests. RESULTS: The deferral rate due to anemia was 4.2%; iron deficiency was identified in 37.5% and beta thalassemia in 9.3% of the excluded candidates. There was a significant discrepancy between the two techniques used to measure hemoglobin with 38.1% of initially deferred donors presenting normal hemoglobin levels by the automated method. CONCLUSION: The results show a high rate of blood donors being deferred for anemia and confirm that iron deficiency is the most prevalent cause. The discrepancies found by comparing screening methods suggest that hemoglobin and hematocrit levels should be confirmed before deferring a donor due to anemia; this may increase supplies in blood banks.

  20. Deferred endoscopic urethral realignment: Role in management of traumatic posterior urethral disruption

    Directory of Open Access Journals (Sweden)

    M.A. Elgammal

    2014-06-01

    Conclusion: When early realignment is postponed for any reason, deferred endoscopic realignment is considered an adequate substitute because urethral continuity can be achieved in a group of patients without increase incidence of impotence and incontinence.

  1. Short communication: immediate and deferred milk production responses to concentrate supplements in cows grazing fresh pasture.

    Science.gov (United States)

    Roche, J R; Kay, J K; Rius, A G; Grala, T M; Sheahan, A J; White, H M; Phyn, C V C

    2013-04-01

    The objective of this study was to determine the increase in milk production from supplementation that occurred after supplementation ceased. This portion of the total response (i.e., the deferred response), although accepted, is generally not accounted for in short-term component research projects, but it is important in determining the economic impact of supplementary feeding. Fifty-nine multiparous Holstein-Friesian dairy cows were offered a generous allowance of spring pasture [>45 kg of dry matter (DM)/cow per day) and were supplemented with 0, 3, or 6 kg (DM)/d of pelleted concentrate (half of the allowance at each milking event) in a complete randomized design. Treatments were imposed for the first 12 wk of lactation. Treatments were balanced for cow age (5.4 ± 1.68 yr), calving date (July 27 ± 26.0 d), and genetic merit for milk component yield. During the period of supplementation, milk yield and the yield of milk components increased (1.19 kg of milk, 0.032 kg of fat, 0.048 kg of protein, and 0.058 kg of lactose/kg of concentrate DM consumed), but neither body condition score nor body weight was affected. After concentrate supplementation ceased and cows returned to a common diet of fresh pasture, milk and milk component yields remained greater for 3 wk in the cows previously supplemented. During this 3-wk period, cows that previously received 3 and 6 kg of concentrate DM per day produced an additional 2.3 and 4.5 kg of milk/d, 0.10 and 0.14 kg of fat/d, 0.10 and 0.14 kg of protein/d, and 0.10 and 0.19 kg of lactose/d, respectively, relative to unsupplemented cows. This is equivalent to an additional 0.19 kg of milk, 0.006 kg of fat, 0.006 kg of protein, and 0.008 kg of lactose per 1 kg of concentrate DM previously consumed, which would not be accounted for in the immediate response. As a result of this deferred response to supplements, the total milk production benefit to concentrate supplements is between 7% (lactose yield) and 32% (fat yield) greater

  2. The American Jobs Creation Act and its impact on deferred compensation: reassessment from a business perspective.

    Science.gov (United States)

    Johnson, David G

    2005-01-01

    The American Jobs Creation Act (AJCA), which was signed into law in October 2004, will have an impact on almost every deferred compensation program in the United States. This article argues that as companies continue to evaluate the transition alternatives under AJCA and contemplate the necessary changes to the plan program, companies also should consider simultaneously addressing broader issues surrounding nonqualified deferred compensation arrangements. These include ongoing business purpose, financial planning considerations, education of participants, corporate governance considerations and the potential implications to international assignees.

  3. How parents and practitioners experience research without prior consent (deferred consent) for emergency research involving children with life threatening conditions: a mixed method study

    Science.gov (United States)

    Woolfall, Kerry; Frith, Lucy; Gamble, Carrol; Gilbert, Ruth; Mok, Quen; Young, Bridget

    2015-01-01

    Objective Alternatives to prospective informed consent to enable children with life-threatening conditions to be entered into trials of emergency treatments are needed. Across Europe, a process called deferred consent has been developed as an alternative. Little is known about the views and experiences of those with first-hand experience of this controversial consent process. To inform how consent is sought for future paediatric critical care trials, we explored the views and experiences of parents and practitioners involved in the CATheter infections in CHildren (CATCH) trial, which allowed for deferred consent in certain circumstances. Design Mixed method survey, interview and focus group study. Participants 275 parents completed a questionnaire; 20 families participated in an interview (18 mothers, 5 fathers). 17 CATCH practitioners participated in one of four focus groups (10 nurses, 3 doctors and 4 clinical trial unit staff). Setting 12 UK children's hospitals. Results Some parents were momentarily shocked or angered to discover that their child had or could have been entered into CATCH without their prior consent. Although these feelings resolved after the reasons why consent needed to be deferred were explained and that the CATCH interventions were already used in clinical care. Prior to seeking deferred consent for the first few times, CATCH practitioners were apprehensive, although their feelings abated with experience of talking to parents about CATCH. Parents reported that their decisions about their child's participation in the trial had been voluntary. However, mistiming the deferred consent discussion had caused distress for some. Practitioners and parents supported the use of deferred consent in CATCH and in future trials of interventions already used in clinical care. Conclusions Our study provides evidence to support the use of deferred consent in paediatric emergency medicine; it also indicates the crucial importance of practitioner communication

  4. Pulmonary effects of immediate versus deferred antiretroviral therapy in HIV-positive individuals

    DEFF Research Database (Denmark)

    Kunisaki, Ken M; Niewoehner, Dennis E; Collins, Gary

    2016-01-01

    BACKGROUND: Observational data have been conflicted regarding the potential role of HIV antiretroviral therapy (ART) as a causative factor for, or protective factor against, COPD. We therefore aimed to investigate the effect of immediate versus deferred ART on decline in lung function in HIV...... Services guidelines) either immediately, or deferred until CD4 T-cell counts decreased to 350 per μL or AIDS developed. The randomisation was determined by participation in the parent START study, and was not specific to the substudy. Because of the nature of our study, site investigators and participants...... were not masked to the treatment group assignment; however, the assessors who reviewed the outcomes were masked to the treatment group. The primary outcome was the annual rate of decline in lung function, expressed as the FEV1 slope in mL/year; spirometry was done annually during follow-up for up to 5...

  5. Estrutura do capim-braquiária durante o diferimento da pastagem = Signalgrass structure during pasture deferment

    Directory of Open Access Journals (Sweden)

    Manoel Eduardo Rozalino Santos

    2010-04-01

    Full Text Available O experimento foi realizado para avaliar o número de perfilhos, a massa de forragem e de seus componentes morfológicos em pastos de Brachiaria decumbens cv. Basilisk durante o diferimento da pastagem. Os tratamentos foram quatro períodos de diferimento (18, 46, 74 e 121dias, a partir de 8/3/2004. O delineamento foi em blocos casualizados com duas repetições. Foram determinados os números de perfilhos vegetativos (PV, reprodutivos (PR e mortos (PM, bem como as massas de lâmina foliar verde (MLV, colmo verde (MCV e forragemmorta (MFM. Durante o período de diferimento houve redução no número de PV (de 1.491 para 944 perfilhos m-2. Os números de PR e PM não foram influenciados pelo período de diferimento e suas médias foram 211 e 456 perfilhos m-2, respectivamente. O período de diferimento causou incremento nas MCV (de 2.965 para 4.877 kg ha-1 de massa seca e MFM (2.324 para 4.823 kg ha-1 de massa seca, porém não influenciou a MLV (em média, 2.047 kg ha-1 de massa seca. Em Viçosa, Estado de Minas Gerais, o pasto de B. decumbens, adubado com nitrogênio e diferido no início de março, pode permanecer diferido por cerca de 70 dias para conciliar produção de forragem em quantidade com boa composição morfológica. This experiment was performed aiming to evaluate tiller population density, forage mass and its morphological components on pastures of Brachiaria decumbens cv. Basilisk. during deferment. The treatments encompassed four deferred grazing periods (18, 46, 74 and 121 days. Arandomized block design with two replications was used. The numbers of vegetative tillers (VT, reproductive tillers (RT and dead tillers (DT in the pasture were determined. The masses of green leaf blade (GLBM, dead stem (DSM and dead forage (DFM were alsodetermined. There was a reduction in the number of VT (from 1,491 to 944 tiller m-2 during the deferment period. RT and DT numbers were not influenced by the deferment periods. Their averages were

  6. Proton Spectroscopy in a Cross-Section of HIV-Positive Asymptomatic Patients Receiving Immediate Compared with Deferred Zidovudine (Concorde Study).

    Science.gov (United States)

    Hall-Craggs, M A; Williams, I G; Wilkinson, I D; Paley, M; Chinn, R J; Chong, W K; Kendall, B E; Harrison, M J; Baldeweg, T; Pugh, K; Riccio, M; Catalan, J; Weller, I V

    1997-01-01

    The purpose of this study was to examine by proton spectroscopy for any difference in cerebral metabolites in patients taking part in the Concorde study (comparing the efficacy of immediate versus deferred treatment with zidovudine on asymptomatic HIV infected individuals). Forty seven HIV positive male patients [29 immediate, 18 deferred zidovudine] were examined in the last 9 months of the therapeutic trial. Magnetic resonance imaging and proton spectroscopy were performed at 1.5 Tesla using a single voxel placed in the parieto-occipital white matter. No significant difference was found in metabolite ratios comparing immediate versus deferred zidovudine (NA/NA+Cho+Cr 0.52 vs. 0.52). High quality spectra were acquired in relatively large numbers of patients and logistically spectroscopy may be applied to clinical therapeutic studies.

  7. 27 CFR 26.95 - Deferred payment of tax-release of wine.

    Science.gov (United States)

    2010-04-01

    ...-release of wine. 26.95 Section 26.95 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND... ISLANDS Taxpayment of Liquors and Articles in Puerto Rico Wine § 26.95 Deferred payment of tax—release of wine. (a) Action by proprietor. Where the proprietor has furnished bond, on Form 2897, and payment of...

  8. Conservative multi-implicit integral deferred correction methods with adaptive mesh refinement

    International Nuclear Information System (INIS)

    Layton, A.T.

    2004-01-01

    In most models of reacting gas dynamics, the characteristic time scales of chemical reactions are much shorter than the hydrodynamic and diffusive time scales, rendering the reaction part of the model equations stiff. Moreover, nonlinear forcings may introduce into the solutions sharp gradients or shocks, the robust behavior and correct propagation of which require the use of specialized spatial discretization procedures. This study presents high-order conservative methods for the temporal integration of model equations of reacting flows. By means of a method of lines discretization on the flux difference form of the equations, these methods compute approximations to the cell-averaged or finite-volume solution. The temporal discretization is based on a multi-implicit generalization of integral deferred correction methods. The advection term is integrated explicitly, and the diffusion and reaction terms are treated implicitly but independently, with the splitting errors present in traditional operator splitting methods reduced via the integral deferred correction procedure. To reduce computational cost, time steps used to integrate processes with widely-differing time scales may differ in size. (author)

  9. Revisionist integral deferred correction with adaptive step-size control

    KAUST Repository

    Christlieb, Andrew

    2015-03-27

    © 2015 Mathematical Sciences Publishers. Adaptive step-size control is a critical feature for the robust and efficient numerical solution of initial-value problems in ordinary differential equations. In this paper, we show that adaptive step-size control can be incorporated within a family of parallel time integrators known as revisionist integral deferred correction (RIDC) methods. The RIDC framework allows for various strategies to implement stepsize control, and we report results from exploring a few of them.

  10. DEFERRED TAXES GENERATED BY THE CAPITALIZED INTERESTS IN THE AMOUNT

    Directory of Open Access Journals (Sweden)

    PALIU – POPA LUCIA

    2015-08-01

    Full Text Available According to the General Framework for preparing and presenting the financial statements elaborated by IASB, the utility of information is provided by attributes (qualitative features, such as: intelligibility, relevance, credibility and comparability. For being credible, the financial information shall be erroneous, shall not be biased or deforming the patrimony, and one of the elements representing and defining the information credibility is the prudency. Thus, the prudential accounting treatments affect, on the one hand, the accounting information relevance and credibility, and on the other hand, equally, both the producers as well as the users of the financial information, due to the economic consequences which are generated. From this perspective and considering that the implied economic agents are not neutral in terms of their option concerning the neutral accounting practices, prudent or aggressive, we opined that it is useful to conduct a study aiming the relevance of the accounting information related to the deferred taxes generated by the capitalized interests in the amount of the fix assets, recognizing the value of these taxes having as result the compliance with the principle of prudency within the accountancy. In this context, compared to the dominant accounting systems, respectively the continental system and the Anglo- Saxon system, within which the accounting information is characterized as legal, respectively addressed to the external users, especially to the investors, the conducted study aimed the following directions: the main differences between the provisions of the national, European, Anglo-Saxon accounting regulations and those of the international referential related to the prudency; the occurrence and evolution of the deferred taxes generated by the capitalized interests in the amount of the fix assets; informational benefits of the accounting prudency concerning the reflection of the deferred taxes established by the

  11. The relationship between attention and deferred imitation in 12-month-old infants.

    Science.gov (United States)

    Zmyj, Norbert; Schölmerich, Axel; Daum, Moritz M

    2017-08-01

    Imitation is a frequent behavior in the first years of life, and serves both a social function (e.g., to interact with others) and a cognitive function (e.g., to learn a new skill). Infants differ in their temperament, and temperament might be related to the dominance of one function of imitation. In this study, we investigated whether temperament and deferred imitation are related in 12-month-old infants. Temperament was measured via the Infant Behavior Questionnaire Revised (IBQ-R) and parts of the Laboratory Temperament Assessment Battery (Lab-TAB). Deferred imitation was measured via the Frankfurt Imitation Test for 12-month-olds (FIT-12). Regression analyses revealed that the duration of orienting (IBQ-R) and the latency of the first look away in the Task Orientation task (Lab-TAB) predicted the infants' imitation score. These results suggest that attention-related processes may play a major role when infants start to imitate. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. How parents and practitioners experience research without prior consent (deferred consent) for emergency research involving children with life threatening conditions: a mixed method study.

    Science.gov (United States)

    Woolfall, Kerry; Frith, Lucy; Gamble, Carrol; Gilbert, Ruth; Mok, Quen; Young, Bridget

    2015-09-18

    Alternatives to prospective informed consent to enable children with life-threatening conditions to be entered into trials of emergency treatments are needed. Across Europe, a process called deferred consent has been developed as an alternative. Little is known about the views and experiences of those with first-hand experience of this controversial consent process. To inform how consent is sought for future paediatric critical care trials, we explored the views and experiences of parents and practitioners involved in the CATheter infections in CHildren (CATCH) trial, which allowed for deferred consent in certain circumstances. Mixed method survey, interview and focus group study. 275 parents completed a questionnaire; 20 families participated in an interview (18 mothers, 5 fathers). 17 CATCH practitioners participated in one of four focus groups (10 nurses, 3 doctors and 4 clinical trial unit staff). 12 UK children's hospitals. Some parents were momentarily shocked or angered to discover that their child had or could have been entered into CATCH without their prior consent. Although these feelings resolved after the reasons why consent needed to be deferred were explained and that the CATCH interventions were already used in clinical care. Prior to seeking deferred consent for the first few times, CATCH practitioners were apprehensive, although their feelings abated with experience of talking to parents about CATCH. Parents reported that their decisions about their child's participation in the trial had been voluntary. However, mistiming the deferred consent discussion had caused distress for some. Practitioners and parents supported the use of deferred consent in CATCH and in future trials of interventions already used in clinical care. Our study provides evidence to support the use of deferred consent in paediatric emergency medicine; it also indicates the crucial importance of practitioner communication and appropriate timing of deferred consent discussions

  13. The African Union at Ten Years Old: A Dream Deferred! | Mbeki ...

    African Journals Online (AJOL)

    In his famous poem, 'Harlem', first published in 1951, the eminent African American poet, writer, thinker and activist, Langston Hughes, asked challenging questions when he wrote: What happens to a dream deferred? Does it dry up like a raisin in the sun? Or fester like a sore – And then run? Does it stink like rotten meat?

  14. 27 CFR 26.104 - Deferred payment of tax-release of beer.

    Science.gov (United States)

    2010-04-01

    ...-release of beer. 26.104 Section 26.104 Alcohol, Tobacco Products and Firearms ALCOHOL AND TOBACCO TAX AND... ISLANDS Taxpayment of Liquors and Articles in Puerto Rico Beer § 26.104 Deferred payment of tax—release of beer. (a) Action by brewer. Where the brewer has furnished bond on Form 2898, and payment of the tax is...

  15. 26 CFR 1.381(c)(10)-1 - Deferred exploration and development expenditures.

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 4 2010-04-01 2010-04-01 false Deferred exploration and development expenditures. 1.381(c)(10)-1 Section 1.381(c)(10)-1 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) INCOME TAX (CONTINUED) INCOME TAXES Insolvency Reorganizations § 1.381(c)(10...

  16. Producción y calidad del forraje diferido de Panicum coloratum L. en dos periodos de diferimiento y tres momentos de defoliación Production and quality of Panicum coloratum L. deferred forage with two deferred period and three times of defolation

    Directory of Open Access Journals (Sweden)

    M.R. Steinberg

    2012-06-01

    Full Text Available Los sistemas pecuarios de las áreas subtropicales de la Argentina utilizan en invierno forrajes diferidos, provenientes del crecimiento acumulado en verano de gramíneas megatérmicas o pasturas naturales. El objetivo del trabajo fue determinar la producción de materia seca, porcentajes de hojas, tallos, proteína bruta, fibra detergente neutra, fibra detergente ácida, cenizas y digestibilidad del diferido de Panicum coloratum cv Verde. Se evaluaron dos periodos de diferimiento: diferido total (DT, forraje acumulado desde el rebrote en primavera y diferido parcial (DP, forraje acumulado desde un corte a fines de diciembre y ambos con tres oportunidades de defoliación: temprano (mayo; intermedio (julio y tardío (agosto. DT produjo más materia seca, pero con alta proporción de tallos, mayor cantidad de fibra y menor porcentaje de proteínas y cenizas; mientras que DP presentó menos tallos y más hojas y como consecuencia mayor porcentaje de proteínas y cenizas. Se concluye que P. coloratum es un recurso adecuado para diferir, sólo si se lo utiliza con un período corto de diferido y momentos tempranos de uso, ya que presenta un nivel mínimo de proteína suficiente para satisfacer los requerimientos de los animales sin la necesidad de realizar suplementación nitrogenada y valores superiores al 55% de digestibilidad.During winter the livestock production systems in subtropical areas of Argentina use deferred forage of native pasture or warm season grasses from the cumulative growth of the summer. The aim of this study was to determine dry matter production and percentage of leaves, stems, crude protein, neutral detergent fiber, acid detergent fiber, ash and digestibility of deferred forage of Panicum coloratum cv Verde. Two deferred periods were evaluated: total deferred (TD, forage accumulated since the spring regrowth and partial deferred (PD, forage accumulated from a cut in late December, and both with three cut dates: early (May

  17. It is out of my hands: how deferring control to God can decrease quality of life for breast cancer patients.

    Science.gov (United States)

    McLaughlin, Bryan; Yoo, Woohyun; D'Angelo, Jonathan; Tsang, Stephanie; Shaw, Bret; Shah, Dhavan; Baker, Timothy; Gustafson, David

    2013-12-01

    This paper seeks to contribute to the understanding of how and why religion affects psychosocial health outcomes. We propose a theoretical model predicting that when women with breast cancer defer control to God they will experience fewer breast cancer related concerns. Deferring control to God, however, should also reduce the likelihood that they take a proactive coping approach, which will be exacerbated by lowered breast cancer concerns. We therefore predict that this passive coping style will ultimately result in lower levels of quality of life. Data were collected as part of a randomized clinical trial funded by the National Cancer Institute. A total of 192 women with breast cancer participated in a computer-mediated social support group. Deferring control to God statements were captured by using computer-aided content analysis of discussion posts. Psychosocial outcomes were measured using longitudinal survey data. Analysis was performed using structural equation modeling. The results of our analysis largely confirm our mediation model for which we find significant model fit. As predicted, deferring control to God leads to lower levels of breast cancer concerns but also to more passive coping styles. Ultimately, deferring control to God can lead to lower levels of quality of life. Our study demonstrates how and why religious coping can lead to both positive and negative psychosocial health outcomes. Health care practitioners should encourage patients who are relying on religion to keep their end of the bargain and maintain an active coping style. Copyright © 2013 John Wiley & Sons, Ltd.

  18. How does the possibility to defer pension payments affect the labour supply of elderly Danish workers?

    DEFF Research Database (Denmark)

    Amilon, Anna; Nielsen, Torben Heien

    2010-01-01

    This chapter investigates the effects of a recent Danish policy reform on the number of hours worked after age 65. In short, the policy reform involved the option to defer pension payments and a reduction in the official retirement age from 67 to 65. Using a quasi-experimental design, we find tha...... positions that choose to defer their pensions. The results, therefore, indicate that the reform has mainly improved the situation of an already well-off group.......This chapter investigates the effects of a recent Danish policy reform on the number of hours worked after age 65. In short, the policy reform involved the option to defer pension payments and a reduction in the official retirement age from 67 to 65. Using a quasi-experimental design, we find...... that the reform has had a small positive impact on the number of hours worked at age 65. In the longer term (from age 65 to 67) the effect disappears, probably due to the reduction in the official retirement age causing people to retire earlier. It is mainly men, the highly educated and people holding advanced...

  19. Deference, Denial, and Beyond: A Repertoire Approach to Mass Media and Schooling

    Science.gov (United States)

    Rymes, Betsy

    2011-01-01

    In this article, the author outlines two general research approaches, within the education world, to these mass-mediated formations: "Deference" and "Denial." Researchers who recognize the social practices that give local meaning to mass media formations and ways of speaking do not attempt to recontextualize youth media in their own social…

  20. 14 CFR 399.44 - Treatment of deferred Federal income taxes for rate purposes.

    Science.gov (United States)

    2010-01-01

    ... TRANSPORTATION (AVIATION PROCEEDINGS) POLICY STATEMENTS STATEMENTS OF GENERAL POLICY Policies Relating to Rates and Tariffs § 399.44 Treatment of deferred Federal income taxes for rate purposes. For rate-making purposes other than the determination of subsidy under section 406(b), it is the policy of the Board that...

  1. 48 CFR 252.227-7027 - Deferred ordering of technical data or computer software.

    Science.gov (United States)

    2010-10-01

    ... technical data or computer software. 252.227-7027 Section 252.227-7027 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(b), use the following clause: Deferred Ordering of Technical Data or Computer Software (APR 1988) In addition to technical data or computer software...

  2. 48 CFR 252.227-7026 - Deferred delivery of technical data or computer software.

    Science.gov (United States)

    2010-10-01

    ... technical data or computer software. 252.227-7026 Section 252.227-7026 Federal Acquisition Regulations... data or computer software. As prescribed at 227.7103-8(a), use the following clause: Deferred Delivery of Technical Data or Computer Software (APR 1988) The Government shall have the right to require, at...

  3. Validation of a spectrophotometer-based method for estimating daily sperm production and deferent duct transit.

    Science.gov (United States)

    Froman, D P; Rhoads, D D

    2012-10-01

    The objectives of the present work were 3-fold. First, a new method for estimating daily sperm production was validated. This method, in turn, was used to evaluate testis output as well as deferent duct throughput. Next, this analytical approach was evaluated in 2 experiments. The first experiment compared left and right reproductive tracts within roosters. The second experiment compared reproductive tract throughput in roosters from low and high sperm mobility lines. Standard curves were constructed from which unknown concentrations of sperm cells and sperm nuclei could be predicted from observed absorbance. In each case, the independent variable was based upon hemacytometer counts, and absorbance was a linear function of concentration. Reproductive tracts were excised, semen recovered from each duct, and the extragonadal sperm reserve determined by multiplying volume by sperm cell concentration. Testicular sperm nuclei were procured by homogenization of a whole testis, overlaying a 20-mL volume of homogenate upon 15% (wt/vol) Accudenz (Accurate Chemical and Scientific Corporation, Westbury, NY), and then washing nuclei by centrifugation through the Accudenz layer. Daily sperm production was determined by dividing the predicted number of sperm nuclei within the homogenate by 4.5 d (i.e., the time sperm with elongated nuclei spend within the testis). Sperm transit through the deferent duct was estimated by dividing the extragonadal reserve by daily sperm production. Neither the efficiency of sperm production (sperm per gram of testicular parenchyma per day) nor deferent duct transit differed between left and right reproductive tracts (P > 0.05). Whereas efficiency of sperm production did not differ (P > 0.05) between low and high sperm mobility lines, deferent duct transit differed between lines (P < 0.001). On average, this process required 2.2 and 1.0 d for low and high lines, respectively. In summary, we developed and then tested a method for quantifying male

  4. Pasture height at the beginning of deferment as a determinant of signal grass structure and potential selectivity by cattle - doi: 10.4025/actascianimsci.v35i4.20421

    Directory of Open Access Journals (Sweden)

    Manoel Eduardo Rozalino Santos

    2013-10-01

    Full Text Available Current experiment identified the height of signal grass {Urochloa decumbens (Stapf R. D. Webster cv. Basilisk [syn. Brachiaria decumbens Stapf cv. Basilisk]} at the beginning of deferment that provided an appropriate pasture structure and potential selectivity by cattle on deferred pastures. Four pasture heights at the beginning of deferment (10, 20, 30 and 40 cm and two forage samples (available on pasture and simulated grazing were studied. The experimental design was set in completely randomized blocks, with two replications, in a split-plot arrangement. Higher percentage of live leaf blades and lower percentage of live stems and senescent forage were recorded in the forage sample from simulated grazing. The increase in pasture height increased the percentage of senescent forage and reduced the percentage of live leaf blades in forage samples. Pasture height at the beginning of deferment did not affect the potential selectivity index by cattle for the percentage of live leaf blade. The potential selectivity index varied quadratically for the percentage of live stems and increased linearly for the percentage of senescent forage with pasture height. A 10-to-20 cm reduction in pasture height at the beginning of deferment improved the structure of deferred signal grass and optimized selectivity by cattle.  

  5. High-order multi-implicit spectral deferred correction methods for problems of reactive flow

    International Nuclear Information System (INIS)

    Bourlioux, Anne; Layton, Anita T.; Minion, Michael L.

    2003-01-01

    Models for reacting flow are typically based on advection-diffusion-reaction (A-D-R) partial differential equations. Many practical cases correspond to situations where the relevant time scales associated with each of the three sub-processes can be widely different, leading to disparate time-step requirements for robust and accurate time-integration. In particular, interesting regimes in combustion correspond to systems in which diffusion and reaction are much faster processes than advection. The numerical strategy introduced in this paper is a general procedure to account for this time-scale disparity. The proposed methods are high-order multi-implicit generalizations of spectral deferred correction methods (MISDC methods), constructed for the temporal integration of A-D-R equations. Spectral deferred correction methods compute a high-order approximation to the solution of a differential equation by using a simple, low-order numerical method to solve a series of correction equations, each of which increases the order of accuracy of the approximation. The key feature of MISDC methods is their flexibility in handling several sub-processes implicitly but independently, while avoiding the splitting errors present in traditional operator-splitting methods and also allowing for different time steps for each process. The stability, accuracy, and efficiency of MISDC methods are first analyzed using a linear model problem and the results are compared to semi-implicit spectral deferred correction methods. Furthermore, numerical tests on simplified reacting flows demonstrate the expected convergence rates for MISDC methods of orders three, four, and five. The gain in efficiency by independently controlling the sub-process time steps is illustrated for nonlinear problems, where reaction and diffusion are much stiffer than advection. Although the paper focuses on this specific time-scales ordering, the generalization to any ordering combination is straightforward

  6. Deferred Compensation for Personnel of Tax-Exempt Universities: Effective Use of Section 403(b) Plans.

    Science.gov (United States)

    Crain, John L.; And Others

    1989-01-01

    Under the Tax Reform Act of 1986 many university employees are no longer able to make tax deductible contributions to an IRA. Several alternative plans of action are discussed including tax-deferred annuities. Tax planning strategies are offered. (MLW)

  7. Affiliation of the beneficiaries of a deferred pension to the CERN Health Insurance Scheme

    CERN Multimedia

    2003-01-01

    Subsequent to the modifications to the Rules and Regulations of the Pension Fund allowing members of the personnel having five years of affiliation to the Fund to opt for a deferred retirement pension, the Organization wishes to recall the rules relating to the affiliation of those beneficiaries to the CERN Health Insurance Scheme (CHIS). In accordance with Articles III 2.02 and VIII 4.02 of the CHIS Rules, beneficiaries of a deferred retirement pension can only be Members of the CHIS as CERN pensioners if they applied to remain Members of the Scheme upon termination of their compulsory membership as a member of the personnel and if their membership has been uninterrupted up to the moment they become CERN pensioners. The applicable contribution for this intermediate period is indicated in Articles III 5.03 and X 1.02 of the CHIS Rules. The amount is revised annually, and is set at 936 CHF/ month for 2003. Human Resources Division Tel. 73635

  8. 78 FR 37719 - Interim Final Determination To Defer Sanctions; California; South Coast Air Quality Management...

    Science.gov (United States)

    2013-06-24

    ... Determination To Defer Sanctions; California; South Coast Air Quality Management District AGENCY: Environmental... Quality Management District's (SCAQMD) portion of the California State Implementation Plan (SIP) published... California submitted the ``South Coast Air Quality Management District Proposed Contingency Measures for the...

  9. Overstory cohort survival in an Appalachian hardwood deferment cutting: 35-year results

    Science.gov (United States)

    John P. Brown; Melissa A. Thomas-Van Gundy; Thomas M. Schuler

    2018-01-01

    Deferment cutting is a two-aged regeneration method in which the majority of the stand is harvested and a dispersed component of overstory trees—approximately 15–20% of the basal area – is retained for at least onehalf rotation and up one full rotation for reasons other than regeneration. Careful consideration of residual trees, in both characteristics and harvesting,...

  10. What Infant Memory Tells Us about Infantile Amnesia: Long-Term Recall and Deferred Imitation.

    Science.gov (United States)

    Meltzoff, Andrew N.

    1995-01-01

    Long-term recall memory was assessed in 14- and 16 month-olds using a nonverbal method requiring subjects to reenact a past event from memory. The results demonstrated significant deferred imitation after delays of two and four months, and that the toddlers retained and imitated multiple acts. (MDM)

  11. To Defer or To Stand Up? How Offender Formidability Affects Third Party Moral Outrage

    DEFF Research Database (Denmark)

    Jensen, Niels Holm; Petersen, Michael Bang

    2011-01-01

    . Deciding whether to defer to or stand up against a formidable exploiter is a complicated decision as there is both much to lose (formidable individuals are able and prone to retaliate) and much to gain (formidable individuals pose a great future threat). An optimally designed outrage system should...

  12. Deferred imitation in 18-month-olds from two cultural contexts: the case of Cameroonian Nso farmer and German-middle class infants.

    Science.gov (United States)

    Borchert, Sonja; Lamm, Bettina; Graf, Frauke; Knopf, Monika

    2013-12-01

    Imitative learning has been described in naturalistic studies for different cultures, but lab-based research studying imitative learning across different cultural contexts is almost missing. Therefore, imitative learning was assessed with 18-month-old German middle-class and Cameroonian Nso farmer infants - representing two highly different eco-cultural contexts associated with different cultural models, the psychological autonomy and the hierarchical relatedness - by using the deferred imitation paradigm. Study 1 revealed that the infants from both cultural contexts performed a higher number of target actions in the deferred imitation than in the baseline phase. Moreover, it was found that German middle-class infants showed a higher mean imitation rate as they performed more target actions in the deferred imitation phase compared with Cameroonian Nso farmer infants. It was speculated that the opportunity to manipulate the test objects directly after the demonstration of the target actions could enhance the mean deferred imitation rate of the Cameroonian Nso farmer infants which was confirmed in Study 2. Possible explanations for the differences in the amount of imitated target actions of German middle-class and Cameroonian Nso farmer infants are discussed considering the object-related, dyadic setting of the imitation paradigm with respect to the different learning contexts underlying the different cultural models of learning. Copyright © 2013 Elsevier Inc. All rights reserved.

  13. The Response of Deferred Executive Compensation to Changes in Tax Rates

    OpenAIRE

    Aspen Gorry; Kevin A. Hassett; R. Glenn Hubbard; Aparna Mathur

    2015-01-01

    Given the increasing use of stock options in executive compensation, we examine how taxes influence the choice of compensation and document that income deferral is an important margin of adjustment in response to tax rate changes. To account for this option in the empirical analysis, we explore deferral by estimating how executives’ choice of compensation between current and deferred income depends on changes in tax policy. Our empirical results suggest a significant impact of taxes on the co...

  14. Discordant human T-lymphotropic virus screening with Western blot confirmation: evaluation of the dual-test algorithm for US blood donations.

    Science.gov (United States)

    Stramer, Susan L; Townsend, Rebecca L; Foster, Gregory A; Johnson, Ramona; Weixlmann, Barbara; Dodd, Roger Y

    2018-03-01

    Human T-lymphotropic virus (HTLV) blood donation screening has used a dual-testing algorithm beginning with either a chemiluminescent immunoassay or enzyme-linked immunosorbent screening assay (ELISA). Before the availability of a licensed HTLV supplemental assay, repeat-reactive (RR) samples on a first assay (Assay 1) were retested with a second screening assay (Assay 2). Donors with RR results by Assay 2 were deferred from blood donation and further tested using an unlicensed supplemental test to confirm reactivity while nonreactive (NR) donors remained eligible for donation until RR on a subsequent donation. This "dual-test" algorithm was replaced in May 2016 with the requirement that all RRs by Assay 1 be further tested by a licensed HTLV supplemental test (Western blot [WB]). In this study, we have requalified the dual-test algorithm using the available licensed HTLV WB. We tested 100 randomly selected HTLV RRs on screening Assay 1 (Abbott PRISM chemiluminescent immunoassay) but NR on screening Assay 2 (Avioq ELISA) by a Food and Drug Administration-licensed WB (MP Biomedicals) to ensure that no confirmed positives were among those that were RR by Assay 1 but NR by Assay 2. Of the 100 samples evaluated, 79 of 100 were WB seronegative, 21 of 100 indeterminate, and 0 of 100 seropositive. Of the 79 of 100 seronegative specimens, 73 of 79 did not express any bands on WB. We demonstrated that none of the 100 samples RR on Assay 1 but NR on Assay 2 were confirmed positive. This algorithm prevents such donors from requiring further testing and from being deferred. © 2018 AABB.

  15. ACCOUNTING – TAXATION REPORT IN TERMS OF DEFERRED TAXES ON ASSETS REVALUATION

    Directory of Open Access Journals (Sweden)

    PALIU – POPA LUCIA

    2014-12-01

    Full Text Available There has always been and will be a relationship between accounting and taxation, and the ongoing discussions are related to intensity, interrelation and generation of reciprocal effects. Profit is the "wealth" achieved by the economic entity, the share of shareholders after paying the income tax, where applicable, which makes the profit have a major influence on the method of determination and thus of the accounting treatment incurred by the income tax depending on the accounting cultures in dispute for supremacy, namely the European accounting culture and the Anglo-Saxon accounting culture. As the users of information in the financial statements seek to assess the performance and profitability of the company in general and, academically, the income tax is the only element raising debates on the relationship between accounting and taxation, we deemed it useful to conduct a study on the accounting – taxation report in terms of deferred taxes related to assets revaluation. The record of deferred tax amount for each type of temporary difference results in elimination of tax effects from accounting, with the aim of revealing the real earnings of the economic entity and not its fiscal side, all of which is a step in disconnecting the taxation accounting

  16. Pension Accounting and Reporting with Other Comprehensive Income and Deferred Taxes: A Worksheet Approach

    Science.gov (United States)

    Jackson, Robert E.; Sneathen, L. Dwight, Jr.; Veal, Timothy R.

    2012-01-01

    This instructional tool presents pension accounting using a worksheet approach where debits equal credits for both the employer and for the plan. Transactions associated with the initiation of the plan through the end of the second year of the plan are presented, including their impact on accumulated other comprehensive income and deferred taxes.…

  17. 75 FR 25856 - Release of Exposure Draft on Definitional Changes Related to Deferred Maintenance and Repairs...

    Science.gov (United States)

    2010-05-10

    ... Property, Plant, and Equipment. The proposed Exposure Draft represents a first step toward improving... Related to Deferred Maintenance and Repairs: Amending SFFAS 6, Accounting for Property, Plant, and Equipment AGENCY: Federal Accounting Standards Advisory Board. ACTION: Notice. Board Action: Pursuant to 31...

  18. Deferred Imitation in 9-Month-Olds: How Do Model and Task Characteristics Matter across Cultures?

    Science.gov (United States)

    Teiser, Johanna; Lamm, Bettina; Böning, Mirjam; Graf, Frauke; Gudi, Helene; Goertz, Claudia; Fassbender, Ina; Freitag, Claudia; Spangler, Sibylle; Teubert, Manuel; Lohaus, Arnold; Schwarzer, Gudrun; Knopf, Monika; Keller, Heidi

    2014-01-01

    Studies investigating imitation are usually conducted with adult models in Western contexts; therefore, the influence of cultural context and the model's age on infants' imitation is largely unknown. This study assessed deferred imitation in 9-month-old infants from the German middle-class ("N" = 44) and the ethnic group of Nso in rural…

  19. Caracterização dos perfilhos em pastos de capim-braquiária diferidos e adubados com nitrogênio Tiller characteristics in nitrogen fertilized and deferred signalgrass pastures

    Directory of Open Access Journals (Sweden)

    Manoel Eduardo Rozalino Santos

    2009-04-01

    Full Text Available Este trabalho foi conduzido com o objetivo de avaliar o perfilhamento e as características estruturais de perfilhos em pastos de Brachiaria decumbens cv. Basilisk sob três períodos de diferimento da pastagem (73, 95 e 116 dias e quatro doses de nitrogênio (N (0, 40, 80 e 120 kg/ha. O delineamento experimental foi em blocos casualizados, com três repetições, em esquema de parcelas subdivididas. O número de perfilhos vegetativos (PV reduziu com o aumento do período de diferimento. Houve interação entre período de diferimento e dose de N para a densidade populacional de PV. O maior período de diferimento elevou o número de perfilhos reprodutivos (PR nos pastos de capim-braquiária, mas não houve efeito da dose de N sobre o número de perfilhos reprodutivos. O número de perfilhos mortos não foi influenciado pelos fatores estudados. Tanto o período de diferimento quanto a adubação nitrogenada aumentaram o peso de todas as categorias de perfilhos de capim-braquiária. As características estruturais dos perfilhos presentes nos pastos diferidos foram alteradas pelo período de diferimento e pela dose de nitrogênio. A redução do período de diferimento e a adubação nitrogenada são estratégias de manejo adequadas para aumentar o número de perfilhos vegetativos em pastos de Brachiaria decumbens cv. Basilisk.This work aimed to evaluate the tilling and tiller structural characteristics on Brachiaria decumbens cv. Basilisk pasture fertilized with nitrogen (N under different deferring periods. The treatments had 73, 95 and 116-day deferring periods, and 0, 40, 80 and 120 kg/ha N doses. A randomized block design with three replicates and subdivided plots was used. The number of vegetative tillers (VT reduced as the deferring period increased. Interaction between the deferring period and N doses on the VT population density was verified. The longest deferring period increased the number of reproductive tillers (RT on the B. decumbens

  20. Why birds with deferred sexual maturity are sedentary on islands: a systematic review.

    Directory of Open Access Journals (Sweden)

    Miguel Ferrer

    Full Text Available BACKGROUND: Island faunas have played central roles in the development of evolutionary biology and ecology. Birds are among the most studied organisms on islands, in part because of their dispersal powers linked to migration. Even so, we lack of information about differences in the movement ecology of island versus mainland populations of birds. METHODOLOGY/PRINCIPAL FINDINGS: Here we present a new general pattern indicating that large birds with deferred sexual maturity are sedentary on islands, and that they become so even when they are migratory on the mainland. Density-dependent variation in the age at first breeding affects the survivorship of insular populations and this, in turn, affects the movement ecology of large birds. Because density-dependent variation in the age of first breeding is critical to the long-term survival of small isolated populations of long-lived species, migratory forms can successfully colonize islands only if they become sedentary once there. Analyses of the movement ecology of continental and insular populations of 314 species of raptors, 113 species of Ciconiiformes and 136 species of passerines, along with individual-based population simulations confirm this prediction. CONCLUSIONS: This finding has several consequences for speciation, colonization and survival of small isolated population of species with deferred sexual maturity.

  1. Towards best-case response times of real-time tasks under fixed-priority scheduling with deferred preemption

    NARCIS (Netherlands)

    Bril, R.J.; Verhaegh, W.F.J.; Puaut, I.

    2005-01-01

    In this paper, we present lower bounds for best-case response times of periodic tasks under fixed-priority scheduling with deferred preemption (FPDS) and arbitrary phasing. Our analysis is based on a dedicated conjecture for a ¿-optimal instant, and uses the notion of best-case occupied time. We

  2. 34 CFR 611.44 - Under what circumstances may the Secretary defer a scholarship recipient's service obligation?

    Science.gov (United States)

    2010-07-01

    ... a scholarship recipient's service obligation? (a) Upon written request, the Secretary may defer a service obligation for a scholarship recipient who— (1) Has not begun teaching in a high-need school of a... scholarship recipient's service obligation? 611.44 Section 611.44 Education Regulations of the Offices of the...

  3. 76 FR 56116 - Interim Final Determination To Stay and Defer Sanctions, San Joaquin Valley Unified Air Pollution...

    Science.gov (United States)

    2011-09-12

    ... Determination To Stay and Defer Sanctions, San Joaquin Valley Unified Air Pollution Control District AGENCY... on a proposed approval of revisions to the San Joaquin Valley Unified Air Pollution Control District... Part 52 Environmental protection, Air pollution control, Incorporation by reference, Intergovernmental...

  4. 76 FR 56114 - Interim Final Determination to Stay and Defer Sanctions, San Joaquin Valley Unified Air Pollution...

    Science.gov (United States)

    2011-09-12

    ... Determination to Stay and Defer Sanctions, San Joaquin Valley Unified Air Pollution Control District AGENCY... on a proposed approval of revisions to the San Joaquin Valley Unified Air Pollution Control District... Part 52 Environmental protection, Air pollution control, Incorporation by reference, Intergovernmental...

  5. 76 FR 59254 - Interim Final Determination To Stay and Defer Sanctions, San Joaquin Valley Unified Air Pollution...

    Science.gov (United States)

    2011-09-26

    ... Determination To Stay and Defer Sanctions, San Joaquin Valley Unified Air Pollution Control District AGENCY... on a proposed approval of revisions to the San Joaquin Valley Unified Air Pollution Control District...)(2)). List of Subjects in 40 CFR Part 52 Environmental protection, Air pollution control...

  6. [Evaluation of the efficacy of medical screening of blood donors on preventing blood transfusion-transmitted infectious agents].

    Science.gov (United States)

    Seck, M; Dièye, B; Guèye, Y B; Faye, B F; Senghor, A B; Toure, S A; Dieng, N; Sall, A; Toure, A O; Dièye, T N; Diop, S

    2016-05-01

    The aim of this study was to evaluate the efficacy of medical screening to retain blood donors in window period by comparing the seroprevalence of infectious agents (HIV, hepatitis B and C, syphilis) in deferred versus accepted blood donors. This prospective and transversal study was performed during 4 months in the National Blood Transfusion Center in Dakar (Senegal). We conducted a convenience sampling comparing the seroprevalence of infectious agents (HIV, HBsAg, HCV and syphilis) in deferred versus accepted blood donors after medical selection. In total, 8219 blood donors were included. Medical selection had authorized 8048 donors (97.92%) and deferred donors were 171 (2.08%). The prevalence of HIV was higher in the deferred than in accepted blood donors (1.75% vs. 0.05%) (P=0.0003; OR=35.91), as well as for HBsAg (12.87% vs. 7.35%) (P=0.006; OR=1.86). HCV antibodies were present in 0.71% of accepted blood donors and 0.58% in deferred blood donors (P=0.65; OR=0.82). Only accepted donors had brought the infection of syphilis (0.34%) (P=0.56; OR=0). Medical selection is efficient to exclude blood donors at high risk of HIV transmission and to a lesser extent of HBV. However, current medical screening procedures do not allow us to exclude donors asymptomatic carriers of HCV and syphilis. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  7. Techniques Applied for Accounting of Revenues with Deferred Payments

    Directory of Open Access Journals (Sweden)

    Traian Cristin Nicolae

    2016-01-01

    Full Text Available In accounting of the entities it is not always clear difference between income and gains. Gainsarising from transactions that are associated with the operations that are not ordinary and thatcan sometimes be significant. For example, the headquarters of a company in wich it operates canbe sold later to realize a profit from the sale. An investment company can own because they cangenerate revenue through sales. When properties are sold, the resulting gain will be included in thetotal turnover. Such a society will include properties held in current assets category. Normally,other business records and property and property held any gain from their sale will be reflected inan account that is not part of the turnover. The paper describes specific aspects of accounting ofrevenues from sales with deferred payments.

  8. A comparison of high-order explicit Runge–Kutta, extrapolation, and deferred correction methods in serial and parallel

    KAUST Repository

    Ketcheson, David I.

    2014-06-13

    We compare the three main types of high-order one-step initial value solvers: extrapolation, spectral deferred correction, and embedded Runge–Kutta pairs. We consider orders four through twelve, including both serial and parallel implementations. We cast extrapolation and deferred correction methods as fixed-order Runge–Kutta methods, providing a natural framework for the comparison. The stability and accuracy properties of the methods are analyzed by theoretical measures, and these are compared with the results of numerical tests. In serial, the eighth-order pair of Prince and Dormand (DOP8) is most efficient. But other high-order methods can be more efficient than DOP8 when implemented in parallel. This is demonstrated by comparing a parallelized version of the wellknown ODEX code with the (serial) DOP853 code. For an N-body problem with N = 400, the experimental extrapolation code is as fast as the tuned Runge–Kutta pair at loose tolerances, and is up to two times as fast at tight tolerances.

  9. Binar Sort: A Linear Generalized Sorting Algorithm

    OpenAIRE

    Gilreath, William F.

    2008-01-01

    Sorting is a common and ubiquitous activity for computers. It is not surprising that there exist a plethora of sorting algorithms. For all the sorting algorithms, it is an accepted performance limit that sorting algorithms are linearithmic or O(N lg N). The linearithmic lower bound in performance stems from the fact that the sorting algorithms use the ordering property of the data. The sorting algorithm uses comparison by the ordering property to arrange the data elements from an initial perm...

  10. PREDICTIVE VALUE OF THE DEFERRED TAXES GENERATED BY THE SUBVENTIONS FOR INVESTMENTS – ESSENTIAL ELEMENT FOR PRESENTING THE INFORMATION IN THE FINANCIAL STATEMENTS

    Directory of Open Access Journals (Sweden)

    PALIU – POPA LUCIA

    2015-12-01

    Full Text Available Most information underlying the decision to invest at the level of a company, are provided by the accountancy, this becoming today a common language with respect to the businesses on the international markets, and the accountancy normalization was extrapolated from the national level to the international level, due to the needs concerning the comparability and the transparency of the entities financial statements, without considering the geopolitical area where they were built. These issues justify the approaches for improving both accounting treatments and the procedures for elaborating and presenting data within the financial statements such that the users to benefit from credible and transparent information. One of the major issues arising with respect to the performance of an entity aims to prepare a unique situation on the company performance, namely:“the statement of the comprehensive income”, having as primordial objective the facility of forecasting the performance, within which the deferred taxes generated by the subventions for investments are an essential element with an important predictive value. In this context, starting from the main differences between the provisions of the national, Anglo-Saxon accounting regulations and those of the international reference system with respect to the predictive value of the deferred taxes and continuing with the occurrence and evolution of the deferred taxes generated by the subventions for investments, the study proposes to highlight the predictive value of the deferred taxes generated by the subventions for investments, provided o the users by the information of annual financial statements.

  11. ACCEPT: Introduction of the Adverse Condition and Critical Event Prediction Toolbox

    Science.gov (United States)

    Martin, Rodney A.; Santanu, Das; Janakiraman, Vijay Manikandan; Hosein, Stefan

    2015-01-01

    The prediction of anomalies or adverse events is a challenging task, and there are a variety of methods which can be used to address the problem. In this paper, we introduce a generic framework developed in MATLAB (sup registered mark) called ACCEPT (Adverse Condition and Critical Event Prediction Toolbox). ACCEPT is an architectural framework designed to compare and contrast the performance of a variety of machine learning and early warning algorithms, and tests the capability of these algorithms to robustly predict the onset of adverse events in any time-series data generating systems or processes.

  12. Foresight beyond the very next event: Four-year-olds can link past and deferred future episodes

    Directory of Open Access Journals (Sweden)

    Jonathan eRedshaw

    2013-07-01

    Full Text Available Previous experiments have demonstrated that by four years of age children can use information from a past episode to solve a problem for the very next future episode. However, it remained unclear whether four-year-olds can similarly use such information to solve a problem for a more removed future episode that is not of immediate concern. In the current study we introduced four-year-olds to problems in one room before taking them to another room and distracting them for 15 minutes. The children were then offered a choice of items to place into a bucket that was to be taken back to the first room when a five-minute sand-timer had completed a cycle. Across two conceptually distinct domains, the children placed the item that could solve the deferred future problem above chance level. This result demonstrates that by 48 months many children can recall a problem from the past and act in the present to solve that problem for a deferred future episode. We discuss implications for theories about the nature of episodic foresight.

  13. Repeat Courses of Stereotactic Radiosurgery (SRS), Deferring Whole-Brain Irradiation, for New Brain Metastases After Initial SRS

    Energy Technology Data Exchange (ETDEWEB)

    Shultz, David B.; Modlin, Leslie A.; Jayachandran, Priya; Von Eyben, Rie; Gibbs, Iris C. [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, California (United States); Choi, Clara Y.H. [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, California (United States); Department of Radiation Oncology, Santa Clara Valley Medical Center, San Jose, California (United States); Chang, Steven D.; Harsh, Griffith R.; Li, Gordon; Adler, John R. [Department of Neurosurgery, Stanford University School of Medicine, Stanford, California (United States); Hancock, Steven L. [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, California (United States); Soltys, Scott G., E-mail: sgsoltys@stanford.edu [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, California (United States)

    2015-08-01

    Purpose: To report the outcomes of repeat stereotactic radiosurgery (SRS), deferring whole-brain radiation therapy (WBRT), for distant intracranial recurrences and identify factors associated with prolonged overall survival (OS). Patients and Methods: We retrospectively identified 652 metastases in 95 patients treated with 2 or more courses of SRS for brain metastases, deferring WBRT. Cox regression analyzed factors predictive for OS. Results: Patients had a median of 2 metastases (range, 1-14) treated per course, with a median of 2 courses (range, 2-14) of SRS per patient. With a median follow-up after first SRS of 15 months (range, 3-98 months), the median OS from the time of the first and second course of SRS was 18 (95% confidence interval [CI] 15-24) and 11 months (95% CI 6-17), respectively. On multivariate analysis, histology, graded prognostic assessment score, aggregate tumor volume (but not number of metastases), and performance status correlated with OS. The 1-year cumulative incidence, with death as a competing risk, of local failure was 5% (95% CI 4-8%). Eighteen (24%) of 75 deaths were from neurologic causes. Nineteen patients (20%) eventually received WBRT. Adverse radiation events developed in 2% of SRS sites. Conclusion: Multiple courses of SRS, deferring WBRT, for distant brain metastases after initial SRS, seem to be a safe and effective approach. The graded prognostic assessment score, updated at each course, and aggregate tumor volume may help select patients in whom the deferral of WBRT might be most beneficial.

  14. Qualitative evaluation of a deferred consent process in paediatric emergency research: a PREDICT study

    OpenAIRE

    Furyk, Jeremy; McBain-Rigg, Kristin; Watt, Kerrianne; Emeto, Theophilus I; Franklin, Richard C; Franklin, Donna; Schibler, Andreas; Dalziel, Stuart R; Babl, Franz E; Wilson, Catherine; Phillips, Natalie; Ray, Robin

    2017-01-01

    Background A challenge of conducting research in critically ill children is that the therapeutic window for the intervention may be too short to seek informed consent prior to enrolment. In specific circumstances, most international ethical guidelines allow for children to be enrolled in research with informed consent obtained later, termed deferred consent (DC) or retrospective consent. There is a paucity of data on the attitudes of parents to this method of enrolment in paediatric emergency...

  15. On factoring RSA modulus using random-restart hill-climbing algorithm and Pollard’s rho algorithm

    Science.gov (United States)

    Budiman, M. A.; Rachmawati, D.

    2017-12-01

    The security of the widely-used RSA public key cryptography algorithm depends on the difficulty of factoring a big integer into two large prime numbers. For many years, the integer factorization problem has been intensively and extensively studied in the field of number theory. As a result, a lot of deterministic algorithms such as Euler’s algorithm, Kraitchik’s, and variants of Pollard’s algorithms have been researched comprehensively. Our study takes a rather uncommon approach: rather than making use of intensive number theories, we attempt to factorize RSA modulus n by using random-restart hill-climbing algorithm, which belongs the class of metaheuristic algorithms. The factorization time of RSA moduli with different lengths is recorded and compared with the factorization time of Pollard’s rho algorithm, which is a deterministic algorithm. Our experimental results indicates that while random-restart hill-climbing algorithm is an acceptable candidate to factorize smaller RSA moduli, the factorization speed is much slower than that of Pollard’s rho algorithm.

  16. List-Based Simulated Annealing Algorithm for Traveling Salesman Problem

    Directory of Open Access Journals (Sweden)

    Shi-hua Zhan

    2016-01-01

    Full Text Available Simulated annealing (SA algorithm is a popular intelligent optimization algorithm which has been successfully applied in many fields. Parameters’ setting is a key factor for its performance, but it is also a tedious work. To simplify parameters setting, we present a list-based simulated annealing (LBSA algorithm to solve traveling salesman problem (TSP. LBSA algorithm uses a novel list-based cooling schedule to control the decrease of temperature. Specifically, a list of temperatures is created first, and then the maximum temperature in list is used by Metropolis acceptance criterion to decide whether to accept a candidate solution. The temperature list is adapted iteratively according to the topology of the solution space of the problem. The effectiveness and the parameter sensitivity of the list-based cooling schedule are illustrated through benchmark TSP problems. The LBSA algorithm, whose performance is robust on a wide range of parameter values, shows competitive performance compared with some other state-of-the-art algorithms.

  17. Immediate or deferred adjustment of drug regimens in multidose drug dispensing systems.

    Science.gov (United States)

    Mertens, Bram J; Kwint, Henk-Frans; van Marum, Rob J; Bouvy, Marcel L

    2018-05-18

    Multidose drug dispensing (MDD) is used to help patients take their medicines appropriately. Little is known about drug regimen changes within these MDD systems and how they are effectuated by the community pharmacist. Manual immediate adjustments of the MDD system could introduce dispensing errors. MDD guidelines therefore recommend to effectuate drug regimen changes at the start of a new MDD system. The aim of this study was to investigate the frequency, type, procedure followed, immediate necessity, and time taken to make MDD adjustments. This was a cross-sectional study in eight community pharmacies in the Netherlands. All adjustments to MDD systems were systematically documented for 3 weeks by the community pharmacist. Overall, 261 MDD adjustments involving 364 drug changes were documented for 250 patients: 127 (35%) drug changes involved the addition of a new drug, 124 (34%) a change in dosage, and 95 (26%) drug discontinuation. Of the MDD adjustments, 135 (52%) were effectuated immediately: 81 (31%) by adjusting the MDD system manually, 49 (19%) by temporarily dispensing the drug separately from the MDD system, and 5 (2%) by ordering a new MDD system. Pharmacists considered that 36 (27%) of the immediate MDD adjustments could have been deferred until the next MDD system was produced. Immediate adjustment took significantly longer than deferred adjustment (p < 0.001). This study shows that in patients using MDD systems, over half of the drug regimen changes are adjusted immediately. The necessity of these immediate changes should be critically evaluated. Copyright © 2018. Published by Elsevier Inc.

  18. Variabilidade espacial e temporal da vegetação em pastos de capim braquiária diferidos Spatial and temporal variability of vegetation on deferred signalgrass pastures

    Directory of Open Access Journals (Sweden)

    Manoel Eduardo Rozalino Santos

    2010-04-01

    Full Text Available Avaliou-se a variabilidade espacial e temporal de características descritoras da condição de pastos diferidos de Brachiaria decumbens cv. Basilisk (capim-braquiária. Os tratamentos consistiram de combinações dos períodos de diferimento da pastagem (73, 103, 131 e 163 dias com os períodos de pastejo (29, 57 e 85 dias. Utilizou-se esquema de parcelas subdivididas e delineamento em blocos casualizados com duas repetições. Foi determinada a dispersão dos valores de altura do pasto, altura da planta estendida e do índice de tombamento do pasto. A variabilidade espacial da altura do pasto aumentou de forma linear com o período de diferimento, porém não foi influenciada pelo período de pastejo. O coeficiente de variação da altura da planta estendida diminuiu linearmente em pastos submetidos aos maiores períodos de diferimento e não foi afetado pelo período de pastejo. A variabilidade do índice de tombamento, no entanto, apresentou resposta quadrática ao período de diferimento, com ponto de máximo correspondente ao coeficiente de variação de 38,25% aos 130 dias. Em pastagens diferidas por curto período (73 dias, ocorreu variação negativa do coeficiente de variação durante o período de pastejo. Pastos de capimbraquiária sob diferimento por longos períodos possuem maior variabilidade da altura do pasto e menor heterogeneidade da altura da planta estendida. Além do efeito temporal, ocorre grande variabilidade espacial nas pastagens de capim-braquiária diferidas.It was evaluated spatial and temporal variability of status descriptive characters of Brachiaria decumbens (signalgrass cv. Basilisk deferred pastures. Treatments consisted of combinations of pasture deferring periods (73, 103, 131 and 163 days with grazing periods (29, 57 and 85 days. Randomized block design with two repetitions and subdivided plots was used. It was determined the dispersion of pasture height (PH, stretched plant height (SPH and falling index

  19. Algorithmic alternatives

    International Nuclear Information System (INIS)

    Creutz, M.

    1987-11-01

    A large variety of Monte Carlo algorithms are being used for lattice gauge simulations. For purely bosonic theories, present approaches are generally adequate; nevertheless, overrelaxation techniques promise savings by a factor of about three in computer time. For fermionic fields the situation is more difficult and less clear. Algorithms which involve an extrapolation to a vanishing step size are all quite closely related. Methods which do not require such an approximation tend to require computer time which grows as the square of the volume of the system. Recent developments combining global accept/reject stages with Langevin or microcanonical updatings promise to reduce this growth to V/sup 4/3/

  20. Improved quality of life with immediate versus deferred initiation of antiretroviral therapy in early asymptomatic HIV infection.

    Science.gov (United States)

    Lifson, Alan R; Grund, Birgit; Gardner, Edward M; Kaplan, Richard; Denning, Eileen; Engen, Nicole; Carey, Catherine L; Chen, Fabian; Dao, Sounkalo; Florence, Eric; Sanz, Jesus; Emery, Sean

    2017-04-24

    To determine if immediate compared to deferred initiation of antiretroviral therapy (ART) in healthy persons living with HIV had a more favorable impact on health-related quality of life (QOL), or self-assessed physical, mental, and overall health status. QOL was measured in the Strategic Timing of Antiretroviral Therapy study, which randomized healthy ART-naive persons living with HIV with CD4 cell counts above 500 cells/μl from 35 countries to immediate versus deferred ART. At baseline, months 4 and 12, then annually, participants completed a visual analog scale (VAS) for 'perceived current health' and the Short-Form 12-Item Health Survey version 2 from which the following were computed: general health perception; physical component summary (PCS); and mental component summary (MCS); the VAS and general health were rated from 0 (lowest) to 100 (highest). QOL at study entry was high (mean scores: VAS = 80.9, general health = 72.5, PCS = 53.7, MCS = 48.2). Over a mean follow-up of 3 years, changes in all QOL measures favored the immediate group (P < 0.001); estimated differences were as follows: VAS = 1.9, general health = 3.6, PCS = 0.8, MCS = 0.9. When QOL changes were assessed across various demographic and clinical subgroups, treatment differences continued to favor the immediate group. QOL was poorer in those experiencing primary outcomes; however, when excluding those with primary events, results remained favorable for immediate ART recipients. In an international randomized trial in ART-naive participants with above 500 CD4 cells/μl, there were modest but significant improvements in self-assessed QOL among those initiating ART immediately compared to deferring treatment, supporting patient-perceived health benefits of initiating ART as soon as possible after an HIV diagnosis.

  1. The evolution of prestige: freely conferred deference as a mechanism for enhancing the benefits of cultural transmission.

    Science.gov (United States)

    Henrich, J; Gil-White, F J.

    2001-05-01

    This paper advances an "information goods" theory that explains prestige processes as an emergent product of psychological adaptations that evolved to improve the quality of information acquired via cultural transmission. Natural selection favored social learners who could evaluate potential models and copy the most successful among them. In order to improve the fidelity and comprehensiveness of such ranked-biased copying, social learners further evolved dispositions to sycophantically ingratiate themselves with their chosen models, so as to gain close proximity to, and prolonged interaction with, these models. Once common, these dispositions created, at the group level, distributions of deference that new entrants may adaptively exploit to decide who to begin copying. This generated a preference for models who seem generally "popular." Building on social exchange theories, we argue that a wider range of phenomena associated with prestige processes can more plausibly be explained by this simple theory than by others, and we test its predictions with data from throughout the social sciences. In addition, we distinguish carefully between dominance (force or force threat) and prestige (freely conferred deference).

  2. Immune control of HIV-1 infection after therapy interruption: immediate versus deferred antiretroviral therapy

    Directory of Open Access Journals (Sweden)

    Bernaschi Massimo

    2009-10-01

    Full Text Available Abstract Background The optimal stage for initiating antiretroviral therapies in HIV-1 bearing patients is still a matter of debate. Methods We present computer simulations of HIV-1 infection aimed at identifying the pro et contra of immediate as compared to deferred Highly Active Antiretroviral Therapy (HAART. Results Our simulations highlight that a prompt specific CD8+ cytotoxic T lymphocytes response is detected when therapy is delayed. Compared to very early initiation of HAART, in deferred treated patients CD8+ T cells manage to mediate the decline of viremia in a shorter time and, at interruption of therapy, the virus experiences a stronger immune pressure. We also observe, however, that the immunological effects of the therapy fade with time in both therapeutic regimens. Thus, within one year from discontinuation, viral burden recovers to the value at which it would level off in the absence of therapy. In summary, simulations show that immediate therapy does not prolong the disease-free period and does not confer a survival benefit when compared to treatment started during the chronic infection phase. Conclusion Our conclusion is that, since there is no therapy to date that guarantees life-long protection, deferral of therapy should be preferred in order to minimize the risk of adverse effects, the occurrence of drug resistances and the costs of treatment.

  3. STRUCTURE AND NUTRITIVE VALUE OF DEFERRED PASTURE OF BRACHIARIA DECUMBENS CV. BASILISK DURING THE GRAZING PERIOD

    Directory of Open Access Journals (Sweden)

    Manoel Eduardo Rozalino Santos

    2011-07-01

    Full Text Available The experiment aimed to understand the effect of grazing period on morphology and nutritive value of deferred Brachiaria decumbens cv. Basilisk pasture and on hand-plucking sample with cattle. Subdivided plots were used according to a randomized block design with two replicates. Four grazing periods (1, 31, 57 and 88 days and two forage samples (available in pasture (AP and obtained by hand-plucking (HP were studied. The live leaf laminae (LLL, potentially digestible neutral detergent fiber (PDNDF, potentially digestible dry matter (PDDM and crude protein (CP levels were higher and dead leaf laminae (DLL, and dead stem (DS, neutral detergent fiber (NDF and indigestible NDF percentages were lower in sample of HP in relation to forage AP. The grazing period decreased linearly the LLL, PDNDF, PDDM percentages, as well as increased linearly DS and indigestible NDF levels in forages. The potential selectivity indexes (PSI of LLL and indigestible NDF increased linearly with grazing period. The PSI of live stem was smaller and the PSI of CP was higher in interim periods of grazing. The reduction in deferring period results in B. decumbens with better morphological composition and nutritional value, which favors the animal selectivity.

  4. What Infant Memory Tells Us about Infantile Amnesia: Long-Term Recall and Deferred Imitation

    OpenAIRE

    Meltzoff, Andrew N.

    1995-01-01

    Long-term recall memory was assessed using a nonverbal method requiring subjects to reenact a past event from memory (deferred imitation). A large sample of infants (N = 192), evenly divided between 14- and 16-months old, was tested across two experiments. A delay of 2 months was used in Experiment 1 and a delay of 4 months in Experiment 2. In both experiments two treatment groups were used, In one treatment group, motor practice (immediate imitation) was allowed before the delay was imposed;...

  5. The Effects of Deferred Action for Childhood Arrivals on the Educational Outcomes of Undocumented Students

    OpenAIRE

    Hsin, Amy; Ortega, Francesc

    2017-01-01

    Deferred Action for Childhood Arrivals (DACA) is the first large-scale immigration reform to affect undocumented immigrants in the United States in decades and offers eligible undocumented youth temporary relief from deportation and renewable work permits. While DACA has improved the economic conditions and mental health of undocumented immigrants, we do not know how DACA improves the social mobility of undocumented immigrants through its effect on educational attainment. This paper uses admi...

  6. The Effects of DACAmentation: The Impact of Deferred Action for Childhood Arrivals on Unauthorized Immigrants

    OpenAIRE

    Pope, Nolan G.

    2016-01-01

    As the largest immigration policy in 25 years, Deferred Action for Childhood Arrivals (DACA) made deportation relief and work authorization available to 1.7 million unauthorized immigrants. This paper looks at how DACA affects DACA-eligible immigrants' labor market outcomes. I use a difference-in-differences design for unauthorized immigrants near the criteria cutoffs for DACA eligibility. I find DACA increases the likelihood of working by increasing labor force participation and decreasing t...

  7. Enhanced intelligent water drops algorithm for multi-depot vehicle routing problem.

    Science.gov (United States)

    Ezugwu, Absalom E; Akutsah, Francis; Olusanya, Micheal O; Adewumi, Aderemi O

    2018-01-01

    The intelligent water drop algorithm is a swarm-based metaheuristic algorithm, inspired by the characteristics of water drops in the river and the environmental changes resulting from the action of the flowing river. Since its appearance as an alternative stochastic optimization method, the algorithm has found applications in solving a wide range of combinatorial and functional optimization problems. This paper presents an improved intelligent water drop algorithm for solving multi-depot vehicle routing problems. A simulated annealing algorithm was introduced into the proposed algorithm as a local search metaheuristic to prevent the intelligent water drop algorithm from getting trapped into local minima and also improve its solution quality. In addition, some of the potential problematic issues associated with using simulated annealing that include high computational runtime and exponential calculation of the probability of acceptance criteria, are investigated. The exponential calculation of the probability of acceptance criteria for the simulated annealing based techniques is computationally expensive. Therefore, in order to maximize the performance of the intelligent water drop algorithm using simulated annealing, a better way of calculating the probability of acceptance criteria is considered. The performance of the proposed hybrid algorithm is evaluated by using 33 standard test problems, with the results obtained compared with the solutions offered by four well-known techniques from the subject literature. Experimental results and statistical tests show that the new method possesses outstanding performance in terms of solution quality and runtime consumed. In addition, the proposed algorithm is suitable for solving large-scale problems.

  8. Impact of a deferred recruitment model in a randomised controlled trial in primary care (CREAM study).

    Science.gov (United States)

    Shepherd, Victoria; Thomas-Jones, Emma; Ridd, Matthew J; Hood, Kerenza; Addison, Katy; Francis, Nick A

    2017-11-10

    Recruitment of participants is particularly challenging in primary care, with less than a third of randomised controlled trials (RCT) achieving their target within the original time frame. Participant identification, consent, randomisation and data collection can all be time-consuming. Trials recruiting an incident, as opposed to a prevalent, population may be particularly affected. This paper describes the impact of a deferred recruitment model in a RCT of antibiotics for children with infected eczema in primary care, which required the recruitment of cases presenting acutely. Eligible children were identified by participating general practitioners (GPs) and referred to a study research nurse, who then visited them at home. This allowed the consent and recruitment processes to take place outside the general practice setting. Information was recorded about patients who were referred and recruited, or if not, the reasons for non-recruitment. Data on recruitment challenges were collected through semi-structured interviews and questionnaires with a sample of participating GPs. Data were thematically analysed to identify key themes. Of the children referred to the study 34% (58/171) were not recruited - 48% (28/58) because of difficulties arranging a baseline visit within the defined time frame, 31% (18/58) did not meet the study inclusion criteria at the time of nurse assessment, and 21% (12/58) declined participation. GPs had positive views about the recruitment process, reporting that parents valued and benefitted from additional contact with a nurse. GPs felt that the deferred recruitment model did not negatively impact on the study. GPs and parents recognised the benefits of deferred recruitment, but these did not translate into enhanced recruitment of participants. The model resulted in the loss of a third of children who were identified by the GP as eligible, but not subsequently recruited to the study. If the potential for improving outcomes in primary care

  9. A new chaotic algorithm for image encryption

    International Nuclear Information System (INIS)

    Gao Haojiang; Zhang Yisheng; Liang Shuyun; Li Dequn

    2006-01-01

    Recent researches of image encryption algorithms have been increasingly based on chaotic systems, but the drawbacks of small key space and weak security in one-dimensional chaotic cryptosystems are obvious. This paper presents a new nonlinear chaotic algorithm (NCA) which uses power function and tangent function instead of linear function. Its structural parameters are obtained by experimental analysis. And an image encryption algorithm in a one-time-one-password system is designed. The experimental results demonstrate that the image encryption algorithm based on NCA shows advantages of large key space and high-level security, while maintaining acceptable efficiency. Compared with some general encryption algorithms such as DES, the encryption algorithm is more secure

  10. From patient deference towards negotiated and precarious informality: An Eliasian analysis of English general practitioners' understandings of changing patient relations

    NARCIS (Netherlands)

    Brown, P.R.; Elston, M.A.; Gabe, J.

    2015-01-01

    This article contributes to sociological debates about trends in the power and status of medical professionals, focussing on claims that deferent patient relations are giving way to a more challenging consumerism. Analysing data from a mixed methods study involving general practitioners in England,

  11. Relations between 18-month-olds' gaze pattern and target action performance: a deferred imitation study with eye tracking.

    Science.gov (United States)

    Óturai, Gabriella; Kolling, Thorsten; Knopf, Monika

    2013-12-01

    Deferred imitation studies are used to assess infants' declarative memory performance. These studies have found that deferred imitation performance improves with age, which is usually attributed to advancing memory capabilities. Imitation studies, however, are also used to assess infants' action understanding. In this second research program it has been observed that infants around the age of one year imitate selectively, i.e., they imitate certain kinds of target actions and omit others. In contrast to this, two-year-olds usually imitate the model's exact actions. 18-month-olds imitate more exactly than one-year-olds, but more selectively than two-year-olds, a fact which makes this age group especially interesting, since the processes underlying selective vs. exact imitation are largely debated. The question, for example, if selective attention to certain kinds of target actions accounts for preferential imitation of these actions in young infants is still open. Additionally, relations between memory capabilities and selective imitation processes, as well as their role in shaping 18-month-olds' neither completely selective, nor completely exact imitation have not been thoroughly investigated yet. The present study, therefore, assessed 18-month-olds' gaze toward two types of actions (functional vs. arbitrary target actions) and the model's face during target action demonstration, as well as infants' deferred imitation performance. Although infants' fixation times to functional target actions were not longer than to arbitrary target actions, they imitated the functional target actions more frequently than the arbitrary ones. This suggests that selective imitation does not rely on selective gaze toward functional target actions during the demonstration phase. In addition, a post hoc analysis of interindividual differences suggested that infants' attention to the model's social-communicative cues might play an important role in exact imitation, meaning the imitation

  12. 26 CFR 20.6324A-1 - Special lien for estate tax deferred under section 6166 or 6166A.

    Science.gov (United States)

    2010-04-01

    ...) for payment of the estate tax. Such value must take into account any encumbrance on the property (such... the required value must be added to the agreement within 90 days after notice and demand from the... 26 Internal Revenue 14 2010-04-01 2010-04-01 false Special lien for estate tax deferred under...

  13. Participation in Black Lives Matter and Deferred Action for Childhood Arrivals: Modern Activism among Black and Latino College Students

    Science.gov (United States)

    Hope, Elan C.; Keels, Micere; Durkee, Myles I.

    2016-01-01

    Political activism is one way racially/ethnically marginalized youth can combat institutional discrimination and seek legislative change toward equality and justice. In the current study, we examine participation in #BlackLivesMatter (BLM) and advocacy for Deferred Action for Childhood Arrivals (DACA) as political activism popular among youth.…

  14. Can Authorization Reduce Poverty among Undocumented Immigrants? Evidence from the Deferred Action for Childhood Arrivals Program

    OpenAIRE

    Amuedo-Dorantes, Catalina; Antman, Francisca M.

    2016-01-01

    We explore the impact of authorization on the poverty exposure of households headed by undocumented immigrants. The identification strategy makes use of the 2012 Deferred Action for Childhood Arrivals (DACA) program, which provided a temporary work authorization and reprieve from deportation to eligible immigrants. Using a difference-in-differences approach, we compare DACA-eligible to DACA-ineligible likely unauthorized immigrants, before and after the program implementation. We find that DA...

  15. Is selective attention the basis for selective imitation in infants? An eye-tracking study of deferred imitation with 12-month-olds.

    Science.gov (United States)

    Kolling, Thorsten; Oturai, Gabriella; Knopf, Monika

    2014-08-01

    Infants and children do not blindly copy every action they observe during imitation tasks. Research demonstrated that infants are efficient selective imitators. The impact of selective perceptual processes (selective attention) for selective deferred imitation, however, is still poorly described. The current study, therefore, analyzed 12-month-old infants' looking behavior during demonstration of two types of target actions: arbitrary versus functional actions. A fully automated remote eye tracker was used to assess infants' looking behavior during action demonstration. After a 30-min delay, infants' deferred imitation performance was assessed. Next to replicating a memory effect, results demonstrate that infants do imitate significantly more functional actions than arbitrary actions (functionality effect). Eye-tracking data show that whereas infants do not fixate significantly longer on functional actions than on arbitrary actions, amount of fixations and amount of saccades differ between functional and arbitrary actions, indicating different encoding mechanisms. In addition, item-level findings differ from overall findings, indicating that perceptual and conceptual item features influence looking behavior. Looking behavior on both the overall and item levels, however, does not relate to deferred imitation performance. Taken together, the findings demonstrate that, on the one hand, selective imitation is not explainable merely by selective attention processes. On the other hand, notwithstanding this reasoning, attention processes on the item level are important for encoding processes during target action demonstration. Limitations and future studies are discussed. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Globalized Newton-Krylov-Schwarz Algorithms and Software for Parallel Implicit CFD

    Science.gov (United States)

    Gropp, W. D.; Keyes, D. E.; McInnes, L. C.; Tidriri, M. D.

    1998-01-01

    Implicit solution methods are important in applications modeled by PDEs with disparate temporal and spatial scales. Because such applications require high resolution with reasonable turnaround, "routine" parallelization is essential. The pseudo-transient matrix-free Newton-Krylov-Schwarz (Psi-NKS) algorithmic framework is presented as an answer. We show that, for the classical problem of three-dimensional transonic Euler flow about an M6 wing, Psi-NKS can simultaneously deliver: globalized, asymptotically rapid convergence through adaptive pseudo- transient continuation and Newton's method-, reasonable parallelizability for an implicit method through deferred synchronization and favorable communication-to-computation scaling in the Krylov linear solver; and high per- processor performance through attention to distributed memory and cache locality, especially through the Schwarz preconditioner. Two discouraging features of Psi-NKS methods are their sensitivity to the coding of the underlying PDE discretization and the large number of parameters that must be selected to govern convergence. We therefore distill several recommendations from our experience and from our reading of the literature on various algorithmic components of Psi-NKS, and we describe a freely available, MPI-based portable parallel software implementation of the solver employed here.

  17. Elaboration of an algorithm for preserving a projective skin flap above the tumor when planning subcutaneous mastectomy from an aesthetically acceptable area in patients with breast nodule cancer

    Directory of Open Access Journals (Sweden)

    A. R. Khamitov

    2016-01-01

    Full Text Available Indications for the conservation of the skin flap over the tumor for potential offset of the operational access in aesthetically acceptable zone in patients with primary nodular breast cancer are discussed in the article. The survey results of 203 patients (T1–2N0–3M0 are analyzed. The study revealed that the risk factors affecting the skin flap involvement are the presence of the skin flattening as well as topographic and anatomical characteristics: tumor < 3 cm, located at a depth of < 0.46 ± 0.2 cm, tumor ≥ 3 cm located at a depth of < 1.66 cm. Based on the data the algorithm for immediate breast reconstruction from aesthetically acceptable zone for surgical oncologist is compiled.

  18. Forest FIRE and FIRE wood : tools for tree automata and tree algorithms

    NARCIS (Netherlands)

    Cleophas, L.G.W.A.; Piskorski, J.; Watson, B.W.; Yli-Jyrä, A.

    2009-01-01

    Pattern matching, acceptance, and parsing algorithms on node-labeled, ordered, ranked trees ('tree algorithms') are important for applications such as instruction selection and tree transformation/term rewriting. Many such algorithms have been developed. They often are based on results from such

  19. Qualitative evaluation of a deferred consent process in paediatric emergency research: a PREDICT study.

    Science.gov (United States)

    Furyk, Jeremy; McBain-Rigg, Kristin; Watt, Kerrianne; Emeto, Theophilus I; Franklin, Richard C; Franklin, Donna; Schibler, Andreas; Dalziel, Stuart R; Babl, Franz E; Wilson, Catherine; Phillips, Natalie; Ray, Robin

    2017-11-15

    A challenge of conducting research in critically ill children is that the therapeutic window for the intervention may be too short to seek informed consent prior to enrolment. In specific circumstances, most international ethical guidelines allow for children to be enrolled in research with informed consent obtained later, termed deferred consent (DC) or retrospective consent. There is a paucity of data on the attitudes of parents to this method of enrolment in paediatric emergency research. To explore the attitudes of parents to the concept of DC and to expand the knowledge of the limitations to informed consent and DC in these situations. Children presenting with uncomplicated febrile seizures or bronchiolitis were identified from three separate hospital emergency department databases. Parents were invited to participate in a semistructured telephone interview exploring themes of limitations of prospective informed consent, acceptability of the DC process and the most appropriate time to seek DC. Transcripts underwent inductive thematic analysis with intercoder agreement, using Nvivo 11 software. A total of 39 interviews were conducted. Participants comprehended the limitations of informed consent under emergency circumstances and were generally supportive of DC. However, they frequently confused concepts of clinical care and research, and support for participation was commonly linked to their belief of personal benefit. Participants acknowledged the requirement for alternatives to prospective informed consent in emergency research, and were supportive of the concept of DC. Our results suggest that current research practice seems to align with community expectations. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  20. 17 CFR 270.6c-10 - Exemption for certain open-end management investment companies to impose deferred sales loads.

    Science.gov (United States)

    2010-04-01

    ... management investment companies to impose deferred sales loads. 270.6c-10 Section 270.6c-10 Commodity and... ACT OF 1940 § 270.6c-10 Exemption for certain open-end management investment companies to impose... purposes of this section: (1) Company means a registered open-end management investment company, other than...

  1. Accurate simulation of neutrons in less than one minute Pt. 1: Acceptance diagram shading

    International Nuclear Information System (INIS)

    Bentley, Phillip M.; Andersen, Ken H.

    2009-01-01

    We describe an algorithm for modelling neutron guides based on an extension of acceptance diagram methods: neutron acceptance diagram shading (NADS), using a computer graphical approach. It has a similar black-box, modular strategy to Monte-Carlo simulations, but often outperforms Monte-Carlo by orders of magnitude in terms of calculation speed without sacrificing significant detail. NADS is expected to increase the productivity in the design stage of neutron instrument development, and facilitate the widespread application of advanced, modern artificial-intelligence-based optimisation algorithms for the design of neutron instrumentation.

  2. DACA at the Two-Year Mark: A National and State Profile of Youth Eligible and Applying for Deferred Action

    Science.gov (United States)

    Batalova, Jeanne; Hooker, Sarah; Capps, Randy

    2014-01-01

    Since the Obama administration launched the Deferred Action for Childhood Arrivals (DACA) program in 2012, which offers temporary relief from deportation and the right to apply for work authorization for certain unauthorized immigrants who came to the United States as children, 55 percent of the 1.2 million youth who immediately met the program's…

  3. Health consequences of the US Deferred Action for Childhood Arrivals (DACA) immigration programme: a quasi-experimental study

    OpenAIRE

    Atheendar S Venkataramani, DrMD; Sachin J Shah, MD; Rourke O'Brien, PhD; Ichiro Kawachi, ProfPhD; Alexander C Tsai, MD

    2017-01-01

    Summary: Background: The effects of changes in immigration policy on health outcomes among undocumented immigrants are not well known. We aimed to examine the physical and mental health effects of the Deferred Action for Childhood Arrivals (DACA) programme, a 2012 US immigration policy that provided renewable work permits and freedom from deportation for a large number of undocumented immigrants. Methods: We did a retrospective, quasi-experimental study using nationally representative, repea...

  4. 77 FR 24857 - Interim Final Determination To Stay and Defer Sanctions, San Joaquin Valley Unified Air Pollution...

    Science.gov (United States)

    2012-04-26

    ...EPA is making an interim final determination to stay the imposition of offset sanctions and to defer the imposition of highway sanctions based on a proposed approval of revisions to the San Joaquin Valley Unified Air Pollution Control District (SJVUAPCD) portion of the California State Implementation Plan (SIP) published elsewhere in this Federal Register. The revisions concern SJVUAPCD Rule 4352, Solid Fuel Fired Boilers, Steam Generators and Process Heaters.

  5. Infrastructure investment for tomorrow: A financing plan to eliminate the deferred maintenance on the nation's roads

    OpenAIRE

    Regan, Edward V.

    1994-01-01

    The author presents a long-term public investment proposal to preserve and upgrade the nation's infrastructure system, offering a unique financing plan to eliminate much of the backlog of deferred maintenance that plagues America's roads and bridges. The plan would allow states and municipalities to get out from under this burden with a one-time upgrading program, and then attain a new capacity to maintain and improve their infrastructure networks. Regan concludes that the goal of long-term i...

  6. Undocumented students pursuing medical education: The implications of deferred action for childhood arrivals (DACA).

    Science.gov (United States)

    Balderas-Medina Anaya, Yohualli; del Rosario, Mithi; Doyle, Lawrence Hy; Hayes-Bautista, David E

    2014-12-01

    There are about 1.8 million young immigrants in the United States who came or were brought to the country without documentation before the age of 16. These youth have been raised and educated in the United States and have aspirations and educational achievements similar to those of their native-born peers. However, their undocumented status has hindered their pursuit of higher education, especially in medical and other graduate health sciences. Under a new discretionary policy, Deferred Action for Childhood Arrivals (DACA), many of these young immigrants are eligible to receive permission to reside and work in the United States. DACA defers deportation of eligible, undocumented youth and grants lawful presence in the United States, work permits, Social Security numbers, and, in most states, driver's licenses. These privileges have diminished the barriers undocumented students traditionally have faced in obtaining higher education, specifically in pursuing medicine. With the advent of DACA, students are slowly matriculating into U.S. medical schools and residencies. However, this applicant pool remains largely untapped. In the face of a physician shortage and the implementation of the Affordable Care Act, an increase in matriculation of qualified undocumented students would be greatly beneficial. This Perspective is intended to begin discussion within the academic medicine community of the implications of DACA in reducing barriers for the selection and matriculation of undocumented medical students and residents. Moreover, this Perspective is a call to peers in the medical community to support undocumented students seeking access to medical school, residency, and other health professions.

  7. Deconstructing the risk for malaria in United States donors deferred for travel to Mexico.

    Science.gov (United States)

    Spencer, Bryan; Kleinman, Steven; Custer, Brian; Cable, Ritchard; Wilkinson, Susan L; Steele, Whitney; High, Patrick M; Wright, David

    2011-11-01

    More than 66,000 blood donors are deferred annually in the United States due to travel to malaria-endemic areas of Mexico. Mexico accounts for the largest share of malaria travel deferrals, yet it has extremely low risk for malaria transmission throughout most of its national territory, suggesting a suboptimal balance between blood safety and availability. This study sought to determine whether donor deferral requirements might be relaxed for parts of Mexico without compromising blood safety. Travel destination was recorded from a representative sample of presenting blood donors deferred for malaria travel from six blood centers during 2006. We imputed to these donors reporting Mexican travel a risk for acquiring malaria equivalent to Mexican residents in the destination location, adjusted for length of stay. We extrapolated these results to the overall US blood donor population. Risk for malaria in Mexico varies significantly across endemic areas and is greatest in areas infrequently visited by study donors. More than 70% of blood donor deferrals were triggered by travel to the state of Quintana Roo on the Yucatán Peninsula, an area of very low malaria transmission. Eliminating the travel deferral requirement for all areas except the state of Oaxaca might result in the recovery of almost 65,000 blood donors annually at risk of approximately one contaminated unit collected every 20 years. Deferral requirements should be relaxed for presenting donors who traveled to areas within Mexico that confer exceptionally small risks for malaria, such as Quintana Roo. © 2011 American Association of Blood Banks.

  8. Changes in inflammatory and coagulation biomarkers: a randomized comparison of immediate versus deferred antiretroviral therapy in patients with HIV infection

    DEFF Research Database (Denmark)

    Baker, Jason V; Neuhaus, Jacqueline; Duprez, Daniel

    2011-01-01

    Among a subgroup of participants in the Strategies for Management of Antiretroviral Therapy (SMART) Trial that were naïve to antiretroviral therapy (ART) or off ART (6 months or longer) at study entry, risk of AIDS and serious non-AIDS events were increased for participants who deferred ART compa...

  9. Acceptance and implementation of a system of planning computerized based on Monte Carlo

    International Nuclear Information System (INIS)

    Lopez-Tarjuelo, J.; Garcia-Molla, R.; Suan-Senabre, X. J.; Quiros-Higueras, J. Q.; Santos-Serra, A.; Marco-Blancas, N.; Calzada-Feliu, S.

    2013-01-01

    It has been done the acceptance for use clinical Monaco computerized planning system, based on an on a virtual model of the energy yield of the head of the linear electron Accelerator and that performs the calculation of the dose with an algorithm of x-rays (XVMC) based on Monte Carlo algorithm. (Author)

  10. Maximizing Total Profit in Two-agent Problem of Order Acceptance and Scheduling

    Directory of Open Access Journals (Sweden)

    Mohammad Reisi-Nafchi

    2017-03-01

    Full Text Available In competitive markets, attracting potential customers and keeping current customers is a survival condition for each company. So, paying attention to the requests of customers is important and vital. In this paper, the problem of order acceptance and scheduling has been studied, in which two types of customers or agents compete in a single machine environment. The objective is maximizing sum of the total profit of first agent's accepted orders and the total revenue of second agent. Therefore, only the first agent has penalty and its penalty function is lateness and the second agent's orders have a common due date and this agent does not accept any tardy order. To solve the problem, a mathematical programming, a heuristic algorithm and a pseudo-polynomial dynamic programming algorithm are proposed. Computational results confirm the ability of solving all problem instances up to 70 orders size optimally and also 93.12% of problem instances up to 150 orders size by dynamic programming.

  11. 78 FR 12243 - Interim Final Determination To Stay and Defer Sanctions, Placer County Air Pollution Control...

    Science.gov (United States)

    2013-02-22

    ...EPA is making an interim final determination to stay the imposition of offset sanctions and to defer the imposition of highway sanctions based on a proposed approval of a revision to the Placer County Air Pollution Control District (PCAPCD) and Feather River Air Quality Management District (FRAQMD) portion of the California State Implementation Plan (SIP) published elsewhere in this Federal Register. The SIP revision concerns two permitting rules submitted by the PCAPCD and FRAQMD, respectively: Rule 502, New Source Review, and Rule 10.1, New Source Review.

  12. Applying a Stochastic Financial Planning System to an Individual: Immediate or Deferred Life Annuities?

    DEFF Research Database (Denmark)

    Konicz, Agnieszka Karolina; Mulvey, John M.

    2013-01-01

    of such financial decisions, especially in the retirement arena. They present as an example the choice to purchase a life annuity for a middle-aged person. Buyers must choose whether to purchase before retirement or at the date of retirement. The article provides some guidelines on whether or not to purchase......Individuals are often faced with financial decisions that have long-term implications for themselves and their families, but they have few sources of unbiased assistance. The authors suggest that a stochastic financial planning system, properly constructed and calibrated, can be applied to a number...... deferred life annuities, and who might most benefit from such a purchase....

  13. One Step In and One Step Out : The Lived Experience of the Deferred Action for Childhood Arrivals Program

    OpenAIRE

    Kosnac, Hillary Sue

    2014-01-01

    After over a decade of congressional stalemate on the Development, Relief and Education for Alien Minors (DREAM) Act, the Obama administration announced the Deferred Action for Childhood Arrivals (DACA) program in the summer of 2012. A form of prosecutorial discretion, DACA offers certain undocumented youth a two-year reprieve from deportation, employment authorization and, in some states like California, a driver's license. Nevertheless, because DACA does not provide a pathway to citizenship...

  14. Proposed prediction algorithms based on hybrid approach to deal with anomalies of RFID data in healthcare

    Directory of Open Access Journals (Sweden)

    A. Anny Leema

    2013-07-01

    Full Text Available The RFID technology has penetrated the healthcare sector due to its increased functionality, low cost, high reliability, and easy-to-use capabilities. It is being deployed for various applications and the data captured by RFID readers increase according to timestamp resulting in an enormous volume of data duplication, false positive, and false negative. The dirty data stream generated by the RFID readers is one of the main factors limiting the widespread adoption of RFID technology. In order to provide reliable data to RFID application, it is necessary to clean the collected data and this should be done in an effective manner before they are subjected to warehousing. The existing approaches to deal with anomalies are physical, middleware, and deferred approach. The shortcomings of existing approaches are analyzed and found that robust RFID system can be built by integrating the middleware and deferred approach. Our proposed algorithms based on hybrid approach are tested in the healthcare environment which predicts false positive, false negative, and redundant data. In this paper, healthcare environment is simulated using RFID and the data observed by RFID reader consist of anomalies false positive, false negative, and duplication. Experimental evaluation shows that our cleansing methods remove errors in RFID data more accurately and efficiently. Thus, with the aid of the planned data cleaning technique, we can bring down the healthcare costs, optimize business processes, streamline patient identification processes, and improve patient safety.

  15. Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability

    Energy Technology Data Exchange (ETDEWEB)

    Thanh, Vo Hong, E-mail: vo@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Priami, Corrado, E-mail: priami@cosbi.eu [The Microsoft Research - University of Trento Centre for Computational and Systems Biology, Piazza Manifattura 1, Rovereto 38068 (Italy); Department of Mathematics, University of Trento, Trento (Italy); Zunino, Roberto, E-mail: roberto.zunino@unitn.it [Department of Mathematics, University of Trento, Trento (Italy)

    2016-06-14

    Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximate algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.

  16. Guidelines and algorithms for managing the difficult airway.

    Science.gov (United States)

    Gómez-Ríos, M A; Gaitini, L; Matter, I; Somri, M

    2018-01-01

    The difficult airway constitutes a continuous challenge for anesthesiologists. Guidelines and algorithms are key to preserving patient safety, by recommending specific plans and strategies that address predicted or unexpected difficult airway. However, there are currently no "gold standard" algorithms or universally accepted standards. The aim of this article is to present a synthesis of the recommendations of the main guidelines and difficult airway algorithms. Copyright © 2017 Sociedad Española de Anestesiología, Reanimación y Terapéutica del Dolor. Publicado por Elsevier España, S.L.U. All rights reserved.

  17. Duality reconstruction algorithm for use in electrical impedance tomography

    International Nuclear Information System (INIS)

    Abdullah, M.Z.; Dickin, F.J.

    1996-01-01

    A duality reconstruction algorithm for solving the inverse problem in electrical impedance tomography (EIT) is described. In this method, an algorithm based on the Geselowitz compensation (GC) theorem is used first to reconstruct an approximate version of the image. It is then fed as a first guessed data to the modified Newton-Raphson (MNR) algorithm which iteratively correct the image until a final acceptable solution is reached. The implementation of the GC and MNR based algorithms using the finite element method will be discussed. Reconstructed images produced by the algorithm will also be presented. Consideration is also given to the most computationally intensive aspects of the algorithm, namely the inversion of the large and sparse matrices. The methods taken to approximately compute the inverse ot those matrices will be outlined. (author)

  18. Support the Design of Improved IUE NEWSIPS High Dispersion Extraction Algorithms: Improved IUE High Dispersion Extraction Algorithms

    Science.gov (United States)

    Lawton, Pat

    2004-01-01

    The objective of this work was to support the design of improved IUE NEWSIPS high dispersion extraction algorithms. The purpose of this work was to evaluate use of the Linearized Image (LIHI) file versus the Re-Sampled Image (SIHI) file, evaluate various extraction, and design algorithms for evaluation of IUE High Dispersion spectra. It was concluded the use of the Re-Sampled Image (SIHI) file was acceptable. Since the Gaussian profile worked well for the core and the Lorentzian profile worked well for the wings, the Voigt profile was chosen for use in the extraction algorithm. It was found that the gamma and sigma parameters varied significantly across the detector, so gamma and sigma masks for the SWP detector were developed. Extraction code was written.

  19. ENHANCING STAKEHOLDER ACCEPTANCE OF BIOREMEDIATION TECHNOLOGIES

    Energy Technology Data Exchange (ETDEWEB)

    Focht, Will; Albright, Matt; Anex, Robert P., Jr., ed.

    2009-04-21

    control over exposure as low. Slightly more than half believe that risk reduction should be balanced against cost. We also found that distrust of DOE and its contractors exists, primarily due to the perception that site managers do not share public values; hence, the public is generally unwilling to defer to DOE in its decision-making. The concomitant belief of inefficacy confounds distrust by generating frustration that DOE does not care. Moreover, the public is split with respect to trust of each other, primarily because of the belief that citizens lack technical competence. With respect to bioremediation support, we found that more than 40% of the public has no opinion. However, of those who do, 3 of 4 are favorably disposed – particularly among those who believe that risk is lower and who are more trusting of site management. We presented survey respondents with four alternative participation strategies based on the results of the Q analysis and asked their judgments of each. The public prefers strategies that shifts power to them. The least empowered strategy (feedback) was supported by 46%; support grew as public power increased, reaching 66% support for independently facilitated deliberation. More DOE distrust generates more support for high power strategies. We offer the following recommendations to enhance public acceptance. First, and perhaps most importantly, site managers should pursue robust trust-building efforts to gain public confidence in DOE risk management that meets public expectations. Public trust decreases risk perception, which increases public willingness to defer to site managers’ discretion in decision-making, which in turn increases public acceptance of the decisions that result. Second, site managers should address public concerns about bioremediation such as its effectiveness in reducing risk, performance compared to other remediation alternatives, costs compared against benefits, time required to start and complete remediation, level of

  20. Enhancing Stakeholder Acceptance Of Bioremediation Technologies

    International Nuclear Information System (INIS)

    Focht, Will; Albright, Matt; Anex, Robert P. Jr.

    2009-01-01

    control over exposure as low. Slightly more than half believe that risk reduction should be balanced against cost. We also found that distrust of DOE and its contractors exists, primarily due to the perception that site managers do not share public values; hence, the public is generally unwilling to defer to DOE in its decision-making. The concomitant belief of inefficacy confounds distrust by generating frustration that DOE does not care. Moreover, the public is split with respect to trust of each other, primarily because of the belief that citizens lack technical competence. With respect to bioremediation support, we found that more than 40% of the public has no opinion. However, of those who do, 3 of 4 are favorably disposed - particularly among those who believe that risk is lower and who are more trusting of site management. We presented survey respondents with four alternative participation strategies based on the results of the Q analysis and asked their judgments of each. The public prefers strategies that shifts power to them. The least empowered strategy (feedback) was supported by 46%; support grew as public power increased, reaching 66% support for independently facilitated deliberation. More DOE distrust generates more support for high power strategies. We offer the following recommendations to enhance public acceptance. First, and perhaps most importantly, site managers should pursue robust trust-building efforts to gain public confidence in DOE risk management that meets public expectations. Public trust decreases risk perception, which increases public willingness to defer to site managers discretion in decision-making, which in turn increases public acceptance of the decisions that result. Second, site managers should address public concerns about bioremediation such as its effectiveness in reducing risk, performance compared to other remediation alternatives, costs compared against benefits, time required to start and complete remediation, level of risk

  1. Tag SNP selection via a genetic algorithm.

    Science.gov (United States)

    Mahdevar, Ghasem; Zahiri, Javad; Sadeghi, Mehdi; Nowzari-Dalini, Abbas; Ahrabian, Hayedeh

    2010-10-01

    Single Nucleotide Polymorphisms (SNPs) provide valuable information on human evolutionary history and may lead us to identify genetic variants responsible for human complex diseases. Unfortunately, molecular haplotyping methods are costly, laborious, and time consuming; therefore, algorithms for constructing full haplotype patterns from small available data through computational methods, Tag SNP selection problem, are convenient and attractive. This problem is proved to be an NP-hard problem, so heuristic methods may be useful. In this paper we present a heuristic method based on genetic algorithm to find reasonable solution within acceptable time. The algorithm was tested on a variety of simulated and experimental data. In comparison with the exact algorithm, based on brute force approach, results show that our method can obtain optimal solutions in almost all cases and runs much faster than exact algorithm when the number of SNP sites is large. Our software is available upon request to the corresponding author.

  2. A Learning Algorithm for Multimodal Grammar Inference.

    Science.gov (United States)

    D'Ulizia, A; Ferri, F; Grifoni, P

    2011-12-01

    The high costs of development and maintenance of multimodal grammars in integrating and understanding input in multimodal interfaces lead to the investigation of novel algorithmic solutions in automating grammar generation and in updating processes. Many algorithms for context-free grammar inference have been developed in the natural language processing literature. An extension of these algorithms toward the inference of multimodal grammars is necessary for multimodal input processing. In this paper, we propose a novel grammar inference mechanism that allows us to learn a multimodal grammar from its positive samples of multimodal sentences. The algorithm first generates the multimodal grammar that is able to parse the positive samples of sentences and, afterward, makes use of two learning operators and the minimum description length metrics in improving the grammar description and in avoiding the over-generalization problem. The experimental results highlight the acceptable performances of the algorithm proposed in this paper since it has a very high probability of parsing valid sentences.

  3. Deferred Action: Theoretical model of process architecture design for emergent business processes

    Directory of Open Access Journals (Sweden)

    Patel, N.V.

    2007-01-01

    Full Text Available E-Business modelling and ebusiness systems development assumes fixed company resources, structures, and business processes. Empirical and theoretical evidence suggests that company resources and structures are emergent rather than fixed. Planning business activity in emergent contexts requires flexible ebusiness models based on better management theories and models . This paper builds and proposes a theoretical model of ebusiness systems capable of catering for emergent factors that affect business processes. Drawing on development of theories of the ‘action and design’class the Theory of Deferred Action is invoked as the base theory for the theoretical model. A theoretical model of flexible process architecture is presented by identifying its core components and their relationships, and then illustrated with exemplar flexible process architectures capable of responding to emergent factors. Managerial implications of the model are considered and the model’s generic applicability is discussed.

  4. A problem of finding an acceptable variant in generalized project networks

    Directory of Open Access Journals (Sweden)

    David Blokh

    2005-01-01

    Full Text Available A project network often has some activities or groups of activities which can be performed at different stages of the project. Then, the problem of finding an optimal/acceptable time or/and optimal/acceptable order of such an activity or a group of activities arises. Such a problem emerges, in particular, in house-building management when the beginnings of some activities may vary in time or/and order. We consider a mathematical formulation of the problem, show its computational complexity, and describe an algorithm for solving the problem.

  5. Dose Calculation Accuracy of the Monte Carlo Algorithm for CyberKnife Compared with Other Commercially Available Dose Calculation Algorithms

    International Nuclear Information System (INIS)

    Sharma, Subhash; Ott, Joseph; Williams, Jamone; Dickow, Danny

    2011-01-01

    Monte Carlo dose calculation algorithms have the potential for greater accuracy than traditional model-based algorithms. This enhanced accuracy is particularly evident in regions of lateral scatter disequilibrium, which can develop during treatments incorporating small field sizes and low-density tissue. A heterogeneous slab phantom was used to evaluate the accuracy of several commercially available dose calculation algorithms, including Monte Carlo dose calculation for CyberKnife, Analytical Anisotropic Algorithm and Pencil Beam convolution for the Eclipse planning system, and convolution-superposition for the Xio planning system. The phantom accommodated slabs of varying density; comparisons between planned and measured dose distributions were accomplished with radiochromic film. The Monte Carlo algorithm provided the most accurate comparison between planned and measured dose distributions. In each phantom irradiation, the Monte Carlo predictions resulted in gamma analysis comparisons >97%, using acceptance criteria of 3% dose and 3-mm distance to agreement. In general, the gamma analysis comparisons for the other algorithms were <95%. The Monte Carlo dose calculation algorithm for CyberKnife provides more accurate dose distribution calculations in regions of lateral electron disequilibrium than commercially available model-based algorithms. This is primarily because of the ability of Monte Carlo algorithms to implicitly account for tissue heterogeneities, density scaling functions; and/or effective depth correction factors are not required.

  6. The hybrid Monte Carlo Algorithm and the chiral transition

    International Nuclear Information System (INIS)

    Gupta, R.

    1987-01-01

    In this talk the author describes tests of the Hybrid Monte Carlo Algorithm for QCD done in collaboration with Greg Kilcup and Stephen Sharpe. We find that the acceptance in the glubal Metropolis step for Staggered fermions can be tuned and kept large without having to make the step-size prohibitively small. We present results for the finite temperature transition on 4 4 and 4 x 6 3 lattices using this algorithm

  7. From patient deference towards negotiated and precarious informality: An Eliasian analysis of English general practitioners' understandings of changing patient relations.

    Science.gov (United States)

    Brown, Patrick; Elston, Mary Ann; Gabe, Jonathan

    2015-12-01

    This article contributes to sociological debates about trends in the power and status of medical professionals, focussing on claims that deferent patient relations are giving way to a more challenging consumerism. Analysing data from a mixed methods study involving general practitioners in England, we found some support for the idea that an apparent 'golden age' of patient deference is receding. Although not necessarily expressing nostalgia for such doctor-patient relationships, most GPs described experiencing disruptive or verbally abusive interactions at least occasionally and suggested that these were becoming more common. Younger doctors tended to rate patients as less respectful than their older colleagues but were also more likely to be egalitarian in attitude. Our data suggest that GPs, especially younger ones, tend towards a more informal yet limited engagement with their patients and with the communities in which they work. These new relations might be a basis for mutual respect between professionals and patients in the consulting room, but may also generate uncertainty and misunderstanding. Such shifts are understood through an Eliasian framework as the functional-democratisation of patient-doctor relations via civilising processes, but with this shift existing alongside decivilising tendencies involving growing social distance across broader social figurations. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  8. A Problem-Reduction Evolutionary Algorithm for Solving the Capacitated Vehicle Routing Problem

    Directory of Open Access Journals (Sweden)

    Wanfeng Liu

    2015-01-01

    Full Text Available Assessment of the components of a solution helps provide useful information for an optimization problem. This paper presents a new population-based problem-reduction evolutionary algorithm (PREA based on the solution components assessment. An individual solution is regarded as being constructed by basic elements, and the concept of acceptability is introduced to evaluate them. The PREA consists of a searching phase and an evaluation phase. The acceptability of basic elements is calculated in the evaluation phase and passed to the searching phase. In the searching phase, for each individual solution, the original optimization problem is reduced to a new smaller-size problem. With the evolution of the algorithm, the number of common basic elements in the population increases until all individual solutions are exactly the same which is supposed to be the near-optimal solution of the optimization problem. The new algorithm is applied to a large variety of capacitated vehicle routing problems (CVRP with customers up to nearly 500. Experimental results show that the proposed algorithm has the advantages of fast convergence and robustness in solution quality over the comparative algorithms.

  9. ‘Humblewise’: Deference and Complaint in the Court of Requests

    Directory of Open Access Journals (Sweden)

    Liam J. Meyer

    2015-03-01

    Full Text Available When servants, laborers, and apprentices sued their masters for back wages or mistreatment in the Court of Requests they took advantage of the court’s doctrine of equity. Since these plaintiffs often lacked the strict written proofs required by common law, or were bound by unfair written contracts, they badly needed an equitable jurisdiction where fairness, extenuating circumstances, and broad social mores could overrule the letter of the law. The formal tropes of their Complaints negotiate the tension between these two conceptions of justice and reveal how that tension relates to early seventeenth century economic culture, where customary ideas about patronage and hierarchical obligations coexisted with emerging notions of self-interest and contractual equality. In appealing to the court with older but still vibrant discourses of social justice and mutual obligation, plaintiffs modulated their Complaints with expressions of deference and helplessness. Their pleadings therefore take the sophisticated rhetorical form of self-assertion articulated as abject submission. The documents are highly mediated by lawyers and institutional constraints, but nevertheless reveal subordinates tactically using expressions of weakness to elicit pathos and use the ideology of paternalism against their masters.

  10. Hybrid cryptosystem for image file using elgamal and double playfair cipher algorithm

    Science.gov (United States)

    Hardi, S. M.; Tarigan, J. T.; Safrina, N.

    2018-03-01

    In this paper, we present an implementation of an image file encryption using hybrid cryptography. We chose ElGamal algorithm to perform asymmetric encryption and Double Playfair for the symmetric encryption. Our objective is to show that these algorithms are capable to encrypt an image file with an acceptable running time and encrypted file size while maintaining the level of security. The application was built using C# programming language and ran as a stand alone desktop application under Windows Operating System. Our test shows that the system is capable to encrypt an image with a resolution of 500×500 to a size of 976 kilobytes with an acceptable running time.

  11. Estrutura do capim-braquiária durante o diferimento da pastagem - doi: 10.4025/actascianimsci.v32i2.7922 Signalgrass structure during pasture deferment - doi: 10.4025/actascianimsci.v32i2.7922

    Directory of Open Access Journals (Sweden)

    Marcela Azevedo Magalhães

    2010-07-01

    Full Text Available O experimento foi realizado para avaliar o número de perfilhos, a massa de forragem e de seus componentes morfológicos em pastos de Brachiaria decumbens cv. Basilisk durante o diferimento da pastagem. Os tratamentos foram quatro períodos de diferimento (18, 46, 74 e 121 dias, a partir de 8/3/2004. O delineamento foi em blocos casualizados com duas repetições. Foram determinados os números de perfilhos vegetativos (PV, reprodutivos (PR e mortos (PM, bem como as massas de lâmina foliar verde (MLV, colmo verde (MCV e forragem morta (MFM. Durante o período de diferimento houve redução no número de PV (de 1.491 para 944 perfilhos m-2. Os números de PR e PM não foram influenciados pelo período de diferimento e suas médias foram 211 e 456 perfilhos m-2, respectivamente. O período de diferimento causou incremento nas MCV (de 2.965 para 4.877 kg ha-1 de massa seca e MFM (2.324 para 4.823 kg ha-1 de massa seca, porém não influenciou a MLV (em média, 2.047 kg ha-1 de massa seca. Em Viçosa, Estado de Minas Gerais, o pasto de B. decumbens, adubado com nitrogênio e diferido no início de março, pode permanecer diferido por cerca de 70 dias para conciliar produção de forragem em quantidade com boa composição morfológica.This experiment was performed aiming to evaluate tiller population density, forage mass and its morphological components on pastures of Brachiaria decumbens cv. Basilisk. during deferment. The treatments encompassed four deferred grazing periods (18, 46, 74 and 121 days. A randomized block design with two replications was used. The numbers of vegetative tillers (VT, reproductive tillers (RT and dead tillers (DT in the pasture were determined. The masses of green leaf blade (GLBM, dead stem (DSM and dead forage (DFM were also determined. There was a reduction in the number of VT (from 1,491 to 944 tiller m-2 during the deferment period. RT and DT numbers were not influenced by the deferment periods. Their averages were

  12. WDM Multicast Tree Construction Algorithms and Their Comparative Evaluations

    Science.gov (United States)

    Makabe, Tsutomu; Mikoshi, Taiju; Takenaka, Toyofumi

    We propose novel tree construction algorithms for multicast communication in photonic networks. Since multicast communications consume many more link resources than unicast communications, effective algorithms for route selection and wavelength assignment are required. We propose a novel tree construction algorithm, called the Weighted Steiner Tree (WST) algorithm and a variation of the WST algorithm, called the Composite Weighted Steiner Tree (CWST) algorithm. Because these algorithms are based on the Steiner Tree algorithm, link resources among source and destination pairs tend to be commonly used and link utilization ratios are improved. Because of this, these algorithms can accept many more multicast requests than other multicast tree construction algorithms based on the Dijkstra algorithm. However, under certain delay constraints, the blocking characteristics of the proposed Weighted Steiner Tree algorithm deteriorate since some light paths between source and destinations use many hops and cannot satisfy the delay constraint. In order to adapt the approach to the delay-sensitive environments, we have devised the Composite Weighted Steiner Tree algorithm comprising the Weighted Steiner Tree algorithm and the Dijkstra algorithm for use in a delay constrained environment such as an IPTV application. In this paper, we also give the results of simulation experiments which demonstrate the superiority of the proposed Composite Weighted Steiner Tree algorithm compared with the Distributed Minimum Hop Tree (DMHT) algorithm, from the viewpoint of the light-tree request blocking.

  13. Spiral-CT-angiography of acute pulmonary embolism: factors that influence the implementation into standard diagnostic algorithms

    International Nuclear Information System (INIS)

    Bankier, A.; Herold, C.J.; Fleischmann, D.; Janata-Schwatczek, K.

    1998-01-01

    Purpose: Debate about the potential implementation of Spiral-CT in diagnostic algorithms of pulmonary embolism are often focussed on sensitivity and specificity in the context of comparative methodologic studies. We intend to investigate whether additional factors might influence this debate. Results: The factors availability, acceptance, patient-outcome, and cost-effectiveness-studies do have substantial influence on the implementation of Spiral-CT in the diagnostic algorithms of pulmonary embolism. Incorporation of these factors into the discussion might lead to more flexible and more patient-oriented algorithms for the diagnosis of pulmonary embolism. Conclusion: Availability of equipment, acceptance among clinicians, patient-out-come, and cost-effectiveness evaluations should be implemented into the debate about potential implementation of Spiral-CT in routine diagnostic imaging algorithms of pulmonary embolism. (orig./AJ) [de

  14. The decline of judicial deference to medical opinion in medical negligence litigation in Malaysia.

    Science.gov (United States)

    Kassim, Puteri Nemie J

    2008-06-01

    The decision of the Federal Court of Malaysia in abandoning the Bolam principle in relation to doctor's duty to disclose risks has clearly marked the decline of judicial deference to medical opinion in medical negligence litigation in Malaysia. It is undeniable that the Bolam principle has acted as a gatekeeper to the number of claims against medical practitioners. This has always been seen as necessary to protect the society from unwanted effects of defensive medicine. However, will these changes contribute significantly to the growth of medical negligence cases in Malaysia? This article will trace the development of the Bolam principle in medical negligence litigation in Malaysia since 1965 and analyse the influence of selected Commonwealth cases on the development. The implications of the Federal Court ruling will also be discussed.

  15. A hybrid Jaya algorithm for reliability-redundancy allocation problems

    Science.gov (United States)

    Ghavidel, Sahand; Azizivahed, Ali; Li, Li

    2018-04-01

    This article proposes an efficient improved hybrid Jaya algorithm based on time-varying acceleration coefficients (TVACs) and the learning phase introduced in teaching-learning-based optimization (TLBO), named the LJaya-TVAC algorithm, for solving various types of nonlinear mixed-integer reliability-redundancy allocation problems (RRAPs) and standard real-parameter test functions. RRAPs include series, series-parallel, complex (bridge) and overspeed protection systems. The search power of the proposed LJaya-TVAC algorithm for finding the optimal solutions is first tested on the standard real-parameter unimodal and multi-modal functions with dimensions of 30-100, and then tested on various types of nonlinear mixed-integer RRAPs. The results are compared with the original Jaya algorithm and the best results reported in the recent literature. The optimal results obtained with the proposed LJaya-TVAC algorithm provide evidence for its better and acceptable optimization performance compared to the original Jaya algorithm and other reported optimal results.

  16. CiSE: a circular spring embedder layout algorithm.

    Science.gov (United States)

    Dogrusoz, Ugur; Belviranli, Mehmet E; Dilek, Alptug

    2013-06-01

    We present a new algorithm for automatic layout of clustered graphs using a circular style. The algorithm tries to determine optimal location and orientation of individual clusters intrinsically within a modified spring embedder. Heuristics such as reversal of the order of nodes in a cluster and swap of neighboring node pairs in the same cluster are employed intermittently to further relax the spring embedder system, resulting in reduced inter-cluster edge crossings. Unlike other algorithms generating circular drawings, our algorithm does not require the quotient graph to be acyclic, nor does it sacrifice the edge crossing number of individual clusters to improve respective positioning of the clusters. Moreover, it reduces the total area required by a cluster by using the space inside the associated circle. Experimental results show that the execution time and quality of the produced drawings with respect to commonly accepted layout criteria are quite satisfactory, surpassing previous algorithms. The algorithm has also been successfully implemented and made publicly available as part of a compound and clustered graph editing and layout tool named CHISIO.

  17. Advanced metaheuristic algorithms for laser optimization

    International Nuclear Information System (INIS)

    Tomizawa, H.

    2010-01-01

    A laser is one of the most important experimental tools. In synchrotron radiation field, lasers are widely used for experiments with Pump-Probe techniques. Especially for Xray-FELs, a laser has important roles as a seed light source or photo-cathode-illuminating light source to generate a high brightness electron bunch. The controls of laser pulse characteristics are required for many kinds of experiments. However, the laser should be tuned and customized for each requirement by laser experts. The automatic tuning of laser is required to realize with some sophisticated algorithms. The metaheuristic algorithm is one of the useful candidates to find one of the best solutions as acceptable as possible. The metaheuristic laser tuning system is expected to save our human resources and time for the laser preparations. I have shown successful results on a metaheuristic algorithm based on a genetic algorithm to optimize spatial (transverse) laser profiles and a hill climbing method extended with a fuzzy set theory to choose one of the best laser alignments automatically for each experimental requirement. (author)

  18. A tunable algorithm for collective decision-making.

    Science.gov (United States)

    Pratt, Stephen C; Sumpter, David J T

    2006-10-24

    Complex biological systems are increasingly understood in terms of the algorithms that guide the behavior of system components and the information pathways that link them. Much attention has been given to robust algorithms, or those that allow a system to maintain its functions in the face of internal or external perturbations. At the same time, environmental variation imposes a complementary need for algorithm versatility, or the ability to alter system function adaptively as external circumstances change. An important goal of systems biology is thus the identification of biological algorithms that can meet multiple challenges rather than being narrowly specified to particular problems. Here we show that emigrating colonies of the ant Temnothorax curvispinosus tune the parameters of a single decision algorithm to respond adaptively to two distinct problems: rapid abandonment of their old nest in a crisis and deliberative selection of the best available new home when their old nest is still intact. The algorithm uses a stepwise commitment scheme and a quorum rule to integrate information gathered by numerous individual ants visiting several candidate homes. By varying the rates at which they search for and accept these candidates, the ants yield a colony-level response that adaptively emphasizes either speed or accuracy. We propose such general but tunable algorithms as a design feature of complex systems, each algorithm providing elegant solutions to a wide range of problems.

  19. An evaluation of solution algorithms and numerical approximation methods for modeling an ion exchange process

    Science.gov (United States)

    Bu, Sunyoung; Huang, Jingfang; Boyer, Treavor H.; Miller, Cass T.

    2010-07-01

    The focus of this work is on the modeling of an ion exchange process that occurs in drinking water treatment applications. The model formulation consists of a two-scale model in which a set of microscale diffusion equations representing ion exchange resin particles that vary in size and age are coupled through a boundary condition with a macroscopic ordinary differential equation (ODE), which represents the concentration of a species in a well-mixed reactor. We introduce a new age-averaged model (AAM) that averages all ion exchange particle ages for a given size particle to avoid the expensive Monte-Carlo simulation associated with previous modeling applications. We discuss two different numerical schemes to approximate both the original Monte-Carlo algorithm and the new AAM for this two-scale problem. The first scheme is based on the finite element formulation in space coupled with an existing backward difference formula-based ODE solver in time. The second scheme uses an integral equation based Krylov deferred correction (KDC) method and a fast elliptic solver (FES) for the resulting elliptic equations. Numerical results are presented to validate the new AAM algorithm, which is also shown to be more computationally efficient than the original Monte-Carlo algorithm. We also demonstrate that the higher order KDC scheme is more efficient than the traditional finite element solution approach and this advantage becomes increasingly important as the desired accuracy of the solution increases. We also discuss issues of smoothness, which affect the efficiency of the KDC-FES approach, and outline additional algorithmic changes that would further improve the efficiency of these developing methods for a wide range of applications.

  20. From Undocumented to DACAmented: Benefits and Limitations of the Deferred Action for Childhood Arrivals (DACA) Program, Three Years Following its Announcement

    OpenAIRE

    Patler, Caitlin; Cabrera, Jorge; Dream Team Los Angeles

    2015-01-01

    Announced by President Obama in June 2012, the Deferred Action for Childhood Arrivals (DACA) program offers eligible undocumented youth and young adults a reprieve from deportation and temporary work authorization. This study assesses DACA’s impacts on the educational and socioeconomic trajectories and health and wellbeing of young adults in Southern California, comparing DACA recipients with undocumented youth who do not have DACA status. The study took place 2.5 years after DACA’s initiatio...

  1. A Cultural Algorithm for Optimal Design of Truss Structures

    Directory of Open Access Journals (Sweden)

    Shahin Jalili

    Full Text Available Abstract A cultural algorithm was utilized in this study to solve optimal design of truss structures problem achieving minimum weight objective under stress and deflection constraints. The algorithm is inspired by principles of human social evolution. It simulates the social interaction between the peoples and their beliefs in a belief space. Cultural Algorithm (CA utilizes the belief space and population space which affects each other based on acceptance and influence functions. The belief space of CA consists of different knowledge components. In this paper, only situational and normative knowledge components are used within the belief space. The performance of the method is demonstrated through four benchmark design examples. Comparison of the obtained results with those of some previous studies demonstrates the efficiency of this algorithm.

  2. Treatment Algorithm for Ameloblastoma

    Directory of Open Access Journals (Sweden)

    Madhumati Singh

    2014-01-01

    Full Text Available Ameloblastoma is the second most common benign odontogenic tumour (Shafer et al. 2006 which constitutes 1–3% of all cysts and tumours of jaw, with locally aggressive behaviour, high recurrence rate, and a malignant potential (Chaine et al. 2009. Various treatment algorithms for ameloblastoma have been reported; however, a universally accepted approach remains unsettled and controversial (Chaine et al. 2009. The treatment algorithm to be chosen depends on size (Escande et al. 2009 and Sampson and Pogrel 1999, anatomical location (Feinberg and Steinberg 1996, histologic variant (Philipsen and Reichart 1998, and anatomical involvement (Jackson et al. 1996. In this paper various such treatment modalities which include enucleation and peripheral osteotomy, partial maxillectomy, segmental resection and reconstruction done with fibula graft, and radical resection and reconstruction done with rib graft and their recurrence rate are reviewed with study of five cases.

  3. Optimization of Straight Cylindrical Turning Using Artificial Bee Colony (ABC) Algorithm

    Science.gov (United States)

    Prasanth, Rajanampalli Seshasai Srinivasa; Hans Raj, Kandikonda

    2017-04-01

    Artificial bee colony (ABC) algorithm, that mimics the intelligent foraging behavior of honey bees, is increasingly gaining acceptance in the field of process optimization, as it is capable of handling nonlinearity, complexity and uncertainty. Straight cylindrical turning is a complex and nonlinear machining process which involves the selection of appropriate cutting parameters that affect the quality of the workpiece. This paper presents the estimation of optimal cutting parameters of the straight cylindrical turning process using the ABC algorithm. The ABC algorithm is first tested on four benchmark problems of numerical optimization and its performance is compared with genetic algorithm (GA) and ant colony optimization (ACO) algorithm. Results indicate that, the rate of convergence of ABC algorithm is better than GA and ACO. Then, the ABC algorithm is used to predict optimal cutting parameters such as cutting speed, feed rate, depth of cut and tool nose radius to achieve good surface finish. Results indicate that, the ABC algorithm estimated a comparable surface finish when compared with real coded genetic algorithm and differential evolution algorithm.

  4. A Hybrid Optimization Algorithm for Low RCS Antenna Design

    Directory of Open Access Journals (Sweden)

    W. Shao

    2012-12-01

    Full Text Available In this article, a simple and efficient method is presented to design low radar cross section (RCS patch antennas. This method consists of a hybrid optimization algorithm, which combines a genetic algorithm (GA with tabu search algorithm (TSA, and electromagnetic field solver. The TSA, embedded into the GA frame, defines the acceptable neighborhood region of parameters and screens out the poor-scoring individuals. Thus, the repeats of search are avoided and the amount of time-consuming electromagnetic simulations is largely reduced. Moreover, the whole design procedure is auto-controlled by programming the VBScript language. A slot patch antenna example is provided to verify the accuracy and efficiency of the proposed method.

  5. Características estruturais e índice de tombamento de Brachiaria decumbens cv. Basilisk em pastagens diferidas Structural characteristics and falling index of Brachiaria decumbens cv. Basilisk on deferred pastures

    Directory of Open Access Journals (Sweden)

    Manoel Eduardo Rozalino Santos

    2009-04-01

    Full Text Available Objetivou-se avaliar o efeito dos períodos de diferimento e de pastejo sobre a densidade populacional de perfilhos, a massa dos componentes morfológicos da forragem e o índice de tombamento em pastagens de Brachiaria decumbens cv. Basilisk. Dois ensaios foram conduzidos: o primeiro denominado ano 1 e, o segundo, ano 2. Adotou-se o esquema de parcelas subdivididas, segundo o delineamento em blocos casualizados, com duas repetições para cada combinação dos períodos de diferimento da pastagem com os períodos de pastejo. No ano 1, os períodos de diferimento foram 103, 121, 146 e 163 dias; e no ano 2, foram 73, 103, 131 e 163 dias. Os períodos de pastejo foram 1, 29, 57 e 85 dias. O aumento do período de diferimento elevou a densidade populacional de perfilhos reprodutivos (ano 2: de 37 para 304 perfilhos/m² e reduziu a de perfilhos vegetativos (ano 1: de 1.253 para 889 perfilhos/m²; ano 2: de 1.235 para 627 perfilhos/m². Durante o período de pastejo, ocorreu diminuição no número de perfilhos vegetativo (ano 1: de 988 para 868 perfilhos/m² e reprodutivo (ano 1: de 216 para 0 perfilhos/m²; ano 2: de 203 para 0 perfilhos/m² e aumento no número de perfilhos mortos (ano 1: 463 para 1.088 perfilhos/m²; ano 2: de 341 para 1.010 perfilhos/m². Pastagens sob maiores períodos de diferimento e de pastejo apresentaram maior massa de colmo morto (6.093 e 3.819 kg/ha de MS nos anos 1 e 2, respectivamente e menor massa de lâmina foliar verde (341 e 177 kg/ha de MS nos anos 1 e 2, respectivamente. Pastos de Brachiaria decumbens cv. Basilisk, submetidos a longos períodos de diferimento e de pastejo possuem características estruturais desfavoráveis à produção animal.This worked aimed to evaluate the effects of deferring and grazing periods on the tiller population density, morphological component mass of forage and falling index on Brachiaria decumbens cv. Basilisk pastures. Two assays were carried out: first year and second year

  6. Parallel Computing Strategies for Irregular Algorithms

    Science.gov (United States)

    Biswas, Rupak; Oliker, Leonid; Shan, Hongzhang; Biegel, Bryan (Technical Monitor)

    2002-01-01

    Parallel computing promises several orders of magnitude increase in our ability to solve realistic computationally-intensive problems, but relies on their efficient mapping and execution on large-scale multiprocessor architectures. Unfortunately, many important applications are irregular and dynamic in nature, making their effective parallel implementation a daunting task. Moreover, with the proliferation of parallel architectures and programming paradigms, the typical scientist is faced with a plethora of questions that must be answered in order to obtain an acceptable parallel implementation of the solution algorithm. In this paper, we consider three representative irregular applications: unstructured remeshing, sparse matrix computations, and N-body problems, and parallelize them using various popular programming paradigms on a wide spectrum of computer platforms ranging from state-of-the-art supercomputers to PC clusters. We present the underlying problems, the solution algorithms, and the parallel implementation strategies. Smart load-balancing, partitioning, and ordering techniques are used to enhance parallel performance. Overall results demonstrate the complexity of efficiently parallelizing irregular algorithms.

  7. Early versus deferred treatment for smoldering multiple myeloma: a meta-analysis of randomized, controlled trials.

    Directory of Open Access Journals (Sweden)

    Minjie Gao

    Full Text Available Whether patients with smoldering multiple myeloma (SMM needed to receive early interventional treatment remains controversial. Herein, we conducted a meta-analysis comparing the efficacy and safety of early treatment over deferred treatment for patients with SMM.MEDLINE and Cochrane Library were searched to May 2014 for randomized controlled trials (RCTs that assessed the effect of early treatment over deferred treatment. Primary outcome measure was mortality, and secondary outcome measures were progression, response rate, and adverse events.Overall, 5 trials including 449 patients were identified. There was a markedly reduced risk of disease progression with early treatment (Odds Ratio [OR] = 0.13, 95% confidence interval [CI] = 0.07 to 0.24. There were no significant differences in mortality and response rate (OR = 0.85, 95% CI = 0.45 to 1.60, and OR = 0.63, 95% CI = 0.32 to 1.23, respectively. More patients in the early treatment arm experienced gastrointestinal toxicities (OR = 10.02, 95%CI = 4.32 to 23.23, constipation (OR = 8.58, 95%CI = 3.20 to 23.00 and fatigue or asthenia (OR = 2.72, 95%CI = 1.30 to 5.67. No significant differences were seen with the development of acute leukemia (OR = 2.80, 95%CI = 0.42 to 18.81, hematologic cancer (OR = 2.07, 95%CI = 0.43 to 10.01, second primary tumors (OR = 3.45, 95%CI = 0.81 to 14.68, nor vertebral compression (OR = 0.18, 95%CI = 0.02 to 1.59.Early treatment delayed disease progression but increased the risk of gastrointestinal toxicities, constipation and fatigue or asthenia. The differences on vertebral compression, acute leukemia, hematological cancer and second primary tumors were not statistically significant. Based on the current evidence, early treatment didn't significantly affect mortality and response rate. However, further much larger trials were needed to provide more evidence.

  8. Eat now, pay later? Evidence of deferred food-processing costs in diving seals.

    Science.gov (United States)

    Sparling, Carol E; Fedak, Mike A; Thompson, Dave

    2007-02-22

    Seals may delay costly physiological processes (e.g. digestion) that are incompatible with the physiological adjustments to diving until after periods of active foraging. We present unusual profiles of metabolic rate (MR) in grey seals measured during long-term simulation of foraging trips (4-5 days) that provide evidence for this. We measured extremely high MRs (up to almost seven times the baseline levels) and high heart rates during extended surface intervals, where the seals were motionless at the surface. These occurred most often during the night and occurred frequently many hours after the end of feeding bouts. The duration and amount of oxygen consumed above baseline levels during these events was correlated with the amount of food eaten, confirming that these metabolic peaks were related to the processing of food eaten during foraging periods earlier in the day. We suggest that these periods of high MR represent a payback of costs deferred during foraging.

  9. Dynamic Vehicle Routing Using an Improved Variable Neighborhood Search Algorithm

    Directory of Open Access Journals (Sweden)

    Yingcheng Xu

    2013-01-01

    Full Text Available In order to effectively solve the dynamic vehicle routing problem with time windows, the mathematical model is established and an improved variable neighborhood search algorithm is proposed. In the algorithm, allocation customers and planning routes for the initial solution are completed by the clustering method. Hybrid operators of insert and exchange are used to achieve the shaking process, the later optimization process is presented to improve the solution space, and the best-improvement strategy is adopted, which make the algorithm can achieve a better balance in the solution quality and running time. The idea of simulated annealing is introduced to take control of the acceptance of new solutions, and the influences of arrival time, distribution of geographical location, and time window range on route selection are analyzed. In the experiment, the proposed algorithm is applied to solve the different sizes' problems of DVRP. Comparing to other algorithms on the results shows that the algorithm is effective and feasible.

  10. Algorithm for the real-structure design of neutron supermirrors

    International Nuclear Information System (INIS)

    Pleshanov, N.K.

    2004-01-01

    The effect of structure imperfections of neutron supermirrors on their performance is well known. Nevertheless, supermirrors are designed with the algorithms based on the theories of reflection from perfect layered structures. In the present paper an approach is suggested, in which the design of a supermirror is made on the basis of its real-structure model (the RSD algorithm) with the use of exact numerical methods. It allows taking the growth laws and the reflectance of real structures into account. The new algorithm was compared with the Gukasov-Ruban-Bedrizova (GRB) algorithm and with the most frequently used algorithm of Hayter and Mook (HM). Calculations showed that, when the parameters of the algorithms are chosen so that the supermirrors designed for a given angular acceptance m have the same number of bilayers, (a) for perfect layers the GRB, HM and RSD algorithms generate sequences of practically the same reflectance; (b) for real structures with rough interfaces and interdiffusion the GRB and HM algorithms generate sequences with insufficient number of thinner layers and the RSD algorithm turns out to be more responsive and efficient. The efficiency of the RSD algorithm increases for larger m. In addition, calculations have been carried out to demonstrate the effect of fabrication errors and absorption on the reflectance of Ni/Ti supermirrors

  11. A particle identification technique for large acceptance spectrometers

    International Nuclear Information System (INIS)

    Cappuzzello, F.; Cavallaro, M.; Cunsolo, A.; Foti, A.; Carbone, D.; Orrigo, S.E.A.; Rodrigues, M.R.D.

    2010-01-01

    A technique to identify the heavy ions produced in nuclear reactions is presented. It is based on the use of a hybrid detector, which measures the energy loss, the residual energy, the position and angle of the ions at the focal plane of a magnetic spectrometer. The key point is the use of a powerful algorithm for the reconstruction of the ion trajectory, which makes the technique reliable even with large acceptance optical devices. Experimental results with the MAGNEX spectrometer show a remarkable resolution of about 1/160 in the mass parameter.

  12. A Novel Algorithm for Feature Level Fusion Using SVM Classifier for Multibiometrics-Based Person Identification

    Directory of Open Access Journals (Sweden)

    Ujwalla Gawande

    2013-01-01

    Full Text Available Recent times witnessed many advancements in the field of biometric and ultimodal biometric fields. This is typically observed in the area, of security, privacy, and forensics. Even for the best of unimodal biometric systems, it is often not possible to achieve a higher recognition rate. Multimodal biometric systems overcome various limitations of unimodal biometric systems, such as nonuniversality, lower false acceptance, and higher genuine acceptance rates. More reliable recognition performance is achievable as multiple pieces of evidence of the same identity are available. The work presented in this paper is focused on multimodal biometric system using fingerprint and iris. Distinct textual features of the iris and fingerprint are extracted using the Haar wavelet-based technique. A novel feature level fusion algorithm is developed to combine these unimodal features using the Mahalanobis distance technique. A support-vector-machine-based learning algorithm is used to train the system using the feature extracted. The performance of the proposed algorithms is validated and compared with other algorithms using the CASIA iris database and real fingerprint database. From the simulation results, it is evident that our algorithm has higher recognition rate and very less false rejection rate compared to existing approaches.

  13. Coming of Age on the Margins: Mental Health and Wellbeing Among Latino Immigrant Young Adults Eligible for Deferred Action for Childhood Arrivals (DACA)

    OpenAIRE

    Siemons, R; Raymond-Flesh, M; Auerswald, CL; Brindis, CD

    2017-01-01

    © 2016, Springer Science+Business Media New York. Undocumented immigrant young adults growing up in the United States face significant challenges. For those qualified, the Deferred Action for Childhood Arrivals (DACA) program’s protections may alleviate stressors, with implications for their mental health and wellbeing (MHWB). We conducted nine focus groups with 61 DACA-eligible Latinos (ages 18–31) in California to investigate their health needs. Participants reported MHWB as their greatest ...

  14. The potential benefits and drawbacks of deferring the decommissioning of nuclear facilities

    International Nuclear Information System (INIS)

    Nash, R.; Pomfret, D.G.

    2000-01-01

    The need to decommission redundant plants or sites at the end of operations is accepted throughout the industry, by the timing of such work requires careful consideration. In a world where political issues dominate, safety and business drivers are often only two of the inputs, and it is therefore imperative that industry is clear in its understanding of the different drivers that affect this issue. Issues are; 1. The formal positions adopted by governments and regulators are often directed at specifically to water reactors. Should other types of facilities be constrained by the same policies? 2. Political drivers can be dominated by short to medium term considerations. Can we encourage long term strategic planning, and ? 3. The failings of the industry (especially accidents) and fear of nuclear weapons can often dominate public perception. Can industry counter these adverse associations? The key benefits relate to better risk management (especially if the radiological inventory was minimised at plant closure). Extending the decommissioning programme does allow dose reductions to take place, especially if short-lived isotopes are involved. Other benefits include more time is allowed for optioneering and planning of the decommissioning processes. This time frame also gives opportunity to develop new or better technologies. With proper control, political strategies can be developed and financial planning can take place. The concern of 'Will there be enough money to fund decommissioning' can become 'A little funding set aside early, and properly managed, can pay for decommissioning in the future'. Savings in dose, impact on the environment, and money can be achieved through a well-managed deferred programme. An extended programme of decommissioning demands the political will and the infrastructure to remain in place over a long period. Tied in to this is the need to keep money available for a long time, and to be able to retain and recover the relevant knowledge

  15. Sensitivity of Calibrated Parameters and Water Resource Estimates on Different Objective Functions and Optimization Algorithms

    Directory of Open Access Journals (Sweden)

    Delaram Houshmand Kouchi

    2017-05-01

    Full Text Available The successful application of hydrological models relies on careful calibration and uncertainty analysis. However, there are many different calibration/uncertainty analysis algorithms, and each could be run with different objective functions. In this paper, we highlight the fact that each combination of optimization algorithm-objective functions may lead to a different set of optimum parameters, while having the same performance; this makes the interpretation of dominant hydrological processes in a watershed highly uncertain. We used three different optimization algorithms (SUFI-2, GLUE, and PSO, and eight different objective functions (R2, bR2, NSE, MNS, RSR, SSQR, KGE, and PBIAS in a SWAT model to calibrate the monthly discharges in two watersheds in Iran. The results show that all three algorithms, using the same objective function, produced acceptable calibration results; however, with significantly different parameter ranges. Similarly, an algorithm using different objective functions also produced acceptable calibration results, but with different parameter ranges. The different calibrated parameter ranges consequently resulted in significantly different water resource estimates. Hence, the parameters and the outputs that they produce in a calibrated model are “conditioned” on the choices of the optimization algorithm and objective function. This adds another level of non-negligible uncertainty to watershed models, calling for more attention and investigation in this area.

  16. Emergency Department Management of Suspected Calf-Vein Deep Venous Thrombosis: A Diagnostic Algorithm

    Directory of Open Access Journals (Sweden)

    Levi Kitchen

    2016-06-01

    Full Text Available Introduction: Unilateral leg swelling with suspicion of deep venous thrombosis (DVT is a common emergency department (ED presentation. Proximal DVT (thrombus in the popliteal or femoral veins can usually be diagnosed and treated at the initial ED encounter. When proximal DVT has been ruled out, isolated calf-vein deep venous thrombosis (IC-DVT often remains a consideration. The current standard for the diagnosis of IC-DVT is whole-leg vascular duplex ultrasonography (WLUS, a test that is unavailable in many hospitals outside normal business hours. When WLUS is not available from the ED, recommendations for managing suspected IC-DVT vary. The objectives of the study is to use current evidence and recommendations to (1 propose a diagnostic algorithm for IC-DVT when definitive testing (WLUS is unavailable; and (2 summarize the controversy surrounding IC-DVT treatment. Discussion: The Figure combines D-dimer testing with serial CUS or a single deferred FLUS for the diagnosis of IC-DVT. Such an algorithm has the potential to safely direct the management of suspected IC-DVT when definitive testing is unavailable. Whether or not to treat diagnosed IC-DVT remains widely debated and awaiting further evidence. Conclusion: When IC-DVT is not ruled out in the ED, the suggested algorithm, although not prospectively validated by a controlled study, offers an approach to diagnosis that is consistent with current data and recommendations. When IC-DVT is diagnosed, current references suggest that a decision between anticoagulation and continued follow-up outpatient testing can be based on shared decision-making. The risks of proximal progression and life-threatening embolization should be balanced against the generally more benign natural history of such thrombi, and an individual patient’s risk factors for both thrombus propagation and complications of anticoagulation. [West J Emerg Med. 2016;17(4384-390.

  17. Emergency Department Management of Suspected Calf-Vein Deep Venous Thrombosis: A Diagnostic Algorithm.

    Science.gov (United States)

    Kitchen, Levi; Lawrence, Matthew; Speicher, Matthew; Frumkin, Kenneth

    2016-07-01

    Unilateral leg swelling with suspicion of deep venous thrombosis (DVT) is a common emergency department (ED) presentation. Proximal DVT (thrombus in the popliteal or femoral veins) can usually be diagnosed and treated at the initial ED encounter. When proximal DVT has been ruled out, isolated calf-vein deep venous thrombosis (IC-DVT) often remains a consideration. The current standard for the diagnosis of IC-DVT is whole-leg vascular duplex ultrasonography (WLUS), a test that is unavailable in many hospitals outside normal business hours. When WLUS is not available from the ED, recommendations for managing suspected IC-DVT vary. The objectives of the study is to use current evidence and recommendations to (1) propose a diagnostic algorithm for IC-DVT when definitive testing (WLUS) is unavailable; and (2) summarize the controversy surrounding IC-DVT treatment. The Figure combines D-dimer testing with serial CUS or a single deferred FLUS for the diagnosis of IC-DVT. Such an algorithm has the potential to safely direct the management of suspected IC-DVT when definitive testing is unavailable. Whether or not to treat diagnosed IC-DVT remains widely debated and awaiting further evidence. When IC-DVT is not ruled out in the ED, the suggested algorithm, although not prospectively validated by a controlled study, offers an approach to diagnosis that is consistent with current data and recommendations. When IC-DVT is diagnosed, current references suggest that a decision between anticoagulation and continued follow-up outpatient testing can be based on shared decision-making. The risks of proximal progression and life-threatening embolization should be balanced against the generally more benign natural history of such thrombi, and an individual patient's risk factors for both thrombus propagation and complications of anticoagulation.

  18. Accepting or declining non-invasive ventilation or gastrostomy in amyotrophic lateral sclerosis: patients' perspectives.

    Science.gov (United States)

    Greenaway, L P; Martin, N H; Lawrence, V; Janssen, A; Al-Chalabi, A; Leigh, P N; Goldstein, L H

    2015-01-01

    The objective was to identify factors associated with decisions made by patients with amyotrophic lateral sclerosis (ALS) to accept or decline non-invasive ventilation (NIV) and/or gastrostomy in a prospective population-based study. Twenty-one people with ALS, recruited from the South-East ALS Register who made an intervention decision during the study timeframe underwent a face-to-face in-depth interview, with or without their informal caregiver present. Sixteen had accepted an intervention (11 accepted gastrostomy, four accepted NIV and one accepted both interventions). Five patients had declined gastrostomy. Thematic analysis revealed three main themes: (1) patient-centric factors (including perceptions of control, acceptance and need, and aspects of fear); (2) external factors (including roles played by healthcare professionals, family, and information provision); and (3) the concept of time (including living in the moment and the notion of 'right thing, right time'). Many aspects of these factors were inter-related. Decision-making processes for the patients were found to be complex and multifaceted and reinforce arguments for individualised (rather than 'algorithm-based') approaches to facilitating decision-making by people with ALS who require palliative interventions.

  19. Variants of Evolutionary Algorithms for Real-World Applications

    CERN Document Server

    Weise, Thomas; Michalewicz, Zbigniew

    2012-01-01

    Evolutionary Algorithms (EAs) are population-based, stochastic search algorithms that mimic natural evolution. Due to their ability to find excellent solutions for conventionally hard and dynamic problems within acceptable time, EAs have attracted interest from many researchers and practitioners in recent years. This book “Variants of Evolutionary Algorithms for Real-World Applications” aims to promote the practitioner’s view on EAs by providing a comprehensive discussion of how EAs can be adapted to the requirements of various applications in the real-world domains. It comprises 14 chapters, including an introductory chapter re-visiting the fundamental question of what an EA is and other chapters addressing a range of real-world problems such as production process planning, inventory system and supply chain network optimisation, task-based jobs assignment, planning for CNC-based work piece construction, mechanical/ship design tasks that involve runtime-intense simulations, data mining for the predictio...

  20. Automatic boiling water reactor control rod pattern design using particle swarm optimization algorithm and local search

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Cheng-Der, E-mail: jdwang@iner.gov.tw [Nuclear Engineering Division, Institute of Nuclear Energy Research, No. 1000, Wenhua Rd., Jiaan Village, Longtan Township, Taoyuan County 32546, Taiwan, ROC (China); Lin, Chaung [National Tsing Hua University, Department of Engineering and System Science, 101, Section 2, Kuang Fu Road, Hsinchu 30013, Taiwan (China)

    2013-02-15

    Highlights: ► The PSO algorithm was adopted to automatically design a BWR CRP. ► The local search procedure was added to improve the result of PSO algorithm. ► The results show that the obtained CRP is the same good as that in the previous work. -- Abstract: This study developed a method for the automatic design of a boiling water reactor (BWR) control rod pattern (CRP) using the particle swarm optimization (PSO) algorithm. The PSO algorithm is more random compared to the rank-based ant system (RAS) that was used to solve the same BWR CRP design problem in the previous work. In addition, the local search procedure was used to make improvements after PSO, by adding the single control rod (CR) effect. The design goal was to obtain the CRP so that the thermal limits and shutdown margin would satisfy the design requirement and the cycle length, which is implicitly controlled by the axial power distribution, would be acceptable. The results showed that the same acceptable CRP found in the previous work could be obtained.

  1. Matching Theory for Channel Allocation in Cognitive Radio Networks

    Directory of Open Access Journals (Sweden)

    L. Cao

    2016-12-01

    Full Text Available For a cognitive radio network (CRN in which a set of secondary users (SUs competes for a limited number of channels (spectrum resources belonging to primary users (PUs, the channel allocation is a challenge and dominates the throughput and congestion of the network. In this paper, the channel allocation problem is first formulated as the 0-1 integer programming optimization, with considering the overall utility both of primary system and secondary system. Inspired by matching theory, a many-to-one matching game is used to remodel the channel allocation problem, and the corresponding PU proposing deferred acceptance (PPDA algorithm is also proposed to yield a stable matching. We compare the performance and computation complexity between these two solutions. Numerical results demonstrate the efficiency and obtain the communication overhead of the proposed scheme.

  2. Service Composition Instantiation Based on Cross-Modified Artificial Bee Colony Algorithm

    Institute of Scientific and Technical Information of China (English)

    Lei Huo; Zhiliang Wang

    2016-01-01

    Internet of things (IoT) imposes new challenges on service composition as it is difficult to manage a quick instantiation of a complex services from a growing number of dynamic candidate services.A cross-modified Artificial Bee Colony Algorithm (CMABC) is proposed to achieve the optimal solution services in an acceptable time and high accuracy.Firstly,web service instantiation model was established.What is more,to overcome the problem of discrete and chaotic solution space,the global optimal solution was used to accelerate convergence rate by imitating the cross operation of Genetic algorithm (GA).The simulation experiment result shows that CMABC exhibited faster convergence speed and better convergence accuracy than some other intelligent optimization algorithms.

  3. Bridging the Gap between Social Acceptance and Ethical Acceptability.

    Science.gov (United States)

    Taebi, Behnam

    2017-10-01

    New technology brings great benefits, but it can also create new and significant risks. When evaluating those risks in policymaking, there is a tendency to focus on social acceptance. By solely focusing on social acceptance, we could, however, overlook important ethical aspects of technological risk, particularly when we evaluate technologies with transnational and intergenerational risks. I argue that good governance of risky technology requires analyzing both social acceptance and ethical acceptability. Conceptually, these two notions are mostly complementary. Social acceptance studies are not capable of sufficiently capturing all the morally relevant features of risky technologies; ethical analyses do not typically include stakeholders' opinions, and they therefore lack the relevant empirical input for a thorough ethical evaluation. Only when carried out in conjunction are these two types of analysis relevant to national and international governance of risky technology. I discuss the Rawlsian wide reflective equilibrium as a method for marrying social acceptance and ethical acceptability. Although the rationale of my argument is broadly applicable, I will examine the case of multinational nuclear waste repositories in particular. This example will show how ethical issues may be overlooked if we focus only on social acceptance, and will provide a test case for demonstrating how the wide reflective equilibrium can help to bridge the proverbial acceptance-acceptability gap. © 2016 The Authors Risk Analysis published by Wiley Periodicals, Inc. on behalf of Society for Risk Analysis.

  4. An Improved Parallel DNA Algorithm of 3-SAT

    Directory of Open Access Journals (Sweden)

    Wei Liu

    2007-09-01

    Full Text Available There are many large-size and difficult computational problems in mathematics and computer science. For many of these problems, traditional computers cannot handle the mass of data in acceptable timeframes, which we call an NP problem. DNA computing is a means of solving a class of intractable computational problems in which the computing time grows exponentially with problem size. This paper proposes a parallel algorithm model for the universal 3-SAT problem based on the Adleman-Lipton model and applies biological operations to handling the mass of data in solution space. In this manner, we can control the run time of the algorithm to be finite and approximately constant.

  5. An optimized outlier detection algorithm for jury-based grading of engineering design projects

    DEFF Research Database (Denmark)

    Thompson, Mary Kathryn; Espensen, Christina; Clemmensen, Line Katrine Harder

    2016-01-01

    This work characterizes and optimizes an outlier detection algorithm to identify potentially invalid scores produced by jury members while grading engineering design projects. The paper describes the original algorithm and the associated adjudication process in detail. The impact of the various...... (the base rule and the three additional conditions) play a role in the algorithm's performance and should be included in the algorithm. Because there is significant interaction between the base rule and the additional conditions, many acceptable combinations that balance the FPR and FNR can be found......, but no true optimum seems to exist. The performance of the best optimizations and the original algorithm are similar. Therefore, it should be possible to choose new coefficient values for jury populations in other cultures and contexts logically and empirically without a full optimization as long...

  6. Discrete size optimization of steel trusses using a refined big bang-big crunch algorithm

    Science.gov (United States)

    Hasançebi, O.; Kazemzadeh Azad, S.

    2014-01-01

    This article presents a methodology that provides a method for design optimization of steel truss structures based on a refined big bang-big crunch (BB-BC) algorithm. It is shown that a standard formulation of the BB-BC algorithm occasionally falls short of producing acceptable solutions to problems from discrete size optimum design of steel trusses. A reformulation of the algorithm is proposed and implemented for design optimization of various discrete truss structures according to American Institute of Steel Construction Allowable Stress Design (AISC-ASD) specifications. Furthermore, the performance of the proposed BB-BC algorithm is compared to its standard version as well as other well-known metaheuristic techniques. The numerical results confirm the efficiency of the proposed algorithm in practical design optimization of truss structures.

  7. The Impact of Deferred Tax Assets, Discretionary Accrual, Leverage, Company Size and Tax Planning Onearnings Management Practices

    Directory of Open Access Journals (Sweden)

    Jacobus Widiatmoko

    2016-04-01

    Full Text Available The purpose of this study is to analyze and provide empirical evidence of the influence of deferred tax asset, discretionary accrual, leverage, company size, and tax planning on earnings management. Financial performance is an indicator that is required by company management to measure the effectiveness of company performance. This research used secondary data that was got from annual report published in www.idx.co.id and data from Indonesian Capital Market Directory (ICMD. Populations of the research are manufacturing companies listed on Indonesia Stock Exchange from 2011-2013. Samples were selected by using purposive sampling method. There are 208 observations that will examined by logistic regression analysis. The result shows that deferred tax asset has negative and not significant effect to the earnings management, discretionary accrual has negative and not significant effect to the earnings management, leverage has negative and significant effect to the earnings management, company size has positive and significant effect to the earnings management, tax planning has positive and not significant effect to the earnings management.Tujuan penelitian ini menganalisis bukti empiris mengenai pengaruh asset pajak tangguhan, discretionary accrual, leverage, ukuran perusahaan, dan perencanaan pajak terhadap manajemen laba. Kinerja keuangan adalah indikator untuk mengukur efektivitas perusahaan. Penelitian ini menggunakan data sekunder yang diperoleh dari www.idx.co.id serta data dari Indonesian Capital Market Directory (ICMD. Populasi penelitian ini adalah perusahaan manufaktur yang terdaftar di BEI tahun 2011-2013. Sampel dipilih dengan purposive sampling. Terdapat 208 observasi yang akan diuji dengan model analisis regresi logistik. Hasil penelitian ini menunjukkan bahwa asset pajak tangguhan memiliki pengaruh negatif dan tidak signifikan terhadap praktik manajemen laba, discretionary accrual memiliki pengaruh negatif dan tidak signifikan terhadap

  8. Comparison of machine learning algorithms for detecting coral reef

    Directory of Open Access Journals (Sweden)

    Eduardo Tusa

    2014-09-01

    Full Text Available (Received: 2014/07/31 - Accepted: 2014/09/23This work focuses on developing a fast coral reef detector, which is used for an autonomous underwater vehicle, AUV. A fast detection secures the AUV stabilization respect to an area of reef as fast as possible, and prevents devastating collisions. We use the algorithm of Purser et al. (2009 because of its precision. This detector has two parts: feature extraction that uses Gabor Wavelet filters, and feature classification that uses machine learning based on Neural Networks. Due to the extensive time of the Neural Networks, we exchange for a classification algorithm based on Decision Trees. We use a database of 621 images of coral reef in Belize (110 images for training and 511 images for testing. We implement the bank of Gabor Wavelets filters using C++ and the OpenCV library. We compare the accuracy and running time of 9 machine learning algorithms, whose result was the selection of the Decision Trees algorithm. Our coral detector performs 70ms of running time in comparison to 22s executed by the algorithm of Purser et al. (2009.

  9. BALL - biochemical algorithms library 1.3

    Directory of Open Access Journals (Sweden)

    Stöckel Daniel

    2010-10-01

    Full Text Available Abstract Background The Biochemical Algorithms Library (BALL is a comprehensive rapid application development framework for structural bioinformatics. It provides an extensive C++ class library of data structures and algorithms for molecular modeling and structural bioinformatics. Using BALL as a programming toolbox does not only allow to greatly reduce application development times but also helps in ensuring stability and correctness by avoiding the error-prone reimplementation of complex algorithms and replacing them with calls into the library that has been well-tested by a large number of developers. In the ten years since its original publication, BALL has seen a substantial increase in functionality and numerous other improvements. Results Here, we discuss BALL's current functionality and highlight the key additions and improvements: support for additional file formats, molecular edit-functionality, new molecular mechanics force fields, novel energy minimization techniques, docking algorithms, and support for cheminformatics. Conclusions BALL is available for all major operating systems, including Linux, Windows, and MacOS X. It is available free of charge under the Lesser GNU Public License (LPGL. Parts of the code are distributed under the GNU Public License (GPL. BALL is available as source code and binary packages from the project web site at http://www.ball-project.org. Recently, it has been accepted into the debian project; integration into further distributions is currently pursued.

  10. A New Fuzzy Harmony Search Algorithm Using Fuzzy Logic for Dynamic Parameter Adaptation

    Directory of Open Access Journals (Sweden)

    Cinthia Peraza

    2016-10-01

    Full Text Available In this paper, a new fuzzy harmony search algorithm (FHS for solving optimization problems is presented. FHS is based on a recent method using fuzzy logic for dynamic adaptation of the harmony memory accepting (HMR and pitch adjustment (PArate parameters that improve the convergence rate of traditional harmony search algorithm (HS. The objective of the method is to dynamically adjust the parameters in the range from 0.7 to 1. The impact of using fixed parameters in the harmony search algorithm is discussed and a strategy for efficiently tuning these parameters using fuzzy logic is presented. The FHS algorithm was successfully applied to different benchmarking optimization problems. The results of simulation and comparison studies demonstrate the effectiveness and efficiency of the proposed approach.

  11. Traffic sharing algorithms for hybrid mobile networks

    Science.gov (United States)

    Arcand, S.; Murthy, K. M. S.; Hafez, R.

    1995-01-01

    In a hybrid (terrestrial + satellite) mobile personal communications networks environment, a large size satellite footprint (supercell) overlays on a large number of smaller size, contiguous terrestrial cells. We assume that the users have either a terrestrial only single mode terminal (SMT) or a terrestrial/satellite dual mode terminal (DMT) and the ratio of DMT to the total terminals is defined gamma. It is assumed that the call assignments to and handovers between terrestrial cells and satellite supercells take place in a dynamic fashion when necessary. The objectives of this paper are twofold, (1) to propose and define a class of traffic sharing algorithms to manage terrestrial and satellite network resources efficiently by handling call handovers dynamically, and (2) to analyze and evaluate the algorithms by maximizing the traffic load handling capability (defined in erl/cell) over a wide range of terminal ratios (gamma) given an acceptable range of blocking probabilities. Two of the algorithms (G & S) in the proposed class perform extremely well for a wide range of gamma.

  12. Outcomes of Patients Presenting With Clinical Indices of Spontaneous Reperfusion in ST-Elevation Acute Coronary Syndrome Undergoing Deferred Angiography.

    Science.gov (United States)

    Fefer, Paul; Beigel, Roy; Atar, Shaul; Aronson, Doron; Pollak, Arthur; Zahger, Doron; Asher, Elad; Iakobishvili, Zaza; Shlomo, Nir; Alcalai, Ronny; Einhorn-Cohen, Michal; Segev, Amit; Goldenberg, Ilan; Matetzky, Shlomi

    2017-07-25

    Few data are available regarding the optimal management of ST-elevation myocardial infarction patients with clinically defined spontaneous reperfusion (SR). We report on the characteristics and outcomes of patients with SR in the primary percutaneous coronary intervention era, and assess whether immediate reperfusion can be deferred. Data were drawn from a prospective nationwide survey, ACSIS (Acute Coronary Syndrome Israeli Survey). Definition of SR was predefined as both (1) ≥70% reduction in ST-segment elevation on consecutive ECGs and (2) ≥70% resolution of pain. Of 2361 consecutive ST-elevation-acute coronary syndrome patients in Killip class 1, 405 (17%) were not treated with primary reperfusion therapy because of SR. Intervention in SR patients was performed a median of 26 hours after admission. These patients were compared with the 1956 ST-elevation myocardial infarction patients who underwent primary reperfusion with a median door-to-balloon of 66 minutes (interquartile range 38-106). Baseline characteristics were similar except for slightly higher incidence of renal dysfunction and prior angina pectoris in SR patients. Time from symptom onset to medical contact was significantly greater in SR patients. Patients with SR had significantly less in-hospital heart failure (4% versus 11%) and cardiogenic shock (0% versus 2%) ( P <0.01 for all). No significant differences were found in in-hospital mortality (1% versus 2%), 30-day major cardiac events (4% versus 4%), and mortality at 30 days (1% versus 2%) and 1 year (4% versus 4%). Patients with clinically defined SR have a favorable prognosis. Deferring immediate intervention seems to be safe in patients with clinical indices of spontaneous reperfusion. © 2017 The Authors. Published on behalf of the American Heart Association, Inc., by Wiley.

  13. Experimental Methods for the Analysis of Optimization Algorithms

    DEFF Research Database (Denmark)

    , computational experiments differ from those in other sciences, and the last decade has seen considerable methodological research devoted to understanding the particular features of such experiments and assessing the related statistical methods. This book consists of methodological contributions on different...... in algorithm design, statistical design, optimization and heuristics, and most chapters provide theoretical background and are enriched with case studies. This book is written for researchers and practitioners in operations research and computer science who wish to improve the experimental assessment......In operations research and computer science it is common practice to evaluate the performance of optimization algorithms on the basis of computational results, and the experimental approach should follow accepted principles that guarantee the reliability and reproducibility of results. However...

  14. Genetic algorithms for protein threading.

    Science.gov (United States)

    Yadgari, J; Amir, A; Unger, R

    1998-01-01

    Despite many years of efforts, a direct prediction of protein structure from sequence is still not possible. As a result, in the last few years researchers have started to address the "inverse folding problem": Identifying and aligning a sequence to the fold with which it is most compatible, a process known as "threading". In two meetings in which protein folding predictions were objectively evaluated, it became clear that threading as a concept promises a real breakthrough, but that much improvement is still needed in the technique itself. Threading is a NP-hard problem, and thus no general polynomial solution can be expected. Still a practical approach with demonstrated ability to find optimal solutions in many cases, and acceptable solutions in other cases, is needed. We applied the technique of Genetic Algorithms in order to significantly improve the ability of threading algorithms to find the optimal alignment of a sequence to a structure, i.e. the alignment with the minimum free energy. A major progress reported here is the design of a representation of the threading alignment as a string of fixed length. With this representation validation of alignments and genetic operators are effectively implemented. Appropriate data structure and parameters have been selected. It is shown that Genetic Algorithm threading is effective and is able to find the optimal alignment in a few test cases. Furthermore, the described algorithm is shown to perform well even without pre-definition of core elements. Existing threading methods are dependent on such constraints to make their calculations feasible. But the concept of core elements is inherently arbitrary and should be avoided if possible. While a rigorous proof is hard to submit yet an, we present indications that indeed Genetic Algorithm threading is capable of finding consistently good solutions of full alignments in search spaces of size up to 10(70).

  15. New Parallel Algorithms for Structural Analysis and Design of Aerospace Structures

    Science.gov (United States)

    Nguyen, Duc T.

    1998-01-01

    Subspace and Lanczos iterations have been developed, well documented, and widely accepted as efficient methods for obtaining p-lowest eigen-pair solutions of large-scale, practical engineering problems. The focus of this paper is to incorporate recent developments in vectorized sparse technologies in conjunction with Subspace and Lanczos iterative algorithms for computational enhancements. Numerical performance, in terms of accuracy and efficiency of the proposed sparse strategies for Subspace and Lanczos algorithm, is demonstrated by solving for the lowest frequencies and mode shapes of structural problems on the IBM-R6000/590 and SunSparc 20 workstations.

  16. SVC control enhancement applying self-learning fuzzy algorithm for islanded microgrid

    Directory of Open Access Journals (Sweden)

    Hossam Gabbar

    2016-03-01

    Full Text Available Maintaining voltage stability, within acceptable levels, for islanded Microgrids (MGs is a challenge due to limited exchange power between generation and loads. This paper proposes an algorithm to enhance the dynamic performance of islanded MGs in presence of load disturbance using Static VAR Compensator (SVC with Fuzzy Model Reference Learning Controller (FMRLC. The proposed algorithm compensates MG nonlinearity via fuzzy membership functions and inference mechanism imbedded in both controller and inverse model. Hence, MG keeps the desired performance as required at any operating condition. Furthermore, the self-learning capability of the proposed control algorithm compensates for grid parameter’s variation even with inadequate information about load dynamics. A reference model was designed to reject bus voltage disturbance with achievable performance by the proposed fuzzy controller. Three simulations scenarios have been presented to investigate effectiveness of proposed control algorithm in improving steady-state and transient performance of islanded MGs. The first scenario conducted without SVC, second conducted with SVC using PID controller and third conducted using FMRLC algorithm. A comparison for results shows ability of proposed control algorithm to enhance disturbance rejection due to learning process.

  17. Responsible technology acceptance

    DEFF Research Database (Denmark)

    Toft, Madeleine Broman; Schuitema, Geertje; Thøgersen, John

    2014-01-01

    As a response to climate change and the desire to gain independence from imported fossil fuels, there is a pressure to increase the proportion of electricity from renewable sources which is one of the reasons why electricity grids are currently being turned into Smart Grids. In this paper, we focus...... on private consumers’ acceptance of having Smart Grid technology installed in their home. We analyse acceptance in a combined framework of the Technology Acceptance Model and the Norm Activation Model. We propose that individuals are only likely to accept Smart Grid technology if they assess usefulness...... in terms of a positive impact for society and the environment. Therefore, we expect that Smart Grid technology acceptance can be better explained when the well-known technology acceptance parameters included in the Technology Acceptance Model are supplemented by moral norms as suggested by the Norm...

  18. Clustering Using Boosted Constrained k-Means Algorithm

    Directory of Open Access Journals (Sweden)

    Masayuki Okabe

    2018-03-01

    Full Text Available This article proposes a constrained clustering algorithm with competitive performance and less computation time to the state-of-the-art methods, which consists of a constrained k-means algorithm enhanced by the boosting principle. Constrained k-means clustering using constraints as background knowledge, although easy to implement and quick, has insufficient performance compared with metric learning-based methods. Since it simply adds a function into the data assignment process of the k-means algorithm to check for constraint violations, it often exploits only a small number of constraints. Metric learning-based methods, which exploit constraints to create a new metric for data similarity, have shown promising results although the methods proposed so far are often slow depending on the amount of data or number of feature dimensions. We present a method that exploits the advantages of the constrained k-means and metric learning approaches. It incorporates a mechanism for accepting constraint priorities and a metric learning framework based on the boosting principle into a constrained k-means algorithm. In the framework, a metric is learned in the form of a kernel matrix that integrates weak cluster hypotheses produced by the constrained k-means algorithm, which works as a weak learner under the boosting principle. Experimental results for 12 data sets from 3 data sources demonstrated that our method has performance competitive to those of state-of-the-art constrained clustering methods for most data sets and that it takes much less computation time. Experimental evaluation demonstrated the effectiveness of controlling the constraint priorities by using the boosting principle and that our constrained k-means algorithm functions correctly as a weak learner of boosting.

  19. Trilateral market coupling. Algorithm appendix

    International Nuclear Information System (INIS)

    2006-03-01

    each local market. The Market Coupling algorithm provides as an output for each market: The set of accepted Block Orders; The Net Position for each Settlement Period of the following day; and The price (MCP) for each Settlement Period of the following day. The results of the Market Coupling algorithm are consistent with a number of 'High Level Properties'. The High Level Properties can be divided into two subsets: Market Coupling High Level Properties (constraints that the Market Results fulfill for each Settlement Period), and Exchanges High Level Properties (constraints that the Market Results must fulfill for each Settlement Period. They reflect the requirements of individual participants trading on the exchanges). Using the ATCs and NECs, the Market Coupling algorithm can determine for each Settlement Period the Price and Net Position of each market. A NEC is built for a given set of accepted Block Orders (Winning Subset). When a set of NECs is used to determine the prices and Net Positions of each market, the set of prices returned for each market may very well not be compatible with this assumed Winning Subset. The Winning Subset needs to be updated and the calculations run again with the derived new NEC. This procedure must be repeated until a stable solution is found. As a consequence, the Market Coupling algorithm involves iterations between two modules: The Coordination Module which is in charge of the centralized computations; The Block Selector of each exchange which performs the decentralized computations. The iterative nature of the algorithm derives from the treatment of Block Orders. The data flows and calculations of the iterative algorithm are described in the rest of the document

  20. Trilateral market coupling. Algorithm appendix

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-03-15

    participants in each local market. The Market Coupling algorithm provides as an output for each market: The set of accepted Block Orders; The Net Position for each Settlement Period of the following day; and The price (MCP) for each Settlement Period of the following day. The results of the Market Coupling algorithm are consistent with a number of 'High Level Properties'. The High Level Properties can be divided into two subsets: Market Coupling High Level Properties (constraints that the Market Results fulfill for each Settlement Period), and Exchanges High Level Properties (constraints that the Market Results must fulfill for each Settlement Period. They reflect the requirements of individual participants trading on the exchanges). Using the ATCs and NECs, the Market Coupling algorithm can determine for each Settlement Period the Price and Net Position of each market. A NEC is built for a given set of accepted Block Orders (Winning Subset). When a set of NECs is used to determine the prices and Net Positions of each market, the set of prices returned for each market may very well not be compatible with this assumed Winning Subset. The Winning Subset needs to be updated and the calculations run again with the derived new NEC. This procedure must be repeated until a stable solution is found. As a consequence, the Market Coupling algorithm involves iterations between two modules: The Coordination Module which is in charge of the centralized computations; The Block Selector of each exchange which performs the decentralized computations. The iterative nature of the algorithm derives from the treatment of Block Orders. The data flows and calculations of the iterative algorithm are described in the rest of the document.

  1. Acceptance of Others, Feeling of Being Accepted and Striving for Being Accepted Among the Representatives of Different Kinds of Occupations

    Directory of Open Access Journals (Sweden)

    Gergana Stanoeva

    2012-05-01

    Full Text Available This paper deals with an important issue related to the human attitudes and needs in interpersonal and professional aspects. The theoretical part deals with several psychological components of the self-esteem and esteem of the others – acceptance of the others, feeling of being accepted, need for approval. Some gender differences in manifestations of acceptance and feeling of being accepted at the workplace are discussed. This article presents some empirical data for the degree of acceptance of others, feeling of being accepted and the strive for being accepted among the representatives of helping, pedagogical, administrative and economic occupations, as well as non-qualified workers. The goals of the study were to reveal the interdependency between these constructs and to be found some significant differences between the representatives of the four groups of occupations. The methods of the first study were W. Fey’s scales “Acceptance of others”, and “How do I feel accepted by others”. The method of the second study was Crown and Marlowe Scale for Social Desirability. The results indicated some significant differences in acceptance of others and feeling of being accepted between the non-qualified workers and the representatives of helping, administrative and economic occupations. There were not any significant difference in strive for being accepted between the fouroccupational groups.

  2. Acceptable levels of digital image compression in chest radiology

    International Nuclear Information System (INIS)

    Smith, I.

    2000-01-01

    The introduction of picture archival and communications systems (PACS) and teleradiology has prompted an examination of techniques that optimize the storage capacity and speed of digital storage and distribution networks. The general acceptance of the move to replace conventional screen-film capture with computed radiography (CR) is an indication that clinicians within the radiology community are willing to accept images that have been 'compressed'. The question to be answered, therefore, is what level of compression is acceptable. The purpose of the present study is to provide an assessment of the ability of a group of imaging professionals to determine whether an image has been compressed. To undertake this study a single mobile chest image, selected for the presence of some subtle pathology in the form of a number of septal lines in both costphrenic angles, was compressed to levels of 10:1, 20:1 and 30:1. These images were randomly ordered and shown to the observers for interpretation. Analysis of the responses indicates that in general it was not possible to distinguish the original image from its compressed counterparts. Furthermore, a preference appeared to be shown for images that have undergone low levels of compression. This preference can most likely be attributed to the 'de-noising' effect of the compression algorithm at low levels. Copyright (1999) Blackwell Science Pty. Ltd

  3. Backtracking algorithm for lepton reconstruction with HADES

    International Nuclear Information System (INIS)

    Sellheim, P

    2015-01-01

    The High Acceptance Di-Electron Spectrometer (HADES) at the GSI Helmholtzzentrum für Schwerionenforschung investigates dilepton and strangeness production in elementary and heavy-ion collisions. In April - May 2012 HADES recorded 7 billion Au+Au events at a beam energy of 1.23 GeV/u with the highest multiplicities measured so far. The track reconstruction and particle identification in the high track density environment are challenging. The most important detector component for lepton identification is the Ring Imaging Cherenkov detector. Its main purpose is the separation of electrons and positrons from large background of charged hadrons produced in heavy-ion collisions. In order to improve lepton identification this backtracking algorithm was developed. In this contribution we will show the results of the algorithm compared to the currently applied method for e +/- identification. Efficiency and purity of a reconstructed e +/- sample will be discussed as well. (paper)

  4. Analysis of blood donor pre-donation deferral in Dubai: characteristics and reasons.

    Science.gov (United States)

    Al Shaer, Laila; Sharma, Ranjita; AbdulRahman, Mahera

    2017-01-01

    To ensure an adequate and safe blood supply, it is crucial to select suitable donors according to stringent eligibility criteria. Understanding the reasons for donor deferral can help in planning more efficient recruitment strategies and evaluating donor selection criteria. This study aims to define donor pre-donation deferral rates, causes of deferral, and characteristics of deferred donors in Dubai. This retrospective study was conducted on all donors who presented for allogeneic blood donation between January 1, 2010, until June 30, 2013, in Dubai Blood Donation Centre, accredited by the American Association of Blood Banks. The donation and deferral data were analyzed to determine the demographic characteristics of accepted and deferred donors, and frequency analyses were also conducted. Among 142,431 individuals presenting during the study period, 114,827 (80.6%) were accepted for donation, and 27,604 (19.4%) were deferred. The overall proportion of deferrals was higher among individuals less than 21 years old (35%, P Dubai is relatively high compared to the internationally reported rates. This rate was higher among first-time donors and females, with low hemoglobin as the major factor leading to a temporary deferral of donors. Strategies to mitigate deferral and improve blood donor retention are urged in Dubai to avoid additional stress on the blood supply.

  5. Multi-scale graph-cut algorithm for efficient water-fat separation.

    Science.gov (United States)

    Berglund, Johan; Skorpil, Mikael

    2017-09-01

    To improve the accuracy and robustness to noise in water-fat separation by unifying the multiscale and graph cut based approaches to B 0 -correction. A previously proposed water-fat separation algorithm that corrects for B 0 field inhomogeneity in 3D by a single quadratic pseudo-Boolean optimization (QPBO) graph cut was incorporated into a multi-scale framework, where field map solutions are propagated from coarse to fine scales for voxels that are not resolved by the graph cut. The accuracy of the single-scale and multi-scale QPBO algorithms was evaluated against benchmark reference datasets. The robustness to noise was evaluated by adding noise to the input data prior to water-fat separation. Both algorithms achieved the highest accuracy when compared with seven previously published methods, while computation times were acceptable for implementation in clinical routine. The multi-scale algorithm was more robust to noise than the single-scale algorithm, while causing only a small increase (+10%) of the reconstruction time. The proposed 3D multi-scale QPBO algorithm offers accurate water-fat separation, robustness to noise, and fast reconstruction. The software implementation is freely available to the research community. Magn Reson Med 78:941-949, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  6. Experimental methods for the analysis of optimization algorithms

    CERN Document Server

    Bartz-Beielstein, Thomas; Paquete, Luis; Preuss, Mike

    2010-01-01

    In operations research and computer science it is common practice to evaluate the performance of optimization algorithms on the basis of computational results, and the experimental approach should follow accepted principles that guarantee the reliability and reproducibility of results. However, computational experiments differ from those in other sciences, and the last decade has seen considerable methodological research devoted to understanding the particular features of such experiments and assessing the related statistical methods. This book consists of methodological contributions on diffe

  7. CAMPAIGN: an open-source library of GPU-accelerated data clustering algorithms.

    Science.gov (United States)

    Kohlhoff, Kai J; Sosnick, Marc H; Hsu, William T; Pande, Vijay S; Altman, Russ B

    2011-08-15

    Data clustering techniques are an essential component of a good data analysis toolbox. Many current bioinformatics applications are inherently compute-intense and work with very large datasets. Sequential algorithms are inadequate for providing the necessary performance. For this reason, we have created Clustering Algorithms for Massively Parallel Architectures, Including GPU Nodes (CAMPAIGN), a central resource for data clustering algorithms and tools that are implemented specifically for execution on massively parallel processing architectures. CAMPAIGN is a library of data clustering algorithms and tools, written in 'C for CUDA' for Nvidia GPUs. The library provides up to two orders of magnitude speed-up over respective CPU-based clustering algorithms and is intended as an open-source resource. New modules from the community will be accepted into the library and the layout of it is such that it can easily be extended to promising future platforms such as OpenCL. Releases of the CAMPAIGN library are freely available for download under the LGPL from https://simtk.org/home/campaign. Source code can also be obtained through anonymous subversion access as described on https://simtk.org/scm/?group_id=453. kjk33@cantab.net.

  8. Comprehensive preference optimization of an irreversible thermal engine using pareto based mutable smart bee algorithm and generalized regression neural network

    DEFF Research Database (Denmark)

    Mozaffari, Ahmad; Gorji-Bandpy, Mofid; Samadian, Pendar

    2013-01-01

    Optimizing and controlling of complex engineering systems is a phenomenon that has attracted an incremental interest of numerous scientists. Until now, a variety of intelligent optimizing and controlling techniques such as neural networks, fuzzy logic, game theory, support vector machines...... and stochastic algorithms were proposed to facilitate controlling of the engineering systems. In this study, an extended version of mutable smart bee algorithm (MSBA) called Pareto based mutable smart bee (PBMSB) is inspired to cope with multi-objective problems. Besides, a set of benchmark problems and four...... well-known Pareto based optimizing algorithms i.e. multi-objective bee algorithm (MOBA), multi-objective particle swarm optimization (MOPSO) algorithm, non-dominated sorting genetic algorithm (NSGA-II), and strength Pareto evolutionary algorithm (SPEA 2) are utilized to confirm the acceptable...

  9. Algorithm for the treatment of type 2 diabetes: a position statement of Brazilian Diabetes Society.

    Science.gov (United States)

    Lerario, Antonio C; Chacra, Antonio R; Pimazoni-Netto, Augusto; Malerbi, Domingos; Gross, Jorge L; Oliveira, José Ep; Gomes, Marilia B; Santos, Raul D; Fonseca, Reine Mc; Betti, Roberto; Raduan, Roberto

    2010-06-08

    The Brazilian Diabetes Society is starting an innovative project of quantitative assessment of medical arguments of and implementing a new way of elaborating SBD Position Statements. The final aim of this particular project is to propose a new Brazilian algorithm for the treatment of type 2 diabetes, based on the opinions of endocrinologists surveyed from a poll conducted on the Brazilian Diabetes Society website regarding the latest algorithm proposed by American Diabetes Association /European Association for the Study of Diabetes, published in January 2009.An additional source used, as a basis for the new algorithm, was to assess the acceptability of controversial arguments published in international literature, through a panel of renowned Brazilian specialists. Thirty controversial arguments in diabetes have been selected with their respective references, where each argument was assessed and scored according to its acceptability level and personal conviction of each member of the evaluation panel.This methodology was adapted using a similar approach to the one adopted in the recent position statement by the American College of Cardiology on coronary revascularization, of which not only cardiologists took part, but also specialists of other related areas.

  10. Algorithm for the treatment of type 2 diabetes: a position statement of Brazilian Diabetes Society

    Directory of Open Access Journals (Sweden)

    Lerario Antonio C

    2010-06-01

    Full Text Available Abstract The Brazilian Diabetes Society is starting an innovative project of quantitative assessment of medical arguments of and implementing a new way of elaborating SBD Position Statements. The final aim of this particular project is to propose a new Brazilian algorithm for the treatment of type 2 diabetes, based on the opinions of endocrinologists surveyed from a poll conducted on the Brazilian Diabetes Society website regarding the latest algorithm proposed by American Diabetes Association /European Association for the Study of Diabetes, published in January 2009. An additional source used, as a basis for the new algorithm, was to assess the acceptability of controversial arguments published in international literature, through a panel of renowned Brazilian specialists. Thirty controversial arguments in diabetes have been selected with their respective references, where each argument was assessed and scored according to its acceptability level and personal conviction of each member of the evaluation panel. This methodology was adapted using a similar approach to the one adopted in the recent position statement by the American College of Cardiology on coronary revascularization, of which not only cardiologists took part, but also specialists of other related areas.

  11. Energy functions for regularization algorithms

    Science.gov (United States)

    Delingette, H.; Hebert, M.; Ikeuchi, K.

    1991-01-01

    Regularization techniques are widely used for inverse problem solving in computer vision such as surface reconstruction, edge detection, or optical flow estimation. Energy functions used for regularization algorithms measure how smooth a curve or surface is, and to render acceptable solutions these energies must verify certain properties such as invariance with Euclidean transformations or invariance with parameterization. The notion of smoothness energy is extended here to the notion of a differential stabilizer, and it is shown that to void the systematic underestimation of undercurvature for planar curve fitting, it is necessary that circles be the curves of maximum smoothness. A set of stabilizers is proposed that meet this condition as well as invariance with rotation and parameterization.

  12. ZAP: a distributed channel assignment algorithm for cognitive radio networks

    OpenAIRE

    Junior , Paulo Roberto ,; Fonseca , Mauro; Munaretto , Anelise; Viana , Aline ,; Ziviani , Artur

    2011-01-01

    Abstract We propose ZAP, an algorithm for the distributed channel assignment in cognitive radio (CR) networks. CRs are capable of identifying underutilized licensed bands of the spectrum, allowing their reuse by secondary users without interfering with primary users. In this context, efficient channel assignment is challenging as ideally it must be simple, incur acceptable communication overhead, provide timely response, and be adaptive to accommodate frequent changes in the network. Another ...

  13. COOBBO: A Novel Opposition-Based Soft Computing Algorithm for TSP Problems

    Directory of Open Access Journals (Sweden)

    Qingzheng Xu

    2014-12-01

    Full Text Available In this paper, we propose a novel definition of opposite path. Its core feature is that the sequence of candidate paths and the distances between adjacent nodes in the tour are considered simultaneously. In a sense, the candidate path and its corresponding opposite path have the same (or similar at least distance to the optimal path in the current population. Based on an accepted framework for employing opposition-based learning, Oppositional Biogeography-Based Optimization using the Current Optimum, called COOBBO algorithm, is introduced to solve traveling salesman problems. We demonstrate its performance on eight benchmark problems and compare it with other optimization algorithms. Simulation results illustrate that the excellent performance of our proposed algorithm is attributed to the distinct definition of opposite path. In addition, its great strength lies in exploitation for enhancing the solution accuracy, not exploration for improving the population diversity. Finally, by comparing different version of COOBBO, another conclusion is that each successful opposition-based soft computing algorithm needs to adjust and remain a good balance between backward adjacent node and forward adjacent node.

  14. [An operational remote sensing algorithm of land surface evapotranspiration based on NOAA PAL dataset].

    Science.gov (United States)

    Hou, Ying-Yu; He, Yan-Bo; Wang, Jian-Lin; Tian, Guo-Liang

    2009-10-01

    Based on the time series 10-day composite NOAA Pathfinder AVHRR Land (PAL) dataset (8 km x 8 km), and by using land surface energy balance equation and "VI-Ts" (vegetation index-land surface temperature) method, a new algorithm of land surface evapotranspiration (ET) was constructed. This new algorithm did not need the support from meteorological observation data, and all of its parameters and variables were directly inversed or derived from remote sensing data. A widely accepted ET model of remote sensing, i. e., SEBS model, was chosen to validate the new algorithm. The validation test showed that both the ET and its seasonal variation trend estimated by SEBS model and our new algorithm accorded well, suggesting that the ET estimated from the new algorithm was reliable, being able to reflect the actual land surface ET. The new ET algorithm of remote sensing was practical and operational, which offered a new approach to study the spatiotemporal variation of ET in continental scale and global scale based on the long-term time series satellite remote sensing images.

  15. Global Distribution of Net Electron Acceptance in Subseafloor Sediment

    Science.gov (United States)

    Fulfer, V. M.; Pockalny, R. A.; D'Hondt, S.

    2017-12-01

    We quantified the global distribution of net electron acceptance rates (e-/m2/year) in subseafloor sediment (>1.5 meters below seafloor [mbsf]) using (i) a modified version of the chemical-reaction-rate algorithm by Wang et al. (2008), (ii) physical properties and dissolved oxygen and sulfate data from interstitial waters of sediment cores collected by the Ocean Drilling Program, Integrated Ocean Drilling Program, International Ocean Discovery Program, and U.S. coring expeditions, and (iii) correlation of net electron acceptance rates to global oceanographic properties. Calculated net rates vary from 4.8 x 1019 e-/m2/year for slowly accumulating abyssal clay to 1.2 x 1023 e-/m2/year for regions of high sedimentation rate. Net electron acceptance rate correlates strongly with mean sedimentation rate. Where sedimentation rate is very low (e.g., 1 m/Myr), dissolved oxygen penetrates more than 70 mbsf and is the primary terminal electron acceptor. Where sedimentation rate is moderate (e.g., 3 to 60 m/Myr), dissolved sulfate penetrates as far as 700 mbsf and is the principal terminal electron acceptor. Where sedimentation rate is high (e.g., > 60 m/Myr), dissolved sulfate penetrates only meters, but is the principal terminal electron acceptor in subseafloor sediment to the depth of sulfate penetration. Because microbial metabolism continues at greater depths than the depth of sulfate penetration in fast-accumulating sediment, complete quantification of subseafloor metabolic rates will require consideration of other chemical species.

  16. Algebraic dynamics algorithm: Numerical comparison with Runge-Kutta algorithm and symplectic geometric algorithm

    Institute of Scientific and Technical Information of China (English)

    WANG ShunJin; ZHANG Hua

    2007-01-01

    Based on the exact analytical solution of ordinary differential equations,a truncation of the Taylor series of the exact solution to the Nth order leads to the Nth order algebraic dynamics algorithm.A detailed numerical comparison is presented with Runge-Kutta algorithm and symplectic geometric algorithm for 12 test models.The results show that the algebraic dynamics algorithm can better preserve both geometrical and dynamical fidelity of a dynamical system at a controllable precision,and it can solve the problem of algorithm-induced dissipation for the Runge-Kutta algorithm and the problem of algorithm-induced phase shift for the symplectic geometric algorithm.

  17. Algebraic dynamics algorithm:Numerical comparison with Runge-Kutta algorithm and symplectic geometric algorithm

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Based on the exact analytical solution of ordinary differential equations, a truncation of the Taylor series of the exact solution to the Nth order leads to the Nth order algebraic dynamics algorithm. A detailed numerical comparison is presented with Runge-Kutta algorithm and symplectic geometric algorithm for 12 test models. The results show that the algebraic dynamics algorithm can better preserve both geometrical and dynamical fidelity of a dynamical system at a controllable precision, and it can solve the problem of algorithm-induced dissipation for the Runge-Kutta algorithm and the problem of algorithm-induced phase shift for the symplectic geometric algorithm.

  18. An improved algorithm for finding all minimal paths in a network

    International Nuclear Information System (INIS)

    Bai, Guanghan; Tian, Zhigang; Zuo, Ming J.

    2016-01-01

    Minimal paths (MPs) play an important role in network reliability evaluation. In this paper, we report an efficient recursive algorithm for finding all MPs in two-terminal networks, which consist of a source node and a sink node. A linked path structure indexed by nodes is introduced, which accepts both directed and undirected form of networks. The distance between each node and the sink node is defined, and a simple recursive algorithm is presented for labeling the distance for each node. Based on the distance between each node and the sink node, additional conditions for backtracking are incorporated to reduce the number of search branches. With the newly introduced linked node structure, the distances between each node and the sink node, and the additional backtracking conditions, an improved backtracking algorithm for searching for all MPs is developed. In addition, the proposed algorithm can be adapted to search for all minimal paths for each source–sink pair in networks consisting of multiple source nodes and/or multiple sink nodes. Through computational experiments, it is demonstrated that the proposed algorithm is more efficient than existing algorithms when the network size is not too small. The proposed algorithm becomes more advantageous as the size of the network grows. - Highlights: • A linked path structure indexed by nodes is introduced to represent networks. • Additional conditions for backtracking are proposed based on the distance of each node. • An efficient algorithm is developed to find all MPs for two-terminal networks. • The computational efficiency of the algorithm for two-terminal networks is investigated. • The computational efficiency of the algorithm for multi-terminal networks is investigated.

  19. Valor nutritivo de perfilhos e componentes morfológicos em pastos de capim-braquiária diferidos e adubados com nitrogênio Nutritive value of tillers and morphological components on deferred and nitrogen fertilized pastures of Brachiaria decumbens cv. Basilisk

    Directory of Open Access Journals (Sweden)

    Manoel Eduardo Rozalino Santos

    2010-09-01

    Full Text Available Este trabalho foi realizado com o objetivo de avaliar o valor nutritivo de perfilhos e componentes morfológicos em pastos diferidos e adubados com nitrogênio de Brachiaria decumbens cv. Basilisk. Dois experimentos foram realizados em delineamento de blocos casualizados com três repetições e esquema de parcela subdivida. No primeiro experimento, foram realizadas combinações entre doses de nitrogênio (0, 40, 80 e 120 kg/ha com os componentes morfológicos (folha verde, colmo verde, folha morta e colmo morto ou com as categorias de perfilhos (vegetativo e reprodutivo do pasto. No segundo experimento, estudaram-se as combinações entre períodos de diferimento (73, 95 e 116 dias com categorias de perfilhos (vegetativo e reprodutivo. A adubação nitrogenada aumentou o percentual de fibra em detergente neutro (FDN nos colmos verdes e o teor de proteína bruta (PB nas folhas verdes e mortas e no colmo verde. Elevou também os percentuais de PB nos perfilhos vegetativos e reprodutivos, assim como o teor de FDN do perfilhos vegetativos. Maior período de diferimento causou acréscimo no percentual de FDN e redução no percentual de proteína bruta nos perfilhos vegetativos e reprodutivos. A dose de nitrogênio e o período de diferimento alteram o valor nutritivo dos componentes morfológicos e dos perfilhos dos pastos de B. decumbens cv. Basilisk. No diferimento da pastagem, ações de manejo que resultem em maiores percentuais de lâmina foliar verde e perfilho vegetativo no pasto contribuem para melhorar o valor nutritivo da forragem diferida.This work was carried out aiming to evaluate the nutritional value of tillers and the morphological components on deferred and nitrogen-fertilized pastures of Brachiaria decumbens cv. Basilisk. Two experiments were performed in randomized block designs with three repetitions and subdivided plot scheme. In the first experiment, combinations among nitrogen doses (0, 40, 80 and 120 kg/ha were performed

  20. Algorithmic acquisition of diagnostic patterns in district heating billing system

    International Nuclear Information System (INIS)

    Kiluk, Sebastian

    2012-01-01

    An application of algorithmic exploration of billing data is examined for fault detection, diagnosis (FDD) based on evaluation of present state and detection of unexpected changes in energy efficiency of buildings. Large data sets from district heating (DH) billing systems are used for construction of feature space, diagnostic rules and classification of the buildings according to their energy efficiency properties. The algorithmic approach automates discovering knowledge about common, thus accepted changes in buildings’ properties, in equipment and in habitants’ behavior reflecting progress in technology and life style. In this article implementation of Data Mining and Knowledge Discovery (DMKD) method in supervision system with exemplary results based on real data is presented. Crucial steps of data processing influencing diagnostic results are described in details.

  1. The Algorithm for Algorithms: An Evolutionary Algorithm Based on Automatic Designing of Genetic Operators

    Directory of Open Access Journals (Sweden)

    Dazhi Jiang

    2015-01-01

    Full Text Available At present there is a wide range of evolutionary algorithms available to researchers and practitioners. Despite the great diversity of these algorithms, virtually all of the algorithms share one feature: they have been manually designed. A fundamental question is “are there any algorithms that can design evolutionary algorithms automatically?” A more complete definition of the question is “can computer construct an algorithm which will generate algorithms according to the requirement of a problem?” In this paper, a novel evolutionary algorithm based on automatic designing of genetic operators is presented to address these questions. The resulting algorithm not only explores solutions in the problem space like most traditional evolutionary algorithms do, but also automatically generates genetic operators in the operator space. In order to verify the performance of the proposed algorithm, comprehensive experiments on 23 well-known benchmark optimization problems are conducted. The results show that the proposed algorithm can outperform standard differential evolution algorithm in terms of convergence speed and solution accuracy which shows that the algorithm designed automatically by computers can compete with the algorithms designed by human beings.

  2. Acceptance and implementation of a system of planning computerized based on Monte Carlo; Aceptacion y puesta en marcha de un sistema de planificacion comutarizada basado en Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Tarjuelo, J.; Garcia-Molla, R.; Suan-Senabre, X. J.; Quiros-Higueras, J. Q.; Santos-Serra, A.; Marco-Blancas, N.; Calzada-Feliu, S.

    2013-07-01

    It has been done the acceptance for use clinical Monaco computerized planning system, based on an on a virtual model of the energy yield of the head of the linear electron Accelerator and that performs the calculation of the dose with an algorithm of x-rays (XVMC) based on Monte Carlo algorithm. (Author)

  3. Application of artificial neural network for development of an algorithm for TLD badge system in the mixed field dosimetry of X and gamma radiation in terms of Hp(10)

    International Nuclear Information System (INIS)

    Srivastava, K.; Bakshi, A.K.; Geetha, V.; Kher, R.K.; Dhar, V.K.

    2005-01-01

    ICRU has introduced new operational quantities for individual monitoring. Therefore it is required to develop an algorithm that gives direct response of the TLD badge in terms of the operational quantities. For this purpose and also to improve the precision in the mixed fields dosimetry, two methods were studied i)- analytical method by developing an algorithm based on of higher order polynomial fit of the data points for known delivered doses and ii) use of Artificial Neural Network (ANN). Study on the response of the TLD badge system based on CaSO 4 : Dy Teflon TLD discs in the mixed fields of X and gamma radiation in terms of operational quantity Hp(10) was carried out using the prevalent algorithm, algorithm developed using higher order polynomial and neural network predicted algorithm for different proportion of dose delivered by X and gamma radiations. It was found that the uncertainties in the dose response for few fields are beyond the acceptable limit for prevalent algorithm and within the acceptable limit for other two algorithms. Algorithm based on ANN gives higher precision in the mixed field of two radiations compared to other two algorithms. (author)

  4. Bridging the Gap between Social Acceptance and Ethical Acceptability

    NARCIS (Netherlands)

    Taebi, B.

    2016-01-01

    New technology brings great benefits, but it can also create new and significant risks. When evaluating those risks in policymaking, there is a tendency to focus on social acceptance. By solely focusing on social acceptance, we could, however, overlook important ethical aspects of technological

  5. Video Game Acceptance: A Meta-Analysis of the Extended Technology Acceptance Model.

    Science.gov (United States)

    Wang, Xiaohui; Goh, Dion Hoe-Lian

    2017-11-01

    The current study systematically reviews and summarizes the existing literature of game acceptance, identifies the core determinants, and evaluates the strength of the relationships in the extended technology acceptance model. Moreover, this study segments video games into two categories: hedonic and utilitarian and examines player acceptance of these two types separately. Through a meta-analysis of 50 articles, we find that perceived ease of use (PEOU), perceived usefulness (PU), and perceived enjoyment (PE) significantly associate with attitude and behavioral intention. PE is the dominant predictor of hedonic game acceptance, while PEOU and PU are the main determinants of utilitarian game acceptance. Furthermore, we find that respondent type and game platform are significant moderators. Findings of this study provide critical insights into the phenomenon of game acceptance and suggest directions for future research.

  6. Monte Carlo tests of the ELIPGRID-PC algorithm

    International Nuclear Information System (INIS)

    Davidson, J.R.

    1995-04-01

    The standard tool for calculating the probability of detecting pockets of contamination called hot spots has been the ELIPGRID computer code of Singer and Wickman. The ELIPGRID-PC program has recently made this algorithm available for an IBM reg-sign PC. However, no known independent validation of the ELIPGRID algorithm exists. This document describes a Monte Carlo simulation-based validation of a modified version of the ELIPGRID-PC code. The modified ELIPGRID-PC code is shown to match Monte Carlo-calculated hot-spot detection probabilities to within ±0.5% for 319 out of 320 test cases. The one exception, a very thin elliptical hot spot located within a rectangular sampling grid, differed from the Monte Carlo-calculated probability by about 1%. These results provide confidence in the ability of the modified ELIPGRID-PC code to accurately predict hot-spot detection probabilities within an acceptable range of error

  7. Dense Matching Comparison Between Census and a Convolutional Neural Network Algorithm for Plant Reconstruction

    Science.gov (United States)

    Xia, Y.; Tian, J.; d'Angelo, P.; Reinartz, P.

    2018-05-01

    3D reconstruction of plants is hard to implement, as the complex leaf distribution highly increases the difficulty level in dense matching. Semi-Global Matching has been successfully applied to recover the depth information of a scene, but may perform variably when different matching cost algorithms are used. In this paper two matching cost computation algorithms, Census transform and an algorithm using a convolutional neural network, are tested for plant reconstruction based on Semi-Global Matching. High resolution close-range photogrammetric images from a handheld camera are used for the experiment. The disparity maps generated based on the two selected matching cost methods are comparable with acceptable quality, which shows the good performance of Census and the potential of neural networks to improve the dense matching.

  8. Adaptive control of call acceptance in WCDMA network

    Directory of Open Access Journals (Sweden)

    Milan Manojle Šunjevarić

    2013-10-01

    Full Text Available In this paper, an overview of the algorithms for access control in mobile wireless networks is presented. A review of adaptive control methods of accepting a call in WCDMA networks is discussed, based on the overview of the algorithms used for this purpose, and their comparison. Appropriate comments and conculsions in comparison with the basic characteristics of these algorithms are given. The OVSF codes are explained as well as how the allocation method influences the capacity and probability of blocking.. Introduction We are witnessing a steady increase in the number of demands placed upon modern wireless networks. New applications and an increasing number of users as well as user activities growth in recent years reinforce the need for an efficient use of the spectrum and its proper distribution among different applications and classes of services. Besides humans, the last few years saw different computers, machines, applications, and, in the future, many other devices, RFID applications, and finally networked objects, as a new kind of wireless networks "users". Because of the exceptional rise in the number of users, the demands placed upon modern wireless networks are becoming larger, and spectrum management plays an important role. For these reasons, choosing an appropriate call admission control algorithm is of great importance. Multiple access and resource management in wireless networks Radio resource management of mobile networks is a set of algorithms to manage the use of radio resources with the aim is to maximize the total capacity of wireless systems with equal distribution of resources to users. Management of radio resources in cellular networks is usually located in the base station controller, the base station and the mobile terminal, and is based on decisions made on appropriate measurement and feedback. It is often defined as the maximum volume of traffic load that the system can provide for some of the requirements for the

  9. Implementation of pencil kernel and depth penetration algorithms for treatment planning of proton beams

    International Nuclear Information System (INIS)

    Russell, K.R.; Saxner, M.; Ahnesjoe, A.; Montelius, A.; Grusell, E.; Dahlgren, C.V.

    2000-01-01

    The implementation of two algorithms for calculating dose distributions for radiation therapy treatment planning of intermediate energy proton beams is described. A pencil kernel algorithm and a depth penetration algorithm have been incorporated into a commercial three-dimensional treatment planning system (Helax-TMS, Helax AB, Sweden) to allow conformal planning techniques using irregularly shaped fields, proton range modulation, range modification and dose calculation for non-coplanar beams. The pencil kernel algorithm is developed from the Fermi-Eyges formalism and Moliere multiple-scattering theory with range straggling corrections applied. The depth penetration algorithm is based on the energy loss in the continuous slowing down approximation with simple correction factors applied to the beam penumbra region and has been implemented for fast, interactive treatment planning. Modelling of the effects of air gaps and range modifying device thickness and position are implicit to both algorithms. Measured and calculated dose values are compared for a therapeutic proton beam in both homogeneous and heterogeneous phantoms of varying complexity. Both algorithms model the beam penumbra as a function of depth in a homogeneous phantom with acceptable accuracy. Results show that the pencil kernel algorithm is required for modelling the dose perturbation effects from scattering in heterogeneous media. (author)

  10. Analysis of blood donor pre-donation deferral in Dubai: characteristics and reasons

    Directory of Open Access Journals (Sweden)

    Al Shaer L

    2017-05-01

    Full Text Available Laila Al Shaer,1 Ranjita Sharma,2 Mahera AbdulRahman2 1College of Medicine, Mohammed Bin Rashid University of Medicine and Health Sciences, Dubai, UAE; 2Dubai Health Authority, Dubai, UAE Background: To ensure an adequate and safe blood supply, it is crucial to select suitable donors according to stringent eligibility criteria. Understanding the reasons for donor deferral can help in planning more efficient recruitment strategies and evaluating donor selection criteria. This study aims to define donor pre-donation deferral rates, causes of deferral, and characteristics of deferred donors in Dubai.Materials and methods: This retrospective study was conducted on all donors who presented for allogeneic blood donation between January 1, 2010, until June 30, 2013, in Dubai Blood Donation Centre, accredited by the American Association of Blood Banks. The donation and deferral data were analyzed to determine the demographic characteristics of accepted and deferred donors, and frequency analyses were also conducted.Results: Among 142,431 individuals presenting during the study period, 114,827 (80.6% were accepted for donation, and 27,604 (19.4% were deferred. The overall proportion of deferrals was higher among individuals less than 21 years old (35%, P<0.000, females (44% were deferred compared to 15% of males, P<0.0001, and first-time donors (22% were deferred vs 14% of repeat donors, P<0.0001. The main causes for a temporary deferral were low hemoglobin and high blood pressure.Discussion: The deferral rate among blood donors in Dubai is relatively high compared to the internationally reported rates. This rate was higher among first-time donors and females, with low hemoglobin as the major factor leading to a temporary deferral of donors. Strategies to mitigate deferral and improve blood donor retention are urged in Dubai to avoid additional stress on the blood supply. Keywords: blood donation, blood safety, donor deferral, selection criteria 

  11. Expert-guided evolutionary algorithm for layout design of complex space stations

    Science.gov (United States)

    Qian, Zhiqin; Bi, Zhuming; Cao, Qun; Ju, Weiguo; Teng, Hongfei; Zheng, Yang; Zheng, Siyu

    2017-08-01

    The layout of a space station should be designed in such a way that different equipment and instruments are placed for the station as a whole to achieve the best overall performance. The station layout design is a typical nondeterministic polynomial problem. In particular, how to manage the design complexity to achieve an acceptable solution within a reasonable timeframe poses a great challenge. In this article, a new evolutionary algorithm has been proposed to meet such a challenge. It is called as the expert-guided evolutionary algorithm with a tree-like structure decomposition (EGEA-TSD). Two innovations in EGEA-TSD are (i) to deal with the design complexity, the entire design space is divided into subspaces with a tree-like structure; it reduces the computation and facilitates experts' involvement in the solving process. (ii) A human-intervention interface is developed to allow experts' involvement in avoiding local optimums and accelerating convergence. To validate the proposed algorithm, the layout design of one-space station is formulated as a multi-disciplinary design problem, the developed algorithm is programmed and executed, and the result is compared with those from other two algorithms; it has illustrated the superior performance of the proposed EGEA-TSD.

  12. Hybrid algorithm for rotor angle security assessment in power systems

    Directory of Open Access Journals (Sweden)

    D. Prasad Wadduwage

    2015-08-01

    Full Text Available Transient rotor angle stability assessment and oscillatory rotor angle stability assessment subsequent to a contingency are integral components of dynamic security assessment (DSA in power systems. This study proposes a hybrid algorithm to determine whether the post-fault power system is secure due to both transient rotor angle stability and oscillatory rotor angle stability subsequent to a set of known contingencies. The hybrid algorithm first uses a new security measure developed based on the concept of Lyapunov exponents (LEs to determine the transient security of the post-fault power system. Later, the transient secure power swing curves are analysed using an improved Prony algorithm which extracts the dominant oscillatory modes and estimates their damping ratios. The damping ratio is a security measure about the oscillatory security of the post-fault power system subsequent to the contingency. The suitability of the proposed hybrid algorithm for DSA in power systems is illustrated using different contingencies of a 16-generator 68-bus test system and a 50-generator 470-bus test system. The accuracy of the stability conclusions and the acceptable computational burden indicate that the proposed hybrid algorithm is suitable for real-time security assessment with respect to both transient rotor angle stability and oscillatory rotor angle stability under multiple contingencies of the power system.

  13. A New Lane Departure Warning Algorithm Considering the Driver’s Behavior Characteristics

    Directory of Open Access Journals (Sweden)

    Lun Hui Xu

    2015-01-01

    Full Text Available In order to meet the driving safety warning required for different driver types and situations, a new lane departure warning (LDW algorithm was proposed. Its adaptability is much better through setting the different thresholds of time to lane crossing (TLC using fuzzy control method for driver with different driving behaviors in different lanes and different vehicle movements. To ensure the accuracy of computation of TLC under the different actual driving scenarios, the algorithm was established based on vehicle kinematics and advanced mathematics compared to other ways of computation of TLC. On this basis, a LDW strategy determining driver's intentions was presented by introducing identifying vehicle movements. Finally, a vast quantity of the real vehicle experiments was given to demonstrate the effectiveness of the proposed LDW algorithm. The results of the tests show that the algorithm can decrease false alarm rate effectively because of distinguishing from unconscious by real-time vehicle movements, and promote the adaptability to the driver behavior characteristics, so it has favorable driver acceptance and strong intelligence.

  14. In acceptance we trust? Conceptualising acceptance as a viable approach to NGO security management.

    Science.gov (United States)

    Fast, Larissa A; Freeman, C Faith; O'Neill, Michael; Rowley, Elizabeth

    2013-04-01

    This paper documents current understanding of acceptance as a security management approach and explores issues and challenges non-governmental organisations (NGOs) confront when implementing an acceptance approach to security management. It argues that the failure of organisations to systematise and clearly articulate acceptance as a distinct security management approach and a lack of organisational policies and procedures concerning acceptance hinder its efficacy as a security management approach. The paper identifies key and cross-cutting components of acceptance that are critical to its effective implementation in order to advance a comprehensive and systematic concept of acceptance. The key components of acceptance illustrate how organisational and staff functions affect positively or negatively an organisation's acceptance, and include: an organisation's principles and mission, communications, negotiation, programming, relationships and networks, stakeholder and context analysis, staffing, and image. The paper contends that acceptance is linked not only to good programming, but also to overall organisational management and structures. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  15. Parallel algorithm for determining motion vectors in ice floe images by matching edge features

    Science.gov (United States)

    Manohar, M.; Ramapriyan, H. K.; Strong, J. P.

    1988-01-01

    A parallel algorithm is described to determine motion vectors of ice floes using time sequences of images of the Arctic ocean obtained from the Synthetic Aperture Radar (SAR) instrument flown on-board the SEASAT spacecraft. Researchers describe a parallel algorithm which is implemented on the MPP for locating corresponding objects based on their translationally and rotationally invariant features. The algorithm first approximates the edges in the images by polygons or sets of connected straight-line segments. Each such edge structure is then reduced to a seed point. Associated with each seed point are the descriptions (lengths, orientations and sequence numbers) of the lines constituting the corresponding edge structure. A parallel matching algorithm is used to match packed arrays of such descriptions to identify corresponding seed points in the two images. The matching algorithm is designed such that fragmentation and merging of ice floes are taken into account by accepting partial matches. The technique has been demonstrated to work on synthetic test patterns and real image pairs from SEASAT in times ranging from .5 to 0.7 seconds for 128 x 128 images.

  16. Survey of neonatologists' attitudes toward limiting life-sustaining treatments in the neonatal intensive care unit.

    Science.gov (United States)

    Feltman, D M; Du, H; Leuthner, S R

    2012-11-01

    To understand neonatologists' attitudes toward end-of-life (EOL) management in clinical scenarios, EOL ethical concepts and resource utilization. American Academy of Pediatrics (AAP) Perinatal section members completed an anonymous online survey. Respondents indicated preferences in limiting life-sustaining treatments in four clinical scenarios, ranked agreement with EOL-care ethics statements, indicated outside resources previously used and provided demographic information. In all, 451 surveys were analyzed. Across clinical scenarios and as general ethical concepts, withdrawal of mechanical ventilation in severely affected patients was most accepted by respondents; withdrawal of artificial nutrition and hydration was least accepted. One-third of neonatologists did not agree that non-initiation of treatment is ethically equivalent to withdrawal. Around 20% of neonatologists would not defer care if uncomfortable with a parent's request. Respondents' resources included ethics committees, AAP guidelines and legal counsel/courts. Challenges to providing just, unified EOL care strategies are discussed, including deferring care, limiting artificial nutrition/hydration and conditions surrounding ventilator withdrawal.

  17. A meta-heuristic method for solving scheduling problem: crow search algorithm

    Science.gov (United States)

    Adhi, Antono; Santosa, Budi; Siswanto, Nurhadi

    2018-04-01

    Scheduling is one of the most important processes in an industry both in manufacturingand services. The scheduling process is the process of selecting resources to perform an operation on tasks. Resources can be machines, peoples, tasks, jobs or operations.. The selection of optimum sequence of jobs from a permutation is an essential issue in every research in scheduling problem. Optimum sequence becomes optimum solution to resolve scheduling problem. Scheduling problem becomes NP-hard problem since the number of job in the sequence is more than normal number can be processed by exact algorithm. In order to obtain optimum results, it needs a method with capability to solve complex scheduling problems in an acceptable time. Meta-heuristic is a method usually used to solve scheduling problem. The recently published method called Crow Search Algorithm (CSA) is adopted in this research to solve scheduling problem. CSA is an evolutionary meta-heuristic method which is based on the behavior in flocks of crow. The calculation result of CSA for solving scheduling problem is compared with other algorithms. From the comparison, it is found that CSA has better performance in term of optimum solution and time calculation than other algorithms.

  18. A novel orthoimage mosaic method using the weighted A* algorithm for UAV imagery

    Science.gov (United States)

    Zheng, Maoteng; Zhou, Shunping; Xiong, Xiaodong; Zhu, Junfeng

    2017-12-01

    A weighted A* algorithm is proposed to select optimal seam-lines in orthoimage mosaic for UAV (Unmanned Aircraft Vehicle) imagery. The whole workflow includes four steps: the initial seam-line network is firstly generated by standard Voronoi Diagram algorithm; an edge diagram is then detected based on DSM (Digital Surface Model) data; the vertices (conjunction nodes) of initial network are relocated since some of them are on the high objects (buildings, trees and other artificial structures); and, the initial seam-lines are finally refined using the weighted A* algorithm based on the edge diagram and the relocated vertices. The method was tested with two real UAV datasets. Preliminary results show that the proposed method produces acceptable mosaic images in both the urban and mountainous areas, and is better than the result of the state-of-the-art methods on the datasets.

  19. DENSE MATCHING COMPARISON BETWEEN CENSUS AND A CONVOLUTIONAL NEURAL NETWORK ALGORITHM FOR PLANT RECONSTRUCTION

    Directory of Open Access Journals (Sweden)

    Y. Xia

    2018-05-01

    Full Text Available 3D reconstruction of plants is hard to implement, as the complex leaf distribution highly increases the difficulty level in dense matching. Semi-Global Matching has been successfully applied to recover the depth information of a scene, but may perform variably when different matching cost algorithms are used. In this paper two matching cost computation algorithms, Census transform and an algorithm using a convolutional neural network, are tested for plant reconstruction based on Semi-Global Matching. High resolution close-range photogrammetric images from a handheld camera are used for the experiment. The disparity maps generated based on the two selected matching cost methods are comparable with acceptable quality, which shows the good performance of Census and the potential of neural networks to improve the dense matching.

  20. An international consensus algorithm for management of chronic postoperative inguinal pain.

    Science.gov (United States)

    Lange, J F M; Kaufmann, R; Wijsmuller, A R; Pierie, J P E N; Ploeg, R J; Chen, D C; Amid, P K

    2015-02-01

    Tension-free mesh repair of inguinal hernia has led to uniformly low recurrence rates. Morbidity associated with this operation is mainly related to chronic pain. No consensus guidelines exist for the management of this condition. The goal of this study is to design an expert-based algorithm for diagnostic and therapeutic management of chronic inguinal postoperative pain (CPIP). A group of surgeons considered experts on inguinal hernia surgery was solicited to develop the algorithm. Consensus regarding each step of an algorithm proposed by the authors was sought by means of the Delphi method leading to a revised expert-based algorithm. With the input of 28 international experts, an algorithm for a stepwise approach for management of CPIP was created. 26 participants accepted the final algorithm as a consensus model. One participant could not agree with the final concept. One expert did not respond during the final phase. There is a need for guidelines with regard to management of CPIP. This algorithm can serve as a guide with regard to the diagnosis, management, and treatment of these patients and improve clinical outcomes. If an expectative phase of a few months has passed without any amelioration of CPIP, a multidisciplinary approach is indicated and a pain management team should be consulted. Pharmacologic, behavioral, and interventional modalities including nerve blocks are essential. If conservative measures fail and surgery is considered, triple neurectomy, correction for recurrence with or without neurectomy, and meshoma removal if indicated should be performed. Surgeons less experienced with remedial operations for CPIP should not hesitate to refer their patients to dedicated hernia surgeons.

  1. MEASUREMENT FOR ACCEPTANCE OF SUPPLY CHAIN SIMULATOR APPLICATION USING TECHNOLOGY ACCEPTANCE MODEL

    Directory of Open Access Journals (Sweden)

    Mulyati E.

    2018-03-01

    Full Text Available The aim of this research for was to measure the user acceptance of simulator application which was built as a tool for student in learning of supply chain, particularly in bullwhip effect problem. The measurements used for the acceptance of supply chain simulator application in this research was the Technology Acceptance Model from 162 samples which were analyzed with Confirmatory Factor Analysis and Structural Equation Modelling. The result of this research indicated that the user acceptance (shown by customer participation of supply chain simulator was directly influence by perceived usefulness of supply chain simulator application used (positive and significant; the user acceptance of supply chain simulator was indirectly influenced by perceived ease of use in using supply chain simulator application (positive but not significant; the user acceptance of supply chain simulator was indirectly influenced by perceived enjoyment when the supply chain simulator application was used. The research would give a better understanding about a bullwhip effect and better experience for students, which would not be obtained through conventional learning, when the tools were not used.

  2. A novel image encryption algorithm based on a 3D chaotic map

    Science.gov (United States)

    Kanso, A.; Ghebleh, M.

    2012-07-01

    Recently [Solak E, Çokal C, Yildiz OT Biyikoǧlu T. Cryptanalysis of Fridrich's chaotic image encryption. Int J Bifur Chaos 2010;20:1405-1413] cryptanalyzed the chaotic image encryption algorithm of [Fridrich J. Symmetric ciphers based on two-dimensional chaotic maps. Int J Bifur Chaos 1998;8(6):1259-1284], which was considered a benchmark for measuring security of many image encryption algorithms. This attack can also be applied to other encryption algorithms that have a structure similar to Fridrich's algorithm, such as that of [Chen G, Mao Y, Chui, C. A symmetric image encryption scheme based on 3D chaotic cat maps. Chaos Soliton Fract 2004;21:749-761]. In this paper, we suggest a novel image encryption algorithm based on a three dimensional (3D) chaotic map that can defeat the aforementioned attack among other existing attacks. The design of the proposed algorithm is simple and efficient, and based on three phases which provide the necessary properties for a secure image encryption algorithm including the confusion and diffusion properties. In phase I, the image pixels are shuffled according to a search rule based on the 3D chaotic map. In phases II and III, 3D chaotic maps are used to scramble shuffled pixels through mixing and masking rules, respectively. Simulation results show that the suggested algorithm satisfies the required performance tests such as high level security, large key space and acceptable encryption speed. These characteristics make it a suitable candidate for use in cryptographic applications.

  3. Expanded pharmacy technician roles: Accepting verbal prescriptions and communicating prescription transfers.

    Science.gov (United States)

    Frost, Timothy P; Adams, Alex J

    2017-11-01

    As the role of the clinical pharmacist continues to develop and advance, it is critical to ensure pharmacists can operate in a practice environment and workflow that supports the full deployment of their clinical skills. When pharmacy technician roles are optimized, patient safety can be enhanced and pharmacists may dedicate more time to advanced clinical services. Currently, 17 states allow technicians to accept verbal prescriptions called in by a prescriber or prescriber's agent, or transfer a prescription order from one pharmacy to another. States that allow these activities generally put few legal limitations on them, and instead defer to the professional judgment of the supervising pharmacist whether to delegate these tasks or not. These activities were more likely to be seen in states that require technicians to be registered and certified, and in states that have accountability mechanisms (e.g., discipline authority) in place for technicians. There is little evidence to suggest these tasks cannot be performed safely and accurately by appropriately trained technicians, and the track record of success with these tasks spans four decades in some states. Pharmacists can adopt strong practice policies and procedures to mitigate the risk of harm from verbal orders, such as instituting read-back/spell-back techniques, or requiring the indication for each phoned-in medication, among other strategies. Pharmacists may also exercise discretion in deciding to whom to delegate these tasks. As the legal environment becomes more permissive, we foresee investment in more robust education and training of technicians to cover these activities. Thus, with the adoption of robust practice policies and procedures, delegation of verbal orders and prescription transfers can be safe and effective, remove undue stress on pharmacists, and potentially free up pharmacist time for higher-order clinical care. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. An algorithm for automated layout of process description maps drawn in SBGN.

    Science.gov (United States)

    Genc, Begum; Dogrusoz, Ugur

    2016-01-01

    Evolving technology has increased the focus on genomics. The combination of today's advanced techniques with decades of molecular biology research has yielded huge amounts of pathway data. A standard, named the Systems Biology Graphical Notation (SBGN), was recently introduced to allow scientists to represent biological pathways in an unambiguous, easy-to-understand and efficient manner. Although there are a number of automated layout algorithms for various types of biological networks, currently none specialize on process description (PD) maps as defined by SBGN. We propose a new automated layout algorithm for PD maps drawn in SBGN. Our algorithm is based on a force-directed automated layout algorithm called Compound Spring Embedder (CoSE). On top of the existing force scheme, additional heuristics employing new types of forces and movement rules are defined to address SBGN-specific rules. Our algorithm is the only automatic layout algorithm that properly addresses all SBGN rules for drawing PD maps, including placement of substrates and products of process nodes on opposite sides, compact tiling of members of molecular complexes and extensively making use of nested structures (compound nodes) to properly draw cellular locations and molecular complex structures. As demonstrated experimentally, the algorithm results in significant improvements over use of a generic layout algorithm such as CoSE in addressing SBGN rules on top of commonly accepted graph drawing criteria. An implementation of our algorithm in Java is available within ChiLay library (https://github.com/iVis-at-Bilkent/chilay). ugur@cs.bilkent.edu.tr or dogrusoz@cbio.mskcc.org Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  5. Deferred slanted-edge analysis: a unified approach to spatial frequency response measurement on distorted images and color filter array subsets.

    Science.gov (United States)

    van den Bergh, F

    2018-03-01

    The slanted-edge method of spatial frequency response (SFR) measurement is usually applied to grayscale images under the assumption that any distortion of the expected straight edge is negligible. By decoupling the edge orientation and position estimation step from the edge spread function construction step, it is shown in this paper that the slanted-edge method can be extended to allow it to be applied to images suffering from significant geometric distortion, such as produced by equiangular fisheye lenses. This same decoupling also allows the slanted-edge method to be applied directly to Bayer-mosaicked images so that the SFR of the color filter array subsets can be measured directly without the unwanted influence of demosaicking artifacts. Numerical simulation results are presented to demonstrate the efficacy of the proposed deferred slanted-edge method in relation to existing methods.

  6. The Role of Campus Support, Undocumented Identity, and Deferred Action for Childhood Arrivals on Civic Engagement for Latinx Undocumented Undergraduates.

    Science.gov (United States)

    Katsiaficas, Dalal; Volpe, Vanessa; Raza, Syeda S; Garcia, Yuliana

    2017-08-30

    This study examined civic engagement in a sample of 790 undocumented Latinx undergraduates (aged 18-30). The relations between social supports (campus safe spaces and peer support) and civic engagement and whether a strong sense of undocumented identity mediated this relation were examined. Competing statistical models examined the role of participants' status (whether or not they received temporary protection from deportation with Deferred Action for Childhood Arrivals [DACA]) in this mediational process. Results revealed that having a strong identification with being undocumented mediated the role of social supports on civic engagement in the overall sample, and that this process was specifically important for those with DACA status. The intersection of policies such as DACA and the lived experiences of Latinx undocumented college students are discussed. © 2017 The Authors. Child Development © 2017 Society for Research in Child Development, Inc.

  7. A comparison between physicians and computer algorithms for form CMS-2728 data reporting.

    Science.gov (United States)

    Malas, Mohammed Said; Wish, Jay; Moorthi, Ranjani; Grannis, Shaun; Dexter, Paul; Duke, Jon; Moe, Sharon

    2017-01-01

    CMS-2728 form (Medical Evidence Report) assesses 23 comorbidities chosen to reflect poor outcomes and increased mortality risk. Previous studies questioned the validity of physician reporting on forms CMS-2728. We hypothesize that reporting of comorbidities by computer algorithms identifies more comorbidities than physician completion, and, therefore, is more reflective of underlying disease burden. We collected data from CMS-2728 forms for all 296 patients who had incident ESRD diagnosis and received chronic dialysis from 2005 through 2014 at Indiana University outpatient dialysis centers. We analyzed patients' data from electronic medical records systems that collated information from multiple health care sources. Previously utilized algorithms or natural language processing was used to extract data on 10 comorbidities for a period of up to 10 years prior to ESRD incidence. These algorithms incorporate billing codes, prescriptions, and other relevant elements. We compared the presence or unchecked status of these comorbidities on the forms to the presence or absence according to the algorithms. Computer algorithms had higher reporting of comorbidities compared to forms completion by physicians. This remained true when decreasing data span to one year and using only a single health center source. The algorithms determination was well accepted by a physician panel. Importantly, algorithms use significantly increased the expected deaths and lowered the standardized mortality ratios. Using computer algorithms showed superior identification of comorbidities for form CMS-2728 and altered standardized mortality ratios. Adapting similar algorithms in available EMR systems may offer more thorough evaluation of comorbidities and improve quality reporting. © 2016 International Society for Hemodialysis.

  8. Pediatric chest HRCT using the iDose4 Hybrid Iterative Reconstruction Algorithm: Which iDose level to choose?

    International Nuclear Information System (INIS)

    Smarda, M; Alexopoulou, E; Mazioti, A; Kordolaimi, S; Ploussi, A; Efstathopoulos, E; Priftis, K

    2015-01-01

    Purpose of the study is to determine the appropriate iterative reconstruction (IR) algorithm level that combines image quality and diagnostic confidence, for pediatric patients undergoing high-resolution computed tomography (HRCT). During the last 2 years, a total number of 20 children up to 10 years old with a clinical presentation of chronic bronchitis underwent HRCT in our department's 64-detector row CT scanner using the iDose IR algorithm, with almost similar image settings (80kVp, 40-50 mAs). CT images were reconstructed with all iDose levels (level 1 to 7) as well as with filtered-back projection (FBP) algorithm. Subjective image quality was evaluated by 2 experienced radiologists in terms of image noise, sharpness, contrast and diagnostic acceptability using a 5-point scale (1=excellent image, 5=non-acceptable image). Artifacts existance was also pointed out. All mean scores from both radiologists corresponded to satisfactory image quality (score ≤3), even with the FBP algorithm use. Almost excellent (score <2) overall image quality was achieved with iDose levels 5 to 7, but oversmoothing artifacts appearing with iDose levels 6 and 7 affected the diagnostic confidence. In conclusion, the use of iDose level 5 enables almost excellent image quality without considerable artifacts affecting the diagnosis. Further evaluation is needed in order to draw more precise conclusions. (paper)

  9. Algorithms

    Indian Academy of Sciences (India)

    polynomial) division have been found in Vedic Mathematics which are dated much before Euclid's algorithm. A programming language Is used to describe an algorithm for execution on a computer. An algorithm expressed using a programming.

  10. 7 CFR 993.58 - Deferment of time for withholding.

    Science.gov (United States)

    2010-01-01

    ... undertaking shall be secured by a bond or bonds to be filed with and acceptable to the committee in the amount or amounts specified, conditioned upon full compliance with such undertaking. (b)(1) Each bond shall..., with reserve pool funds for distribution to equity holders. (3) If for any reason the committee is...

  11. A novel algorithm for image encryption based on mixture of chaotic maps

    International Nuclear Information System (INIS)

    Behnia, S.; Akhshani, A.; Mahmodi, H.; Akhavan, A.

    2008-01-01

    Chaos-based encryption appeared recently in the early 1990s as an original application of nonlinear dynamics in the chaotic regime. In this paper, an implementation of digital image encryption scheme based on the mixture of chaotic systems is reported. The chaotic cryptography technique used in this paper is a symmetric key cryptography. In this algorithm, a typical coupled map was mixed with a one-dimensional chaotic map and used for high degree security image encryption while its speed is acceptable. The proposed algorithm is described in detail, along with its security analysis and implementation. The experimental results based on mixture of chaotic maps approves the effectiveness of the proposed method and the implementation of the algorithm. This mixture application of chaotic maps shows advantages of large key space and high-level security. The ciphertext generated by this method is the same size as the plaintext and is suitable for practical use in the secure transmission of confidential information over the Internet

  12. Exergetic optimization of shell and tube heat exchangers using a genetic based algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Oezcelik, Yavuz [Ege University, Bornova, Izmir (Turkey). Engineering Faculty, Chemical Engineering Department

    2007-08-15

    In the computer-based optimization, many thousands of alternative shell and tube heat exchangers may be examined by varying the high number of exchanger parameters such as tube length, tube outer diameter, pitch size, layout angle, baffle space ratio, number of tube side passes. In the present study, a genetic based algorithm was developed, programmed, and applied to estimate the optimum values of discrete and continuous variables of the MINLP (mixed integer nonlinear programming) test problems. The results of the test problems show that the genetic based algorithm programmed can estimate the acceptable values of continuous variables and optimum values of integer variables. Finally the genetic based algorithm was extended to make parametric studies and to find optimum configuration of heat exchangers by minimizing the sum of the annual capital cost and exergetic cost of the shell and tube heat exchangers. The results of the example problems show that the proposed algorithm is applicable to find optimum and near optimum alternatives of the shell and tube heat exchanger configurations. (author)

  13. Key stakeholder perceptions about consent to participate in acute illness research: a rapid, systematic review to inform epi/pandemic research preparedness.

    Science.gov (United States)

    Gobat, Nina H; Gal, Micaela; Francis, Nick A; Hood, Kerenza; Watkins, Angela; Turner, Jill; Moore, Ronald; Webb, Steve A R; Butler, Christopher C; Nichol, Alistair

    2015-12-29

    A rigorous research response is required to inform clinical and public health decision-making during an epi/pandemic. However, the ethical conduct of such research, which often involves critically ill patients, may be complicated by the diminished capacity to consent and an imperative to initiate trial therapies within short time frames. Alternative approaches to taking prospective informed consent may therefore be used. We aimed to rapidly review evidence on key stakeholder (patients, their proxy decision-makers, clinicians and regulators) views concerning the acceptability of various approaches for obtaining consent relevant to pandemic-related acute illness research. We conducted a rapid evidence review, using the Internet, database and hand-searching for English language empirical publications from 1996 to 2014 on stakeholder opinions of consent models (prospective informed, third-party, deferred, or waived) used in acute illness research. We excluded research on consent to treatment, screening, or other such procedures, non-emergency research and secondary studies. Papers were categorised, and data summarised using narrative synthesis. We screened 689 citations, reviewed 104 full-text articles and included 52. Just one paper related specifically to pandemic research. In other emergency research contexts potential research participants, clinicians and research staff found third-party, deferred, and waived consent to be acceptable as a means to feasibly conduct such research. Acceptability to potential participants was motivated by altruism, trust in the medical community, and perceived value in medical research and decreased as the perceived risks associated with participation increased. Discrepancies were observed in the acceptability of the concept and application or experience of alternative consent models. Patients accepted clinicians acting as proxy-decision makers, with preference for two decision makers as invasiveness of interventions increased

  14. Applying wind turbines and battery storage to defer Orcas Power and Light Company distribution circuit upgrades

    International Nuclear Information System (INIS)

    Zaininger, H.W.; Barnes, P.R.

    1997-03-01

    The purpose of this study is to conduct a detailed assessment of the Orcas Power and Light Company (OPALCO) system to determine the potential for deferring the costly upgrade of the 25-kV Lopez- Eastsound circuit, by the application of a MW-scale wind farm and battery storage facilities as appropriate. Local wind resource data has been collected over the past year and used to determine MW-scale wind farm performance. This hourly wind farm performance data is used with measured hourly Eastsound load data, and recent OPALCO distribution system expansion plans and cost projections in performing this detailed benefit-cost assessment. The OPALCO distribution circuit expansion project and assumptions are described. MW-scale wind farm performance results are given. The economic benefit-cost results for the wind farm and battery storage applications on the OPALCO system using OPALCO system design criteria and cost assumptions are reported. A recalculation is presented of the benefit-cost results for similar potential wind farm and battery storage applications on other utility systems with higher marginal energy and demand costs. Conclusions and recommendations are presented. costs. Conclusions and recommendations are presented

  15. Chinese Nurses' Acceptance of PDA: A Cross-Sectional Survey Using a Technology Acceptance Model.

    Science.gov (United States)

    Wang, Yanling; Xiao, Qian; Sun, Liu; Wu, Ying

    2016-01-01

    This study explores Chinese nurses' acceptance of PDA, using a questionnaire based on the framework of Technology Acceptance Model (TAM). 357 nurses were involved in the study. The results reveal the scores of the nurses' acceptance of PDA were means 3.18~3.36 in four dimensions. The younger of nurses, the higher nurses' title, the longer previous usage time, the more experienced using PDA, and the more acceptance of PDA. Therefore, the hospital administrators may change strategies to enhance nurses' acceptance of PDA, and promote the wide application of PDA.

  16. Efficient algorithms for flow simulation related to nuclear reactor safety

    International Nuclear Information System (INIS)

    Gornak, Tatiana

    2013-01-01

    Safety analysis is of ultimate importance for operating Nuclear Power Plants (NPP). The overall modeling and simulation of physical and chemical processes occuring in the course of an accident is an interdisciplinary problem and has origins in fluid dynamics, numerical analysis, reactor technology and computer programming. The aim of the study is therefore to create the foundations of a multi-dimensional non-isothermal fluid model for a NPP containment and software tool based on it. The numerical simulations allow to analyze and predict the behavior of NPP systems under different working and accident conditions, and to develop proper action plans for minimizing the risks of accidents, and/or minimizing the consequences of possible accidents. A very large number of scenarios have to be simulated, and at the same time acceptable accuracy for the critical parameters, such as radioactive pollution, temperature, etc., have to be achieved. The existing software tools are either too slow, or not accurate enough. This thesis deals with developing customized algorithm and software tools for simulation of isothermal and non-isothermal flows in a containment pool of NPP. Requirements to such a software are formulated, and proper algorithms are presented. The goal of the work is to achieve a balance between accuracy and speed of calculation, and to develop customized algorithm for this special case. Different discretization and solution approaches are studied and those which correspond best to the formulated goal are selected, adjusted, and when possible, analysed. Fast directional splitting algorithm for Navier-Stokes equations in complicated geometries, in presence of solid and porous obstacles, is in the core of the algorithm. Developing suitable pre-processor and customized domain decomposition algorithms are essential part of the overall algorithm amd software. Results from numerical simulations in test geometries and in real geometries are presented and discussed.

  17. Main Results of Updated Decommission Conception of NPPs Operating in Ukraine

    International Nuclear Information System (INIS)

    Purtov, Oleg; Masko, Alexander; Vasilchenko, Victor

    2014-01-01

    Results of long-term planning analysis based on consideration of 6 possible scenarios for the nuclear energy development with 15 years and 20 years life time extensions of operation of nuclear power units beyond 30 year provided by original design are presented in the updated decommission conception of NPP's operating in Ukraine. These characteristics of the two main options for NPP decommissioning deferred or immediate dismantling, which is close to the level of acceptability with relative superiority variant of deferred dismantling, are presented. The best option for NPP unit decommissioning as comparative analysis results is the option with deferred dismantling with 30 years endurance time. It can be taken as a basis for optimal strategies for NPP unit decommission design development. Cost estimations for the decommissioning of WWER-440 and WWER-1000 reactor type units are presented in the updated conception. The updated cost assessment for required annual payments with uniform accumulation costs to the Decommission Fund corresponding deferred dismantling variant with 20 years life time extension operation time is 98,2 mln US$ per year. This value is 3.61% of the electricity generated by NPP's in Ukraine and supplied to the wholesale electricity market of Ukraine in 2012 base year. (authors)

  18. STAR Algorithm Integration Team - Facilitating operational algorithm development

    Science.gov (United States)

    Mikles, V. J.

    2015-12-01

    The NOAA/NESDIS Center for Satellite Research and Applications (STAR) provides technical support of the Joint Polar Satellite System (JPSS) algorithm development and integration tasks. Utilizing data from the S-NPP satellite, JPSS generates over thirty Environmental Data Records (EDRs) and Intermediate Products (IPs) spanning atmospheric, ocean, cryosphere, and land weather disciplines. The Algorithm Integration Team (AIT) brings technical expertise and support to product algorithms, specifically in testing and validating science algorithms in a pre-operational environment. The AIT verifies that new and updated algorithms function in the development environment, enforces established software development standards, and ensures that delivered packages are functional and complete. AIT facilitates the development of new JPSS-1 algorithms by implementing a review approach based on the Enterprise Product Lifecycle (EPL) process. Building on relationships established during the S-NPP algorithm development process and coordinating directly with science algorithm developers, the AIT has implemented structured reviews with self-contained document suites. The process has supported algorithm improvements for products such as ozone, active fire, vegetation index, and temperature and moisture profiles.

  19. Signal quality indices and data fusion for determining clinical acceptability of electrocardiograms

    International Nuclear Information System (INIS)

    Clifford, G D; Behar, J; Li, Q; Rezek, I

    2012-01-01

    A completely automated algorithm to detect poor-quality electrocardiograms (ECGs) is described. The algorithm is based on both novel and previously published signal quality metrics, originally designed for intensive care monitoring. The algorithms have been adapted for use on short (5–10 s) single- and multi-lead ECGs. The metrics quantify spectral energy distribution, higher order moments and inter-channel and inter-algorithm agreement. Seven metrics were calculated for each channel (84 features in all) and presented to either a multi-layer perceptron artificial neural network or a support vector machine (SVM) for training on a multiple-annotator labelled and adjudicated training dataset. A single-lead version of the algorithm was also developed in a similar manner. Data were drawn from the PhysioNet Challenge 2011 dataset where binary labels were available, on 1500 12-lead ECGs indicating whether the entire recording was acceptable or unacceptable for clinical interpretation. We re-annotated all the leads in both the training set (1000 labelled ECGs) and test dataset (500 12-lead ECGs where labels were not publicly available) using two independent annotators, and a third for adjudication of differences. We found that low-quality data accounted for only 16% of the ECG leads. To balance the classes (between high and low quality), we created extra noisy data samples by adding noise from PhysioNet’s noise stress test database to some of the clean 12-lead ECGs. No data were shared between training and test sets. A classification accuracy of 98% on the training data and 97% on the test data were achieved. Upon inspection, incorrectly classified data were found to be borderline cases which could be classified either way. If these cases were more consistently labelled, we expect our approach to achieve an accuracy closer to 100%. (paper)

  20. Dynamic phasing of multichannel cw laser radiation by means of a stochastic gradient algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Volkov, V A; Volkov, M V; Garanin, S G; Dolgopolov, Yu V; Kopalkin, A V; Kulikov, S M; Starikov, F A; Sukharev, S A; Tyutin, S V; Khokhlov, S V; Chaparin, D A [Russian Federal Nuclear Center ' All-Russian Research Institute of Experimental Physics' , Sarov, Nizhnii Novgorod region (Russian Federation)

    2013-09-30

    The phasing of a multichannel laser beam by means of an iterative stochastic parallel gradient (SPG) algorithm has been numerically and experimentally investigated. The operation of the SPG algorithm is simulated, the acceptable range of amplitudes of probe phase shifts is found, and the algorithm parameters at which the desired Strehl number can be obtained with a minimum number of iterations are determined. An experimental bench with phase modulators based on lithium niobate, which are controlled by a multichannel electronic unit with a real-time microcontroller, has been designed. Phasing of 16 cw laser beams at a system response bandwidth of 3.7 kHz and phase thermal distortions in a frequency band of about 10 Hz is experimentally demonstrated. The experimental data are in complete agreement with the calculation results. (control of laser radiation parameters)

  1. 12 CFR 612.2150 - Employees-prohibited conduct.

    Science.gov (United States)

    2010-01-01

    ... institution, including borrowers and loan applicants. (f) Accept, directly or indirectly, any gift, fee, or... position to solicit or obtain any gift, fee, or other present or deferred compensation or for any other... to transactions involving the purchase or sale of real estate intended for the use of the employee, a...

  2. Validation of near infrared satellite based algorithms to relative atmospheric water vapour content over land

    International Nuclear Information System (INIS)

    Serpolla, A.; Bonafoni, S.; Basili, P.; Biondi, R.; Arino, O.

    2009-01-01

    This paper presents the validation results of ENVISAT MERIS and TERRA MODIS retrieval algorithms for atmospheric Water Vapour Content (WVC) estimation in clear sky condition on land. The MERIS algorithms exploits the radiance ratio of the absorbing channel at 900 nm with the almost absorption-free reference at 890 nm, while the MODIS one is based on the ratio of measurements centred at near 0.905, 0.936, and 0.94 μm with atmospheric window reflectance at 0.865 and 1.24 μm. The first test was performed in the Mediterranean area using WVC provided from both ECMWF and AERONET. As a second step, the performances of the algorithms were tested exploiting WVC computed from radio sounding (RAOBs)in the North East Australia. The different comparisons with respect to reference WVC values showed an overestimation of WVC by MODIS (root mean square error percentage greater than 20%) and an acceptable performance of MERIS algorithms (root mean square error percentage around 10%) [it

  3. Age and Acceptance of Euthanasia.

    Science.gov (United States)

    Ward, Russell A.

    1980-01-01

    Study explores relationship between age (and sex and race) and acceptance of euthanasia. Women and non-Whites were less accepting because of religiosity. Among older people less acceptance was attributable to their lesser education and greater religiosity. Results suggest that quality of life in old age affects acceptability of euthanasia. (Author)

  4. Selfish Gene Algorithm Vs Genetic Algorithm: A Review

    Science.gov (United States)

    Ariff, Norharyati Md; Khalid, Noor Elaiza Abdul; Hashim, Rathiah; Noor, Noorhayati Mohamed

    2016-11-01

    Evolutionary algorithm is one of the algorithms inspired by the nature. Within little more than a decade hundreds of papers have reported successful applications of EAs. In this paper, the Selfish Gene Algorithms (SFGA), as one of the latest evolutionary algorithms (EAs) inspired from the Selfish Gene Theory which is an interpretation of Darwinian Theory ideas from the biologist Richards Dawkins on 1989. In this paper, following a brief introduction to the Selfish Gene Algorithm (SFGA), the chronology of its evolution is presented. It is the purpose of this paper is to present an overview of the concepts of Selfish Gene Algorithm (SFGA) as well as its opportunities and challenges. Accordingly, the history, step involves in the algorithm are discussed and its different applications together with an analysis of these applications are evaluated.

  5. Robust perception algorithms for road and track autonomous following

    Science.gov (United States)

    Marion, Vincent; Lecointe, Olivier; Lewandowski, Cecile; Morillon, Joel G.; Aufrere, Romuald; Marcotegui, Beatrix; Chapuis, Roland; Beucher, Serge

    2004-09-01

    The French Military Robotic Study Program (introduced in Aerosense 2003), sponsored by the French Defense Procurement Agency and managed by Thales Airborne Systems as the prime contractor, focuses on about 15 robotic themes, which can provide an immediate "operational add-on value." The paper details the "road and track following" theme (named AUT2), which main purpose was to develop a vision based sub-system to automatically detect roadsides of an extended range of roads and tracks suitable to military missions. To achieve the goal, efforts focused on three main areas: (1) Improvement of images quality at algorithms inputs, thanks to the selection of adapted video cameras, and the development of a THALES patented algorithm: it removes in real time most of the disturbing shadows in images taken in natural environments, enhances contrast and lowers reflection effect due to films of water. (2) Selection and improvement of two complementary algorithms (one is segment oriented, the other region based) (3) Development of a fusion process between both algorithms, which feeds in real time a road model with the best available data. Each previous step has been developed so that the global perception process is reliable and safe: as an example, the process continuously evaluates itself and outputs confidence criteria qualifying roadside detection. The paper presents the processes in details, and the results got from passed military acceptance tests, which trigger the next step: autonomous track following (named AUT3).

  6. Texas Medication Algorithm Project: development and feasibility testing of a treatment algorithm for patients with bipolar disorder.

    Science.gov (United States)

    Suppes, T; Swann, A C; Dennehy, E B; Habermacher, E D; Mason, M; Crismon, M L; Toprac, M G; Rush, A J; Shon, S P; Altshuler, K Z

    2001-06-01

    Use of treatment guidelines for treatment of major psychiatric illnesses has increased in recent years. The Texas Medication Algorithm Project (TMAP) was developed to study the feasibility and process of developing and implementing guidelines for bipolar disorder, major depressive disorder, and schizophrenia in the public mental health system of Texas. This article describes the consensus process used to develop the first set of TMAP algorithms for the Bipolar Disorder Module (Phase 1) and the trial testing the feasibility of their implementation in inpatient and outpatient psychiatric settings across Texas (Phase 2). The feasibility trial answered core questions regarding implementation of treatment guidelines for bipolar disorder. A total of 69 patients were treated with the original algorithms for bipolar disorder developed in Phase 1 of TMAP. Results support that physicians accepted the guidelines, followed recommendations to see patients at certain intervals, and utilized sequenced treatment steps differentially over the course of treatment. While improvements in clinical symptoms (24-item Brief Psychiatric Rating Scale) were observed over the course of enrollment in the trial, these conclusions are limited by the fact that physician volunteers were utilized for both treatment and ratings. and there was no control group. Results from Phases 1 and 2 indicate that it is possible to develop and implement a treatment guideline for patients with a history of mania in public mental health clinics in Texas. TMAP Phase 3, a recently completed larger and controlled trial assessing the clinical and economic impact of treatment guidelines and patient and family education in the public mental health system of Texas, improves upon this methodology.

  7. Implementation of several mathematical algorithms to breast tissue density classification

    International Nuclear Information System (INIS)

    Quintana, C.; Redondo, M.; Tirao, G.

    2014-01-01

    The accuracy of mammographic abnormality detection methods is strongly dependent on breast tissue characteristics, where a dense breast tissue can hide lesions causing cancer to be detected at later stages. In addition, breast tissue density is widely accepted to be an important risk indicator for the development of breast cancer. This paper presents the implementation and the performance of different mathematical algorithms designed to standardize the categorization of mammographic images, according to the American College of Radiology classifications. These mathematical techniques are based on intrinsic properties calculations and on comparison with an ideal homogeneous image (joint entropy, mutual information, normalized cross correlation and index Q) as categorization parameters. The algorithms evaluation was performed on 100 cases of the mammographic data sets provided by the Ministerio de Salud de la Provincia de Córdoba, Argentina—Programa de Prevención del Cáncer de Mama (Department of Public Health, Córdoba, Argentina, Breast Cancer Prevention Program). The obtained breast classifications were compared with the expert medical diagnostics, showing a good performance. The implemented algorithms revealed a high potentiality to classify breasts into tissue density categories. - Highlights: • Breast density classification can be obtained by suitable mathematical algorithms. • Mathematical processing help radiologists to obtain the BI-RADS classification. • The entropy and joint entropy show high performance for density classification

  8. Algorithmic mathematics

    CERN Document Server

    Hougardy, Stefan

    2016-01-01

    Algorithms play an increasingly important role in nearly all fields of mathematics. This book allows readers to develop basic mathematical abilities, in particular those concerning the design and analysis of algorithms as well as their implementation. It presents not only fundamental algorithms like the sieve of Eratosthenes, the Euclidean algorithm, sorting algorithms, algorithms on graphs, and Gaussian elimination, but also discusses elementary data structures, basic graph theory, and numerical questions. In addition, it provides an introduction to programming and demonstrates in detail how to implement algorithms in C++. This textbook is suitable for students who are new to the subject and covers a basic mathematical lecture course, complementing traditional courses on analysis and linear algebra. Both authors have given this "Algorithmic Mathematics" course at the University of Bonn several times in recent years.

  9. Application of Data Mining Algorithm to Recipient of Motorcycle Installment

    Directory of Open Access Journals (Sweden)

    Harry Dhika

    2015-12-01

    Full Text Available The study was conducted in the subsidiaries that provide services of finance related to the purchase of a motorcycle on credit. At the time of applying, consumers enter their personal data. Based on the personal data, it will be known whether the consumer credit data is approved or rejected. From 224 consumer data obtained, it is known that the number of consumers whose applications are approved is 87% or about 217 consumers and consumers whose application is rejected is 16% or as much as 6 consumers. Acceptance of motorcycle financing on credit by using the method of applying the algorithm through CRIS-P DM is the industry standard in the processing of data mining. The algorithm used in the decision making is the algorithm C4.5. The results obtained previously, the level of accuracy is measured with the Confusion Matrix and Receiver Operating characteristic (ROC. Evaluation of the Confusion Matrix is intended to seek the value of accuracy, precision value, and the value of recall data. While the Receiver Operating Characteristic (ROC is used to find data tables and comparison Area Under Curve (AUC.

  10. Implementation of several mathematical algorithms to breast tissue density classification

    Science.gov (United States)

    Quintana, C.; Redondo, M.; Tirao, G.

    2014-02-01

    The accuracy of mammographic abnormality detection methods is strongly dependent on breast tissue characteristics, where a dense breast tissue can hide lesions causing cancer to be detected at later stages. In addition, breast tissue density is widely accepted to be an important risk indicator for the development of breast cancer. This paper presents the implementation and the performance of different mathematical algorithms designed to standardize the categorization of mammographic images, according to the American College of Radiology classifications. These mathematical techniques are based on intrinsic properties calculations and on comparison with an ideal homogeneous image (joint entropy, mutual information, normalized cross correlation and index Q) as categorization parameters. The algorithms evaluation was performed on 100 cases of the mammographic data sets provided by the Ministerio de Salud de la Provincia de Córdoba, Argentina—Programa de Prevención del Cáncer de Mama (Department of Public Health, Córdoba, Argentina, Breast Cancer Prevention Program). The obtained breast classifications were compared with the expert medical diagnostics, showing a good performance. The implemented algorithms revealed a high potentiality to classify breasts into tissue density categories.

  11. Essential algorithms a practical approach to computer algorithms

    CERN Document Server

    Stephens, Rod

    2013-01-01

    A friendly and accessible introduction to the most useful algorithms Computer algorithms are the basic recipes for programming. Professional programmers need to know how to use algorithms to solve difficult programming problems. Written in simple, intuitive English, this book describes how and when to use the most practical classic algorithms, and even how to create new algorithms to meet future needs. The book also includes a collection of questions that can help readers prepare for a programming job interview. Reveals methods for manipulating common data structures s

  12. FUZZY ACCEPTANCE SAMPLING AND CHARACTERISTIC CURVES

    Directory of Open Access Journals (Sweden)

    Ebru Turano?lu

    2012-02-01

    Full Text Available Acceptance sampling is primarily used for the inspection of incoming or outgoing lots. Acceptance sampling refers to the application of specific sampling plans to a designated lot or sequence of lots. The parameters of acceptance sampling plans are sample sizes and acceptance numbers. In some cases, it may not be possible to define acceptance sampling parameters as crisp values. These parameters can be expressed by linguistic variables. The fuzzy set theory can be successfully used to cope with the vagueness in these linguistic expressions for acceptance sampling. In this paper, the main distributions of acceptance sampling plans are handled with fuzzy parameters and their acceptance probability functions are derived. Then the characteristic curves of acceptance sampling are examined under fuzziness. Illustrative examples are given.

  13. Algorithmic cryptanalysis

    CERN Document Server

    Joux, Antoine

    2009-01-01

    Illustrating the power of algorithms, Algorithmic Cryptanalysis describes algorithmic methods with cryptographically relevant examples. Focusing on both private- and public-key cryptographic algorithms, it presents each algorithm either as a textual description, in pseudo-code, or in a C code program.Divided into three parts, the book begins with a short introduction to cryptography and a background chapter on elementary number theory and algebra. It then moves on to algorithms, with each chapter in this section dedicated to a single topic and often illustrated with simple cryptographic applic

  14. Iterative concurrent reconstruction algorithms for emission computed tomography

    International Nuclear Information System (INIS)

    Brown, J.K.; Hasegawa, B.H.; Lang, T.F.

    1994-01-01

    Direct reconstruction techniques, such as those based on filtered backprojection, are typically used for emission computed tomography (ECT), even though it has been argued that iterative reconstruction methods may produce better clinical images. The major disadvantage of iterative reconstruction algorithms, and a significant reason for their lack of clinical acceptance, is their computational burden. We outline a new class of ''concurrent'' iterative reconstruction techniques for ECT in which the reconstruction process is reorganized such that a significant fraction of the computational processing occurs concurrently with the acquisition of ECT projection data. These new algorithms use the 10-30 min required for acquisition of a typical SPECT scan to iteratively process the available projection data, significantly reducing the requirements for post-acquisition processing. These algorithms are tested on SPECT projection data from a Hoffman brain phantom acquired with a 2 x 10 5 counts in 64 views each having 64 projections. The SPECT images are reconstructed as 64 x 64 tomograms, starting with six angular views. Other angular views are added to the reconstruction process sequentially, in a manner that reflects their availability for a typical acquisition protocol. The results suggest that if T s of concurrent processing are used, the reconstruction processing time required after completion of the data acquisition can be reduced by at least 1/3 T s. (Author)

  15. Clinical Implications of TiGRT Algorithm for External Audit in Radiation Oncology.

    Science.gov (United States)

    Shahbazi-Gahrouei, Daryoush; Saeb, Mohsen; Monadi, Shahram; Jabbari, Iraj

    2017-01-01

    Performing audits play an important role in quality assurance program in radiation oncology. Among different algorithms, TiGRT is one of the common application software for dose calculation. This study aimed to clinical implications of TiGRT algorithm to measure dose and compared to calculated dose delivered to the patients for a variety of cases, with and without the presence of inhomogeneities and beam modifiers. Nonhomogeneous phantom as quality dose verification phantom, Farmer ionization chambers, and PC-electrometer (Sun Nuclear, USA) as a reference class electrometer was employed throughout the audit in linear accelerators 6 and 18 MV energies (Siemens ONCOR Impression Plus, Germany). Seven test cases were performed using semi CIRS phantom. In homogeneous regions and simple plans for both energies, there was a good agreement between measured and treatment planning system calculated dose. Their relative error was found to be between 0.8% and 3% which is acceptable for audit, but in nonhomogeneous organs, such as lung, a few errors were observed. In complex treatment plans, when wedge or shield in the way of energy is used, the error was in the accepted criteria. In complex beam plans, the difference between measured and calculated dose was found to be 2%-3%. All differences were obtained between 0.4% and 1%. A good consistency was observed for the same type of energy in the homogeneous and nonhomogeneous phantom for the three-dimensional conformal field with a wedge, shield, asymmetric using the TiGRT treatment planning software in studied center. The results revealed that the national status of TPS calculations and dose delivery for 3D conformal radiotherapy was globally within acceptable standards with no major causes for concern.

  16. Denni Algorithm An Enhanced Of SMS (Scan, Move and Sort) Algorithm

    Science.gov (United States)

    Aprilsyah Lubis, Denni; Salim Sitompul, Opim; Marwan; Tulus; Andri Budiman, M.

    2017-12-01

    Sorting has been a profound area for the algorithmic researchers, and many resources are invested to suggest a more working sorting algorithm. For this purpose many existing sorting algorithms were observed in terms of the efficiency of the algorithmic complexity. Efficient sorting is important to optimize the use of other algorithms that require sorted lists to work correctly. Sorting has been considered as a fundamental problem in the study of algorithms that due to many reasons namely, the necessary to sort information is inherent in many applications, algorithms often use sorting as a key subroutine, in algorithm design there are many essential techniques represented in the body of sorting algorithms, and many engineering issues come to the fore when implementing sorting algorithms., Many algorithms are very well known for sorting the unordered lists, and one of the well-known algorithms that make the process of sorting to be more economical and efficient is SMS (Scan, Move and Sort) algorithm, an enhancement of Quicksort invented Rami Mansi in 2010. This paper presents a new sorting algorithm called Denni-algorithm. The Denni algorithm is considered as an enhancement on the SMS algorithm in average, and worst cases. The Denni algorithm is compared with the SMS algorithm and the results were promising.

  17. An Algorithm-Based Approach for Behavior and Disease Management in Children.

    Science.gov (United States)

    Meyer, Beau D; Lee, Jessica Y; Thikkurissy, S; Casamassimo, Paul S; Vann, William F

    2018-03-15

    Pharmacologic behavior management for dental treatment is an approach to provide invasive yet compassionate care for young children; it can facilitate the treatment of children who otherwise may not cooperate for traditional in-office care. Some recent highly publicized procedural sedation-related tragedies have drawn attention to risks associated with pharmacologic management. However, it remains widely accepted that, by adhering to proper guidelines, procedural sedation can assist in the provision of high-quality dental care while minimizing morbidity and mortality from the procedure. The purpose of this paper was to propose an algorithm for clinicians to consider when selecting a behavior and disease management strategy for early childhood caries. This algorithm will not ensure a positive outcome but can assist clinicians when counseling caregivers about risks, benefits, and alternatives. It also emphasizes and underscores best-safety practices.

  18. Python algorithms mastering basic algorithms in the Python language

    CERN Document Server

    Hetland, Magnus Lie

    2014-01-01

    Python Algorithms, Second Edition explains the Python approach to algorithm analysis and design. Written by Magnus Lie Hetland, author of Beginning Python, this book is sharply focused on classical algorithms, but it also gives a solid understanding of fundamental algorithmic problem-solving techniques. The book deals with some of the most important and challenging areas of programming and computer science in a highly readable manner. It covers both algorithmic theory and programming practice, demonstrating how theory is reflected in real Python programs. Well-known algorithms and data struc

  19. UAS Air Traffic Controller Acceptability Study-2: Effects of Communications Delays and Winds in Simulation

    Science.gov (United States)

    Comstock, James R., Jr.; Ghatas, Rania W.; Consiglio, Maria C.; Chamberlain, James P.; Hoffler, Keith D.

    2016-01-01

    This study evaluated the effects of Communications Delays and Winds on Air Traffic Controller ratings of acceptability of horizontal miss distances (HMDs) for encounters between UAS and manned aircraft in a simulation of the Dallas-Ft. Worth East-side airspace. Fourteen encounters per hour were staged in the presence of moderate background traffic. Seven recently retired controllers with experience at DFW served as subjects. Guidance provided to the UAS pilots for maintaining a given HMD was provided by information from self-separation algorithms displayed on the Multi-Aircraft Simulation System. Winds tested did not affect the acceptability ratings. Communications delays tested included 0, 400, 1200, and 1800 msec. For longer communications delays, there were changes in strategy and communications flow that were observed and reported by the controllers. The aim of this work is to provide useful information for guiding future rules and regulations applicable to flying UAS in the NAS.

  20. Hybrid Cryptosystem Using Tiny Encryption Algorithm and LUC Algorithm

    Science.gov (United States)

    Rachmawati, Dian; Sharif, Amer; Jaysilen; Andri Budiman, Mohammad

    2018-01-01

    Security becomes a very important issue in data transmission and there are so many methods to make files more secure. One of that method is cryptography. Cryptography is a method to secure file by writing the hidden code to cover the original file. Therefore, if the people do not involve in cryptography, they cannot decrypt the hidden code to read the original file. There are many methods are used in cryptography, one of that method is hybrid cryptosystem. A hybrid cryptosystem is a method that uses a symmetric algorithm to secure the file and use an asymmetric algorithm to secure the symmetric algorithm key. In this research, TEA algorithm is used as symmetric algorithm and LUC algorithm is used as an asymmetric algorithm. The system is tested by encrypting and decrypting the file by using TEA algorithm and using LUC algorithm to encrypt and decrypt the TEA key. The result of this research is by using TEA Algorithm to encrypt the file, the cipher text form is the character from ASCII (American Standard for Information Interchange) table in the form of hexadecimal numbers and the cipher text size increase by sixteen bytes as the plaintext length is increased by eight characters.

  1. Public acceptance of nuclear power

    International Nuclear Information System (INIS)

    Wildgruber, O.H.

    1990-01-01

    The lecture addresses the question why we need public acceptance work and provides some clues to it. It explains various human behaviour patterns which determine the basics for public acceptance. To some extent, the opposition to nuclear energy and the role the media play are described. Public acceptance efforts of industry are critically reviewed. Some hints on difficulties with polling are provided. The lecture concludes with recommendations for further public acceptance work. (author)

  2. Parallel genetic algorithms with migration for the hybrid flow shop scheduling problem

    Directory of Open Access Journals (Sweden)

    K. Belkadi

    2006-01-01

    Full Text Available This paper addresses scheduling problems in hybrid flow shop-like systems with a migration parallel genetic algorithm (PGA_MIG. This parallel genetic algorithm model allows genetic diversity by the application of selection and reproduction mechanisms nearer to nature. The space structure of the population is modified by dividing it into disjoined subpopulations. From time to time, individuals are exchanged between the different subpopulations (migration. Influence of parameters and dedicated strategies are studied. These parameters are the number of independent subpopulations, the interconnection topology between subpopulations, the choice/replacement strategy of the migrant individuals, and the migration frequency. A comparison between the sequential and parallel version of genetic algorithm (GA is provided. This comparison relates to the quality of the solution and the execution time of the two versions. The efficiency of the parallel model highly depends on the parameters and especially on the migration frequency. In the same way this parallel model gives a significant improvement of computational time if it is implemented on a parallel architecture which offers an acceptable number of processors (as many processors as subpopulations.

  3. Multicycle Optimization of Advanced Gas-Cooled Reactor Loading Patterns Using Genetic Algorithms

    International Nuclear Information System (INIS)

    Ziver, A. Kemal; Carter, Jonathan N.; Pain, Christopher C.; Oliveira, Cassiano R.E. de; Goddard, Antony J. H.; Overton, Richard S.

    2003-01-01

    A genetic algorithm (GA)-based optimizer (GAOPT) has been developed for in-core fuel management of advanced gas-cooled reactors (AGRs) at HINKLEY B and HARTLEPOOL, which employ on-load and off-load refueling, respectively. The optimizer has been linked to the reactor analysis code PANTHER for the automated evaluation of loading patterns in a two-dimensional geometry, which is collapsed from the three-dimensional reactor model. GAOPT uses a directed stochastic (Monte Carlo) algorithm to generate initial population members, within predetermined constraints, for use in GAs, which apply the standard genetic operators: selection by tournament, crossover, and mutation. The GAOPT is able to generate and optimize loading patterns for successive reactor cycles (multicycle) within acceptable CPU times even on single-processor systems. The algorithm allows radial shuffling of fuel assemblies in a multicycle refueling optimization, which is constructed to aid long-term core management planning decisions. This paper presents the application of the GA-based optimization to two AGR stations, which apply different in-core management operational rules. Results obtained from the testing of GAOPT are discussed

  4. Sound algorithms

    OpenAIRE

    De Götzen , Amalia; Mion , Luca; Tache , Olivier

    2007-01-01

    International audience; We call sound algorithms the categories of algorithms that deal with digital sound signal. Sound algorithms appeared in the very infancy of computer. Sound algorithms present strong specificities that are the consequence of two dual considerations: the properties of the digital sound signal itself and its uses, and the properties of auditory perception.

  5. Blind and Deaf to Acceptance: The Role of Self-Esteem in Capitalizing on Social Acceptance

    OpenAIRE

    Luerssen, Anna Maud

    2013-01-01

    Across two studies, we evaluated whether people with low self-esteem are less likely to capitalize on, or take full advantage of, their romantic partners' accepting behaviors. We conceptualized capitalization as the tendency to perceive acceptance when it occurs, and to experience positive changes in affect and relationship satisfaction when acceptance is perceived. We found that participants with low self-esteem under-perceived their partners' acceptance, both in daily life and in the labora...

  6. Optimization Solution of Troesch’s and Bratu’s Problems of Ordinary Type Using Novel Continuous Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Zaer Abo-Hammour

    2014-01-01

    Full Text Available A new kind of optimization technique, namely, continuous genetic algorithm, is presented in this paper for numerically approximating the solutions of Troesch’s and Bratu’s problems. The underlying idea of the method is to convert the two differential problems into discrete versions by replacing each of the second derivatives by an appropriate difference quotient approximation. The new method has the following characteristics. First, it should not resort to more advanced mathematical tools; that is, the algorithm should be simple to understand and implement and should be thus easily accepted in the mathematical and physical application fields. Second, the algorithm is of global nature in terms of the solutions obtained as well as its ability to solve other mathematical and physical problems. Third, the proposed methodology has an implicit parallel nature which points to its implementation on parallel machines. The algorithm is tested on different versions of Troesch’s and Bratu’s problems. Experimental results show that the proposed algorithm is effective, straightforward, and simple.

  7. Imaging for dismantlement verification: Information management and analysis algorithms

    International Nuclear Information System (INIS)

    Robinson, S.M.; Jarman, K.D.; Pitts, W.K.; Seifert, A.; Misner, A.C.; Woodring, M.L.; Myjak, M.J.

    2012-01-01

    The level of detail discernible in imaging techniques has generally excluded them from consideration as verification tools in inspection regimes. An image will almost certainly contain highly sensitive information, and storing a comparison image will almost certainly violate a cardinal principle of information barriers: that no sensitive information be stored in the system. To overcome this problem, some features of the image might be reduced to a few parameters suitable for definition as an attribute, which must be non-sensitive to be acceptable in an Information Barrier regime. However, this process must be performed with care. Features like the perimeter, area, and intensity of an object, for example, might reveal sensitive information. Any data-reduction technique must provide sufficient information to discriminate a real object from a spoofed or incorrect one, while avoiding disclosure (or storage) of any sensitive object qualities. Ultimately, algorithms are intended to provide only a yes/no response verifying the presence of features in the image. We discuss the utility of imaging for arms control applications and present three image-based verification algorithms in this context. The algorithms reduce full image information to non-sensitive feature information, in a process that is intended to enable verification while eliminating the possibility of image reconstruction. The underlying images can be highly detailed, since they are dynamically generated behind an information barrier. We consider the use of active (conventional) radiography alone and in tandem with passive (auto) radiography. We study these algorithms in terms of technical performance in image analysis and application to an information barrier scheme.

  8. Delayed Slater determinant update algorithms for high efficiency quantum Monte Carlo

    Science.gov (United States)

    McDaniel, T.; D'Azevedo, E. F.; Li, Y. W.; Wong, K.; Kent, P. R. C.

    2017-11-01

    Within ab initio Quantum Monte Carlo simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunction. Each Monte Carlo step requires finding the determinant of a dense matrix. This is most commonly iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. The overall computational cost is, therefore, formally cubic in the number of electrons or matrix size. To improve the numerical efficiency of this procedure, we propose a novel multiple rank delayed update scheme. This strategy enables probability evaluation with an application of accepted moves to the matrices delayed until after a predetermined number of moves, K. The accepted events are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency via matrix-matrix operations instead of matrix-vector operations. This procedure does not change the underlying Monte Carlo sampling or its statistical efficiency. For calculations on large systems and algorithms such as diffusion Monte Carlo, where the acceptance ratio is high, order of magnitude improvements in the update time can be obtained on both multi-core central processing units and graphical processing units.

  9. Adaptive Incremental Genetic Algorithm for Task Scheduling in Cloud Environments

    Directory of Open Access Journals (Sweden)

    Kairong Duan

    2018-05-01

    Full Text Available Cloud computing is a new commercial model that enables customers to acquire large amounts of virtual resources on demand. Resources including hardware and software can be delivered as services and measured by specific usage of storage, processing, bandwidth, etc. In Cloud computing, task scheduling is a process of mapping cloud tasks to Virtual Machines (VMs. When binding the tasks to VMs, the scheduling strategy has an important influence on the efficiency of datacenter and related energy consumption. Although many traditional scheduling algorithms have been applied in various platforms, they may not work efficiently due to the large number of user requests, the variety of computation resources and complexity of Cloud environment. In this paper, we tackle the task scheduling problem which aims to minimize makespan by Genetic Algorithm (GA. We propose an incremental GA which has adaptive probabilities of crossover and mutation. The mutation and crossover rates change according to generations and also vary between individuals. Large numbers of tasks are randomly generated to simulate various scales of task scheduling problem in Cloud environment. Based on the instance types of Amazon EC2, we implemented virtual machines with different computing capacity on CloudSim. We compared the performance of the adaptive incremental GA with that of Standard GA, Min-Min, Max-Min , Simulated Annealing and Artificial Bee Colony Algorithm in finding the optimal scheme. Experimental results show that the proposed algorithm can achieve feasible solutions which have acceptable makespan with less computation time.

  10. Genetic algorithms

    Science.gov (United States)

    Wang, Lui; Bayer, Steven E.

    1991-01-01

    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  11. How Do Severe Constraints Affect the Search Ability of Multiobjective Evolutionary Algorithms in Water Resources?

    Science.gov (United States)

    Clarkin, T. J.; Kasprzyk, J. R.; Raseman, W. J.; Herman, J. D.

    2015-12-01

    This study contributes a diagnostic assessment of multiobjective evolutionary algorithm (MOEA) search on a set of water resources problem formulations with different configurations of constraints. Unlike constraints in classical optimization modeling, constraints within MOEA simulation-optimization represent limits on acceptable performance that delineate whether solutions within the search problem are feasible. Constraints are relevant because of the emergent pressures on water resources systems: increasing public awareness of their sustainability, coupled with regulatory pressures on water management agencies. In this study, we test several state-of-the-art MOEAs that utilize restricted tournament selection for constraint handling on varying configurations of water resources planning problems. For example, a problem that has no constraints on performance levels will be compared with a problem with several severe constraints, and a problem with constraints that have less severe values on the constraint thresholds. One such problem, Lower Rio Grande Valley (LRGV) portfolio planning, has been solved with a suite of constraints that ensure high reliability, low cost variability, and acceptable performance in a single year severe drought. But to date, it is unclear whether or not the constraints are negatively affecting MOEAs' ability to solve the problem effectively. Two categories of results are explored. The first category uses control maps of algorithm performance to determine if the algorithm's performance is sensitive to user-defined parameters. The second category uses run-time performance metrics to determine the time required for the algorithm to reach sufficient levels of convergence and diversity on the solution sets. Our work exploring the effect of constraints will better enable practitioners to define MOEA problem formulations for real-world systems, especially when stakeholders are concerned with achieving fixed levels of performance according to one or

  12. Algorithm aversion: people erroneously avoid algorithms after seeing them err.

    Science.gov (United States)

    Dietvorst, Berkeley J; Simmons, Joseph P; Massey, Cade

    2015-02-01

    Research shows that evidence-based algorithms more accurately predict the future than do human forecasters. Yet when forecasters are deciding whether to use a human forecaster or a statistical algorithm, they often choose the human forecaster. This phenomenon, which we call algorithm aversion, is costly, and it is important to understand its causes. We show that people are especially averse to algorithmic forecasters after seeing them perform, even when they see them outperform a human forecaster. This is because people more quickly lose confidence in algorithmic than human forecasters after seeing them make the same mistake. In 5 studies, participants either saw an algorithm make forecasts, a human make forecasts, both, or neither. They then decided whether to tie their incentives to the future predictions of the algorithm or the human. Participants who saw the algorithm perform were less confident in it, and less likely to choose it over an inferior human forecaster. This was true even among those who saw the algorithm outperform the human.

  13. 7 CFR 457.132 - Cranberry crop insurance provisions.

    Science.gov (United States)

    2010-01-01

    ... if you are unable to market due to quarantine, boycott, or refusal of any person to accept production... abandon or no longer care for, if you and we agree on the appraised amount of production. Upon such... use the appraised amount of production or defer the claim if you agree to continue to care for the...

  14. The product composition control system at Savannah River: Statistical process control algorithm

    International Nuclear Information System (INIS)

    Brown, K.G.

    1994-01-01

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) will be used to immobilize the approximately 130 million liters of high-level nuclear waste currently stored at the site in 51 carbon steel tanks. Waste handling operations separate this waste into highly radioactive insoluble sludge and precipitate and less radioactive water soluble salts. In DWPF, precipitate (PHA) is blended with insoluble sludge and ground glass frit to produce melter feed slurry which is continuously fed to the DWPF melter. The melter produces a molten borosilicate glass which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in an geologic repository. Described here is the Product Composition Control System (PCCS) process control algorithm. The PCCS is the amalgam of computer hardware and software intended to ensure that the melt will be processable and that the glass wasteform produced will be acceptable. Within PCCS, the Statistical Process Control (SPC) Algorithm is the means which guides control of the DWPF process. The SPC Algorithm is necessary to control the multivariate DWPF process in the face of uncertainties arising from the process, its feeds, sampling, modeling, and measurement systems. This article describes the functions performed by the SPC Algorithm, characterization of DWPF prior to making product, accounting for prediction uncertainty, accounting for measurement uncertainty, monitoring a SME batch, incorporating process information, and advantages of the algorithm. 9 refs., 6 figs

  15. Clinical Implications of TiGRT Algorithm for External Audit in Radiation Oncology

    Directory of Open Access Journals (Sweden)

    Daryoush Shahbazi-Gahrouei

    2017-01-01

    Full Text Available Background: Performing audits play an important role in quality assurance program in radiation oncology. Among different algorithms, TiGRT is one of the common application software for dose calculation. This study aimed to clinical implications of TiGRT algorithm to measure dose and compared to calculated dose delivered to the patients for a variety of cases, with and without the presence of inhomogeneities and beam modifiers. Materials and Methods: Nonhomogeneous phantom as quality dose verification phantom, Farmer ionization chambers, and PC-electrometer (Sun Nuclear, USA as a reference class electrometer was employed throughout the audit in linear accelerators 6 and 18 MV energies (Siemens ONCOR Impression Plus, Germany. Seven test cases were performed using semi CIRS phantom. Results: In homogeneous regions and simple plans for both energies, there was a good agreement between measured and treatment planning system calculated dose. Their relative error was found to be between 0.8% and 3% which is acceptable for audit, but in nonhomogeneous organs, such as lung, a few errors were observed. In complex treatment plans, when wedge or shield in the way of energy is used, the error was in the accepted criteria. In complex beam plans, the difference between measured and calculated dose was found to be 2%–3%. All differences were obtained between 0.4% and 1%. Conclusions: A good consistency was observed for the same type of energy in the homogeneous and nonhomogeneous phantom for the three-dimensional conformal field with a wedge, shield, asymmetric using the TiGRT treatment planning software in studied center. The results revealed that the national status of TPS calculations and dose delivery for 3D conformal radiotherapy was globally within acceptable standards with no major causes for concern.

  16. Electricity supply between acceptance, acceptability and social compatibility; Energieversorgung zwischen Akzeptanz, Akzeptabilitaet und Sozialvertraeglichkeit

    Energy Technology Data Exchange (ETDEWEB)

    Schubert, Katharina; Koch, Marco K. [Bochum Univ. (Germany). Lehrstuhl Energiesysteme und Energiewirtschaft (LEE)

    2012-11-01

    Acceptance promotion is supposed to be an indispensable premise for a successful realization of an energy concept. The contribution identifies deficiencies of the energy policy, including intransparency, complexity of decision procedures, for instance in case of the so called energy transmission line extension acceleration law, that has caused irritation and anger in the public. The justification of acceptance promotion is questioned in connection with the German nuclear policy reversal following the Fukushima accident. A research program ''public acceptance of large-scale power plants for electricity generation'' is presented. The issues criteria and limits of acceptability are of main importance for this discussion.

  17. The Texas Medication Algorithm Project (TMAP) schizophrenia algorithms.

    Science.gov (United States)

    Miller, A L; Chiles, J A; Chiles, J K; Crismon, M L; Rush, A J; Shon, S P

    1999-10-01

    In the Texas Medication Algorithm Project (TMAP), detailed guidelines for medication management of schizophrenia and related disorders, bipolar disorders, and major depressive disorders have been developed and implemented. This article describes the algorithms developed for medication treatment of schizophrenia and related disorders. The guidelines recommend a sequence of medications and discuss dosing, duration, and switch-over tactics. They also specify response criteria at each stage of the algorithm for both positive and negative symptoms. The rationale and evidence for each aspect of the algorithms are presented.

  18. Zone of Acceptance Under Performance Measurement: Does Performance Information Affect Employee Acceptance of Management Authority?

    DEFF Research Database (Denmark)

    Nielsen, Poul Aaes; Jacobsen, Christian Bøtcher

    2018-01-01

    Public sector employees have traditionally enjoyed substantial influence and bargaining power in organizational decision making, but few studies have investigated the formation of employee acceptance of management authority. Drawing on the ‘romance of leadership’ perspective, we argue that perfor......Public sector employees have traditionally enjoyed substantial influence and bargaining power in organizational decision making, but few studies have investigated the formation of employee acceptance of management authority. Drawing on the ‘romance of leadership’ perspective, we argue...... that performance information shapes employee attributions of leader quality and perceptions of a need for change in ways that affect their acceptance of management authority, conceptualized using Simon’s notion of a ‘zone of acceptance.’ We conducted a survey experiment among 1,740 teachers, randomly assigning...... true performance information about each respondent’s own school. When employees were exposed to signals showing low or high performance, their acceptance of management authority increased, whereas average performance signals reduced employee acceptance of management authority. The findings suggest...

  19. Waste Acceptance System Requirements document (WASRD)

    International Nuclear Information System (INIS)

    1993-01-01

    This Waste Acceptance System Requirements document (WA-SRD) describes the functions to be performed and the technical requirements for a Waste Acceptance System for accepting spent nuclear fuel (SNF) and high-level radioactive waste (HLW) into the Civilian Radioactive Waste Management System (CRWMS). This revision of the WA-SRD addresses the requirements for the acceptance of HLW. This revision has been developed as a top priority document to permit DOE's Office of Environmental Restoration and Waste Management (EM) to commence waste qualification runs at the Savannah River Site's (SRS) Defense Waste Processing Facility (DWPF) in a timely manner. Additionally, this revision of the WA-SRD includes the requirements from the Physical System Requirements -- Accept Waste document for the acceptance of SNF. A subsequent revision will fully address requirements relative to the acceptance of SNF

  20. Comparison of the accuracy of three algorithms in predicting accessory pathways among adult Wolff-Parkinson-White syndrome patients.

    Science.gov (United States)

    Maden, Orhan; Balci, Kevser Gülcihan; Selcuk, Mehmet Timur; Balci, Mustafa Mücahit; Açar, Burak; Unal, Sefa; Kara, Meryem; Selcuk, Hatice

    2015-12-01

    The aim of this study was to investigate the accuracy of three algorithms in predicting accessory pathway locations in adult patients with Wolff-Parkinson-White syndrome in Turkish population. A total of 207 adult patients with Wolff-Parkinson-White syndrome were retrospectively analyzed. The most preexcited 12-lead electrocardiogram in sinus rhythm was used for analysis. Two investigators blinded to the patient data used three algorithms for prediction of accessory pathway location. Among all locations, 48.5% were left-sided, 44% were right-sided, and 7.5% were located in the midseptum or anteroseptum. When only exact locations were accepted as match, predictive accuracy for Chiang was 71.5%, 72.4% for d'Avila, and 71.5% for Arruda. The percentage of predictive accuracy of all algorithms did not differ between the algorithms (p = 1.000; p = 0.875; p = 0.885, respectively). The best algorithm for prediction of right-sided, left-sided, and anteroseptal and midseptal accessory pathways was Arruda (p algorithms were similar in predicting accessory pathway location and the predicted accuracy was lower than previously reported by their authors. However, according to the accessory pathway site, the algorithm designed by Arruda et al. showed better predictions than the other algorithms and using this algorithm may provide advantages before a planned ablation.

  1. A multiresolution approach to iterative reconstruction algorithms in X-ray computed tomography.

    Science.gov (United States)

    De Witte, Yoni; Vlassenbroeck, Jelle; Van Hoorebeke, Luc

    2010-09-01

    In computed tomography, the application of iterative reconstruction methods in practical situations is impeded by their high computational demands. Especially in high resolution X-ray computed tomography, where reconstruction volumes contain a high number of volume elements (several giga voxels), this computational burden prevents their actual breakthrough. Besides the large amount of calculations, iterative algorithms require the entire volume to be kept in memory during reconstruction, which quickly becomes cumbersome for large data sets. To overcome this obstacle, we present a novel multiresolution reconstruction, which greatly reduces the required amount of memory without significantly affecting the reconstructed image quality. It is shown that, combined with an efficient implementation on a graphical processing unit, the multiresolution approach enables the application of iterative algorithms in the reconstruction of large volumes at an acceptable speed using only limited resources.

  2. Sparse spectral deconvolution algorithm for noncartesian MR spectroscopic imaging.

    Science.gov (United States)

    Bhave, Sampada; Eslami, Ramin; Jacob, Mathews

    2014-02-01

    To minimize line shape distortions and spectral leakage artifacts in MR spectroscopic imaging (MRSI). A spatially and spectrally regularized non-Cartesian MRSI algorithm that uses the line shape distortion priors, estimated from water reference data, to deconvolve the spectra is introduced. Sparse spectral regularization is used to minimize noise amplification associated with deconvolution. A spiral MRSI sequence that heavily oversamples the central k-space regions is used to acquire the MRSI data. The spatial regularization term uses the spatial supports of brain and extracranial fat regions to recover the metabolite spectra and nuisance signals at two different resolutions. Specifically, the nuisance signals are recovered at the maximum resolution to minimize spectral leakage, while the point spread functions of metabolites are controlled to obtain acceptable signal-to-noise ratio. The comparisons of the algorithm against Tikhonov regularized reconstructions demonstrates considerably reduced line-shape distortions and improved metabolite maps. The proposed sparsity constrained spectral deconvolution scheme is effective in minimizing the line-shape distortions. The dual resolution reconstruction scheme is capable of minimizing spectral leakage artifacts. Copyright © 2013 Wiley Periodicals, Inc.

  3. Multi-objective optimization using genetic algorithms: A tutorial

    International Nuclear Information System (INIS)

    Konak, Abdullah; Coit, David W.; Smith, Alice E.

    2006-01-01

    Multi-objective formulations are realistic models for many complex engineering optimization problems. In many real-life problems, objectives under consideration conflict with each other, and optimizing a particular solution with respect to a single objective can result in unacceptable results with respect to the other objectives. A reasonable solution to a multi-objective problem is to investigate a set of solutions, each of which satisfies the objectives at an acceptable level without being dominated by any other solution. In this paper, an overview and tutorial is presented describing genetic algorithms (GA) developed specifically for problems with multiple objectives. They differ primarily from traditional GA by using specialized fitness functions and introducing methods to promote solution diversity

  4. Acceptable noise level

    DEFF Research Database (Denmark)

    Olsen, Steen Østergaard; Nielsen, Lars Holme; Lantz, Johannes

    2012-01-01

    The acceptable noise level (ANL) is used to quantify the amount of background noise that subjects can accept while listening to speech, and is suggested for prediction of individual hearing-aid use. The aim of this study was to assess the repeatability of the ANL measured in normal-hearing subjects...... using running Danish and non-semantic speech materials as stimuli and modulated speech-spectrum and multi-talker babble noises as competing stimuli....

  5. Acceptable noise level

    DEFF Research Database (Denmark)

    Olsen, Steen Østergaard; Nielsen, Lars Holme; Lantz, Johannes

    2012-01-01

    The acceptable noise level (ANL) is used to quantify the amount of background noise that subjects can accept while listening to speech, and is suggested for prediction of individual hearing-aid use. The aim of this study was to assess the repeatability of the ANL measured in normal-hearing subjec...... using running Danish and non-semantic speech materials as stimuli and modulated speech-spectrum and multi-talker babble noises as competing stimuli....

  6. Algorithming the Algorithm

    DEFF Research Database (Denmark)

    Mahnke, Martina; Uprichard, Emma

    2014-01-01

    Imagine sailing across the ocean. The sun is shining, vastness all around you. And suddenly [BOOM] you’ve hit an invisible wall. Welcome to the Truman Show! Ever since Eli Pariser published his thoughts on a potential filter bubble, this movie scenario seems to have become reality, just with slight...... changes: it’s not the ocean, it’s the internet we’re talking about, and it’s not a TV show producer, but algorithms that constitute a sort of invisible wall. Building on this assumption, most research is trying to ‘tame the algorithmic tiger’. While this is a valuable and often inspiring approach, we...

  7. On risks and acceptability

    International Nuclear Information System (INIS)

    Watson, S.R.

    1981-01-01

    A very attractive notion is that it should be possible not only to determine how much risk is associated with any particular activity, but also to determine if that risk is acceptable. Stated boldly this seems an entirely unobjectionable and indeed a very acceptable notion. There is, however, underlying this idea, a mistaken view of risk which we might refer to as the ''phlogiston'' theory of risk. In this paper, presented at the SRP meeting on Ethical and Legal Aspects of Radiological Protection, the phlogiston theory of risk is described; secondly, it will be argued that it is too simple a theory to be realistic or useful; and thirdly, the management of risk will be placed in a wider decision framework. Acceptability, it will be argued is highly dependent on context, and it is not possible, therefore, to lay down generally applicable notions of acceptability. (author)

  8. Pseudo-deterministic Algorithms

    OpenAIRE

    Goldwasser , Shafi

    2012-01-01

    International audience; In this talk we describe a new type of probabilistic algorithm which we call Bellagio Algorithms: a randomized algorithm which is guaranteed to run in expected polynomial time, and to produce a correct and unique solution with high probability. These algorithms are pseudo-deterministic: they can not be distinguished from deterministic algorithms in polynomial time by a probabilistic polynomial time observer with black box access to the algorithm. We show a necessary an...

  9. The Algorithmic Imaginary

    DEFF Research Database (Denmark)

    Bucher, Taina

    2017-01-01

    the notion of the algorithmic imaginary. It is argued that the algorithmic imaginary – ways of thinking about what algorithms are, what they should be and how they function – is not just productive of different moods and sensations but plays a generative role in moulding the Facebook algorithm itself...... of algorithms affect people's use of these platforms, if at all? To help answer these questions, this article examines people's personal stories about the Facebook algorithm through tweets and interviews with 25 ordinary users. To understand the spaces where people and algorithms meet, this article develops...

  10. A semi-learning algorithm for noise rejection: an fNIRS study on ADHD children

    Science.gov (United States)

    Sutoko, Stephanie; Funane, Tsukasa; Katura, Takusige; Sato, Hiroki; Kiguchi, Masashi; Maki, Atsushi; Monden, Yukifumi; Nagashima, Masako; Yamagata, Takanori; Dan, Ippeita

    2017-02-01

    In pediatrics studies, the quality of functional near infrared spectroscopy (fNIRS) signals is often reduced by motion artifacts. These artifacts likely mislead brain functionality analysis, causing false discoveries. While noise correction methods and their performance have been investigated, these methods require several parameter assumptions that apparently result in noise overfitting. In contrast, the rejection of noisy signals serves as a preferable method because it maintains the originality of the signal waveform. Here, we describe a semi-learning algorithm to detect and eliminate noisy signals. The algorithm dynamically adjusts noise detection according to the predetermined noise criteria, which are spikes, unusual activation values (averaged amplitude signals within the brain activation period), and high activation variances (among trials). Criteria were sequentially organized in the algorithm and orderly assessed signals based on each criterion. By initially setting an acceptable rejection rate, particular criteria causing excessive data rejections are neglected, whereas others with tolerable rejections practically eliminate noises. fNIRS data measured during the attention response paradigm (oddball task) in children with attention deficit/hyperactivity disorder (ADHD) were utilized to evaluate and optimize the algorithm's performance. This algorithm successfully substituted the visual noise identification done in the previous studies and consistently found significantly lower activation of the right prefrontal and parietal cortices in ADHD patients than in typical developing children. Thus, we conclude that the semi-learning algorithm confers more objective and standardized judgment for noise rejection and presents a promising alternative to visual noise rejection

  11. Specification and acceptance testing of radiotherapy treatment planning systems

    International Nuclear Information System (INIS)

    2007-04-01

    Quality assurance (QA) in the radiation therapy treatment planning process is essential to ensure accurate dose delivery to the patient and to minimize the possibility of accidental exposure. The computerized radiotherapy treatment planning systems (RTPSs) are now widely available in industrialized and developing countries and it is of special importance to support hospitals in Member States in developing procedures for acceptance testing, commissioning and QA of their RTPSs. Responding to these needs, a group of experts developed an IAEA publication with such recommendations, which was published in 2004 as IAEA Technical Reports Series No. 430. This report provides a general framework and describes a large number of tests and procedures that should be considered by the users of new RTPSs. However, small hospitals with limited resources or large hospitals with high patient load and limited staff are not always able to perform complete characterization, validation and software testing of algorithms used in RTPSs. Therefore, the IAEA proposed more specific guidelines that provide a step-by-step recommendation for users at hospitals or cancer centres how to implement acceptance and commissioning procedures for newly purchased RTPSs. The current publication was developed in the framework of the Coordinated Research Project on Development of Procedures for Quality Assurance for Dosimetry Calculations in Radiotherapy and uses the International Electrotechnical Commission (IEC) standard IEC 62083, Requirements for the Safety of Radiotherapy Treatment Planning Systems as its basis. The report addresses the procedures for specification and acceptance testing of RTPSs to be used by both manufacturers and users at the hospitals. Recommendations are provided for specific tests to be performed at the manufacturing facility known as type tests, and for acceptance tests to be performed at the hospital known as site tests. The purpose of acceptance testing is to demonstrate to the

  12. An Adaptive Cultural Algorithm with Improved Quantum-behaved Particle Swarm Optimization for Sonar Image Detection.

    Science.gov (United States)

    Wang, Xingmei; Hao, Wenqian; Li, Qiming

    2017-12-18

    This paper proposes an adaptive cultural algorithm with improved quantum-behaved particle swarm optimization (ACA-IQPSO) to detect the underwater sonar image. In the population space, to improve searching ability of particles, iterative times and the fitness value of particles are regarded as factors to adaptively adjust the contraction-expansion coefficient of the quantum-behaved particle swarm optimization algorithm (QPSO). The improved quantum-behaved particle swarm optimization algorithm (IQPSO) can make particles adjust their behaviours according to their quality. In the belief space, a new update strategy is adopted to update cultural individuals according to the idea of the update strategy in shuffled frog leaping algorithm (SFLA). Moreover, to enhance the utilization of information in the population space and belief space, accept function and influence function are redesigned in the new communication protocol. The experimental results show that ACA-IQPSO can obtain good clustering centres according to the grey distribution information of underwater sonar images, and accurately complete underwater objects detection. Compared with other algorithms, the proposed ACA-IQPSO has good effectiveness, excellent adaptability, a powerful searching ability and high convergence efficiency. Meanwhile, the experimental results of the benchmark functions can further demonstrate that the proposed ACA-IQPSO has better searching ability, convergence efficiency and stability.

  13. Operations Acceptance Management

    OpenAIRE

    Suchá, Ivana

    2010-01-01

    This paper examines the process of Operations Acceptance Management, whose main task is to control Operations Acceptance Tests (OAT). In the first part the author focuses on the theoretical ground for the problem in the context of ITSM best practices framework ITIL. Benefits, process pitfalls and possibilities for automation are discussed in this part. The second part contains a case study of DHL IT Services (Prague), where a solution optimizing the overall workflow was implemented using simp...

  14. Gerontechnology acceptance by elderly Hong Kong Chinese: a senior technology acceptance model (STAM).

    Science.gov (United States)

    Chen, Ke; Chan, Alan Hoi Shou

    2014-01-01

    The purpose of this study was to develop and test a senior technology acceptance model (STAM) aimed at understanding the acceptance of gerontechnology by older Hong Kong Chinese people. The proposed STAM extended previous technology acceptance models and theories by adding age-related health and ability characteristics of older people. The proposed STAM was empirically tested using a cross-sectional questionnaire survey with a sample of 1012 seniors aged 55 and over in Hong Kong. The result showed that STAM was strongly supported and could explain 68% of the variance in the use of gerontechnology. For older Hong Kong Chinese, individual attributes, which include age, gender, education, gerontechnology self-efficacy and anxiety, and health and ability characteristics, as well as facilitating conditions explicitly and directly affected technology acceptance. These were better predictors of gerontechnology usage behaviour (UB) than the conventionally used attitudinal factors (usefulness and ease of use).

  15. An Adaptive Filtering Algorithm Based on Genetic Algorithm-Backpropagation Network

    Directory of Open Access Journals (Sweden)

    Kai Hu

    2013-01-01

    Full Text Available A new image filtering algorithm is proposed. GA-BPN algorithm uses genetic algorithm (GA to decide weights in a back propagation neural network (BPN. It has better global optimal characteristics than traditional optimal algorithm. In this paper, we used GA-BPN to do image noise filter researching work. Firstly, this paper uses training samples to train GA-BPN as the noise detector. Then, we utilize the well-trained GA-BPN to recognize noise pixels in target image. And at last, an adaptive weighted average algorithm is used to recover noise pixels recognized by GA-BPN. Experiment data shows that this algorithm has better performance than other filters.

  16. Nature-inspired optimization algorithms

    CERN Document Server

    Yang, Xin-She

    2014-01-01

    Nature-Inspired Optimization Algorithms provides a systematic introduction to all major nature-inspired algorithms for optimization. The book's unified approach, balancing algorithm introduction, theoretical background and practical implementation, complements extensive literature with well-chosen case studies to illustrate how these algorithms work. Topics include particle swarm optimization, ant and bee algorithms, simulated annealing, cuckoo search, firefly algorithm, bat algorithm, flower algorithm, harmony search, algorithm analysis, constraint handling, hybrid methods, parameter tuning

  17. Public acceptance of small reactors

    International Nuclear Information System (INIS)

    McDougall, D.S.

    1997-01-01

    The success of any nuclear program requires acceptance by the local public and all levels of government involved in the decision to initiate a reactor program. Public acceptance of a nuclear energy source is a major challenge in successful initiation of a small reactor program. In AECL's experience, public acceptance will not be obtained until the public is convinced that the specific nuclear program is needed, safe and economic and environmental benefit to the community. The title of public acceptance is misleading. The objective of the program is a fully informed public. The program proponent cannot force public acceptance, which is beyond his control. He can, however, ensure that the public is informed. Once information has begun to flow to the public by various means as will be explained later, the proponent is responsible to ensure that the information that is provided by him and by others is accurate. Most importantly, and perhaps most difficult to accomplish, the proponent must develop a consultative process that allows the proponent and the public to agree on actions that are acceptable to the proponent and the community

  18. Condition monitoring of face milling tool using K-star algorithm and histogram features of vibration signal

    Directory of Open Access Journals (Sweden)

    C.K. Madhusudana

    2016-09-01

    Full Text Available This paper deals with the fault diagnosis of the face milling tool based on machine learning approach using histogram features and K-star algorithm technique. Vibration signals of the milling tool under healthy and different fault conditions are acquired during machining of steel alloy 42CrMo4. Histogram features are extracted from the acquired signals. The decision tree is used to select the salient features out of all the extracted features and these selected features are used as an input to the classifier. K-star algorithm is used as a classifier and the output of the model is utilised to study and classify the different conditions of the face milling tool. Based on the experimental results, K-star algorithm is provided a better classification accuracy in the range from 94% to 96% with histogram features and is acceptable for fault diagnosis.

  19. Convex hull ranking algorithm for multi-objective evolutionary algorithms

    NARCIS (Netherlands)

    Davoodi Monfrared, M.; Mohades, A.; Rezaei, J.

    2012-01-01

    Due to many applications of multi-objective evolutionary algorithms in real world optimization problems, several studies have been done to improve these algorithms in recent years. Since most multi-objective evolutionary algorithms are based on the non-dominated principle, and their complexity

  20. Total algorithms

    NARCIS (Netherlands)

    Tel, G.

    We define the notion of total algorithms for networks of processes. A total algorithm enforces that a "decision" is taken by a subset of the processes, and that participation of all processes is required to reach this decision. Total algorithms are an important building block in the design of

  1. Progressive geometric algorithms

    NARCIS (Netherlands)

    Alewijnse, S.P.A.; Bagautdinov, T.M.; de Berg, M.T.; Bouts, Q.W.; ten Brink, Alex P.; Buchin, K.A.; Westenberg, M.A.

    2015-01-01

    Progressive algorithms are algorithms that, on the way to computing a complete solution to the problem at hand, output intermediate solutions that approximate the complete solution increasingly well. We present a framework for analyzing such algorithms, and develop efficient progressive algorithms

  2. Progressive geometric algorithms

    NARCIS (Netherlands)

    Alewijnse, S.P.A.; Bagautdinov, T.M.; Berg, de M.T.; Bouts, Q.W.; Brink, ten A.P.; Buchin, K.; Westenberg, M.A.

    2014-01-01

    Progressive algorithms are algorithms that, on the way to computing a complete solution to the problem at hand, output intermediate solutions that approximate the complete solution increasingly well. We present a framework for analyzing such algorithms, and develop efficient progressive algorithms

  3. Archimedean copula estimation of distribution algorithm based on artificial bee colony algorithm

    Institute of Scientific and Technical Information of China (English)

    Haidong Xu; Mingyan Jiang; Kun Xu

    2015-01-01

    The artificial bee colony (ABC) algorithm is a com-petitive stochastic population-based optimization algorithm. How-ever, the ABC algorithm does not use the social information and lacks the knowledge of the problem structure, which leads to in-sufficiency in both convergent speed and searching precision. Archimedean copula estimation of distribution algorithm (ACEDA) is a relatively simple, time-economic and multivariate correlated EDA. This paper proposes a novel hybrid algorithm based on the ABC algorithm and ACEDA cal ed Archimedean copula estima-tion of distribution based on the artificial bee colony (ACABC) algorithm. The hybrid algorithm utilizes ACEDA to estimate the distribution model and then uses the information to help artificial bees to search more efficiently in the search space. Six bench-mark functions are introduced to assess the performance of the ACABC algorithm on numerical function optimization. Experimen-tal results show that the ACABC algorithm converges much faster with greater precision compared with the ABC algorithm, ACEDA and the global best (gbest)-guided ABC (GABC) algorithm in most of the experiments.

  4. Two-dimensional pencil beam scaling: an improved proton dose algorithm for heterogeneous media

    International Nuclear Information System (INIS)

    Szymanowski, Hanitra; Oelfke, Uwe

    2002-01-01

    New dose delivery techniques with proton beams, such as beam spot scanning or raster scanning, require fast and accurate dose algorithms which can be applied for treatment plan optimization in clinically acceptable timescales. The clinically required accuracy is particularly difficult to achieve for the irradiation of complex, heterogeneous regions of the patient's anatomy. Currently applied fast pencil beam dose calculations based on the standard inhomogeneity correction of pathlength scaling often cannot provide the accuracy required for clinically acceptable dose distributions. This could be achieved with sophisticated Monte Carlo simulations which are still unacceptably time consuming for use as dose engines in optimization calculations. We therefore present a new algorithm for proton dose calculations which aims to resolve the inherent problem between calculation speed and required clinical accuracy. First, a detailed derivation of the new concept, which is based on an additional scaling of the lateral proton fluence is provided. Then, the newly devised two-dimensional (2D) scaling method is tested for various geometries of different phantom materials. These include standard biological tissues such as bone, muscle and fat as well as air. A detailed comparison of the new 2D pencil beam scaling with the current standard pencil beam approach and Monte Carlo simulations, performed with GEANT, is presented. It was found that the new concept proposed allows calculation of absorbed dose with an accuracy almost equal to that achievable with Monte Carlo simulations while requiring only modestly increased calculation times in comparison to the standard pencil beam approach. It is believed that this new proton dose algorithm has the potential to significantly improve the treatment planning outcome for many clinical cases encountered in highly conformal proton therapy. (author)

  5. A filtered backprojection algorithm with characteristics of the iterative landweber algorithm

    OpenAIRE

    L. Zeng, Gengsheng

    2012-01-01

    Purpose: In order to eventually develop an analytical algorithm with noise characteristics of an iterative algorithm, this technical note develops a window function for the filtered backprojection (FBP) algorithm in tomography that behaves as an iterative Landweber algorithm.

  6. 7 CFR 457.122 - Walnut crop insurance provisions.

    Science.gov (United States)

    2010-01-01

    ... quarantine, boycott, or refusal of any person to accept production. 10. Duties in the Event of Damage or Loss... production on insured acreage that you intend to abandon or no longer care for, if you and we agree on the... do not agree with our appraisal, we may defer the claim only if you agree to continue to care for the...

  7. 7 CFR 457.121 - Arizona-California citrus crop insurance provisions.

    Science.gov (United States)

    2010-01-01

    ... quarantine, boycott, or refusal of any person to accept production. 10. Duties in the Event of Damage or Loss... insured acreage that you intend to abandon or no longer care for, if you and we agree on the appraised... agree with our appraisal, we may defer the claim only if you agree to continue to care for the crop. We...

  8. 42 CFR 137.138 - Once the Indian Tribe's final offer has been accepted or deemed accepted by operation of law...

    Science.gov (United States)

    2010-10-01

    ... accepted or deemed accepted by operation of law, what is the next step? 137.138 Section 137.138 Public... final offer has been accepted or deemed accepted by operation of law, what is the next step? After the... the acceptance or the deemed acceptance. Rejection of Final Offers ...

  9. A Flexible VHDL Floating Point Module for Control Algorithm Implementation in Space Applications

    Science.gov (United States)

    Padierna, A.; Nicoleau, C.; Sanchez, J.; Hidalgo, I.; Elvira, S.

    2012-08-01

    The implementation of control loops for space applications is an area with great potential. However, the characteristics of this kind of systems, such as its wide dynamic range of numeric values, make inadequate the use of fixed-point algorithms.However, because the generic chips available for the treatment of floating point data are, in general, not qualified to operate in space environments and the possibility of using an IP module in a FPGA/ASIC qualified for space is not viable due to the low amount of logic cells available for these type of devices, it is necessary to find a viable alternative.For these reasons, in this paper a VHDL Floating Point Module is presented. This proposal allows the design and execution of floating point algorithms with acceptable occupancy to be implemented in FPGAs/ASICs qualified for space environments.

  10. Probabilistic relationships in acceptable risk studies

    International Nuclear Information System (INIS)

    Benjamin, J.R.

    1977-01-01

    Acceptable risk studies involve uncertainties in future events; consequences and associated values, the acceptability levels, and the future decision environment. Probabilistic procedures afford the basic analytical tool to study the influence of each of these parameters on the acceptable risk decision, including their interrelationships, and combinations. A series of examples are presented in the paper in increasing complexity to illustrate the principles involved and to quantify the relationships to the acceptable risk decision. The basic objective of such studies is to broaden the scientific basis of acceptable risk decision making. It is shown that rationality and consistency in decision making is facilitated by such studies and that rather simple relationships exist in many situations of interest. The variation in criteria associated with an increase in the state of knowledge or change in the level of acceptability is also discussed. (Auth.)

  11. Probabilistic relationships in acceptable risk studies

    International Nuclear Information System (INIS)

    Benjamin, J.R.

    1977-01-01

    Acceptable risk studies involve uncertainties in future events: consequences and associated values, the acceptability levels, and the future decision environment. Probabilistic procedures afford the basic analytical tool to study the influence of each of these parameters on the acceptable risk decision, including their interrelationships, and combinations. A series of examples are presented in the paper in increasing complexity to illustrate the principles involved and to quantify the relationships to the acceptable risk decision. The basic objective of such studies is to broaden the scientific basis of acceptable risk decision making. It is shown that rationality and consistency in decision making is facilitated by such studies and that rather simple relationships exist in many situations of interest. The variation in criteria associated with an increase in the state of knowledge or change in the level of acceptability is also discussed

  12. Technology, Demographic Characteristics and E-Learning Acceptance: A Conceptual Model Based on Extended Technology Acceptance Model

    Science.gov (United States)

    Tarhini, Ali; Elyas, Tariq; Akour, Mohammad Ali; Al-Salti, Zahran

    2016-01-01

    The main aim of this paper is to develop an amalgamated conceptual model of technology acceptance that explains how individual, social, cultural and organizational factors affect the students' acceptance and usage behaviour of the Web-based learning systems. More specifically, the proposed model extends the Technology Acceptance Model (TAM) to…

  13. Investigating Students' Acceptance of a Statistics Learning Platform Using Technology Acceptance Model

    Science.gov (United States)

    Song, Yanjie; Kong, Siu-Cheung

    2017-01-01

    The study aims at investigating university students' acceptance of a statistics learning platform to support the learning of statistics in a blended learning context. Three kinds of digital resources, which are simulations, online videos, and online quizzes, were provided on the platform. Premised on the technology acceptance model, we adopted a…

  14. American acceptance of nuclear power

    International Nuclear Information System (INIS)

    Barrett, W.

    1980-01-01

    The characteristic adventurous spirit that built American technology will eventually lead to American acceptance of nuclear power unless an overpowering loss of nerve causes us to reject both nuclear technology and world leadership. The acceptance of new technology by society has always been accompanied by activist opposition to industralization. To resolve the debate between environmental and exploitive extremists, we must accept with humility the basic premise that human accomplishment is a finite part of nature

  15. Tacit acceptance of the succession

    Directory of Open Access Journals (Sweden)

    Ioana NICOLAE

    2012-01-01

    Full Text Available This paper examines some essential and contradictory aspects regarding the issue of tacit acceptance of succession in terms of distinction between documents valuing tacit acceptance of succession and other acts that would not justify such a solution. The documents expressly indicated by the legislator as having tacit acceptance value as well as those which do not have such value are presented and their most important legal effects are examined and discussed.

  16. Approaches to acceptable risk

    International Nuclear Information System (INIS)

    Whipple, C.

    1997-01-01

    Several alternative approaches to address the question open-quotes How safe is safe enough?close quotes are reviewed and an attempt is made to apply the reasoning behind these approaches to the issue of acceptability of radiation exposures received in space. The approaches to the issue of the acceptability of technological risk described here are primarily analytical, and are drawn from examples in the management of environmental health risks. These include risk-based approaches, in which specific quantitative risk targets determine the acceptability of an activity, and cost-benefit and decision analysis, which generally focus on the estimation and evaluation of risks, benefits and costs, in a framework that balances these factors against each other. These analytical methods tend by their quantitative nature to emphasize the magnitude of risks, costs and alternatives, and to downplay other factors, especially those that are not easily expressed in quantitative terms, that affect acceptance or rejection of risk. Such other factors include the issues of risk perceptions and how and by whom risk decisions are made

  17. Reproducible cancer biomarker discovery in SELDI-TOF MS using different pre-processing algorithms.

    Directory of Open Access Journals (Sweden)

    Jinfeng Zou

    Full Text Available BACKGROUND: There has been much interest in differentiating diseased and normal samples using biomarkers derived from mass spectrometry (MS studies. However, biomarker identification for specific diseases has been hindered by irreproducibility. Specifically, a peak profile extracted from a dataset for biomarker identification depends on a data pre-processing algorithm. Until now, no widely accepted agreement has been reached. RESULTS: In this paper, we investigated the consistency of biomarker identification using differentially expressed (DE peaks from peak profiles produced by three widely used average spectrum-dependent pre-processing algorithms based on SELDI-TOF MS data for prostate and breast cancers. Our results revealed two important factors that affect the consistency of DE peak identification using different algorithms. One factor is that some DE peaks selected from one peak profile were not detected as peaks in other profiles, and the second factor is that the statistical power of identifying DE peaks in large peak profiles with many peaks may be low due to the large scale of the tests and small number of samples. Furthermore, we demonstrated that the DE peak detection power in large profiles could be improved by the stratified false discovery rate (FDR control approach and that the reproducibility of DE peak detection could thereby be increased. CONCLUSIONS: Comparing and evaluating pre-processing algorithms in terms of reproducibility can elucidate the relationship among different algorithms and also help in selecting a pre-processing algorithm. The DE peaks selected from small peak profiles with few peaks for a dataset tend to be reproducibly detected in large peak profiles, which suggests that a suitable pre-processing algorithm should be able to produce peaks sufficient for identifying useful and reproducible biomarkers.

  18. Image processing algorithm design and implementation for real-time autonomous inspection of mixed waste

    International Nuclear Information System (INIS)

    Schalkoff, R.J.; Shaaban, K.M.; Carver, A.E.

    1996-01-01

    The ARIES number-sign 1 (Autonomous Robotic Inspection Experimental System) vision system is used to acquire drum surface images under controlled conditions and subsequently perform autonomous visual inspection leading to a classification as 'acceptable' or 'suspect'. Specific topics described include vision system design methodology, algorithmic structure,hardware processing structure, and image acquisition hardware. Most of these capabilities were demonstrated at the ARIES Phase II Demo held on Nov. 30, 1995. Finally, Phase III efforts are briefly addressed

  19. UAS Air Traffic Controller Acceptability Study. 2; Evaluating Detect and Avoid Technology and Communication Delays in Simulation

    Science.gov (United States)

    Comstock, James R., Jr.; Ghatas, Rania W.; Consiglio, Maria C.; Chamberlain, James P.; Hoffler, Keith D.

    2015-01-01

    This study evaluated the effects of communications delays and winds on air traffic controller ratings of acceptability of horizontal miss distances (HMDs) for encounters between Unmanned Aircraft Systems (UAS) and manned aircraft in a simulation of the Dallas-Ft. Worth (DFW) airspace. Fourteen encounters per hour were staged in the presence of moderate background traffic. Seven recently retired controllers with experience at DFW served as subjects. Guidance provided to the UAS pilots for maintaining a given HMD was provided by information from Detect and Avoid (DAA) self-separation algorithms (Stratway+) displayed on the Multi-Aircraft Control System. This guidance consisted of amber "bands" on the heading scale of the UAS navigation display indicating headings that would result in a loss of well clear between the UAS and nearby traffic. Winds tested were successfully handled by the DAA algorithms and did not affect the controller acceptability ratings of the HMDs. Voice communications delays for the UAS were also tested and included one-way delay times of 0, 400, 1200, and 1800 msec. For longer communications delays, there were changes in strategy and communications flow that were observed and reported by the controllers. The aim of this work is to provide useful information for guiding future rules and regulations applicable to flying UAS in the NAS. Information from this study will also be of value to the Radio Technical Commission for Aeronautics (RTCA) Special Committee 228 - Minimum Performance Standards for UAS.

  20. Effectiveness and cost-effectiveness of a cardiovascular risk prediction algorithm for people with severe mental illness (PRIMROSE).

    Science.gov (United States)

    Zomer, Ella; Osborn, David; Nazareth, Irwin; Blackburn, Ruth; Burton, Alexandra; Hardoon, Sarah; Holt, Richard Ian Gregory; King, Michael; Marston, Louise; Morris, Stephen; Omar, Rumana; Petersen, Irene; Walters, Kate; Hunter, Rachael Maree

    2017-09-05

    To determine the cost-effectiveness of two bespoke severe mental illness (SMI)-specific risk algorithms compared with standard risk algorithms for primary cardiovascular disease (CVD) prevention in those with SMI. Primary care setting in the UK. The analysis was from the National Health Service perspective. 1000 individuals with SMI from The Health Improvement Network Database, aged 30-74 years and without existing CVD, populated the model. Four cardiovascular risk algorithms were assessed: (1) general population lipid, (2) general population body mass index (BMI), (3) SMI-specific lipid and (4) SMI-specific BMI, compared against no algorithm. At baseline, each cardiovascular risk algorithm was applied and those considered high risk ( > 10%) were assumed to be prescribed statin therapy while others received usual care. Quality-adjusted life years (QALYs) and costs were accrued for each algorithm including no algorithm, and cost-effectiveness was calculated using the net monetary benefit (NMB) approach. Deterministic and probabilistic sensitivity analyses were performed to test assumptions made and uncertainty around parameter estimates. The SMI-specific BMI algorithm had the highest NMB resulting in 15 additional QALYs and a cost saving of approximately £53 000 per 1000 patients with SMI over 10 years, followed by the general population lipid algorithm (13 additional QALYs and a cost saving of £46 000). The general population lipid and SMI-specific BMI algorithms performed equally well. The ease and acceptability of use of an SMI-specific BMI algorithm (blood tests not required) makes it an attractive algorithm to implement in clinical settings. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  1. Super-Encryption Implementation Using Monoalphabetic Algorithm and XOR Algorithm for Data Security

    Science.gov (United States)

    Rachmawati, Dian; Andri Budiman, Mohammad; Aulia, Indra

    2018-03-01

    The exchange of data that occurs offline and online is very vulnerable to the threat of data theft. In general, cryptography is a science and art to maintain data secrecy. An encryption is a cryptography algorithm in which data is transformed into cipher text, which is something that is unreadable and meaningless so it cannot be read or understood by other parties. In super-encryption, two or more encryption algorithms are combined to make it more secure. In this work, Monoalphabetic algorithm and XOR algorithm are combined to form a super- encryption. Monoalphabetic algorithm works by changing a particular letter into a new letter based on existing keywords while the XOR algorithm works by using logic operation XOR Since Monoalphabetic algorithm is a classical cryptographic algorithm and XOR algorithm is a modern cryptographic algorithm, this scheme is expected to be both easy-to-implement and more secure. The combination of the two algorithms is capable of securing the data and restoring it back to its original form (plaintext), so the data integrity is still ensured.

  2. An Alternative Consent Process for Minimal Risk Research in the ICU.

    Science.gov (United States)

    Terry, Melissa A; Freedberg, Daniel E; Morris, Marilyn C

    2017-09-01

    Seeking consent for minimal risk research in the ICU poses challenges, especially when the research is time-sensitive. Our aim was to determine the extent to which ICU patients or surrogates support a deferred consent process for a minimal risk study without the potential for direct benefit. Prospective cohort study. Five ICUs within a tertiary care hospital. Newly admitted ICU patients 18 years old or older. We administered an eight-item verbal survey to patients or surrogates approached for consent to participate in a minimal risk, ICU-based study. The parent study involved noninvasive collection of biosamples and clinical data at the time of ICU admission and again 3 days later. If patients had capacity at the time of ICU admission, or if a surrogate was readily available, consent was sought prior to initial sample collection; otherwise, a waiver of consent was granted, and deferred consent was sought 3 days later. Quantitative and qualitative data were analyzed. One hundred fifty-seven individuals were approached for consent to participate in the parent study; none objected to the consent process. One hundred thirty-five of 157 (86%) competed the survey, including 94 who consented to the parent study and 41 who declined. Forty-four of 60 individuals (73%) approached for deferred consent responded positively to the question "Did we make the right choice in waiting until now to ask your consent?" three of 60 (5%) responded negatively, and 13 of 60 (22%) made a neutral or unrelated response. The most common reason given for endorsing the deferred consent process was the stress of the early ICU experience 25 of 44 (61%). Most patients and surrogates accept a deferred consent process for minimal risk research in the ICU. For appropriate ICU-based research, investigators and Institutional Review Boards should consider a deferred consent process if the subject lacks capacity and an appropriate surrogate is not readily available.

  3. Self-acceptance of stuttering: A preliminary study.

    Science.gov (United States)

    De Nardo, Thales; Gabel, Rodney M; Tetnowski, John A; Swartz, Eric R

    2016-01-01

    This study explored the relationship between self-acceptance of stuttering and (1) psychosocial factors (self-esteem, hostility towards others, emotional support, and perceived discrimination); (2) treatment history (support group participation, therapy duration, and perceived therapy success); and (3) previously reported variables in self-acceptance of stuttering, which include age and stuttering severity. Participants were 80 adults who stutter who were recruited with assistance from the National Stuttering Association and Board Certified Specialists in Fluency Disorders. Participants completed an electronic survey composed of an acceptance of stuttering scale, psychosocial scales, and a participant information questionnaire. Statistical analysis identified significant correlations between participants' reports of self-acceptance of stuttering and self-esteem, perceived discrimination, hostility towards others, and perceived therapy outcome. Self-esteem was positively correlated with self-acceptance, while hostility towards others and perceived discrimination was negatively correlated with self-acceptance. Participants who perceived their therapy outcome to be successful were significantly more likely to report higher levels of self-acceptance. No significant relationships were found between self-acceptance of stuttering and support group participation, emotional support, stuttering severity, and participant age. This exploratory investigation has provided a foundation for future studies on the self-acceptance of stuttering. The findings indicate common psychosocial variables in self-acceptance of stuttering and of other disabilities. The significant relationships between self-acceptance of stuttering and psychosocial and therapeutic variables found need to be further explored to identify its causalities and clinical implications. The reader will be able to (1) discuss the importance of assessing self-acceptance of stuttering, (2) summarize the literature on self-acceptance

  4. Revisiting the NEH algorithm- the power of job insertion technique for optimizing the makespan in permutation flow shop scheduling

    Directory of Open Access Journals (Sweden)

    A. Baskar

    2016-04-01

    Full Text Available Permutation flow shop scheduling problems have been an interesting area of research for over six decades. Out of the several parameters, minimization of makespan has been studied much over the years. The problems are widely regarded as NP-Complete if the number of machines is more than three. As the computation time grows exponentially with respect to the problem size, heuristics and meta-heuristics have been proposed by many authors that give reasonably accurate and acceptable results. The NEH algorithm proposed in 1983 is still considered as one of the best simple, constructive heuristics for the minimization of makespan. This paper analyses the powerful job insertion technique used by NEH algorithm and proposes seven new variants, the complexity level remains same. 120 numbers of problem instances proposed by Taillard have been used for the purpose of validating the algorithms. Out of the seven, three produce better results than the original NEH algorithm.

  5. Differentiated influences of risk perceptions on nuclear power acceptance according to acceptance targets: Evidence from Korea

    International Nuclear Information System (INIS)

    Roh, Seung Kook; Lee, Jin Won

    2017-01-01

    The determinants of the public's nuclear power acceptance have received considerable attention as decisive factors regarding nuclear power policy. However, the contingency of the relative importance of different determinants has been less explored. Building on the literature of psychological distance between the individual and the object, the present study demonstrates that the relative effects of different types of perceived risks regarding nuclear power generation differ across acceptance targets. Using a sample of Korea, our results show that, regarding national acceptance of nuclear power generation, perceived risk from nuclear power plants exerts a stronger negative effect than that from radioactive waste management; however, the latter exerts a stronger negative effect than the former on local acceptance of a nuclear power plant. This finding provides implications for efficient public communication strategy to raise nuclear power acceptance

  6. Older Adults' Acceptance of Information Technology

    Science.gov (United States)

    Wang, Lin; Rau, Pei-Luen Patrick; Salvendy, Gavriel

    2011-01-01

    This study investigated variables contributing to older adults' information technology acceptance through a survey, which was used to find factors explaining and predicting older adults' information technology acceptance behaviors. Four factors, including needs satisfaction, perceived usability, support availability, and public acceptance, were…

  7. Monitoring device acceptance in implantable cardioverter defibrillator patients using the Florida Patient Acceptance Survey

    DEFF Research Database (Denmark)

    Versteeg, Henneke; Starrenburg, Annemieke; Denollet, Johan

    2012-01-01

    Patient device acceptance might be essential in identifying patients at risk for adverse patient-reported outcomes following implantation of an implantable cardioverter defibrillator (ICD). We examined the validity and reliability of the Florida Patient Acceptance Scale (FPAS) and identified corr...

  8. Medical research in emergency research in the European Union member states: tensions between theory and practice.

    Science.gov (United States)

    Kompanje, Erwin J O; Maas, Andrew I R; Menon, David K; Kesecioglu, Jozef

    2014-04-01

    In almost all of the European Union member states, prior consent by a legal representative is used as a substitute for informed patient consent for non-urgent medical research. Deferred (patient and/or proxy) consent is accepted as a substitute in acute emergency research in approximately half of the member states. In 12 European Union member states emergency research is not mentioned in national law. Medical research in the European Union is covered by the Clinical Trial Directive 2001/20/EC. A proposal for a regulation by the European Commission is currently being examined by the European Parliament and the Council and will replace Directive 2001/20/EC. Deferred patient and/or proxy consent is allowed in the proposed regulation, but does not fit completely in the practice of emergency research. For example, deferred consent is only possible when legal representatives are not available. This criterion will delay inclusion of patients in acute life-threatening conditions in short time frames. As the regulation shall be binding in its entirety in all member states, emergency research in acute situations is still not possible as it should be.

  9. Wind power: basic challenge concerning social acceptance

    NARCIS (Netherlands)

    Wolsink, M.; Meyers, R.A.

    2012-01-01

    This reference article gives an overview of social acceptance (acceptance by all relevant actors in society) of all relevant aspects of implementation and diffusion of wind power. In social acceptance three dimensions of acceptance are distinguished (socio-political -; community -; market

  10. Algorithms for synthesizing management solutions based on OLAP-technologies

    Science.gov (United States)

    Pishchukhin, A. M.; Akhmedyanova, G. F.

    2018-05-01

    OLAP technologies are a convenient means of analyzing large amounts of information. An attempt was made in their work to improve the synthesis of optimal management decisions. The developed algorithms allow forecasting the needs and accepted management decisions on the main types of the enterprise resources. Their advantage is the efficiency, based on the simplicity of quadratic functions and differential equations of only the first order. At the same time, the optimal redistribution of resources between different types of products from the assortment of the enterprise is carried out, and the optimal allocation of allocated resources in time. The proposed solutions can be placed on additional specially entered coordinates of the hypercube representing the data warehouse.

  11. Evaluation of segmentation algorithms for generation of patient models in radiofrequency hyperthermia

    International Nuclear Information System (INIS)

    Wust, P.; Gellermann, J.; Beier, J.; Tilly, W.; Troeger, J.; Felix, R.; Wegner, S.; Oswald, H.; Stalling, D.; Hege, H.C.; Deuflhard, P.

    1998-01-01

    Time-efficient and easy-to-use segmentation algorithms (contour generation) are a precondition for various applications in radiation oncology, especially for planning purposes in hyperthermia. We have developed the three following algorithms for contour generation and implemented them in an editor of the HyperPlan hyperthermia planning system. Firstly, a manual contour input with numerous correction and editing options. Secondly, a volume growing algorithm with adjustable threshold range and minimal region size. Thirdly, a watershed transformation in two and three dimensions. In addition, the region input function of the Helax commercial radiation therapy planning system was available for comparison. All four approaches were applied under routine conditions to two-dimensional computed tomographic slices of the superior thoracic aperture, mid-chest, upper abdomen, mid-abdomen, pelvis and thigh; they were also applied to a 3D CT sequence of 72 slices using the three-dimensional extension of the algorithms. Time to generate the contours and their quality with respect to a reference model were determined. Manual input for a complete patient model required approximately 5 to 6 h for 72 CT slices (4.5 min/slice). If slight irregularities at object boundaries are accepted, this time can be reduced to 3.5 min/slice using the volume growing algorithm. However, generating a tetrahedron mesh from such a contour sequence for hyperthermia planning (the basis for finite-element algorithms) requires a significant amount of postediting. With the watershed algorithm extended to three dimensions, processing time can be further reduced to 3 min/slice while achieving satisfactory contour quality. Therefore, this method is currently regarded as offering some potential for efficient automated model generation in hyperthermia. In summary, the 3D volume growing algorithm and watershed transformation are both suitable for segmentation of even low-contrast objects. However, they are not

  12. Linear feature detection algorithm for astronomical surveys - I. Algorithm description

    Science.gov (United States)

    Bektešević, Dino; Vinković, Dejan

    2017-11-01

    Computer vision algorithms are powerful tools in astronomical image analyses, especially when automation of object detection and extraction is required. Modern object detection algorithms in astronomy are oriented towards detection of stars and galaxies, ignoring completely the detection of existing linear features. With the emergence of wide-field sky surveys, linear features attract scientific interest as possible trails of fast flybys of near-Earth asteroids and meteors. In this work, we describe a new linear feature detection algorithm designed specifically for implementation in big data astronomy. The algorithm combines a series of algorithmic steps that first remove other objects (stars and galaxies) from the image and then enhance the line to enable more efficient line detection with the Hough algorithm. The rate of false positives is greatly reduced thanks to a step that replaces possible line segments with rectangles and then compares lines fitted to the rectangles with the lines obtained directly from the image. The speed of the algorithm and its applicability in astronomical surveys are also discussed.

  13. Access Point Backhaul Resource Aggregation as a Many-to-One Matching Game in Wireless Local Area Networks

    Directory of Open Access Journals (Sweden)

    Kawther Hassine

    2017-01-01

    Full Text Available This paper studies backhaul bandwidth aggregation in the context of a wireless local area network composed of two different types of access points: those with spare backhaul capacity (which we term providers and those in shortage of it (beneficiaries; the aim is to transfer excess capacity from providers to beneficiaries. We model the system as a matching game with many-to-one setting wherein several providers can be matched to one beneficiary and adopt the so-called deferred acceptance algorithm to reach an optimal and stable solution. We consider two flavors, when the beneficiaries are limited in their resource demands and when they are not, and two scenarios, when resources are abundant and when they are scarce. Our results show that the many-to-one setting outperforms the one-to-one case in terms of overall throughput gain, resource usage, and individual beneficiaries satisfaction by up to 50%, whether resources are scarce or abundant. As of the limited versus nonlimited case, the former ensures more fair sharing of spectral resources and higher satisfaction percentage between beneficiaries.

  14. Golden Sine Algorithm: A Novel Math-Inspired Algorithm

    Directory of Open Access Journals (Sweden)

    TANYILDIZI, E.

    2017-05-01

    Full Text Available In this study, Golden Sine Algorithm (Gold-SA is presented as a new metaheuristic method for solving optimization problems. Gold-SA has been developed as a new search algorithm based on population. This math-based algorithm is inspired by sine that is a trigonometric function. In the algorithm, random individuals are created as many as the number of search agents with uniform distribution for each dimension. The Gold-SA operator searches to achieve a better solution in each iteration by trying to bring the current situation closer to the target value. The solution space is narrowed by the golden section so that the areas that are supposed to give only good results are scanned instead of the whole solution space scan. In the tests performed, it is seen that Gold-SA has better results than other population based methods. In addition, Gold-SA has fewer algorithm-dependent parameters and operators than other metaheuristic methods, increasing the importance of this method by providing faster convergence of this new method.

  15. Differentiated influences of risk perceptions on nuclear power acceptance according to acceptance targets: Evidence from Korea

    Directory of Open Access Journals (Sweden)

    Seungkook Roh

    2017-08-01

    Full Text Available The determinants of the public's nuclear power acceptance have received considerable attention as decisive factors regarding nuclear power policy. However, the contingency of the relative importance of different determinants has been less explored. Building on the literature of psychological distance between the individual and the object, the present study demonstrates that the relative effects of different types of perceived risks regarding nuclear power generation differ across acceptance targets. Using a sample of Korea, our results show that, regarding national acceptance of nuclear power generation, perceived risk from nuclear power plants exerts a stronger negative effect than that from radioactive waste management; however, the latter exerts a stronger negative effect than the former on local acceptance of a nuclear power plant. This finding provides implications for efficient public communication strategy to raise nuclear power acceptance.

  16. Differentiated influences of risk perceptions on nuclear power acceptance according to acceptance targets: Evidence from Korea

    Energy Technology Data Exchange (ETDEWEB)

    Roh, Seung Kook [Policy Research Center, Korea Atomic Energy Research Institute (KAERI), Daejeon (Korea, Republic of); Lee, Jin Won [School of Management, Xiamen University, Xiamen (China)

    2017-08-15

    The determinants of the public's nuclear power acceptance have received considerable attention as decisive factors regarding nuclear power policy. However, the contingency of the relative importance of different determinants has been less explored. Building on the literature of psychological distance between the individual and the object, the present study demonstrates that the relative effects of different types of perceived risks regarding nuclear power generation differ across acceptance targets. Using a sample of Korea, our results show that, regarding national acceptance of nuclear power generation, perceived risk from nuclear power plants exerts a stronger negative effect than that from radioactive waste management; however, the latter exerts a stronger negative effect than the former on local acceptance of a nuclear power plant. This finding provides implications for efficient public communication strategy to raise nuclear power acceptance.

  17. The Orthogonally Partitioned EM Algorithm: Extending the EM Algorithm for Algorithmic Stability and Bias Correction Due to Imperfect Data.

    Science.gov (United States)

    Regier, Michael D; Moodie, Erica E M

    2016-05-01

    We propose an extension of the EM algorithm that exploits the common assumption of unique parameterization, corrects for biases due to missing data and measurement error, converges for the specified model when standard implementation of the EM algorithm has a low probability of convergence, and reduces a potentially complex algorithm into a sequence of smaller, simpler, self-contained EM algorithms. We use the theory surrounding the EM algorithm to derive the theoretical results of our proposal, showing that an optimal solution over the parameter space is obtained. A simulation study is used to explore the finite sample properties of the proposed extension when there is missing data and measurement error. We observe that partitioning the EM algorithm into simpler steps may provide better bias reduction in the estimation of model parameters. The ability to breakdown a complicated problem in to a series of simpler, more accessible problems will permit a broader implementation of the EM algorithm, permit the use of software packages that now implement and/or automate the EM algorithm, and make the EM algorithm more accessible to a wider and more general audience.

  18. Statistical analysis of the vibration loading of the reactor internals and fuel assemblies of reactor units type WWER-440 from deferent projects

    International Nuclear Information System (INIS)

    Ovcharov, O.; Pavelko, V.; Usanov, A.; Arkadov, G.; Dolgov, A.; Molchanov, V.; Anikeev, J.; Pljush, A.

    2006-01-01

    In this paper the following items have been presented: 1) Vibration noise instrument channels; 2) Vibration loading characteristics of control assemblies, internals and design peculiarities of internals of WWER-440 deferent projects; 3) Coolant flow rate through the reactor, reactor core, fuel assemblies and control assemblies for different projects WWER-440 and 4) Noise measurements of coolant speed per channel. The change of auto power spectrum density of absolute displacement detector signal for the last 12 years of SUS monitoring of the Kola NPP unit 2; the coherence functions groups between two SPND of the same level for the Kola NPP unit 1; the measured coolant flow rate at Paks NPP and the auto power spectrum density group of SPND signals from 11 neutron measuring channels of the Kola NPP unit 1 are given. The main factors of vibration loading of internals and fuel assemblies for Kola NPP units 1-4, Bohunice NPP units 1 and 2 and Novovoronezh NPP units 3 and 4 are also discussed

  19. 46 CFR 28.73 - Accepted organizations.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Accepted organizations. 28.73 Section 28.73 Shipping... INDUSTRY VESSELS General Provisions § 28.73 Accepted organizations. An organization desiring to be designated by the Commandant as an accepted organization must request such designation in writing. As a...

  20. Coming of Age on the Margins: Mental Health and Wellbeing Among Latino Immigrant Young Adults Eligible for Deferred Action for Childhood Arrivals (DACA).

    Science.gov (United States)

    Siemons, Rachel; Raymond-Flesch, Marissa; Auerswald, Colette L; Brindis, Claire D

    2017-06-01

    Undocumented immigrant young adults growing up in the United States face significant challenges. For those qualified, the Deferred Action for Childhood Arrivals (DACA) program's protections may alleviate stressors, with implications for their mental health and wellbeing (MHWB). We conducted nine focus groups with 61 DACA-eligible Latinos (ages 18-31) in California to investigate their health needs. Participants reported MHWB as their greatest health concern and viewed DACA as beneficial through increasing access to opportunities and promoting belonging and peer support. Participants found that DACA also introduced unanticipated challenges, including greater adult responsibilities and a new precarious identity. Thus, immigration policies such as DACA may influence undocumented young adults' MHWB in expected and unexpected ways. Research into the impacts of policy changes on young immigrants' MHWB can guide stakeholders to better address this population's health needs. MHWB implications include the need to reduce fear of deportation and increase access to services.

  1. Early declarative memory predicts productive language: A longitudinal study of deferred imitation and communication at 9 and 16months.

    Science.gov (United States)

    Sundqvist, Annette; Nordqvist, Emelie; Koch, Felix-Sebastian; Heimann, Mikael

    2016-11-01

    Deferred imitation (DI) may be regarded as an early declarative-like memory ability shaping the infant's ability to learn about novelties and regularities of the surrounding world. In the current longitudinal study, infants were assessed at 9 and 16months. DI was assessed using five novel objects. Each infant's communicative development was measured by parental questionnaires. The results indicate stability in DI performance and early communicative development between 9 and 16months. The early achievers at 9months were still advanced at 16months. Results also identified a predictive relationship between the infant's gestural development at 9months and the infant's productive and receptive language at 16months. Moreover, the results show that declarative memory, measured with DI, and gestural communication at 9months independently predict productive language at 16months. These findings suggest a connection between the ability to form non-linguistic and linguistic mental representations. These results indicate that the child's DI ability when predominantly preverbal might be regarded as an early domain-general declarative memory ability underlying early productive language development. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  2. Parallel sorting algorithms

    CERN Document Server

    Akl, Selim G

    1985-01-01

    Parallel Sorting Algorithms explains how to use parallel algorithms to sort a sequence of items on a variety of parallel computers. The book reviews the sorting problem, the parallel models of computation, parallel algorithms, and the lower bounds on the parallel sorting problems. The text also presents twenty different algorithms, such as linear arrays, mesh-connected computers, cube-connected computers. Another example where algorithm can be applied is on the shared-memory SIMD (single instruction stream multiple data stream) computers in which the whole sequence to be sorted can fit in the

  3. Evaluation of the Eclipse eMC algorithm for bolus electron conformal therapy using a standard verification dataset.

    Science.gov (United States)

    Carver, Robert L; Sprunger, Conrad P; Hogstrom, Kenneth R; Popple, Richard A; Antolak, John A

    2016-05-08

    The purpose of this study was to evaluate the accuracy and calculation speed of electron dose distributions calculated by the Eclipse electron Monte Carlo (eMC) algorithm for use with bolus electron conformal therapy (ECT). The recent com-mercial availability of bolus ECT technology requires further validation of the eMC dose calculation algorithm. eMC-calculated electron dose distributions for bolus ECT have been compared to previously measured TLD-dose points throughout patient-based cylindrical phantoms (retromolar trigone and nose), whose axial cross sections were based on the mid-PTV (planning treatment volume) CT anatomy. The phantoms consisted of SR4 muscle substitute, SR4 bone substitute, and air. The treatment plans were imported into the Eclipse treatment planning system, and electron dose distributions calculated using 1% and processors (Intel Xeon E5-2690, 2.9 GHz) on a framework agent server (FAS). In comparison, the eMC was significantly more accurate than the pencil beam algorithm (PBA). The eMC has comparable accuracy to the pencil beam redefinition algorithm (PBRA) used for bolus ECT planning and has acceptably low dose calculation times. The eMC accuracy decreased when smoothing was used in high-gradient dose regions. The eMC accuracy was consistent with that previously reported for accuracy of the eMC electron dose algorithm and shows that the algorithm is suitable for clinical implementation of bolus ECT.

  4. Application of Shuffled Frog Leaping Algorithm and Genetic Algorithm for the Optimization of Urban Stormwater Drainage

    Science.gov (United States)

    Kumar, S.; Kaushal, D. R.; Gosain, A. K.

    2017-12-01

    Urban hydrology will have an increasing role to play in the sustainability of human settlements. Expansion of urban areas brings significant changes in physical characteristics of landuse. Problems with administration of urban flooding have their roots in concentration of population within a relatively small area. As watersheds are urbanized, infiltration decreases, pattern of surface runoff is changed generating high peak flows, large runoff volumes from urban areas. Conceptual rainfall-runoff models have become a foremost tool for predicting surface runoff and flood forecasting. Manual calibration is often time consuming and tedious because of the involved subjectivity, which makes automatic approach more preferable. The calibration of parameters usually includes numerous criteria for evaluating the performances with respect to the observed data. Moreover, derivation of objective function assosciat6ed with the calibration of model parameters is quite challenging. Various studies dealing with optimization methods has steered the embracement of evolution based optimization algorithms. In this paper, a systematic comparison of two evolutionary approaches to multi-objective optimization namely shuffled frog leaping algorithm (SFLA) and genetic algorithms (GA) is done. SFLA is a cooperative search metaphor, stimulated by natural memetics based on the population while, GA is based on principle of survival of the fittest and natural evolution. SFLA and GA has been employed for optimizing the major parameters i.e. width, imperviousness, Manning's coefficient and depression storage for the highly urbanized catchment of Delhi, India. The study summarizes the auto-tuning of a widely used storm water management model (SWMM), by internal coupling of SWMM with SFLA and GA separately. The values of statistical parameters such as, Nash-Sutcliffe efficiency (NSE) and Percent Bias (PBIAS) were found to lie within the acceptable limit, indicating reasonably good model performance

  5. Credit in Acceptance Sampling on Attributes

    NARCIS (Netherlands)

    Klaassen, Chris A.J.

    2000-01-01

    Credit is introduced in acceptance sampling on attributes and a Credit Based Acceptance sampling system is developed that is very easy to apply in practice.The credit of a producer is defined as the total number of items accepted since the last rejection.In our sampling system the sample size for a

  6. Performance of fusion algorithms for computer-aided detection and classification of mines in very shallow water obtained from testing in navy Fleet Battle Exercise-Hotel 2000

    Science.gov (United States)

    Ciany, Charles M.; Zurawski, William; Kerfoot, Ian

    2001-10-01

    The performance of Computer Aided Detection/Computer Aided Classification (CAD/CAC) Fusion algorithms on side-scan sonar images was evaluated using data taken at the Navy's's Fleet Battle Exercise-Hotel held in Panama City, Florida, in August 2000. A 2-of-3 binary fusion algorithm is shown to provide robust performance. The algorithm accepts the classification decisions and associated contact locations form three different CAD/CAC algorithms, clusters the contacts based on Euclidian distance, and then declares a valid target when a clustered contact is declared by at least 2 of the 3 individual algorithms. This simple binary fusion provided a 96 percent probability of correct classification at a false alarm rate of 0.14 false alarms per image per side. The performance represented a 3.8:1 reduction in false alarms over the best performing single CAD/CAC algorithm, with no loss in probability of correct classification.

  7. 12 CFR 412.7 - Conditions for acceptance.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Conditions for acceptance. 412.7 Section 412.7 Banks and Banking EXPORT-IMPORT BANK OF THE UNITED STATES ACCEPTANCE OF PAYMENT FROM A NON-FEDERAL SOURCE FOR TRAVEL EXPENSES § 412.7 Conditions for acceptance. (a) Eximbank may accept payment for...

  8. Vehicle Trajectory Estimation Using Spatio-Temporal MCMC

    Directory of Open Access Journals (Sweden)

    Francois Bardet

    2010-01-01

    Full Text Available This paper presents an algorithm for modeling and tracking vehicles in video sequences within one integrated framework. Most of the solutions are based on sequential methods that make inference according to current information. In contrast, we propose a deferred logical inference method that makes a decision according to a sequence of observations, thus processing a spatio-temporal search on the whole trajectory. One of the drawbacks of deferred logical inference methods is that the solution space of hypotheses grows exponentially related to the depth of observation. Our approach takes into account both the kinematic model of the vehicle and a driver behavior model in order to reduce the space of the solutions. The resulting proposed state model explains the trajectory with only 11 parameters. The solution space is then sampled with a Markov Chain Monte Carlo (MCMC that uses a model-driven proposal distribution in order to control random walk behavior. We demonstrate our method on real video sequences from which we have ground truth provided by a RTK GPS (Real-Time Kinematic GPS. Experimental results show that the proposed algorithm outperforms a sequential inference solution (particle filter.

  9. Consumer acceptance of functional foods

    DEFF Research Database (Denmark)

    Frewer, Lynn J.; Scholderer, Joachim; Lambert, Nigel

    2003-01-01

    In the past, it has been assumed that consumers would accept novel foods if there is a concrete and tangible consumer benefit associated with them, which implies that those functional foods would quickly be accepted. However, there is evidence that individuals are likely to differ in the extent...... to which they are likely to buy products with particular functional properties. Various cross-cultural and demographic differences in acceptance found in the literature are reviewed, as well as barriers to dietary change. In conclusion, it is argued that understanding consumer's risk perceptions...

  10. Influenza detection and prediction algorithms: comparative accuracy trial in Östergötland county, Sweden, 2008-2012.

    Science.gov (United States)

    Spreco, A; Eriksson, O; Dahlström, Ö; Timpka, T

    2017-07-01

    Methods for the detection of influenza epidemics and prediction of their progress have seldom been comparatively evaluated using prospective designs. This study aimed to perform a prospective comparative trial of algorithms for the detection and prediction of increased local influenza activity. Data on clinical influenza diagnoses recorded by physicians and syndromic data from a telenursing service were used. Five detection and three prediction algorithms previously evaluated in public health settings were calibrated and then evaluated over 3 years. When applied on diagnostic data, only detection using the Serfling regression method and prediction using the non-adaptive log-linear regression method showed acceptable performances during winter influenza seasons. For the syndromic data, none of the detection algorithms displayed a satisfactory performance, while non-adaptive log-linear regression was the best performing prediction method. We conclude that evidence was found for that available algorithms for influenza detection and prediction display satisfactory performance when applied on local diagnostic data during winter influenza seasons. When applied on local syndromic data, the evaluated algorithms did not display consistent performance. Further evaluations and research on combination of methods of these types in public health information infrastructures for 'nowcasting' (integrated detection and prediction) of influenza activity are warranted.

  11. Fermion cluster algorithms

    International Nuclear Information System (INIS)

    Chandrasekharan, Shailesh

    2000-01-01

    Cluster algorithms have been recently used to eliminate sign problems that plague Monte-Carlo methods in a variety of systems. In particular such algorithms can also be used to solve sign problems associated with the permutation of fermion world lines. This solution leads to the possibility of designing fermion cluster algorithms in certain cases. Using the example of free non-relativistic fermions we discuss the ideas underlying the algorithm

  12. PTM Along Track Algorithm to Maintain Spacing During Same Direction Pair-Wise Trajectory Management Operations

    Science.gov (United States)

    Carreno, Victor A.

    2015-01-01

    Pair-wise Trajectory Management (PTM) is a cockpit based delegated responsibility separation standard. When an air traffic service provider gives a PTM clearance to an aircraft and the flight crew accepts the clearance, the flight crew will maintain spacing and separation from a designated aircraft. A PTM along track algorithm will receive state information from the designated aircraft and from the own ship to produce speed guidance for the flight crew to maintain spacing and separation

  13. Performance of a Real-time Multipurpose 2-Dimensional Clustering Algorithm Developed for the ATLAS Experiment

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00372074; The ATLAS collaboration; Sotiropoulou, Calliope Louisa; Annovi, Alberto; Kordas, Kostantinos

    2016-01-01

    In this paper the performance of the 2D pixel clustering algorithm developed for the Input Mezzanine card of the ATLAS Fast TracKer system is presented. Fast TracKer is an approved ATLAS upgrade that has the goal to provide a complete list of tracks to the ATLAS High Level Trigger for each level-1 accepted event, at up to 100 kHz event rate with a very small latency, in the order of 100µs. The Input Mezzanine card is the input stage of the Fast TracKer system. Its role is to receive data from the silicon detector and perform real time clustering, thus to reduce the amount of data propagated to the subsequent processing levels with minimal information loss. We focus on the most challenging component on the Input Mezzanine card, the 2D clustering algorithm executed on the pixel data. We compare two different implementations of the algorithm. The first is one called the ideal one which searches clusters of pixels in the whole silicon module at once and calculates the cluster centroids exploiting the whole avail...

  14. Performance of a Real-time Multipurpose 2-Dimensional Clustering Algorithm Developed for the ATLAS Experiment

    CERN Document Server

    Gkaitatzis, Stamatios; The ATLAS collaboration

    2016-01-01

    In this paper the performance of the 2D pixel clustering algorithm developed for the Input Mezzanine card of the ATLAS Fast TracKer system is presented. Fast TracKer is an approved ATLAS upgrade that has the goal to provide a complete list of tracks to the ATLAS High Level Trigger for each level-1 accepted event, at up to 100 kHz event rate with a very small latency, in the order of 100 µs. The Input Mezzanine card is the input stage of the Fast TracKer system. Its role is to receive data from the silicon detector and perform real time clustering, thus to reduce the amount of data propagated to the subsequent processing levels with minimal information loss. We focus on the most challenging component on the Input Mezzanine card, the 2D clustering algorithm executed on the pixel data. We compare two different implementations of the algorithm. The first is one called the ideal one which searches clusters of pixels in the whole silicon module at once and calculates the cluster centroids exploiting the whole avai...

  15. An effective trust-based recommendation method using a novel graph clustering algorithm

    Science.gov (United States)

    Moradi, Parham; Ahmadian, Sajad; Akhlaghian, Fardin

    2015-10-01

    Recommender systems are programs that aim to provide personalized recommendations to users for specific items (e.g. music, books) in online sharing communities or on e-commerce sites. Collaborative filtering methods are important and widely accepted types of recommender systems that generate recommendations based on the ratings of like-minded users. On the other hand, these systems confront several inherent issues such as data sparsity and cold start problems, caused by fewer ratings against the unknowns that need to be predicted. Incorporating trust information into the collaborative filtering systems is an attractive approach to resolve these problems. In this paper, we present a model-based collaborative filtering method by applying a novel graph clustering algorithm and also considering trust statements. In the proposed method first of all, the problem space is represented as a graph and then a sparsest subgraph finding algorithm is applied on the graph to find the initial cluster centers. Then, the proposed graph clustering algorithm is performed to obtain the appropriate users/items clusters. Finally, the identified clusters are used as a set of neighbors to recommend unseen items to the current active user. Experimental results based on three real-world datasets demonstrate that the proposed method outperforms several state-of-the-art recommender system methods.

  16. Trilateration-based localization algorithm for ADS-B radar systems

    Science.gov (United States)

    Huang, Ming-Shih

    Rapidly increasing growth and demand in various unmanned aerial vehicles (UAV) have pushed governmental regulation development and numerous technology research advances toward integrating unmanned and manned aircraft into the same civil airspace. Safety of other airspace users is the primary concern; thus, with the introduction of UAV into the National Airspace System (NAS), a key issue to overcome is the risk of a collision with manned aircraft. The challenge of UAV integration is global. As automatic dependent surveillance-broadcast (ADS-B) system has gained wide acceptance, additional exploitations of the radioed satellite-based information are topics of current interest. One such opportunity includes the augmentation of the communication ADS-B signal with a random bi-phase modulation for concurrent use as a radar signal for detecting other aircraft in the vicinity. This dissertation provides detailed discussion about the ADS-B radar system, as well as the formulation and analysis of a suitable non-cooperative multi-target tracking method for the ADS-B radar system using radar ranging techniques and particle filter algorithms. In order to deal with specific challenges faced by the ADS-B radar system, several estimation algorithms are studied. Trilateration-based localization algorithms are proposed due to their easy implementation and their ability to work with coherent signal sources. The centroid of three most closely spaced intersections of constant-range loci is conventionally used as trilateration estimate without rigorous justification. In this dissertation, we address the quality of trilateration intersections through range scaling factors. A number of well-known triangle centers, including centroid, incenter, Lemoine point (LP), and Fermat point (FP), are discussed in detail. To the author's best knowledge, LP was never associated with trilateration techniques. According our study, LP is proposed as the best trilateration estimator thanks to the

  17. Algorithm improvement program nuclide identification algorithm scoring criteria and scoring application.

    Energy Technology Data Exchange (ETDEWEB)

    Enghauser, Michael [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-02-01

    The goal of the Domestic Nuclear Detection Office (DNDO) Algorithm Improvement Program (AIP) is to facilitate gamma-radiation detector nuclide identification algorithm development, improvement, and validation. Accordingly, scoring criteria have been developed to objectively assess the performance of nuclide identification algorithms. In addition, a Microsoft Excel spreadsheet application for automated nuclide identification scoring has been developed. This report provides an overview of the equations, nuclide weighting factors, nuclide equivalencies, and configuration weighting factors used by the application for scoring nuclide identification algorithm performance. Furthermore, this report presents a general overview of the nuclide identification algorithm scoring application including illustrative examples.

  18. Sustainability and acceptance - new challenges for nuclear energy

    International Nuclear Information System (INIS)

    Lensa, W. von

    2001-01-01

    This paper discusses the concept of sustainability in relation to acceptance of nuclear energy. Acceptance is viewed in terms of public acceptance, industrial acceptance, and internal acceptance/consensus within the nuclear community. It addresses sustainability criteria, the need for innovation, and the different levels of acceptability. The mechanisms of risk perception are discussed along with the technological consequences from risk perception mechanisms leading to specific objections against nuclear energy. (author)

  19. Nuclear Energy and Public Acceptance

    International Nuclear Information System (INIS)

    Daifuku, K.

    2002-01-01

    The continued use of nuclear power in the European Union and elsewhere requires an adequate level of public and political acceptance. A lack of acceptance is often mistakenly cited as a reason for the slowdown in nuclear power plant construction in Western Europe and as a justification for abandoning nuclear power. In fact, the reasons for the slowdown have more to do with the following two factors: Plentiful supplies of low-priced natural gas, making gas-fired power plants a more attractive investment choice; more than adequate supplies of electricity which have curbed the need for the construction of new plant of any kind. In general, moves towards a withdrawal from nuclear in certain Community countries have been due to party political pressures and have not been a response to public opposition to nuclear. In addition, opinion polls do not show widespread public opposition to the use of nuclear power. Figures consistently indicate that the use of nuclear power does not come high on the list of most people's main worries. Their main concerns focus on other issues such as crime and financial problems. In the main, electricity is taken for granted in the industrialised world. Electric power only becomes an issue when there is a threat of shortages. So if public acceptance is not the main obstacle, what is? Political acceptance is an integral part of the process in which nuclear becomes acceptable or not. The relationship between public and political acceptance and the role of the industry in this context, on how to foster a better trialogue, will be examined. (author)

  20. Algorithm of Particle Data Association for SLAM Based on Improved Ant Algorithm

    Directory of Open Access Journals (Sweden)

    KeKe Gen

    2015-01-01

    Full Text Available The article considers a problem of data association algorithm for simultaneous localization and mapping guidelines in determining the route of unmanned aerial vehicles (UAVs. Currently, these equipments are already widely used, but mainly controlled from the remote operator. An urgent task is to develop a control system that allows for autonomous flight. Algorithm SLAM (simultaneous localization and mapping, which allows to predict the location, speed, the ratio of flight parameters and the coordinates of landmarks and obstacles in an unknown environment, is one of the key technologies to achieve real autonomous UAV flight. The aim of this work is to study the possibility of solving this problem by using an improved ant algorithm.The data association for SLAM algorithm is meant to establish a matching set of observed landmarks and landmarks in the state vector. Ant algorithm is one of the widely used optimization algorithms with positive feedback and the ability to search in parallel, so the algorithm is suitable for solving the problem of data association for SLAM. But the traditional ant algorithm in the process of finding routes easily falls into local optimum. Adding random perturbations in the process of updating the global pheromone to avoid local optima. Setting limits pheromone on the route can increase the search space with a reasonable amount of calculations for finding the optimal route.The paper proposes an algorithm of the local data association for SLAM algorithm based on an improved ant algorithm. To increase the speed of calculation, local data association is used instead of the global data association. The first stage of the algorithm defines targets in the matching space and the observed landmarks with the possibility of association by the criterion of individual compatibility (IC. The second stage defines the matched landmarks and their coordinates using improved ant algorithm. Simulation results confirm the efficiency and

  1. Acceptability of GM foods among Pakistani consumers

    Science.gov (United States)

    Ali, Akhter; Rahut, Dil Bahadur; Imtiaz, Muhammad

    2016-01-01

    ABSTRACT In Pakistan majority of the consumers do not have information about genetically modified (GM) foods. In developing countries particularly in Pakistan few studies have focused on consumers' acceptability about GM foods. Using comprehensive primary dataset collected from 320 consumers in 2013 from Pakistan, this study analyzes the determinants of consumers' acceptability of GM foods. The data was analyzed by employing the bivariate probit model and censored least absolute deviation (CLAD) models. The empirical results indicated that urban consumers are more aware of GM foods compared to rural consumers. The acceptance of GM foods was more among females' consumers as compared to male consumers. In addition, the older consumers were more willing to accept GM food compared to young consumers. The acceptability of GM foods was also higher among wealthier households. Low price is the key factor leading to the acceptability of GM foods. The acceptability of the GM foods also reduces the risks among Pakistani consumers. PMID:27494790

  2. Acceptability of GM foods among Pakistani consumers.

    Science.gov (United States)

    Ali, Akhter; Rahut, Dil Bahadur; Imtiaz, Muhammad

    2016-04-02

    In Pakistan majority of the consumers do not have information about genetically modified (GM) foods. In developing countries particularly in Pakistan few studies have focused on consumers' acceptability about GM foods. Using comprehensive primary dataset collected from 320 consumers in 2013 from Pakistan, this study analyzes the determinants of consumers' acceptability of GM foods. The data was analyzed by employing the bivariate probit model and censored least absolute deviation (CLAD) models. The empirical results indicated that urban consumers are more aware of GM foods compared to rural consumers. The acceptance of GM foods was more among females' consumers as compared to male consumers. In addition, the older consumers were more willing to accept GM food compared to young consumers. The acceptability of GM foods was also higher among wealthier households. Low price is the key factor leading to the acceptability of GM foods. The acceptability of the GM foods also reduces the risks among Pakistani consumers.

  3. Consumer Acceptance of Novel Foods

    NARCIS (Netherlands)

    Fischer, A.R.H.; Reinders, M.J.

    2016-01-01

    The success of novel foods depends to a considerable extent on whether consumers accept those innovations. This chapter provides an overview of current knowledge relevant to consumer acceptance of innovations in food. A broad range of theories and approaches to assess consumer response to

  4. Impact of an Acceptance Facilitating Intervention on Patients' Acceptance of Internet-based Pain Interventions: A Randomized Controlled Trial.

    Science.gov (United States)

    Baumeister, Harald; Seifferth, Holger; Lin, Jiaxi; Nowoczin, Lisa; Lüking, Marianne; Ebert, David

    2015-06-01

    Results from clinical trials indicate that Internet-based psychological pain interventions are effective in treating chronic pain. However, little is known about patients' acceptance of these programs and how to positively influence patients' intention to engage in them. Therefore, the present study aimed (1) to assess patients' acceptance of Internet-based interventions, and (2) to examine whether patients' acceptance can be increased by an acceptance facilitating intervention. A total of 104 patients with chronic pain from 2 pain units were randomly allocated to an intervention group (IG) and a no-intervention control group (CG). The IG was shown a short informational video about Internet-based psychological pain interventions before receiving a questionnaire on patients' acceptance of Internet-based psychological pain interventions and predictors of acceptance (performance expectancy, effort expectancy, social influence, facilitating conditions, Internet usage, and Internet anxiety). The CG filled out the questionnaire immediately. Patients' acceptance was measured with a 4-item scale (sum score ranging from 4 to 20). Baseline acceptance of Internet-based interventions was reported as low (sum-score:4-9) by 53.8%, moderate (10 to 15) by 42.3%, and high (16 to 20) by 3.9% of the patients with chronic pain in the CG. The IG showed a significantly higher acceptance (M = 12.17, SD = 4.22) than the CG (M = 8.94, SD = 3.71) with a standardized mean difference of d = 0.81 (95% CI, 0.41, 1.21). All predictor variables were significantly improved in the IG compared with the CG, except for Internet usage. Patients with chronic pain display a relatively low acceptance of Internet-based psychological pain interventions, which can be substantially increased by a short informational video.

  5. The BR eigenvalue algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Geist, G.A. [Oak Ridge National Lab., TN (United States). Computer Science and Mathematics Div.; Howell, G.W. [Florida Inst. of Tech., Melbourne, FL (United States). Dept. of Applied Mathematics; Watkins, D.S. [Washington State Univ., Pullman, WA (United States). Dept. of Pure and Applied Mathematics

    1997-11-01

    The BR algorithm, a new method for calculating the eigenvalues of an upper Hessenberg matrix, is introduced. It is a bulge-chasing algorithm like the QR algorithm, but, unlike the QR algorithm, it is well adapted to computing the eigenvalues of the narrowband, nearly tridiagonal matrices generated by the look-ahead Lanczos process. This paper describes the BR algorithm and gives numerical evidence that it works well in conjunction with the Lanczos process. On the biggest problems run so far, the BR algorithm beats the QR algorithm by a factor of 30--60 in computing time and a factor of over 100 in matrix storage space.

  6. Classical Methods and Calculation Algorithms for Determining Lime Requirements

    Directory of Open Access Journals (Sweden)

    André Guarçoni

    Full Text Available ABSTRACT The methods developed for determination of lime requirements (LR are based on widely accepted principles. However, the formulas used for calculation have evolved little over recent decades, and in some cases there are indications of their inadequacy. The aim of this study was to compare the lime requirements calculated by three classic formulas and three algorithms, defining those most appropriate for supplying Ca and Mg to coffee plants and the smaller possibility of causing overliming. The database used contained 600 soil samples, which were collected in coffee plantings. The LR was estimated by the methods of base saturation, neutralization of Al3+, and elevation of Ca2+ and Mg2+ contents (two formulas and by the three calculation algorithms. Averages of the lime requirements were compared, determining the frequency distribution of the 600 lime requirements (LR estimated through each calculation method. In soils with low cation exchange capacity at pH 7, the base saturation method may fail to adequately supply the plants with Ca and Mg in many situations, while the method of Al3+ neutralization and elevation of Ca2+ and Mg2+ contents can result in the calculation of application rates that will increase the pH above the suitable range. Among the methods studied for calculating lime requirements, the algorithm that predicts reaching a defined base saturation, with adequate Ca and Mg supply and the maximum application rate limited to the H+Al value, proved to be the most efficient calculation method, and it can be recommended for use in numerous crops conditions.

  7. Finite sample performance of the E-M algorithm for ranks data modelling

    Directory of Open Access Journals (Sweden)

    Angela D'Elia

    2007-10-01

    Full Text Available We check the finite sample performance of the maximum likelihood estimators of the parameters of a mixture distribution recently introduced for modelling ranks/preference data. The estimates are derived by the E-M algorithm and the performance is evaluated both from an univariate and bivariate points of view. While the results are generally acceptable as far as it concerns the bias, the Monte Carlo experiment shows a different behaviour of the estimators efficiency for the two parameters of the mixture, mainly depending upon their location in the admissible parametric space. Some operative suggestions conclude the paer.

  8. A Nonmonotone Line Search Filter Algorithm for the System of Nonlinear Equations

    Directory of Open Access Journals (Sweden)

    Zhong Jin

    2012-01-01

    Full Text Available We present a new iterative method based on the line search filter method with the nonmonotone strategy to solve the system of nonlinear equations. The equations are divided into two groups; some equations are treated as constraints and the others act as the objective function, and the two groups are just updated at the iterations where it is needed indeed. We employ the nonmonotone idea to the sufficient reduction conditions and filter technique which leads to a flexibility and acceptance behavior comparable to monotone methods. The new algorithm is shown to be globally convergent and numerical experiments demonstrate its effectiveness.

  9. [Managment of acute low back pain without trauma - an algorithm].

    Science.gov (United States)

    Melcher, Carolin; Wegener, Bernd; Jansson, Volkmar; Mutschler, Wolf; Kanz, Karl-Georg; Birkenmaier, Christof

    2018-05-14

    Low back pain is a common problem for primary care providers, outpatient clinics and A&E departments. The predominant symptoms are those of so-called "unspecific back pain", but serious pathologies can be concealed by the clinical signs. Especially less experienced colleagues have problems in treating these patients, as - despite the multitude of recommendations and guidelines - there is no generally accepted algorithm. After a literature search (Medline/Cochrane), 158 articles were selected from 15,000 papers and classified according to their level of evidence. These were attuned to the clinical guidelines of the orthopaedic and pain-physician associations in Europe, North America and overseas and the experience of specialists at LMU Munich, in order to achieve consistency with literature recommendations, as well as feasibility in everyday clinical work and optimised with practical relevance. An algorithm was formed to provide the crucial differential diagnosis of lumbar back pain according to its clinical relevance and to provide a plan of action offering reasonable diagnostic and therapeutic steps. As a consequence of distinct binary decisions, low back patients should be treated at any given time according to the guidelines, with emergencies detected, unnecessary diagnostic testing and interventions averted and reasonable treatment initiated pursuant to the underlying pathology. In the context of the available evidence, a clinical algorithm has been developed that translates the complex diagnostic testing of acute low back pain into a transparent, structured and systematic guideline. Georg Thieme Verlag KG Stuttgart · New York.

  10. Computer vision algorithm for diabetic foot injury identification and evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Castaneda M, C. L.; Solis S, L. O.; Martinez B, M. R.; Ortiz R, J. M.; Garza V, I.; Martinez F, M.; Castaneda M, R.; Vega C, H. R., E-mail: lsolis@uaz.edu.mx [Universidad Autonoma de Zacatecas, 98000 Zacatecas, Zac. (Mexico)

    2016-10-15

    Diabetic foot is one of the most devastating consequences related to diabetes. It is relevant because of its incidence and the elevated percentage of amputations and deaths that the disease implies. Given the fact that the existing tests and laboratories designed to diagnose it are limited and expensive, the most common evaluation is still based on signs and symptoms. This means that the specialist completes a questionnaire based solely on observation and an invasive wound measurement. Using the questionnaire, the physician issues a diagnosis. In the sense, the diagnosis relies only on the criteria and the specialists experience. For some variables such as the lesions area or their location, this dependency is not acceptable. Currently bio-engineering has played a key role on the diagnose of different chronic degenerative diseases. A timely diagnose has proven to be the best tool against diabetic foot. The diabetics foot clinical evaluation, increases the possibility to identify risks and further complications. The main goal of this paper is to present the development of an algorithm based on digital image processing techniques, which enables to optimize the results on the diabetics foot lesion evaluation. Using advanced techniques for object segmentation and adjusting the sensibility parameter, allows the correlation between the algorithms identified wounds and those observed by the physician. Using the developed algorithm it is possible to identify and assess the wounds, their size, and location, in a non-invasive way. (Author)

  11. Computer vision algorithm for diabetic foot injury identification and evaluation

    International Nuclear Information System (INIS)

    Castaneda M, C. L.; Solis S, L. O.; Martinez B, M. R.; Ortiz R, J. M.; Garza V, I.; Martinez F, M.; Castaneda M, R.; Vega C, H. R.

    2016-10-01

    Diabetic foot is one of the most devastating consequences related to diabetes. It is relevant because of its incidence and the elevated percentage of amputations and deaths that the disease implies. Given the fact that the existing tests and laboratories designed to diagnose it are limited and expensive, the most common evaluation is still based on signs and symptoms. This means that the specialist completes a questionnaire based solely on observation and an invasive wound measurement. Using the questionnaire, the physician issues a diagnosis. In the sense, the diagnosis relies only on the criteria and the specialists experience. For some variables such as the lesions area or their location, this dependency is not acceptable. Currently bio-engineering has played a key role on the diagnose of different chronic degenerative diseases. A timely diagnose has proven to be the best tool against diabetic foot. The diabetics foot clinical evaluation, increases the possibility to identify risks and further complications. The main goal of this paper is to present the development of an algorithm based on digital image processing techniques, which enables to optimize the results on the diabetics foot lesion evaluation. Using advanced techniques for object segmentation and adjusting the sensibility parameter, allows the correlation between the algorithms identified wounds and those observed by the physician. Using the developed algorithm it is possible to identify and assess the wounds, their size, and location, in a non-invasive way. (Author)

  12. Evaluation of the Acceptance of Audience Response System by Corporations Using the Technology Acceptance Model

    Science.gov (United States)

    Chu, Hsing-Hui; Lu, Ta-Jung; Wann, Jong-Wen

    The purpose of this research is to explore enterprises' acceptance of Audience Response System (ARS) using Technology Acceptance Model (TAM). The findings show that (1) IT characteristics and facilitating conditions could be external variables of TAM. (2) The degree of E-business has positive significant correlation with behavioral intention of employees. (3) TAM is a good model to predict and explain IT acceptance. (4) Demographic variables, industry and firm characteristics have no significant correlation with ARS acceptance. The results provide useful information to managers and ARS providers that (1) ARS providers should focus more on creating different usages to enhance interactivity and employees' using intention. (2) Managers should pay attention to build sound internal facilitating conditions for introducing IT. (3) According to the degree of E-business, managers should set up strategic stages of introducing IT. (4) Providers should increase product promotion and also leverage academic and government to promote ARS.

  13. Geometric approximation algorithms

    CERN Document Server

    Har-Peled, Sariel

    2011-01-01

    Exact algorithms for dealing with geometric objects are complicated, hard to implement in practice, and slow. Over the last 20 years a theory of geometric approximation algorithms has emerged. These algorithms tend to be simple, fast, and more robust than their exact counterparts. This book is the first to cover geometric approximation algorithms in detail. In addition, more traditional computational geometry techniques that are widely used in developing such algorithms, like sampling, linear programming, etc., are also surveyed. Other topics covered include approximate nearest-neighbor search, shape approximation, coresets, dimension reduction, and embeddings. The topics covered are relatively independent and are supplemented by exercises. Close to 200 color figures are included in the text to illustrate proofs and ideas.

  14. Joint-2D-SL0 Algorithm for Joint Sparse Matrix Reconstruction

    Directory of Open Access Journals (Sweden)

    Dong Zhang

    2017-01-01

    Full Text Available Sparse matrix reconstruction has a wide application such as DOA estimation and STAP. However, its performance is usually restricted by the grid mismatch problem. In this paper, we revise the sparse matrix reconstruction model and propose the joint sparse matrix reconstruction model based on one-order Taylor expansion. And it can overcome the grid mismatch problem. Then, we put forward the Joint-2D-SL0 algorithm which can solve the joint sparse matrix reconstruction problem efficiently. Compared with the Kronecker compressive sensing method, our proposed method has a higher computational efficiency and acceptable reconstruction accuracy. Finally, simulation results validate the superiority of the proposed method.

  15. Secondary Structure Prediction of Protein using Resilient Back Propagation Learning Algorithm

    Directory of Open Access Journals (Sweden)

    Jyotshna Dongardive

    2015-12-01

    Full Text Available The paper proposes a neural network based approach to predict secondary structure of protein. It uses Multilayer Feed Forward Network (MLFN with resilient back propagation as the learning algorithm. Point Accepted Mutation (PAM is adopted as the encoding scheme and CB396 data set is used for the training and testing of the network. Overall accuracy of the network has been experimentally calculated with different window sizes for the sliding window scheme and by varying the number of units in the hidden layer. The best results were obtained with eleven as the window size and seven as the number of units in the hidden layer.

  16. Algorithms in Singular

    Directory of Open Access Journals (Sweden)

    Hans Schonemann

    1996-12-01

    Full Text Available Some algorithms for singularity theory and algebraic geometry The use of Grobner basis computations for treating systems of polynomial equations has become an important tool in many areas. This paper introduces of the concept of standard bases (a generalization of Grobner bases and the application to some problems from algebraic geometry. The examples are presented as SINGULAR commands. A general introduction to Grobner bases can be found in the textbook [CLO], an introduction to syzygies in [E] and [St1]. SINGULAR is a computer algebra system for computing information about singularities, for use in algebraic geometry. The basic algorithms in SINGULAR are several variants of a general standard basis algorithm for general monomial orderings (see [GG]. This includes wellorderings (Buchberger algorithm ([B1], [B2] and tangent cone orderings (Mora algorithm ([M1], [MPT] as special cases: It is able to work with non-homogeneous and homogeneous input and also to compute in the localization of the polynomial ring in 0. Recent versions include algorithms to factorize polynomials and a factorizing Grobner basis algorithm. For a complete description of SINGULAR see [Si].

  17. Multimodal optimization by using hybrid of artificial bee colony algorithm and BFGS algorithm

    Science.gov (United States)

    Anam, S.

    2017-10-01

    Optimization has become one of the important fields in Mathematics. Many problems in engineering and science can be formulated into optimization problems. They maybe have many local optima. The optimization problem with many local optima, known as multimodal optimization problem, is how to find the global solution. Several metaheuristic methods have been proposed to solve multimodal optimization problems such as Particle Swarm Optimization (PSO), Genetics Algorithm (GA), Artificial Bee Colony (ABC) algorithm, etc. The performance of the ABC algorithm is better than or similar to those of other population-based algorithms with the advantage of employing a fewer control parameters. The ABC algorithm also has the advantages of strong robustness, fast convergence and high flexibility. However, it has the disadvantages premature convergence in the later search period. The accuracy of the optimal value cannot meet the requirements sometimes. Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm is a good iterative method for finding a local optimum. Compared with other local optimization methods, the BFGS algorithm is better. Based on the advantages of the ABC algorithm and the BFGS algorithm, this paper proposes a hybrid of the artificial bee colony algorithm and the BFGS algorithm to solve the multimodal optimization problem. The first step is that the ABC algorithm is run to find a point. In the second step is that the point obtained by the first step is used as an initial point of BFGS algorithm. The results show that the hybrid method can overcome from the basic ABC algorithm problems for almost all test function. However, if the shape of function is flat, the proposed method cannot work well.

  18. Modified Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Surafel Luleseged Tilahun

    2012-01-01

    Full Text Available Firefly algorithm is one of the new metaheuristic algorithms for optimization problems. The algorithm is inspired by the flashing behavior of fireflies. In the algorithm, randomly generated solutions will be considered as fireflies, and brightness is assigned depending on their performance on the objective function. One of the rules used to construct the algorithm is, a firefly will be attracted to a brighter firefly, and if there is no brighter firefly, it will move randomly. In this paper we modify this random movement of the brighter firefly by generating random directions in order to determine the best direction in which the brightness increases. If such a direction is not generated, it will remain in its current position. Furthermore the assignment of attractiveness is modified in such a way that the effect of the objective function is magnified. From the simulation result it is shown that the modified firefly algorithm performs better than the standard one in finding the best solution with smaller CPU time.

  19. A fuzzy logic algorithm to assign confidence levels to heart and respiratory rate time series

    International Nuclear Information System (INIS)

    Liu, J; McKenna, T M; Gribok, A; Reifman, J; Beidleman, B A; Tharion, W J

    2008-01-01

    We have developed a fuzzy logic-based algorithm to qualify the reliability of heart rate (HR) and respiratory rate (RR) vital-sign time-series data by assigning a confidence level to the data points while they are measured as a continuous data stream. The algorithm's membership functions are derived from physiology-based performance limits and mass-assignment-based data-driven characteristics of the signals. The assigned confidence levels are based on the reliability of each HR and RR measurement as well as the relationship between them. The algorithm was tested on HR and RR data collected from subjects undertaking a range of physical activities, and it showed acceptable performance in detecting four types of faults that result in low-confidence data points (receiver operating characteristic areas under the curve ranged from 0.67 (SD 0.04) to 0.83 (SD 0.03), mean and standard deviation (SD) over all faults). The algorithm is sensitive to noise in the raw HR and RR data and will flag many data points as low confidence if the data are noisy; prior processing of the data to reduce noise allows identification of only the most substantial faults. Depending on how HR and RR data are processed, the algorithm can be applied as a tool to evaluate sensor performance or to qualify HR and RR time-series data in terms of their reliability before use in automated decision-assist systems

  20. Acceptability, acceptance and decision making

    International Nuclear Information System (INIS)

    Ackerschott, H.

    2002-01-01

    There is a fundamental difference between the acceptability of a civilizatory or societal risk and the acceptability of the decision-making process that leads to a civilizatory or societal risk. The analysis of individual risk decisions - regarding who, executes when which indisputably hazardous, unhealthy or dangerous behaviour under which circumstances - is not helpful in finding solutions for the political decisions at hand in Germany concerning nuclear energy in particular or energy in general. The debt for implementation of any technology, in the sense of making the technology a success in terms of broad acceptance and general utilisation, lies with the particular industry involved. Regardless of the technology, innovation research identifies the implementation phase as most critical to the success of any innovation. In this sense, nuclear technology is at best still an innovation, because the implementation has not yet been completed. Fear and opposition to innovation are ubiquitous. Even the economy - which is often described as 'rational' - is full of this resistance. Innovation has an impact on the pivotal point between stability, the presupposition for the successful execution of decisions already taken and instability, which includes insecurity, but is also necessary for the success of further development. By definition, innovations are beyond our sphere of experience; not at the level of reliability and trust yet to come. Yet they are evaluated via the simplifying heuristics for making decisions proven not only to be necessary and useful, but also accurate in the familiar. The 'settlement of the debt of implementation', the accompanying communication, the decision-making procedures concerning the regulation of averse effects of the technology, but also the tailoring of the new technology or service itself must be directed to appropriate target groups. But the group often aimed at in the nuclear debate, the group, which largely determines political

  1. An automated land-use mapping comparison of the Bayesian maximum likelihood and linear discriminant analysis algorithms

    Science.gov (United States)

    Tom, C. H.; Miller, L. D.

    1984-01-01

    The Bayesian maximum likelihood parametric classifier has been tested against the data-based formulation designated 'linear discrimination analysis', using the 'GLIKE' decision and "CLASSIFY' classification algorithms in the Landsat Mapping System. Identical supervised training sets, USGS land use/land cover classes, and various combinations of Landsat image and ancilliary geodata variables, were used to compare the algorithms' thematic mapping accuracy on a single-date summer subscene, with a cellularized USGS land use map of the same time frame furnishing the ground truth reference. CLASSIFY, which accepts a priori class probabilities, is found to be more accurate than GLIKE, which assumes equal class occurrences, for all three mapping variable sets and both levels of detail. These results may be generalized to direct accuracy, time, cost, and flexibility advantages of linear discriminant analysis over Bayesian methods.

  2. A Proposed Algorithm for Improved Recognition and Treatment of the Depression/Anxiety Spectrum in Primary Care

    Science.gov (United States)

    Ballenger, James C.; Davidson, Jonathan R. T.; Lecrubier, Yves; Nutt, David J.

    2001-01-01

    The International Consensus Group on Depression and Anxiety has held 7 meetings over the last 3 years that focused on depression and specific anxiety disorders. During the course of the meeting series, a number of common themes have developed. At the last meeting of the Consensus Group, we reviewed these areas of commonality across the spectrum of depression and anxiety disorders. With the aim of improving the recognition and management of depression and anxiety in the primary care setting, we developed an algorithm that is presented in this article. We attempted to balance currently available scientific knowledge about the treatment of these disorders and to reformat it to provide an acceptable algorithm that meets the practical aspects of recognizing and treating these disorders in primary care. PMID:15014615

  3. OTM Machine Acceptance: In the Arab Culture

    Science.gov (United States)

    Rashed, Abdullah; Santos, Henrique

    Basically, neglecting the human factor is one of the main reasons for system failures or for technology rejection, even when important technologies are considered. Biometrics mostly have the characteristics needed for effortless acceptance, such as easiness and usefulness, that are essential pillars of acceptance models such as TAM (technology acceptance model). However, it should be investigated. Many studies have been carried out to research the issues of technology acceptance in different cultures, especially the western culture. Arabic culture lacks these types of studies with few publications in this field. This paper introduces a new biometric interface for ATM machines. This interface depends on a promising biometrics which is odour. To discover the acceptance of this biometrics, we distributed a questionnaire via a web site and called for participation in the Arab Area and found that most respondents would accept to use odour.

  4. High-speed scanning: an improved algorithm

    Science.gov (United States)

    Nachimuthu, A.; Hoang, Khoi

    1995-10-01

    In using machine vision for assessing an object's surface quality, many images are required to be processed in order to separate the good areas from the defective ones. Examples can be found in the leather hide grading process; in the inspection of garments/canvas on the production line; in the nesting of irregular shapes into a given surface... . The most common method of subtracting the total area from the sum of defective areas does not give an acceptable indication of how much of the `good' area can be used, particularly if the findings are to be used for the nesting of irregular shapes. This paper presents an image scanning technique which enables the estimation of useable areas within an inspected surface in terms of the user's definition, not the supplier's claims. That is, how much useable area the user can use, not the total good area as the supplier estimated. An important application of the developed technique is in the leather industry where the tanner (the supplier) and the footwear manufacturer (the user) are constantly locked in argument due to disputed quality standards of finished leather hide, which disrupts production schedules and wasted costs in re-grading, re- sorting... . The developed basic algorithm for area scanning of a digital image will be presented. The implementation of an improved scanning algorithm will be discussed in detail. The improved features include Boolean OR operations and many other innovative functions which aim at optimizing the scanning process in terms of computing time and the accurate estimation of useable areas.

  5. Network-Oblivious Algorithms

    DEFF Research Database (Denmark)

    Bilardi, Gianfranco; Pietracaprina, Andrea; Pucci, Geppino

    2016-01-01

    A framework is proposed for the design and analysis of network-oblivious algorithms, namely algorithms that can run unchanged, yet efficiently, on a variety of machines characterized by different degrees of parallelism and communication capabilities. The framework prescribes that a network......-oblivious algorithm be specified on a parallel model of computation where the only parameter is the problem’s input size, and then evaluated on a model with two parameters, capturing parallelism granularity and communication latency. It is shown that for a wide class of network-oblivious algorithms, optimality...... of cache hierarchies, to the realm of parallel computation. Its effectiveness is illustrated by providing optimal network-oblivious algorithms for a number of key problems. Some limitations of the oblivious approach are also discussed....

  6. A novel hybrid algorithm of GSA with Kepler algorithm for numerical optimization

    Directory of Open Access Journals (Sweden)

    Soroor Sarafrazi

    2015-07-01

    Full Text Available It is now well recognized that pure algorithms can be promisingly improved by hybridization with other techniques. One of the relatively new metaheuristic algorithms is Gravitational Search Algorithm (GSA which is based on the Newton laws. In this paper, to enhance the performance of GSA, a novel algorithm called “Kepler”, inspired by the astrophysics, is introduced. The Kepler algorithm is based on the principle of the first Kepler law. The hybridization of GSA and Kepler algorithm is an efficient approach to provide much stronger specialization in intensification and/or diversification. The performance of GSA–Kepler is evaluated by applying it to 14 benchmark functions with 20–1000 dimensions and the optimal approximation of linear system as a practical optimization problem. The results obtained reveal that the proposed hybrid algorithm is robust enough to optimize the benchmark functions and practical optimization problems.

  7. Acceptance conditions in automated negotiation

    NARCIS (Netherlands)

    Baarslag, T.; Hindriks, K.V.; Jonker, C.M.

    2011-01-01

    In every negotiation with a deadline, one of the negotiating parties has to accept an offer to avoid a break off. A break off is usually an undesirable outcome for both parties, therefore it is important that a negotiator employs a proficient mechanism to decide under which conditions to accept.

  8. 12 CFR 7.1007 - Acceptances.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Acceptances. 7.1007 Section 7.1007 Banks and Banking COMPTROLLER OF THE CURRENCY, DEPARTMENT OF THE TREASURY BANK ACTIVITIES AND OPERATIONS Bank Powers... financing credit transactions. Bankers' acceptances may be used for such purpose, since the making of...

  9. Derivation and validation of the automated search algorithms to identify cognitive impairment and dementia in electronic health records.

    Science.gov (United States)

    Amra, Sakusic; O'Horo, John C; Singh, Tarun D; Wilson, Gregory A; Kashyap, Rahul; Petersen, Ronald; Roberts, Rosebud O; Fryer, John D; Rabinstein, Alejandro A; Gajic, Ognjen

    2017-02-01

    Long-term cognitive impairment is a common and important problem in survivors of critical illness. We developed electronic search algorithms to identify cognitive impairment and dementia from the electronic medical records (EMRs) that provide opportunity for big data analysis. Eligible patients met 2 criteria. First, they had a formal cognitive evaluation by The Mayo Clinic Study of Aging. Second, they were hospitalized in intensive care unit at our institution between 2006 and 2014. The "criterion standard" for diagnosis was formal cognitive evaluation supplemented by input from an expert neurologist. Using all available EMR data, we developed and improved our algorithms in the derivation cohort and validated them in the independent validation cohort. Of 993 participants who underwent formal cognitive testing and were hospitalized in intensive care unit, we selected 151 participants at random to form the derivation and validation cohorts. The automated electronic search algorithm for cognitive impairment was 94.3% sensitive and 93.0% specific. The search algorithms for dementia achieved respective sensitivity and specificity of 97% and 99%. EMR search algorithms significantly outperformed International Classification of Diseases codes. Automated EMR data extractions for cognitive impairment and dementia are reliable and accurate and can serve as acceptable and efficient alternatives to time-consuming manual data review. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  11. Relevant cost information for order acceptance decisions

    NARCIS (Netherlands)

    Wouters, M.J.F.

    1997-01-01

    Some economic considerations for order acceptance decisions are discussed. The relevant economic considerations for order acceptance are widely discussed in the literature: only those costs are relevant which would be avoidable by not accepting the order incremental costs plus opportunity costs .

  12. DEVELOPMENT OF A NEW ALGORITHM FOR KEY AND S-BOX GENERATION IN BLOWFISH ALGORITHM

    Directory of Open Access Journals (Sweden)

    TAYSEER S. ATIA

    2014-08-01

    Full Text Available Blowfish algorithm is a block cipher algorithm, its strong, simple algorithm used to encrypt data in block of size 64-bit. Key and S-box generation process in this algorithm require time and memory space the reasons that make this algorithm not convenient to be used in smart card or application requires changing secret key frequently. In this paper a new key and S-box generation process was developed based on Self Synchronization Stream Cipher (SSS algorithm where the key generation process for this algorithm was modified to be used with the blowfish algorithm. Test result shows that the generation process requires relatively slow time and reasonably low memory requirement and this enhance the algorithm and gave it the possibility for different usage.

  13. Development of algorithm for depreciation costs allocation in dynamic input-output industrial enterprise model

    Directory of Open Access Journals (Sweden)

    Keller Alevtina

    2017-01-01

    Full Text Available The article considers the issue of allocation of depreciation costs in the dynamic inputoutput model of an industrial enterprise. Accounting the depreciation costs in such a model improves the policy of fixed assets management. It is particularly relevant to develop the algorithm for the allocation of depreciation costs in the construction of dynamic input-output model of an industrial enterprise, since such enterprises have a significant amount of fixed assets. Implementation of terms of the adequacy of such an algorithm itself allows: evaluating the appropriateness of investments in fixed assets, studying the final financial results of an industrial enterprise, depending on management decisions in the depreciation policy. It is necessary to note that the model in question for the enterprise is always degenerate. It is caused by the presence of zero rows in the matrix of capital expenditures by lines of structural elements unable to generate fixed assets (part of the service units, households, corporate consumers. The paper presents the algorithm for the allocation of depreciation costs for the model. This algorithm was developed by the authors and served as the basis for further development of the flowchart for subsequent implementation with use of software. The construction of such algorithm and its use for dynamic input-output models of industrial enterprises is actualized by international acceptance of the effectiveness of the use of input-output models for national and regional economic systems. This is what allows us to consider that the solutions discussed in the article are of interest to economists of various industrial enterprises.

  14. Monte Carlo algorithms with absorbing Markov chains: Fast local algorithms for slow dynamics

    International Nuclear Information System (INIS)

    Novotny, M.A.

    1995-01-01

    A class of Monte Carlo algorithms which incorporate absorbing Markov chains is presented. In a particular limit, the lowest order of these algorithms reduces to the n-fold way algorithm. These algorithms are applied to study the escape from the metastable state in the two-dimensional square-lattice nearest-neighbor Ising ferromagnet in an unfavorable applied field, and the agreement with theoretical predictions is very good. It is demonstrated that the higher-order algorithms can be many orders of magnitude faster than either the traditional Monte Carlo or n-fold way algorithms

  15. VDLLA: A virtual daddy-long legs optimization

    Science.gov (United States)

    Yaakub, Abdul Razak; Ghathwan, Khalil I.

    2016-08-01

    Swarm intelligence is a strong optimization algorithm based on a biological behavior of insects or animals. The success of any optimization algorithm is depending on the balance between exploration and exploitation. In this paper, we present a new swarm intelligence algorithm, which is based on daddy long legs spider (VDLLA) as a new optimization algorithm with virtual behavior. In VDLLA, each agent (spider) has nine positions which represent the legs of spider and each position represent one solution. The proposed VDLLA is tested on four standard functions using average fitness, Medium fitness and standard deviation. The results of proposed VDLLA have been compared against Particle Swarm Optimization (PSO), Differential Evolution (DE) and Bat Inspired Algorithm (BA). Additionally, the T-Test has been conducted to show the significant deference between our proposed and other algorithms. VDLLA showed very promising results on benchmark test functions for unconstrained optimization problems and also significantly improved the original swarm algorithms.

  16. Dynamic route guidance algorithm based algorithm based on artificial immune system

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    To improve the performance of the K-shortest paths search in intelligent traffic guidance systems,this paper proposes an optimal search algorithm based on the intelligent optimization search theory and the memphor mechanism of vertebrate immune systems.This algorithm,applied to the urban traffic network model established by the node-expanding method,can expediently realize K-shortest paths search in the urban traffic guidance systems.Because of the immune memory and global parallel search ability from artificial immune systems,K shortest paths can be found without any repeat,which indicates evidently the superiority of the algorithm to the conventional ones.Not only does it perform a better parallelism,the algorithm also prevents premature phenomenon that often occurs in genetic algorithms.Thus,it is especially suitable for real-time requirement of the traffic guidance system and other engineering optimal applications.A case study verifies the efficiency and the practicability of the algorithm aforementioned.

  17. Consumers' acceptance of medicinal herbs: An application of the technology acceptance model (TAM).

    Science.gov (United States)

    Jokar, Nargesh Khatun; Noorhosseini, Seyyed Ali; Allahyari, Mohammad Sadegh; Damalas, Christos A

    2017-07-31

    The shift in consumers' preferences from synthetic to 'natural' products has led to a resurgence of interest in medicinal plants, particularly in developing countries. However, research data about consumers' preferences for particular products is hard to find. The main objective of this study was to contribute to the general understanding of consumers' intention for selecting medicinal herbs for consumption. Factors underpinning consumers' acceptance of medicinal herbs were studied with the technology acceptance model (TAM) in Rasht City of Iran using a structured questionnaire. Most respondents had low to moderate familiarity with consumption of medicinal herbs. However, about half of the respondents (47.5%) showed a high level of acceptance of medicinal herbs. Herbs like spearmint (Mentha spicata L.), spinach (Spinacia oleracea L.), basil (Ocimum basilicum L.), Damask rose (Rosa × damascena Herrm.), saffron (Crocus sativus L.), cinnamon (Cinnamomum verum J.Presl), flixweed [Descurainia sophia (L.) Webb ex Prantl], red feathers (Echium amoenum Fisch. & C.A.Mey.), and green tea [Camellia sinensis (L.) Kuntze] had the highest consumption rate among the majority (over 75%) of citizens of Rasht. The highest rate of perceived usefulness of medicinal herbs was related to their perceived role in healing diseases. The variable of importance of use of medicinal herbs had the strongest direct effect and the variables of perceived usefulness and attitude towards use had the second and third strongest direct effect on the acceptance of medicinal herbs' use at p acceptance of medicinal herbs and may serve as a benchmark for future research and evaluation concerning the use of medicinal herbs over time. For plant producers, more effective and targeted crop development should be encouraged, whereas for retailers better marketing and delivery strategies should be sought. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  18. Evaluating and Improving Automatic Sleep Spindle Detection by Using Multi-Objective Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Min-Yin Liu

    2017-05-01

    Full Text Available Sleep spindles are brief bursts of brain activity in the sigma frequency range (11–16 Hz measured by electroencephalography (EEG mostly during non-rapid eye movement (NREM stage 2 sleep. These oscillations are of great biological and clinical interests because they potentially play an important role in identifying and characterizing the processes of various neurological disorders. Conventionally, sleep spindles are identified by expert sleep clinicians via visual inspection of EEG signals. The process is laborious and the results are inconsistent among different experts. To resolve the problem, numerous computerized methods have been developed to automate the process of sleep spindle identification. Still, the performance of these automated sleep spindle detection methods varies inconsistently from study to study. There are two reasons: (1 the lack of common benchmark databases, and (2 the lack of commonly accepted evaluation metrics. In this study, we focus on tackling the second problem by proposing to evaluate the performance of a spindle detector in a multi-objective optimization context and hypothesize that using the resultant Pareto fronts for deriving evaluation metrics will improve automatic sleep spindle detection. We use a popular multi-objective evolutionary algorithm (MOEA, the Strength Pareto Evolutionary Algorithm (SPEA2, to optimize six existing frequency-based sleep spindle detection algorithms. They include three Fourier, one continuous wavelet transform (CWT, and two Hilbert-Huang transform (HHT based algorithms. We also explore three hybrid approaches. Trained and tested on open-access DREAMS and MASS databases, two new hybrid methods of combining Fourier with HHT algorithms show significant performance improvement with F1-scores of 0.726–0.737.

  19. Implementation of Human Trafficking Education and Treatment Algorithm in the Emergency Department.

    Science.gov (United States)

    Egyud, Amber; Stephens, Kimberly; Swanson-Bierman, Brenda; DiCuccio, Marge; Whiteman, Kimberly

    2017-11-01

    Health care professionals have not been successful in recognizing or rescuing victims of human trafficking. The purpose of this project was to implement a screening system and treatment algorithm in the emergency department to improve the identification and rescue of victims of human trafficking. The lack of recognition by health care professionals is related to inadequate education and training tools and confusion with other forms of violence such as trauma and sexual assault. A multidisciplinary team was formed to assess the evidence related to human trafficking and make recommendations for practice. After receiving education, staff completed a survey about knowledge gained from the training. An algorithm for identification and treatment of sex trafficking victims was implemented and included a 2-pronged identification approach: (1) medical red flags created by a risk-assessment tool embedded in the electronic health record and (2) a silent notification process. Outcome measures were the number of victims who were identified either by the medical red flags or by silent notification and were offered and accepted intervention. Survey results indicated that 75% of participants reported that the education improved their competence level. The results demonstrated that an education and treatment algorithm may be an effective strategy to improve recognition. One patient was identified as an actual victim of human trafficking; the remaining patients reported other forms of abuse. Education and a treatment algorithm were effective strategies to improve recognition and rescue of human trafficking victims and increase identification of other forms of abuse. Copyright © 2017 Emergency Nurses Association. Published by Elsevier Inc. All rights reserved.

  20. Cone penetrometer acceptance test report

    Energy Technology Data Exchange (ETDEWEB)

    Boechler, G.N.

    1996-09-19

    This Acceptance Test Report (ATR) documents the results of acceptance test procedure WHC-SD-WM-ATR-151. Included in this report is a summary of the tests, the results and issues, the signature and sign- off ATP pages, and a summarized table of the specification vs. ATP section that satisfied the specification.

  1. Understanding Retailers’ Acceptance of Virtual Stores

    OpenAIRE

    Irene Y.L. Chen

    2010-01-01

    The acceptance of e-commerce among consumers has stimulated the rise of virtual stores. Increasing traditional retailers or people who do not have sufficient capital for maintaining a brick-and-mortar store have considered using virtual stores to reach global market. In the e-commerce literature, there has been rich research evidence concerning consumers’ acceptance of virtual stores. However, rigorous academic research on retailers’ acceptance of virtual stores is relatively scarce today. Th...

  2. A parallel row-based algorithm with error control for standard-cell replacement on a hypercube multiprocessor

    Science.gov (United States)

    Sargent, Jeff Scott

    1988-01-01

    A new row-based parallel algorithm for standard-cell placement targeted for execution on a hypercube multiprocessor is presented. Key features of this implementation include a dynamic simulated-annealing schedule, row-partitioning of the VLSI chip image, and two novel new approaches to controlling error in parallel cell-placement algorithms; Heuristic Cell-Coloring and Adaptive (Parallel Move) Sequence Control. Heuristic Cell-Coloring identifies sets of noninteracting cells that can be moved repeatedly, and in parallel, with no buildup of error in the placement cost. Adaptive Sequence Control allows multiple parallel cell moves to take place between global cell-position updates. This feedback mechanism is based on an error bound derived analytically from the traditional annealing move-acceptance profile. Placement results are presented for real industry circuits and the performance is summarized of an implementation on the Intel iPSC/2 Hypercube. The runtime of this algorithm is 5 to 16 times faster than a previous program developed for the Hypercube, while producing equivalent quality placement. An integrated place and route program for the Intel iPSC/2 Hypercube is currently being developed.

  3. Numeric algorithms for parallel processors computer architectures with applications to the few-groups neutron diffusion equations

    International Nuclear Information System (INIS)

    Zee, S.K.

    1987-01-01

    A numeric algorithm and an associated computer code were developed for the rapid solution of the finite-difference method representation of the few-group neutron-diffusion equations on parallel computers. Applications of the numeric algorithm on both SIMD (vector pipeline) and MIMD/SIMD (multi-CUP/vector pipeline) architectures were explored. The algorithm was successfully implemented in the two-group, 3-D neutron diffusion computer code named DIFPAR3D (DIFfusion PARallel 3-Dimension). Numerical-solution techniques used in the code include the Chebyshev polynomial acceleration technique in conjunction with the power method of outer iteration. For inner iterations, a parallel form of red-black (cyclic) line SOR with automated determination of group dependent relaxation factors and iteration numbers required to achieve specified inner iteration error tolerance is incorporated. The code employs a macroscopic depletion model with trace capability for selected fission products' transients and critical boron. In addition to this, moderator and fuel temperature feedback models are also incorporated into the DIFPAR3D code, for realistic simulation of power reactor cores. The physics models used were proven acceptable in separate benchmarking studies

  4. Hamiltonian Algorithm Sound Synthesis

    OpenAIRE

    大矢, 健一

    2013-01-01

    Hamiltonian Algorithm (HA) is an algorithm for searching solutions is optimization problems. This paper introduces a sound synthesis technique using Hamiltonian Algorithm and shows a simple example. "Hamiltonian Algorithm Sound Synthesis" uses phase transition effect in HA. Because of this transition effect, totally new waveforms are produced.

  5. Perceptions of acceptable conducts by university students.

    Science.gov (United States)

    Marques, Dora Nazaré; Macedo, António Filipe

    2016-01-01

    To determine perceptions of acceptable conducts amongst under and postgraduate optometry students and to compare them with students from other disciplines. Students (under/postgraduate) of optometry (n=156) and other courses (n=54) from University of Minho participated in a voluntary online questionnaire about perception of conducts, classifying as acceptable or unacceptable 15 academic or professional scenarios. 210 questionnaires were analyzed. Differences in perceptions were found between optometry under and postgraduates in scenario 5, Chi-square(2,156)=4.3, p=0.038, and scenario 7, Chi-square(2,156)=7.0, p=0.008 (both with cheating more acceptable for postgrads). Differences between under and postgraduates from other courses were found in scenario 9 (taking supplies from classroom more acceptable for undergrads), Chi-square(1,54)=5.0, p=0.025, and scenario 14 (forging a signature more acceptable for postgrads), Chi-square(1,54)=3.9, p=0.046. Differences between optometry and other courses undergraduates were observed in scenario 2 (plagiarism more acceptable for optometry undergrads), Chi-square(1,154)=8.3, p=0.004 and scenario 9 (taking supplies from classroom more acceptable for other undergrads), chi-square(1,54)=7.8, p=0.005. Differences between optometry and other courses postgraduates were observed in scenario 7, Chi-square(1,56)=5.8, p=0.016, scenario 10 (both with cheating more acceptable for optometry postgrads), chi-square(1,54)=8.1, p=0.004 and scenario 14 (forging a signature more acceptable for other postgrads), Chi-square(1,54)=6.1, p=0.026. Academic misconducts were mainly considered more acceptable than professional misconducts. Our results show that perceptions of acceptable conducts amongst optometry students are not very different from other students, and, against our initial prediction, do not show a general change in misconduct perception when students become more mature. Universities should pay more attention to this problem and take

  6. 1991 Acceptance priority ranking

    International Nuclear Information System (INIS)

    1991-12-01

    The Standard Contract for Disposal of Spent Nuclear Fuel and/or High- Level Radioactive Waste (10 CFR Part 961) that the Department of Energy (DOE) has executed with the owners and generators of civilian spent nuclear fuel requires annual publication of the Acceptance Priority Ranking (APR). The 1991 APR details the order in which DOE will allocate Federal waste acceptance capacity. As required by the Standard Contract, the ranking is based on the age of permanently discharged spent nuclear fuel (SNF), with the owners of the oldest SNF, on an industry-wide basis, given the highest priority. the 1991 APR will be the basis for the annual allocation of waste acceptance capacity to the Purchasers in the 1991 Annual Capacity Report (ACR), to be issued later this year. This document is based on SNF discharges as of December 31, 1990, and reflects Purchaser comments and corrections, as appropriate, to the draft APR issued on May 15, 1991

  7. Measuring Technology Acceptance Level of Turkish Pre-Service English Teachers by Using Technology Acceptance Model

    Science.gov (United States)

    Kirmizi, Özkan

    2014-01-01

    The aim of this study is to investigate technology acceptance of prospective English teachers by using Technology Acceptance Model (TAM) in Turkish context. The study is based on Structural Equation Model (SEM). The participants of the study from English Language Teaching Departments of Hacettepe, Gazi and Baskent Universities. The participants…

  8. Identification of permit and waste acceptance criteria provisions requiring modification for acceptance of commercial mixed waste

    International Nuclear Information System (INIS)

    1994-03-01

    In October 1990, representatives of States and compact regions requested that the US Department of Energy (DOE) explore an agreement with host States and compact regions under which DOE would accept commercial mixed low-level radioactive waste (LLW) at DOE's own treatment and disposal facilities. A program for DOE management of commercial mixed waste is made potentially more attractive in light of the low commercial mixed waste volumes, high regulatory burdens, public opposition to new disposal sites, and relatively high cost of constructing commercial disposal facilities. Several studies were identified as essential in determining the feasibility of DOE accepting commercial mixed waste for disposal. The purpose of this report is to identify any current or proposed waste acceptance criteria (WAC) or Resource Conservation and Recovery Act (RCRA) provisions that would have to be modified for commercial mixed waste acceptance at specified DOE facilities. Following the introduction, Section 2 of this report (a) provides a background summary of existing and proposed mixed waste disposal facilities at each DOE site, and (b) summarizes the status of any RCRA Part B permit and WAC provisions relating to the disposal of mixed waste, including provisions relating to acceptance of offsite waste. Section 3 provides overall conclusions regarding the current status and permit modifications that must be implemented in order to grant DOE sites authority under their permits to accept commercial mixed waste for disposal. Section 4 contains a list of references

  9. Modified Clipped LMS Algorithm

    Directory of Open Access Journals (Sweden)

    Lotfizad Mojtaba

    2005-01-01

    Full Text Available Abstract A new algorithm is proposed for updating the weights of an adaptive filter. The proposed algorithm is a modification of an existing method, namely, the clipped LMS, and uses a three-level quantization ( scheme that involves the threshold clipping of the input signals in the filter weight update formula. Mathematical analysis shows the convergence of the filter weights to the optimum Wiener filter weights. Also, it can be proved that the proposed modified clipped LMS (MCLMS algorithm has better tracking than the LMS algorithm. In addition, this algorithm has reduced computational complexity relative to the unmodified one. By using a suitable threshold, it is possible to increase the tracking capability of the MCLMS algorithm compared to the LMS algorithm, but this causes slower convergence. Computer simulations confirm the mathematical analysis presented.

  10. Patient acceptance of awake craniotomy.

    Science.gov (United States)

    Wrede, Karsten H; Stieglitz, Lennart H; Fiferna, Antje; Karst, Matthias; Gerganov, Venelin M; Samii, Madjid; von Gösseln, Hans-Henning; Lüdemann, Wolf O

    2011-12-01

    The aim of this study was to objectively assess the patients' acceptance for awake craniotomy in a group of neurosurgical patients, who underwent this procedure for removal of lesions in or close to eloquent brain areas. Patients acceptance for awake craniotomy under local anesthesia and conscious sedation was assessed by a formal questionnaire (PPP33), initially developed for general surgery patients. The results are compared to a group of patients who had brain surgery under general anesthesia and to previously published data. The awake craniotomy (AC) group consisted of 37 male and 9 female patients (48 craniotomies) with age ranging from 18 to 71 years. The general anesthesia (GA) group consisted of 26 male and 15 female patients (43 craniotomies) with age ranging from 26 to 83 years. All patients in the study were included in the questionnaire analysis. In comparison to GA the overall PPP33 score for AC was higher (p=0.07), suggesting better overall acceptance for AC. The subscale scores for AC were also significantly better compared to GA for the two subscales "postoperative pain" (p=0.02) and "physical disorders" (p=0.01) and equal for the other 6 subscales. The results of the overall mean score and the scores for the subscales of the PPP33 questionnaire verify good patients' acceptance for AC. Previous studies have shown good patients' acceptance for awake craniotomy, but only a few times using formal approaches. By utilizing a formal questionnaire we could verify good patient acceptance for awake craniotomy for the treatment of brain tumors in or close to eloquent areas. This is a novel approach that substantiates previously published experiences. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Algorithms as fetish: Faith and possibility in algorithmic work

    Directory of Open Access Journals (Sweden)

    Suzanne L Thomas

    2018-01-01

    Full Text Available Algorithms are powerful because we invest in them the power to do things. With such promise, they can transform the ordinary, say snapshots along a robotic vacuum cleaner’s route, into something much more, such as a clean home. Echoing David Graeber’s revision of fetishism, we argue that this easy slip from technical capabilities to broader claims betrays not the “magic” of algorithms but rather the dynamics of their exchange. Fetishes are not indicators of false thinking, but social contracts in material form. They mediate emerging distributions of power often too nascent, too slippery or too disconcerting to directly acknowledge. Drawing primarily on 2016 ethnographic research with computer vision professionals, we show how faith in what algorithms can do shapes the social encounters and exchanges of their production. By analyzing algorithms through the lens of fetishism, we can see the social and economic investment in some people’s labor over others. We also see everyday opportunities for social creativity and change. We conclude that what is problematic about algorithms is not their fetishization but instead their stabilization into full-fledged gods and demons – the more deserving objects of critique.

  12. Pregnancy planning and acceptance among Danish pregnant women

    DEFF Research Database (Denmark)

    Rasch, V; Knudsen, L B; Wielandt, H

    2001-01-01

    OBJECTIVE: To study how living conditions influence pregnancy planning and acceptance among Danish women. METHOD: A cross-sectional questionnaire study performed among 3516 pregnant women attending Odense University Hospital, Denmark. The study population consisted of women with spontaneous...... abortion, women with ectopic pregnancies, women attending antenatal care and women with induced abortion. They were divided into four groups: women with planned and accepted pregnancies (accepting planners, n=2137), women who accepted an initially unplanned pregnancy (accepting non-planners, n=1006), women...... who rejected an initially planned pregnancy (rejecting planners, n=31), and women with unplanned and rejected pregnancies (rejecting non-planners, n=342). The association between socio-economic characteristics and pregnancy planning and acceptance was evaluated by comparing accepting non...

  13. Worldwide nuclear revival and acceptance

    International Nuclear Information System (INIS)

    Geraets, Luc H.; Crommelynck, Yves A.

    2009-01-01

    The current status and trends of the nuclear revival in Europe and abroad are outlined. The development of public opinion in the last decade is playing an important part. This has turned from clear rejection to careful acceptance. Transparency and open communication will be important aspects in the further development of nuclear acceptance. (orig.)

  14. A kidney offer acceptance decision tool to inform the decision to accept an offer or wait for a better kidney.

    Science.gov (United States)

    Wey, Andrew; Salkowski, Nicholas; Kremers, Walter K; Schaffhausen, Cory R; Kasiske, Bertram L; Israni, Ajay K; Snyder, Jon J

    2018-04-01

    We developed a kidney offer acceptance decision tool to predict the probability of graft survival and patient survival for first-time kidney-alone candidates after an offer is accepted or declined, and we characterized the effect of restricting the donor pool with a maximum acceptable kidney donor profile index (KDPI). For accepted offers, Cox proportional hazards models estimated these probabilities using transplanted kidneys. For declined offers, these probabilities were estimated by considering the experience of similar candidates who declined offers and the probability that declining would lead to these outcomes. We randomly selected 5000 declined offers and estimated these probabilities 3 years post-offer had the offers been accepted or declined. Predicted outcomes for declined offers were well calibrated (offers been accepted, the probabilities of graft survival and patient survival were typically higher. However, these advantages attenuated or disappeared with higher KDPI, candidate priority, and local donor supply. Donor pool restrictions were associated with worse 3-year outcomes, especially for candidates with high allocation priority. The kidney offer acceptance decision tool could inform offer acceptance by characterizing the potential risk-benefit trade-off associated with accepting or declining an offer. © 2017 The American Society of Transplantation and the American Society of Transplant Surgeons.

  15. Quick fuzzy backpropagation algorithm.

    Science.gov (United States)

    Nikov, A; Stoeva, S

    2001-03-01

    A modification of the fuzzy backpropagation (FBP) algorithm called QuickFBP algorithm is proposed, where the computation of the net function is significantly quicker. It is proved that the FBP algorithm is of exponential time complexity, while the QuickFBP algorithm is of polynomial time complexity. Convergence conditions of the QuickFBP, resp. the FBP algorithm are defined and proved for: (1) single output neural networks in case of training patterns with different targets; and (2) multiple output neural networks in case of training patterns with equivalued target vector. They support the automation of the weights training process (quasi-unsupervised learning) establishing the target value(s) depending on the network's input values. In these cases the simulation results confirm the convergence of both algorithms. An example with a large-sized neural network illustrates the significantly greater training speed of the QuickFBP rather than the FBP algorithm. The adaptation of an interactive web system to users on the basis of the QuickFBP algorithm is presented. Since the QuickFBP algorithm ensures quasi-unsupervised learning, this implies its broad applicability in areas of adaptive and adaptable interactive systems, data mining, etc. applications.

  16. Parallel Implementation and Scaling of an Adaptive Mesh Discrete Ordinates Algorithm for Transport

    International Nuclear Information System (INIS)

    Howell, L H

    2004-01-01

    Block-structured adaptive mesh refinement (AMR) uses a mesh structure built up out of locally-uniform rectangular grids. In the BoxLib parallel framework used by the Raptor code, each processor operates on one or more of these grids at each refinement level. The decomposition of the mesh into grids and the distribution of these grids among processors may change every few timesteps as a calculation proceeds. Finer grids use smaller timesteps than coarser grids, requiring additional work to keep the system synchronized and ensure conservation between different refinement levels. In a paper for NECDC 2002 I presented preliminary results on implementation of parallel transport sweeps on the AMR mesh, conjugate gradient acceleration, accuracy of the AMR solution, and scalar speedup of the AMR algorithm compared to a uniform fully-refined mesh. This paper continues with a more in-depth examination of the parallel scaling properties of the scheme, both in single-level and multi-level calculations. Both sweeping and setup costs are considered. The algorithm scales with acceptable performance to several hundred processors. Trends suggest, however, that this is the limit for efficient calculations with traditional transport sweeps, and that modifications to the sweep algorithm will be increasingly needed as job sizes in the thousands of processors become common

  17. Influence Processes for Information Technology Acceptance

    DEFF Research Database (Denmark)

    Bhattacherjee, Anol; Sanford, Clive Carlton

    2006-01-01

    This study examines how processes of external influence shape information technology acceptance among potential users, how such influence effects vary across a user population, and whether these effects are persistent over time. Drawing on the elaboration-likelihood model (ELM), we compared two...... alternative influence processes, the central and peripheral routes, in motivating IT acceptance. These processes were respectively operationalized using the argument quality and source credibility constructs, and linked to perceived usefulness and attitude, the core perceptual drivers of IT acceptance. We...... further examined how these influence processes were moderated by users' IT expertise and perceived job relevance and the temporal stability of such influence effects. Nine hypotheses thus developed were empirically validated using a field survey of document management system acceptance at an eastern...

  18. A New Modified Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Medha Gupta

    2016-07-01

    Full Text Available Nature inspired meta-heuristic algorithms studies the emergent collective intelligence of groups of simple agents. Firefly Algorithm is one of the new such swarm-based metaheuristic algorithm inspired by the flashing behavior of fireflies. The algorithm was first proposed in 2008 and since then has been successfully used for solving various optimization problems. In this work, we intend to propose a new modified version of Firefly algorithm (MoFA and later its performance is compared with the standard firefly algorithm along with various other meta-heuristic algorithms. Numerical studies and results demonstrate that the proposed algorithm is superior to existing algorithms.

  19. Quantum Computation and Algorithms

    International Nuclear Information System (INIS)

    Biham, O.; Biron, D.; Biham, E.; Grassi, M.; Lidar, D.A.

    1999-01-01

    It is now firmly established that quantum algorithms provide a substantial speedup over classical algorithms for a variety of problems, including the factorization of large numbers and the search for a marked element in an unsorted database. In this talk I will review the principles of quantum algorithms, the basic quantum gates and their operation. The combination of superposition and interference, that makes these algorithms efficient, will be discussed. In particular, Grover's search algorithm will be presented as an example. I will show that the time evolution of the amplitudes in Grover's algorithm can be found exactly using recursion equations, for any initial amplitude distribution

  20. Antecedents to Consumers' Acceptance of Mobile Advertisements

    DEFF Research Database (Denmark)

    Rajala, Risto; Westerlund, Mika

    2010-01-01

    The paper presents a hierarchical construct PLS structural equation model to analyze mobile advertisement acceptance. Hypotheses are established and tested about the hierarchical structure and the effects of the factors that precede consumers' behavioral intention to accept mobile advertisement....... The results suggest that valuable content and trust in advertisers are key predictors of mobile device users' acceptance of mobile advertising. In addition, subjective value of the ads and subjective norms mediate these antecedent-acceptance relationships. The results are invaluable to both scholars...

  1. The assessment of the impact of socio-economic factors in accepting cancer using the Acceptance of Illness Scale (AIS).

    Science.gov (United States)

    Czerw, Aleksandra I; Bilińska, Magdalena; Deptała, Andrzej

    2016-01-01

    The paper presents the results of examining the level of acceptance of the illness in cancer patients using the Acceptance of Illness Scale (AIS). The study involved cancer patients treated at the Central Clinical Hospital of the Ministry the Interior in Warsaw in 2014. The questionnaire comprised basic demographic questions (socio-economic factors) and the AIS test estimating the level of illness acceptance in patients. For the group of patients in the research group, the arithmetic mean amounted to 27.56 points. The period of time that elapsed between the first cancer diagnosis and the start of the study did not influence the score of accepting illness. The acceptance of illness in patients diagnosed with metastases differed from the acceptance of illness by patients diagnosed with metastatic cancer. Females obtained the average of 29.59 in the AIS test, whereas the average in male patients was 26.17. The patients' age did not impact the AIS test. There were no differences in the AIS test results between a group of people with secondary education and a group of people with higher education. There were no differences in the AIS test results between employed individuals versus pensioners. The inhabitants of cities were characterized by the highest degree of acceptance of their health condition. The lowest degree of acceptance of illness was observed in the group with the lowest average incomes. In the group of married individuals the average degree of acceptance of illness amounted to 27.37 points. The average degree of acceptance of illness in patients that declared themselves as single amounted to 25.75. The average degree of acceptance of illness in the study group was 27.56 points, which is a relatively high level of acceptance of cancer. The main socio-economic factor, which influenced the AIS test results was whether metastases were diagnosed or not. There were no differences between patients in groups where the time that elapsed from the first diagnosis of

  2. The assessment of the impact of socio-economic factors in accepting cancer using the Acceptance of Illness Scale (AIS

    Directory of Open Access Journals (Sweden)

    Aleksandra I. Czerw

    2015-11-01

    Full Text Available Aim of the study : The paper presents the results of examining the level of acceptance of the illness in cancer patients using the Acceptance of Illness Scale (AIS. Materials and methods: The study involved cancer patients treated at the Central Clinical Hospital of the Ministry the Interior in Warsaw in 2014. The questionnaire comprised basic demographic questions (socio-economic factors and the AIS test estimating the level of illness acceptance in patients. Results : For the group of patients in the research group, the arithmetic mean amounted to 27.56 points. The period of time that elapsed between the first cancer diagnosis and the start of the study did not influence the score of accepting illness. The acceptance of illness in patients diagnosed with metastases differed from the acceptance of illness by patients diagnosed with metastatic cancer. Females obtained the average of 29.59 in the AIS test, whereas the average in male patients was 26.17. The patients’ age did not impact the AIS test. There were no differences in the AIS test results between a group of people with secondary education and a group of people with higher education. There were no differences in the AIS test results between employed individuals versus pensioners. The inhabitants of cities were characterized by the highest degree of acceptance of their health condition. The lowest degree of acceptance of illness was observed in the group with the lowest average incomes. In the group of married individuals the average degree of acceptance of illness amounted to 27.37 points. The average degree of acceptance of illness in patients that declared themselves as single amounted to 25.75. Conclusions : The average degree of acceptance of illness in the study group was 27.56 points, which is a relatively high level of acceptance of cancer. The main socio-economic factor, which influenced the AIS test results was whether metastases were diagnosed or not. There were no

  3. Semioptimal practicable algorithmic cooling

    International Nuclear Information System (INIS)

    Elias, Yuval; Mor, Tal; Weinstein, Yossi

    2011-01-01

    Algorithmic cooling (AC) of spins applies entropy manipulation algorithms in open spin systems in order to cool spins far beyond Shannon's entropy bound. Algorithmic cooling of nuclear spins was demonstrated experimentally and may contribute to nuclear magnetic resonance spectroscopy. Several cooling algorithms were suggested in recent years, including practicable algorithmic cooling (PAC) and exhaustive AC. Practicable algorithms have simple implementations, yet their level of cooling is far from optimal; exhaustive algorithms, on the other hand, cool much better, and some even reach (asymptotically) an optimal level of cooling, but they are not practicable. We introduce here semioptimal practicable AC (SOPAC), wherein a few cycles (typically two to six) are performed at each recursive level. Two classes of SOPAC algorithms are proposed and analyzed. Both attain cooling levels significantly better than PAC and are much more efficient than the exhaustive algorithms. These algorithms are shown to bridge the gap between PAC and exhaustive AC. In addition, we calculated the number of spins required by SOPAC in order to purify qubits for quantum computation. As few as 12 and 7 spins are required (in an ideal scenario) to yield a mildly pure spin (60% polarized) from initial polarizations of 1% and 10%, respectively. In the latter case, about five more spins are sufficient to produce a highly pure spin (99.99% polarized), which could be relevant for fault-tolerant quantum computing.

  4. Physiologic correlates to background noise acceptance

    Science.gov (United States)

    Tampas, Joanna; Harkrider, Ashley; Nabelek, Anna

    2004-05-01

    Acceptance of background noise can be evaluated by having listeners indicate the highest background noise level (BNL) they are willing to accept while following the words of a story presented at their most comfortable listening level (MCL). The difference between the selected MCL and BNL is termed the acceptable noise level (ANL). One of the consistent findings in previous studies of ANL is large intersubject variability in acceptance of background noise. This variability is not related to age, gender, hearing sensitivity, personality, type of background noise, or speech perception in noise performance. The purpose of the current experiment was to determine if individual differences in physiological activity measured from the peripheral and central auditory systems of young female adults with normal hearing can account for the variability observed in ANL. Correlations between ANL and various physiological responses, including spontaneous, click-evoked, and distortion-product otoacoustic emissions, auditory brainstem and middle latency evoked potentials, and electroencephalography will be presented. Results may increase understanding of the regions of the auditory system that contribute to individual noise acceptance.

  5. Residential proximinity, perceived and acceptable risk

    International Nuclear Information System (INIS)

    Rogers, G.O.

    1984-01-01

    This paper focuses on the relationship between the life experiences associated with residential proximity, and the perception and acceptability of the risks associated with generating electricity in nuclear power plants. Perceived risk is operationally defined in terms of estimated likelihood of occurrence, while acceptability of nuclear power is defined in terms of people's favorable or unfavorable opinions regarding nuclear power plants. In the context of a simple social-structural model of perceived and acceptable risk, four potential explanations for enhanced acceptability among those residentially proximate with nuclear facilities are examined: residents, through the experience of living with hazard, are reinforced toward assigning lower probabilities to the potential risks associated with nuclear facilities; the cognitive dissonance created by the acceptance of the risks associated with nuclear power is decreased by reducing perceived risk; nuclear neighbors are predisposed toward, educated about, and/or economically dependent upon nuclear power hence the more favorable attitudes toward it; nearby residents are systematically more altruistic--other oriented--than the general population and thus more willing to bear the risks associated with nuclear power

  6. An Ordering Linear Unification Algorithm

    Institute of Scientific and Technical Information of China (English)

    胡运发

    1989-01-01

    In this paper,we present an ordering linear unification algorithm(OLU).A new idea on substituteion of the binding terms is introduced to the algorithm,which is able to overcome some drawbacks of other algorithms,e.g.,MM algorithm[1],RG1 and RG2 algorithms[2],Particularly,if we use the directed eyclie graphs,the algoritm needs not check the binding order,then the OLU algorithm can also be aplied to the infinite tree data struceture,and a higher efficiency can be expected.The paper focuses upon the discussion of OLU algorithm and a partial order structure with respect to the unification algorithm.This algorithm has been implemented in the GKD-PROLOG/VAX 780 interpreting system.Experimental results have shown that the algorithm is very simple and efficient.

  7. Financial treatment of demand management expenditures at Ontario Hydro

    International Nuclear Information System (INIS)

    Ariss, D.G.

    1990-01-01

    Ontario Hydro's demand side management (DSM) plan comprises reduction of load, load shifting, and peak shaving. It includes an accounting policy applied only to measures which reduce demand by the increase in the efficiency of electricity of utilization or by the shifting of load from peak periods to off-peak periods. In order to choose the pertinent periods for which the DSM expenditures should be recovered, the utility has considered three accounting options: expensing all DSM expenditures as incurred; deferring all DSM expenditures; or deferring only those DSM expenditures that meet specified criteria. Ontario Hydro has chosen the last option, since it is in conformity with generally accepted accounting principles. This option is based on the matching principle, under which costs and revenues that are linked to each other in a cause-and-effect relationship should be recognized in the same accounting period. It has also been judged advantageous to amortize the deferred expenses corresponding to each measure over appropriate periods. It has also been established that the amortization period should begin immediately after each measure has been put into operation. This accounting policy ensures that expenses relating to DSM are accounted in a pertinent and uniform manner. 6 refs

  8. Changes in Cardiovascular Disease Risk Factors With Immediate Versus Deferred Antiretroviral Therapy Initiation Among HIV-Positive Participants in the START (Strategic Timing of Antiretroviral Treatment) Trial

    DEFF Research Database (Denmark)

    Baker, Jason V; Sharma, Shweta; Achhra, Amit C

    2017-01-01

    INTRODUCTION: HIV infection and certain antiretroviral therapy (ART) medications increase atherosclerotic cardiovascular disease risk, mediated, in part, through traditional cardiovascular disease risk factors. METHODS AND RESULTS: We studied cardiovascular disease risk factor changes in the START...... (Strategic Timing of Antiretroviral Treatment) trial, a randomized study of immediate versus deferred ART initiation among HIV-positive persons with CD4+ cell counts >500 cells/mm3. Mean change from baseline in risk factors and the incidence of comorbid conditions were compared between groups....... The characteristics among 4685 HIV-positive START trial participants include a median age of 36 years, a CD4 cell count of 651 cells/mm3, an HIV viral load of 12 759 copies/mL, a current smoking status of 32%, a median systolic/diastolic blood pressure of 120/76 mm Hg, and median levels of total cholesterol of 168 mg...

  9. PERFORMANCE ANALYSIS OF SET PARTITIONING IN HIERARCHICAL TREES (SPIHT ALGORITHM FOR A FAMILY OF WAVELETS USED IN COLOR IMAGE COMPRESSION

    Directory of Open Access Journals (Sweden)

    A. Sreenivasa Murthy

    2014-11-01

    Full Text Available With the spurt in the amount of data (Image, video, audio, speech, & text available on the net, there is a huge demand for memory & bandwidth savings. One has to achieve this, by maintaining the quality & fidelity of the data acceptable to the end user. Wavelet transform is an important and practical tool for data compression. Set partitioning in hierarchal trees (SPIHT is a widely used compression algorithm for wavelet transformed images. Among all wavelet transform and zero-tree quantization based image compression algorithms SPIHT has become the benchmark state-of-the-art algorithm because it is simple to implement & yields good results. In this paper we present a comparative study of various wavelet families for image compression with SPIHT algorithm. We have conducted experiments with Daubechies, Coiflet, Symlet, Bi-orthogonal, Reverse Bi-orthogonal and Demeyer wavelet types. The resulting image quality is measured objectively, using peak signal-to-noise ratio (PSNR, and subjectively, using perceived image quality (human visual perception, HVP for short. The resulting reduction in the image size is quantified by compression ratio (CR.

  10. 48 CFR 552.270-29 - Acceptance of Space.

    Science.gov (United States)

    2010-10-01

    ... 48 Federal Acquisition Regulations System 4 2010-10-01 2010-10-01 false Acceptance of Space. 552... Acceptance of Space. As prescribed in 570.603, insert the following clause: Acceptance of Space (SEP 1999) (a... designated representative shall promptly inspect the space. (b) The Government will accept the space and the...

  11. Management Of Large Scale Osmotic Dehydration Solution Using The Pearsons Square Algorithm

    Directory of Open Access Journals (Sweden)

    Oladejo Duduyemi

    2015-01-01

    Full Text Available ABSTRACT Osmotic dehydration is a widely researched and advantageous pre-treatment process in food preservation but has not enjoyed industrial acceptance because if its highly concentrated and voluminous effluent. The Pearsons square algorithm was employed to give a focussed attack to the problem by developing a user-friendly template for reconstituting effluents for recycling purposes using Java script programme. Outflow from a pilot scale plant was reactivated and introduced into a scheme of operation for continuous OD of fruits and vegetables. Screened and re-concentrated effluent were subjected to statistical analysis in comparison to initial concentrations solution at confidence limit of p0.05. The template proven to be an adequate representation of the Pearsons square algorithm it is sufficiently good in reconstituting used osmotic solutions for repetitive usage. This protocol if adopted in the industry is not only environmentally friendly but also promises significant economic improvement of OD process. Application Recycling of non-reacting media and as a template for automation in continuous OD processing.

  12. An Online Scheduling Algorithm with Advance Reservation for Large-Scale Data Transfers

    Energy Technology Data Exchange (ETDEWEB)

    Balman, Mehmet; Kosar, Tevfik

    2010-05-20

    Scientific applications and experimental facilities generate massive data sets that need to be transferred to remote collaborating sites for sharing, processing, and long term storage. In order to support increasingly data-intensive science, next generation research networks have been deployed to provide high-speed on-demand data access between collaborating institutions. In this paper, we present a practical model for online data scheduling in which data movement operations are scheduled in advance for end-to-end high performance transfers. In our model, data scheduler interacts with reservation managers and data transfer nodes in order to reserve available bandwidth to guarantee completion of jobs that are accepted and confirmed to satisfy preferred time constraint given by the user. Our methodology improves current systems by allowing researchers and higher level meta-schedulers to use data placement as a service where theycan plan ahead and reserve the scheduler time in advance for their data movement operations. We have implemented our algorithm and examined possible techniques for incorporation into current reservation frameworks. Performance measurements confirm that the proposed algorithm is efficient and scalable.

  13. Differentiated influences of benefit and risk perceptions on nuclear power acceptance according to acceptance levels. Evidence from Korea

    International Nuclear Information System (INIS)

    Roh, Seungkook; Lee Jinwon

    2017-01-01

    The perceived benefit and risk of nuclear power generation have received considerable attention as determinants of the public's nuclear power acceptance. However, the contingency of the relative importance of these benefit and risk has been less explored. Using Korea as an example, this study explores the possibility that the relative importance of perceived benefit and risk on nuclear power acceptance depends on acceptance levels. Our results from latent class analysis and multinomial probit show that, in determining whether an individual shows a moderate level of nuclear power acceptance rather than a low level, perceived risk plays a dominant role compared to perceived benefit; however, regarding whether he/she shows a high level of nuclear power acceptance rather than a moderate level, this relative importance is reversed. These results carry practical implications for risk governance of nuclear power, particularly with regard to communication with the public. (author)

  14. VISUALIZATION OF PAGERANK ALGORITHM

    OpenAIRE

    Perhaj, Ervin

    2013-01-01

    The goal of the thesis is to develop a web application that help users understand the functioning of the PageRank algorithm. The thesis consists of two parts. First we develop an algorithm to calculate PageRank values of web pages. The input of algorithm is a list of web pages and links between them. The user enters the list through the web interface. From the data the algorithm calculates PageRank value for each page. The algorithm repeats the process, until the difference of PageRank va...

  15. Consumer Acceptance of Dry Dog Food Variations

    Science.gov (United States)

    Donfrancesco, Brizio Di; Koppel, Kadri; Swaney-Stueve, Marianne; Chambers, Edgar

    2014-01-01

    Simple Summary The objectives of this study were to compare the acceptance of different dry dog food products by consumers, determine consumer clusters for acceptance, and identify the characteristics of dog food that drive consumer acceptance. Pet owners evaluated dry dog food samples available in the US market. The results indicated that appearance of the sample, especially the color, influenced pet owner’s overall liking more than the aroma of the product. Abstract The objectives of this study were to compare the acceptance of different dry dog food products by consumers, determine consumer clusters for acceptance, and identify the characteristics of dog food that drive consumer acceptance. Eight dry dog food samples available in the US market were evaluated by pet owners. In this study, consumers evaluated overall liking, aroma, and appearance liking of the products. Consumers were also asked to predict their purchase intent, their dog’s liking, and cost of the samples. The results indicated that appearance of the sample, especially the color, influenced pet owner’s overall liking more than the aroma of the product. Overall liking clusters were not related to income, age, gender, or education, indicating that general consumer demographics do not appear to play a main role in individual consumer acceptance of dog food products. PMID:26480043

  16. A novel validation algorithm allows for automated cell tracking and the extraction of biologically meaningful parameters.

    Directory of Open Access Journals (Sweden)

    Daniel H Rapoport

    Full Text Available Automated microscopy is currently the only method to non-invasively and label-free observe complex multi-cellular processes, such as cell migration, cell cycle, and cell differentiation. Extracting biological information from a time-series of micrographs requires each cell to be recognized and followed through sequential microscopic snapshots. Although recent attempts to automatize this process resulted in ever improving cell detection rates, manual identification of identical cells is still the most reliable technique. However, its tedious and subjective nature prevented tracking from becoming a standardized tool for the investigation of cell cultures. Here, we present a novel method to accomplish automated cell tracking with a reliability comparable to manual tracking. Previously, automated cell tracking could not rival the reliability of manual tracking because, in contrast to the human way of solving this task, none of the algorithms had an independent quality control mechanism; they missed validation. Thus, instead of trying to improve the cell detection or tracking rates, we proceeded from the idea to automatically inspect the tracking results and accept only those of high trustworthiness, while rejecting all other results. This validation algorithm works independently of the quality of cell detection and tracking through a systematic search for tracking errors. It is based only on very general assumptions about the spatiotemporal contiguity of cell paths. While traditional tracking often aims to yield genealogic information about single cells, the natural outcome of a validated cell tracking algorithm turns out to be a set of complete, but often unconnected cell paths, i.e. records of cells from mitosis to mitosis. This is a consequence of the fact that the validation algorithm takes complete paths as the unit of rejection/acceptance. The resulting set of complete paths can be used to automatically extract important biological parameters

  17. RFID Location Algorithm

    Directory of Open Access Journals (Sweden)

    Wang Zi Min

    2016-01-01

    Full Text Available With the development of social services, people’s living standards improve further requirements, there is an urgent need for a way to adapt to the complex situation of the new positioning technology. In recent years, RFID technology have a wide range of applications in all aspects of life and production, such as logistics tracking, car alarm, security and other items. The use of RFID technology to locate, it is a new direction in the eyes of the various research institutions and scholars. RFID positioning technology system stability, the error is small and low-cost advantages of its location algorithm is the focus of this study.This article analyzes the layers of RFID technology targeting methods and algorithms. First, RFID common several basic methods are introduced; Secondly, higher accuracy to political network location method; Finally, LANDMARC algorithm will be described. Through this it can be seen that advanced and efficient algorithms play an important role in increasing RFID positioning accuracy aspects.Finally, the algorithm of RFID location technology are summarized, pointing out the deficiencies in the algorithm, and put forward a follow-up study of the requirements, the vision of a better future RFID positioning technology.

  18. Improved multivariate polynomial factoring algorithm

    International Nuclear Information System (INIS)

    Wang, P.S.

    1978-01-01

    A new algorithm for factoring multivariate polynomials over the integers based on an algorithm by Wang and Rothschild is described. The new algorithm has improved strategies for dealing with the known problems of the original algorithm, namely, the leading coefficient problem, the bad-zero problem and the occurrence of extraneous factors. It has an algorithm for correctly predetermining leading coefficients of the factors. A new and efficient p-adic algorithm named EEZ is described. Bascially it is a linearly convergent variable-by-variable parallel construction. The improved algorithm is generally faster and requires less store then the original algorithm. Machine examples with comparative timing are included

  19. Governance by algorithms

    Directory of Open Access Journals (Sweden)

    Francesca Musiani

    2013-08-01

    Full Text Available Algorithms are increasingly often cited as one of the fundamental shaping devices of our daily, immersed-in-information existence. Their importance is acknowledged, their performance scrutinised in numerous contexts. Yet, a lot of what constitutes 'algorithms' beyond their broad definition as “encoded procedures for transforming input data into a desired output, based on specified calculations” (Gillespie, 2013 is often taken for granted. This article seeks to contribute to the discussion about 'what algorithms do' and in which ways they are artefacts of governance, providing two examples drawing from the internet and ICT realm: search engine queries and e-commerce websites’ recommendations to customers. The question of the relationship between algorithms and rules is likely to occupy an increasingly central role in the study and the practice of internet governance, in terms of both institutions’ regulation of algorithms, and algorithms’ regulation of our society.

  20. The effects of noise reduction technologies on the acceptance of background noise.

    Science.gov (United States)

    Lowery, Kristy Jones; Plyler, Patrick N

    2013-09-01

    Directional microphones (D-Mics) and digital noise reduction (DNR) algorithms are used in hearing aids to reduce the negative effects of background noise on performance. Directional microphones attenuate sounds arriving from anywhere other than the front of the listener while DNR attenuates sounds with physical characteristics of noise. Although both noise reduction technologies are currently available in hearing aids, it is unclear if the use of these technologies in isolation or together affects acceptance of noise and/or preference for the end user when used in various types of background noise. The purpose of the research was to determine the effects of D-Mic, DNR, or the combination of D-Mic and DNR on acceptance of noise and preference when listening in various types of background noise. An experimental study in which subjects were exposed to a repeated measures design was utilized. Thirty adult listeners with mild sloping to moderately severe sensorineural hearing loss participated (mean age 67 yr). Acceptable noise levels (ANLs) were obtained using no noise reduction technologies, D-Mic only, DNR only, and the combination of the two technologies (Combo) for three different background noises (single-talker speech, speech-shaped noise, and multitalker babble) for each listener. In addition, preference rankings of the noise reduction technologies were obtained within each background noise (1 = best, 3 = worst). ANL values were significantly better for each noise reduction technology than baseline; and benefit increased significantly from DNR to D-Mic to Combo. Listeners with higher (worse) baseline ANLs received more benefit from noise reduction technologies than listeners with lower (better) baseline ANLs. Neither ANL values nor ANL benefit values were significantly affected by background noise type; however, ANL benefit with D-Mic and Combo was similar when speech-like noise was present while ANL benefit was greatest for Combo when speech spectrum noise was