WorldWideScience

Sample records for methods approach utilizing

  1. Methods for cellobiosan utilization

    Energy Technology Data Exchange (ETDEWEB)

    Linger, Jeffrey; Beckham, Gregg T.

    2017-07-11

    Disclosed herein are enzymes useful for the degradation of cellobiosan in materials such a pyrolysis oils. Methods of degrading cellobiosan using enzymes or organisms expressing the same are also disclosed.

  2. Risk evaluation method for faults by engineering approach. (2) Application concept of margin analysis utilizing accident sequences

    International Nuclear Information System (INIS)

    Kamiya, Masanobu; Kanaida, Syuuji; Kamiya, Kouichi; Sato, Kunihiko; Kuroiwa, Katsuya

    2016-01-01

    The influence of the fault displacement on the facility should to be evaluated not only by the activity of the fault but also by obtaining risk information by considering scenarios including such as the frequency and the degree of the hazard, which should be an appropriate approach for nuclear safety. An applicable concept of margin analysis utilizing accident sequences for evaluating the influence of the fault displacement is proposed. By use of this analysis, we can evaluate of the safety functions and margin for core damage, verify the efficiency of equipment of portable type and make a decision to take additional measures to reduce the risk by using obtained risk information. (author)

  3. A preliminary approach to creating an overview of lactoferrin multi-functionality utilizing a text mining method.

    Science.gov (United States)

    Shimazaki, Kei-ichi; Kushida, Tatsuya

    2010-06-01

    Lactoferrin is a multi-functional metal-binding glycoprotein that exhibits many biological functions of interest to many researchers from the fields of clinical medicine, dentistry, pharmacology, veterinary medicine, nutrition and milk science. To date, a number of academic reports concerning the biological activities of lactoferrin have been published and are easily accessible through public data repositories. However, as the literature is expanding daily, this presents challenges in understanding the larger picture of lactoferrin function and mechanisms. In order to overcome the "analysis paralysis" associated with lactoferrin information, we attempted to apply a text mining method to the accumulated lactoferrin literature. To this end, we used the information extraction system GENPAC (provided by Nalapro Technologies Inc., Tokyo). This information extraction system uses natural language processing and text mining technology. This system analyzes the sentences and titles from abstracts stored in the PubMed database, and can automatically extract binary relations that consist of interactions between genes/proteins, chemicals and diseases/functions. We expect that such information visualization analysis will be useful in determining novel relationships among a multitude of lactoferrin functions and mechanisms. We have demonstrated the utilization of this method to find pathways of lactoferrin participation in neovascularization, Helicobacter pylori attack on gastric mucosa, atopic dermatitis and lipid metabolism.

  4. A bayesian approach to laboratory utilization management

    Directory of Open Access Journals (Sweden)

    Ronald G Hauser

    2015-01-01

    Full Text Available Background: Laboratory utilization management describes a process designed to increase healthcare value by altering requests for laboratory services. A typical approach to monitor and prioritize interventions involves audits of laboratory orders against specific criteria, defined as rule-based laboratory utilization management. This approach has inherent limitations. First, rules are inflexible. They adapt poorly to the ambiguity of medical decision-making. Second, rules judge the context of a decision instead of the patient outcome allowing an order to simultaneously save a life and break a rule. Third, rules can threaten physician autonomy when used in a performance evaluation. Methods: We developed an alternative to rule-based laboratory utilization. The core idea comes from a formula used in epidemiology to estimate disease prevalence. The equation relates four terms: the prevalence of disease, the proportion of positive tests, test sensitivity and test specificity. When applied to a laboratory utilization audit, the formula estimates the prevalence of disease (pretest probability [PTP] in the patients tested. The comparison of PTPs among different providers, provider groups, or patient cohorts produces an objective evaluation of laboratory requests. We demonstrate the model in a review of tests for enterovirus (EV meningitis. Results: The model identified subpopulations within the cohort with a low prevalence of disease. These low prevalence groups shared demographic and seasonal factors known to protect against EV meningitis. This suggests too many orders occurred from patients at low risk for EV. Conclusion: We introduce a new method for laboratory utilization management programs to audit laboratory services.

  5. UTILIZATION OF 5Es' CONSTRUCTIVIST APPROACH FOR ...

    African Journals Online (AJOL)

    Global Journal

    learning, the effective utilization of which can enhance biology teaching and learning. This paper focused on how ... KEYWORDS: Utilization, 5Es Constructivist approach, difficult concepts, Biology. ... gives hope to the development of the deep.

  6. In situ dating on Mars: A new approach to the K-Ar method utilizing cosmogenic argon

    Science.gov (United States)

    Cassata, William S.

    2014-01-01

    Cosmogenic argon isotopes are produced in feldspars via nuclear reactions between cosmic rays and Ca and K atoms within the lattice. These cosmogenic isotopes can be used as proxies for K and Ca, much like nuclear reactor-derived 39Ar and 37Ar are used as proxies for K and Ca, respectively, in 40Ar/39Ar geochronology. If Ca and K are uniformly distributed, then the ratio of radiogenic 40Ar (40Ar*) to cosmogenic 38Ar or 36Ar (38Arcos or 36Arcos) is proportional to the difference between the radioisotopic and exposure ages, as well as the K/Ca ratio of the degassing phase. Thus cosmogenic, radiogenic, and trapped Ar isotopes, all of which can be measured remotely and are stable over geologic time, are sufficient to generate an isochron-like diagram from which the isotopic composition of the trapped component may be inferred. Such data also provide a means to assess the extent to which the system has remained closed with respect to 40Ar*, thereby mitigating otherwise unquantifiable uncertainties that complicate the conventional K-Ar dating method.

  7. Method for aluminium dross utilization

    International Nuclear Information System (INIS)

    Lucheva, B.; Petkov, R.; Tzonev, Tz.

    2003-01-01

    A new hydrometallurgical method has been developed for metal aluminum utilization from secondary aluminum dross. Secondary aluminum dross is a powder product with an average of 35% aluminium content (below 1mm). It is waste from primary aluminum dross pyrometallurgical flux less treatment in rotary DC electric arc furnace. This method is based on aluminum leaching in copper chloride water solution. As a result an aluminum oxychloride solution and solids, consisting of copper and oxides are obtained. In order to copper chloride solution regenerate hydrochloric acid is added to the solids. The process is simple, quick, economic and safe. The aluminum oxychloride solution contains 56 g/l Al 2 O 3 . The molar ratios are Al:Cl=0,5; OH:Al=1. The solution has 32 % basicity and 1,1 g/cm 3 density. For increasing the molar ratio of aluminium to chlorine aluminum hydroxide is added to this solution at 80 o C. Aluminum hydroxide is the final product from the secondary aluminum dross alkaline leaching. As a result aluminum oxychloride solution of the following composition is prepared: Al 2 O 3 - 180 g/l; Al:Cl=1,88; OH:Al=4,64; basicity 82%; density 1,22 g/cm 3 , pH=4 -4,5. Aluminum oxychloride solution produced by means of this method can be used in potable and wastewater treatment, paper making, in refractory mixture as a binder etc. (Original)

  8. Systemic-Functional Approach to Utilities Supplys

    Directory of Open Access Journals (Sweden)

    Nikolay I. Komkov

    2017-01-01

    Full Text Available Purpose: the purpose of the article consists in statement of management approach to development of utilities supply processes based on conflict situations decision – making search. It had appeared in the period of the transition from the planned and directive management to market development. Methods: the research methodology is based on the system analysis of full life cycle processes functioning, forecasting of complex systems development, mathematical modeling of processes of services supply and innovative and investment projects modeling as well as development of supplying services processes. Results: the results of the work are concentrated in the presentation of systemic-functional approach to managing the development of processes of municipal services, able to resolve conflict situations in this sphere. Conclusions and Relevance: the traditional management approach on the basis of elimination of "bottlenecks" and emergencies prevailing within planned and directive system at its transformation in the market conditions has led to accumulation of conflict situations and unsolvable problems. The offered systemic-functional approach based on forecasting of full life cycle of the modernized processes and the services providing systems allows to consider costs of modernization, prime cost and quality of the rendered services. 

  9. Supplier Selection Using Weighted Utility Additive Method

    Science.gov (United States)

    Karande, Prasad; Chakraborty, Shankar

    2015-10-01

    Supplier selection is a multi-criteria decision-making (MCDM) problem which mainly involves evaluating a number of available suppliers according to a set of common criteria for choosing the best one to meet the organizational needs. For any manufacturing or service organization, selecting the right upstream suppliers is a key success factor that will significantly reduce purchasing cost, increase downstream customer satisfaction and improve competitive ability. The past researchers have attempted to solve the supplier selection problem employing different MCDM techniques which involve active participation of the decision makers in the decision-making process. This paper deals with the application of weighted utility additive (WUTA) method for solving supplier selection problems. The WUTA method, an extension of utility additive approach, is based on ordinal regression and consists of building a piece-wise linear additive decision model from a preference structure using linear programming (LP). It adopts preference disaggregation principle and addresses the decision-making activities through operational models which need implicit preferences in the form of a preorder of reference alternatives or a subset of these alternatives present in the process. The preferential preorder provided by the decision maker is used as a restriction of a LP problem, which has its own objective function, minimization of the sum of the errors associated with the ranking of each alternative. Based on a given reference ranking of alternatives, one or more additive utility functions are derived. Using these utility functions, the weighted utilities for individual criterion values are combined into an overall weighted utility for a given alternative. It is observed that WUTA method, having a sound mathematical background, can provide accurate ranking to the candidate suppliers and choose the best one to fulfill the organizational requirements. Two real time examples are illustrated to prove

  10. Decentralized method for utility regulation

    Energy Technology Data Exchange (ETDEWEB)

    Loeb, M. (North Carolina State Univ., Raleigh); Magat, W.A.

    1979-10-01

    A new institutional arrangement for regulating utilities is suggested that minimizes the costs of natural monopolies. A mixture of regulation and franchising, the plan draws on the advantages of each and eliminates many of the problems. The proposal allows utilities to set their own price on the basis of demand and marginal-cost projections. Subsidies are provided by the regulatory agency if there is a consumer surplus. The system encourages the utility to select a competitive price and to produce only the amount of service needed. Operating efficiency is encouraged by rewarding cost reductions and discouraging cost overstatement at the rate review. The regulatory agency would not need to take action to bring price and marginal costs into equality. The franchise sale can be made by competitive bidding, in which the bidders would capitalize part or all of the subsidy or the regulatory agency could recover the subsidy in a lump-sum tax on the utility.

  11. A utility theory approach for insurance pricing

    Directory of Open Access Journals (Sweden)

    Mohsen Gharakhani

    2015-11-01

    Full Text Available Providing insurance contract with “deductible” is beneficial for both insurer and insured. In this paper, we provide a utility modeling approach to handle insurance pricing and evaluate the tradeoff between discount benefit and deductible level. We analyze four different pricing problems of no insurance, full insurance coverage, insurance with β% deductible and insurance with D-dollar deductible based on a given utility function. A numerical example is also used to illustrate some interesting results.

  12. Diagnosis method utilizing neural networks

    International Nuclear Information System (INIS)

    Watanabe, K.; Tamayama, K.

    1990-01-01

    Studies have been made on the technique of neural networks, which will be used to identify a cause of a small anomalous state in the reactor coolant system of the ATR (Advance Thermal Reactor). Three phases of analyses were carried out in this study. First, simulation for 100 seconds was made to determine how the plant parameters respond after the occurence of a transient decrease in reactivity, flow rate and temperature of feed water and increase in the steam flow rate and steam pressure, which would produce a decrease of water level in a steam drum of the ATR. Next, the simulation data was analysed utilizing an autoregressive model. From this analysis, a total of 36 coherency functions up to 0.5 Hz in each transient were computed among nine important and detectable plant parameters: neutron flux, flow rate of coolant, steam or feed water, water level in the steam drum, pressure and opening area of control valve in a steam pipe, feed water temperature and electrical power. Last, learning of neural networks composed of 96 input, 4-9 hidden and 5 output layer units was done by use of the generalized delta rule, namely a back-propagation algorithm. These convergent computations were continued as far as the difference between the desired outputs, 1 for direct cause or 0 for four other ones and actual outputs reached less than 10%. (1) Coherency functions were not governed by decreasing rate of reactivity in the range of 0.41x10 -2 dollar/s to 1.62x10 -2 dollar /s or by decreasing depth of the feed water temperature in the range of 3 deg C to 10 deg C or by a change of 10% or less in the three other causes. Change in coherency functions only depended on the type of cause. (2) The direct cause from the other four ones could be discriminated with 0.94+-0.01 of output level. A maximum of 0.06 output height was found among the other four causes. (3) Calculation load which is represented as products of learning times and numbers of the hidden units did not depend on the

  13. Market oriented approach by public utilities

    Energy Technology Data Exchange (ETDEWEB)

    Mantel, J J; Verkuil, J M

    1989-08-01

    Public utilities, especially the larger ones, have an image of being inefficient, technocratic and bureaucratic institutions, unresponsive to modern lifestyles, growing consumerism, differentiated customer needs and changing social values. Improving this image and increasing customer satisfaction requires the adoption of a systematic market oriented approach, based on an appropriate segmentation of the client and general public. This article gives the broad outline of such an approach followed by some generally applicable practical recommendations. Finally it stresses the importance of human aspects of organizational behaviour and, consequently, the crucial part of corporate culture. 2 figs., 1 tab.

  14. Qualitative Methods in Drug Utilization Research

    DEFF Research Database (Denmark)

    Almarsdóttir, Anna Birna; Bastholm Rahmner, Pia

    2016-01-01

    Qualitative research methods derive from the social sciences. Their use in drug utilization research is increasingly widespread, especially in understanding patient and prescriber perspectives. The main focus in qualitative research is exploration of a given phenomenon in order to get a wider...... understanding of why and how it appears. Qualitative research methods build on various theoretical underpinnings/schools of thought. The same validity and quality criteria cannot be used for both qualitative and quantitative methods....

  15. Utilities' nuclear fuel economic evaluation methods

    International Nuclear Information System (INIS)

    Sonz, L.A.

    1987-01-01

    This paper presents the typical perceptions, methods, considerations, and procedures used by an operating electric utility in the economic evaluation of nuclear fuel preparation and utilization scenarios. The means given are probably not an exclusive review of those available, but are the author's recollection of systems employed to select and recommend preferable courses of action. Economic evaluation of proposed nuclear fuel scenarios is an important, but not exclusive, means of deciding on corporate action. If the economic evaluation is performed and coordinated with the other corporate considerations, such as technical and operational ability, electrical system operations management, tax effects, capital management, rates impact, etc., then the resultant recommendation may be employed to the benefit of the customers and, consequently, to the corporation

  16. A multivariate-utility approach for selection of energy sources

    International Nuclear Information System (INIS)

    Ahmed, S; Husseiny, A.A.

    1978-01-01

    A deterministic approach is devised to compare the safety features of various energy sources. The approach is based on multiattribute utility theory. The method is used in evaluating the safety aspects of alternative energy sources used for the production of electrical energy. Four alternative energy sources are chosen which could be considered for the production of electricity to meet the national energy demand. These are nuclear, coal, solar, and geothermal energy. For simplicity, a total electrical system is considered in each case. A computer code is developed to evaluate the overall utility function for each alternative from the utility patterns corresponding to 23 energy attributes, mostly related to safety. The model can accommodate other attributes assuming that these are independent. The technique is kept flexible so that virtually any decision problem with various attributes can be attacked and optimal decisions can be reached. The selected data resulted in preference of geothermal and nuclear energy over other sources, and the method is found viable in making decisions on energy uses based on quantified and subjective attributes. (author)

  17. Solar energy utilization by physical methods.

    Science.gov (United States)

    Wolf, M

    1974-04-19

    On the basis of the estimated contributions of these differing methods of the utilization of solar energy, their total energy delivery impact on the projected U.S. energy economy (9) can be evaluated (Fig. 5). Despite this late energy impact, the actual sales of solar energy utilization equipment will be significant at an early date. Potential sales in photovoltaic arrays alone could exceed $400 million by 1980, in order to meet the projected capacity buildup (10). Ultimately, the total energy utilization equipment industry should attain an annual sales volume of several tens of billion dollars in the United States, comparable to that of several other energy related industries. Varying amounts of technology development are required to assure the technical and economic feasibility of the different solar energy utilization methods. Several of these developments are far enough along that the paths can be analyzed from the present time to the time of demonstration of technical and economic feasibility, and from there to production and marketing readiness. After that point, a period of market introduction will follow, which will differ in duration according to the type of market addressed. It may be noted that the present rush to find relief from the current energy problem, or to be an early leader in entering a new market, can entail shortcuts in sound engineering practice, particularly in the areas of design for durability and easy maintenance, or of proper application engineering. The result can be loss of customer acceptance, as has been experienced in the past with various products, including solar water heaters. Since this could cause considerable delay in achieving the expected total energy impact, it will be important to spend adequate time at this stage for thorough development. Two other aspects are worth mentioning. The first is concerned with the economic impacts. Upon reflection on this point, one will observe that largescale solar energy utilization will

  18. Sustainable development through biomass utilization: A practical approach

    Science.gov (United States)

    Ravi Malhotra

    2008-01-01

    (Please note, this is an abstract only) This paper is for folks involved in community development efforts targeted towards biomass utilization. Our approach to evaluate the potential for establishing enterprises that utilize locally available forest resources is tailored specifically to the needs of the local community. We evaluate the: 1. Technical feasibility and...

  19. Methods of fabricating cermet materials and methods of utilizing same

    Science.gov (United States)

    Kong, Peter C.

    2006-04-04

    Methods of fabricating cermet materials and methods of utilizing the same such as in filtering particulate and gaseous pollutants from internal combustion engines having intermetallic and ceramic phases. The cermet material may be made from a transition metal aluminide phase and an aluminia phase. The mixture may be pressed to form a green compact body and then heated in a nitrogen-containing atmosphere so as to melt aluminum particles and form the cermet. Filler materials may be added to increase the porosity or tailor the catalytic properties of the cermet material. Additionally, the cermet material may be reinforced with fibers or screens. The cermet material may also be formed so as to pass an electrical current therethrough to heat the material during use.

  20. Methods of producing cermet materials and methods of utilizing same

    Science.gov (United States)

    Kong, Peter C [Idaho Falls, ID

    2008-12-30

    Methods of fabricating cermet materials and methods of utilizing the same such as in filtering particulate and gaseous pollutants from internal combustion engines having intermetallic and ceramic phases. The cermet material may be made from a transition metal aluminide phase and an alumina phase. The mixture may be pressed to form a green compact body and then heated in a nitrogen-containing atmosphere so as to melt aluminum particles and form the cermet. Filler materials may be added to increase the porosity or tailor the catalytic properties of the cermet material. Additionally, the cermet material may be reinforced with fibers or screens. The cermet material may also be formed so as to pass an electrical current therethrough to heat the material during use.

  1. Barriers to utilization of modern methods of family planning amongst ...

    African Journals Online (AJOL)

    Barriers to utilization of modern methods of family planning amongst women in a ... is recognized by the world health organization (WHO) as a universal human right. ... Conclusion: The study finds numerous barriers to utilization of family ...

  2. Method of utilization of alum shales

    Energy Technology Data Exchange (ETDEWEB)

    Dahlerus, C G

    1908-07-04

    A procedure - by means of reducing smelting of bituminous alum shales in a closed furnace process with or without the use of additional fuel and without adding lime or other slag-forming material - to utilize the hydrocarbons and tar oils formed, and likewise the alkali, nitrogen, and sulfur compositions is given. This is accomplished by making these products follow the furnace gases, and later separating them from the gases by cooling for condensation. The patent contains one more claim.

  3. Integrated Transport Planning Framework Involving Combined Utility Regret Approach

    DEFF Research Database (Denmark)

    Wang, Yang; Monzon, Andres; Di Ciommo, Floridea

    2014-01-01

    Sustainable transport planning requires an integrated approach involving strategic planning, impact analysis, and multicriteria evaluation. This study aimed at relaxing the utility-based decision-making assumption by newly embedding anticipated-regret and combined utility regret decision mechanisms...... in a framework for integrated transport planning. The framework consisted of a two-round Delphi survey, integrated land use and transport model for Madrid, and multicriteria analysis. Results show that (a) the regret-based ranking has a similar mean but larger variance than the utility-based ranking does, (b......) the least-regret scenario forms a compromise between the desired and the expected scenarios, (c) the least-regret scenario can lead to higher user benefits in the short term and lower user benefits in the long term, (d) the utility-based, the regret-based, and the combined utility- and regret...

  4. Evaluating the comparative effectiveness of different demand side interventions to increase maternal health service utilization and practice of birth spacing in South Kivu, Democratic Republic of Congo: an innovative, mixed methods approach.

    Science.gov (United States)

    Dumbaugh, Mari; Bapolisi, Wyvine; van de Weerd, Jennie; Zabiti, Michel; Mommers, Paula; Balaluka, Ghislain Bisimwa; Merten, Sonja

    2017-07-03

    In this protocol we describe a mixed methods study in the province of South Kivu, Democratic Republic of Congo evaluating the effectiveness of different demand side strategies to increase maternal health service utilization and the practice of birth spacing. Conditional service subsidization, conditional cash transfers and non-monetary incentives aim to encourage women to use maternal health services and practice birth spacing in two different health districts. Our methodology will comparatively evaluate the effectiveness of different approaches against each other and no intervention. This study comprises four main research activities: 1) Formative qualitative research to determine feasibility of planned activities and inform development of the quantitative survey; 2) A community-based, longitudinal survey; 3) A retrospective review of health facility records; 4) Qualitative exploration of intervention acceptability and emergent themes through in-depth interviews with program participants, non-participants, their partners and health providers. Female community health workers are engaged as core members of the research team, working in tandem with female survey teams to identify women in the community who meet eligibility criteria. Female community health workers also act as key informants and community entry points during methods design and qualitative exploration. Main study outcomes are completion of antenatal care, institutional delivery, practice of birth spacing, family planning uptake and intervention acceptability in the communities. Qualitative methods also explore decision making around maternal health service use, fertility preference and perceptions of family planning. The innovative mixed methods design allows quantitative data to inform the relationships and phenomena to be explored in qualitative collection. In turn, qualitative findings will be triangulated with quantitative findings. Inspired by the principles of grounded theory, qualitative

  5. NNP-LANL Utilities - Condition Assessment and Project Approach

    Energy Technology Data Exchange (ETDEWEB)

    Stewart, Grant Lorenz [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-11-21

    This report is a presentation on LANL Utilities & Transportation Asset Management; Utility Assets Overview; Condition Assessment; Utilities Project Nominations & Ranking; and Utilities Project Execution.

  6. A modified method using a two-port approach for accessing the hilar vasculature without transferring an endostapler from camera port to utility port during thoracoscopic right upper lobectomy.

    Science.gov (United States)

    Jiao, W; Zhao, Y; Xuan, Y; Wang, M

    2015-02-01

    For thoracoscopic upper lobectomies, most cutting endostaplers must be inserted through the camera port when using a two-port approach. Access to the hilar vasculature through only the utility port remains a challenge. In this study, we describe a procedure to access the hilar vasculature without transferring the endostapler site during a thoracoscopic right upper lobectomy. A 2.5-cm utility anterior incision was made in the fourth intercostal space. The posterior mediastinal visceral pleura were dissected to expose the posterior portion of the right upper bronchus and the anterior trunk of the right pulmonary artery. The pleura over the right hilar vasculature were then peeled with an electrocoagulation hook. The anterior trunk of the right pulmonary artery was then transected with a cutting endostapler through the utility port firstly. This crucial maneuver allowed the endostapler access to the right upper lobe pulmonary vein. The hilar structures were then easily handled in turn. This novel technique was performed successfully in 32 patients, with no perioperative deaths. The average operation time was 120.6 min (range 75-180 min). This novel technique permits effective control of the hilar vessels through the utility port, enabling simple, safe, quick and effective resection.

  7. Towards Multi-Method Research Approach in Empirical Software Engineering

    Science.gov (United States)

    Mandić, Vladimir; Markkula, Jouni; Oivo, Markku

    This paper presents results of a literature analysis on Empirical Research Approaches in Software Engineering (SE). The analysis explores reasons why traditional methods, such as statistical hypothesis testing and experiment replication are weakly utilized in the field of SE. It appears that basic assumptions and preconditions of the traditional methods are contradicting the actual situation in the SE. Furthermore, we have identified main issues that should be considered by the researcher when selecting the research approach. In virtue of reasons for weak utilization of traditional methods we propose stronger use of Multi-Method approach with Pragmatism as the philosophical standpoint.

  8. A Novel Approach to Improving Utilization of Laboratory Testing.

    Science.gov (United States)

    Zhou, Yaolin; Procop, Gary W; Riley, Jacquelyn D

    2018-02-01

    - The incorporation of best practice guidelines into one's institution is a challenging goal of utilization management, and the successful adoption of such guidelines depends on institutional context. Laboratorians who have access to key clinical data are well positioned to understand existing local practices and promote more appropriate laboratory testing. - To apply a novel approach to utilization management by reviewing international clinical guidelines and current institutional practices to create a reliable mechanism to improve detection and reduce unnecessary tests in our patient population. - We targeted a frequently ordered genetic test for HFE-related hereditary hemochromatosis, a disorder of low penetrance. After reviewing international practice guidelines, we evaluated 918 HFE tests and found that all patients with new diagnoses had transferrin saturation levels that were significantly higher than those of patients with nonrisk genotypes (72% versus 42%; P < .001). - Our "one-button" order that restricts HFE genetic tests to patients with transferrin saturation greater than 45% is consistent with published practice guidelines and detected 100% of new patients with HFE-related hereditary hemochromatosis. - Our proposed algorithm differs from previously published approaches in that it incorporates both clinical practice guidelines and local physician practices, yet requires no additional hands-on effort from pathologists or clinicians. This novel approach to utilization management embraces the role of pathologists as leaders in promoting high-quality patient care in local health care systems.

  9. IRP methods for Environmental Impact Statements of utility expansion plans

    International Nuclear Information System (INIS)

    Cavallo, J.D.; Hemphill, R.C.; Veselka, T.D.

    1992-01-01

    Most large electric utilities and a growing number of gas utilities in the United States are using a planning method -- Integrated Resource Planning (IRP) - which incorporates demand-side management (DSM) programs whenever the marginal cost of the DSM programs are lower than the marginal cost of supply-side expansion options. Argonne National Laboratory has applied the IRP method in its socio-economic analysis of an Environmental Impact Statement (EIS) of power marketing for a system of electric utilities in the mountain and western regions of the United States. Applying the IRP methods provides valuable information to the participants in an EIS process involving capacity expansion of an electric or gas utility. The major challenges of applying the IRP method within an EIS are the time consuming and costly task of developing a least cost expansion path for each altemative, the detailed quantification of environmental damages associated with capacity expansion, and the explicit inclusion of societal-impacts to the region

  10. Utility of three-dimensional method for diagnosing meniscal lesions

    International Nuclear Information System (INIS)

    Ohshima, Suguru; Nomura, Kazutoshi; Hirano, Mako; Hashimoto, Noburo; Fukumoto, Tetsuya; Katahira, Kazuhiro

    1998-01-01

    MRI of the knee is a useful method for diagnosing meniscal tears. Although the spin echo method is usually used for diagnosing meniscal tears, we examined the utility of thin slice scan with the three-dimensional method. We reviewed 70 menisci in which arthroscopic findings were confirmed. In this series, sensitivity was 90.9% for medial meniscal injuries and 68.8% for lateral meniscal injuries. There were 3 meniscal tears in which we could not detect tears on preoperative MRI. We could find tears in two of these cases when re-evaluated using the same MRI. In conclusion, we can get the same diagnostic rate with the three-dimensional method compared with the spin echo method. Scan time of the three-dimensional method is 3 minutes, on the other hand that of spin echo method in 17 minutes. This slice scan with three-dimensional method is useful for screening meniscal injuries before arthroscopy. (author)

  11. Statistical benchmarking in utility regulation: Role, standards and methods

    International Nuclear Information System (INIS)

    Newton Lowry, Mark; Getachew, Lullit

    2009-01-01

    Statistical benchmarking is being used with increasing frequency around the world in utility rate regulation. We discuss how and where benchmarking is in use for this purpose and the pros and cons of regulatory benchmarking. We then discuss alternative performance standards and benchmarking methods in regulatory applications. We use these to propose guidelines for the appropriate use of benchmarking in the rate setting process. The standards, which we term the competitive market and frontier paradigms, have a bearing on method selection. These along with regulatory experience suggest that benchmarking can either be used for prudence review in regulation or to establish rates or rate setting mechanisms directly

  12. A GA-Based Approach to Hide Sensitive High Utility Itemsets

    Directory of Open Access Journals (Sweden)

    Chun-Wei Lin

    2014-01-01

    Full Text Available A GA-based privacy preserving utility mining method is proposed to find appropriate transactions to be inserted into the database for hiding sensitive high utility itemsets. It maintains the low information loss while providing information to the data demanders and protects the high-risk information in the database. A flexible evaluation function with three factors is designed in the proposed approach to evaluate whether the processed transactions are required to be inserted. Three different weights are, respectively, assigned to the three factors according to users. Moreover, the downward closure property and the prelarge concept are adopted in the proposed approach to reduce the cost of rescanning database, thus speeding up the evaluation process of chromosomes.

  13. Exergy and exergoeconomic analysis of a petroleum refinery utilities plant using the condensing to power method

    Energy Technology Data Exchange (ETDEWEB)

    Mendes da Silva, Julio Augusto; Pellegrini, Luiz Felipe; Oliveira Junior, Silvio [Polytechnic School of the University of Sao Paulo, SP (Brazil)], e-mails: jams@usp.br, luiz.pellegrini@usp.br, soj@usp.br; Plaza, Claudio; Rucker, Claudio [Petrobras - Petroleo Brasileiro S.A., Rio de Janeiro, RJ (Brazil)], e-mails: claudioplaza@petrobras.com.br, rucker@petrobras.com.br

    2010-07-01

    In this paper a brief description of the main processes present in a modern high capacity refinery is done. The methodology used to evaluate, through exergy analysis, the performance of the refinery's utilities plant since it is responsible for a very considerable amount of the total exergy destruction in a refinery is presented. The utilities plant products: steam, electricity, shaft power and high pressure water had their exergy unit cost determined using exergoeconomic approach. A simple and effective method called condensing to power was used to define the product of the condensers in exergy basis. Using this method it is possible to define the product of the condenser without the use of negentropy concept nor the aggregation of condensers to the steam turbines. By using this new approach, the costs obtained for the plant's products are exactly the same costs obtained when the condenser is aggregated to the steam turbine but with the advantage that the information about the stream between condenser and the steam turbine is not lost and the condenser can be evaluated singly. The analysis shows that the equipment where attention and resources should be focused are the boilers followed by the gas turbine, that together, are responsible for 80% of total exergy destruction in the utilities plant. The total exergy efficiency found for the utilities plant studied is 35% while more than 280 MW of exergy is destroyed in the utilities processes. (author)

  14. Utility enterprise solutions - the benefits of an open, integrated approach

    Energy Technology Data Exchange (ETDEWEB)

    Manos, P. [Mincom Inc., Denver, CO (United States)

    2000-12-31

    An integrated data handling system, specifically designed to assist utilities to have the needed flexibility and integration capabilities in their systems that supports their front and back office functions, optimize operations, and deepen collaborative relationships with suppliers as well as customers during the transition phase from public utilities to a free market environment, are discussed. The proposed system provides asset management, materials management, human resources management and finance management modules, integrated to key utility functions such as customer information, geographic information, outage management, switching management, mobile computing, safety/lockout -tag and SCADA. Through the linkage of these systems, all data is available to utility decision-makers on a real-time basis, in the office, in the field or telecommuting from the 'virtual' office. The integrated solution described here will provide higher system reliability, increased responsiveness to customer service requests, optimized engineering analysis work by designers and technical experts, more streamlined job planning, optimization of personnel-related processes and reduction of inventory expenses. By shifting the focus from chasing paper or worrying about interface performance and by making asset management the core element of the management information system, utility professionals can concentrate on focusing on bottom-line performance and on managing, rather than performing critical activities.

  15. Control Methods Utilizing Energy Optimizing Schemes in Refrigeration Systems

    DEFF Research Database (Denmark)

    Larsen, L.S; Thybo, C.; Stoustrup, Jakob

    2003-01-01

    The potential energy savings in refrigeration systems using energy optimal control has been proved to be substantial. This however requires an intelligent control that drives the refrigeration systems towards the energy optimal state. This paper proposes an approach for a control, which drives th...... the condenser pressure towards an optimal state. The objective of this is to present a feasible method that can be used for energy optimizing control. A simulation model of a simple refrigeration system will be used as basis for testing the control method....

  16. 3D Approach for Representing Uncertainties of Underground Utility Data

    NARCIS (Netherlands)

    olde Scholtenhuis, Léon Luc; Zlatanova, S.; den Duijn, Xander; Lin, Ken-Yu; El-Gohary, Nora; Tang, Pingbo

    Availability of 3D underground information models is key to designing and managing urban infrastructure construction projects. Buried utilities information is often registered by using different types of location data with different uncertainties. These data variances are, however, not considered in

  17. Utilizing Distributed Resources in Smart Grids - A Coordination Approach

    DEFF Research Database (Denmark)

    Juelsgaard, Morten

    2014-01-01

    as well as its limitations. Enforcing coordination through temporal shifts of consumption and production requires the problems we consider to be solved across some predefined time-horizon. Utilizing flexibility of consumers through coordination, is known as demand management, and considers how consumers...

  18. eHealth System for Collecting and Utilizing Patient Reported Outcome Measures for Personalized Treatment and Care (PROMPT-Care) Among Cancer Patients: Mixed Methods Approach to Evaluate Feasibility and Acceptability.

    Science.gov (United States)

    Girgis, Afaf; Durcinoska, Ivana; Levesque, Janelle V; Gerges, Martha; Sandell, Tiffany; Arnold, Anthony; Delaney, Geoff P

    2017-10-02

    Despite accumulating evidence indicating that collecting patient-reported outcomes (PROs) and transferring results to the treating health professional in real time has the potential to improve patient well-being and cancer outcomes, this practice is not widespread. The aim of this study was to test the feasibility and acceptability of PROMPT-Care (Patient Reported Outcome Measures for Personalized Treatment and Care), a newly developed electronic health (eHealth) system that facilitates PRO data capture from cancer patients, data linkage and retrieval to support clinical decisions and patient self-management, and data retrieval to support ongoing evaluation and innovative research. We developed an eHealth system in consultation with content-specific expert advisory groups and tested it with patients receiving treatment or follow-up care in two hospitals in New South Wales, Australia, over a 3-month period. Participants were recruited in clinic and completed self-report Web-based assessments either just before their upcoming clinical consultation or every 4 weeks if in follow-up care. A mixed methods approach was used to evaluate feasibility and acceptability of PROMPT-Care; data collected throughout the study informed the accuracy and completeness of data transfer procedures, and extent of missing data was determined from participants' assessments. Patients participated in cognitive interviews while completing their first assessment and completed evaluation surveys and interviews at study-end to assess system acceptability and usefulness of patient self-management resources, and oncology staff were interviewed at study-end to determine the acceptability and perceived usefulness of real-time PRO reporting. A total of 42 patients consented to the study; 7 patients were withdrawn before starting the intervention primarily because of changes in eligibility. Overall, 35 patients (13 on treatment and 22 in follow-up) completed 67 assessments during the study period. Mean

  19. Total System Performance Assessment-License Application Methods and Approach

    Energy Technology Data Exchange (ETDEWEB)

    J. McNeish

    2002-09-13

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issue (KTI) agreements, the ''Yucca Mountain Review Plan'' (CNWRA 2002 [158449]), and 10 CFR Part 63. This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are utilized in this document.

  20. Platelet function testing: methods of assessment and clinical utility.

    LENUS (Irish Health Repository)

    Mylotte, Darren

    2012-02-01

    Platelets play a central role in the regulation of both thrombosis and haemostasis yet tests of platelet function have, until recently, been exclusively used in the diagnosis and management of bleeding disorders. Recent advances have demonstrated the clinical utility of platelet function testing in patients with cardiovascular disease. The ex vivo measurement of response to antiplatelet therapies (aspirin and clopidogrel), by an ever-increasing array of platelet function tests, is with some assays, predictive of adverse clinical events and thus, represents an emerging area of interest for both the clinician and basic scientist. This review article will describe the advantages and disadvantages of the currently available methods of measuring platelet function and discuss both the limitations and emerging data supporting the role of platelet function studies in clinical practice.

  1. Platelet function testing: methods of assessment and clinical utility.

    LENUS (Irish Health Repository)

    Mylotte, Darren

    2011-01-01

    Platelets play a central role in the regulation of both thrombosis and haemostasis yet tests of platelet function have, until recently, been exclusively used in the diagnosis and management of bleeding disorders. Recent advances have demonstrated the clinical utility of platelet function testing in patients with cardiovascular disease. The ex vivo measurement of response to antiplatelet therapies (aspirin and clopidogrel), by an ever-increasing array of platelet function tests, is with some assays, predictive of adverse clinical events and thus, represents an emerging area of interest for both the clinician and basic scientist. This review article will describe the advantages and disadvantages of the currently available methods of measuring platelet function and discuss both the limitations and emerging data supporting the role of platelet function studies in clinical practice.

  2. Cost utility analysis of diagnostic method of syphilis

    Directory of Open Access Journals (Sweden)

    Viroj Wiwanitkit

    2008-04-01

    Full Text Available Presently, the diagnosis of syphilis is dependent mainly on serological tests. The most widely used screening tests for syphilis are the VDRL and the rapid plasma reagin (RPR and for confirmation, the fluorescent treponemal antibody (FTA and the treponema pallidum hemagglutination (TPHA tests. The four alternative modes for diagnosis of syphilis can be a VDRL + FTA, b VDRL + TPHA, c RPR + FTA and d RPR + TPHA. Here the author reports an evaluation of cost utility of these tests in medical practice. It is shown that the cost per accurate diagnosis with VDRL + TPH is the least expensive choice. Therefore, this alternative is the best method for serological diagnosis for syphilis, based on medical laboratory economics principles

  3. Optimally eating a stochastic cake. A recursive utility approach

    International Nuclear Information System (INIS)

    Epaulard, Anne; Pommeret, Aude

    2003-01-01

    In this short paper, uncertainties on resource stock and on technical progress are introduced into an intertemporal equilibrium model of optimal extraction of a non-renewable resource. The representative consumer maximizes a recursive utility function which disentangles between intertemporal elasticity of substitution and risk aversion. A closed-form solution is derived for both the optimal extraction and price paths. The value of the intertemporal elasticity of substitution relative to unity is then crucial in understanding extraction. Moreover, this model leads to a non-renewable resource price following a geometric Brownian motion

  4. Global hospital bed utilization crisis. A different approach.

    Science.gov (United States)

    Waness, Abdelkarim; Akbar, Jalal U; Kharal, Mubashar; BinSalih, Salih; Harakati, Mohammed

    2010-04-01

    To test the effect of improved physician availability on hospital bed utilization. A prospective cohort study was conducted from 1st January 2009 to 31st March 2009 in the Division of Internal Medicine (DIM), King Abdul-Aziz Medical City (KAMC), Riyadh, Kingdom of Saudi Arabia. Two clinical teaching units (CTU) were compared head-to-head. Each CTU has 3 consultants. The CTU-control provides standard care, while the CTU-intervention was designed to provide better physician-consultant availability. Three outcomes were evaluated: patient outsourcing to another hospital, patient discharge during weekends, and overall admissions. Statistical analysis was carried out by electronic statistics calculator from the Center for Evidence-Based Medicine. Three hundred and thirty-four patients were evaluated for admission at the Emergency Room by both CTU's. One hundred and eighty-three patients were seen by the CTU-control, 6 patients were outsourced, and 177 were admitted. One hundred fifty-one patients were seen by the CTU-intervention: 39 of them were outsourced, and 112 were admitted. Forty-eight weekend patient discharges occurred during this period of time: 21 by CTU-control, and 27 by CTU-intervention. Analysis for odds ratio in both the rate of outsourcing, and weekend discharges, showed statistical significance in favor of the intervention group. The continuous availability of a physician-consultant for patient admission evaluation, outsourcing, or discharge during regular weekdays and weekends at DIM, KAMC proved to have a positive impact on bed utilization.

  5. Utilizing 4-H in Afterschool Settings: Two Approaches for Integration

    Directory of Open Access Journals (Sweden)

    Rachel Rudd

    2013-03-01

    Full Text Available As our communities grow and change, afterschool programs represent an avenue to bring resources to populations which would otherwise not be available to them. Combining 4-H with the afterschool environment can be beneficial in supporting and raising the quality of afterschool programs being offered. This article explores the benefits and challenges of two approaches of implementing 4-H programming in afterschool settings: the 4-H managed program that is created and run solely by 4-H faculty and staff and the 4-H afterschool partnerships which are facilitated in partnership with existing afterschool programs. Regardless of the approach, combining 4-H with afterschool programs can strengthen well established programs and can enhance the quality of all afterschool programs.

  6. Severe accident prevention and mitigation: A utility perspective - EDF approach

    International Nuclear Information System (INIS)

    Vidard, M.

    1998-01-01

    Current plans have excellent safety records and are cost competitive. For future plants, excellence in safety will remain a prerequisite, as well as increased cost competitiveness. When contemplating solutions to Severe Accident challenges, cost effectiveness is essential in the decision making process. This cost effectiveness must be understood not only in terms of capital cost, but also of Operation and Maintenance costs as well as absence of additional risks to plant operators. Examples are given to illustrate the recommended approach

  7. A Practical Approach to Improve Optical Channel Utilization Period for Hybrid FSO/RF Systems

    Directory of Open Access Journals (Sweden)

    Ahmet Akbulut

    2014-01-01

    Full Text Available In hybrid FSO/RF systems, mostly a hard switching mechanism is preferred in case of the FSO signal level falls below to the predefined threshold. In this work, a computationally simple approach is proposed to increase the utilization of the FSO channels bandwidth advantage. For the channel, clear air conditions have been supposed with the atmospheric turbulence. In this approach, FSO bit rate is adaptively changed to achieve desired BER performance. An IM/DD modulation, OOK (NRZ format has been used to show the benefit of the proposed method. Furthermore, to be more realistic with respect to the atmospheric turbulence variations within a day, some experimental observations have been followed up.

  8. A proposed method for determining long term optimum utilization ...

    African Journals Online (AJOL)

    Application of utilization intensity as an experimental variable by the use of the pasture disc meter and the put- and take technique is recommended as an alternative to stocking rate experiments. Utilization intensity allows stocking rate to be used as a measure of pasture productivity and consequently the relationship ...

  9. Maintenance Approaches for Different Production Methods

    Directory of Open Access Journals (Sweden)

    Mungani, Dzivhuluwani Simon

    2013-11-01

    Full Text Available Various production methods are used in industry to manufacture or produce a variety of products needed by industry and consumers. The nature of a product determines which production method is most suitable or cost-effective. A continuous process is typically used to produce large volumes of liquids or gases. Batch processing is often used for small volumes, such as pharmaceutical products. This paper discusses a research project to determine the relationship between maintenance approaches and production methods. A survey was done to determine to what extent three maintenance approaches reliability-centred maintenance (RCM, total productive maintenance (TPM, and business-centred maintenance (BCM are used for three different processing methods (continuous process, batch process, and a production line method.

  10. Socratic Method as an Approach to Teaching

    Directory of Open Access Journals (Sweden)

    Haris Delić

    2016-10-01

    Full Text Available In this article we presented the theoretical view of Socrates' life and his method in teaching. After the biographical facts of Socrates and his life, we explained the method he used in teaching and the two main types of his method, Classic and Modern Socratic Method. Since the core of Socrates' approach is the dialogue as a form of teaching we explained how exactly the Socratic dialogue goes. Besides that, we presented two examples of dialogues that Socrates led, Meno and Gorgias. Socratic circle is also one of the aspects that we presented in this paper. It is the form of seminars that is crucial for group discussions of a given theme. At the end, some disadvantages of the Method are explained. With this paper, the reader can get the conception of this approach of teaching and can use Socrates as an example of how successfull teacher leads his students towards the goal.

  11. Approach to 3D dose verification by utilizing autoactivation

    Energy Technology Data Exchange (ETDEWEB)

    Nakajima, Yasunori, E-mail: yasunori.nkjm@gmail.com [Tokyo Institute of Technology, Yokohama-shi (Japan); Kohno, Toshiyuki [Tokyo Institute of Technology, Yokohama-shi (Japan); Inaniwa, Taku; Sato, Shinji; Yoshida, Eiji; Yamaya, Taiga [National Institute of Radiological Sciences, Chiba-shi (Japan); Tsuruta, Yuki [Tokyo Institute of Technology, Yokohama-shi (Japan); Sihver, Lembit [Chalmers University of Technology, Gothenburg (Sweden)

    2011-08-21

    To evaluate the deposited dose distribution in a target, we have proposed to utilize the annihilation gamma-rays emitted from the positron emitters distributed in the target irradiated with stable heavy-ion beams. Verification of the one dimensional (1-D) dose distributions along and perpendicular to a beam axis was achieved through our previous works. The purpose of this work is to verify 3-D dose distributions. As the first attempt uniform PMMA targets were irradiated in simple rectangular parallelepiped shapes, and the annihilation gamma-rays were detected with a PET scanner. By comparing the detected annihilation gamma-ray distributions with the calculated ones the dose distributions were estimated. As a result the estimated positions of the distal edges of the dose distributions were in agreement with the measured ones within 1 mm. However, the estimated positions of the proximal edges were different from the measured ones by 5-9 mm depending on the thickness of the irradiation filed.

  12. Introductory geology for elementary education majors utilizing a constructivist approach

    Science.gov (United States)

    Brown, L.M.; Kelso, P.R.; Rexroad, C.B.

    2001-01-01

    "Field Excursions in Earth Science" is designed as a non-prerequisite field-based course for elementary education majors. Classic Canadian Shield and Michigan Basin outcrops and Quaternary features are used to teach those Earth science objectives considered most important for K-8 teachers by the Michigan State Board of Education and by others. We integrated these objectives into five conceptual pathways rather than presenting them as discrete pieces of information. A variety of teaching techniques based on constructivist educational theory are employed, so that pre-service teachers experience active-learning strategies in the context of how science is practiced. Our learning strategies address the cognitive and affective domains and utilize personal experiences in conjunction with pre- and post-experience organizers to allow students to develop individual meanings. We place emphasis on observations and concepts and we encourage students to explain their understanding of concepts verbally and in a variety of written formats. Activities address spatial concepts and map reading; mineral, rock, and fossil identification; formation of rocks; surficial processes and landform development; structural deformation and plate tectonics; and environmental issues. Students keep field notes and have daily projects. They address the pedagogical structure of the course in a daily diary.

  13. Utilizing a disease management approach to improve ESRD patient outcomes.

    Science.gov (United States)

    Anand, Shaan; Nissenson, Allen R

    2002-01-01

    In this era of processes and systems to improve quality, disease management is one methodology to improve care delivery and outcomes for patients with chronic kidney disease (CKD). In most disease management systems a senior renal nurse coordinates all aspects of the patient's care and ensures that the prescribed and necessary care is delivered for both CKD-related and comorbid conditions. The nurse also continually monitors outcomes on quality indicators and key performance measures. These outcome data are then aggregated and analyzed, are compared with local and national benchmarks, and drive the continuous quality improvement (CQI) process. Such a system attempts to centralize the currently fragmented care delivery system, continually improve patient outcomes, and conserve scarce economic resources. Early data suggest a disease management approach may improve both the morbidity and mortality of CKD patients.

  14. An Utilization Method Cooperating ISO drawings and Bookmarks for NDE

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Hyun-Ju; Cho, Chan-Hee; Lee, Tae-Hun [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    The inspection of weld parts are performed with various non-destructive test methods such as AUT, VT, PT, ECT, etc. The results have been published in the report booklets. Moreover, huge amount of non-destructive inspection data and reports, which are produced from the '78 Kori 1 to newly constructed power plant, are stored in the management department and DDCC of each power plant. Because the data are not classified, it takes much time to find the corresponding non-destructive test results report of a specific unit of a power plant for a particular year. In addition, it is possibility to make human error because the report is written and submitted after finding manually the non-destructive test results for the concerning weld. When there is a dispute of the results of non-destructive inspection of a particular weld, analyzes the control center, a result of the power plant and the corresponding non-destructive inspection of a particular weld of ISO drawing in many places such as control center, corresponding power plant and CRI was discussed together will be to derive the final conclusion. For discussion, the person in charge of each site, looking for ISO drawings there is a result of the weld, in the past results history report of non-destructive testing for this, by searching the results of the welding site, telephone and E-mail disadvantage of complicated procedures if necessary to the discussion together in -mail is performed I have. In this paper, we describe the contents introduced the ISO drawings and Bookmark function is trying to complement these drawbacks. By applying the present invention utilization, find the ISO drawings manually when querying the results of the non-destructive inspection of past weld, which reduces the time to grasp the content of the non-destructive test results report.

  15. Awareness and Utilization of Family Planning Methods among ...

    African Journals Online (AJOL)

    deficiency Virus (HIV) infection influence the design and background Family planning is an important preventive measure against maternal and child morbidity and mortality. This study was aimed at determining the awareness and utilization of family ...

  16. A method of short range system analysis for nuclear utilities

    International Nuclear Information System (INIS)

    Eng, R.; Mason, E.A.; Benedict, M.

    1976-01-01

    An optimization procedure has been formulated and tested that is capable of solving for the optimal generation schedule of several nuclear power reactors in an electric power utility system, under short-range, resource-limited, conditions. The optimization procedure utilizes a new concept called the Opportunity Cost of Nuclear Power (OCNP) to optimally assign the resource-limited nuclear energy to the different weeks and hours in the short-range planning horizon. OCNP is defined as the cost of displaced energy when optimally distributed nuclear energy is marginally increased. Under resource-limited conditions, the short-range 'value' of nuclear power to a utility system is not its actual generation cost, but the cost of the next best alternative supply of energy, the OCNP. OCNP is a function of a week's system reserve capacity, the system's economic loading order, the customer demand function, and the nature of the available utility system generating units. The optimized OCNP value of the short-range planning period represents the utility's short-range energy replacement cost incurred when selling nuclear energy to a neighbouring utility. (author)

  17. Single Cell Genomics: Approaches and Utility in Immunology

    Science.gov (United States)

    Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A

    2017-01-01

    Single cell genomics offers powerful tools for studying lymphocytes, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population-level. Advances in computer science and single cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single cell RNA-seq data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. PMID:28094102

  18. Retention of Content Utilizing a Flipped Classroom Approach.

    Science.gov (United States)

    Shatto, Bobbi; LʼEcuyer, Kristine; Quinn, Jerod

    The flipped classroom experience promotes retention and accountability for learning. The authors report their evaluation of a flipped classroom for accelerated second-degree nursing students during their primary medical-surgical nursing course. Standardized HESI® scores were compared between a group of students who experienced the flipped classroom and a previous group who had traditional teaching methods. Short- and long-term retention was measured using standardized exams 3 months and 12 months following the course. Results indicated that short-term retention was greater and long- term retention was significantly great in the students who were taught using flipped classroom methodology.

  19. Permutation statistical methods an integrated approach

    CERN Document Server

    Berry, Kenneth J; Johnston, Janis E

    2016-01-01

    This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. This research monograph addresses a statistically-informed audience, and can also easily serve as a ...

  20. Single-Cell Genomics: Approaches and Utility in Immunology.

    Science.gov (United States)

    Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A

    2017-02-01

    Single-cell genomics offers powerful tools for studying immune cells, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population level. Advances in computer science and single-cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single-cell RNA-sequencing data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Near-Port Air Quality Assessment Utilizing a Mobile Monitoring Approach

    Data.gov (United States)

    U.S. Environmental Protection Agency — Near-Port Air Quality Assessment Utilizing a Mobile Monitoring Approach. This dataset is associated with the following publication: Steffens, J., S. Kimbrough, R....

  2. Computational intelligence approach for NOx emissions minimization in a coal-fired utility boiler

    International Nuclear Information System (INIS)

    Zhou Hao; Zheng Ligang; Cen Kefa

    2010-01-01

    The current work presented a computational intelligence approach used for minimizing NO x emissions in a 300 MW dual-furnaces coal-fired utility boiler. The fundamental idea behind this work included NO x emissions characteristics modeling and NO x emissions optimization. First, an objective function aiming at estimating NO x emissions characteristics from nineteen operating parameters of the studied boiler was represented by a support vector regression (SVR) model. Second, four levels of primary air velocities (PA) and six levels of secondary air velocities (SA) were regulated by using particle swarm optimization (PSO) so as to achieve low NO x emissions combustion. To reduce the time demanding, a more flexible stopping condition was used to improve the computational efficiency without the loss of the quality of the optimization results. The results showed that the proposed approach provided an effective way to reduce NO x emissions from 399.7 ppm to 269.3 ppm, which was much better than a genetic algorithm (GA) based method and was slightly better than an ant colony optimization (ACO) based approach reported in the earlier work. The main advantage of PSO was that the computational cost, typical of less than 25 s under a PC system, is much less than those required for ACO. This meant the proposed approach would be more applicable to online and real-time applications for NO x emissions minimization in actual power plant boilers.

  3. Reactor technology assessment and selection utilizing systems engineering approach

    Science.gov (United States)

    Zolkaffly, Muhammed Zulfakar; Han, Ki-In

    2014-02-01

    The first Nuclear power plant (NPP) deployment in a country is a complex process that needs to consider technical, economic and financial aspects along with other aspects like public acceptance. Increased interest in the deployment of new NPPs, both among newcomer countries and those with expanding programs, necessitates the selection of reactor technology among commercially available technologies. This paper reviews the Systems Decision Process (SDP) of Systems Engineering and applies it in selecting the most appropriate reactor technology for the deployment in Malaysia. The integrated qualitative and quantitative analyses employed in the SDP are explored to perform reactor technology assessment and to select the most feasible technology whose design has also to comply with the IAEA standard requirements and other relevant requirements that have been established in this study. A quick Malaysian case study result suggests that the country reside with PWR (pressurized water reactor) technologies with more detailed study to be performed in the future for the selection of the most appropriate reactor technology for Malaysia. The demonstrated technology assessment also proposes an alternative method to systematically and quantitatively select the most appropriate reactor technology.

  4. Etiology of cutaneous vasculitis: utility of a systemic approach

    Science.gov (United States)

    Chanussot-Deprez, Caroline; Vega-Memije, María Elisa; Flores-Suárez, Luis; Ríos-Romero, Celia; Cabiedes-Contreras, Javier; Reyes, Edgardo; Rangel-Gamboa, Lucia

    2018-01-01

    Cutaneous vasculities (CV) represents a diagnostic challenge, occurs as primary cutaneous disorder or as a manifestation of other entities. To search the cause of CV. Methods: Patients with CV were prospectively evaluated. In all patients, skin biopsies were drawn, and direct immunofluorescence was done in most of the patients. American College of Rheumatology (ACR) and Chapel Hill Consensus Conference Criteria (CHCC) were used for classification. 32 patients were studied. There was female predominance (71.8%). Children presented drug-associated CV or Schönlein-Henoch púrpura (SHP). Adults presented more frequently SHP, systemic lupus erythematosus or paraneoplastic vasculitis, other diagnosis as polyarteritis nodosa, microscopic polyangiitis, thrombotic vasculitis (post-puerperal), antiphospholipid syndrome, Churg-Strauss syndrome, and drug-associated CV were presented. Using the ACR and CHCC criteria, 50% of cases were classified. In our institution, during this work the etiologic diagnostic of CV increased more than twice. However, in the case of HSV or LA and SHP none of the proposed criteria had high specificity; other parameters were used to discern between both. Six patients remained as not classified. In our view, cryoglobulins and hepatitis serology do not seem useful unless patient’s history supports they need to be done. Unclassified patients were followed-up closely for 2 years. Copyright: © 2018 SecretarÍa de Salud

  5. New pricing approaches for bundled payments: Leveraging clinical standards and regional variations to target avoidable utilization.

    Science.gov (United States)

    Hellsten, Erik; Chu, Scally; Crump, R Trafford; Yu, Kevin; Sutherland, Jason M

    2016-03-01

    Develop pricing models for bundled payments that draw inputs from clinician-defined best practice standards and benchmarks set from regional variations in utilization. Health care utilization and claims data for a cohort of incident Ontario ischemic and hemorrhagic stroke episodes. Episodes of care are created by linking incident stroke hospitalizations with subsequent health service utilization across multiple datasets. Costs are estimated for episodes of care and constituent service components using setting-specific case mix methodologies and provincial fee schedules. Costs are estimated for five areas of potentially avoidable utilization, derived from best practice standards set by an expert panel of stroke clinicians. Alternative approaches for setting normative prices for stroke episodes are developed using measures of potentially avoidable utilization and benchmarks established by the best performing regions. There are wide regional variations in the utilization of different health services within episodes of stroke care. Reconciling the best practice standards with regional utilization identifies significant amounts of potentially avoidable utilization. Normative pricing models for stroke episodes result in increasingly aggressive redistributions of funding. Bundled payment pilots to date have been based on the costs of historical service patterns, which effectively 'bake in' unwarranted and inefficient variations in utilization. This study demonstrates the feasibility of novel clinically informed episode pricing approaches that leverage these variations to target reductions in potentially avoidable utilization. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  6. Bionic Design Methods - A practical approach

    DEFF Research Database (Denmark)

    Kepler, Jørgen Asbøll; Stokholm, Marianne Denise J.

    2004-01-01

    Nature has served as inspiration for product design throughout history. Applications range from poetic translations of form to utilization of primary functional principles. This paper describes a generally applicable design methodology for transforming natural functional principles to feasible...... product design. From a formulation of design demands, which need not necessarily be very precise, the approach continues with a study of natural objects (anaimals, plants) which are subject to the same demands. From this study, the working principle(s) are derived. This (these) are then clarified through...... illustrative models, which should be simplified as much as possible. The simplified principle may now be evaluated and transformed into practical design. The methodology is clarified through examples, taken from a series of extended workshops held atAalborg University ....

  7. Thermal Unit Commitment Scheduling Problem in Utility System by Tabu Search Embedded Genetic Algorithm Method

    Directory of Open Access Journals (Sweden)

    C. Christober Asir Rajan

    2008-06-01

    Full Text Available The objective of this paper is to find the generation scheduling such that the total operating cost can be minimized, when subjected to a variety of constraints. This also means that it is desirable to find the optimal unit commitment in the power system for the next H hours. A 66-bus utility power system in India demonstrates the effectiveness of the proposed approach; extensive studies have also been performed for different IEEE test systems consist of 24, 57 and 175 buses. Numerical results are shown comparing the cost solutions and computation time obtained by different intelligence and conventional methods.

  8. Density meters utilizing ionizing radiation: definitions and test methods

    International Nuclear Information System (INIS)

    Anon.

    1981-01-01

    This standard is applicable to density meters utilizing ionizing radiation, designed for the measurement of the density of liquids, slurries or fluidized solids. The standard applies to transmission-type instruments only. Reference to compliance with this standard shall identify any deviations and the reasons for such deviations. Safety aspects are not included but should fulfill the requirements of all relevant internationally accepted standards

  9. Measuring the Capacity Utilization of Public District Hospitals in Tunisia: Using Dual Data Envelopment Analysis Approach

    Directory of Open Access Journals (Sweden)

    Chokri Arfa

    2017-01-01

    Full Text Available Background Public district hospitals (PDHs in Tunisia are not operating at full plant capacity and underutilize their operating budget. Methods Individual PDHs capacity utilization (CU is measured for 2000 and 2010 using dual data envelopment analysis (DEA approach with shadow prices input and output restrictions. The CU is estimated for 101 of 105 PDH in 2000 and 94 of 105 PDH in 2010. Results In average, unused capacity is estimated at 18% in 2010 vs. 13% in 2000. Of PDHs 26% underutilize their operating budget in 2010 vs. 21% in 2000. Conclusion Inadequate supply, health quality and the lack of operating budget should be tackled to reduce unmet user’s needs and the bypassing of the PDHs and, thus to increase their CU. Social health insurance should be turned into a direct purchaser of curative and preventive care for the PDHs.

  10. Methods of determining incremental energy costs for economic dispatch and inter-utility interchange in Canadian utilities

    International Nuclear Information System (INIS)

    El-Hawary, M.E.; El-Hawary, F.; Mbamalu, G.A.N.

    1991-01-01

    A questionnaire was mailed to ten Canadian utilities to determine the methods the utilities use in determining the incremental cost of delivering energy at any time. The questionnaire was divided into three parts: generation, transmission and general. The generation section dealt with heat rates, fuel, operation and maintenance, startup and shutdown, and method of prioritizing and economic evaluation of interchange transactions. Transmission dealt with inclusion of transmission system incremental maintenance costs, and transmission losses determination. The general section dealt with incremental costs aspects, and various other economic considerations. A summary is presented of responses to the questionnaire

  11. Method for utilizing decay heat from radioactive nuclear wastes

    International Nuclear Information System (INIS)

    Busey, H.M.

    1974-01-01

    Management of radioactive heat-producing waste material while safely utilizing the heat thereof is accomplished by encapsulating the wastes after a cooling period, transporting the capsules to a facility including a plurality of vertically disposed storage tubes, lowering the capsules as they arrive at the facility into the storage tubes, cooling the storage tubes by circulating a gas thereover, employing the so heated gas to obtain an economically beneficial result, and continually adding waste capsules to the facility as they arrive thereat over a substantial period of time

  12. Data-driven approach for assessing utility of medical tests using electronic medical records.

    Science.gov (United States)

    Skrøvseth, Stein Olav; Augestad, Knut Magne; Ebadollahi, Shahram

    2015-02-01

    To precisely define the utility of tests in a clinical pathway through data-driven analysis of the electronic medical record (EMR). The information content was defined in terms of the entropy of the expected value of the test related to a given outcome. A kernel density classifier was used to estimate the necessary distributions. To validate the method, we used data from the EMR of the gastrointestinal department at a university hospital. Blood tests from patients undergoing surgery for gastrointestinal surgery were analyzed with respect to second surgery within 30 days of the index surgery. The information content is clearly reflected in the patient pathway for certain combinations of tests and outcomes. C-reactive protein tests coupled to anastomosis leakage, a severe complication show a clear pattern of information gain through the patient trajectory, where the greatest gain from the test is 3-4 days post index surgery. We have defined the information content in a data-driven and information theoretic way such that the utility of a test can be precisely defined. The results reflect clinical knowledge. In the case we used the tests carry little negative impact. The general approach can be expanded to cases that carry a substantial negative impact, such as in certain radiological techniques. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  13. SPRERI experience on holistic approach to utilize all parts of Jatropha curcas fruit for energy

    Energy Technology Data Exchange (ETDEWEB)

    Singh, R.N.; Vyas, D.K.; Srivastava, N.S.L.; Narra, Madhuri [Thermochemical Conversion Division, Sardar Patel Renewable Energy Research Institute, Vallabh Vidyanagar 388 120, Gujarat (India)

    2008-08-15

    Freshly harvested Jatropha dried fruit contains about 35-40% shell and 60-65% seed (by weight). The fruits are 2.5 cm long, ovoid, black and have 2-3 halves. It has nearly 400-425 fruits per kg and 1580-1600 seed per kg weight. Weight of 100 seeds is about 63 g. Jatropha shells are available after de-shelling of the Jatropha fruit while Jatropha seed husks are available after decortications of Jatropha seed for oil extraction. Seed contains about 40-42% husk/hull and 58-60% kernels. The kernels have about 50% oil. If the oil is extracted by solvent method the oil recovery is more than 95% but in mechanical expeller the oil recovery is about 85% only. If 100 kg of seed is expelled by expeller it will give about 28-30 kg oil. While lot of emphasis is being given on use of bio-diesel, which is only about 17-18% of the dry fruit, not much attention is being given to utilize other components of fruit for energy purposes. At SPRERI holistic approach has been taken to utilize all components of the Jatropha fruit - shell for combustion, hull/husk for gasification, oil and bio-diesel for running CI engines, cake for production of biogas and spent slurry as manure and it has been found that all components of the Jatropha curcas fruit can be utilized efficiently for energy purposes. This paper gives detailed information on the use of different components of J. curcas fruit for energy purposes. (author)

  14. Analysis method and utilization mechanism of the overall value of EV charging

    International Nuclear Information System (INIS)

    Guo, Chunlin; Chan, Ching Chuen

    2015-01-01

    Highlights: • Analysis on the overall value of EV charging from a viewpoint of system. • An analytical model of the overall value of EV charging was presented. • A model was proposed to calculate the value of emission reduction by EV. • A model to evaluate the improvement in new energy utilization was given. • A utilization mechanism apt to overall optimization was proposed. - Abstract: Electric Vehicle (EV) can save energy while reducing emissions and has thus attracted the attention of both academics and industry. The cost and benefit of charging are one of the key issues in relation to EV development that has been researched extensively. But many studies are carried out from a viewpoint of some local entities rather than a global system, focus on specific types or aspects of EV charging, or use mixed models that can only be computed by computer simulation and lack physical transparency. This paper illuminated that it is necessary to consider the value of EV charging on a system scale. In order to achieve this, it presents an analytical model for analyzing the overall value of EVs, an analysis model to evaluate the reduction of pollutions relevant to photovoltaic power, and a model to transfer the intrinsic savings of wind power to the off-peak charging loads. It is estimated that EV charging has a significant positive value, providing the basis for enhanced EV subsidies. Accordingly, a utilization mechanism apt to optimize globally is proposed, upon which sustainable business models can be formed by providing adequate support, including the implementation of a peak–valley tariff, charging subsidies and one-time battery subsidies. This utilization mechanism, by taking full advantage of the operation system of power utilities to provide basic support and service, may provide new approaches to the development of EVs. The method proposed here is of important value for the systematic considerations about EV development and maybe can help broaden the

  15. A PRACTICAL APPROACH TO THE GROUND OSCILLATION VELOCITY MEASUREMENT METHOD

    Directory of Open Access Journals (Sweden)

    Siniša Stanković

    2017-01-01

    Full Text Available The use of an explosive’s energy during blasting includes undesired effects on the environment. The seismic influence of a blast, as a major undesired effect, is determined by many national standards, recommendations and calculations where the main parameter is ground oscillation velocity at the field measurement location. There are a few approaches and methods for calculation of expected ground oscillation velocities according to charge weight per delay and the distance from the blast to the point of interest. Utilizations of these methods and formulas do not provide satisfactory results, thus the measured values on diverse distance from the blast field more or less differ from values given by previous calculations. Since blasting works are executed in diverse geological conditions, the aim of this research is the development of a practical and reliable approach which will give a different model for each construction site where blasting works have been or will be executed. The approach is based on a greater number of measuring points in line from the blast field at predetermined distances. This new approach has been compared with other generally used methods and formulas through the use of measurements taken during research along with measurements from several previously executed projects. The results confirmed that the suggested model gives more accurate values.

  16. The clinical utility of a adding lateral approach to conventional vertical approach for prone stereotactic vacuum-assisted breast biopsy

    International Nuclear Information System (INIS)

    Myong, Joo Hwa; Kang, Bong Joo; Yoon, Soo Kyung; Kim, Sung Hun; An, Yeong Yi

    2013-01-01

    The purpose of this study is to evaluate the clinical utility of adding lateral approach to conventional vertical approach for prone stereotactic vacuum-assisted breast biopsies. From April 2010 to May 2012, 130 vacuum-assisted stereotactic biopsies were attempted in 127 patients. While a vertical approach was preferred, a lateral approach was used if the vertical approach failed. The success rate of biopsies utilizing only a vertical approach was compared with that using both vertical and lateral approaches and the breast thickness for both procedures was measured and compared with that for vertical approach. In addition, pathology results were evaluated and the causes of the failed biopsies were analyzed. Of the 130 cases, 127 biopsies were performed and 3 biopsies failed. The success rate of the vertical approach was 83.8% (109/130); however, when the lateral approach was also used, the success rate increased to 97.7% (127/130) (p = 0.0004). The mean breast thickness was 2.7 ± 1 cm for the lateral approach and 4 ± 1.2 cm for the vertical approach (p < 0.0001). The histopathologic results in 76 (59.8%) of the biopsies were benign, 23 (18.1%) were high-risk lesions, and 28 (22.0%) were malignant. The causes of biopsy failure were thin breasts (n = 2) and undetected difficult lesion location (n = 1). The addition of lateral approach to conventional vertical approach in prone stereotactic vacuum-assisted breast biopsy improved the success rate of stereotactic biopsy, especially in patients with thin breasts.

  17. Predictive Utility of Personality Disorder in Depression: Comparison of Outcomes and Taxonomic Approach.

    Science.gov (United States)

    Newton-Howes, Giles; Mulder, Roger; Ellis, Pete M; Boden, Joseph M; Joyce, Peter

    2017-09-19

    There is debate around the best model for diagnosing personality disorder, both in terms of its relationship to the empirical data and clinical utility. Four randomized controlled trials examining various treatments for depression were analyzed at an individual patient level. Three different approaches to the diagnosis of personality disorder were analyzed in these patients. A total of 578 depressed patients were included in the analysis. Personality disorder, however measured, was of little predictive utility in the short term but added significantly to predictive modelling of medium-term outcomes, accounting for more than twice as much of the variance in social functioning outcome as depression psychopathology. Personality disorder assessment is of predictive utility with longer timeframes and when considering social outcomes as opposed to symptom counts. This utility is sufficiently great that there appears to be value in assessing personality; however, no particular approach outperforms any other.

  18. Neurobiological studies of risk assessment: A comparison of expected utility and mean-variance approaches

    OpenAIRE

    d'Acremont, M.; Bossaerts, Peter

    2008-01-01

    When modeling valuation under uncertainty, economists generally prefer expected utility because it has an axiomatic foundation, meaning that the resulting choices will satisfy a number of rationality requirements. In expected utility theory, values are computed by multiplying probabilities of each possible state of nature by the payoff in that state and summing the results. The drawback of this approach is that all state probabilities need to be dealt with separately, which becomes extremely ...

  19. The marginal utility of money: A modern Marshallian approach to consumer choice

    OpenAIRE

    Friedman, Daniel; Sákovics, József

    2011-01-01

    We reformulate neoclassical consumer choice by focusing on [lambda], the marginal utility of money. As the opportunity cost of current expenditure, [lambda] is approximated by the slope of the indirect utility function of the continuation. We argue that [lambda] can largely supplant the role of an arbitrary budget constraint in partial equilibrium analysis. The result is a better grounded, more exible and more intuitive approach to consumer choice.

  20. Robust steganographic method utilizing properties of MJPEG compression standard

    Directory of Open Access Journals (Sweden)

    Jakub Oravec

    2015-06-01

    Full Text Available This article presents design of steganographic method, which uses video container as cover data. Video track was recorded by webcam and was further encoded by compression standard MJPEG. Proposed method also takes in account effects of lossy compression. The embedding process is realized by switching places of transform coefficients, which are computed by Discrete Cosine Transform. The article contains possibilities, used techniques, advantages and drawbacks of chosen solution. The results are presented at the end of the article.

  1. Neutron diffraction utilizing the T-O-F method

    Energy Technology Data Exchange (ETDEWEB)

    Niimura, N [Tohoku Univ., Sendai (Japan). Lab. of Nuclear Science

    1974-12-01

    Characteristic features of the TOF (time of flight) neutron diffraction are summarized. In this method, i) all the reciprocal points on the rod passing through the origin in the reciprocal space can be scanned by each burst of white neutrons, ii) it is easy to measure high index reflections at the large scattering angle, iii) each reflection is not affected by the higher-order harmonics, and iv) it is easy to measure the physical properties depending on the neutron wavelength. The pulse neutron generator as well as the data acquisition system in the Laboratory of Nuclear Science of Tohoku University is described. The TOF method seems to be very powerful if it is applied to accurate structure analysis. The data correction methods are discussed. The TOF method is prospective to the study of transient phenomena. In this method one can apply to the crystalline sample an external field pulsed with the same frequency as the neutrons. By using this method, the transient state of the polarization reversal of the ferroelectric NaNO/sub 2/ has been observed. The magnetically pulsed neutron TOF spectrometer is briefly introduced after a review of the chopper history.

  2. A self-reference PRF-shift MR thermometry method utilizing the phase gradient

    International Nuclear Information System (INIS)

    Langley, Jason; Potter, William; Phipps, Corey; Zhao Qun; Huang Feng

    2011-01-01

    In magnetic resonance (MR) imaging, the most widely used and accurate method for measuring temperature is based on the shift in proton resonance frequency (PRF). However, inter-scan motion and bulk magnetic field shifts can lead to inaccurate temperature measurements in the PRF-shift MR thermometry method. The self-reference PRF-shift MR thermometry method was introduced to overcome such problems by deriving a reference image from the heated or treated image, and approximates the reference phase map with low-order polynomial functions. In this note, a new approach is presented to calculate the baseline phase map in self-reference PRF-shift MR thermometry. The proposed method utilizes the phase gradient to remove the phase unwrapping step inherent to other self-reference PRF-shift MR thermometry methods. The performance of the proposed method was evaluated using numerical simulations with temperature distributions following a two-dimensional Gaussian function as well as phantom and in vivo experimental data sets. The results from both the numerical simulations and experimental data show that the proposed method is a promising technique for measuring temperature. (note)

  3. When homogeneity meets heterogeneity: the geographically weighted regression with spatial lag approach to prenatal care utilization

    Science.gov (United States)

    Shoff, Carla; Chen, Vivian Yi-Ju; Yang, Tse-Chuan

    2014-01-01

    Using geographically weighted regression (GWR), a recent study by Shoff and colleagues (2012) investigated the place-specific risk factors for prenatal care utilization in the US and found that most of the relationships between late or not prenatal care and its determinants are spatially heterogeneous. However, the GWR approach may be subject to the confounding effect of spatial homogeneity. The goal of this study is to address this concern by including both spatial homogeneity and heterogeneity into the analysis. Specifically, we employ an analytic framework where a spatially lagged (SL) effect of the dependent variable is incorporated into the GWR model, which is called GWR-SL. Using this innovative framework, we found evidence to argue that spatial homogeneity is neglected in the study by Shoff et al. (2012) and the results are changed after considering the spatially lagged effect of prenatal care utilization. The GWR-SL approach allows us to gain a place-specific understanding of prenatal care utilization in US counties. In addition, we compared the GWR-SL results with the results of conventional approaches (i.e., OLS and spatial lag models) and found that GWR-SL is the preferred modeling approach. The new findings help us to better estimate how the predictors are associated with prenatal care utilization across space, and determine whether and how the level of prenatal care utilization in neighboring counties matters. PMID:24893033

  4. The Utility of Synthetic-based Approach of Writing among Iranian EFL Learners

    Directory of Open Access Journals (Sweden)

    Nasrin Derakhshandeh

    2014-05-01

    Full Text Available The present study intends to examine the utility of synthetic-based approach versus traditional approaches of writing among Iranian EFL learners. To achieve this end, ninety students at Upper-Intermediate level were randomly chosen from the English population of Kish and Gooyesh English Institutes. The students were divided into three groups. Group1 was asked to do a writing task based on product-based approach. A writing task based on process-oriented approach was administered to Group2; later on, Group 3 was invited to write a composition to assess their performance based on synthetic-based approach. The result of the t test and two-way ANOVA revealed that the students performed better in writing using synthetic approach rather than traditional approaches to writing.

  5. Method for enhancing microbial utilization rates of gases using perfluorocarbons

    Science.gov (United States)

    Turick, C.E.

    1997-06-10

    A method of enhancing the bacterial reduction of industrial gases using perfluorocarbons (PFCs) is disclosed. Because perfluorocarbons (PFCs) allow for a much greater solubility of gases than water does, PFCs have the potential to deliver gases in higher concentrations to microorganisms when used as an additive to microbial growth media thereby increasing the rate of the industrial gas conversion to economically viable chemicals and gases. 3 figs.

  6. Drug utilization according to reason for prescribing: a pharmacoepidemiologic method based on an indication hierarchy

    DEFF Research Database (Denmark)

    Kildemoes, Helle Wallach; Hendriksen, Carsten; Morten, Andersen

    2011-01-01

    ABSTRACT Purpose To develop a pharmacoepidemiologic method for drug utilization analysis according to indication, gender, and age by means of register-based information. Statin utilization in 2005 was applied as an example. Methods Following the recommendations for statin therapy, we constructed ...

  7. An effective utilization management strategy by dual approach of influencing physician ordering and gate keeping.

    Science.gov (United States)

    Elnenaei, Manal O; Campbell, Samuel G; Thoni, Andrea J; Lou, Amy; Crocker, Bryan D; Nassar, Bassam A

    2016-02-01

    There is increasing recognition of the importance of appropriate laboratory test utilization. We investigate the effect of a multifaceted educational approach that includes physician feedback on individual test ordering, in conjunction with targeted restriction, on the utilization of selected laboratory tests. Scientific evidence was compiled on the usefulness and limitations of tests suspected of being over utilized in our laboratories. A variety of approaches were used to deliver education on each of the targeted tests, with greater focus on primary care physicians (PCPs). Feedback on requesting behavior of these tests was also communicated to the latter group which included an educational component. Laboratory based restriction of testing was also exercised, including the unbundling of our electrolyte panel. PCP requesting patterns for the selected tests were found to be markedly skewed. The interventions implemented over the study period resulted in a substantial 51% reduction in overall ordering of five of the targeted tests equating to an annual marginal cost saving of $60,124. Unbundling of the electrolyte panel resulted in marginal cost savings that equated annually to $42,500 on chloride and $48,000 on total CO2. A multifaceted educational approach combined with feedback on utilization and laboratory driven gate-keeping significantly reduced the number of laboratory tests suspected of being redundant or unjustifiably requested. Laboratory professionals are well positioned to manage demand on laboratory tests by utilizing evidence base in developing specific test ordering directives and gate-keeping rules. Crown Copyright © 2015. Published by Elsevier Inc. All rights reserved.

  8. Methods and systems for utilizing carbide lime or slag

    Science.gov (United States)

    Devenney, Martin; Fernandez, Miguel; Chen, Irvin; Calas, Guillaume; Weiss, Michael Joseph; Tester, Chantel Cabrera

    2018-02-27

    Provided herein are methods comprising a) treating a slag solid or carbide lime suspension with an ammonium salt in water to produce an aqueous solution comprising calcium salt, ammonium salt, and solids; b) contacting the aqueous solution with carbon dioxide from an industrial process under one or more precipitation conditions to produce a precipitation material comprising calcium carbonate and a supernatant aqueous solution wherein the precipitation material and the supernatant aqueous solution comprise residual ammonium salt; and c) removing and optionally recovering ammonia and/or ammonium salt using one or more steps of (i) recovering a gas exhaust stream comprising ammonia during the treating and/or the contacting step; (ii) recovering the residual ammonium salt from the supernatant aqueous solution; and (iii) removing and optionally recovering the residual ammonium salt from the precipitation material.

  9. Utilizing Mass Customization Methods for Modular Manufacturing System Design

    DEFF Research Database (Denmark)

    Jørgensen, Steffen; Jacobsen, Alexia; Nielsen, Kjeld

    2011-01-01

    Markets today have become dynamic and demand rapid product changes, product variety, and customized products. In order to operate under and taking advantages of such conditions requires, amongst other aspects, manufacturing processes robust to product changes - a contradiction to traditional...... manufacturing systems developed as dedicated engineer-to-order solutions, tailored to production of a specific product or a limited product assortment. In response, modular manufacturing concepts are evolving, which are aimed at possessing the needed responsiveness and aimed at being the manufacturing paradigm...... of Mass Customization (MC). Research focus has been on the basic principles and enabling technologies, while modular architectures and system design have received less attention. A potential to fill these gaps by applying selected design theories and methods of MC have been seen. Based on a communality...

  10. Economic Valuation on Change of Tourism Quality in Rawapening, Indonesia: An Application of Random Utility Method

    Science.gov (United States)

    Subanti, S.; Irawan, B. R. M. B.; Sasongko, G.; Hakim, A. R.

    2017-04-01

    This study aims to determine the profit (loss) earned economic actors tourism activities if the condition or quality of tourism in Rawapening be improved (deteriorated). Change condition or quality can be seen by traveling expenses, natural environment, Japanese cultural performances, and traditional markets. The method used to measure changes in the economic benefits or economic loss with a random utility approach. The study was found that travel cost, natural environment, Japanese cultural performances, and traditional markets have significant factors about respondent preferences to choose the change of tourism condition. The value of compensation received by visitors as a result of changes in conditions improved by 2,932 billion, while the change in the condition worsens by 2,628 billion. Recommendation of this study is the local government should consider environmental factors in the formulation of tourism development in Rawapening.

  11. Utilization of 40An/39Ar method in policyclic rocks

    International Nuclear Information System (INIS)

    Kinoshita, H.

    1976-01-01

    This work presents the results of twelve radiometric analyses by the 40 Ar/ 39 Ar method in biotites, anphiboles and plagio classes of rock of the Penhinha region. Nine determinations were made by the techniques of stepwise heating and three by metting of the sample. The different apparent ages of the minerals (515-1200 My), obtained by the technique of stepwise heating, seem to imply a a complex geologic history for the region, and confirms the existence of older emplacement nuclei, pre-Brazilians, affected by two orogenic cycles, at least, the first pre-Brazilian and the cycles, at least, the first pre-Brazilian and the second, Brazilian. On the other hand, the age determinations obtained by melting of plagioclasses (∼ 730 My), agreed with the conventional K-ar apparent ages, a fact with no geological meaning. The preliminary results show that the stepwise heating technique may become a valuable tool in the study of policyclic samples, because it permits to distinguish between disturbed and undisturbed rocks. The same cannot be said of the melting technique. In this latter case the results can be used only comparatively. (author) [pt

  12. Utilization of Selected Data Mining Methods for Communication Network Analysis

    Directory of Open Access Journals (Sweden)

    V. Ondryhal

    2011-06-01

    Full Text Available The aim of the project was to analyze the behavior of military communication networks based on work with real data collected continuously since 2005. With regard to the nature and amount of the data, data mining methods were selected for the purpose of analyses and experiments. The quality of real data is often insufficient for an immediate analysis. The article presents the data cleaning operations which have been carried out with the aim to improve the input data sample to obtain reliable models. Gradually, by means of properly chosen SW, network models were developed to verify generally valid patterns of network behavior as a bulk service. Furthermore, unlike the commercially available communication networks simulators, the models designed allowed us to capture nonstandard models of network behavior under an increased load, verify the correct sizing of the network to the increased load, and thus test its reliability. Finally, based on previous experience, the models enabled us to predict emergency situations with a reasonable accuracy.

  13. Analysis of the methods utilized in OXIDE-3

    International Nuclear Information System (INIS)

    Skalyo, J. Jr.; Epel, L.G.; Sastre, C.

    1978-03-01

    OXIDE-3 is an evolving code developed to analyze the transient response of certain state variables of a High Temperature Gas Cooled Reactor (HTGR) during an accident involving the inleakage of steam and/or air into the helium primary coolant system. Primary tasks of the code are to calculate the primary coolant constituents as a function of time, their resultant chemical interaction with the graphite fuel elements, and their possible egress into the containment building. The report takes a critical look at certain aspects of the problem solving methods implemented in OXIDE-3 and gives estimates of the expected accuracy. Attendant to the latter finding, some of the calculated output may require careful interpretation since programmatical warnings are not given when an accuracy limitation is exceeded. The code has been used at BNL in an investigation to calculate the full power steady state impurity concentrations in the primary coolant system as a function of steam leak rate, steam graphite reaction rate, and the effective diffusion constant of steam in graphite. The results are in reasonable agreement with those obtained from the steady state oxidation code GOPTWO

  14. The impact of ageing and changing utilization patterns on future cardiovascular drug expenditure: a pharmacoepidemiological projection approach

    DEFF Research Database (Denmark)

    Kildemoes, Helle Wallach; Andersen, Morten; Støvring, Henrik

    2010-01-01

    To develop a method for projecting the impact of ageing and changing drug utilization patterns on future drug expenditure.......To develop a method for projecting the impact of ageing and changing drug utilization patterns on future drug expenditure....

  15. Modified Dempster-Shafer approach using an expected utility interval decision rule

    Science.gov (United States)

    Cheaito, Ali; Lecours, Michael; Bosse, Eloi

    1999-03-01

    The combination operation of the conventional Dempster- Shafer algorithm has a tendency to increase exponentially the number of propositions involved in bodies of evidence by creating new ones. The aim of this paper is to explore a 'modified Dempster-Shafer' approach of fusing identity declarations emanating form different sources which include a number of radars, IFF and ESM systems in order to limit the explosion of the number of propositions. We use a non-ad hoc decision rule based on the expected utility interval to select the most probable object in a comprehensive Platform Data Base containing all the possible identity values that a potential target may take. We study the effect of the redistribution of the confidence levels of the eliminated propositions which otherwise overload the real-time data fusion system; these eliminated confidence levels can in particular be assigned to ignorance, or uniformly added to the remaining propositions and to ignorance. A scenario has been selected to demonstrate the performance of our modified Dempster-Shafer method of evidential reasoning.

  16. Neurobiological studies of risk assessment: a comparison of expected utility and mean-variance approaches.

    Science.gov (United States)

    D'Acremont, Mathieu; Bossaerts, Peter

    2008-12-01

    When modeling valuation under uncertainty, economists generally prefer expected utility because it has an axiomatic foundation, meaning that the resulting choices will satisfy a number of rationality requirements. In expected utility theory, values are computed by multiplying probabilities of each possible state of nature by the payoff in that state and summing the results. The drawback of this approach is that all state probabilities need to be dealt with separately, which becomes extremely cumbersome when it comes to learning. Finance academics and professionals, however, prefer to value risky prospects in terms of a trade-off between expected reward and risk, where the latter is usually measured in terms of reward variance. This mean-variance approach is fast and simple and greatly facilitates learning, but it impedes assigning values to new gambles on the basis of those of known ones. To date, it is unclear whether the human brain computes values in accordance with expected utility theory or with mean-variance analysis. In this article, we discuss the theoretical and empirical arguments that favor one or the other theory. We also propose a new experimental paradigm that could determine whether the human brain follows the expected utility or the mean-variance approach. Behavioral results of implementation of the paradigm are discussed.

  17. The Effects of Rising Interest Rates on Electric Utility Stock Prices: Regulatory Considerations and Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Kihm, Steve [Seventhwave, Madison, WI (United States); Satchwell, Andrew [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cappers, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-26

    This technical brief identifies conditions under which utility regulators should consider implementing policy approaches that seek to mitigate negative outcomes due to an increase in interest rates. Interest rates are a key factor in determining a utility’s cost of equity and investors find value when returns exceed the cost of equity. Through historical observations of periods of rising and falling interest rates and application of a pro forma financial tool, we identify the key drivers of utility stock valuations and estimate the degree to which those valuations might be affected by increasing interest rates.3 We also analyze the efficacy of responses by utility regulators to mitigate potential negative financial impacts. We find that regulators have several possible approaches to mitigate a decline in value in an environment of increasing interest rates, though regulators must weigh the tradeoffs of improving investor value with potential increases in customer costs. Furthermore, the range of approaches reflects today’s many different electric utility regulatory models and regulatory responses to a decline in investor value will fit within state-specific models.

  18. Cluster approach to the development of housing services and public utilities in the region

    Directory of Open Access Journals (Sweden)

    Sergey Ivanovich Bazhenov

    2012-03-01

    Full Text Available The ongoing crisis in the housing services and public utilities sphere determines the need to accelerate its integration into the market space. The author proposes to apply the cluster initiatives, which, in his opinion, provide the solution of housing services and public utilities problems in terms of their broader vision. This paper attempts to highlight the benefits of the cluster approach to the development of housing services and public utilities, and identifies the main provisions in the process of its implementation. The essence of the concept of «housing services and social cluster of the region» in terms of housing services and public utilities development is revealed, members of cluster unification are designated, its components are identified, the purpose of introducing the cluster model of housing services and public utilities reform management is determined, which essence is to change the mentality of the producers and consumers of housing services and public utilities in the direction of increasing responsibilities and respect for mutual interests in the market of housing services and public utilities. The main provisions and principles of formation of housing services and social cluster of the region are reviewed, as well as the characteristics and trends of its development. An authorial approach to the development of a strategy of forming a housing services and social cluster in the region in accordance with modern trends is presented. The leading role of several factors in establishing the basic prerequisites for sustainable operation of housing services and social cluster of the region is justified. These factors include governmental regulation of pricing and forms of support to small entrepreneurship development, creation of financial security system, development of public-private partnerships and implementation of innovative technologies. The role of non-governmental organizations and public associations in the formation of

  19. MOLECULAR APPROACHES FOR IN SITU IDENTIFCIATION OF NITRATE UTILIZATION BY MARINE BACTERIA AND PHYTOPLANKTON

    Energy Technology Data Exchange (ETDEWEB)

    Frischer, Marc E. [Skidaway Institute of Oceanography; Verity, Peter G.; Gilligan, Mathew R.; Bronk, Deborah A.; Zehr, Jonathan P.; Booth, Melissa G.

    2013-09-12

    Traditionally, the importance of inorganic nitrogen (N) for the nutrition and growth of marine phytoplankton has been recognized, while inorganic N utilization by bacteria has received less attention. Likewise, organic N has been thought to be important for heterotrophic organisms but not for phytoplankton. However, accumulating evidence suggests that bacteria compete with phytoplankton for nitrate (NO3-) and other N species. The consequences of this competition may have a profound effect on the flux of N, and therefore carbon (C), in ocean margins. Because it has been difficult to differentiate between N uptake by heterotrophic bacterioplankton versus autotrophic phytoplankton, the processes that control N utilization, and the consequences of these competitive interactions, have traditionally been difficult to study. Significant bacterial utilization of DIN may have a profound effect on the flux of N and C in the water column because sinks for dissolved N that do not incorporate inorganic C represent mechanisms that reduce the atmospheric CO2 drawdown via the ?biological pump? and limit the flux of POC from the euphotic zone. This project was active over the period of 1998-2007 with support from the DOE Biotechnology Investigations ? Ocean Margins Program (BI-OMP). Over this period we developed a tool kit of molecular methods (PCR, RT-PCR, Q-PCR, QRT-PCR, and TRFLP) and combined isotope mass spectrometry and flow-cytometric approaches that allow selective isolation, characterization, and study of the diversity and genetic expression (mRNA) of the structural gene responsible for the assimilation of NO3- by heterotrophic bacteria (nasA). As a result of these studies we discovered that bacteria capable of assimilating NO3- are ubiquitous in marine waters, that the nasA gene is expressed in these environments, that heterotrophic bacteria can account for a significant fraction of total DIN uptake in different ocean margin systems, that the expression of nasA is

  20. Comparison of candidate solar array maximum power utilization approaches. [for spacecraft propulsion

    Science.gov (United States)

    Costogue, E. N.; Lindena, S.

    1976-01-01

    A study was made of five potential approaches that can be utilized to detect the maximum power point of a solar array while sustaining operations at or near maximum power and without endangering stability or causing array voltage collapse. The approaches studied included: (1) dynamic impedance comparator, (2) reference array measurement, (3) onset of solar array voltage collapse detection, (4) parallel tracker, and (5) direct measurement. The study analyzed the feasibility and adaptability of these approaches to a future solar electric propulsion (SEP) mission, and, specifically, to a comet rendezvous mission. Such missions presented the most challenging requirements to a spacecraft power subsystem in terms of power management over large solar intensity ranges of 1.0 to 3.5 AU. The dynamic impedance approach was found to have the highest figure of merit, and the reference array approach followed closely behind. The results are applicable to terrestrial solar power systems as well as to other than SEP space missions.

  1. Assessment of management approaches in a public water utility: A case study of the Namibia water corporation (NAMWATER)

    Science.gov (United States)

    Ndokosho, Johnson; Hoko, Zvikomborero; Makurira, Hodson

    More than 90% of urban water supply and sanitation services in developing countries are provided by public organizations. However, public provision of services has been inherently inefficient. As a result a number of initiatives have emerged in recent years with a common goal to improve service delivery. In Namibia, the water sector reform resulted in the creation of a public utility called the Namibia Water Corporation (NAMWATER) which is responsible for bulk water supply countrywide. Since its inception in 1998, NAMWATER has been experiencing poor financial performance. This paper presents the findings of a case study that compared the management approaches of NAMWATER to the New Public Management (NPM) paradigm. The focus of the NPM approach is for the public water sector to mirror private sector methods of management so that public utilities can accrue the benefits of effectiveness, efficiency and flexibility often associated with private sector. The study tools used were a combination of literature review, interviews and questionnaires. It was found out that NAMWATER has a high degree of autonomy in its operations, albeit government approved tariffs and sourcing of external financing. The utility reports to government annually to account for results. The utility embraces a notion of good corporate culture and adheres to sound management practices. NAMWATER demonstrated a strong market-orientation indicated by the outsourcing of non-core functions but benchmarking was poorly done. NAMWATER’s customer-orientation is poor as evidenced by the lack of customer care facilities. NAMWATER’s senior management delegated operational authority to lower management to facilitate flexibility and eliminate bottlenecks. The lower management is in turn held accountable for performance by the senior management. There are no robust methods of ensuring sufficient accountability indicated by absence of performance contracts or service level agreements. It was concluded that

  2. Tank Operations Contract Construction Management Methodology. Utilizing The Agency Method Of Construction Management

    International Nuclear Information System (INIS)

    Lesko, K.F.; Berriochoa, M.V.

    2010-01-01

    Washington River Protection Solutions, LLC (WRPS) has faced significant project management challenges in managing Davis-Bacon construction work that meets contractually required small business goals. The unique challenge is to provide contracting opportunities to multiple small business constructioin subcontractors while performing high hazard work in a safe and productive manner. Previous to the WRPS contract, construction work at the Hanford Tank Farms was contracted to large companies, while current Department of Energy (DOE) Contracts typically emphasize small business awards. As an integral part of Nuclear Project Management at Hanford Tank Farms, construction involves removal of old equipment and structures and installation of new infrastructure to support waste retrieval and waste feed delivery to the Waste Treatment Plant. Utilizing the optimum construction approach ensures that the contractors responsible for this work are successful in meeting safety, quality, cost and schedule objectives while working in a very hazardous environment. This paper descirbes the successful transition from a traditional project delivery method that utilized a large business general contractor and subcontractors to a new project construction management model that is more oriented to small businesses. Construction has selected the Agency Construction Management Method (John E Schaufelberger, Len Holm, 'Management of Construction Projects, A Constructor's Perspective', University of Washington, Prentice Hall 2002). This method was implemented in the first quarter of Fiscal Year 2009 (FY2009), where Construction Management is performed by substantially home office resources from the URS Northwest Office in Richland, Washington. The Agency Method has allowed WRPS to provide proven Construction Managers and Field Leads to mentor and direct small business contractors, thus providing expertise and assurance of a successful project. Construction execution contracts are subcontracted

  3. TANK OPERATIONS CONTRACT CONSTRUCTION MANAGEMENT METHODOLOGY UTILIZING THE AGENCY METHOD OF CONSTRUCTION MANAGEMENT

    Energy Technology Data Exchange (ETDEWEB)

    LESKO KF; BERRIOCHOA MV

    2010-02-26

    Washington River Protection Solutions, LLC (WRPS) has faced significant project management challenges in managing Davis-Bacon construction work that meets contractually required small business goals. The unique challenge is to provide contracting opportunities to multiple small business constructioin subcontractors while performing high hazard work in a safe and productive manner. Previous to the WRPS contract, construction work at the Hanford Tank Farms was contracted to large companies, while current Department of Energy (DOE) Contracts typically emphasize small business awards. As an integral part of Nuclear Project Management at Hanford Tank Farms, construction involves removal of old equipment and structures and installation of new infrastructure to support waste retrieval and waste feed delivery to the Waste Treatment Plant. Utilizing the optimum construction approach ensures that the contractors responsible for this work are successful in meeting safety, quality, cost and schedule objectives while working in a very hazardous environment. This paper descirbes the successful transition from a traditional project delivery method that utilized a large business general contractor and subcontractors to a new project construction management model that is more oriented to small businesses. Construction has selected the Agency Construction Management Method (John E Schaufelberger, Len Holm, "Management of Construction Projects, A Constructor's Perspective", University of Washington, Prentice Hall 2002). This method was implemented in the first quarter of Fiscal Year 2009 (FY2009), where Construction Management is performed by substantially home office resources from the URS Northwest Office in Richland, Washington. The Agency Method has allowed WRPS to provide proven Construction Managers and Field Leads to mentor and direct small business contractors, thus providing expertise and assurance of a successful project. Construction execution contracts are

  4. An approach to multi-attribute utility analysis under parametric uncertainty

    International Nuclear Information System (INIS)

    Kelly, M.; Thorne, M.C.

    2001-01-01

    The techniques of cost-benefit analysis and multi-attribute analysis provide a useful basis for informing decisions in situations where a number of potentially conflicting opinions or interests need to be considered, and where there are a number of possible decisions that could be adopted. When the input data to such decision-making processes are uniquely specified, cost-benefit analysis and multi-attribute utility analysis provide unambiguous guidance on the preferred decision option. However, when the data are not uniquely specified, application and interpretation of these techniques is more complex. Herein, an approach to multi-attribute utility analysis (and hence, as a special case, cost-benefit analysis) when input data are subject to parametric uncertainty is presented. The approach is based on the use of a Monte Carlo technique, and has recently been applied to options for the remediation of former uranium mining liabilities in a number of Central and Eastern European States

  5. A quantitative approach to choose among multiple mutually exclusive decisions: comparative expected utility theory

    OpenAIRE

    Zhu, Pengyu

    2018-01-01

    Mutually exclusive decisions have been studied for decades. Many well-known decision theories have been defined to help people either to make rational decisions or to interpret people's behaviors, such as expected utility theory, regret theory, prospect theory, and so on. The paper argues that none of these decision theories are designed to provide practical, normative and quantitative approaches for multiple mutually exclusive decisions. Different decision-makers should naturally make differ...

  6. Cloud computing methods and practical approaches

    CERN Document Server

    Mahmood, Zaigham

    2013-01-01

    This book presents both state-of-the-art research developments and practical guidance on approaches, technologies and frameworks for the emerging cloud paradigm. Topics and features: presents the state of the art in cloud technologies, infrastructures, and service delivery and deployment models; discusses relevant theoretical frameworks, practical approaches and suggested methodologies; offers guidance and best practices for the development of cloud-based services and infrastructures, and examines management aspects of cloud computing; reviews consumer perspectives on mobile cloud computing an

  7. Qualitative Approaches to Mixed Methods Practice

    Science.gov (United States)

    Hesse-Biber, Sharlene

    2010-01-01

    This article discusses how methodological practices can shape and limit how mixed methods is practiced and makes visible the current methodological assumptions embedded in mixed methods practice that can shut down a range of social inquiry. The article argues that there is a "methodological orthodoxy" in how mixed methods is practiced…

  8. Nuclear energy policy analysis under uncertainties : applications of new utility theoretic approaches

    International Nuclear Information System (INIS)

    Ra, Ki Yong

    1992-02-01

    For the purpose of analyzing the nuclear energy policy under uncertainties, new utility theoretic approaches were applied. The main discoveries of new utility theories are that, firstly, the consequences can affect the perceived probabilities, secondly, the utilities are not fixed but can change, and finally, utilities and probabilities thus should be combined dependently to determine the overall worth of risky option. These conclusions were applied to develop the modified expected utility model and to establish the probabilistic nuclear safety criterion. The modified expected utility model was developed in order to resolve the inconsistencies between the expected utility model and the actual decision behaviors. Based on information theory and Bayesian inference, the modified probabilities were obtained as the stated probabilities times substitutional factors. The model theoretically predicts that the extreme value outcomes are perceived as to be more likely to occur than medium value outcomes. This prediction is consistent with the first finding of new utility theories that the consequences can after the perceived probabilities. And further with this theoretical prediction, the decision behavior of buying lottery ticket, of paying for insurance and of nuclear catastrophic risk aversion can well be explained. Through the numerical application, it is shown that the developed model can well explain the common consequence effect, common ratio effect and reflection effect. The probabilistic nuclear safety criterion for core melt frequency was established: Firstly, the distribution of the public's safety goal (DPSG) was proposed for representing the public's group preference under risk. Secondly, a new probabilistic safety criterion (PSC) was established, in which the DPSG was used as a benchmark for evaluating the results of probabilistic safety assessment. Thirdly, a log-normal distribution was proposed as the appropriate DPSG for core melt frequency using the

  9. A multidisciplinary three-phase approach to improve the clinical utility of patient safety indicators.

    Science.gov (United States)

    Najjar, Peter; Kachalia, Allen; Sutherland, Tori; Beloff, Jennifer; David-Kasdan, Jo Ann; Bates, David W; Urman, Richard D

    2015-01-01

    The AHRQ Patient Safety Indicators (PSIs) are used for calculation of risk-adjusted postoperative rates for adverse events. The payers and quality consortiums are increasingly requiring public reporting of hospital performance on these metrics. We discuss processes designed to improve the accuracy and clinical utility of PSI reporting in practice. The study was conducted at a 793-bed tertiary care academic medical center where PSI processes have been aggressively implemented to track patient safety events at discharge. A three-phased approach to improving administrative data quality was implemented. The initiative consisted of clinical review of all PSIs, documentation improvement, and provider outreach including active querying for patient safety events. This multidisciplinary effort to develop a streamlined process for PSI calculation reduced the reporting of miscoded PSIs and increased the clinical utility of PSI monitoring. Over 4 quarters, 4 of 41 (10%) PSI-11 and 9 of 138 (7%) PSI-15 errors were identified on review of clinical documentation and appropriate adjustments were made. A multidisciplinary, phased approach leveraging existing billing infrastructure for robust metric coding, ongoing clinical review, and frontline provider outreach is a novel and effective way to reduce the reporting of false-positive outcomes and improve the clinical utility of PSIs.

  10. Numerical Methods for Stochastic Computations A Spectral Method Approach

    CERN Document Server

    Xiu, Dongbin

    2010-01-01

    The first graduate-level textbook to focus on fundamental aspects of numerical methods for stochastic computations, this book describes the class of numerical methods based on generalized polynomial chaos (gPC). These fast, efficient, and accurate methods are an extension of the classical spectral methods of high-dimensional random spaces. Designed to simulate complex systems subject to random inputs, these methods are widely used in many areas of computer science and engineering. The book introduces polynomial approximation theory and probability theory; describes the basic theory of gPC meth

  11. Mixed method approaches to evaluate conservation impact

    DEFF Research Database (Denmark)

    Lund, Jens Friis; Burgess, Neil D.; Chamshama, Shabani A.O.

    2015-01-01

    Nearly 10% of the world's total forest area is formally owned by communities and indigenous groups, yet knowledge of the effects of decentralized forest management approaches on conservation (and livelihood) impacts remains elusive. In this paper, the conservation impact of decentralized forest m...

  12. Nutrition and culture in professional football. A mixed method approach.

    Science.gov (United States)

    Ono, Mutsumi; Kennedy, Eileen; Reeves, Sue; Cronin, Linda

    2012-02-01

    An adequate diet is essential for the optimal performance of professional football (soccer) players. Existing studies have shown that players fail to consume such a diet, without interrogating the reasons for this. The aim of this study was to explore the difficulties professional football players experience in consuming a diet for optimal performance. It utilized a mixed method approach, combining nutritional intake assessment with qualitative interviews, to ascertain both what was consumed and the wider cultural factors that affect consumption. The study found a high variability in individual intake which ranged widely from 2648 to 4606 kcal/day. In addition, the intake of carbohydrate was significantly lower than that recommended. The study revealed that the main food choices for carbohydrate and protein intake were pasta and chicken respectively. Interview results showed the importance of tradition within the world of professional football in structuring the players' approach to nutrition. In addition, the players' personal eating habits that derived from their class and national habitus restricted their food choice by conflicting with the dietary choices promoted within the professional football clubs. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Utilization pattern of extension tools and methods by Agricultural Extension Agents

    Directory of Open Access Journals (Sweden)

    M Surudhi

    2018-05-01

    Full Text Available A study was conducted in Krishnagiri district of Tamil Nadu state to understand the utilization pattern of extension tools and methods by the agricultural extension agents. As ICT revolution is slowly conquering the rural sector, it becomes imperative that the agricultural extension agents transform themselves to the changing times and develop competencies in utilizing these ICTs.  The study explored the usage of various extension tools and methods by the change agents and the constraints faced in utilizing them. The findings revealed that the extension functionaries frequently used the individual contact methods viz., telephone, office calls and farm and home visits in the process of transfer of technology. Least efforts were shown in sending SMS based communication. Meetings were the common and frequently adopted group contact method. Demonstrations, farmer field school, farmer’s interest groups, field trips and farmer training programmes were moderately adopted. Posters, leaflets and pre-season campaigns were the widely adopted mass contact methods. They possess least skill in utilizing farm magazines, presenting television and radio programmes, which are among the most popular and most efficient mass contact methods. The extension functionaries need to be trained adequately on the wider use of electronic communication methods like e mails, and SMS in the local language. Efforts should be taken up to sensitize the importance and train the extension agents in the usage of different group and mass contact methods.

  14. Approaches and methods of risk assessment

    International Nuclear Information System (INIS)

    Rowe, W.D.

    1983-01-01

    The classification system of risk assessment includes the categories: 1) risk comparisons, 2) cost-effectiveness of risk reduction, 3) balancing of costs, risks and benefits against one another, 4. Metasystems. An overview of methods and systems reveals that no single method can be applied to all cases and situations. The visibility of the process and the absolute consideration of all aspects of judging are, however, of first and fore most importance. (DG) [de

  15. Site Assessment of Multiple-Sensor Approaches for Buried Utility Detection

    Directory of Open Access Journals (Sweden)

    Alexander C. D. Royal

    2011-01-01

    Full Text Available The successful operation of buried infrastructure within urban environments is fundamental to the conservation of modern living standards. Open-cut methods are predominantly used, in preference to trenchless technology, to effect a repair, replace or install a new section of the network. This is, in part, due to the inability to determine the position of all utilities below the carriageway, making open-cut methods desirable in terms of dealing with uncertainty since the buried infrastructure is progressively exposed during excavation. However, open-cut methods damage the carriageway and disrupt society's functions. This paper describes the progress of a research project that aims to develop a multi-sensor geophysical platform that can improve the probability of complete detection of the infrastructure buried beneath the carriageway. The multi-sensor platform is being developed in conjunction with a knowledge-based system that aims to provide information on how the properties of the ground might affect the sensing technologies being deployed. The fusion of data sources (sensor data and utilities record data is also being researched to maximize the probability of location. This paper describes the outcome of the initial phase of testing along with the development of the knowledge-based system and the fusing of data to produce utility maps.

  16. A Trace Data-Based Approach for an Accurate Estimation of Precise Utilization Maps in LTE

    Directory of Open Access Journals (Sweden)

    Almudena Sánchez

    2017-01-01

    Full Text Available For network planning and optimization purposes, mobile operators make use of Key Performance Indicators (KPIs, computed from Performance Measurements (PMs, to determine whether network performance needs to be improved. In current networks, PMs, and therefore KPIs, suffer from lack of precision due to an insufficient temporal and/or spatial granularity. In this work, an automatic method, based on data traces, is proposed to improve the accuracy of radio network utilization measurements collected in a Long-Term Evolution (LTE network. The method’s output is an accurate estimate of the spatial and temporal distribution for the cell utilization ratio that can be extended to other indicators. The method can be used to improve automatic network planning and optimization algorithms in a centralized Self-Organizing Network (SON entity, since potential issues can be more precisely detected and located inside a cell thanks to temporal and spatial precision. The proposed method is tested with real connection traces gathered in a large geographical area of a live LTE network and considers overload problems due to trace file size limitations, which is a key consideration when analysing a large network. Results show how these distributions provide a very detailed information of network utilization, compared to cell based statistics.

  17. Development of slim-maud: a multi-attribute utility approach to human reliability evaluation

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1984-01-01

    This paper describes further work on the Success Likelihood Index Methodology (SLIM), a procedure for quantitatively evaluating human reliability in nuclear power plants and other systems. SLIM was originally developed by Human Reliability Associates during an earlier contract with Brookhaven National Laboratory (BNL). A further development of SLIM, SLIM-MAUD (Multi-Attribute Utility Decomposition) is also described. This is an extension of the original approach using an interactive, computer-based system. All of the work described in this report was supported by the Human Factors and Safeguards Branch of the US Nuclear Regulatory Commission

  18. Knowledge management of eco-industrial park for efficient energy utilization through ontology-based approach

    International Nuclear Information System (INIS)

    Zhang, Chuan; Romagnoli, Alessandro; Zhou, Li; Kraft, Markus

    2017-01-01

    Highlights: •An intelligent energy management system for Eco-Industrial Park (EIP) is proposed. •An explicit domain ontology for EIP energy management is designed. •Ontology-based approach can increase knowledge interoperability within EIP. •Ontology-based approach can allow self-optimization without human intervention in EIP. •The proposed system harbours huge potential in the future scenario of Internet of Things. -- Abstract: An ontology-based approach for Eco-Industrial Park (EIP) knowledge management is proposed in this paper. The designed ontology in this study is formalized conceptualization of EIP. Based on such an ontological representation, a Knowledge-Based System (KBS) for EIP energy management named J-Park Simulator (JPS) is developed. By applying JPS to the solution of EIP waste heat utilization problem, the results of this study show that ontology is a powerful tool for knowledge management of complex systems such as EIP. The ontology-based approach can increase knowledge interoperability between different companies in EIP. The ontology-based approach can also allow intelligent decision making by using disparate data from remote databases, which implies the possibility of self-optimization without human intervention scenario of Internet of Things (IoT). It is shown through this study that KBS can bridge the communication gaps between different companies in EIP, sequentially more potential Industrial Symbiosis (IS) links can be established to improve the overall energy efficiency of the whole EIP.

  19. A novel method of utilizing permeable reactive kiddle (PRK) for the remediation of acid mine drainage.

    Science.gov (United States)

    Lee, Woo-Chun; Lee, Sang-Woo; Yun, Seong-Taek; Lee, Pyeong-Koo; Hwang, Yu Sik; Kim, Soon-Oh

    2016-01-15

    Numerous technologies have been developed and applied to remediate AMD, but each has specific drawbacks. To overcome the limitations of existing methods and improve their effectiveness, we propose a novel method utilizing permeable reactive kiddle (PRK). This manuscript explores the performance of the PRK method. In line with the concept of green technology, the PRK method recycles industrial waste, such as steel slag and waste cast iron. Our results demonstrate that the PRK method can be applied to remediate AMD under optimal operational conditions. Especially, this method allows for simple installation and cheap expenditure, compared with established technologies. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. [Series: Utilization of Differential Equations and Methods for Solving Them in Medical Physics (1)].

    Science.gov (United States)

    Murase, Kenya

    2014-01-01

    Utilization of differential equations and methods for solving them in medical physics are presented. First, the basic concept and the kinds of differential equations were overviewed. Second, separable differential equations and well-known first-order and second-order differential equations were introduced, and the methods for solving them were described together with several examples. In the next issue, the symbolic and series expansion methods for solving differential equations will be mainly introduced.

  1. Geochronology and geochemistry by nuclear tracks method: some utilization examples in geologic applied

    International Nuclear Information System (INIS)

    Poupeau, G.; Soliani Junior, E.

    1988-01-01

    This article discuss some applications of the 'nuclear tracks method' in geochronology, geochemistry and geophysic. In geochronology, after rapid presentation of the dating principles by 'Fission Track' and the kinds of geological events mensurable by this method, is showed some application in metallogeny and in petroleum geolocy. In geochemistry the 'fission tracks' method utilizations are related with mining prospecting and uranium prospecting. In geophysics an important application is the earthquake prevision, through the Ra 222 emanations continous control. (author) [pt

  2. BEOS-A new approach to promote and organize industrial ISS utilization

    Science.gov (United States)

    Luttmann, Helmut; Buchholz, Henning; Bratke, Burkhard; Hueser, Detlev; Dittus, Hansjörg

    2000-01-01

    In order to develop and to market innovative services and products for the operation of the ISS and its utilization, three players have teamed up together and established an entity called BEOS (Bremen Engineering Operations Science). The team is made up of DaimlerChrysler Aerospace, OHB-System and ZARM, the Center of Applied Space Technology and Microgravity at the University of Bremen. It is the aim of BEOS to represent a competent industrial interface to potential ISS users from the space and non-space industries. In this effort BEOS is supporting and supplementing the activities of the space agencies, especially in the field of industrial and/or commercial ISS utilization. With this approach BEOS is creating new business opportunities not only for its team members but also for its customers from industry. Besides the fostering of industrial research in space, nontechnical fields of space utilization like entertainment, advertisement, education and space travel represent further key sectors for the marketing efforts of BEOS. .

  3. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    Science.gov (United States)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  4. Enterprise Engineering Method supporting Six Sigma Approach

    OpenAIRE

    Jochem, Roland

    2007-01-01

    Enterprise Modeling (EM) is currently in operation either as a technique to represent and understand the structure and behavior of the enterprise, or as a technique to analyze business processes, and in many cases as support technique for business process reengineering. However, EM architectures and methodes for Enterprise Engineering can also used to support new management techniques like SIX SIGMA, because these new techniques need a clear, transparent and integrated definition and descript...

  5. Carbon footprint of forest and tree utilization technologies in life cycle approach

    Science.gov (United States)

    Polgár, András; Pécsinger, Judit

    2017-04-01

    In our research project a suitable method has been developed related the technological aspect of the environmental assessment of land use changes caused by climate change. We have prepared an eco-balance (environmental inventory) to the environmental effects classification in life-cycle approach in connection with the typical agricultural / forest and tree utilization technologies. The use of balances and environmental classification makes possible to compare land-use technologies and their environmental effects per common functional unit. In order to test our environmental analysis model, we carried out surveys in sample of forest stands. We set up an eco-balance of the working systems of intermediate cutting and final harvest in the stands of beech, oak, spruce, acacia, poplar and short rotation energy plantations (willow, poplar). We set up the life-cycle plan of the surveyed working systems by using the GaBi 6.0 Professional software and carried out midpoint and endpoint impact assessment. Out of the results, we applied the values of CML 2001 - Global Warming Potential (GWP 100 years) [kg CO2-Equiv.] and Eco-Indicator 99 - Human health, Climate Change [DALY]. On the basis of the values we set up a ranking of technology. By this, we received the environmental impact classification of the technologies based on carbon footprint. The working systems had the greatest impact on global warming (GWP 100 years) throughout their whole life cycle. This is explained by the amount of carbon dioxide releasing to the atmosphere resulting from the fuel of the technologies. Abiotic depletion (ADP foss) and marine aquatic ecotoxicity (MAETP) emerged also as significant impact categories. These impact categories can be explained by the share of input of fuel and lube. On the basis of the most significant environmental impact category (carbon footprint), we perform the relative life cycle contribution and ranking of each technologies. The technological life cycle stages examined

  6. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    Science.gov (United States)

    2018-01-01

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction. PMID:29495599

  7. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility

    Directory of Open Access Journals (Sweden)

    Agustín Zaballos

    2018-02-01

    Full Text Available Information and communication technologies (ICTs have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid’s data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  8. A Custom Approach for a Flexible, Real-Time and Reliable Software Defined Utility.

    Science.gov (United States)

    Zaballos, Agustín; Navarro, Joan; Martín De Pozuelo, Ramon

    2018-02-28

    Information and communication technologies (ICTs) have enabled the evolution of traditional electric power distribution networks towards a new paradigm referred to as the smart grid. However, the different elements that compose the ICT plane of a smart grid are usually conceived as isolated systems that typically result in rigid hardware architectures, which are hard to interoperate, manage and adapt to new situations. In the recent years, software-defined systems that take advantage of software and high-speed data network infrastructures have emerged as a promising alternative to classic ad hoc approaches in terms of integration, automation, real-time reconfiguration and resource reusability. The purpose of this paper is to propose the usage of software-defined utilities (SDUs) to address the latent deployment and management limitations of smart grids. More specifically, the implementation of a smart grid's data storage and management system prototype by means of SDUs is introduced, which exhibits the feasibility of this alternative approach. This system features a hybrid cloud architecture able to meet the data storage requirements of electric utilities and adapt itself to their ever-evolving needs. Conducted experimentations endorse the feasibility of this solution and encourage practitioners to point their efforts in this direction.

  9. Demonstrating a small utility approach to demand-side program implementation

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-01

    The US DOE awarded a grant to the Burlington Electric Department (B.E.D.) to test a demand-side management (DSM) demonstration program designed to quickly save a significant amount of power with little disruption to the utility's customers or its normal operations. B.E.D. is a small municipal utility located in northern Vermont, with a lengthy history of successful DSM involvement. In our grant application, we proposed to develop a replicable program and approach to DSM that might be useful to other small utilities and to write a report to enable such replication. We believe that this DSM program and/or individual program components are replicable. This report is designed to allow other utilities interested in DSM to replicate this program or specific program design features to meet their DSM goals. We also wanted to use the opportunity of this grant to test the waters of residential heating fuel-switching. We hoped to test the application of one fuel-switching technology, and to benefit from the lessons learned in developing a full-scale DSM program for this end- use. To this end the pilot effort has been very successful. In the pilot pressure we installed direct-vent gas fired space heaters sized as supplemental heating units in 44 residences heated solely by electric resistance heat. We installed the gas space heating units at no cost to the owners or residents. We surveyed participating customers. The results of those surveys are included in this report and preliminary estimates of winter peak capacity load reductions are also noted in this report.

  10. Demonstrating a small utility approach to demand-side program implementation

    International Nuclear Information System (INIS)

    1991-01-01

    The US DOE awarded a grant to the Burlington Electric Department (B.E.D.) to test a demand-side management (DSM) demonstration program designed to quickly save a significant amount of power with little disruption to the utility's customers or its normal operations. B.E.D. is a small municipal utility located in northern Vermont, with a lengthy history of successful DSM involvement. In our grant application, we proposed to develop a replicable program and approach to DSM that might be useful to other small utilities and to write a report to enable such replication. We believe that this DSM program and/or individual program components are replicable. This report is designed to allow other utilities interested in DSM to replicate this program or specific program design features to meet their DSM goals. We also wanted to use the opportunity of this grant to test the waters of residential heating fuel-switching. We hoped to test the application of one fuel-switching technology, and to benefit from the lessons learned in developing a full-scale DSM program for this end- use. To this end the pilot effort has been very successful. In the pilot pressure we installed direct-vent gas fired space heaters sized as supplemental heating units in 44 residences heated solely by electric resistance heat. We installed the gas space heating units at no cost to the owners or residents. We surveyed participating customers. The results of those surveys are included in this report and preliminary estimates of winter peak capacity load reductions are also noted in this report

  11. Optimising risk reduction: An expected utility approach for marginal risk reduction during regulatory decision making

    International Nuclear Information System (INIS)

    Li Jiawei; Pollard, Simon; Kendall, Graham; Soane, Emma; Davies, Gareth

    2009-01-01

    In practice, risk and uncertainty are essentially unavoidable in many regulation processes. Regulators frequently face a risk-benefit trade-off since zero risk is neither practicable nor affordable. Although it is accepted that cost-benefit analysis is important in many scenarios of risk management, what role it should play in a decision process is still controversial. One criticism of cost-benefit analysis is that decision makers should consider marginal benefits and costs, not present ones, in their decision making. In this paper, we investigate the problem of regulatory decision making under risk by applying expected utility theory and present a new approach of cost-benefit analysis. Directly taking into consideration the reduction of the risks, this approach achieves marginal cost-benefit analysis. By applying this approach, the optimal regulatory decision that maximizes the marginal benefit of risk reduction can be considered. This provides a transparent and reasonable criterion for stakeholders involved in the regulatory activity. An example of evaluating seismic retrofitting alternatives is provided to demonstrate the potential of the proposed approach.

  12. TRANSFER PRICES: MECHANISMS, METHODS AND INTERNATIONAL APPROACHES

    Directory of Open Access Journals (Sweden)

    Pop Cosmina

    2008-05-01

    Full Text Available Transfer prices are considered the prices paid for the goods or services in a cross-border transaction between affiliates companies, often significant reduced or increased in order to avoid the higher imposing rates from one jurisdiction. Presently, over 60% of cross-border transfers are represented by intra-group transfers. The paper presents the variety of methods and mechanisms used by the companies to transfer the funds from one tax jurisdiction to another in order to avoid over taxation.

  13. Microscopic approach to the generator coordinate method

    International Nuclear Information System (INIS)

    Haider, Q.; Gogny, D.; Weiss, M.S.

    1989-01-01

    In this paper, we solve different theoretical problems associated with the calculation of the kernel occurring in the Hill-Wheeler integral equations within the framework of generator coordinate method. In particular, we extend the Wick's theorem to nonorthogonal Bogoliubov states. Expressions for the overlap between Bogoliubov states and for the generalized density matrix are also derived. These expressions are valid even when using an incomplete basis, as in the case of actual calculations. Finally, the Hill-Wheeler formalism is developed for a finite range interaction and the Skyrme force, and evaluated for the latter. 20 refs., 1 fig., 4 tabs

  14. Analytical investigation of different mathematical approaches utilizing manipulation of ratio spectra

    Science.gov (United States)

    Osman, Essam Eldin A.

    2018-01-01

    This work represents a comparative study of different approaches of manipulating ratio spectra, applied on a binary mixture of ciprofloxacin HCl and dexamethasone sodium phosphate co-formulated as ear drops. The proposed new spectrophotometric methods are: ratio difference spectrophotometric method (RDSM), amplitude center method (ACM), first derivative of the ratio spectra (1DD) and mean centering of ratio spectra (MCR). The proposed methods were checked using laboratory-prepared mixtures and were successfully applied for the analysis of pharmaceutical formulation containing the cited drugs. The proposed methods were validated according to the ICH guidelines. A comparative study was conducted between those methods regarding simplicity, limitations and sensitivity. The obtained results were statistically compared with those obtained from the reported HPLC method, showing no significant difference with respect to accuracy and precision.

  15. Approaches to Mixed Methods Dissemination and Implementation Research: Methods, Strengths, Caveats, and Opportunities.

    Science.gov (United States)

    Green, Carla A; Duan, Naihua; Gibbons, Robert D; Hoagwood, Kimberly E; Palinkas, Lawrence A; Wisdom, Jennifer P

    2015-09-01

    Limited translation of research into practice has prompted study of diffusion and implementation, and development of effective methods of encouraging adoption, dissemination and implementation. Mixed methods techniques offer approaches for assessing and addressing processes affecting implementation of evidence-based interventions. We describe common mixed methods approaches used in dissemination and implementation research, discuss strengths and limitations of mixed methods approaches to data collection, and suggest promising methods not yet widely used in implementation research. We review qualitative, quantitative, and hybrid approaches to mixed methods dissemination and implementation studies, and describe methods for integrating multiple methods to increase depth of understanding while improving reliability and validity of findings.

  16. A sequential mixed methods research approach to investigating HIV ...

    African Journals Online (AJOL)

    2016-09-03

    Sep 3, 2016 ... Sequential mixed methods research is an effective approach for ... show the effectiveness of the research method. ... qualitative data before quantitative datasets ..... whereby both types of data are collected simultaneously.

  17. Elastic Model Transitions: a Hybrid Approach Utilizing Quadratic Inequality Constrained Least Squares (LSQI) and Direct Shape Mapping (DSM)

    Science.gov (United States)

    Jurenko, Robert J.; Bush, T. Jason; Ottander, John A.

    2014-01-01

    A method for transitioning linear time invariant (LTI) models in time varying simulation is proposed that utilizes both quadratically constrained least squares (LSQI) and Direct Shape Mapping (DSM) algorithms to determine physical displacements. This approach is applicable to the simulation of the elastic behavior of launch vehicles and other structures that utilize multiple LTI finite element model (FEM) derived mode sets that are propagated throughout time. The time invariant nature of the elastic data for discrete segments of the launch vehicle trajectory presents a problem of how to properly transition between models while preserving motion across the transition. In addition, energy may vary between flex models when using a truncated mode set. The LSQI-DSM algorithm can accommodate significant changes in energy between FEM models and carries elastic motion across FEM model transitions. Compared with previous approaches, the LSQI-DSM algorithm shows improvements ranging from a significant reduction to a complete removal of transients across FEM model transitions as well as maintaining elastic motion from the prior state.

  18. Integrated approach to natural gas utilization in the Asia Pacific region

    International Nuclear Information System (INIS)

    Hovdestad, W.R.; Egbogah, E.O.

    1995-01-01

    The rapidly expanding economies in the Pacific Rim have placed increasing demands upon indigenous natural gas supplies in South East Asia and Australia. Competing demands include exports of liquefied natural gas (LNG), domestic consumption, and potential use for enhanced oil recovery (EOR) to extend the useful life of maturing oil fields. An additional competing demand for gas exports may emerge as the interstate pipeline grid is expanded. An integrated approach incorporating the evolving nature of gas demands and discrete physical supplies would provide a means to mitigate against potential mismatching of supply and demand. The consideration of the evolving nature of gas demands could promote economically beneficial changes to gas field development. The development of high carbon dioxide (CO 2 ) content gas fields has been slowed by the lack of a market for CO 2 . Utilization of by-product CO 2 for EOR could improve development economics, thus facilitating earlier development of gas supplies to satisfy gas demands including domestic use and LNG exports. End users would also benefit from the assurance that gas supplies would become available as needed. The maturity and increasingly complex natural gas industry in the Asia Pacific Region has led to a qualitative change. The model of single projects to satisfy single markets is no longer valid. The current environment is more dynamic, creating the need to anticipate changes to market demands and to find value-added markets for by-products. The integrated approach to gas utilization discussed in this paper presents a new model more appropriate to the gas industry existing today in the Asia Pacific Region. This approach is particularly significant to widely discussed proposals for an Asia Pacific energy grid extending to Australia

  19. Utilize target motion to cover clinical target volume (ctv) - a novel and practical treatment planning approach to manage respiratory motion

    International Nuclear Information System (INIS)

    Jin Jianyue; Ajlouni, Munther; Kong Fengming; Ryu, Samuel; Chetty, Indrin J.; Movsas, Benjamin

    2008-01-01

    Purpose: To use probability density function (PDF) to model motion effects and incorporate this information into treatment planning for lung cancers. Material and methods: PDFs were calculated from the respiratory motion traces of 10 patients. Motion effects were evaluated by convolving static dose distributions with various PDFs. Based on a differential dose prescription with relatively lower dose to the clinical target volume (CTV) than to the gross tumor volume (GTV), two approaches were proposed to incorporate PDFs into treatment planning. The first approach uses the GTV-based internal target volume (ITV) as the planning target volume (PTV) to ensure full dose to the GTV, and utilizes the motion-induced dose gradient to cover the CTV. The second approach employs an inhomogeneous static dose distribution within a minimized PTV to best match the prescription dose gradient. Results: Motion effects on dose distributions were minimal in the anterior-posterior (AP) and lateral directions: a 10-mm motion only induced about 3% of dose reduction in the peripheral target region. The motion effect was remarkable in the cranial-caudal direction. It varied with the motion amplitude, but tended to be similar for various respiratory patterns. For the first approach, a 10-15 mm motion would adequately cover the CTV (presumed to be 60-70% of the GTV dose) without employing the CTV in planning. For motions 15-mm. An example of inhomogeneous static dose distribution in a reduced PTV was given, and it showed significant dose reduction in the normal tissue without compromising target coverage. Conclusions: Respiratory motion-induced dose gradient can be utilized to cover the CTV and minimize the lung dose without the need for more sophisticated technologies

  20. 4D CAD Based Method for Supporting Coordination of Urban Subsurface Utility Projects

    NARCIS (Netherlands)

    olde Scholtenhuis, Léon Luc; Hartmann, T.; Doree, Andries G.

    Coordinators of inner city utility construction works face increasing difficulty in managing their projects due to tight physical restrictions, strict deadlines and growing stakeholder fragmentation. This paper therefore presents a 4D CAD based coordination method that supports project plan scoping,

  1. Creating Critical Conversations: Investigating the Utility of Socratic Dialogues in Elementary Social Studies Methods

    Science.gov (United States)

    Buchanan, Lisa Brown

    2012-01-01

    This article explores the utility of Socratic dialogues in the elementary social studies methods course. Findings include preservice teachers' behaviors during dialogues, perceived strengths and challenges of using Socratic dialogues in teacher education, and the impact on student learning. Challenges and apprehensions encountered by the teacher…

  2. Utilization of OR method toward realization of better fast breeder reactor cycle

    International Nuclear Information System (INIS)

    Shiotani, Hiroki

    2008-01-01

    Fast Reactor Cycle Technology Development (FaCT) Project was now started aiming at commercialization of new nuclear power plants system. In parallel with development of component technology and technology demonstration by test, development of comprehensive evaluation method of the FBR cycle system is under way and scenario study, discounted cash flow (DCF) method, analytic hierarchy process (AHP), real option, supply chain management (SCM) and others are used. Since commercialized FBR cycle would request long-term and large-scale development contributed by so many participants, modeling of nuclear system and knowledge management are beneficial even for development of evaluation method and further utilization of OR technology is highly expected. Comprehensive evaluation methods now utilized or developing were overlooked from the standpoint of OR, 'Science of Better'. (T. Tanaka)

  3. Outline of a multiattribute utility approach to development of a waste management strategy at Sillamaee

    International Nuclear Information System (INIS)

    Anselmo, P.C.

    2000-01-01

    The article briefly discusses a framework for analysis of the waste disposition and management problem at Sillamaee. It is a response to the need to develop a strategic waste management plan for the Sillamaee site. A hypothetical objectives hierarchy is presented, along with two possible methods for aggregating scores for designated alternatives. Waste management and disposal problems, particularly nuclear waste disposal problems, have been addressed by many decision analysts. The latter citations are examples of Multiattribute Utility (MAU) Analysis, a decision analysis technique that is most appropriate for evaluation of waste management strategies at Sillamaee

  4. Risk aversion and uncertainty in cost-effectiveness analysis: the expected-utility, moment-generating function approach.

    Science.gov (United States)

    Elbasha, Elamin H

    2005-05-01

    The availability of patient-level data from clinical trials has spurred a lot of interest in developing methods for quantifying and presenting uncertainty in cost-effectiveness analysis (CEA). Although the majority has focused on developing methods for using sample data to estimate a confidence interval for an incremental cost-effectiveness ratio (ICER), a small strand of the literature has emphasized the importance of incorporating risk preferences and the trade-off between the mean and the variance of returns to investment in health and medicine (mean-variance analysis). This paper shows how the exponential utility-moment-generating function approach is a natural extension to this branch of the literature for modelling choices from healthcare interventions with uncertain costs and effects. The paper assumes an exponential utility function, which implies constant absolute risk aversion, and is based on the fact that the expected value of this function results in a convenient expression that depends only on the moment-generating function of the random variables. The mean-variance approach is shown to be a special case of this more general framework. The paper characterizes the solution to the resource allocation problem using standard optimization techniques and derives the summary measure researchers need to estimate for each programme, when the assumption of risk neutrality does not hold, and compares it to the standard incremental cost-effectiveness ratio. The importance of choosing the correct distribution of costs and effects and the issues related to estimation of the parameters of the distribution are also discussed. An empirical example to illustrate the methods and concepts is provided. Copyright 2004 John Wiley & Sons, Ltd

  5. Method, apparatus, and system for utilizing augmented reality to improve surgery

    KAUST Repository

    Cali, Corrado

    2016-10-13

    A method, apparatus, and computer readable medium are provided for utilizing augmented reality visualization to assist surgery. An example method includes generating a three dimensional reconstruction of an image stack representing a target area of a patient, and superimposing, by a head-mounted display, a projection of the three dimensional reconstruction onto a field of view of a user. The method further includes maintaining alignment between the projection and the user\\'s actual view of the target area using a plurality of fiducial markers associated with the target area. In some embodiments, the method further includes scanning the target area to generate the image stack.

  6. A Multi-Approach Evaluation System (MA-ES) of Organic Rankine Cycles (ORC) used in waste heat utilization

    International Nuclear Information System (INIS)

    Shu, Gequn; Yu, Guopeng; Tian, Hua; Wei, Haiqiao; Liang, Xingyu

    2014-01-01

    Highlights: • The MA-ES provides comprehensive valuations on ORC used for waste heat utilization. • The MA-ES covers energetic, exergetic and economic evaluations of typical ORCs. • The MA-ES is a general assessing method without restriction to specific ORC condition. • Two ORC cases of ICE waste-heat-recovery are exemplified applying the MA-ES. - Abstract: A Multi-Approach Evaluation System (MA-ES) is established in this paper providing comprehensive evaluations on Organic Rankine Cycles (ORC) used for waste heat utilization. The MA-ES covers three main aspects of typical ORC performance: basic evaluations of energy distribution and system efficiency based on the 1st law of thermodynamics; evaluations of exergy distribution and exergy efficiency based on the 2nd law of thermodynamics; economic evaluations based on calculations of equipment capacity, investment and cost recovery. The MA-ES is reasonably organized aiming at providing a general method of ORC performance assessment, without restrictions to system configurations, operation modes, applications, working fluid types, equipment conditions, process parameters and so on. Two ORC cases of internal combustion engines’ (ICEs) waste-heat-recovery are exemplified to illustrate the applications of the evaluation system. The results clearly revealed the performance comparisons among ORC configurations and working fluids referred. The comparisons will provide credible guidance for ORC design, equipment selection and system construction

  7. Different methods to define utility functions yield similar results but engage different neural processes

    Directory of Open Access Journals (Sweden)

    Marcus Heldmann

    2009-10-01

    Full Text Available Although the concept of utility is fundamental to many economic theories, up to now a generally accepted method determining a subject’s utility function is not available. We investigated two methods that are used in economic sciences for describing utility functions by using response-locked event-related potentials in order to assess their neural underpinnings. For defining the certainty equivalent (CE, we used a lottery game with probabilities to win p=0.5, for identifying the subjects’ utility functions directly a standard bisection task was applied. Although the lottery tasks’ payoffs were only hypothetical, a pronounced negativity was observed resembling the error related negativity (ERN previously described in action monitoring research, but this occurred only for choices far away from the indifference point between money and lottery. By contrast, the bisection task failed to evoke an ERN irrespective of the responses’ correctness. Based on these findings we are reasoning that only decisions made in the lottery task achieved a level of subjective relevance that activates cognitive-emotional monitoring. In terms of economic sciences, our findings support the view that the bisection method is unaffected by any kind of probability valuation or other parameters related to risk and in combination with the lottery task can, therefore, be used to differentiate between payoff and probability valuation.

  8. Clinical Utility of Noninvasive Method to Measure Specific Gravity in the Pediatric Population.

    Science.gov (United States)

    Hall, Jeanine E; Huynh, Pauline P; Mody, Ameer P; Wang, Vincent J

    2018-04-01

    Clinicians rely on any combination of signs and symptoms, clinical scores, or invasive procedures to assess the hydration status in children. Noninvasive tests to evaluate for dehydration in the pediatric population are appealing. The objective of our study is to assess the utility of measuring specific gravity of tears compared to specific gravity of urine and the clinical assessment of dehydration. We conducted a prospective cohort convenience sample study, in a pediatric emergency department at a tertiary care children's hospital. We approached parents/guardians of children aged 6 months to 4 years undergoing transurethral catheterization for evaluation of urinary tract infection for enrollment. We collected tears and urine for measurement of tear specific gravity (TSG) and urine specific gravity (USG), respectively. Treating physicians completed dehydration assessment forms to assess for hydration status. Among the 60 participants included, the mean TSG was 1.0183 (SD = 0.007); the mean USG was 1.0186 (SD = 0.0083). TSG and USG were positively correlated with each other (Pearson Correlation = 0.423, p = 0.001). Clinical dehydration scores ranged from 0 to 3, with 87% assigned a score of 0, by physician assessment. Mean number of episodes of vomiting and diarrhea in a 24-hour period were 2.2 (SD = 3.9) and 1.5 (SD = 3.2), respectively. Sixty-two percent of parents reported decreased oral intake. TSG measurements yielded similar results compared with USG. Further studies are needed to determine if TSG can be used as a noninvasive method of dehydration assessment in children. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. A Generalized Approach to Forensic Dye Identification: Development and Utility of Reference Libraries.

    Science.gov (United States)

    Groves, Ethan; Palenik, Skip; Palenik, Christopher S

    2018-04-18

    While color is arguably the most important optical property of evidential fibers, the actual dyestuffs responsible for its expression in them are, in forensic trace evidence examinations, rarely analyzed and still less often identified. This is due, primarily, to the exceedingly small quantities of dye present in a single fiber as well as to the fact that dye identification is a challenging analytical problem, even when large quantities are available for analysis. Among the practical reasons for this are the wide range of dyestuffs available (and the even larger number of trade names), the low total concentration of dyes in the finished product, the limited amount of sample typically available for analysis in forensic cases, and the complexity of the dye mixtures that may exist within a single fiber. Literature on the topic of dye analysis is often limited to a specific method, subset of dyestuffs, or an approach that is not applicable given the constraints of a forensic analysis. Here, we present a generalized approach to dye identification that ( 1 ) combines several robust analytical methods, ( 2 ) is broadly applicable to a wide range of dye chemistries, application classes, and fiber types, and ( 3 ) can be scaled down to forensic casework-sized samples. The approach is based on the development of a reference collection of 300 commercially relevant textile dyes that have been characterized by a variety of microanalytical methods (HPTLC, Raman microspectroscopy, infrared microspectroscopy, UV-Vis spectroscopy, and visible microspectrophotometry). Although there is no single approach that is applicable to all dyes on every type of fiber, a combination of these analytical methods has been applied using a reproducible approach that permits the use of reference libraries to constrain the identity of and, in many cases, identify the dye (or dyes) present in a textile fiber sample.

  10. Prioritizing Roads Safety Based on the Quasi-Induced Exposure Method and Utilization of the Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Sajad rezaei

    2014-06-01

    Full Text Available Safety analysis of the roads through the accident rates which is one of the widely used tools has been resulted from the direct exposure method which is based on the ratio of the vehicle-kilometers traveled and vehicle-travel time. However, due to some fundamental flaws in its theories and difficulties in gaining access to the data required such as traffic volume, distance and duration of the trip, and various problems in determining the exposure in a specific time, place, and individual categories, there is a need for an algorithm for prioritizing the road safety so that with a new exposure method, the problems of the previous approaches would be resolved. In this way, an efficient application may lead to have more realistic comparisons and the new method would be applicable to a wider range of time, place, and individual categories. Therefore, an algorithm was introduced to prioritize the safety of roads using the quasi-induced exposure method and utilizing the analytical hierarchy process. For this research, 11 provinces of Iran were chosen as case study locations. A rural accidents database was created for these provinces, the validity of quasi-induced exposure method for Iran’s accidents database was explored, and the involvement ratio for different characteristics of the drivers and the vehicles was measured. Results showed that the quasi-induced exposure method was valid in determining the real exposure in the provinces under study. Results also showed a significant difference in the prioritization based on the new and traditional approaches. This difference mostly would stem from the perspective of the quasi-induced exposure method in determining the exposure, opinion of experts, and the quantity of accidents data. Overall, the results for this research showed that prioritization based on the new approach is more comprehensive and reliable compared to the prioritization in the traditional approach which is dependent on various

  11. Clinical utility of an endoscopic ultrasound-guided rendezvous technique via various approach routes.

    Science.gov (United States)

    Kawakubo, Kazumichi; Isayama, Hiroyuki; Sasahira, Naoki; Nakai, Yousuke; Kogure, Hirofumi; Hamada, Tsuyoshi; Miyabayashi, Koji; Mizuno, Suguru; Sasaki, Takashi; Ito, Yukiko; Yamamoto, Natsuyo; Hirano, Kenji; Tada, Minoru; Koike, Kazuhiko

    2013-09-01

    The endoscopic ultrasound-guided rendezvous techniques (EUS-rendezvous) provide reliable biliary access after failed endoscopic retrograde cholangiopancreatography (ERCP) cannulation. We evaluated the clinical utility of an EUS-rendezvous technique using various approach routes. Patients undergoing EUS-rendezvous for biliary access after failed bile duct cannulation in ERCP were included. EUS-rendezvous was performed via three approach routes depending on the patient's condition: transgastric, transduodenal in a short endoscopic position, or transduodenal in a long endoscopic position. The main outcomes were the technical success rates. Secondary outcomes were procedure time and complications. Fourteen patients (median age, 77 years) underwent EUS-rendezvous for biliary access resulting from failed biliary cannulation. The reasons for biliary drainage were malignant biliary obstruction in five patients and choledocholithiasis in nine. Transgastric, transduodenal in a short position, and transduodenal in a long position EUS-rendezvous was performed in five, five, and four patients, respectively. Bile duct puncture occurred in the left intrahepatic duct in four patients, right hepatic duct in one, middle common bile duct in four, and lower common bile duct in five. The technical success rate was 100 %. In four patients, the approach route was modified from transduodenal in a short position to transduodenal in a long position or transgastric route. The median procedure time was 81 min. One case each of biliary peritonitis and pancreatitis occurred and were managed conservatively. EUS-rendezvous provided safe and reliable transpapillary bile duct access after failed ERCP cannulation. The selection of the appropriate approach routes, depending on patient condition, is critical.

  12. Regulatory and ratemaking approaches to mitigate financial impacts of net-metered PV on utilities and ratepayers

    International Nuclear Information System (INIS)

    Satchwell, Andrew; Mills, Andrew; Barbose, Galen

    2015-01-01

    The financial interests of U.S. utilities are poorly aligned with customer-sited solar photovoltaics (PV) under traditional regulation. Customer-sited PV, especially under a net-metering arrangement, may result in revenue erosion and lost earnings opportunities for utility shareholders as well as increases in average retail rates for utility ratepayers. Regulators are considering alternative regulatory and ratemaking approaches to mitigate these financial impacts. We performed a scoping analysis using a financial model to quantify the efficacy of mitigation approaches in reducing financial impacts of customer-sited PV on utility shareholders and ratepayers. We find that impacts can be mitigated through various incremental changes to utility regulatory and business models, though the efficacy varies considerably depending on design and particular utility circumstances. Based on this analysis, we discuss tradeoffs policymakers should consider, which ultimately might need to be resolved within broader policy contexts. -- Highlights: •Customer-sited PV presents negatively impacts utilities and ratepayers. •Regulatory and ratemaking approaches exist to mitigate profitability and rate impacts. •Mitigation approaches entail tradeoffs among stakeholders

  13. Comparison of the difference and delta 15nitrogen approaches for evaluating liquid urea ammonium nitrate utilization by maize

    International Nuclear Information System (INIS)

    Clay, D.E.

    1997-01-01

    Isotopic nitrogen (N) research techniques may be required in watershed studies to determine the impact of landscape position on fertilizer efficiency and the soil supplying power. However, traditional approaches using 15N labeled fertilizer may not be suitable when farmer equipment is used. The delta 15N natural abundance isotopic approach has been used to evaluate N cycling in watersheds. The objectives of this study were to measure the precision of the delta 15N measurement by the Europa 20-20 ratio mass spectrometer (Europa Scientific Ltd, UK), and to compare the difference and delta 15N approaches for measuring fertilizer use by maize (Zea mays). A replicated field study containing two different N rates (0 and 15.7 g N m-2) were used for the study. Maize samples were collected at the 8th-leaf, silking, and plant maturity in 1992 and 1993. Samples were dried (80 degrees C), ground (1-mm), weighed (stover 12 mg and grain 3 mg), and analyzed for total N and delta 15N. Fertilizer utilization at the three growth stages was determined using the natural abundance delta 15N and nonisotopic difference (fertilizer-control) techniques. During the study, the Europa 20-20 ratio mass spectrometer (Europa Scientific Ltd, UK) analyzed over 100 samples a day and had consumable costs of less than $2.00 per sample. The standard deviations of the mean were less than 0.11 and 0.21 per thousand in 51 and 77% of the stover samples, respectively. In 1992, grain yields were not influenced by N fertilizer additions, while in 1993 grain yields were increased by N fertilizer. The difference method estimated that in 1992, 16% of the N fertilizer was utilized by the crop, while the natural abundance delta 15N approach estimated that 36% of the fertilizer N was used by the crop. Differences between calculated values by the two techniques resulted from the difference method calculating net fertilizer use, while the delta 15N approach calculated fertilizer contained in the plant

  14. An integrated multicriteria decision-making approach for evaluating nuclear fuel cycle systems for long-term sustainability on the basis of an equilibrium model: Technique for order of preference by similarity to ideal solution, preference ranking organization method for enrichment evaluation, and multiattribute utility theory combined with analytic hierarchy process

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Sae Rom [Dept of Quantum Energy Chemical Engineering, Korea University of Science and Technology (KUST), Daejeon (Korea, Republic of); Choi, Sung Yeol [Ulsan National Institute of Science and Technology, Ulju (Korea, Republic of); Ko, Wonil [Nonproliferation System Development Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2017-02-15

    The focus on the issues surrounding spent nuclear fuel and lifetime extension of old nuclear power plants continues to grow nowadays. A transparent decision-making process to identify the best suitable nuclear fuel cycle (NFC) is considered to be the key task in the current situation. Through this study, an attempt is made to develop an equilibrium model for the NFC to calculate the material flows based on 1 TWh of electricity production, and to perform integrated multicriteria decision-making method analyses via the analytic hierarchy process technique for order of preference by similarity to ideal solution, preference ranking organization method for enrichment evaluation, and multiattribute utility theory methods. This comparative study is aimed at screening and ranking the three selected NFC options against five aspects: sustainability, environmental friendliness, economics, proliferation resistance, and technical feasibility. The selected fuel cycle options include pressurized water reactor (PWR) once-through cycle, PWR mixed oxide cycle, or pyroprocessing sodium-cooled fast reactor cycle. A sensitivity analysis was performed to prove the robustness of the results and explore the influence of criteria on the obtained ranking. As a result of the comparative analysis, the pyroprocessing sodium-cooled fast reactor cycle is determined to be the most competitive option among the NFC scenarios.

  15. An integrated multicriteria decision-making approach for evaluating nuclear fuel cycle systems for long-term sustainability on the basis of an equilibrium model: Technique for order of preference by similarity to ideal solution, preference ranking organization method for enrichment evaluation, and multiattribute utility theory combined with analytic hierarchy process

    International Nuclear Information System (INIS)

    Yoon, Sae Rom; Choi, Sung Yeol; Ko, Wonil

    2017-01-01

    The focus on the issues surrounding spent nuclear fuel and lifetime extension of old nuclear power plants continues to grow nowadays. A transparent decision-making process to identify the best suitable nuclear fuel cycle (NFC) is considered to be the key task in the current situation. Through this study, an attempt is made to develop an equilibrium model for the NFC to calculate the material flows based on 1 TWh of electricity production, and to perform integrated multicriteria decision-making method analyses via the analytic hierarchy process technique for order of preference by similarity to ideal solution, preference ranking organization method for enrichment evaluation, and multiattribute utility theory methods. This comparative study is aimed at screening and ranking the three selected NFC options against five aspects: sustainability, environmental friendliness, economics, proliferation resistance, and technical feasibility. The selected fuel cycle options include pressurized water reactor (PWR) once-through cycle, PWR mixed oxide cycle, or pyroprocessing sodium-cooled fast reactor cycle. A sensitivity analysis was performed to prove the robustness of the results and explore the influence of criteria on the obtained ranking. As a result of the comparative analysis, the pyroprocessing sodium-cooled fast reactor cycle is determined to be the most competitive option among the NFC scenarios

  16. An Integrated Multicriteria Decision-Making Approach for Evaluating Nuclear Fuel Cycle Systems for Long-term Sustainability on the Basis of an Equilibrium Model: Technique for Order of Preference by Similarity to Ideal Solution, Preference Ranking Organization Method for Enrichment Evaluation, and Multiattribute Utility Theory Combined with Analytic Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Saerom Yoon

    2017-02-01

    Full Text Available The focus on the issues surrounding spent nuclear fuel and lifetime extension of old nuclear power plants continues to grow nowadays. A transparent decision-making process to identify the best suitable nuclear fuel cycle (NFC is considered to be the key task in the current situation. Through this study, an attempt is made to develop an equilibrium model for the NFC to calculate the material flows based on 1 TWh of electricity production, and to perform integrated multicriteria decision-making method analyses via the analytic hierarchy process technique for order of preference by similarity to ideal solution, preference ranking organization method for enrichment evaluation, and multiattribute utility theory methods. This comparative study is aimed at screening and ranking the three selected NFC options against five aspects: sustainability, environmental friendliness, economics, proliferation resistance, and technical feasibility. The selected fuel cycle options include pressurized water reactor (PWR once-through cycle, PWR mixed oxide cycle, or pyroprocessing sodium-cooled fast reactor cycle. A sensitivity analysis was performed to prove the robustness of the results and explore the influence of criteria on the obtained ranking. As a result of the comparative analysis, the pyroprocessing sodium-cooled fast reactor cycle is determined to be the most competitive option among the NFC scenarios.

  17. An IR-Based Approach Utilizing Query Expansion for Plagiarism Detection in MEDLINE.

    Science.gov (United States)

    Nawab, Rao Muhammad Adeel; Stevenson, Mark; Clough, Paul

    2017-01-01

    The identification of duplicated and plagiarized passages of text has become an increasingly active area of research. In this paper, we investigate methods for plagiarism detection that aim to identify potential sources of plagiarism from MEDLINE, particularly when the original text has been modified through the replacement of words or phrases. A scalable approach based on Information Retrieval is used to perform candidate document selection-the identification of a subset of potential source documents given a suspicious text-from MEDLINE. Query expansion is performed using the ULMS Metathesaurus to deal with situations in which original documents are obfuscated. Various approaches to Word Sense Disambiguation are investigated to deal with cases where there are multiple Concept Unique Identifiers (CUIs) for a given term. Results using the proposed IR-based approach outperform a state-of-the-art baseline based on Kullback-Leibler Distance.

  18. Advancing biomarker research: utilizing 'Big Data' approaches for the characterization and prevention of bipolar disorder.

    Science.gov (United States)

    McIntyre, Roger S; Cha, Danielle S; Jerrell, Jeanette M; Swardfager, Walter; Kim, Rachael D; Costa, Leonardo G; Baskaran, Anusha; Soczynska, Joanna K; Woldeyohannes, Hanna O; Mansur, Rodrigo B; Brietzke, Elisa; Powell, Alissa M; Gallaugher, Ashley; Kudlow, Paul; Kaidanovich-Beilin, Oksana; Alsuwaidan, Mohammad

    2014-08-01

    To provide a strategic framework for the prevention of bipolar disorder (BD) that incorporates a 'Big Data' approach to risk assessment for BD. Computerized databases (e.g., Pubmed, PsychInfo, and MedlinePlus) were used to access English-language articles published between 1966 and 2012 with the search terms bipolar disorder, prodrome, 'Big Data', and biomarkers cross-referenced with genomics/genetics, transcriptomics, proteomics, metabolomics, inflammation, oxidative stress, neurotrophic factors, cytokines, cognition, neurocognition, and neuroimaging. Papers were selected from the initial search if the primary outcome(s) of interest was (were) categorized in any of the following domains: (i) 'omics' (e.g., genomics), (ii) molecular, (iii) neuroimaging, and (iv) neurocognitive. The current strategic approach to identifying individuals at risk for BD, with an emphasis on phenotypic information and family history, has insufficient predictive validity and is clinically inadequate. The heterogeneous clinical presentation of BD, as well as its pathoetiological complexity, suggests that it is unlikely that a single biomarker (or an exclusive biomarker approach) will sufficiently augment currently inadequate phenotypic-centric prediction models. We propose a 'Big Data'- bioinformatics approach that integrates vast and complex phenotypic, anamnestic, behavioral, family, and personal 'omics' profiling. Bioinformatic processing approaches, utilizing cloud- and grid-enabled computing, are now capable of analyzing data on the order of tera-, peta-, and exabytes, providing hitherto unheard of opportunities to fundamentally revolutionize how psychiatric disorders are predicted, prevented, and treated. High-throughput networks dedicated to research on, and the treatment of, BD, integrating both adult and younger populations, will be essential to sufficiently enroll adequate samples of individuals across the neurodevelopmental trajectory in studies to enable the characterization

  19. Two-stage discrete-continuous multi-objective load optimization: An industrial consumer utility approach to demand response

    International Nuclear Information System (INIS)

    Abdulaal, Ahmed; Moghaddass, Ramin; Asfour, Shihab

    2017-01-01

    Highlights: •Two-stage model links discrete-optimization to real-time system dynamics operation. •The solutions obtained are non-dominated Pareto optimal solutions. •Computationally efficient GA solver through customized chromosome coding. •Modest to considerable savings are achieved depending on the consumer’s preference. -- Abstract: In the wake of today’s highly dynamic and competitive energy markets, optimal dispatching of energy sources requires effective demand responsiveness. Suppliers have adopted a dynamic pricing strategy in efforts to control the downstream demand. This method however requires consumer awareness, flexibility, and timely responsiveness. While residential activities are more flexible and schedulable, larger commercial consumers remain an obstacle due to the impacts on industrial performance. This paper combines methods from quadratic, stochastic, and evolutionary programming with multi-objective optimization and continuous simulation, to propose a two-stage discrete-continuous multi-objective load optimization (DiCoMoLoOp) autonomous approach for industrial consumer demand response (DR). Stage 1 defines discrete-event load shifting targets. Accordingly, controllable loads are continuously optimized in stage 2 while considering the consumer’s utility. Utility functions, which measure the loads’ time value to the consumer, are derived and weights are assigned through an analytical hierarchy process (AHP). The method is demonstrated for an industrial building model using real data. The proposed method integrates with building energy management system and solves in real-time with autonomous and instantaneous load shifting in the hour-ahead energy price (HAP) market. The simulation shows the occasional existence of multiple load management options on the Pareto frontier. Finally, the computed savings, based on the simulation analysis with real consumption, climate, and price data, ranged from modest to considerable amounts

  20. Strategic Approach in Enhancing the Utilization of GGH Facilities Towards High Impact of Agrobiotechnology

    International Nuclear Information System (INIS)

    Azhar Mohamad; Ahsanulkhaliqin Abdul Wahab

    2013-01-01

    Gamma greenhouse (GGH) is associates with chronic radiation activities in life organism. The facility is equipped with 137 Cs source with relatively high energy (t 1/2 =30.1 years). The energy associated with gamma radiation is high enough to break the molecular bonds and ionize atoms without affecting structure of the atomic nucleus (avoiding induction of radioactivity). Nuclear Malaysia is the only institute that provides the facility for Research and Development chronic mutagenesis activities in Malaysia. Chronic gamma irradiation is an exposure of ionizing radiation over an extended period (hours, weeks, months) depending on their nature, sensitivity and research requirements. The alteration by chronic irradiation is tremendous, resulting in physical appearance, changes in molecular structures and metabolism changes. These changes are randomly events, inheritable, and the stability depends on cell damages after irradiation at molecular level. In agrobiotechnology, chronic gamma irradiation produces a wider mutation spectrum and useful for minimizing radiation damages towards obtaining new improved traits for commercial values. Continuous expose at low dose of gamma irradiation resulting in considerably elevated somaclonal variation frequency without negative effects on natural response. However, there is still lack of users especially researchers in Malaysia to utilize the facility. Strategic approaches as seminars, public talk, direct connections and engagement through collaboration, research activities and road show approaches are expected to bring more consumers in conveying high impact activities at GGH. (author)

  1. Simple noninvasive quantification method for measuring myocardial glucose utilization in humans employing positron emission tomography and fluorine-18 deoxyglucose

    International Nuclear Information System (INIS)

    Gambhir, S.S.; Schwaiger, M.; Huang, S.C.; Krivokapich, J.; Schelbert, H.R.; Nienaber, C.A.; Phelps, M.E.

    1989-01-01

    To estimate regional myocardial glucose utilization (rMGU) with positron emission tomography (PET) and 2-[ 18 F]fluoro-2-deoxy-D-glucose (FDG) in humans, we studied a method which simplifies the experimental procedure and is computationally efficient. This imaging approach uses a blood time-activity curve derived from a region of interest (ROI) drawn over dynamic PET images of the left ventricle (LV), and a Patlak graphic analysis. The spillover of radioactivity from the cardiac chambers to the myocardium is automatically removed by this analysis. Estimates of rMGU were obtained from FDG PET cardiac studies of six normal human subjects. Results from this study indicate that the FDG time-activity curve obtained from the LV ROI matched well with the arterial plasma curve. The rMGU obtained by Patlak graphic analysis was in good agreement with direct curve fitting results (r = 0.90). The average standard error of the estimate of the Patlak rMGU was low (3%). These results demonstrate the practical usefulness of a simplified method for the estimation of rMGU in humans by PET. This approach is noninvasive, computationally fast, and highly suited for developing parametric images of myocardial glucose utilization rate

  2. Parallelised Krylov subspace method for reactor kinetics by IQS approach

    International Nuclear Information System (INIS)

    Gupta, Anurag; Modak, R.S.; Gupta, H.P.; Kumar, Vinod; Bhatt, K.

    2005-01-01

    Nuclear reactor kinetics involves numerical solution of space-time-dependent multi-group neutron diffusion equation. Two distinct approaches exist for this purpose: the direct (implicit time differencing) approach and the improved quasi-static (IQS) approach. Both the approaches need solution of static space-energy-dependent diffusion equations at successive time-steps; the step being relatively smaller for the direct approach. These solutions are usually obtained by Gauss-Seidel type iterative methods. For a faster solution, the Krylov sub-space methods have been tried and also parallelised by many investigators. However, these studies seem to have been done only for the direct approach. In the present paper, parallelised Krylov methods are applied to the IQS approach in addition to the direct approach. It is shown that the speed-up obtained for IQS is higher than that for the direct approach. The reasons for this are also discussed. Thus, the use of IQS approach along with parallelised Krylov solvers seems to be a promising scheme

  3. An innovative computationally efficient hydromechanical coupling approach for fault reactivation in geological subsurface utilization

    Science.gov (United States)

    Adams, M.; Kempka, T.; Chabab, E.; Ziegler, M.

    2018-02-01

    Estimating the efficiency and sustainability of geological subsurface utilization, i.e., Carbon Capture and Storage (CCS) requires an integrated risk assessment approach, considering the occurring coupled processes, beside others, the potential reactivation of existing faults. In this context, hydraulic and mechanical parameter uncertainties as well as different injection rates have to be considered and quantified to elaborate reliable environmental impact assessments. Consequently, the required sensitivity analyses consume significant computational time due to the high number of realizations that have to be carried out. Due to the high computational costs of two-way coupled simulations in large-scale 3D multiphase fluid flow systems, these are not applicable for the purpose of uncertainty and risk assessments. Hence, an innovative semi-analytical hydromechanical coupling approach for hydraulic fault reactivation will be introduced. This approach determines the void ratio evolution in representative fault elements using one preliminary base simulation, considering one model geometry and one set of hydromechanical parameters. The void ratio development is then approximated and related to one reference pressure at the base of the fault. The parametrization of the resulting functions is then directly implemented into a multiphase fluid flow simulator to carry out the semi-analytical coupling for the simulation of hydromechanical processes. Hereby, the iterative parameter exchange between the multiphase and mechanical simulators is omitted, since the update of porosity and permeability is controlled by one reference pore pressure at the fault base. The suggested procedure is capable to reduce the computational time required by coupled hydromechanical simulations of a multitude of injection rates by a factor of up to 15.

  4. Trace and low concentration co2 removal methods and apparatus utilizing metal organic frameworks

    KAUST Repository

    Eddaoudi, Mohamed

    2016-03-10

    In general, this disclosure describes techniques for removing trace and low concentration CO2 from fluids using SIFSIX-n-M MOFs, wherein n is at least two and M is a metal. In some embodiments, the metal is zinc or copper. Embodiments include devices comprising SIFSIX-n-M MOFs for removing CO2 from fluids. In particular, embodiments relate to devices and methods utilizing SIFSIX-n-M MOFs for removing CO2 from fluids, wherein CO2 concentration is trace. Methods utilizing SIFSIX-n-M MOFs for removing CO2 from fluids can occur in confined spaces. SIFSIX-n-M MOFs can comprise bidentate organic ligands. In a specific embodiment, SIFSIX-n-M MOFs comprise pyrazine or dipryidilacetylene ligands.

  5. Quantitative determination of the crystalline phases of the ceramic materials utilizing the Rietveld method

    International Nuclear Information System (INIS)

    Kniess, C.T.; Prates, P.B.; Lima, J.C. de; Kuhnen, N.C.; Riella, H.G.; Maliska, A.M.

    2009-01-01

    Ceramic materials have properties defined by their chemical and micro-structural composition. The quantification of the crystalline phases is a fundamental stage in the determination of the structure, properties and applications of a ceramic material. Within this context, this study aims is the quantitative determination of the crystalline phases of the ceramic materials developed with addition of mineral coal bottom ash, utilizing the X ray diffraction technique, through the method proposed by Rietveld. For the formulation of the ceramic mixtures a {3,3} simplex-lattice design was used, giving ten formulations of three components (two different types of clays and coal bottom ash). The crystalline phases identified in the ceramic materials after sintering at 1150 deg C during two hours are: quartz, tridimite, mullite and hematite. The proposed methodology utilizing the Rietveld method for the quantification relating to crystalline phases of the materials was shown to be adequate and efficient. (author)

  6. A Utility Maximizing and Privacy Preserving Approach for Protecting Kinship in Genomic Databases.

    Science.gov (United States)

    Kale, Gulce; Ayday, Erman; Tastan, Oznur

    2017-09-12

    Rapid and low cost sequencing of genomes enabled widespread use of genomic data in research studies and personalized customer applications, where genomic data is shared in public databases. Although the identities of the participants are anonymized in these databases, sensitive information about individuals can still be inferred. One such information is kinship. We define two routes kinship privacy can leak and propose a technique to protect kinship privacy against these risks while maximizing the utility of shared data. The method involves systematic identification of minimal portions of genomic data to mask as new participants are added to the database. Choosing the proper positions to hide is cast as an optimization problem in which the number of positions to mask is minimized subject to privacy constraints that ensure the familial relationships are not revealed.We evaluate the proposed technique on real genomic data. Results indicate that concurrent sharing of data pertaining to a parent and an offspring results in high risks of kinship privacy, whereas the sharing data from further relatives together is often safer. We also show arrival order of family members have a high impact on the level of privacy risks and on the utility of sharing data. Available at: https://github.com/tastanlab/Kinship-Privacy. erman@cs.bilkent.edu.tr or oznur.tastan@cs.bilkent.edu.tr. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  7. Agile Service Development: A Rule-Based Method Engineering Approach

    NARCIS (Netherlands)

    dr. Martijn Zoet; Stijn Hoppenbrouwers; Inge van de Weerd; Johan Versendaal

    2011-01-01

    Agile software development has evolved into an increasingly mature software development approach and has been applied successfully in many software vendors’ development departments. In this position paper, we address the broader agile service development. Based on method engineering principles we

  8. Methods in pharmacoepidemiology: a review of statistical analyses and data reporting in pediatric drug utilization studies.

    Science.gov (United States)

    Sequi, Marco; Campi, Rita; Clavenna, Antonio; Bonati, Maurizio

    2013-03-01

    To evaluate the quality of data reporting and statistical methods performed in drug utilization studies in the pediatric population. Drug utilization studies evaluating all drug prescriptions to children and adolescents published between January 1994 and December 2011 were retrieved and analyzed. For each study, information on measures of exposure/consumption, the covariates considered, descriptive and inferential analyses, statistical tests, and methods of data reporting was extracted. An overall quality score was created for each study using a 12-item checklist that took into account the presence of outcome measures, covariates of measures, descriptive measures, statistical tests, and graphical representation. A total of 22 studies were reviewed and analyzed. Of these, 20 studies reported at least one descriptive measure. The mean was the most commonly used measure (18 studies), but only five of these also reported the standard deviation. Statistical analyses were performed in 12 studies, with the chi-square test being the most commonly performed test. Graphs were presented in 14 papers. Sixteen papers reported the number of drug prescriptions and/or packages, and ten reported the prevalence of the drug prescription. The mean quality score was 8 (median 9). Only seven of the 22 studies received a score of ≥10, while four studies received a score of statistical methods and reported data in a satisfactory manner. We therefore conclude that the methodology of drug utilization studies needs to be improved.

  9. An intelligent approach to optimize the EDM process parameters using utility concept and QPSO algorithm

    Directory of Open Access Journals (Sweden)

    Chinmaya P. Mohanty

    2017-04-01

    Full Text Available Although significant research has gone into the field of electrical discharge machining (EDM, analysis related to the machining efficiency of the process with different electrodes has not been adequately made. Copper and brass are frequently used as electrode materials but graphite can be used as a potential electrode material due to its high melting point temperature and good electrical conductivity. In view of this, the present work attempts to compare the machinability of copper, graphite and brass electrodes while machining Inconel 718 super alloy. Taguchi’s L27 orthogonal array has been employed to collect data for the study and analyze effect of machining parameters on performance measures. The important performance measures selected for this study are material removal rate, tool wear rate, surface roughness and radial overcut. Machining parameters considered for analysis are open circuit voltage, discharge current, pulse-on-time, duty factor, flushing pressure and electrode material. From the experimental analysis, it is observed that electrode material, discharge current and pulse-on-time are the important parameters for all the performance measures. Utility concept has been implemented to transform a multiple performance characteristics into an equivalent performance characteristic. Non-linear regression analysis is carried out to develop a model relating process parameters and overall utility index. Finally, the quantum behaved particle swarm optimization (QPSO and particle swarm optimization (PSO algorithms have been used to compare the optimal level of cutting parameters. Results demonstrate the elegance of QPSO in terms of convergence and computational effort. The optimal parametric setting obtained through both the approaches is validated by conducting confirmation experiments.

  10. Mean-variance portfolio optimization by using time series approaches based on logarithmic utility function

    Science.gov (United States)

    Soeryana, E.; Fadhlina, N.; Sukono; Rusyaman, E.; Supian, S.

    2017-01-01

    Investments in stocks investors are also faced with the issue of risk, due to daily price of stock also fluctuate. For minimize the level of risk, investors usually forming an investment portfolio. Establishment of a portfolio consisting of several stocks are intended to get the optimal composition of the investment portfolio. This paper discussed about optimizing investment portfolio of Mean-Variance to stocks by using mean and volatility is not constant based on logarithmic utility function. Non constant mean analysed using models Autoregressive Moving Average (ARMA), while non constant volatility models are analysed using the Generalized Autoregressive Conditional heteroscedastic (GARCH). Optimization process is performed by using the Lagrangian multiplier technique. As a numerical illustration, the method is used to analyse some Islamic stocks in Indonesia. The expected result is to get the proportion of investment in each Islamic stock analysed.

  11. An integrated lean-methods approach to hospital facilities redesign.

    Science.gov (United States)

    Nicholas, John

    2012-01-01

    Lean production methods for eliminating waste and improving processes in manufacturing are now being applied in healthcare. As the author shows, the methods are appropriate for redesigning hospital facilities. When used in an integrated manner and employing teams of mostly clinicians, the methods produce facility designs that are custom-fit to patient needs and caregiver work processes, and reduce operational costs. The author reviews lean methods and an approach for integrating them in the redesign of hospital facilities. A case example of the redesign of an emergency department shows the feasibility and benefits of the approach.

  12. Possibilities of Utilizing the Method of Analytical Hierarchy Process Within the Strategy of Corporate Social Business

    Science.gov (United States)

    Drieniková, Katarína; Hrdinová, Gabriela; Naňo, Tomáš; Sakál, Peter

    2010-01-01

    The paper deals with the analysis of the theory of corporate social responsibility, risk management and the exact method of analytic hierarchic process that is used in the decision-making processes. The Chapters 2 and 3 focus on presentation of the experience with the application of the method in formulating the stakeholders' strategic goals within the Corporate Social Responsibility (CSR) and simultaneously its utilization in minimizing the environmental risks. The major benefit of this paper is the application of Analytical Hierarchy Process (AHP).

  13. Approaches to greenhouse gas accounting methods for biomass carbon

    International Nuclear Information System (INIS)

    Downie, Adriana; Lau, David; Cowie, Annette; Munroe, Paul

    2014-01-01

    This investigation examines different approaches for the GHG flux accounting of activities within a tight boundary of biomass C cycling, with scope limited to exclude all other aspects of the lifecycle. Alternative approaches are examined that a) account for all emissions including biogenic CO 2 cycling – the biogenic method; b) account for the quantity of C that is moved to and maintained in the non-atmospheric pool – the stock method; and c) assume that the net balance of C taken up by biomass is neutral over the short-term and hence there is no requirement to include this C in the calculation – the simplified method. This investigation demonstrates the inaccuracies in both emissions forecasting and abatement calculations that result from the use of the simplified method, which is commonly accepted for use. It has been found that the stock method is the most accurate and appropriate approach for use in calculating GHG inventories, however short-comings of this approach emerge when applied to abatement projects, as it does not account for the increase in biogenic CO 2 emissions that are generated when non-CO 2 GHG emissions in the business-as-usual case are offset. Therefore the biogenic method or a modified version of the stock method should be used to accurately estimate GHG emissions abatement achieved by a project. This investigation uses both the derivation of methodology equations from first principles and worked examples to explore the fundamental differences in the alternative approaches. Examples are developed for three project scenarios including; landfill, combustion and slow-pyrolysis (biochar) of biomass. -- Highlights: • Different approaches can be taken to account for the GHG emissions from biomass. • Simplification of GHG accounting methods is useful, however, can lead to inaccuracies. • Approaches used currently are often inadequate for practises that store carbon. • Accounting methods for emissions forecasting can be inadequate for

  14. Min-max optimization and the radial approach to the public service system design with generalized utility

    Directory of Open Access Journals (Sweden)

    Jaroslav Janáček

    2016-04-01

    Full Text Available The paper deals with the min-max public service system design, where the generalized utility is considered. In contrast to the formulations presented in the literature, the generalized utility defined for a public service system assumes that the user’s utility comes generally from more than one located service center and the individual contributions from relevant centers are weighted by reduction coefficients depending on a center order. Given that commercial IP-solvers often fail due to enormous computational times or extreme memory demands when resolving this issue, we suggested and compared several approaches based on a bisection process with the purpose of developing an effective max-min approach to the public service system design with a generalized utility.

  15. Diagnostic utility of the cell block method versus the conventional smear study in pleural fluid cytology

    Directory of Open Access Journals (Sweden)

    Udasimath Shivakumarswamy

    2012-01-01

    Full Text Available Background: The cytological examinations of serous effusions have been well-accepted, and a positive diagnosis is often considered as a definitive diagnosis. It helps in staging, prognosis and management of the patients in malignancies and also gives information about various inflammatory and non-inflammatory lesions. Diagnostic problems arise in everyday practice to differentiate reactive atypical mesothelial cells and malignant cells by the routine conventional smear (CS method. Aims: To compare the morphological features of the CS method with those of the cell block (CB method and also to assess the utility and sensitivity of the CB method in the cytodiagnosis of pleural effusions. Materials and Methods: The study was conducted in the cytology section of the Department of Pathology. Sixty pleural fluid samples were subjected to diagnostic evaluation for over a period of 20 months. Along with the conventional smears, cell blocks were prepared by using 10% alcohol-formalin as a fixative agent. Statistical analysis with the ′z test′ was performed to identify the cellularity, using the CS and CB methods. Mc. Naemer′s χ2 test was used to identify the additional yield for malignancy by the CB method. Results: Cellularity and additional yield for malignancy was 15% more by the CB method. Conclusions: The CB method provides high cellularity, better architectural patterns, morphological features and an additional yield of malignant cells, and thereby, increases the sensitivity of the cytodiagnosis when compared with the CS method.

  16. Comparison of direct and indirect methods of estimating health state utilities for resource allocation: review and empirical analysis.

    Science.gov (United States)

    Arnold, David; Girling, Alan; Stevens, Andrew; Lilford, Richard

    2009-07-22

    Utilities (values representing preferences) for healthcare priority setting are typically obtained indirectly by asking patients to fill in a quality of life questionnaire and then converting the results to a utility using population values. We compared such utilities with those obtained directly from patients or the public. Review of studies providing both a direct and indirect utility estimate. Papers reporting comparisons of utilities obtained directly (standard gamble or time tradeoff) or indirectly (European quality of life 5D [EQ-5D], short form 6D [SF-6D], or health utilities index [HUI]) from the same patient. PubMed and Tufts database of utilities. Sign test for paired comparisons between direct and indirect utilities; least squares regression to describe average relations between the different methods. Mean utility scores (or median if means unavailable) for each method, and differences in mean (median) scores between direct and indirect methods. We found 32 studies yielding 83 instances where direct and indirect methods could be compared for health states experienced by adults. The direct methods used were standard gamble in 57 cases and time trade off in 60(34 used both); the indirect methods were EQ-5D (67 cases), SF-6D (13), HUI-2 (5), and HUI-3 (37). Mean utility values were 0.81 (standard gamble) and 0.77 (time tradeoff) for the direct methods; for the indirect methods: 0.59(EQ-5D), 0.63 (SF-6D), 0.75 (HUI-2) and 0.68 (HUI-3). Direct methods of estimating utilities tend to result in higher health ratings than the more widely used indirect methods, and the difference can be substantial.Use of indirect methods could have important implications for decisions about resource allocation: for example, non-lifesaving treatments are relatively more favoured in comparison with lifesaving interventions than when using direct methods.

  17. Using qualitative methods to inform the trade-off between content validity and consistency in utility assessment: the example of type 2 diabetes and Alzheimer's Disease

    Directory of Open Access Journals (Sweden)

    Gargon Elizabeth

    2010-02-01

    Full Text Available Abstract Background Key stakeholders regard generic utility instruments as suitable tools to inform health technology assessment decision-making regarding allocation of resources across competing interventions. These instruments require a 'descriptor', a 'valuation' and a 'perspective' of the economic evaluation. There are various approaches that can be taken for each of these, offering a potential lack of consistency between instruments (a basic requirement for comparisons across diseases. The 'reference method' has been proposed as a way to address the limitations of the Quality-Adjusted Life Year (QALY. However, the degree to which generic measures can assess patients' specific experiences with their disease would remain unresolved. This has been neglected in the discussions on methods development and its impact on the QALY values obtained and resulting cost per QALY estimate underestimated. This study explored the content of utility instruments relevant to type 2 diabetes and Alzheimer's disease (AD as examples, and the role of qualitative research in informing the trade-off between content coverage and consistency. Method A literature review was performed to identify qualitative and quantitative studies regarding patients' experiences with type 2 diabetes or AD, and associated treatments. Conceptual models for each indication were developed. Generic- and disease-specific instruments were mapped to the conceptual models. Results Findings showed that published descriptions of relevant concepts important to patients with type 2 diabetes or AD are available for consideration in deciding on the most comprehensive approach to utility assessment. While the 15-dimensional health related quality of life measure (15D seemed the most comprehensive measure for both diseases, the Health Utilities Index 3 (HUI 3 seemed to have the least coverage for type 2 diabetes and the EuroQol-5 Dimensions (EQ-5D for AD. Furthermore, some of the utility instruments

  18. Reconciling ocean mass content change based on direct and inverse approaches by utilizing data from GRACE, altimetry and Swarm

    Science.gov (United States)

    Rietbroek, R.; Uebbing, B.; Lück, C.; Kusche, J.

    2017-12-01

    Ocean mass content (OMC) change due to the melting of the ice-sheets in Greenland and Antarctica, melting of glaciers and changes in terrestrial hydrology is a major contributor to present-day sea level rise. Since 2002, the GRACE satellite mission serves as a valuable tool for directly measuring the variations in OMC. As GRACE has almost reached the end of its lifetime, efforts are being made to utilize the Swarm mission for the recovery of low degree time-variable gravity fields to bridge a possible gap until the GRACE-FO mission and to fill up periods where GRACE data was not existent. To this end we compute Swarm monthly normal equations and spherical harmonics that are found competitive to other solutions. In addition to directly measuring the OMC, combination of GRACE gravity data with altimetry data in a global inversion approach allows to separate the total sea level change into individual mass-driven and steric contributions. However, published estimates of OMC from the direct and inverse methods differ not only depending on the time window, but also are influenced by numerous post-processing choices. Here, we will look into sources of such differences between direct and inverse approaches and evaluate the capabilities of Swarm to derive OMC. Deriving time series of OMC requires several processing steps; choosing a GRACE (and altimetry) product, data coverage, masks and filters to be applied in either spatial or spectral domain, corrections related to spatial leakage, GIA and geocenter motion. In this study, we compare and quantify the effects of the different processing choices of the direct and inverse methods. Our preliminary results point to the GIA correction as the major source of difference between the two approaches.

  19. A new approach for modeling the peak utility impacts from a proposed CUAC standard

    Energy Technology Data Exchange (ETDEWEB)

    LaCommare, Kristina Hamachi; Gumerman, Etan; Marnay, Chris; Chan, Peter; Coughlin, Katie

    2004-08-01

    This report describes a new Berkeley Lab approach for modeling the likely peak electricity load reductions from proposed energy efficiency programs in the National Energy Modeling System (NEMS). This method is presented in the context of the commercial unitary air conditioning (CUAC) energy efficiency standards. A previous report investigating the residential central air conditioning (RCAC) load shapes in NEMS revealed that the peak reduction results were lower than expected. This effect was believed to be due in part to the presence of the squelch, a program algorithm designed to ensure changes in the system load over time are consistent with the input historic trend. The squelch applies a system load-scaling factor that scales any differences between the end-use bottom-up and system loads to maintain consistency with historic trends. To obtain more accurate peak reduction estimates, a new approach for modeling the impact of peaky end uses in NEMS-BT has been developed. The new approach decrements the system load directly, reducing the impact of the squelch on the final results. This report also discusses a number of additional factors, in particular non-coincidence between end-use loads and system loads as represented within NEMS, and their impacts on the peak reductions calculated by NEMS. Using Berkeley Lab's new double-decrement approach reduces the conservation load factor (CLF) on an input load decrement from 25% down to 19% for a SEER 13 CUAC trial standard level, as seen in NEMS-BT output. About 4 GW more in peak capacity reduction results from this new approach as compared to Berkeley Lab's traditional end-use decrement approach, which relied solely on lowering end use energy consumption. The new method has been fully implemented and tested in the Annual Energy Outlook 2003 (AEO2003) version of NEMS and will routinely be applied to future versions. This capability is now available for use in future end-use efficiency or other policy analysis

  20. Total System Performance Assessment - License Application Methods and Approach

    International Nuclear Information System (INIS)

    McNeish, J.

    2003-01-01

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issues (KTIs) identified in agreements with the U.S. Nuclear Regulatory Commission, the ''Yucca Mountain Review Plan'' (YMRP), ''Final Report'' (NRC 2003 [163274]), and the NRC final rule 10 CFR Part 63 (NRC 2002 [156605]). This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are used in this document

  1. Utilization integrated Fuzzy-QFD and TOPSIS approach in supplier selection

    Directory of Open Access Journals (Sweden)

    2016-02-01

    Full Text Available Supplier selection is a typical multi-attribute problem that involves both qualitative and quantitative factors. To deal with this problem, different techniques have suggested. Being based on purely mathematical data, these techniques have significant drawbacks especially when we want to consider qualitative factors, which are very important in supplier selection and are not easy to measure. Some innovative approaches, based on artificial intelligence techniques such as Fuzzy Logic match very well with decision-making situations especially when decision makers express heterogeneous judgments. In this research, by the combination of Fuzzy logic and the House of Quality (HOQ, qualitative criteria are considered in the forward parts of car suppliers’ selection process in Sazehgostar SAIPA Company. Then, TOPSIS technique is adopted to consider quantitative metrics. Finally, by combining of Fuzzy QFD and TOPSIS techniques, these suppliers will be selected and ranked in this Company. Concern to the both qualitative and quantitative criteria, is the important point used in this research and also methodology utilized, counts innovative aspect. Limited number of experts associated with each piece and unavailability of some quantitative criteria has been limitations across of this study’s accomplishment.

  2. A unique in vivo approach for investigating antimicrobial materials utilizing fistulated animals

    Science.gov (United States)

    Berean, Kyle J.; Adetutu, Eric M.; Zhen Ou, Jian; Nour, Majid; Nguyen, Emily P.; Paull, David; McLeod, Jess; Ramanathan, Rajesh; Bansal, Vipul; Latham, Kay; Bishop-Hurley, Greg J.; McSweeney, Chris; Ball, Andrew S.; Kalantar-Zadeh, Kourosh

    2015-06-01

    Unique in vivo tests were conducted through the use of a fistulated ruminant, providing an ideal environment with a diverse and vibrant microbial community. Utilizing such a procedure can be especially invaluable for investigating the performance of antimicrobial materials related to human and animal related infections. In this pilot study, it is shown that the rumen of a fistulated animal provides an excellent live laboratory for assessing the properties of antimicrobial materials. We investigate microbial colonization onto model nanocomposites based on silver (Ag) nanoparticles at different concentrations into polydimethylsiloxane (PDMS). With implantable devices posing a major risk for hospital-acquired infections, the present study provides a viable solution to understand microbial colonization with the potential to reduce the incidence of infection through the introduction of Ag nanoparticles at the optimum concentrations. In vitro measurements were also conducted to show the validity of the approach. An optimal loading of 0.25 wt% Ag is found to show the greatest antimicrobial activity and observed through the in vivo tests to reduce the microbial diversity colonizing the surface.

  3. Diagnostic utility of the cell block method versus the conventional smear study in pleural fluid cytology.

    Science.gov (United States)

    Shivakumarswamy, Udasimath; Arakeri, Surekha U; Karigowdar, Mahesh H; Yelikar, Br

    2012-01-01

    The cytological examinations of serous effusions have been well-accepted, and a positive diagnosis is often considered as a definitive diagnosis. It helps in staging, prognosis and management of the patients in malignancies and also gives information about various inflammatory and non-inflammatory lesions. Diagnostic problems arise in everyday practice to differentiate reactive atypical mesothelial cells and malignant cells by the routine conventional smear (CS) method. To compare the morphological features of the CS method with those of the cell block (CB) method and also to assess the utility and sensitivity of the CB method in the cytodiagnosis of pleural effusions. The study was conducted in the cytology section of the Department of Pathology. Sixty pleural fluid samples were subjected to diagnostic evaluation for over a period of 20 months. Along with the conventional smears, cell blocks were prepared by using 10% alcohol-formalin as a fixative agent. Statistical analysis with the 'z test' was performed to identify the cellularity, using the CS and CB methods. Mc. Naemer's χ(2)test was used to identify the additional yield for malignancy by the CB method. Cellularity and additional yield for malignancy was 15% more by the CB method. The CB method provides high cellularity, better architectural patterns, morphological features and an additional yield of malignant cells, and thereby, increases the sensitivity of the cytodiagnosis when compared with the CS method.

  4. Apparatus and method for materials processing utilizing a rotating magnetic field

    Science.gov (United States)

    Muralidharan, Govindarajan; Angelini, Joseph A.; Murphy, Bart L.; Wilgen, John B.

    2017-04-11

    An apparatus for materials processing utilizing a rotating magnetic field comprises a platform for supporting a specimen, and a plurality of magnets underlying the platform. The plurality of magnets are configured for rotation about an axis of rotation intersecting the platform. A heat source is disposed above the platform for heating the specimen during the rotation of the plurality of magnets. A method for materials processing utilizing a rotating magnetic field comprises providing a specimen on a platform overlying a plurality of magnets; rotating the plurality of magnets about an axis of rotation intersecting the platform, thereby applying a rotating magnetic field to the specimen; and, while rotating the plurality of magnets, heating the specimen to a desired temperature.

  5. Double-label autoradiographic deoxyglucose method for sequential measurement of regional cerebral glucose utilization

    Energy Technology Data Exchange (ETDEWEB)

    Redies, C; Diksic, M; Evans, A C; Gjedde, A; Yamamoto, Y L

    1987-08-01

    A new double-label autoradiographic glucose analog method for the sequential measurement of altered regional cerebral metabolic rates for glucose in the same animal is presented. This method is based on the sequential injection of two boluses of glucose tracer labeled with two different isotopes (short-lived /sup 18/F and long-lived /sup 3/H, respectively). An operational equation is derived which allows the determination of glucose utilization for the time period before the injection of the second tracer; this equation corrects for accumulation and loss of the first tracer from the metabolic pool occurring after the injection of the second tracer. An error analysis of this operational equation is performed. The double-label deoxyglucose method is validated in the primary somatosensory (''barrel'') cortex of the anesthetized rat. Two different rows of whiskers were stimulated sequentially in each rat; the two periods of stimulation were each preceded by an injection of glucose tracer. After decapitation, dried brain slices were first exposed, in direct contact, to standard X-ray film and then to uncoated, ''tritium-sensitive'' film. Results show that the double-label deoxyglucose method proposed in this paper allows the quantification and complete separation of glucose utilization patterns elicited by two different stimulations sequentially applied in the same animal.

  6. Algebraic Verification Method for SEREs Properties via Groebner Bases Approaches

    Directory of Open Access Journals (Sweden)

    Ning Zhou

    2013-01-01

    Full Text Available This work presents an efficient solution using computer algebra system to perform linear temporal properties verification for synchronous digital systems. The method is essentially based on both Groebner bases approaches and symbolic simulation. A mechanism for constructing canonical polynomial set based symbolic representations for both circuit descriptions and assertions is studied. We then present a complete checking algorithm framework based on these algebraic representations by using Groebner bases. The computational experience result in this work shows that the algebraic approach is a quite competitive checking method and will be a useful supplement to the existent verification methods based on simulation.

  7. Peak Detection Method Evaluation for Ion Mobility Spectrometry by Using Machine Learning Approaches

    DEFF Research Database (Denmark)

    Hauschild, Anne-Christin; Kopczynski, Dominik; D'Addario, Marianna

    2013-01-01

    machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region......-merging with VisualNow, and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods...

  8. Utility of a Systematic Approach to Teaching Photographic Nasal Analysis to Otolaryngology Residents.

    Science.gov (United States)

    Robitschek, Jon; Dresner, Harley; Hilger, Peter

    2017-12-01

    Photographic nasal analysis constitutes a critical step along the path toward accurate diagnosis and precise surgical planning in rhinoplasty. The learned process by which one assesses photographs, analyzes relevant anatomical landmarks, and generates a global view of the nasal aesthetic is less widely described. To discern the common pitfalls in performing photographic nasal analysis and to quantify the utility of a systematic approach model in teaching photographic nasal analysis to otolaryngology residents. This prospective observational study included 20 participants from a university-based otolaryngology residency program. The control and intervention groups underwent baseline graded assessment of 3 patients. The intervention group received instruction on a systematic approach model for nasal analysis, and both groups underwent postintervention testing at 10 weeks. Data were collected from October 1, 2015, through June 1, 2016. A 10-minute, 11-slide presentation provided instruction on a systematic approach to nasal analysis to the intervention group. Graded photographic nasal analysis using a binary 18-point system. The 20 otolaryngology residents (15 men and 5 women; age range, 24-34 years) were adept at mentioning dorsal deviation and dorsal profile with focused descriptions of tip angle and contour. Areas commonly omitted by residents included verification of the Frankfort plane, position of the lower lateral crura, radix position, and ratio of the ala to tip lobule. The intervention group demonstrated immediate improvement after instruction on the teaching model, with the mean (SD) postintervention test score doubling compared with their baseline performance (7.5 [2.7] vs 10.3 [2.5]; P Otolaryngology residents demonstrated proficiency at incorporating nasal deviation, tip angle, and dorsal profile contour into their nasal analysis. They often omitted verification of the Frankfort plane, position of lower lateral crura, radix depth, and ala-to-tip lobule

  9. A Method to Predict Compressor Stall in the TF34-100 Turbofan Engine Utilizing Real-Time Performance Data

    Science.gov (United States)

    2015-06-01

    A METHOD TO PREDICT COMPRESSOR STALL IN THE TF34-100 TURBOFAN ENGINE UTILIZING REAL-TIME PERFORMANCE...THE TF34-100 TURBOFAN ENGINE UTILIZING REAL-TIME PERFORMANCE DATA THESIS Presented to the Faculty Department of Systems Engineering and...036 A METHOD TO PREDICT COMPRESSOR STALL IN THE TF34-100 TURBOFAN ENGINE UTILIZING REAL-TIME PERFORMANCE DATA Shuxiang ‘Albert’ Li, BS

  10. Magnetic exchange couplings from constrained density functional theory: an efficient approach utilizing analytic derivatives.

    Science.gov (United States)

    Phillips, Jordan J; Peralta, Juan E

    2011-11-14

    We introduce a method for evaluating magnetic exchange couplings based on the constrained density functional theory (C-DFT) approach of Rudra, Wu, and Van Voorhis [J. Chem. Phys. 124, 024103 (2006)]. Our method shares the same physical principles as C-DFT but makes use of the fact that the electronic energy changes quadratically and bilinearly with respect to the constraints in the range of interest. This allows us to use coupled perturbed Kohn-Sham spin density functional theory to determine approximately the corrections to the energy of the different spin configurations and construct a priori the relevant energy-landscapes obtained by constrained spin density functional theory. We assess this methodology in a set of binuclear transition-metal complexes and show that it reproduces very closely the results of C-DFT. This demonstrates a proof-of-concept for this method as a potential tool for studying a number of other molecular phenomena. Additionally, routes to improving upon the limitations of this method are discussed. © 2011 American Institute of Physics

  11. The Temporal Effect of Training Utility Perceptions on Adopting a Trained Method: The Role of Perceived Organizational Support

    Science.gov (United States)

    Madera, Juan M.; Steele, Stacey T.; Beier, Margaret

    2011-01-01

    The current study examined the temporal effect of perceived training utility on adoption of a trained method and how perceived organizational support influences the relationship between perceived training utility perceptions and adoption of a trained method. With the use of a correlational-survey-based design, this longitudinal study required…

  12. Impact of Anterior vs Posterior Approach for Total Hip Arthroplasty on Post-Acute Care Service Utilization.

    Science.gov (United States)

    L'Hommedieu, Coles E; Gera, James J; Rupp, Gerald; Salin, Jeffery W; Cox, John S; Duwelius, Paul J

    2016-09-01

    Controversy exists as to which surgical approach is best for total hip arthroplasty (THA). Previous studies suggested that the tissue-sparing anterior approach should result in a more rapid recovery requiring fewer postacute services, ultimately decreasing overall episodic cost. The purpose of this cross-sectional study was to determine if any significant differences exist between the anterior vs posterior approaches on postacute care service utilization, readmissions, or episodic cost. Claims data from 26,773 Medicare fee-for-service beneficiaries receiving elective THAs (Medical Severity-Diagnosis Related Groups (MS-DRGs) 469/470) were analyzed. Claims data were collected from the 2-year period, January 2013 through December 2014. The posterior surgical approach was performed on 23,653 patients while 3120 patients received the anterior approach. Data analysis showed negligible effect sizes in postacute care service utilization, readmission rate, and cost between the surgical approaches for elective THA (MS-DRG 469 and 470). Average THA total episode cost was negligibly higher for procedures using the anterior approach compared to the posterior approach ($22,517 and $22,068, respectively). Statistically significant differences were observed in inpatient rehab and home health cost and service utilization. However, the effect sizes of these comparisons are negligible when accounting for the large sample size. All other comparisons showed minimal and statistically insignificant variation. The results indicate that surgical approach alone is not the primary driver of postacute care service utilization, quality outcomes, or cost. Other factors such as physician-led patient-focused care pathways, care coordination, rapid rehabilitation protocols, perioperative pain management protocols, and patient education are integral for effective patient care. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. An informatics approach to assess pediatric pharmacotherapy: design and implementation of a hospital drug utilization system.

    Science.gov (United States)

    Zuppa, Athena; Vijayakumar, Sundararajan; Jayaraman, Bhuvana; Patel, Dimple; Narayan, Mahesh; Vijayakumar, Kalpana; Mondick, John T; Barrett, Jeffrey S

    2007-09-01

    Drug utilization in the inpatient setting can provide a mechanism to assess drug prescribing trends, efficiency, and cost-effectiveness of hospital formularies and examine subpopulations for which prescribing habits may be different. Such data can be used to correlate trends with time-dependent or seasonal changes in clinical event rates or the introduction of new pharmaceuticals. It is now possible to provide a robust, dynamic analysis of drug utilization in a large pediatric inpatient setting through the creation of a Web-based hospital drug utilization system that retrieves source data from our accounting database. The production implementation provides a dynamic and historical account of drug utilization at the authors' institution. The existing application can easily be extended to accommodate a multi-institution environment. The creation of a national or even global drug utilization network would facilitate the examination of geographical and/or socioeconomic influences in drug utilization and prescribing practices in general.

  14. A Pattern-Oriented Approach to a Methodical Evaluation of Modeling Methods

    Directory of Open Access Journals (Sweden)

    Michael Amberg

    1996-11-01

    Full Text Available The paper describes a pattern-oriented approach to evaluate modeling methods and to compare various methods with each other from a methodical viewpoint. A specific set of principles (the patterns is defined by investigating the notations and the documentation of comparable modeling methods. Each principle helps to examine some parts of the methods from a specific point of view. All principles together lead to an overall picture of the method under examination. First the core ("method neutral" meaning of each principle is described. Then the methods are examined regarding the principle. Afterwards the method specific interpretations are compared with each other and with the core meaning of the principle. By this procedure, the strengths and weaknesses of modeling methods regarding methodical aspects are identified. The principles are described uniformly using a principle description template according to descriptions of object oriented design patterns. The approach is demonstrated by evaluating a business process modeling method.

  15. Preliminary assessment of a method utilizing carbon dioxide and steelmaking slags to produce precipitated calcium carbonate

    International Nuclear Information System (INIS)

    Eloneva, Sanni; Said, Arshe; Fogelholm, Carl-Johan; Zevenhoven, Ron

    2012-01-01

    Highlights: ► An NH 4 -salt-based method utilizes CO 2 and steelmaking slags to produce pure CaCO 3 . ► It was determined if its economic potential warrants moving forward. ► Despite small solvent losses, the method was found to have economical potential. ► The method has significant CO 2 emissions reduction potential. ► Scaling up the reactor will allow for a more detailed design for the process. -- Abstract: One of the options that can contribute to the reduction of carbon dioxide emissions for climate change mitigation is the so-called CO 2 sequestration by mineral carbonation, or CO 2 mineral sequestration. Steel manufacturing could benefit from this option by utilizing its own by-products, i.e. steelmaking slags to combine with CO 2 . We have recently studied a method, where aqueous solution of ammonium salt (e.g. ammonium acetate, ammonium nitrate and ammonium chloride) is used to extract calcium selectively from the steel converter slag, followed by precipitation of pure calcium carbonate by bubbling CO 2 through the produced solution. The ammonium salt solution is recovered and re-used. The purpose of this research was to determine if the economic potential of the method warrants moving forward to large-scale application. Despite the small solvent losses, the method was found to have economical potential. In addition, it has significant CO 2 emission reduction potential as well. Scaling up the reactor from the small laboratory scale will allow more detailed design for the process to be made followed by a full economical evaluation including all of the important operational and capital investment costs.

  16. Approaches and methods for econometric analysis of market power

    DEFF Research Database (Denmark)

    Perekhozhuk, Oleksandr; Glauben, Thomas; Grings, Michael

    2017-01-01

    , functional forms, estimation methods and derived estimates of the degree of market power. Thereafter, we use our framework to evaluate several structural models based on PTA and GIM to measure oligopsony power in the Ukrainian dairy industry. The PTA-based results suggest that the estimated parameters......This study discusses two widely used approaches in the New Empirical Industrial Organization (NEIO) literature and examines the strengths and weaknesses of the Production-Theoretic Approach (PTA) and the General Identification Method (GIM) for the econometric analysis of market power...... in agricultural and food markets. We provide a framework that may help researchers to evaluate and improve structural models of market power. Starting with the specification of the approaches in question, we compare published empirical studies of market power with respect to the choice of the applied approach...

  17. A maximum information utilization approach in X-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Papp, T.; Maxwell, J.A.; Papp, A.T.

    2009-01-01

    X-ray fluorescence data bases have significant contradictions, and inconsistencies. We have identified that the main source of the contradictions, after the human factors, is rooted in the signal processing approaches. We have developed signal processors to overcome many of the problems by maximizing the information available to the analyst. These non-paralyzable, fully digital signal processors have yielded improved resolution, line shape, tailing and pile up recognition. The signal processors account for and register all events, sorting them into two spectra, one spectrum for the desirable or accepted events, and one spectrum for the rejected events. The information contained in the rejected spectrum is mandatory to have control over the measurement and to make a proper accounting and allocation of the events. It has established the basis for the application of the fundamental parameter method approach. A fundamental parameter program was also developed. The primary X-ray line shape (Lorentzian) is convoluted with a system line shape (Gaussian) and corrected for the sample material absorption, X-ray absorbers and detector efficiency. The peaks also can have, a lower and upper energy side tailing, including the physical interaction based long range functions. It also employs a peak and continuum pile up and can handle layered samples of up to five layers. The application of a fundamental parameter method demands the proper equipment characterization. We have also developed an inverse fundamental parameter method software package for equipment characterisation. The program calculates the excitation function at the sample position and the detector efficiency, supplying an internally consistent system.

  18. Systematic methods and tools for design of sustainable chemical processes for CO2 utilization

    DEFF Research Database (Denmark)

    Kongpanna, Pichayapan; Babi, Deenesh K.; Pavarajarn, Varong

    2016-01-01

    A systematic computer-aided framework for sustainable process design is presented together with its application to the synthesis and generation of processing networks for dimethyl carbonate (DMC) production with CO2 utilization. The framework integrated with various methods, tools, algorithms......-stage involves selection and analysis of the identified networks as a base case design in terms of operational feasibility, economics, life cycle assessment factors and sustainability measures, which are employed to establish targets for improvement in the next-stage. The innovation-stage involves generation...

  19. New approach to equipment quality evaluation method with distinct functions

    Directory of Open Access Journals (Sweden)

    Milisavljević Vladimir M.

    2016-01-01

    Full Text Available The paper presents new approach for improving method for quality evaluation and selection of equipment (devices and machinery by applying distinct functions. Quality evaluation and selection of devices and machinery is a multi-criteria problem which involves the consideration of numerous parameters of various origins. Original selection method with distinct functions is based on technical parameters with arbitrary evaluation of each parameter importance (weighting. Improvement of this method, presented in this paper, addresses the issue of weighting of parameters by using Delphi Method. Finally, two case studies are provided, which included quality evaluation of standard boilers for heating and evaluation of load-haul-dump (LHD machines, to demonstrate applicability of this approach. Analytical Hierarchical Process (AHP is used as a control method.

  20. Computational model of precision grip in Parkinson’s disease: A Utility based approach

    Directory of Open Access Journals (Sweden)

    Ankur eGupta

    2013-12-01

    Full Text Available We propose a computational model of Precision Grip (PG performance in normal subjects and Parkinson’s Disease (PD patients. Prior studies on grip force generation in PD patients show an increase in grip force during ON medication and an increase in the variability of the grip force during OFF medication (Fellows et al 1998; Ingvarsson et al 1997. Changes in grip force generation in dopamine-deficient PD conditions strongly suggest contribution of the Basal Ganglia, a deep brain system having a crucial role in translating dopamine signals to decision making. The present approach is to treat the problem of modeling grip force generation as a problem of action selection, which is one of the key functions of the Basal Ganglia. The model consists of two components: 1 the sensory-motor loop component, and 2 the Basal Ganglia component. The sensory-motor loop component converts a reference position and a reference grip force, into lift force and grip force profiles, respectively. These two forces cooperate in grip-lifting a load. The sensory-motor loop component also includes a plant model that represents the interaction between two fingers involved in PG, and the object to be lifted. The Basal Ganglia component is modeled using Reinforcement Learning with the significant difference that the action selection is performed using utility distribution instead of using purely Value-based distribution, thereby incorporating risk-based decision making. The proposed model is able to account for the precision grip results from normal and PD patients accurately (Fellows et. al. 1998; Ingvarsson et. al. 1997. To our knowledge the model is the first model of precision grip in PD conditions.

  1. A multiparameter chaos control method based on OGY approach

    International Nuclear Information System (INIS)

    Souza de Paula, Aline; Amorim Savi, Marcelo

    2009-01-01

    Chaos control is based on the richness of responses of chaotic behavior and may be understood as the use of tiny perturbations for the stabilization of a UPO embedded in a chaotic attractor. Since one of these UPO can provide better performance than others in a particular situation the use of chaos control can make this kind of behavior to be desirable in a variety of applications. The OGY method is a discrete technique that considers small perturbations promoted in the neighborhood of the desired orbit when the trajectory crosses a specific surface, such as a Poincare section. This contribution proposes a multiparameter semi-continuous method based on OGY approach in order to control chaotic behavior. Two different approaches are possible with this method: coupled approach, where all control parameters influences system dynamics although they are not active; and uncoupled approach that is a particular case where control parameters return to the reference value when they become passive parameters. As an application of the general formulation, it is investigated a two-parameter actuation of a nonlinear pendulum control employing coupled and uncoupled approaches. Analyses are carried out considering signals that are generated by numerical integration of the mathematical model using experimentally identified parameters. Results show that the procedure can be a good alternative for chaos control since it provides a more effective UPO stabilization than the classical single-parameter approach.

  2. Evaluation of the efficiency and utility of recombinant enzyme-free seamless DNA cloning methods

    Directory of Open Access Journals (Sweden)

    Ken Motohashi

    2017-03-01

    Full Text Available Simple and low-cost recombinant enzyme-free seamless DNA cloning methods have recently become available. In vivo Escherichia coli cloning (iVEC can directly transform a mixture of insert and vector DNA fragments into E. coli, which are ligated by endogenous homologous recombination activity in the cells. Seamless ligation cloning extract (SLiCE cloning uses the endogenous recombination activity of E. coli cellular extracts in vitro to ligate insert and vector DNA fragments. An evaluation of the efficiency and utility of these methods is important in deciding the adoption of a seamless cloning method as a useful tool. In this study, both seamless cloning methods incorporated inserting DNA fragments into linearized DNA vectors through short (15–39 bp end homology regions. However, colony formation was 30–60-fold higher with SLiCE cloning in end homology regions between 15 and 29 bp than with the iVEC method using DH5α competent cells. E. coli AQ3625 strains, which harbor a sbcA gene mutation that activates the RecE homologous recombination pathway, can be used to efficiently ligate insert and vector DNA fragments with short-end homology regions in vivo. Using AQ3625 competent cells in the iVEC method improved the rate of colony formation, but the efficiency and accuracy of SLiCE cloning were still higher. In addition, the efficiency of seamless cloning methods depends on the intrinsic competency of E. coli cells. The competency of chemically competent AQ3625 cells was lower than that of competent DH5α cells, in all cases of chemically competent cell preparations using the three different methods. Moreover, SLiCE cloning permits the use of both homemade and commercially available competent cells because it can use general E. coli recA− strains such as DH5α as host cells for transformation. Therefore, between the two methods, SLiCE cloning provides both higher efficiency and better utility than the iVEC method for seamless DNA plasmid

  3. Evaluation methodology based on physical security assessment results: a utility theory approach

    International Nuclear Information System (INIS)

    Bennett, H.A.; Olascoaga, M.T.

    1978-03-01

    This report describes an evaluation methodology which aggregates physical security assessment results for nuclear facilities into an overall measure of adequacy. This methodology utilizes utility theory and conforms to a hierarchical structure developed by the NRC. Implementation of the methodology is illustrated by several examples. Recommendations for improvements in the evaluation process are given

  4. Vector and parallel computing on the IBM ES/3090, a powerful approach to solving problems in the utility industry

    International Nuclear Information System (INIS)

    Bellucci, V.J.

    1990-01-01

    This paper describes IBM's approach to parallel computing using the IBM ES/3090 computer. Parallel processing concepts were discussed including its advantages, potential performance improvements and limitations. Particular applications and capabilities for the IBM ES/3090 were presented along with preliminary results from some utilities in the application of parallel processing to simulation of system reliability, air pollution models, and power network dynamics

  5. Instructional design in mathematics for undergraduate students based on learning by mistakes approach utilizing scilab assistance

    Science.gov (United States)

    Kartika, H.

    2018-03-01

    The issue related to making mistake while learning such as negative emotion is found while students learn mathematics with the aid of a computer. When the computer output showed a mistake message, the students considered it as a computer software malfunction. Based on this issue, the writer designs an instructional model based on learning by mistake approach and which is Scilab assisted. The method used in this research is research design involving undergraduate students in matrix algebra courses. The data collected throught survey with questionnaire to gain feedback about the approach implemented. The data analyzed using quantitative descriptive. The instructional design proposed is the student act as a mistake corrector while the teacher acts as a mistake maker. Teacher deliberately makes mistakes with the help of Scilab software. On the other hand, students correct, analyze and explain errors resulting from Scilab software. The result of this research is an ICT based instructional design which is expected to be applicable as an alternative learning in directing students to think positively about mistakes in learning. Furthermore, students are also expected to improve their ability in understanding and thinking critically while solving problems and improving themselves in learning mathematics.

  6. Utilizing a Photo-Analysis Software for Content Identifying Method (CIM

    Directory of Open Access Journals (Sweden)

    Nejad Nasim Sahraei

    2015-01-01

    Full Text Available Content Identifying Methodology or (CIM was developed to measure public preferences in order to reveal the common characteristics of landscapes and aspects of underlying perceptions including the individual's reactions to content and spatial configuration, therefore, it can assist with the identification of factors that influenced preference. Regarding the analysis of landscape photographs through CIM, there are several studies utilizing image analysis software, such as Adobe Photoshop, in order to identify the physical contents in the scenes. This study attempts to evaluate public’s ‘preferences for aesthetic qualities of pedestrian bridges in urban areas through a photo-questionnaire survey, in which respondents evaluated images of pedestrian bridges in urban areas. Two groups of images were evaluated as the most and least preferred scenes that concern the highest and lowest mean scores respectively. These two groups were analyzed by CIM and also evaluated based on the respondent’s description of each group to reveal the pattern of preferences and the factors that may affect them. Digimizer Software was employed to triangulate the two approaches and to determine the role of these factors on people’s preferences. This study attempts to introduce the useful software for image analysis which can measure the physical contents and also their spatial organization in the scenes. According to the findings, it is revealed that Digimizer could be a useful tool in CIM approaches through preference studies that utilizes photographs in place of the actual landscape in order to determine the most important factors in public preferences for pedestrian bridges in urban areas.

  7. Utilization of net energy analysis as a method of evaluating energy systems

    International Nuclear Information System (INIS)

    Lee, Gi Won; Cho, Joo Hyun; Hah, Yung Joon

    1994-01-01

    It can be said that the upturn of Korean nuclear power program started in early 70's while future plants for the construction of new nuclear power plants virtually came to a halt in United States since the late 70's. It is projected that power plant systems from combination of nuclear and coal fired types might shift to all coal fired type in U.S., considering the current U.S. trend of construction on the new plants. However, with the depletion of natural resources, it may be desirable to understand the utilization of two competitive utility technologies in terms of invested energy. Presented in this paper is a method of comparing two energy systems in terms of energy investment and a brief result from energy economic analysis of nuclear power plant and coal fired steam power plant to illustrate the methodology. The method of comparison is Net Energy Analysis (NEA). In doing so, Input-Output Analysis (lOA) among industries and commodities is done. Using these information, net energy ratios are calculated and compared. Although NEA does not offer conclusive solution, it can be used as a screening process in decision making

  8. The development of uranium foil farication technology utilizing twin roll method for Mo-99 irradiation target

    CERN Document Server

    Kim, C K; Park, H D

    2002-01-01

    MDS Nordion in Canada, occupying about 75% of global supply of Mo-99 isotope, has provided the irradiation target of Mo-99 using the rod-type UAl sub x alloys with HEU(High Enrichment Uranium). ANL (Argonne National Laboratory) through co-operation with BATAN in Indonesia, leading RERTR (Reduced Enrichment for Research and Test Reactors) program substantially for nuclear non-proliferation, has designed and fabricated the annular cylinder of uranium targets, and successfully performed irradiation test, in order to develop the fabrication technology of fission Mo-99 using LEU(Low Enrichment Uranium). As the uranium foils could be fabricated in laboratory scale, not in commercialized scale by hot rolling method due to significant problems in foil quality, productivity and economic efficiency, attention has shifted to the development of new technology. Under these circumstances, the invention of uranium foil fabrication technology utilizing twin-roll casting method in KAERI is found to be able to fabricate LEU or...

  9. Method and apparatus for generating and utilizing a compound plasma configuration

    International Nuclear Information System (INIS)

    Koloc, P.M.

    1977-01-01

    A method and apparatus for generating and utilizing a compound plasma configuration is disclosed. The plasma configuration includes a central toroidal plasma with electrical currents surrounded by a generally ellipsoidal mantle of ionized particles or electrically conducting matter. The preferred methods of forming this compound plasma configuration include the steps of forming a helical ionized path in a gaseous medium and simultaneously discharging a high potential through the ionized path to produce a helical or heliform current which collapses on itself to produce a toroidal current, or generating a toroidal plasmoid, supplying magnetic energy to the plasmoid, and applying fluid pressure external to the plasmoid. The apparatus of the present invention includes a pressure chamber wherein the compound plasma configuration can be isolated or compressed by fluid or other forms of mechanical or magnetic pressure. 47 claims, 10 figures

  10. Comparing the Medicaid Retrospective Drug Utilization Review Program Cost-Savings Methods Used by State Agencies.

    Science.gov (United States)

    Prada, Sergio I

    2017-12-01

    The Medicaid Drug Utilization Review (DUR) program is a 2-phase process conducted by Medicaid state agencies. The first phase is a prospective DUR and involves electronically monitoring prescription drug claims to identify prescription-related problems, such as therapeutic duplication, contraindications, incorrect dosage, or duration of treatment. The second phase is a retrospective DUR and involves ongoing and periodic examinations of claims data to identify patterns of fraud, abuse, underutilization, drug-drug interaction, or medically unnecessary care, implementing corrective actions when needed. The Centers for Medicare & Medicaid Services requires each state to measure prescription drug cost-savings generated from its DUR programs on an annual basis, but it provides no guidance or unified methodology for doing so. To describe and synthesize the methodologies used by states to measure cost-savings using their Medicaid retrospective DUR program in federal fiscal years 2014 and 2015. For each state, the cost-savings methodologies included in the Medicaid DUR 2014 and 2015 reports were downloaded from Medicaid's website. The reports were then reviewed and synthesized. Methods described by the states were classified according to research designs often described in evaluation textbooks. In 2014, the most often used prescription drugs cost-savings estimation methodology for the Medicaid retrospective DUR program was a simple pre-post intervention method, without a comparison group (ie, 12 states). In 2015, the most common methodology used was a pre-post intervention method, with a comparison group (ie, 14 states). Comparisons of savings attributed to the program among states are still unreliable, because of a lack of a common methodology available for measuring cost-savings. There is great variation among states in the methods used to measure prescription drug utilization cost-savings. This analysis suggests that there is still room for improvement in terms of

  11. The utility of QSARs in predicting acute fish toxicity of pesticide metabolites: A retrospective validation approach.

    Science.gov (United States)

    Burden, Natalie; Maynard, Samuel K; Weltje, Lennart; Wheeler, James R

    2016-10-01

    The European Plant Protection Products Regulation 1107/2009 requires that registrants establish whether pesticide metabolites pose a risk to the environment. Fish acute toxicity assessments may be carried out to this end. Considering the total number of pesticide (re-) registrations, the number of metabolites can be considerable, and therefore this testing could use many vertebrates. EFSA's recent "Guidance on tiered risk assessment for plant protection products for aquatic organisms in edge-of-field surface waters" outlines opportunities to apply non-testing methods, such as Quantitative Structure Activity Relationship (QSAR) models. However, a scientific evidence base is necessary to support the use of QSARs in predicting acute fish toxicity of pesticide metabolites. Widespread application and subsequent regulatory acceptance of such an approach would reduce the numbers of animals used. The work presented here intends to provide this evidence base, by means of retrospective data analysis. Experimental fish LC50 values for 150 metabolites were extracted from the Pesticide Properties Database (http://sitem.herts.ac.uk/aeru/ppdb/en/atoz.htm). QSAR calculations were performed to predict fish acute toxicity values for these metabolites using the US EPA's ECOSAR software. The most conservative predicted LC50 values generated by ECOSAR were compared with experimental LC50 values. There was a significant correlation between predicted and experimental fish LC50 values (Spearman rs = 0.6304, p < 0.0001). For 62% of metabolites assessed, the QSAR predicted values are equal to or lower than their respective experimental values. Refined analysis, taking into account data quality and experimental variation considerations increases the proportion of sufficiently predictive estimates to 91%. For eight of the nine outliers, there are plausible explanation(s) for the disparity between measured and predicted LC50 values. Following detailed consideration of the robustness of

  12. A creative approach to the development of an agenda for knowledge utilization: outputs from the 11th international knowledge utilization colloquium (KU 11).

    Science.gov (United States)

    Wilkinson, Joyce E; Rycroft-Malone, Jo; Davies, Huw T O; McCormack, Brendan

    2012-12-01

    A group of researchers and practitioners interested in advancing knowledge utilization met as a colloquium in Belfast (KU 11) and used a "world café" approach to exploit the social capital and shared understanding built up over previous events to consider the research and practice agenda. We considered three key areas of relevance to knowledge use: (1) understanding the nature of research use, influence and impact; (2) blended and collaborative approaches to knowledge production and use; and (3) supporting sustainability and spread of evidence-informed innovations. The approach enabled the development of artifacts that reflected the three areas and these were analyzed using a creative hermeneutic approach. The themes that emerged and which are outlined in this commentary are not mutually exclusive. There was much overlap in the discussions and therefore of the themes, reflecting the complex nature of knowledge translation work. The agenda that has emerged from KU 11 also reflects the participatory and creative approach in which the meeting was structured and focused, and therefore emphasizes the processual, relational and contingent nature of some of the challenges we face. The past 20 years has seen an explosion in activity around understanding KU, and we have learned much about the difficulties. Whilst the agenda for the next decade may be becoming clearer, colloquia such as KU 11, using creative and engaging approaches, have a key role to play in dissecting, articulating and sharing that agenda. In this way, we also build an ever-expanding international community that is dedicated to working towards increasing the chances of success for better patient care. © 2012 Sigma Theta Tau International.

  13. Three methods to monitor utilization of healthcare services by the poor

    Directory of Open Access Journals (Sweden)

    Urni Farhana

    2009-08-01

    Full Text Available Abstract Background Achieving equity by way of improving the condition of the economically poor or otherwise disadvantaged is among the core goals of contemporary development paradigm. This places importance on monitoring outcome indicators among the poor. National surveys allow disaggregation of outcomes by socioeconomic status at national level and do not have statistical adequacy to provide estimates for lower level administrative units. This limits the utility of these data for programme managers to know how well particular services are reaching the poor at the lowest level. Managers are thus left without a tool for monitoring results for the poor at lower levels. This paper demonstrates that with some extra efforts community and facility based data at the lower level can be used to monitor utilization of healthcare services by the poor. Methods Data used in this paper came from two sources- Chakaria Health and Demographic Surveillance System (HDSS of ICDDR,B and from a special study conducted during 2006 among patients attending the public and private health facilities in Chakaria, Bangladesh. The outcome variables included use of skilled attendants for delivery and use of facilities. Rate-ratio, rate-difference, concentration index, benefit incidence ratio, sequential sampling, and Lot Quality Assurance Sampling were used to assess how pro-poor is the use of skilled attendants for delivery and healthcare facilities. Findings Poor are using skilled attendants for delivery far less than the better offs. Government health service facilities are used more than the private facilities by the poor. Benefit incidence analysis and sequential sampling techniques could assess the situation realistically which can be used for monitoring utilization of services by poor. The visual display of the findings makes both these methods attractive. LQAS, on the other hand, requires small fixed sample and always enables decision making. Conclusion With some

  14. Three methods to monitor utilization of healthcare services by the poor

    Science.gov (United States)

    Bhuiya, Abbas; Hanifi, SMA; Urni, Farhana; Mahmood, Shehrin Shaila

    2009-01-01

    Background Achieving equity by way of improving the condition of the economically poor or otherwise disadvantaged is among the core goals of contemporary development paradigm. This places importance on monitoring outcome indicators among the poor. National surveys allow disaggregation of outcomes by socioeconomic status at national level and do not have statistical adequacy to provide estimates for lower level administrative units. This limits the utility of these data for programme managers to know how well particular services are reaching the poor at the lowest level. Managers are thus left without a tool for monitoring results for the poor at lower levels. This paper demonstrates that with some extra efforts community and facility based data at the lower level can be used to monitor utilization of healthcare services by the poor. Methods Data used in this paper came from two sources- Chakaria Health and Demographic Surveillance System (HDSS) of ICDDR,B and from a special study conducted during 2006 among patients attending the public and private health facilities in Chakaria, Bangladesh. The outcome variables included use of skilled attendants for delivery and use of facilities. Rate-ratio, rate-difference, concentration index, benefit incidence ratio, sequential sampling, and Lot Quality Assurance Sampling were used to assess how pro-poor is the use of skilled attendants for delivery and healthcare facilities. Findings Poor are using skilled attendants for delivery far less than the better offs. Government health service facilities are used more than the private facilities by the poor. Benefit incidence analysis and sequential sampling techniques could assess the situation realistically which can be used for monitoring utilization of services by poor. The visual display of the findings makes both these methods attractive. LQAS, on the other hand, requires small fixed sample and always enables decision making. Conclusion With some extra efforts monitoring of the

  15. A hybrid approach for efficient anomaly detection using metaheuristic methods

    Directory of Open Access Journals (Sweden)

    Tamer F. Ghanem

    2015-07-01

    Full Text Available Network intrusion detection based on anomaly detection techniques has a significant role in protecting networks and systems against harmful activities. Different metaheuristic techniques have been used for anomaly detector generation. Yet, reported literature has not studied the use of the multi-start metaheuristic method for detector generation. This paper proposes a hybrid approach for anomaly detection in large scale datasets using detectors generated based on multi-start metaheuristic method and genetic algorithms. The proposed approach has taken some inspiration of negative selection-based detector generation. The evaluation of this approach is performed using NSL-KDD dataset which is a modified version of the widely used KDD CUP 99 dataset. The results show its effectiveness in generating a suitable number of detectors with an accuracy of 96.1% compared to other competitors of machine learning algorithms.

  16. Systematic approaches to data analysis from the Critical Decision Method

    Directory of Open Access Journals (Sweden)

    Martin Sedlár

    2015-01-01

    Full Text Available The aim of the present paper is to introduce how to analyse the qualitative data from the Critical Decision Method. At first, characterizing the method provides the meaningful introduction into the issue. This method used in naturalistic decision making research is one of the cognitive task analysis methods, it is based on the retrospective semistructured interview about critical incident from the work and it may be applied in various domains such as emergency services, military, transport, sport or industry. Researchers can make two types of methodological adaptation. Within-method adaptations modify the way of conducting the interviews and cross-method adaptations combine this method with other related methods. There are many decsriptions of conducting the interview, but the descriptions how the data should be analysed are rare. Some researchers use conventional approaches like content analysis, grounded theory or individual procedures with reference to the objectives of research project. Wong (2004 describes two approaches to data analysis proposed for this method of data collection, which are described and reviewed in the details. They enable systematic work with a large amount of data. The structured approach organizes the data according to an a priori analysis framework and it is suitable for clearly defined object of research. Each incident is studied separately. At first, the decision chart showing the main decision points and then the incident summary are made. These decision points are used to identify the relevant statements from the transcript, which are analysed in terms of the Recognition-Primed Decision Model. Finally, the results from all the analysed incidents are integrated. The limitation of the structured approach is it may not reveal some interesting concepts. The emergent themes approach helps to identify these concepts while maintaining a systematic framework for analysis and it is used for exploratory research design. It

  17. Experiential Approach to Teaching Statistics and Research Methods ...

    African Journals Online (AJOL)

    Statistics and research methods are among the more demanding topics for students of education to master at both the undergraduate and postgraduate levels. It is our conviction that teaching these topics should be combined with real practical experiences. We discuss an experiential teaching/ learning approach that ...

  18. Book Review: Comparative Education Research: Approaches and Methods

    Directory of Open Access Journals (Sweden)

    Noel Mcginn

    2014-10-01

    Full Text Available Book Review Comparative Education Research: Approaches and Methods (2nd edition By Mark Bray, Bob Adamson and Mark Mason (Eds. (2014, 453p ISBN: 978-988-17852-8-2, Hong Kong: Comparative Education Research Centre and Springer

  19. A sequential mixed methods research approach to investigating HIV ...

    African Journals Online (AJOL)

    Sequential mixed methods research is an effective approach for investigating complex problems, but it has not been extensively used in construction management research. In South Africa, the HIV/AIDS pandemic has seen construction management taking on a vital responsibility since the government called upon the ...

  20. Teaching Psychological Research Methods through a Pragmatic and Programmatic Approach

    Science.gov (United States)

    Rosenkranz, Patrick; Fielden, Amy; Tzemou, Effy

    2014-01-01

    Research methods teaching in psychology is pivotal in preparing students for the transition from student as learner to independent practitioner. We took an action research approach to re-design, implement and evaluate a module guiding students through a programmatic and pragmatic research cycle. These revisions allow students to experience how…

  1. The Feldenkrais Method: A Dynamic Approach to Changing Motor Behavior.

    Science.gov (United States)

    Buchanan, Patricia A.; Ulrich, Beverly D.

    2001-01-01

    Describes the Feldenkrais Method of somatic education, noting parallels with a dynamic systems theory (DST) approach to motor behavior. Feldenkrais uses movement and perception to foster individualized improvement in function. DST explains that a human-environment system continually adapts to changing conditions and assembles behaviors…

  2. Computer-Enhanced Visual Learning Method to Teach Endoscopic Correction of Vesicoureteral Reflux: An Invitation to Residency Training Programs to Utilize the CEVL Method

    Directory of Open Access Journals (Sweden)

    Michael Bauschard

    2012-01-01

    Full Text Available Herein we describe a standardized approach to teach endoscopic injection therapy to repair vesicoureteral reflux utilizing the CEVL method, an internet-accessed platform. The content was developed through collaboration of the authors' clinical and computer expertises. This application provides personnel training, examination, and procedure skill documentation through the use of online text with narration, pictures, and video. There is also included feedback and remediation of skill performance and teaching “games.” We propose that such standardized teaching and procedure performance will ultimate in improved surgical results. The electronic nature of communication in this journal is ideal to rapidly disseminate this information and to develop a structure for collaborative research.

  3. Conditions for energy generation as an alternative approach to compost utilization.

    Science.gov (United States)

    Raclavska, H; Juchelkova, D; Skrobankova, H; Wiltowski, T; Campen, A

    2011-01-01

    Very strict limits constrain the current possibilities for compost utilization in agriculture and for land reclamation, thus creating a need for other compost utilization practices. A favourable alternative can be compost utilization as a renewable heat source - alternative fuel. The changes of the basic physical-chemical parameters during the composting process are evaluated. During the composting process, energy losses of 920 kJ/kg occur, caused by carbohydrate decomposition (loss of 12.64% TOC). The net calorific value for mature compost was 11.169 kJ/kg dry matter. The grain size of compost below 0.045 mm has the highest ash content. The energetic utilization of compost depended on moisture, which can be influenced by paper addition or by prolonging the time of maturation to six months.

  4. An Efficient Approach for Identifying Stable Lobes with Discretization Method

    Directory of Open Access Journals (Sweden)

    Baohai Wu

    2013-01-01

    Full Text Available This paper presents a new approach for quick identification of chatter stability lobes with discretization method. Firstly, three different kinds of stability regions are defined: absolute stable region, valid region, and invalid region. Secondly, while identifying the chatter stability lobes, three different regions within the chatter stability lobes are identified with relatively large time intervals. Thirdly, stability boundary within the valid regions is finely calculated to get exact chatter stability lobes. The proposed method only needs to test a small portion of spindle speed and cutting depth set; about 89% computation time is savedcompared with full discretization method. It spends only about10 minutes to get exact chatter stability lobes. Since, based on discretization method, the proposed method can be used for different immersion cutting including low immersion cutting process, the proposed method can be directly implemented in the workshop to promote machining parameters selection efficiency.

  5. A Systems Approach to Information Technology (IT) Infrastructure Design for Utility Management Automation Systems

    OpenAIRE

    A. Fereidunian; H. Lesani; C. Lucas; M. Lehtonen; M. M. Nordman

    2006-01-01

    Almost all of electric utility companies are planning to improve their management automation system, in order to meet the changing requirements of new liberalized energy market and to benefit from the innovations in information and communication technology (ICT or IT). Architectural design of the utility management automation (UMA) systems for their IT-enabling requires proper selection of IT choices for UMA system, which leads to multi-criteria decision-makings (MCDM). In resp...

  6. Schizophrenia: multi-attribute utility theory approach to selection of atypical antipsychotics.

    Science.gov (United States)

    Bettinger, Tawny L; Shuler, Garyn; Jones, Donnamaria R; Wilson, James P

    2007-02-01

    Current guidelines/algorithms recommend atypical antipsychotics as first-line agents for the treatment of schizophrenia. Because there are extensive healthcare costs associated with the treatment of schizophrenia, many institutions and health systems are faced with making restrictive formulary decisions regarding the use of atypical antipsychotics. Often, medication acquisition costs are the driving force behind formulary decisions, while other treatment factors are not considered. To apply a multi-attribute utility theory (MAUT) analysis to aid in the selection of a preferred agent among the atypical antipsychotics for the treatment of schizophrenia. Five atypical antipsychotics (risperidone, olanzapine, quetiapine, ziprasidone, aripiprazole) were selected as the alternative agents to be included in the MAUT analysis. The attributes identified for inclusion in the analysis were efficacy, adverse effects, cost, and adherence, with relative weights of 35%, 35%, 20%, and 10%, respectively. For each agent, attribute scores were calculated, weighted, and then summed to generate a total utility score. The agent with the highest total utility score was considered the preferred agent. Aripiprazole, with a total utility score of 75.8, was the alternative agent with the highest total utility score in this model. This was followed by ziprasidone, risperidone, and quetiapine, with total utility scores of 71.8, 69.0, and 65.9, respectively. Olanzapine received the lowest total utility score. A sensitivity analysis was performed and failed to displace aripiprazole as the agent with the highest total utility score. This model suggests that aripiprazole should be considered a preferred agent for the treatment of schizophrenia unless found to be otherwise inappropriate.

  7. The Value of Sustainable Knowledge Transfer Methods for SMEs, Utilizing Socio-Technical Networks and Complex Systems

    Directory of Open Access Journals (Sweden)

    Susu Nousala

    2010-12-01

    Full Text Available This paper will examine the development of sustainable SME methods for tracking tacit (informal knowledge transfer as a series of networks of larger complex system. Understanding sustainable systems begins with valuing tacit knowledge networks and their ability to produce connections on multiple levels. The behaviour of the social or socio aspects of a system in relation to the explicit formal/physical structures need to be understood and actively considered when utilizing methodologies for interacting within complex systems structures. This paper utilizes theory from several previous studies to underpin the key case study discussed. This approach involved examining the behavioural phenomena of an SME knowledge network. The knowledge network elements were highlighted to identify their value within an SME structure. To understand the value of these emergent elements from between tacit and explicit knowledge networks, is to actively, simultaneously and continuous support sustainable development for SME organizations. The simultaneous links within and between groups of organizations is crucial for understanding sustainable networking structures of complex systems.

  8. APPLICATION OF MODIFIED POWER FLOW TRACING METHOD FOR REACTIVE POWER PRICING IN PRACTICAL UTILITY SYSTEM

    Directory of Open Access Journals (Sweden)

    M. SUSITHRA

    2017-01-01

    Full Text Available Competitive trend towards restructuring and unbundling of transmission services has resulted in the need to discover the impact of a particular generator to load. This paper initially presents the analysis of three different reactive power valuation methods namely, Modified Ybus , Virtual flow approach and modified power flow tracing to compute the reactive power output from a particular generator to particular load. Among these methods, the modified power flow electricity tracing method is identified as the best method to trace the reactive power contribution from various reactive power sources to loads, transmission line, etc. Also this proposed method breakdown the total reactive power loss in a transmission line into components to be allocated to individual loads. Secondly, based on this Method a novel allocation method for reactive power service for practical system is proposed. Hence, this method can be useful in providing additional insight into power system operation and can be used to modify existing tariffs of charging for reactive power transmission loss and reactive power transmission services. Simulation and comparison results are shown by taking WSCC 9 and IEEE 30 bus system as test system.

  9. Automatic segmentation of MRI head images by 3-D region growing method which utilizes edge information

    International Nuclear Information System (INIS)

    Jiang, Hao; Suzuki, Hidetomo; Toriwaki, Jun-ichiro

    1991-01-01

    This paper presents a 3-D segmentation method that automatically extracts soft tissue from multi-sliced MRI head images. MRI produces a sequence of two-dimensional (2-D) images which contains three-dimensional (3-D) information of organs. To utilize such information we need effective algorithms to treat 3-D digital images and to extract organs and tissues of interest. We developed a method to extract the brain from MRI images which uses a region growing procedure and integrates information of uniformity of gray levels and information of the presence of edge segments in the local area around the pixel of interest. First we generate a kernel region which is a part of brain tissue by simple thresholding. Then we grow the region by means of a region growing algorithm under the control of 3-D edge existence to obtain the region of the brain. Our method is rather simple because it uses basic 3-D image processing techniques like spatial difference. It is robust for variation of gray levels inside a tissue since it also refers to the edge information in the process of region growing. Therefore, the method is flexible enough to be applicable to the segmentation of other images including soft tissues which have complicated shapes and fluctuation in gray levels. (author)

  10. Can we delay the replacement of this component?-an asset management approach to the question [for electric utilities

    DEFF Research Database (Denmark)

    Østergaard, Jacob; Jensen, A. Norsk

    2001-01-01

    Asset management is emerging as a new approach on how to exploit an electric utility physical asset in the most profitable way. One of the major questions to answer by the asset management staff is when to do replacements? This is a difficult question, which require weighting of several parameter...... on Windows CE handheld computers which are presented in this paper. The calculations shown in the paper are based on this tool, and the software system is today available and used by Danish electric utilities....

  11. Total System Performance Assessment - License Application Methods and Approach

    Energy Technology Data Exchange (ETDEWEB)

    J. McNeish

    2003-12-08

    ''Total System Performance Assessment-License Application (TSPA-LA) Methods and Approach'' provides the top-level method and approach for conducting the TSPA-LA model development and analyses. The method and approach is responsive to the criteria set forth in Total System Performance Assessment Integration (TSPAI) Key Technical Issues (KTIs) identified in agreements with the U.S. Nuclear Regulatory Commission, the ''Yucca Mountain Review Plan'' (YMRP), ''Final Report'' (NRC 2003 [163274]), and the NRC final rule 10 CFR Part 63 (NRC 2002 [156605]). This introductory section provides an overview of the TSPA-LA, the projected TSPA-LA documentation structure, and the goals of the document. It also provides a brief discussion of the regulatory framework, the approach to risk management of the development and analysis of the model, and the overall organization of the document. The section closes with some important conventions that are used in this document.

  12. Budgetary Approach to Project Management by Percentage of Completion Method

    Directory of Open Access Journals (Sweden)

    Leszek Borowiec

    2011-07-01

    Full Text Available Efficient and effective project management process is made possible by the use of methods and techniques of project management. The aim of this paper is to present the problems of project management by using Percentage of Completion method. The research material was gathered based on the experience in implementing this method by the Johnson Controls International Company. The article attempts to demonstrate the validity of the thesis that the POC project management method, allows for effective implementation and monitoring of the project and thus is an effective tool in the managing of companies which exploit the budgetary approach. The study presents planning process of basic parameters affecting the effectiveness of the project (such as costs, revenue, margin and characterized how the primary measurements used to evaluate it. The present theme is illustrating by numerous examples for showing the essence of the raised problems and the results are presenting by using descriptive methods, graphical and tabular.

  13. Assessing and evaluating multidisciplinary translational teams: a mixed methods approach.

    Science.gov (United States)

    Wooten, Kevin C; Rose, Robert M; Ostir, Glenn V; Calhoun, William J; Ameredes, Bill T; Brasier, Allan R

    2014-03-01

    A case report illustrates how multidisciplinary translational teams can be assessed using outcome, process, and developmental types of evaluation using a mixed-methods approach. Types of evaluation appropriate for teams are considered in relation to relevant research questions and assessment methods. Logic models are applied to scientific projects and team development to inform choices between methods within a mixed-methods design. Use of an expert panel is reviewed, culminating in consensus ratings of 11 multidisciplinary teams and a final evaluation within a team-type taxonomy. Based on team maturation and scientific progress, teams were designated as (a) early in development, (b) traditional, (c) process focused, or (d) exemplary. Lessons learned from data reduction, use of mixed methods, and use of expert panels are explored.

  14. Daylight Utilization with Light Pipe in Farm Animal Production: A Simulation Approach

    Directory of Open Access Journals (Sweden)

    Alejandro Pacheco Diéguez’

    2016-06-01

    Full Text Available Light pipes offer a passive way to bring daylight inside deep buildings, such as agricultural buildings. However, the lack of reliable performance predictability methods for light pipes represents a major obstacle preventing their widespread use. This paper evaluates a simulation approach for performance prediction and identifies key light pipe design parameters affecting their daylight transmission performance. The study was carried out through continuous monitoring of daylight in two full-scale, identical pig stables fitted with two light pipe systems, Solatube® and Velux®. The experiment included three continuously measuring sensors in each stable and an outdoor sensor during 2013 and 2014. A forward raytracing tool, TracePro®, was used for illuminance prediction and parametric simulations. The simulation results for overcast skies indicated discrepancies between the simulated and average measurement results below 30% in all cases. The discrepancies for clear skies were somewhat higher, i.e., below 30% for 67% of the cases. The higher discrepancies with clear skies were due to the overestimation of absolute sunlight levels and absence of an advanced and detailed optical characterization of the dome collector’s surface. The parametric results have shown that light pipes’ performance is better during summer time, in sunny climates, at low to mid-latitudes, which provides higher solar altitudes than during winter and cloudy climates at high latitudes. Methods to improve the luminous transmittance for low solar altitudes occurring in Scandinavia include: bending or tilting the pipe, increasing the aspect ratio, improving the pipe specular reflectance, tilting the collector to the south, and using optical redirecting system in the collector.

  15. A restraint molecular dynamics and simulated annealing approach for protein homology modeling utilizing mean angles

    Directory of Open Access Journals (Sweden)

    Maurer Till

    2005-04-01

    Full Text Available Abstract Background We have developed the program PERMOL for semi-automated homology modeling of proteins. It is based on restrained molecular dynamics using a simulated annealing protocol in torsion angle space. As main restraints defining the optimal local geometry of the structure weighted mean dihedral angles and their standard deviations are used which are calculated with an algorithm described earlier by Döker et al. (1999, BBRC, 257, 348–350. The overall long-range contacts are established via a small number of distance restraints between atoms involved in hydrogen bonds and backbone atoms of conserved residues. Employing the restraints generated by PERMOL three-dimensional structures are obtained using standard molecular dynamics programs such as DYANA or CNS. Results To test this modeling approach it has been used for predicting the structure of the histidine-containing phosphocarrier protein HPr from E. coli and the structure of the human peroxisome proliferator activated receptor γ (Ppar γ. The divergence between the modeled HPr and the previously determined X-ray structure was comparable to the divergence between the X-ray structure and the published NMR structure. The modeled structure of Ppar γ was also very close to the previously solved X-ray structure with an RMSD of 0.262 nm for the backbone atoms. Conclusion In summary, we present a new method for homology modeling capable of producing high-quality structure models. An advantage of the method is that it can be used in combination with incomplete NMR data to obtain reasonable structure models in accordance with the experimental data.

  16. Utilizing Computational Probabilistic Methods to Derive Shock Specifications in a Nondeterministic Environment

    Energy Technology Data Exchange (ETDEWEB)

    FIELD JR.,RICHARD V.; RED-HORSE,JOHN R.; PAEZ,THOMAS L.

    2000-10-25

    One of the key elements of the Stochastic Finite Element Method, namely the polynomial chaos expansion, has been utilized in a nonlinear shock and vibration application. As a result, the computed response was expressed as a random process, which is an approximation to the true solution process, and can be thought of as a generalization to solutions given as statistics only. This approximation to the response process was then used to derive an analytically-based design specification for component shock response that guarantees a balanced level of marginal reliability. Hence, this analytically-based reference SRS might lead to an improvement over the somewhat ad hoc test-based reference in the sense that it will not exhibit regions of conservativeness. nor lead to overtesting of the design.

  17. Few-layer graphene growth from polystyrene as solid carbon source utilizing simple APCVD method

    Science.gov (United States)

    Ahmadi, Shahrokh; Afzalzadeh, Reza

    2016-07-01

    This research article presents development of an economical, simple, immune and environment friendly process to grow few-layer graphene by controlling evaporation rate of polystyrene on copper foil as catalyst and substrate utilizing atmospheric pressure chemical vapor deposition (APCVD) method. Evaporation rate of polystyrene depends on molecular structure, amount of used material and temperature. We have found controlling rate of evaporation of polystyrene by controlling the source temperature is easier than controlling the material weight. Atomic force microscopy (AFM) as well as Raman Spectroscopy has been used for characterization of the layers. The frequency of G‧ to G band ratio intensity in some samples varied between 0.8 and 1.6 corresponding to few-layer graphene. Topography characterization by atomic force microscopy confirmed Raman results.

  18. Some approaches to the formation of the financialmechanism of efficient housing and utility services functioning

    Directory of Open Access Journals (Sweden)

    Chernyshev Aleksey Valentinovich

    2014-02-01

    Full Text Available In modern market conditions the purpose of the financial mechanism formation of housing and utility services has to consist in ensuring efficient functioning of rendering services of this complex. While creating the financial mechanism of housing and utility services development, only such criteria are considered as purpose and operating principles of organizations. Thus, the main goal of this research is to establish the transparent mechanism of reflection of the price policy in housing services industry, and also the payment size control at the contents and repair of objects of housing and utility services. The financial mechanism formation has to be carried out within the principles of the finance management. Also, considering various points of view of the scientists on the quantity and essence of the principles, the authors discuss such of them, which are most specific to the sphere of housing and utility services.Many economists put as a basis of housing and utility services financial mechanism such purpose as creating favorable conditions for social development, which means compliance with the interests and requirements of the population.

  19. An approach for evaluating utility-financed energy conservation programs. The economic welfare model

    Energy Technology Data Exchange (ETDEWEB)

    Costello, K W; Galen, P S

    1985-09-01

    The main objective of this paper is to illustrate how the economic welfare model may be used to measure the economic efficiency effects of utility-financed energy conservation programs. The economic welfare model is the theoretical structure that was used in this paper to develop a cost/benefit test. This test defines the net benefit of a conservation program as the change in the sum of consumer and producer surplus. The authors advocate the operation of the proposed cost/benefit model as a screening tool to eliminate from more detailed review those programs where the expected net benefits are less than zero. The paper presents estimates of the net benefit derived from different specified cost/benefit models for four illustrative pilot programs. These models are representative of those which have been applied or are under review by utilities and public utility commissions. From the numerical results, it is shown that net benefit is greatly affected by the assumptions made about the nature of welfare gains to program participants. The main conclusion that emerges from the numerical results is that the selection of a cost/benefit model is a crucial element in evaluating utility-financed energy conservation programs. The paper also briefly addresses some of the major unresolved issues in utility-financed energy conservation programs. 2 figs., 3 tabs., 10 refs. (A.V.)

  20. Multiattribute Utility Theory without Expected Utility Foundations

    NARCIS (Netherlands)

    Stiggelbout, A.M.; Wakker, P.P.

    1995-01-01

    Methods for determining the form of utilities are needed for the implementation of utility theory in specific decisions. An important step forward was achieved when utility theorists characterized useful parametric families of utilities, and simplifying decompositions of multiattribute utilities.

  1. Multiattribute utility theory without expected utility foundations

    NARCIS (Netherlands)

    Wakker, P.P.; Miyamoto, J.

    1996-01-01

    Methods for determining the form of utilities are needed for the implementation of utility theory in specific decisions. An important step forward was achieved when utility theorists characterized useful parametric families of utilities, and simplifying decompositions of multiattribute utilities.

  2. Astronomical Orientation Method Based on Lunar Observations Utilizing Super Wide Field of View

    Directory of Open Access Journals (Sweden)

    PU Junyu

    2018-04-01

    Full Text Available In this paper,astronomical orientation is achieved by observing the moon utilizing camera with super wide field of view,and formulae are deduced in detail.An experiment based on real observations verified the stability of the method.In this experiment,after 15 minutes' tracking shoots,the internal precision could be superior to ±7.5" and the external precision could approximately reach ±20".This camera-based method for astronomical orientation can change the traditional mode (aiming by human eye based on theodolite,thus lowering the requirements for operator's skill to some extent.Furthermore,camera with super wide field of view can realize the function of continuous tracking shoots on the moon without complicated servo control devices.Considering the similar existence of gravity on the moon and the earth's phase change when observed from the moon,once the technology of self-leveling is developed,this method can be extended to orientation for lunar rover by shooting the earth.

  3. A mixed-methods approach to systematic reviews.

    Science.gov (United States)

    Pearson, Alan; White, Heath; Bath-Hextall, Fiona; Salmond, Susan; Apostolo, Joao; Kirkpatrick, Pamela

    2015-09-01

    There are an increasing number of published single-method systematic reviews that focus on different types of evidence related to a particular topic. As policy makers and practitioners seek clear directions for decision-making from systematic reviews, it is likely that it will be increasingly difficult for them to identify 'what to do' if they are required to find and understand a plethora of syntheses related to a particular topic.Mixed-methods systematic reviews are designed to address this issue and have the potential to produce systematic reviews of direct relevance to policy makers and practitioners.On the basis of the recommendations of the Joanna Briggs Institute International Mixed Methods Reviews Methodology Group in 2012, the Institute adopted a segregated approach to mixed-methods synthesis as described by Sandelowski et al., which consists of separate syntheses of each component method of the review. Joanna Briggs Institute's mixed-methods synthesis of the findings of the separate syntheses uses a Bayesian approach to translate the findings of the initial quantitative synthesis into qualitative themes and pooling these with the findings of the initial qualitative synthesis.

  4. A new approach for peat inventory methods; Turvetutkimusten menetelmaekehitystarkastelu

    Energy Technology Data Exchange (ETDEWEB)

    Laatikainen, M.; Leino, J.; Lerssi, J.; Torppa, J.; Turunen, J. Email: jukka.turunen@gtk.fi

    2011-07-01

    Development of the new peatland inventory method started in 2009. There was a need to investigate whether new methods and tools could be developed cost-effectively so field inventory work would more completely cover the whole peatland area and the quality and liability of the final results would remain at a high level. The old inventory method in place at the Geological Survey of Finland (GTK) is based on the main transect and cross transect approach across a peatland area. The goal of this study was to find a practical grid-based method linked to the geographic information system suitable for field conditions. the triangle-grid method with even distance between the study points was found to be the most suitable approach. A new Ramac-ground penetrating radar was obtained by the GTK in 2009, and it was concluded in the study of new peatland inventory methods. This radar model is relatively light and very suitable, for example, to the forestry drained peatlands, which are often difficult to cross because of the intensive ditch network. the goal was to investigate the best working methods for the ground penetrating radar to optimize its use in the large-scale peatland inventory. Together with the new field inventory methods, a novel interpolation-based method (MITTI) for modelling peat depths was developed. MITTI makes it possible to take advantage of all the available peat-depth data including, at the moment, aerogeophysical and ground penetrating radar measurements, drilling data and the mire outline. The characteristic uncertainties of each data type are taken into account and, in addition to the depth model itself, an uncertainty map of the model is computed. Combined with the grid-based field inventory method, this multi-approach provides better tools to more accurately estimate the peat depths, peat amounts and peat type distributions. The development of the new peatland inventory method was divided into four separate sections: (1) Development of new field

  5. Approaches to Electric Utility Energy Efficiency for Low Income Customers in a Changing Regulatory Environment

    Energy Technology Data Exchange (ETDEWEB)

    Brockway, N.

    2001-05-21

    As the electric industry goes through a transformation to a more market-driven model, traditional grounds for utility energy efficiency have come under fire, undermining the existing mechanisms to fund and deliver such services. The challenge, then, is to understand why the electric industry should sustain investments in helping low-income Americans use electricity efficiently, how such investments should be made, and how these policies can become part of the new electric industry structure. This report analyzes the opportunities and barriers to leveraging electric utility energy efficiency assistance to low-income customers during the transition of the electric industry to greater competition.

  6. Adaptive interaction a utility maximization approach to understanding human interaction with technology

    CERN Document Server

    Payne, Stephen J

    2013-01-01

    This lecture describes a theoretical framework for the behavioural sciences that holds high promise for theory-driven research and design in Human-Computer Interaction. The framework is designed to tackle the adaptive, ecological, and bounded nature of human behaviour. It is designed to help scientists and practitioners reason about why people choose to behave as they do and to explain which strategies people choose in response to utility, ecology, and cognitive information processing mechanisms. A key idea is that people choose strategies so as to maximise utility given constraints. The frame

  7. Cost of energy from utility-owned solar electric systems. A required revenue method for ERDA/EPRI evaluations

    Energy Technology Data Exchange (ETDEWEB)

    1976-06-01

    This methodology calculates the electric energy busbar cost from a utility-owned solar electric system. This approach is applicable to both publicly- and privately-owned utilities. Busbar cost represents the minimum price per unit of energy consistent with producing system-resultant revenues equal to the sum of system-resultant costs. This equality is expressed in present value terms, where the discount rate used reflects the rate of return required on invested capital. Major input variables describe the output capabilities and capital cost of the energy system, the cash flows required for system operation and maintenance, and the financial structure and tax environment of the utility.

  8. E-Learning as a Knowledge Management Approach for Intellectual Capital Utilization

    Science.gov (United States)

    Shehabat, Issa; Mahdi, Saad A.; Khoualdi, Kamel

    2008-01-01

    This paper addresses human resources utilization at the university environment. We address the design issues of e-learning courses that can capture the teacher knowledge. The underlying objective is that e-learning is a key knowledge and major resources for many universities. Therefore, the design of e-learning should be an important part of the…

  9. Utility of Policy Capturing as an Approach to Graduate Admissions Decision Making.

    Science.gov (United States)

    Schmidt, Frank L.; And Others

    1978-01-01

    The present study examined and evaluated the application of linear policy-capturing models to the real-world decision task of graduate admissions. Utility of the policy-capturing models was great enough to be of practical significance, and least-squares weights showed no predictive advantage over equal weights. (Author/CTM)

  10. A Single-Stage Approach to Anscombe and Aumann's Expected Utility

    NARCIS (Netherlands)

    R.K. Sarin (Rakesh); P.P. Wakker (Peter)

    1997-01-01

    textabstractAnscombe and Aumann showed that if one accepts the existence of a physical randomizing device such as a roulette wheel then Savage's derivation of subjective expected utility can be considerably simplified. They, however, invoked compound gambles to define their axioms. We demonstrate

  11. A single-stage approach to Ancombe and Aumann's expected utility

    NARCIS (Netherlands)

    Wakker, P.P.; Sarin, R.K.

    1997-01-01

    Anscombe and Aumann showed that if one accepts the existence of a physical randomizing device such as a roulette wheel then Savage's derivation of subjective expected utility can be considerably simplified. They, however, invoked compound gambles to define their axioms. We demonstrate that the

  12. A Unifying Approach to Axiomatic Non-Expected Utility Theories: Correction and Comment

    NARCIS (Netherlands)

    S.H. Chew; L.G. Epstein (Larry); P.P. Wakker (Peter)

    1993-01-01

    textabstractChew and Epstein attempted to provide a unifying axiomatic framework for a number of generalizations of expected utility theory. Wakker pointed out that Theorem A, on which the central unifying proposition is based, is false. In this note, we apply Segal′s result to prove that Theorem 2

  13. A unifying approach to axiomatic non-expected utility theories: correction and comment

    NARCIS (Netherlands)

    Hong, C.S.; Epstein, L.G.; Wakker, P.

    1993-01-01

    Chew and Epstein attempted to provide a unifying axiomatic framework for a number of generalizations of expected utility theory. Wakker pointed out that Theorem A, on which the central unifying proposition is based, is false. In this note, we apply Segal's result to prove that Theorem 2 is

  14. Methodical approaches to development of classification state methods of regulation business activity in fishery

    OpenAIRE

    She Son Gun

    2014-01-01

    Approaches to development of classification of the state methods of regulation of economy are considered. On the basis of the provided review the complex method of state regulation of business activity is reasonable. The offered principles allow improving public administration and can be used in industry concepts and state programs on support of small business in fishery.

  15. Creative Approaches to Teaching Graduate Research Methods Workshops

    OpenAIRE

    Peter Reilly

    2017-01-01

    Engagement and deeper learning were enhanced by developing several innovative teaching strategies delivered in Research Methods workshops to Graduate Business Students.  Focusing primarily on students adopting a creative approach to formulating a valid research question for undertaking a dissertation successfully. These techniques are applicable to most subject domains to ensure student engagement.  Addressing the various multiple intelligences and learning styles existing within groups while...

  16. Methodical approach to financial stimulation of logistics managers

    OpenAIRE

    Melnykova Kateryna V.

    2014-01-01

    The article offers a methodical approach to financial stimulation of logistics managers, which allows calculation of the incentive amount with consideration of profit obtained from introduction of optimisation logistics solutions. The author generalises measures, which would allow increase of stimulation of labour of logistics managers by the enterprise top managers. The article marks out motivation factors, which exert influence upon relation of logistics managers to execution of optimisatio...

  17. Internal Medicine Resident Engagement with a Laboratory Utilization Dashboard: Mixed Methods Study.

    Science.gov (United States)

    Kurtzman, Gregory; Dine, Jessica; Epstein, Andrew; Gitelman, Yevgenly; Leri, Damien; Patel, Miltesh S; Ryskina, Kyra

    2017-09-01

    The objective of this study was to measure internal medicine resident engagement with an electronic medical record-based dashboard providing feedback on their use of routine laboratory tests relative to service averages. From January 2016 to June 2016, residents were e-mailed a snapshot of their personalized dashboard, a link to the online dashboard, and text summarizing the resident and service utilization averages. We measured resident engagement using e-mail read-receipts and web-based tracking. We also conducted 3 hour-long focus groups with residents. Using grounded theory approach, the transcripts were analyzed for common themes focusing on barriers and facilitators of dashboard use. Among 80 residents, 74% opened the e-mail containing a link to the dashboard and 21% accessed the dashboard itself. We did not observe a statistically significant difference in routine laboratory ordering by dashboard use, although residents who opened the link to the dashboard ordered 0.26 fewer labs per doctor-patient-day than those who did not (95% confidence interval, -0.77 to 0.25; = 0 .31). While they raised several concerns, focus group participants had positive attitudes toward receiving individualized feedback delivered in real time. © 2017 Society of Hospital Medicine.

  18. Excision and Patch Grafting of a Lateral Peyronie's Plaque—Utilizing a Longitudinal “Window” Approach

    Directory of Open Access Journals (Sweden)

    Kathy Lue, BS

    2015-06-01

    Conclusion: A lateral longitudinal incision for PIEG is a feasible technique and may reduce the postoperative morbidity and dissection required with traditional circumcising incision with penile degloving. Larger comparative studies are necessary for further evaluation. Lue K, Emtage JB, Martinez DR, Yang C, and Carrion R. Excision and patch grafting of a lateral Peyronie's plaque—Utilizing a longitudinal “window” approach. Sex Med 2015;3:86–89.

  19. A Methodological Alternative to Media Comparison Studies: Linking Information Utilization Strategies and Instructional Approach in Hypermedia Learning

    OpenAIRE

    Catrambone, Richard; Gerjets, Peter; Scheiter, Katharina; Vollmann, Brigitte

    2006-01-01

    Literature reviews on hypermedia learning have yet failed to show consistent positive effects of learner-controlled nonlinear information access. We argue that a possible reason for this lack of evidence in favor of hypermedia learning results from the fact that not sufficient attention is paid to the strategies of information utilization learners deploy. The few studies that do analyze these strategies fail to link them to an instructional approach, which hampers a deeper interpretation of s...

  20. Utility of the physical examination in detecting pulmonary hypertension. A mixed methods study.

    Directory of Open Access Journals (Sweden)

    Rebecca Colman

    Full Text Available INTRODUCTION: Patients with pulmonary hypertension (PH often present with a variety of physical findings reflecting a volume or pressure overloaded right ventricle (RV. However, there is no consensus regarding the diagnostic utility of the physical examination in PH. METHODS: We conducted a systematic review of publications that evaluated the clinical examination and diagnosis of PH using MEDLINE (1946-2013 and EMBASE (1947-2013. We also prospectively evaluated the diagnostic utility of the physical examination findings. Patients who underwent right cardiac catheterization for any reason were recruited. After informed consent, participants were examined by 6 physicians (3 "specialists" and 3 "generalists" who were unaware of the results of the patient's hemodynamics. Each examiner independently assessed patients for the presence of a RV lift, loud P2, jugular venous distension (JVD, tricuspid insufficiency murmur and right-sided 4th heart sound at rest and during a slow inspiration. A global rating (scale of 1-5 of the likelihood that the patient had pulmonary hypertension was provided by each examiner. RESULTS: 31 articles that assessed the physical examination in PH were included in the final analysis. There was heterogeneity amongst the studies and many did not include control data. The sign most associated with PH in the literature was a loud pulmonic component of the second heart sound (P2. In our prospective study physical examination was performed on 52 subjects (25 met criteria for PH; mPAP ≥ 25 mmHg. The physical sign with the highest likelihood ratio (LR was a loud P2 on inspiration with a LR +ve 1.9, 95% CrI [1.2, 3.1] when data from all examiners was analyzed together. Results from the specialist examiners had higher diagnostic utility; a loud P2 on inspiration was associated with a positive LR of 3.2, 95% CrI [1.5, 6.2] and a right sided S4 on inspiration had a LR +ve 4.7, 95% CI [1.0, 15.6]. No aspect of the physical exam, could

  1. Utility of the Physical Examination in Detecting Pulmonary Hypertension. A Mixed Methods Study

    Science.gov (United States)

    Colman, Rebecca; Whittingham, Heather; Tomlinson, George; Granton, John

    2014-01-01

    Introduction Patients with pulmonary hypertension (PH) often present with a variety of physical findings reflecting a volume or pressure overloaded right ventricle (RV). However, there is no consensus regarding the diagnostic utility of the physical examination in PH. Methods We conducted a systematic review of publications that evaluated the clinical examination and diagnosis of PH using MEDLINE (1946–2013) and EMBASE (1947–2013). We also prospectively evaluated the diagnostic utility of the physical examination findings. Patients who underwent right cardiac catheterization for any reason were recruited. After informed consent, participants were examined by 6 physicians (3 “specialists” and 3 “generalists”) who were unaware of the results of the patient's hemodynamics. Each examiner independently assessed patients for the presence of a RV lift, loud P2, jugular venous distension (JVD), tricuspid insufficiency murmur and right-sided 4th heart sound at rest and during a slow inspiration. A global rating (scale of 1–5) of the likelihood that the patient had pulmonary hypertension was provided by each examiner. Results 31 articles that assessed the physical examination in PH were included in the final analysis. There was heterogeneity amongst the studies and many did not include control data. The sign most associated with PH in the literature was a loud pulmonic component of the second heart sound (P2). In our prospective study physical examination was performed on 52 subjects (25 met criteria for PH; mPAP ≥25 mmHg). The physical sign with the highest likelihood ratio (LR) was a loud P2 on inspiration with a LR +ve 1.9, 95% CrI [1.2, 3.1] when data from all examiners was analyzed together. Results from the specialist examiners had higher diagnostic utility; a loud P2 on inspiration was associated with a positive LR of 3.2, 95% CrI [1.5, 6.2] and a right sided S4 on inspiration had a LR +ve 4.7, 95% CI [1.0, 15.6]. No aspect of the physical exam, could

  2. Cross-border shipment route selection utilizing analytic hierarchy process (AHP method

    Directory of Open Access Journals (Sweden)

    Veeris Ammarapala

    2018-02-01

    Full Text Available Becoming a member of ASEAN Economic Community (AEC, Thailand expects a growth of cross-border trade with neighboring countries, especially the agricultural products shipment. To facilitate this, a number of strategies are set, such as the utilization of single check point, the Asian Highway (AH route development, and the truck lane initiation. However, majority of agricultural products traded through the borders are transported using the rural roads, from growing area to the factory, before continuing to the borders using different highways. It is, therefore, necessary for the Department of Rural Roads (DRR to plan for rural road improvement to accommodate the growth of the cross-border trades in the near future. This research, thus, aims to select potential rural roads to support cross-border shipment utilizing the analytic hierarchy process (AHP method. Seven key factors affecting rural roads selection, with references from transport and other related literatures, are extracted. They include:1 cross-border trade value, 2 distance from border to rural road, 3 agriculture and processed agriculture goods transported across the border, 4 compatibility with national strategies, 5 area characteristics around the rural road, 6 truck volume, and 7 number of rural roads in the radius of 50 kilometers from the border. Interviews are conducted with the experts based on seven key factors to collect data for the AHP analysis. The results identify the weight of each factor with an acceptable consistency ratio. It shows that the cross-border trade value is the most important factor as it achieves the highest weight. The distance from border to rural road and the compatibility with national strategies are also found crucial when making rural road selection decision. The Department of Rural Roads could use the results to select suitable roads, and plan for road improvement to support the crossborder shipment when the AEC is fully implemented.

  3. A simple method for measuring glucose utilization of insulin-sensitive tissues by using the brain as a reference

    International Nuclear Information System (INIS)

    Namba, Hiroki; Nakagawa, Keiichi; Iyo, Masaomi; Fukushi, Kiyoshi; Irie, Toshiaki

    1994-01-01

    A simple method, without measurement of the plasma input function, to obtain semiquantitative values of glucose utilization in tissues other than the brain with radioactive deoxyglucose is reported. The brain, in which glucose utilization is essentially insensitive to plasma glucose and insulin concentrations, was used as an internal reference. The effects of graded doses of oral glucose loading (0.5, 1 and 2 mg/g body weight) on insulin-sensitive tissues (heart, muscle and fat tissue) were studied in the rat. By using the brain-reference method, dose-dependent increases in glucose utilization were clearly shown in all the insulin-sensitive tissues examined. The method seems to be of value for measurement of glucose utilization using radioactive deoxyglucose and positron emission tomography in the heart or other insulin-sensitive tissues, especially during glucose loading. (orig.)

  4. On Innovation: A Theoretical Approach on the Challenges of Utilities Marketing

    Directory of Open Access Journals (Sweden)

    Florina PÎNZARU

    2012-06-01

    Full Text Available One of the markets not long ago closed and completely regulated is now in the growing process of liberalization and deregulation: it is the utilities market we refer to (water, sewege, gas, electricity, waste collection. The deregulation of a market is usually followed by the appearance of competition expression conditions and, unassailably, the occurrence of specific marketing strategies. This paper investigates the specific of utilities marketing as it develops now, an bourgeoning domain, although with a rather discreet presence in this field’s theoretical analysis studies. Exploratory research on the analysis type products, promotional offers and communication of this market’s players shows an effervescent players practice, but also a continuous innovation necessary in a market where consumers are unfamiliar with bein persuaded by commercial means

  5. The waiting time distribution as a graphical approach to epidemiologic measures of drug utilization

    DEFF Research Database (Denmark)

    Hallas, J; Gaist, D; Bjerrum, L

    1997-01-01

    that effectively conveys some essential utilization parameters for a drug. The waiting time distribution for a group of drug users is a charting of their first prescription presentations within a specified time window. For a drug used for chronic treatment, most current users will be captured at the beginning...... of the window. After a few months, the graph will be dominated by new, incident users. As examples, we present waiting time distributions for insulin, ulcer drugs, systemic corticosteroids, antidepressants, and disulfiram. Appropriately analyzed and interpreted, the waiting time distributions can provide...... information about the period prevalence, point prevalence, incidence, duration of use, seasonality, and rate of prescription renewal or relapse for specific drugs. Each of these parameters has a visual correlate. The waiting time distributions may be an informative supplement to conventional drug utilization...

  6. Numerical approach to optimal portfolio in a power utility regime-switching model

    Science.gov (United States)

    Gyulov, Tihomir B.; Koleva, Miglena N.; Vulkov, Lubin G.

    2017-12-01

    We consider a system of weakly coupled degenerate semi-linear parabolic equations of optimal portfolio in a regime-switching with power utility function, derived by A.R. Valdez and T. Vargiolu [14]. First, we discuss some basic properties of the solution of this system. Then, we develop and analyze implicit-explicit, flux limited finite difference schemes for the differential problem. Numerical experiments are discussed.

  7. Increasing the utility of regional water table maps: a new method for estimating groundwater recharge

    Science.gov (United States)

    Gilmore, T. E.; Zlotnik, V. A.; Johnson, M.

    2017-12-01

    Groundwater table elevations are one of the most fundamental measurements used to characterize unconfined aquifers, groundwater flow patterns, and aquifer sustainability over time. In this study, we developed an analytical model that relies on analysis of groundwater elevation contour (equipotential) shape, aquifer transmissivity, and streambed gradient between two parallel, perennial streams. Using two existing regional water table maps, created at different times using different methods, our analysis of groundwater elevation contours, transmissivity and streambed gradient produced groundwater recharge rates (42-218 mm yr-1) that were consistent with previous independent recharge estimates from different methods. The three regions we investigated overly the High Plains Aquifer in Nebraska and included some areas where groundwater is used for irrigation. The three regions ranged from 1,500 to 3,300 km2, with either Sand Hills surficial geology, or Sand Hills transitioning to loess. Based on our results, the approach may be used to increase the value of existing water table maps, and may be useful as a diagnostic tool to evaluate the quality of groundwater table maps, identify areas in need of detailed aquifer characterization and expansion of groundwater monitoring networks, and/or as a first approximation before investing in more complex approaches to groundwater recharge estimation.

  8. Utility of Combining a Simulation-Based Method With a Lecture-Based Method for Fundoscopy Training in Neurology Residency.

    Science.gov (United States)

    Gupta, Deepak K; Khandker, Namir; Stacy, Kristin; Tatsuoka, Curtis M; Preston, David C

    2017-10-01

    Fundoscopic examination is an essential component of the neurologic examination. Competence in its performance is mandated as a required clinical skill for neurology residents by the American Council of Graduate Medical Education. Government and private insurance agencies require its performance and documentation for moderate- and high-level neurologic evaluations. Traditionally, assessment and teaching of this key clinical examination technique have been difficult in neurology residency training. To evaluate the utility of a simulation-based method and the traditional lecture-based method for assessment and teaching of fundoscopy to neurology residents. This study was a prospective, single-blinded, education research study of 48 neurology residents recruited from July 1, 2015, through June 30, 2016, at a large neurology residency training program. Participants were equally divided into control and intervention groups after stratification by training year. Baseline and postintervention assessments were performed using questionnaire, survey, and fundoscopy simulators. After baseline assessment, both groups initially received lecture-based training, which covered fundamental knowledge on the components of fundoscopy and key neurologic findings observed on fundoscopic examination. The intervention group additionally received simulation-based training, which consisted of an instructor-led, hands-on workshop that covered practical skills of performing fundoscopic examination and identifying neurologically relevant findings on another fundoscopy simulator. The primary outcome measures were the postintervention changes in fundoscopy knowledge, skills, and total scores. A total of 30 men and 18 women were equally distributed between the 2 groups. The intervention group had significantly higher mean (SD) increases in skills (2.5 [2.3] vs 0.8 [1.8], P = .01) and total (9.3 [4.3] vs 5.3 [5.8], P = .02) scores compared with the control group. Knowledge scores (6.8 [3

  9. E-LEARNING AS A KNOWLEDGE MANAGEMENT APPROACH FOR INTELLECTUAL CAPITAL UTILIZATION

    Directory of Open Access Journals (Sweden)

    Issa SHEHABAT

    2009-01-01

    Full Text Available This paper addresses human resources utilization at the university environment. We address the design issues of e-learning courses that can capture the teacher knowledge. The underlying objective is that e-learning is a key knowledge and major resources for many universities. Therefore, the design of e-learning should be an important part of the university knowledge management process. Teachers' knowledge in any important topic or field should be managed in a way that the university can benefit from it in case of teacher leaving or retired. Hence, intellectual personal knowledge management will be explored through the development of e-learning systems. Some concepts from the Artificial Intelligence field can be used in developing such systems. The potential for utilizing human knowledge in the university environment will optimize the resources and can be of cost effective and quality assurance factors and provide the university with a sustainable competitive advantage. Assuring the proper knowledge management within the university environment is a more complex issue. This is due to the diverse of topics in one hand and the behavior of the student and the lecturers on the other hand. Effective implementation and success requires a lot of efforts that will guarantee the utilization of the intellectual capital within the university environment.

  10. E-LEARNING AS A KNOWLEDGE MANAGEMENT APPROACH FOR INTELLECTUAL CAPITAL UTILIZATION

    Directory of Open Access Journals (Sweden)

    Issa SHEHABAT

    2008-01-01

    Full Text Available This paper addresses human resources utilization at the university environment. We address the design issues of e-learning courses that can capture the teacher knowledge. The underlying objective is that e-learning is a key knowledge and major resources for many universities. Therefore, the design of e-learning should be an important part of the university knowledge management process. Teachers' knowledge in any important topic or field should be managed in a way that the university can benefit from it in case of teacher leaving or retired. Hence, intellectual personal knowledge management will be explored through the development of e-learning systems. Some concepts from the Artificial Intelligence field can be used in developing such systems.The potential for utilizing human knowledge in the university environment will optimize the resources and can be of cost effective and quality assurance factors and provide the university with a sustainable competitive advantage.Assuring the proper knowledge management within the university environment is a more complex issue. This is due to the diverse of topics in one hand and the behavior of the student and the lecturers on the other hand. Effective implementation and success requires a lot of efforts that will guarantee the utilization of the intellectual capital within the university environment.

  11. Investigation of Effect of Machine Layout on Productivity and Utilization Level: What If Simulation Approach

    Directory of Open Access Journals (Sweden)

    Islam Faisal Bourini

    2018-03-01

    Full Text Available Designing and selecting the material handling system is a vital factor for any production line, and as result for the whole manufacturing system. Poor design and unsuitable handling equipment may increase the risk of having bottlenecks, longer production time and as a result the higher total production cost. One of the useful and effective tools are using “what if” simulation techniques. However, this technique needs effective simulation software. The main objective for this research is to simulate different types of handling system using what if scenario. To achieve the objective of the research, Delmia Quest software has been used to simulate two different systems: manual system and conveyers system for the same production line and analyses the differences in terms of utilization and production rate. The results obtained have been analysed and appraised to induce the bottleneck locations, productivity and utilizations of the machines and material handling systems used in the design system. Finally, the best model have been developed to increase the productivity, utilizations of the machines, material handling systems and to minimize the bottleneck locations.

  12. Reducing operating costs: A collaborative approach between industry and electric utilities

    International Nuclear Information System (INIS)

    Tyers, B.; Sibbald, L.

    1993-01-01

    The unit cost of electricity to industrial consumers is expected to increase at a rate of 5% annually in the 1990s. The partnership that has been created between Amoco Canada Petroleum Company and TransAlta Utilities to control the cost of electricity is described. To allow the company to receive lower rates for interruptible power, a number of measures have been taken. The Amoco Whitecourt plant has standby generators in reserve that can be used when utility power is not available. A Pembina compressor can be turned off for up to 12 hours, at 30 minutes notice, without affecting field pressure. At the East Crossfield plant sales gas can be compressed using electricity or a gas-driven engine. Spot market energy is used in a number of plants allowing electric drive alternatives to plant operators and offering short term energy markets. TransAlta invests in electrical equipment such as switchgear as well as transmission lines and transformers. New rate alternatives offered by TransAlta Utilities include review of the need for a demand ratchet, additional time of use rates, unbundling of rates allowing power purchase from alternative sources, rates that follow product costs, reduced rates for conversion of gas to electric drives certain circumstances, energy audits, and power factor credits. 5 figs

  13. Approaches and Methods of Periodization in Literary History

    Directory of Open Access Journals (Sweden)

    Naser Gholi Sarli

    2013-10-01

    Full Text Available Abstract One of the most fundamental acts of historiography is to classify historical information in diachronic axis. The method of this classification or periodization shows the theoretical approach of the historian and determines the structure and the form of his history. Because of multiple criteria of analysis and various literary genres, periodization in literary history is more complicated than that of general history. We can distinguish two approaches in periodization of literary history, although these can be used together: extrinsic or social-cultural approach (based on criteria extrinsic to literature and intrinsic or formalist approach (based on criteria intrinsic to literature. Then periodization in literary history can be formulated in different methods and may be based upon various criteria: chronological such as century, decade and year organic patterns of evolution great poets and writers literary emblems and evaluations of every period events, concepts and periods of general or political history analogy of literary history and history of ideas or history of arts approaches and styles of language dominant literary norms. These methods actually are used together and everyone has adequacy in special kind of literary history. In periodization of Persian contemporary literature, some methods and models current in periodization of poetry have been applied identically to periodization of prose. Periodization based upon century, decade and year is the simplest and most mechanical method but sometimes certain centuries in some countries have symbolic and stylistic meaning, and decades were used often for subdivisions of literary history, especially nowadays with fast rhythm of literary change. Periodization according to organic patterns of evolution equates the changes of literary history with the life phases of an organism, and offers an account of birth, mature and death (and sometimes re-birth of literary genres, but this method have

  14. Approaches and Methods of Periodization in Literary History

    Directory of Open Access Journals (Sweden)

    Dr. N. Gh. Sarli

    Full Text Available One of the most fundamental acts of historiography is to classify historical information in diachronic axis. The method of this classification or periodization shows the theoretical approach of the historian and determines the structure and the form of his history. Because of multiple criteria of analysis and various literary genres, periodization in literary history is more complicated than that of general history. We can distinguish two approaches in periodization of literary history, although these can be used together: extrinsic or social-cultural approach (based on criteria extrinsic to literature and intrinsic or formalist approach (based on criteria intrinsic to literature. Then periodization in literary history can be formulated in different methods and may be based upon various criteria: chronological such as century, decade and year; organic patterns of evolution; great poets and writers; literary emblems and evaluations of every period; events, concepts and periods of general or political history; analogy of literary history and history of ideas or history of arts; approaches and styles of language; dominant literary norms. These methods actually are used together and everyone has adequacy in special kind of literary history. In periodization of Persian contemporary literature, some methods and models current in periodization of poetry have been applied identically to periodization of prose. Periodization based upon century, decade and year is the simplest and most mechanical method but sometimes certain centuries in some countries have symbolic and stylistic meaning, and decades were used often for subdivisions of literary history, especially nowadays with fast rhythm of literary change.Periodization according to organic patterns of evolution equates the changes of literary history with the life phases of an organism, and offers an account of birth, mature and death (and sometimes re-birth of literary genres, but this method have

  15. Approaches and Methods of Periodization in Literary History

    Directory of Open Access Journals (Sweden)

    Naser Gholi Sarli

    2013-11-01

    Full Text Available Abstract One of the most fundamental acts of historiography is to classify historical information in diachronic axis. The method of this classification or periodization shows the theoretical approach of the historian and determines the structure and the form of his history. Because of multiple criteria of analysis and various literary genres, periodization in literary history is more complicated than that of general history. We can distinguish two approaches in periodization of literary history, although these can be used together: extrinsic or social-cultural approach (based on criteria extrinsic to literature and intrinsic or formalist approach (based on criteria intrinsic to literature. Then periodization in literary history can be formulated in different methods and may be based upon various criteria: chronological such as century, decade and year organic patterns of evolution great poets and writers literary emblems and evaluations of every period events, concepts and periods of general or political history analogy of literary history and history of ideas or history of arts approaches and styles of language dominant literary norms. These methods actually are used together and everyone has adequacy in special kind of literary history. In periodization of Persian contemporary literature, some methods and models current in periodization of poetry have been applied identically to periodization of prose. Periodization based upon century, decade and year is the simplest and most mechanical method but sometimes certain centuries in some countries have symbolic and stylistic meaning, and decades were used often for subdivisions of literary history, especially nowadays with fast rhythm of literary change. Periodization according to organic patterns of evolution equates the changes of literary history with the life phases of an organism, and offers an account of birth, mature and death (and sometimes re-birth of literary genres, but this method have

  16. Basis set approach in the constrained interpolation profile method

    International Nuclear Information System (INIS)

    Utsumi, T.; Koga, J.; Yabe, T.; Ogata, Y.; Matsunaga, E.; Aoki, T.; Sekine, M.

    2003-07-01

    We propose a simple polynomial basis-set that is easily extendable to any desired higher-order accuracy. This method is based on the Constrained Interpolation Profile (CIP) method and the profile is chosen so that the subgrid scale solution approaches the real solution by the constraints from the spatial derivative of the original equation. Thus the solution even on the subgrid scale becomes consistent with the master equation. By increasing the order of the polynomial, this solution quickly converges. 3rd and 5th order polynomials are tested on the one-dimensional Schroedinger equation and are proved to give solutions a few orders of magnitude higher in accuracy than conventional methods for lower-lying eigenstates. (author)

  17. Electrical deicing utilizing carbon fiber tape for asphalt approach and crosswalk phase I - literature review.

    Science.gov (United States)

    2016-06-30

    The purpose of this study is to provide a comprehensive literature review of electrical deicing technology for possible application in asphalt approach and crosswalks. A : thorough review of existing and emerging deicing technology for snow/ice melti...

  18. Econutrition and utilization of food-based approaches for nutritional health.

    Science.gov (United States)

    Blasbalg, Tanya L; Wispelwey, Bram; Deckelbaum, Richard J

    2011-03-01

    Macronutrient and micronutrient deficiencies continue to have a detrimental impact in lower-income countries, with significant costs in morbidity, mortality, and productivity. Food is the primary source of the nutrients needed to sustain life, and it is the essential component that links nutrition, agriculture, and ecology in the econutrition framework. To present evidence and analysis of food-based approaches for improving nutritional and health outcomes in lower-income countries. Review of existing literature. The benefits of food-based approaches may include nutritional improvement, food security, cost-effectiveness, sustainability, and human productivity. Food-based approaches require additional inputs, including nutrition education, gender considerations, and agricultural planning. Although some forms of malnutrition can be addressed via supplements, food-based approaches are optimal to achieve sustainable solutions to multiple nutrient deficiencies.

  19. Creative Approaches to Teaching Graduate Research Methods Workshops

    Directory of Open Access Journals (Sweden)

    Peter Reilly

    2017-06-01

    Full Text Available Engagement and deeper learning were enhanced by developing several innovative teaching strategies delivered in Research Methods workshops to Graduate Business Students.  Focusing primarily on students adopting a creative approach to formulating a valid research question for undertaking a dissertation successfully. These techniques are applicable to most subject domains to ensure student engagement.  Addressing the various multiple intelligences and learning styles existing within groups while ensuring these sessions are student centred and conducive to a collaborative learning environment.  Blogs, interactive tutorials, online videos, games and posters, are used to develop student’s cognitive and metacognitive abilities.  Using novelty images appeals to a groups’ intellectual curiosity, acting as an interpretive device to explain  the value of adopting a holistic rather than analytic approach towards a topic.

  20. Methods for measuring denitrification: Diverse approaches to a difficult problem

    DEFF Research Database (Denmark)

    Groffman, Peter M.; Altabet, Mark A.; Böhlke, J. K.

    2006-01-01

    , and global scales. Unfortunately, this process is very difficult to measure, and existing methods are problematic for different reasons in different places at different times. In this paper, we review the major approaches that have been taken to measure denitrification in terrestrial and aquatic environments...... based on stable isotopes, (8) in situ gradients with atmospheric environmental tracers, and (9) molecular approaches. Our review makes it clear that the prospects for improved quantification of denitrification vary greatly in different environments and at different scales. While current methodology allows...... for the production of accurate estimates of denitrification at scales relevant to water and air quality and ecosystem fertility questions in some systems (e.g., aquatic sediments, well-defined aquifers), methodology for other systems, especially upland terrestrial areas, still needs development. Comparison of mass...

  1. Utility of qualitative methods in a clinical setting: perinatal care in the Western Province.

    Science.gov (United States)

    Jayasuriya, V

    2012-03-01

    A peculiar paradox that has been observed in previous studies of antenatal care is where patients are satisfied with the services despite obvious lack of basic facilities. Qualitative methods were used to describe the experience of perinatal care in the Western province with the objective of demonstrating application of this method in a clinical setting. This paper used a 'naturalistic' approach of qualitative methods. In-depth interviews conducted with 20 postnatal mothers delivering in tertiary care institutions in the Western province was tape recorded, transcribed and content analysed. To ensure objectivity and validity of results, the principle investigator received only the anonymised data to prevent any prejudices or pre-conceptions affecting the results. The main themes emerging from the text demonstrated 'naïve trust' in the carer and a state of 'hero worship' where patients were distanced and therefore unable and unwilling to query the decisions made by the carers. This is similar to a state of patient-carer relationship described in a published model known as guarded alliance, where the relationship develops though four phases based on the level of trust and confidence in the relationship. This state explains not only why patients fail to recognise and report any deficiencies in the services but also the need for them to justify the behaviour of caregivers even when it amounts to incompetence and negligence. Qualitative methods allow the researcher to capture experiences in its 'natural' form rather than based on pre-determined protocols or plans, which may be limited to our own understanding and expectations and therefore unable to explain many idiosyncrasies of the programmes. This paper argues favourably for the use of qualitative methods in other clinical settings.

  2. Flash memory management system and method utilizing multiple block list windows

    Science.gov (United States)

    Chow, James (Inventor); Gender, Thomas K. (Inventor)

    2005-01-01

    The present invention provides a flash memory management system and method with increased performance. The flash memory management system provides the ability to efficiently manage and allocate flash memory use in a way that improves reliability and longevity, while maintaining good performance levels. The flash memory management system includes a free block mechanism, a disk maintenance mechanism, and a bad block detection mechanism. The free block mechanism provides efficient sorting of free blocks to facilitate selecting low use blocks for writing. The disk maintenance mechanism provides for the ability to efficiently clean flash memory blocks during processor idle times. The bad block detection mechanism provides the ability to better detect when a block of flash memory is likely to go bad. The flash status mechanism stores information in fast access memory that describes the content and status of the data in the flash disk. The new bank detection mechanism provides the ability to automatically detect when new banks of flash memory are added to the system. Together, these mechanisms provide a flash memory management system that can improve the operational efficiency of systems that utilize flash memory.

  3. Design for a sustainable society utilizing information and communication technologies (ICT) - proposal: a new EcoDesign method and its application

    Energy Technology Data Exchange (ETDEWEB)

    Fujimoto, J. [Research Center for Advanced Science and Technology, the Univ. of Tokyo (Japan); Matsumoto, M. [Fundamental and Environment Labs., NEC Corp. (Japan)

    2004-07-01

    We discuss a new EcoDesign method, with a goal of improving the economy, and its application to a sustainable networked society. The new EcoDesign method differs from conventional EcoDesign in three respects: it utilizes a top-down approach, project-style research, and fact-finding and benchmarking. We systematize EcoDesign for a networked society from the viewpoint of technology and divide it into four components. Feasibility studies of the individual components reveal that if we promote ICT diffusion with EcoDesign in mind, we can suppress CO2 emissions and move towards a more sustainable society. (orig.)

  4. A System Dynamics Approach to Modeling the Sensitivity of Inappropriate Emergency Department Utilization

    Science.gov (United States)

    Behr, Joshua G.; Diaz, Rafael

    Non-urgent Emergency Department utilization has been attributed with increasing congestion in the flow and treatment of patients and, by extension, conditions the quality of care and profitability of the Emergency Department. Interventions designed to divert populations to more appropriate care may be cautiously received by operations managers due to uncertainty about the impact an adopted intervention may have on the two values of congestion and profitability. System Dynamics (SD) modeling and simulation may be used to measure the sensitivity of these two, often-competing, values of congestion and profitability and, thus, provide an additional layer of information designed to inform strategic decision making.

  5. A risk analysis approach applied to field surveillance in utility meters in legal metrology

    Science.gov (United States)

    Rodrigues Filho, B. A.; Nonato, N. S.; Carvalho, A. D.

    2018-03-01

    Field surveillance represents the level of control in metrological supervision responsible for checking the conformity of measuring instruments in-service. Utility meters represent the majority of measuring instruments produced by notified bodies due to self-verification in Brazil. They play a major role in the economy once electricity, gas and water are the main inputs to industries in their production processes. Then, to optimize the resources allocated to control these devices, the present study applied a risk analysis in order to identify among the 11 manufacturers notified to self-verification, the instruments that demand field surveillance.

  6. Pressurization Risk Assessment of CO2 Reservoirs Utilizing Design of Experiments and Response Surface Methods

    Science.gov (United States)

    Guyant, E.; Han, W. S.; Kim, K. Y.; Park, E.; Han, K.

    2015-12-01

    Monitoring of pressure buildup can provide explicit information on reservoir integrity and is an appealing tool, however pressure variation is dependent on a variety of factors causing high uncertainty in pressure predictions. This work evaluated pressurization of a reservoir system in the presence of leakage pathways as well as exploring the effects of compartmentalization of the reservoir utilizing design of experiments (Definitive Screening, Box Behnken, Central Composite, and Latin Hypercube designs) and response surface methods. Two models were developed, 1) an idealized injection scenario in order to evaluate the performance of multiple designs, and 2) a complex injection scenario implementing the best performing design to investigate pressurization of the reservoir system. A holistic evaluation of scenario 1, determined that the Central Composite design would be used for the complex injection scenario. The complex scenario evaluated 5 risk factors: reservoir, seal, leakage pathway and fault permeabilities, and horizontal position of the pathway. A total of 60 response surface models (RSM) were developed for the complex scenario with an average R2 of 0.95 and a NRMSE of 0.067. Sensitivity to the input factors was dynamic through space and time; at the earliest time (0.05 years) the reservoir permeability was dominant, and for later times (>0.5 years) the fault permeability became dominant for all locations. The RSM's were then used to conduct a Monte Carlo Analysis to further analyze pressurization risks, identifying the P10, P50, P90 values. This identified the in zone (lower) P90 values as 2.16, 1.77, and 1.53 MPa and above zone values of 1.35, 1.23, 1.09 MPa for monitoring locations 1, 2, and 3, respectively. In summary, the design of experiments and response surface methods allowed for an efficient sensitivity and uncertainty analysis to be conducted permitting a complete evaluation of the pressurization across the entire parameter space.

  7. A Component Approach to Collaborative Scientific Software Development: Tools and Techniques Utilized by the Quantum Chemistry Science Application Partnership

    Directory of Open Access Journals (Sweden)

    Joseph P. Kenny

    2008-01-01

    Full Text Available Cutting-edge scientific computing software is complex, increasingly involving the coupling of multiple packages to combine advanced algorithms or simulations at multiple physical scales. Component-based software engineering (CBSE has been advanced as a technique for managing this complexity, and complex component applications have been created in the quantum chemistry domain, as well as several other simulation areas, using the component model advocated by the Common Component Architecture (CCA Forum. While programming models do indeed enable sound software engineering practices, the selection of programming model is just one building block in a comprehensive approach to large-scale collaborative development which must also address interface and data standardization, and language and package interoperability. We provide an overview of the development approach utilized within the Quantum Chemistry Science Application Partnership, identifying design challenges, describing the techniques which we have adopted to address these challenges and highlighting the advantages which the CCA approach offers for collaborative development.

  8. Methodical Approaches to Communicative Providing of Retailer Branding

    Directory of Open Access Journals (Sweden)

    Andrey Kataev

    2017-07-01

    Full Text Available The thesis is devoted to the rationalization of methodical approaches for provision of branding of retail trade enterprises. The article considers the features of brand perception by retail consumers and clarifies the specifics of customer reviews of stores for the procedures accompanying brand management. It is proved that besides traditional communication mix, the most important tool of communicative influence on buyers is the store itself as a place for comfortable shopping. The shop should have a stimulating effect on all five human senses, including sight, smell, hearing, touch, and taste, which shall help maximize consumer integration into the buying process.

  9. Reasoning methods in medical consultation systems: artificial intelligence approaches.

    Science.gov (United States)

    Shortliffe, E H

    1984-01-01

    It has been argued that the problem of medical diagnosis is fundamentally ill-structured, particularly during the early stages when the number of possible explanations for presenting complaints can be immense. This paper discusses the process of clinical hypothesis evocation, contrasts it with the structured decision making approaches used in traditional computer-based diagnostic systems, and briefly surveys the more open-ended reasoning methods that have been used in medical artificial intelligence (AI) programs. The additional complexity introduced when an advice system is designed to suggest management instead of (or in addition to) diagnosis is also emphasized. Example systems are discussed to illustrate the key concepts.

  10. Flight management research utilizing an oculometer. [pilot scanning behavior during simulated approach and landing

    Science.gov (United States)

    Spady, A. A., Jr.; Kurbjun, M. C.

    1978-01-01

    This paper presents an overview of the flight management work being conducted using NASA Langley's oculometer system. Tests have been conducted in a Boeing 737 simulator to investigate pilot scan behavior during approach and landing for simulated IFR, VFR, motion versus no motion, standard versus advanced displays, and as a function of various runway patterns and symbology. Results of each of these studies are discussed. For example, results indicate that for the IFR approaches a difference in pilot scan strategy was noted for the manual versus coupled (autopilot) conditions. Also, during the final part of the approach when the pilot looks out-of-the-window he fixates on his aim or impact point on the runway and holds this point until flare initiation.

  11. Cost utility analysis of endoscopic biliary stent in unresectable hilar cholangiocarcinoma: decision analytic modeling approach.

    Science.gov (United States)

    Sangchan, Apichat; Chaiyakunapruk, Nathorn; Supakankunti, Siripen; Pugkhem, Ake; Mairiang, Pisaln

    2014-01-01

    Endoscopic biliary drainage using metal and plastic stent in unresectable hilar cholangiocarcinoma (HCA) is widely used but little is known about their cost-effectiveness. This study evaluated the cost-utility of endoscopic metal and plastic stent drainage in unresectable complex, Bismuth type II-IV, HCA patients. Decision analytic model, Markov model, was used to evaluate cost and quality-adjusted life year (QALY) of endoscopic biliary drainage in unresectable HCA. Costs of treatment and utilities of each Markov state were retrieved from hospital charges and unresectable HCA patients from tertiary care hospital in Thailand, respectively. Transition probabilities were derived from international literature. Base case analyses and sensitivity analyses were performed. Under the base-case analysis, metal stent is more effective but more expensive than plastic stent. An incremental cost per additional QALY gained is 192,650 baht (US$ 6,318). From probabilistic sensitivity analysis, at the willingness to pay threshold of one and three times GDP per capita or 158,000 baht (US$ 5,182) and 474,000 baht (US$ 15,546), the probability of metal stent being cost-effective is 26.4% and 99.8%, respectively. Based on the WHO recommendation regarding the cost-effectiveness threshold criteria, endoscopic metal stent drainage is cost-effective compared to plastic stent in unresectable complex HCA.

  12. Urban water infrastructure asset management - a structured approach in four water utilities.

    Science.gov (United States)

    Cardoso, M A; Silva, M Santos; Coelho, S T; Almeida, M C; Covas, D I C

    2012-01-01

    Water services are a strategic sector of large social and economic relevance. It is therefore essential that they are managed rationally and efficiently. Advanced water supply and wastewater infrastructure asset management (IAM) is key in achieving adequate levels of service in the future, particularly with regard to reliable and high quality drinking water supply, prevention of urban flooding, efficient use of natural resources and prevention of pollution. This paper presents a methodology for supporting the development of urban water IAM, developed during the AWARE-P project as well as an appraisal of its implementation in four water utilities. Both water supply and wastewater systems were considered. Due to the different contexts and features of the utilities, the main concerns vary from case to case; some problems essentially are related to performance, others to risk. Cost is a common deciding factor. The paper describes the procedure applied, focusing on the diversity of drivers, constraints, benefits and outcomes. It also points out the main challenges and the results obtained through the implementation of a structured procedure for supporting urban water IAM.

  13. Utility approach to decision-making in extended T1 and limited T2 glottic carcinoma.

    Science.gov (United States)

    van Loon, Yda; Stiggelbout, Anne M; Hakkesteegt, Marieke M; Langeveld, Ton P M; de Jong, Rob J Baatenburg; Sjögren, Elisabeth V

    2017-04-01

    It is still undecided if endoscopic laser surgery or radiotherapy is the preferable treatment in extended T1 and limited T2 glottic tumors. Health utilities assessed from patients can aid in decision-making. Patients treated for extended T1 or limited T2 glottic carcinoma by laser surgery (n = 12) or radiotherapy (n = 14) assigned health utilities using a visual analog scale (VAS), time tradeoff (TTO) technique and scored their voice handicap using the Voice Handicap Index (VHI). VAS and TTO scores were slightly lower for the laser group compared to the radiotherapy group, however, not significantly so. The VHI showed a correlation with the VAS score, which was very low in both groups and can be considered (near) normal. Patients show no clear preference for the outcomes of laser surgery or radiotherapy from a quality of life (QOL) or voice handicap point of view. These data can now be incorporated into decision-making models. © 2017 Wiley Periodicals, Inc. Head Neck, 2017 © 2016 Wiley Periodicals, Inc. Head Neck 39: 779-785, 2017. © 2017 Wiley Periodicals, Inc.

  14. A Meta-heuristic Approach for Variants of VRP in Terms of Generalized Saving Method

    Science.gov (United States)

    Shimizu, Yoshiaki

    Global logistic design is becoming a keen interest to provide an essential infrastructure associated with modern societal provision. For examples, we can designate green and/or robust logistics in transportation systems, smart grids in electricity utilization systems, and qualified service in delivery systems, and so on. As a key technology for such deployments, we engaged in practical vehicle routing problem on a basis of the conventional saving method. This paper extends such idea and gives a general framework available for various real-world applications. It can cover not only delivery problems but also two kind of pick-up problems, i.e., straight and drop-by routings. Moreover, multi-depot problem is considered by a hybrid approach with graph algorithm and its solution method is realized in a hierarchical manner. Numerical experiments have been taken place to validate effectiveness of the proposed method.

  15. Edward’s syndrome: A rare cause of difficult intubation-utility of left molar approach

    Directory of Open Access Journals (Sweden)

    Teena Bansal

    2016-04-01

    Full Text Available Edward’s syndrome (trisomy 18 is an autosomal abnormality with dysmorphic face, visceral deformities and delayed mental and motor development including congenital heart disease. Challenges may arise during mask ventilation, laryngoscopy and/or intubation of the trachea due to dysmorphic face. Difficult airway cart should be kept ready. Left molar approach using a standard Macintosh blade improves the laryngoscopic view in patients with difficult midline laryngoscopy. We hereby present a case report of a 2 year old male child with Edward’s syndrome posted for evacuation and drainage of brain abscess, intubated successfully using left molar approach.

  16. A Mixed Methods Approach to Equity and Justice Research: Insights from Research on Children's Reasoning About Economic Inequality.

    Science.gov (United States)

    Mistry, Rashmita S; White, Elizabeth S; Chow, Kirby A; Griffin, Katherine M; Nenadal, Lindsey

    2016-01-01

    Mixed methods research approaches are gaining traction across various social science disciplines, including among developmental scientists. In this chapter, we discuss the utility of a mixed methods research approach in examining issues related to equity and justice. We incorporate a brief overview of quantitative and qualitative monomethod research approaches in our larger discussion of the advantages, procedures, and considerations of employing a mixed methods design to advance developmental science from an equity and justice perspective. To better illustrate the theoretical and practical significance of a mixed methods research approach, we include examples of research conducted on children and adolescents' conceptions of economic inequality as one example of developmental science research with an equity and justice frame. © 2016 Elsevier Inc. All rights reserved.

  17. A methodological approach to studying resilience mechanisms: demonstration of utility in age and Alzheimer's disease-related brain pathology.

    Science.gov (United States)

    Wolf, Dominik; Fischer, Florian Udo; Fellgiebel, Andreas

    2018-05-01

    The present work aims at providing a methodological approach for the investigation of resilience factors and mechanisms in normal aging, Alzheimer's disease (AD) and other neurodegenerative disorders. By expanding and re-conceptualizing traditional regression approaches, we propose an approach that not only aims at identifying potential resilience factors but also allows for a differentiation between general and dynamic resilience factors in terms of their association with pathology. Dynamic resilience factors are characterized by an increasing relevance with increasing levels of pathology, while the relevance of general resilience factors is independent of the amount of pathology. Utility of the approach is demonstrated in age and AD-related brain pathology by investigating widely accepted resilience factors, including education and brain volume. Moreover, the approach is used to test hippocampal volume as potential resilience factor. Education and brain volume could be identified as general resilience factors against age and AD-related pathology. Beyond that, analyses highlighted that hippocampal volume may not only be disease target but also serve as a potential resilience factor in age and AD-related pathology, particularly at higher levels of tau-pathology (i.e. dynamic resilience factor). Given its unspecific and superordinate nature the approach is suitable for the investigation of a wide range of potential resilience factors in normal aging, AD and other neurodegenerative disorders. Consequently, it may find a wide application and thereby promote the comparability between studies.

  18. A geologic approach to field methods in fluvial geomorphology

    Science.gov (United States)

    Fitzpatrick, Faith A.; Thornbush, Mary J; Allen, Casey D; Fitzpatrick, Faith A.

    2014-01-01

    A geologic approach to field methods in fluvial geomorphology is useful for understanding causes and consequences of past, present, and possible future perturbations in river behavior and floodplain dynamics. Field methods include characterizing river planform and morphology changes and floodplain sedimentary sequences over long periods of time along a longitudinal river continuum. Techniques include topographic and bathymetric surveying of fluvial landforms in valley bottoms and describing floodplain sedimentary sequences through coring, trenching, and examining pits and exposures. Historical sediment budgets that include floodplain sedimentary records can characterize past and present sources and sinks of sediment along a longitudinal river continuum. Describing paleochannels and floodplain vertical accretion deposits, estimating long-term sedimentation rates, and constructing historical sediment budgets can assist in management of aquatic resources, habitat, sedimentation, and flooding issues.

  19. Compression-RSA: New approach of encryption and decryption method

    Science.gov (United States)

    Hung, Chang Ee; Mandangan, Arif

    2013-04-01

    Rivest-Shamir-Adleman (RSA) cryptosystem is a well known asymmetric cryptosystem and it has been applied in a very wide area. Many researches with different approaches have been carried out in order to improve the security and performance of RSA cryptosystem. The enhancement of the performance of RSA cryptosystem is our main interest. In this paper, we propose a new method to increase the efficiency of RSA by shortening the number of plaintext before it goes under encryption process without affecting the original content of the plaintext. Concept of simple Continued Fraction and the new special relationship between it and Euclidean Algorithm have been applied on this newly proposed method. By reducing the number of plaintext-ciphertext, the encryption-decryption processes of a secret message can be accelerated.

  20. Implementation of a VLSI Level Zero Processing system utilizing the functional component approach

    Science.gov (United States)

    Shi, Jianfei; Horner, Ward P.; Grebowsky, Gerald J.; Chesney, James R.

    1991-01-01

    A high rate Level Zero Processing system is currently being prototyped at NASA/Goddard Space Flight Center (GSFC). Based on state-of-the-art VLSI technology and the functional component approach, the new system promises capabilities of handling multiple Virtual Channels and Applications with a combined data rate of up to 20 Megabits per second (Mbps) at low cost.

  1. Utilizing a Rapid Prototyping Approach in the Building of a Hypermedia-Based Reference Station.

    Science.gov (United States)

    Sell, Dan

    This paper discusses the building of a hypermedia-based reference station at the Wright Laboratory Technical Library, Wright-Patterson Air Force Base, Ohio. Following this, the paper focuses on an electronic user survey from which data is collected and analysis is made. The survey data is used in a rapid prototyping approach, which is defined as…

  2. Alberta's systems approach to chronic disease management and prevention utilizing the expanded chronic care model.

    Science.gov (United States)

    Delon, Sandra; Mackinnon, Blair

    2009-01-01

    Alberta's integrated approach to chronic disease management programming embraces client-centred care, supports self-management and facilitates care across the continuum. This paper presents strategies implemented through collaboration with primary care to improve care of individuals with chronic conditions, evaluation evidence supporting success and lessons learned from the Alberta perspective.

  3. An Investigation on Individual Students' Perceptions of Interest Utilizing a Blended Learning Approach

    Science.gov (United States)

    Shroff, Ronnie H.; Vogel, Douglas R.

    2010-01-01

    Research has established that individual student interest has a positive effect on learning and academic achievement. However, little is known about the impact of a blended learning approach on individual student interest and whether combinations of online and face-to-face learning activities significantly enhance student interest. This paper…

  4. Utilization of renewable energy potential in Pakistan - a goal oriented approach through industry-cum-academia linkage

    International Nuclear Information System (INIS)

    Khalil, M.S.

    2011-01-01

    Due to the recent power crisis in Pakistan, it is essential to utilize the god gifted renewable energy potentials in the form of hydro-power, solar, wind etc. With recent developments in emerging technologies globally, it is the basic need for indigenous development of renewable energy resources. It can be done through research and development by bridging the gaps between industries and technical institutions. In the world, this approach is being carried out not only for the development but for sustain ability of the R and D in the field of renewable energies. Local industries can play a vital role in using latest computational techniques in the research and development areas with the utilization of technical and engineering institutions. For sustainable development of renewable energy resources in the country, local industry have to come forward and contribute toward the betterment of the country. (author)

  5. METHODICAL APPROACH TO AN ESTIMATION OF PROFESSIONALISM OF AN EMPLOYEE

    Directory of Open Access Journals (Sweden)

    Татьяна Александровна Коркина

    2013-08-01

    Full Text Available Analysis of definitions of «professionalism», reflecting the different viewpoints of scientists and practitioners, has shown that it is interpreted as a specific property of the people effectively and reliably carry out labour activity in a variety of conditions. The article presents the methodical approach to an estimation of professionalism of the employee from the position as the external manifestations of the reliability and effectiveness of the work and the position of the personal characteristics of the employee, determining the results of his work. This approach includes the assessment of the level of qualification and motivation of the employee for each key job functions as well as the final results of its implementation on the criteria of efficiency and reliability. The proposed methodological approach to the estimation of professionalism of the employee allows to identify «bottlenecks» in the structure of its labour functions and to define directions of development of the professional qualities of the worker to ensure the required level of reliability and efficiency of the obtained results.DOI: http://dx.doi.org/10.12731/2218-7405-2013-6-11

  6. Estimating the demand for drop-off recycling sites: a random utility travel cost approach.

    Science.gov (United States)

    Sidique, Shaufique F; Lupi, Frank; Joshi, Satish V

    2013-09-30

    Drop-off recycling is one of the most widely adopted recycling programs in the United States. Despite its wide implementation, relatively little literature addresses the demand for drop-off recycling. This study examines the demand for drop-off recycling sites as a function of travel costs and various site characteristics using the random utility model (RUM). The findings of this study indicate that increased travel costs significantly reduce the frequency of visits to drop-off sites implying that the usage pattern of a site is influenced by its location relative to where people live. This study also demonstrates that site specific characteristics such as hours of operation, the number of recyclables accepted, acceptance of commingled recyclables, and acceptance of yard-waste affect the frequency of visits to drop-off sites. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Novel approach in quantitative analysis of shearography method

    International Nuclear Information System (INIS)

    Wan Saffiey Wan Abdullah

    2002-01-01

    The application of laser interferometry in industrial non-destructive testing and material characterization is becoming more prevalent since this method provides non-contact full-field inspection of the test object. However their application only limited to the qualitative analysis, current trend has changed to the development of this method by the introduction of quantitative analysis, which attempts to detail the defect examined. This being the design feature for a ranges of object size to be examined. The growing commercial demand for quantitative analysis for NDT and material characterization is determining the quality of optical and analysis instrument. However very little attention is currently being paid to understanding, quantifying and compensating for the numerous error sources which are a function of interferometers. This paper presents a comparison of measurement analysis using the established theoretical approach and the new approach, taken into account the factor of divergence illumination and other geometrical factors. The difference in the measurement system could be associated in the error factor. (Author)

  8. On dynamical systems approaches and methods in f ( R ) cosmology

    Energy Technology Data Exchange (ETDEWEB)

    Alho, Artur [Center for Mathematical Analysis, Geometry and Dynamical Systems, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais, 1049-001 Lisboa (Portugal); Carloni, Sante [Centro Multidisciplinar de Astrofisica – CENTRA, Instituto Superior Técnico, Universidade de Lisboa, Av. Rovisco Pais, 1049-001 Lisboa (Portugal); Uggla, Claes, E-mail: aalho@math.ist.utl.pt, E-mail: sante.carloni@tecnico.ulisboa.pt, E-mail: claes.uggla@kau.se [Department of Physics, Karlstad University, S-65188 Karlstad (Sweden)

    2016-08-01

    We discuss dynamical systems approaches and methods applied to flat Robertson-Walker models in f ( R )-gravity. We argue that a complete description of the solution space of a model requires a global state space analysis that motivates globally covering state space adapted variables. This is shown explicitly by an illustrative example, f ( R ) = R + α R {sup 2}, α > 0, for which we introduce new regular dynamical systems on global compactly extended state spaces for the Jordan and Einstein frames. This example also allows us to illustrate several local and global dynamical systems techniques involving, e.g., blow ups of nilpotent fixed points, center manifold analysis, averaging, and use of monotone functions. As a result of applying dynamical systems methods to globally state space adapted dynamical systems formulations, we obtain pictures of the entire solution spaces in both the Jordan and the Einstein frames. This shows, e.g., that due to the domain of the conformal transformation between the Jordan and Einstein frames, not all the solutions in the Jordan frame are completely contained in the Einstein frame. We also make comparisons with previous dynamical systems approaches to f ( R ) cosmology and discuss their advantages and disadvantages.

  9. Early Cleft Lip Repair Revisited: A Safe and Effective Approach Utilizing a Multidisciplinary Protocol.

    Science.gov (United States)

    Hammoudeh, Jeff A; Imahiyerobo, Thomas A; Liang, Fan; Fahradyan, Artur; Urbinelli, Leo; Lau, Jennifer; Matar, Marla; Magee, William; Urata, Mark

    2017-06-01

    The optimal timing for cleft lip repair has yet to be established. Advances in neonatal anesthesia, along with a growing body of literature, suggesting benefits of earlier cleft lip and nasal repair, have set the stage for a reexamination of current practices. In this prospective study, cleft lip and nasal repair occurred on average at 34.8 days (13-69 days). Nasal correction was achieved primarily through molding the nasal cartilage without the placement of nasal sutures at the time of repair. A standardized anesthetic protocol aimed at limiting neurotoxicity was utilized in all cases. Anesthetic and postoperative complications were assessed. A 3-dimensional nasal analysis compared pre- and postoperative nasal symmetry for unilateral clefts. Surveys assessed familial response to repair. Thirty-two patients were included (27 unilateral and 5 bilateral clefts). In this study, the overall complication rate was 3.1%. Anthropometric measurements taken from 3-dimensional-image models showed statistically significant improvement in ratios of nostril height (preoperative mean, 0.59; postoperative mean, 0.80), nasal base width (preoperative mean, 1.96; postoperative mean, 1.12), columella length (preoperative mean, 0.62; postoperative mean, 0.89; and columella angle (preoperative mean, 30.73; postoperative mean, 9.1). Survey data indicated that families uniformly preferred earlier repair. We present evidence that early cleft lip and nasal repair can be performed safely and is effective at improving nasal symmetry without the placement of any nasal sutures. Utilization of this protocol has the potential to be a paradigm shift in the treatment of cleft lip and nasal deformity.

  10. An Innovative Approach to Pharmacy Law Education Utilizing a Mock Board of Pharmacy Meeting

    Directory of Open Access Journals (Sweden)

    D. Todd Bess

    2016-03-01

    Full Text Available A thorough understanding of pharmacy law by students is important in the molding of future pharmacy practitioners, but a standardized template for the best way to educate students in this area has not been created. A mock Board of Pharmacy meeting was designed and incorporated into the Pharmacy Law course at the University of Tennessee College of Pharmacy. Students acted as Board of Pharmacy members and utilized technology to decide outcomes of cases and requests addressed in a typical 2 day Tennessee Board of Pharmacy meeting. The actual responses to those cases, as well as similar cases and requests addressed over a 5 year period, were revealed to students after they made motions on mock scenarios. The mock Board of Pharmacy meeting engages the students in a way that lectures alone often fail to achieve with some initial evidence of successful student learning. Utilizing this teaching format as a law education tool challenges the status quo of pharmacy education and may serve as an impetus and catalyst for future innovations. Conflict of Interest We declare no conflicts of interest or financial interests that the authors or members of their immediate families have in any product or service discussed in the manuscript, including grants (pending or received, employment, gifts, stock holdings or options, honoraria, consultancies, expert testimony, patents, and royalties. Dr. Wang’s time was partly supported by the National Institute on Aging of the National Institutes of Health under Award Number R01AG040146 and R01AG049696. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.   Type: Idea Paper

  11. Method and Excel VBA Algorithm for Modeling Master Recession Curve Using Trigonometry Approach.

    Science.gov (United States)

    Posavec, Kristijan; Giacopetti, Marco; Materazzi, Marco; Birk, Steffen

    2017-11-01

    A new method was developed and implemented into an Excel Visual Basic for Applications (VBAs) algorithm utilizing trigonometry laws in an innovative way to overlap recession segments of time series and create master recession curves (MRCs). Based on a trigonometry approach, the algorithm horizontally translates succeeding recession segments of time series, placing their vertex, that is, the highest recorded value of each recession segment, directly onto the appropriate connection line defined by measurement points of a preceding recession segment. The new method and algorithm continues the development of methods and algorithms for the generation of MRC, where the first published method was based on a multiple linear/nonlinear regression model approach (Posavec et al. 2006). The newly developed trigonometry-based method was tested on real case study examples and compared with the previously published multiple linear/nonlinear regression model-based method. The results show that in some cases, that is, for some time series, the trigonometry-based method creates narrower overlaps of the recession segments, resulting in higher coefficients of determination R 2 , while in other cases the multiple linear/nonlinear regression model-based method remains superior. The Excel VBA algorithm for modeling MRC using the trigonometry approach is implemented into a spreadsheet tool (MRCTools v3.0 written by and available from Kristijan Posavec, Zagreb, Croatia) containing the previously published VBA algorithms for MRC generation and separation. All algorithms within the MRCTools v3.0 are open access and available free of charge, supporting the idea of running science on available, open, and free of charge software. © 2017, National Ground Water Association.

  12. Utilization of probabilistic methods for evaluating the safety of PWRs built in France

    International Nuclear Information System (INIS)

    Queniart, D.; Brisbois, J.; Lanore, J.M.

    1985-01-01

    Firstly, it is recalled that, in France, PWRs are designed on a deterministic basis by studying the consequences of a limited number of conventional incidents whose estimated frequency is specified in order-of-magnitude terms and for which it is shown that the consequences, for each category of frequency, predominate over those of the other situations in the same category. These situations are called dimensioning situations. The paper then describes the use made of probabilistic methods. External attacks and loss of redundant systems are examined in particular. A probabilistic approach is in fact well suited to the evaluation of risks due, among other things, to aircraft crashes and the industrial environment. Analysis of the reliability of redundant systems has shown that, in the light of the overall risk assessment objective, their loss should be examined with a view to instituting counteraction to reduce the risks associated with such loss (particularly the introduction of special control procedures). Probabilistic methods are used to evaluate the effectiveness of the counteraction proposed and such a study has been carried out for total loss of electric power supply. Finally, the probabilistic study of hazard initiated post factum by the French safety authorities for the standardized 900 MW(e) power units is described. The study, which is not yet complete, will serve as the basis for a permanent safety analysis tool taking into account control procedures and the total operating experience acquired using these power units. (author)

  13. Community Phylogenetics: Assessing Tree Reconstruction Methods and the Utility of DNA Barcodes

    Science.gov (United States)

    Boyle, Elizabeth E.; Adamowicz, Sarah J.

    2015-01-01

    Studies examining phylogenetic community structure have become increasingly prevalent, yet little attention has been given to the influence of the input phylogeny on metrics that describe phylogenetic patterns of co-occurrence. Here, we examine the influence of branch length, tree reconstruction method, and amount of sequence data on measures of phylogenetic community structure, as well as the phylogenetic signal (Pagel’s λ) in morphological traits, using Trichoptera larval communities from Churchill, Manitoba, Canada. We find that model-based tree reconstruction methods and the use of a backbone family-level phylogeny improve estimations of phylogenetic community structure. In addition, trees built using the barcode region of cytochrome c oxidase subunit I (COI) alone accurately predict metrics of phylogenetic community structure obtained from a multi-gene phylogeny. Input tree did not alter overall conclusions drawn for phylogenetic signal, as significant phylogenetic structure was detected in two body size traits across input trees. As the discipline of community phylogenetics continues to expand, it is important to investigate the best approaches to accurately estimate patterns. Our results suggest that emerging large datasets of DNA barcode sequences provide a vast resource for studying the structure of biological communities. PMID:26110886

  14. Development of an in vitro photosafety evaluation method utilizing intracellular ROS production in THP-1 cells.

    Science.gov (United States)

    Toyoda, Akemi; Itagaki, Hiroshi

    2018-01-01

    Photoreactive compounds that may experience exposure to ultraviolet (UV) radiation can lead to the intracellular production of reactive oxygen species (ROS), which may cause phototoxic and photoallergenic responses. Here, we developed a novel in vitro photosafety assay and investigated whether it could be used to predict phototoxicity and photosensitivity by measuring changes in intracellular ROS production. THP-1 cells that had previously taken up 5-(and-6)-carboxy-2',7'-difluorodihydrofluorescein diacetate (carboxy-H 2 DFFDA), a ROS-sensitive fluorescent reagent, were exposed to photoreactive substances such as phototoxic and photoallergenic materials and then subjected to with UV-A irradiation (5 J/cm 2 ). The fluorescence intensity was subsequently measured using a flow cytometer, and the intracellular ROS production was calculated. A statistically significant increase in ROS following treatment with photoreactive substances was observed in cells irradiated with UV-A. In contrast, no significant increase was observed for non-photoreactive substances in comparison to the control solution. Next, to confirm the impact of intracellular ROS on the photosensitive response, changes in CD86 and CD54 expression were measured following quencher addition during the photo human cell line activation test (photo h-CLAT). The results confirmed the reduction of CD86 and CD54 expression in response to photoallergenic substances following quencher addition. Together, these findings suggest that intracellular ROS production is involved in photosensitizing reactions. Therefore, we suggest that the developed method utilizing intracellular ROS production as an index may be useful as a novel in vitro evaluation tool for photoreactive substances.

  15. Evaluation of breast implants with breast magnetic resonance: Practical utility in comparison with other methods

    International Nuclear Information System (INIS)

    Melendez, Florencia; Blejman, Oscar; Lamattina Mariano; Villamea, Victoria; Sarquis, Flavio; Torrillo, Fabiana

    2010-01-01

    The purpose of this study was to demonstrate the utility of Breast Magnetic Resonance (BMR) in the evaluation of breast implants comparing Mammography and Breast ultrasound in an ambulatory private institute setting. Method: Between May 2008 and April 2010, 729 BRM were performed in patients between 22 and 77 years of age, 474 were evaluated for implant integrity. The studies were performed in a high field equipment (1.5T) with T1, T2, fat Sat, Silicone only and silicone suppression sequences. Results: Of the 474 patients that were evaluated for implant integrity: 291 patients (61.39%) presented with intact implants, 252 (86.6%) showed concordant findings with mammography and or ultrasound and 39 (13.4%) showed discordant findings. Then, 116 patients (24.47%) presented signs of intracapsular rupture, 82 (70.7%) were concordant with ultrasound findings and 34 (29.3%) had normal ultrasound evaluation. 44 patients (9.28%) presented extracapsular rupture, 40 (90.9%) were concordant with mammography and ultrasound and 4 (9.10%) were discordant. Finally, 23 patients (4.85%) presented residual silicone granulomas, with history of previous implant rupture and explanted surgery, 13 (56.52%) concordant with mammography and ultrasound and 10 (43.48%) were discordant. Conclusion: Non contrast BMR, using multiples planes and sequences is very useful in the evaluation of the internal structure of the implants and extracapsular silicone. It's the most sensible study to demonstrate intra and extracapsular rupture. BMR exceeds mammography and ultrasound in the detection of implant rupture and residual granulomas following explanted surgery.

  16. Commentary: Utilizing Community-Engaged Approaches to Investigate and Address Hmong Women’s Cancer Disparities

    Directory of Open Access Journals (Sweden)

    Shannon M.A. Sparks

    2014-12-01

    Full Text Available Cancer is a growing concern for women in the Hmong community. Hmong women experience poor health outcomes for both cervical and breast cancer, largely due to low rates of screening and resultant late-stage at diagnosis. Both breast and cervical cancer screening are complicated by a multitude of social, cultural and environmental factors which influence health care decision-making and can otherwise serve to restrict access. We argue that community-engaged research, an orientation which prioritizes collaborative, equitable partnerships and community voice in identifying both problems and solutions, can be a valuable approach to helping address cancer health disparities for Hmong women. Using the Milwaukee-based “Healthy Hmong Women” project as a case example, we detail how the community-engaged approach implemented by the project partners was critical in identifying factors contributing to Hmong cancer disparities and appropriate interventions, as well as the overall acceptance and success of the project. Specifically, we discuss how this approach: (1 promoted community investment and ownership in the project; (2 facilitated the integration of local perspectives and experiences; (3 built capacity to address cancer screening disparities; (4 facilitated the creation of interventions targeting multiple ecological levels; and (5 framed the community as the foundation and driver of positive change.

  17. Do sampling methods differ in their utility for ecological monitoring? Comparison of line-point intercept, grid-point intercept, and ocular estimate methods

    Science.gov (United States)

    This study compared the utility of three sampling methods for ecological monitoring based on: interchangeability of data (rank correlations), precision (coefficient of variation), cost (minutes/transect), and potential of each method to generate multiple indicators. Species richness and foliar cover...

  18. Process improvement methods increase the efficiency, accuracy, and utility of a neurocritical care research repository.

    Science.gov (United States)

    O'Connor, Sydney; Ayres, Alison; Cortellini, Lynelle; Rosand, Jonathan; Rosenthal, Eric; Kimberly, W Taylor

    2012-08-01

    Reliable and efficient data repositories are essential for the advancement of research in Neurocritical care. Various factors, such as the large volume of patients treated within the neuro ICU, their differing length and complexity of hospital stay, and the substantial amount of desired information can complicate the process of data collection. We adapted the tools of process improvement to the data collection and database design of a research repository for a Neuroscience intensive care unit. By the Shewhart-Deming method, we implemented an iterative approach to improve the process of data collection for each element. After an initial design phase, we re-evaluated all data fields that were challenging or time-consuming to collect. We then applied root-cause analysis to optimize the accuracy and ease of collection, and to determine the most efficient manner of collecting the maximal amount of data. During a 6-month period, we iteratively analyzed the process of data collection for various data elements. For example, the pre-admission medications were found to contain numerous inaccuracies after comparison with a gold standard (sensitivity 71% and specificity 94%). Also, our first method of tracking patient admissions and discharges contained higher than expected errors (sensitivity 94% and specificity 93%). In addition to increasing accuracy, we focused on improving efficiency. Through repeated incremental improvements, we reduced the number of subject records that required daily monitoring from 40 to 6 per day, and decreased daily effort from 4.5 to 1.5 h/day. By applying process improvement methods to the design of a Neuroscience ICU data repository, we achieved a threefold improvement in efficiency and increased accuracy. Although individual barriers to data collection will vary from institution to institution, a focus on process improvement is critical to overcoming these barriers.

  19. LightForce Photon-Pressure Collision Avoidance: Updated Efficiency Analysis Utilizing a Highly Parallel Simulation Approach

    Science.gov (United States)

    Stupl, Jan; Faber, Nicolas; Foster, Cyrus; Yang, Fan Yang; Nelson, Bron; Aziz, Jonathan; Nuttall, Andrew; Henze, Chris; Levit, Creon

    2014-01-01

    This paper provides an updated efficiency analysis of the LightForce space debris collision avoidance scheme. LightForce aims to prevent collisions on warning by utilizing photon pressure from ground based, commercial off the shelf lasers. Past research has shown that a few ground-based systems consisting of 10 kilowatt class lasers directed by 1.5 meter telescopes with adaptive optics could lower the expected number of collisions in Low Earth Orbit (LEO) by an order of magnitude. Our simulation approach utilizes the entire Two Line Element (TLE) catalogue in LEO for a given day as initial input. Least-squares fitting of a TLE time series is used for an improved orbit estimate. We then calculate the probability of collision for all LEO objects in the catalogue for a time step of the simulation. The conjunctions that exceed a threshold probability of collision are then engaged by a simulated network of laser ground stations. After those engagements, the perturbed orbits are used to re-assess the probability of collision and evaluate the efficiency of the system. This paper describes new simulations with three updated aspects: 1) By utilizing a highly parallel simulation approach employing hundreds of processors, we have extended our analysis to a much broader dataset. The simulation time is extended to one year. 2) We analyze not only the efficiency of LightForce on conjunctions that naturally occur, but also take into account conjunctions caused by orbit perturbations due to LightForce engagements. 3) We use a new simulation approach that is regularly updating the LightForce engagement strategy, as it would be during actual operations. In this paper we present our simulation approach to parallelize the efficiency analysis, its computational performance and the resulting expected efficiency of the LightForce collision avoidance system. Results indicate that utilizing a network of four LightForce stations with 20 kilowatt lasers, 85% of all conjunctions with a

  20. Multiattribute Utility Theory without Expected Utility Foundations

    NARCIS (Netherlands)

    J. Miyamoto (John); P.P. Wakker (Peter)

    1996-01-01

    textabstractMethods for determining the form of utilities are needed for the implementation of utility theory in specific decisions. An important step forward was achieved when utility theorists characterized useful parametric families of utilities and simplifying decompositions of multiattribute

  1. Carbohydrate-active enzymes from pigmented Bacilli: a genomic approach to assess carbohydrate utilization and degradation

    Directory of Open Access Journals (Sweden)

    Henrissat Bernard

    2011-09-01

    Full Text Available Abstract Background Spore-forming Bacilli are Gram-positive bacteria commonly found in a variety of natural habitats, including soil, water and the gastro-intestinal (GI-tract of animals. Isolates of various Bacillus species produce pigments, mostly carotenoids, with a putative protective role against UV irradiation and oxygen-reactive forms. Results We report the annotation of carbohydrate active enzymes (CAZymes of two pigmented Bacilli isolated from the human GI-tract and belonging to the Bacillus indicus and B. firmus species. A high number of glycoside hydrolases (GHs and carbohydrate binding modules (CBMs were found in both isolates. A detailed analysis of CAZyme families, was performed and supported by growth data. Carbohydrates able to support growth as the sole carbon source negatively effected carotenoid formation in rich medium, suggesting that a catabolite repression-like mechanism controls carotenoid biosynthesis in both Bacilli. Experimental results on biofilm formation confirmed genomic data on the potentials of B. indicus HU36 to produce a levan-based biofilm, while mucin-binding and -degradation experiments supported genomic data suggesting the ability of both Bacilli to degrade mammalian glycans. Conclusions CAZy analyses of the genomes of the two pigmented Bacilli, compared to other Bacillus species and validated by experimental data on carbohydrate utilization, biofilm formation and mucin degradation, suggests that the two pigmented Bacilli are adapted to the intestinal environment and are suited to grow in and colonize the human gut.

  2. Postmortem interval estimation: a novel approach utilizing gas chromatography/mass spectrometry-based biochemical profiling.

    Science.gov (United States)

    Kaszynski, Richard H; Nishiumi, Shin; Azuma, Takeshi; Yoshida, Masaru; Kondo, Takeshi; Takahashi, Motonori; Asano, Migiwa; Ueno, Yasuhiro

    2016-05-01

    While the molecular mechanisms underlying postmortem change have been exhaustively investigated, the establishment of an objective and reliable means for estimating postmortem interval (PMI) remains an elusive feat. In the present study, we exploit low molecular weight metabolites to estimate postmortem interval in mice. After sacrifice, serum and muscle samples were procured from C57BL/6J mice (n = 52) at seven predetermined postmortem intervals (0, 1, 3, 6, 12, 24, and 48 h). After extraction and isolation, low molecular weight metabolites were measured via gas chromatography/mass spectrometry (GC/MS) and examined via semi-quantification studies. Then, PMI prediction models were generated for each of the 175 and 163 metabolites identified in muscle and serum, respectively, using a non-linear least squares curve fitting program. A PMI estimation panel for muscle and serum was then erected which consisted of 17 (9.7%) and 14 (8.5%) of the best PMI biomarkers identified in muscle and serum profiles demonstrating statistically significant correlations between metabolite quantity and PMI. Using a single-blinded assessment, we carried out validation studies on the PMI estimation panels. Mean ± standard deviation for accuracy of muscle and serum PMI prediction panels was -0.27 ± 2.88 and -0.89 ± 2.31 h, respectively. Ultimately, these studies elucidate the utility of metabolomic profiling in PMI estimation and pave the path toward biochemical profiling studies involving human samples.

  3. A novel approach in organic waste utilization through biochar addition in wood/polypropylene composites

    Energy Technology Data Exchange (ETDEWEB)

    Das, Oisik [Department of Civil and Environmental Engineering, University of Auckland, Auckland 1142 (New Zealand); Sarmah, Ajit K., E-mail: a.sarmah@auckland.ac.nz [Department of Civil and Environmental Engineering, University of Auckland, Auckland 1142 (New Zealand); Bhattacharyya, Debes [Department of Mechanical Engineering, Center for Advanced Composite Materials, University of Auckland, Auckland 1142 (New Zealand)

    2015-04-15

    Highlights: • Biochar made from waste wood was added with wood polypropylene composites. • 24% biochar gave the best mechanical properties. • 6% biochar had no effect on physico-mechanical properties of composites. • Coupling agent remained unreacted in composites having higher amount of biochar. - Abstract: In an attempt to concurrently address the issues related to landfill gas emission and utilization of organic wastes, a relatively novel idea is introduced to develop biocomposites where biochar made from pyrolysis of waste wood (Pinus radiata) is added with the same wood, plastic/polymer (polypropylene) and maleated anhydride polypropylene (MAPP). Experiments were conducted by manufacturing wood and polypropylene composites (WPCs) mixed with 6 wt%, 12 wt%, 18 wt%, 24 wt%, and 30 wt% biochar. Though 6 wt% addition had similar properties to that of the control (composite without biochar), increasing biochar content to 24 wt% improved the composite’s tensile/flexural strengths and moduli. The biochar, having high surface area due to fine particles and being highly carbonised, acted as reinforcing filler in the biocomposite. Composites having 12 wt% and 18 wt% of biochar were found to be the most ductile and thermally stable, respectively. This study demonstrates that, WPCs added with biochar has good potential to mitigate wastes while simultaneously producing biocomposites having properties that might be suited for various end applications.

  4. A novel approach in organic waste utilization through biochar addition in wood/polypropylene composites

    International Nuclear Information System (INIS)

    Das, Oisik; Sarmah, Ajit K.; Bhattacharyya, Debes

    2015-01-01

    Highlights: • Biochar made from waste wood was added with wood polypropylene composites. • 24% biochar gave the best mechanical properties. • 6% biochar had no effect on physico-mechanical properties of composites. • Coupling agent remained unreacted in composites having higher amount of biochar. - Abstract: In an attempt to concurrently address the issues related to landfill gas emission and utilization of organic wastes, a relatively novel idea is introduced to develop biocomposites where biochar made from pyrolysis of waste wood (Pinus radiata) is added with the same wood, plastic/polymer (polypropylene) and maleated anhydride polypropylene (MAPP). Experiments were conducted by manufacturing wood and polypropylene composites (WPCs) mixed with 6 wt%, 12 wt%, 18 wt%, 24 wt%, and 30 wt% biochar. Though 6 wt% addition had similar properties to that of the control (composite without biochar), increasing biochar content to 24 wt% improved the composite’s tensile/flexural strengths and moduli. The biochar, having high surface area due to fine particles and being highly carbonised, acted as reinforcing filler in the biocomposite. Composites having 12 wt% and 18 wt% of biochar were found to be the most ductile and thermally stable, respectively. This study demonstrates that, WPCs added with biochar has good potential to mitigate wastes while simultaneously producing biocomposites having properties that might be suited for various end applications

  5. Novel Emergency Medicine Curriculum Utilizing Self-Directed Learning and the Flipped Classroom Method: Genitourinary Emergencies Small Group Module

    Directory of Open Access Journals (Sweden)

    Andrew King

    2017-07-01

    Full Text Available Audience: This curriculum, created and implemented at The Ohio State University Wexner Medical Center, was designed to educate our emergency medicine (EM residents, PGY-1 to PGY-3, as well as medical students. Introduction: In 2013, there were over 6 million Emergency Department visits in the United States which resulted in a primary diagnosis of the genitourinary system. This represents 5.2% of all Emergency Department visits.1 Residents must be proficient in the differential diagnosis and management of the wide variety of genitourinary emergencies. This flipped classroom curricular model emphasizes self-directed learning activities completed by learners, followed by small group discussions pertaining to the topic reviewed. The active learning fostered by this curriculum increases faculty and learner engagement and interaction time typically absent in traditional lecture-based formats.2-4 Studies have revealed that the application of knowledge through case studies, personal interaction with content experts, and integrated questions are effective learning strategies for emergency medicine residents.4-6 The Ohio State University Wexner Medical Center EM Residency didactic curriculum recently transitioned to a “flipped classroom” approach.7-10 We created this innovative curriculum aimed to improve our residency education program and to share educational resources with other EM residency programs. Our curriculum utilizes an 18-month curricular cycle to cover the defined emergency medicine content. The flipped classroom curriculum maximizes didactic time and resident engagement, fosters intellectual curiosity and active learning, and meets the needs of today’s learners. 3,6,11 Objectives: We aim to teach the presentation and management of genitourinary emergencies through the creation of a flipped classroom design. This unique, innovative curriculum utilizes resources chosen by education faculty and resident learners, study questions, real

  6. Novel Emergency Medicine Curriculum Utilizing Self-Directed Learning and the Flipped Classroom Method: Gastrointestinal Emergencies Small Group Module

    Directory of Open Access Journals (Sweden)

    Andrew King

    2017-01-01

    Full Text Available Audience and type of curriculum: This curriculum created and implemented at The Ohio State University Wexner Medical Center was designed to educate our emergency medicine (EM residents, PGY-1 to PGY-3, as well as medical students and attending physicians. Introduction/Background: Gastrointestinal (GI emergencies comprise approximately 12% of emergency department (ED visits.1 Residents must be proficient in the differential diagnosis and management of the wide variety of GI emergencies. The flipped classroom curricular model emphasizes self-directed learning activities completed by learners, followed by small group discussions pertaining to the topic reviewed. The active learning fostered by this curriculum increases faculty and learner engagement and interaction time typically absent in traditional lecture-based formats.2-4 Studies have revealed that the application of knowledge through case studies, personal interaction with content experts, and integrated questions are effective learning strategies for emergency medicine residents.4-6 The Ohio State University EM Residency didactic curriculum recently transitioned to a “flipped classroom” approach.7-10 We created this innovative curriculum aimed to improve our residency education program and to share educational resources with other EM residency programs. This proposed curriculum utilizes an 18-month curricular cycle. The flipped classroom curriculum maximizes didactic time and resident engagement, fosters intellectual curiosity and active learning, and meets the needs of today’s learners. 3,6,11 Objectives: We aim to teach the presentation and management of GI emergencies through the creation of a flipped classroom design. This unique, innovative curriculum utilizes resources chosen by education faculty and resident learners, study questions, real-life experiences, and small group discussions in place of traditional lectures. In doing so, a goal of the curriculum is to encourage self

  7. Novel Emergency Medicine Curriculum Utilizing Self-Directed Learning and the Flipped Classroom Method: Psychiatric Emergencies Small Group Module

    Directory of Open Access Journals (Sweden)

    Andrew King

    2017-07-01

    Full Text Available Audience: This curriculum created and implemented at The Ohio State University Wexner Medical Center was designed to educate our emergency medicine (EM residents, PGY-1 to PGY-3, as well as medical students and attending physicians. Introduction: In 2007, there were 12 million adult Emergency Department visits for mental health and substance abuse complaints. This represents 12.5% of all adult emergency department visits.1 Residents must be proficient in the differential diagnosis and management of the wide variety of psychiatric emergencies. The flipped classroom curricular model emphasizes self-directed learning activities completed by learners, followed by small group discussions pertaining to the topic reviewed. The active learning fostered by this curriculum increases faculty and learner engagement and interaction time typically absent in traditional lecture-based formats.2-4 Studies have revealed that the application of knowledge through case studies, personal interaction with content experts, and integrated questions are effective learning strategies for emergency medicine residents.4-6 The Ohio State University EM Residency didactic curriculum recently transitioned to a “flipped classroom” approach.7-10 We created this innovative curriculum aimed to improve our residency education program and to share educational resources with other EM residency programs. Our curriculum utilizes an 18-month curricular cycle to cover the defined emergency medicine content. The flipped classroom curriculum maximizes didactic time and resident engagement, fosters intellectual curiosity and active learning, and meets the needs of today’s learners. 3,6,11 Objectives: We aim to teach the presentation and management of psychiatric emergencies through the creation of a flipped classroom design. This unique, innovative curriculum utilizes resources chosen by education faculty and resident learners, study questions, real-life experiences, and small group

  8. Electrostatic Discharge Current Linear Approach and Circuit Design Method

    Directory of Open Access Journals (Sweden)

    Pavlos K. Katsivelis

    2010-11-01

    Full Text Available The Electrostatic Discharge phenomenon is a great threat to all electronic devices and ICs. An electric charge passing rapidly from a charged body to another can seriously harm the last one. However, there is a lack in a linear mathematical approach which will make it possible to design a circuit capable of producing such a sophisticated current waveform. The commonly accepted Electrostatic Discharge current waveform is the one set by the IEC 61000-4-2. However, the over-simplified circuit included in the same standard is incapable of producing such a waveform. Treating the Electrostatic Discharge current waveform of the IEC 61000-4-2 as reference, an approximation method, based on Prony’s method, is developed and applied in order to obtain a linear system’s response. Considering a known input, a method to design a circuit, able to generate this ESD current waveform in presented. The circuit synthesis assumes ideal active elements. A simulation is carried out using the PSpice software.

  9. Methodical approach to financial stimulation of logistics managers

    Directory of Open Access Journals (Sweden)

    Melnykova Kateryna V.

    2014-01-01

    Full Text Available The article offers a methodical approach to financial stimulation of logistics managers, which allows calculation of the incentive amount with consideration of profit obtained from introduction of optimisation logistics solutions. The author generalises measures, which would allow increase of stimulation of labour of logistics managers by the enterprise top managers. The article marks out motivation factors, which exert influence upon relation of logistics managers to execution of optimisation logistical solutions, which minimise logistical costs. The author builds a scale of financial encouragement for introduction of optimisation logistical solutions proposed by logistics managers. This scale is basic for functioning of the encouragement system and influences the increase of efficiency of logistics managers operation and also optimisation of enterprise logistical solutions.

  10. Social Media Impact: Utility of Reflective Approach in the Practice of Surgery.

    Science.gov (United States)

    Mohiuddin, Zia; Shahid, Hassan; Shuaib, Waqas

    2015-12-01

    Social media is rapidly being incorporated into medical education. We created a small group, reflective practice sessions by integrating specific medical cases to improve awareness about professionalism on social media. Medical scenarios were generated for reflective practice sessions on social media professionalism. Anonymous pre/post-session surveys evaluated residents' use of social media and gathered their opinions on the session. Thirty-eight of 48 (79 %) residents replied to the presession survey with 50 % (19/38) reporting daily digital media use, 76 % (29/38) witnessed unprofessional postings on social media, and 21 % (8/38) posted unprofessional content themselves. Of the 79 % (30/38) residents who attended the session, 74 % (28/38) completed the post-session survey. Residents reported the session added to the longevity of their professional career 4.11, 95 % CI (3.89-4.36). As a result of the session, they were more conscious of using the social media more professionally 3.47, 95 % CI (2.88-3.96) and would be proactive in protecting patient privacy and confidentiality on social media sites 3.96, 95 % CI (3.50-4.37). In summary, reflective practice-based sessions regarding the impact of social media on professionalism in surgery was well favored by the residents. The majority agreed that it had important implications for the longevity of their professional career. Participants reported having an increased awareness to protect patient privacy and utilize social media more professionally.

  11. An integrated approach for facilities planning by ELECTRE method

    Science.gov (United States)

    Elbishari, E. M. Y.; Hazza, M. H. F. Al; Adesta, E. Y. T.; Rahman, Nur Salihah Binti Abdul

    2018-01-01

    Facility planning is concerned with the design, layout, and accommodation of people, machines and activities of a system. Most of the researchers try to investigate the production area layout and the related facilities. However, few of them try to investigate the relationship between the production space and its relationship with service departments. The aim of this research to is to integrate different approaches in order to evaluate, analyse and select the best facilities planning method that able to explain the relationship between the production area and other supporting departments and its effect on human efforts. To achieve the objective of this research two different approaches have been integrated: Apple’s layout procedure as one of the effective tools in planning factories, ELECTRE method as one of the Multi Criteria Decision Making methods (MCDM) to minimize the risk of getting poor facilities planning. Dalia industries have been selected as a case study to implement our integration the factory have been divided two main different area: the whole facility (layout A), and the manufacturing area (layout B). This article will be concerned with the manufacturing area layout (Layout B). After analysing the data gathered, the manufacturing area was divided into 10 activities. There are five factors that the alternative were compared upon which are: Inter department satisfactory level, total distance travelled for workers, total distance travelled for the product, total time travelled for the workers, and total time travelled for the product. Three different layout alternatives have been developed in addition to the original layouts. Apple’s layout procedure was used to study and evaluate the different alternatives layouts, the study and evaluation of the layouts was done by calculating scores for each of the factors. After obtaining the scores from evaluating the layouts, ELECTRE method was used to compare the proposed alternatives with each other and with

  12. Review of remediation techniques for arsenic (As) contamination: a novel approach utilizing bio-organisms.

    Science.gov (United States)

    Rahman, Shahedur; Kim, Ki-Hyun; Saha, Subbroto Kumar; Swaraz, A M; Paul, Dipak Kumar

    2014-02-15

    Arsenic (As) contamination has recently become a worldwide problem, as it is found to be widespread not only in drinking water but also in various foodstuffs. Because of the high toxicity, As contamination poses a serious risk to human health and ecological system. To cope with this problem, a great deal of effort have been made to account for the mechanisms of As mineral formation and accumulation by some plants and aquatic organisms exposed to the high level of As. Hence, bio-remediation is now considered an effective and potent approach to breakdown As contamination. In this review, we provide up-to-date knowledge on how biological tools (such as plants for phytoremediation and to some extent microorganisms) can be used to help resolve the effects of As problems on the Earth's environment. Copyright © 2014 Elsevier Ltd. All rights reserved.

  13. Population health-based approaches to utilizing digital technology: a strategy for equity.

    Science.gov (United States)

    Graham, Garth N; Ostrowski, MaryLynn; Sabina, Alyse B

    2016-11-01

    Health care disparities and high chronic disease rates burden many communities and disproportionally impact racial/ethnic populations in the United States. These disparities vary geographically, increase health care expenses, and result in shortened lifespans. Digital technologies may be one tool for addressing health disparities and improving population health by increasing individuals' access to health information-especially as most low-income U.S. residents gain access to smartphones. The Aetna Foundation partners with organizations to use digital technologies, including mobile applications, data collection, and related platforms, for learning and sharing. Projects range from the broad-childhood education, lifestyle modification, health IT training, and nutrition education, to the specific-local healthy foods, stroke rehabilitation, and collection of city-level data. We describe our approaches to grantmaking and discuss lessons learned and their implications. When combined with sound policy strategies, emerging, scalable, digital technologies will likely become powerful allies for improving health and reducing health disparities.

  14. Two comments to utilization of structure function approach in deep inelastic scattering experiments

    International Nuclear Information System (INIS)

    Kuraev, E.; Galynskij, M.; Il'ichev, A.

    2002-01-01

    The 'returning to resonance' mechanism can be used to obtain the simple procedure of taking radiative corrections (RC) to deep inelastic scattering (DIS) cross sections into account in the framework of the Drell-Yan picture. Iteration procedure is proposed. Kinematical region y→1 can be described in the framework of the Drell-Yan picture using the structure function approach. The large RC in the lowest order reflect the Sudakov form factor suppression, which can be taken into account in all orders of the perturbation theory. Based on explicit calculation in two lowest orders of the perturbation theory, we construct the cross section in the y→1 region obeying renormalization group equations and including the Sudakov-like form factor suppression

  15. Power to the people - A regional utility's approach in rural Alaska

    Energy Technology Data Exchange (ETDEWEB)

    Thomson, William [AVEC Canada (Canada)

    2011-07-01

    Avec is a non-profit firm working in various fields, such as building interties and capturing recovered heat. The aim of this paper is to give an overview of the company's approach in rural Alaska. The state and federal governments funded over 100 wind projects in Alaska in the 1980s and nearly all failed due to lack of maintenance and poor locations. Geographical and technical challenges included, among others, complex logistics, poor soils and low temperatures. Moreover, the availability of heavy construction equipment was key issue. The challenge was to have access to specialty equipment and wind assessment was critical. The geotechnical conditions also presented unique challenges. But there were benefits, like the reduction in carbon footprint and reduced exposure to oil spills. Future plans include evaluation of sites for future funding in several rural areas of western Alaska. Clearly, there are a very great number of challenges involved but the work and results are rewarding.

  16. Identifying the Critical Links in Road Transportation Networks: Centrality-based approach utilizing structural properties

    Energy Technology Data Exchange (ETDEWEB)

    Chinthavali, Supriya [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Surface transportation road networks share structural properties similar to other complex networks (e.g., social networks, information networks, biological networks, and so on). This research investigates the structural properties of road networks for any possible correlation with the traffic characteristics such as link flows those determined independently. Additionally, we define a criticality index for the links of the road network that identifies the relative importance in the network. We tested our hypotheses with two sample road networks. Results show that, correlation exists between the link flows and centrality measures of a link of the road (dual graph approach is followed) and the criticality index is found to be effective for one test network to identify the vulnerable nodes.

  17. Excision and Patch Grafting of a Lateral Peyronie's Plaque-Utilizing a Longitudinal "Window" Approach.

    Science.gov (United States)

    Lue, Kathy; Emtage, Justin B; Martinez, Daniel R; Yang, Christopher; Carrion, Rafael

    2015-06-01

    Peyronie's disease (PD) is a debilitating disorder in which collagen deposition, fibrosis, and plaques in the tunica albuginea result in penile curvature, shortening, and pain. For severe curvatures requiring plaque incision or excision with grafting (PIEG), a subcoronal circumcising incision with penile degloving has historically been used. The aim of this study was to report our unique approach to PIEG via a longitudinal "window" incision for the correction of PD, minimizing the surgical manipulation and dissection accompanying the traditional circumcising incisional approach that may lead to increased postoperative edema, pain, and prolonged healing time. A patient presented with a stable, painless, 90-degree midshaft leftward curvature causing penetration difficulties and painful intercourse for his partner. His Sexual Health Inventory for Men (SHIM) score was 23. The patient opted for surgical correction with plaque excision and grafting via a 4-cm longitudinal incision overlying the point of maximal curvature along the left lateral penile shaft. This direct access to the left corpus cavernosum and plaque, along with dissecting skin, dartos, and Buck's fascia, created a window with sufficient exposure for excision and patch grafting. The main outcome measures were objective data and subjective data in men undergoing PIEG via lateral longitudinal "window" incision for PD repair. The plaque was excised and a porcine small intestinal submucosa graft was sewn in. Intraoperative artificial tumescence at the end of surgery revealed complete correction of the curvature. The patient experienced painless rigid erections by postoperative day three with minimal penile edema. By postoperative week four, he could successfully partake in coitus. His SHIM score remained unchanged. At maximum follow-up 6 months postoperatively, he still endorsed excellent cosmetic and functional outcomes with spontaneous unassisted erections and no recurrence of his curvature. A lateral

  18. An adaptive mesh refinement approach for average current nodal expansion method in 2-D rectangular geometry

    International Nuclear Information System (INIS)

    Poursalehi, N.; Zolfaghari, A.; Minuchehr, A.

    2013-01-01

    Highlights: ► A new adaptive h-refinement approach has been developed for a class of nodal method. ► The resulting system of nodal equations is more amenable to efficient numerical solution. ► The benefit of the approach is reducing computational efforts relative to the uniform fine mesh modeling. ► Spatially adaptive approach greatly enhances the accuracy of the solution. - Abstract: The aim of this work is to develop a spatially adaptive coarse mesh strategy that progressively refines the nodes in appropriate regions of domain to solve the neutron balance equation by zeroth order nodal expansion method. A flux gradient based a posteriori estimation scheme has been utilized for checking the approximate solutions for various nodes. The relative surface net leakage of nodes has been considered as an assessment criterion. In this approach, the core module is called in by adaptive mesh generator to determine gradients of node surfaces flux to explore the possibility of node refinements in appropriate regions and directions of the problem. The benefit of the approach is reducing computational efforts relative to the uniform fine mesh modeling. For this purpose, a computer program ANRNE-2D, Adaptive Node Refinement Nodal Expansion, has been developed to solve neutron diffusion equation using average current nodal expansion method for 2D rectangular geometries. Implementing the adaptive algorithm confirms its superiority in enhancing the accuracy of the solution without using fine nodes throughout the domain and increasing the number of unknown solution. Some well-known benchmarks have been investigated and improvements are reported

  19. Advanced methods in NDE using machine learning approaches

    Science.gov (United States)

    Wunderlich, Christian; Tschöpe, Constanze; Duckhorn, Frank

    2018-04-01

    Machine learning (ML) methods and algorithms have been applied recently with great success in quality control and predictive maintenance. Its goal to build new and/or leverage existing algorithms to learn from training data and give accurate predictions, or to find patterns, particularly with new and unseen similar data, fits perfectly to Non-Destructive Evaluation. The advantages of ML in NDE are obvious in such tasks as pattern recognition in acoustic signals or automated processing of images from X-ray, Ultrasonics or optical methods. Fraunhofer IKTS is using machine learning algorithms in acoustic signal analysis. The approach had been applied to such a variety of tasks in quality assessment. The principal approach is based on acoustic signal processing with a primary and secondary analysis step followed by a cognitive system to create model data. Already in the second analysis steps unsupervised learning algorithms as principal component analysis are used to simplify data structures. In the cognitive part of the software further unsupervised and supervised learning algorithms will be trained. Later the sensor signals from unknown samples can be recognized and classified automatically by the algorithms trained before. Recently the IKTS team was able to transfer the software for signal processing and pattern recognition to a small printed circuit board (PCB). Still, algorithms will be trained on an ordinary PC; however, trained algorithms run on the Digital Signal Processor and the FPGA chip. The identical approach will be used for pattern recognition in image analysis of OCT pictures. Some key requirements have to be fulfilled, however. A sufficiently large set of training data, a high signal-to-noise ratio, and an optimized and exact fixation of components are required. The automated testing can be done subsequently by the machine. By integrating the test data of many components along the value chain further optimization including lifetime and durability

  20. Content-oriented Approach to Organization of Theories and Its Utilization

    Science.gov (United States)

    Hayashi, Yusuke; Bourdeau, Jacqueline; Mizoguch, Riichiro

    In spite of the fact that the relation between theory and practice is a foundation of scientific and technological development, the trend of increasing the gap between theory and practice accelerates in these years. The gap embraces a risk of distrust of science and technology. Ontological engineering as the content-oriented research is expected to contribute to the resolution of the gap. This paper presents the feasibility of organization of theoretical knowledge on ontological engineering and new-generation intelligent systems based on it through an application of ontological engineering in the area of learning/instruction support. This area also has the problem of the gap between theory and practice, and its resolution is strongly required. So far we proposed OMNIBUS ontology, which is a comprehensive ontology that covers different learning/instructional theories and paradigms, and SMARTIES, which is a theory-aware and standard-compliant authoring system for making learning/instructional scenarios based on OMNIBUS ontology. We believe the theory-awareness and standard-compliance bridge the gap between theory and practice because it links theories to practical use of standard technologies and enables practitioners to easily enjoy theoretical support while using standard technologies in practice. The following goals are set in order to achieve it; computers (1) understand a variety of learning/instructional theories based on the organization of them, (2) utilize the understanding for helping authors' learning/instructional scenario making and (3) make such theoretically sound scenarios interoperable within the framework of standard technologies. This paper suggests an ontological engineering solution to the achievement of these three goals. Although the evaluation is far from complete in terms of practical use, we believe that the results of this study address high-level technical challenges from the viewpoint of the current state of the art in the research area

  1. Mixing Methods in Organizational Ethics and Organizational Innovativeness Research : Three Approaches to Mixed Methods Analysis

    OpenAIRE

    Riivari, Elina

    2015-01-01

    This chapter discusses three categories of mixed methods analysis techniques: variableoriented, case-oriented, and process/experience-oriented. All three categories combine qualitative and quantitative approaches to research methodology. The major differences among the categories are the focus of the study, available analysis techniques and timely aspect of the study. In variable-oriented analysis, the study focus is relationships between the research phenomena. In case-oriente...

  2. Methodical approaches in the Norwegian Master Plan for Water Resources

    International Nuclear Information System (INIS)

    Bowitz, Einar

    1997-01-01

    The Norwegian Master Plan for Water Resources instructs the management not to consider applications for concession to develop hydroelectric projects in the so called category II of the plan. These are the environmentally most controversial projects or the most expensive projects. This report discusses the methods used in this Master Plan to classify the projects. The question whether the assessments of the environmental disadvantages of hydropower development are reasonable is approached in two ways: (1) Compare the environmental costs imbedded in the Plan with direct assessments, and (2) Discuss the appropriateness of the methodology used for environmental evaluations in the Plan. The report concludes that (1) the environmental costs that can be derived from the ranking in the Plan are significantly greater than those following from direct evaluations, (2) the differences are generally so great that one may ask whether the methods used in the Plan overestimate the real environmental costs, (3) it seems to have been difficult to make a unified assessment of the environmental disadvantages, (4) the Plan has considered the economic impact on agriculture and forestry very roughly and indirectly, which may have contributed to overestimated environmental costs of hydropower development. 20 refs., 6 figs., 7 tabs

  3. An Experiential-Based Learning Method Aiming to Improve Spatial Awareness Utilizing GPS, Geocaching, and Geo-Selfies

    Science.gov (United States)

    Flynn, K. Colton; Popp, Jennie

    2016-01-01

    Many educators have suggested that spatial awareness is vital in the foundation of geography curricula, as well as the ability to utilize geospatial technologies (National Research Council 2006; Kerski 2008; Lee and Bednarz 2009; Favier and Van der Schee 2014). The purpose of this research was to identify a low-cost and effective method to improve…

  4. A bottom-up method to develop pollution abatement cost curves for coal-fired utility boilers

    Science.gov (United States)

    This paper illustrates a new method to create supply curves for pollution abatement using boiler-level data that explicitly accounts for technology costs and performance. The Coal Utility Environmental Cost (CUECost) model is used to estimate retrofit costs for five different NO...

  5. Utilization of a liquid crystal spatial light modulator in a gray scale detour phase method for Fourier holograms.

    Science.gov (United States)

    Makey, Ghaith; El-Daher, Moustafa Sayem; Al-Shufi, Kanj

    2012-11-10

    This paper introduces a new modification for the well-known binary detour phase method, which is largely used to represent Fourier holograms; the modification utilizes gray scale level control provided by a liquid crystal spatial light modulator to improve the traditional binary detour phase. Results are shown by both simulation and experiment.

  6. The Utility of a Three-Dimensional Approach with T-Shaped Osteotomy in Osseous Genioplasty

    Directory of Open Access Journals (Sweden)

    Jung Jae Jegal

    2013-07-01

    Full Text Available BackgroundFacial beauty depends on the form, proportion, and position of various units of the face. In terms of the frontal view and facial profile, the chin is the most prominent aesthetic element of the lower third of the face. Many methods have been implemented to obtain good proportions of the lower face. In this study, we applied the T-shaped genioplasty method to correcting chin deformities.MethodsAll of the procedures in 9 cases were performed under general anesthesia. For genioplasty, a horizontal cutting line and 1 or 2 vertical cutting lines were drawn 5 mm below the mental foramen. Osteotomed bone segments of the chin were used for horizontal widening using bone grafts or for horizontal shortening. Likewise, they were used as bone grafts for vertical lengthening or vertical shortening. The bone segments were approximated in the midline and held in place using miniplates.ResultsThe postoperative appearance of the 9 cases showed that the lower third of the face had been naturally changed. At the same time, vertical lengthening or shortening, and horizontal widening or shortening could be implemented during the operation. Satisfactory results were obtained based on reviews of the patients' preoperative and postoperative photographs. The patients were also satisfied with the outcomes.ConclusionsUsing T-shaped genioplasty, we efficiently adjusted the shape and position of the chin to obtain good proportions of the lower face and change its contour to obtain an aesthetically appealing oval face in accordance with East Asians' aesthetic preferences.

  7. Combining psychological and engineering approaches to utilizing social robots with children with autism.

    Science.gov (United States)

    Dickstein-Fischer, Laurie; Fischer, Gregory S

    2014-01-01

    It is estimated that Autism Spectrum Disorder (ASD) affects 1 in 68 children. Early identification of an ASD is exceedingly important to the introduction of an intervention. We are developing a robot-assisted approach that will serve as an improved diagnostic and early intervention tool for children with autism. The robot, named PABI® (Penguin for Autism Behavioral Interventions), is a compact humanoid robot taking on an expressive cartoon-like embodiment. The robot is affordable, durable, and portable so that it can be used in various settings including schools, clinics, and the home. Thus enabling significantly enhanced and more readily available diagnosis and continuation of care. Through facial expressions, body motion, verbal cues, stereo vision-based tracking, and a tablet computer, the robot is capable of interacting meaningfully with an autistic child. Initial implementations of the robot, as part of a comprehensive treatment model (CTM), include Applied Behavioral Analysis (ABA) therapy where the child interacts with a tablet computer wirelessly interfaced with the robot. At the same time, the robot makes meaningful expressions and utterances and uses stereo cameras in eyes to track the child, maintain eye contact, and collect data such as affect and gaze direction for charting of progress. In this paper we present the clinical justification, anticipated usage with corresponding requirements, prototype development of the robotic system, and demonstration of a sample application for robot-assisted ABA therapy.

  8. LUDIC APPROACH OF THE SCIENTIFIC METHOD IN BIOCHEMICAL EDUCATION

    Directory of Open Access Journals (Sweden)

    W.B. Maia

    2008-05-01

    Full Text Available Education in the current times concerns to  the intellectual autonomy, the capacity of decision and the possibilities of the student in making the difference. This work arose due to multi-transdisciplinary perception of an educator’s team from UFPE and UFRJ,  to  whom scientific research means intentional intellectual activity to attend human needs. As an answer to teacher’s questions about student difficulties of learning scientific methodology, it was included a ludic activity into classroom of high school, graduation and postgraduate levels. First of all a CD-ROM entitled: “O Método Científico: Uma Leitura Virtual”  was  made with adobe flash program and it was  used as neurodidactic strategy (auditory and visual stimulus. After thirty minutes of ludic activities, discussions about importance and utility of scientific method took place, and the Science and the Technology as subjects matters arose. So, a democratic and interfacial environment was produced.  As a result of using this didactic  activity it was stimulated curiosity, awakened up attention, developed specific abilities of observation, eased association of ideas and improved capacity of analysis and synthesis in all the groups. Ludic activities in education create possibilities of sociability  (such as consolidation of student’s individuality, and  also  stimulate the interest in the Sciences. In this way, for the development of educational projects focused on paradigms changes that lead to the rise of a new didactic culture,  the use  of diverse interfacial resources becomes necessary.

  9. Value of information methods to design a clinical trial in a small population to optimise a health economic utility function

    Directory of Open Access Journals (Sweden)

    Michael Pearce

    2018-02-01

    Full Text Available Abstract Background Most confirmatory randomised controlled clinical trials (RCTs are designed with specified power, usually 80% or 90%, for a hypothesis test conducted at a given significance level, usually 2.5% for a one-sided test. Approval of the experimental treatment by regulatory agencies is then based on the result of such a significance test with other information to balance the risk of adverse events against the benefit of the treatment to future patients. In the setting of a rare disease, recruiting sufficient patients to achieve conventional error rates for clinically reasonable effect sizes may be infeasible, suggesting that the decision-making process should reflect the size of the target population. Methods We considered the use of a decision-theoretic value of information (VOI method to obtain the optimal sample size and significance level for confirmatory RCTs in a range of settings. We assume the decision maker represents society. For simplicity we assume the primary endpoint to be normally distributed with unknown mean following some normal prior distribution representing information on the anticipated effectiveness of the therapy available before the trial. The method is illustrated by an application in an RCT in haemophilia A. We explicitly specify the utility in terms of improvement in primary outcome and compare this with the costs of treating patients, both financial and in terms of potential harm, during the trial and in the future. Results The optimal sample size for the clinical trial decreases as the size of the population decreases. For non-zero cost of treating future patients, either monetary or in terms of potential harmful effects, stronger evidence is required for approval as the population size increases, though this is not the case if the costs of treating future patients are ignored. Conclusions Decision-theoretic VOI methods offer a flexible approach with both type I error rate and power (or equivalently

  10. Method of configuring a cell of a wireless communication system for improved resource utilization

    NARCIS (Netherlands)

    2013-01-01

    At least one base station of a wireless network adjusts its access area so as to drive at least one measure of utilization of a resource or resources of that cell toward, but not to exceed, a specified maximum level. The adjustment is dynamic in that it responds in real time to traffic fluctuations.

  11. Systematic Field Study of NO(x) Emission Control Methods for Utility Boilers.

    Science.gov (United States)

    Bartok, William; And Others

    A utility boiler field test program was conducted. The objectives were to determine new or improved NO (x) emission factors by fossil fuel type and boiler design, and to assess the scope of applicability of combustion modification techniques for controlling NO (x) emissions from such installations. A statistically designed test program was…

  12. Methods and approaches to prediction in the meat industry

    Directory of Open Access Journals (Sweden)

    A. B. Lisitsyn

    2016-01-01

    presents the man methods and approaches to prediction in the meat industry.

  13. A LINEAR PROGRAMMING METHOD TO ENHANCE RESOURCE UTILIZATION CASE OF ETHIOPIAN APPAREL SECTOR

    Directory of Open Access Journals (Sweden)

    Gezahegn Tesfaye

    2016-06-01

    Full Text Available The Ethiopian industrial development strategy is characterized by export-led and labor intensive industrialization. The country is emerging as the most important investment destination in its apparel sector. Thought this sector is expected to generate more income from the export market, its export earnings remain trivial mainly due to the inefficient organizational resource utilization. One of the competent techniques that help companies to efficiently improve the use of their resources to increase their profit is linear programming. In apparel manufacturing firms, efficient use of materials such as fabrics and sewing threads and processing time at different stages of production as well as minimization of labor and materials cost are necessary to enhance their profitability. Cutting, sewing, and finishing operations deserve more attention for apparel process optimization. However, the issue of proper resource allocation remains an unsolved problem within the Ethiopian apparel industry. The aim of this research is to devise efficient resource utilization mechanism for Ethiopian apparel sector to improve their resource utilization and profitability, taking one of the garment factories engaged in the export market as a case study. Five types of products the company is currently producing, the amount of resources employed to produce each unit of the products, and the value of profit per unit from the sale of each products have been collected from the case company. The monthly availability of resources utilized and the monthly production volume of the five products have also been collected from the company. The data gathered was mathematically modeled using a linear programming technique, and solved using MS-Excel solver. The findings of the study depicts that all of the organizational resources are severely underutilized. This research proved that the resource utilization of the case company can be improved from 46.41% of the current resource

  14. Approaching complexity by stochastic methods: From biological systems to turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Friedrich, Rudolf [Institute for Theoretical Physics, University of Muenster, D-48149 Muenster (Germany); Peinke, Joachim [Institute of Physics, Carl von Ossietzky University, D-26111 Oldenburg (Germany); Sahimi, Muhammad [Mork Family Department of Chemical Engineering and Materials Science, University of Southern California, Los Angeles, CA 90089-1211 (United States); Reza Rahimi Tabar, M., E-mail: mohammed.r.rahimi.tabar@uni-oldenburg.de [Department of Physics, Sharif University of Technology, Tehran 11155-9161 (Iran, Islamic Republic of); Institute of Physics, Carl von Ossietzky University, D-26111 Oldenburg (Germany); Fachbereich Physik, Universitaet Osnabrueck, Barbarastrasse 7, 49076 Osnabrueck (Germany)

    2011-09-15

    This review addresses a central question in the field of complex systems: given a fluctuating (in time or space), sequentially measured set of experimental data, how should one analyze the data, assess their underlying trends, and discover the characteristics of the fluctuations that generate the experimental traces? In recent years, significant progress has been made in addressing this question for a class of stochastic processes that can be modeled by Langevin equations, including additive as well as multiplicative fluctuations or noise. Important results have emerged from the analysis of temporal data for such diverse fields as neuroscience, cardiology, finance, economy, surface science, turbulence, seismic time series and epileptic brain dynamics, to name but a few. Furthermore, it has been recognized that a similar approach can be applied to the data that depend on a length scale, such as velocity increments in fully developed turbulent flow, or height increments that characterize rough surfaces. A basic ingredient of the approach to the analysis of fluctuating data is the presence of a Markovian property, which can be detected in real systems above a certain time or length scale. This scale is referred to as the Markov-Einstein (ME) scale, and has turned out to be a useful characteristic of complex systems. We provide a review of the operational methods that have been developed for analyzing stochastic data in time and scale. We address in detail the following issues: (i) reconstruction of stochastic evolution equations from data in terms of the Langevin equations or the corresponding Fokker-Planck equations and (ii) intermittency, cascades, and multiscale correlation functions.

  15. Approaching complexity by stochastic methods: From biological systems to turbulence

    International Nuclear Information System (INIS)

    Friedrich, Rudolf; Peinke, Joachim; Sahimi, Muhammad; Reza Rahimi Tabar, M.

    2011-01-01

    This review addresses a central question in the field of complex systems: given a fluctuating (in time or space), sequentially measured set of experimental data, how should one analyze the data, assess their underlying trends, and discover the characteristics of the fluctuations that generate the experimental traces? In recent years, significant progress has been made in addressing this question for a class of stochastic processes that can be modeled by Langevin equations, including additive as well as multiplicative fluctuations or noise. Important results have emerged from the analysis of temporal data for such diverse fields as neuroscience, cardiology, finance, economy, surface science, turbulence, seismic time series and epileptic brain dynamics, to name but a few. Furthermore, it has been recognized that a similar approach can be applied to the data that depend on a length scale, such as velocity increments in fully developed turbulent flow, or height increments that characterize rough surfaces. A basic ingredient of the approach to the analysis of fluctuating data is the presence of a Markovian property, which can be detected in real systems above a certain time or length scale. This scale is referred to as the Markov-Einstein (ME) scale, and has turned out to be a useful characteristic of complex systems. We provide a review of the operational methods that have been developed for analyzing stochastic data in time and scale. We address in detail the following issues: (i) reconstruction of stochastic evolution equations from data in terms of the Langevin equations or the corresponding Fokker-Planck equations and (ii) intermittency, cascades, and multiscale correlation functions.

  16. Assessment of statistical methods used in library-based approaches to microbial source tracking.

    Science.gov (United States)

    Ritter, Kerry J; Carruthers, Ethan; Carson, C Andrew; Ellender, R D; Harwood, Valerie J; Kingsley, Kyle; Nakatsu, Cindy; Sadowsky, Michael; Shear, Brian; West, Brian; Whitlock, John E; Wiggins, Bruce A; Wilbur, Jayson D

    2003-12-01

    Several commonly used statistical methods for fingerprint identification in microbial source tracking (MST) were examined to assess the effectiveness of pattern-matching algorithms to correctly identify sources. Although numerous statistical methods have been employed for source identification, no widespread consensus exists as to which is most appropriate. A large-scale comparison of several MST methods, using identical fecal sources, presented a unique opportunity to assess the utility of several popular statistical methods. These included discriminant analysis, nearest neighbour analysis, maximum similarity and average similarity, along with several measures of distance or similarity. Threshold criteria for excluding uncertain or poorly matched isolates from final analysis were also examined for their ability to reduce false positives and increase prediction success. Six independent libraries used in the study were constructed from indicator bacteria isolated from fecal materials of humans, seagulls, cows and dogs. Three of these libraries were constructed using the rep-PCR technique and three relied on antibiotic resistance analysis (ARA). Five of the libraries were constructed using Escherichia coli and one using Enterococcus spp. (ARA). Overall, the outcome of this study suggests a high degree of variability across statistical methods. Despite large differences in correct classification rates among the statistical methods, no single statistical approach emerged as superior. Thresholds failed to consistently increase rates of correct classification and improvement was often associated with substantial effective sample size reduction. Recommendations are provided to aid in selecting appropriate analyses for these types of data.

  17. An Examination of Physical Education Teachers' Perceptions of Utilizing Contemporary Music in the Classroom Environment: A Qualitative Approach

    Science.gov (United States)

    Barney, David C.; Pleban, Francis T.

    2018-01-01

    Objectives: To provide further information regarding physical education (PE) teachers' perceptions of incorporating music in PE lessons and to evaluate the influence of music on the classroom environment using a qualitative approach. Method: Electronic survey interviews were conducted with 26 veteran PE instructors (10 male, 16 female), from 7…

  18. Clustering Methods with Qualitative Data: a Mixed-Methods Approach for Prevention Research with Small Samples.

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B; Mohatt, Nathaniel; Allen, James; Kelly, James G

    2015-10-01

    Qualitative methods potentially add depth to prevention research but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed-methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed-methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-means clustering, and latent class analysis produced similar levels of accuracy with binary data and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a "real-world" example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities.

  19. Clustering Methods with Qualitative Data: A Mixed Methods Approach for Prevention Research with Small Samples

    Science.gov (United States)

    Henry, David; Dymnicki, Allison B.; Mohatt, Nathaniel; Allen, James; Kelly, James G.

    2016-01-01

    Qualitative methods potentially add depth to prevention research, but can produce large amounts of complex data even with small samples. Studies conducted with culturally distinct samples often produce voluminous qualitative data, but may lack sufficient sample sizes for sophisticated quantitative analysis. Currently lacking in mixed methods research are methods allowing for more fully integrating qualitative and quantitative analysis techniques. Cluster analysis can be applied to coded qualitative data to clarify the findings of prevention studies by aiding efforts to reveal such things as the motives of participants for their actions and the reasons behind counterintuitive findings. By clustering groups of participants with similar profiles of codes in a quantitative analysis, cluster analysis can serve as a key component in mixed methods research. This article reports two studies. In the first study, we conduct simulations to test the accuracy of cluster assignment using three different clustering methods with binary data as produced when coding qualitative interviews. Results indicated that hierarchical clustering, K-Means clustering, and latent class analysis produced similar levels of accuracy with binary data, and that the accuracy of these methods did not decrease with samples as small as 50. Whereas the first study explores the feasibility of using common clustering methods with binary data, the second study provides a “real-world” example using data from a qualitative study of community leadership connected with a drug abuse prevention project. We discuss the implications of this approach for conducting prevention research, especially with small samples and culturally distinct communities. PMID:25946969

  20. Expedient Method for Samarium(II) Iodide Preparation Utilizing a Flow Approach

    Czech Academy of Sciences Publication Activity Database

    Voltrová, Svatava; Šrogl, Jiří

    2013-01-01

    Roč. 24, č. 3 (2013), s. 394-396 ISSN 0936-5214 R&D Projects: GA MŠk LH12013 Institutional support: RVO:61388963 Keywords : flow * samarium * iodide * reduction Subject RIV: CC - Organic Chemistry Impact factor: 2.463, year: 2013

  1. Approaches for optimizing the utilization of nuclear medical resources in developing countries

    International Nuclear Information System (INIS)

    Adelstein, S.

    1978-10-01

    Contemplation of the use of nuclear medicine technologies in developing countries raises the issue, what purposes they would serve in relation to the major health needs of the region. This investigation was an attempt to assess and to test a methodology for conducting this assessment. Several potential nuclear medicine applications were selected on the basis of their intrinsic promise and in order to evaluate a variety of techniques. These applications were then subjected to a cost-effectiveness analysis, either on their own or in comparison with some competing procedure. To give reality to the study, it was performed in the context of the Indian health scene. The situations examined were the following: Screening blood units for hepatitis B surface antigen prior to transfusion. Costs and benefits of radiorespirometry and conventional culture methods as means to improve case-finding in tuberculosis surveys. Improved tests for diagnosing tuberculous meningitis. Population screening for thyroid diseases. Studies of this type are typically impeded by an insufficiency of key data, they nevertheless can shed some light on the value of potential radionuclide applications

  2. Biomaterials Approaches for Utilizing the Regenerative Potential of the Peripheral Nerve Injury Microenvironment

    Science.gov (United States)

    Wrobel, Melissa Renee

    following classical activation (M1/pro-inflammatory) with lipopolysaccharide (LPS; 1microg/mL) and would accelerate the transformation of Schwann cells from an immature state fol-lowing injury to a mature/pro-myelinating one. Cell phenotypes were functionally assessed using quantified reverse transcription polymerase chain reaction (qRT-PCR), immunofluorescence, and sandwich-ELISA based antibody arrays to measure changes in mRNA expression, mor-phology, and cytokine release, respectively. Macrophages cultured with the SCM and HA fibers had significantly reduced M1 gene expression, released lower levels of M1 cytokines (IL-1a, RANTES and TFN-a) and assumed an elongated morphology indicative of M2. These cues also induced changes in the Schwann cells including significantly reduced area, increased elongation, decreased expression of immature genes (GFAP) and increased expression of mature genes (Krox20 and Oct6). These results suggest that the SCM and HA nanofibers could trigger non-neuronal cells towards regenerative programs more quickly than traditional PNI interventions. Changes induced by biomaterials have distinct benefits over the use of immunomodulatory cy-tokines and would be a novel approach to direct repair. Our collective studies offer improved in-sight into the endogenous potential of the injured peripheral nerve and offer ways to incorporate intrinsic repair cues into a biomaterial system for treating large gaps.

  3. [Series: Utilization of Differential Equations and Methods for Solving Them in Medical Physics (2)].

    Science.gov (United States)

    Murase, Kenya

    2015-01-01

    In this issue, symbolic methods for solving differential equations were firstly introduced. Of the symbolic methods, Laplace transform method was also introduced together with some examples, in which this method was applied to solving the differential equations derived from a two-compartment kinetic model and an equivalent circuit model for membrane potential. Second, series expansion methods for solving differential equations were introduced together with some examples, in which these methods were used to solve Bessel's and Legendre's differential equations. In the next issue, simultaneous differential equations and various methods for solving these differential equations will be introduced together with some examples in medical physics.

  4. Value of information methods to design a clinical trial in a small population to optimise a health economic utility function.

    Science.gov (United States)

    Pearce, Michael; Hee, Siew Wan; Madan, Jason; Posch, Martin; Day, Simon; Miller, Frank; Zohar, Sarah; Stallard, Nigel

    2018-02-08

    Most confirmatory randomised controlled clinical trials (RCTs) are designed with specified power, usually 80% or 90%, for a hypothesis test conducted at a given significance level, usually 2.5% for a one-sided test. Approval of the experimental treatment by regulatory agencies is then based on the result of such a significance test with other information to balance the risk of adverse events against the benefit of the treatment to future patients. In the setting of a rare disease, recruiting sufficient patients to achieve conventional error rates for clinically reasonable effect sizes may be infeasible, suggesting that the decision-making process should reflect the size of the target population. We considered the use of a decision-theoretic value of information (VOI) method to obtain the optimal sample size and significance level for confirmatory RCTs in a range of settings. We assume the decision maker represents society. For simplicity we assume the primary endpoint to be normally distributed with unknown mean following some normal prior distribution representing information on the anticipated effectiveness of the therapy available before the trial. The method is illustrated by an application in an RCT in haemophilia A. We explicitly specify the utility in terms of improvement in primary outcome and compare this with the costs of treating patients, both financial and in terms of potential harm, during the trial and in the future. The optimal sample size for the clinical trial decreases as the size of the population decreases. For non-zero cost of treating future patients, either monetary or in terms of potential harmful effects, stronger evidence is required for approval as the population size increases, though this is not the case if the costs of treating future patients are ignored. Decision-theoretic VOI methods offer a flexible approach with both type I error rate and power (or equivalently trial sample size) depending on the size of the future population for

  5. Recommended methods for evaluating the benefits of ECUT Program outputs. [Energy Conversion and Utilization

    Energy Technology Data Exchange (ETDEWEB)

    Levine, L.O.; Winter, C.

    1986-03-01

    This study was conducted to define and develop techniques that could be used to assess the complete spectrum of positive effects resulting from the Energy Conversion and Utilization Technologies (ECUT) Program activities. These techniques could then be applied to measure the benefits from past ECUT outputs. In addition, the impact of future ECUT outputs could be assessed as part of an ongoing monitoring process, after sufficient time has elapsed to allow their impacts to develop.

  6. Evaluating a physician leadership development program - a mixed methods approach.

    Science.gov (United States)

    Throgmorton, Cheryl; Mitchell, Trey; Morley, Tom; Snyder, Marijo

    2016-05-16

    Purpose - With the extent of change in healthcare today, organizations need strong physician leaders. To compensate for the lack of physician leadership education, many organizations are sending physicians to external leadership programs or developing in-house leadership programs targeted specifically to physicians. The purpose of this paper is to outline the evaluation strategy and outcomes of the inaugural year of a Physician Leadership Academy (PLA) developed and implemented at a Michigan-based regional healthcare system. Design/methodology/approach - The authors applied the theoretical framework of Kirkpatrick's four levels of evaluation and used surveys, observations, activity tracking, and interviews to evaluate the program outcomes. The authors applied grounded theory techniques to the interview data. Findings - The program met targeted outcomes across all four levels of evaluation. Interview themes focused on the significance of increasing self-awareness, building relationships, applying new skills, and building confidence. Research limitations/implications - While only one example, this study illustrates the importance of developing the evaluation strategy as part of the program design. Qualitative research methods, often lacking from learning evaluation design, uncover rich themes of impact. The study supports how a PLA program can enhance physician learning, engagement, and relationship building throughout and after the program. Physician leaders' partnership with organization development and learning professionals yield results with impact to individuals, groups, and the organization. Originality/value - Few studies provide an in-depth review of evaluation methods and outcomes of physician leadership development programs. Healthcare organizations seeking to develop similar in-house programs may benefit applying the evaluation strategy outlined in this study.

  7. Sequestration and utilization of carbon dioxide by chemical and biological methods for biofuels and biomaterials by chemoautotrophs: Opportunities and challenges.

    Science.gov (United States)

    Thakur, Indu Shekhar; Kumar, Manish; Varjani, Sunita J; Wu, Yonghong; Gnansounou, Edgard; Ravindran, Sindhu

    2018-05-01

    To meet the CO 2 emission reduction targets, carbon dioxide capture and utilization (CCU) comes as an evolve technology. CCU concept is turning into a feedstock and technologies have been developed for transformation of CO 2 into useful organic products. At industrial scale, utilization of CO 2 as raw material is not much significant as compare to its abundance. Mechanisms in nature have evolved for carbon concentration, fixation and utilization. Assimilation and subsequent conversion of CO 2 into complex molecules are performed by the photosynthetic and chemolithotrophic organisms. In the last three decades, substantial research is carry out to discover chemical and biological conversion of CO 2 in various synthetic and biological materials, such as carboxylic acids, esters, lactones, polymer biodiesel, bio-plastics, bio-alcohols, exopolysaccharides. This review presents an over view of catalytic transformation of CO 2 into biofuels and biomaterials by chemical and biological methods. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. Numerical Methods Application for Reinforced Concrete Elements-Theoretical Approach for Direct Stiffness Matrix Method

    Directory of Open Access Journals (Sweden)

    Sergiu Ciprian Catinas

    2015-07-01

    Full Text Available A detailed theoretical and practical investigation of the reinforced concrete elements is due to recent techniques and method that are implemented in the construction market. More over a theoretical study is a demand for a better and faster approach nowadays due to rapid development of the calculus technique. The paper above will present a study for implementing in a static calculus the direct stiffness matrix method in order capable to address phenomena related to different stages of loading, rapid change of cross section area and physical properties. The method is a demand due to the fact that in our days the FEM (Finite Element Method is the only alternative to such a calculus and FEM are considered as expensive methods from the time and calculus resources point of view. The main goal in such a method is to create the moment-curvature diagram in the cross section that is analyzed. The paper above will express some of the most important techniques and new ideas as well in order to create the moment curvature graphic in the cross sections considered.

  9. Cumulative Risk Assessment Toolbox: Methods and Approaches for the Practitioner

    Directory of Open Access Journals (Sweden)

    Margaret M. MacDonell

    2013-01-01

    Full Text Available The historical approach to assessing health risks of environmental chemicals has been to evaluate them one at a time. In fact, we are exposed every day to a wide variety of chemicals and are increasingly aware of potential health implications. Although considerable progress has been made in the science underlying risk assessments for real-world exposures, implementation has lagged because many practitioners are unaware of methods and tools available to support these analyses. To address this issue, the US Environmental Protection Agency developed a toolbox of cumulative risk resources for contaminated sites, as part of a resource document that was published in 2007. This paper highlights information for nearly 80 resources from the toolbox and provides selected updates, with practical notes for cumulative risk applications. Resources are organized according to the main elements of the assessment process: (1 planning, scoping, and problem formulation; (2 environmental fate and transport; (3 exposure analysis extending to human factors; (4 toxicity analysis; and (5 risk and uncertainty characterization, including presentation of results. In addition to providing online access, plans for the toolbox include addressing nonchemical stressors and applications beyond contaminated sites and further strengthening resource accessibility to support evolving analyses for cumulative risk and sustainable communities.

  10. A full digital approach to the TDCR method

    International Nuclear Information System (INIS)

    Mini, Giuliano; Pepe, Francesco; Tintori, Carlo; Capogni, Marco

    2014-01-01

    Current state of the art solutions based on the Triple to Double Coincidence Ratio method are generally large size, heavy-weight and not transportable systems. This is due, on one side, to large detectors and scintillation chambers and, on the other, to bulky analog electronics for data acquisition. CAEN developed a new, full digital approach to TDCR technique based on a portable, stand-alone, high-speed multichannel digitizer, on-board Digital Pulse Processing and dedicated DAQ software that emulates the well-known MAC3 analog board. - Highlights: • CAEN Desktop Digitizers used to emulate the MAC3 analog board in TDCR acquisition. • Spectroscopic application of the CAEN digitizers to the TDCR for charge spectra. • Development of two different softwares by CAEN and ENEA-INMRI for TDCR analysis. • Single electron peak obtained by CAEN digitizer and ENEA-INMRI portable TDCR. • Measurements of 90 Sr/ 90 Y by the new TDCR device equipped with CAEN digitizers

  11. A Cubesat enabled Spatio-Temporal Enhancement Method (CESTEM) utilizing Planet, Landsat and MODIS data

    KAUST Repository

    Houborg, Rasmus

    2018-03-19

    Satellite sensing in the visible to near-infrared (VNIR) domain has been the backbone of land surface monitoring and characterization for more than four decades. However, a limitation of conventional single-sensor satellite missions is their limited capacity to observe land surface dynamics at the very high spatial and temporal resolutions demanded by a wide range of applications. One solution to this spatio-temporal divide is an observation strategy based on the CubeSat standard, which facilitates constellations of small, inexpensive satellites. Repeatable near-daily image capture in RGB and near-infrared (NIR) bands at 3–4 m resolution has recently become available via a constellation of >130 CubeSats operated commercially by Planet. While the observing capacity afforded by this system is unprecedented, the relatively low radiometric quality and cross-sensor inconsistencies represent key challenges in the realization of their full potential as a game changer in Earth observation. To address this issue, we developed a Cubesat Enabled Spatio-Temporal Enhancement Method (CESTEM) that uses a multi-scale machine-learning technique to correct for radiometric inconsistencies between CubeSat acquisitions. The CESTEM produces Landsat 8 consistent atmospherically corrected surface reflectances in blue, green, red, and NIR bands, but at the spatial scale and temporal frequency of the CubeSat observations. An application of CESTEM over an agricultural dryland system in Saudi Arabia demonstrated CubeSat-based reproduction of Landsat 8 consistent VNIR data with an overall relative mean absolute deviation of 1.6% or better, even when the Landsat 8 and CubeSat acquisitions were temporally displaced by >32 days. The consistently high retrieval accuracies were achieved using a multi-scale target sampling scheme that draws Landsat 8 reference data from a series of scenes by using MODIS-consistent surface reflectance time series to quantify relative changes in Landsat

  12. Vibration reduction methods and techniques for rotorcraft utilizing on-blade active control, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Rotor blades adapted for vibration control have the added benefit of extended blade and rotor life, as well as improved passenger comfort. Approaches that have been...

  13. Search for Heavy Stable Charged Particles at $\\sqrt{s}$ = 13 TeV Utilizing a Multivariate Approach

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00375809

    Heavy stable charged particles (HSCPs) have been searched for at the Large Hadron Collider since its initial data taking in 2010. The search for heavy stable charged particles provide a means of directly probing the new physics realm, as they produce a detector signature unlike any particle discovered to date. The goal of this research is to investigate an idea that was introduced in the later stages of 2010-2012 data taking period. Rather than utilizing the current tight selection on the calculated particle mass the hypothesis is that by incorporating a multivariate approach, specif- ically an artificial neural network, the remaining selection criteria could be loosened allowing for a greater signal acceptance while maintaining acceptable background rejection via the multivariate discriminator from the artificial neural network. The increase in signal acceptance and retention or increase in background rejection increases the discovery potential for HSCPs and as a secondary objective calculates improved limit...

  14. Robust Optimization Approach for Design for a Dynamic Cell Formation Considering Labor Utilization: Bi-objective Mathematical Mode

    Directory of Open Access Journals (Sweden)

    Hiwa Farughi

    2016-05-01

    Full Text Available In this paper, robust optimization of a bi-objective mathematical model in a dynamic cell formation problem considering labor utilization with uncertain data is carried out. The robust approach is used to reduce the effects of fluctuations of the uncertain parameters with regards to all the possible future scenarios. In this research, cost parameters of the cell formation and demand fluctuations are subject to uncertainty and a mixed-integer programming (MIP model is developed to formulate the related robust dynamic cell formation problem. Then the problem is transformed into a bi-objective linear one. The first objective function seeks to minimize relevant costs of the problem including machine procurement and relocation costs, machine variable cost, inter-cell movement and intra-cell movement costs, overtime cost and labor shifting cost between cells, machine maintenance cost, inventory, holding part cost. The second objective function seeks to minimize total man-hour deviations between cells or indeed labor utilization of the modeled.

  15. Embedding Systems Thinking into EWB Project Planning and Development: Assessing the Utility of a Group Model Building Approach

    Directory of Open Access Journals (Sweden)

    Kimberly Pugel

    2017-11-01

    Full Text Available Amongst growing sociotechnical efforts, engineering students and professionals both in the international development sector and industry are challenged to approach projects more holistically to achieve project goals.  Engineering service learning organizations must similarly adapt their technological projects to consider varying cultural and economic structures, ensuring more resilient social progress within development efforts.  In practice, systems thinking approaches can be utilized to model the social, economic, political, and technological implications that influence the sustainability of an engineering project. This research assesses the utility of integrating systems thinking into Engineers Without Borders (EWB project planning and development, thereby improving project impact and more effectively engaging members.  At a workshop held at an EWB-USA 2016 Regional Conference, the authors presented a planning and evaluation framework that applies group model building with system dynamics to foster systems thinking through factor diagramming and analysis. To assess the added value of the framework for EWB project planning and development, extensive participant feedback was gathered and evaluated during the workshop and through an optional post-workshop survey.  Supported by thoughtful observations and feedback provided by the EWB members, the model building workshop appeared to help participants reveal and consider project complexities by both visually and quantitatively identifying key non-technical and technical factors that influence project sustainability.  Therefore, system dynamics applied in a group model building workshop offers a powerful supplement to traditional EWB project planning and assessment activities, providing a systems-based tool for EWB teams and partner communities to build capacity and create lasting change.

  16. Enrichment, development, and assessment of Indian basil oil based antiseptic cream formulation utilizing hydrophilic-lipophilic balance approach.

    Science.gov (United States)

    Yadav, Narayan Prasad; Meher, Jaya Gopal; Pandey, Neelam; Luqman, Suaib; Yadav, Kuldeep Singh; Chanda, Debabrata

    2013-01-01

    The present work was aimed to develop an antiseptic cream formulation of Indian basil oil utilizing hydrophilic-lipophilic balance approach. In order to determine the required-hydrophilic lipophilic balance (rHLB) of basil oil, emulsions of basil oil were prepared by phase inversion temperature technique using water, Tween 80, and Span 80. Formulated emulsions were assessed for creaming (BE9; 9.8, BE10; 10.2), droplet size (BE18; 3.22 ± 0.09 μ m), and turbidity (BE18; 86.12 ± 2.1%). To ensure correctness of the applied methodology, rHLB of light liquid paraffin was also determined. After rHLB determination, basil oil creams were prepared with two different combinations of surfactants, namely, GMS : Tween 80 (1 : 3.45) and SLS : GMS (1 : 3.68), and evaluated for in vitro antimicrobial activity, skin irritation test, viscosity and consistency. The rHLB of basil oil and light liquid paraffin were found to be 13.36 ± 0.36 and 11.5 ± 0.35, respectively. Viscosity, and consistency parameters of cream was found to be consistent over 90 days. Cream formulations showed net zone of growth inhibition in the range of 5.0-11.3 mm against bacteria and 4.3-7.6 mm against fungi. Primary irritation index was found to be between 0.38 and1.05. Conclusively stable, consistent, non-irritant, enriched antiseptic basil oil cream formulations were developed utilizing HLB approach.

  17. Estimating Utility

    DEFF Research Database (Denmark)

    Arndt, Channing; Simler, Kenneth R.

    2010-01-01

    A fundamental premise of absolute poverty lines is that they represent the same level of utility through time and space. Disturbingly, a series of recent studies in middle- and low-income economies show that even carefully derived poverty lines rarely satisfy this premise. This article proposes a......, with the current approach tending to systematically overestimate (underestimate) poverty in urban (rural) zones.......A fundamental premise of absolute poverty lines is that they represent the same level of utility through time and space. Disturbingly, a series of recent studies in middle- and low-income economies show that even carefully derived poverty lines rarely satisfy this premise. This article proposes...... an information-theoretic approach to estimating cost-of-basic-needs (CBN) poverty lines that are utility consistent. Applications to date illustrate that utility-consistent poverty measurements derived from the proposed approach and those derived from current CBN best practices often differ substantially...

  18. Methods and energy storage devices utilizing electrolytes having surface-smoothing additives

    Science.gov (United States)

    Xu, Wu; Zhang, Jiguang; Graff, Gordon L; Chen, Xilin; Ding, Fei

    2015-11-12

    Electrodeposition and energy storage devices utilizing an electrolyte having a surface-smoothing additive can result in self-healing, instead of self-amplification, of initial protuberant tips that give rise to roughness and/or dendrite formation on the substrate and anode surface. For electrodeposition of a first metal (M1) on a substrate or anode from one or more cations of M1 in an electrolyte solution, the electrolyte solution is characterized by a surface-smoothing additive containing cations of a second metal (M2), wherein cations of M2 have an effective electrochemical reduction potential in the solution lower than that of the cations of M1.

  19. A Tools-Based Approach to Teaching Data Mining Methods

    Science.gov (United States)

    Jafar, Musa J.

    2010-01-01

    Data mining is an emerging field of study in Information Systems programs. Although the course content has been streamlined, the underlying technology is still in a state of flux. The purpose of this paper is to describe how we utilized Microsoft Excel's data mining add-ins as a front-end to Microsoft's Cloud Computing and SQL Server 2008 Business…

  20. Optimization of PID Parameters Utilizing Variable Weight Grey-Taguchi Method and Particle Swarm Optimization

    Science.gov (United States)

    Azmi, Nur Iffah Mohamed; Arifin Mat Piah, Kamal; Yusoff, Wan Azhar Wan; Romlay, Fadhlur Rahman Mohd

    2018-03-01

    Controller that uses PID parameters requires a good tuning method in order to improve the control system performance. Tuning PID control method is divided into two namely the classical methods and the methods of artificial intelligence. Particle swarm optimization algorithm (PSO) is one of the artificial intelligence methods. Previously, researchers had integrated PSO algorithms in the PID parameter tuning process. This research aims to improve the PSO-PID tuning algorithms by integrating the tuning process with the Variable Weight Grey- Taguchi Design of Experiment (DOE) method. This is done by conducting the DOE on the two PSO optimizing parameters: the particle velocity limit and the weight distribution factor. Computer simulations and physical experiments were conducted by using the proposed PSO- PID with the Variable Weight Grey-Taguchi DOE and the classical Ziegler-Nichols methods. They are implemented on the hydraulic positioning system. Simulation results show that the proposed PSO-PID with the Variable Weight Grey-Taguchi DOE has reduced the rise time by 48.13% and settling time by 48.57% compared to the Ziegler-Nichols method. Furthermore, the physical experiment results also show that the proposed PSO-PID with the Variable Weight Grey-Taguchi DOE tuning method responds better than Ziegler-Nichols tuning. In conclusion, this research has improved the PSO-PID parameter by applying the PSO-PID algorithm together with the Variable Weight Grey-Taguchi DOE method as a tuning method in the hydraulic positioning system.

  1. Research of environmentally-friendly utilization methods of the crushed stone waste on granite quarries

    Directory of Open Access Journals (Sweden)

    Levytskyi V.G.

    2017-12-01

    Full Text Available The analysis of activity of stone-mining enterprises shows the low competitiveness of crushed stone products Upgrading the quality of crushed stone and production of the European standard fractions requires to use of new technologies and equipment. The main waste of crushed stone pits is сrushed granite waste, which high percent of an exit is caused by outdated equipment and incorrectly selected technological parameters of the crushing process. Crushed-granite waste is stored in dumps which occupy large areas and negatively effect on production area ecology. In November 2017, the Government of Ukraine accepted the National Waste Management Strategy until 2030, the main aim of it is develop a strategy of the mineral raw materials balanced use and international standards introduction at the national level. Therefore, the problem of complex utilization and recycling of waste from stone-mining enterprises with receiving a qualitative secondary product is relevant. The publication presents the сrushed granite waste volumes by crushed stone pit, its properties and main directions of utilization. The ecological influence of waste dumps, in particular granite dust, on the environment and human, the strategy of using non-waste technologies and ecological features of сrushed granite waste secondary processing are considered

  2. Impact of phosphate-solubilizing bacteria inoculation methods on phosphorus transformation and long-term utilization in composting.

    Science.gov (United States)

    Wei, Yuquan; Zhao, Yue; Fan, Yuying; Lu, Qian; Li, Mingxiao; Wei, Qingbin; Zhao, Yi; Cao, Zhenyu; Wei, Zimin

    2017-10-01

    This study aimed to assess the effect of phosphate-solubilizing bacteria (PSB) application and inoculation methods on rock phosphate (RP) solubilization and bacterial community during composting. The results showed that PSB inoculation in different stages of composting, especially both in the beginning and cooling stages, not only improved the diversity and abundance of PSB and bacterial community, but also distinctly increased the content of potential available phosphorus. Redundancy analysis indicated that the combined inoculation of PSB in the initial stage with higher inoculation amount and in the cooling stage with lower inoculation amount was the best way to improve the inoculation effect and increase the solubilization and utilization of RP during composting. Besides, we suggested three methods to improve phosphorus transformation and long-term utilization efficiency in composts based on biological fixation of phosphates by humic substance and phosphate-accumulating organisms. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. How Iranian Instructors Teach L2 Pragmatics in Their Classroom Practices? A Mixed-Methods Approach

    Science.gov (United States)

    Muthasamy, Paramasivam; Farashaiyan, Atieh

    2016-01-01

    This study examined the teaching approaches and techniques that Iranian instructors utilize for teaching L2 pragmatics in their classroom practices. 238 Iranian instructors participated in this study. The data for this study were accumulated through questionnaire and semi-structured interviews. In terms of the instructional approaches, both the…

  4. Train Stop Scheduling in a High-Speed Rail Network by Utilizing a Two-Stage Approach

    Directory of Open Access Journals (Sweden)

    Huiling Fu

    2012-01-01

    Full Text Available Among the most commonly used methods of scheduling train stops are practical experience and various “one-step” optimal models. These methods face problems of direct transferability and computational complexity when considering a large-scale high-speed rail (HSR network such as the one in China. This paper introduces a two-stage approach for train stop scheduling with a goal of efficiently organizing passenger traffic into a rational train stop pattern combination while retaining features of regularity, connectivity, and rapidity (RCR. Based on a three-level station classification definition, a mixed integer programming model and a train operating tactics descriptive model along with the computing algorithm are developed and presented for the two stages. A real-world numerical example is presented using the Chinese HSR network as the setting. The performance of the train stop schedule and the applicability of the proposed approach are evaluated from the perspective of maintaining RCR.

  5. Utilization of the perturbation method for determination of the buckling heterogenous reactors

    International Nuclear Information System (INIS)

    Gheorghe, R.

    1975-01-01

    Evaluation of material buckling for heterogenous nulcear reactors is a key-problem for reactor people. In this direction several methods have been elaborated: bi-group method, heterogenous method and perturbation methods. Out of them, mostly employed is the perturbation method which is also presented in this paper and is applied in some parameter calculations of a new cell type for which fuel is positioned in the marginal area and the moderator is in the centre. It is based on the technique of progressive substitution. Advantages of the method: buckling comes out clearly, high level defects due to differences between O perturbated fluxes and the unperturbated flux Osub(o) can be corrected by an iterative procedure; using a modified bi-group theory, one can clearly describe effects of other parameters

  6. A Selection Approach for Optimized Problem-Solving Process by Grey Relational Utility Model and Multicriteria Decision Analysis

    Directory of Open Access Journals (Sweden)

    Chih-Kun Ke

    2012-01-01

    Full Text Available In business enterprises, especially the manufacturing industry, various problem situations may occur during the production process. A situation denotes an evaluation point to determine the status of a production process. A problem may occur if there is a discrepancy between the actual situation and the desired one. Thus, a problem-solving process is often initiated to achieve the desired situation. In the process, how to determine an action need to be taken to resolve the situation becomes an important issue. Therefore, this work uses a selection approach for optimized problem-solving process to assist workers in taking a reasonable action. A grey relational utility model and a multicriteria decision analysis are used to determine the optimal selection order of candidate actions. The selection order is presented to the worker as an adaptive recommended solution. The worker chooses a reasonable problem-solving action based on the selection order. This work uses a high-tech company’s knowledge base log as the analysis data. Experimental results demonstrate that the proposed selection approach is effective.

  7. Identification of evolutionarily conserved Momordica charantia microRNAs using computational approach and its utility in phylogeny analysis.

    Science.gov (United States)

    Thirugnanasambantham, Krishnaraj; Saravanan, Subramanian; Karikalan, Kulandaivelu; Bharanidharan, Rajaraman; Lalitha, Perumal; Ilango, S; HairulIslam, Villianur Ibrahim

    2015-10-01

    Momordica charantia (bitter gourd, bitter melon) is a monoecious Cucurbitaceae with anti-oxidant, anti-microbial, anti-viral and anti-diabetic potential. Molecular studies on this economically valuable plant are very essential to understand its phylogeny and evolution. MicroRNAs (miRNAs) are conserved, small, non-coding RNA with ability to regulate gene expression by bind the 3' UTR region of target mRNA and are evolved at different rates in different plant species. In this study we have utilized homology based computational approach and identified 27 mature miRNAs for the first time from this bio-medically important plant. The phylogenetic tree developed from binary data derived from the data on presence/absence of the identified miRNAs were noticed to be uncertain and biased. Most of the identified miRNAs were highly conserved among the plant species and sequence based phylogeny analysis of miRNAs resolved the above difficulties in phylogeny approach using miRNA. Predicted gene targets of the identified miRNAs revealed their importance in regulation of plant developmental process. Reported miRNAs held sequence conservation in mature miRNAs and the detailed phylogeny analysis of pre-miRNA sequences revealed genus specific segregation of clusters. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Series: Utilization of Differential Equations and Methods for Solving Them in Medical Physics (4).

    Science.gov (United States)

    Murase, Kenya

    2016-01-01

    Partial differential equations are often used in the field of medical physics. In this (final) issue, the methods for solving the partial differential equations were introduced, which include separation of variables, integral transform (Fourier and Fourier-sine transforms), Green's function, and series expansion methods. Some examples were also introduced, in which the integral transform and Green's function methods were applied to solving Pennes' bioheat transfer equation and the Fourier series expansion method was applied to Navier-Stokes equation for analyzing the wall shear stress in blood vessels.Finally, the author hopes that this series will be helpful for people who engage in medical physics.

  9. Series: Utilization of Differential Equations and Methods for Solving Them in Medical Physics (3).

    Science.gov (United States)

    Murase, Kenya

    2016-01-01

    In this issue, simultaneous differential equations were introduced. These differential equations are often used in the field of medical physics. The methods for solving them were also introduced, which include Laplace transform and matrix methods. Some examples were also introduced, in which Laplace transform and matrix methods were applied to solving simultaneous differential equations derived from a three-compartment kinetic model for analyzing the glucose metabolism in tissues and Bloch equations for describing the behavior of the macroscopic magnetization in magnetic resonance imaging.In the next (final) issue, partial differential equations and various methods for solving them will be introduced together with some examples in medical physics.

  10. Research design: qualitative, quantitative and mixed methods approaches Research design: qualitative, quantitative and mixed methods approaches Creswell John W Sage 320 £29 0761924426 0761924426 [Formula: see text].

    Science.gov (United States)

    2004-09-01

    The second edition of Creswell's book has been significantly revised and updated. The author clearly sets out three approaches to research: quantitative, qualitative and mixed methods. As someone who has used mixed methods in my research, it is refreshing to read a textbook that addresses this. The differences between the approaches are clearly identified and a rationale for using each methodological stance provided.

  11. Utility of the physical examination in detecting pulmonary hypertension. A mixed methods study.

    Science.gov (United States)

    Colman, Rebecca; Whittingham, Heather; Tomlinson, George; Granton, John

    2014-01-01

    Patients with pulmonary hypertension (PH) often present with a variety of physical findings reflecting a volume or pressure overloaded right ventricle (RV). However, there is no consensus regarding the diagnostic utility of the physical examination in PH. We conducted a systematic review of publications that evaluated the clinical examination and diagnosis of PH using MEDLINE (1946-2013) and EMBASE (1947-2013). We also prospectively evaluated the diagnostic utility of the physical examination findings. Patients who underwent right cardiac catheterization for any reason were recruited. After informed consent, participants were examined by 6 physicians (3 "specialists" and 3 "generalists") who were unaware of the results of the patient's hemodynamics. Each examiner independently assessed patients for the presence of a RV lift, loud P2, jugular venous distension (JVD), tricuspid insufficiency murmur and right-sided 4th heart sound at rest and during a slow inspiration. A global rating (scale of 1-5) of the likelihood that the patient had pulmonary hypertension was provided by each examiner. 31 articles that assessed the physical examination in PH were included in the final analysis. There was heterogeneity amongst the studies and many did not include control data. The sign most associated with PH in the literature was a loud pulmonic component of the second heart sound (P2). In our prospective study physical examination was performed on 52 subjects (25 met criteria for PH; mPAP ≥ 25 mmHg). The physical sign with the highest likelihood ratio (LR) was a loud P2 on inspiration with a LR +ve 1.9, 95% CrI [1.2, 3.1] when data from all examiners was analyzed together. Results from the specialist examiners had higher diagnostic utility; a loud P2 on inspiration was associated with a positive LR of 3.2, 95% CrI [1.5, 6.2] and a right sided S4 on inspiration had a LR +ve 4.7, 95% CI [1.0, 15.6]. No aspect of the physical exam, could consistently rule out PH (negative LRs 0

  12. New method of determining the thermal utilization factor of a cell; Nouvelle methode de determination du facteur d'utilisation thermique d'une cellule

    Energy Technology Data Exchange (ETDEWEB)

    Amouyal, A; Benoist, P [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1956-07-01

    A new formula for the thermal utilization factor is derived, which, while comparable in simplicity to the formula given by elementary diffusion theory, furnishes much more precise results. This is clearly brought out by comparison with the results given by the S{sub n} and spherical harmonics methods. (author) [French] Une nouvelle expression du facteur d'utilisation thermique, d'une simplicite comparable a celle de Ia theorie elementaire, est etablie. La comparaison avec les resultats fournis par la methode S{sub n} et les methodes d'harmoniques spheriques montre que la precision obtenue par cette formule est tres superieure a celle que donne la theorie elementaire. (auteur)

  13. Methods for Analyzing the Benefits and Costs of Distributed Photovoltaic Generation to the U.S. Electric Utility System

    Energy Technology Data Exchange (ETDEWEB)

    Denholm, P.; Margolis, R.; Palmintier, B.; Barrows, C.; Ibanez, E.; Bird, L.; Zuboy, J.

    2014-09-01

    This report outlines the methods, data, and tools that could be used at different levels of sophistication and effort to estimate the benefits and costs of DGPV. In so doing, we identify the gaps in current benefit-cost-analysis methods, which we hope will inform the ongoing research agenda in this area. The focus of this report is primarily on benefits and costs from the utility or electricity generation system perspective. It is intended to provide useful background information to utility and regulatory decision makers and their staff, who are often being asked to use or evaluate estimates of the benefits and cost of DGPV in regulatory proceedings. Understanding the technical rigor of the range of methods and how they might need to evolve as DGPV becomes a more significant contributor of energy to the electricity system will help them be better consumers of this type of information. This report is also intended to provide information to utilities, policy makers, PV technology developers, and other stakeholders, which might help them maximize the benefits and minimize the costs of integrating DGPV into a changing electricity system.

  14. Effect of soil type and application method on nutrients absorption and utilization by grape plants 1-Absorption and utilization of manganese using Mn-54

    International Nuclear Information System (INIS)

    Mohamed, F.A.; Sharaf, A.N.M.; Khamis, M.A.; Sharaf, M.M.

    2000-01-01

    This work was conducted to study effect of soil type and application method on absorption, translocation and utilization of Mn by grape plants. One year old rooted cuttings of grape (Cv. Ruby seedless) were transplanted in plastic containers filled with 15 kg of three different soils, i.e. clay loam soil, sandy soil and calcareous soil. Fertilization treatments were as follows: Tap water (control) (T1); soil application of N, P,K and Mg (T2); T2 plus soil application of Fe, Mn and Zn (T4). Also, pot experiment was carried out using Mn SO4 at 5ppm for soil application and at 0.5% for foliar application and Mn-54 was used for labelling both solutions. Manganese contents in different organs of grape plant were significantly increased by the three fertilization treatments as compared to those of control. Moreover, highest Mn level was obtained due to foliar application of micro elements and soil application of macro elements T 4 . followed by soil application of both macro-and micro elements T 3 , soil application of macro elements only T2 and control T1

  15. Paired comparisons analysis: an axiomatic approach to ranking methods

    NARCIS (Netherlands)

    Gonzalez-Diaz, J.; Hendrickx, Ruud; Lohmann, E.R.M.A.

    2014-01-01

    In this paper we present an axiomatic analysis of several ranking methods for general tournaments. We find that the ranking method obtained by applying maximum likelihood to the (Zermelo-)Bradley-Terry model, the most common method in statistics and psychology, is one of the ranking methods that

  16. Method for the detection of a magnetic field utilizing a magnetic vortex

    Science.gov (United States)

    Novosad, Valentyn [Chicago, IL; Buchanan, Kristen [Batavia, IL

    2010-04-13

    The determination of the strength of an in-plane magnetic field utilizing one or more magnetically-soft, ferromagnetic member, having a shape, size and material whereas a single magnetic vortex is formed at remanence in each ferromagnetic member. The preferred shape is a thin circle, or dot. Multiple ferromagnetic members can also be stacked on-top of each other and separated by a non-magnetic spacer. The resulting sensor is hysteresis free. The sensor's sensitivity, and magnetic saturation characteristics may be easily tuned by simply altering the material, size, shape, or a combination thereof to match the desired sensitivity and saturation characteristics. The sensor is self-resetting at remanence and therefore does not require any pinning techniques.

  17. An examination of generalized anxiety disorder and dysthymia utilizing the Rorschach inkblot method.

    Science.gov (United States)

    Slavin-Mulford, Jenelle; Clements, Alyssa; Hilsenroth, Mark; Charnas, Jocelyn; Zodan, Jennifer

    2016-06-30

    This study examined transdiagnostic features of generalized anxiety disorder (GAD) and dysthymia in an outpatient clinical sample. Fifteen patients who met DSM-IV criteria for GAD and twenty-one patients who met DSM-IV criteria for dysthymia but who did not have comorbid anxiety disorder were evaluated utilizing the Rorschach. Salient clinical variables were then compared. Results showed that patients with GAD scored significantly higher on variables related to cognitive agitation and a desire/need for external soothing. In addition, there was a trend for patients with GAD to produce higher scores on a measure of ruminative focus on negative aspects of the self. Thus, not surprisingly, GAD patients' experienced more distress than the dysthymic patients. The implications of these findings are discussed with regards to better understanding the shared and distinct features of GAD and dysthymia. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  18. Thermodynamic approach and comparison of two-step and single step DME (dimethyl ether) syntheses with carbon dioxide utilization

    International Nuclear Information System (INIS)

    Chen, Wei-Hsin; Hsu, Chih-Liang; Wang, Xiao-Dong

    2016-01-01

    DME (Dimethyl ether) synthesis from syngas with CO_2 utilization through two-step and single step processes is analyzed thermodynamically. The influences of reaction temperature, H_2/CO molar ratio, and CO_2/CO molar ratio on CO and CO_2 conversions, DME selectivity and yield, and thermal behavior are evaluated. Particular attention is paid to the comparison of the performance of DME synthesis between the two different methods. In the two-step method, the addition of CO_2 suppresses the CO conversion during methanol synthesis. An increase in CO_2/CO ratio decreases the CO_2 conversion (negative effect), but increases the total consumption amount of CO_2 (positive effect). At a given reaction temperature with H_2/CO = 4, the maximum DME yield develops at CO_2/CO = 1. In the single step method, over 98% of CO can be converted and the DME yield can be as high as 0.52 mol (mol CO)"−"1 at CO_2/CO = 2. The comparison of the single step and two-step processes indicates that the maximum CO conversion, DME selectivity, and DME yield in the former are higher than those in the latter, whereas an opposite result in the maximum CO_2 conversion is observed. These results reveal that the single step process has lower thermodynamic limitation and is a better option for DME synthesis. From CO_2 utilization point of view, the operation with low temperature, high H_2/CO ratio, and low CO_2/CO ratio results in higher CO_2 conversion, irrespective of two-step or single step DME synthesis. - Highlights: • DME (Dimethyl ether) synthesis with CO_2 utilization is analyzed thermodynamically. • Single step and two-step DME syntheses are studied and compared with each other. • CO_2 addition suppresses CO conversion in MeOH synthesis but increases MeOH yield. • The performance of the single step DME synthesis is better than that of the two-step one. • Increase CO_2/CO ratio decreases CO_2 conversion but increases CO_2 consumption amount.

  19. Prediction method of unburnt carbon for coal fired utility boiler using image processing technique of combustion flame

    International Nuclear Information System (INIS)

    Shimoda, M.; Sugano, A.; Kimura, T.; Watanabe, Y.; Ishiyama, K.

    1990-01-01

    This paper reports on a method predicting unburnt carbon in a coal fired utility boiler developed using an image processing technique. The method consists of an image processing unit and a furnace model unit. temperature distribution of combustion flames can be obtained through the former unit. The later calculates dynamics of the carbon reduction from the burner stages to the furnace outlet using coal feed rate, air flow rate, chemical and ash content of coal. An experimental study shows that the prediction error of the unburnt carbon can be reduced to 10%

  20. Utilization and discontinuation of contraceptive methods: the University of Calabar Teaching Hospital (UCTH experience

    Directory of Open Access Journals (Sweden)

    Njoku CO

    2014-09-01

    Full Text Available Background: Contraception has an important role to play in reducing the high rate of maternal morbidity and mortality in developing countries. Objective: The objective is to determine the prevalence rate, methods and reasons for discontinuation of contraceptive methods at UCTH, Calabar. Method: This was a retrospective study of all clients that utilised different forms of contraceptives at UCTH, Calabar from 1st January, 2009 to 31st December, 2013. Results: A total of 5,381 clients used various methods of contraception while 13,492 live births were recorded giving the prevalence rate of 39.9% of total live birth. Common methods were intrauterine contraceptive device (IUCD 1,745(32.8% and injectable contraceptives 1,268(23.8%. Most clients 1,876(35.2% were graduates while 81(1.5% had no formal education. A total of 535(10.1% clients discontinued different family planning method commonly due to desire for pregnancy and side effects. IUCD had the highest discontinuation rate. Conclusion: The study revealed low prevalence rate of contraceptive use which was more among teenagers and illiterate women. The main reasons for discontinuation of different methods were desire for pregnancy, side effects and menopause. Creating more contraceptive awareness, improvement in contraceptive counselling and female education will help to improve contraceptive utilisation rate and reduce discontinuation rate.

  1. 02A. Design, Methods, and Outcomes for Recent Clinical Trials Utilizing Ayurvedic Medicine, Yoga, and Meditation

    Science.gov (United States)

    Saper, Robert; Vinjamury, Sivarama; Elder, Charles

    2013-01-01

    Focus Area: Integrative Approaches to Care The panel discussants will present on the outcomes of four recent pragmatic trials covering the spectrum of Ayurvedic medicine, yoga, and meditation as therapeutic approaches for both acute and chronic conditions. The presenters will discuss: (1) a pilot study of a whole-systems Ayurveda and Yoga Therapy intervention for obesity; (2) a comparative effectiveness randomized controlled trial of hatha yoga, physical therapy, and education for non-specific chronic low back pain in low-income minority populations; (3) an investigation of the therapeutic usefulness of Shirodhara (Ayurvedic oil dripping therapy) as a treatment for insomnia; and (4) a discussion of the evidence base supporting implementation of meditation interventions in schools and workplace settings. Discussants will present information on study designs, research methodology, and outcome measure selection to highlight special considerations in conducting research on whole medical systems that use multi-target therapies and focus on patient-centered outcomes. Ayurvedic medicine and yoga are characterized by low-cost, noninvasive interventions that can be usefully offered as part of an integrative medicine therapeutic approach.

  2. A study of water in glass by an autoradiographic method that utilizes tritiated water

    International Nuclear Information System (INIS)

    Knickerbocker, S.H.; Brown, S.D.; Joshi, S.B.

    1983-01-01

    This chapter determines water concentration and spatial distribution in glass by an autoradiographic method that makes use of tritiated water as the tagged species. Describes the method and presents some typical results. Lists advantages and disadvantages associated with the method and examines other methods that might be used for the study of water in glass. Discusses dry glass preparation, the addition of tritiated water to glass, glass preparation, film selection, and film analysis. Shows tritium autoradiography to be a valuable technique for measuring the content and spatial distribution of water in inorganic glasses. Finds that the technique yields unique information, particularly in regard to spatial distribution, when compared with techniques of IR spectroscopy, SIMS, SIPS, NRRA, ESR and NMR. Points out that large areas (e.g., several square inches) of sample can be mapped in a single exposure. Notes that the spatial resolution of water in the glass network can be 10 -7 m, so very accurate diffusion profiles are obtainable

  3. Method, apparatus, and system for utilizing augmented reality to improve surgery

    KAUST Repository

    Cali, Corrado; Besuchet, Jonathan

    2016-01-01

    of a patient, and superimposing, by a head-mounted display, a projection of the three dimensional reconstruction onto a field of view of a user. The method further includes maintaining alignment between the projection and the user's actual view

  4. Utilization of niching methods of genetic algorithms in nuclear reactor problems optimization

    International Nuclear Information System (INIS)

    Sacco, Wagner Figueiredo; Schirru, Roberto

    2000-01-01

    Genetic Algorithms (GAs) are biologically motivated adaptive systems which have been used, with good results, in function optimization. However, traditional GAs rapidly push an artificial population toward convergence. That is, all individuals in the population soon become nearly identical. Niching Methods allow genetic algorithms to maintain a population of diverse individuals. GAs that incorporate these methods are capable of locating multiple, optimal solutions within a single population. The purpose of this study is to test existing niching techniques and two methods introduced herein, bearing in mind their eventual application in nuclear reactor related problems, specially the nuclear reactor core reload one, which has multiple solutions. Tests are performed using widely known test functions and their results show that the new methods are quite promising, specially in real world problems like the nuclear reactor core reload. (author)

  5. Methods of natural gas liquefaction and natural gas liquefaction plants utilizing multiple and varying gas streams

    Science.gov (United States)

    Wilding, Bruce M; Turner, Terry D

    2014-12-02

    A method of natural gas liquefaction may include cooling a gaseous NG process stream to form a liquid NG process stream. The method may further include directing the first tail gas stream out of a plant at a first pressure and directing a second tail gas stream out of the plant at a second pressure. An additional method of natural gas liquefaction may include separating CO.sub.2 from a liquid NG process stream and processing the CO.sub.2 to provide a CO.sub.2 product stream. Another method of natural gas liquefaction may include combining a marginal gaseous NG process stream with a secondary substantially pure NG stream to provide an improved gaseous NG process stream. Additionally, a NG liquefaction plant may include a first tail gas outlet, and at least a second tail gas outlet, the at least a second tail gas outlet separate from the first tail gas outlet.

  6. Method and apparatus for surface characterization and process control utilizing radiation from desorbed particles

    International Nuclear Information System (INIS)

    Feldman, L.C.; Kraus, J.S.; Tolk, N.H.; Traum, M.M.; Tully, J.C.

    1983-01-01

    Emission of characteristic electromagnetic radiation in the infrared, visible, or UV from excited particles, typically ions, molecules, or neutral atoms, desorbed from solid surfaces by an incident beam of low-momentum probe radiation has been observed. Disclosed is a method for characterizing solid surfaces based on the observed effect, with low-momentum probe radiation consisting of electrons or photons. Further disclosed is a method for controlling manufacturing processes that is also based on the observed effect. The latter method can, for instance, be advantageously applied in integrated circuit-, integrated optics-, and magnetic bubble device manufacture. Specific examples of applications of the method are registering of masks, control of a direct-writing processing beam, end-point detection in etching, and control of a processing beam for laser- or electron-beam annealing or ion implantation

  7. Utility of PCR, Culture, and Antigen Detection Methods for Diagnosis of Legionellosis.

    Science.gov (United States)

    Chen, Derrick J; Procop, Gary W; Vogel, Sherilynn; Yen-Lieberman, Belinda; Richter, Sandra S

    2015-11-01

    The goal of this retrospective study was to evaluate the performance of different diagnostic tests for Legionnaires' disease in a clinical setting where Legionella pneumophila PCR had been introduced. Electronic medical records at the Cleveland Clinic were searched for Legionella urinary antigen (UAG), culture, and PCR tests ordered from March 2010 through December 2013. For cases where two or more test methods were performed and at least one was positive, the medical record was reviewed for relevant clinical and epidemiologic factors. Excluding repeat testing on a given patient, 19,912 tests were ordered (12,569 UAG, 3,747 cultures, and 3,596 PCR) with 378 positive results. The positivity rate for each method was 0.4% for culture, 0.8% for PCR, and 2.7% for UAG. For 37 patients, at least two test methods were performed with at least one positive result: 10 (27%) cases were positive by all three methods, 16 (43%) were positive by two methods, and 11 (30%) were positive by one method only. For the 32 patients with medical records available, clinical presentation was consistent with proven or probable Legionella infection in 84% of the cases. For those cases, the sensitivities of culture, PCR, and UAG were 50%, 92%, and 96%, respectively. The specificities were 100% for culture and 99.9% for PCR and UAG. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  8. Educational Accountability: A Qualitatively Driven Mixed-Methods Approach

    Science.gov (United States)

    Hall, Jori N.; Ryan, Katherine E.

    2011-01-01

    This article discusses the importance of mixed-methods research, in particular the value of qualitatively driven mixed-methods research for quantitatively driven domains like educational accountability. The article demonstrates the merits of qualitative thinking by describing a mixed-methods study that focuses on a middle school's system of…

  9. Electric melting furnace of solidifying radioactive waste by utilizing magnetic field and melting method

    International Nuclear Information System (INIS)

    Igarashi, Hiroshi.

    1990-01-01

    An electric melting furnace for solidification of radioactive wastes utilizing magnetic fields in accordance with the present invention comprises a plurality of electrodes supplying AC current to molten glass in a glass melting furnace and a plurality of magnetic poles for generating AC magnetic fields. Interactions between the current and the magnetic field, generated forces in the identical direction in view of time in the molten glass. That is, forces for promoting the flow of molten glass in the melting furnace are resulted due to the Fleming's left-hand rule. As a result, the following effects can be obtained. (1) The amount of heat ransferred from the molten glass to the starting material layer on the molten surface is increased to improve the melting performance. (2) For an identical melting performance, the size and the weight of the melting furnace can be reduced to decrease the amount of secondary wastes when the apparatus-life is exhausted. (3) Bottom deposits can be suppressed and prevented from settling and depositing to the reactor bottom by the promoted flow in the layer. (4) Further, the size of auxiliary electrodes for directly supplying electric current to heat the molten glass near the reactor bottom can be decreased. (I.S.)

  10. A new characterization method of the microstructure by utilizing the macroscopic composition gradient in alloys

    International Nuclear Information System (INIS)

    Miyazaki, T.; Koyama, T.; Kobayashi, S.

    1996-01-01

    A new experimental method to determine the phase boundary and phase equilibrium is accomplished by - means of analytical transmission electron microscopy for alloys with a macroscopic composition gradient. The various phase boundaries, i.e. the coherent binodal and spinodal lines, incoherent binodal line and order/disorder transformation line are distinctly determined for the Cu-Ti alloy and the other alloy systems. Furthermore, the equilibrium compositions at the interface of precipitate/matrix can experimentally be obtained for various particle sizes, and thus the Gibbs-Thomson's relation is verified. It is expected that the composition gradient method proposed in the present will become an important experimental method of the microstructural characterization

  11. An Effective Numerical Method and Its Utilization to Solution of Fractional Models Used in Bioengineering Applications

    Directory of Open Access Journals (Sweden)

    Petráš Ivo

    2011-01-01

    Full Text Available This paper deals with the fractional-order linear and nonlinear models used in bioengineering applications and an effective method for their numerical solution. The proposed method is based on the power series expansion of a generating function. Numerical solution is in the form of the difference equation, which can be simply applied in the Matlab/Simulink to simulate the dynamics of system. Several illustrative examples are presented, which can be widely used in bioengineering as well as in the other disciplines, where the fractional calculus is often used.

  12. METHODICAL APPROACHES TO THE COST MANAGEMENT OF INDUSTRIAL ENTERPRISES

    Directory of Open Access Journals (Sweden)

    Trunina Iryna

    2018-03-01

    Full Text Available Introduction. The paper deals with the actual issues of managing the costs of industrial enterprises, because in the conditions of an unstable market environment the financial performance depends on the efficiency of the cost management system, competitiveness, financial sustainability and investment attractiveness of any subject of economic activity. Purpose of the article is analys is of approaches to cost management, theoretical substantiation and development of recommendations regarding the formation of strategic cost management. Results. The economic content of cost management in the treatment of different authors and on different approaches: functional, process-oriented and system approaches has been considered. Their essence and features, the direction for operational or strategic management of expenses of the enterprise, ways of spending management in different approaches are determined. It is stated that all considered approaches to cost management of enterprises are aimed at optimal use of resources and ensuring the growth of the efficiency of enterprises. Conclusions. Based on the review of methodological approaches to cost management, recommendations are developed for expanding the implementation of cost management at various levels of enterprise management and the formation of strategic cost management within the framework of strategic management of an enterprise. The strategic cost management is complex category aimed at achieving a rational level of costs in the long run, which allows for the consideration of competitive cost advantages and increase the competitiveness of an industrial enterprise. The implementation of cost reduction strategies should be a constant and important part of the company’s work, while the strategy of cost reduction should be integrated into the overall business strategy of the enterprise.

  13. Interdisciplinary Approaches and Methods for Sustainable Transformation and Innovation

    Directory of Open Access Journals (Sweden)

    Sangkyun Kim

    2015-04-01

    Full Text Available To increase the likelihood of success and sustainability, organizations must fundamentally reposition themselves and try to change current processes or create new products and services. One of the most effective approaches to find a solution for transformation and innovation is to learn from other domains where a solution for similar problems is already available. This paper briefly introduces the definition of and approaches to convergence of academic disciplines and industries, and overviews several representative convergence cases focusing on gamification for sustainable education, environments, and business managements.

  14. Natural colorants: Pigment stability and extraction yield enhancement via utilization of appropriate pretreatment and extraction methods.

    Science.gov (United States)

    Ngamwonglumlert, Luxsika; Devahastin, Sakamon; Chiewchan, Naphaporn

    2017-10-13

    Natural colorants from plant-based materials have gained increasing popularity due to health consciousness of consumers. Among the many steps involved in the production of natural colorants, pigment extraction is one of the most important. Soxhlet extraction, maceration, and hydrodistillation are conventional methods that have been widely used in industry and laboratory for such a purpose. Recently, various non-conventional methods, such as supercritical fluid extraction, pressurized liquid extraction, microwave-assisted extraction, ultrasound-assisted extraction, pulsed-electric field extraction, and enzyme-assisted extraction have emerged as alternatives to conventional methods due to the advantages of the former in terms of smaller solvent consumption, shorter extraction time, and more environment-friendliness. Prior to the extraction step, pretreatment of plant materials to enhance the stability of natural pigments is another important step that must be carefully taken care of. In this paper, a comprehensive review of appropriate pretreatment and extraction methods for chlorophylls, carotenoids, betalains, and anthocyanins, which are major classes of plant pigments, is provided by using pigment stability and extraction yield as assessment criteria.

  15. An Evaluation of Research Replication with Q Method and Its Utility in Market Segmentation.

    Science.gov (United States)

    Adams, R. C.

    Precipitated by questions of using Q methodology in television market segmentation and of the replicability of such research, this paper reports on both a reexamination of 1968 research by Joseph M. Foley and an attempt to replicate Foley's study. By undertaking a reanalysis of the Foley data, the question of replication in Q method is addressed.…

  16. UTILIZING THE PAKS METHOD FOR MEASURING ACROLEIN AND OTHER ALDEHYDES IN DEARS

    Science.gov (United States)

    Acrolein is a hazardous air pollutant of high priority due to its high irritation potency and other potential adverse health effects. However, a reliable method is currently unavailable for measuring airborne acrolein at typical environmental levels. In the Detroit Exposure and A...

  17. Utilizing the Lead User Method for Promoting Innovation in E-Recuiting

    NARCIS (Netherlands)

    Furtmueller-Ettinger, Elfriede; Wilderom, Celeste P.M.; van Dick, Rolf; Bondarouk, Tatiana; Ruel, Hubertus Johannes Maria; Guiderdoni-Jourdain, Karine; Oiry, Ewan

    2009-01-01

    In order to maintain their customer base, many e-recruiting firms are in need of developing innovations. The Lead User (LU) Method has been heralded in the new product innovation literature but not yet applied often in e-service settings. Based on an e-recruiting portal, the authors compare new

  18. Peak detection method evaluation for ion mobility spectrometry by using machine learning approaches.

    Science.gov (United States)

    Hauschild, Anne-Christin; Kopczynski, Dominik; D'Addario, Marianna; Baumbach, Jörg Ingo; Rahmann, Sven; Baumbach, Jan

    2013-04-16

    Ion mobility spectrometry with pre-separation by multi-capillary columns (MCC/IMS) has become an established inexpensive, non-invasive bioanalytics technology for detecting volatile organic compounds (VOCs) with various metabolomics applications in medical research. To pave the way for this technology towards daily usage in medical practice, different steps still have to be taken. With respect to modern biomarker research, one of the most important tasks is the automatic classification of patient-specific data sets into different groups, healthy or not, for instance. Although sophisticated machine learning methods exist, an inevitable preprocessing step is reliable and robust peak detection without manual intervention. In this work we evaluate four state-of-the-art approaches for automated IMS-based peak detection: local maxima search, watershed transformation with IPHEx, region-merging with VisualNow, and peak model estimation (PME).We manually generated Metabolites 2013, 3 278 a gold standard with the aid of a domain expert (manual) and compare the performance of the four peak calling methods with respect to two distinct criteria. We first utilize established machine learning methods and systematically study their classification performance based on the four peak detectors' results. Second, we investigate the classification variance and robustness regarding perturbation and overfitting. Our main finding is that the power of the classification accuracy is almost equally good for all methods, the manually created gold standard as well as the four automatic peak finding methods. In addition, we note that all tools, manual and automatic, are similarly robust against perturbations. However, the classification performance is more robust against overfitting when using the PME as peak calling preprocessor. In summary, we conclude that all methods, though small differences exist, are largely reliable and enable a wide spectrum of real-world biomedical applications.

  19. Detection capability of a pulsed Ground Penetrating Radar utilizing an oscilloscope and Radargram Fusion Approach for optimal signal quality

    Science.gov (United States)

    Seyfried, Daniel; Schoebel, Joerg

    2015-07-01

    In scientific research pulsed radars often employ a digital oscilloscope as sampling unit. The sensitivity of an oscilloscope is determined in general by means of the number of digits of its analog-to-digital converter and the selected full scale vertical setting, i.e., the maximal voltage range displayed. Furthermore oversampling or averaging of the input signal may increase the effective number of digits, hence the sensitivity. Especially for Ground Penetrating Radar applications high sensitivity of the radar system is demanded since reflection amplitudes of buried objects are strongly attenuated in ground. Hence, in order to achieve high detection capability this parameter is one of the most crucial ones. In this paper we analyze the detection capability of our pulsed radar system utilizing a Rohde & Schwarz RTO 1024 oscilloscope as sampling unit for Ground Penetrating Radar applications, such as detection of pipes and cables in the ground. Also effects of averaging and low-noise amplification of the received signal prior to sampling are investigated by means of an appropriate laboratory setup. To underline our findings we then present real-world radar measurements performed on our GPR test site, where we have buried pipes and cables of different types and materials in different depths. The results illustrate the requirement for proper choice of the settings of the oscilloscope for optimal data recording. However, as we show, displaying both strong signal contributions due to e.g., antenna cross-talk and direct ground bounce reflection as well as weak reflections from objects buried deeper in ground requires opposing trends for the oscilloscope's settings. We therefore present our Radargram Fusion Approach. By means of this approach multiple radargrams recorded in parallel, each with an individual optimized setting for a certain type of contribution, can be fused in an appropriate way in order to finally achieve a single radargram which displays all

  20. The time to trade-off method versus the EQ-5D index to determine health utility in patients with rheumatoid arthritis

    Directory of Open Access Journals (Sweden)

    Dmitriy Vladimirovich Goryachev

    2010-01-01

    The values of the utility index calculated by TTO failed to correlate with those of HAQ and EQ-5D. The disease duration was unrelated to the values of the utility index calculated by any of the methods used. The association of TTO index was found only for VAS-GH "thermometer". Conclusion. EQ-5D index and VAS-GH are the methods of choice in determining health utility for a clinicoeconomic analysis in patients with RA.

  1. The Li–CO2 battery: a novel method for CO2 capture and utilization

    KAUST Repository

    Xu, Shaomao

    2013-01-01

    We report a novel primary Li-CO2 battery that consumes pure CO2 gas as its cathode. The battery exhibits a high discharge capacity of around 2500 mA h g-1 at moderate temperatures. At 100 °C the discharge capacity is close to 1000% higher than that at 40 °C, and the temperature dependence is significantly weaker for higher surface area carbon cathodes. Ex-situ FTIR and XRD analyses convincingly show that lithium carbonate (Li2CO3) is the main component of the discharge product. The feasibility of similar primary metal-CO2 batteries based on earth abundant metal anodes, such as Al and Mg, is demonstrated. The metal-CO2 battery platform provides a novel approach for simultaneous capturing of CO2 emissions and producing electrical energy. © 2013 The Royal Society of Chemistry.

  2. Analytical methods for prefiltering of close approaches between ...

    African Journals Online (AJOL)

    user

    2010-02-10

    Feb 10, 2010 ... find out the close approach for all objects with simulations. ... the operational satellite and other orbiting objects. ... Recently, space scientists all over the Globe are giving much ... avoidances (Alarcon-Rodriguez et al., 2004, Gronchi, 2005 and Choi et al., 2009) for the stability of future Low Earth Orbit (LEO).

  3. [Unwanted adolescent pregnancy and post-partum utilization of contraceptive methods].

    Science.gov (United States)

    Núñez-Urquiza, Rosa María; Hernández-Prado, Bernardo; García-Barrios, Cecilia; González, Dolores; Walker, Dylis

    2003-01-01

    To describe the proportion of unwanted pregnancies among all pregnant adolescents, its association with sociodemographic characteristics, and the use of post-partum contraceptive methods. A cross-sectional study was conducted among 220 women between 13 and 19 years of age, in two semi-urban municipalities of the State of Morelos, Mexico, interviewed between 1992 and 1994. Women were interviewed at home, six to twelve weeks after their delivery date. Women were asked whether they had wanted their last pregnancy, and about knowledge and use of contraceptive methods after delivery. Adolescent pregnancies accounted for 17% of all births registered in these two municipalities. Among all adolescent mother 22.73% reported that their pregnancy had not been wanted. A positive association was found between the lack of access to health services provided by public medical insurance systems (Instituto Mexicano del Seguro Social IMSS and Instituto de Seguridad y Servicios Sociales para los Trabajadores del Estado ISSSTE) and unwanted pregnancy (adjusted OR = 3.03, 95% CI (1.31, 7.) An association was also found between living in an urban community (adjusted OR = 2.16, 95% CI (1.08, 4.33) and an unwanted pregnancy. Among all adolescent mothers, 91.3% were familiar with "the pill" as a contraceptive method; 84.72% knew about the IUD, and 63.68% knew about the condom. However, only 35% of them were actually using an effective contraceptive method six weeks after delivery. No difference in frequency of contraceptive use was found among the adolescent mothers, according to whether they wanted their last pregnancy. Only 43.39% of mothers who delivered at hospitals or health centers were using an effective contraceptive method. These findings suggest that there is a great potential for family planning programs to target adolescents, and that the use of contraceptive methods after delivery should be promoted among adolescent mothers, especially those lacking access to public medical

  4. Addressing water quality issues on a watershed basis: a comprehensive approach for utilizing chapter 20 of the Michigan drain code

    International Nuclear Information System (INIS)

    McCulloch, J.P.

    2002-01-01

    There are five major watersheds in Oakland County. They are the Clinton, Flint, Huron, Rouge and Shiawassee. Included in these watersheds are 61 individual cities, villages and townships. Actions taken by one community within the watershed have a significant impact on other communities in the watershed. Consequently, a multi-community approach needs to be identified and utilized to comprehensively address public health and water quality issues. Some of the issues faced by these communities individually include stormwater management, flooding, drainage, and river and stream management. Failing septic systems, illicit connections causing groundwater contamination, and habitat and wetland degradation are also primary concerns. Finally, wastewater treatment capacity and sanitary sewer service also are regularly dealt with by these communities. Traditionally, short-term solutions to these often urgent problems required the construction of relief sewers or temporary retention structures. Unfortunately, solving the problem in one area often meant the creation of new problems downstream. Coordinating efforts among these 61 individual communities is difficult. These difficult challenges are best met with a coordinated, comprehensive plan. (author)

  5. Theoretical and Methodological Approaches to Understanding Human Migration Patterns and their Utility in Forensic Human Identification Cases

    Directory of Open Access Journals (Sweden)

    Anastasia Holobinko

    2012-06-01

    Full Text Available Human migration patterns are of interest to scientists representing many fields. Theories have been posited to explain modern human evolutionary expansion, the diversity of human culture, and the motivational factors underlying an individual or group decision to migrate. Although the research question and subsequent approach may vary between disciplines, one thread is ubiquitous throughout most migration studies: why do humans migrate and what is the result of such an event? While the determination of individual attributes such as age, sex, and ancestry is often integral to migration studies, the positive identification of human remains is usually irrelevant. However, the positive identification of a deceased is paramount to a forensic investigation in which human remains have been recovered and must be identified. What role, if any, might the study of human movement patterns play in the interpretation of evidence associated with unidentified human remains? Due to increasing global mobility in the world's populations, it is not inconceivable that an individual might die far away from his or her home. If positive identification cannot immediately be made, investigators may consider various theories as to how or why a deceased ended up in a particular geographic location. While scientific evidence influences the direction of forensic investigations, qualitative evaluation can be an important component of evidence interpretation. This review explores several modern human migration theories and the methodologies utilized to identify evidence of human migratory movement before addressing the practical application of migration theory to forensic cases requiring the identification of human remains.

  6. A manual for inexpensive methods of analyzing and utilizing remote sensor data

    Science.gov (United States)

    Elifrits, C. D.; Barr, D. J.

    1978-01-01

    Instructions are provided for inexpensive methods of using remote sensor data to assist in the completion of the need to observe the earth's surface. When possible, relative costs were included. Equipment need for analysis of remote sensor data is described, and methods of use of these equipment items are included, as well as advantages and disadvantages of the use of individual items. Interpretation and analysis of stereo photos and the interpretation of typical patterns such as tone and texture, landcover, drainage, and erosional form are described. Similar treatment is given to monoscopic image interpretation, including LANDSAT MSS data. Enhancement techniques are detailed with respect to their application and simple techniques of creating an enhanced data item. Techniques described include additive and subtractive (Diazo processes) color techniques and enlargement of photos or images. Applications of these processes, including mappings of land resources, engineering soils, geology, water resources, environmental conditions, and crops and/or vegetation, are outlined.

  7. Methods of direct (non-chromatographic) quantification of body metabolites utilizing chemical ionization mass spectrometry

    International Nuclear Information System (INIS)

    Mee, J.M.L.

    1978-01-01

    For quantitative determination of known metabolites from the biological sample by direct chemical ionization mass spectrometry (CI-MS), the method of internal standard using stable isotopically labelled analogs appears to be the method of choice. In the case where stable isotope ratio determinations could not be applied, and alternative quantification can be achieved using non-labelled external or internal standards and a calibration curve (sum of peak height per a given number of scans versus concentration). The technique of computer monitoring permits display and plotting of ion current profiles (TIC and SIC) or spectra per a given number of scans or a given range of mass per charge. Examples are given in areas of clinical application and the quantitative data show very good agreement with the conventional chromatographic measurements. (Auth.)

  8. Separation of Flame and Nonflame-retardant Plastics Utilizing Magneto-Archimedes Method

    International Nuclear Information System (INIS)

    Misawa, Kohei; Kobayashi, Takayuki; Mori, Tatsuya; Akiyama, Yoko; Nishijima, Shigehiro; Mishima, Fumihito

    2017-01-01

    In physical recycling process, the quality of recycled plastics becomes usually poor in case various kinds of plastic materials are mixed. In order to solve the problem, we tried to separate flame and nonflame-retardant plastics used for toner cartridges as one example of mixed plastics by using magneto-Archimedes method. By using this method, we can control levitation and settlement of the particles in the medium by controlling the density and magnetic susceptibility of the medium and the magnetic field. In this study, we introduced the separation system of plastics by the combination of wet type specific gravity separation and magneto-Archimedes separation. In addition, we examined continuous and massive separation by introducing the system which can separate the plastics continuously in the flowing fluid. (paper)

  9. Utilization of atomic emission spectroscopy methods for determination of rare earth elements

    International Nuclear Information System (INIS)

    Kubova, J.; Polakovicova, J.; Medved, J.; Stresko, V.

    1997-01-01

    The authors elaborated and applied procedures for rare earth elements (REE) determination using optical emission spectrograph with D.C arc excitation and ICP atomic emission spectrometry.Some of these analytical method are described. The proposed procedure was applied for the analysis of different types of geological materials from several Slovak localities. The results the REE determination were used for e.g. investigation of REE distribution in volcanic rocks, rhyolite tuffs with uranium-molybdenum mineralization, sandstones with heavy minerals accumulations, phosphatic sandstones, granites, quartz-carbonate veins and in the meteorite found in the locality Rumanova. The REE contents were determined in 19 mineral water sources and the results obtained by the both mentioned methods compared. The total REE contents in the analysed mineral water samples were between 2 · 10 -7 and 3 · 10 -5 g dm -3

  10. Computer-Based Methods for Collecting Peer Nomination Data: Utility, Practice, and Empirical Support.

    Science.gov (United States)

    van den Berg, Yvonne H M; Gommans, Rob

    2017-09-01

    New technologies have led to several major advances in psychological research over the past few decades. Peer nomination research is no exception. Thanks to these technological innovations, computerized data collection is becoming more common in peer nomination research. However, computer-based assessment is more than simply programming the questionnaire and asking respondents to fill it in on computers. In this chapter the advantages and challenges of computer-based assessments are discussed. In addition, a list of practical recommendations and considerations is provided to inform researchers on how computer-based methods can be applied to their own research. Although the focus is on the collection of peer nomination data in particular, many of the requirements, considerations, and implications are also relevant for those who consider the use of other sociometric assessment methods (e.g., paired comparisons, peer ratings, peer rankings) or computer-based assessments in general. © 2017 Wiley Periodicals, Inc.

  11. Perturbation method utilization in the analysis of the Convertible Spectral Shift Reactor (RCVS)

    International Nuclear Information System (INIS)

    Bruna, G.B; Legendre, J.F.; Porta, J.; Doriath, J.Y.

    1988-01-01

    In the framework of the preliminary faisability studies on a new core concept, techniques derived from perturbation theory show-up very useful in the calculation and physical analysis of project parameters. We show, in the present work, some applications of these methods to the RCVS (Reacteur Convertible a Variation de Spectre - Convertible Spectral Shift Reactor) Concept studies. Actually, we present here the search of a few group project type energy structure and the splitting of reactivity effects into individual components [fr

  12. A Facile Method for the Preparation of Unsymmetrical Ureas Utilizing Zirconium(IV) Chloride

    International Nuclear Information System (INIS)

    Lee, Anna; Kim, Hee-Kwon; Thompson, David H.

    2016-01-01

    A facile synthetic method for the preparation of unsymmetrical ureas from amines is described.Carbamoyl imidazole compounds were prepared by the reaction of 1,1-carbonyldiimidazole with primary or secondary amines, and further activation by treatment with zirconium(IV) chloride to generate the desired urea. This reaction protocol was applied to the synthesis of tri and tetrasubstituted ureas with high yields. This study provides an alternative guideline for the practical preparation of various unsymmetrical ureas.

  13. Utilizing of inner porous structure in injection moulds for application of special cooling method

    International Nuclear Information System (INIS)

    Seidl, M; Bobek, J; Habr, J; Běhálek, L; Šafka, J; Nováková, I

    2016-01-01

    The article is focused on impact evaluation of controlled inner structure of production tools and new cooling method on regulation of thermal processes for injection moulding technology. The mould inserts with porous structure were cooled by means of liquid CO 2 which is very progressive cooling method and enables very fast and intensive heat transfer among the plastic product, the production tool and cooling medium. The inserts were created using rapid prototype technology (DLSM) and they had a bi-component structure consisting of thin compact surface layer and defined porous inner structure of open cell character where liquid CO 2 was flowing through. This analyse includes the evaluation of cooling efficiency for different inner structures and different time profiles for dosing of liquid CO 2 into the porous structure. The thermal processes were monitored using thermocouples and IR thermal analyse of product surface and experimental device. Intensive heat removal influenced also the final structure and the shape and dimensional accuracy of the moulded parts that were made of semi-crystalline polymer. The range of final impacts of using intensive cooling method on the plastic parts was defined by DSC and dimensional analyses. (paper)

  14. Utility of the 3D GRE method in the female pelvic area with 3T MRI

    International Nuclear Information System (INIS)

    Matsushita, Hiroki; Terada, Masaki; Oosugi, Masanori; Inoue, Kazuyasu; Anma, Takeshi

    2008-01-01

    A high signal-to-noise ratio (SNR) can be obtained in three-Tesla (3T) MRI, and it is possible to use it to shorten imaging time and improve spatial resolution. However, reports of its disadvantages have been increasing. We attempted to describe a high-resolution evaluation image that made the best use of a decrease in specific absorption rate (SAR) and high SNR by using the LAVA (liver acquisition with volume acceleration) method, a kind of three-dimensional GRE (3D gradient echo) method that did not show the above-mentioned disadvantage in obtaining a shadow inspection of the female pelvic area with 3T MRI. A 0.8 mm isovoxel image of excellent SNR could be obtained within about one and one-half minutes by using the LAVA method as a result of the examination. Moreover, a SAR that was problematic with the 3T MR device was able to be decreased, and was useful. (author)

  15. Utilizing Strategic and Operational Methods for Whole-Community Disaster Planning.

    Science.gov (United States)

    Franks, Stevee; Seaton, Ellen

    2017-12-01

    Analysis of response and recovery efforts to disasters over the past 2 decades has identified a consistent gap that plagues the nation in regard to persons with access and functional needs. This gap can be highlighted by Hurricane Katrina, where the majority of those killed were a part of the access and functional needs population. After a disaster, many individuals with access and functional needs require assistance recovering but often have difficulty accessing services and resources. These difficulties are due to a combination of issues, such as health problems and the disruption of community support services. We sought to help bridge this gap by focusing on strategic and operational methods used while planning for the whole community. This article highlights the many partnerships that must be fostered for successful whole-community planning. These partnerships include, but are not limited to, local government departments, health agencies, nonprofit and nongovernmental organizations, and other volunteer organizations. We showcase these methods by using a developmental Post-Disaster Canvassing Plan to highlight planning methods that may aid jurisdictions across the United States in disaster planning for the whole community. (Disaster Med Public Health Preparedness. 2017;11:741-746).

  16. Methods of the Development Strategy of Service Companies: Logistical Approach

    Science.gov (United States)

    Toymentseva, Irina A.; Karpova, Natalya P.; Toymentseva, Angelina A.; Chichkina, Vera D.; Efanov, Andrey V.

    2016-01-01

    The urgency of the analyzed issue is due to lack of attention of heads of service companies to the theory and methodology of strategic management, methods and models of management decision-making in times of economic instability. The purpose of the article is to develop theoretical positions and methodical recommendations on the formation of the…

  17. Mixed-methods approaches in health research in Nepal

    OpenAIRE

    Simkhada, Padam; Van Teijlingen, Edwin; Wasti, Sharada Prasad; Sathian, B.

    2014-01-01

    Combining and integrating a mixture of qualitative and quantitative methods in one single study is widely used in health and social care research in high-income countries. This editorial adds a few words of advice to the novice mixed-methods researcher in Nepal.

  18. Automated Extraction of Cranial Landmarks from Computed Tomography Data using a Combined Method of Knowledge and Pattern Based Approaches

    Directory of Open Access Journals (Sweden)

    Roshan N. RAJAPAKSE

    2016-03-01

    Full Text Available Accurate identification of anatomical structures from medical imaging data is a significant and critical function in the medical domain. Past studies in this context have mainly utilized two main approaches, the knowledge and learning methodologies based methods. Further, most of previous reported studies have focused on identification of landmarks from lateral X-ray Computed Tomography (CT data, particularly in the field of orthodontics. However, this study focused on extracting cranial landmarks from large sets of cross sectional CT slices using a combined method of the two aforementioned approaches. The proposed method of this study is centered mainly on template data sets, which were created using the actual contour patterns extracted from CT cases for each of the landmarks in consideration. Firstly, these templates were used to devise rules which are a characteristic of the knowledge based method. Secondly, the same template sets were employed to perform template matching related to the learning methodologies approach. The proposed method was tested on two landmarks, the Dorsum sellae and the Pterygoid plate, using CT cases of 5 subjects. The results indicate that, out of the 10 tests, the output images were within the expected range (desired accuracy in 7 instances and acceptable range (near accuracy for 2 instances, thus verifying the effectiveness of the combined template sets centric approach proposed in this study.

  19. Simplified approaches to some nonoverlapping domain decomposition methods

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Jinchao

    1996-12-31

    An attempt will be made in this talk to present various domain decomposition methods in a way that is intuitively clear and technically coherent and concise. The basic framework used for analysis is the {open_quotes}parallel subspace correction{close_quotes} or {open_quotes}additive Schwarz{close_quotes} method, and other simple technical tools include {open_quotes}local-global{close_quotes} and {open_quotes}global-local{close_quotes} techniques, the formal one is for constructing subspace preconditioner based on a preconditioner on the whole space whereas the later one for constructing preconditioner on the whole space based on a subspace preconditioner. The domain decomposition methods discussed in this talk fall into two major categories: one, based on local Dirichlet problems, is related to the {open_quotes}substructuring method{close_quotes} and the other, based on local Neumann problems, is related to the {open_quotes}Neumann-Neumann method{close_quotes} and {open_quotes}balancing method{close_quotes}. All these methods will be presented in a systematic and coherent manner and the analysis for both two and three dimensional cases are carried out simultaneously. In particular, some intimate relationship between these algorithms are observed and some new variants of the algorithms are obtained.

  20. A Quality Assurance Method that Utilizes 3D Dosimetry and Facilitates Clinical Interpretation

    Energy Technology Data Exchange (ETDEWEB)

    Oldham, Mark, E-mail: mark.oldham@duke.edu [Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States); Thomas, Andrew; O' Daniel, Jennifer; Juang, Titania [Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States); Ibbott, Geoffrey [University of Texas MD Anderson Cancer Center, Houston, Texas (United States); Adamovics, John [Rider University, Lawrenceville, New Jersey (United States); Kirkpatrick, John P. [Radiation Oncology, Duke University Medical Center, Durham, North Carolina (United States)

    2012-10-01

    Purpose: To demonstrate a new three-dimensional (3D) quality assurance (QA) method that provides comprehensive dosimetry verification and facilitates evaluation of the clinical significance of QA data acquired in a phantom. Also to apply the method to investigate the dosimetric efficacy of base-of-skull (BOS) intensity-modulated radiotherapy (IMRT) treatment. Methods and Materials: Two types of IMRT QA verification plans were created for 6 patients who received BOS IMRT. The first plan enabled conventional 2D planar IMRT QA using the Varian portal dosimetry system. The second plan enabled 3D verification using an anthropomorphic head phantom. In the latter, the 3D dose distribution was measured using the DLOS/Presage dosimetry system (DLOS = Duke Large-field-of-view Optical-CT System, Presage Heuris Pharma, Skillman, NJ), which yielded isotropic 2-mm data throughout the treated volume. In a novel step, measured 3D dose distributions were transformed back to the patient's CT to enable calculation of dose-volume histograms (DVH) and dose overlays. Measured and planned patient DVHs were compared to investigate clinical significance. Results: Close agreement between measured and calculated dose distributions was observed for all 6 cases. For gamma criteria of 3%, 2 mm, the mean passing rate for portal dosimetry was 96.8% (range, 92.0%-98.9%), compared to 94.9% (range, 90.1%-98.9%) for 3D. There was no clear correlation between 2D and 3D passing rates. Planned and measured dose distributions were evaluated on the patient's anatomy, using DVH and dose overlays. Minor deviations were detected, and the clinical significance of these are presented and discussed. Conclusions: Two advantages accrue to the methods presented here. First, treatment accuracy is evaluated throughout the whole treated volume, yielding comprehensive verification. Second, the clinical significance of any deviations can be assessed through the generation of DVH curves and dose overlays on

  1. A Method for Sustainable Carbon Dioxide Utilization Process Synthesis and Design

    DEFF Research Database (Denmark)

    Frauzem, Rebecca; Fjellerup, Kasper; Roh, Kosan

    As a result of increasing regulations and concern about the impact of greenhouse gases on the environment, carbon dioxide (CO2) emissions are a primary focus for reducing emissions and improving global sustainability. One method to achieve reduced emissions, is the conversion of CO2 to useful...... compounds via chemical reactions. However, conversion is still in its infancy and requires work for implementation at an industrial level. One aspect of this is the development of a methodology for the formulation and optimization of sustainable conversion processes. This methodology follows three stages...

  2. Utilizing of 2-D resistivity with geotechnical method for sediment mapping in Sungai Batu, Kedah

    Science.gov (United States)

    Taqiuddin, Z. M.; Rosli, S.; Nordiana, M. M.; Azwin, I. N.; Mokhtar, S.

    2017-07-01

    Sungai Batu is Lembah Bujang subdistrict, located at northern region of Peninsular Malaysia, recognized as an international cultural and commercial crossroad for 2000 years ago, and recorded as the oldest archaeological site in southeast Asia. The discovering of iron smelting area (1st-4th century) shows the evidence of important iron industry in Malay Peninsular to others civilization. Nowadays, a lot of interdisciplinary research was conducted in this area including geophysical prospect to understand the subsurface profile for this locality. Geophysical approach such as 2-D resistivity was performed with the main objective is to identify sediment deposit for this area. Three 2-D resistivity survey lines were design across borehole and data acquired using ABEM SAS4000 system with Pole-dipole array using 2.5 m minimum electrode spacing. The data obtained was process using Res2Dinv software to produce inversion model and Surfer10 software used for interpretation and correlation with respective borehole record. The 2-D resistivity inversion model shows that, the area dominated by clay soil with resistivity values of values of >500 Ωm interpreted as hard layer. The saturated zone (25 m which consider large volume of soil deposit during sedimentation process. The correlation with the borehole record shows that clay profile distributed at depth of >20 m. The present of shale in certain borehole record indicate that the environment deposit is clam/stagnant water condition during the formation process which suspected controlled by the deposition process from the land deposit.

  3. Applying a life cycle approach to project management methods

    OpenAIRE

    Biggins, David; Trollsund, F.; Høiby, A.L.

    2016-01-01

    Project management is increasingly important to organisations because projects are the method\\ud by which organisations respond to their environment. A key element within project management\\ud is the standards and methods that are used to control and conduct projects, collectively known as\\ud project management methods (PMMs) and exemplified by PRINCE2, the Project Management\\ud Institute’s and the Association for Project Management’s Bodies of Knowledge (PMBOK and\\ud APMBOK. The purpose of t...

  4. A taxonomy of behaviour change methods: an Intervention Mapping approach.

    Science.gov (United States)

    Kok, Gerjo; Gottlieb, Nell H; Peters, Gjalt-Jorn Y; Mullen, Patricia Dolan; Parcel, Guy S; Ruiter, Robert A C; Fernández, María E; Markham, Christine; Bartholomew, L Kay

    2016-09-01

    In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters for effectiveness of methods, and explicate the related distinction between theory-based methods and practical applications and the probability that poor translation of methods may lead to erroneous conclusions as to method-effectiveness. Third, we recommend a minimal set of intervention characteristics that may be reported when intervention descriptions and evaluations are published. Specifying these characteristics can greatly enhance the quality of our meta-analyses and other literature syntheses. In conclusion, the dynamics of behaviour change are such that any taxonomy of methods of behaviour change needs to acknowledge the importance of, and provide instruments for dealing with, three conditions for effectiveness for behaviour change methods. For a behaviour change method to be effective: (1) it must target a determinant that predicts behaviour; (2) it must be able to change that determinant; (3) it must be translated into a practical application in a way that preserves the parameters for effectiveness and fits with the target population, culture, and context. Thus, taxonomies of methods of behaviour change must distinguish the specific determinants that are targeted, practical, specific applications, and the theory-based methods they embody. In addition, taxonomies should acknowledge that the lists of behaviour change methods will be used by, and should be used by, intervention developers. Ideally, the taxonomy should be readily usable for this goal; but alternatively, it should be

  5. A multi-method approach to evaluate health information systems.

    Science.gov (United States)

    Yu, Ping

    2010-01-01

    Systematic evaluation of the introduction and impact of health information systems (HIS) is a challenging task. As the implementation is a dynamic process, with diverse issues emerge at various stages of system introduction, it is challenge to weigh the contribution of various factors and differentiate the critical ones. A conceptual framework will be helpful in guiding the evaluation effort; otherwise data collection may not be comprehensive and accurate. This may again lead to inadequate interpretation of the phenomena under study. Based on comprehensive literature research and own practice of evaluating health information systems, the author proposes a multimethod approach that incorporates both quantitative and qualitative measurement and centered around DeLone and McLean Information System Success Model. This approach aims to quantify the performance of HIS and its impact, and provide comprehensive and accurate explanations about the casual relationships of the different factors. This approach will provide decision makers with accurate and actionable information for improving the performance of the introduced HIS.

  6. Utilization of the MAT method to analyze the nucleate boiling boundary in rod bundles subchannels

    International Nuclear Information System (INIS)

    Pedron, M.Q.

    1983-01-01

    The digital program PANTERA-1P, a new version of the COBRA-IIIC code, developed at CDTN, is directed to the thermal-hydraulic analysis of water cooled rod bundles and reactor cores, insteady state and transient conditions. Both the new and the old code versions have identical capacities in what concerns evaluation of fluid variables, nevertheless PANTERA-1P has better and faster performance. Improvements introduced in the scheme for solution of the conservation equations have contributed significantly to reduce the computer time, without affecting the accuracy of results. While the momentum equations are solved in COBRA-IIIC for the crossflow distribution, the PANTERA-1P code solves these equations for the pressure distribution by using the MAT method (Modified and Advanced Theta). The calculation of the pressure coefficient matrix has been optimized and simultaneous linear equations are solved optionally by means of the transpose elimination with storage requirements or the successive over-relaxation methods. The program presents others features specially in what concerns the thermal conduction model for fuel rods and the critical heat flux calculations options. A new input/output scheme is provided for optional use of the British or Internacional System of Units. The results of the program are compared to the critical heat flux experimental data and to the results of COBRA-IIIC. Excellent agreement is observed in both cases. (Author) [pt

  7. Utilizing the virus-induced blocking of apoptosis in an easy baculovirus titration method.

    Science.gov (United States)

    Niarchos, Athanasios; Lagoumintzis, George; Poulas, Konstantinos

    2015-10-22

    Baculovirus-mediated protein expression is a robust experimental technique for producing recombinant higher-eukaryotic proteins because it combines high yields with considerable post-translational modification capabilities. In this expression system, the determination of the titer of recombinant baculovirus stocks is important to achieve the correct multiplicity of infection for effective amplification of the virus and high expression of the target protein. To overcome the drawbacks of existing titration methods (e.g., plaque assay, real-time PCR), we present a simple and reliable assay that uses the ability of baculoviruses to block apoptosis in their host cells to accurately titrate virus samples. Briefly, after incubation with serial dilutions of baculovirus samples, Sf9 cells were UV irradiated and, after apoptosis induction, they were viewed via microscopy; the presence of cluster(s) of infected cells as islets indicated blocked apoptosis. Subsequently, baculovirus titers were calculated through the determination of the 50% endpoint dilution. The method is simple, inexpensive, and does not require unique laboratory equipment, consumables or expertise; moreover, it is versatile enough to be adapted for the titration of every virus species that can block apoptosis in any culturable host cells which undergo apoptosis under specific conditions.

  8. Method of preparing and utilizing a catalyst system for an oxidation process on a gaseous hydrocarbon stream

    Science.gov (United States)

    Berry, David A; Shekhawat, Dushyant; Smith, Mark; Haynes, Daniel

    2013-07-16

    The disclosure relates to a method of utilizing a catalyst system for an oxidation process on a gaseous hydrocarbon stream with a mitigation of carbon accumulation. The system is comprised of a catalytically active phase deposited onto an oxygen conducting phase, with or without supplemental support. The catalytically active phase has a specified crystal structure where at least one catalytically active metal is a cation within the crystal structure and coordinated with oxygen atoms within the crystal structure. The catalyst system employs an optimum coverage ratio for a given set of oxidation conditions, based on a specified hydrocarbon conversion and a carbon deposition limit. Specific embodiments of the catalyst system are disclosed.

  9. A taxonomy of behaviour change methods: an Intervention Mapping approach

    OpenAIRE

    Kok, Gerjo; Gottlieb, Nell H.; Peters, Gjalt-Jorn Y.; Mullen, Patricia Dolan; Parcel, Guy S.; Ruiter, Robert A.C.; Fern?ndez, Mar?a E.; Markham, Christine; Bartholomew, L. Kay

    2015-01-01

    ABSTRACT In this paper, we introduce the Intervention Mapping (IM) taxonomy of behaviour change methods and its potential to be developed into a coding taxonomy. That is, although IM and its taxonomy of behaviour change methods are not in fact new, because IM was originally developed as a tool for intervention development, this potential was not immediately apparent. Second, in explaining the IM taxonomy and defining the relevant constructs, we call attention to the existence of parameters fo...

  10. Direct endoscopic necrosectomy versus step-up approach for walled-off pancreatic necrosis: comparison of clinical outcome and health care utilization.

    Science.gov (United States)

    Kumar, Nitin; Conwell, Darwin L; Thompson, Christopher C

    2014-11-01

    Infected walled-off pancreatic necrosis (WOPN) is a complication of acute pancreatitis requiring intervention. Surgery is associated with considerable morbidity. Percutaneous catheter drainage (PCD), initial therapy in the step-up approach, minimizes complications. Direct endoscopic necrosectomy (DEN) has demonstrated safety and efficacy. We compared outcome and health care utilization of DEN versus step-up approach. This was a matched cohort study using a prospective registry. Twelve consecutive DEN patients were matched with 12 step-up approach patients. Outcomes were clinical resolution after primary therapeutic modality, new organ failure, mortality, endocrine or exocrine insufficiency, length of stay, and health care utilization. Clinical resolution in 11 of 12 patients after DEN versus 3 of 12 step-up approach patients after PCD (P endocrine insufficiency, and shorter length of stay (P < 0.05). Health care utilization was lower after DEN by 5.2:1 (P < 0.01). Direct endoscopic necrosectomy may be superior to step-up approach for WOPN with suspected or established infection. Primary PCD generally delayed definitive therapy. Given the higher efficacy, shorter length of stay, and lower health care utilization, DEN could be the first-line therapy for WOPN, with primary PCD for inaccessible or immature collections.

  11. utilization of some physical methods for detection of some irradiated foods

    International Nuclear Information System (INIS)

    Mohammed, I.A.S

    2007-01-01

    the present investigation was carried out to establish a detection method for irradiated black pepper and marjoram using thermoluminescence (TL) and wheat, cinnamon and ginger using viscosity measurement. all samples were packed in polyethylene bags then irradiated at 5,10 and 15 kGy for black pepper, marjoram, cinnamon and ginger. wheat was irradiated at 1,2 and 3 kGy. all samples were stored for eight months at room temperature. results indicated that irradiation treatment caused markedly increasing in TL intensity for irradiated black pepper and marjoram while irradiation treatment decreased apparent viscosity of wheat flour, cinnamon and ginger powder, post irradiation and during storage. therefore, it could be concluded that the TL analysis can be used to detect irradiated black pepper and marjoram, also viscosity measurement can be used to detect irradiate wheat flour, cinnamon and ginger powder than non-irradiated ones after irradiation process and also during 8 months of storage at ambient temperature

  12. Radon and radon daughter measurements and methods utilized by EPA's Eastern Environmental Radiation Facility

    International Nuclear Information System (INIS)

    Phillips, C.R.

    1977-01-01

    The Eastern Environmental Radiation Facility (EERF), Office of Radiation Programs, has the responsibility for conducting the Environmental Protection Agency's study of the radiological impact of the phosphate industry. Numerous measurements in structures constructed on land reclaimed from phosphate mining showed that working levels in these structures range from 0.001 to 0.9 WL. Sampling is performed by drawing air through a 0.8 micrometer pore size, 25 mm diameter filter at a flow rate of 10 to 15 liters/minute for from 5 to 20 minutes, depending on the daughter levels anticipated. The detection system consists of a ruggedized silicon surface barrier detector (450 mm 2 -100 micrometer depletion) connected through an appropriate pre-amplifier-amplifier to a 1024-channel multichannel analyzer. Other measurement methods are also discussed

  13. Method of phase space beam dilution utilizing bounded chaos generated by rf phase modulation

    Directory of Open Access Journals (Sweden)

    Alfonse N. Pham

    2015-12-01

    Full Text Available This paper explores the physics of chaos in a localized phase-space region produced by rf phase modulation applied to a double rf system. The study can be exploited to produce rapid particle bunch broadening exhibiting longitudinal particle distribution uniformity. Hamiltonian models and particle-tracking simulations are introduced to understand the mechanism and applicability of controlled particle diffusion. When phase modulation is applied to the double rf system, regions of localized chaos are produced through the disruption and overlapping of parametric resonant islands and configured to be bounded by well-behaved invariant tori to prevent particle loss. The condition of chaoticity and the degree of particle dilution can be controlled by the rf parameters. The method has applications in alleviating adverse space-charge effects in high-intensity beams, particle bunch distribution uniformization, and industrial radiation-effects experiments.

  14. Advancement in shampoo (a dermal care product): preparation methods, patents and commercial utility.

    Science.gov (United States)

    Deeksha; Malviya, Rishabha; Sharma, Pramod K

    2014-01-01

    Shampoo is a cleaning aid for hair and is the most evolving beauty products in the present scenario. Today's shampoo products are of great importance as they provide cleaning of hair with the benefits of conditioning, smoothing and good health of hair i.e. dandruff, dirt, grease and lice free hair. Various types of shampoos depending upon function, nature of ingredient, and their special effects are elaborated in this study. Generally shampoos are evaluated in terms of physical appearance, detergency, surface tension, foam quality, pH, viscosity, and percent of solid content, flow property, dirt dispersion, cleaning action, stability and wetting time. The attention should be paid at its patent portion which attracts towards itself as it provides wide knowledge related to shampoo. This article reviews the various aspects of shampoo in terms of preparation methods, various patents and commercial value.

  15. Hydrogen generation systems and methods utilizing sodium silicide and sodium silica gel materials

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, Andrew P.; Melack, John M.; Lefenfeld, Michael

    2017-12-19

    Systems, devices, and methods combine thermally stable reactant materials and aqueous solutions to generate hydrogen and a non-toxic liquid by-product. The reactant materials can sodium silicide or sodium silica gel. The hydrogen generation devices are used in fuels cells and other industrial applications. One system combines cooling, pumping, water storage, and other devices to sense and control reactions between reactant materials and aqueous solutions to generate hydrogen. Springs and other pressurization mechanisms pressurize and deliver an aqueous solution to the reaction. A check valve and other pressure regulation mechanisms regulate the pressure of the aqueous solution delivered to the reactant fuel material in the reactor based upon characteristics of the pressurization mechanisms and can regulate the pressure of the delivered aqueous solution as a steady decay associated with the pressurization force. The pressure regulation mechanism can also prevent hydrogen gas from deflecting the pressure regulation mechanism.

  16. A new approach for categorizing pig lying behaviour based on a Delaunay triangulation method.

    Science.gov (United States)

    Nasirahmadi, A; Hensel, O; Edwards, S A; Sturm, B

    2017-01-01

    Machine vision-based monitoring of pig lying behaviour is a fast and non-intrusive approach that could be used to improve animal health and welfare. Four pens with 22 pigs in each were selected at a commercial pig farm and monitored for 15 days using top view cameras. Three thermal categories were selected relative to room setpoint temperature. An image processing technique based on Delaunay triangulation (DT) was utilized. Different lying patterns (close, normal and far) were defined regarding the perimeter of each DT triangle and the percentages of each lying pattern were obtained in each thermal category. A method using a multilayer perceptron (MLP) neural network, to automatically classify group lying behaviour of pigs into three thermal categories, was developed and tested for its feasibility. The DT features (mean value of perimeters, maximum and minimum length of sides of triangles) were calculated as inputs for the MLP classifier. The network was trained, validated and tested and the results revealed that MLP could classify lying features into the three thermal categories with high overall accuracy (95.6%). The technique indicates that a combination of image processing, MLP classification and mathematical modelling can be used as a precise method for quantifying pig lying behaviour in welfare investigations.

  17. Preferential and Non-Preferential Approaches to Trade Liberalization in East Asia: What Differences Do Utilization Rates and Reciprocity Make?

    OpenAIRE

    Menon, Jayant

    2013-01-01

    Previous studies on the impacts of free trade agreements (FTAs) in East Asia have assumed full utilization of preferences. The evidence suggests that this assumption is seriously in error, with the estimated uptake particularly low in East Asia. In this paper, we assume a more realistic utilization rate in estimating impacts. We find that actual utilization rates significantly diminish the benefits from preferential liberalization, but in a non-linear way. Reciprocity is an important motivati...

  18. Fabrication of graphene-based flexible devices utilizing a soft lithographic patterning method

    Science.gov (United States)

    Jung, Min Wook; Myung, Sung; Kim, Ki Woong; Song, Wooseok; Jo, You-Young; Lee, Sun Suk; Lim, Jongsun; Park, Chong-Yun; An, Ki-Seok

    2014-07-01

    There has been considerable interest in soft lithographic patterning processing of large scale graphene sheets due to the low cost and simplicity of the patterning process along with the exceptional electrical or physical properties of graphene. These properties include an extremely high carrier mobility and excellent mechanical strength. Recently, a study has reported that single layer graphene grown via chemical vapor deposition (CVD) was patterned and transferred to a target surface by controlling the surface energy of the polydimethylsiloxane (PDMS) stamp. However, applications are limited because of the challenge of CVD-graphene functionalization for devices such as chemical or bio-sensors. In addition, graphene-based layers patterned with a micron scale width on the surface of biocompatible silk fibroin thin films, which are not suitable for conventional CMOS processes such as the patterning or etching of substrates, have yet to be reported. Herein, we developed a soft lithographic patterning process via surface energy modification for advanced graphene-based flexible devices such as transistors or chemical sensors. Using this approach, the surface of a relief-patterned elastomeric stamp was functionalized with hydrophilic dimethylsulfoxide molecules to enhance the surface energy of the stamp and to remove the graphene-based layer from the initial substrate and transfer it to a target surface. As a proof of concept using this soft lithographic patterning technique, we demonstrated a simple and efficient chemical sensor consisting of reduced graphene oxide and a metallic nanoparticle composite. A flexible graphene-based device on a biocompatible silk fibroin substrate, which is attachable to an arbitrary target surface, was also successfully fabricated. Briefly, a soft lithographic patterning process via surface energy modification was developed for advanced graphene-based flexible devices such as transistors or chemical sensors and attachable devices on a

  19. Fabrication of graphene-based flexible devices utilizing a soft lithographic patterning method

    International Nuclear Information System (INIS)

    Wook Jung, Min; Myung, Sung; Woong Kim, Ki; Song, Wooseok; Suk Lee, Sun; Lim, Jongsun; An, Ki-Seok; Jo, You-Young; Park, Chong-Yun

    2014-01-01

    There has been considerable interest in soft lithographic patterning processing of large scale graphene sheets due to the low cost and simplicity of the patterning process along with the exceptional electrical or physical properties of graphene. These properties include an extremely high carrier mobility and excellent mechanical strength. Recently, a study has reported that single layer graphene grown via chemical vapor deposition (CVD) was patterned and transferred to a target surface by controlling the surface energy of the polydimethylsiloxane (PDMS) stamp. However, applications are limited because of the challenge of CVD-graphene functionalization for devices such as chemical or bio-sensors. In addition, graphene-based layers patterned with a micron scale width on the surface of biocompatible silk fibroin thin films, which are not suitable for conventional CMOS processes such as the patterning or etching of substrates, have yet to be reported. Herein, we developed a soft lithographic patterning process via surface energy modification for advanced graphene-based flexible devices such as transistors or chemical sensors. Using this approach, the surface of a relief-patterned elastomeric stamp was functionalized with hydrophilic dimethylsulfoxide molecules to enhance the surface energy of the stamp and to remove the graphene-based layer from the initial substrate and transfer it to a target surface. As a proof of concept using this soft lithographic patterning technique, we demonstrated a simple and efficient chemical sensor consisting of reduced graphene oxide and a metallic nanoparticle composite. A flexible graphene-based device on a biocompatible silk fibroin substrate, which is attachable to an arbitrary target surface, was also successfully fabricated. Briefly, a soft lithographic patterning process via surface energy modification was developed for advanced graphene-based flexible devices such as transistors or chemical sensors and attachable devices on a

  20. The equivalent energy method: an engineering approach to fracture

    International Nuclear Information System (INIS)

    Witt, F.J.

    1981-01-01

    The equivalent energy method for elastic-plastic fracture evaluations was developed around 1970 for determining realistic engineering estimates for the maximum load-displacement or stress-strain conditions for fracture of flawed structures. The basis principles were summarized but the supporting experimental data, most of which were obtained after the method was proposed, have never been collated. This paper restates the original bases more explicitly and presents the validating data in graphical form. Extensive references are given. The volumetric energy ratio, a modelling parameter encompassing both size and temperature, is the fundamental parameter of the equivalent energy method. It is demonstrated that, in an engineering sense, the volumetric energy ratio is a unique material characteristic for a steel, much like a material property except size must be taken into account. With this as a proposition, the basic formula of the equivalent energy method is derived. Sufficient information is presented so that investigators and analysts may judge the viability and applicability of the method to their areas of interest. (author)

  1. THE EFFECTIVENESS OF PRUDENTIAL BANKING SUPERVISION: PECULIARITIES OF METHODICAL APPROACHES

    Directory of Open Access Journals (Sweden)

    S. Naumenkova

    2015-10-01

    Full Text Available Іn the article the theoretical fundamentals of the prudential banking supervision effectiveness and substantiation of approaches to calculation of the integral indicator of supervisory system compliance with the Basel Committee Core Principles were investigated. The “functional effectiveness” and “institutional effectiveness” concepts of supervisory activity were suggested. The authors have defined the influence of supervisory organizing structure on GDP growth by groups of countries in the world. The list of priority measures focused on increase of the effectiveness of prudential supervisory activity was systematized to restore sustainability of the national banking sector.

  2. Comparative analysis of methods for measurements of food intake and utilization using the soybean looper, Pseudoplusia includens and artificial media

    International Nuclear Information System (INIS)

    Parra, J.R.P.; Kogan, M.; Illinois Agricultural Experiment Station, Urbana; Illinois Univ., Urbana

    1981-01-01

    An analysis of intake and utilization of an artificial medium by larvae of the soybean looper, Pseudoplusia includens Walker, Lepidoptera: Noctuidae, was performed using 4 methods: standard gravimetric, chromic oxide, Calco Oil Red, and 14 C-glucose. Each method was used in conjunction with standard gravimetry. The relative merits of the indirect methods were analyzed in terms of precision and accuracy for ECI and ECD estimation, cost, and overall versatility. Only the gravimetric method combined ca. 80% precision in ECI and ECD estimation with low cost and maximum versatility. Calco Oil Red at 0.1% w/v was detrimental to the larvae. Cr 2 O 3 caused reduced intake but conversion was increased resulting in normal development and growth of larvae. The radioisotopic method had the advantage of providing a direct means of measuring expired CO 2 . The need to operate under a totally enclosed system, however, poses some serious difficulties in the use of radioisotopes. There seems to be little advantage in any of the proposed indirect methods, except if there are unusual difficulties in separating the excreta from the medium. (orig.)

  3. Introduction to quantitative research methods an investigative approach

    CERN Document Server

    Balnaves, Mark

    2001-01-01

    Introduction to Quantitative Research Methods is a student-friendly introduction to quantitative research methods and basic statistics. It uses a detective theme throughout the text and in multimedia courseware to show how quantitative methods have been used to solve real-life problems. The book focuses on principles and techniques that are appropriate to introductory level courses in media, psychology and sociology. Examples and illustrations are drawn from historical and contemporary research in the social sciences. The multimedia courseware provides tutorial work on sampling, basic statistics, and techniques for seeking information from databases and other sources. The statistics modules can be used as either part of a detective games or directly in teaching and learning. Brief video lessons in SPSS, using real datasets, are also a feature of the CD-ROM.

  4. Assessing Critical Thinking Outcomes of Dental Hygiene Students Utilizing Virtual Patient Simulation: A Mixed Methods Study.

    Science.gov (United States)

    Allaire, Joanna L

    2015-09-01

    Dental hygiene educators must determine which educational practices best promote critical thinking, a quality necessary to translate knowledge into sound clinical decision making. The aim of this small pilot study was to determine whether virtual patient simulation had an effect on the critical thinking of dental hygiene students. A pretest-posttest design using the Health Science Reasoning Test was used to evaluate the critical thinking skills of senior dental hygiene students at The University of Texas School of Dentistry at Houston Dental Hygiene Program before and after their experience with computer-based patient simulation cases. Additional survey questions sought to identify the students' perceptions of whether the experience had helped develop their critical thinking skills and improved their ability to provide competent patient care. A convenience sample of 31 senior dental hygiene students completed both the pretest and posttest (81.5% of total students in that class); 30 senior dental hygiene students completed the survey on perceptions of the simulation (78.9% response rate). Although the results did not show a significant increase in mean scores, the students reported feeling that the use of virtual patients was an effective teaching method to promote critical thinking, problem-solving, and confidence in the clinical realm. The results of this pilot study may have implications to support the use of virtual patient simulations in dental hygiene education. Future research could include a larger controlled study to validate findings from this study.

  5. An approximate methods approach to probabilistic structural analysis

    Science.gov (United States)

    Mcclung, R. C.; Millwater, H. R.; Wu, Y.-T.; Thacker, B. H.; Burnside, O. H.

    1989-01-01

    A probabilistic structural analysis method (PSAM) is described which makes an approximate calculation of the structural response of a system, including the associated probabilistic distributions, with minimal computation time and cost, based on a simplified representation of the geometry, loads, and material. The method employs the fast probability integration (FPI) algorithm of Wu and Wirsching. Typical solution strategies are illustrated by formulations for a representative critical component chosen from the Space Shuttle Main Engine (SSME) as part of a major NASA-sponsored program on PSAM. Typical results are presented to demonstrate the role of the methodology in engineering design and analysis.

  6. Improvement Methods in NPP's Radiation Emergency Plan: An Administrative Approach

    International Nuclear Information System (INIS)

    Lee, Yoon Wook; Yang, He Sun

    2009-01-01

    The Radiation Emergency Plan (REP) can be divided into a technical and an administrative responses. The domestic NPP's REPs are reviewed from the viewpoint of the administrative response and improvement methods are also suggested in this treatise. The fields of the reviews are the composition of the emergency response organizations, the activation criteria of the organizations, the selection of the staffings and the reasonableness of the REP's volume. In addition, the limitations of the current radiation exercises are reviewed and the improvement method of the exercise is presented. It is expected that the suggested recommendations will be helpful in establishing useful REPs and making practical radiation exercises in Korea

  7. A novel time series link prediction method: Learning automata approach

    Science.gov (United States)

    Moradabadi, Behnaz; Meybodi, Mohammad Reza

    2017-09-01

    Link prediction is a main social network challenge that uses the network structure to predict future links. The common link prediction approaches to predict hidden links use a static graph representation where a snapshot of the network is analyzed to find hidden or future links. For example, similarity metric based link predictions are a common traditional approach that calculates the similarity metric for each non-connected link and sort the links based on their similarity metrics and label the links with higher similarity scores as the future links. Because people activities in social networks are dynamic and uncertainty, and the structure of the networks changes over time, using deterministic graphs for modeling and analysis of the social network may not be appropriate. In the time-series link prediction problem, the time series link occurrences are used to predict the future links In this paper, we propose a new time series link prediction based on learning automata. In the proposed algorithm for each link that must be predicted there is one learning automaton and each learning automaton tries to predict the existence or non-existence of the corresponding link. To predict the link occurrence in time T, there is a chain consists of stages 1 through T - 1 and the learning automaton passes from these stages to learn the existence or non-existence of the corresponding link. Our preliminary link prediction experiments with co-authorship and email networks have provided satisfactory results when time series link occurrences are considered.

  8. The Utility of the Health Action Process Approach Model for Predicting Physical Activity Intentions and Behavior in Schizophrenia

    Directory of Open Access Journals (Sweden)

    Kelly P. Arbour-Nicitopoulos

    2017-08-01

    Full Text Available Research is needed to develop evidence-based behavioral interventions for preventing and treating obesity that are specific to the schizophrenia population. This study is the precursor to such intervention research where we examined the utility of the social cognitions outlined within the Health Action Process Approach (HAPA model for predicting moderate to vigorous physical activity (MVPA intentions and behavior among individuals with schizophrenia or schizoaffective disorder. A prospective cohort design [baseline (T1, week 2 (T2, and week 4 (T3] was used to examine the HAPA constructs and MVPA across a sample of 101 adults (Mage = 41.5 ± 11.7 years; MBMI = 31.2 ± 7.8 kg/m2; 59% male. Two hierarchical regression analyses were conducted controlling for age, gender, BMI, and previous self-reported MVPA. In the first regression, intentions at T1 were regressed onto the T1 motivational HAPA constructs (risk perception, affective attitudes, task self-efficacy and social support; MVPA status (meeting vs. not meeting the MVPA guidelines assessed via accelerometry at T3 was regressed onto T1 social support and intentions followed by T2 action and coping planning, and maintenance self-efficacy in the second analysis. Overall, the motivational and social support variables accounted for 28% of the variance in intentions, with affective attitudes (β = 0.33, p < 0.01 and task self-efficacy (β = 0.25, p < 0.05 exhibiting significant, positive relationships. For MVPA status, the model as a whole explained 39% of the variance, with the volitional HAPA constructs explaining a non-significant 3% of this total variance. These findings suggest a need for interventions targeting self-efficacy and affective attitudes within this clinical population.

  9. A novel pre-oxidation method for elemental mercury removal utilizing a complex vaporized absorbent

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Yi, E-mail: zhaoyi9515@163.com; Hao, Runlong; Guo, Qing

    2014-09-15

    Graphical abstract: - Highlights: • An innovative liquid-phase complex absorbent (LCA) for Hg{sup 0} removal was prepared. • A novel integrative process for Hg{sup 0} removal was proposed. • The simultaneous removal efficiencies of SO{sub 2}, NO and Hg{sup 0} were 100%, 79.5% and 80.4%, respectively. • The reaction mechanism of simultaneous removal of SO{sub 2}, NO and Hg{sup 0} was proposed. - Abstract: A novel semi-dry integrative method for elemental mercury (Hg{sup 0}) removal has been proposed in this paper, in which Hg{sup 0} was initially pre-oxidized by a vaporized liquid-phase complex absorbent (LCA) composed of a Fenton reagent, peracetic acid (CH{sub 3}COOOH) and sodium chloride (NaCl), after which Hg{sup 2+} was absorbed by the resultant Ca(OH){sub 2}. The experimental results indicated that CH{sub 3}COOOH and NaCl were the best additives for Hg{sup 0} oxidation. Among the influencing factors, the pH of the LCA and the adding rate of the LCA significantly affected the Hg{sup 0} removal. The coexisting gases, SO{sub 2} and NO, were characterized as either increasing or inhibiting in the removal process, depending on their concentrations. Under optimal reaction conditions, the efficiency for the single removal of Hg{sup 0} was 91%. Under identical conditions, the efficiencies of the simultaneous removal of SO{sub 2}, NO and Hg{sup 0} were 100%, 79.5% and 80.4%, respectively. Finally, the reaction mechanism for the simultaneous removal of SO{sub 2}, NO and Hg{sup 0} was proposed based on the characteristics of the removal products as determined by X-ray diffraction (XRD), atomic fluorescence spectrometry (AFS), the analysis of the electrode potentials, and through data from related research references.

  10. A novel pre-oxidation method for elemental mercury removal utilizing a complex vaporized absorbent

    International Nuclear Information System (INIS)

    Zhao, Yi; Hao, Runlong; Guo, Qing

    2014-01-01

    Graphical abstract: - Highlights: • An innovative liquid-phase complex absorbent (LCA) for Hg 0 removal was prepared. • A novel integrative process for Hg 0 removal was proposed. • The simultaneous removal efficiencies of SO 2 , NO and Hg 0 were 100%, 79.5% and 80.4%, respectively. • The reaction mechanism of simultaneous removal of SO 2 , NO and Hg 0 was proposed. - Abstract: A novel semi-dry integrative method for elemental mercury (Hg 0 ) removal has been proposed in this paper, in which Hg 0 was initially pre-oxidized by a vaporized liquid-phase complex absorbent (LCA) composed of a Fenton reagent, peracetic acid (CH 3 COOOH) and sodium chloride (NaCl), after which Hg 2+ was absorbed by the resultant Ca(OH) 2 . The experimental results indicated that CH 3 COOOH and NaCl were the best additives for Hg 0 oxidation. Among the influencing factors, the pH of the LCA and the adding rate of the LCA significantly affected the Hg 0 removal. The coexisting gases, SO 2 and NO, were characterized as either increasing or inhibiting in the removal process, depending on their concentrations. Under optimal reaction conditions, the efficiency for the single removal of Hg 0 was 91%. Under identical conditions, the efficiencies of the simultaneous removal of SO 2 , NO and Hg 0 were 100%, 79.5% and 80.4%, respectively. Finally, the reaction mechanism for the simultaneous removal of SO 2 , NO and Hg 0 was proposed based on the characteristics of the removal products as determined by X-ray diffraction (XRD), atomic fluorescence spectrometry (AFS), the analysis of the electrode potentials, and through data from related research references

  11. Child protective services utilization of child abuse pediatricians: A mixed methods study.

    Science.gov (United States)

    Girardet, Rebecca; Bolton, Kelly; Hashmi, Syed; Sedlock, Emily; Khatri, Rachna; Lahoti, Nina; Lukefahr, James

    2018-02-01

    Several children's hospitals and medical schools across Texas have child abuse pediatricians (CAPs) who work closely with child protection workers to help ensure accurate assessments of the likelihood of maltreatment in cases of suspected abuse and neglect. Since the state does not mandate which cases should be referred to a CAP center, we were interested in studying factors that may influence workers' decisions to consult a CAP. We used a mixed methods study design consisting of a focus group followed by a survey. The focus group identified multiple factors that impact workers' decision-making, including several that involve medical providers. Responses from 436 completed surveys were compared to employees' number of years of employment and to the state region in which they worked. Focus group findings and survey responses revealed frustration among many workers when dealing with medical providers, and moderate levels of confidence in workers' abilities to make accurate determinations in cases involving medical information. Workers were more likely to refer cases involving serious physical injury than other types of cases. Among workers who reported prior interactions with a CAP, experiences and attitudes regarding CAPs were typically positive. The survey also revealed significant variability in referral practices by state region. Our results suggest that standard guidelines regarding CAP referrals may help workers who deal with cases involving medical information. Future research and quality improvement efforts to improve transfers of information and to better understand the qualities that CPS workers appreciate in CAP teams should improve CAP-CPS coordination. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Methodical approaches to solving special problems of testing. Seminar papers

    International Nuclear Information System (INIS)

    1996-01-01

    This Seminar volume introduces concepts and applications from different areas of application of ultrasonic testing and other non-destructive test methods in 18 lectures, in order to give an idea of new trends in development and stimuli for special solutions to problems. 3 articles were recorded separately for the ENERGY data bank. (orig./MM) [de

  13. An Integrated Approach to Research Methods and Capstone

    Science.gov (United States)

    Postic, Robert; McCandless, Ray; Stewart, Beth

    2014-01-01

    In 1991, the AACU issued a report on improving undergraduate education suggesting, in part, that a curriculum should be both comprehensive and cohesive. Since 2008, we have systematically integrated our research methods course with our capstone course in an attempt to accomplish the twin goals of comprehensiveness and cohesion. By taking this…

  14. A bottom-up method to develop pollution abatement cost curves for coal-fired utility boilers

    International Nuclear Information System (INIS)

    Vijay, Samudra; DeCarolis, Joseph F.; Srivastava, Ravi K.

    2010-01-01

    This paper illustrates a new method to create supply curves for pollution abatement using boiler-level data that explicitly accounts for technology cost and performance. The Coal Utility Environmental Cost (CUECost) model is used to estimate retrofit costs for five different NO x control configurations on a large subset of the existing coal-fired, utility-owned boilers in the US. The resultant data are used to create technology-specific marginal abatement cost curves (MACCs) and also serve as input to an integer linear program, which minimizes system-wide control costs by finding the optimal distribution of NO x controls across the modeled boilers under an emission constraint. The result is a single optimized MACC that accounts for detailed, boiler-specific information related to NO x retrofits. Because the resultant MACCs do not take into account regional differences in air-quality standards or pre-existing NO x controls, the results should not be interpreted as a policy prescription. The general method as well as NO x -specific results presented here should be of significant value to modelers and policy analysts who must estimate the costs of pollution reduction.

  15. A blended learning approach to teaching sociolinguistic research methods

    Directory of Open Access Journals (Sweden)

    Olivier, Jako

    2014-12-01

    Full Text Available This article reports on the use of Wiktionary, an open source online dictionary, as well as generic wiki pages within a university’s e-learning environment as teaching and learning resources in an Afrikaans sociolinguistics module. In a communal constructivist manner students learnt, but also constructed learning content. From the qualitative research conducted with students it is clear that wikis provide for effective facilitation of a blended learning approach to sociolinguistic research. The use of this medium was positively received, however, some students did prefer handing in assignments in hard copy. The issues of computer literacy and access to the internet were also raised by the respondents. The use of wikis and Wiktionary prompted useful unplanned discussions around reliability and quality of public wikis. The use of a public wiki such as Wiktionary served as encouragement for students as they were able to contribute to the promotion of Afrikaans in this way.

  16. Economic Sustainability in International Business: Peculiarities, Methods and Approaches

    Directory of Open Access Journals (Sweden)

    Otenko Iryna Pavlivna

    2016-05-01

    Full Text Available This article is intended as a contribution to the ongoing analysis of economic sustainability in international business. This study is presented with a view toward further understanding and agreement of the key concepts of sustainability. Approaches to sustainability are considered, important benchmarks and essential characteristics of sustainable development in international business are included. The article demonstrates how the concept of economic sustainability can be applied to the business level. The main ideas of the most widespread concepts on resource management are presented. Incorporation of ESG and financial factors in the concept of sustainable investing is considered. Emissions that are responsible for climate change, namely top emitters, key issues and figures are presented.

  17. The pyrolytic-plasma method and the device for the utilization of hazardous waste containing organic compounds

    Energy Technology Data Exchange (ETDEWEB)

    Opalińska, Teresa [Tele and Radio Research Institute, Ratuszowa 11, 03-450 Warsaw (Poland); Wnęk, Bartłomiej, E-mail: bartlomiej.wnek@itr.org.pl [Tele and Radio Research Institute, Ratuszowa 11, 03-450 Warsaw (Poland); Witowski, Artur; Juszczuk, Rafał; Majdak, Małgorzata [Tele and Radio Research Institute, Ratuszowa 11, 03-450 Warsaw (Poland); Bartusek, Stanilav [VŠB—Technical University of Ostrava, 17. listopadu 15/2172, 708 33 Ostrava − Poruba Czech Republic (Czech Republic)

    2016-11-15

    Highlights: • A first stage of the process of waste utilization consisted in pyrolysis of waste. • Then the pyrolytic gas was oxidized with a use of non-equilibrium plasma. • The device for the process implementation was built and characterized. • Correctness of the device operation was proven with a use of the decomposition of PE. • Usefulness of the method was proven in the process of utilization of EW. - Abstract: This paper is focused on the new method of waste processing. The waste, including hazardous waste, contain organic compounds. The method consists in two main processes: the pyrolysis of waste and the oxidation of the pyrolytic gas with a use of non-equilibrium plasma. The practical implementation of the method requires the design, construction and testing of the new device in large laboratory scale. The experiments were carried out for the two kinds of waste: polyethylene as a model waste and the electronic waste as a real waste. The process of polyethylene decomposition showed that the operation of the device is correct because 99.74% of carbon moles contained in the PE samples was detected in the gas after the process. Thus, the PE samples practically were pyrolyzed completely to hydrocarbons, which were completely oxidized in the plasma reactor. It turned out that the device is useful for decomposition of the electronic waste. The conditions in the plasma reactor during the oxidation process of the pyrolysis products did not promote the formation of PCDD/Fs despite the presence of the oxidizing conditions. An important parameter determining the efficiency of the oxidation of the pyrolysis products is gas temperature in the plasma reactor.

  18. SU-E-T-492: Implementing a Method for Brain Irradiation in Rats Utilizing a Commercially Available Radiosurgery Irradiator

    International Nuclear Information System (INIS)

    Cates, J; Drzymala, R

    2014-01-01

    Purpose: The purpose of the study was to implement a method for accurate rat brain irradiation using the Gamma Knife Perfexion unit. The system needed to be repeatable, efficient, and dosimetrically and spatially accurate. Methods: A platform (“rat holder”) was made such that it is attachable to the Leskell Gamma Knife G Frame. The rat holder utilizes two ear bars contacting bony anatomy and a front tooth bar to secure the rat. The rat holder fits inside of the Leskell localizer box, which utilizes fiducial markers to register with the GammaPlan planning system. This method allows for accurate, repeatable setup.A cylindrical phantom was made so that film can be placed axially in the phantom. We then acquired CT image sets of the rat holder and localizer box with both a rat and the phantom. Three treatment plans were created: a plan on the rat CT dataset, a phantom plan with the same prescription dose as the rat plan, and a phantom plan with the same delivery time as the rat plan. Results: Film analysis from the phantom showed that our setup is spatially accurate and repeatable. It is also dosimetrically accurate, with an difference between predicted and measured dose of 2.9%. Film analysis with prescription dose equal between rat and phantom plans showed a difference of 3.8%, showing that our phantom is a good representation of the rat for dosimetry purposes, allowing for +/- 3mm diameter variation. Film analysis with treatment time equal showed an error of 2.6%, which means we can deliver a prescription dose within 3% accuracy. Conclusion: Our method for irradiation of rat brain has been shown to be repeatable, efficient, and accurate, both dosimetrically and spatially. We can treat a large number of rats efficiently while delivering prescription doses within 3% at millimeter level accuracy

  19. About One Approach to Determine the Weights of the State Space Method

    Directory of Open Access Journals (Sweden)

    I. K. Romanova

    2015-01-01

    Full Text Available The article studies methods of determining weight coefficients, also called coefficients of criteria importance in multiobjective optimization (MOO. It is assumed that these coefficients indicate a degree of individual criteria influence on the final selection (final or summary assessment: the more is coefficient, the greater is contribution of its corresponding criterion.Today in the framework of modern information systems to support decision making for various purposes a number of methods for determining relative importance of criteria has been developed. Among those methods we can distinguish a utility method, method of weighted power average; weighted median; method of matching clustered rankings, method of paired comparison of importance, etc.However, it should be noted that different techniques available for calculating weights does not eliminate the main problem of multicriteria optimization namely, the inconsistency of individual criteria. The basis for solving multicriteria problems is a fundamental principle of multi-criteria selection i.e. Edgeworth - Pareto principle.Despite a large number of methods to determine the weights, the task remains relevant not only for reasons of evaluations subjectivity, but also because of the mathematical aspects. Today, recognized is the fact that, for example, such a popular method as linear convolution of private criteria, essentially, represents one of the heuristic approaches and, applying it, you can have got not the best final choice. Carlin lemma reflects the limits of the method application.The aim of this work is to offer one of the methods to calculate the weights applied to the problem of dynamic system optimization, the quality of which is determined by the criterion of a special type, namely integral quadratic quality criterion. The main challenge relates to the method of state space, which in the literature also is called the method of analytical design of optimal controllers.Despite the

  20. Deviation-based spam-filtering method via stochastic approach

    Science.gov (United States)

    Lee, Daekyung; Lee, Mi Jin; Kim, Beom Jun

    2018-03-01

    In the presence of a huge number of possible purchase choices, ranks or ratings of items by others often play very important roles for a buyer to make a final purchase decision. Perfectly objective rating is an impossible task to achieve, and we often use an average rating built on how previous buyers estimated the quality of the product. The problem of using a simple average rating is that it can easily be polluted by careless users whose evaluation of products cannot be trusted, and by malicious spammers who try to bias the rating result on purpose. In this letter we suggest how trustworthiness of individual users can be systematically and quantitatively reflected to build a more reliable rating system. We compute the suitably defined reliability of each user based on the user's rating pattern for all products she evaluated. We call our proposed method as the deviation-based ranking, since the statistical significance of each user's rating pattern with respect to the average rating pattern is the key ingredient. We find that our deviation-based ranking method outperforms existing methods in filtering out careless random evaluators as well as malicious spammers.

  1. Breast cancer tumor classification using LASSO method selection approach

    International Nuclear Information System (INIS)

    Celaya P, J. M.; Ortiz M, J. A.; Martinez B, M. R.; Solis S, L. O.; Castaneda M, R.; Garza V, I.; Martinez F, M.; Ortiz R, J. M.

    2016-10-01

    Breast cancer is one of the leading causes of deaths worldwide among women. Early tumor detection is key in reducing breast cancer deaths and screening mammography is the widest available method for early detection. Mammography is the most common and effective breast cancer screening test. However, the rate of positive findings is very low, making the radiologic interpretation monotonous and biased toward errors. In an attempt to alleviate radiological workload, this work presents a computer-aided diagnosis (CAD x) method aimed to automatically classify tumor lesions into malign or benign as a means to a second opinion. The CAD x methos, extracts image features, and classifies the screening mammogram abnormality into one of two categories: subject at risk of having malignant tumor (malign), and healthy subject (benign). In this study, 143 abnormal segmentation s (57 malign and 86 benign) from the Breast Cancer Digital Repository (BCD R) public database were used to train and evaluate the CAD x system. Percentile-rank (p-rank) was used to standardize the data. Using the LASSO feature selection methodology, the model achieved a Leave-one-out-cross-validation area under the receiver operating characteristic curve (Auc) of 0.950. The proposed method has the potential to rank abnormal lesions with high probability of malignant findings aiding in the detection of potential malign cases as a second opinion to the radiologist. (Author)

  2. Breast cancer tumor classification using LASSO method selection approach

    Energy Technology Data Exchange (ETDEWEB)

    Celaya P, J. M.; Ortiz M, J. A.; Martinez B, M. R.; Solis S, L. O.; Castaneda M, R.; Garza V, I.; Martinez F, M.; Ortiz R, J. M., E-mail: morvymm@yahoo.com.mx [Universidad Autonoma de Zacatecas, Av. Ramon Lopez Velarde 801, Col. Centro, 98000 Zacatecas, Zac. (Mexico)

    2016-10-15

    Breast cancer is one of the leading causes of deaths worldwide among women. Early tumor detection is key in reducing breast cancer deaths and screening mammography is the widest available method for early detection. Mammography is the most common and effective breast cancer screening test. However, the rate of positive findings is very low, making the radiologic interpretation monotonous and biased toward errors. In an attempt to alleviate radiological workload, this work presents a computer-aided diagnosis (CAD x) method aimed to automatically classify tumor lesions into malign or benign as a means to a second opinion. The CAD x methos, extracts image features, and classifies the screening mammogram abnormality into one of two categories: subject at risk of having malignant tumor (malign), and healthy subject (benign). In this study, 143 abnormal segmentation s (57 malign and 86 benign) from the Breast Cancer Digital Repository (BCD R) public database were used to train and evaluate the CAD x system. Percentile-rank (p-rank) was used to standardize the data. Using the LASSO feature selection methodology, the model achieved a Leave-one-out-cross-validation area under the receiver operating characteristic curve (Auc) of 0.950. The proposed method has the potential to rank abnormal lesions with high probability of malignant findings aiding in the detection of potential malign cases as a second opinion to the radiologist. (Author)

  3. Bio-energies usages in electricity generation utility means through a modeling approach: an application to the French case

    International Nuclear Information System (INIS)

    Le Cadre, Elodie; Lantz, Frederic; Farnoosh, Arash

    2011-12-01

    The introduction of renewable energies target and CO 2 emissions trading systems make the investment decision more difficult and add an additional flow of expenditure for electricity producers. This challenge represents a double advantage for biomass, which is a renewable alternative to fossil fuels, whose use in thermal coal can now reduce emissions of greenhouse gases. Whereas renewable energy sources (RES-E) are now supported by various economic tools such as feed-in tariffs and call for tender, the use of biomass is also enhanced by the CO 2 emission price. However, the price evolution of emission allowance is uncertain. Taking into account these incentives, the scope of this work is to study the penetration of green electricity production from biomass and its impacts on the future electricity generation mix for France incorporating different scenarios of emission allowance prices. While the use of the power plant is organized according to their growing running costs in the short-term approach, in the long- term approach capacity expansion planning should be determined by using several optimization methods. So, we develop a model based on a linear dynamic programming approach for supporting the electricity generation management in which renewable energies, climate and energy policies are modeled. We apply the model to the French power market under consideration of the neighboring countries. We present the results of the initialization and the electricity price and production tests. Then the expected demands of fuel over 2020 are presented for di rent CO 2 prices as an example to illustrate how the model can be put to use in different contexts. (authors)

  4. A Formal Methods Approach to the Analysis of Mode Confusion

    Science.gov (United States)

    Butler, Ricky W.; Miller, Steven P.; Potts, James N.; Carreno, Victor A.

    2004-01-01

    The goal of the new NASA Aviation Safety Program (AvSP) is to reduce the civil aviation fatal accident rate by 80% in ten years and 90% in twenty years. This program is being driven by the accident data with a focus on the most recent history. Pilot error is the most commonly cited cause for fatal accidents (up to 70%) and obviously must be given major consideration in this program. While the greatest source of pilot error is the loss of situation awareness , mode confusion is increasingly becoming a major contributor as well. The January 30, 1995 issue of Aviation Week lists 184 incidents and accidents involving mode awareness including the Bangalore A320 crash 2/14/90, the Strasbourg A320 crash 1/20/92, the Mulhouse-Habsheim A320 crash 6/26/88, and the Toulouse A330 crash 6/30/94. These incidents and accidents reveal that pilots sometimes become confused about what the cockpit automation is doing. Consequently, human factors research is an obvious investment area. However, even a cursory look at the accident data reveals that the mode confusion problem is much deeper than just training deficiencies and a lack of human-oriented design. This is readily acknowledged by human factors experts. It seems that further progress in human factors must come through a deeper scrutiny of the internals of the automation. It is in this arena that formal methods can contribute. Formal methods refers to the use of techniques from logic and discrete mathematics in the specification, design, and verification of computer systems, both hardware and software. The fundamental goal of formal methods is to capture requirements, designs and implementations in a mathematically based model that can be analyzed in a rigorous manner. Research in formal methods is aimed at automating this analysis as much as possible. By capturing the internal behavior of a flight deck in a rigorous and detailed formal model, the dark corners of a design can be analyzed. This paper will explore how formal

  5. Utility of the pooling approach as applied to whole genome association scans with high-density Affymetrix microarrays

    Directory of Open Access Journals (Sweden)

    Gray Joanna

    2010-11-01

    Full Text Available Abstract Background We report an attempt to extend the previously successful approach of combining SNP (single nucleotide polymorphism microarrays and DNA pooling (SNP-MaP employing high-density microarrays. Whereas earlier studies employed a range of Affymetrix SNP microarrays comprising from 10 K to 500 K SNPs, this most recent investigation used the 6.0 chip which displays 906,600 SNP probes and 946,000 probes for the interrogation of CNVs (copy number variations. The genotyping assay using the Affymetrix SNP 6.0 array is highly demanding on sample quality due to the small feature size, low redundancy, and lack of mismatch probes. Findings In the first study published so far using this microarray on pooled DNA, we found that pooled cheek swab DNA could not accurately predict real allele frequencies of the samples that comprised the pools. In contrast, the allele frequency estimates using blood DNA pools were reasonable, although inferior compared to those obtained with previously employed Affymetrix microarrays. However, it might be possible to improve performance by developing improved analysis methods. Conclusions Despite the decreasing costs of genome-wide individual genotyping, the pooling approach may have applications in very large-scale case-control association studies. In such cases, our study suggests that high-quality DNA preparations and lower density platforms should be preferred.

  6. Effect of regulation on the rate of adoption of cost saving scale technology in the electric utility industry: a portfolio approach

    International Nuclear Information System (INIS)

    Scheraga, C.A.

    1984-01-01

    This study presents a new analytical framework for examining the relationship between regulation and the investment behavior of electric utilities. The particular kind of investment behavior considered is the adoption of new as well as innovative electrical generation technology. The technologies of interest are large scale coal and nuclear generation plants. The theoretical model used in this study differs from traditional approaches in its utilization of a behavioral framework of analysis. The responsiveness of utilities to the required rate of return demanded by stockholders is demonstrated using an augmented form of the standard Capital Asset Pricing Model. Regulation is viewed as affecting utility investment behavior through the effect of the actions of regulatory commissions on the required rate of return that utilities must earn on equity. The particular regulatory policies considered are modification of existing rate structure, automatic adjustment clauses, and required efficiency standards. These policies are of particular interest both because their effects have not been previously examined in detail, and because they are recommended for adoption in the Public Utility Regulatory Policies Act of 1978. It is demonstrated empirically that the above policies do affect the required rate of return for utilities and hence their innovative investment behavior

  7. Development of Nuclear Safety Culture evaluation method for an operation team based on the probabilistic approach

    International Nuclear Information System (INIS)

    Han, Sang Min; Lee, Seung Min; Yim, Ho Bin; Seong, Poong Hyun

    2018-01-01

    Highlights: •We proposed a Probabilistic Safety Culture Healthiness Evaluation Method. •Positive relationship between the ‘success’ states of NSC and performance was shown. •The state probability profile showed a unique ratio regardless of the scenarios. •Cutset analysis provided not only root causes but also the latent causes of failures. •Pro-SCHEMe was found to be applicable to Korea NPPs. -- Abstract: The aim of this study is to propose a new quantitative evaluation method for Nuclear Safety Culture (NSC) in Nuclear Power Plant (NPP) operation teams based on the probabilistic approach. Various NSC evaluation methods have been developed, and the Korea NPP utility company has conducted the NSC assessment according to international practice. However, most of methods are conducted by interviews, observations, and the self-assessment. Consequently, the results are often qualitative, subjective, and mainly dependent on evaluator’s judgement, so the assessment results can be interpreted from different perspectives. To resolve limitations of present evaluation methods, the concept of Safety Culture Healthiness was suggested to produce quantitative results and provide faster evaluation process. This paper presents Probabilistic Safety Culture Healthiness Evaluation Method (Pro-SCHEMe) to generate quantitative inputs for Human Reliability Assessment (HRA) in Probabilistic Safety Assessment (PSA). Evaluation items which correspond to a basic event in PSA are derived in the first part of the paper through the literature survey; mostly from nuclear-related organizations such as the International Atomic Energy Agency (IAEA), the United States Nuclear Regulatory Commission (U.S.NRC), and the Institute of Nuclear Power Operations (INPO). Event trees (ETs) and fault trees (FTs) are devised to apply evaluation items to PSA based on the relationships among such items. The Modeling Guidelines are also suggested to classify and calculate NSC characteristics of

  8. The Integral Method, a new approach to quantify bactericidal activity.

    Science.gov (United States)

    Gottardi, Waldemar; Pfleiderer, Jörg; Nagl, Markus

    2015-08-01

    The bactericidal activity (BA) of antimicrobial agents is generally derived from the results of killing assays. A reliable quantitative characterization and particularly a comparison of these substances, however, are impossible with this information. We here propose a new method that takes into account the course of the complete killing curve for assaying BA and that allows a clear-cut quantitative comparison of antimicrobial agents with only one number. The new Integral Method, based on the reciprocal area below the killing curve, reliably calculates an average BA [log10 CFU/min] and, by implementation of the agent's concentration C, the average specific bactericidal activity SBA=BA/C [log10 CFU/min/mM]. Based on experimental killing data, the pertaining BA and SBA values of exemplary active halogen compounds were established, allowing quantitative assertions. N-chlorotaurine (NCT), chloramine T (CAT), monochloramine (NH2Cl), and iodine (I2) showed extremely diverging SBA values of 0.0020±0.0005, 1.11±0.15, 3.49±0.22, and 291±137log10 CFU/min/mM, respectively, against Staphylococcus aureus. This immediately demonstrates an approximately 550-fold stronger activity of CAT, 1730-fold of NH2Cl, and 150,000-fold of I2 compared to NCT. The inferred quantitative assertions and conclusions prove the new method suitable for characterizing bactericidal activity. Its application comprises the effect of defined agents on various bacteria, the consequence of temperature shifts, the influence of varying drug structure, dose-effect relationships, ranking of isosteric agents, comparison of competing commercial antimicrobial formulations, and the effect of additives. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Methods and models in mathematical biology deterministic and stochastic approaches

    CERN Document Server

    Müller, Johannes

    2015-01-01

    This book developed from classes in mathematical biology taught by the authors over several years at the Technische Universität München. The main themes are modeling principles, mathematical principles for the analysis of these models, and model-based analysis of data. The key topics of modern biomathematics are covered: ecology, epidemiology, biochemistry, regulatory networks, neuronal networks, and population genetics. A variety of mathematical methods are introduced, ranging from ordinary and partial differential equations to stochastic graph theory and  branching processes. A special emphasis is placed on the interplay between stochastic and deterministic models.

  10. New Approaches to Aluminum Integral Foam Production with Casting Methods

    Directory of Open Access Journals (Sweden)

    Ahmet Güner

    2015-08-01

    Full Text Available Integral foam has been used in the production of polymer materials for a long time. Metal integral foam casting systems are obtained by transferring and adapting polymer injection technology. Metal integral foam produced by casting has a solid skin at the surface and a foam core. Producing near-net shape reduces production expenses. Insurance companies nowadays want the automotive industry to use metallic foam parts because of their higher impact energy absorption properties. In this paper, manufacturing processes of aluminum integral foam with casting methods will be discussed.

  11. Time interval approach to the pulsed neutron logging method

    International Nuclear Information System (INIS)

    Zhao Jingwu; Su Weining

    1994-01-01

    The time interval of neighbouring neutrons emitted from a steady state neutron source can be treated as that from a time-dependent neutron source. In the rock space, the neutron flux is given by the neutron diffusion equation and is composed of an infinite terms. Each term s composed of two die-away curves. The delay action is discussed and used to measure the time interval with only one detector in the experiment. Nuclear reactions with the time distribution due to different types of radiations observed in the neutron well-logging methods are presented with a view to getting the rock nuclear parameters from the time interval technique

  12. Review of Methods and Approaches for Deriving Numeric ...

    Science.gov (United States)

    EPA will propose numeric criteria for nitrogen/phosphorus pollution to protect estuaries, coastal areas and South Florida inland flowing waters that have been designated Class I, II and III , as well as downstream protective values (DPVs) to protect estuarine and marine waters. In accordance with the formal determination and pursuant to a subsequent consent decree, these numeric criteria are being developed to translate and implement Florida’s existing narrative nutrient criterion, to protect the designated use that Florida has previously set for these waters, at Rule 62-302.530(47)(b), F.A.C. which provides that “In no case shall nutrient concentrations of a body of water be altered so as to cause an imbalance in natural populations of aquatic flora or fauna.” Under the Clean Water Act and EPA’s implementing regulations, these numeric criteria must be based on sound scientific rationale and reflect the best available scientific knowledge. EPA has previously published a series of peer reviewed technical guidance documents to develop numeric criteria to address nitrogen/phosphorus pollution in different water body types. EPA recognizes that available and reliable data sources for use in numeric criteria development vary across estuarine and coastal waters in Florida and flowing waters in South Florida. In addition, scientifically defensible approaches for numeric criteria development have different requirements that must be taken into consider

  13. METHODICAL APPROACHES TO THE CREATION MOOC (EXPERIENCE LAC

    Directory of Open Access Journals (Sweden)

    Larysa Nozdrina

    2016-03-01

    Full Text Available The article presents a number of problems, determining the current state of development of the domestic market of massive open online courses (MOOC. Considering the aim of the research, there is described national experience in this area, proposed and substantiated a number of the criteria and methodological approaches to the implementation of MOOC in higher education. The development of MOOC in the Lviv Academy of Commerce (LAC is reviewed as example. The main factors which determine the success of this educational innovation in the domestic market and are put under the research are software platforms, multimedia software for creating video lectures, course structure and support the learning process. The results of study which have been analyzed in this course can have a positive impact on the functioning of the market MOOC in Ukrainian universities. The article focuses on finding the ways of improving the process of developing and implementing MOOC in higher education in the example of LAC where Web-center on the MOODLE platform is used for e-learning. Further research should focus on the development of institutional mechanisms to ensure the effective design, implementation and operation of the MOOC in the universities in Ukraine. Particular attention during learning in massive open online course should be aimed at improving the educational process and to strengthen student's motivation in MOOC. Experience of MOOC's development in LAC can be useful at creating similar courses at other institutions of higher education in the different platforms, both MOODLE, and on the others

  14. Methodical Approach to Managing Resources at an Industrial Enterprise

    Directory of Open Access Journals (Sweden)

    Antonets Olga O.

    2013-11-01

    Full Text Available The goal of the article lies in identification of optimal ways of managing material resources of an industrial enterprise on the basis of economic and mathematical modelling. In the process of analysis and systematisation of works of foreign and domestic scientists the article makes a conclusion about the insufficient degree of development of such complex solutions on formation of logistic systems of resource management, which would be simple and adaptive. The article provides results of the study of specific features of resource management at enterprises, among which – surplus (deficit of resources and availability of non-liquid reserves. In order to eliminate shortcomings the article offers a situational order of management with consideration of a possible state of reserves. The article improves the model of selection of the volume of supply of material resources and identifies optimal solutions with consideration of interval uncertainty. The further direction of the study lies in integration of the proposed approach to resource management with the system of financial planning at an industrial enterprise.

  15. Standardless quantification approach of TXRF analysis using fundamental parameter method

    International Nuclear Information System (INIS)

    Szaloki, I.; Taniguchi, K.

    2000-01-01

    New standardless evaluation procedure based on the fundamental parameter method (FPM) has been developed for TXRF analysis. The theoretical calculation describes the relationship between characteristic intensities and the geometrical parameters of the excitation, detection system and the specimen parameters: size, thickness, angle of the excitation beam to the surface and the optical properties of the specimen holder. Most of the TXRF methods apply empirical calibration, which requires the application of special preparation technique. However, the characteristic lines of the specimen holder (Si Kα,β) present information from the local excitation and geometrical conditions on the substrate surface. On the basis of the theoretically calculation of the substrate characteristic intensity the excitation beam flux can be approximated. Taking into consideration the elements are in the specimen material a system of non-linear equation can be given involving the unknown concentration values and the geometrical and detection parameters. In order to solve this mathematical problem PASCAL software was written, which calculates the sample composition and the average sample thickness by gradient algorithm. Therefore, this quantitative estimation of the specimen composition requires neither external nor internal standard sample. For verification of the theoretical calculation and the numerical procedure, several experiments were carried out using mixed standard solution containing elements of K, Sc, V, Mn, Co and Cu in 0.1 - 10 ppm concentration range. (author)

  16. Methods of counting ribs on chest CT: the modified sternomanubrial approach

    Energy Technology Data Exchange (ETDEWEB)

    Yi, Kyung Sik; Kim, Sung Jin; Jeon, Min Hee; Lee, Seung Young; Bae, Il Hun [Chungbuk National University, Cheongju (Korea, Republic of)

    2007-08-15

    The purpose of this study was to evaluate the accuracy of each method of counting ribs on chest CT and to propose a new method: the anterior approach with using the sternocostal joints. CT scans of 38 rib lesions of 27 patients were analyzed (fracture: 25, metastasis: 11, benign bone disease: 2). Each lesion was independently counted by three radiologists with using three different methods for counting ribs: the sternoclavicular approach, the xiphisternal approach and the modified sternomanubrial approach. The rib lesions were divided into three parts of evaluation of each method according to the location of the lesion as follows: the upper part (between the first and fourth thoracic vertebra), the middle part (between the fifth and eighth) and the lower part (between the ninth and twelfth). The most accurate method was a modified sternomanubrial approach (99.1%). The accuracies of a xiphisternal approach and a sternoclavicular approach were 95.6% and 88.6%, respectively. A modified sternomanubrial approach showed the highest accuracies in all three parts (100%, 100% and 97.9%, respectively). We propose a new method for counting ribs, the modified sternomanubrial approach, which was more accurate than the known methods in any parts of the bony thorax, and it may be an easier and quicker method than the others in clinical practice.

  17. Generalized perturbation theory (GPT) methods. A heuristic approach

    International Nuclear Information System (INIS)

    Gandini, A.

    1987-01-01

    Wigner first proposed a perturbation theory as early as 1945 to study fundamental quantities such as the reactivity worths of different materials. The first formulation, CPT, for conventional perturbation theory is based on universal quantum mechanics concepts. Since that early conception, significant contributions have been made to CPT, in particular, Soodak, who rendered a heuristic interpretation of the adjoint function, (referred to as the GPT method for generalized perturbation theory). The author illustrates the GPT methodology in a variety of linear and nonlinear domains encountered in nuclear reactor analysis. The author begins with the familiar linear neutron field and then generalizes the methodology to other linear and nonlinear fields, using heuristic arguments. The author believes that the inherent simplicity and elegance of the heuristic derivation, although intended here for reactor physics problems might be usefully adopted in collateral fields and includes such examples

  18. Utility Assessment Methods.

    Science.gov (United States)

    1982-05-01

    Raiffa (831, LaValle [891, and other books on decision analysis. 4.2 Risk Attitudes Much recent research has focused on the investigation of various risk...Issacs, G.L., Hamer, R., Chen, J., Chuang, D., Woodworth, G., Molenaar , I., Lewis C., and Libby, D., Manual for the Computer-Assisted Data Analysis (CADA

  19. A Formative Evaluation with Extension Educators: Exploring Implementation Approaches Using Web-based Methods

    Directory of Open Access Journals (Sweden)

    Adrienne M. Duke

    2017-10-01

    Full Text Available The article describes the formative evaluation of a bullying prevention program called Be SAFE from the perspective of Extension educators. Twelve regional and county educators from Family and Child Development and 4-H Youth Development participated in our study. We used a web-based, mixed methods approach, utilizing both Qualtrics, an online survey software platform, and Scopia, a video conferencing application, to collect survey data and do a focus group. The results of the survey show that three activities, Clear Mind, Mud Mind, Take a Stand, and The Relationship Continuum, were perceived as garnering the most participation from students. However, focus group data indicated that while there was often a high level of participation, the subject matter of the curriculum was too advanced for students in the fifth grade and that classroom size affected how well educators could teach lessons. Furthermore, school access was not an implementation challenge, but the amount of days available to implement the full curriculum was sometimes limited. The data collected through this formative evaluation were used to improve implementation efforts. The process outlined in this article can be used as a model to help program leaders who are interested in using web-based tools to evaluate implementation processes.

  20. Tank Operations Contract Construction Management Methodology. Utilizing The Agency Method Of Construction Management To Safely And Effectively Complete Nuclear Construction Work

    International Nuclear Information System (INIS)

    Leso, K.F.; Hamilton, H.M.; Farner, M.; Heath, T.

    2010-01-01

    Washington River Protection Solutions, LLC (WRPS) has faced significant project management challenges in managing Davis-Bacon construction work that meets contractually required small business goals. The unique challenge is to provide contracting opportunities to multiple small business construction subcontractors while performing high hazard work in a safe and productive manner. Previous to the Washington River Protection Solutions, LLC contract, Construction work at the Hanford Tank Farms was contracted to large companies, while current Department of Energy (DOE) Contracts typically emphasize small business awards. As an integral part of Nuclear Project Management at Hanford Tank Farms, construction involves removal of old equipment and structures and installation of new infrastructure to support waste retrieval and waste feed delivery to the Waste Treatment Plant. Utilizing the optimum construction approach ensures that the contractors responsible for this work are successful in meeting safety, quality, cost and schedule objectives while working in a very hazardous environment. This paper describes the successful transition from a traditional project delivery method that utilized a large business general contractor and subcontractors to a new project construction management model that is more oriented to small businesses. Construction has selected the Agency Construction Management Method. This method was implemented in the first quarter of Fiscal Year (FY) 2009, where Construction Management is performed by substantially home office resources from the URS Northwest Office in Richland, Washington. The Agency Method has allowed WRPS to provide proven Construction Managers and Field Leads to mentor and direct small business contractors, thus providing expertise and assurance of a successful project. Construction execution contracts are subcontracted directly by WRPS to small or disadvantaged contractors that are mentored and supported by DRS personnel. Each small