WorldWideScience

Sample records for method requires minimal

  1. Waste minimization in analytical methods

    International Nuclear Information System (INIS)

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-01-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department's goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision

  2. Fast nonconvex nonsmooth minimization methods for image restoration and reconstruction.

    Science.gov (United States)

    Nikolova, Mila; Ng, Michael K; Tam, Chi-Pan

    2010-12-01

    Nonconvex nonsmooth regularization has advantages over convex regularization for restoring images with neat edges. However, its practical interest used to be limited by the difficulty of the computational stage which requires a nonconvex nonsmooth minimization. In this paper, we deal with nonconvex nonsmooth minimization methods for image restoration and reconstruction. Our theoretical results show that the solution of the nonconvex nonsmooth minimization problem is composed of constant regions surrounded by closed contours and neat edges. The main goal of this paper is to develop fast minimization algorithms to solve the nonconvex nonsmooth minimization problem. Our experimental results show that the effectiveness and efficiency of the proposed algorithms.

  3. Approximate error conjugation gradient minimization methods

    Science.gov (United States)

    Kallman, Jeffrey S

    2013-05-21

    In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.

  4. Minimizing convex functions by continuous descent methods

    Directory of Open Access Journals (Sweden)

    Sergiu Aizicovici

    2010-01-01

    Full Text Available We study continuous descent methods for minimizing convex functions, defined on general Banach spaces, which are associated with an appropriate complete metric space of vector fields. We show that there exists an everywhere dense open set in this space of vector fields such that each of its elements generates strongly convergent trajectories.

  5. Minimal residual method stronger than polynomial preconditioning

    Energy Technology Data Exchange (ETDEWEB)

    Faber, V.; Joubert, W.; Knill, E. [Los Alamos National Lab., NM (United States)] [and others

    1994-12-31

    Two popular methods for solving symmetric and nonsymmetric systems of equations are the minimal residual method, implemented by algorithms such as GMRES, and polynomial preconditioning methods. In this study results are given on the convergence rates of these methods for various classes of matrices. It is shown that for some matrices, such as normal matrices, the convergence rates for GMRES and for the optimal polynomial preconditioning are the same, and for other matrices such as the upper triangular Toeplitz matrices, it is at least assured that if one method converges then the other must converge. On the other hand, it is shown that matrices exist for which restarted GMRES always converges but any polynomial preconditioning of corresponding degree makes no progress toward the solution for some initial error. The implications of these results for these and other iterative methods are discussed.

  6. The minimal energetic requirement of sustained awareness after brain injury

    DEFF Research Database (Denmark)

    Stender, Johan; Mortensen, Kristian Nygaard; Thibaut, Aurore

    2016-01-01

    of glucose has been proposed as an indicator of consciousness [2 and 3]. Likewise, FDG-PET may contribute to the clinical diagnosis of disorders of consciousness (DOCs) [4 and 5]. However, current methods are non-quantitative and have important drawbacks deriving from visually guided assessment of relative...... changes in brain metabolism [4]. We here used FDG-PET to measure resting state brain glucose metabolism in 131 DOC patients to identify objective quantitative metabolic indicators and predictors of awareness. Quantitation of images was performed by normalizing to extracerebral tissue. We show that 42......% of normal cortical activity represents the minimal energetic requirement for the presence of conscious awareness. Overall, the cerebral metabolic rate accounted for the current level, or imminent return, of awareness in 94% of the patient population, suggesting a global energetic threshold effect...

  7. Minimal requirements for quality controls in radiotherapy with external beams

    International Nuclear Information System (INIS)

    1999-01-01

    Physical dosimetric guidelines have been developed by the Italian National Institute of Health study group on quality assurance in radiotherapy to define protocols for quality controls in external beam radiotherapy. While the document does not determine strict rules or firm recommendations, it suggests minimal requirements for quality controls necessary to guarantee an adequate degree of accuracy in external beam radiotherapy [it

  8. Methods evaluated to minimize emissions from preplant soil fumigation

    Directory of Open Access Journals (Sweden)

    Suduan Gao

    2008-05-01

    Full Text Available Many commodities depend on preplant soil fumigation for pest control to achieve healthy crops and profitable yields. Under California regulations, minimizing emissions is essential to maintain the practical use of soil fumigants, and more stringent regulations are likely in the future. The phase-out of methyl bromide as a broad-spectrum soil fumigant has created formidable challenges. Most alternatives registered today are regulated as volatile organic compounds because of their toxicity and mobile nature. We review research on methods for minimizing emissions from soil fumigation, including the effectiveness of their emission reductions, impacts on pest control and cost. Low-permeability plastic mulches are highly effective but are generally affordable only in high-value cash crops such as strawberry. Crops with low profit margins such as stone-fruit orchards may require lower-cost methods such as water treatment or target-area fumigation.

  9. Secondary waste minimization in analytical methods

    International Nuclear Information System (INIS)

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S.; Schilling, J.B.

    1995-01-01

    The characterization phase of site remediation is an important and costly part of the process. Because toxic solvents and other hazardous materials are used in common analytical methods, characterization is also a source of new waste, including mixed waste. Alternative analytical methods can reduce the volume or form of hazardous waste produced either in the sample preparation step or in the measurement step. The authors are examining alternative methods in the areas of inorganic, radiological, and organic analysis. For determining inorganic constituents, alternative methods were studied for sample introduction into inductively coupled plasma spectrometers. Figures of merit for the alternative methods, as well as their associated waste volumes, were compared with the conventional approaches. In the radiological area, the authors are comparing conventional methods for gross α/β measurements of soil samples to an alternative method that uses high-pressure microwave dissolution. For determination of organic constituents, microwave-assisted extraction was studied for RCRA regulated semivolatile organics in a variety of solid matrices, including spiked samples in blank soil; polynuclear aromatic hydrocarbons in soils, sludges, and sediments; and semivolatile organics in soil. Extraction efficiencies were determined under varying conditions of time, temperature, microwave power, moisture content, and extraction solvent. Solvent usage was cut from the 300 mL used in conventional extraction methods to about 30 mL. Extraction results varied from one matrix to another. In most cases, the microwave-assisted extraction technique was as efficient as the more common Soxhlet or sonication extraction techniques

  10. Minimalism

    CERN Document Server

    Obendorf, Hartmut

    2009-01-01

    The notion of Minimalism is proposed as a theoretical tool supporting a more differentiated understanding of reduction and thus forms a standpoint that allows definition of aspects of simplicity. This book traces the development of minimalism, defines the four types of minimalism in interaction design, and looks at how to apply it.

  11. On the convergence of nonconvex minimization methods for image recovery.

    Science.gov (United States)

    Xiao, Jin; Ng, Michael Kwok-Po; Yang, Yu-Fei

    2015-05-01

    Nonconvex nonsmooth regularization method has been shown to be effective for restoring images with neat edges. Fast alternating minimization schemes have also been proposed and developed to solve the nonconvex nonsmooth minimization problem. The main contribution of this paper is to show the convergence of these alternating minimization schemes, based on the Kurdyka-Łojasiewicz property. In particular, we show that the iterates generated by the alternating minimization scheme, converges to a critical point of this nonconvex nonsmooth objective function. We also extend the analysis to nonconvex nonsmooth regularization model with box constraints, and obtain similar convergence results of the related minimization algorithm. Numerical examples are given to illustrate our convergence analysis.

  12. Balancing related methods for minimal realization of periodic systems

    OpenAIRE

    Varga, A.

    1999-01-01

    We propose balancing related numerically reliable methods to compute minimal realizations of linear periodic systems with time-varying dimensions. The first method belongs to the family of square-root methods with guaranteed enhanced computational accuracy and can be used to compute balanced minimal order realizations. An alternative balancing-free square-root method has the advantage of a potentially better numerical accuracy in case of poorly scaled original systems. The key numerical co...

  13. A convergent overlapping domain decomposition method for total variation minimization

    KAUST Repository

    Fornasier, Massimo; Langer, Andreas; Schö nlieb, Carola-Bibiane

    2010-01-01

    In this paper we are concerned with the analysis of convergent sequential and parallel overlapping domain decomposition methods for the minimization of functionals formed by a discrepancy term with respect to the data and a total variation

  14. Linearly convergent stochastic heavy ball method for minimizing generalization error

    KAUST Repository

    Loizou, Nicolas

    2017-10-30

    In this work we establish the first linear convergence result for the stochastic heavy ball method. The method performs SGD steps with a fixed stepsize, amended by a heavy ball momentum term. In the analysis, we focus on minimizing the expected loss and not on finite-sum minimization, which is typically a much harder problem. While in the analysis we constrain ourselves to quadratic loss, the overall objective is not necessarily strongly convex.

  15. Subspace Correction Methods for Total Variation and $\\ell_1$-Minimization

    KAUST Repository

    Fornasier, Massimo

    2009-01-01

    This paper is concerned with the numerical minimization of energy functionals in Hilbert spaces involving convex constraints coinciding with a seminorm for a subspace. The optimization is realized by alternating minimizations of the functional on a sequence of orthogonal subspaces. On each subspace an iterative proximity-map algorithm is implemented via oblique thresholding, which is the main new tool introduced in this work. We provide convergence conditions for the algorithm in order to compute minimizers of the target energy. Analogous results are derived for a parallel variant of the algorithm. Applications are presented in domain decomposition methods for degenerate elliptic PDEs arising in total variation minimization and in accelerated sparse recovery algorithms based on 1-minimization. We include numerical examples which show e.cient solutions to classical problems in signal and image processing. © 2009 Society for Industrial and Applied Physics.

  16. A Matrix Splitting Method for Composite Function Minimization

    KAUST Repository

    Yuan, Ganzhao

    2016-12-07

    Composite function minimization captures a wide spectrum of applications in both computer vision and machine learning. It includes bound constrained optimization and cardinality regularized optimization as special cases. This paper proposes and analyzes a new Matrix Splitting Method (MSM) for minimizing composite functions. It can be viewed as a generalization of the classical Gauss-Seidel method and the Successive Over-Relaxation method for solving linear systems in the literature. Incorporating a new Gaussian elimination procedure, the matrix splitting method achieves state-of-the-art performance. For convex problems, we establish the global convergence, convergence rate, and iteration complexity of MSM, while for non-convex problems, we prove its global convergence. Finally, we validate the performance of our matrix splitting method on two particular applications: nonnegative matrix factorization and cardinality regularized sparse coding. Extensive experiments show that our method outperforms existing composite function minimization techniques in term of both efficiency and efficacy.

  17. A Matrix Splitting Method for Composite Function Minimization

    KAUST Repository

    Yuan, Ganzhao; Zheng, Wei-Shi; Ghanem, Bernard

    2016-01-01

    Composite function minimization captures a wide spectrum of applications in both computer vision and machine learning. It includes bound constrained optimization and cardinality regularized optimization as special cases. This paper proposes and analyzes a new Matrix Splitting Method (MSM) for minimizing composite functions. It can be viewed as a generalization of the classical Gauss-Seidel method and the Successive Over-Relaxation method for solving linear systems in the literature. Incorporating a new Gaussian elimination procedure, the matrix splitting method achieves state-of-the-art performance. For convex problems, we establish the global convergence, convergence rate, and iteration complexity of MSM, while for non-convex problems, we prove its global convergence. Finally, we validate the performance of our matrix splitting method on two particular applications: nonnegative matrix factorization and cardinality regularized sparse coding. Extensive experiments show that our method outperforms existing composite function minimization techniques in term of both efficiency and efficacy.

  18. Minimal processing - preservation methods of the future: an overview

    International Nuclear Information System (INIS)

    Ohlsson, T.

    1994-01-01

    Minimal-processing technologies are modern techniques that provide sufficient shelf life to foods to allow their distribution, while also meeting the demands of the consumers for convenience and fresh-like quality. Minimal-processing technologies can be applied at various stages of the food distribution chain, in storage, in processing and/or in packaging. Examples of methods will be reviewed, including modified-atmosphere packaging, high-pressure treatment, sous-vide cooking and active packaging

  19. Linearly convergent stochastic heavy ball method for minimizing generalization error

    KAUST Repository

    Loizou, Nicolas; Richtarik, Peter

    2017-01-01

    In this work we establish the first linear convergence result for the stochastic heavy ball method. The method performs SGD steps with a fixed stepsize, amended by a heavy ball momentum term. In the analysis, we focus on minimizing the expected loss

  20. A convergent overlapping domain decomposition method for total variation minimization

    KAUST Repository

    Fornasier, Massimo

    2010-06-22

    In this paper we are concerned with the analysis of convergent sequential and parallel overlapping domain decomposition methods for the minimization of functionals formed by a discrepancy term with respect to the data and a total variation constraint. To our knowledge, this is the first successful attempt of addressing such a strategy for the nonlinear, nonadditive, and nonsmooth problem of total variation minimization. We provide several numerical experiments, showing the successful application of the algorithm for the restoration of 1D signals and 2D images in interpolation/inpainting problems, respectively, and in a compressed sensing problem, for recovering piecewise constant medical-type images from partial Fourier ensembles. © 2010 Springer-Verlag.

  1. Minimizers with discontinuous velocities for the electromagnetic variational method

    International Nuclear Information System (INIS)

    De Luca, Jayme

    2010-01-01

    The electromagnetic two-body problem has neutral differential delay equations of motion that, for generic boundary data, can have solutions with discontinuous derivatives. If one wants to use these neutral differential delay equations with arbitrary boundary data, solutions with discontinuous derivatives must be expected and allowed. Surprisingly, Wheeler-Feynman electrodynamics has a boundary value variational method for which minimizer trajectories with discontinuous derivatives are also expected, as we show here. The variational method defines continuous trajectories with piecewise defined velocities and accelerations, and electromagnetic fields defined by the Euler-Lagrange equations on trajectory points. Here we use the piecewise defined minimizers with the Lienard-Wierchert formulas to define generalized electromagnetic fields almost everywhere (but on sets of points of zero measure where the advanced/retarded velocities and/or accelerations are discontinuous). Along with this generalization we formulate the generalized absorber hypothesis that the far fields vanish asymptotically almost everywhere and show that localized orbits with far fields vanishing almost everywhere must have discontinuous velocities on sewing chains of breaking points. We give the general solution for localized orbits with vanishing far fields by solving a (linear) neutral differential delay equation for these far fields. We discuss the physics of orbits with discontinuous derivatives stressing the differences to the variational methods of classical mechanics and the existence of a spinorial four-current associated with the generalized variational electrodynamics.

  2. Subspace Correction Methods for Total Variation and $\\ell_1$-Minimization

    KAUST Repository

    Fornasier, Massimo; Schö nlieb, Carola-Bibiane

    2009-01-01

    This paper is concerned with the numerical minimization of energy functionals in Hilbert spaces involving convex constraints coinciding with a seminorm for a subspace. The optimization is realized by alternating minimizations of the functional on a

  3. Minimal Residual Disease Assessment in Lymphoma: Methods and Applications.

    Science.gov (United States)

    Herrera, Alex F; Armand, Philippe

    2017-12-01

    Standard methods for disease response assessment in patients with lymphoma, including positron emission tomography and computed tomography scans, are imperfect. In other hematologic malignancies, particularly leukemias, the ability to detect minimal residual disease (MRD) is increasingly influencing treatment paradigms. However, in many subtypes of lymphoma, the application of MRD assessment techniques, like flow cytometry or polymerase chain reaction-based methods, has been challenging because of the absence of readily detected circulating disease or canonic chromosomal translocations. Newer MRD detection methods that use next-generation sequencing have yielded promising results in a number of lymphoma subtypes, fueling the hope that MRD detection may soon be applicable in clinical practice for most patients with lymphoma. MRD assessment can provide real-time information about tumor burden and response to therapy, noninvasive genomic profiling, and monitoring of clonal dynamics, allowing for many possible applications that could significantly affect the care of patients with lymphoma. Further validation of MRD assessment methods, including the incorporation of MRD assessment into clinical trials in patients with lymphoma, will be critical to determine how best to deploy MRD testing in routine practice and whether MRD assessment can ultimately bring us closer to the goal of personalized lymphoma care. In this review article, we describe the methods available for detecting MRD in patients with lymphoma and their relative advantages and disadvantages. We discuss preliminary results supporting the potential applications for MRD testing in the care of patients with lymphoma and strategies for including MRD assessment in lymphoma clinical trials.

  4. Determination method of inactivating minimal dose of gama radiation for Salmonella typhimurium

    International Nuclear Information System (INIS)

    Araujo, E.S.; Campos, H. de; Silva, D.M.

    1979-01-01

    A method for determination of minimal inactivating dose (MID) with Salmonella typhimurium is presented. This is a more efficient way to improve the irradiated vaccines. The MID found for S. thyphimurium 6.616 by binomial test was 0.55 MR. The method used allows to get a definite value for MID and requires less consumption of material, work and time in comparison with the usual procedure [pt

  5. Optimized Runge-Kutta methods with minimal dispersion and dissipation for problems arising from computational acoustics

    International Nuclear Information System (INIS)

    Tselios, Kostas; Simos, T.E.

    2007-01-01

    In this Letter a new explicit fourth-order seven-stage Runge-Kutta method with a combination of minimal dispersion and dissipation error and maximal accuracy and stability limit along the imaginary axes, is developed. This method was produced by a general function that was constructed to satisfy all the above requirements and, from which, all the existing fourth-order six-stage RK methods can be produced. The new method is more efficient than the other optimized methods, for acoustic computations

  6. Optimal design method to minimize users' thinking mapping load in human-machine interactions.

    Science.gov (United States)

    Huang, Yanqun; Li, Xu; Zhang, Jie

    2015-01-01

    The discrepancy between human cognition and machine requirements/behaviors usually results in serious mental thinking mapping loads or even disasters in product operating. It is important to help people avoid human-machine interaction confusions and difficulties in today's mental work mastered society. Improving the usability of a product and minimizing user's thinking mapping and interpreting load in human-machine interactions. An optimal human-machine interface design method is introduced, which is based on the purpose of minimizing the mental load in thinking mapping process between users' intentions and affordance of product interface states. By analyzing the users' thinking mapping problem, an operating action model is constructed. According to human natural instincts and acquired knowledge, an expected ideal design with minimized thinking loads is uniquely determined at first. Then, creative alternatives, in terms of the way human obtains operational information, are provided as digital interface states datasets. In the last, using the cluster analysis method, an optimum solution is picked out from alternatives, by calculating the distances between two datasets. Considering multiple factors to minimize users' thinking mapping loads, a solution nearest to the ideal value is found in the human-car interaction design case. The clustering results show its effectiveness in finding an optimum solution to the mental load minimizing problems in human-machine interaction design.

  7. OCOPTR, Minimization of Nonlinear Function, Variable Metric Method, Derivative Calculation. DRVOCR, Minimization of Nonlinear Function, Variable Metric Method, Derivative Calculation

    International Nuclear Information System (INIS)

    Nazareth, J. L.

    1979-01-01

    1 - Description of problem or function: OCOPTR and DRVOCR are computer programs designed to find minima of non-linear differentiable functions f: R n →R with n dimensional domains. OCOPTR requires that the user only provide function values (i.e. it is a derivative-free routine). DRVOCR requires the user to supply both function and gradient information. 2 - Method of solution: OCOPTR and DRVOCR use the variable metric (or quasi-Newton) method of Davidon (1975). For OCOPTR, the derivatives are estimated by finite differences along a suitable set of linearly independent directions. For DRVOCR, the derivatives are user- supplied. Some features of the codes are the storage of the approximation to the inverse Hessian matrix in lower trapezoidal factored form and the use of an optimally-conditioned updating method. Linear equality constraints are permitted subject to the initial Hessian factor being chosen correctly. 3 - Restrictions on the complexity of the problem: The functions to which the routine is applied are assumed to be differentiable. The routine also requires (n 2 /2) + 0(n) storage locations where n is the problem dimension

  8. An alternating minimization method for blind deconvolution from Poisson data

    International Nuclear Information System (INIS)

    Prato, Marco; La Camera, Andrea; Bonettini, Silvia

    2014-01-01

    Blind deconvolution is a particularly challenging inverse problem since information on both the desired target and the acquisition system have to be inferred from the measured data. When the collected data are affected by Poisson noise, this problem is typically addressed by the minimization of the Kullback-Leibler divergence, in which the unknowns are sought in particular feasible sets depending on the a priori information provided by the specific application. If these sets are separated, then the resulting constrained minimization problem can be addressed with an inexact alternating strategy. In this paper we apply this optimization tool to the problem of reconstructing astronomical images from adaptive optics systems, and we show that the proposed approach succeeds in providing very good results in the blind deconvolution of nondense stellar clusters

  9. An iterative method for determination of a minimal eigenvalue

    DEFF Research Database (Denmark)

    Kristiansen, G.K.

    1968-01-01

    Kristiansen (1963) has discussed the convergence of a group of iterative methods (denoted the Equipoise methods) for the solution of reactor criticality problems. The main result was that even though the methods are said to work satisfactorily in all practical cases, examples of divergence can be...

  10. Thermodynamic optimization of ground heat exchangers with single U-tube by entropy generation minimization method

    International Nuclear Information System (INIS)

    Li Min; Lai, Alvin C.K.

    2013-01-01

    Highlights: ► A second-law-based analysis is performed for single U-tube ground heat exchangers. ► Two expressions for the optimal length and flow velocity are developed for GHEs. ► Empirical velocities of GHEs are large compared to thermodynamic optimum values. - Abstract: This paper investigates thermodynamic performance of borehole ground heat exchangers with a single U-tube by the entropy generation minimization method which requires information of heat transfer and fluid mechanics, in addition to thermodynamics analysis. This study first derives an expression for dimensionless entropy generation number, a function that consists of five dimensionless variables, including Reynolds number, dimensionless borehole length, scale factor of pressures, and two duty parameters of ground heat exchangers. The derivation combines a heat transfer model and a hydraulics model for borehole ground heat exchangers with the first law and the second law of thermodynamics. Next, the entropy generation number is minimized to produce two analytical expressions for the optimal length and the optimal flow velocity of ground heat exchangers. Then, this paper discusses and analyzes implications and applications of these optimization formulas with two case studies. An important finding from the case studies is that widely used empirical velocities of circulating fluid are too large to operate ground-coupled heat pump systems in a thermodynamic optimization way. This paper demonstrates that thermodynamic optimal parameters of ground heat exchangers can probably be determined by using the entropy generation minimization method.

  11. Primal Interior Point Method for Minimization of Generalized Minimax Functions

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2010-01-01

    Roč. 46, č. 4 (2010), s. 697-721 ISSN 0023-5954 R&D Projects: GA ČR GA201/09/1957 Institutional research plan: CEZ:AV0Z10300504 Keywords : unconstrained optimization * large-scale optimization * nonsmooth optimization * generalized minimax optimization * interior-point methods * modified Newton methods * variable metric methods * global convergence * computational experiments Subject RIV: BA - General Mathematics Impact factor: 0.461, year: 2010 http://dml.cz/handle/10338.dmlcz/140779

  12. Comparison of 3 Minimally Invasive Methods for Distal Tibia Fractures.

    Science.gov (United States)

    Fang, Jun-Hao; Wu, Yao-Sen; Guo, Xiao-Shan; Sun, Liao-Jun

    2016-07-01

    This study compared the results of external fixation combined with limited open reduction and internal fixation (EF + LORIF), minimally invasive percutaneous plate osteosynthesis (MIPPO), and intramedullary nailing (IMN) for distal tibia fractures. A total of 84 patients with distal tibia shaft fractures were randomized to operative stabilization using EF + LORIF (28 cases), MIPPO (28 cases), or IMN (28 cases). The 3 groups were comparable with respect to patient demographics. Data were collected on operative time and radiation time, union time, complications, time of recovery to work, secondary operations, and measured joint function using the American Orthopaedic Foot and Ankle Society (AOFAS) score. There was no significant difference in time to union, incidence of union status, time of recovery to work, and AOFAS scores among the 3 groups (P>.05). Mean operative time and radiation time in the MIPPO group were longer than those in the IMN or EF + LORIF groups (Pknee pain occurred frequently after IMN (32.1%), and irritation symptoms were encountered more frequently after MIPPO (46.4%). Although EF + LORIF was associated with fewer secondary procedures vs MIPPO or IMN, it was related with more pin-tract infections (14.3%). Findings indicated that EF + LORIF, MIPPO, and IMN all achieved similar good functional results. However, EF + LORIF had some advantages over MIPPO and IMN in reducing operative and radiation times, postoperative complications, and reoperation rate. [Orthopedics. 2016; 39(4):e627-e633.]. Copyright 2016, SLACK Incorporated.

  13. Entropy resistance minimization: An alternative method for heat exchanger analyses

    International Nuclear Information System (INIS)

    Cheng, XueTao

    2013-01-01

    In this paper, the concept of entropy resistance is proposed based on the entropy generation analyses of heat transfer processes. It is shown that smaller entropy resistance leads to larger heat transfer rate with fixed thermodynamic force difference and smaller thermodynamic force difference with fixed heat transfer rate, respectively. For the discussed two-stream heat exchangers in which the heat transfer rates are not given and the three-stream heat exchanger with prescribed heat capacity flow rates and inlet temperatures of the streams, smaller entropy resistance leads to larger heat transfer rate. For the two-stream heat exchangers with fixed heat transfer rate, smaller entropy resistance leads to larger effectiveness. Furthermore, it is shown that smaller values of the concepts of entropy generation numbers and modified entropy generation number do not always correspond to better performance of the discussed heat exchangers. - Highlights: • The concept of entropy resistance is defined for heat exchangers. • The concepts based on entropy generation are used to analyze heat exchangers. • Smaller entropy resistance leads to better performance of heat exchangers. • The applicability of entropy generation minimization is conditional

  14. Guidelines on the facilities required for minor surgical procedures and minimal access interventions.

    LENUS (Irish Health Repository)

    Humphreys, H

    2012-02-01

    There have been many changes in healthcare provision in recent years, including the delivery of some surgical services in primary care or in day surgery centres, which were previously provided by acute hospitals. Developments in the fields of interventional radiology and cardiology have further expanded the range and complexity of procedures undertaken in these settings. In the face of these changes there is a need to define from an infection prevention and control perspective the basic physical requirements for facilities in which such surgical procedures may be carried out. Under the auspices of the Healthcare Infection Society, we have developed the following recommendations for those designing new facilities or upgrading existing facilities. These draw upon best practice, available evidence, other guidelines where appropriate, and expert consensus to provide sensible and feasible advice. An attempt is also made to define minimal access interventions and minor surgical procedures. For minimal access interventions, including interventional radiology, new facilities should be mechanically ventilated to achieve 15 air changes per hour but natural ventilation is satisfactory for minor procedures. All procedures should involve a checklist and operators should be appropriately trained. There is also a need for prospective surveillance to accurately determine the post-procedure infection rate. Finally, there is a requirement for appropriate applied research to develop the evidence base required to support subsequent iterations of this guidance.

  15. Minimizing Dispersion in FDTD Methods with CFL Limit Extension

    Science.gov (United States)

    Sun, Chen

    The CFL extension in FDTD methods is receiving considerable attention in order to reduce the computational effort and save the simulation time. One of the major issues in the CFL extension methods is the increased dispersion. We formulate a decomposition of FDTD equations to study the behaviour of the dispersion. A compensation scheme to reduce the dispersion in CFL extension is constructed and proposed. We further study the CFL extension in a FDTD subgridding case, where we improve the accuracy by acting only on the FDTD equations of the fine grid. Numerical results confirm the efficiency of the proposed method for minimising dispersion.

  16. THE MINIMALLY INVASIVE METHODS OF TREATMENT OF ANTERIOR URETHRA STRICTURES

    Directory of Open Access Journals (Sweden)

    V. L. Medvedev

    2017-01-01

    Full Text Available This review is to evaluate literature concerning different methods of treatment of anterior urethra strictures: internal optical urethrotomy (OIU, laser urethrotomy, urethra stenting, urethra dilatation, OIU in combination with selfdilatation, OIU combined with chemicals injection. Evaluation of expedience, advisability and reasonableness of the chosen methods and techniques. Hereby presented statistical assessment of longtime postoperative data, low efficiency researches analysis. This research is compiled using Medline, PubMed and Embase databases.

  17. German Risk Study - influences of data base, minimal requirements and system modifications

    International Nuclear Information System (INIS)

    Hoertner, H.; Linden, J. von

    1987-01-01

    The reliability analyses for Phase B of the German Risk Study taken into account an improved reliability data base, best-estimate minimal requirements for the relevant system functions and the design modifications, which have been carried out after completion of Phase A. These points and their influence on the frequency of core melt accidents are discussed, emphasizing the reliability data. Although the detailed evaluation of operating experience for the estimation of the reliability data does result in an increase of contributions, the best-estimate minimal requirements and the system modifications carried out for the reference plant reduce the core melt frequency due to those initiating events which were dominant in Phase A of the German Risk Study. The detailed investigation of additional initiating events which had already been recognized as important during Phase A leads to additional contributions to the frequency of core melt accidents. Such initiating events are the main steam line break and the steam generator tube rupture and altogether, the evaluated contributions to the frequency of core melt are lower than the values assessed in Phase A. (orig./HP)

  18. Reduction of very large reaction mechanisms using methods based on simulation error minimization

    Energy Technology Data Exchange (ETDEWEB)

    Nagy, Tibor; Turanyi, Tamas [Institute of Chemistry, Eoetvoes University (ELTE), P.O. Box 32, H-1518 Budapest (Hungary)

    2009-02-15

    A new species reduction method called the Simulation Error Minimization Connectivity Method (SEM-CM) was developed. According to the SEM-CM algorithm, a mechanism building procedure is started from the important species. Strongly connected sets of species, identified on the basis of the normalized Jacobian, are added and several consistent mechanisms are produced. The combustion model is simulated with each of these mechanisms and the mechanism causing the smallest error (i.e. deviation from the model that uses the full mechanism), considering the important species only, is selected. Then, in several steps other strongly connected sets of species are added, the size of the mechanism is gradually increased and the procedure is terminated when the error becomes smaller than the required threshold. A new method for the elimination of redundant reactions is also presented, which is called the Principal Component Analysis of Matrix F with Simulation Error Minimization (SEM-PCAF). According to this method, several reduced mechanisms are produced by using various PCAF thresholds. The reduced mechanism having the least CPU time requirement among the ones having almost the smallest error is selected. Application of SEM-CM and SEM-PCAF together provides a very efficient way to eliminate redundant species and reactions from large mechanisms. The suggested approach was tested on a mechanism containing 6874 irreversible reactions of 345 species that describes methane partial oxidation to high conversion. The aim is to accurately reproduce the concentration-time profiles of 12 major species with less than 5% error at the conditions of an industrial application. The reduced mechanism consists of 246 reactions of 47 species and its simulation is 116 times faster than using the full mechanism. The SEM-CM was found to be more effective than the classic Connectivity Method, and also than the DRG, two-stage DRG, DRGASA, basic DRGEP and extended DRGEP methods. (author)

  19. A method of posterior fossa dural incision to minimize hemorrhage from the occipital sinus: the "mosquito" method.

    Science.gov (United States)

    Lee, Hee Chang; Lee, Ji Yeoun; Ryu, Seul Ki; Lim, Jang Mi; Chong, Sangjoon; Phi, Ji Hoon; Kim, Seung-Ki; Wang, Kyu-Chang

    2016-12-01

    The posterior fossa dural opening requires the ligation of the occipital sinus to gain successful exposure. However, there could be a prominent occipital sinus which is functioning as the main drainage route and is harboring the risk of unpredictable massive hemorrhage during the dural opening. We introduce a safe method of posterior fossa dural incision to minimize hemorrhage from the occipital sinus using four curved hemostat clamps. For the dural incision at the midline part of the posterior cranial fossa, we used four curved hemostat clamps to occlude the prominent occipital sinus: one pair of clamps at the proximal part and the other pair at the distal part to occlude the occipital sinus. Dural incision was made between the two pairs of the curved hemostat clamps. By clamping of the sinus, it allows observation of possible brain swelling after occlusion of the occipital sinus as well as minimizes hemorrhage during incision of the midline dura of the posterior fossa. This method allows observation of brain swelling after occipital sinus occlusion and is an easy and safe incision of the midline dura minimizing hemorrhage in selected cases with a prominent occipital sinus.

  20. Systems and methods for mirror mounting with minimized distortion

    Science.gov (United States)

    Antonille, Scott R. (Inventor); Wallace, Thomas E. (Inventor); Content, David A. (Inventor); Wake, Shane W. (Inventor)

    2012-01-01

    A method for mounting a mirror for use in a telescope includes attaching the mirror to a plurality of adjustable mounts; determining a distortion in the mirror caused by the plurality adjustable mounts, and, if the distortion is determined to be above a predetermined level: adjusting one or more of the adjustable mounts; and determining the distortion in the mirror caused by the adjustable mounts; and in the event the determined distortion is determined to be at or below the predetermined level, rigidizing the adjustable mounts.

  1. MINIMIZE ENERGY AND COSTS REQUIREMENT OF WEEDING AND FERTILIZING PROCESS FOR FIBER CROPS IN SMALL FARMS

    Directory of Open Access Journals (Sweden)

    Tarek FOUDA

    2015-06-01

    Full Text Available The experimental work was carried out through agricultural summer season of 2014 at the experimental farm of Gemmiza Research Station, Gharbiya governorate to minimize energy and costs in weeding and fertilizing processes for fiber crops (Kenaf and Roselle in small farms. The manufactured multipurpose unit performance was studied as a function of change in machine forward speed (2.2, 2.8, 3.4 and 4 Km/h fertilizing rates (30,45 and 60 Kg.N.fed-1,and constant soil moisture content was 20%(d.b in average. Performance of the manufactured machine was evaluated in terms of fuel consumption, power and energy requirements, effective field capacity, theoretical field capacity, field efficiency, and operational costs as a machine measurements .The experiment results reveled that the manufactured machine decreased energy and increased effective field capacity and efficiency under the following conditions: -machine forward speed 2.2Kmlh. -moisture content average 20%.

  2. 40 CFR 125.94 - How will requirements reflecting best technology available for minimizing adverse environmental...

    Science.gov (United States)

    2010-07-01

    ... technology available for minimizing adverse environmental impact be established for my Phase II existing... technology available to minimize adverse environmental impact for your facility in accordance with paragraphs... technology available for minimizing adverse environmental impact. This determination must be based on...

  3. New method for minimizing regular functions with constraints on parameter region

    International Nuclear Information System (INIS)

    Kurbatov, V.S.; Silin, I.N.

    1993-01-01

    The new method of function minimization is developed. Its main features are considered. It is possible minimization of regular function with the arbitrary structure. For χ 2 -like function the usage of simplified second derivatives is possible with the control of correctness. The constraints of arbitrary structure can be used. The means for fast movement along multidimensional valleys are used. The method is tested on real data of K π2 decay of the experiment on rare K - -decays. 6 refs

  4. Canonical Primal-Dual Method for Solving Non-convex Minimization Problems

    OpenAIRE

    Wu, Changzhi; Li, Chaojie; Gao, David Yang

    2012-01-01

    A new primal-dual algorithm is presented for solving a class of non-convex minimization problems. This algorithm is based on canonical duality theory such that the original non-convex minimization problem is first reformulated as a convex-concave saddle point optimization problem, which is then solved by a quadratically perturbed primal-dual method. %It is proved that the popular SDP method is indeed a special case of the canonical duality theory. Numerical examples are illustrated. Comparing...

  5. A detailed survey of numerical methods for unconstrained minimization. Pt. 1

    International Nuclear Information System (INIS)

    Mika, K.; Chaves, T.

    1980-01-01

    A detailed description of numerical methods for unconstrained minimization is presented. This first part surveys in particular conjugate direction and gradient methods, whereas variable metric methods will be the subject of the second part. Among the results of special interest we quote the following. The conjugate direction methods of Powell, Zangwill and Sutti can be best interpreted if the Smith approach is adopted. The conditions for quadratic termination of Powell's first procedure are analyzed. Numerical results based on nonlinear least squares problems are presented for the following conjugate direction codes: VA04AD from Harwell Subroutine Library and ZXPOW from IMSL, both implementations of Powell's second procedure, DFMND from IBM-SILMATH (Zangwill's method) and Brent's algorithm PRAXIS. VA04AD turns out to be superior in all cases, PRAXIS improves for high-dimensional problems. All codes clearly exhibit superlinear convergence. Akaike's result for the method of steepest descent is derived directly from a set of nonlinear recurrence relations. Numerical results obtained with the highly ill conditioned Hilbert function confirm the theoretical predictions. Several properties of the conjugate gradient method are presented and a new derivation of the equivalence of steepest descent partan and the CG method is given. A comparison of numerical results from the CG codes VA08AD (Fletcher-Reeves), DFMCG (the SSP version of the Fletcher-Reevens algorithm) and VA14AD (Powell's implementation of the Polak-Ribiere formula) reveals that VA14AD is clearly superior in all cases, but that the convergence rate of these codes is only weakly superlinear such that high accuracy solutions require extremely large numbers of function calls. (orig.)

  6. Minimizing the Free Energy: A Computer Method for Teaching Chemical Equilibrium Concepts.

    Science.gov (United States)

    Heald, Emerson F.

    1978-01-01

    Presents a computer method for teaching chemical equilibrium concepts using material balance conditions and the minimization of the free energy. Method for the calculation of chemical equilibrium, the computer program used to solve equilibrium problems and applications of the method are also included. (HM)

  7. Option Pricing under Risk-Minimization Criterion in an Incomplete Market with the Finite Difference Method

    Directory of Open Access Journals (Sweden)

    Xinfeng Ruan

    2013-01-01

    Full Text Available We study option pricing with risk-minimization criterion in an incomplete market where the dynamics of the risky underlying asset is governed by a jump diffusion equation with stochastic volatility. We obtain the Radon-Nikodym derivative for the minimal martingale measure and a partial integro-differential equation (PIDE of European option. The finite difference method is employed to compute the European option valuation of PIDE.

  8. Minimization of required model runs in the Random Mixing approach to inverse groundwater flow and transport modeling

    Science.gov (United States)

    Hoerning, Sebastian; Bardossy, Andras; du Plessis, Jaco

    2017-04-01

    Most geostatistical inverse groundwater flow and transport modelling approaches utilize a numerical solver to minimize the discrepancy between observed and simulated hydraulic heads and/or hydraulic concentration values. The optimization procedure often requires many model runs, which for complex models lead to long run times. Random Mixing is a promising new geostatistical technique for inverse modelling. The method is an extension of the gradual deformation approach. It works by finding a field which preserves the covariance structure and maintains observed hydraulic conductivities. This field is perturbed by mixing it with new fields that fulfill the homogeneous conditions. This mixing is expressed as an optimization problem which aims to minimize the difference between the observed and simulated hydraulic heads and/or concentration values. To preserve the spatial structure, the mixing weights must lie on the unit hyper-sphere. We present a modification to the Random Mixing algorithm which significantly reduces the number of model runs required. The approach involves taking n equally spaced points on the unit circle as weights for mixing conditional random fields. Each of these mixtures provides a solution to the forward model at the conditioning locations. For each of the locations the solutions are then interpolated around the circle to provide solutions for additional mixing weights at very low computational cost. The interpolated solutions are used to search for a mixture which maximally reduces the objective function. This is in contrast to other approaches which evaluate the objective function for the n mixtures and then interpolate the obtained values. Keeping the mixture on the unit circle makes it easy to generate equidistant sampling points in the space; however, this means that only two fields are mixed at a time. Once the optimal mixture for two fields has been found, they are combined to form the input to the next iteration of the algorithm. This

  9. A Modified Limited-Memory BNS Method for Unconstrained Minimization Based on the Conjugate Directions Idea

    Czech Academy of Sciences Publication Activity Database

    Vlček, Jan; Lukšan, Ladislav

    2015-01-01

    Roč. 30, č. 3 (2015), s. 616-633 ISSN 1055-6788 R&D Projects: GA ČR GA13-06684S Institutional support: RVO:67985807 Keywords : unconstrained minimization * variable metric methods * limited-memory methods * the BFGS update * conjugate directions * numerical results Subject RIV: BA - General Mathematics Impact factor: 0.841, year: 2015

  10. An optimization based method for line planning to minimize travel time

    DEFF Research Database (Denmark)

    Bull, Simon Henry; Lusby, Richard Martin; Larsen, Jesper

    2015-01-01

    The line planning problem is to select a number of lines from a potential pool which provides sufficient passenger capacity and meets operational requirements, with some objective measure of solution line quality. We model the problem of minimizing the average passenger system time, including...

  11. An applied optimization based method for line planning to minimize travel time

    DEFF Research Database (Denmark)

    Bull, Simon Henry; Rezanova, Natalia Jurjevna; Lusby, Richard Martin

    The line planning problem in rail is to select a number of lines froma potential pool which provides sufficient passenger capacity and meetsoperational requirements, with some objective measure of solution linequality. We model the problem of minimizing the average passenger systemtime, including...

  12. Projected Gauss-Seidel subspace minimization method for interactive rigid body dynamics

    DEFF Research Database (Denmark)

    Silcowitz-Hansen, Morten; Abel, Sarah Maria Niebe; Erleben, Kenny

    2010-01-01

    artifacts such as viscous or damped contact response. In this paper, we present a new approach to contact force determination. We formulate the contact force problem as a nonlinear complementarity problem, and discretize the problem to derive the Projected Gauss–Seidel method. We combine the Projected Gauss......–Seidel method with a subspace minimization method. Our new method shows improved qualities and superior convergence properties for specific configurations....

  13. Anatomic double-bundle anterior cruciate ligament reconstruction using hamstring tendons with minimally required initial tension.

    Science.gov (United States)

    Mae, Tatsuo; Shino, Konsei; Matsumoto, Norinao; Natsu-Ume, Takashi; Yoneda, Kenji; Yoshikawa, Hideki; Yoneda, Minoru

    2010-10-01

    Our purpose was to clarify the clinical outcomes at 2 years after anatomic double-bundle anterior cruciate ligament (ACL) reconstruction with 20 N of the initial tension, which was the minimally required initial tension to perform the reconstruction successfully according to our previous report about the pre-tension necessary to restore the laxity found in the opposite knee (7.3 N; range, 2.2 to 14 N). Of 64 patients who underwent anatomic double-bundle ACL reconstruction with autogenous semitendinosus tendon, 45 were periodically examined for 2 years. Two double-looped grafts were fixed with EndoButton CL devices (Smith & Nephew Endoscopy, Andover, MA) on the femoral side and Double Spike Plates (Smith & Nephew Endoscopy) on the tibial side, while a total of 20 N of initial tension (10 N to each graft) was applied at 20° of knee flexion. The International Knee Documentation Committee Knee Examination Form and Lysholm score were used for the subjective assessment, whereas range of motion and knee stability were evaluated as the objective assessment. Grafts were evaluated in 25 patients with second-look arthroscopy. According to the International Knee Documentation Committee subjective assessment, 62% of knees were graded as normal and 38% as nearly normal. The Lysholm score was 72 points in the preoperative period and improved to 99 points at 2 years' follow-up. A loss of knee extension of less than 3° was found in 2 patients. The pivot-shift test was evaluated as negative in all patients except for 5 as a glide. KT-2000 knee arthrometer side-to-side difference (MEDmetric, San Diego, CA) was 0.1 ± 0.9 mm at 2 years' follow-up. Of the subset of grafts evaluated by second-look arthroscopy, most were considered to have good synovial coverage and to be taut. The anatomic double-bundle ACL reconstruction with 20 N of low initial tension yielded good clinical outcomes at 2 years postoperatively, and second-look arthroscopic findings were excellent. Level IV

  14. Detection of Cavities by Inverse Heat Conduction Boundary Element Method Using Minimal Energy Technique

    International Nuclear Information System (INIS)

    Choi, C. Y.

    1997-01-01

    A geometrical inverse heat conduction problem is solved for the infrared scanning cavity detection by the boundary element method using minimal energy technique. By minimizing the kinetic energy of temperature field, boundary element equations are converted to the quadratic programming problem. A hypothetical inner boundary is defined such that the actual cavity is located interior to the domain. Temperatures at hypothetical inner boundary are determined to meet the constraints of measurement error of surface temperature obtained by infrared scanning, and then boundary element analysis is performed for the position of an unknown boundary (cavity). Cavity detection algorithm is provided, and the effects of minimal energy technique on the inverse solution method are investigated by means of numerical analysis

  15. Improving groundwater management in rural India using simple modeling tools with minimal data requirements

    Science.gov (United States)

    Moysey, S. M.; Oblinger, J. A.; Ravindranath, R.; Guha, C.

    2008-12-01

    shortly after the start of the monsoon and villager water use is small compared to the other fluxes. Groundwater fluxes were accounted for by conceptualizing the contributing areas upstream and downstream of the reservoir as one dimensional flow tubes. This description of the flow system allows for the definition of physically-based parameters making the model useful for investigating WHS infiltration under a variety of management scenarios. To address concerns regarding the uniqueness of the model parameters, 10,000 independent model calibrations were performed using randomly selected starting parameters. Based on this Monte Carlo analysis, it was found that the mean volume of water contributed by the WHS to infiltration over the study period (Sept.-Dec., 2007) was 48.1x103m3 with a 95% confidence interval of 43.7-53.7x103m3. This volume represents 17-21% of the total natural groundwater recharge contributed by the entire watershed, which was determined independently using a surface water balance. Despite the fact that the model is easy to use and requires minimal data, the results obtained provide a powerful quantitative starting point for managing groundwater withdrawals in the dry season.

  16. Defining the Minimal Factors Required for Erythropoiesis through Direct Lineage Conversion

    Directory of Open Access Journals (Sweden)

    Sandra Capellera-Garcia

    2016-06-01

    Full Text Available Erythroid cell commitment and differentiation proceed through activation of a lineage-restricted transcriptional network orchestrated by a group of well characterized genes. However, the minimal set of factors necessary for instructing red blood cell (RBC development remains undefined. We employed a screen for transcription factors allowing direct lineage reprograming from fibroblasts to induced erythroid progenitors/precursors (iEPs. We show that Gata1, Tal1, Lmo2, and c-Myc (GTLM can rapidly convert murine and human fibroblasts directly to iEPs. The transcriptional signature of murine iEPs resembled mainly that of primitive erythroid progenitors in the yolk sac, whereas addition of Klf1 or Myb to the GTLM cocktail resulted in iEPs with a more adult-type globin expression pattern. Our results demonstrate that direct lineage conversion is a suitable platform for defining and studying the core factors inducing the different waves of erythroid development.

  17. Performance Analysis of Video Transmission Using Sequential Distortion Minimization Method for Digital Video Broadcasting Terrestrial

    Directory of Open Access Journals (Sweden)

    Novita Astin

    2016-12-01

    Full Text Available This paper presents about the transmission of Digital Video Broadcasting system with streaming video resolution 640x480 on different IQ rate and modulation. In the video transmission, distortion often occurs, so the received video has bad quality. Key frames selection algorithm is flexibel on a change of video, but on these methods, the temporal information of a video sequence is omitted. To minimize distortion between the original video and received video, we aimed at adding methodology using sequential distortion minimization algorithm. Its aim was to create a new video, better than original video without significant loss of content between the original video and received video, fixed sequentially. The reliability of video transmission was observed based on a constellation diagram, with the best result on IQ rate 2 Mhz and modulation 8 QAM. The best video transmission was also investigated using SEDIM (Sequential Distortion Minimization Method and without SEDIM. The experimental result showed that the PSNR (Peak Signal to Noise Ratio average of video transmission using SEDIM was an increase from 19,855 dB to 48,386 dB and SSIM (Structural Similarity average increase 10,49%. The experimental results and comparison of proposed method obtained a good performance. USRP board was used as RF front-end on 2,2 GHz.

  18. A flood-based information flow analysis and network minimization method for gene regulatory networks.

    Science.gov (United States)

    Pavlogiannis, Andreas; Mozhayskiy, Vadim; Tagkopoulos, Ilias

    2013-04-24

    Biological networks tend to have high interconnectivity, complex topologies and multiple types of interactions. This renders difficult the identification of sub-networks that are involved in condition- specific responses. In addition, we generally lack scalable methods that can reveal the information flow in gene regulatory and biochemical pathways. Doing so will help us to identify key participants and paths under specific environmental and cellular context. This paper introduces the theory of network flooding, which aims to address the problem of network minimization and regulatory information flow in gene regulatory networks. Given a regulatory biological network, a set of source (input) nodes and optionally a set of sink (output) nodes, our task is to find (a) the minimal sub-network that encodes the regulatory program involving all input and output nodes and (b) the information flow from the source to the sink nodes of the network. Here, we describe a novel, scalable, network traversal algorithm and we assess its potential to achieve significant network size reduction in both synthetic and E. coli networks. Scalability and sensitivity analysis show that the proposed method scales well with the size of the network, and is robust to noise and missing data. The method of network flooding proves to be a useful, practical approach towards information flow analysis in gene regulatory networks. Further extension of the proposed theory has the potential to lead in a unifying framework for the simultaneous network minimization and information flow analysis across various "omics" levels.

  19. Alternative sanitization methods for minimally processed lettuce in comparison to sodium hypochlorite.

    Science.gov (United States)

    Bachelli, Mara Lígia Biazotto; Amaral, Rívia Darla Álvares; Benedetti, Benedito Carlos

    2013-01-01

    Lettuce is a leafy vegetable widely used in industry for minimally processed products, in which the step of sanitization is the crucial moment for ensuring a safe food for consumption. Chlorinated compounds, mainly sodium hypochlorite, are the most used in Brazil, but the formation of trihalomethanes from this sanitizer is a drawback. Then, the search for alternative methods to sodium hypochlorite has been emerging as a matter of great interest. The suitability of chlorine dioxide (60 mg L(-1)/10 min), peracetic acid (100 mg L(-1)/15 min) and ozonated water (1.2 mg L(-1)/1 min) as alternative sanitizers to sodium hypochlorite (150 mg L(-1) free chlorine/15 min) were evaluated. Minimally processed lettuce washed with tap water for 1 min was used as a control. Microbiological analyses were performed in triplicate, before and after sanitization, and at 3, 6, 9 and 12 days of storage at 2 ± 1 °C with the product packaged on LDPE bags of 60 μm. It was evaluated total coliforms, Escherichia coli, Salmonella spp., psicrotrophic and mesophilic bacteria, yeasts and molds. All samples of minimally processed lettuce showed absence of E. coli and Salmonella spp. The treatments of chlorine dioxide, peracetic acid and ozonated water promoted reduction of 2.5, 1.1 and 0.7 log cycle, respectively, on count of microbial load of minimally processed product and can be used as substitutes for sodium hypochlorite. These alternative compounds promoted a shelf-life of six days to minimally processed lettuce, while the shelf-life with sodium hypochlorite was 12 days.

  20. Alternative sanitization methods for minimally processed lettuce in comparison to sodium hypochlorite

    Directory of Open Access Journals (Sweden)

    Mara Lígia Biazotto Bachelli

    2013-09-01

    Full Text Available Lettuce is a leafy vegetable widely used in industry for minimally processed products, in which the step of sanitization is the crucial moment for ensuring a safe food for consumption. Chlorinated compounds, mainly sodium hypochlorite, are the most used in Brazil, but the formation of trihalomethanes from this sanitizer is a drawback. Then, the search for alternative methods to sodium hypochlorite has been emerging as a matter of great interest. The suitability of chlorine dioxide (60 mg L-1/10 min, peracetic acid (100 mg L-1/15 min and ozonated water (1.2 mg L-1 /1 min as alternative sanitizers to sodium hypochlorite (150 mg L-1 free chlorine/15 min were evaluated. Minimally processed lettuce washed with tap water for 1 min was used as a control. Microbiological analyses were performed in triplicate, before and after sanitization, and at 3, 6, 9 and 12 days of storage at 2 ± 1 ºC with the product packaged on LDPE bags of 60 µm. It was evaluated total coliforms, Escherichia coli, Salmonella spp., psicrotrophic and mesophilic bacteria, yeasts and molds. All samples of minimally processed lettuce showed absence of E. coli and Salmonella spp. The treatments of chlorine dioxide, peracetic acid and ozonated water promoted reduction of 2.5, 1.1 and 0.7 log cycle, respectively, on count of microbial load of minimally processed product and can be used as substitutes for sodium hypochlorite. These alternative compounds promoted a shelf-life of six days to minimally processed lettuce, while the shelf-life with sodium hypochlorite was 12 days.

  1. Systematic process synthesis and design methods for cost effective waste minimization

    International Nuclear Information System (INIS)

    Biegler, L.T.; Grossman, I.E.; Westerberg, A.W.

    1995-01-01

    We present progress on our work to develop synthesis methods to aid in the design of cost effective approaches to waste minimization. Work continues to combine the approaches of Douglas and coworkers and of Grossmann and coworkers on a hierarchical approach where bounding information allows it to fit within a mixed integer programming approach. We continue work on the synthesis of reactors and of flexible separation processes. In the first instance, we strive for methods we can use to reduce the production of potential pollutants, while in the second we look for ways to recover and recycle solvents

  2. Systematic process synthesis and design methods for cost effective waste minimization

    Energy Technology Data Exchange (ETDEWEB)

    Biegler, L.T.; Grossman, I.E.; Westerberg, A.W. [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1995-12-31

    We present progress on our work to develop synthesis methods to aid in the design of cost effective approaches to waste minimization. Work continues to combine the approaches of Douglas and coworkers and of Grossmann and coworkers on a hierarchical approach where bounding information allows it to fit within a mixed integer programming approach. We continue work on the synthesis of reactors and of flexible separation processes. In the first instance, we strive for methods we can use to reduce the production of potential pollutants, while in the second we look for ways to recover and recycle solvents.

  3. An Improved Variational Method for Hyperspectral Image Pansharpening with the Constraint of Spectral Difference Minimization

    Science.gov (United States)

    Huang, Z.; Chen, Q.; Shen, Y.; Chen, Q.; Liu, X.

    2017-09-01

    Variational pansharpening can enhance the spatial resolution of a hyperspectral (HS) image using a high-resolution panchromatic (PAN) image. However, this technology may lead to spectral distortion that obviously affect the accuracy of data analysis. In this article, we propose an improved variational method for HS image pansharpening with the constraint of spectral difference minimization. We extend the energy function of the classic variational pansharpening method by adding a new spectral fidelity term. This fidelity term is designed following the definition of spectral angle mapper, which means that for every pixel, the spectral difference value of any two bands in the HS image is in equal proportion to that of the two corresponding bands in the pansharpened image. Gradient descent method is adopted to find the optimal solution of the modified energy function, and the pansharpened image can be reconstructed. Experimental results demonstrate that the constraint of spectral difference minimization is able to preserve the original spectral information well in HS images, and reduce the spectral distortion effectively. Compared to original variational method, our method performs better in both visual and quantitative evaluation, and achieves a good trade-off between spatial and spectral information.

  4. [Manufacture method and clinical application of minimally invasive dental implant guide template based on registration technology].

    Science.gov (United States)

    Lin, Zeming; He, Bingwei; Chen, Jiang; D u, Zhibin; Zheng, Jingyi; Li, Yanqin

    2012-08-01

    To guide doctors in precisely positioning surgical operation, a new production method of minimally invasive implant guide template was presented. The mandible of patient was scanned by CT scanner, and three-dimensional jaw bone model was constructed based on CT images data The professional dental implant software Simplant was used to simulate the plant based on the three-dimensional CT model to determine the location and depth of implants. In the same time, the dental plaster models were scanned by stereo vision system to build the oral mucosa model. Next, curvature registration technology was used to fuse the oral mucosa model and the CT model, then the designed position of implant in the oral mucosa could be determined. The minimally invasive implant guide template was designed in 3-Matic software according to the design position of implant and the oral mucosa model. Finally, the template was produced by rapid prototyping. The three-dimensional registration technology was useful to fuse the CT data and the dental plaster data, and the template was accurate that could provide the doctors a guidance in the actual planting without cut-off mucosa. The guide template which fabricated by comprehensive utilization of three-dimensional registration, Simplant simulation and rapid prototyping positioning are accurate and can achieve the minimally invasive and accuracy implant surgery, this technique is worthy of clinical use.

  5. An Approximate Redistributed Proximal Bundle Method with Inexact Data for Minimizing Nonsmooth Nonconvex Functions

    Directory of Open Access Journals (Sweden)

    Jie Shen

    2015-01-01

    Full Text Available We describe an extension of the redistributed technique form classical proximal bundle method to the inexact situation for minimizing nonsmooth nonconvex functions. The cutting-planes model we construct is not the approximation to the whole nonconvex function, but to the local convexification of the approximate objective function, and this kind of local convexification is modified dynamically in order to always yield nonnegative linearization errors. Since we only employ the approximate function values and approximate subgradients, theoretical convergence analysis shows that an approximate stationary point or some double approximate stationary point can be obtained under some mild conditions.

  6. Note: A method for minimizing oxide formation during elevated temperature nanoindentation

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, I. C.; Hodge, A. M., E-mail: ahodge@usc.edu [Department of Aerospace and Mechanical Engineering, University of Southern California, 3650 McClintock Avenue OHE430, Los Angeles, California 90089 (United States); Garcia-Sanchez, E. [Department of Aerospace and Mechanical Engineering, University of Southern California, 3650 McClintock Avenue OHE430, Los Angeles, California 90089 (United States); Facultad de Ingeniería Mecánica y Eléctrica, Universidad Autónoma de Nuevo León, Av. Universidad S/N, San Nicolás de los Garza, NL 66450 (Mexico)

    2014-09-15

    A standardized method to protect metallic samples and minimize oxide formation during elevated-temperature nanoindentation was adapted to a commercial instrument. Nanoindentation was performed on Al (100), Cu (100), and W (100) single crystals submerged in vacuum oil at 200 °C, while the surface morphology and oxidation was carefully monitored using atomic force microscopy (AFM) and X-ray photoelectron spectroscopy (XPS). The results were compared to room temperature and 200 °C nanoindentation tests performed without oil, in order to evaluate the feasibility of using the oil as a protective medium. Extensive surface characterization demonstrated that this methodology is effective for nanoscale testing.

  7. Minimal invasive stabilization of osteoporotic vertebral compression fractures. Methods and preinterventional diagnostics

    International Nuclear Information System (INIS)

    Grohs, J.G.; Krepler, P.

    2004-01-01

    Minimal invasive stabilizations represent a new alternative for the treatment of osteoporotic compression fractures. Vertebroplasty and balloon kyphoplasty are two methods to enhance the strength of osteoporotic vertebral bodies by the means of cement application. Vertebroplasty is the older and technically easier method. The balloon kyphoplasty is the newer and more expensive method which does not only improve pain but also restores the sagittal profile of the spine. By balloon kyphoplasty the height of 101 fractured vertebral bodies could be increased up to 90% and the wedge decreased from 12 to 7 degrees. Pain was reduced from 7,2 to 2,5 points. The Oswestry disability index decreased from 60 to 26 points. This effects persisted over a period of two years. Cement leakage occurred in only 2% of vertebral bodies. Fractures of adjacent vertebral bodies were found in 11%. Good preinterventional diagnostics and intraoperative imaging are necessary to make the balloon kyphoplasty a successful application. (orig.) [de

  8. Variational method for the minimization of entropy generation in solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Smit, Sjoerd; Kessels, W. M. M., E-mail: w.m.m.kessels@tue.nl [Department of Applied Physics, Eindhoven University of Technology, P.O. Box 513, 5600 MB Eindhoven (Netherlands)

    2015-04-07

    In this work, a method is presented to extend traditional solar cell simulation tools to make it possible to calculate the most efficient design of practical solar cells. The method is based on the theory of nonequilibrium thermodynamics, which is used to derive an expression for the local entropy generation rate in the solar cell, making it possible to quantify all free energy losses on the same scale. The framework of non-equilibrium thermodynamics can therefore be combined with the calculus of variations and existing solar cell models to minimize the total entropy generation rate in the cell to find the most optimal design. The variational method is illustrated by applying it to a homojunction solar cell. The optimization results in a set of differential algebraic equations, which determine the optimal shape of the doping profile for given recombination and transport models.

  9. Minimizing transfusion requirements for children undergoing craniosynostosis repair: the CHoR protocol.

    Science.gov (United States)

    Vega, Rafael A; Lyon, Camila; Kierce, Jeannette F; Tye, Gary W; Ritter, Ann M; Rhodes, Jennifer L

    2014-08-01

    Children with craniosynostosis may require cranial vault remodeling to prevent or relieve elevated intracranial pressure and to correct the underlying craniofacial abnormalities. The procedure is typically associated with significant blood loss and high transfusion rates. The risks associated with transfusions are well documented and include transmission of infectious agents, bacterial contamination, acute hemolytic reactions, transfusion-related lung injury, and transfusion-related immune modulation. This study presents the Children's Hospital of Richmond (CHoR) protocol, which was developed to reduce the rate of blood transfusion in infants undergoing primary craniosynostosis repair. A retrospective chart review of pediatric patients treated between January 2003 and Febuary 2012 was performed. The CHoR protocol was instituted in November 2008, with the following 3 components; 1) the use of preoperative erythropoietin and iron therapy, 2) the use of an intraoperative blood recycling device, and 3) acceptance of a lower level of hemoglobin as a trigger for transfusion (protocol implementation served as controls. A total of 60 children were included in the study, 32 of whom were treated with the CHoR protocol. The control (C) and protocol (P) groups were comparable with respect to patient age (7 vs 8.4 months, p = 0.145). Recombinant erythropoietin effectively raised the mean preoperative hemoglobin level in the P group (12 vs 9.7 g/dl, p protocol that includes preoperative administration of recombinant erythropoietin, intraoperative autologous blood recycling, and accepting a lower transfusion trigger significantly decreased transfusion utilization (p < 0.001). A decreased length of stay (p < 0.001) was seen, although the authors did not investigate whether composite transfusion complication reductions led to better outcomes.

  10. An Approximate Proximal Bundle Method to Minimize a Class of Maximum Eigenvalue Functions

    Directory of Open Access Journals (Sweden)

    Wei Wang

    2014-01-01

    Full Text Available We present an approximate nonsmooth algorithm to solve a minimization problem, in which the objective function is the sum of a maximum eigenvalue function of matrices and a convex function. The essential idea to solve the optimization problem in this paper is similar to the thought of proximal bundle method, but the difference is that we choose approximate subgradient and function value to construct approximate cutting-plane model to solve the above mentioned problem. An important advantage of the approximate cutting-plane model for objective function is that it is more stable than cutting-plane model. In addition, the approximate proximal bundle method algorithm can be given. Furthermore, the sequences generated by the algorithm converge to the optimal solution of the original problem.

  11. Approximate k-NN delta test minimization method using genetic algorithms: Application to time series

    CERN Document Server

    Mateo, F; Gadea, Rafael; Sovilj, Dusan

    2010-01-01

    In many real world problems, the existence of irrelevant input variables (features) hinders the predictive quality of the models used to estimate the output variables. In particular, time series prediction often involves building large regressors of artificial variables that can contain irrelevant or misleading information. Many techniques have arisen to confront the problem of accurate variable selection, including both local and global search strategies. This paper presents a method based on genetic algorithms that intends to find a global optimum set of input variables that minimize the Delta Test criterion. The execution speed has been enhanced by substituting the exact nearest neighbor computation by its approximate version. The problems of scaling and projection of variables have been addressed. The developed method works in conjunction with MATLAB's Genetic Algorithm and Direct Search Toolbox. The goodness of the proposed methodology has been evaluated on several popular time series examples, and also ...

  12. Chronic Morel-Lavallée Lesion: A Novel Minimally Invasive Method of Treatment.

    Science.gov (United States)

    Mettu, Ramireddy; Surath, Harsha Vardhan; Chayam, Hanumantha Rao; Surath, Amaranth

    2016-11-01

    A Morel-Lavallée lesion is a closed internal degloving injury resulting from a shearing force applied to the skin. The etiology of this condition may be motor vehicle accidents, falls, contact sports (ie, football, wrestling),1 and iatrogenic after mammoplasty or abdominal liposuction.2 Common sites of the lesions include the pelvis and/or thigh.3 Isolated Morel-Lavallée lesions without underlying fracture are likely to be missed, which result in chronicity. Management of this condition often requires extensive surgical procedures such as debridement, sclerotherapy, serial percutaneous drainage, negative pressure wound therapy (NPWT), and skin grafting.4,5 The authors wish to highlight a minimally invasive technique for the treatment of chronic Morel-Lavallée lesions.

  13. MINIMAL INVASIVE PLATE OSTEOSYNTHESIS- AN EFFECTIVE TREATMENT METHOD FOR DISTAL TIBIA INTRAARTICULAR (PILON FRACTURES- AN 18 MONTHS FOLLOW UP

    Directory of Open Access Journals (Sweden)

    Saket Jati

    2016-12-01

    Full Text Available BACKGROUND Tibial pilon fracture though requires operative treatment is difficult to manage. Conventional osteosynthesis is not suitable, because distal tibia is subcutaneous bone with poor vascularity. Closed reduction and Minimally Invasive Plate Osteosynthesis (MIPO for distal tibia has emerged as an alternative treatment option because it respects fracture biology and haematoma and also provides biomechanically stable construct. The aim of the study is to evaluate the results of minimally invasive plate osteosynthesis using locking plates in treating tibial pilon fractures in terms of fracture union, restoration of ankle function and complications. MATERIALS AND METHODS 30 patients with closed tibial pilon fractures (Ruedi and Allgower type I (14, type II (13, type III (3 treated with MIPO with Locking Compression Plates (LCP were prospectively followed for average duration of 18 months. RESULTS Average duration of injury-hospital and injury-surgery interval was as 12.05 hrs. and 3.50 days, respectively. All fractures got united with an average duration of 20.8 weeks (range 14-28 weeks. Olerud and Molander score was used for evaluation at 3 months, 6 months and 18 months. One patient had union with valgus angulation of 15 degrees, but no nonunion was found. CONCLUSION The present study shows that MIPO with LCP is an effective treatment method in terms of union time and complications rate for tibial pilon fracture promoting early union and early weight bearing.

  14. New hybrid frequency reuse method for packet loss minimization in LTE network.

    Science.gov (United States)

    Ali, Nora A; El-Dakroury, Mohamed A; El-Soudani, Magdi; ElSayed, Hany M; Daoud, Ramez M; Amer, Hassanein H

    2015-11-01

    This paper investigates the problem of inter-cell interference (ICI) in Long Term Evolution (LTE) mobile systems, which is one of the main problems that causes loss of packets between the base station and the mobile station. Recently, different frequency reuse methods, such as soft and fractional frequency reuse, have been introduced in order to mitigate this type of interference. In this paper, minimizing the packet loss between the base station and the mobile station is the main concern. Soft Frequency Reuse (SFR), which is the most popular frequency reuse method, is examined and the amount of packet loss is measured. In order to reduce packet loss, a new hybrid frequency reuse method is implemented. In this method, each cell occupies the same bandwidth of the SFR, but the total system bandwidth is greater than in SFR. This will provide the new method with a lot of new sub-carriers from the neighboring cells to reduce the ICI which represents a big problem in many applications and causes a lot of packets loss. It is found that the new hybrid frequency reuse method has noticeable improvement in the amount of packet loss compared to SFR method in the different frequency bands. Traffic congestion management in Intelligent Transportation system (ITS) is one of the important applications that is affected by the packet loss due to the large amount of traffic that is exchanged between the base station and the mobile node. Therefore, it is used as a studied application for the proposed frequency reuse method and the improvement in the amount of packet loss reached 49.4% in some frequency bands using the new hybrid frequency reuse method.

  15. A Sparsity-Promoted Method Based on Majorization-Minimization for Weak Fault Feature Enhancement.

    Science.gov (United States)

    Ren, Bangyue; Hao, Yansong; Wang, Huaqing; Song, Liuyang; Tang, Gang; Yuan, Hongfang

    2018-03-28

    Fault transient impulses induced by faulty components in rotating machinery usually contain substantial interference. Fault features are comparatively weak in the initial fault stage, which renders fault diagnosis more difficult. In this case, a sparse representation method based on the Majorzation-Minimization (MM) algorithm is proposed to enhance weak fault features and extract the features from strong background noise. However, the traditional MM algorithm suffers from two issues, which are the choice of sparse basis and complicated calculations. To address these challenges, a modified MM algorithm is proposed in which a sparse optimization objective function is designed firstly. Inspired by the Basis Pursuit (BP) model, the optimization function integrates an impulsive feature-preserving factor and a penalty function factor. Second, a modified Majorization iterative method is applied to address the convex optimization problem of the designed function. A series of sparse coefficients can be achieved through iterating, which only contain transient components. It is noteworthy that there is no need to select the sparse basis in the proposed iterative method because it is fixed as a unit matrix. Then the reconstruction step is omitted, which can significantly increase detection efficiency. Eventually, envelope analysis of the sparse coefficients is performed to extract weak fault features. Simulated and experimental signals including bearings and gearboxes are employed to validate the effectiveness of the proposed method. In addition, comparisons are made to prove that the proposed method outperforms the traditional MM algorithm in terms of detection results and efficiency.

  16. Charge transfer interaction using quasiatomic minimal-basis orbitals in the effective fragment potential method

    International Nuclear Information System (INIS)

    Xu, Peng; Gordon, Mark S.

    2013-01-01

    The charge transfer (CT) interaction, the most time-consuming term in the general effective fragment potential method, is made much more computationally efficient. This is accomplished by the projection of the quasiatomic minimal-basis-set orbitals (QUAMBOs) as the atomic basis onto the self-consistent field virtual molecular orbital (MO) space to select a subspace of the full virtual space called the valence virtual space. The diagonalization of the Fock matrix in terms of QUAMBOs recovers the canonical occupied orbitals and, more importantly, gives rise to the valence virtual orbitals (VVOs). The CT energies obtained using VVOs are generally as accurate as those obtained with the full virtual space canonical MOs because the QUAMBOs span the valence part of the virtual space, which can generally be regarded as “chemically important.” The number of QUAMBOs is the same as the number of minimal-basis MOs of a molecule. Therefore, the number of VVOs is significantly smaller than the number of canonical virtual MOs, especially for large atomic basis sets. This leads to a dramatic decrease in the computational cost

  17. Minimization of municipal solid waste transportation route in West Jakarta using Tabu Search method

    Science.gov (United States)

    Chaerul, M.; Mulananda, A. M.

    2018-04-01

    Indonesia still adopts the concept of collect-haul-dispose for municipal solid waste handling and it leads to the queue of the waste trucks at final disposal site (TPA). The study aims to minimize the total distance of waste transportation system by applying a Transshipment model. In this case, analogous of transshipment point is a compaction facility (SPA). Small capacity of trucks collects the waste from waste temporary collection points (TPS) to the compaction facility which located near the waste generator. After compacted, the waste is transported using big capacity of trucks to the final disposal site which is located far away from city. Problem related with the waste transportation can be solved using Vehicle Routing Problem (VRP). In this study, the shortest distance of route from truck pool to TPS, TPS to SPA, and SPA to TPA was determined by using meta-heuristic methods, namely Tabu Search 2 Phases. TPS studied is the container type with total 43 units throughout the West Jakarta City with 38 units of Armroll truck with capacity of 10 m3 each. The result determines the assignment of each truck from the pool to the selected TPS, SPA and TPA with the total minimum distance of 2,675.3 KM. The minimum distance causing the total cost for waste transportation to be spent by the government also becomes minimal.

  18. A Parsimonious Model of the Rabbit Action Potential Elucidates the Minimal Physiological Requirements for Alternans and Spiral Wave Breakup.

    Science.gov (United States)

    Gray, Richard A; Pathmanathan, Pras

    2016-10-01

    Elucidating the underlying mechanisms of fatal cardiac arrhythmias requires a tight integration of electrophysiological experiments, models, and theory. Existing models of transmembrane action potential (AP) are complex (resulting in over parameterization) and varied (leading to dissimilar predictions). Thus, simpler models are needed to elucidate the "minimal physiological requirements" to reproduce significant observable phenomena using as few parameters as possible. Moreover, models have been derived from experimental studies from a variety of species under a range of environmental conditions (for example, all existing rabbit AP models incorporate a formulation of the rapid sodium current, INa, based on 30 year old data from chick embryo cell aggregates). Here we develop a simple "parsimonious" rabbit AP model that is mathematically identifiable (i.e., not over parameterized) by combining a novel Hodgkin-Huxley formulation of INa with a phenomenological model of repolarization similar to the voltage dependent, time-independent rectifying outward potassium current (IK). The model was calibrated using the following experimental data sets measured from the same species (rabbit) under physiological conditions: dynamic current-voltage (I-V) relationships during the AP upstroke; rapid recovery of AP excitability during the relative refractory period; and steady-state INa inactivation via voltage clamp. Simulations reproduced several important "emergent" phenomena including cellular alternans at rates > 250 bpm as observed in rabbit myocytes, reentrant spiral waves as observed on the surface of the rabbit heart, and spiral wave breakup. Model variants were studied which elucidated the minimal requirements for alternans and spiral wave break up, namely the kinetics of INa inactivation and the non-linear rectification of IK.The simplicity of the model, and the fact that its parameters have physiological meaning, make it ideal for engendering generalizable mechanistic

  19. Critical requirements of the SSTR method

    International Nuclear Information System (INIS)

    Gold, R.

    1975-08-01

    Discrepancies have been reported in absolute fission rate measurements observed with Solid State Tract Recorders (SSTR) and fission chambers which lie well outside experimental error. As a result of these comparisons, the reliability of the SSTR method has been seriously questioned, and the fission chamber method has been advanced for sole use in absolute fission rate determinations. In view of the absolute accuracy already reported and well documented for the SSTR method, this conclusion is both surprising and unfortunate. Two independent methods are highly desirable. Moreover, these two methods more than compliment one another, since certain in-core experiments may be amenable to either but not both techniques. Consequently, one cannot abandon the SSTR method without sacrificing crucial advantages. A critical reappraisal of certain aspects of the SSTR method is offered in the hope that the source of the current controversy can be uncovered and a long term beneficial agreement between these two methods can therefore be established. (WHK)

  20. Inverse atmospheric radiative transfer problems - A nonlinear minimization search method of solution. [aerosol pollution monitoring

    Science.gov (United States)

    Fymat, A. L.

    1976-01-01

    The paper studies the inversion of the radiative transfer equation describing the interaction of electromagnetic radiation with atmospheric aerosols. The interaction can be considered as the propagation in the aerosol medium of two light beams: the direct beam in the line-of-sight attenuated by absorption and scattering, and the diffuse beam arising from scattering into the viewing direction, which propagates more or less in random fashion. The latter beam has single scattering and multiple scattering contributions. In the former case and for single scattering, the problem is reducible to first-kind Fredholm equations, while for multiple scattering it is necessary to invert partial integrodifferential equations. A nonlinear minimization search method, applicable to the solution of both types of problems has been developed, and is applied here to the problem of monitoring aerosol pollution, namely the complex refractive index and size distribution of aerosol particles.

  1. Evaluation of the accuracy of the free-energy-minimization method

    International Nuclear Information System (INIS)

    Najafabadi, R.; Srolovitz, D.J.

    1995-01-01

    We have made a detailed comparison between three competing methods for determining the free energies of solids and their defects: the thermodynamic integration of Monte Carlo (TIMC) data, the quasiharmonic (QH) model, and the free-energy-minimization (FEM) method. The accuracy of these methods decreases from the TIMC to QH to FEM method, while the computational efficiency improves in that order. All three methods yield perfect crystal lattice parameters and free energies at finite temperatures which are in good agreement for three different Cu interatomic potentials [embedded atom method (EAM), Morse and Lennard-Jones]. The FEM error (relative to the TIMC) in the (001) surface free energy and in the vacancy formation energy were found to be much larger for the EAM potential than for the other two potentials. Part of the errors in the FEM determination of the free energies are associated with anharmonicities in the interatomic potentials, with the remainder attributed to decoupling of the atomic vibrations. The anharmonicity of the EAM potential was found to be unphysically large compared with experimental vacancy formation entropy determinations. Based upon these results, we show that the FEM method provides a reasonable compromise between accuracy and computational demands. However, the accuracy of this approach is sensitive to the choice of interatomic potential and the nature of the defect to which it is being applied. The accuracy of the FEM is best in high-symmetry environments (perfect crystal, high-symmetry defects, etc.) and when used to describe materials where the anharmonicity is not too large

  2. Delay generation methods with reduced memory requirements

    DEFF Research Database (Denmark)

    Tomov, Borislav Gueorguiev; Jensen, Jørgen Arendt

    2003-01-01

    Modern diagnostic ultrasound beamformers require delay information for each sample along the image lines. In order to avoid storing large amounts of focusing data, delay generation techniques have to be used. In connection with developing a compact beamformer architecture, recursive algorithms were......) For the best parametric approach, the gate count was 2095, the maximum operation speed was 131.9 MHz, the power consumption at 40 MHz was 10.6 mW, and it requires 4 12-bit words for each image line and channel. 2) For the piecewise-linear approximation, the corresponding numbers are 1125 gates, 184.9 MHz, 7...

  3. Minimal requirements for quality controls in radiotherapy with external beams; Controlli di qualita' essenziali in radioterapia con fasci esterni

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-07-01

    Physical dosimetric guidelines have been developed by the Italian National Institute of Health study group on quality assurance in radiotherapy to define protocols for quality controls in external beam radiotherapy. While the document does not determine strict rules or firm recommendations, it suggests minimal requirements for quality controls necessary to guarantee an adequate degree of accuracy in external beam radiotherapy. [Italian] Il gruppo di studio Assicurazione di qualita' in radioterapia dell'Istituto Superiore di Sanita' presenta le linee guida per la stesura dei protocolli di controllo di qualita' essenziali necessari a garantire un adeguato livello di accuratezza del trattamento radiante e rappresenta pertanto una parte essenziale del contributo fisico-dosimetrico globale di assicurazione di qualita' in radioterapia con fasci esterni.

  4. RE-EDUCATIVE METHOD IN THE PROCESS OF MINIMIZING OF AUTOAGRESIVE WAYS OF BEHAVIOR

    Directory of Open Access Journals (Sweden)

    Nenad GLUMBIC

    1999-05-01

    Full Text Available Autoagressive behavior is a relatively frequent symptom of mental disturbances and behavior disturbances which are the subject of professional engagement of clinically oriented defectologists. In the process of rehabilitation numerous methods are used, from behavioral to psychopharmacological ones by which the above mentioned problems are eliminated of softened.The paper deals with four children with different diagnosis (autism, disintegrative psychosis, Patau syndrome and amaurosis that have the same common denominator-mental retardation and autoagression.We have tried to point out-by the description of a study case as well as the ways od work with these children-an application possibly of the particular methods of general and special re-education of psychomotorics in the process of autoagressive ways of behavior minimizing.The paper gives the autor’s notion of indications for re-educative method application with in the multihandicapped children population. Defectological treatment discovers new forms of existence in the existential field, not only to the retarded child but also to the very therapist. Epistemological consequences of the mentioned transfer are given in details in the paper.

  5. Cauliflower ear – a minimally invasive treatment method in a wrestling athlete: a case report

    Directory of Open Access Journals (Sweden)

    Haik J

    2018-01-01

    Full Text Available Josef Haik,1–4 Or Givol,2 Rachel Kornhaber,1,5 Michelle Cleary,6 Hagit Ofir,1,2 Moti Harats1–3 1Department of Plastic and Reconstructive Surgery, Sheba Medical Center, Tel Hashomer, Ramat Gan, 2Sackler School of Medicine, Tel Aviv University, Tel Aviv, Israel; 3Burn Injury Research Node, Institute for Health Research University of Notre Dame Fremantle, Fremantle WA, Australia; 4Talpiot Leadership Program, Sheba Medical Center, Tel Hashomer, Ramat Gan, Israel; 5Faculty of Health, 6School of Health Sciences, College of Health and Medicine, University of Tasmania, Sydney, NSW, Australia Abstract: Acute auricular hematoma can be caused by direct blunt trauma or other injury to the external ear. It is typically seen in those who practice full contact sports such as boxing, wrestling, and rugby. “Cauliflower ear” deformity, fibrocartilage formation during scarring, is a common complication of auricular hematomas. Therefore, acute drainage of the hematoma and postprocedural techniques for preventing recurrence are necessary for preventing the deformity. There are many techniques although no superior method of treatment has been found. In this case report, we describe a novel method using needle aspiration followed by the application of a magnet and an adapted disc to the affected area of the auricular. This minimally invasive, simple, and accessible method could potentially facilitate the treatment of cauliflower ear among full contact sports athletes. Keywords: cauliflower ear, hematoma, ear deformity, athletic injuries, wrestling, case report

  6. Octasaccharide is the minimal length unit required for efficient binding of cyclophilin B to heparin and cell surface heparan sulphate.

    Science.gov (United States)

    Vanpouille, Christophe; Denys, Agnès; Carpentier, Mathieu; Pakula, Rachel; Mazurier, Joël; Allain, Fabrice

    2004-09-01

    Cyclophilin B (CyPB) is a heparin-binding protein first identified as a receptor for cyclosporin A. In previous studies, we reported that CyPB triggers chemotaxis and integrin-mediated adhesion of T-lymphocytes by way of interaction with two types of binding sites. The first site corresponds to a signalling receptor; the second site has been identified as heparan sulphate (HS) and appears crucial to induce cell adhesion. Characterization of the HS-binding unit is critical to understand the requirement of HS in pro-adhesive activity of CyPB. By using a strategy based on gel mobility shift assays with fluorophore-labelled oligosaccharides, we demonstrated that the minimal heparin unit required for efficient binding of CyPB is an octasaccharide. The mutants CyPB(KKK-) [where KKK- refers to the substitutions K3A(Lys3-->Ala)/K4A/K5A] and CyPB(DeltaYFD) (where Tyr14-Phe-Asp16 has been deleted) failed to interact with octasaccharides, confirming that the Y14FD16 and K3KK5 clusters are required for CyPB binding. Molecular modelling revealed that both clusters are spatially arranged so that they may act synergistically to form a binding site for the octasaccharide. We then demonstrated that heparin-derived octasaccharides and higher degree of polymerization oligosaccharides inhibited the interaction between CyPB and fluorophore-labelled HS chains purified from T-lymphocytes, and strongly reduced the HS-dependent pro-adhesive activity of CyPB. However, oligosaccharides or heparin were unable to restore adhesion of heparinase-treated T-lymphocytes, indicating that HS has to be present on the cell membrane to support the pro-adhesive activity of CyPB. Altogether, these results demonstrate that the octasaccharide is likely to be the minimal length unit required for efficient binding of CyPB to cell surface HS and consequent HS-dependent cell responses.

  7. Blast casting requires fresh assessment of methods

    Energy Technology Data Exchange (ETDEWEB)

    Pilshaw, S.R.

    1987-08-01

    The article discusses the reasons why conventional blasting operations, mainly that of explosive products, drilling and initiation methods are inefficient, and suggests new methods and materials to overcome the problems of the conventional operations. The author suggests that the use of bulk ANFO for casting, instead of high energy and density explosives with high velocity detonation is more effective in producing heave action results. Similarly the drilling of smaller blast holes than is conventional allows better loading distribution of explosives in the rock mass. The author also suggests that casting would be more efficient if the shot rows were loaded differently to produce a variable burden blasting pattern.

  8. Assessment of methods to determine minimal cellulase concentrations for efficient hydrolysis of cellulose

    Energy Technology Data Exchange (ETDEWEB)

    Hogan, C.M.; Mes-Hartree, M.; Saddler, J.N. (Forintek Canada Corp., Ottawa, ON (Canada). Biotechnology and Chemistry Dept.); Kushner, D.J. (Toronto Univ., Ontario (Canada). Dept. of Microbiology)

    1990-02-01

    The enzyme loading needed to achieve substrate saturation appeared to be the most economical enzyme concentration to use for hydrolysis, based on percentage hydrolysis. Saturation was reached at 25 filter paper units per gram substrate on Solka Floc BW300, as determined by studying (a) initial adsorption of the cellulase preparation onto the substrate, (b) an actual hydrolysis or (c) a combined hydrolysis and fermentation (CHF) process. Initial adsorption of the cellulases onto the substrate can be used to determine the minimal cellulase requirements for efficient hydrolysis since enzymes initially adsorbed to the substrate have a strong role in governing the overall reaction. Trichoderma harzianum E58 produces high levels of {beta}-glucosidase and is able to cause high conversion of Solka Floc BW300 to glucose without the need for exogenous {beta}-glucosidase. End-product inhibition of the cellulase and {beta}-glucosidase can be more effectively reduced by employing a CHF process than by supplemental {beta}-glucosidase. (orig.).

  9. Using the critical incident technique to define a minimal data set for requirements elicitation in public health.

    Science.gov (United States)

    Olvingson, Christina; Hallberg, Niklas; Timpka, Toomas; Greenes, Robert A

    2002-12-18

    The introduction of computer-based information systems (ISs) in public health provides enhanced possibilities for service improvements and hence also for improvement of the population's health. Not least, new communication systems can help in the socialization and integration process needed between the different professions and geographical regions. Therefore, development of ISs that truly support public health practices require that technical, cognitive, and social issues be taken into consideration. A notable problem is to capture 'voices' of all potential users, i.e., the viewpoints of different public health practitioners. Failing to capture these voices will result in inefficient or even useless systems. The aim of this study is to develop a minimal data set for capturing users' voices on problems experienced by public health professionals in their daily work and opinions about how these problems can be solved. The issues of concern thus captured can be used both as the basis for formulating the requirements of ISs for public health professionals and to create an understanding of the use context. Further, the data can help in directing the design to the features most important for the users.

  10. Methods for the minimization of radioactive waste from decontamination and decommissioning of nuclear facilities

    International Nuclear Information System (INIS)

    2001-01-01

    The objective of this report is to provide Member States and their decision makers (ranging from regulators, strategists, planners and designers, to operators) with relevant information on opportunities for minimizing radioactive wastes arising from the D and D of nuclear facilities. This will allow waste minimization options to be properly planned and assessed as part of national, site and plant waste management policies. This objective will be achieved by: reviewing the sources and characteristics of radioactive materials arising from D and D activities; reviewing waste minimization principles and current practical applications, together with regulatory, technical, financial and political factors influencing waste minimization practices; and reviewing current trends in improving waste minimization practices during D and D

  11. The Adaptation of Ways and Methods of Risk Minimization in Local Payment Systems in Public Transport

    Directory of Open Access Journals (Sweden)

    Avdaev Mausar Yushaevich

    2014-12-01

    Full Text Available The problems of risk management gain special relevance in the conditions of payment systems development in public passenger transport in Russia. The risk carriers as well as the sources of their occurrence are revealed; the characteristics of private risks of individual participants in the system of public passenger transport are presented. The directions of risk management in relation to the payment system in public transport are reasoned and structured. It is proved that the choice of specific ways to minimize the risks in local payment systems in public transport is conditioned by the following factors – the nature of the payment system integration in public transport areas, the temporary nature of risk components effect due to the improvement of organizational, economic and technological factors, the change of the stages of payment systems development, the evaluation of risks effects. The article reasons the possibility of using and adjusting traditional ways (risk evasion, risk compensation, decrease in risk level, risk transfer, distribution of risk between participants and the methods of risk management in the payment systems in public transport according to the stages of their development and functioning for the processing center, passenger motor transport organizations, financial center and passengers (payers. The authors justify the directions of integrating the local payment systems of public transport in the national payment system, taking into account the risks involved in the activity of its members.

  12. Minimizing the Discrepancy between Simulated and Historical Failures in Turbine Engines: A Simulation-Based Optimization Method

    Directory of Open Access Journals (Sweden)

    Ahmed Kibria

    2015-01-01

    Full Text Available The reliability modeling of a module in a turbine engine requires knowledge of its failure rate, which can be estimated by identifying statistical distributions describing the percentage of failure per component within the turbine module. The correct definition of the failure statistical behavior per component is highly dependent on the engineer skills and may present significant discrepancies with respect to the historical data. There is no formal methodology to approach this problem and a large number of labor hours are spent trying to reduce the discrepancy by manually adjusting the distribution’s parameters. This paper addresses this problem and provides a simulation-based optimization method for the minimization of the discrepancy between the simulated and the historical percentage of failures for turbine engine components. The proposed methodology optimizes the parameter values of the component’s failure statistical distributions within the component’s likelihood confidence bounds. A complete testing of the proposed method is performed on a turbine engine case study. The method can be considered as a decision-making tool for maintenance, repair, and overhaul companies and will potentially reduce the cost of labor associated to finding the appropriate value of the distribution parameters for each component/failure mode in the model and increase the accuracy in the prediction of the mean time to failures (MTTF.

  13. Metric-based method of software requirements correctness improvement

    Directory of Open Access Journals (Sweden)

    Yaremchuk Svitlana

    2017-01-01

    Full Text Available The work highlights the most important principles of software reliability management (SRM. The SRM concept construes a basis for developing a method of requirements correctness improvement. The method assumes that complicated requirements contain more actual and potential design faults/defects. The method applies a newer metric to evaluate the requirements complexity and double sorting technique evaluating the priority and complexity of a particular requirement. The method enables to improve requirements correctness due to identification of a higher number of defects with restricted resources. Practical application of the proposed method in the course of demands review assured a sensible technical and economic effect.

  14. Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization

    Czech Academy of Sciences Publication Activity Database

    Vlček, Jan; Lukšan, Ladislav

    Online: 02 April (2018) ISSN 1017-1398 R&D Projects: GA ČR GA13-06684S Institutional support: RVO:67985807 Keywords : Unconstrained minimization * Block variable metric methods * Limited-memory methods * BFGS update * Global convergence * Numerical results Subject RIV: BA - General Mathematics OBOR OECD: Applied mathematics Impact factor: 1.241, year: 2016

  15. The Sources and Methods of Engineering Design Requirement

    DEFF Research Database (Denmark)

    Li, Xuemeng; Zhang, Zhinan; Ahmed-Kristensen, Saeema

    2014-01-01

    to be defined in a new context. This paper focuses on understanding the design requirement sources at the requirement elicitation phase. It aims at proposing an improved design requirement source classification considering emerging markets and presenting current methods for eliciting requirement for each source...

  16. Minimal invasive stabilization of osteoporotic vertebral compression fractures. Methods and preinterventional diagnostics; Minimal-invasive Stabilisierung osteoporotischer Wirbelkoerpereinbrueche. Methodik und praeinterventionelle Diagnostik

    Energy Technology Data Exchange (ETDEWEB)

    Grohs, J.G.; Krepler, P. [Orthopaedische Klinik, Universitaet Wien (Austria)

    2004-03-01

    Minimal invasive stabilizations represent a new alternative for the treatment of osteoporotic compression fractures. Vertebroplasty and balloon kyphoplasty are two methods to enhance the strength of osteoporotic vertebral bodies by the means of cement application. Vertebroplasty is the older and technically easier method. The balloon kyphoplasty is the newer and more expensive method which does not only improve pain but also restores the sagittal profile of the spine. By balloon kyphoplasty the height of 101 fractured vertebral bodies could be increased up to 90% and the wedge decreased from 12 to 7 degrees. Pain was reduced from 7,2 to 2,5 points. The Oswestry disability index decreased from 60 to 26 points. This effects persisted over a period of two years. Cement leakage occurred in only 2% of vertebral bodies. Fractures of adjacent vertebral bodies were found in 11%. Good preinterventional diagnostics and intraoperative imaging are necessary to make the balloon kyphoplasty a successful application. (orig.) [German] Minimal-invasive Stabilisierungen stellen eine Alternative zur bisherigen Behandlung osteoporotischer Wirbelfrakturen dar. Die Vertebroplastie und die Ballonkyphoplastik sind 2 Verfahren, um die Festigkeit der Wirbelkoerper nach osteoporotischen Kompressionsfrakturen durch Einbringen von Knochenzement wieder herzustellen. Die Vertebroplastie ist die aeltere, technisch einfachere und kostenguenstigere Technik, geht allerdings regelmaessig mit Zementaustritt einher. Die Ballonkyphoplastik ist die neuere kostenintensivere Technologie, mit der abgesehen von der Schmerzreduktion auch die Wiederherstellung des sagittalen Profils der Wirbelsaeule angestrebt werden kann. Mit der Ballonkyphoplastik konnten bei 101 frakturierten Wirbelkoerpern die Hoehe auf fast 90% des Sollwertes angehoben und die lokale Kyphose von 12 auf 7 vermindert werden. Die Schmerzen wurden - gemessen anhand einer 10-teiligen Skala - von 7,2 auf 2,5 reduziert. Der Oswestry disability

  17. A Survey of Various Object Oriented Requirement Engineering Methods

    OpenAIRE

    Anandi Mahajan; Dr. Anurag Dixit

    2013-01-01

    In current years many industries have been moving to the use of object-oriented methods for the development of large scale information systems The requirement of Object Oriented approach in the development of software systems is increasing day by day. This paper is basically a survey paper on various Object-oriented requirement engineering methods. This paper contains a summary of the available Object-oriented requirement engineering methods with their relative advantages and disadvantages...

  18. Minimally disruptive medicine is needed for patients with multimorbidity: time to develop computerised medical record systems to meet this requirement

    Directory of Open Access Journals (Sweden)

    Peter Schattner

    2015-02-01

    Full Text Available Background Minimally disruptive medicine (MDM is proposed as a method for more appropriately managing people with multiple chronic disease. Much clinical management is currently single disease focussed, with people with multimorbidity being managed according to multiple single disease guidelines. Current initiatives to improve care include education about individual conditions and creating an environment where multiple guidelines might be simultaneously supported. The patientcentred medical home (PCMH is an example of the latter. However, educational programmes and PCMH may increase the burden on patients.Problem The cumulative workload for patients in managing the impact of multiple disease-specific guidelines is only relatively recently recognised. There is an intellectual vacuum as to how best to manage multimorbidity and how informatics might support implementing MDM. There is currently no alternative to multiple singlecondition- specific guidelines and a lack of certainty, should the treatment burden need to be reduced, as to which guideline might be ‘dropped’.Action The best information about multimorbidity is recorded in primary care computerised medical record (CMR systems and in an increasing number of integrated care organisations. CMR systems have the potential to flag individuals who might be in greatest need. However, CMR systems may also provide insights into whether there are ameliorating factors that might make it easier for them to be resilient to the burden of care. Data from such CMR systems might be used to develop the evidence base about how to better manage multimorbidity.Conclusions There is potential for these information systems to help reduce the management burden on patients and clinicians. However, substantial investment in research-driven CMR development is needed if we are to achieve this.

  19. Taxonomic minimalism.

    Science.gov (United States)

    Beattle, A J; Oliver, I

    1994-12-01

    Biological surveys are in increasing demand while taxonomic resources continue to decline. How much formal taxonomy is required to get the job done? The answer depends on the kind of job but it is possible that taxonomic minimalism, especially (1) the use of higher taxonomic ranks, (2) the use of morphospecies rather than species (as identified by Latin binomials), and (3) the involvement of taxonomic specialists only for training and verification, may offer advantages for biodiversity assessment, environmental monitoring and ecological research. As such, formal taxonomy remains central to the process of biological inventory and survey but resources may be allocated more efficiently. For example, if formal Identification is not required, resources may be concentrated on replication and increasing sample sizes. Taxonomic minimalism may also facilitate the inclusion in these activities of important but neglected groups, especially among the invertebrates, and perhaps even microorganisms. Copyright © 1994. Published by Elsevier Ltd.

  20. Basic requirements to the methods of personnel monitoring

    International Nuclear Information System (INIS)

    Keirim-Markus, I.B.

    1981-01-01

    Requirements to methods of personnel monitoring (PMM) depending on irradiation conditions are given. The irradiation conditions determine subjected to monitoring types of irradiation, measurement ranges, periodicity of monitoring, operativeness of obtaining results and required accuracy. The PMM based on the photographic effect of ionizing radiation is the main method of the mass monitoring [ru

  1. Evaluation of the Efficiency and Effectiveness of Three Minimally Invasive Methods of Caries Removal: An in vitro Study.

    Science.gov (United States)

    Boob, Ankush Ramnarayan; Manjula, M; Reddy, E Rajendra; Srilaxmi, N; Rani, Tabitha

    2014-01-01

    Many chemomechanical caries removal (CMCR) agents have been introduced and marketed since 1970s, with each new one being better and effective than the previously introduced. Papacarie and Carisolv are new systems in the field of CMCR techniques. These are reportedly minimally invasive methods of removing carious dentin while preserving sound dentin. To compare the Efficiency (time taken for caries removal) and effectiveness (Knoop hardness number of the remaining dentin) of caries removal by three minimally invasive methods, i.e. hand excavation and chemomechanical caries removal using Carisolv and Papacarie. Thirty recently extracted human permanent molars with occlusal carious lesions were divided randomly in three equal groups and bisected through the middle of the lesion mesiodistally and excavated by two methods on each tooth. Statistically significant difference was present among three methods with respect to time and knoop hardness values (KHN) of the remaining dentin. The Efficiency of Hand method is better compared to CMCR techniques and effectiveness of CMCR techniques is better than Hand method in terms of dentin preservation so the chances of maintaining vitality of the pulp will be enhanced. How to cite this article: Boob AR, Manjula M, Reddy ER, Srilaxmi N, Rani T. Evaluation of the Efficiency and Effectiveness of Three Minimally Invasive Methods of Caries Removal: An in vitro Study. Int J Clin Pediatr Dent 2014;7(1):11-18.

  2. Silver nanoparticles: an alternative method for sanitization of minimally processed cabbage

    Directory of Open Access Journals (Sweden)

    Emiliane Andrade Araújo

    2015-06-01

    Full Text Available The minimal processing of vegetables basically aims to extend food shelf life, which depends on a number of factors, such as sanitization, that is considered a critical step for food microbiological quality. However, the usual antimicrobial agents reduce the microbial population in a maximum of two logarithmic cycles. Therefore, it is necessary to develop alternative sanitizers. This study aimed to increase the innocuity of minimally processed cabbage through sanitization with silver nanoparticles. It was observed that the nanoparticles promoted three logarithmic reductions, i.e. a 99.9 % reduction rate, in the Escherichia coli population inoculated on the cabbage surface. When compared to other antimicrobial agents (sodium dichloroisocyanurate and sodium hypochlorite, the nanoparticles were more efficient in sanitizing minimally processed cabbage, showing a lower count of aerobic mesophils. It was also observed that the cabbage surface presents hydrophobic characteristics, resulting in a higher propension for bacterial adhesion, which was confirmed in the thermodynamic evaluation of favorable adhesion for Staphylococcus aureus, Escherichia coli and Listeria innocua.

  3. Collective motion in prolate γ-rigid nuclei within minimal length concept via a quantum perturbation method

    Science.gov (United States)

    Chabab, M.; El Batoul, A.; Lahbas, A.; Oulne, M.

    2018-05-01

    Based on the minimal length concept, inspired by Heisenberg algebra, a closed analytical formula is derived for the energy spectrum of the prolate γ-rigid Bohr-Mottelson Hamiltonian of nuclei, within a quantum perturbation method (QPM), by considering a scaled Davidson potential in β shape variable. In the resulting solution, called X(3)-D-ML, the ground state and the first β-band are all studied as a function of the free parameters. The fact of introducing the minimal length concept with a QPM makes the model very flexible and a powerful approach to describe nuclear collective excitations of a variety of vibrational-like nuclei. The introduction of scaling parameters in the Davidson potential enables us to get a physical minimum of this latter in comparison with previous works. The analysis of the corrected wave function, as well as the probability density distribution, shows that the minimal length parameter has a physical upper bound limit.

  4. Operating cost minimization of a radial distribution system in a deregulated electricity market through reconfiguration using NSGA method

    International Nuclear Information System (INIS)

    Chandramohan, S.; Atturulu, Naresh; Devi, R.P. Kumudini; Venkatesh, B.

    2010-01-01

    In the future, mechanisms for trade in ancillary services such as reactive power will be implemented in many deregulated power systems. In such an operating framework, a Distribution Corporation (DisCo) would have to purchase reactive power along with real power from the connected transmission corporation. A DisCo would want to minimize its operating costs by minimizing the total amount of real and reactive power drawn from the connected transmission system. Optimally reconfiguring the network will achieve such a goal. In this work, we use a non-dominated sorting genetic algorithm (NSGA) for reconfiguring a radial DisCo to minimize its operating costs considering real and reactive power costs while maximizing its operating reliability and satisfying the regular operating constraints. This method is tested on sample test systems and reported. (author)

  5. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    Science.gov (United States)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  6. How to Compare the Security Quality Requirements Engineering (SQUARE) Method with Other Methods

    National Research Council Canada - National Science Library

    Mead, Nancy R

    2007-01-01

    The Security Quality Requirements Engineering (SQUARE) method, developed at the Carnegie Mellon Software Engineering Institute, provides a systematic way to identify security requirements in a software development project...

  7. An interval-based possibilistic programming method for waste management with cost minimization and environmental-impact abatement under uncertainty.

    Science.gov (United States)

    Li, Y P; Huang, G H

    2010-09-15

    Considerable public concerns have been raised in the past decades since a large amount of pollutant emissions from municipal solid waste (MSW) disposal of processes pose risks on surrounding environment and human health. Moreover, in MSW management, various uncertainties exist in the related costs, impact factors and objectives, which can affect the optimization processes and the decision schemes generated. In this study, an interval-based possibilistic programming (IBPP) method is developed for planning the MSW management with minimized system cost and environmental impact under uncertainty. The developed method can deal with uncertainties expressed as interval values and fuzzy sets in the left- and right-hand sides of constraints and objective function. An interactive algorithm is provided for solving the IBPP problem, which does not lead to more complicated intermediate submodels and has a relatively low computational requirement. The developed model is applied to a case study of planning a MSW management system, where mixed integer linear programming (MILP) technique is introduced into the IBPP framework to facilitate dynamic analysis for decisions of timing, sizing and siting in terms of capacity expansion for waste-management facilities. Three cases based on different waste-management policies are examined. The results obtained indicate that inclusion of environmental impacts in the optimization model can change the traditional waste-allocation pattern merely based on the economic-oriented planning approach. The results obtained can help identify desired alternatives for managing MSW, which has advantages in providing compromised schemes under an integrated consideration of economic efficiency and environmental impact under uncertainty. Copyright 2010 Elsevier B.V. All rights reserved.

  8. MINIMAL REQUIREMENTS FOR THE DIAGNOSIS, CLASSIFICATION, AND EVALUATION OF THE TREATMENT OF CHILDHOOD ACUTE LYMPHOBLASTIC-LEUKEMIA (ALL) IN THE BFM FAMILY COOPERATIVE GROUP

    NARCIS (Netherlands)

    VANDERDOESVANDENBERG, A; BARTRAM, CR; BASSO, G; BENOIT, YCM; BIONDI, A; DEBATIN, KM; HAAS, OA; HARBOTT, J; KAMPS, WA; KOLLER, U; LAMPERT, F; LUDWIG, WD; NIEMEYER, CM; VANWERING, ER

    1992-01-01

    Minimal requirements and their rationale for the diagnosis and the response to treatment in childhood acute lymphoblastic leukemia (ALL) were defined in the recently instituted "BFM-Family"-Group, in which the German, Austrian, Dutch, Italian, Belgian, French and Hungarian childhood leukemia study

  9. The continuous reaction times method for diagnosing, grading, and monitoring minimal/covert hepatic encephalopathy

    DEFF Research Database (Denmark)

    Lauridsen, Mette Enok Munk; Thiele, Maja; Kimer, N

    2013-01-01

    Abstract Existing tests for minimal/covert hepatic encephalopathy (m/cHE) are time- and expertise consuming and primarily useable for research purposes. An easy-to-use, fast and reliable diagnostic and grading tool is needed. We here report on the background, experience, and ongoing research......-10) percentile) as a parameter of reaction time variability. The index is a measure of alertness stability and is used to assess attention and cognition deficits. The CRTindex identifies half of patients in a Danish cohort with chronic liver disease, as having m/cHE, a normal value safely precludes HE, it has...

  10. Percutaneous Needle Aspiration Is A Minimally Invasive Method For A Breast Abscess

    Directory of Open Access Journals (Sweden)

    Hamid H. Sarhan

    2012-04-01

    Results: Twenty-three (53.4% of the patients obtained complete resolution (no focal collection after one aspiration; 9 (21% required two aspirations and 8 (18.6% required more than two aspirations for the cure (residual collection. In 3 (7% of the patients, the treatment failed, where symptoms had not resolved after 3 days, with further pus collection despite aspiration and antibiotics, where surgical drainage was required. Conclusions: Percutaneous needle drainage of breast abscesses after preliminary breast US is feasible as a primary and definitive treatment for breast abscesses, if complete or near complete drainage is achieved. [Arch Clin Exp Surg 2012; 1(2.000: 105-109

  11. Evaluation of the Efficiency and Effectiveness of Three Minimally Invasive Methods of Caries Removal: An in vitro Study

    OpenAIRE

    Boob, Ankush Ramnarayan; Manjula, M; Reddy, E Rajendra; Srilaxmi, N; Rani, Tabitha

    2014-01-01

    ABSTRACT Background: Many chemomechanical caries removal (CMCR) agents have been introduced and marketed since 1970s, with each new one being better and effective than the previously introduced. Papacarie and Carisolv are new systems in the field of CMCR techniques. These are reportedly minimally invasive methods of removing carious dentin while preserving sound dentin. Aim: To compare the Efficiency (time taken for caries removal) and effectiveness (Knoop hardness number of the remaining den...

  12. By how much can Residual Minimization Accelerate the Convergence of Orthogonal Residual Methods?

    Czech Academy of Sciences Publication Activity Database

    Gutknecht, M. H.; Rozložník, Miroslav

    2001-01-01

    Roč. 27, - (2001), s. 189-213 ISSN 1017-1398 R&D Projects: GA ČR GA201/98/P108 Institutional research plan: AV0Z1030915 Keywords : system of linear algebraic equations * iterative method * Krylov space method * conjugate gradient method * biconjugate gradient method * CG * CGNE * CGNR * CGS * FOM * GMRes * QMR * TFQMR * residual smoothing * MR smoothing * QMR smoothing Subject RIV: BA - General Mathematics Impact factor: 0.438, year: 2001

  13. Minimal Subdermal Shaving by Means of Sclerotherapy Using Absolute Ethanol: A New Method for the Treatment of Axillary Osmidrosis

    Directory of Open Access Journals (Sweden)

    Hyung­Sup Shim

    2013-07-01

    Full Text Available Background Axillary osmidrosis is characterized by unpleasant odors originating from the axillary apocrine glands, resulting in psychosocial stress. The main treatment modality is apocrine gland removal. Until now, of the various surgical techniques have sometimes caused serious complications. We describe herein the favorable outcomes of a new method for ablating apocrine glands by minimal subdermal shaving using sclerotherapy with absolute ethanol.Methods A total of 12 patients underwent the procedure. The severity of osmidrosis was evaluated before surgery. Conventional subdermal shaving was performed on one side (control group and ablation by means of minimal subdermal shaving and absolute ethanol on the other side (study group. Postoperative outcomes were compared between the study and control groups.Results The length of time to removal of the drain was 1 day shorter in the study group than in the control group. There were no serious complications, such as hematoma or seroma, in either group, but flap margin necrosis and flap desquamation occurred in the control group, and were successfully managed with conservative treatment. Six months after surgery, we and our patients were satisfied with the outcomes.Conclusions Sclerotherapy using absolute ethanol combined with minimal subdermal shaving may be useful for the treatment of axillary osmidrosis. It can reduce the incidence of seroma and hematoma and allow the skin flap to adhere to its recipient site. It can degrade and ablate the remaining apocrine glands and eliminate causative organisms. Furthermore, since this technique is relatively simple, it takes less time than the conventional method.

  14. Application of the microbiological method DEFT/APC to detect minimally processed vegetables treated with gamma radiation

    Science.gov (United States)

    Araújo, M. M.; Duarte, R. C.; Silva, P. V.; Marchioni, E.; Villavicencio, A. L. C. H.

    2009-07-01

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a 60Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.

  15. Application of the microbiological method DEFT/APC to detect minimally processed vegetables treated with gamma radiation

    International Nuclear Information System (INIS)

    Araujo, M.M.; Duarte, R.C.; Silva, P.V.; Marchioni, E.; Villavicencio, A.L.C.H.

    2009-01-01

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a 60 Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.

  16. Application of the microbiological method DEFT/APC to detect minimally processed vegetables treated with gamma radiation

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, M.M.; Duarte, R.C.; Silva, P.V. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Centro de Tecnologia das Radiacoes, Laboratorio de Deteccao de Alimentos Irradiados, Cidade Universitaria, Av. Prof. Lineu Prestes 2242, Butanta Zip Code 05508-000 Sao Paulo (Brazil); Marchioni, E. [Laboratoire de Chimie Analytique et Sciences de l' Aliment (UMR 7512), Faculte de Pharmacie, Universite Louis Pasteur, 74, route du Rhin, F-67400 Illkirch (France); Villavicencio, A.L.C.H. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Centro de Tecnologia das Radiacoes, Laboratorio de Deteccao de Alimentos Irradiados, Cidade Universitaria, Av. Prof. Lineu Prestes 2242, Butanta Zip Code 05508-000 Sao Paulo (Brazil)], E-mail: villavic@ipen.br

    2009-07-15

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a {sup 60}Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.

  17. A new smoothing modified three-term conjugate gradient method for [Formula: see text]-norm minimization problem.

    Science.gov (United States)

    Du, Shouqiang; Chen, Miao

    2018-01-01

    We consider a kind of nonsmooth optimization problems with [Formula: see text]-norm minimization, which has many applications in compressed sensing, signal reconstruction, and the related engineering problems. Using smoothing approximate techniques, this kind of nonsmooth optimization problem can be transformed into a general unconstrained optimization problem, which can be solved by the proposed smoothing modified three-term conjugate gradient method. The smoothing modified three-term conjugate gradient method is based on Polak-Ribière-Polyak conjugate gradient method. For the Polak-Ribière-Polyak conjugate gradient method has good numerical properties, the proposed method possesses the sufficient descent property without any line searches, and it is also proved to be globally convergent. Finally, the numerical experiments show the efficiency of the proposed method.

  18. Designing for Change: Minimizing the Impact of Changing Requirements in the Later Stages of a Spaceflight Software Project

    Science.gov (United States)

    Allen, B. Danette

    1998-01-01

    In the traditional 'waterfall' model of the software project life cycle, the Requirements Phase ends and flows into the Design Phase, which ends and flows into the Development Phase. Unfortunately, the process rarely, if ever, works so smoothly in practice. Instead, software developers often receive new requirements, or modifications to the original requirements, well after the earlier project phases have been completed. In particular, projects with shorter than ideal schedules are highly susceptible to frequent requirements changes, as the software requirements analysis phase is often forced to begin before the overall system requirements and top-level design are complete. This results in later modifications to the software requirements, even though the software design and development phases may be complete. Requirements changes received in the later stages of a software project inevitably lead to modification of existing developed software. Presented here is a series of software design techniques that can greatly reduce the impact of last-minute requirements changes. These techniques were successfully used to add built-in flexibility to two complex software systems in which the requirements were expected to (and did) change frequently. These large, real-time systems were developed at NASA Langley Research Center (LaRC) to test and control the Lidar In-Space Technology Experiment (LITE) instrument which flew aboard the space shuttle Discovery as the primary payload on the STS-64 mission.

  19. A Method for Software Requirement Volatility Analysis Using QFD

    Directory of Open Access Journals (Sweden)

    Yunarso Anang

    2016-10-01

    Full Text Available Changes of software requirements are inevitable during the development life cycle. Rather than avoiding the circumstance, it is easier to just accept it and find a way to anticipate those changes. This paper proposes a method to analyze the volatility of requirement by using the Quality Function Deployment (QFD method and the introduced degree of volatility. Customer requirements are deployed to software functions and subsequently to architectural design elements. And then, after determining the potential for changes of the design elements, the degree of volatility of the software requirements is calculated. In this paper the method is described using a flow diagram and illustrated using a simple example, and is evaluated using a case study.

  20. GeLC-MS: A Sample Preparation Method for Proteomics Analysis of Minimal Amount of Tissue.

    Science.gov (United States)

    Makridakis, Manousos; Vlahou, Antonia

    2017-10-10

    Application of various proteomics methodologies have been implemented for the global and targeted proteome analysis of many different types of biological samples such as tissue, urine, plasma, serum, blood, and cell lines. Among the aforementioned biological samples, tissue has an exceptional role into clinical research and practice. Disease initiation and progression is usually located at the tissue level of different organs, making the analysis of this material very important for the understanding of the disease pathophysiology. Despite the significant advances in the mass spectrometry instrumentation, tissue proteomics still faces several challenges mainly due to increased sample complexity and heterogeneity. However, the most prominent challenge is attributed to the invasive procedure of tissue sampling which restricts the availability of fresh frozen tissue to minimal amounts and limited number of samples. Application of GeLC-MS sample preparation protocol for tissue proteomics analysis can greatly facilitate making up for these difficulties. In this chapter, a step by step guide for the proteomics analysis of minute amounts of tissue samples using the GeLC-MS sample preparation protocol, as applied by our group in the analysis of multiple different types of tissues (vessels, kidney, bladder, prostate, heart) is provided.

  1. A METHOD OF THE MINIMIZING OF THE TOTAL ACQUISITIONS COST WITH THE INCREASING VARIABLE DEMAND

    Directory of Open Access Journals (Sweden)

    ELEONORA IONELA FOCȘAN

    2015-12-01

    Full Text Available Over time, mankind has tried to find different ways of costs reduction. This subject which we are facing more often nowadays, has been detailed studied, without reaching a general model, and also efficient, regarding the costs reduction. Costs reduction entails a number of benefits over the entity, the most important being: increase revenue and default to the profit, increase productivity, a higher level of services / products offered to clients, and last but not least, the risk mitigation of the economic deficit. Therefore, each entity search different modes to obtain most benefits, for the company to succeed in a competitive market. This article supports the companies, trying to make known a new way of minimizing the total cost of acquisitions, by presenting some hypotheses about the increasing variable demand, proving them, and development of formulas for reducing the costs. The hypotheses presented in the model described below, can be maximally exploited to obtain new models of reducing the total cost, according to the modes of the purchase of entities which approach it.

  2. Prognostic value of deep sequencing method for minimal residual disease detection in multiple myeloma

    Science.gov (United States)

    Lahuerta, Juan J.; Pepin, François; González, Marcos; Barrio, Santiago; Ayala, Rosa; Puig, Noemí; Montalban, María A.; Paiva, Bruno; Weng, Li; Jiménez, Cristina; Sopena, María; Moorhead, Martin; Cedena, Teresa; Rapado, Immaculada; Mateos, María Victoria; Rosiñol, Laura; Oriol, Albert; Blanchard, María J.; Martínez, Rafael; Bladé, Joan; San Miguel, Jesús; Faham, Malek; García-Sanz, Ramón

    2014-01-01

    We assessed the prognostic value of minimal residual disease (MRD) detection in multiple myeloma (MM) patients using a sequencing-based platform in bone marrow samples from 133 MM patients in at least very good partial response (VGPR) after front-line therapy. Deep sequencing was carried out in patients in whom a high-frequency myeloma clone was identified and MRD was assessed using the IGH-VDJH, IGH-DJH, and IGK assays. The results were contrasted with those of multiparametric flow cytometry (MFC) and allele-specific oligonucleotide polymerase chain reaction (ASO-PCR). The applicability of deep sequencing was 91%. Concordance between sequencing and MFC and ASO-PCR was 83% and 85%, respectively. Patients who were MRD– by sequencing had a significantly longer time to tumor progression (TTP) (median 80 vs 31 months; P < .0001) and overall survival (median not reached vs 81 months; P = .02), compared with patients who were MRD+. When stratifying patients by different levels of MRD, the respective TTP medians were: MRD ≥10−3 27 months, MRD 10−3 to 10−5 48 months, and MRD <10−5 80 months (P = .003 to .0001). Ninety-two percent of VGPR patients were MRD+. In complete response patients, the TTP remained significantly longer for MRD– compared with MRD+ patients (131 vs 35 months; P = .0009). PMID:24646471

  3. Operant Conditioning: A Minimal Components Requirement in Artificial Spiking Neurons Designed for Bio-Inspired Robot’s Controller

    Directory of Open Access Journals (Sweden)

    André eCyr

    2014-07-01

    Full Text Available We demonstrate the operant conditioning (OC learning process within a basic bio-inspired robot controller paradigm, using an artificial spiking neural network (ASNN with minimal component count as artificial brain. In biological agents, OC results in behavioral changes that are learned from the consequences of previous actions, using progressive prediction adjustment triggered by reinforcers. In a robotics context, virtual and physical robots may benefit from a similar learning skill when facing unknown environments with no supervision. In this work, we demonstrate that a simple ASNN can efficiently realise many OC scenarios. The elementary learning kernel that we describe relies on a few critical neurons, synaptic links and the integration of habituation and spike-timing dependent plasticity (STDP as learning rules. Using four tasks of incremental complexity, our experimental results show that such minimal neural component set may be sufficient to implement many OC procedures. Hence, with the described bio-inspired module, OC can be implemented in a wide range of robot controllers, including those with limited computational resources.

  4. Parameter-free method for the shape optimization of stiffeners on thin-walled structures to minimize stress concentration

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yang; Shibutan, Yoji [Osaka University, Osaka (Japan); Shimoda, Masatoshi [Toyota Technological Institute, Nagoya (Japan)

    2015-04-15

    This paper presents a parameter-free shape optimization method for the strength design of stiffeners on thin-walled structures. The maximum von Mises stress is minimized and subjected to the volume constraint. The optimum design problem is formulated as a distributed-parameter shape optimization problem under the assumptions that a stiffener is varied in the in-plane direction and that the thickness is constant. The issue of nondifferentiability, which is inherent in this min-max problem, is avoided by transforming the local measure to a smooth differentiable integral functional by using the Kreisselmeier-Steinhauser function. The shape gradient functions are derived by using the material derivative method and adjoint variable method and are applied to the H{sup 1} gradient method for shells to determine the optimal free-boundary shapes. By using this method, the smooth optimal stiffener shape can be obtained without any shape design parameterization while minimizing the maximum stress. The validity of this method is verified through two practical design examples.

  5. Quality functions for requirements engineering in system development methods.

    Science.gov (United States)

    Johansson, M; Timpka, T

    1996-01-01

    Based on a grounded theory framework, this paper analyses the quality characteristics for methods to be used for requirements engineering in the development of medical decision support systems (MDSS). The results from a Quality Function Deployment (QFD) used to rank functions connected to user value and a focus group study were presented to a validation focus group. The focus group studies take advantage of a group process to collect data for further analyses. The results describe factors considered by the participants as important in the development of methods for requirements engineering in health care. Based on the findings, the content which, according to the user a MDSS method should support is established.

  6. An Alternative Method to Gauss-Jordan Elimination: Minimizing Fraction Arithmetic

    Science.gov (United States)

    Smith, Luke; Powell, Joan

    2011-01-01

    When solving systems of equations by using matrices, many teachers present a Gauss-Jordan elimination approach to row reducing matrices that can involve painfully tedious operations with fractions (which I will call the traditional method). In this essay, I present an alternative method to row reduce matrices that does not introduce additional…

  7. Variational method for the minimization of entropy generation in solar cells

    NARCIS (Netherlands)

    Smit, S.; Kessels, W.M.M.

    2015-01-01

    In this work, a method is presented to extend traditional solar cell simulation tools to make it possible to calculate the most efficient design of practical solar cells. The method is based on the theory of nonequilibrium thermodynamics, which is used to derive an expression for the local entropy

  8. Reducing uncertainty at minimal cost: a method to identify important input parameters and prioritize data collection

    NARCIS (Netherlands)

    Uwizeye, U.A.; Groen, E.A.; Gerber, P.J.; Schulte, Rogier P.O.; Boer, de I.J.M.

    2016-01-01

    The study aims to illustrate a method to identify important input parameters that explain most of the output variance ofenvironmental assessment models. The method is tested for the computation of life-cycle nitrogen (N) use efficiencyindicators among mixed dairy production systems in Rwanda. We

  9. A Minimally Invasive, Translational Method to Deliver Hydrogels to the Heart Through the Pericardial Space

    Directory of Open Access Journals (Sweden)

    Jose R. Garcia, MS

    2017-10-01

    Full Text Available Biomaterials are a new treatment strategy for cardiovascular diseases but are difficult to deliver to the heart in a safe, precise, and translatable way. We developed a method to deliver hydrogels to the epicardium through the pericardial space. Our device creates a temporary compartment for hydrogel delivery and gelation using anatomic structures. The method minimizes risk to patients from embolization, thrombotic occlusion, and arrhythmia. In pigs there were no clinically relevant acute or subacute adverse effects from pericardial hydrogel delivery, making this a translatable strategy to deliver biomaterials to the heart.

  10. New methods to minimize the preventive maintenance cost of series-parallel systems using ant colony optimization

    International Nuclear Information System (INIS)

    Samrout, M.; Yalaoui, F.; Cha-hat telet, E.; Chebbo, N.

    2005-01-01

    This article is based on a previous study made by Bris, Chatelet and Yalaoui [Bris R, Chatelet E, Yalaoui F. New method to minimise the preventive maintenance cost of series-parallel systems. Reliab Eng Syst Saf 2003;82:247-55]. They use genetic algorithm to minimize preventive maintenance cost problem for the series-parallel systems. We propose to improve their results developing a new method based on another technique, the Ant Colony Optimization (ACO). The resolution consists in determining the solution vector of system component inspection periods, T P . Those calculations were applied within the programming tool Matlab. Thus, highly interesting results and improvements of previous studies were obtained

  11. Determining the Minimal Required Radioactivity of 18F-FDG for Reliable Semiquantification in PET/CT Imaging: A Phantom Study.

    Science.gov (United States)

    Chen, Ming-Kai; Menard, David H; Cheng, David W

    2016-03-01

    In pursuit of as-low-as-reasonably-achievable (ALARA) doses, this study investigated the minimal required radioactivity and corresponding imaging time for reliable semiquantification in PET/CT imaging. Using a phantom containing spheres of various diameters (3.4, 2.1, 1.5, 1.2, and 1.0 cm) filled with a fixed (18)F-FDG concentration of 165 kBq/mL and a background concentration of 23.3 kBq/mL, we performed PET/CT at multiple time points over 20 h of radioactive decay. The images were acquired for 10 min at a single bed position for each of 10 half-lives of decay using 3-dimensional list mode and were reconstructed into 1-, 2-, 3-, 4-, 5-, and 10-min acquisitions per bed position using an ordered-subsets expectation maximum algorithm with 24 subsets and 2 iterations and a gaussian 2-mm filter. SUVmax and SUVavg were measured for each sphere. The minimal required activity (±10%) for precise SUVmax semiquantification in the spheres was 1.8 kBq/mL for an acquisition of 10 min, 3.7 kBq/mL for 3-5 min, 7.9 kBq/mL for 2 min, and 17.4 kBq/mL for 1 min. The minimal required activity concentration-acquisition time product per bed position was 10-15 kBq/mL⋅min for reproducible SUV measurements within the spheres without overestimation. Using the total radioactivity and counting rate from the entire phantom, we found that the minimal required total activity-time product was 17 MBq⋅min and the minimal required counting rate-time product was 100 kcps⋅min. Our phantom study determined a threshold for minimal radioactivity and acquisition time for precise semiquantification in (18)F-FDG PET imaging that can serve as a guide in pursuit of achieving ALARA doses. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  12. Review Article: Fabricated Microparticles: An Innovative Method to Minimize the Side Effects of NSAIDs in Arthritis.

    Science.gov (United States)

    Abadi, Shaivad Shabee Hulhasan; Moin, Afrasim; Veerabhadrappa, Gangadharappa Hosahalli

    2016-01-01

    inflammatory diseases which provide constant and prolonged therapeutic effects that reduce dosing frequency and thereby minimize potential adverse effects of NSAIDs such as GI irritation and insufficient patient compliance. The present review describes the latest developments in microparticulate drug delivery systems and the best alternatives for safe and effective microcapsular systems in a controlled manner for the delivery of NSAIDs.

  13. A novel minimally-invasive method to sample human endothelial cells for molecular profiling.

    Directory of Open Access Journals (Sweden)

    Stephen W Waldo

    Full Text Available The endothelium is a key mediator of vascular homeostasis and cardiovascular health. Molecular research on the human endothelium may provide insight into the mechanisms underlying cardiovascular disease. Prior methodology used to isolate human endothelial cells has suffered from poor yields and contamination with other cell types. We thus sought to develop a minimally invasive technique to obtain endothelial cells derived from human subjects with higher yields and purity.Nine healthy volunteers underwent endothelial cell harvesting from antecubital veins using guidewires. Fluorescence-activated cell sorting (FACS was subsequently used to purify endothelial cells from contaminating cells using endothelial surface markers (CD34/CD105/CD146 with the concomitant absence of leukocyte and platelet specific markers (CD11b/CD45. Endothelial lineage in the purified cell population was confirmed by expression of endothelial specific genes and microRNA using quantitative polymerase chain reaction (PCR.A median of 4,212 (IQR: 2161-6583 endothelial cells were isolated from each subject. Quantitative PCR demonstrated higher expression of von Willebrand Factor (vWF, P<0.001, nitric oxide synthase 3 (NOS3, P<0.001 and vascular cell adhesion molecule 1 (VCAM-1, P<0.003 in the endothelial population compared to similarly isolated leukocytes. Similarly, the level of endothelial specific microRNA-126 was higher in the purified endothelial cells (P<0.001.This state-of-the-art technique isolates human endothelial cells for molecular analysis in higher purity and greater numbers than previously possible. This approach will expedite research on the molecular mechanisms of human cardiovascular disease, elucidating its pathophysiology and potential therapeutic targets.

  14. Method for developing cost estimates for generic regulatory requirements

    International Nuclear Information System (INIS)

    1985-01-01

    The NRC has established a practice of performing regulatory analyses, reflecting costs as well as benefits, of proposed new or revised generic requirements. A method had been developed to assist the NRC in preparing the types of cost estimates required for this purpose and for assigning priorities in the resolution of generic safety issues. The cost of a generic requirement is defined as the net present value of total lifetime cost incurred by the public, industry, and government in implementing the requirement for all affected plants. The method described here is for commercial light-water-reactor power plants. Estimating the cost for a generic requirement involves several steps: (1) identifying the activities that must be carried out to fully implement the requirement, (2) defining the work packages associated with the major activities, (3) identifying the individual elements of cost for each work package, (4) estimating the magnitude of each cost element, (5) aggregating individual plant costs over the plant lifetime, and (6) aggregating all plant costs and generic costs to produce a total, national, present value of lifetime cost for the requirement. The method developed addresses all six steps. In this paper, we discuss on the first three

  15. Improved method for minimizing sulfur loss in analysis of particulate organic sulfur.

    Science.gov (United States)

    Park, Ki-Tae; Lee, Kitack; Shin, Kyoungsoon; Jeong, Hae Jin; Kim, Kwang Young

    2014-02-04

    The global sulfur cycle depends primarily on the metabolism of marine microorganisms, which release sulfur gas into the atmosphere and thus affect the redistribution of sulfur globally as well as the earth's climate system. To better quantify sulfur release from the ocean, analysis of the production and distribution of organic sulfur in the ocean is necessary. This report describes a wet-based method for accurate analysis of particulate organic sulfur (POS) in the marine environment. The proposed method overcomes the considerable loss of sulfur (up to 80%) that occurs during analysis using conventional methods involving drying. Use of the wet-based POS extraction procedure in conjunction with a sensitive sulfur analyzer enabled accurate measurements of cellular POS. Data obtained using this method will enable accurate assessment of how rapidly sulfur can transfer among pools. Such information will improve understanding of the role of POS in the oceanic sulfur cycle.

  16. 1994 annual report on waste generation and waste minimization progress as required by DOE Order 5400.1

    International Nuclear Information System (INIS)

    Irwin, E.F.; Poligone, S.E.

    1995-01-01

    The Y-12 Plant serves as a key manufacturing technology center for the development and demonstration of unique materials, components, and services of importance to the Department of Energy (DOE) and the nation. This is accomplished through the reclamation and storage of nuclear materials, manufacture of nuclear materials, manufacture of components for the nation's defense capabilities, support to national security programs, and services provided to other customers as approved by DOE. We are recognized by our people, the community, and our customers as innovative, responsive, and responsible. We are a leader in worker health and safety, environmental protection, and stewardship of our national resources. As a DOE facility, Y-12 also supports DOE's waste minimization mission. Data contained in this report represents waste generation in Tennessee

  17. AN ENHANCED METHOD FOREXTENDING COMPUTATION AND RESOURCES BY MINIMIZING SERVICE DELAY IN EDGE CLOUD COMPUTING

    OpenAIRE

    B.Bavishna*1, Mrs.M.Agalya2 & Dr.G.Kavitha3

    2018-01-01

    A lot of research has been done in the field of cloud computing in computing domain. For its effective performance, variety of algorithms has been proposed. The role of virtualization is significant and its performance is dependent on VM Migration and allocation. More of the energy is absorbed in cloud; therefore, the utilization of numerous algorithms is required for saving energy and efficiency enhancement in the proposed work. In the proposed work, green algorithm has been considered with ...

  18. Methods for Minimization and Management of Variability in Long-Term Groundwater Monitoring Results

    Science.gov (United States)

    2015-12-01

    DECEMBER 2015 Poonam Kulkarni Charles Newell Claire Krebs Thomas McHugh GSI Environmental, Inc. Britt Sanford ProHydro Distribution...based on an understanding of the short-term variability and long-term attenuation rate at a particular site ( McHugh et al., 2015a). The...time is independent of these parameters ( McHugh et al., 2015c). The relative trade-off between monitoring frequency and time required to

  19. 1994 Annual report on waste generation and waste minimization progress as required by DOE Order 5400.1, Hanford Site

    International Nuclear Information System (INIS)

    1995-09-01

    Many Waste Minimization/Pollution Prevention successes at the Hanford Site occur every day without formal recognition. A few of the successful projects are: T-Plant helps facilities reuse equipment by offering decontamination services for items such as gas cylinders, trucks, and railcars, thus saving disposal and equipment replacement costs. Custodial Services reviewed its use of 168 hazardous cleaning products, and, through a variety of measures, replaced them with 38 safer substitutes, one for each task. Scrap steel contaminated with low level radioactivity from the interim stabilization of 107-K and 107-C was decontaminated and sold to a vendor for recycling. Site-wide programs include the following: the Pollution Prevention Opportunity Assessment (P2OA) program at the Hanford site was launched during 1994, including a training class, a guidance document, technical assistance, and goals; control over hazardous materials purchased was achieved by reviewing all purchase requisitions of a chemical nature; the Office Supply Reuse Program was established to redeploy unused or unwanted office supply items. In 1994, pollution prevention activities reduced approximately 274,000 kilograms of hazardous waste, 2,100 cubic meters of radioactive and mixed waste, 14,500,000 kilograms of sanitary waste, and 215,000 cubic meters off liquid waste and waste water. Pollution Prevention activities also saved almost $4.2 million in disposal, product, and labor costs. Overall waste generation increased in 1994 due to increased work and activity typical for a site with an environmental restoration mission. However, without any Waste Minimization/Pollution Prevention activities, solid radioactive waste generation at Hanford would have been 25% higher, solid hazardous waste generation would have been 30% higher, and solid sanitary waste generation would have been 60% higher

  20. Fast online generalized multiscale finite element method using constraint energy minimization

    Science.gov (United States)

    Chung, Eric T.; Efendiev, Yalchin; Leung, Wing Tat

    2018-02-01

    Local multiscale methods often construct multiscale basis functions in the offline stage without taking into account input parameters, such as source terms, boundary conditions, and so on. These basis functions are then used in the online stage with a specific input parameter to solve the global problem at a reduced computational cost. Recently, online approaches have been introduced, where multiscale basis functions are adaptively constructed in some regions to reduce the error significantly. In multiscale methods, it is desired to have only 1-2 iterations to reduce the error to a desired threshold. Using Generalized Multiscale Finite Element Framework [10], it was shown that by choosing sufficient number of offline basis functions, the error reduction can be made independent of physical parameters, such as scales and contrast. In this paper, our goal is to improve this. Using our recently proposed approach [4] and special online basis construction in oversampled regions, we show that the error reduction can be made sufficiently large by appropriately selecting oversampling regions. Our numerical results show that one can achieve a three order of magnitude error reduction, which is better than our previous methods. We also develop an adaptive algorithm and enrich in selected regions with large residuals. In our adaptive method, we show that the convergence rate can be determined by a user-defined parameter and we confirm this by numerical simulations. The analysis of the method is presented.

  1. Evaluation of Restoration Methods to Minimize Canada Thistle (Cirsium arvense) Infestation

    Science.gov (United States)

    Larson, Diane L.

    2009-01-01

    The National Wildlife Refuge System has an active habitat restoration program and annually seeds thousands of hectares with native plant species. The noxious weed, Canada thistle (Cirsium arvense), plagues these restorations. This study evaluates planting methodology and seed mixes with the goal of recommending optimal methods to reduce infestation of noxious weeds, especially Canada thistle, in new restorations. Three planting methods (dormant season broadcast, growing season [summer] broadcast, and growing season [summer] drill) were fully crossed with three levels of seed diversity (10, 20, and 34 species [plus a fourth level, 58 species, on the three sites in Iowa]) in a completely randomized design replicated on nine sites in Minnesota and Iowa. The propagule bank of Canada thistle was evaluated at each site. Planting occurred in winter 2004 and spring-summer 2005. Here I report on results through summer 2007. None of the planting methods or seed mix diversities consistently resulted in reduced abundance of Canada thistle. Soil texture had the strongest influence; sites with greater proportions of clay had greater frequency and cover of Canada thistle than did sandy sites. At the Minnesota study sites, the dormant broadcast planting method combined with the highest seed diversity resulted in both the greatest cover of planted species as well as the greatest richness of planted species. At the Iowa sites, planted species richness was slightly greater in the summer drill plots, but cover of planted species was greatest in the dormant broadcast plots. Richness of planted species at the Iowa sites was maximized in the high diversity plots, with the extra-high diversity seed mix resulting in significantly lower species richness. Individual species responded to planting methods idiosyncratically, which suggests that particular species could be favored by tailoring planting methods to that species.

  2. Method to minimize the organic waste in liquid-liquid extraction processes

    International Nuclear Information System (INIS)

    Schoen, J.; Ochsenfeld, W.

    1978-01-01

    In order to free the aqueous phases, accuring in the Purex process of the reprocessing of irradiated nuclear and breeder materials, from the most interfering tri-n-butyl phosphate (TBP) only present in small amounts, and its decomposition products, a suggestion is made to add macroporous sorption resin based on polystyrene which was cross-linked with divinyl benzene, to the former. A method is also described how to reprocess these resins so that almost all components can be recycled. 7 detailed examples explain the method. (UWI) [de

  3. A Hybrid Optimization Method for Reactive Power and Voltage Control Considering Power Loss Minimization

    DEFF Research Database (Denmark)

    Liu, Chengxi; Qin, Nan; Bak, Claus Leth

    2015-01-01

    This paper proposes a hybrid optimization method to optimally control the voltage and reactive power with minimum power loss in transmission grid. This approach is used for the Danish automatic voltage control (AVC) system which is typically a non-linear non-convex problem mixed with both...

  4. The improved oval forceps suture-guiding method for minimally invasive Achilles tendon repair.

    Science.gov (United States)

    Liu, Yang; Lin, Lixiang; Lin, Chuanlu; Weng, Qihao; Hong, Jianjun

    2018-06-01

    To discuss the effect and advantage of the improved oval forceps suture-guiding method combined with anchor nail in the treatment of acute Achilles tendon rupture. A retrospective research was performed on 35 cases of acute Achilles tendon rupture treated with the improved oval forceps suture-guiding method from January 2013 to October 2016. Instead of the Achillon device, we perform the Achillon technique with the use of simple oval forceps, combined with absorbable anchor nail, percutaneously to repair the acute Achilles tendon rupture. All patients were followed up for at least 12 months (range, 12-19 months), and all the patients underwent successful repair of their acute Achilles tendon rupture using the improved oval forceps suture-guiding method without any major intra- or postoperative complications. All the patients returned to work with pre-injury levels of activity at a mean of 12.51 ± 0.76 weeks. Mean AOFAS ankle-hindfoot scores improved from 63.95 (range, 51-78) preoperatively to 98.59 (range, 91-100) at last follow-up. This was statistically significant difference (P anchor nail, the improved technique has better repair capacity and expands the operation indication of oval forceps method. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. A time-minimizing hybrid method for fitting complex Moessbauer spectra

    International Nuclear Information System (INIS)

    Steiner, K.J.

    2000-07-01

    The process of fitting complex Moessbauer-spectra is known to be time-consuming. The fitting process involves a mathematical model for the combined hyperfine interaction which can be solved by an iteration method only. The iteration method is very sensitive to its input-parameters. In other words, with arbitrary input-parameters it is most unlikely that the iteration method will converge. Up to now a scientist has to spent her/his time to guess appropriate input parameters for the iteration process. The idea is to replace the guessing phase by a genetic algorithm. The genetic algorithm starts with an initial population of arbitrary input parameters. Each parameter set is called an individual. The first step is to evaluate the fitness of all individuals. Afterwards the current population is recombined to form a new population. The process of recombination involves the successive application of genetic operators which are selection, crossover, and mutation. These operators mimic the process of natural evolution, i.e. the concept of the survival of the fittest. Even though there is no formal proof that the genetic algorithm will eventually converge, there is an excellent chance that there will be a population with very good individuals after some generations. The hybrid method presented in the following combines a very modern version of a genetic algorithm with a conventional least-square routine solving the combined interaction Hamiltonian i.e. providing a physical solution with the original Moessbauer parameters by a minimum of input. (author)

  6. Optimization of design and erection methods to minimize the construction time-schedule of EPR plants

    International Nuclear Information System (INIS)

    Pierrat, Michel; L'Huby, Yvan; Decelle, Alain

    1999-01-01

    This paper presents the results of the investigations made during the Basic Design of the EPR project (European Pressurized water Reactor) to shorten the construction schedule. A 57 months construction schedule can be reached for the first unit. The investigations concern both design and construction methods. (author)

  7. Stability of bioactive compounds in minimally processed beet according to the cooking methods

    Directory of Open Access Journals (Sweden)

    Juliana Arruda RAMOS

    2017-10-01

    Full Text Available Abstract The current study aimed to determine the functional propriety of fresh beets under different cooking methods through the quantification of bioactives compounds. Beets were chosen for uniformity of size, color and absence of defects. They were thoroughly washed in running water to remove dirt, manually peeled with a knife, sliced through a stainless-steel food processor (5 mm slicing disc and submitted to four different cooking methods: steaming, pressure, oven-baked and hot-water immersion. Analysis were performed in both uncooked and cooked beets to evaluate antioxidant activity, total phenolic content, carotenoids, flavonoids and betalains. The experiment was completely randomized design (CRD. Data were subjected to analysis of variance (F test and means were compared by Tukey test (p < 0.05. Oven-baked beets preserve most of the bioactive coumpouds, maintaining better levels of carotenoids, flavonoids, betacyanin and betaxanthin than the other cooking methods. The antioxidant activity was similar between the treatments, except in the pressure. Moreover, different cooking methods did not affect phenolic compounds concentration in beets.

  8. Determination of material irradiation parameters. Required accuracies and available methods

    International Nuclear Information System (INIS)

    Cerles, J.M.; Mas, P.

    1978-01-01

    In this paper, the author reports some main methods to determine the nuclear parameters of material irradiation in testing reactor (nuclear power, burn-up, fluxes, fluences, ...). The different methods (theoretical or experimental) are reviewed: neutronics measurements and calculations, gamma scanning, thermal balance, ... The required accuracies are reviewed: they are of 3-5% on flux, fluences, nuclear power, burn-up, conversion factor, ... These required accuracies are compared with the real accuracies available which are at the present time of order of 5-20% on these parameters

  9. Rapid methods for jugular bleeding of dogs requiring one technician.

    Science.gov (United States)

    Frisk, C S; Richardson, M R

    1979-06-01

    Two methods were used to collect blood from the jugular vein of dogs. In both techniques, only one technician was required. A rope with a slip knot was placed around the base of the neck to assist in restraint and act as a tourniquet for the vein. The technician used one hand to restrain the dog by the muzzle and position the head. The other hand was used for collecting the sample. One of the methods could be accomplished with the dog in its cage. The bleeding techniques were rapid, requiring approximately 1 minute per dog.

  10. Determination of fuel irradiation parameters. Required accuracies and available methods

    International Nuclear Information System (INIS)

    Mas, P.

    1977-01-01

    This paper reports on the present point of some main methods to determine the nuclear parameters of fuel irradiation in testing reactors (nuclear power, burn up, ...) The different methods (theoretical or experimental) are reviewed: neutron measurements and calculations, gamma scanning, heat balance, ... . The required accuracies are reviewed: they are of 3-5 % on flux, fluences, nuclear power, burn-up, conversion factor. These required accuracies are compared with the real accuracies available which are the present time of order of 5-20 % on these parameters

  11. Classical Methods and Calculation Algorithms for Determining Lime Requirements

    Directory of Open Access Journals (Sweden)

    André Guarçoni

    Full Text Available ABSTRACT The methods developed for determination of lime requirements (LR are based on widely accepted principles. However, the formulas used for calculation have evolved little over recent decades, and in some cases there are indications of their inadequacy. The aim of this study was to compare the lime requirements calculated by three classic formulas and three algorithms, defining those most appropriate for supplying Ca and Mg to coffee plants and the smaller possibility of causing overliming. The database used contained 600 soil samples, which were collected in coffee plantings. The LR was estimated by the methods of base saturation, neutralization of Al3+, and elevation of Ca2+ and Mg2+ contents (two formulas and by the three calculation algorithms. Averages of the lime requirements were compared, determining the frequency distribution of the 600 lime requirements (LR estimated through each calculation method. In soils with low cation exchange capacity at pH 7, the base saturation method may fail to adequately supply the plants with Ca and Mg in many situations, while the method of Al3+ neutralization and elevation of Ca2+ and Mg2+ contents can result in the calculation of application rates that will increase the pH above the suitable range. Among the methods studied for calculating lime requirements, the algorithm that predicts reaching a defined base saturation, with adequate Ca and Mg supply and the maximum application rate limited to the H+Al value, proved to be the most efficient calculation method, and it can be recommended for use in numerous crops conditions.

  12. Minimal Subdermal Shaving by Means of Sclerotherapy Using Absolute Ethanol: A New Method for the Treatment of Axillary Osmidrosis

    Directory of Open Access Journals (Sweden)

    Hyung-Sup Shim

    2013-07-01

    Full Text Available BackgroundAxillary osmidrosis is characterized by unpleasant odors originating from the axillary apocrine glands, resulting in psychosocial stress. The main treatment modality is apocrine gland removal. Until now, of the various surgical techniques have sometimes caused serious complications. We describe herein the favorable outcomes of a new method for ablating apocrine glands by minimal subdermal shaving using sclerotherapy with absolute ethanol.MethodsA total of 12 patients underwent the procedure. The severity of osmidrosis was evaluated before surgery. Conventional subdermal shaving was performed on one side (control group and ablation by means of minimal subdermal shaving and absolute ethanol on the other side (study group. Postoperative outcomes were compared between the study and control groups.ResultsThe length of time to removal of the drain was 1 day shorter in the study group than in the control group. There were no serious complications, such as hematoma or seroma, in either group, but flap margin necrosis and flap desquamation occurred in the control group, and were successfully managed with conservative treatment. Six months after surgery, we and our patients were satisfied with the outcomes.ConclusionsSclerotherapy using absolute ethanol combined with minimal subdermal shaving may be useful for the treatment of axillary osmidrosis. It can reduce the incidence of seroma and hematoma and allow the skin flap to adhere to its recipient site. It can degrade and ablate the remaining apocrine glands and eliminate causative organisms. Furthermore, since this technique is relatively simple, it takes less time than the conventional method.

  13. A new approach to the inverse kinematics of a multi-joint robot manipulator using a minimization method

    International Nuclear Information System (INIS)

    Sasaki, Shinobu

    1987-01-01

    This paper proposes a new approach to solve the inverse kinematics of a type of sixlink manipulator. Directing our attention to features of joint structures of the manipulator, the original problem is first formulated by a system of equations with four variables and solved by means of a minimization technique. The remaining two variables are determined from constrained conditions involved. This is the basic idea in the present approach. The results of computer simulation of the present algorithm showed that the accuracies of solutions and convergence speed are much higher and quite satisfactory for practical purposes, as compared with the linearization-iteration method based on the conventional inverse Jacobian matrix. (author)

  14. Human brain receptor autoradiography using whole hemisphere sections: a general method that minimizes tissue artefacts

    International Nuclear Information System (INIS)

    Quirion, R.; Robitaille, Y.; Martial, J.; Chabot, J.G.; Lemoine, P.; Pilapil, C.; Dalpe, M.

    1987-01-01

    A general method for the preparation of high-quality, mostly ice-crystal-artefact-free whole human brain hemisphere sections is described. Upon receipt, hemispheres are divided; one is then fixed in buffered 10% formalin for neuropathological analysis while the other is cut in 8-10-mm-thick coronal slices that are then rapidly frozen in 2-methylbutane at -40 degrees C (10-15 sec) before being placed in the brain bank at -80 degrees C. Such rapid freezing markedly decreases the formation of ice-crystal artefacts. Whole-hemisphere 20-micron thick sections are then cut and mounted onto lantern-type gelatin-coated slides. These sections are subsequently used for both qualitative and quantitative in vitro receptor autoradiography. Examples of data obtained are given by using various radioligands labelling classical neutrotransmitter, neuropeptide, enzyme, and ion channel receptor binding sites. This method should be useful for the obtention of various receptor maps in human brain. Such information could be most useful for in vivo receptor visualization studies using positron emission tomography (PET) scanning. It could also indicate if a given receptor population is specifically and selectively altered in certain brain diseases, eventually leading to the development of new therapeutic approaches

  15. Optimization of Extraction Method of the Natural Coagulant from Descurainia Sophia Seed: Minimization of Color Generation

    Directory of Open Access Journals (Sweden)

    Mazyar Peyda

    2016-06-01

    Full Text Available Background: Water treatment sometimes needs a coagulation and flocculation process to remove suspended and colloidal materials. Inorganic coagulants used create concerns about pollution of the environment and harmful effects on the human’s health. The studies carried out previously indicated the capability of an active coagulant agent extracted from Descurainia Sophia seed to remove turbidity of water. Methods: The purpose of this study was to investigate the effect of NaCl (0.05-1 gL-1, NaOH (0.01-0.1 gL-1, extraction duration (1-25 min and the ultrasound frequency (0-45-75 kHz, used in the extraction of Descurainia Sophia seed, on the generation of color in purified water and to provide a model to predict the effects of the studied variables on color generation. Extraction was performed using water as solvent, supplemented with NaCl and NaOH and irradiated by ultrasound. Design of experiments and analysis of results were conducted by the D-optimal method based on the response surface methodology (RSM. Results: The results demonstrated that only the effect of concentration of NaOH is significant in color generation (with p<0.05. Conclusion: The effect of NaOH on color generation in purified water is predictable by the use of a statistically valid linear model at a confidence level of 95%.

  16. The use of noninvasive and minimally invasive methods in endocrinology for threatened mammalian species conservation.

    Science.gov (United States)

    Kersey, David C; Dehnhard, Martin

    2014-07-01

    Endocrinology is an indispensable tool in threatened species research. The study of endocrinology in threatened species not only advances knowledge of endocrine mechanism but also contributes to conservation efforts of studied species. To this end, endocrinology has been traditionally used to understand reproductive and adrenocortical endocrine axes by quantifying excreted steroid metabolites. From these studies a large body of knowledge was created that contributed to the field of endocrinology, aided conservation efforts, and created a template by which to validate and conduct this research for other species. In this regard noninvasive hormone monitoring has become a favored approach to study the basic endocrinology of wildlife species. Due to the increased understanding of endocrine physiology of threatened species, breeding rates of captive population have improved to levels allowing for reintroduction of species to restored natural ecosystems. Although these approaches are still employed, advances in biochemical, molecular, and genomic technologies are providing inroads to describe lesser known endocrine activity in threatened species. These new avenues of research will allow for growth of the field with greater depth and breadth. However, for all approaches to endocrinology, limitations on resources and access to animals will require innovation of current methodologies to permit broad application for use in threatened species research. Crown Copyright © 2014. Published by Elsevier Inc. All rights reserved.

  17. Application of improved Vogel’s approximation method in minimization of rice distribution costs of Perum BULOG

    Science.gov (United States)

    Nahar, J.; Rusyaman, E.; Putri, S. D. V. E.

    2018-03-01

    This research was conducted at Perum BULOG Sub-Divre Medan which is the implementing institution of Raskin program for several regencies and cities in North Sumatera. Raskin is a program of distributing rice to the poor. In order to minimize rice distribution costs then rice should be allocated optimally. The method used in this study consists of the Improved Vogel Approximation Method (IVAM) to analyse the initial feasible solution, and Modified Distribution (MODI) to test the optimum solution. This study aims to determine whether the IVAM method can provide savings or cost efficiency of rice distribution. From the calculation with IVAM obtained the optimum cost is lower than the company's calculation of Rp945.241.715,5 while the cost of the company's calculation of Rp958.073.750,40. Thus, the use of IVAM can save rice distribution costs of Rp12.832.034,9.

  18. A Framework for the Development of Automatic DFA Method to Minimize the Number of Components and Assembly Reorientations

    Science.gov (United States)

    Alfadhlani; Samadhi, T. M. A. Ari; Ma’ruf, Anas; Setiasyah Toha, Isa

    2018-03-01

    Assembly is a part of manufacturing processes that must be considered at the product design stage. Design for Assembly (DFA) is a method to evaluate product design in order to make it simpler, easier and quicker to assemble, so that assembly cost is reduced. This article discusses a framework for developing a computer-based DFA method. The method is expected to aid product designer to extract data, evaluate assembly process, and provide recommendation for the product design improvement. These three things are desirable to be performed without interactive process or user intervention, so product design evaluation process could be done automatically. Input for the proposed framework is a 3D solid engineering drawing. Product design evaluation is performed by: minimizing the number of components; generating assembly sequence alternatives; selecting the best assembly sequence based on the minimum number of assembly reorientations; and providing suggestion for design improvement.

  19. 1993 Annual report on waste generation and waste minimization progress as required by DOE Order 5400.1, Hanford Site

    International Nuclear Information System (INIS)

    Kirkendall, J.R.; Engel, J.A.

    1994-01-01

    More important than waste generation numbers, the pollution prevention and waste minimization successes achieved at Hanford in 1993 have reduced waste and improved operations at the Site. Just a few of these projects are: A small research nuclear reactor, unused and destined for disposal as low level radioactive waste, was provided to a Texas University for their nuclear research program, avoiding 25 cubic meters of waste and saving $116,000. By changing the slope on a asphalt lot in front of a waste storage pad, run-off rainwater was prevented from becoming mixed low level waste water, preventing 40 cubic meters of waste and saving $750,000. Through more efficient electrostatic paint spraying equipment and a solvent recovery system, a paint shop reduced hazardous waste by 3,500 kilograms, saving $90,800. During the demolition of a large decommissioned building, more than 90% of the building's material was recycled by crushing the concrete for use on-Site and selling the steel to an off-Site recycler, avoiding a total of 12,600 metric tons of waste and saving $450,000. Additionally, several site-wide programs have avoided large quantities of waste, including the following: Through expansion of the paper and office waste recycling program which includes paper, cardboard, newspaper, and phone books, 516 metric tons of sanitary waste was reduced, saving $68,000. With the continued success of the excess chemicals program, which finds on-Site and off-Site customers for excess chemical materials, hazardous waste was reduced by 765,000 liters of liquid chemicals and 50 metric tons of solid chemicals, saving over $700,000 in disposal costs

  20. Methods for ensuring compliance with regulatory requirements: regulators and operators

    International Nuclear Information System (INIS)

    Fleischmann, A.W.

    1989-01-01

    Some of the methods of ensuring compliance with regulatory requirements contained in various radiation protection documents such as Regulations, ICRP Recommendations etc. are considered. These include radiation safety officers and radiation safety committees, personnel monitoring services, dissemination of information, inspection services and legislative power of enforcement. Difficulties in ensuring compliance include outmoded legislation, financial and personnel constraints

  1. Proposed New Method of Interpretation of Infrared Ship Signature Requirements

    NARCIS (Netherlands)

    Neele, F.P.; Wilson, M.T.; Youern, K.

    2005-01-01

    new method of deriving and defining requirements for the infrared signature of new ships is presented. The current approach is to specify the maximum allowed temperature or radiance contrast of the ship with respect to its background. At present, in most NATO countries, it is the contractor’s

  2. KEELE, Minimization of Nonlinear Function with Linear Constraints, Variable Metric Method

    International Nuclear Information System (INIS)

    Westley, G.W.

    1975-01-01

    1 - Description of problem or function: KEELE is a linearly constrained nonlinear programming algorithm for locating a local minimum of a function of n variables with the variables subject to linear equality and/or inequality constraints. 2 - Method of solution: A variable metric procedure is used where the direction of search at each iteration is obtained by multiplying the negative of the gradient vector by a positive definite matrix which approximates the inverse of the matrix of second partial derivatives associated with the function. 3 - Restrictions on the complexity of the problem: Array dimensions limit the number of variables to 20 and the number of constraints to 50. These can be changed by the user

  3. Waste minimization methods for treating analytical instrumentation effluents at the source

    International Nuclear Information System (INIS)

    Ritter, J.A.; Barnhart, C.

    1995-01-01

    The primary goal of this project was to reduce the amount of hazardous waste being generated by the Savannah River Siste Defense Waste Processing Technology-analytical Laboratory (DWPT-AL). A detailed characterization study was performed on 12 of the liquid effluent streams generated within the DWPT-AL. Two of the streams were not hazardous, and are now being collected separately from the 10 hazardous streams. A secondary goal of the project was to develop in-line methods using primarily adsorption/ion exchange columns to treat liquid effluent as it emerges from the analytical instrument as a slow, dripping flow. Samples from the 10 hazardous streams were treated by adsorption in an experimental apparatus that resembled an in-line or at source column apparatus. The layered adsorbent bed contained activated carbon and ion exchange resin. The column technique did not work on the first three samples of the spectroscopy waste stream, but worked well on the next three samples which were treated in a different column. It was determined that an unusual form of mercury was present in the first three samples. Similarly, two samples of a combined waste stream were rendered nonhazardous, but the last two samples contained acetylnitrile that prevented analysis. The characteristics of these streams changed from the initial characterization study; therefore, continual, in-deptch stream characterization is the key to making this project successful

  4. New methods of minimally invasive brain modulation as therapies in psychiatry: TMS, MST, VNS and DBS.

    Science.gov (United States)

    George, Mark S

    2002-08-01

    Over the past 20 years, new methods have been developed that have allowed scientists to visualize the human brain in action. Initially positron emission tomography (PET) and now functional magnetic resonance imaging (fMRI) are causing a paradigm shift in psychiatry and the neurosciences. Psychiatry is abandoning the pharmacological model of 'brain as soup', used for much of the past 20 years. Instead, there is new realization that both normal and abnormal behavior arise from chemical processes that occur within parallel distributed networks in specific brain regions. Many of these pathological circuits are becoming well characterized, in disorders ranging from Parkinson's disease, to obsessive-compulsive disorder, to depression. Most recently, there has been an explosion of new techniques that allow for direct stimulation of these brain circuits, without the need for open craniotomy and neurosurgical ablation. The techniques include transcranial magnetic stimulation (TMS), magnetic seizure therapy (MST), vagus nerve stimulation (VNS), and deep brain stimulation (DBS). This review will describe these new tools, and overview their current and future potential for research and clinical neuropsychiatric use. The psychiatry of the future will be better grounded in a firm understanding of neuroanatomy and neurophysiology (as well as pharmacology). These brain stimulation tools, or their next iterations, will play an ever-larger role in clinical neuropsychiatric practice.

  5. Minimization of energy consumption in HVAC systems with data-driven models and an interior-point method

    International Nuclear Information System (INIS)

    Kusiak, Andrew; Xu, Guanglin; Zhang, Zijun

    2014-01-01

    Highlights: • We study the energy saving of HVAC systems with a data-driven approach. • We conduct an in-depth analysis of the topology of developed Neural Network based HVAC model. • We apply interior-point method to solving a Neural Network based HVAC optimization model. • The uncertain building occupancy is incorporated in the minimization of HVAC energy consumption. • A significant potential of saving HVAC energy is discovered. - Abstract: In this paper, a data-driven approach is applied to minimize energy consumption of a heating, ventilating, and air conditioning (HVAC) system while maintaining the thermal comfort of a building with uncertain occupancy level. The uncertainty of arrival and departure rate of occupants is modeled by the Poisson and uniform distributions, respectively. The internal heating gain is calculated from the stochastic process of the building occupancy. Based on the observed and simulated data, a multilayer perceptron algorithm is employed to model and simulate the HVAC system. The data-driven models accurately predict future performance of the HVAC system based on the control settings and the observed historical information. An optimization model is formulated and solved with the interior-point method. The optimization results are compared with the results produced by the simulation models

  6. On Birnbaum importance assessment for aging multi-state system under minimal repair by using the Lz-transform method

    International Nuclear Information System (INIS)

    Lisnianski, Anatoly; Frenkel, Ilia; Khvatskin, Lev

    2015-01-01

    This paper considers a reliability importance evaluation for components in an aging multi-state system. In practical reliability engineering a “curse of dimensionality” (the large number of states that should be analyzed for a multi-state system model) is a main obstacle for importance assessment. In order to challenge the problem, this paper proposes a new method that is based on an L Z -transform of the discrete-state continuous-time Markov process and on Ushakov's Universal Generating Operator. The paper shows that the proposed method can drastically reduce a computational burden. In order to illustrate the method, a solution of a real world problem is presented as a numerical example. - Highlights: • Aging multi-state system under minimal repair is studied. • A new method for Birnbaum importance assessment is developed. • The method is based on the L Z -transform. • The proposed method provides a drastic reduction of computation burden. • Numerical example is presented in order to illustrate the method

  7. [Determination of minimal concentrations of biocorrosion inhibitors by a bioluminescence method in relation to bacteria, participating in biocorrosion].

    Science.gov (United States)

    Efremenko, E N; Azizov, R E; Makhlis, T A; Abbasov, V M; Varfolomeev, S D

    2005-01-01

    By using a bioluminescence ATP assay, we have determined the minimal concentrations of some biocorrosion inhibitors (Katon, Khazar, VFIKS-82, Nitro-1, Kaspii-2, and Kaspii-4) suppressing most common microbial corrosion agents: Desulfovibrio desulfuricans, Desulfovibrio vulgaris, Pseudomonas putida, Pseudomonas fluorescens, and Acidithiobacillus ferrooxidans. The cell titers determined by the bioluminescence method, including not only dividing cells but also their dormant living counterparts, are two- to sixfold greater than the values determined microbiologically. It is shown that the bioluminescence method can be applied to determination of cell titers in samples of oil-field waters in the presence of iron ions (up to 260 mM) and iron sulfide (to 186 mg/l) and in the absence or presence of biocidal corrosion inhibitors.

  8. Neural Interfaces for Intracortical Recording: Requirements, Fabrication Methods, and Characteristics.

    Science.gov (United States)

    Szostak, Katarzyna M; Grand, Laszlo; Constandinou, Timothy G

    2017-01-01

    Implantable neural interfaces for central nervous system research have been designed with wire, polymer, or micromachining technologies over the past 70 years. Research on biocompatible materials, ideal probe shapes, and insertion methods has resulted in building more and more capable neural interfaces. Although the trend is promising, the long-term reliability of such devices has not yet met the required criteria for chronic human application. The performance of neural interfaces in chronic settings often degrades due to foreign body response to the implant that is initiated by the surgical procedure, and related to the probe structure, and material properties used in fabricating the neural interface. In this review, we identify the key requirements for neural interfaces for intracortical recording, describe the three different types of probes-microwire, micromachined, and polymer-based probes; their materials, fabrication methods, and discuss their characteristics and related challenges.

  9. Neural Interfaces for Intracortical Recording: Requirements, Fabrication Methods, and Characteristics

    Directory of Open Access Journals (Sweden)

    Katarzyna M. Szostak

    2017-12-01

    Full Text Available Implantable neural interfaces for central nervous system research have been designed with wire, polymer, or micromachining technologies over the past 70 years. Research on biocompatible materials, ideal probe shapes, and insertion methods has resulted in building more and more capable neural interfaces. Although the trend is promising, the long-term reliability of such devices has not yet met the required criteria for chronic human application. The performance of neural interfaces in chronic settings often degrades due to foreign body response to the implant that is initiated by the surgical procedure, and related to the probe structure, and material properties used in fabricating the neural interface. In this review, we identify the key requirements for neural interfaces for intracortical recording, describe the three different types of probes—microwire, micromachined, and polymer-based probes; their materials, fabrication methods, and discuss their characteristics and related challenges.

  10. Estimation methods of eco-environmental water requirements: Case study

    Institute of Scientific and Technical Information of China (English)

    YANG Zhifeng; CUI Baoshan; LIU Jingling

    2005-01-01

    Supplying water to the ecological environment with certain quantity and quality is significant for the protection of diversity and the realization of sustainable development. The conception and connotation of eco-environmental water requirements, including the definition of the conception, the composition and characteristics of eco-environmental water requirements, are evaluated in this paper. The classification and estimation methods of eco-environmental water requirements are then proposed. On the basis of the study on the Huang-Huai-Hai Area, the present water use, the minimum and suitable water requirement are estimated and the corresponding water shortage is also calculated. According to the interrelated programs, the eco-environmental water requirements in the coming years (2010, 2030, 2050) are estimated. The result indicates that the minimum and suitable eco-environmental water requirements fluctuate with the differences of function setting and the referential standard of water resources, and so as the water shortage. Moreover, the study indicates that the minimum eco-environmental water requirement of the study area ranges from 2.84×1010m3 to 1.02×1011m3, the suitable water requirement ranges from 6.45×1010m3 to 1.78×1011m3, the water shortage ranges from 9.1×109m3 to 2.16×1010m3 under the minimum water requirement, and it is from 3.07×1010m3 to 7.53×1010m3 under the suitable water requirement. According to the different values of the water shortage, the water priority can be allocated. The ranges of the eco-environmental water requirements in the three coming years (2010, 2030, 2050) are 4.49×1010m3-1.73×1011m3, 5.99×10m3?2.09×1011m3, and 7.44×1010m3-2.52×1011m3, respectively.

  11. Structural and Functional Characterization of an Ancient Bacterial Transglutaminase Sheds Light on the Minimal Requirements for Protein Cross-Linking.

    Science.gov (United States)

    Fernandes, Catarina G; Plácido, Diana; Lousa, Diana; Brito, José A; Isidro, Anabela; Soares, Cláudio M; Pohl, Jan; Carrondo, Maria A; Archer, Margarida; Henriques, Adriano O

    2015-09-22

    Transglutaminases are best known for their ability to catalyze protein cross-linking reactions that impart chemical and physical resilience to cellular structures. Here, we report the crystal structure and characterization of Tgl, a transglutaminase from the bacterium Bacillus subtilis. Tgl is produced during sporulation and cross-links the surface of the highly resilient spore. Tgl-like proteins are found only in spore-forming bacteria of the Bacillus and Clostridia classes, indicating an ancient origin. Tgl is a single-domain protein, produced in active form, and the smallest transglutaminase characterized to date. We show that Tgl is structurally similar to bacterial cell wall endopeptidases and has an NlpC/P60 catalytic core, thought to represent the ancestral unit of the cysteine protease fold. We show that Tgl functions through a unique partially redundant catalytic dyad formed by Cys116 and Glu187 or Glu115. Strikingly, the catalytic Cys is insulated within a hydrophobic tunnel that traverses the molecule from side to side. The lack of similarity of Tgl to other transglutaminases together with its small size suggests that an NlpC/P60 catalytic core and insulation of the active site during catalysis may be essential requirements for protein cross-linking.

  12. Quantitative methods for developing C2 system requirement

    Energy Technology Data Exchange (ETDEWEB)

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the communications gap'' between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  13. Quantitative methods for developing C2 system requirement

    Energy Technology Data Exchange (ETDEWEB)

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the ``communications gap`` between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff? This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  14. Activities of native and tyrosine-69 mutant phospholipases A2 on phospholipid analogues. A reevaluation of the minimal substrate requirements.

    Science.gov (United States)

    Kuipers, O P; Dekker, N; Verheij, H M; de Haas, G H

    1990-06-26

    The role of Tyr-69 of porcine pancreatic phospholipase A2 in substrate binding was studied with the help of proteins modified by site-directed mutagenesis and phospholipid analogues with a changed head-group geometry. Two mutants were used containing Phe and Lys, respectively, at position 69. Modifications in the phospholipids included introduction of a sulfur at the phosphorus (thionophospholipids), removal of the negative charge at phosphorus (phosphatidic acid dimethyl ester), and reduction (phosphonolipids) or extension (diacylbutanetriol choline phosphate) of the distance between the phosphorus and the acyl ester bond. Replacement of Tyr-69 by Lys reduces enzymatic activity, but the mutant enzyme retains both the stereospecificity and positional specificity of native phospholipase A2. The Phe-69 mutant not only hydrolyzes the Rp isomer of thionophospholipids more efficiently than the wild-type enzyme, but the Sp thiono isomer is hydrolyzed too, although at a low (approximately 4%) rate. Phosphonolipids are hydrolyzed by native phospholipase A2 about 7 times more slowly than natural phospholipids, with retention of positional specificity and a (partial) loss of stereospecificity. The dimethyl ester of phosphatidic acid is degraded efficiently in a calcium-dependent and positional-specific way by native phospholipase A2 and by the mutants, indicating that a negative charge at phosphorus is not an absolute substrate requirement. The activities on the phosphatidic acid dimethyl ester of native enzyme and the Lys-69 mutant are lower than those on the corresponding lecithin, in contrast to the Phe-69 mutant, which has equal activities on both substrates.(ABSTRACT TRUNCATED AT 250 WORDS)

  15. Minimal surfaces

    CERN Document Server

    Dierkes, Ulrich; Sauvigny, Friedrich; Jakob, Ruben; Kuster, Albrecht

    2010-01-01

    Minimal Surfaces is the first volume of a three volume treatise on minimal surfaces (Grundlehren Nr. 339-341). Each volume can be read and studied independently of the others. The central theme is boundary value problems for minimal surfaces. The treatise is a substantially revised and extended version of the monograph Minimal Surfaces I, II (Grundlehren Nr. 295 & 296). The first volume begins with an exposition of basic ideas of the theory of surfaces in three-dimensional Euclidean space, followed by an introduction of minimal surfaces as stationary points of area, or equivalently

  16. Tracking-by-detection of surgical instruments in minimally invasive surgery via the convolutional neural network deep learning-based method.

    Science.gov (United States)

    Zhao, Zijian; Voros, Sandrine; Weng, Ying; Chang, Faliang; Li, Ruijian

    2017-12-01

    Worldwide propagation of minimally invasive surgeries (MIS) is hindered by their drawback of indirect observation and manipulation, while monitoring of surgical instruments moving in the operated body required by surgeons is a challenging problem. Tracking of surgical instruments by vision-based methods is quite lucrative, due to its flexible implementation via software-based control with no need to modify instruments or surgical workflow. A MIS instrument is conventionally split into a shaft and end-effector portions, while a 2D/3D tracking-by-detection framework is proposed, which performs the shaft tracking followed by the end-effector one. The former portion is described by line features via the RANSAC scheme, while the latter is depicted by special image features based on deep learning through a well-trained convolutional neural network. The method verification in 2D and 3D formulation is performed through the experiments on ex-vivo video sequences, while qualitative validation on in-vivo video sequences is obtained. The proposed method provides robust and accurate tracking, which is confirmed by the experimental results: its 3D performance in ex-vivo video sequences exceeds those of the available state-of -the-art methods. Moreover, the experiments on in-vivo sequences demonstrate that the proposed method can tackle the difficult condition of tracking with unknown camera parameters. Further refinements of the method will refer to the occlusion and multi-instrumental MIS applications.

  17. Minimal methylation classifier (MIMIC): A novel method for derivation and rapid diagnostic detection of disease-associated DNA methylation signatures.

    Science.gov (United States)

    Schwalbe, E C; Hicks, D; Rafiee, G; Bashton, M; Gohlke, H; Enshaei, A; Potluri, S; Matthiesen, J; Mather, M; Taleongpong, P; Chaston, R; Silmon, A; Curtis, A; Lindsey, J C; Crosier, S; Smith, A J; Goschzik, T; Doz, F; Rutkowski, S; Lannering, B; Pietsch, T; Bailey, S; Williamson, D; Clifford, S C

    2017-10-18

    Rapid and reliable detection of disease-associated DNA methylation patterns has major potential to advance molecular diagnostics and underpin research investigations. We describe the development and validation of minimal methylation classifier (MIMIC), combining CpG signature design from genome-wide datasets, multiplex-PCR and detection by single-base extension and MALDI-TOF mass spectrometry, in a novel method to assess multi-locus DNA methylation profiles within routine clinically-applicable assays. We illustrate the application of MIMIC to successfully identify the methylation-dependent diagnostic molecular subgroups of medulloblastoma (the most common malignant childhood brain tumour), using scant/low-quality samples remaining from the most recently completed pan-European medulloblastoma clinical trial, refractory to analysis by conventional genome-wide DNA methylation analysis. Using this approach, we identify critical DNA methylation patterns from previously inaccessible cohorts, and reveal novel survival differences between the medulloblastoma disease subgroups with significant potential for clinical exploitation.

  18. Development of Bi-phase sodium-oxygen-hydrogen chemical equilibrium calculation program (BISHOP) using Gibbs free energy minimization method

    International Nuclear Information System (INIS)

    Okano, Yasushi

    1999-08-01

    In order to analyze the reaction heat and compounds due to sodium combustion, the multiphase chemical equilibrium calculation program for chemical reaction among sodium, oxygen and hydrogen is developed in this study. The developed numerical program is named BISHOP; which denotes Bi-Phase, Sodium - Oxygen - Hydrogen, Chemical Equilibrium Calculation Program'. Gibbs free energy minimization method is used because of the special merits that easily add and change chemical species, and generally deal many thermochemical reaction systems in addition to constant temperature and pressure one. Three new methods are developed for solving multi-phase sodium reaction system in this study. One is to construct equation system by simplifying phase, and the other is to expand the Gibbs free energy minimization method into multi-phase system, and the last is to establish the effective searching method for the minimum value. Chemical compounds by the combustion of sodium in the air are calculated using BISHOP. The Calculated temperature and moisture conditions where sodium-oxide and hydroxide are formed qualitatively agree with the experiments. Deformation of sodium hydride is calculated by the program. The estimated result of the relationship between the deformation temperature and pressure closely agree with the well known experimental equation of Roy and Rodgers. It is concluded that BISHOP can be used for evaluated the combustion and deformation behaviors of sodium and its compounds. Hydrogen formation condition of the dump-tank room at the sodium leak event of FBR is quantitatively evaluated by BISHOP. It can be concluded that to keep the temperature of dump-tank room lower is effective method to suppress the formation of hydrogen. In case of choosing the lower inflammability limit of 4.1 mol% as the hydrogen concentration criterion, formation reaction of sodium hydride from sodium and hydrogen is facilitated below the room temperature of 800 K, and concentration of hydrogen

  19. Effective teaching methods in higher education: requirements and barriers

    Directory of Open Access Journals (Sweden)

    NAHID SHIRANI BIDABADI

    2016-10-01

    Full Text Available Introduction: Teaching is one of the main components in educational planning which is a key factor in conducting educational plans. Despite the importance of good teaching, the outcomes are far from ideal. The present qualitative study aimed to investigate effective teaching in higher education in Iran based on the experiences of best professors in the country and the best local professors of Isfahan University of Technology. Methods: This qualitative content analysis study was conducted through purposeful sampling. Semi-structured interviews were conducted with ten faculty members (3 of them from the best professors in the country and 7 from the best local professors. Content analysis was performed by MAXQDA software. The codes, categories and themes were explored through an inductive process that began from semantic units or direct quotations to general themes. Results: According to the results of this study, the best teaching approach is the mixed method (student-centered together with teacher-centered plus educational planning and previous readiness. But whenever the teachers can teach using this method confront with some barriers and requirements; some of these requirements are prerequisite in professors’ behavior and some of these are prerequisite in professors’ outlook. Also, there are some major barriers, some of which are associated with the professors’ operation and others are related to laws and regulations. Implications of these findings for teachers’ preparation in education are discussed. Conclusion: In the present study, it was illustrated that a good teaching method helps the students to question their preconceptions, and motivates them to learn, by putting them in a situation in which they come to see themselves as the authors of answers, as the agents of responsibility for change. But training through this method has some barriers and requirements. To have an effective teaching; the faculty members of the universities

  20. MADM Technique Integrated with Grey- based Taguchi method for Selection of Alluminium alloys to minimize deburring cost during Drilling

    Directory of Open Access Journals (Sweden)

    Reddy Sreenivasulu

    2015-06-01

    Full Text Available Traditionally, burr problems had been considered unavoidable so that most efforts had been made on removal of the burr as a post process. Nowadays, a trend of manufacturing is an integration of the whole production flow from design to end product. Manufacturing problem issues are handled in various stages even from design stage. Therefore, the methods of describing the burr are getting much attention in recent years for the systematic approach to resolve the burr problem at various manufacturing stages. The main objective of this paper is to explore the basic concepts of MADM methods. In this study, five parameters namely speed, feed, drill size, drill geometry such as point angle and clearance angle were identified to influence more on burr formation during drilling. L 18 orthogonal array was selected and experiments were conducted as per Taguchi experimental plan for Aluminium alloy of 2014, 6061, 5035 and 7075 series. The experiment performed on a CNC Machining center with HSS twist drills. The burr size such as height and thickness were measured on exit of each hole. An optimal combination of process parameters was obtained to minimize the burr size via grey relational analysis. The output from grey based- taguchi method fed as input to the MADM. Apart from burr size strength and temperature are also considered as attributes. Finally, the results generated in MADM suggests the suitable alternative of  aluminium alloy, which results in less deburring cost, high strength and high resistance at elevated temperatures.

  1. Effective Teaching Methods in Higher Education: Requirements and Barriers.

    Science.gov (United States)

    Shirani Bidabadi, Nahid; Nasr Isfahani, Ahmmadreza; Rouhollahi, Amir; Khalili, Roya

    2016-10-01

    Teaching is one of the main components in educational planning which is a key factor in conducting educational plans. Despite the importance of good teaching, the outcomes are far from ideal. The present qualitative study aimed to investigate effective teaching in higher education in Iran based on the experiences of best professors in the country and the best local professors of Isfahan University of Technology. This qualitative content analysis study was conducted through purposeful sampling. Semi-structured interviews were conducted with ten faculty members (3 of them from the best professors in the country and 7 from the best local professors). Content analysis was performed by MAXQDA software. The codes, categories and themes were explored through an inductive process that began from semantic units or direct quotations to general themes. According to the results of this study, the best teaching approach is the mixed method (student-centered together with teacher-centered) plus educational planning and previous readiness. But whenever the teachers can teach using this method confront with some barriers and requirements; some of these requirements are prerequisite in professors' behavior and some of these are prerequisite in professors' outlook. Also, there are some major barriers, some of which are associated with the professors' operation and others are related to laws and regulations. Implications of these findings for teachers' preparation in education are discussed. In the present study, it was illustrated that a good teaching method helps the students to question their preconceptions, and motivates them to learn, by putting them in a situation in which they come to see themselves as the authors of answers, as the agents of responsibility for change. But training through this method has some barriers and requirements. To have an effective teaching; the faculty members of the universities should be awarded of these barriers and requirements as a way to

  2. Stochastic LMP (Locational marginal price) calculation method in distribution systems to minimize loss and emission based on Shapley value and two-point estimate method

    International Nuclear Information System (INIS)

    Azad-Farsani, Ehsan; Agah, S.M.M.; Askarian-Abyaneh, Hossein; Abedi, Mehrdad; Hosseinian, S.H.

    2016-01-01

    LMP (Locational marginal price) calculation is a serious impediment in distribution operation when private DG (distributed generation) units are connected to the network. A novel policy is developed in this study to guide distribution company (DISCO) to exert its control over the private units when power loss and green-house gases emissions are minimized. LMP at each DG bus is calculated according to the contribution of the DG to the reduced amount of loss and emission. An iterative algorithm which is based on the Shapley value method is proposed to allocate loss and emission reduction. The proposed algorithm will provide a robust state estimation tool for DISCOs in the next step of operation. The state estimation tool provides the decision maker with the ability to exert its control over private DG units when loss and emission are minimized. Also, a stochastic approach based on the PEM (point estimate method) is employed to capture uncertainty in the market price and load demand. The proposed methodology is applied to a realistic distribution network, and efficiency and accuracy of the method are verified. - Highlights: • Reduction of the loss and emission at the same time. • Fair allocation of loss and emission reduction. • Estimation of the system state using an iterative algorithm. • Ability of DISCOs to control DG units via the proposed policy. • Modeling the uncertainties to calculate the stochastic LMP.

  3. Development of a minimal growth medium for Lactobacillus plantarum

    NARCIS (Netherlands)

    Wegkamp, H.B.A.; Teusink, B.; Vos, de W.M.; Smid, E.J.

    2010-01-01

    Aim: A medium with minimal requirements for the growth of Lactobacillus plantarum WCFS was developed. The composition of the minimal medium was compared to a genome-scale metabolic model of L. plantarum. Methods and Results: By repetitive single omission experiments, two minimal media were

  4. Latency in Visionic Systems: Test Methods and Requirements

    Science.gov (United States)

    Bailey, Randall E.; Arthur, J. J., III; Williams, Steven P.; Kramer, Lynda J.

    2005-01-01

    A visionics device creates a pictorial representation of the external scene for the pilot. The ultimate objective of these systems may be to electronically generate a form of Visual Meteorological Conditions (VMC) to eliminate weather or time-of-day as an operational constraint and provide enhancement over actual visual conditions where eye-limiting resolution may be a limiting factor. Empirical evidence has shown that the total system delays or latencies including the imaging sensors and display systems, can critically degrade their utility, usability, and acceptability. Definitions and measurement techniques are offered herein as common test and evaluation methods for latency testing in visionics device applications. Based upon available data, very different latency requirements are indicated based upon the piloting task, the role in which the visionics device is used in this task, and the characteristics of the visionics cockpit display device including its resolution, field-of-regard, and field-of-view. The least stringent latency requirements will involve Head-Up Display (HUD) applications, where the visionics imagery provides situational information as a supplement to symbology guidance and command information. Conversely, the visionics system latency requirement for a large field-of-view Head-Worn Display application, providing a Virtual-VMC capability from which the pilot will derive visual guidance, will be the most stringent, having a value as low as 20 msec.

  5. Visual Occlusion During Minimally Invasive Surgery: A Contemporary Review of Methods to Reduce Laparoscopic and Robotic Lens Fogging and Other Sources of Optical Loss.

    Science.gov (United States)

    Manning, Todd G; Perera, Marlon; Christidis, Daniel; Kinnear, Ned; McGrath, Shannon; O'Beirne, Richard; Zotov, Paul; Bolton, Damien; Lawrentschuk, Nathan

    2017-04-01

    Maintenance of optimal vision during minimally invasive surgery is crucial to maintaining operative awareness, efficiency, and safety. Hampered vision is commonly caused by laparoscopic lens fogging (LLF), which has prompted the development of various antifogging fluids and warming devices. However, limited comparative evidence exists in contemporary literature. Despite technologic advancements there remains no consensus as to superior methods to prevent LLF or restore visual acuity once LLF has occurred. We performed a review of literature to present the current body of evidence supporting the use of numerous techniques. A standardized Preferred Reporting Items for Systematic Reviews and Meta-Analysis review was performed, and PubMed, Embase, Web of Science, and Google Scholar were searched. Articles pertaining to mechanisms and prevention of LLF were reviewed. We applied no limit to year of publication or publication type and all articles encountered were included in final review. Limited original research and heterogenous outcome measures precluded meta-analytical assessment. Vision loss has a multitude of causes and although scientific theory can be applied to in vivo environments, no authors have completely characterized this complex problem. No method to prevent or correct LLF was identified as superior to others and comparative evidence is minimal. Robotic LLF was poorly investigated and aside from a single analysis has not been directly compared to standard laparoscopic fogging in any capacity. Obscured vision during surgery is hazardous and typically caused by LLF. The etiology of LLF despite application of scientific theory is yet to be definitively proven in the in vivo environment. Common methods of prevention of LLF or restoration of vision due to LLF have little evidence-based data to support their use. A multiarm comparative in vivo analysis is required to formally assess these commonly used techniques in both standard and robotic laparoscopes.

  6. Reduction in requirements for allogeneic blood products: nonpharmacologic methods.

    Science.gov (United States)

    Hardy, J F; Bélisle, S; Janvier, G; Samama, M

    1996-12-01

    Various strategies have been proposed to decrease bleeding and allogeneic transfusion requirements during and after cardiac operations. This article attempts to document the usefulness, or lack thereof, of the nonpharmacologic methods available in clinical practice. Blood conservation methods were reviewed in chronologic order, as they become available to patients during the perisurgical period. The literature in support of or against each strategy was reexamined critically. Avoidance of preoperative anemia and adherence to published guidelines for the practice of transfusion are of paramount importance. Intraoperatively, tolerance of low hemoglobin concentrations and use of autologous blood (predonated or harvested before bypass) will reduce allogeneic transfusions. The usefulness of plateletpheresis and retransfusion of shed mediastinal fluid remains controversial. Intraoperatively and postoperatively, maintenance of normothermia contributes to improved hemostasis. Several approaches have been shown to be effective. An efficient combination of methods can reduce, and sometimes abolish, the need for allogeneic blood products after cardiac operations, inasmuch as all those involved in the care of cardiac surgical patients adhere thoughtfully to existing transfusion guidelines.

  7. Guidelines, minimal requirements and standard of cancer care around the Mediterranean Area: report from the Collaborative AROME (Association of Radiotherapy and Oncology of the Mediterranean Area) working parties.

    Science.gov (United States)

    2011-04-01

    Guidelines are produced in oncology to facilitate clinical decision making and improve clinical practice. However, existing guidelines are mainly developed for countries with a certain availability of means and cultural aspects are rarely taken into account. Around the Mediterranean Area, countries share common cultural backgrounds but also great disparities with respect to availability of means; current guidelines by most societies are not applicable to all of those countries. Association of Radiotherapy and Oncology of the Mediterranean Area (AROME) is a scientific organization for the promotion and overcoming of inequalities in oncology clinical practice around the Mediterranean Area. In an effort to accomplish this goal, members of the AROME society have developed clinical recommendations for most common cancer sites in countries around the Mediterranean Area. The structure of these recommendations lies in the concept of minimal requirements vs. standard of care; they are being presented and discussed in the main text. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  8. Minimal requirements for quality controls in radiotherapy with external beams; Controlli di qualita' essenziali in radioterapia con fasci esterni

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-07-01

    Physical dosimetric guidelines have been developed by the Italian National Institute of Health study group on quality assurance in radiotherapy to define protocols for quality controls in external beam radiotherapy. While the document does not determine strict rules or firm recommendations, it suggests minimal requirements for quality controls necessary to guarantee an adequate degree of accuracy in external beam radiotherapy. [Italian] Il gruppo di studio Assicurazione di qualita' in radioterapia dell'Istituto Superiore di Sanita' presenta le linee guida per la stesura dei protocolli di controllo di qualita' essenziali necessari a garantire un adeguato livello di accuratezza del trattamento radiante e rappresenta pertanto una parte essenziale del contributo fisico-dosimetrico globale di assicurazione di qualita' in radioterapia con fasci esterni.

  9. Minimizing Mutual Couping

    DEFF Research Database (Denmark)

    2010-01-01

    Disclosed herein are techniques, systems, and methods relating to minimizing mutual coupling between a first antenna and a second antenna.......Disclosed herein are techniques, systems, and methods relating to minimizing mutual coupling between a first antenna and a second antenna....

  10. Organisational reviews - requirements, methods and experience. Progress report 2006

    International Nuclear Information System (INIS)

    Reiman, T.; Oedewald, P.; Wahlstroem, B.; Rollenhagen, C.; Kahlbom, U.

    2007-04-01

    Organisational reviews are important instruments in the continuous quest for improved performance. In the nuclear field there has been an increasing regulatory interest in organisational performance, because incidents and accidents often point to organisational deficiencies as one of the major precursors. Many methods for organisational reviews have been proposed, but they are mostly based on ad hoc approaches to specific problems. The absence of well-established techniques for organisational reviews has already shown to cause discussions and controversies on different levels. The aim of the OrRe project is to collect the experiences from organisational reviews carried out so far and to reflect them in a theoretical model of organisational performance. Furthermore, the project aims to reflect on the criteria for the definition of the scope and content of organisational reviews. Finally, recommendations will be made for guidance for people participating in organisational reviews. This progress report describes regulatory practices in Finland and Sweden together with some case examples of organizational reviews and assessment in both countries. Some issues of concern are raised and an outline for the next year's work is proposed. Issues of concern include the sufficient depth of the assessment, the required competence in assessments, data and criteria problems, definition of the boundaries of the system to be assessed, and the necessary internal support and organisational maturity required for successful assessments. Finally, plans for next year's work are outlined. (au)

  11. Organisational reviews - requirements, methods and experience. Progress report 2006

    Energy Technology Data Exchange (ETDEWEB)

    Reiman, T.; Oedewald, P.; Wahlstroem, B. [VTT, Technical Research Centre of Finland (Finland); Rollenhagen, C.; Kahlbom, U. [Maelardalen University (FI)

    2007-04-15

    Organisational reviews are important instruments in the continuous quest for improved performance. In the nuclear field there has been an increasing regulatory interest in organisational performance, because incidents and accidents often point to organisational deficiencies as one of the major precursors. Many methods for organisational reviews have been proposed, but they are mostly based on ad hoc approaches to specific problems. The absence of well-established techniques for organisational reviews has already shown to cause discussions and controversies on different levels. The aim of the OrRe project is to collect the experiences from organisational reviews carried out so far and to reflect them in a theoretical model of organisational performance. Furthermore, the project aims to reflect on the criteria for the definition of the scope and content of organisational reviews. Finally, recommendations will be made for guidance for people participating in organisational reviews. This progress report describes regulatory practices in Finland and Sweden together with some case examples of organizational reviews and assessment in both countries. Some issues of concern are raised and an outline for the next year's work is proposed. Issues of concern include the sufficient depth of the assessment, the required competence in assessments, data and criteria problems, definition of the boundaries of the system to be assessed, and the necessary internal support and organisational maturity required for successful assessments. Finally, plans for next year's work are outlined. (au)

  12. Similarity measure and topology evolution of foreign exchange markets using dynamic time warping method: Evidence from minimal spanning tree

    Science.gov (United States)

    Wang, Gang-Jin; Xie, Chi; Han, Feng; Sun, Bo

    2012-08-01

    In this study, we employ a dynamic time warping method to study the topology of similarity networks among 35 major currencies in international foreign exchange (FX) markets, measured by the minimal spanning tree (MST) approach, which is expected to overcome the synchronous restriction of the Pearson correlation coefficient. In the empirical process, firstly, we subdivide the analysis period from June 2005 to May 2011 into three sub-periods: before, during, and after the US sub-prime crisis. Secondly, we choose NZD (New Zealand dollar) as the numeraire and then, analyze the topology evolution of FX markets in terms of the structure changes of MSTs during the above periods. We also present the hierarchical tree associated with the MST to study the currency clusters in each sub-period. Our results confirm that USD and EUR are the predominant world currencies. But USD gradually loses the most central position while EUR acts as a stable center in the MST passing through the crisis. Furthermore, an interesting finding is that, after the crisis, SGD (Singapore dollar) becomes a new center currency for the network.

  13. Quantitative Nuclear Medicine Imaging: Concepts, Requirements and Methods

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2014-01-15

    The absolute quantification of radionuclide distribution has been a goal since the early days of nuclear medicine. Nevertheless, the apparent complexity and sometimes limited accuracy of these methods have prevented them from being widely used in important applications such as targeted radionuclide therapy or kinetic analysis. The intricacy of the effects degrading nuclear medicine images and the lack of availability of adequate methods to compensate for these effects have frequently been seen as insurmountable obstacles in the use of quantitative nuclear medicine in clinical institutions. In the last few decades, several research groups have consistently devoted their efforts to the filling of these gaps. As a result, many efficient methods are now available that make quantification a clinical reality, provided appropriate compensation tools are used. Despite these efforts, many clinical institutions still lack the knowledge and tools to adequately measure and estimate the accumulated activities in the human body, thereby using potentially outdated protocols and procedures. The purpose of the present publication is to review the current state of the art of image quantification and to provide medical physicists and other related professionals facing quantification tasks with a solid background of tools and methods. It describes and analyses the physical effects that degrade image quality and affect the accuracy of quantification, and describes methods to compensate for them in planar, single photon emission computed tomography (SPECT) and positron emission tomography (PET) images. The fast paced development of the computational infrastructure, both hardware and software, has made drastic changes in the ways image quantification is now performed. The measuring equipment has evolved from the simple blind probes to planar and three dimensional imaging, supported by SPECT, PET and hybrid equipment. Methods of iterative reconstruction have been developed to allow for

  14. Legal incentives for minimizing waste

    International Nuclear Information System (INIS)

    Clearwater, S.W.; Scanlon, J.M.

    1991-01-01

    Waste minimization, or pollution prevention, has become an integral component of federal and state environmental regulation. Minimizing waste offers many economic and public relations benefits. In addition, waste minimization efforts can also dramatically reduce potential criminal requirements. This paper addresses the legal incentives for minimizing waste under current and proposed environmental laws and regulations

  15. Cost-Effective Method for Free-Energy Minimization in Complex Systems with Elaborated Ab Initio Potentials.

    Science.gov (United States)

    Bistafa, Carlos; Kitamura, Yukichi; Martins-Costa, Marilia T C; Nagaoka, Masataka; Ruiz-López, Manuel F

    2018-05-22

    We describe a method to locate stationary points in the free-energy hypersurface of complex molecular systems using high-level correlated ab initio potentials. In this work, we assume a combined QM/MM description of the system although generalization to full ab initio potentials or other theoretical schemes is straightforward. The free-energy gradient (FEG) is obtained as the mean force acting on relevant nuclei using a dual level strategy. First, a statistical simulation is carried out using an appropriate, low-level quantum mechanical force-field. Free-energy perturbation (FEP) theory is then used to obtain the free-energy derivatives for the target, high-level quantum mechanical force-field. We show that this composite FEG-FEP approach is able to reproduce the results of a standard free-energy minimization procedure with high accuracy, while simultaneously allowing for a drastic reduction of both computational and wall-clock time. The method has been applied to study the structure of the water molecule in liquid water at the QCISD/aug-cc-pVTZ level of theory, using the sampling from QM/MM molecular dynamics simulations at the B3LYP/6-311+G(d,p) level. The obtained values for the geometrical parameters and for the dipole moment of the water molecule are within the experimental error, and they also display an excellent agreement when compared to other theoretical estimations. The developed methodology represents therefore an important step toward the accurate determination of the mechanism, kinetics, and thermodynamic properties of processes in solution, in enzymes, and in other disordered chemical systems using state-of-the-art ab initio potentials.

  16. A chord error conforming tool path B-spline fitting method for NC machining based on energy minimization and LSPIA

    OpenAIRE

    He, Shanshan; Ou, Daojiang; Yan, Changya; Lee, Chen-Han

    2015-01-01

    Piecewise linear (G01-based) tool paths generated by CAM systems lack G1 and G2 continuity. The discontinuity causes vibration and unnecessary hesitation during machining. To ensure efficient high-speed machining, a method to improve the continuity of the tool paths is required, such as B-spline fitting that approximates G01 paths with B-spline curves. Conventional B-spline fitting approaches cannot be directly used for tool path B-spline fitting, because they have shortages such as numerical...

  17. A chord error conforming tool path B-spline fitting method for NC machining based on energy minimization and LSPIA

    Directory of Open Access Journals (Sweden)

    Shanshan He

    2015-10-01

    Full Text Available Piecewise linear (G01-based tool paths generated by CAM systems lack G1 and G2 continuity. The discontinuity causes vibration and unnecessary hesitation during machining. To ensure efficient high-speed machining, a method to improve the continuity of the tool paths is required, such as B-spline fitting that approximates G01 paths with B-spline curves. Conventional B-spline fitting approaches cannot be directly used for tool path B-spline fitting, because they have shortages such as numerical instability, lack of chord error constraint, and lack of assurance of a usable result. Progressive and Iterative Approximation for Least Squares (LSPIA is an efficient method for data fitting that solves the numerical instability problem. However, it does not consider chord errors and needs more work to ensure ironclad results for commercial applications. In this paper, we use LSPIA method incorporating Energy term (ELSPIA to avoid the numerical instability, and lower chord errors by using stretching energy term. We implement several algorithm improvements, including (1 an improved technique for initial control point determination over Dominant Point Method, (2 an algorithm that updates foot point parameters as needed, (3 analysis of the degrees of freedom of control points to insert new control points only when needed, (4 chord error refinement using a similar ELSPIA method with the above enhancements. The proposed approach can generate a shape-preserving B-spline curve. Experiments with data analysis and machining tests are presented for verification of quality and efficiency. Comparisons with other known solutions are included to evaluate the worthiness of the proposed solution.

  18. Required doses for projection methods in X-ray diagnosis

    International Nuclear Information System (INIS)

    Hagemann, G.

    1992-01-01

    The ideal dose requirement has been stated by Cohen et al. (1981) by a formula basing on parallel beam, maximum quantum yield and Bucky grid effect depending on the signal to noise ratio and object contrast. This was checked by means of contrast detail diagrams measured at the hole phantom, and was additionally compared with measurement results obtained with acrylic glass phantoms. The optimal dose requirement is obtained by the maximum technically possible approach to the ideal requirement level. Examples are given, besides for x-ray equipment with Gd 2 O 2 S screen film systems for grid screen mammography, and new thoracic examination systems for mass screenings. Finally, a few values concerning the dose requirement or the analogous time required for fluorscent screening in angiography and interventional radiology, are stated, as well as for dentistry and paediatric x-ray diagnostics. (orig./HP) [de

  19. Thermodynamic analysis of ethanol/water system in a fuel cell reformer with the Gibbs energy minimization method

    International Nuclear Information System (INIS)

    Lima da Silva, Aline; De Fraga Malfatti, Celia; Heck, Nestor Cesar

    2003-01-01

    The use of fuel cells is a promising technology in the conversion of chemical to electrical energy. Due to environmental concerns related to the reduction of atmospheric pollution and greenhouse gases emissions such as CO 2 , NO x and hydrocarbons, there have been many researches about fuel cells using hydrogen as fuel. Hydrogen gas can be produced by several routes; a promising one is the steam reforming of ethanol. This route may become an important industrial process, especially for sugarcane producing countries. Ethanol is renewable energy and presents several advantages over other sources related to natural availability, storage and handling safety. In order to contribute to the understanding of the steam reforming of ethanol inside the reformer, this work displays a detailed thermodynamic analysis of the ethanol/water system, in the temperature range of 500-1200K, considering different H 2 O/ethanol reforming ratios. The equilibrium determinations were done with the help of the Gibbs energy minimization method using the Generalized Reduced Gradient algorithm (GRG). Based on literature data, the species considered in calculations were: H 2 , H 2 O, CO, CO 2 , CH 4 , C 2 H 4 , CH 3 CHO, C 2 H 5 OH (gas phase) and C gr . (graphite phase). The thermodynamic conditions for carbon deposition (probably soot) on catalyst during gas reforming were analyzed, in order to establish temperature ranges and H 2 O/ethanol ratios where carbon precipitation is not thermodynamically feasible. Experimental results from literature show that carbon deposition causes catalyst deactivation during reforming. This deactivation is due to encapsulating carbon that covers active phases on a catalyst substrate, e.g. Ni over Al 2 O 3 . In the present study, a mathematical relationship between Lagrange multipliers and the carbon activity (with reference to the graphite phase) was deduced, unveiling the carbon activity in the reformer atmosphere. From this, it is possible to foreseen if soot

  20. Optical Alignment of the Chromospheric Lyman-Alpha SpectroPolarimeter using Sophisticated Methods to Minimize Activities under Vacuum

    Science.gov (United States)

    Giono, G.; Katsukawa, Y.; Ishikawa, R.; Narukage, N.; Kano, R.; Kubo, M.; Ishikawa, S.; Bando, T.; Hara, H.; Suematsu, Y.; hide

    2016-01-01

    The Chromospheric Lyman-Alpha Spectro-Polarimeter (CLASP) is a sounding-rocket instrument developed at the National Astronomical Observatory of Japan (NAOJ) as a part of an international collaboration. The in- strument main scientific goal is to achieve polarization measurement of the Lyman-alpha line at 121.56 nm emitted from the solar upper-chromosphere and transition region with an unprecedented 0.1% accuracy. For this purpose, the optics are composed of a Cassegrain telescope coated with a "cold mirror" coating optimized for UV reflection and a dual-channel spectrograph allowing for simultaneous observation of the two orthogonal states of polarization. Although the polarization sensitivity is the most important aspect of the instrument, the spatial and spectral resolutions of the instrument are also crucial to observe the chromospheric features and resolve the Ly- pro les. A precise alignment of the optics is required to ensure the resolutions, but experiments under vacuum conditions are needed since Ly-alpha is absorbed by air, making the alignment experiments difficult. To bypass this issue, we developed methods to align the telescope and the spectrograph separately in visible light. We will explain these methods and present the results for the optical alignment of the CLASP telescope and spectrograph. We will then discuss the combined performances of both parts to derive the expected resolutions of the instrument, and compare them with the flight observations performed on September 3rd 2015.

  1. Evaluation of Information Requirements of Reliability Methods in Engineering Design

    DEFF Research Database (Denmark)

    Marini, Vinicius Kaster; Restrepo-Giraldo, John Dairo; Ahmed-Kristensen, Saeema

    2010-01-01

    This paper aims to characterize the information needed to perform methods for robustness and reliability, and verify their applicability to early design stages. Several methods were evaluated on their support to synthesis in engineering design. Of those methods, FMEA, FTA and HAZOP were selected...

  2. Non-binary decomposition trees - a method of reliability computation for systems with known minimal paths/cuts

    Energy Technology Data Exchange (ETDEWEB)

    Malinowski, Jacek

    2004-05-01

    A coherent system with independent components and known minimal paths (cuts) is considered. In order to compute its reliability, a tree structure T is constructed whose nodes contain the modified minimal paths (cuts) and numerical values. The value of a non-leaf node is a function of its child nodes' values. The values of leaf nodes are calculated from a simple formula. The value of the root node is the system's failure probability (reliability). Subsequently, an algorithm computing the system's failure probability (reliability) is constructed. The algorithm scans all nodes of T using a stack structure for this purpose. The nodes of T are alternately put on and removed from the stack, their data being modified in the process. Once the algorithm has terminated, the stack contains only the final modification of the root node of T, and its value is equal to the system's failure probability (reliability)

  3. A new waste minimization method for the determination of total nonhalogenated volatile organic compounds in TRU wastes

    International Nuclear Information System (INIS)

    Sandoval, W.; Quintana, B.D.; Ortega, L.

    1997-01-01

    As part of the technical support CST-12 provides for a wide variety of defense and nondefense programs within Los Alamos National Laboratory (LANL) and the Department of Energy (DOE) complex, new waste minimization technique is under development for radiological volatile organic analysis (Hot VOA). Currently all HOT VOA must be run in a glovebox. Several types of sample contain TRU radiological waste in the form of particulates. By prefiltering the samples through a 1.2 micron syringe and counting the radioactivity, it has been found that many of the samples can be analyzed outside a glovebox. In the present investigation, the types of Hot VOA samples that can take advantage of this new technique, the volume and types of waste reduced and the experimental parameters will be discussed. Overall, the radioactive waste generated is minimized

  4. Non-binary decomposition trees - a method of reliability computation for systems with known minimal paths/cuts

    International Nuclear Information System (INIS)

    Malinowski, Jacek

    2004-01-01

    A coherent system with independent components and known minimal paths (cuts) is considered. In order to compute its reliability, a tree structure T is constructed whose nodes contain the modified minimal paths (cuts) and numerical values. The value of a non-leaf node is a function of its child nodes' values. The values of leaf nodes are calculated from a simple formula. The value of the root node is the system's failure probability (reliability). Subsequently, an algorithm computing the system's failure probability (reliability) is constructed. The algorithm scans all nodes of T using a stack structure for this purpose. The nodes of T are alternately put on and removed from the stack, their data being modified in the process. Once the algorithm has terminated, the stack contains only the final modification of the root node of T, and its value is equal to the system's failure probability (reliability)

  5. Defining Requirements and Related Methods for Designing Sensorized Garments

    Directory of Open Access Journals (Sweden)

    Giuseppe Andreoni

    2016-05-01

    Full Text Available Designing smart garments has strong interdisciplinary implications, specifically related to user and technical requirements, but also because of the very different applications they have: medicine, sport and fitness, lifestyle monitoring, workplace and job conditions analysis, etc. This paper aims to discuss some user, textile, and technical issues to be faced in sensorized clothes development. In relation to the user, the main requirements are anthropometric, gender-related, and aesthetical. In terms of these requirements, the user’s age, the target application, and fashion trends cannot be ignored, because they determine the compliance with the wearable system. Regarding textile requirements, functional factors—also influencing user comfort—are elasticity and washability, while more technical properties are the stability of the chemical agents’ effects for preserving the sensors’ efficacy and reliability, and assuring the proper duration of the product for the complete life cycle. From the technical side, the physiological issues are the most important: skin conductance, tolerance, irritation, and the effect of sweat and perspiration are key factors for reliable sensing. Other technical features such as battery size and duration, and the form factor of the sensor collector, should be considered, as they affect aesthetical requirements, which have proven to be crucial, as well as comfort and wearability.

  6. Robust design requirements specification: a quantitative method for requirements development using quality loss functions

    DEFF Research Database (Denmark)

    Pedersen, Søren Nygaard; Christensen, Martin Ebro; Howard, Thomas J.

    2016-01-01

    Product requirements serve many purposes in the product development process. Most importantly, they are meant to capture and facilitate product goals and acceptance criteria, as defined by stakeholders. Accurately communicating stakeholder goals and acceptance criteria can be challenging and more...

  7. 40 CFR 136.6 - Method modifications and analytical requirements.

    Science.gov (United States)

    2010-07-01

    ... modifications and analytical requirements. (a) Definitions of terms used in this section. (1) Analyst means the..., oil and grease, total suspended solids, total phenolics, turbidity, chemical oxygen demand, and.... Except as set forth in paragraph (b)(3) of this section, an analyst may modify an approved test procedure...

  8. A Survey of Requirements Engineering Methods for Pervasive Services

    NARCIS (Netherlands)

    Kolos, L.; van Eck, Pascal; Wieringa, Roelf J.

    Designing and deploying ubiquitous computing systems, such as those delivering large-scale mobile services, still requires large-scale investments in both development effort as well as infrastructure costs. Therefore, in order to develop the right system, the design process merits a thorough

  9. Sensitivity Analysis of Hydraulic Methods Regarding Hydromorphologic Data Derivation Methods to Determine Environmental Water Requirements

    Directory of Open Access Journals (Sweden)

    Alireza Shokoohi

    2015-07-01

    Full Text Available This paper studies the accuracy of hydraulic methods in determining environmental flow requirements. Despite the vital importance of deriving river cross sectional data for hydraulic methods, few studies have focused on the criteria for deriving this data. The present study shows that the depth of cross section has a meaningful effect on the results obtained from hydraulic methods and that, considering fish as the index species for river habitat analysis, an optimum depth of 1 m should be assumed for deriving information from cross sections. The second important parameter required for extracting the geometric and hydraulic properties of rivers is the selection of an appropriate depth increment; ∆y. In the present research, this parameter was found to be equal to 1 cm. The uncertainty of the environmental discharge evaluation, when allocating water in areas with water scarcity, should be kept as low as possible. The Manning friction coefficient (n is an important factor in river discharge calculation. Using a range of "n" equal to 3 times the standard deviation for the study area, it is shown that the influence of friction coefficient on the estimation of environmental flow is much less than that on the calculation of river discharge.

  10. Algorithm for finding minimal cut sets in a fault tree

    International Nuclear Information System (INIS)

    Rosenberg, Ladislav

    1996-01-01

    This paper presents several algorithms that have been used in a computer code for fault-tree analysing by the minimal cut sets method. The main algorithm is the more efficient version of the new CARA algorithm, which finds minimal cut sets with an auxiliary dynamical structure. The presented algorithm for finding the minimal cut sets enables one to do so by defined requirements - according to the order of minimal cut sets, or to the number of minimal cut sets, or both. This algorithm is from three to six times faster when compared with the primary version of the CARA algorithm

  11. An efficient method for minimizing a convex separable logarithmic function subject to a convex inequality constraint or linear equality constraint

    Directory of Open Access Journals (Sweden)

    2006-01-01

    Full Text Available We consider the problem of minimizing a convex separable logarithmic function over a region defined by a convex inequality constraint or linear equality constraint, and two-sided bounds on the variables (box constraints. Such problems are interesting from both theoretical and practical point of view because they arise in some mathematical programming problems as well as in various practical problems such as problems of production planning and scheduling, allocation of resources, decision making, facility location problems, and so forth. Polynomial algorithms are proposed for solving problems of this form and their convergence is proved. Some examples and results of numerical experiments are also presented.

  12. Thermal test requirements and their verification by different test methods

    International Nuclear Information System (INIS)

    Droste, B.; Wieser, G.; Probst, U.

    1993-01-01

    The paper discusses the parameters influencing the thermal test conditions for type B-packages. Criteria for different test methods (by analytical as well as by experimental means) will be developed. A comparison of experimental results from fuel oil pool and LPG fire tests will be given. (J.P.N.)

  13. Quantification of Listeria monocytogenes in minimally processed leafy vegetables using a combined method based on enrichment and 16S rRNA real-time PCR.

    Science.gov (United States)

    Aparecida de Oliveira, Maria; Abeid Ribeiro, Eliana Guimarães; Morato Bergamini, Alzira Maria; Pereira De Martinis, Elaine Cristina

    2010-02-01

    Modern lifestyle markedly changed eating habits worldwide, with an increasing demand for ready-to-eat foods, such as minimally processed fruits and leafy greens. Packaging and storage conditions of those products may favor the growth of psychrotrophic bacteria, including the pathogen Listeria monocytogenes. In this work, minimally processed leafy vegetables samples (n = 162) from retail market from Ribeirão Preto, São Paulo, Brazil, were tested for the presence or absence of Listeria spp. by the immunoassay Listeria Rapid Test, Oxoid. Two L. monocytogenes positive and six artificially contaminated samples of minimally processed leafy vegetables were evaluated by the Most Probable Number (MPN) with detection by classical culture method and also culture method combined with real-time PCR (RTi-PCR) for 16S rRNA genes of L. monocytogenes. Positive MPN enrichment tubes were analyzed by RTi-PCR with primers specific for L. monocytogenes using the commercial preparation ABSOLUTE QPCR SYBR Green Mix (ABgene, UK). Real-time PCR assay presented good exclusivity and inclusivity results and no statistical significant difference was found in comparison with the conventional culture method (p < 0.05). Moreover, RTi-PCR was fast and easy to perform, with MPN results obtained in ca. 48 h for RTi-PCR in comparison to 7 days for conventional method.

  14. Extension of Modified Polak-Ribière-Polyak Conjugate Gradient Method to Linear Equality Constraints Minimization Problems

    Directory of Open Access Journals (Sweden)

    Zhifeng Dai

    2014-01-01

    Full Text Available Combining the Rosen gradient projection method with the two-term Polak-Ribière-Polyak (PRP conjugate gradient method, we propose a two-term Polak-Ribière-Polyak (PRP conjugate gradient projection method for solving linear equality constraints optimization problems. The proposed method possesses some attractive properties: (1 search direction generated by the proposed method is a feasible descent direction; consequently the generated iterates are feasible points; (2 the sequences of function are decreasing. Under some mild conditions, we show that it is globally convergent with Armijio-type line search. Preliminary numerical results show that the proposed method is promising.

  15. Conducting organizational safety reviews - requirements, methods and experience

    International Nuclear Information System (INIS)

    Reiman, T.; Oedewald, P.; Wahlstroem, B.; Rollenhagen, C.; Kahlbom, U.

    2008-03-01

    Organizational safety reviews are part of the safety management process of power plants. They are typically performed after major reorganizations, significant incidents or according to specified review programs. Organizational reviews can also be a part of a benchmarking between organizations that aims to improve work practices. Thus, they are important instruments in proactive safety management and safety culture. Most methods that have been used for organizational reviews are based more on practical considerations than a sound scientific theory of how various organizational or technical issues influence safety. Review practices and methods also vary considerably. The objective of this research is to promote understanding on approaches used in organizational safety reviews as well as to initiate discussion on criteria and methods of organizational assessment. The research identified a set of issues that need to be taken into account when planning and conducting organizational safety reviews. Examples of the issues are definition of appropriate criteria for evaluation, the expertise needed in the assessment and the organizational motivation for conducting the assessment. The study indicates that organizational safety assessments involve plenty of issues and situations where choices have to be made regarding what is considered valid information and a balance has to be struck between focus on various organizational phenomena. It is very important that these choices are based on a sound theoretical framework and that these choices can later be evaluated together with the assessment findings. The research concludes that at its best, the organizational safety reviews can be utilised as a source of information concerning the changing vulnerabilities and the actual safety performance of the organization. In order to do this, certain basic organizational phenomena and assessment issues have to be acknowledged and considered. The research concludes with recommendations on

  16. Conducting organizational safety reviews - requirements, methods and experience

    Energy Technology Data Exchange (ETDEWEB)

    Reiman, T.; Oedewald, P.; Wahlstroem, B. [Technical Research Centre of Finland, VTT (Finland); Rollenhagen, C. [Royal Institute of Technology, KTH, (Sweden); Kahlbom, U. [RiskPilot (Sweden)

    2008-03-15

    Organizational safety reviews are part of the safety management process of power plants. They are typically performed after major reorganizations, significant incidents or according to specified review programs. Organizational reviews can also be a part of a benchmarking between organizations that aims to improve work practices. Thus, they are important instruments in proactive safety management and safety culture. Most methods that have been used for organizational reviews are based more on practical considerations than a sound scientific theory of how various organizational or technical issues influence safety. Review practices and methods also vary considerably. The objective of this research is to promote understanding on approaches used in organizational safety reviews as well as to initiate discussion on criteria and methods of organizational assessment. The research identified a set of issues that need to be taken into account when planning and conducting organizational safety reviews. Examples of the issues are definition of appropriate criteria for evaluation, the expertise needed in the assessment and the organizational motivation for conducting the assessment. The study indicates that organizational safety assessments involve plenty of issues and situations where choices have to be made regarding what is considered valid information and a balance has to be struck between focus on various organizational phenomena. It is very important that these choices are based on a sound theoretical framework and that these choices can later be evaluated together with the assessment findings. The research concludes that at its best, the organizational safety reviews can be utilised as a source of information concerning the changing vulnerabilities and the actual safety performance of the organization. In order to do this, certain basic organizational phenomena and assessment issues have to be acknowledged and considered. The research concludes with recommendations on

  17. Validation of five minimally obstructive methods to estimate physical activity energy expenditure in young adults in semi-standardized settings

    DEFF Research Database (Denmark)

    Schneller, Mikkel Bo; Pedersen, Mogens Theisen; Gupta, Nidhi

    2015-01-01

    We compared the accuracy of five objective methods, including two newly developed methods combining accelerometry and activity type recognition (Acti4), against indirect calorimetry, to estimate total energy expenditure (EE) of different activities in semi-standardized settings. Fourteen particip...

  18. Two-step calibration method for multi-algorithm score-based face recognition systems by minimizing discrimination loss

    NARCIS (Netherlands)

    Susyanto, N.; Veldhuis, R.N.J.; Spreeuwers, L.J.; Klaassen, C.A.J.; Fierrez, J.; Li, S.Z.; Ross, A.; Veldhuis, R.; Alonso-Fernandez, F.; Bigun, J.

    2016-01-01

    We propose a new method for combining multi-algorithm score-based face recognition systems, which we call the two-step calibration method. Typically, algorithms for face recognition systems produce dependent scores. The two-step method is based on parametric copulas to handle this dependence. Its

  19. Ergonomic requirements to control room design - evaluation method

    International Nuclear Information System (INIS)

    Hinz, W.

    1985-01-01

    The method of evaluation introduced is the result of work carried out by the sub-committee 'Control Room Design' of the Engineering Standards Committee in DIN Standards, Ergonomy. This committee compiles standards for the design of control rooms (instrumentation and control) for the monitoring and operation of process engineering cycles. With the agreement of the committee - whom we wish to take the opportunity of thanking at this point for their constructive collaboration - a planned partial standard will be introduced thematically in the following, in order that knowledge gained from the discussion can be included in further work on the subject. The matter in question is a procedure for the qualitative evaluation of the duties to be performed under the control of operators in order that an assessment can be made of existing control concepts or such concepts as are to be found in the draft phase. (orig./GL) [de

  20. Rational choice of a minimally invasive method of treatment in uncomplicated nephrolithiasis with kidney calculi from 1.0 to 2.5 cm

    Directory of Open Access Journals (Sweden)

    А. І. Sagalevich

    2018-02-01

    Full Text Available Study purpose – to improve the solitary nephrolithiasis treatment effectiveness by determining the optimal conditions for ESWL or mini PNL application in the treatment of kidney calculi 1.0 to2.5 cm in size. Patients and methods. A comparative analysis of the results of minimally invasive methods application for nephrolithiasis treatment was performed in 210 patients treated with mini PNL (the group I and 190 patients treated with ESWL (the group II. Patients with calculi more than 1.5 cm predominated in the group of mini PNL and with calculi less than 1.5 cm – in the ESWL group. The number of patients with calculi 1.5–2.0 cm in both groups was the same: 24.3 % and 24.2 % (P > 0.05. Results. It was noted that the calculi destruction effectiveness after 1–4 or more sessions of ESWL took place in 182 patients (95.8 %. At the same time, an increase in the mean density of calculi above 600 HU caused reduction (P < 0.001 of the primary ESWL session efficiency almost twofold. When performing the 221 mPNL, 97.1 % of the patients required one surgical treatment. The number of complications (bleeding, attack of pyelonephritis in the group II was insignificantly higher in contrast to the group I – 26 (12.3 % and 45 (14.1 %, respectively (P < 0.05. The stone-free status (up to one month was noted in 62.6 % of patients after the completion of ESWL sessions that increases the risk of nephrolithiasis recurrence from 37.4 %. In treatment with mPNL, the stone-free status reached 97.1 % (P < 0.001, and in repeated mPNL applying in 2.8 % of cases – 100 %. The mean clinic postoperative treatment periods in the group I were lower in contrast to patients of the group II – 3.0 ± 1.5 and 12.5 ± 3.6, respectively (P < 0.001. Conclusions. This comparative analysis of features and results of uncomplicated nephrolithiasis with mPNL and ESWL treatment indicates that mPNL is the most preferred method for kidney calculi 1.0 to2.5 cm and more in size treatment.

  1. Minimal DBM Substraction

    DEFF Research Database (Denmark)

    David, Alexandre; Håkansson, John; G. Larsen, Kim

    In this paper we present an algorithm to compute DBM substractions with a guaranteed minimal number of splits and disjoint DBMs to avoid any redundance. The substraction is one of the few operations that result in a non-convex zone, and thus, requires splitting. It is of prime importance to reduce...

  2. A multi-step dealloying method to produce nanoporous gold with no volume change and minimal cracking

    Energy Technology Data Exchange (ETDEWEB)

    Sun Ye [Department of Chemical and Materials Engineering, University of Kentucky, 177 F. Paul Anderson Tower, Lexington, KY 40506 (United States); Balk, T. John [Department of Chemical and Materials Engineering, University of Kentucky, 177 F. Paul Anderson Tower, Lexington, KY 40506 (United States)], E-mail: balk@engr.uky.edu

    2008-05-15

    We report a simple two-step dealloying method for producing bulk nanoporous gold with no volume change and no significant cracking. The galvanostatic dealloying method used here appears superior to potentiostatic methods for fabricating millimeter-scale samples. Care must be taken when imaging the nanoscale, interconnected sponge-like structure with a focused ion beam, as even brief exposure caused immediate and extensive cracking of nanoporous gold, as well as ligament coarsening at the surface00.

  3. Utility of cement injection to stabilize split-depression tibial plateau fracture by minimally invasive methods: A finite element analysis.

    Science.gov (United States)

    Belaid, D; Vendeuvre, T; Bouchoucha, A; Brémand, F; Brèque, C; Rigoard, P; Germaneau, A

    2018-05-08

    Treatment for fractures of the tibial plateau is in most cases carried out by stable fixation in order to allow early mobilization. Minimally invasive technologies such as tibioplasty or stabilization by locking plate, bone augmentation and cement filling (CF) have recently been used to treat this type of fracture. The aim of this paper was to determine the mechanical behavior of the tibial plateau by numerically modeling and by quantifying the mechanical effects on the tibia mechanical properties from injury healing. A personalized Finite Element (FE) model of the tibial plateau from a clinical case has been developed to analyze stress distribution in the tibial plateau stabilized by balloon osteoplasty and to determine the influence of the cement injected. Stress analysis was performed for different stages after surgery. Just after surgery, the maximum von Mises stresses obtained for the fractured tibia treated with and without CF were 134.9 MPa and 289.9 MPa respectively on the plate. Stress distribution showed an increase of values in the trabecular bone in the treated model with locking plate and CF and stress reduction in the cortical bone in the model treated with locking plate only. The computed results of stresses or displacements of the fractured models show that the cement filling of the tibial depression fracture may increase implant stability, and decrease the loss of depression reduction, while the presence of the cement in the healed model renders the load distribution uniform. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Pathological Methods Applied to the Investigation of Causes of Death in Developing Countries: Minimally Invasive Autopsy Approach.

    Directory of Open Access Journals (Sweden)

    Paola Castillo

    Full Text Available Complete diagnostic autopsies (CDA remain the gold standard in the determination of cause of death (CoD. However, performing CDAs in developing countries is challenging due to limited facilities and human resources, and poor acceptability. We aimed to develop and test a simplified minimally invasive autopsy (MIA procedure involving organ-directed sampling with microbiology and pathology analyses implementable by trained technicians in low- income settings.A standardized scheme for the MIA has been developed and tested in a series of 30 autopsies performed at the Maputo Central Hospital, Mozambique. The procedure involves the collection of 20 mL of blood and cerebrospinal fluid (CSF and puncture of liver, lungs, heart, spleen, kidneys, bone marrow and brain in all cases plus uterus in women of childbearing age, using biopsy needles.The sampling success ranged from 67% for the kidney to 100% for blood, CSF, lung, liver and brain. The amount of tissue obtained in the procedure varied from less than 10 mm2 for the lung, spleen and kidney, to over 35 mm2 for the liver and brain. A CoD was identified in the histological and/or the microbiological analysis in 83% of the MIAs.A simplified MIA technique allows obtaining adequate material from body fluids and major organs leading to accurate diagnoses. This procedure could improve the determination of CoD in developing countries.

  5. Drilling and coring methods that minimize the disturbance of cuttings, core, and rock formation in the unsaturated zone, Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Hammermeister, D.P.; Blout, D.O.; McDaniel, J.C.

    1985-01-01

    A drilling-and-casing method (Odex 115 system) utilizing air as a drilling fluid was used successfully to drill through various rock types within the unsaturated zone at Yucca Mountain, Nevada. This paper describes this method and the equipment used to rapidly penetrate bouldery alluvial-colluvial deposits, poorly consolidated bedded and nonwelded tuff, and fractured, densely welded tuff to depths of about 130 meters. A comparison of water-content and water-potential data from drill cuttings with similar measurements on rock cores indicates that drill cuttings were only slightly disturbed for several of the rock types penetrated. Coring, sampling, and handling methods were devised to obtain minimally disturbed drive core from bouldery alluvial-colluvial deposits. Bulk-density values obtained from bulk samples dug from nearby trenches were compared to bulk-density values obtained from drive core to determine the effects of drive coring on the porosity of the core. Rotary coring methods utilizing a triple-tube core barrel and air as the drilling fluid were used to obtain core from welded and nonwelded tuff. Results indicate that the disturbance of the water content of the core was minimal. Water-content distributions in alluvium-colluvium were determined before drilling occurred by drive-core methods. After drilling, water-content distributions were determined by nuclear-logging methods. A comparison of the water-content distributions made before and after drilling indicates that Odex 115 drilling minimally disturbs the water content of the formation rock. 10 refs., 12 figs., 4 tabs

  6. Defining the minimal structural requirements for partial agonism at the type I myo-inositol 1,4,5-trisphosphate receptor.

    Science.gov (United States)

    Wilcox, R A; Fauq, A; Kozikowski, A P; Nahorski, S R

    1997-02-03

    The novel synthetic analogues D-3-fluoro-myo-inositol 1,5-bisphosphate-4-phosphorothioate, [3F-Ins(1,5)P2-4PS], D-3-fluoro-myo-inositol 1,4-bisphosphate-5-phosphorothioate [3F-Ins(1,4)P2-5PS], and D-3-fluoro-myo-inositol 1-phosphate-4,5-bisphosphorothioate [3F-Ins(1)P-(4,5)PS2] were utilised to define the structure-activity relationships which could produce partial agonism at the Ca2+ mobilising myo-inositol 1,4,5-trisphosphate [Ins(1,4,5)P3] receptor. Based on prior structure-activity data we hypothesised that the minimal structural requirements for lns(1,4,5)P3 receptor partial agonism, were phosphorothioate substitution of the crucial vicinal 4,5-bisphosphate pair accompanied by another structural perturbation, such fluorination of 3-position of the myo-inositol ring. All the analogues fully displaced [3H]Ins(1,4,5)P3 from a single Ins(1,4,5)P3 binding site in pig cerebellar membranes [3F-Ins(1,5)P2-4PS (1C50 = 26 nM), 3F-Ins(1,4)P2-5PS (IC50 = 80 nM) and 3F-Ins(1)P-(4,5)PS2 (IC50 = 109 nM) cf. Ins(1,4,5)P3 (IC50 = 11 nM)]. In contrast, 3F-Ins(1,5)P2-4PS (IC50 = 424 nM) and 3F-Ins(1,4)P2-5PS (IC50 = 3579 nM) were weak full agonists at the Ca2+ mobilising Ins(1,4,5)P3 receptor of permeabilised SH-SY5Y neuroblastoma cells, being respectively 4- and 36-fold less potent than Ins(1,4,5)P3 (EC50 = 99 nM). While 3F-Ins(1)P-(4,5)PS2 (EC50 = 11345 nM) was a partial agonist releasing only 64.3 +/- 1.9% of the Ins(1,4,5)P3-sensitive intracellular Ca2+ pools. 3F-Ins(1)P-(4,5)PS2 was unique among the Ins(1,4,5)P3 receptor partial agonists so far identified in having a relatively high affinity for the Ins(1,4,5)P3 binding site, accompanied by a significant loss of intrinsic activity for Ca2+ mobilisation. This improved affinity was probably due to the retention of the 1-position phosphate, which enhances interaction with the Ins-(1,4,5)P3 receptor. 3F-Ins(1)P-(4,5)PS2 may be an important lead compound for the development of efficient Ins(1,4,5)P3 receptor antagonists.

  7. Detection of radioactively labeled proteins is quenched by silver staining methods: quenching is minimal for 14C and partially reversible for 3H with a photochemical stain

    International Nuclear Information System (INIS)

    Van Keuren, M.L.; Goldman, D.; Merril, C.R.

    1981-01-01

    Silver staining methods for protein detection in polyacrylamide gels have a quenching effect on autoradiography and fluorography. This effect was quantitated for proteins in two-dimensional gels by microdensitometry using a computer equipped with an image processor and by scintillation counting of proteins solubilized from the gels. The original histologically derived silver stain had a quenching effect that was severe and irreversible for 3 H detection and moderate for 14 C detection. A silver stain based on photochemical methods had minimal quenching of 14 C detection and less of a quenching effect than the histological stain for 3 H detection. The 3 H quenching effect was partially reversible for the photochemical stain

  8. Experimentation of several mitigation methods in Tasiujaq Airport to minimize the effects caused by the melting of permafrost

    DEFF Research Database (Denmark)

    Jørgensen, Anders Stuhr; Doré, Guy

    2009-01-01

    Since the beginning of the 1990s an important increase in the mean annual air temperatures has been recorded in Nunavik, Québec, Canada. This has lead to the degradation of permafrost, which is threatening the stability of airport and road embankments in the region. In the summer of 2007 a test......-site was established at Tasiujaq Airport to study the effect of three different mitigations methods: heat drain, air convection embankment, and gentle slope (8:1). The methods were constructed in the shoulder of the runway embankment, each method over a distance of 50 m. In each section thermistors were installed...

  9. Optimal Control Method for Wind Farm to Support Temporary Primary Frequency Control with Minimized Wind Energy Cost

    DEFF Research Database (Denmark)

    Wang, Haijiao; Chen, Zhe; Jiang, Quanyuan

    2015-01-01

    This study proposes an optimal control method for variable speed wind turbines (VSWTs) based wind farm (WF) to support temporary primary frequency control. This control method consists of two layers: temporary frequency support control (TFSC) of the VSWT, and temporary support power optimal...... dispatch (TSPOD) of the WF. With TFSC, the VSWT could temporarily provide extra power to support system frequency under varying and wide-range wind speed. In the WF control centre, TSPOD optimally dispatches the frequency support power orders to the VSWTs that operate under different wind speeds, minimises...... the wind energy cost of frequency support, and satisfies the support capabilities of the VSWTs. The effectiveness of the whole control method is verified in the IEEE-RTS built in MATLABSimulink, and compared with a published de-loading method....

  10. Minimal quantization and confinement

    International Nuclear Information System (INIS)

    Ilieva, N.P.; Kalinowskij, Yu.L.; Nguyen Suan Han; Pervushin, V.N.

    1987-01-01

    A ''minimal'' version of the Hamiltonian quantization based on the explicit solution of the Gauss equation and on the gauge-invariance principle is considered. By the example of the one-particle Green function we show that the requirement for gauge invariance leads to relativistic covariance of the theory and to more proper definition of the Faddeev - Popov integral that does not depend on the gauge choice. The ''minimal'' quantization is applied to consider the gauge-ambiguity problem and a new topological mechanism of confinement

  11. Minimizing the Discrepancy between Simulated and Historical Failures in Turbine Engines: A Simulation-Based Optimization Method

    OpenAIRE

    Ahmed Kibria; Krystel K. Castillo-Villar; Harry Millwater

    2015-01-01

    The reliability modeling of a module in a turbine engine requires knowledge of its failure rate, which can be estimated by identifying statistical distributions describing the percentage of failure per component within the turbine module. The correct definition of the failure statistical behavior per component is highly dependent on the engineer skills and may present significant discrepancies with respect to the historical data. There is no formal methodology to approach this problem and a l...

  12. Application of the microbiological method DEFT/APC and DNA comet assay to detect ionizing radiation processing of minimally processed vegetables

    International Nuclear Information System (INIS)

    Araujo, Michel Mozeika

    2008-01-01

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent healthy. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and inactivation of food-borne pathogens, Its combination with minimal processing could improve the safety and quality of MPV. Two different food irradiation detection methods, a biological, the DEFT/APC, and another biochemical, the DNA Comet Assay were applied to MPV in order to test its applicability to detect irradiation treatment. DEFT/APC is a microbiological screening method based on the use of the direct epi fluorescent filter technique (DEFT) and the aerobic plate count (APC). DNA Comet Assay detects DNA damage due to ionizing radiation. Samples of lettuce, chard, watercress, dandelion, kale, chicory, spinach, cabbage from retail market were irradiated O.5 kGy and 1.0 kGy using a 60 Co facility. Irradiation treatment guaranteed at least 2 log cycle reduction for aerobic and psychotropic microorganisms. In general, with increasing radiation doses, DEFT counts remained similar independent of irradiation processing while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. DNA Comet Assay allowed distinguishing non-irradiated samples from irradiated ones, which showed different types of comets owing to DNA fragmentation. Both DEFT/APC method and DNA Comet Assay would be satisfactorily used as a screening method for indicating irradiation processing. (author)

  13. Minimal experimental requirements for definition of extracellular vesicles and their functions: a position statement from the International Society for Extracellular Vesicles.

    Science.gov (United States)

    Lötvall, Jan; Hill, Andrew F; Hochberg, Fred; Buzás, Edit I; Di Vizio, Dolores; Gardiner, Christopher; Gho, Yong Song; Kurochkin, Igor V; Mathivanan, Suresh; Quesenberry, Peter; Sahoo, Susmita; Tahara, Hidetoshi; Wauben, Marca H; Witwer, Kenneth W; Théry, Clotilde

    2014-01-01

    Secreted membrane-enclosed vesicles, collectively called extracellular vesicles (EVs), which include exosomes, ectosomes, microvesicles, microparticles, apoptotic bodies and other EV subsets, encompass a very rapidly growing scientific field in biology and medicine. Importantly, it is currently technically challenging to obtain a totally pure EV fraction free from non-vesicular components for functional studies, and therefore there is a need to establish guidelines for analyses of these vesicles and reporting of scientific studies on EV biology. Here, the International Society for Extracellular Vesicles (ISEV) provides researchers with a minimal set of biochemical, biophysical and functional standards that should be used to attribute any specific biological cargo or functions to EVs.

  14. The Videographic Requirements Gathering Method for Adolescent-Focused Interaction Design

    Directory of Open Access Journals (Sweden)

    Tamara Peyton

    2014-08-01

    Full Text Available We present a novel method for conducting requirements gathering with adolescent populations. Called videographic requirements gathering, this technique makes use of mobile phone data capture and participant creation of media images. The videographic requirements gathering method can help researchers and designers gain intimate insight into adolescent lives while simultaneously reducing power imbalances. We provide rationale for this approach, pragmatics of using the method, and advice on overcoming common challenges facing researchers and designers relying on this technique.

  15. Cell Adhesion Minimization by a Novel Mesh Culture Method Mechanically Directs Trophoblast Differentiation and Self-Assembly Organization of Human Pluripotent Stem Cells.

    Science.gov (United States)

    Okeyo, Kennedy Omondi; Kurosawa, Osamu; Yamazaki, Satoshi; Oana, Hidehiro; Kotera, Hidetoshi; Nakauchi, Hiromitsu; Washizu, Masao

    2015-10-01

    Mechanical methods for inducing differentiation and directing lineage specification will be instrumental in the application of pluripotent stem cells. Here, we demonstrate that minimization of cell-substrate adhesion can initiate and direct the differentiation of human pluripotent stem cells (hiPSCs) into cyst-forming trophoblast lineage cells (TLCs) without stimulation with cytokines or small molecules. To precisely control cell-substrate adhesion area, we developed a novel culture method where cells are cultured on microstructured mesh sheets suspended in a culture medium such that cells on mesh are completely out of contact with the culture dish. We used microfabricated mesh sheets that consisted of open meshes (100∼200 μm in pitch) with narrow mesh strands (3-5 μm in width) to provide support for initial cell attachment and growth. We demonstrate that minimization of cell adhesion area achieved by this culture method can trigger a sequence of morphogenetic transformations that begin with individual hiPSCs attached on the mesh strands proliferating to form cell sheets by self-assembly organization and ultimately differentiating after 10-15 days of mesh culture to generate spherical cysts that secreted human chorionic gonadotropin (hCG) hormone and expressed caudal-related homeobox 2 factor (CDX2), a specific marker of trophoblast lineage. Thus, this study demonstrates a simple and direct mechanical approach to induce trophoblast differentiation and generate cysts for application in the study of early human embryogenesis and drug development and screening.

  16. B-ALL minimal residual disease flow cytometry: an application of a novel method for optimization of a single-tube model.

    Science.gov (United States)

    Shaver, Aaron C; Greig, Bruce W; Mosse, Claudio A; Seegmiller, Adam C

    2015-05-01

    Optimizing a clinical flow cytometry panel can be a subjective process dependent on experience. We develop a quantitative method to make this process more rigorous and apply it to B lymphoblastic leukemia/lymphoma (B-ALL) minimal residual disease (MRD) testing. We retrospectively analyzed our existing three-tube, seven-color B-ALL MRD panel and used our novel method to develop an optimized one-tube, eight-color panel, which was tested prospectively. The optimized one-tube, eight-color panel resulted in greater efficiency of time and resources with no loss in diagnostic power. Constructing a flow cytometry panel using a rigorous, objective, quantitative method permits optimization and avoids problems of interdependence and redundancy in a large, multiantigen panel. Copyright© by the American Society for Clinical Pathology.

  17. Minimal experimental requirements for definition of extracellular vesicles and their functions: a position statement from the International Society for Extracellular Vesicles

    Directory of Open Access Journals (Sweden)

    Jan Lötvall

    2014-12-01

    Full Text Available Secreted membrane-enclosed vesicles, collectively called extracellular vesicles (EVs, which include exosomes, ectosomes, microvesicles, microparticles, apoptotic bodies and other EV subsets, encompass a very rapidly growing scientific field in biology and medicine. Importantly, it is currently technically challenging to obtain a totally pure EV fraction free from non-vesicular components for functional studies, and therefore there is a need to establish guidelines for analyses of these vesicles and reporting of scientific studies on EV biology. Here, the International Society for Extracellular Vesicles (ISEV provides researchers with a minimal set of biochemical, biophysical and functional standards that should be used to attribute any specific biological cargo or functions to EVs.

  18. Application of the entropy generation minimization method to a solar heat exchanger: A pseudo-optimization design process based on the analysis of the local entropy generation maps

    International Nuclear Information System (INIS)

    Giangaspero, Giorgio; Sciubba, Enrico

    2013-01-01

    This paper presents an application of the entropy generation minimization method to the pseudo-optimization of the configuration of the heat exchange surfaces in a Solar Rooftile. An initial “standard” commercial configuration is gradually improved by introducing design changes aimed at the reduction of the thermodynamic losses due to heat transfer and fluid friction. Different geometries (pins, fins and others) are analysed with a commercial CFD (Computational Fluid Dynamics) code that also computes the local entropy generation rate. The design improvement process is carried out on the basis of a careful analysis of the local entropy generation maps and the rationale behind each step of the process is discussed in this perspective. The results are compared with other entropy generation minimization techniques available in the recent technical literature. It is found that the geometry with pin-fins has the best performance among the tested ones, and that the optimal pin array shape parameters (pitch and span) can be determined by a critical analysis of the integrated and local entropy maps and of the temperature contours. - Highlights: ► An entropy generation minimization method is applied to a solar heat exchanger. ► The approach is heuristic and leads to a pseudo-optimization process with CFD as main tool. ► The process is based on the evaluation of the local entropy generation maps. ► The geometry with pin-fins in general outperforms all other configurations. ► The entropy maps and temperature contours can be used to determine the optimal pin array design parameters

  19. A mixed-methods systematic review protocol to examine the use of physical restraint with critically ill adults and strategies for minimizing their use

    Directory of Open Access Journals (Sweden)

    Louise Rose

    2016-11-01

    Full Text Available Abstract Background Critically ill patients frequently experience severe agitation placing them at risk of harm. Physical restraint is common in intensive care units (ICUs for clinician concerns about safety. However, physical restraint may not prevent medical device removal and has been associated with negative physical and psychological consequences. While professional society guidelines, legislation, and accreditation standards recommend physical restraint minimization, guidelines for critically ill patients are over a decade old, with recommendations that are non-specific. Our systematic review will synthesize evidence on physical restraint in critically ill adults with the primary objective of identifying effective minimization strategies. Methods Two authors will independently search from inception to July 2016 the following: Ovid MEDLINE, CINAHL, Embase, Web of Science, Cochrane Library, PROSPERO, Joanna Briggs Institute, grey literature, professional society websites, and the International Clinical Trials Registry Platform. We will include quantitative and qualitative study designs, clinical practice guidelines, policy documents, and professional society recommendations relevant to physical restraint of critically ill adults. Authors will independently perform data extraction in duplicate and complete risk of bias and quality assessment using recommended tools. We will assess evidence quality for quantitative studies using the Grading of Recommendations Assessment, Development and Evaluation (GRADE approach and for qualitative studies using the Confidence in the Evidence from Reviews of Qualitative Research (CERQual guidelines. Outcomes of interest include (1 efficacy/effectiveness of physical restraint minimization strategies; (2 adverse events (unintentional device removal, psychological impact, physical injury and associated benefits including harm prevention; (3 ICU outcomes (ventilation duration, length of stay, and mortality; (4

  20. Video-Assisted Anal Fistula Treatment: Pros and Cons of This Minimally Invasive Method for Treatment of Perianal Fistulas

    Directory of Open Access Journals (Sweden)

    Michal Romaniszyn

    2017-01-01

    Full Text Available Purpose. The purpose of this paper is to present results of a single-center, nonrandomized, prospective study of the video-assisted anal fistula treatment (VAAFT. Methods. 68 consecutive patients with perianal fistulas were operated on using the VAAFT technique. 30 of the patients had simple fistulas, and 38 had complex fistulas. The mean follow-up time was 31 months. Results. The overall healing rate was 54.41% (37 of the 68 patients healed with no recurrence during the follow-up period. The results varied depending on the type of fistula. The success rate for the group with simple fistulas was 73.3%, whereas it was only 39.47% for the group with complex fistulas. Female patients achieved higher healing rates for both simple (81.82% versus 68.42% and complex fistulas (77.78% versus 27.59%. There were no major complications. Conclusions. The results of VAAFT vary greatly depending on the type of fistula. The procedure has some drawbacks due to the rigid construction of the fistuloscope and the diameter of the shaft. The electrocautery of the fistula tract from the inside can be insufficient to close wide tracts. However, low risk of complications permits repetition of the treatment until success is achieved. Careful selection of patients is advised.

  1. Handbook of methods for risk-based analysis of technical specification requirements

    International Nuclear Information System (INIS)

    Samanta, P.K.; Vesely, W.E.

    1994-01-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements were based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while others may not be conducive to safety. Improvements in these requirements are facilitated by the availability of plant specific Probabilistic Safety Assessments (PSAs). The use of risk and reliability-based methods to improve TS requirements has gained wide interest because these methods can: Quantitatively evaluate the risk and justify changes based on objective risk arguments; Provide a defensible basis for these requirements for regulatory applications. The US NRC Office of Research is sponsoring research to develop systematic risk-based methods to improve various aspects of TS requirements. The handbook of methods, which is being prepared, summarizes such risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), defenses against common-cause failures, managing plant configurations, and scheduling maintenances. For each topic, the handbook summarizes methods of analysis and data needs, outlines the insights to be gained, lists additional references, and presents examples of evaluations

  2. Handbook of methods for risk-based analysis of Technical Specification requirements

    International Nuclear Information System (INIS)

    Samanta, P.K.; Vesely, W.E.

    1993-01-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements were based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while others may not be conducive to safety. Improvements in these requirements are facilitated by the availability of plant specific Probabilistic Safety Assessments (PSAs). The use of risk and reliability-based methods to improve TS requirements has gained wide interest because these methods can: quantitatively evaluate the risk impact and justify changes based on objective risk arguments. Provide a defensible basis for these requirements for regulatory applications. The United States Nuclear Regulatory Commission (USNRC) Office of Research is sponsoring research to develop systematic risk-based methods to improve various aspects of TS requirements. The handbook of methods, which is being prepared, summarizes such risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), defenses against common-cause failures, managing plant configurations, and scheduling maintenances. For each topic, the handbook summarizes methods of analysis and data needs, outlines the insights to be gained, lists additional references, and presents examples of evaluations

  3. Distal tibial pilon fractures (AO/OTA type B, and C treated with the external skeletal and minimal internal fixation method

    Directory of Open Access Journals (Sweden)

    Milenković Saša

    2013-01-01

    Full Text Available Background/Aim. Distal tibial pilon fractures include extra-articular fractures of the tibial metaphysis and the more severe intra-articular tibial pilon fractures. There is no universal method for treating distal tibial pilon fractures. These fractures are treated by means of open reduction, internal fixation (ORIF and external skeletal fixation. The high rate of soft-tissue complications associated with primary ORIF of pilon fractures led to the use of external skeletal fixation, with limited internal fixation as an alternative technique for definitive management. The aim of this study was to estimate efficacy of distal tibial pilon fratures treatment using the external skeletal and minimal internal fixation method. Methods. We presented a series of 31 operated patients with tibial pilon fractures. The patients were operated on using the method of external skeletal fixation with a minimal internal fixation. According to the AO/OTA classification, 17 patients had type B fracture and 14 patients type C fractures. The rigid external skeletal fixation was transformed into a dynamic external skeletal fixation 6 weeks post-surgery. Results. This retrospective study involved 31 patients with tibial pilon fractures, average age 41.81 (from 21 to 60 years. The average follow-up was 21.86 (from 12 to 48 months. The percentage of union was 90.32%, nonunion 3.22% and malunion 6.45%. The mean to fracture union was 14 (range 12-20 weeks. There were 4 (12.19% infections around the pins of the external skeletal fixator and one (3.22% deep infections. The ankle joint arthrosis as a late complication appeared in 4 (12.90% patients. All arthroses appeared in patients who had type C fractures. The final functional results based on the AOFAS score were excellent in 51.61%, good in 32.25%, average in 12.90% and bad in 3.22% of the patients. Conclusion. External skeletal fixation and minimal internal fixation of distal tibial pilon fractures is a good method for

  4. 42 CFR 84.146 - Method of measuring the power and torque required to operate blowers.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Method of measuring the power and torque required... RESPIRATORY PROTECTIVE DEVICES Supplied-Air Respirators § 84.146 Method of measuring the power and torque.... These are used to facilitate timing. To determine the torque or horsepower required to operate the...

  5. KidReporter : a method for engaging children in making a newspaper to gather user requirements

    NARCIS (Netherlands)

    Bekker, M.M.; Beusmans, J.; Keyson, D.V.; Lloyd, P.A.; Bekker, M.M.; Markopoulos, P.; Tsikalkina, M.

    2002-01-01

    We describe a design method, called the KidReporter method, for gathering user requirements from children. Two school classes participated in making a newspaper about a zoo, to gather requirements for the design process of an interactive educational game. The educational game was developed to

  6. Minimal changes in health status questionnaires: distinction between minimally detectable change and minimally important change

    Directory of Open Access Journals (Sweden)

    Knol Dirk L

    2006-08-01

    Full Text Available Abstract Changes in scores on health status questionnaires are difficult to interpret. Several methods to determine minimally important changes (MICs have been proposed which can broadly be divided in distribution-based and anchor-based methods. Comparisons of these methods have led to insight into essential differences between these approaches. Some authors have tried to come to a uniform measure for the MIC, such as 0.5 standard deviation and the value of one standard error of measurement (SEM. Others have emphasized the diversity of MIC values, depending on the type of anchor, the definition of minimal importance on the anchor, and characteristics of the disease under study. A closer look makes clear that some distribution-based methods have been merely focused on minimally detectable changes. For assessing minimally important changes, anchor-based methods are preferred, as they include a definition of what is minimally important. Acknowledging the distinction between minimally detectable and minimally important changes is useful, not only to avoid confusion among MIC methods, but also to gain information on two important benchmarks on the scale of a health status measurement instrument. Appreciating the distinction, it becomes possible to judge whether the minimally detectable change of a measurement instrument is sufficiently small to detect minimally important changes.

  7. Sequential unconstrained minimization algorithms for constrained optimization

    International Nuclear Information System (INIS)

    Byrne, Charles

    2008-01-01

    The problem of minimizing a function f(x):R J → R, subject to constraints on the vector variable x, occurs frequently in inverse problems. Even without constraints, finding a minimizer of f(x) may require iterative methods. We consider here a general class of iterative algorithms that find a solution to the constrained minimization problem as the limit of a sequence of vectors, each solving an unconstrained minimization problem. Our sequential unconstrained minimization algorithm (SUMMA) is an iterative procedure for constrained minimization. At the kth step we minimize the function G k (x)=f(x)+g k (x), to obtain x k . The auxiliary functions g k (x):D subset of R J → R + are nonnegative on the set D, each x k is assumed to lie within D, and the objective is to minimize the continuous function f:R J → R over x in the set C = D-bar, the closure of D. We assume that such minimizers exist, and denote one such by x-circumflex. We assume that the functions g k (x) satisfy the inequalities 0≤g k (x)≤G k-1 (x)-G k-1 (x k-1 ), for k = 2, 3, .... Using this assumption, we show that the sequence {(x k )} is decreasing and converges to f(x-circumflex). If the restriction of f(x) to D has bounded level sets, which happens if x-circumflex is unique and f(x) is closed, proper and convex, then the sequence {x k } is bounded, and f(x*)=f(x-circumflex), for any cluster point x*. Therefore, if x-circumflex is unique, x* = x-circumflex and {x k } → x-circumflex. When x-circumflex is not unique, convergence can still be obtained, in particular cases. The SUMMA includes, as particular cases, the well-known barrier- and penalty-function methods, the simultaneous multiplicative algebraic reconstruction technique (SMART), the proximal minimization algorithm of Censor and Zenios, the entropic proximal methods of Teboulle, as well as certain cases of gradient descent and the Newton–Raphson method. The proof techniques used for SUMMA can be extended to obtain related results

  8. Guidelines, "minimal requirements" and standard of care in glioblastoma around the Mediterranean Area: A report from the AROME (Association of Radiotherapy and Oncology of the Mediterranean arEa) Neuro-Oncology working party.

    Science.gov (United States)

    2016-02-01

    Glioblastoma is the most common and the most lethal primary brain tumor in adults. Although studies are ongoing, the epidemiology of glioblastoma in North Africa (i.e. Morocco, Algeria and Tunisia) remains imperfectly settled and needs to be specified for a better optimization of the neuro-oncology healthcare across the Mediterranean area and in North Africa countries. Over the last years significant therapeutic advances have been accomplished improving survival and quality of life of glioblastoma patients. Indeed, concurrent temozolomide-radiotherapy (temoradiation) and adjuvant temozolomide has been established as the standard of care associated with a survival benefit and a better outcome. Therefore, considering this validated strategy and regarding the means and some other North Africa countries specificities, we decided, under the auspices of AROME (association of radiotherapy and oncology of the Mediterranean area; www.aromecancer.org), a non-profit organization, to organize a dedicated meeting to discuss the standards and elaborate a consensus on the "minimal requirements" adapted to the local resources. Thus, panels of physicians involved in daily multidisciplinary brain tumors management in the two borders of the Mediterranean area have been invited to the AROME neuro-oncology working party. We report here the consensus, established for minimal human and material resources for glioblastoma diagnosis and treatment faced to the standard of care, which should be reached. If the minimal requirements are not reached, the patients should be referred to the closest specialized medical center where at least minimal requirements, or, ideally, the standard of care should be guaranteed to the patients. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  9. Distal tibial pilon fractures (AO/OTA type B, and C) treated with the external skeletal and minimal internal fixation method.

    Science.gov (United States)

    Milenković, Sasa; Mitković, Milorad; Micić, Ivan; Mladenović, Desimir; Najman, Stevo; Trajanović, Miroslav; Manić, Miodrag; Mitković, Milan

    2013-09-01

    Distal tibial pilon fractures include extra-articular fractures of the tibial metaphysis and the more severe intra-articular tibial pilon fractures. There is no universal method for treating distal tibial pilon fractures. These fractures are treated by means of open reduction, internal fixation (ORIF) and external skeletal fixation. The high rate of soft-tissue complications associated with primary ORIF of pilon fractures led to the use of external skeletal fixation, with limited internal fixation as an alternative technique for definitive management. The aim of this study was to estimate efficacy of distal tibial pilon fratures treatment using the external skeletal and minimal internal fixation method. We presented a series of 31 operated patients with tibial pilon fractures. The patients were operated on using the method of external skeletal fixation with a minimal internal fixation. According to the AO/OTA classification, 17 patients had type B fracture and 14 patients type C fractures. The rigid external skeletal fixation was transformed into a dynamic external skeletal fixation 6 weeks post-surgery. This retrospective study involved 31 patients with tibial pilon fractures, average age 41.81 (from 21 to 60) years. The average follow-up was 21.86 (from 12 to 48) months. The percentage of union was 90.32%, nonunion 3.22% and malunion 6.45%. The mean to fracture union was 14 (range 12-20) weeks. There were 4 (12.19%) infections around the pins of the external skeletal fixator and one (3.22%) deep infections. The ankle joint arthrosis as a late complication appeared in 4 (12.90%) patients. All arthroses appeared in patients who had type C fractures. The final functional results based on the AOFAS score were excellent in 51.61%, good in 32.25%, average in 12.90% and bad in 3.22% of the patients. External skeletal fixation and minimal internal fixation of distal tibial pilon fractures is a good method for treating all types of inta-articular pilon fractures. In

  10. The cyclic AMP receptor protein, CRP, is required for both virulence and expression of the minimal CRP regulon in Yersinia pestis biovar microtus.

    Science.gov (United States)

    Zhan, Lingjun; Han, Yanping; Yang, Lei; Geng, Jing; Li, Yingli; Gao, He; Guo, Zhaobiao; Fan, Wei; Li, Gang; Zhang, Lianfeng; Qin, Chuan; Zhou, Dongsheng; Yang, Ruifu

    2008-11-01

    The cyclic AMP receptor protein (CRP) is a bacterial regulator that controls more than 100 promoters, including those involved in catabolite repression. In the present study, a null deletion of the crp gene was constructed for Yersinia pestis bv. microtus strain 201. Microarray expression analysis disclosed that at least 6% of Y. pestis genes were affected by this mutation. Further reverse transcription-PCR and electrophoretic mobility shift assay analyses disclosed a set of 37 genes or putative operons to be the direct targets of CRP, and thus they constitute the minimal CRP regulon in Y. pestis. Subsequent primer extension and DNase I footprinting assays mapped transcriptional start sites, core promoter elements, and CRP binding sites within the DNA regions upstream of pla and pst, revealing positive and direct control of these two laterally acquired plasmid genes by CRP. The crp disruption affected both in vitro and in vivo growth of the mutant and led to a >15,000-fold loss of virulence after subcutaneous infection but a pestis and, particularly, is more important for infection by subcutaneous inoculation. It can further be concluded that the reduced in vivo growth phenotype of the crp mutant should contribute, at least partially, to its attenuation of virulence by both routes of infection. Consistent with a previous study of Y. pestis bv. medievalis, lacZ reporter fusion analysis indicated that the crp deletion resulted in the almost absolute loss of pla promoter activity. The plasminogen activator encoded by pla was previously shown to specifically promote Y. pestis dissemination from peripheral infection routes (subcutaneous infection [flea bite] or inhalation). The above evidence supports the notion that in addition to the reduced in vivo growth phenotype, the defect of pla expression in the crp mutant will greatly contribute to the huge loss of virulence of this mutant strain in subcutaneous infection.

  11. Quasi-minimal active disturbance rejection control of MIMO perturbed linear systems based on differential neural networks and the attractive ellipsoid method.

    Science.gov (United States)

    Salgado, Iván; Mera-Hernández, Manuel; Chairez, Isaac

    2017-11-01

    This study addresses the problem of designing an output-based controller to stabilize multi-input multi-output (MIMO) systems in the presence of parametric disturbances as well as uncertainties in the state model and output noise measurements. The controller design includes a linear state transformation which separates uncertainties matched to the control input and the unmatched ones. A differential neural network (DNN) observer produces a nonlinear approximation of the matched perturbation and the unknown states simultaneously in the transformed coordinates. This study proposes the use of the Attractive Ellipsoid Method (AEM) to optimize the gains of the controller and the gain observer in the DNN structure. As a consequence, the obtained control input minimizes the convergence zone for the estimation error. Moreover, the control design uses the estimated disturbance provided by the DNN to obtain a better performance in the stabilization task in comparison with a quasi-minimal output feedback controller based on a Luenberger observer and a sliding mode controller. Numerical results pointed out the advantages obtained by the nonlinear control based on the DNN observer. The first example deals with the stabilization of an academic linear MIMO perturbed system and the second example stabilizes the trajectories of a DC-motor into a predefined operation point. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  12. Waste level analysis in a care section with the lean maintenance method to minimize waste in PT. Varia Usaha Beton Gresik

    Directory of Open Access Journals (Sweden)

    Rochmoeljati Rr.

    2017-06-01

    Full Text Available The major problem faced by PT. Varia Usaha Beton Gresik is not the maintenance activity according to Standard Operation Procedure (SOP, but the activities conducted without taking into account the waste. The purpose of this study is to examine the waste in the maintenance activities as to minimize it by the lean maintenance method. This is taken on a conveyor belt weigher machine, a bucket conveyor, a concrete mixer, and a machine host. The variable is the flow of treatment by reducing waste and by giving recommendation of improvement. Data collection includes damage data, production data, product disability data, and maintenance activity data. The conclusion is QT-10 waste motion machine takes 532 minutes for waste repair, 45 minutes for waste process, and 90 minutes for waste waiting. Our recommendations include improvement on the information provided, on training and rewards, and on supervisors of the production line.

  13. The use of phase modulation optimization for power lasers. Minimizing the FM-AM conversion while preserving spectral broadening functionalities required for fusion

    International Nuclear Information System (INIS)

    Hocquet, St.

    2009-11-01

    This research thesis deals with the problem of phase modulations in power lasers (such as the MegaJoule laser which is developed in France) and their impact of different physical phenomena like the suppression of the stimulated Brillouin scattering (which is necessary to avoid optics damage) and the optical smoothing which allows a spatial homogenisation of focal stains. The author deeply discusses the phase modulation counterparts, and more particularly the FM-AM conversion which is the source of unwanted intensity modulation and of energy loss. He reports the development of a comprehensive modelling of phenomena generating FM-AM conversion on a power laser chain. He theoretically and experimentally studies two methods allowing the FM-AM conversion to be reduced to a given spectral distortion: the compensation of transfer functions and the modification of the phase modulation signal to make it less sensitive to spectral distortion effects. For this last method, he determines the ideal spectrum shape for the phase modulation, and proposes a method to approach it. He shows the feasibility of such a method and reports experiments showing to which extent these solutions may improve performance of power lasers. Finally, he proposed optimised solutions for the MegaJoule Laser

  14. The minimally tuned minimal supersymmetric standard model

    International Nuclear Information System (INIS)

    Essig, Rouven; Fortin, Jean-Francois

    2008-01-01

    The regions in the Minimal Supersymmetric Standard Model with the minimal amount of fine-tuning of electroweak symmetry breaking are presented for general messenger scale. No a priori relations among the soft supersymmetry breaking parameters are assumed and fine-tuning is minimized with respect to all the important parameters which affect electroweak symmetry breaking. The superpartner spectra in the minimally tuned region of parameter space are quite distinctive with large stop mixing at the low scale and negative squark soft masses at the high scale. The minimal amount of tuning increases enormously for a Higgs mass beyond roughly 120 GeV

  15. A look-ahead variant of the Lanczos algorithm and its application to the quasi-minimal residual method for non-Hermitian linear systems. Ph.D. Thesis - Massachusetts Inst. of Technology, Aug. 1991

    Science.gov (United States)

    Nachtigal, Noel M.

    1991-01-01

    The Lanczos algorithm can be used both for eigenvalue problems and to solve linear systems. However, when applied to non-Hermitian matrices, the classical Lanczos algorithm is susceptible to breakdowns and potential instabilities. In addition, the biconjugate gradient (BCG) algorithm, which is the natural generalization of the conjugate gradient algorithm to non-Hermitian linear systems, has a second source of breakdowns, independent of the Lanczos breakdowns. Here, we present two new results. We propose an implementation of a look-ahead variant of the Lanczos algorithm which overcomes the breakdowns by skipping over those steps where a breakdown or a near-breakdown would occur. The new algorithm can handle look-ahead steps of any length and requires the same number of matrix-vector products and inner products per step as the classical Lanczos algorithm without look-ahead. Based on the proposed look-ahead Lanczos algorithm, we then present a novel BCG-like approach, the quasi-minimal residual (QMR) method, which avoids the second source of breakdowns in the BCG algorithm. We present details of the new method and discuss some of its properties. In particular, we discuss the relationship between QMR and BCG, showing how one can recover the BCG iterates, when they exist, from the QMR iterates. We also present convergence results for QMR, showing the connection between QMR and the generalized minimal residual (GMRES) algorithm, the optimal method in this class of methods. Finally, we give some numerical examples, both for eigenvalue computations and for non-Hermitian linear systems.

  16. 5 CFR 610.404 - Requirement for time-accounting method.

    Science.gov (United States)

    2010-01-01

    ... REGULATIONS HOURS OF DUTY Flexible and Compressed Work Schedules § 610.404 Requirement for time-accounting method. An agency that authorizes a flexible work schedule or a compressed work schedule under this...

  17. [Precautions of physical performance requirements and test methods during product standard drafting process of medical devices].

    Science.gov (United States)

    Song, Jin-Zi; Wan, Min; Xu, Hui; Yao, Xiu-Jun; Zhang, Bo; Wang, Jin-Hong

    2009-09-01

    The major idea of this article is to discuss standardization and normalization for the product standard of medical devices. Analyze the problem related to the physical performance requirements and test methods during product standard drafting process and make corresponding suggestions.

  18. DactyLoc : A minimally geo-referenced WiFi+GSM-fingerprint-based localization method for positioning in urban spaces

    DEFF Research Database (Denmark)

    Cujia, Kristian; Wirz, Martin; Kjærgaard, Mikkel Baun

    2012-01-01

    Fingerprinting-based localization methods relying on WiFi and GSM information provide sufficient localization accuracy for many mobile phone applications. Most of the existing approaches require a training set consisting of geo-referenced fingerprints to build a reference database. We propose...... a collaborative, semi-supervised WiFi+GSM fingerprinting method where only a small fraction of all fingerprints needs to be geo-referenced. Our approach enables indexing of areas in the absence of GPS reception as often found in urban spaces and indoors without manual labeling of fingerprints. The method takes...

  19. Minimal families of curves on surfaces

    KAUST Repository

    Lubbes, Niels

    2014-01-01

    A minimal family of curves on an embedded surface is defined as a 1-dimensional family of rational curves of minimal degree, which cover the surface. We classify such minimal families using constructive methods. This allows us to compute the minimal

  20. Minimally Invasive Parathyroidectomy

    Directory of Open Access Journals (Sweden)

    Lee F. Starker

    2011-01-01

    Full Text Available Minimally invasive parathyroidectomy (MIP is an operative approach for the treatment of primary hyperparathyroidism (pHPT. Currently, routine use of improved preoperative localization studies, cervical block anesthesia in the conscious patient, and intraoperative parathyroid hormone analyses aid in guiding surgical therapy. MIP requires less surgical dissection causing decreased trauma to tissues, can be performed safely in the ambulatory setting, and is at least as effective as standard cervical exploration. This paper reviews advances in preoperative localization, anesthetic techniques, and intraoperative management of patients undergoing MIP for the treatment of pHPT.

  1. Functional Mobility Testing: A Novel Method to Establish Human System Interface Design Requirements

    Science.gov (United States)

    England, Scott A.; Benson, Elizabeth A.; Rajulu, Sudhakar

    2008-01-01

    Across all fields of human-system interface design it is vital to posses a sound methodology dictating the constraints on the system based on the capabilities of the human user. These limitations may be based on strength, mobility, dexterity, cognitive ability, etc. and combinations thereof. Data collected in an isolated environment to determine, for example, maximal strength or maximal range of motion would indeed be adequate for establishing not-to-exceed type design limitations, however these restraints on the system may be excessive over what is basally needed. Resources may potentially be saved by having a technique to determine the minimum measurements a system must accommodate. This paper specifically deals with the creation of a novel methodology for establishing mobility requirements for a new generation of space suit design concepts. Historically, the Space Shuttle and the International Space Station vehicle and space hardware design requirements documents such as the Man-Systems Integration Standards and International Space Station Flight Crew Integration Standard explicitly stated that the designers should strive to provide the maximum joint range of motion capabilities exhibited by a minimally clothed human subject. In the course of developing the Human-Systems Integration Requirements (HSIR) for the new space exploration initiative (Constellation), an effort was made to redefine the mobility requirements in the interest of safety and cost. Systems designed for manned space exploration can receive compounded gains from simplified designs that are both initially less expensive to produce and lighter, thereby, cheaper to launch.

  2. Minimal abdominal incisions

    Directory of Open Access Journals (Sweden)

    João Carlos Magi

    2017-04-01

    Full Text Available Minimally invasive procedures aim to resolve the disease with minimal trauma to the body, resulting in a rapid return to activities and in reductions of infection, complications, costs and pain. Minimally incised laparotomy, sometimes referred to as minilaparotomy, is an example of such minimally invasive procedures. The aim of this study is to demonstrate the feasibility and utility of laparotomy with minimal incision based on the literature and exemplifying with a case. The case in question describes reconstruction of the intestinal transit with the use of this incision. Male, young, HIV-positive patient in a late postoperative of ileotiflectomy, terminal ileostomy and closing of the ascending colon by an acute perforating abdomen, due to ileocolonic tuberculosis. The barium enema showed a proximal stump of the right colon near the ileostomy. The access to the cavity was made through the orifice resulting from the release of the stoma, with a lateral-lateral ileo-colonic anastomosis with a 25 mm circular stapler and manual closure of the ileal stump. These surgeries require their own tactics, such as rigor in the lysis of adhesions, tissue traction, and hemostasis, in addition to requiring surgeon dexterity – but without the need for investments in technology; moreover, the learning curve is reported as being lower than that for videolaparoscopy. Laparotomy with minimal incision should be considered as a valid and viable option in the treatment of surgical conditions. Resumo: Procedimentos minimamente invasivos visam resolver a doença com o mínimo de trauma ao organismo, resultando em retorno rápido às atividades, reduções nas infecções, complicações, custos e na dor. A laparotomia com incisão mínima, algumas vezes referida como minilaparotomia, é um exemplo desses procedimentos minimamente invasivos. O objetivo deste trabalho é demonstrar a viabilidade e utilidade das laparotomias com incisão mínima com base na literatura e

  3. Software Safety Analysis of Digital Protection System Requirements Using a Qualitative Formal Method

    International Nuclear Information System (INIS)

    Lee, Jang-Soo; Kwon, Kee-Choon; Cha, Sung-Deok

    2004-01-01

    The safety analysis of requirements is a key problem area in the development of software for the digital protection systems of a nuclear power plant. When specifying requirements for software of the digital protection systems and conducting safety analysis, engineers find that requirements are often known only in qualitative terms and that existing fault-tree analysis techniques provide little guidance on formulating and evaluating potential failure modes. A framework for the requirements engineering process is proposed that consists of a qualitative method for requirements specification, called the qualitative formal method (QFM), and a safety analysis method for the requirements based on causality information, called the causal requirements safety analysis (CRSA). CRSA is a technique that qualitatively evaluates causal relationships between software faults and physical hazards. This technique, extending the qualitative formal method process and utilizing information captured in the state trajectory, provides specific guidelines on how to identify failure modes and the relationship among them. The QFM and CRSA processes are described using shutdown system 2 of the Wolsong nuclear power plants as the digital protection system example

  4. 40 CFR 63.344 - Performance test requirements and test methods.

    Science.gov (United States)

    2010-07-01

    ... electroplating tanks or chromium anodizing tanks. The sampling time and sample volume for each run of Methods 306... Chromium Anodizing Tanks § 63.344 Performance test requirements and test methods. (a) Performance test... Emissions From Decorative and Hard Chromium Electroplating and Anodizing Operations,” appendix A of this...

  5. A minimally invasive approach to spleen histopathology in dogs: A new method for follow-up studies of spleen changes in the course of Leishmania infantum infection.

    Science.gov (United States)

    Santos, Silvana Ornelas; Fontes, Jonathan L M; Laranjeira, Daniela F; Vassallo, José; Barrouin-Melo, Stella Maria; Dos-Santos, Washington L C

    2016-10-01

    Severe forms of zoonotic visceral leishmaniosis (ZVL) are associated with disruption of the spleen structure. However, the study of spleen histology requires splenectomy or necropsy. In this work, we present a minimally invasive cell-block technique for studying spleen tissue histology in dogs with ZVL. We examined 13 dogs with and seven dogs without Leishmania infantum infection. The dogs with Leishmania infection had a lower frequency of lymphoid follicles (2/13, Fisher's test, Pdogs (5/7 exhibiting lymphoid follicles and a plasma cell score of 1). The dogs with Leishmania infection also presented with granulomas (8/13) and infected macrophages (5/13). These differences in the histological presentations of spleen tissue from infected and uninfected dogs corresponded to changes observed in conventional histology. Hence, the cell-block technique described here may be used in the follow-up care and study of dogs with ZVL and other diseases in both clinical practice and research. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Sludge minimization technologies - an overview

    Energy Technology Data Exchange (ETDEWEB)

    Oedegaard, Hallvard

    2003-07-01

    The management of wastewater sludge from wastewater treatment plants represents one of the major challenges in wastewater treatment today. The cost of the sludge treatment amounts to more that the cost of the liquid in many cases. Therefore the focus on and interest in sludge minimization is steadily increasing. In the paper an overview is given for sludge minimization (sludge mass reduction) options. It is demonstrated that sludge minimization may be a result of reduced production of sludge and/or disintegration processes that may take place both in the wastewater treatment stage and in the sludge stage. Various sludge disintegration technologies for sludge minimization are discussed, including mechanical methods (focusing on stirred ball-mill, high-pressure homogenizer, ultrasonic disintegrator), chemical methods (focusing on the use of ozone), physical methods (focusing on thermal and thermal/chemical hydrolysis) and biological methods (focusing on enzymatic processes). (author)

  7. Waste Minimization and Pollution Prevention Awareness Plan

    International Nuclear Information System (INIS)

    1992-01-01

    The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. It is intended to satisfy Department of Energy (DOE) and other legal requirements that are discussed in Section C, below. The Pollution Prevention Awareness Program is included with the Waste Minimization Program as suggested by DOE Order 5400.1. The intent of this plan is to respond to and comply with the Department's policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Directorate-, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Directorates, Programs and Departments. Several Directorates have been reorganized, necessitating changes in the Directorate plans that were published in 1991

  8. Comparison of two methods for minimizing the effect of delayed charge on the dose delivered with a synchrotron based discrete spot scanning proton beam

    International Nuclear Information System (INIS)

    Whitaker, Thomas J.; Beltran, Chris; Tryggestad, Erik; Kruse, Jon J.; Remmes, Nicholas B.; Tasson, Alexandria; Herman, Michael G.; Bues, Martin

    2014-01-01

    Purpose: Delayed charge is a small amount of charge that is delivered to the patient after the planned irradiation is halted, which may degrade the quality of the treatment by delivering unwarranted dose to the patient. This study compares two methods for minimizing the effect of delayed charge on the dose delivered with a synchrotron based discrete spot scanning proton beam. Methods: The delivery of several treatment plans was simulated by applying a normally distributed value of delayed charge, with a mean of 0.001(SD 0.00025) MU, to each spot. Two correction methods were used to account for the delayed charge. Method one (CM1), which is in active clinical use, accounts for the delayed charge by adjusting the MU of the current spot based on the cumulative MU. Method two (CM2) in addition reduces the planned MU by a predicted value. Every fraction of a treatment was simulated using each method and then recomputed in the treatment planning system. The dose difference between the original plan and the sum of the simulated fractions was evaluated. Both methods were tested in a water phantom with a single beam and simple target geometry. Two separate phantom tests were performed. In one test the dose per fraction was varied from 0.5 to 2 Gy using 25 fractions per plan. In the other test the number fractions were varied from 1 to 25, using 2 Gy per fraction. Three patient plans were used to determine the effect of delayed charge on the delivered dose under realistic clinical conditions. The order of spot delivery using CM1 was investigated by randomly selecting the starting spot for each layer, and by alternating per layer the starting spot from first to last. Only discrete spot scanning was considered in this study. Results: Using the phantom setup and varying the dose per fraction, the maximum dose difference for each plan of 25 fractions was 0.37–0.39 Gy and 0.03–0.05 Gy for CM1 and CM2, respectively. While varying the total number of fractions, the maximum dose

  9. The minimal non-minimal standard model

    International Nuclear Information System (INIS)

    Bij, J.J. van der

    2006-01-01

    In this Letter I discuss a class of extensions of the standard model that have a minimal number of possible parameters, but can in principle explain dark matter and inflation. It is pointed out that the so-called new minimal standard model contains a large number of parameters that can be put to zero, without affecting the renormalizability of the model. With the extra restrictions one might call it the minimal (new) non-minimal standard model (MNMSM). A few hidden discrete variables are present. It is argued that the inflaton should be higher-dimensional. Experimental consequences for the LHC and the ILC are discussed

  10. Adjust the method of the FMEA to the requirements of the aviation industry

    Directory of Open Access Journals (Sweden)

    Andrzej FELLNER

    2015-12-01

    Full Text Available The article presents a summary of current methods used in aviation and rail transport. It also contains a proposal to adjust the method of the FMEA to the latest requirements of the airline industry. The authors suggested tables of indicators Zn, Pr and Dt necessary to implement FMEA method of risk analysis taking into account current achievements aerospace and rail safety. Also proposed acceptable limits of the RPN number which allows you to classify threats.

  11. Comparison of two methods for minimizing the effect of delayed charge on the dose delivered with a synchrotron based discrete spot scanning proton beam.

    Science.gov (United States)

    Whitaker, Thomas J; Beltran, Chris; Tryggestad, Erik; Bues, Martin; Kruse, Jon J; Remmes, Nicholas B; Tasson, Alexandria; Herman, Michael G

    2014-08-01

    Delayed charge is a small amount of charge that is delivered to the patient after the planned irradiation is halted, which may degrade the quality of the treatment by delivering unwarranted dose to the patient. This study compares two methods for minimizing the effect of delayed charge on the dose delivered with a synchrotron based discrete spot scanning proton beam. The delivery of several treatment plans was simulated by applying a normally distributed value of delayed charge, with a mean of 0.001(SD 0.00025) MU, to each spot. Two correction methods were used to account for the delayed charge. Method one (CM1), which is in active clinical use, accounts for the delayed charge by adjusting the MU of the current spot based on the cumulative MU. Method two (CM2) in addition reduces the planned MU by a predicted value. Every fraction of a treatment was simulated using each method and then recomputed in the treatment planning system. The dose difference between the original plan and the sum of the simulated fractions was evaluated. Both methods were tested in a water phantom with a single beam and simple target geometry. Two separate phantom tests were performed. In one test the dose per fraction was varied from 0.5 to 2 Gy using 25 fractions per plan. In the other test the number fractions were varied from 1 to 25, using 2 Gy per fraction. Three patient plans were used to determine the effect of delayed charge on the delivered dose under realistic clinical conditions. The order of spot delivery using CM1 was investigated by randomly selecting the starting spot for each layer, and by alternating per layer the starting spot from first to last. Only discrete spot scanning was considered in this study. Using the phantom setup and varying the dose per fraction, the maximum dose difference for each plan of 25 fractions was 0.37-0.39 Gy and 0.03-0.05 Gy for CM1 and CM2, respectively. While varying the total number of fractions, the maximum dose difference increased at a rate

  12. Minimized Capillary End Effect During CO2 Displacement in 2-D Micromodel by Manipulating Capillary Pressure at the Outlet Boundary in Lattice Boltzmann Method

    Science.gov (United States)

    Kang, Dong Hun; Yun, Tae Sup

    2018-02-01

    We propose a new outflow boundary condition to minimize the capillary end effect for a pore-scale CO2 displacement simulation. The Rothman-Keller lattice Boltzmann method with multi-relaxation time is implemented to manipulate a nonflat wall and inflow-outflow boundaries with physically acceptable fluid properties in 2-D microfluidic chip domain. Introducing a mean capillary pressure acting at CO2-water interface to the nonwetting fluid at the outlet effectively prevents CO2 injection pressure from suddenly dropping upon CO2 breakthrough such that the continuous CO2 invasion and the increase of CO2 saturation are allowed. This phenomenon becomes most pronounced at capillary number of logCa = -5.5, while capillary fingering and massive displacement of CO2 prevail at low and high capillary numbers, respectively. Simulations with different domain length in homogeneous and heterogeneous domains reveal that capillary pressure and CO2 saturation near the inlet are reproducible compared with those with a proposed boundary condition. The residual CO2 saturation uniquely follows the increasing tendency with increasing capillary number, corroborated by experimental evidences. The determination of the mean capillary pressure and its sensitivity are also discussed. The proposed boundary condition is commonly applicable to other pore-scale simulations to accurately capture the spatial distribution of nonwetting fluid and corresponding displacement ratio.

  13. Waste minimization handbook, Volume 1

    International Nuclear Information System (INIS)

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility's life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996

  14. Waste minimization handbook, Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility`s life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996.

  15. Functional Mobility Testing: A Novel Method to Create Suit Design Requirements

    Science.gov (United States)

    England, Scott A.; Benson, Elizabeth A.; Rajulu, Sudhakar L.

    2008-01-01

    This study was performed to aide in the creation of design requirements for the next generation of space suits that more accurately describe the level of mobility necessary for a suited crewmember through the use of an innovative methodology utilizing functional mobility. A novel method was utilized involving the collection of kinematic data while 20 subjects (10 male, 10 female) performed pertinent functional tasks that will be required of a suited crewmember during various phases of a lunar mission. These tasks were selected based on relevance and criticality from a larger list of tasks that may be carried out by the crew. Kinematic data was processed through Vicon BodyBuilder software to calculate joint angles for the ankle, knee, hip, torso, shoulder, elbow, and wrist. Maximum functional mobility was consistently lower than maximum isolated mobility. This study suggests that conventional methods for establishing design requirements for human-systems interfaces based on maximal isolated joint capabilities may overestimate the required mobility. Additionally, this method provides a valuable means of evaluating systems created from these requirements by comparing the mobility available in a new spacesuit, or the mobility required to use a new piece of hardware, to this newly established database of functional mobility.

  16. Comparison of different methods to extract the required coefficient of friction for level walking.

    Science.gov (United States)

    Chang, Wen-Ruey; Chang, Chien-Chi; Matz, Simon

    2012-01-01

    The required coefficient of friction (RCOF) is an important predictor for slip incidents. Despite the wide use of the RCOF there is no standardised method for identifying the RCOF from ground reaction forces. This article presents a comparison of the outcomes from seven different methods, derived from those reported in the literature, for identifying the RCOF from the same data. While commonly used methods are based on a normal force threshold, percentage of stance phase or time from heel contact, a newly introduced hybrid method is based on a combination of normal force, time and direction of increase in coefficient of friction. Although no major differences were found with these methods in more than half the strikes, significant differences were found in a significant portion of strikes. Potential problems with some of these methods were identified and discussed and they appear to be overcome by the hybrid method. No standard method exists for determining the required coefficient of friction (RCOF), an important predictor for slipping. In this study, RCOF values from a single data set, using various methods from the literature, differed considerably for a significant portion of strikes. A hybrid method may yield improved results.

  17. Mission from Mars - a method for exploring user requirements for children in a narrative space

    DEFF Research Database (Denmark)

    Dindler, Christian; Ludvigsen, Martin; Lykke-Olesen, Andreas

    2005-01-01

    In this paper a particular design method is propagated as a supplement to existing descriptive approaches to current practice studies especially suitable for gathering requirements for the design of children's technology. The Mission from Mars method was applied during the design of an electronic...... school bag (eBag). The three-hour collaborative session provides a first-hand insight into children's practice in a fun and intriguing way. The method is proposed as a supplement to existing descriptive design methods for interaction design and children....

  18. Proposal for Requirement Validation Criteria and Method Based on Actor Interaction

    Science.gov (United States)

    Hattori, Noboru; Yamamoto, Shuichiro; Ajisaka, Tsuneo; Kitani, Tsuyoshi

    We propose requirement validation criteria and a method based on the interaction between actors in an information system. We focus on the cyclical transitions of one actor's situation against another and clarify observable stimuli and responses based on these transitions. Both actors' situations can be listed in a state transition table, which describes the observable stimuli or responses they send or receive. Examination of the interaction between both actors in the state transition tables enables us to detect missing or defective observable stimuli or responses. Typically, this method can be applied to the examination of the interaction between a resource managed by the information system and its user. As a case study, we analyzed 332 requirement defect reports of an actual system development project in Japan. We found that there were a certain amount of defects regarding missing or defective stimuli and responses, which can be detected using our proposed method if this method is used in the requirement definition phase. This means that we can reach a more complete requirement definition with our proposed method.

  19. Minimal families of curves on surfaces

    KAUST Repository

    Lubbes, Niels

    2014-11-01

    A minimal family of curves on an embedded surface is defined as a 1-dimensional family of rational curves of minimal degree, which cover the surface. We classify such minimal families using constructive methods. This allows us to compute the minimal families of a given surface.The classification of minimal families of curves can be reduced to the classification of minimal families which cover weak Del Pezzo surfaces. We classify the minimal families of weak Del Pezzo surfaces and present a table with the number of minimal families of each weak Del Pezzo surface up to Weyl equivalence.As an application of this classification we generalize some results of Schicho. We classify algebraic surfaces that carry a family of conics. We determine the minimal lexicographic degree for the parametrization of a surface that carries at least 2 minimal families. © 2014 Elsevier B.V.

  20. A step by step selection method for the location and the size of a waste-to-energy facility targeting the maximum output energy and minimization of gate fee.

    Science.gov (United States)

    Kyriakis, Efstathios; Psomopoulos, Constantinos; Kokkotis, Panagiotis; Bourtsalas, Athanasios; Themelis, Nikolaos

    2017-06-23

    This study attempts the development of an algorithm in order to present a step by step selection method for the location and the size of a waste-to-energy facility targeting the maximum output energy, also considering the basic obstacle which is in many cases, the gate fee. Various parameters identified and evaluated in order to formulate the proposed decision making method in the form of an algorithm. The principle simulation input is the amount of municipal solid wastes (MSW) available for incineration and along with its net calorific value are the most important factors for the feasibility of the plant. Moreover, the research is focused both on the parameters that could increase the energy production and those that affect the R1 energy efficiency factor. Estimation of the final gate fee is achieved through the economic analysis of the entire project by investigating both expenses and revenues which are expected according to the selected site and outputs of the facility. In this point, a number of commonly revenue methods were included in the algorithm. The developed algorithm has been validated using three case studies in Greece-Athens, Thessaloniki, and Central Greece, where the cities of Larisa and Volos have been selected for the application of the proposed decision making tool. These case studies were selected based on a previous publication made by two of the authors, in which these areas where examined. Results reveal that the development of a «solid» methodological approach in selecting the site and the size of waste-to-energy (WtE) facility can be feasible. However, the maximization of the energy efficiency factor R1 requires high utilization factors while the minimization of the final gate fee requires high R1 and high metals recovery from the bottom ash as well as economic exploitation of recovered raw materials if any.

  1. Evaluation of Irrigation Methods for Highbush Blueberry. I. Growth and Water Requirements of Young Plants

    Science.gov (United States)

    A study was conducted in a new field of northern highbush blueberry (Vaccinium corymbosum L. 'Elliott') to determine the effects of different irrigation methods on growth and water requirements of uncropped plants during the first 2 years after planting. The plants were grown on mulched, raised beds...

  2. 21 CFR 111.320 - What requirements apply to laboratory methods for testing and examination?

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What requirements apply to laboratory methods for testing and examination? 111.320 Section 111.320 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CURRENT GOOD MANUFACTURING...

  3. Design requirements, criteria and methods for seismic qualification of CANDU power plants

    International Nuclear Information System (INIS)

    Singh, N.; Duff, C.G.

    1979-10-01

    This report describes the requirements and criteria for the seismic design and qualification of systems and equipment in CANDU nuclear power plants. Acceptable methods and techniques for seismic qualification of CANDU nuclear power plants to mitigate the effects or the consequences of earthquakes are also described. (auth)

  4. New concepts, requirements and methods concerning the periodic inspection of the CANDU fuel channels

    International Nuclear Information System (INIS)

    Denis, J.R.

    1995-01-01

    Periodic inspection of fuel channels is essential for a proper assessment of the structural integrity of these vital components of the reactor. The development of wet channel technologies for non-destructive examination (NDE) of pressure tubes and the high technical performance and reliability of the CIGAR equipment have led, in less than 1 0 years, to the accumulation of a very significant volume of data concerning the flaw mechanisms and structural behaviour of the CANDU fuel channels. On this basis, a new form of the CAN/CSA-N285.4 Standard for Periodic Inspection of CANDU Nuclear Power Plant components was elaborated, introducing new concepts and requirements, in accord with the powerful NDE methods now available. This paper presents these concepts and requirements, and discusses the NDE methods, presently used or under development, to satisfy these requirements. Specific features regarding the fuel channel inspections of Cernavoda NGS Unit 1 are also discussed. (author)

  5. Fault tree construction of hybrid system requirements using qualitative formal method

    International Nuclear Information System (INIS)

    Lee, Jang-Soo; Cha, Sung-Deok

    2005-01-01

    When specifying requirements for software controlling hybrid systems and conducting safety analysis, engineers experience that requirements are often known only in qualitative terms and that existing fault tree analysis techniques provide little guidance on formulating and evaluating potential failure modes. In this paper, we propose Causal Requirements Safety Analysis (CRSA) as a technique to qualitatively evaluate causal relationship between software faults and physical hazards. This technique, extending qualitative formal method process and utilizing information captured in the state trajectory, provides specific guidelines on how to identify failure modes and relationship among them. Using a simplified electrical power system as an example, we describe step-by-step procedures of conducting CRSA. Our experience of applying CRSA to perform fault tree analysis on requirements for the Wolsong nuclear power plant shutdown system indicates that CRSA is an effective technique in assisting safety engineers

  6. Annual Waste Minimization Summary Report

    International Nuclear Information System (INIS)

    Haworth, D.M.

    2011-01-01

    This report summarizes the waste minimization efforts undertaken by National Security TechnoIogies, LLC, for the U. S. Department of Energy, National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during calendar year 2010. The NNSA/NSO Pollution Prevention Program establishes a process to reduce the volume and toxicity of waste generated by NNSA/NSO activities and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment.

  7. Droplet Digital PCR for BCR/ABL(P210) Detecting of CML: A High Sensitive Method of the Minimal Residual Disease& Disease Progression.

    Science.gov (United States)

    Wang, Wen-Jun; Zheng, Chao-Feng; Liu, Zhuang; Tan, Yan-Hong; Chen, Xiu-Hua; Zhao, Bin-Liang; Li, Guo-Xia; Xu, Zhi-Fang; Ren, Fang-Gang; Zhang, Yao-Fang; Chang, Jian-Mei; Wang, Hong-Wei

    2018-04-25

    The present study intended to establish a droplet digital PCR (dd-PCR) for monitoring minimal residual disease (MRD) in patients with BCR/ABL (P210)-positive CML, thereby achieving deep-level monitoring of tumor load and determining the efficacy for guided clinically individualized treatment. Using dd-PCR and RT-qPCR, two cell suspensions were obtained from K562 cells and normal peripheral blood mononuclear cells by gradient dilution and were measured at the cellular level. At peripheral blood(PB) level, 61 cases with CML-chronic phase (CML-CP) were obtained after tyrosine kinase inhibitors (TKIs) treatment and regular follow-ups. By RT-qPCR, BCR/ABL (P210) fusion gene was undetectable in PB after three successive analyses, which were performed once every three months. At the same time, dd-PCR was performed simultaneously with the last equal amount of cDNA. Ten CML patients with MR4.5 were followed up by the two methods. At the cellular level, consistency of results of dd-PCR and RT-qPCR reached R 2 ≥0.99, with conversion equation of Y=33.148X 1.222 (Y: dd-PCR results; X: RT-qPCR results). In the dd-PCR test, 11 of the 61 CML patients (18.03%) tested positive and showed statistically significant difference (PPCR 3 months earlier than by RT-qPCR. In contrast with RT-qPCR, dd-PCR is more sensitive, thus enabling accurate conversion of dd-PCR results into internationally standard RT-qPCR results by conversion equation, to achieve a deeper molecular biology-based stratification of BCR/ABL(P210) MRD. It has some reference value to monitor disease progression in clinic. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  8. Quality assurance requirements and methods for high level waste package acceptability

    International Nuclear Information System (INIS)

    1992-12-01

    This document should serve as guidance for assigning the necessary items to control the conditioning process in such a way that waste packages are produced in compliance with the waste acceptance requirements. It is also provided to promote the exchange of information on quality assurance requirements and on the application of quality assurance methods associated with the production of high level waste packages, to ensure that these waste packages comply with the requirements for transportation, interim storage and waste disposal in deep geological formations. The document is intended to assist both the operators of conditioning facilities and repositories as well as national authorities and regulatory bodies, involved in the licensing of the conditioning of high level radioactive wastes or in the development of deep underground disposal systems. The document recommends the quality assurance requirements and methods which are necessary to generate data for these parameters identified in IAEA-TECDOC-560 on qualitative acceptance criteria, and indicates where and when the control methods can be applied, e.g. in the operation or commissioning of a process or in the development of a waste package design. Emphasis is on the control of the process and little reliance is placed on non-destructive or destructive testing. Qualitative criteria, relevant to disposal of high level waste, are repository dependent and are not addressed here. 37 refs, 3 figs, 2 tabs

  9. Minimal Marking: A Success Story

    Directory of Open Access Journals (Sweden)

    Anne McNeilly

    2014-11-01

    Full Text Available The minimal-marking project conducted in Ryerson’s School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The “minimal-marking” concept (Haswell, 1983, which requires dramatically more student engagement, resulted in more successful learning outcomes for surface-level knowledge acquisition than the more traditional approach of “teacher-corrects-all.” Results suggest it would be effective, not just for grammar, punctuation, and word usage, the objective here, but for any material that requires rote-memory learning, such as the Associated Press or Canadian Press style rules used by news publications across North America.

  10. New trends in minimally invasive urological surgery

    Directory of Open Access Journals (Sweden)

    Prabhakar Rajan

    2009-10-01

    Full Text Available Purpose: The perceived benefits of minimally-invasive surgery include less postoperative pain, shorter hospitalization, reduced morbidity and better cosmesis while maintaining diagnostic accuracy and therapeutic outcome. We review the new trends in minimally-invasive urological surgery. Materials and method: We reviewed the English language literature using the National Library of Medicine database to identify the latest technological advances in minimally-invasive surgery with particular reference to urology. Results: Amongst other advances, studies incorporating needlescopic surgery, laparoendoscopic single-site surgery , magnetic anchoring and guidance systems, natural orifice transluminal endoscopic surgery and flexible robots were considered of interest. The results from initial animal and human studies are also outlined. Conclusion: Minimally-invasive surgery continues to evolve to meet the demands of the operators and patients. Many novel technologies are still in the testing phase, whilst others have entered clinical practice. Further evaluation is required to confirm the safety and efficacy of these techniques and validate the published reports.

  11. Regularity of Minimal Surfaces

    CERN Document Server

    Dierkes, Ulrich; Tromba, Anthony J; Kuster, Albrecht

    2010-01-01

    "Regularity of Minimal Surfaces" begins with a survey of minimal surfaces with free boundaries. Following this, the basic results concerning the boundary behaviour of minimal surfaces and H-surfaces with fixed or free boundaries are studied. In particular, the asymptotic expansions at interior and boundary branch points are derived, leading to general Gauss-Bonnet formulas. Furthermore, gradient estimates and asymptotic expansions for minimal surfaces with only piecewise smooth boundaries are obtained. One of the main features of free boundary value problems for minimal surfaces is t

  12. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    Science.gov (United States)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  13. A Method for and Issues Associated with the Determination of Space Suit Joint Requirements

    Science.gov (United States)

    Matty, Jennifer E.; Aitchison, Lindsay

    2009-01-01

    In the design of a new space suit it is necessary to have requirements that define what mobility space suit joints should be capable of achieving in both a system and at the component level. NASA elected to divide mobility into its constituent parts-range of motion (ROM) and torque- in an effort to develop clean design requirements that limit subject performance bias and are easily verified. Unfortunately, the measurement of mobility can be difficult to obtain. Current technologies, such as the Vicon motion capture system, allow for the relatively easy benchmarking of range of motion (ROM) for a wide array of space suit systems. The ROM evaluations require subjects in the suit to accurately evaluate the ranges humans can achieve in the suit. However, when it comes to torque, there are significant challenges for both benchmarking current performance and writing requirements for future suits. This is reflected in the fact that torque definitions have been applied to very few types of space suits and with limited success in defining all the joints accurately. This paper discussed the advantages and disadvantages to historical joint torque evaluation methods, describes more recent efforts directed at benchmarking joint torques of prototype space suits, and provides an outline for how NASA intends to address joint torque in design requirements for the Constellation Space Suit System (CSSS).

  14. The molecular variability analysis of the RNA 3 of fifteen isolates of Prunus necrotic ringspot virus sheds light on the minimal requirements for the synthesis of its subgenomic RNA.

    Science.gov (United States)

    Aparicio, Frederic; Pallás, Vicente

    2002-01-01

    The nucleotide sequences of the RNA 3 of fifteen isolates of Prunus necrotic ringspot virus (PNRSV) varying in the symptomatology they cause in six different Prunus spp. were determined. Analysis of the molecular variability has allowed, in addition to study the phylogenetic relationships among them, to evaluate the minimal requirements for the synthesis of the subgenomic RNA in Ilarvirus genus and their comparison to other members of the Bromoviridae family. Computer assisted comparisons led recently to Jaspars (Virus Genes 17, 233-242, 1998) to propose that a hairpin structure in viral minus strand RNA is required for subgenomic promoter activity of viruses from at least two, and possibly all five, genera in the family of Bromoviridae. For PNRSV and Apple mosaic virus two stable hairpins were proposed whereas for the rest of Ilarviruses and the other four genera of the Bromoviridae family only one stable hairpin was predicted. Comparative analysis of this region among the fifteen PNRSV isolates characterized in this study revealed that two of them showed a 12-nt deletion that led to the disappearance of the most proximal hairpin to the initiation site. Interestingly, the only hairpin found in these two isolates is very similar in primary and secondary structure to the one previously shown in Brome mosaic virus to be required for the synthesis of the subgenomic RNA. In this hairpin, the molecular diversity was concentrated mostly at the loop whereas compensatory mutations were observed at the base of the stem strongly suggesting its functional relevance. The evolutionary implications of these observations are discussed.

  15. Identifying and prioritizing customer requirements from tractor production by QFD method

    Directory of Open Access Journals (Sweden)

    H Taghizadeh

    2017-05-01

    Full Text Available Introduction Discovering and understanding customer needs and expectations are considered as important factors on customer satisfaction and play vital role to maintain the current activity among its competitors, proceeding and obtaining customer satisfaction which are critical factors to design a successful production; thus the successful organizations must meet their needs containing the quality of the products or services to customers. Quality Function Deployment (QFD is a technique for studying demands and needs of customers which is going to give more emphasis to the customer's interests in this way. The QFD method in general implemented various tools and methods for reaching qualitative goals; but the most important and the main tool of this method is the house of quality diagrams. The Analytic Hierarchy Process (AHP is a famous and common MADM method based on pair wise comparisons used for determining the priority of understudied factors in various studies until now. With considering effectiveness of QFD method to explicating customer's demands and obtaining customer satisfaction, generally, the researchers followed this question's suite and scientific answer: how can QFD explicate real demands and requirements of customers from tractor final production and what is the prioritization of these demands and requirements in view of customers. Accordingly, the aim of this study was to identify and prioritize the customer requirements of Massey Ferguson (MF 285 tractor production in Iran tractor manufacturing company with t- student statistical test, AHP and QFD methods. Materials and Methods Research method was descriptive and statistical population included all of the tractor customers of Tractor Manufacturing Company in Iran from March 2011 to March 2015. The statistical sample size was 171 which are determined with Cochran index. Moreover, 20 experts' opinion has been considered for determining product's technical requirements. Literature

  16. A method for determining customer requirement weights based on TFMF and TLR

    Science.gov (United States)

    Ai, Qingsong; Shu, Ting; Liu, Quan; Zhou, Zude; Xiao, Zheng

    2013-11-01

    'Customer requirements' (CRs) management plays an important role in enterprise systems (ESs) by processing customer-focused information. Quality function deployment (QFD) is one of the main CRs analysis methods. Because CR weights are crucial for the input of QFD, we developed a method for determining CR weights based on trapezoidal fuzzy membership function (TFMF) and 2-tuple linguistic representation (TLR). To improve the accuracy of CR weights, we propose to apply TFMF to describe CR weights so that they can be appropriately represented. Because the fuzzy logic is not capable of aggregating information without loss, TLR model is adopted as well. We first describe the basic concepts of TFMF and TLR and then introduce an approach to compute CR weights. Finally, an example is provided to explain and verify the proposed method.

  17. Assessment of LANL waste minimization plan

    International Nuclear Information System (INIS)

    Davis, K.D.; McNair, D.A.; Jennrich, E.A.; Lund, D.M.

    1991-04-01

    The objective of this report is to evaluate the Los Alamos National Laboratory (LANL) Waste Minimization Plan to determine if it meets applicable internal (DOE) and regulatory requirements. The intent of the effort is to assess the higher level elements of the documentation to determine if they have been addressed rather than the detailed mechanics of the program's implementation. The requirement for a Waste Minimization Plan is based in several DOE Orders as well as environmental laws and regulations. Table 2-1 provides a list of the major documents or regulations that require waste minimization efforts. The table also summarizes the applicable requirements

  18. Performance of methods for estimation of table beet water requirement in Alagoas

    Directory of Open Access Journals (Sweden)

    Daniella P. dos Santos

    Full Text Available ABSTRACT Optimization of water use in agriculture is fundamental, particularly in regions where water scarcity is intense, requiring the adoption of technologies that promote increased irrigation efficiency. The objective of this study was to evaluate evapotranspiration models and to estimate the crop coefficients of beet grown in a drainage lysimeter in the Agreste region of Alagoas. The experiment was conducted at the Campus of the Federal University of Alagoas - UFAL, in the municipality of Arapiraca, AL, between March and April 2014. Crop evapotranspiration (ETc was estimated in drainage lysimeters and reference evapotranspiration (ETo by Penman-Monteith-FAO 56 and Hargreaves-Samani methods. The Hargreaves-Samani method presented a good performance index for ETo estimation compared with the Penman-Monteith-FAO method, indicating that it is adequate for the study area. Beet ETc showed a cumulative demand of 202.11 mm for a cumulative reference evapotranspiration of 152.00 mm. Kc values determined using the Penman-Monteith-FAO 56 and Hargreaves-Samani methods were overestimated, in comparison to the Kc values of the FAO-56 standard method. With the obtained results, it is possible to correct the equations of the methods for the region, allowing for adequate irrigation management.

  19. Minimally invasive orthognathic surgery.

    Science.gov (United States)

    Resnick, Cory M; Kaban, Leonard B; Troulis, Maria J

    2009-02-01

    Minimally invasive surgery is defined as the discipline in which operative procedures are performed in novel ways to diminish the sequelae of standard surgical dissections. The goals of minimally invasive surgery are to reduce tissue trauma and to minimize bleeding, edema, and injury, thereby improving the rate and quality of healing. In orthognathic surgery, there are two minimally invasive techniques that can be used separately or in combination: (1) endoscopic exposure and (2) distraction osteogenesis. This article describes the historical developments of the fields of orthognathic surgery and minimally invasive surgery, as well as the integration of the two disciplines. Indications, techniques, and the most current outcome data for specific minimally invasive orthognathic surgical procedures are presented.

  20. Correlates of minimal dating.

    Science.gov (United States)

    Leck, Kira

    2006-10-01

    Researchers have associated minimal dating with numerous factors. The present author tested shyness, introversion, physical attractiveness, performance evaluation, anxiety, social skill, social self-esteem, and loneliness to determine the nature of their relationships with 2 measures of self-reported minimal dating in a sample of 175 college students. For women, shyness, introversion, physical attractiveness, self-rated anxiety, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. For men, physical attractiveness, observer-rated social skill, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. The patterns of relationships were not identical for the 2 indicators of minimal dating, indicating the possibility that minimal dating is not a single construct as researchers previously believed. The present author discussed implications and suggestions for future researchers.

  1. Hexavalent Chromium Minimization Strategy

    Science.gov (United States)

    2011-05-01

    Logistics 4 Initiative - DoD Hexavalent Chromium Minimization Non- Chrome Primer IIEXAVAJ ENT CHRO:M I~UMI CHROMIUM (VII Oil CrfVli.J CANCEfl HAnRD CD...Management Office of the Secretary of Defense Hexavalent Chromium Minimization Strategy Report Documentation Page Form ApprovedOMB No. 0704-0188...00-2011 4. TITLE AND SUBTITLE Hexavalent Chromium Minimization Strategy 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  2. Minimal Super Technicolor

    DEFF Research Database (Denmark)

    Antola, M.; Di Chiara, S.; Sannino, F.

    2011-01-01

    We introduce novel extensions of the Standard Model featuring a supersymmetric technicolor sector (supertechnicolor). As the first minimal conformal supertechnicolor model we consider N=4 Super Yang-Mills which breaks to N=1 via the electroweak interactions. This is a well defined, economical......, between unparticle physics and Minimal Walking Technicolor. We consider also other N =1 extensions of the Minimal Walking Technicolor model. The new models allow all the standard model matter fields to acquire a mass....

  3. A mixed-methods systematic review protocol to examine the use of physical restraint with critically ill adults and strategies for minimizing their use.

    Science.gov (United States)

    Rose, Louise; Dale, Craig; Smith, Orla M; Burry, Lisa; Enright, Glenn; Fergusson, Dean; Sinha, Samir; Wiesenfeld, Lesley; Sinuff, Tasnim; Mehta, Sangeeta

    2016-11-21

    Critically ill patients frequently experience severe agitation placing them at risk of harm. Physical restraint is common in intensive care units (ICUs) for clinician concerns about safety. However, physical restraint may not prevent medical device removal and has been associated with negative physical and psychological consequences. While professional society guidelines, legislation, and accreditation standards recommend physical restraint minimization, guidelines for critically ill patients are over a decade old, with recommendations that are non-specific. Our systematic review will synthesize evidence on physical restraint in critically ill adults with the primary objective of identifying effective minimization strategies. Two authors will independently search from inception to July 2016 the following: Ovid MEDLINE, CINAHL, Embase, Web of Science, Cochrane Library, PROSPERO, Joanna Briggs Institute, grey literature, professional society websites, and the International Clinical Trials Registry Platform. We will include quantitative and qualitative study designs, clinical practice guidelines, policy documents, and professional society recommendations relevant to physical restraint of critically ill adults. Authors will independently perform data extraction in duplicate and complete risk of bias and quality assessment using recommended tools. We will assess evidence quality for quantitative studies using the Grading of Recommendations Assessment, Development and Evaluation (GRADE) approach and for qualitative studies using the Confidence in the Evidence from Reviews of Qualitative Research (CERQual) guidelines. Outcomes of interest include (1) efficacy/effectiveness of physical restraint minimization strategies; (2) adverse events (unintentional device removal, psychological impact, physical injury) and associated benefits including harm prevention; (3) ICU outcomes (ventilation duration, length of stay, and mortality); (4) prevalence, incidence, patterns of use

  4. Method for calculating required shielding in medical x-ray rooms

    International Nuclear Information System (INIS)

    Karppinen, J.

    1997-10-01

    The new annual radiation dose limits - 20 mSv (previously 50 mSv) for radiation workers and 1 mSv (previously 5 mSv) for other persons - implies that the adequacy of existing radiation shielding must be re-evaluated. In principle, one could assume that the thicknesses of old radiation shields should be increased by about one or two half-value layers in order to comply with the new dose limits. However, the assumptions made in the earlier shielding calculations are highly conservative; the required shielding was often determined by applying the maximum high-voltage of the x-ray tube for the whole workload. A more realistic calculation shows that increased shielding is typically not necessary if more practical x-ray tube voltages are used in the evaluation. We have developed a PC-based calculation method for calculating the x-ray shielding which is more realistic than the highly conservative method formerly used. The method may be used to evaluate an existing shield for compliance with new regulations. As examples of these calculations, typical x-ray rooms are considered. The lead and concrete thickness requirements as a function of x-ray tube voltage and workload are also given in tables. (author)

  5. Evaluation of methods to estimate the essential amino acids requirements of fish from the muscle amino acid profile

    Directory of Open Access Journals (Sweden)

    Álvaro José de Almeida Bicudo

    2014-03-01

    Full Text Available Many methods to estimate amino acid requirement based on amino acid profile of fish have been proposed. This study evaluates the methodology proposed by Meyer & Fracalossi (2005 and by Tacon (1989 to estimate amino acids requirement of fish, which do exempt knowledge on previous nutritional requirement of reference amino acid. Data on amino acid requirement of pacu, Piaractus mesopotamicus, were used to validate de accuracy of those methods. Meyer & Fracalossi's and Tacon's methodology estimated the lysine requirement of pacu, respectively, at 13 and 23% above requirement determined using dose-response method. The values estimated by both methods lie within the range of requirements determined for other omnivorous fish species, the Meyer & Fracalossi (2005 method showing better accuracy.

  6. Minimal mirror twin Higgs

    Energy Technology Data Exchange (ETDEWEB)

    Barbieri, Riccardo [Institute of Theoretical Studies, ETH Zurich,CH-8092 Zurich (Switzerland); Scuola Normale Superiore,Piazza dei Cavalieri 7, 56126 Pisa (Italy); Hall, Lawrence J.; Harigaya, Keisuke [Department of Physics, University of California,Berkeley, California 94720 (United States); Theoretical Physics Group, Lawrence Berkeley National Laboratory,Berkeley, California 94720 (United States)

    2016-11-29

    In a Mirror Twin World with a maximally symmetric Higgs sector the little hierarchy of the Standard Model can be significantly mitigated, perhaps displacing the cutoff scale above the LHC reach. We show that consistency with observations requires that the Z{sub 2} parity exchanging the Standard Model with its mirror be broken in the Yukawa couplings. A minimal such effective field theory, with this sole Z{sub 2} breaking, can generate the Z{sub 2} breaking in the Higgs sector necessary for the Twin Higgs mechanism. The theory has constrained and correlated signals in Higgs decays, direct Dark Matter Detection and Dark Radiation, all within reach of foreseen experiments, over a region of parameter space where the fine-tuning for the electroweak scale is 10-50%. For dark matter, both mirror neutrons and a variety of self-interacting mirror atoms are considered. Neutrino mass signals and the effects of a possible additional Z{sub 2} breaking from the vacuum expectation values of B−L breaking fields are also discussed.

  7. Process qualification and control in electron beams--requirements, methods, new concepts and challenges

    International Nuclear Information System (INIS)

    Mittendorfer, J.; Gratzl, F.; Hanis, D.

    2004-01-01

    In this paper the status of process qualification and control in electron beam irradiation is analyzed in terms of requirements, concepts, methods and challenges for a state-of-the-art process control concept for medical device sterilization. Aspects from process qualification to routine process control are described together with the associated process variables. As a case study the 10 MeV beams at Mediscan GmbH are considered. Process control concepts like statistical process control (SPC) and a new concept to determine process capability is briefly discussed

  8. Flood control design requirements and flood evaluation methods of inland nuclear power plant

    International Nuclear Information System (INIS)

    Zhang Ailing; Wang Ping; Zhu Jingxing

    2011-01-01

    Effect of flooding is one of the key safety factors and environmental factors in inland nuclear power plant sitting. Up to now, the rule of law and standard systems are established for the selection of nuclear power plant location and flood control requirements in China. In this paper flood control standards of China and other countries are introduced. Several inland nuclear power plants are taken as examples to thoroughly discuss the related flood evaluation methods. The suggestions are also put forward in the paper. (authors)

  9. Westinghouse Hanford Company waste minimization actions

    International Nuclear Information System (INIS)

    Greenhalgh, W.O.

    1988-09-01

    Companies that generate hazardous waste materials are now required by national regulations to establish a waste minimization program. Accordingly, in FY88 the Westinghouse Hanford Company formed a waste minimization team organization. The purpose of the team is to assist the company in its efforts to minimize the generation of waste, train personnel on waste minimization techniques, document successful waste minimization effects, track dollar savings realized, and to publicize and administer an employee incentive program. A number of significant actions have been successful, resulting in the savings of materials and dollars. The team itself has been successful in establishing some worthwhile minimization projects. This document briefly describes the waste minimization actions that have been successful to date. 2 refs., 26 figs., 3 tabs

  10. Ruled Laguerre minimal surfaces

    KAUST Repository

    Skopenkov, Mikhail; Pottmann, Helmut; Grohs, Philipp

    2011-01-01

    A Laguerre minimal surface is an immersed surface in ℝ 3 being an extremal of the functional ∫ (H 2/K-1)dA. In the present paper, we prove that the only ruled Laguerre minimal surfaces are up to isometry the surfaces ℝ (φλ) = (Aφ, Bφ, Cφ + D cos 2φ

  11. SB certification handout material requirements, test methods, responsibilities, and minimum classification levels for mixture-based specification for flexible base.

    Science.gov (United States)

    2012-10-01

    A handout with tables representing the material requirements, test methods, responsibilities, and minimum classification levels mixture-based specification for flexible base and details on aggregate and test methods employed, along with agency and co...

  12. Minimal quantization of two-dimensional models with chiral anomalies

    International Nuclear Information System (INIS)

    Ilieva, N.

    1987-01-01

    Two-dimensional gauge models with chiral anomalies - ''left-handed'' QED and the chiral Schwinger model, are quantized consistently in the frames of the minimal quantization method. The choice of the cone time as a physical time for system of quantization is motivated. The well-known mass spectrum is found but with a fixed value of the regularization parameter a=2. Such a unique solution is obtained due to the strong requirement of consistency of the minimal quantization that reflects in the physically motivated choice of the time axis

  13. Hazardous waste minimization tracking system

    International Nuclear Information System (INIS)

    Railan, R.

    1994-01-01

    Under RCRA section 3002 9(b) and 3005f(h), hazardous waste generators and owners/operators of treatment, storage, and disposal facilities (TSDFs) are required to certify that they have a program in place to reduce the volume or quantity and toxicity of hazardous waste to the degree determined to be economically practicable. In many cases, there are environmental, as well as, economic benefits, for agencies that pursue pollution prevention options. Several state governments have already enacted waste minimization legislation (e.g., Massachusetts Toxic Use Reduction Act of 1989, and Oregon Toxic Use Reduction Act and Hazardous Waste Reduction Act, July 2, 1989). About twenty six other states have established legislation that will mandate some type of waste minimization program and/or facility planning. The need to address the HAZMIN (Hazardous Waste Minimization) Program at government agencies and private industries has prompted us to identify the importance of managing The HAZMIN Program, and tracking various aspects of the program, as well as the progress made in this area. The open-quotes WASTEclose quotes is a tracking system, which can be used and modified in maintaining the information related to Hazardous Waste Minimization Program, in a manageable fashion. This program maintains, modifies, and retrieves information related to hazardous waste minimization and recycling, and provides automated report generating capabilities. It has a built-in menu, which can be printed either in part or in full. There are instructions on preparing The Annual Waste Report, and The Annual Recycling Report. The program is very user friendly. This program is available in 3.5 inch or 5 1/4 inch floppy disks. A computer with 640K memory is required

  14. Minimizing Exposure at Work

    Science.gov (United States)

    ; Environment Human Health Animal Health Safe Use Practices Food Safety Environment Air Water Soil Wildlife Home Page Pesticide Health and Safety Information Safe Use Practices Minimizing Exposure at Work Pesticides - Pennsylvania State University Cooperative Extension Personal Protective Equipment for Working

  15. Minimalism. Clip and Save.

    Science.gov (United States)

    Hubbard, Guy

    2002-01-01

    Provides background information on the art movement called "Minimalism" discussing why it started and its characteristics. Includes learning activities and information on the artist, Donald Judd. Includes a reproduction of one of his art works and discusses its content. (CMK)

  16. Ruled Laguerre minimal surfaces

    KAUST Repository

    Skopenkov, Mikhail

    2011-10-30

    A Laguerre minimal surface is an immersed surface in ℝ 3 being an extremal of the functional ∫ (H 2/K-1)dA. In the present paper, we prove that the only ruled Laguerre minimal surfaces are up to isometry the surfaces ℝ (φλ) = (Aφ, Bφ, Cφ + D cos 2φ) + λ(sin φ, cos φ, 0), where A,B,C,D ε ℝ are fixed. To achieve invariance under Laguerre transformations, we also derive all Laguerre minimal surfaces that are enveloped by a family of cones. The methodology is based on the isotropic model of Laguerre geometry. In this model a Laguerre minimal surface enveloped by a family of cones corresponds to a graph of a biharmonic function carrying a family of isotropic circles. We classify such functions by showing that the top view of the family of circles is a pencil. © 2011 Springer-Verlag.

  17. A new spinning reserve requirement forecast method for deregulated electricity markets

    International Nuclear Information System (INIS)

    Amjady, Nima; Keynia, Farshid

    2010-01-01

    Ancillary services are necessary for maintaining the security and reliability of power systems and constitute an important part of trade in competitive electricity markets. Spinning Reserve (SR) is one of the most important ancillary services for saving power system stability and integrity in response to contingencies and disturbances that continuously occur in the power systems. Hence, an accurate day-ahead forecast of SR requirement helps the Independent System Operator (ISO) to conduct a reliable and economic operation of the power system. However, SR signal has complex, non-stationary and volatile behavior along the time domain and depends greatly on system load. In this paper, a new hybrid forecast engine is proposed for SR requirement prediction. The proposed forecast engine has an iterative training mechanism composed of Levenberg-Marquadt (LM) learning algorithm and Real Coded Genetic Algorithm (RCGA), implemented on the Multi-Layer Perceptron (MLP) neural network. The proposed forecast methodology is examined by means of real data of Pennsylvania-New Jersey-Maryland (PJM) electricity market and the California ISO (CAISO) controlled grid. The obtained forecast results are presented and compared with those of the other SR forecast methods. (author)

  18. A new spinning reserve requirement forecast method for deregulated electricity markets

    Energy Technology Data Exchange (ETDEWEB)

    Amjady, Nima; Keynia, Farshid [Department of Electrical Engineering, Semnan University, Semnan (Iran)

    2010-06-15

    Ancillary services are necessary for maintaining the security and reliability of power systems and constitute an important part of trade in competitive electricity markets. Spinning Reserve (SR) is one of the most important ancillary services for saving power system stability and integrity in response to contingencies and disturbances that continuously occur in the power systems. Hence, an accurate day-ahead forecast of SR requirement helps the Independent System Operator (ISO) to conduct a reliable and economic operation of the power system. However, SR signal has complex, non-stationary and volatile behavior along the time domain and depends greatly on system load. In this paper, a new hybrid forecast engine is proposed for SR requirement prediction. The proposed forecast engine has an iterative training mechanism composed of Levenberg-Marquadt (LM) learning algorithm and Real Coded Genetic Algorithm (RCGA), implemented on the Multi-Layer Perceptron (MLP) neural network. The proposed forecast methodology is examined by means of real data of Pennsylvania-New Jersey-Maryland (PJM) electricity market and the California ISO (CAISO) controlled grid. The obtained forecast results are presented and compared with those of the other SR forecast methods. (author)

  19. LLNL Waste Minimization Program Plan

    International Nuclear Information System (INIS)

    1990-01-01

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). The Waste Minimization Policy field has undergone continuous changes since its formal inception in the 1984 HSWA legislation. The first LLNL WMPP, Revision A, is dated March 1985. A series of informal revision were made on approximately a semi-annual basis. This Revision 2 is the third formal issuance of the WMPP document. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this new policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. In response to these policies, DOE has revised and issued implementation guidance for DOE Order 5400.1, Waste Minimization Plan and Waste Reduction reporting of DOE Hazardous, Radioactive, and Radioactive Mixed Wastes, final draft January 1990. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements. 3 figs., 4 tabs

  20. Minimal and careful processing

    OpenAIRE

    Nielsen, Thorkild

    2004-01-01

    In several standards, guidelines and publications, organic food processing is strongly associated with "minimal processing" and "careful processing". The term "minimal processing" is nowadays often used in the general food processing industry and described in literature. The term "careful processing" is used more specifically within organic food processing but is not yet clearly defined. The concept of carefulness seems to fit very well with the processing of organic foods, especially if it i...

  1. Application of spatial methods to identify areas with lime requirement in eastern Croatia

    Science.gov (United States)

    Bogunović, Igor; Kisic, Ivica; Mesic, Milan; Zgorelec, Zeljka; Percin, Aleksandra; Pereira, Paulo

    2016-04-01

    With more than 50% of acid soils in all agricultural land in Croatia, soil acidity is recognized as a big problem. Low soil pH leads to a series of negative phenomena in plant production and therefore as a compulsory measure for reclamation of acid soils is liming, recommended on the base of soil analysis. The need for liming is often erroneously determined only on the basis of the soil pH, because the determination of cation exchange capacity, the hydrolytic acidity and base saturation is a major cost to producers. Therefore, in Croatia, as well as some other countries, the amount of liming material needed to ameliorate acid soils is calculated by considering their hydrolytic acidity. For this research, several interpolation methods were tested to identify the best spatial predictor of hidrolitic acidity. The purpose of this study was to: test several interpolation methods to identify the best spatial predictor of hidrolitic acidity; and to determine the possibility of using multivariate geostatistics in order to reduce the number of needed samples for determination the hydrolytic acidity, all with an aim that the accuracy of the spatial distribution of liming requirement is not significantly reduced. Soil pH (in KCl) and hydrolytic acidity (Y1) is determined in the 1004 samples (from 0-30 cm) randomized collected in agricultural fields near Orahovica in eastern Croatia. This study tested 14 univariate interpolation models (part of ArcGIS software package) in order to provide most accurate spatial map of hydrolytic acidity on a base of: all samples (Y1 100%), and the datasets with 15% (Y1 85%), 30% (Y1 70%) and 50% fewer samples (Y1 50%). Parallel to univariate interpolation methods, the precision of the spatial distribution of the Y1 was tested by the co-kriging method with exchangeable acidity (pH in KCl) as a covariate. The soils at studied area had an average pH (KCl) 4,81, while the average Y1 10,52 cmol+ kg-1. These data suggest that liming is necessary

  2. Minimally invasive brow suspension for facial paralysis.

    Science.gov (United States)

    Costantino, Peter D; Hiltzik, David H; Moche, Jason; Preminger, Aviva

    2003-01-01

    To report a new technique for unilateral brow suspension for facial paralysis that is minimally invasive, limits supraciliary scar formation, does not require specialized endoscopic equipment or expertise, and has proved to be equal to direct brow suspension in durability and symmetry. Retrospective survey of a case series of 23 patients between January 1997 and December 2000. Metropolitan tertiary care center. Patients with head and neck tumors and brow ptosis caused by facial nerve paralysis. The results of the procedure were determined using the following 3-tier rating system: outstanding (excellent elevation and symmetry); acceptable (good elevation and fair symmetry); and unacceptable (loss of elevation). The results were considered outstanding in 12 patients, acceptable in 9 patients, and unacceptable in only 1 patient. One patient developed a hematoma, and 1 patient required a secondary adjustment. The technique has proved to be superior to standard brow suspension procedures with regard to scar formation and equal with respect to facial symmetry and suspension. These results have caused us to abandon direct brow suspension and to use this minimally invasive method in all cases of brow ptosis due to facial paralysis.

  3. A Minimally Invasive Method for Sampling Nest and Roost Cavities for Fungi: a Novel Approach to Identify the Fungi Associated with Cavity-Nesting Birds

    Science.gov (United States)

    Michelle A. Jusino; Daniel Lindner; John K. Cianchetti; Adam T. Grisé; Nicholas J. Brazee; Jeffrey R. Walters

    2014-01-01

    Relationships among cavity-nesting birds, trees, and wood decay fungi pose interesting management challenges and research questions in many systems. Ornithologists need to understand the relationships between cavity-nesting birds and fungi in order to understand the habitat requirements of these birds. Typically, researchers rely on fruiting body surveys to identify...

  4. Waste minimization assessment procedure

    International Nuclear Information System (INIS)

    Kellythorne, L.L.

    1993-01-01

    Perry Nuclear Power Plant began developing a waste minimization plan early in 1991. In March of 1991 the plan was documented following a similar format to that described in the EPA Waste Minimization Opportunity Assessment Manual. Initial implementation involved obtaining management's commitment to support a waste minimization effort. The primary assessment goal was to identify all hazardous waste streams and to evaluate those streams for minimization opportunities. As implementation of the plan proceeded, non-hazardous waste streams routinely generated in large volumes were also evaluated for minimization opportunities. The next step included collection of process and facility data which would be useful in helping the facility accomplish its assessment goals. This paper describes the resources that were used and which were most valuable in identifying both the hazardous and non-hazardous waste streams that existed on site. For each material identified as a waste stream, additional information regarding the materials use, manufacturer, EPA hazardous waste number and DOT hazard class was also gathered. Once waste streams were evaluated for potential source reduction, recycling, re-use, re-sale, or burning for heat recovery, with disposal as the last viable alternative

  5. Construction schedules slack time minimizing

    Science.gov (United States)

    Krzemiński, Michał

    2017-07-01

    The article presents two copyright models for minimizing downtime working brigades. Models have been developed for construction schedules performed using the method of work uniform. Application of flow shop models is possible and useful for the implementation of large objects, which can be divided into plots. The article also presents a condition describing gives which model should be used, as well as a brief example of optimization schedule. The optimization results confirm the legitimacy of the work on the newly-developed models.

  6. Method to Minimize the Low-Frequency Neutral-Point Voltage Oscillations With Time-Offset Injection for Neutral-Point-Clamped Inverters

    DEFF Research Database (Denmark)

    Choi, Ui-Min; Blaabjerg, Frede; Lee, Kyo-Beum

    2015-01-01

    time of small- and medium-voltage vectors. However, if the power factor is lower, there is a limitation to eliminate neutral-point oscillations. In this case, the proposed method can be improved by changing the switching sequence properly. Additionally, a method for neutral-point voltage balancing......This paper proposes a method to reduce the low-frequency neutral-point voltage oscillations. The neutral-point voltage oscillations are considerably reduced by adding a time offset to the three-phase turn-on times. The proper time offset is simply calculated considering the phase currents and dwell...

  7. Minimal Composite Inflation

    DEFF Research Database (Denmark)

    Channuie, Phongpichit; Jark Joergensen, Jakob; Sannino, Francesco

    2011-01-01

    We investigate models in which the inflaton emerges as a composite field of a four dimensional, strongly interacting and nonsupersymmetric gauge theory featuring purely fermionic matter. We show that it is possible to obtain successful inflation via non-minimal coupling to gravity, and that the u......We investigate models in which the inflaton emerges as a composite field of a four dimensional, strongly interacting and nonsupersymmetric gauge theory featuring purely fermionic matter. We show that it is possible to obtain successful inflation via non-minimal coupling to gravity...

  8. Minimalism and Speakers’ Intuitions

    Directory of Open Access Journals (Sweden)

    Matías Gariazzo

    2011-08-01

    Full Text Available Minimalism proposes a semantics that does not account for speakers’ intuitions about the truth conditions of a range of sentences or utterances. Thus, a challenge for this view is to offer an explanation of how its assignment of semantic contents to these sentences is grounded in their use. Such an account was mainly offered by Soames, but also suggested by Cappelen and Lepore. The article criticizes this explanation by presenting four kinds of counterexamples to it, and arrives at the conclusion that minimalism has not successfully answered the above-mentioned challenge.

  9. Minimal open strings

    International Nuclear Information System (INIS)

    Hosomichi, Kazuo

    2008-01-01

    We study FZZT-branes and open string amplitudes in (p, q) minimal string theory. We focus on the simplest boundary changing operators in two-matrix models, and identify the corresponding operators in worldsheet theory through the comparison of amplitudes. Along the way, we find a novel linear relation among FZZT boundary states in minimal string theory. We also show that the boundary ground ring is realized on physical open string operators in a very simple manner, and discuss its use for perturbative computation of higher open string amplitudes.

  10. Safety control and minimization of radioactive wastes

    International Nuclear Information System (INIS)

    Wang Jinming; Rong Feng; Li Jinyan; Wang Xin

    2010-01-01

    Compared with the developed countries, the safety control and minimization of the radwastes in China are under-developed. The research of measures for the safety control and minimization of the radwastes is very important for the safety control of the radwastes, and the reduction of the treatment and disposal cost and environment radiation hazards. This paper has systematically discussed the safety control and the minimization of the radwastes produced in the nuclear fuel circulation, nuclear technology applications and the process of decommission of nuclear facilities, and has provided some measures and methods for the safety control and minimization of the radwastes. (authors)

  11. A novel analytical method for D-glucosamine quantification and its application in the analysis of chitosan degradation by a minimal enzyme cocktail

    DEFF Research Database (Denmark)

    Mekasha, Sophanit; Toupalová, Hana; Linggadjaja, Eka

    2016-01-01

    Enzymatic depolymerization of chitosan, a β-(1,4)-linked polycationic polysaccharide composed of D-glucosamine (GlcN) and N-acetyl-D-glucosamine (GlcNAc) provides a possible route to the exploitation of chitin-rich biomass. Complete conversion of chitosan to mono-sugars requires the synergistic...... action of endo- and exo- chitosanases. In the present study we have developed an efficient and cost-effective chitosan-degrading enzyme cocktail containing only two enzymes, an endo-attacking bacterial chitosanase, ScCsn46A, from Streptomyces coelicolor, and an exo-attacking glucosamine specific β...

  12. Minimal model holography

    International Nuclear Information System (INIS)

    Gaberdiel, Matthias R; Gopakumar, Rajesh

    2013-01-01

    We review the duality relating 2D W N minimal model conformal field theories, in a large-N ’t Hooft like limit, to higher spin gravitational theories on AdS 3 . This article is part of a special issue of Journal of Physics A: Mathematical and Theoretical devoted to ‘Higher spin theories and holography’. (review)

  13. Minimal constrained supergravity

    Energy Technology Data Exchange (ETDEWEB)

    Cribiori, N. [Dipartimento di Fisica e Astronomia “Galileo Galilei”, Università di Padova, Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova, Via Marzolo 8, 35131 Padova (Italy); Dall' Agata, G., E-mail: dallagat@pd.infn.it [Dipartimento di Fisica e Astronomia “Galileo Galilei”, Università di Padova, Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova, Via Marzolo 8, 35131 Padova (Italy); Farakos, F. [Dipartimento di Fisica e Astronomia “Galileo Galilei”, Università di Padova, Via Marzolo 8, 35131 Padova (Italy); INFN, Sezione di Padova, Via Marzolo 8, 35131 Padova (Italy); Porrati, M. [Center for Cosmology and Particle Physics, Department of Physics, New York University, 4 Washington Place, New York, NY 10003 (United States)

    2017-01-10

    We describe minimal supergravity models where supersymmetry is non-linearly realized via constrained superfields. We show that the resulting actions differ from the so called “de Sitter” supergravities because we consider constraints eliminating directly the auxiliary fields of the gravity multiplet.

  14. Hazardous waste minimization

    International Nuclear Information System (INIS)

    Freeman, H.

    1990-01-01

    This book presents an overview of waste minimization. Covers applications of technology to waste reduction, techniques for implementing programs, incorporation of programs into R and D, strategies for private industry and the public sector, and case studies of programs already in effect

  15. Minimally invasive distal pancreatectomy

    NARCIS (Netherlands)

    Røsok, Bård I.; de Rooij, Thijs; van Hilst, Jony; Diener, Markus K.; Allen, Peter J.; Vollmer, Charles M.; Kooby, David A.; Shrikhande, Shailesh V.; Asbun, Horacio J.; Barkun, Jeffrey; Besselink, Marc G.; Boggi, Ugo; Conlon, Kevin; Han, Ho Seong; Hansen, Paul; Kendrick, Michael L.; Kooby, David; Montagnini, Andre L.; Palanivelu, Chinnasamy; Wakabayashi, Go; Zeh, Herbert J.

    2017-01-01

    The first International conference on Minimally Invasive Pancreas Resection was arranged in conjunction with the annual meeting of the International Hepato-Pancreato-Biliary Association (IHPBA), in Sao Paulo, Brazil on April 19th 2016. The presented evidence and outcomes resulting from the session

  16. Minimal constrained supergravity

    Directory of Open Access Journals (Sweden)

    N. Cribiori

    2017-01-01

    Full Text Available We describe minimal supergravity models where supersymmetry is non-linearly realized via constrained superfields. We show that the resulting actions differ from the so called “de Sitter” supergravities because we consider constraints eliminating directly the auxiliary fields of the gravity multiplet.

  17. Minimal constrained supergravity

    International Nuclear Information System (INIS)

    Cribiori, N.; Dall'Agata, G.; Farakos, F.; Porrati, M.

    2017-01-01

    We describe minimal supergravity models where supersymmetry is non-linearly realized via constrained superfields. We show that the resulting actions differ from the so called “de Sitter” supergravities because we consider constraints eliminating directly the auxiliary fields of the gravity multiplet.

  18. Method to minimize the low-frequency neutral-point voltage oscillations with time-offset injection for neutral-point-clamped inverters

    DEFF Research Database (Denmark)

    Choi, Uimin; Lee, Kyo-Beum; Blaabjerg, Frede

    2013-01-01

    This paper proposes a method to reduce the low-frequency neutral-point voltage oscillations. The neutral-point voltage oscillations are considerably reduced by adding a time-offset to the three phase turn-on times. The proper time-offset is simply calculated considering the phase currents and dwell...

  19. On eco-efficient technologies to minimize industrial water consumption

    Science.gov (United States)

    Amiri, Mohammad C.; Mohammadifard, Hossein; Ghaffari, Ghasem

    2016-07-01

    Purpose - Water scarcity will further stress on available water systems and decrease the security of water in many areas. Therefore, innovative methods to minimize industrial water usage and waste production are of paramount importance in the process of extending fresh water resources and happen to be the main life support systems in many arid regions of the world. This paper demonstrates that there are good opportunities for many industries to save water and decrease waste water in softening process by substituting traditional with echo-friendly methods. The patented puffing method is an eco-efficient and viable technology for water saving and waste reduction in lime softening process. Design/methodology/approach - Lime softening process (LSP) is a very sensitive process to chemical reactions. In addition, optimal monitoring not only results in minimizing sludge that must be disposed of but also it reduces the operating costs of water conditioning. Weakness of the current (regular) control of LSP based on chemical analysis has been demonstrated experimentally and compared with the eco-efficient puffing method. Findings - This paper demonstrates that there is a good opportunity for many industries to save water and decrease waste water in softening process by substituting traditional method with puffing method, a patented eco-efficient technology. Originality/value - Details of the required innovative works to minimize industrial water usage and waste production are outlined in this paper. Employing the novel puffing method for monitoring of lime softening process results in saving a considerable amount of water while reducing chemical sludge.

  20. [Minimally invasive coronary artery surgery].

    Science.gov (United States)

    Zalaquett, R; Howard, M; Irarrázaval, M J; Morán, S; Maturana, G; Becker, P; Medel, J; Sacco, C; Lema, G; Canessa, R; Cruz, F

    1999-01-01

    There is a growing interest to perform a left internal mammary artery (LIMA) graft to the left anterior descending coronary artery (LAD) on a beating heart through a minimally invasive access to the chest cavity. To report the experience with minimally invasive coronary artery surgery. Analysis of 11 patients aged 48 to 79 years old with single vessel disease that, between 1996 and 1997, had a LIMA graft to the LAD performed through a minimally invasive left anterior mediastinotomy, without cardiopulmonary bypass. A 6 to 10 cm left parasternal incision was done. The LIMA to the LAD anastomosis was done after pharmacological heart rate and blood pressure control and a period of ischemic pre conditioning. Graft patency was confirmed intraoperatively by standard Doppler techniques. Patients were followed for a mean of 11.6 months (7-15 months). All patients were extubated in the operating room and transferred out of the intensive care unit on the next morning. Seven patients were discharged on the third postoperative day. Duplex scanning confirmed graft patency in all patients before discharge; in two patients, it was confirmed additionally by arteriography. There was no hospital mortality, no perioperative myocardial infarction and no bleeding problems. After follow up, ten patients were free of angina, in functional class I and pleased with the surgical and cosmetic results. One patient developed atypical angina on the seventh postoperative month and a selective arteriography confirmed stenosis of the anastomosis. A successful angioplasty of the original LAD lesion was carried out. A minimally invasive left anterior mediastinotomy is a good surgical access to perform a successful LIMA to LAD graft without cardiopulmonary bypass, allowing a shorter hospital stay and earlier postoperative recovery. However, a larger experience and a longer follow up is required to define its role in the treatment of coronary artery disease.

  1. Waste Minimization and Pollution Prevention Awareness Plan

    International Nuclear Information System (INIS)

    1994-04-01

    The purpose of this plan is to document Lawrence Livermore National Laboratory (LLNL) projections for present and future waste minimization and pollution prevention. The plan specifies those activities and methods that are or will be used to reduce the quantity and toxicity of wastes generated at the site. It is intended to satisfy Department of Energy (DOE) requirements. This Plan provides an overview of projected activities from FY 1994 through FY 1999. The plans are broken into site-wide and problem-specific activities. All directorates at LLNL have had an opportunity to contribute input, to estimate budget, and to review the plan. In addition to the above, this plan records LLNL's goals for pollution prevention, regulatory drivers for those activities, assumptions on which the cost estimates are based, analyses of the strengths of the projects, and the barriers to increasing pollution prevention activities

  2. Minimal Walking Technicolor

    DEFF Research Database (Denmark)

    Foadi, Roshan; Frandsen, Mads Toudal; A. Ryttov, T.

    2007-01-01

    Different theoretical and phenomenological aspects of the Minimal and Nonminimal Walking Technicolor theories have recently been studied. The goal here is to make the models ready for collider phenomenology. We do this by constructing the low energy effective theory containing scalars......, pseudoscalars, vector mesons and other fields predicted by the minimal walking theory. We construct their self-interactions and interactions with standard model fields. Using the Weinberg sum rules, opportunely modified to take into account the walking behavior of the underlying gauge theory, we find...... interesting relations for the spin-one spectrum. We derive the electroweak parameters using the newly constructed effective theory and compare the results with the underlying gauge theory. Our analysis is sufficiently general such that the resulting model can be used to represent a generic walking technicolor...

  3. Minimization over randomly selected lines

    Directory of Open Access Journals (Sweden)

    Ismet Sahin

    2013-07-01

    Full Text Available This paper presents a population-based evolutionary optimization method for minimizing a given cost function. The mutation operator of this method selects randomly oriented lines in the cost function domain, constructs quadratic functions interpolating the cost function at three different points over each line, and uses extrema of the quadratics as mutated points. The crossover operator modifies each mutated point based on components of two points in population, instead of one point as is usually performed in other evolutionary algorithms. The stopping criterion of this method depends on the number of almost degenerate quadratics. We demonstrate that the proposed method with these mutation and crossover operations achieves faster and more robust convergence than the well-known Differential Evolution and Particle Swarm algorithms.

  4. Minimal Dark Matter in the sky

    International Nuclear Information System (INIS)

    Panci, P.

    2016-01-01

    We discuss some theoretical and phenomenological aspects of the Minimal Dark Matter (MDM) model proposed in 2006, which is a theoretical framework highly appreciated for its minimality and yet its predictivity. We first critically review the theoretical requirements of MDM pointing out generalizations of this framework. Then we review the phenomenology of the originally proposed fermionic hyperchargeless electroweak quintuplet showing its main γ-ray tests.

  5. Appropriate statistical methods are required to assess diagnostic tests for replacement, add-on, and triage

    NARCIS (Netherlands)

    Hayen, Andrew; Macaskill, Petra; Irwig, Les; Bossuyt, Patrick

    2010-01-01

    To explain which measures of accuracy and which statistical methods should be used in studies to assess the value of a new binary test as a replacement test, an add-on test, or a triage test. Selection and explanation of statistical methods, illustrated with examples. Statistical methods for

  6. HIV in hiding: methods and data requirements for the estimation of the number of people living with undiagnosed HIV

    DEFF Research Database (Denmark)

    Lundgren, Jens

    2011-01-01

    Many people who are HIV positive are unaware of their infection status. Estimation of the number of people with undiagnosed HIV within a country or region is vital for understanding future need for treatment and for motivating testing programs. We review the available estimation approaches which...... are in current use. They can be broadly classified into those based on prevalence surveys and those based on reported HIV and AIDS cases. Estimation based on prevalence data requires data from regular prevalence surveys in different population groups together with estimates of the size of these groups....... The recommended minimal case reporting data needed to estimate the number of patients with undiagnosed HIV are HIV diagnoses, including CD4 count at diagnosis and whether there has been an AIDS diagnosis in the 3 months before or after HIV diagnosis, and data on deaths in people with HIV. We would encourage all...

  7. Minimization of radioactive solid wastes from uranium mining and metallurgy

    International Nuclear Information System (INIS)

    Zhang Xueli; Xu Lechang; Wei Guangzhi; Gao Jie; Wang Erqi

    2010-01-01

    The concept and contents of radioactive waste minimization are introduced. The principle of radioactive waste minimization involving administration optimization, source reduction, recycling and reuse as well as volume reduction are discussed. The strategies and methods to minimize radioactive solid wastes from uranium mining and metallurgy are summarized. In addition, the benefit from its application of radioactive waste minimization is analyzed. Prospects for the research on radioactive so-lid waste minimization are made in the end. (authors)

  8. Review of data requirements for groundwater flow and solute transport modelling and the ability of site investigation methods to meet these requirements

    International Nuclear Information System (INIS)

    McEwen, T.J.; Chapman, N.A.; Robinson, P.C.

    1990-08-01

    This report describes the data requirements for the codes that may be used in the modelling of groundwater flow and radionuclide transport during the assessment of a Nirex site for the deep disposal of low and intermediate level radioactive waste and also the site investigation methods that exist to supply the data for these codes. The data requirements for eight codes are reviewed, with most emphasis on three of the more significant codes, VANDAL, NAMMU and CHEMTARD. The largest part of the report describes and discusses the site investigation techniques and each technique is considered in terms of its ability to provide the data necessary to characterise the geological and hydrogeological environment around a potential repository. (author)

  9. Production of Superoxide in Bacteria Is Stress- and Cell State-Dependent: A Gating-Optimized Flow Cytometry Method that Minimizes ROS Measurement Artifacts with Fluorescent Dyes.

    Science.gov (United States)

    McBee, Megan E; Chionh, Yok H; Sharaf, Mariam L; Ho, Peiying; Cai, Maggie W L; Dedon, Peter C

    2017-01-01

    The role of reactive oxygen species (ROS) in microbial metabolism and stress response has emerged as a major theme in microbiology and infectious disease. Reactive fluorescent dyes have the potential to advance the study of ROS in the complex intracellular environment, especially for high-content and high-throughput analyses. However, current dye-based approaches to measuring intracellular ROS have the potential for significant artifacts. Here, we describe a robust platform for flow cytometric quantification of ROS in bacteria using fluorescent dyes, with ROS measurements in 10s-of-1000s of individual cells under a variety of conditions. False positives and variability among sample types (e.g., bacterial species, stress conditions) are reduced with a flexible four-step gating scheme that accounts for side- and forward-scattered light (morphological changes), background fluorescence, DNA content, and dye uptake to identify cells producing ROS. Using CellROX Green dye with Escherichia coli, Mycobacterium smegmatis , and Mycobacterium bovis BCG as diverse model bacteria, we show that (1) the generation of a quantifiable CellROX Green signal for superoxide, but not hydrogen peroxide-induced hydroxyl radicals, validates this dye as a superoxide detector; (2) the level of dye-detectable superoxide does not correlate with cytotoxicity or antibiotic sensitivity; (3) the non-replicating, antibiotic tolerant state of nutrient-deprived mycobacteria is associated with high levels of superoxide; and (4) antibiotic-induced production of superoxide is idiosyncratic with regard to both the species and the physiological state of the bacteria. We also show that the gating method is applicable to other fluorescent indicator dyes, such as the 5-carboxyfluorescein diacetate acetoxymethyl ester and 5-cyano-2,3-ditolyl tetrazolium chloride for cellular esterase and reductive respiratory activities, respectively. These results demonstrate that properly controlled flow cytometry coupled

  10. A mixed methods approach to developing and evaluating oncology trainee education around minimization of adverse events and improved patient quality and safety.

    Science.gov (United States)

    Janssen, Anna; Shaw, Tim; Bradbury, Lauren; Moujaber, Tania; Nørrelykke, Anne Mette; Zerillo, Jessica A; LaCasce, Ann; Co, John Patrick T; Robinson, Tracy; Starr, Alison; Harnett, Paul

    2016-03-12

    Adverse events are a significant quality and safety issue in the hospital setting due to their direct impact on patients. Additionally, such events are often handled by junior doctors due to their direct involvement with patients. As such, it is important for health care organizations to prioritize education and training for junior doctors on identifying adverse events and handling them when they occur. The Cancer Cup Challenge is an educational program focuses on quality improvement and adverse event awareness targeting for junior oncology doctors across three international sites. A mixed methodology was used to develop and evaluate the program. The Qstream spaced learning platform was used to disseminate information to participants, as it has been demonstrated to impact on both knowledge and behavior. Eight short case based scenarios with expert feedback were developed by a multidisciplinary advisory committee containing representatives from the international sites. At the conclusion of the course impact on participant knowledge was evaluated using analysis of the metrics collected by the Qstream platform. Additionally, an online survey and semi-structured interviews were used to evaluate engagement and perceived value by participants. A total of 35 junior doctors registered to undertake the Qstream program, with 31 (88.57 %) successfully completing it. Analysis of the Qstream metrics revealed 76.57 % of cases were answered correctly on first attempt. The post-program survey received 17 responses, with 76.47 % indicating cases for the course were interesting and 82.35 % feeling cases were relevant. Finally, 14 participants consented to participate in semi-structured interviews about the program, with feedback towards the course being generally very positive. Our study demonstrates that an online game is well accepted by junior doctors as a method to increase their quality improvement awareness. Developing effective and sustainable training for doctors is

  11. Guidance and methods for satisfying low specific activity material and surface contaminated object regulatory requirements

    International Nuclear Information System (INIS)

    Pope, R.B.; Shappert, L.B.; Michelhaugh, R.D.; Boyle, R.W.; Easton, E.P.; Coodk, J.R.

    1998-01-01

    The U.S. Department of Transportation (DOT) and the U.S. Nuclear Regulatory Commission (NRC) have prepared a comprehensive set of draft guidance for shippers and inspectors to use when applying the newly imposed regulatory requirements for low specific activity (LSA) material and surface contaminated objects (SCOs). These requirements represent significant departures in some areas from the manner in which these materials and objects were regulated by the earlier versions of the regulations. The proper interpretation and application of the regulatory criteria can require a fairly complex set of decisions be made. To assist those trying these regulatory requirements, a detailed set of logic-flow diagrams representing decisions related to multiple factors were prepared and included in the draft report for comment on Categorizing and Transporting Low Specific Activity Materials and Surface Contaminated Objects, (DOT/NRC, 1997). These logic-flow diagrams, as developed, are specific to the U.S. regulations, but were readily adaptable to the IAEA regulations. The diagrams have been modified accordingly and tied directly to specific paragraphs in IAEA Safety Series No. 6. This paper provides the logic-flow diagrams adapted in the IAEA regulations, and demonstrated how these diagrams can be used to assist consignors and inspectors in assessing compliance of shipments with the LSA material and SCO regulatory requirements. (authors)

  12. The ZOOM minimization package

    International Nuclear Information System (INIS)

    Fischler, Mark S.; Sachs, D.

    2004-01-01

    A new object-oriented Minimization package is available for distribution in the same manner as CLHEP. This package, designed for use in HEP applications, has all the capabilities of Minuit, but is a re-write from scratch, adhering to modern C++ design principles. A primary goal of this package is extensibility in several directions, so that its capabilities can be kept fresh with as little maintenance effort as possible. This package is distinguished by the priority that was assigned to C++ design issues, and the focus on producing an extensible system that will resist becoming obsolete

  13. Minimizing the Pacman effect

    International Nuclear Information System (INIS)

    Ritson, D.; Chou, W.

    1997-10-01

    The Pacman bunches will experience two deleterious effects: tune shift and orbit displacement. It is known that the tune shift can be compensated by arranging crossing planes 900 relative to each other at successive interaction points (lPs). This paper gives an analytical estimate of the Pacman orbit displacement for a single as well as for two crossings. For the latter, it can be minimized by using equal phase advances from one IP to another. In the LHC, this displacement is in any event small and can be neglected

  14. Understanding your users a practical guide to user requirements methods, tools, and techniques

    CERN Document Server

    Baxter, Kathy

    2005-01-01

    Today many companies are employing a user-centered design (UCD) process, but for most companies, usability begins and ends with the usability test. Although usability testing is a critical part of an effective user-centered life cycle, it is only one component of the UCD process. This book is focused on the requirements gathering stage, which often receives less attention than usability testing, but is equally as important. Understanding user requirements is critical to the development of a successful product. Understanding Your Users is an easy to read, easy to implement, how-to guide on

  15. Requirements for Participative Management as a Source of Sustainable Competitive Advantage and Tipical Management Method

    Directory of Open Access Journals (Sweden)

    Muscalu Emanoil

    2015-12-01

    Full Text Available The economic context in the recent years has undergone major changes in modern methods and techniques used in management. The current competitive environment is characterized by permanent turbulences influencing firstly the managerial act itself. Out of the many methods and techniques applied so far, some turn out to be less adaptable to the current economic and social context.

  16. THE PSTD ALGORITHM: A TIME-DOMAIN METHOD REQUIRING ONLY TWO CELLS PER WAVELENGTH. (R825225)

    Science.gov (United States)

    A pseudospectral time-domain (PSTD) method is developed for solutions of Maxwell's equations. It uses the fast Fourier transform (FFT), instead of finite differences on conventional finite-difference-time-domain (FDTD) methods, to represent spatial derivatives. Because the Fourie...

  17. Adding Timing Requirements to the CODARTS Real-Time Software Design Method

    DEFF Research Database (Denmark)

    Bach, K.R.

    The CODARTS software design method consideres how concurrent, distributed and real-time applications can be designed. Although accounting for the important issues of task and communication, the method does not provide means for expressing the timeliness of the tasks and communication directly...

  18. Method Verification Requirements for an Advanced Imaging System for Microbial Plate Count Enumeration.

    Science.gov (United States)

    Jones, David; Cundell, Tony

    2018-01-01

    The Growth Direct™ System that automates the incubation and reading of membrane filtration microbial counts on soybean-casein digest, Sabouraud dextrose, and R2A agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. LAY ABSTRACT: The Growth Direct™ System that automates the incubation and reading of microbial counts on membranes on solid agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation time. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. © PDA, Inc. 2018.

  19. 7 CFR Appendix to Subpart C of... - Accounting Methods and Procedures Required of All Borrowers

    Science.gov (United States)

    2010-01-01

    ... Television Services 105 Comprehensive Income—108 Consolidated Financial Statements 106 Cushion of Credit... statements even though this financial information is presented in the parent's consolidated statements. 2. In... consolidated financial statements in accordance with the requirements of Statement No. 94. These consolidated...

  20. A comprehensive program to minimize platelet outdating.

    Science.gov (United States)

    Fuller, Alice K; Uglik, Kristin M; Braine, Hayden G; King, Karen E

    2011-07-01

    Platelet (PLT) transfusions are essential for patients who are bleeding or have an increased risk of bleeding due to a decreased number or abnormal function of circulating PLTs. A shelf life of 5 days for PLT products presents an inventory management challenge. In 2006, greater than 10% of apheresis PLTs made in the United States outdated. It is imperative to have a sufficient number of products for patients requiring transfusion, but outdating PLTs is a financial burden and a waste of a resource. We present the approach used in our institution to anticipate inventory needs based on current patient census and usage. Strategies to predict usage and to identify changes in anticipated usage are examined. Annual outdating is reviewed for a 10-year period from 2000 through 2009. From January 1, 2000, through December 2009, there were 128,207 PLT transfusions given to 15,265 patients. The methods used to anticipate usage and adjust inventory resulted in an annual outdate rate of approximately 1% for the 10-year period reviewed. In addition we have not faced situations where inventory was inadequate to meet the needs of the patients requiring transfusions. We have identified three elements of our transfusion service that can minimize outdate: a knowledgeable proactive staff dedicated to PLT management, a comprehensive computer-based transfusion history for each patient, and a strong two-way relationship with the primary product supplier. Through our comprehensive program, based on the principles of providing optimal patient care, we have minimized PLT outdating for more than 10 years. © 2011 American Association of Blood Banks.

  1. A method for determining the spent-fuel contribution to transport cask containment requirements

    International Nuclear Information System (INIS)

    Sanders, T.L.; Seager, K.D.; Rashid, Y.R.; Barrett, P.R.; Malinauskas, A.P.; Einziger, R.E.; Jordan, H.; Reardon, P.C.

    1992-11-01

    This report examines containment requirements for spent-fuel transport containers that are transported under normal and hypothetical accident conditions. A methodology is described that estimates the probability of rod failure and the quantity of radioactive material released from breached rods. This methodology characterizes the dynamic environment of the cask and its contents and deterministically models the peak stresses that are induced in spent-fuel cladding by the mechanical and thermal dynamic environments. The peak stresses are evaluated in relation to probabilistic failure criteria for generated or preexisting ductile tearing and material fractures at cracks partially through the wall in fuel rods. Activity concentrations in the cask cavity are predicted from estimates of the fraction of gases, volatiles, and fuel fines that are released when the rod cladding is breached. Containment requirements based on the source term are calculated in terms of maximum permissible volumetric leak rates from the cask. Calculations are included for representative cask designs

  2. A Study of Storage Ring Requirements for an Explosive Detection System Using NRA Method

    CERN Document Server

    Wang, Tai-Sen

    2005-01-01

    The technical feasibility of an explosives detection system based on the nuclear resonance absorption (NRA) of gamma rays in nitrogen-rich materials was demonstrated at Los Alamos National Laboratory (LANL) in 1993 by using an RFQ proton accelerator and a tomographic imaging prototype.* The study is being continued recently to examine deployment of such an active interrogation system in realistic scenarios. The approach is to use a cyclotron and electron-cooling-equipped storage rings(s) to provide the high quality and high current proton beam needed in a practical application. In this work, we investigate the storage ring requirements for a variant of the airport luggage inspection system considered in the earlier LANL experiments. Estimations are carried out based on the required inspection throughput, the gamma ray yield, the proton beam emittance growth due to scattering with the photon-production target, beam current limit in the storage ring, and the electron cooling rate. Studies using scaling and reas...

  3. Methods to Minimize Zero-Missing Phenomenon

    DEFF Research Database (Denmark)

    da Silva, Filipe Miguel Faria; Bak, Claus Leth; Gudmundsdottir, Unnur Stella

    2010-01-01

    With the increasing use of high-voltage AC cables at transmission levels, phenomena such as current zero-missing start to appear more often in transmission systems. Zero-missing phenomenon can occur when energizing cable lines with shunt reactors. This may considerably delay the opening of the ci...

  4. Action-minimizing methods in Hamiltonian dynamics

    CERN Document Server

    Sorrentino, Alfonso

    2015-01-01

    John Mather's seminal works in Hamiltonian dynamics represent some of the most important contributions to our understanding of the complex balance between stable and unstable motions in classical mechanics. His novel approach-known as Aubry-Mather theory-singles out the existence of special orbits and invariant measures of the system, which possess a very rich dynamical and geometric structure. In particular, the associated invariant sets play a leading role in determining the global dynamics of the system. This book provides a comprehensive introduction to Mather's theory, and can serve as a

  5. LLNL Waste Minimization Program Plan

    International Nuclear Information System (INIS)

    1990-05-01

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). Now legislation at the federal level is being introduced. Passage will result in new EPA regulations and also DOE orders. At the state level the Hazardous Waste Reduction and Management Review Act of 1989 was signed by the Governor. DHS is currently promulgating regulations to implement the new law. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements

  6. Graphical approach for multiple values logic minimization

    Science.gov (United States)

    Awwal, Abdul Ahad S.; Iftekharuddin, Khan M.

    1999-03-01

    Multiple valued logic (MVL) is sought for designing high complexity, highly compact, parallel digital circuits. However, the practical realization of an MVL-based system is dependent on optimization of cost, which directly affects the optical setup. We propose a minimization technique for MVL logic optimization based on graphical visualization, such as a Karnaugh map. The proposed method is utilized to solve signed-digit binary and trinary logic minimization problems. The usefulness of the minimization technique is demonstrated for the optical implementation of MVL circuits.

  7. METHODS FOR DETERMINING AGITATOR MIXING REQUIREMENTS FOR A MIXING and SAMPLING FACILITY TO FEED WTP (WASTE TREATMENT PLANT)

    International Nuclear Information System (INIS)

    Griffin, P.W.

    2009-01-01

    The following report is a summary of work conducted to evaluate the ability of existing correlative techniques and alternative methods to accurately estimate impeller speed and power requirements for mechanical mixers proposed for use in a mixing and sampling facility (MSF). The proposed facility would accept high level waste sludges from Hanford double-shell tanks and feed uniformly mixed high level waste to the Waste Treatment Plant. Numerous methods are evaluated and discussed, and resulting recommendations provided.

  8. METHODS FOR DETERMINING AGITATOR MIXING REQUIREMENTS FOR A MIXING & SAMPLING FACILITY TO FEED WTP (WASTE TREATMENT PLANT)

    Energy Technology Data Exchange (ETDEWEB)

    GRIFFIN PW

    2009-08-27

    The following report is a summary of work conducted to evaluate the ability of existing correlative techniques and alternative methods to accurately estimate impeller speed and power requirements for mechanical mixers proposed for use in a mixing and sampling facility (MSF). The proposed facility would accept high level waste sludges from Hanford double-shell tanks and feed uniformly mixed high level waste to the Waste Treatment Plant. Numerous methods are evaluated and discussed, and resulting recommendations provided.

  9. A minimal architecture for joint action

    DEFF Research Database (Denmark)

    Vesper, Cordula; Butterfill, Stephen; Knoblich, Günther

    2010-01-01

    What kinds of processes and representations make joint action possible? In this paper we suggest a minimal architecture for joint action that focuses on representations, action monitoring and action prediction processes, as well as ways of simplifying coordination. The architecture spells out...... minimal requirements for an individual agent to engage in a joint action. We discuss existing evidence in support of the architecture as well as open questions that remain to be empirically addressed. In addition, we suggest possible interfaces between the minimal architecture and other approaches...... to joint action. The minimal architecture has implications for theorizing about the emergence of joint action, for human-machine interaction, and for understanding how coordination can be facilitated by exploiting relations between multiple agents’ actions and between actions and the environment....

  10. Minimal conformal model

    Energy Technology Data Exchange (ETDEWEB)

    Helmboldt, Alexander; Humbert, Pascal; Lindner, Manfred; Smirnov, Juri [Max-Planck-Institut fuer Kernphysik, Heidelberg (Germany)

    2016-07-01

    The gauge hierarchy problem is one of the crucial drawbacks of the standard model of particle physics (SM) and thus has triggered model building over the last decades. Its most famous solution is the introduction of low-scale supersymmetry. However, without any significant signs of supersymmetric particles at the LHC to date, it makes sense to devise alternative mechanisms to remedy the hierarchy problem. One such mechanism is based on classically scale-invariant extensions of the SM, in which both the electroweak symmetry and the (anomalous) scale symmetry are broken radiatively via the Coleman-Weinberg mechanism. Apart from giving an introduction to classically scale-invariant models, the talk presents our results on obtaining a theoretically consistent minimal extension of the SM, which reproduces the correct low-scale phenomenology.

  11. Minimal Reducts with Grasp

    Directory of Open Access Journals (Sweden)

    Iris Iddaly Mendez Gurrola

    2011-03-01

    Full Text Available The proper detection of patient level of dementia is important to offer the suitable treatment. The diagnosis is based on certain criteria, reflected in the clinical examinations. From these examinations emerge the limitations and the degree in which each patient is in. In order to reduce the total of limitations to be evaluated, we used the rough set theory, this theory has been applied in areas of the artificial intelligence such as decision analysis, expert systems, knowledge discovery, classification with multiple attributes. In our case this theory is applied to find the minimal limitations set or reduct that generate the same classification that considering all the limitations, to fulfill this purpose we development an algorithm GRASP (Greedy Randomized Adaptive Search Procedure.

  12. Minimally extended SILH

    International Nuclear Information System (INIS)

    Chala, Mikael; Grojean, Christophe; Humboldt-Univ. Berlin; Lima, Leonardo de; Univ. Estadual Paulista, Sao Paulo

    2017-03-01

    Higgs boson compositeness is a phenomenologically viable scenario addressing the hierarchy problem. In minimal models, the Higgs boson is the only degree of freedom of the strong sector below the strong interaction scale. We present here the simplest extension of such a framework with an additional composite spin-zero singlet. To this end, we adopt an effective field theory approach and develop a set of rules to estimate the size of the various operator coefficients, relating them to the parameters of the strong sector and its structural features. As a result, we obtain the patterns of new interactions affecting both the new singlet and the Higgs boson's physics. We identify the characteristics of the singlet field which cause its effects on Higgs physics to dominate over the ones inherited from the composite nature of the Higgs boson. Our effective field theory construction is supported by comparisons with explicit UV models.

  13. Minimally extended SILH

    Energy Technology Data Exchange (ETDEWEB)

    Chala, Mikael [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Valencia Univ. (Spain). Dept. de Fisica Teorica y IFIC; Durieux, Gauthier; Matsedonskyi, Oleksii [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Grojean, Christophe [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Humboldt-Univ. Berlin (Germany). Inst. fuer Physik; Lima, Leonardo de [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); Univ. Estadual Paulista, Sao Paulo (Brazil). Inst. de Fisica Teorica

    2017-03-15

    Higgs boson compositeness is a phenomenologically viable scenario addressing the hierarchy problem. In minimal models, the Higgs boson is the only degree of freedom of the strong sector below the strong interaction scale. We present here the simplest extension of such a framework with an additional composite spin-zero singlet. To this end, we adopt an effective field theory approach and develop a set of rules to estimate the size of the various operator coefficients, relating them to the parameters of the strong sector and its structural features. As a result, we obtain the patterns of new interactions affecting both the new singlet and the Higgs boson's physics. We identify the characteristics of the singlet field which cause its effects on Higgs physics to dominate over the ones inherited from the composite nature of the Higgs boson. Our effective field theory construction is supported by comparisons with explicit UV models.

  14. Sensor Selection method for IoT systems – focusing on embedded system requirements

    Directory of Open Access Journals (Sweden)

    Hirayama Masayuki

    2016-01-01

    Full Text Available Recently, various types of sensors have been developed. Using these sensors, IoT systems have become hot topics in embedded system domain. However, sensor selections for embedded systems are not well discussed up to now. This paper focuses on embedded system’s features and architecture, and proposes a sensor selection method which is composed seven steps. In addition, we applied the proposed method to a simple example – a sensor selection for computer scored answer sheet reader unit. From this case study, an idea to use FTA in sensor selection is also discussed.

  15. Methodical investigations on the determination of metabolic lysine requirements in broiler chickens. 1

    International Nuclear Information System (INIS)

    Bergner, H.; Nguyen Thi Nhan; Wilke, A.

    1987-01-01

    For the estimation of lysine requirement 128 male broiler chickens were used at an age of 7 to 21 days posthatching. They received a lysine-deficient diet composed of wheat and wheat gluten. To this basal diet L-lysine-HCL was supplemented successively resulting in 8 lysine levels ranging from 5.8 to 23.3 g lysine per kg dry matter (DM) (2.2 to 8.7 g lysine per 16 g N). At the end of the two-week feeding period of the experimental diets 14 C-lysine was injected intravenously 1.5 and 5.5 hours after feed withdrawal. During the following 4 hours the exretion of CO 2 and 14 CO 2 was measured. The highest daily gain of 21.5 g was observed in animals fed 13.3 g lysine-kg DM. Lysine concentrations exceeding 18.3 g/kg DM depressed body weight gain. The CO 2 excretion was not influenced by lysine intake. 14 CO 2 excretion was low with diets low in lysine content and increased 3 to 4 times with diets meeting the lysine requirement. Based on measurements 1.5 to 5.5 hours after feed withdrawal the saturation value for lysine was reached at 13.3 g/kg DM. This value was lowered (10.8 g/kg DM), however, if the estimation was carried out 5.5 to 9.5 hours after feed withdrawal. These results suggest a higher metabolic lysine requirement during the earlier period after feed intake. Both, reduced weight gain and non linearity in 14 CO 2 excretion in diets exceeding a lysine content of 18.3 g/kg DM indicate a limited capacity of the organism to degrade excessive lysine. According to the results a lysine requirement betwen 10.8 and 13.3 g/kg DM (27% CP and 660 EFU/sub hen//kg DM) was estimated for broiler chickens 3 weeks posthatching. (author)

  16. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Yong Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States); Yoo, Jun Soo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  17. 30 CFR 48.3 - Training plans; time of submission; where filed; information required; time for approval; method...

    Science.gov (United States)

    2010-07-01

    ... 30 Mineral Resources 1 2010-07-01 2010-07-01 false Training plans; time of submission; where filed....3 Training plans; time of submission; where filed; information required; time for approval; method... training plan shall be filed with the District Manager for the area in which the mine is located. (c) Each...

  18. Potential Use of Agile Methods in Selected DoD Acquisitions: Requirements Development and Management

    Science.gov (United States)

    2014-04-01

    guidelines. 9 Kanban is a technique for managing workflow originating from the lean engineering methods pioneered by Toyota. [Reinertsen 2009...Cockburn, Alistair, & Pols, Andy. Patterns for Effective Use Cas- es. Addison-Wesley, 2002. Anderson, David. Kanban . Blue Hole Press, 2010. CMU/SEI-2013

  19. 40 CFR 53.3 - General requirements for an equivalent method determination.

    Science.gov (United States)

    2010-07-01

    ... part. (6) ISO 9001. All designated FEMs for PM2.5 or PM10−2.5 must be manufactured in an ISO 9001... candidate method. (4) All designated FEM for PM2.5 or PM10−2.5 must be manufactured in an ISO 9001...

  20. Methods for calculating energy and current requirements for industrial electron beam processing

    International Nuclear Information System (INIS)

    Cleland, M.R.; Farrell, J.P.

    1976-01-01

    The practical problems of determining electron beam parameters for industrial irradiation processes are discussed. To assist the radiation engineer in this task, the physical aspects of electron beam absorption are briefly described. Formulas are derived for calculating the surface dose in the treated material using the electron energy, beam current and the area thruput rate of the conveyor. For thick absorbers electron transport results are used to obtain the depth-dose distributions. From these the average dose in the material, anti D, and the beam power utilization efficiency, F/sub p/, can be found by integration over the distributions. These concepts can be used to relate the electron beam power to the mass thruput rate. Qualitatively, the thickness of the material determines the beam energy, the area thruput rate and surface dose determine the beam current while the mass thruput rate and average depth-dose determine the beam power requirements. Graphs are presented showing these relationships as a function of electron energy from 0.2 to 4.0 MeV for polystyrene. With this information, the determination of electron energy and current requirements is a relatively simple procedure

  1. Dietary energy requirements of young adult men, determined by using the doubly labeled water method

    International Nuclear Information System (INIS)

    Roberts, S.B.; Heyman, M.B.; Evans, W.J.; Fuss, P.; Tsay, R.; Young, V.R.

    1991-01-01

    The autors examined the hypothesis that current recommendations on dietary energy requirements may underestimate the total energy needs of young adult men, by measuring total energy expenditure (TEE) and resting energy expenditure (REE) in 14 weight-maintaining healthy subjects leading unrestricted lives. TEE and body composition were measured by using 2H(2)18O, and REE was measured by using indirect calorimetry. All subjects had sedentary full-time occupations and participated in strenuous leisure activities for 34 ± 6 (SE) min/d. TEE and REE were 14.61 ± 0.76 and 7.39 ± 0.26 MJ/d, respectively, and 202 ± 2 and 122 ± 2 kJ.kg-1.d-1. There were significant relationships between TEE and both body fat-free mass (r = 0.732, P less than 0.005) and measured REE (r = 0.568, P less than 0.05). Measured TEE:REE values were significantly higher than the recommended energy requirement (1.98 ± 0.09, compared with 1.55 or 1.67, P less than 0.005). These results are consistent with the suggestion that the current recommended energy intake for young adult men may underestimate total energy needs

  2. A Study of Storage Ring Requirements for an Explosive Detection System Using NRA Method.

    Energy Technology Data Exchange (ETDEWEB)

    Wang, T. F. (Tai-Sen F.); Kwan, T. J. T. (Thomas J. T.)

    2005-01-01

    The technical feasibility of an explosives detection system based on the nuclear resonance absorption (NRA) of gamma rays in nitrogen-rich materials was demonstrated at Los Alamos National Laboratory (LANL) in 1993 by using an RFQ proton accelerator and a tomographic imaging prototype. The study is being continued recently to examine deployment of such an active interrogation system in realistic scenarios. The approach is to use an accelerator and electron-cooling-equipped storage rings(s) to provide the high quality and high current proton beam needed in a practical application. In this work, we investigate the requirements on the storage ring(s) with external gamma-ray-production target for a variant of the airport luggage inspection system considered in the earlier LANL experiments. Estimations are carried out based on the required inspection throughput, the gamma ray yield, the proton beam emittance growth due to scatters with the photon-production target, beam current limit in the storage ring, and the electron-cooling rate. Studies using scaling and reasonable parameter values indicate that it is possible to use no more than a few storage rings per inspection station in a practical NRA luggage inspection complex having more than ten inspection stations.

  3. Comparison of Land, Water, and Energy Requirements of Lettuce Grown Using Hydroponic vs. Conventional Agricultural Methods.

    Science.gov (United States)

    Barbosa, Guilherme Lages; Gadelha, Francisca Daiane Almeida; Kublik, Natalya; Proctor, Alan; Reichelm, Lucas; Weissinger, Emily; Wohlleb, Gregory M; Halden, Rolf U

    2015-06-16

    The land, water, and energy requirements of hydroponics were compared to those of conventional agriculture by example of lettuce production in Yuma, Arizona, USA. Data were obtained from crop budgets and governmental agricultural statistics, and contrasted with theoretical data for hydroponic lettuce production derived by using engineering equations populated with literature values. Yields of lettuce per greenhouse unit (815 m2) of 41 ± 6.1 kg/m2/y had water and energy demands of 20 ± 3.8 L/kg/y and 90,000 ± 11,000 kJ/kg/y (±standard deviation), respectively. In comparison, conventional production yielded 3.9 ± 0.21 kg/m2/y of produce, with water and energy demands of 250 ± 25 L/kg/y and 1100 ± 75 kJ/kg/y, respectively. Hydroponics offered 11 ± 1.7 times higher yields but required 82 ± 11 times more energy compared to conventionally produced lettuce. To the authors' knowledge, this is the first quantitative comparison of conventional and hydroponic produce production by example of lettuce grown in the southwestern United States. It identified energy availability as a major factor in assessing the sustainability of hydroponics, and it points to water-scarce settings offering an abundance of renewable energy (e.g., from solar, geothermal, or wind power) as particularly attractive regions for hydroponic agriculture.

  4. The Paradox of "Structured" Methods for Software Requirements Management: A Case Study of an e-Government Development Project

    Science.gov (United States)

    Conboy, Kieran; Lang, Michael

    This chapter outlines the alternative perspectives of "rationalism" and "improvisation" within information systems development and describes the major shortcomings of each. It then discusses how these shortcomings manifested themselves within an e-government case study where a "structured" requirements management method was employed. Although this method was very prescriptive and firmly rooted in the "rational" paradigm, it was observed that users often resorted to improvised behaviour, such as privately making decisions on how certain aspects of the method should or should not be implemented.

  5. A Study on a Control Method with a Ventilation Requirement of a VAV System in Multi-Zone

    Directory of Open Access Journals (Sweden)

    Hyo-Jun Kim

    2017-11-01

    Full Text Available The objective of this study was to propose a control method with a ventilation requirement of variable air volume (VAV system in multi-zone. In order to control the VAV system inmulti-zone, it is essential to control the terminal unit installed in each zone. A VAV terminal unit with conventional control method using a fixed minimum air flow can cause indoor air quality (IAQ issues depending on the variation in the number of occupants. This research proposes a control method with a ventilation requirement of the VAV terminal unit and AHU inmulti-zone. The integrated control method with an air flow increase model in the VAV terminal unit, AHU, and outdoor air intake rate increase model in the AHU was based on the indoor CO2 concentration. The conventional and proposed control algorithms were compared through a TRNSYS simulation program. The proposed VAV terminal unit control method satisfies all the conditions of indoor temperature, IAQ, and stratification. An energy comparison with the conventional control method showed that the method satisfies not only the indoor thermal comfort, IAQ, and stratification issue, but also reduces the energy consumption.

  6. COMPARISON OF SPATIAL INTERPOLATION METHODS FOR WHEAT WATER REQUIREMENT AND ITS TEMPORAL DISTRIBUTION IN HAMEDAN PROVINCE (IRAN

    Directory of Open Access Journals (Sweden)

    M. H. Nazarifar

    2014-01-01

    Full Text Available Water is the main constraint for production of agricultural crops. The temporal and spatial variations in water requirement for agriculture products are limiting factors in the study of optimum use of water resources in regional planning and management. However, due to unfavorable distribution and density of meteorological stations, it is not possible to monitor the regional variations precisely. Therefore, there is a need to estimate the evapotranspiration of crops at places where meteorological data are not available and then extend the findings from points of measurements to regional scale. Geostatistical methods are among those methods that can be used for estimation of evapotranspiration at regional scale. The present study attempts to investigate different geostatistical methods for temporal and spatial estimation of water requirements for wheat crop in different periods. The study employs the data provided by 16 synoptic and climatology meteorological stations in Hamadan province in Iran. Evapotranspiration for each month and for the growth period were determined using Penman-Mantis and Torrent-White methods for different water periods based on Standardized Precipitation Index (SPI. Among the available geostatistical methods, three methods: Kriging Method, Cokriging Method, and inverse weighted distance were selected, and analyzed, using GS+ software. Analysis and selection of the suitable geostatistical method were performed based on two measures, namely Mean Absolute Error (MAE and Mean Bias Error (MBE. The findings suggest that, in general, during the drought period, Kriging method is the proper one for estimating water requirements for the six months: January, February, April, May, August, and December. However, weighted moving average is a better estimation method for the months March, June, September, and October. In addition, Kriging is the best method for July. In normal conditions, Kriging is suitable for April, August, December

  7. Cardiovascular and Energy Requirements of Parents Watching Their Child Compete: A Pilot Mixed-Methods Investigation

    Directory of Open Access Journals (Sweden)

    Marc Lochbaum

    2017-11-01

    Full Text Available Purpose: Researchers have extensively documented the cardiovascular and metabolic demands for sports participation. To date, researchers have ignored the same requirements of competitor’s parents. Hence, our purpose was to document parent cardiovascular and metabolic responses to watching their child compete while also paying particular attention to their thoughts before and after the competition. Achievement Goal Theory (AGT drove interpretation of parent thoughts. Materials: Parents wore a device, made by Firstbeat Technologies, which continuously monitored heart rate. The parents wore the device the night before the competition to be acclimated to the technology and during the event until later in the day. Parents also completed two open-ended questions, one before the tournament and one after the contest. Results: Before the contest, the dad expected that his son won the event (Croatian National Championships for juniors. Conversely, the mother’s expectations centered more on her son’s enjoyment and competing to the best of his abilities. Parents had differing cardiovascular and energy requirement responses to watching their son compete. In addition, post-competition reflections differed as the father expressed disappointment whereas the mother expressed sadness. Conclusions: The data presented are unique and a first in the sports literature. The parents varied in the intensity of their cardiovascular responses and calories burned while watching their son compete. The father’s cardiovascular response over the course of watching was that of an aerobic workout. Whether this pattern is unique or universal are a critical research question. Last, AGT appears relevant when assessing the parent’s expectations.

  8. Image denoising by a direct variational minimization

    Directory of Open Access Journals (Sweden)

    Pilipović Stevan

    2011-01-01

    Full Text Available Abstract In this article we introduce a novel method for the image de-noising which combines a mathematically well-posdenes of the variational modeling with the efficiency of a patch-based approach in the field of image processing. It based on a direct minimization of an energy functional containing a minimal surface regularizer that uses fractional gradient. The minimization is obtained on every predefined patch of the image, independently. By doing so, we avoid the use of an artificial time PDE model with its inherent problems of finding optimal stopping time, as well as the optimal time step. Moreover, we control the level of image smoothing on each patch (and thus on the whole image by adapting the Lagrange multiplier using the information on the level of discontinuities on a particular patch, which we obtain by pre-processing. In order to reduce the average number of vectors in the approximation generator and still to obtain the minimal degradation, we combine a Ritz variational method for the actual minimization on a patch, and a complementary fractional variational principle. Thus, the proposed method becomes computationally feasible and applicable for practical purposes. We confirm our claims with experimental results, by comparing the proposed method with a couple of PDE-based methods, where we get significantly better denoising results specially on the oscillatory regions.

  9. Minimal Marking: A Success Story

    Science.gov (United States)

    McNeilly, Anne

    2014-01-01

    The minimal-marking project conducted in Ryerson's School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The "minimal-marking" concept (Haswell, 1983), which requires…

  10. Analysis and minimization of Torque Ripple for variable Flux reluctance machines

    NARCIS (Netherlands)

    Bao, J.; Gysen, B.L.J.; Boynov, K.; Paulides, J.J.H.; Lomonova, E.A.

    2017-01-01

    Variable flux reluctance machines (VFRMs) are permanent-magnet-free three-phase machines and are promising candidates for applications requiring low cost and robustness. This paper studies the torque ripple and minimization methods for 12-stator VFRMs. Starting with the analysis of harmonics in the

  11. Swarm robotics and minimalism

    Science.gov (United States)

    Sharkey, Amanda J. C.

    2007-09-01

    Swarm Robotics (SR) is closely related to Swarm Intelligence, and both were initially inspired by studies of social insects. Their guiding principles are based on their biological inspiration and take the form of an emphasis on decentralized local control and communication. Earlier studies went a step further in emphasizing the use of simple reactive robots that only communicate indirectly through the environment. More recently SR studies have moved beyond these constraints to explore the use of non-reactive robots that communicate directly, and that can learn and represent their environment. There is no clear agreement in the literature about how far such extensions of the original principles could go. Should there be any limitations on the individual abilities of the robots used in SR studies? Should knowledge of the capabilities of social insects lead to constraints on the capabilities of individual robots in SR studies? There is a lack of explicit discussion of such questions, and researchers have adopted a variety of constraints for a variety of reasons. A simple taxonomy of swarm robotics is presented here with the aim of addressing and clarifying these questions. The taxonomy distinguishes subareas of SR based on the emphases and justifications for minimalism and individual simplicity.

  12. Minimal dilaton model

    Directory of Open Access Journals (Sweden)

    Oda Kin-ya

    2013-05-01

    Full Text Available Both the ATLAS and CMS experiments at the LHC have reported the observation of the particle of mass around 125 GeV which is consistent to the Standard Model (SM Higgs boson, but with an excess of events beyond the SM expectation in the diphoton decay channel at each of them. There still remains room for a logical possibility that we are not seeing the SM Higgs but something else. Here we introduce the minimal dilaton model in which the LHC signals are explained by an extra singlet scalar of the mass around 125 GeV that slightly mixes with the SM Higgs heavier than 600 GeV. When this scalar has a vacuum expectation value well beyond the electroweak scale, it can be identified as a linearly realized version of a dilaton field. Though the current experimental constraints from the Higgs search disfavors such a region, the singlet scalar model itself still provides a viable alternative to the SM Higgs in interpreting its search results.

  13. Towards the assembly of a minimal oscillator

    NARCIS (Netherlands)

    Nourian, Z.

    2015-01-01

    Life must have started with lower degree of complexity and connectivity. This statement readily triggers the question how simple is the simplest representation of life? In different words and considering a constructive approach, what are the requirements for creating a minimal cell? This thesis sets

  14. Systems for tracking minimally invasive surgical instruments

    NARCIS (Netherlands)

    Chmarra, M. K.; Grimbergen, C. A.; Dankelman, J.

    2007-01-01

    Minimally invasive surgery (e.g. laparoscopy) requires special surgical skills, which should be objectively assessed. Several studies have shown that motion analysis is a valuable assessment tool of basic surgical skills in laparoscopy. However, to use motion analysis as the assessment tool, it is

  15. Waste minimization and pollution prevention awareness plan. Revision 1

    International Nuclear Information System (INIS)

    1994-07-01

    The purpose of this plan is to document Lawrence Livermore National Laboratory (LLNL) projections for present and future waste minimization and pollution prevention. The plan specifies those activities and methods that are or will be used to reduce the quantity and toxicity of wastes generated at the site. It is intended to satisfy Department of Energy (DOE) requirements. This Waste Minimization and Pollution Prevention Awareness Plan provides an overview of projected activities from FY 1994 through FY 1999. The plans are broken into site-wide and problem-specific activities. All directorates at LLNL have had an opportunity to contribute input, estimate budgets, and review the plan. In addition to the above, this plan records LLNL's goals for pollution prevention, regulatory drivers for those activities, assumptions on which the cost estimates are based, analyses of the strengths of the projects, and the barriers to increasing pollution prevention activities

  16. Medicinal Chemistry Projects Requiring Imaginative Structure-Based Drug Design Methods.

    Science.gov (United States)

    Moitessier, Nicolas; Pottel, Joshua; Therrien, Eric; Englebienne, Pablo; Liu, Zhaomin; Tomberg, Anna; Corbeil, Christopher R

    2016-09-20

    Computational methods for docking small molecules to proteins are prominent in drug discovery. There are hundreds, if not thousands, of documented examples-and several pertinent cases within our research program. Fifteen years ago, our first docking-guided drug design project yielded nanomolar metalloproteinase inhibitors and illustrated the potential of structure-based drug design. Subsequent applications of docking programs to the design of integrin antagonists, BACE-1 inhibitors, and aminoglycosides binding to bacterial RNA demonstrated that available docking programs needed significant improvement. At that time, docking programs primarily considered flexible ligands and rigid proteins. We demonstrated that accounting for protein flexibility, employing displaceable water molecules, and using ligand-based pharmacophores improved the docking accuracy of existing methods-enabling the design of bioactive molecules. The success prompted the development of our own program, Fitted, implementing all of these aspects. The primary motivation has always been to respond to the needs of drug design studies; the majority of the concepts behind the evolution of Fitted are rooted in medicinal chemistry projects and collaborations. Several examples follow: (1) Searching for HDAC inhibitors led us to develop methods considering drug-zinc coordination and its effect on the pKa of surrounding residues. (2) Targeting covalent prolyl oligopeptidase (POP) inhibitors prompted an update to Fitted to identify reactive groups and form bonds with a given residue (e.g., a catalytic residue) when the geometry allows it. Fitted-the first fully automated covalent docking program-was successfully applied to the discovery of four new classes of covalent POP inhibitors. As a result, efficient stereoselective syntheses of a few screening hits were prioritized rather than synthesizing large chemical libraries-yielding nanomolar inhibitors. (3) In order to study the metabolism of POP inhibitors by

  17. Minimally Invasive Dentistry

    Science.gov (United States)

    ... all contributors to decay. Your dentist will then use strategies to prevent or reduce your risk for tooth decay. For instance, if ... require anesthesia. It resembles microscopic sand blasting and uses a stream of air combined with a super-fine ... Usually made of plastic resin, dental sealants protect teeth from bacteria that ...

  18. Waste minimization and pollution prevention awareness plan

    Energy Technology Data Exchange (ETDEWEB)

    1991-05-31

    The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. The intent of this plan is to respond to and comply with (DOE's) policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Programs and Departments. 14 refs.

  19. Waste minimization and pollution prevention awareness plan

    International Nuclear Information System (INIS)

    1991-01-01

    The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. The intent of this plan is to respond to and comply with (DOE's) policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Programs and Departments. 14 refs

  20. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches.

    Science.gov (United States)

    Almutairy, Meznah; Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.

  1. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches.

    Directory of Open Access Journals (Sweden)

    Meznah Almutairy

    Full Text Available Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.

  2. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches

    Science.gov (United States)

    Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method. PMID:29389989

  3. Waste minimization at Chalk River Laboratories

    International Nuclear Information System (INIS)

    Kranz, P.; Wong, P.C.F.

    2011-01-01

    Waste minimization supports Atomic Energy of Canada Limited (AECL) Environment Policy with regard to pollution prevention and has positive impacts on the environment, human health and safety, and economy. In accordance with the principle of pollution prevention, the quantities and degree of hazard of wastes requiring storage or disposition at facilities within or external to AECL sites shall be minimized, following the principles of Prevent, Reduce, Reuse, and Recycle, to the extent practical. Waste minimization is an important element in the Waste Management Program. The Waste Management Program has implemented various initiatives for waste minimization since 2007. The key initiatives have focused on waste reduction, segregation and recycling, and included: 1) developed waste minimization requirements and recycling procedure to establish the framework for applying the Waste Minimization Hierarchy; 2) performed waste minimization assessments for the facilities, which generate significant amounts of waste, to identify the opportunities for waste reduction and assist the waste generators to develop waste reduction targets and action plans to achieve the targets; 3) implemented the colour-coded, standardized waste and recycling containers to enhance waste segregation; 4) established partnership with external agents for recycling; 5) extended the likely clean waste and recyclables collection to selected active areas; 6) provided on-going communications to promote waste reduction and increase awareness for recycling; and 7) continually monitored performance, with respect to waste minimization, to identify opportunities for improvement and to communicate these improvements. After implementation of waste minimization initiatives at CRL, the solid waste volume generated from routine operations at CRL has significantly decreased, while the amount of recyclables diverted from the onsite landfill has significantly increased since 2007. The overall refuse volume generated at

  4. Waste minimization at Chalk River Laboratories

    Energy Technology Data Exchange (ETDEWEB)

    Kranz, P.; Wong, P.C.F. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2011-07-01

    Waste minimization supports Atomic Energy of Canada Limited (AECL) Environment Policy with regard to pollution prevention and has positive impacts on the environment, human health and safety, and economy. In accordance with the principle of pollution prevention, the quantities and degree of hazard of wastes requiring storage or disposition at facilities within or external to AECL sites shall be minimized, following the principles of Prevent, Reduce, Reuse, and Recycle, to the extent practical. Waste minimization is an important element in the Waste Management Program. The Waste Management Program has implemented various initiatives for waste minimization since 2007. The key initiatives have focused on waste reduction, segregation and recycling, and included: 1) developed waste minimization requirements and recycling procedure to establish the framework for applying the Waste Minimization Hierarchy; 2) performed waste minimization assessments for the facilities, which generate significant amounts of waste, to identify the opportunities for waste reduction and assist the waste generators to develop waste reduction targets and action plans to achieve the targets; 3) implemented the colour-coded, standardized waste and recycling containers to enhance waste segregation; 4) established partnership with external agents for recycling; 5) extended the likely clean waste and recyclables collection to selected active areas; 6) provided on-going communications to promote waste reduction and increase awareness for recycling; and 7) continually monitored performance, with respect to waste minimization, to identify opportunities for improvement and to communicate these improvements. After implementation of waste minimization initiatives at CRL, the solid waste volume generated from routine operations at CRL has significantly decreased, while the amount of recyclables diverted from the onsite landfill has significantly increased since 2007. The overall refuse volume generated at

  5. Assessing thermochromatography as a separation method for nuclear forensics. Current capability vis-a-vis forensic requirements

    International Nuclear Information System (INIS)

    Hanson, D.E.; Garrison, J.R.; Hall, H.L.

    2011-01-01

    Nuclear forensic science has become increasingly important for global nuclear security. However, many current laboratory analysis techniques are based on methods developed without the imperative for timely analysis that underlies the post-detonation forensics mission requirements. Current analysis of actinides, fission products, and fuel-specific materials requires time-consuming chemical separation coupled with nuclear counting or mass spectrometry. High-temperature gas-phase separations have been used in the past for the rapid separation of newly created elements/isotopes and as a basis for chemical classification of that element. We are assessing the utility of this method for rapid separation in the gas-phase to accelerate the separations of radioisotopes germane to post-detonation nuclear forensic investigations. The existing state of the art for thermo chromatographic separations, and its applicability to nuclear forensics, will be reviewed. (author)

  6. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Derivations and Verification of Plans. Volume 1

    Science.gov (United States)

    Johnson, Kenneth L.; White, K, Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques. This recommended procedure would be used as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. This document contains the outcome of the assessment.

  7. Evolved Minimal Frustration in Multifunctional Biomolecules.

    Science.gov (United States)

    Röder, Konstantin; Wales, David J

    2018-05-25

    Protein folding is often viewed in terms of a funnelled potential or free energy landscape. A variety of experiments now indicate the existence of multifunnel landscapes, associated with multifunctional biomolecules. Here, we present evidence that these systems have evolved to exhibit the minimal number of funnels required to fulfil their cellular functions, suggesting an extension to the principle of minimum frustration. We find that minimal disruptive mutations result in additional funnels, and the associated structural ensembles become more diverse. The same trends are observed in an atomic cluster. These observations suggest guidelines for rational design of engineered multifunctional biomolecules.

  8. Hadamard and minimal renormalizations

    International Nuclear Information System (INIS)

    Castagnino, M.A.; Gunzig, E.; Nardone, P.; Paz, J.P.

    1986-01-01

    A common language is introduced to study two, well-known, different methods for the renormalization of the energy-momentum tensor of a scalar neutral quantum field in curved space-time. Different features of the two renormalizations are established and compared

  9. Simulation of temporal and spatial distribution of required irrigation water by crop models and the pan evaporation coefficient method

    Science.gov (United States)

    Yang, Yan-min; Yang, Yonghui; Han, Shu-min; Hu, Yu-kun

    2009-07-01

    Hebei Plain is the most important agricultural belt in North China. Intensive irrigation, low and uneven precipitation have led to severe water shortage on the plain. This study is an attempt to resolve this crucial issue of water shortage for sustainable agricultural production and water resources management. The paper models distributed regional irrigation requirement for a range of cultivated crops on the plain. Classic crop models like DSSAT- wheat/maize and COTTON2K are used in combination with pan-evaporation coefficient method to estimate water requirements for wheat, corn, cotton, fruit-trees and vegetables. The approach is more accurate than the static approach adopted in previous studies. This is because the combination use of crop models and pan-evaporation coefficient method dynamically accounts for irrigation requirement at different growth stages of crops, agronomic practices, and field and climatic conditions. The simulation results show increasing Required Irrigation Amount (RIA) with time. RIA ranges from 5.08×109 m3 to 14.42×109 m3 for the period 1986~2006, with an annual average of 10.6×109 m3. Percent average water use by wheat, fruit trees, vegetable, corn and cotton is 41%, 12%, 12%, 11%, 7% and 17% respectively. RIA for April and May (the period with the highest irrigation water use) is 1.78×109 m3 and 2.41×109 m3 respectively. The counties in the piedmont regions of Mount Taihang have high RIA while the central and eastern regions/counties have low irrigation requirement.

  10. Minimal entropy approximation for cellular automata

    International Nuclear Information System (INIS)

    Fukś, Henryk

    2014-01-01

    We present a method for the construction of approximate orbits of measures under the action of cellular automata which is complementary to the local structure theory. The local structure theory is based on the idea of Bayesian extension, that is, construction of a probability measure consistent with given block probabilities and maximizing entropy. If instead of maximizing entropy one minimizes it, one can develop another method for the construction of approximate orbits, at the heart of which is the iteration of finite-dimensional maps, called minimal entropy maps. We present numerical evidence that the minimal entropy approximation sometimes outperforms the local structure theory in characterizing the properties of cellular automata. The density response curve for elementary CA rule 26 is used to illustrate this claim. (paper)

  11. Advanced pyrochemical technologies for minimizing nuclear waste

    International Nuclear Information System (INIS)

    Bronson, M.C.; Dodson, K.E.; Riley, D.C.

    1994-01-01

    The Department of Energy (DOE) is seeking to reduce the size of the current nuclear weapons complex and consequently minimize operating costs. To meet this DOE objective, the national laboratories have been asked to develop advanced technologies that take uranium and plutonium, from retired weapons and prepare it for new weapons, long-term storage, and/or final disposition. Current pyrochemical processes generate residue salts and ceramic wastes that require aqueous processing to remove and recover the actinides. However, the aqueous treatment of these residues generates an estimated 100 liters of acidic transuranic (TRU) waste per kilogram of plutonium in the residue. Lawrence Livermore National Laboratory (LLNL) is developing pyrochemical techniques to eliminate, minimize, or more efficiently treat these residue streams. This paper will present technologies being developed at LLNL on advanced materials for actinide containment, reactors that minimize residues, and pyrochemical processes that remove actinides from waste salts

  12. Achieving Accuracy Requirements for Forest Biomass Mapping: A Data Fusion Method for Estimating Forest Biomass and LiDAR Sampling Error with Spaceborne Data

    Science.gov (United States)

    Montesano, P. M.; Cook, B. D.; Sun, G.; Simard, M.; Zhang, Z.; Nelson, R. F.; Ranson, K. J.; Lutchke, S.; Blair, J. B.

    2012-01-01

    The synergistic use of active and passive remote sensing (i.e., data fusion) demonstrates the ability of spaceborne light detection and ranging (LiDAR), synthetic aperture radar (SAR) and multispectral imagery for achieving the accuracy requirements of a global forest biomass mapping mission. This data fusion approach also provides a means to extend 3D information from discrete spaceborne LiDAR measurements of forest structure across scales much larger than that of the LiDAR footprint. For estimating biomass, these measurements mix a number of errors including those associated with LiDAR footprint sampling over regional - global extents. A general framework for mapping above ground live forest biomass (AGB) with a data fusion approach is presented and verified using data from NASA field campaigns near Howland, ME, USA, to assess AGB and LiDAR sampling errors across a regionally representative landscape. We combined SAR and Landsat-derived optical (passive optical) image data to identify forest patches, and used image and simulated spaceborne LiDAR data to compute AGB and estimate LiDAR sampling error for forest patches and 100m, 250m, 500m, and 1km grid cells. Forest patches were delineated with Landsat-derived data and airborne SAR imagery, and simulated spaceborne LiDAR (SSL) data were derived from orbit and cloud cover simulations and airborne data from NASA's Laser Vegetation Imaging Sensor (L VIS). At both the patch and grid scales, we evaluated differences in AGB estimation and sampling error from the combined use of LiDAR with both SAR and passive optical and with either SAR or passive optical alone. This data fusion approach demonstrates that incorporating forest patches into the AGB mapping framework can provide sub-grid forest information for coarser grid-level AGB reporting, and that combining simulated spaceborne LiDAR with SAR and passive optical data are most useful for estimating AGB when measurements from LiDAR are limited because they minimized

  13. Improving allowed outage time and surveillance test interval requirements: a study of their interactions using probabilistic methods

    International Nuclear Information System (INIS)

    Martorell, S.A.; Serradell, V.G.; Samanta, P.K.

    1995-01-01

    Technical Specifications (TS) define the limits and conditions for operating nuclear plants safely. We selected the Limiting Conditions for Operations (LCO) and Surveillance Requirements (SR), both within TS, as the main items to be evaluated using probabilistic methods. In particular, we focused on the Allowed Outage Time (AOT) and Surveillance Test Interval (STI) requirements in LCO and SR, respectively. Already, significant operating and design experience has accumulated revealing several problems which require modifications in some TS rules. Developments in Probabilistic Safety Assessment (PSA) allow the evaluation of effects due to such modifications in AOT and STI from a risk point of view. Thus, some changes have already been adopted in some plants. However, the combined effect of several changes in AOT and STI, i.e. through their interactions, is not addressed. This paper presents a methodology which encompasses, along with the definition of AOT and STI interactions, the quantification of interactions in terms of risk using PSA methods, an approach for evaluating simultaneous AOT and STI modifications, and an assessment of strategies for giving flexibility to plant operation through simultaneous changes on AOT and STI using trade-off-based risk criteria

  14. Global Analysis of Minimal Surfaces

    CERN Document Server

    Dierkes, Ulrich; Tromba, Anthony J

    2010-01-01

    Many properties of minimal surfaces are of a global nature, and this is already true for the results treated in the first two volumes of the treatise. Part I of the present book can be viewed as an extension of these results. For instance, the first two chapters deal with existence, regularity and uniqueness theorems for minimal surfaces with partially free boundaries. Here one of the main features is the possibility of 'edge-crawling' along free parts of the boundary. The third chapter deals with a priori estimates for minimal surfaces in higher dimensions and for minimizers of singular integ

  15. Minimal Surfaces for Hitchin Representations

    DEFF Research Database (Denmark)

    Li, Qiongling; Dai, Song

    2018-01-01

    . In this paper, we investigate the properties of immersed minimal surfaces inside symmetric space associated to a subloci of Hitchin component: $q_n$ and $q_{n-1}$ case. First, we show that the pullback metric of the minimal surface dominates a constant multiple of the hyperbolic metric in the same conformal...... class and has a strong rigidity property. Secondly, we show that the immersed minimal surface is never tangential to any flat inside the symmetric space. As a direct corollary, the pullback metric of the minimal surface is always strictly negatively curved. In the end, we find a fully decoupled system...

  16. Inelastic scattering with Chebyshev polynomials and preconditioned conjugate gradient minimization.

    Science.gov (United States)

    Temel, Burcin; Mills, Greg; Metiu, Horia

    2008-03-27

    We describe and test an implementation, using a basis set of Chebyshev polynomials, of a variational method for solving scattering problems in quantum mechanics. This minimum error method (MEM) determines the wave function Psi by minimizing the least-squares error in the function (H Psi - E Psi), where E is the desired scattering energy. We compare the MEM to an alternative, the Kohn variational principle (KVP), by solving the Secrest-Johnson model of two-dimensional inelastic scattering, which has been studied previously using the KVP and for which other numerical solutions are available. We use a conjugate gradient (CG) method to minimize the error, and by preconditioning the CG search, we are able to greatly reduce the number of iterations necessary; the method is thus faster and more stable than a matrix inversion, as is required in the KVP. Also, we avoid errors due to scattering off of the boundaries, which presents substantial problems for other methods, by matching the wave function in the interaction region to the correct asymptotic states at the specified energy; the use of Chebyshev polynomials allows this boundary condition to be implemented accurately. The use of Chebyshev polynomials allows for a rapid and accurate evaluation of the kinetic energy. This basis set is as efficient as plane waves but does not impose an artificial periodicity on the system. There are problems in surface science and molecular electronics which cannot be solved if periodicity is imposed, and the Chebyshev basis set is a good alternative in such situations.

  17. Westinghouse Hanford Company waste minimization and pollution prevention awareness program plan

    International Nuclear Information System (INIS)

    Craig, P.A.; Nichols, D.H.; Lindsey, D.W.

    1991-08-01

    The purpose of this plan is to establish the Westinghouse Hanford Company's Waste Minimization Program. The plan specifies activities and methods that will be employed to reduce the quantity and toxicity of waste generated at Westinghouse Hanford Company (Westinghouse Hanford). It is designed to satisfy the US Department of Energy (DOE) and other legal requirements that are discussed in Subsection C of the section. The Pollution Prevention Awareness Program is included with the Waste Minimization Program as permitted by DOE Order 5400.1 (DOE 1988a). This plan is based on the Hanford Site Waste Minimization and Pollution Prevention Awareness Program Plan, which directs DOE Field Office, Richland contractors to develop and maintain a waste minimization program. This waste minimization program is an organized, comprehensive, and continual effort to systematically reduce waste generation. The Westinghouse Hanford Waste Minimization Program is designed to prevent or minimize pollutant releases to all environmental media from all aspects of Westinghouse Hanford operations and offers increased protection of public health and the environment. 14 refs., 2 figs., 1 tab

  18. How to minimize wastes

    International Nuclear Information System (INIS)

    Ambolet, M.

    1988-10-01

    Actions undertaken by the CEA to decrease the stock of natural and depleted uranium are presented in this paper. Various wastes and residues are produced in uranium fabrication. If for some wastes or residues processing methods were found previously, for other storage was the rule. Facing growing problems of safety, bulkiness, and cost new treatments allow to decrease a great amount of wastes. Uranium fabrication cycle, wastes and residues are described. Processing of the different residues of operations and optimization of manufacture are indicated [fr

  19. Minimalism through intraoperative functional mapping.

    Science.gov (United States)

    Berger, M S

    1996-01-01

    Intraoperative stimulation mapping may be used to avoid unnecessary risk to functional regions subserving language and sensori-motor pathways. Based on the data presented here, language localization is variable in the entire population, with only certainty existing for the inferior frontal region responsible for motor speech. Anatomical landmarks such as the anterior temporal tip for temporal lobe language sites and the posterior aspect of the lateral sphenoid wing for the frontal lobe language zones are unreliable in avoiding postoperative aphasias. Thus, individual mapping to identify essential language sites has the greatest likelihood of avoiding permanent deficits in naming, reading, and motor speech. In a similar approach, motor and sensory pathways from the cortex and underlying white matter may be reliably stimulated and mapped in both awake and asleep patients. Although these techniques require an additional operative time and equipment nominally priced, the result is often gratifying, as postoperative morbidity has been greatly reduced in the process of incorporating these surgical strategies. The patients quality of life is improved in terms of seizure control, with or without antiepileptic drugs. This avoids having to perform a second costly operative procedure, which is routinely done when extraoperative stimulation and recording is done via subdural grids. In addition, an aggressive tumor resection at the initial operation lengthens the time to tumor recurrence and often obviates the need for a subsequent reoperation. Thus, intraoperative functional mapping may be best alluded to as a surgical technique that results in "minimalism in the long term".

  20. A Fluorine-18 Radiolabeling Method Enabled by Rhenium(I) Complexation Circumvents the Requirement of Anhydrous Conditions.

    Science.gov (United States)

    Klenner, Mitchell A; Pascali, Giancarlo; Zhang, Bo; Sia, Tiffany R; Spare, Lawson K; Krause-Heuer, Anwen M; Aldrich-Wright, Janice R; Greguric, Ivan; Guastella, Adam J; Massi, Massimiliano; Fraser, Benjamin H

    2017-05-11

    Azeotropic distillation is typically required to achieve fluorine-18 radiolabeling during the production of positron emission tomography (PET) imaging agents. However, this time-consuming process also limits fluorine-18 incorporation, due to radioactive decay of the isotope and its adsorption to the drying vessel. In addressing these limitations, the fluorine-18 radiolabeling of one model rhenium(I) complex is reported here, which is significantly improved under conditions that do not require azeotropic drying. This work could open a route towards the investigation of a simplified metal-mediated late-stage radiofluorination method, which would expand upon the accessibility of new PET and PET-optical probes. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Guidelines for mixed waste minimization

    International Nuclear Information System (INIS)

    Owens, C.

    1992-02-01

    Currently, there is no commercial mixed waste disposal available in the United States. Storage and treatment for commercial mixed waste is limited. Host States and compacts region officials are encouraging their mixed waste generators to minimize their mixed wastes because of management limitations. This document provides a guide to mixed waste minimization

  2. Some basic requirements for the application of electrokinetic methods for the reconstruction of masonry with rising humidity

    Energy Technology Data Exchange (ETDEWEB)

    Friese, P; Jacobasch, H J; Boerner, M

    1987-12-01

    Based on some theoretical statements concerning the electro-osmosis the most important requirements for the application of electrokinetic methods for drying masonry with rising humidity are described. Samples of brick masonry (brick and mortar) were examined by means of an electrokinetic measuring system (EKM) with different electrolytes (CaSO/sub 4/ and KCl) being used for different concentrations. It was found for all samples, that the zeta potential is provided with a negative sign and that the absolute value of the zeta potential approaches zero with increasing electrolyte concentration. Based on these measurements, an upper limit of the electrolyte concentration of 0.1 Mol/liter is established for the application of electrokinetic methods for drying masonry.

  3. Minimal Webs in Riemannian Manifolds

    DEFF Research Database (Denmark)

    Markvorsen, Steen

    2008-01-01

    For a given combinatorial graph $G$ a {\\it geometrization} $(G, g)$ of the graph is obtained by considering each edge of the graph as a $1-$dimensional manifold with an associated metric $g$. In this paper we are concerned with {\\it minimal isometric immersions} of geometrized graphs $(G, g......)$ into Riemannian manifolds $(N^{n}, h)$. Such immersions we call {\\em{minimal webs}}. They admit a natural 'geometric' extension of the intrinsic combinatorial discrete Laplacian. The geometric Laplacian on minimal webs enjoys standard properties such as the maximum principle and the divergence theorems, which...... are of instrumental importance for the applications. We apply these properties to show that minimal webs in ambient Riemannian spaces share several analytic and geometric properties with their smooth (minimal submanifold) counterparts in such spaces. In particular we use appropriate versions of the divergence...

  4. No actual measurement … was required: Maxwell and Cavendish's null method for the inverse square law of electrostatics.

    Science.gov (United States)

    Falconer, Isobel

    In 1877 James Clerk Maxwell and his student Donald MacAlister refined Henry Cavendish's 1773 null experiment demonstrating the absence of electricity inside a charged conductor. This null result was a mathematical prediction of the inverse square law of electrostatics, and both Cavendish and Maxwell took the experiment as verifying the law. However, Maxwell had already expressed absolute conviction in the law, based on results of Michael Faraday's. So, what was the value to him of repeating Cavendish's experiment? After assessing whether the law was as secure as he claimed, this paper explores its central importance to the electrical programme that Maxwell was pursuing. It traces the historical and conceptual re-orderings through which Maxwell established the law by constructing a tradition of null tests and asserting the superior accuracy of the method. Maxwell drew on his developing 'doctrine of method' to identify Cavendish's experiment as a member of a wider class of null methods. By doing so, he appealed to the null practices of telegraph engineers, diverted attention from the flawed logic of the method, and sought to localise issues around the mapping of numbers onto instrumental indications, on the grounds that 'no actual measurement … was required'. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Responsiveness and minimal clinically important change

    DEFF Research Database (Denmark)

    Christiansen, David Høyrup; Frost, Poul; Falla, Deborah

    2015-01-01

    Study Design A prospective cohort study nested in a randomized controlled trial. Objectives To determine and compare responsiveness and minimal clinically important change of the modified Constant score (CS) and the Oxford Shoulder Score (OSS). Background The OSS and the CS are commonly used...... to assess shoulder outcomes. However, few studies have evaluated the measurement properties of the OSS and CS in terms of responsiveness and minimal clinically important change. Methods The study included 126 patients who reported having difficulty returning to usual activities 8 to 12 weeks after...... were observed for the CS and the OSS. Minimal clinically important change ROC values were 6 points for the OSS and 11 points for the CS, with upper 95% cutoff limits of 12 and 22 points, respectively. Conclusion The CS and the OSS were both suitable for assessing improvement after decompression surgery....

  6. Numerical methods

    CERN Document Server

    Dahlquist, Germund

    1974-01-01

    ""Substantial, detailed and rigorous . . . readers for whom the book is intended are admirably served."" - MathSciNet (Mathematical Reviews on the Web), American Mathematical Society.Practical text strikes fine balance between students' requirements for theoretical treatment and needs of practitioners, with best methods for large- and small-scale computing. Prerequisites are minimal (calculus, linear algebra, and preferably some acquaintance with computer programming). Text includes many worked examples, problems, and an extensive bibliography.

  7. Minimal Poems Written in 1979 Minimal Poems Written in 1979

    Directory of Open Access Journals (Sweden)

    Sandra Sirangelo Maggio

    2008-04-01

    Full Text Available The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism. The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism.

  8. Retrograde Renal Cooling to Minimize Ischemia

    Directory of Open Access Journals (Sweden)

    Janet L. Colli

    2013-01-01

    Full Text Available Objective: During partial nephrectomy, renal hypothermia has been shown to decrease ischemia induced renal damage which occurs from renal hilar clamping. In this study we investigate the infusion rate required to safely cool the entire renal unit in a porcine model using retrograde irrigation of iced saline via dual-lumen ureteral catheter. Materials and Methods: Renal cortical, renal medullary, bowel and rectal temperatures during retrograde cooling in a laparoscopic porcine model were monitored in six renal units. Iced normal saline was infused at 300 cc/hour, 600 cc/hour, 1000 cc/hour and gravity (800 cc/hour for 600 seconds with and without hilar clamping. Results: Retrograde cooling with hilar clamping provided rapid medullary renal cooling and significant hypothermia of the medulla and cortex at infusion rates ≥ 600 cc/hour. With hilar clamping, cortical temperatures decreased at -0.9° C/min. reaching a threshold temperature of 26.9° C, and medullary temperatures decreased at -0.90 C/min. reaching a temperature of 26.1° C over 600 seconds on average for combined data at infusion rates ≥ 600 cc/hour. The lowest renal temperatures were achieved with gravity infusion. Without renal hilum clamping, retrograde cooling was minimal at all infusion rates. Conclusions: Significant renal cooling by gravity infusion of iced cold saline via a duel lumen catheter with a clamped renal hilum was achieved in a porcine model. Continuous retrograde irrigation with iced saline via a two way ureteral catheter may be an effective method to induce renal hypothermia in patients undergoing robotic assisted and/or laparoscopic partial nephrectomy.

  9. Predicting blood transfusion in patients undergoing minimally invasive oesophagectomy.

    Science.gov (United States)

    Schneider, Crispin; Boddy, Alex P; Fukuta, Junaid; Groom, William D; Streets, Christopher G

    2014-12-01

    To evaluate predictors of allogenic blood transfusion requirements in patients undergoing minimal invasive oesophagectomy at a tertiary high volume centre for oesophago-gastric surgery. Retrospective analysis of all patients undergoing minimal access oesophagectomy in our department between January 2010 and December 2011. Patients were divided into two groups depending on whether they required a blood transfusion at any time during their index admission. Factors that have been shown to influence perioperative blood transfusion requirements in major surgery were included in the analysis. Binary logistic regression analysis was performed to determine the impact of patient and perioperative characteristics on transfusion requirements during the index admission. A total of 80 patients underwent minimal access oesophagectomy, of which 61 patients had a laparoscopic assisted oesophagectomy and 19 patients had a minimal invasive oesophagectomy. Perioperative blood transfusion was required in 28 patients at any time during hospital admission. On binary logistic regression analysis, a lower preoperative haemoglobin concentration (p blood transfusion requirements. It has been reported that requirement for blood transfusion can affect long-term outcomes in oesophageal cancer resection. Two factors which could be addressed preoperatively; haemoglobin concentration and type of oesophageal resection, may be valuable in predicting blood transfusions in patients undergoing minimally invasive oesophagectomy. Our analysis revealed that preoperative haemoglobin concentration, occurrence of significant complications and type of minimal access oesophagectomy predicted blood transfusion requirements in the patient population examined. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  10. Minimal Flavour Violation and Beyond

    CERN Document Server

    Isidori, Gino

    2012-01-01

    We review the formulation of the Minimal Flavour Violation (MFV) hypothesis in the quark sector, as well as some "variations on a theme" based on smaller flavour symmetry groups and/or less minimal breaking terms. We also review how these hypotheses can be tested in B decays and by means of other flavour-physics observables. The phenomenological consequences of MFV are discussed both in general terms, employing a general effective theory approach, and in the specific context of the Minimal Supersymmetric extension of the SM.

  11. Minimizing waste in environmental restoration

    International Nuclear Information System (INIS)

    Thuot, J.R.; Moos, L.

    1996-01-01

    Environmental restoration, decontamination and decommissioning, and facility dismantlement projects are not typically known for their waste minimization and pollution prevention efforts. Typical projects are driven by schedules and milestones with little attention given to cost or waste minimization. Conventional wisdom in these projects is that the waste already exists and cannot be reduced or minimized; however, there are significant areas where waste and cost can be reduced by careful planning and execution. Waste reduction can occur in three ways: beneficial reuse or recycling, segregation of waste types, and reducing generation of secondary waste

  12. Minimizing waste in environmental restoration

    International Nuclear Information System (INIS)

    Moos, L.; Thuot, J.R.

    1996-01-01

    Environmental restoration, decontamination and decommissioning and facility dismantelment projects are not typically known for their waste minimization and pollution prevention efforts. Typical projects are driven by schedules and milestones with little attention given to cost or waste minimization. Conventional wisdom in these projects is that the waste already exists and cannot be reduced or minimized. In fact, however, there are three significant areas where waste and cost can be reduced. Waste reduction can occur in three ways: beneficial reuse or recycling; segregation of waste types; and reducing generation of secondary waste. This paper will discuss several examples of reuse, recycle, segregation, and secondary waste reduction at ANL restoration programs

  13. Non-technical skills in minimally invasive surgery teams

    DEFF Research Database (Denmark)

    Gjeraa, Kirsten; Spanager, Lene; Konge, Lars

    2016-01-01

    BACKGROUND: Root cause analyses show that up to 70 % of adverse events are caused by human error. Strong non-technical skills (NTS) can prevent or reduce these errors, considerable numbers of which occur in the operating theatre. Minimally invasive surgery (MIS) requires manipulation of more...... complex equipment than open procedures, likely requiring a different set of NTS for each kind of team. The aims of this study were to identify the MIS teams' key NTS and investigate the effect of training and assessment of NTS on MIS teams. METHODS: The databases of PubMed, Cochrane Library, Embase, Psyc...... were included. All were observational studies without blinding, and they differed in aims, types of evaluation, and outcomes. Only two studies evaluated patient outcomes other than operative time, and overall, the studies' quality of evidence was low. Different communication types were encountered...

  14. Energy-efficient ECG compression on wireless biosensors via minimal coherence sensing and weighted ℓ₁ minimization reconstruction.

    Science.gov (United States)

    Zhang, Jun; Gu, Zhenghui; Yu, Zhu Liang; Li, Yuanqing

    2015-03-01

    Low energy consumption is crucial for body area networks (BANs). In BAN-enabled ECG monitoring, the continuous monitoring entails the need of the sensor nodes to transmit a huge data to the sink node, which leads to excessive energy consumption. To reduce airtime over energy-hungry wireless links, this paper presents an energy-efficient compressed sensing (CS)-based approach for on-node ECG compression. At first, an algorithm called minimal mutual coherence pursuit is proposed to construct sparse binary measurement matrices, which can be used to encode the ECG signals with superior performance and extremely low complexity. Second, in order to minimize the data rate required for faithful reconstruction, a weighted ℓ1 minimization model is derived by exploring the multisource prior knowledge in wavelet domain. Experimental results on MIT-BIH arrhythmia database reveals that the proposed approach can obtain higher compression ratio than the state-of-the-art CS-based methods. Together with its low encoding complexity, our approach can achieve significant energy saving in both encoding process and wireless transmission.

  15. Minimally Invasive Spine Surgery in Small Animals.

    Science.gov (United States)

    Hettlich, Bianca F

    2018-01-01

    Minimally invasive spine surgery (MISS) seems to have many benefits for human patients and is currently used for various minor and major spine procedures. For MISS, a change in access strategy to the target location is necessary and it requires intraoperative imaging, special instrumentation, and magnification. Few veterinary studies have evaluated MISS for canine patients for spinal decompression procedures. This article discusses the general requirements for MISS and how these can be applied to veterinary spinal surgery. The current veterinary MISS literature is reviewed and suggestions are made on how to apply MISS to different spinal locations. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. Wilson loops in minimal surfaces

    International Nuclear Information System (INIS)

    Drukker, Nadav; Gross, David J.; Ooguri, Hirosi

    1999-01-01

    The AdS/CFT correspondence suggests that the Wilson loop of the large N gauge theory with N = 4 supersymmetry in 4 dimensions is described by a minimal surface in AdS 5 x S 5 . The authors examine various aspects of this proposal, comparing gauge theory expectations with computations of minimal surfaces. There is a distinguished class of loops, which the authors call BPS loops, whose expectation values are free from ultra-violet divergence. They formulate the loop equation for such loops. To the extent that they have checked, the minimal surface in AdS 5 x S 5 gives a solution of the equation. The authors also discuss the zig-zag symmetry of the loop operator. In the N = 4 gauge theory, they expect the zig-zag symmetry to hold when the loop does not couple the scalar fields in the supermultiplet. They will show how this is realized for the minimal surface

  17. Classical strings and minimal surfaces

    International Nuclear Information System (INIS)

    Urbantke, H.

    1986-01-01

    Real Lorentzian forms of some complex or complexified Euclidean minimal surfaces are obtained as an application of H.A. Schwarz' solution to the initial value problem or a search for surfaces admitting a group of Poincare transformations. (Author)

  18. Minimal Gromov-Witten rings

    International Nuclear Information System (INIS)

    Przyjalkowski, V V

    2008-01-01

    We construct an abstract theory of Gromov-Witten invariants of genus 0 for quantum minimal Fano varieties (a minimal class of varieties which is natural from the quantum cohomological viewpoint). Namely, we consider the minimal Gromov-Witten ring: a commutative algebra whose generators and relations are of the form used in the Gromov-Witten theory of Fano varieties (of unspecified dimension). The Gromov-Witten theory of any quantum minimal variety is a homomorphism from this ring to C. We prove an abstract reconstruction theorem which says that this ring is isomorphic to the free commutative ring generated by 'prime two-pointed invariants'. We also find solutions of the differential equation of type DN for a Fano variety of dimension N in terms of the generating series of one-pointed Gromov-Witten invariants

  19. Wilson loops and minimal surfaces

    International Nuclear Information System (INIS)

    Drukker, Nadav; Gross, David J.; Ooguri, Hirosi

    1999-01-01

    The AdS-CFT correspondence suggests that the Wilson loop of the large N gauge theory with N=4 supersymmetry in four dimensions is described by a minimal surface in AdS 5 xS 5 . We examine various aspects of this proposal, comparing gauge theory expectations with computations of minimal surfaces. There is a distinguished class of loops, which we call BPS loops, whose expectation values are free from ultraviolet divergence. We formulate the loop equation for such loops. To the extent that we have checked, the minimal surface in AdS 5 xS 5 gives a solution of the equation. We also discuss the zigzag symmetry of the loop operator. In the N=4 gauge theory, we expect the zigzag symmetry to hold when the loop does not couple the scalar fields in the supermultiplet. We will show how this is realized for the minimal surface. (c) 1999 The American Physical Society

  20. Rigid Body Energy Minimization on Manifolds for Molecular Docking.

    Science.gov (United States)

    Mirzaei, Hanieh; Beglov, Dmitri; Paschalidis, Ioannis Ch; Vajda, Sandor; Vakili, Pirooz; Kozakov, Dima

    2012-11-13

    Virtually all docking methods include some local continuous minimization of an energy/scoring function in order to remove steric clashes and obtain more reliable energy values. In this paper, we describe an efficient rigid-body optimization algorithm that, compared to the most widely used algorithms, converges approximately an order of magnitude faster to conformations with equal or slightly lower energy. The space of rigid body transformations is a nonlinear manifold, namely, a space which locally resembles a Euclidean space. We use a canonical parametrization of the manifold, called the exponential parametrization, to map the Euclidean tangent space of the manifold onto the manifold itself. Thus, we locally transform the rigid body optimization to an optimization over a Euclidean space where basic optimization algorithms are applicable. Compared to commonly used methods, this formulation substantially reduces the dimension of the search space. As a result, it requires far fewer costly function and gradient evaluations and leads to a more efficient algorithm. We have selected the LBFGS quasi-Newton method for local optimization since it uses only gradient information to obtain second order information about the energy function and avoids the far more costly direct Hessian evaluations. Two applications, one in protein-protein docking, and the other in protein-small molecular interactions, as part of macromolecular docking protocols are presented. The code is available to the community under open source license, and with minimal effort can be incorporated into any molecular modeling package.

  1. Minimal string theory is logarithmic

    International Nuclear Information System (INIS)

    Ishimoto, Yukitaka; Yamaguchi, Shun-ichi

    2005-01-01

    We study the simplest examples of minimal string theory whose worldsheet description is the unitary (p,q) minimal model coupled to two-dimensional gravity ( Liouville field theory). In the Liouville sector, we show that four-point correlation functions of 'tachyons' exhibit logarithmic singularities, and that the theory turns out to be logarithmic. The relation with Zamolodchikov's logarithmic degenerate fields is also discussed. Our result holds for generic values of (p,q)

  2. 13 CFR 115.17 - Minimization of Surety's Loss.

    Science.gov (United States)

    2010-01-01

    ... and collateral—(1) Requirements. The Surety must take all reasonable action to minimize risk of Loss... indemnity agreement must be secured by such collateral as the Surety or SBA finds appropriate. Indemnity...

  3. Requirements and testing methods for surfaces of metallic bipolar plates for low-temperature PEM fuel cells

    Science.gov (United States)

    Jendras, P.; Lötsch, K.; von Unwerth, T.

    2017-03-01

    To reduce emissions and to substitute combustion engines automotive manufacturers, legislature and first users aspire hydrogen fuel cell vehicles. Up to now the focus of research was set on ensuring functionality and increasing durability of fuel cell components. Therefore, expensive materials were used. Contemporary research and development try to substitute these substances by more cost-effective material combinations. The bipolar plate is a key component with the greatest influence on volume and mass of a fuel cell stack and they have to meet complex requirements. They support bending sensitive components of stack, spread reactants over active cell area and form the electrical contact to another cell. Furthermore, bipolar plates dissipate heat of reaction and separate one cell gastight from the other. Consequently, they need a low interfacial contact resistance (ICR) to the gas diffusion layer, high flexural strength, good thermal conductivity and a high durability. To reduce costs stainless steel is a favoured material for bipolar plates in automotive applications. Steel is characterized by good electrical and thermal conductivity but the acid environment requires a high chemical durability against corrosion as well. On the one hand formation of a passivating oxide layer increasing ICR should be inhibited. On the other hand pitting corrosion leading to increased permeation rate may not occur. Therefore, a suitable substrate lamination combination is wanted. In this study material testing methods for bipolar plates are considered.

  4. Quantization of the minimal and non-minimal vector field in curved space

    OpenAIRE

    Toms, David J.

    2015-01-01

    The local momentum space method is used to study the quantized massive vector field (the Proca field) with the possible addition of non-minimal terms. Heat kernel coefficients are calculated and used to evaluate the divergent part of the one-loop effective action. It is shown that the naive expression for the effective action that one would write down based on the minimal coupling case needs modification. We adopt a Faddeev-Jackiw method of quantization and consider the case of an ultrastatic...

  5. Minimal but non-minimal inflation and electroweak symmetry breaking

    Energy Technology Data Exchange (ETDEWEB)

    Marzola, Luca [National Institute of Chemical Physics and Biophysics,Rävala 10, 10143 Tallinn (Estonia); Institute of Physics, University of Tartu,Ravila 14c, 50411 Tartu (Estonia); Racioppi, Antonio [National Institute of Chemical Physics and Biophysics,Rävala 10, 10143 Tallinn (Estonia)

    2016-10-07

    We consider the most minimal scale invariant extension of the standard model that allows for successful radiative electroweak symmetry breaking and inflation. The framework involves an extra scalar singlet, that plays the rôle of the inflaton, and is compatibile with current experimental bounds owing to the non-minimal coupling of the latter to gravity. This inflationary scenario predicts a very low tensor-to-scalar ratio r≈10{sup −3}, typical of Higgs-inflation models, but in contrast yields a scalar spectral index n{sub s}≃0.97 which departs from the Starobinsky limit. We briefly discuss the collider phenomenology of the framework.

  6. Willingness to Know the Cause of Death and Hypothetical Acceptability of the Minimally Invasive Autopsy in Six Diverse African and Asian Settings: A Mixed Methods Socio-Behavioural Study

    Science.gov (United States)

    Maixenchs, Maria; Anselmo, Rui; Zielinski-Gutiérrez, Emily; Odhiambo, Frank O.; Akello, Clarah; Zaidi, S. Shujaat H.; Soofi, Sajid Bashir; Bhutta, Zulfiqar A.; Diarra, Kounandji; Djitèye, Mahamane; Dembélé, Roukiatou; Sow, Samba; Minsoko, Pamela Cathérine Angoissa; Agnandji, Selidji Todagbe; Ismail, Mamudo R.; Carrilho, Carla; Ordi, Jaume; Menéndez, Clara; Bassat, Quique

    2016-01-01

    Background The minimally invasive autopsy (MIA) is being investigated as an alternative to complete diagnostic autopsies for cause of death (CoD) investigation. Before potential implementation of the MIA in settings where post-mortem procedures are unusual, a thorough assessment of its feasibility and acceptability is essential. Methods and Findings We conducted a socio-behavioural study at the community level to understand local attitudes and perceptions related to death and the hypothetical feasibility and acceptability of conducting MIAs in six distinct settings in Gabon, Kenya, Mali, Mozambique, and Pakistan. A total of 504 interviews (135 key informants, 175 health providers [including formal health professionals and traditional or informal health providers], and 194 relatives of deceased people) were conducted. The constructs “willingness to know the CoD” and “hypothetical acceptability of MIAs” were quantified and analysed using the framework analysis approach to compare the occurrence of themes related to acceptability across participants. Overall, 75% (379/504) of the participants would be willing to know the CoD of a relative. The overall hypothetical acceptability of MIA on a relative was 73% (366/504). The idea of the MIA was acceptable because of its perceived simplicity and rapidity and particularly for not “mutilating” the body. Further, MIAs were believed to help prevent infectious diseases, address hereditary diseases, clarify the CoD, and avoid witchcraft accusations and conflicts within families. The main concerns regarding the procedure included the potential breach of confidentiality on the CoD, the misperception of organ removal, and the incompatibility with some religious beliefs. Formal health professionals were concerned about possible contradictions between the MIA findings and the clinical pre-mortem diagnoses. Acceptability of the MIA was equally high among Christian and Islamic communities. However, in the two predominantly

  7. Waste minimization applications at a remediation site

    International Nuclear Information System (INIS)

    Allmon, L.A.

    1995-01-01

    The Fernald Environmental Management Project (FEMP) owned by the Department of Energy was used for the processing of uranium. In 1989 Fernald suspended production of uranium metals and was placed on the National Priorities List (NPL). The site's mission has changed from one of production to environmental restoration. Many groups necessary for producing a product were deemed irrelevant for remediation work, including Waste Minimization. Waste Minimization does not readily appear to be applicable to remediation work. Environmental remediation is designed to correct adverse impacts to the environment from past operations and generates significant amounts of waste requiring management. The premise of pollution prevention is to avoid waste generation, thus remediation is in direct conflict with this premise. Although greater amounts of waste will be generated during environmental remediation, treatment capacities are not always available and disposal is becoming more difficult and costly. This creates the need for pollution prevention and waste minimization. Applying waste minimization principles at a remediation site is an enormous challenge. If the remediation site is also radiologically contaminated it is even a bigger challenge. Innovative techniques and ideas must be utilized to achieve reductions in the amount of waste that must be managed or dispositioned. At Fernald the waste minimization paradigm was shifted from focusing efforts on source reduction to focusing efforts on recycle/reuse by inverting the EPA waste management hierarchy. A fundamental difference at remediation sites is that source reduction has limited applicability to legacy wastes but can be applied successfully on secondary waste generation. The bulk of measurable waste reduction will be achieved by the recycle/reuse of primary wastes and by segregation and decontamination of secondary wastestreams. Each effort must be measured in terms of being economically and ecologically beneficial

  8. method

    Directory of Open Access Journals (Sweden)

    L. M. Kimball

    2002-01-01

    Full Text Available This paper presents an interior point algorithm to solve the multiperiod hydrothermal economic dispatch (HTED. The multiperiod HTED is a large scale nonlinear programming problem. Various optimization methods have been applied to the multiperiod HTED, but most neglect important network characteristics or require decomposition into thermal and hydro subproblems. The algorithm described here exploits the special bordered block diagonal structure and sparsity of the Newton system for the first order necessary conditions to result in a fast efficient algorithm that can account for all network aspects. Applying this new algorithm challenges a conventional method for the use of available hydro resources known as the peak shaving heuristic.

  9. On Time with Minimal Expected Cost!

    DEFF Research Database (Denmark)

    David, Alexandre; Jensen, Peter Gjøl; Larsen, Kim Guldstrand

    2014-01-01

    (Priced) timed games are two-player quantitative games involving an environment assumed to be completely antogonistic. Classical analysis consists in the synthesis of strategies ensuring safety, time-bounded or cost-bounded reachability objectives. Assuming a randomized environment, the (priced......) timed game essentially defines an infinite-state Markov (reward) decision proces. In this setting the objective is classically to find a strategy that will minimize the expected reachability cost, but with no guarantees on worst-case behaviour. In this paper, we provide efficient methods for computing...... reachability strategies that will both ensure worst case time-bounds as well as provide (near-) minimal expected cost. Our method extends the synthesis algorithms of the synthesis tool Uppaal-Tiga with suitable adapted reinforcement learning techniques, that exhibits several orders of magnitude improvements w...

  10. Matrix factorizations, minimal models and Massey products

    International Nuclear Information System (INIS)

    Knapp, Johanna; Omer, Harun

    2006-01-01

    We present a method to compute the full non-linear deformations of matrix factorizations for ADE minimal models. This method is based on the calculation of higher products in the cohomology, called Massey products. The algorithm yields a polynomial ring whose vanishing relations encode the obstructions of the deformations of the D-branes characterized by these matrix factorizations. This coincides with the critical locus of the effective superpotential which can be computed by integrating these relations. Our results for the effective superpotential are in agreement with those obtained from solving the A-infinity relations. We point out a relation to the superpotentials of Kazama-Suzuki models. We will illustrate our findings by various examples, putting emphasis on the E 6 minimal model

  11. Biostatistical analysis of treatment results of bacterial liver abscesses using minimally invasive techniques and open surgery

    Directory of Open Access Journals (Sweden)

    Кipshidze A.A.

    2013-12-01

    Full Text Available Today bacterial abscesses remain one of the most difficult complications in surgical hepatology, both traditional and minimally invasive methods of their treatment are used. Bio-statistical analysis is used due to the fact that strong evidences are required for the effectiveness of one or another method of surgical intervention. The estimation of statistical significance of differences between the control and the main group of patients with liver abscesses is given in this paper. Depending on the treatment method patients were divided into two groups: 1 - minimally invasive surgery (89 cases; 2 – laporatomy surgery (74 patients. Data compa¬ri¬son was performed by means of Stjudent's criterion. The effectiveness of method of abscesses drainage using inter¬ventional sonography, outer nazobiliar drainage with reorganization of ductal liver system and abscess cavity with the help of modern antiseptics was considered. The percentage of cured patients was also estimated.

  12. OxMaR: open source free software for online minimization and randomization for clinical trials.

    Directory of Open Access Journals (Sweden)

    Christopher A O'Callaghan

    Full Text Available Minimization is a valuable method for allocating participants between the control and experimental arms of clinical studies. The use of minimization reduces differences that might arise by chance between the study arms in the distribution of patient characteristics such as gender, ethnicity and age. However, unlike randomization, minimization requires real time assessment of each new participant with respect to the preceding distribution of relevant participant characteristics within the different arms of the study. For multi-site studies, this necessitates centralized computational analysis that is shared between all study locations. Unfortunately, there is no suitable freely available open source or free software that can be used for this purpose. OxMaR was developed to enable researchers in any location to use minimization for patient allocation and to access the minimization algorithm using any device that can connect to the internet such as a desktop computer, tablet or mobile phone. The software is complete in itself and requires no special packages or libraries to be installed. It is simple to set up and run over the internet using online facilities which are very low cost or even free to the user. Importantly, it provides real time information on allocation to the study lead or administrator and generates real time distributed backups with each allocation. OxMaR can readily be modified and customised and can also be used for standard randomization. It has been extensively tested and has been used successfully in a low budget multi-centre study. Hitherto, the logistical difficulties involved in minimization have precluded its use in many small studies and this software should allow more widespread use of minimization which should lead to studies with better matched control and experimental arms. OxMaR should be particularly valuable in low resource settings.

  13. OxMaR: open source free software for online minimization and randomization for clinical trials.

    Science.gov (United States)

    O'Callaghan, Christopher A

    2014-01-01

    Minimization is a valuable method for allocating participants between the control and experimental arms of clinical studies. The use of minimization reduces differences that might arise by chance between the study arms in the distribution of patient characteristics such as gender, ethnicity and age. However, unlike randomization, minimization requires real time assessment of each new participant with respect to the preceding distribution of relevant participant characteristics within the different arms of the study. For multi-site studies, this necessitates centralized computational analysis that is shared between all study locations. Unfortunately, there is no suitable freely available open source or free software that can be used for this purpose. OxMaR was developed to enable researchers in any location to use minimization for patient allocation and to access the minimization algorithm using any device that can connect to the internet such as a desktop computer, tablet or mobile phone. The software is complete in itself and requires no special packages or libraries to be installed. It is simple to set up and run over the internet using online facilities which are very low cost or even free to the user. Importantly, it provides real time information on allocation to the study lead or administrator and generates real time distributed backups with each allocation. OxMaR can readily be modified and customised and can also be used for standard randomization. It has been extensively tested and has been used successfully in a low budget multi-centre study. Hitherto, the logistical difficulties involved in minimization have precluded its use in many small studies and this software should allow more widespread use of minimization which should lead to studies with better matched control and experimental arms. OxMaR should be particularly valuable in low resource settings.

  14. A simplified density matrix minimization for linear scaling self-consistent field theory

    International Nuclear Information System (INIS)

    Challacombe, M.

    1999-01-01

    A simplified version of the Li, Nunes and Vanderbilt [Phys. Rev. B 47, 10891 (1993)] and Daw [Phys. Rev. B 47, 10895 (1993)] density matrix minimization is introduced that requires four fewer matrix multiplies per minimization step relative to previous formulations. The simplified method also exhibits superior convergence properties, such that the bulk of the work may be shifted to the quadratically convergent McWeeny purification, which brings the density matrix to idempotency. Both orthogonal and nonorthogonal versions are derived. The AINV algorithm of Benzi, Meyer, and Tuma [SIAM J. Sci. Comp. 17, 1135 (1996)] is introduced to linear scaling electronic structure theory, and found to be essential in transformations between orthogonal and nonorthogonal representations. These methods have been developed with an atom-blocked sparse matrix algebra that achieves sustained megafloating point operations per second rates as high as 50% of theoretical, and implemented in the MondoSCF suite of linear scaling SCF programs. For the first time, linear scaling Hartree - Fock theory is demonstrated with three-dimensional systems, including water clusters and estane polymers. The nonorthogonal minimization is shown to be uncompetitive with minimization in an orthonormal representation. An early onset of linear scaling is found for both minimal and double zeta basis sets, and crossovers with a highly optimized eigensolver are achieved. Calculations with up to 6000 basis functions are reported. The scaling of errors with system size is investigated for various levels of approximation. copyright 1999 American Institute of Physics

  15. On balanced minimal repeated measurements designs

    Directory of Open Access Journals (Sweden)

    Shakeel Ahmad Mir

    2014-10-01

    Full Text Available Repeated Measurements designs are concerned with scientific experiments in which each experimental unit is assigned more than once to a treatment either different or identical. This class of designs has the property that the unbiased estimators for elementary contrasts among direct and residual effects are obtainable. Afsarinejad (1983 provided a method of constructing balanced Minimal Repeated Measurements designs p < t , when t is an odd or prime power, one or more than one treatment may occur more than once in some sequences and  designs so constructed no longer remain uniform in periods. In this paper an attempt has been made to provide a new method to overcome this drawback. Specifically, two cases have been considered                RM[t,n=t(t-t/(p-1,p], λ2=1 for balanced minimal repeated measurements designs and  RM[t,n=2t(t-t/(p-1,p], λ2=2 for balanced  repeated measurements designs. In addition , a method has been provided for constructing              extra-balanced minimal designs for special case RM[t,n=t2/(p-1,p], λ2=1.

  16. Minimal modification to tribimaximal mixing

    International Nuclear Information System (INIS)

    He Xiaogang; Zee, A.

    2011-01-01

    We explore some ways of minimally modifying the neutrino mixing matrix from tribimaximal, characterized by introducing at most one mixing angle and a CP violating phase thus extending our earlier work. One minimal modification, motivated to some extent by group theoretic considerations, is a simple case with the elements V α2 of the second column in the mixing matrix equal to 1/√(3). Modifications by keeping one of the columns or one of the rows unchanged from tribimaximal mixing all belong to the class of minimal modification. Some of the cases have interesting experimentally testable consequences. In particular, the T2K and MINOS collaborations have recently reported indications of a nonzero θ 13 . For the cases we consider, the new data sharply constrain the CP violating phase angle δ, with δ close to 0 (in some cases) and π disfavored.

  17. Topological gravity with minimal matter

    International Nuclear Information System (INIS)

    Li Keke

    1991-01-01

    Topological minimal matter, obtained by twisting the minimal N = 2 supeconformal field theory, is coupled to two-dimensional topological gravity. The free field formulation of the coupled system allows explicit representations of BRST charge, physical operators and their correlation functions. The contact terms of the physical operators may be evaluated by extending the argument used in a recent solution of topological gravity without matter. The consistency of the contact terms in correlation functions implies recursion relations which coincide with the Virasoro constraints derived from the multi-matrix models. Topological gravity with minimal matter thus provides the field theoretic description for the multi-matrix models of two-dimensional quantum gravity. (orig.)

  18. Estimation and Minimization of Embodied Carbon of Buildings: A Review

    Directory of Open Access Journals (Sweden)

    Ali Akbarnezhad

    2017-01-01

    Full Text Available Building and construction is responsible for up to 30% of annual global greenhouse gas (GHG emissions, commonly reported in carbon equivalent unit. Carbon emissions are incurred in all stages of a building’s life cycle and are generally categorised into operating carbon and embodied carbon, each making varying contributions to the life cycle carbon depending on the building’s characteristics. With recent advances in reducing the operating carbon of buildings, the available literature indicates a clear shift in attention towards investigating strategies to minimize embodied carbon. However, minimizing the embodied carbon of buildings is challenging and requires evaluating the effects of embodied carbon reduction strategies on the emissions incurred in different life cycle phases, as well as the operating carbon of the building. In this paper, the available literature on strategies for reducing the embodied carbon of buildings, as well as methods for estimating the embodied carbon of buildings, is reviewed and the strengths and weaknesses of each method are highlighted.

  19. Protein Requirements Are Elevated in Endurance Athletes after Exercise as Determined by the Indicator Amino Acid Oxidation Method.

    Directory of Open Access Journals (Sweden)

    Hiroyuki Kato

    Full Text Available A higher protein intake has been recommended for endurance athletes compared with healthy non-exercising individuals based primarily on nitrogen balance methodology. The aim of this study was to determine the estimated average protein requirement and recommended protein intake in endurance athletes during an acute 3-d controlled training period using the indicator amino acid oxidation method. After 2-d of controlled diet (1.4 g protein/kg/d and training (10 and 5km/d, respectively, six male endurance-trained adults (28±4 y of age; Body weight, 64.5±10.0 kg; VO2peak, 60.3±6.7 ml·kg-1·min-1; means±SD performed an acute bout of endurance exercise (20 km treadmill run prior to consuming test diets providing variable amounts of protein (0.2-2.8 g·kg-1·d-1 and sufficient energy. Protein was provided as a crystalline amino acid mixture based on the composition of egg protein with [1-13C]phenylalanine provided to determine whole body phenylalanine flux, 13CO2 excretion, and phenylalanine oxidation. The estimated average protein requirement was determined as the breakpoint after biphasic linear regression analysis with a recommended protein intake defined as the upper 95% confidence interval. Phenylalanine flux (68.8±8.5 μmol·kg-1·h-1 was not affected by protein intake. 13CO2 excretion displayed a robust bi-phase linear relationship (R2 = 0.86 that resulted in an estimated average requirement and a recommended protein intake of 1.65 and 1.83 g protein·kg-1·d-1, respectively, which was similar to values based on phenylalanine oxidation (1.53 and 1.70 g·kg-1·d-1, respectively. We report a recommended protein intake that is greater than the RDA (0.8 g·kg-1·d-1 and current recommendations for endurance athletes (1.2-1.4 g·kg-1·d-1. Our results suggest that the metabolic demand for protein in endurance-trained adults on a higher volume training day is greater than their sedentary peers and current recommendations for athletes based

  20. Non-minimal inflation revisited

    International Nuclear Information System (INIS)

    Nozari, Kourosh; Shafizadeh, Somayeh

    2010-01-01

    We reconsider an inflationary model that inflaton field is non-minimally coupled to gravity. We study the parameter space of the model up to the second (and in some cases third) order of the slow-roll parameters. We calculate inflation parameters in both Jordan and Einstein frames, and the results are compared in these two frames and also with observations. Using the recent observational data from combined WMAP5+SDSS+SNIa datasets, we study constraints imposed on our model parameters, especially the non-minimal coupling ξ.

  1. Minimal Flavor Constraints for Technicolor

    DEFF Research Database (Denmark)

    Sakuma, Hidenori; Sannino, Francesco

    2010-01-01

    We analyze the constraints on the the vacuum polarization of the standard model gauge bosons from a minimal set of flavor observables valid for a general class of models of dynamical electroweak symmetry breaking. We will show that the constraints have a strong impact on the self-coupling and mas......We analyze the constraints on the the vacuum polarization of the standard model gauge bosons from a minimal set of flavor observables valid for a general class of models of dynamical electroweak symmetry breaking. We will show that the constraints have a strong impact on the self...

  2. Harm minimization among teenage drinkers

    DEFF Research Database (Denmark)

    Jørgensen, Morten Hulvej; Curtis, Tine; Christensen, Pia Haudrup

    2007-01-01

    AIM: To examine strategies of harm minimization employed by teenage drinkers. DESIGN, SETTING AND PARTICIPANTS: Two periods of ethnographic fieldwork were conducted in a rural Danish community of approximately 2000 inhabitants. The fieldwork included 50 days of participant observation among 13....... In regulating the social context of drinking they relied on their personal experiences more than on formalized knowledge about alcohol and harm, which they had learned from prevention campaigns and educational programmes. CONCLUSIONS: In this study we found that teenagers may help each other to minimize alcohol...

  3. Minimizing surgical skin incision scars with a latex surgical glove.

    Science.gov (United States)

    Han, So-Eun; Ryoo, Suk-Tae; Lim, So Young; Pyon, Jai-Kyung; Bang, Sa-Ik; Oh, Kap-Sung; Mun, Goo-Hyun

    2013-04-01

    The current trend in minimally invasive surgery is to make a small surgical incision. However, the excessive tensile stress applied by the retractors to the skin surrounding the incision often results in a long wound healing time and extensive scarring. To minimize these types of wound problems, the authors evaluated a simple and cost-effective method to minimize surgical incision scars based on the use of a latex surgical glove. The tunnel-shaped part of a powder-free latex surgical glove was applied to the incision and the dissection plane. It was fixed to the full layer of the dissection plane with sutures. The glove on the skin surface then was sealed with Ioban (3 M Health Care, St. Paul, MN, USA) to prevent movement. The operation proceeded as usual, with the retractor running through the tunnel of the latex glove. It was possible to complete the operation without any disturbance of the visual field by the surgical glove, and the glove was neither torn nor separated by the retractors. The retractors caused traction and friction during the operation, but the extent of damage to the postoperative skin incision margin was remarkably less than when the operation was performed without a glove. This simple and cost-effective method is based on the use of a latex surgical glove to protect the surgical skin incision site and improve the appearance of the postoperative scar. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  4. Critical flicker frequency and continuous reaction times for the diagnosis of minimal hepatic encephalopathy

    DEFF Research Database (Denmark)

    Lauridsen, Mette Enok Munk; Jepsen, Peter; Vilstrup, Hendrik

    2011-01-01

    Abstract Minimal hepatic encephalopathy (MHE) is intermittently present in up to 2/3 of patients with chronic liver disease. It impairs their daily living and can be treated. However, there is no consensus on diagnostic criteria except that psychometric methods are required. We compared two easy...... appropriately to a sensory stimulus. The choice of test depends on the information needed in the clinical and scientific care and study of the patients....

  5. KCUT, code to generate minimal cut sets for fault trees

    International Nuclear Information System (INIS)

    Han, Sang Hoon

    2008-01-01

    1 - Description of program or function: KCUT is a software to generate minimal cut sets for fault trees. 2 - Methods: Expand a fault tree into cut sets and delete non minimal cut sets. 3 - Restrictions on the complexity of the problem: Size and complexity of the fault tree

  6. The methodical approach to determining the heterogeneity of cognitive function in preschool children requiring correction of speech impediments

    Directory of Open Access Journals (Sweden)

    N.B. Petrenko

    2016-04-01

    Full Text Available Introduction. It has been confirmed that children who suffer from speech impediments may experience difficulties in their cognitive activity, limitations in communication, asociality and sense detachment. It is also clear that these children require not only logopedic treatment, but also assistance in developing functions of their cognitive and mental activities. Aims. To identify the case of uniformity lack of cognitive and somatomental functions of 5-6 year old children with speech impediments in a group; to evaluate the method used for this research. Methods. Use estimates of major mental and cognitive activities by means of tests increases the difficulty. Scores from 1 to 10 were given. Such factors as movement coordination, musicality and body plasticity were taken into consideration too. StatSoft STATISTICA10.0. programme was used to run the statistical analysis. Results. The changes of the group with uniformity of physical, cognitive, somatomental and dance abilities were analysed and estimated at the beginning and at the end of the academic year. The results of the claster analysis have shown that the children managed to develop their cognitive and somatomental abilities. Also it was estimated that the level of uniformity has increased in the group. Conclusions. Having done the research we can state that with the help of cluster analysis children with speech impediments can be grouped according to their physical, cognitive, somatomental and dance abilities. With the help of the results of the claster analysis that notifies that the children managed to develop their cognitive and somatomental abilities, we can observe the positivie effects of the suggested dance-cognitive teaching elements in an educational programme.

  7. Cell-free protein synthesis in micro compartments: building a minimal cell from biobricks.

    Science.gov (United States)

    Jia, Haiyang; Heymann, Michael; Bernhard, Frank; Schwille, Petra; Kai, Lei

    2017-10-25

    The construction of a minimal cell that exhibits the essential characteristics of life is a great challenge in the field of synthetic biology. Assembling a minimal cell requires multidisciplinary expertise from physics, chemistry and biology. Scientists from different backgrounds tend to define the essence of 'life' differently and have thus proposed different artificial cell models possessing one or several essential features of living cells. Using the tools and methods of molecular biology, the bottom-up engineering of a minimal cell appears in reach. However, several challenges still remain. In particular, the integration of individual sub-systems that is required to achieve a self-reproducing cell model presents a complex optimization challenge. For example, multiple self-organisation and self-assembly processes have to be carefully tuned. We review advances and developments of new methods and techniques, for cell-free protein synthesis as well as micro-fabrication, for their potential to resolve challenges and to accelerate the development of minimal cells. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. METHOD FOR SECURITY SPECIFICATION SOFTWARE REQUIREMENTS AS A MEANS FOR IMPLEMENTING A SOFTWARE DEVELOPMENT PROCESS SECURE - MERSEC

    Directory of Open Access Journals (Sweden)

    Castro Mecías, L.T.

    2015-06-01

    Full Text Available Often security incidents that have the object or use the software as a means of causing serious damage and legal, economic consequences, etc. Results of a survey by Kaspersky Lab reflectvulnerabilities in software are the main cause of security incidents in enterprises, the report shows that 85% of them have reported security incidents and vulnerabilities in software are the main reason is further estimated that incidents can cause significant losses estimated from 50,000 to $ 649.000. (1 In this regard academic and industry research focuses on proposals based on reducing vulnerabilities and failures of technology, with a positive influence on how the software is developed. A development process for improved safety practices and should include activities from the initial phases of the software; so that security needs are identified, manage risk and appropriate measures are implemented. This article discusses a method of analysis, acquisition and requirements specification of the software safety analysis on the basis of various proposals and deficiencies identified from participant observation in software development teams. Experiments performed using the proposed yields positive results regarding the reduction of security vulnerabilities and compliance with the safety objectives of the software.

  9. Method

    Directory of Open Access Journals (Sweden)

    Ling Fiona W.M.

    2017-01-01

    Full Text Available Rapid prototyping of microchannel gain lots of attention from researchers along with the rapid development of microfluidic technology. The conventional methods carried few disadvantages such as high cost, time consuming, required high operating pressure and temperature and involve expertise in operating the equipment. In this work, new method adapting xurography method is introduced to replace the conventional method of fabrication of microchannels. The novelty in this study is replacing the adhesion film with clear plastic film which was used to cut the design of the microchannel as the material is more suitable for fabricating more complex microchannel design. The microchannel was then mold using polymethyldisiloxane (PDMS and bonded with a clean glass to produce a close microchannel. The microchannel produced had a clean edge indicating good master mold was produced using the cutting plotter and the bonding between the PDMS and glass was good where no leakage was observed. The materials used in this method is cheap and the total time consumed is less than 5 hours where this method is suitable for rapid prototyping of microchannel.

  10. Isoperimetric inequalities for minimal graphs

    International Nuclear Information System (INIS)

    Pacelli Bessa, G.; Montenegro, J.F.

    2007-09-01

    Based on Markvorsen and Palmer's work on mean time exit and isoperimetric inequalities we establish slightly better isoperimetric inequalities and mean time exit estimates for minimal graphs in N x R. We also prove isoperimetric inequalities for submanifolds of Hadamard spaces with tamed second fundamental form. (author)

  11. A Defense of Semantic Minimalism

    Science.gov (United States)

    Kim, Su

    2012-01-01

    Semantic Minimalism is a position about the semantic content of declarative sentences, i.e., the content that is determined entirely by syntax. It is defined by the following two points: "Point 1": The semantic content is a complete/truth-conditional proposition. "Point 2": The semantic content is useful to a theory of…

  12. Torsional Rigidity of Minimal Submanifolds

    DEFF Research Database (Denmark)

    Markvorsen, Steen; Palmer, Vicente

    2006-01-01

    We prove explicit upper bounds for the torsional rigidity of extrinsic domains of minimal submanifolds $P^m$ in ambient Riemannian manifolds $N^n$ with a pole $p$. The upper bounds are given in terms of the torsional rigidities of corresponding Schwarz symmetrizations of the domains in warped...

  13. The debate on minimal deterrence

    International Nuclear Information System (INIS)

    Arbatov, A.; Karp, R.C.; Toth, T.

    1993-01-01

    Revitalization of debates on minimal nuclear deterrence at the present time is induced by the end of the Cold War and a number of unilateral and bilateral actions by the great powers to curtail nuclear arms race and reduce nuclear weapons arsenals

  14. Minimizing TLD-DRD differences

    International Nuclear Information System (INIS)

    Riley, D.L.; McCoy, R.A.; Connell, W.D.

    1987-01-01

    When substantial differences exist in exposures recorded by TLD's and DRD's, it is often necessary to perform an exposure investigation to reconcile the difference. In working with several operating plants, the authors have observed a number of causes for these differences. This paper outlines these observations and discusses procedures that can be used to minimize them

  15. Acquiring minimally invasive surgical skills

    NARCIS (Netherlands)

    Hiemstra, Ellen

    2012-01-01

    Many topics in surgical skills education have been implemented without a solid scientific basis. For that reason we have tried to find this scientific basis. We have focused on training and evaluation of minimally invasive surgical skills in a training setting and in practice in the operating room.

  16. Pengaruh Pelapis Bionanokomposit terhadap Mutu Mangga Terolah Minimal

    Directory of Open Access Journals (Sweden)

    Ata Aditya Wardana

    2017-04-01

    Full Text Available Abstract Minimally-processed mango is a perishable product due to high respiration and transpiration and microbial decay. Edible coating is one of the alternative methods to maintain the quality of minimally - processed mango. The objective of this study was to evaluate the effects of bionanocomposite edible coating from tapioca and ZnO nanoparticles (NP-ZnO on quality of minimally - processed mango cv. Arumanis, stored for 12 days at 8°C. The combination of tapioca and NP-ZnO (0, 1, 2% by weight of tapioca were used to coat minimally processed mango. The result showed that application of bionanocomposite edible coatings were able to maintain the quality of minimally-processed mango during the storage periods. The bionanocomposite from tapioca + NP-ZnO (2% by weight of tapioca was the most effective in reducing weight loss, firmness, browning index, total acidity, total soluble solids ,respiration, and microbial counts. Thus, the use of bionanocomposite edible coating might provide an alternative method to maintain storage quality of minimally-processed mango. Abstrak Mangga terolah minimal merupakan produk yang cepat mengalami kerusakan dikarenakan respirasi yang cepat, transpirasi dan kerusakan oleh mikroba. Edible coating merupakan salah satu alternatif metode untuk mempertahankan mutu mangga terolah minimal. Tujuan dari penelitian ini adalah untuk mengevaluasi pengaruh pelapis bionanokomposit dari tapioka dan nanopartikel ZnO (NP-ZnO terhadap mutu mangga terolah minimal cv. Arumanis yang disimpan selama 12 hari pada suhu 8oC. Kombinasi dari tapioka dan NP-ZnO (0, 1, 2% b/b tapioka digunakan untuk melapisi mangga terolah minimal. Hasil menunjukkan bahwa pelapisan bionanokomposit mampu mempertahankan mutu mangga terolah minimal selama penyimpanan. Bionanokomposit dari tapioka + NP-ZnO (2% b/b tapioka paling efektif dalam menghambat penurunan susut bobot, kekerasan, indeks pencoklatan, total asam, total padatan terlarut, respirasi dan total

  17. Stabilization of a locally minimal forest

    Science.gov (United States)

    Ivanov, A. O.; Mel'nikova, A. E.; Tuzhilin, A. A.

    2014-03-01

    The method of partial stabilization of locally minimal networks, which was invented by Ivanov and Tuzhilin to construct examples of shortest trees with given topology, is developed. According to this method, boundary vertices of degree 2 are not added to all edges of the original locally minimal tree, but only to some of them. The problem of partial stabilization of locally minimal trees in a finite-dimensional Euclidean space is solved completely in the paper, that is, without any restrictions imposed on the number of edges remaining free of subdivision. A criterion for the realizability of such stabilization is established. In addition, the general problem of searching for the shortest forest connecting a finite family of boundary compact sets in an arbitrary metric space is formalized; it is shown that such forests exist for any family of compact sets if and only if for any finite subset of the ambient space there exists a shortest tree connecting it. The theory developed here allows us to establish further generalizations of the stabilization theorem both for arbitrary metric spaces and for metric spaces with some special properties. Bibliography: 10 titles.

  18. Minimizing hydride cracking in zirconium alloys

    International Nuclear Information System (INIS)

    Coleman, C.E.; Cheadle, B.A.; Ambler, J.F.R.; Eadie, R.L.

    1985-01-01

    Zirconium alloy components can fail by hydride cracking if they contain large flaws and are highly stressed. If cracking in such components is suspected, crack growth can be minimized by following two simple operating rules: components should be heated up from at least 30K below any operating temperature above 450K, and when the component requires cooling to room temperature from a high temperature, any tensile stress should be reduced as much and as quickly as is practical during cooling. This paper describes the physical basis for these rules

  19. What is Quantum Mechanics? A Minimal Formulation

    Science.gov (United States)

    Friedberg, R.; Hohenberg, P. C.

    2018-03-01

    This paper presents a minimal formulation of nonrelativistic quantum mechanics, by which is meant a formulation which describes the theory in a succinct, self-contained, clear, unambiguous and of course correct manner. The bulk of the presentation is the so-called "microscopic theory", applicable to any closed system S of arbitrary size N, using concepts referring to S alone, without resort to external apparatus or external agents. An example of a similar minimal microscopic theory is the standard formulation of classical mechanics, which serves as the template for a minimal quantum theory. The only substantive assumption required is the replacement of the classical Euclidean phase space by Hilbert space in the quantum case, with the attendant all-important phenomenon of quantum incompatibility. Two fundamental theorems of Hilbert space, the Kochen-Specker-Bell theorem and Gleason's theorem, then lead inevitably to the well-known Born probability rule. For both classical and quantum mechanics, questions of physical implementation and experimental verification of the predictions of the theories are the domain of the macroscopic theory, which is argued to be a special case or application of the more general microscopic theory.

  20. Phylogenetic rooting using minimal ancestor deviation.

    Science.gov (United States)

    Tria, Fernando Domingues Kümmel; Landan, Giddy; Dagan, Tal

    2017-06-19

    Ancestor-descendent relations play a cardinal role in evolutionary theory. Those relations are determined by rooting phylogenetic trees. Existing rooting methods are hampered by evolutionary rate heterogeneity or the unavailability of auxiliary phylogenetic information. Here we present a rooting approach, the minimal ancestor deviation (MAD) method, which accommodates heterotachy by using all pairwise topological and metric information in unrooted trees. We demonstrate the performance of the method, in comparison to existing rooting methods, by the analysis of phylogenies from eukaryotes and prokaryotes. MAD correctly recovers the known root of eukaryotes and uncovers evidence for the origin of cyanobacteria in the ocean. MAD is more robust and consistent than existing methods, provides measures of the root inference quality and is applicable to any tree with branch lengths.

  1. Discrete Curvatures and Discrete Minimal Surfaces

    KAUST Repository

    Sun, Xiang

    2012-06-01

    This thesis presents an overview of some approaches to compute Gaussian and mean curvature on discrete surfaces and discusses discrete minimal surfaces. The variety of applications of differential geometry in visualization and shape design leads to great interest in studying discrete surfaces. With the rich smooth surface theory in hand, one would hope that this elegant theory can still be applied to the discrete counter part. Such a generalization, however, is not always successful. While discrete surfaces have the advantage of being finite dimensional, thus easier to treat, their geometric properties such as curvatures are not well defined in the classical sense. Furthermore, the powerful calculus tool can hardly be applied. The methods in this thesis, including angular defect formula, cotangent formula, parallel meshes, relative geometry etc. are approaches based on offset meshes or generalized offset meshes. As an important application, we discuss discrete minimal surfaces and discrete Koenigs meshes.

  2. Fault Sample Generation for Virtual Testability Demonstration Test Subject to Minimal Maintenance and Scheduled Replacement

    Directory of Open Access Journals (Sweden)

    Yong Zhang

    2015-01-01

    Full Text Available Virtual testability demonstration test brings new requirements to the fault sample generation. First, fault occurrence process is described by stochastic process theory. It is discussed that fault occurrence process subject to minimal repair is nonhomogeneous Poisson process (NHPP. Second, the interarrival time distribution function of the next fault event is proposed and three typical kinds of parameterized NHPP are discussed. Third, the procedure of fault sample generation is put forward with the assumptions of minimal maintenance and scheduled replacement. The fault modes and their occurrence time subject to specified conditions and time period can be obtained. Finally, an antenna driving subsystem in automatic pointing and tracking platform is taken as a case to illustrate the proposed method. Results indicate that both the size and structure of the fault samples generated by the proposed method are reasonable and effective. The proposed method can be applied to virtual testability demonstration test well.

  3. Minimizing the Fluid Used to Induce Fracturing

    Science.gov (United States)

    Boyle, E. J.

    2015-12-01

    The less fluid injected to induce fracturing means less fluid needing to be produced before gas is produced. One method is to inject as fast as possible until the desired fracture length is obtained. Presented is an alternative injection strategy derived by applying optimal system control theory to the macroscopic mass balance. The picture is that the fracture is constant in aperture, fluid is injected at a controlled rate at the near end, and the fracture unzips at the far end until the desired length is obtained. The velocity of the fluid is governed by Darcy's law with larger permeability for flow along the fracture length. Fracture growth is monitored through micro-seismicity. Since the fluid is assumed to be incompressible, the rate at which fluid is injected is balanced by rate of fracture growth and rate of loss to bounding rock. Minimizing injected fluid loss to the bounding rock is the same as minimizing total injected fluid How to change the injection rate so as to minimize the total injected fluid is a problem in optimal control. For a given total length, the variation of the injected rate is determined by variations in overall time needed to obtain the desired fracture length, the length at any time, and the rate at which the fracture is growing at that time. Optimal control theory leads to a boundary condition and an ordinary differential equation in time whose solution is an injection protocol that minimizes the fluid used under the stated assumptions. That method is to monitor the rate at which the square of the fracture length is growing and adjust the injection rate proportionately.

  4. MOCUS, Minimal Cut Sets and Minimal Path Sets from Fault Tree Analysis

    International Nuclear Information System (INIS)

    Fussell, J.B.; Henry, E.B.; Marshall, N.H.

    1976-01-01

    1 - Description of problem or function: From a description of the Boolean failure logic of a system, called a fault tree, and control parameters specifying the minimal cut set length to be obtained MOCUS determines the system failure modes, or minimal cut sets, and the system success modes, or minimal path sets. 2 - Method of solution: MOCUS uses direct resolution of the fault tree into the cut and path sets. The algorithm used starts with the main failure of interest, the top event, and proceeds to basic independent component failures, called primary events, to resolve the fault tree to obtain the minimal sets. A key point of the algorithm is that an and gate alone always increases the number of path sets; an or gate alone always increases the number of cut sets and increases the size of path sets. Other types of logic gates must be described in terms of and and or logic gates. 3 - Restrictions on the complexity of the problem: Output from MOCUS can include minimal cut and path sets for up to 20 gates

  5. A method of formal requirements analysis for NPP I and C systems based on object-oriented visual modeling with SCR

    International Nuclear Information System (INIS)

    Koo, S. R.; Seong, P. H.

    1999-01-01

    In this work, a formal requirements analysis method for Nuclear Power Plant (NPP) I and C systems is suggested. This method uses Unified Modeling Language (UML) for modeling systems visually and Software Cost Reduction (SCR) formalism for checking the system models. Since object-oriented method can analyze a document by the objects in a real system, UML models that use object-oriented method are useful for understanding problems and communicating with everyone involved in the project. In order to analyze the requirement more formally, SCR tabular notations is converted from UML models. To help flow-through from UML models to SCR specifications, additional syntactic extensions for UML notation and a converting procedure are defined. The combined method has been applied to Dynamic Safety System (DSS). From this application, three kinds of errors were detected in the existing DSS requirements

  6. Current status of pediatric minimal access surgery at Sultan Qaboos ...

    African Journals Online (AJOL)

    Keywords: current status, laparoscopy, minimal access surgery, thoracoscopy. Departments of ... Materials and methods ... procedures, the open technique was used for the creation ... operated for bilateral inguinal herniotomy had recurrence.

  7. Surface Reconstruction and Image Enhancement via $L^1$-Minimization

    KAUST Repository

    Dobrev, Veselin; Guermond, Jean-Luc; Popov, Bojan

    2010-01-01

    A surface reconstruction technique based on minimization of the total variation of the gradient is introduced. Convergence of the method is established, and an interior-point algorithm solving the associated linear programming problem is introduced

  8. Minimalism and the Pragmatic Frame

    Directory of Open Access Journals (Sweden)

    Ana Falcato

    2016-02-01

    Full Text Available In the debate between literalism and contextualism in semantics, Kent Bach’s project is often taken to stand on the latter side of the divide. In this paper I argue this is a misleading assumption and justify it by contrasting Bach’s assessment of the theoretical eliminability of minimal propositions arguably expressed by well-formed sentences with standard minimalist views, and by further contrasting his account of the division of interpretative processes ascribable to the semantics and pragmatics of a language with a parallel analysis carried out by the most radical opponent to semantic minimalism, i.e., by occasionalism. If my analysis proves right, the sum of its conclusions amounts to a refusal of Bach’s main dichotomies.

  9. Finding A Minimally Informative Dirichlet Prior Using Least Squares

    International Nuclear Information System (INIS)

    Kelly, Dana

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  10. Finding a minimally informative Dirichlet prior distribution using least squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  11. Finding a Minimally Informative Dirichlet Prior Distribution Using Least Squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straight-forward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in closed form, and so an approximate beta distribution is used in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial aleatory model for common-cause failure, must be estimated from data that is often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  12. Principle of minimal work fluctuations.

    Science.gov (United States)

    Xiao, Gaoyang; Gong, Jiangbin

    2015-08-01

    Understanding and manipulating work fluctuations in microscale and nanoscale systems are of both fundamental and practical interest. For example, in considering the Jarzynski equality 〈e-βW〉=e-βΔF, a change in the fluctuations of e-βW may impact how rapidly the statistical average of e-βW converges towards the theoretical value e-βΔF, where W is the work, β is the inverse temperature, and ΔF is the free energy difference between two equilibrium states. Motivated by our previous study aiming at the suppression of work fluctuations, here we obtain a principle of minimal work fluctuations. In brief, adiabatic processes as treated in quantum and classical adiabatic theorems yield the minimal fluctuations in e-βW. In the quantum domain, if a system initially prepared at thermal equilibrium is subjected to a work protocol but isolated from a bath during the time evolution, then a quantum adiabatic process without energy level crossing (or an assisted adiabatic process reaching the same final states as in a conventional adiabatic process) yields the minimal fluctuations in e-βW, where W is the quantum work defined by two energy measurements at the beginning and at the end of the process. In the classical domain where the classical work protocol is realizable by an adiabatic process, then the classical adiabatic process also yields the minimal fluctuations in e-βW. Numerical experiments based on a Landau-Zener process confirm our theory in the quantum domain, and our theory in the classical domain explains our previous numerical findings regarding the suppression of classical work fluctuations [G. Y. Xiao and J. B. Gong, Phys. Rev. E 90, 052132 (2014)].

  13. Optimizing Processes to Minimize Risk

    Science.gov (United States)

    Loyd, David

    2017-01-01

    NASA, like the other hazardous industries, has suffered very catastrophic losses. Human error will likely never be completely eliminated as a factor in our failures. When you can't eliminate risk, focus on mitigating the worst consequences and recovering operations. Bolstering processes to emphasize the role of integration and problem solving is key to success. Building an effective Safety Culture bolsters skill-based performance that minimizes risk and encourages successful engagement.

  14. Minimal Length, Measurability and Gravity

    Directory of Open Access Journals (Sweden)

    Alexander Shalyt-Margolin

    2016-03-01

    Full Text Available The present work is a continuation of the previous papers written by the author on the subject. In terms of the measurability (or measurable quantities notion introduced in a minimal length theory, first the consideration is given to a quantum theory in the momentum representation. The same terms are used to consider the Markov gravity model that here illustrates the general approach to studies of gravity in terms of measurable quantities.

  15. Minimal massive 3D gravity

    International Nuclear Information System (INIS)

    Bergshoeff, Eric; Merbis, Wout; Hohm, Olaf; Routh, Alasdair J; Townsend, Paul K

    2014-01-01

    We present an alternative to topologically massive gravity (TMG) with the same ‘minimal’ bulk properties; i.e. a single local degree of freedom that is realized as a massive graviton in linearization about an anti-de Sitter (AdS) vacuum. However, in contrast to TMG, the new ‘minimal massive gravity’ has both a positive energy graviton and positive central charges for the asymptotic AdS-boundary conformal algebra. (paper)

  16. Acquiring minimally invasive surgical skills

    OpenAIRE

    Hiemstra, Ellen

    2012-01-01

    Many topics in surgical skills education have been implemented without a solid scientific basis. For that reason we have tried to find this scientific basis. We have focused on training and evaluation of minimally invasive surgical skills in a training setting and in practice in the operating room. This thesis has led to an enlarged insight in the organization of surgical skills training during residency training of surgical medical specialists.

  17. Annual Waste Minimization Summary Report Calendar Year 2007

    International Nuclear Information System (INIS)

    NSTec Environmental Management

    2008-01-01

    This report summarizes the waste minimization efforts undertaken by National Security Technologies, LLC (NSTec), for the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during calendar year (CY) 2007. This report was developed in accordance with the requirements of the Nevada Test Site (NTS) Resource Conservation and Recovery Act (RCRA) Permit (number NEV HW0021), and as clarified in a letter dated April 21, 1995, from Paul Liebendorfer of the Nevada Division of Environmental Protection to Donald Elle of the U.S. Department of Energy, Nevada Operations Office. The NNSA/NSO Pollution Prevention (P2) Program establishes a process to reduce the volume and toxicity of waste generated by the NNSA/NSO and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment. The following information provides an overview of the P2 Program, major P2 accomplishments during the reporting year, a comparison of the current year waste generation to prior years, and a description of efforts undertaken during the year to reduce the volume and toxicity of waste generated by the NNSA/NSO

  18. Annual Waste Minimization Summary Report, Calendar Year 2008

    International Nuclear Information System (INIS)

    2009-01-01

    This report summarizes the waste minimization efforts undertaken by National Security Technologies, LLC (NSTec), for the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during calendar year 2008. This report was developed in accordance with the requirements of the Nevada Test Site (NTS) Resource Conservation and Recovery Act (RCRA) Permit (No. NEV HW0021), and as clarified in a letter dated April 21, 1995, from Paul Liebendorfer of the Nevada Division of Environmental Protection to Donald Elle of the U.S. Department of Energy, Nevada Operations Office. The NNSA/NSO Pollution Prevention (P2) Program establishes a process to reduce the volume and toxicity of waste generated by NNSA/NSO activities and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment. The following information provides an overview of the P2 Program, major P2 accomplishments during the reporting year, a comparison of the current year waste generation to prior years, and a description of efforts undertaken during the year to reduce the volume and toxicity of waste generated by the NNSA/NSO

  19. Annual Waste Minimization Summary Report, Calendar Year 2009

    International Nuclear Information System (INIS)

    2010-01-01

    This report summarizes the waste minimization efforts undertaken by National Security Technologies, LLC, for the U. S. Department of Energy, National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during calendar year 2009. This report was developed in accordance with the requirements of the Nevada Test Site Resource Conservation and Recovery Act Permit (No. NEV HW0021), and as clarified in a letter dated April 21, 1995, from Paul Liebendorfer of the Nevada Division of Environmental Protection to Donald Elle of the U.S. Department of Energy, Nevada Operations Office. The NNSA/NSO Pollution Prevention (P2) Program establishes a process to reduce the volume and toxicity of waste generated by NNSA/NSO activities and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment. The following information provides an overview of the P2 Program, major P2 accomplishments during the reporting year, a comparison of the current year waste generation to prior years, and a description of efforts undertaken during the year to reduce the volume and toxicity of waste generated by NNSA/NSO.

  20. Statistically Efficient Construction of α-Risk-Minimizing Portfolio

    Directory of Open Access Journals (Sweden)

    Hiroyuki Taniai

    2012-01-01

    Full Text Available We propose a semiparametrically efficient estimator for α-risk-minimizing portfolio weights. Based on the work of Bassett et al. (2004, an α-risk-minimizing portfolio optimization is formulated as a linear quantile regression problem. The quantile regression method uses a pseudolikelihood based on an asymmetric Laplace reference density, and asymptotic properties such as consistency and asymptotic normality are obtained. We apply the results of Hallin et al. (2008 to the problem of constructing α-risk-minimizing portfolios using residual signs and ranks and a general reference density. Monte Carlo simulations assess the performance of the proposed method. Empirical applications are also investigated.