WorldWideScience

Sample records for method requires minimal

  1. Waste minimization in analytical methods

    International Nuclear Information System (INIS)

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S. Schilling, J.B.

    1995-01-01

    The US Department of Energy (DOE) will require a large number of waste characterizations over a multi-year period to accomplish the Department's goals in environmental restoration and waste management. Estimates vary, but two million analyses annually are expected. The waste generated by the analytical procedures used for characterizations is a significant source of new DOE waste. Success in reducing the volume of secondary waste and the costs of handling this waste would significantly decrease the overall cost of this DOE program. Selection of appropriate analytical methods depends on the intended use of the resultant data. It is not always necessary to use a high-powered analytical method, typically at higher cost, to obtain data needed to make decisions about waste management. Indeed, for samples taken from some heterogeneous systems, the meaning of high accuracy becomes clouded if the data generated are intended to measure a property of this system. Among the factors to be considered in selecting the analytical method are the lower limit of detection, accuracy, turnaround time, cost, reproducibility (precision), interferences, and simplicity. Occasionally, there must be tradeoffs among these factors to achieve the multiple goals of a characterization program. The purpose of the work described here is to add waste minimization to the list of characteristics to be considered. In this paper the authors present results of modifying analytical methods for waste characterization to reduce both the cost of analysis and volume of secondary wastes. Although tradeoffs may be required to minimize waste while still generating data of acceptable quality for the decision-making process, they have data demonstrating that wastes can be reduced in some cases without sacrificing accuracy or precision

  2. Fast nonconvex nonsmooth minimization methods for image restoration and reconstruction.

    Science.gov (United States)

    Nikolova, Mila; Ng, Michael K; Tam, Chi-Pan

    2010-12-01

    Nonconvex nonsmooth regularization has advantages over convex regularization for restoring images with neat edges. However, its practical interest used to be limited by the difficulty of the computational stage which requires a nonconvex nonsmooth minimization. In this paper, we deal with nonconvex nonsmooth minimization methods for image restoration and reconstruction. Our theoretical results show that the solution of the nonconvex nonsmooth minimization problem is composed of constant regions surrounded by closed contours and neat edges. The main goal of this paper is to develop fast minimization algorithms to solve the nonconvex nonsmooth minimization problem. Our experimental results show that the effectiveness and efficiency of the proposed algorithms.

  3. Methods evaluated to minimize emissions from preplant soil fumigation

    Directory of Open Access Journals (Sweden)

    Suduan Gao

    2008-05-01

    Full Text Available Many commodities depend on preplant soil fumigation for pest control to achieve healthy crops and profitable yields. Under California regulations, minimizing emissions is essential to maintain the practical use of soil fumigants, and more stringent regulations are likely in the future. The phase-out of methyl bromide as a broad-spectrum soil fumigant has created formidable challenges. Most alternatives registered today are regulated as volatile organic compounds because of their toxicity and mobile nature. We review research on methods for minimizing emissions from soil fumigation, including the effectiveness of their emission reductions, impacts on pest control and cost. Low-permeability plastic mulches are highly effective but are generally affordable only in high-value cash crops such as strawberry. Crops with low profit margins such as stone-fruit orchards may require lower-cost methods such as water treatment or target-area fumigation.

  4. Approximate error conjugation gradient minimization methods

    Science.gov (United States)

    Kallman, Jeffrey S

    2013-05-21

    In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.

  5. Determination method of inactivating minimal dose of gama radiation for Salmonella typhimurium

    International Nuclear Information System (INIS)

    Araujo, E.S.; Campos, H. de; Silva, D.M.

    1979-01-01

    A method for determination of minimal inactivating dose (MID) with Salmonella typhimurium is presented. This is a more efficient way to improve the irradiated vaccines. The MID found for S. thyphimurium 6.616 by binomial test was 0.55 MR. The method used allows to get a definite value for MID and requires less consumption of material, work and time in comparison with the usual procedure [pt

  6. A Matrix Splitting Method for Composite Function Minimization

    KAUST Repository

    Yuan, Ganzhao

    2016-12-07

    Composite function minimization captures a wide spectrum of applications in both computer vision and machine learning. It includes bound constrained optimization and cardinality regularized optimization as special cases. This paper proposes and analyzes a new Matrix Splitting Method (MSM) for minimizing composite functions. It can be viewed as a generalization of the classical Gauss-Seidel method and the Successive Over-Relaxation method for solving linear systems in the literature. Incorporating a new Gaussian elimination procedure, the matrix splitting method achieves state-of-the-art performance. For convex problems, we establish the global convergence, convergence rate, and iteration complexity of MSM, while for non-convex problems, we prove its global convergence. Finally, we validate the performance of our matrix splitting method on two particular applications: nonnegative matrix factorization and cardinality regularized sparse coding. Extensive experiments show that our method outperforms existing composite function minimization techniques in term of both efficiency and efficacy.

  7. A Matrix Splitting Method for Composite Function Minimization

    KAUST Repository

    Yuan, Ganzhao; Zheng, Wei-Shi; Ghanem, Bernard

    2016-01-01

    Composite function minimization captures a wide spectrum of applications in both computer vision and machine learning. It includes bound constrained optimization and cardinality regularized optimization as special cases. This paper proposes and analyzes a new Matrix Splitting Method (MSM) for minimizing composite functions. It can be viewed as a generalization of the classical Gauss-Seidel method and the Successive Over-Relaxation method for solving linear systems in the literature. Incorporating a new Gaussian elimination procedure, the matrix splitting method achieves state-of-the-art performance. For convex problems, we establish the global convergence, convergence rate, and iteration complexity of MSM, while for non-convex problems, we prove its global convergence. Finally, we validate the performance of our matrix splitting method on two particular applications: nonnegative matrix factorization and cardinality regularized sparse coding. Extensive experiments show that our method outperforms existing composite function minimization techniques in term of both efficiency and efficacy.

  8. Balancing related methods for minimal realization of periodic systems

    OpenAIRE

    Varga, A.

    1999-01-01

    We propose balancing related numerically reliable methods to compute minimal realizations of linear periodic systems with time-varying dimensions. The first method belongs to the family of square-root methods with guaranteed enhanced computational accuracy and can be used to compute balanced minimal order realizations. An alternative balancing-free square-root method has the advantage of a potentially better numerical accuracy in case of poorly scaled original systems. The key numerical co...

  9. The minimal energetic requirement of sustained awareness after brain injury

    DEFF Research Database (Denmark)

    Stender, Johan; Mortensen, Kristian Nygaard; Thibaut, Aurore

    2016-01-01

    of glucose has been proposed as an indicator of consciousness [2 and 3]. Likewise, FDG-PET may contribute to the clinical diagnosis of disorders of consciousness (DOCs) [4 and 5]. However, current methods are non-quantitative and have important drawbacks deriving from visually guided assessment of relative...... changes in brain metabolism [4]. We here used FDG-PET to measure resting state brain glucose metabolism in 131 DOC patients to identify objective quantitative metabolic indicators and predictors of awareness. Quantitation of images was performed by normalizing to extracerebral tissue. We show that 42......% of normal cortical activity represents the minimal energetic requirement for the presence of conscious awareness. Overall, the cerebral metabolic rate accounted for the current level, or imminent return, of awareness in 94% of the patient population, suggesting a global energetic threshold effect...

  10. Optimal design method to minimize users' thinking mapping load in human-machine interactions.

    Science.gov (United States)

    Huang, Yanqun; Li, Xu; Zhang, Jie

    2015-01-01

    The discrepancy between human cognition and machine requirements/behaviors usually results in serious mental thinking mapping loads or even disasters in product operating. It is important to help people avoid human-machine interaction confusions and difficulties in today's mental work mastered society. Improving the usability of a product and minimizing user's thinking mapping and interpreting load in human-machine interactions. An optimal human-machine interface design method is introduced, which is based on the purpose of minimizing the mental load in thinking mapping process between users' intentions and affordance of product interface states. By analyzing the users' thinking mapping problem, an operating action model is constructed. According to human natural instincts and acquired knowledge, an expected ideal design with minimized thinking loads is uniquely determined at first. Then, creative alternatives, in terms of the way human obtains operational information, are provided as digital interface states datasets. In the last, using the cluster analysis method, an optimum solution is picked out from alternatives, by calculating the distances between two datasets. Considering multiple factors to minimize users' thinking mapping loads, a solution nearest to the ideal value is found in the human-car interaction design case. The clustering results show its effectiveness in finding an optimum solution to the mental load minimizing problems in human-machine interaction design.

  11. A method of posterior fossa dural incision to minimize hemorrhage from the occipital sinus: the "mosquito" method.

    Science.gov (United States)

    Lee, Hee Chang; Lee, Ji Yeoun; Ryu, Seul Ki; Lim, Jang Mi; Chong, Sangjoon; Phi, Ji Hoon; Kim, Seung-Ki; Wang, Kyu-Chang

    2016-12-01

    The posterior fossa dural opening requires the ligation of the occipital sinus to gain successful exposure. However, there could be a prominent occipital sinus which is functioning as the main drainage route and is harboring the risk of unpredictable massive hemorrhage during the dural opening. We introduce a safe method of posterior fossa dural incision to minimize hemorrhage from the occipital sinus using four curved hemostat clamps. For the dural incision at the midline part of the posterior cranial fossa, we used four curved hemostat clamps to occlude the prominent occipital sinus: one pair of clamps at the proximal part and the other pair at the distal part to occlude the occipital sinus. Dural incision was made between the two pairs of the curved hemostat clamps. By clamping of the sinus, it allows observation of possible brain swelling after occlusion of the occipital sinus as well as minimizes hemorrhage during incision of the midline dura of the posterior fossa. This method allows observation of brain swelling after occipital sinus occlusion and is an easy and safe incision of the midline dura minimizing hemorrhage in selected cases with a prominent occipital sinus.

  12. Optimized Runge-Kutta methods with minimal dispersion and dissipation for problems arising from computational acoustics

    International Nuclear Information System (INIS)

    Tselios, Kostas; Simos, T.E.

    2007-01-01

    In this Letter a new explicit fourth-order seven-stage Runge-Kutta method with a combination of minimal dispersion and dissipation error and maximal accuracy and stability limit along the imaginary axes, is developed. This method was produced by a general function that was constructed to satisfy all the above requirements and, from which, all the existing fourth-order six-stage RK methods can be produced. The new method is more efficient than the other optimized methods, for acoustic computations

  13. Minimal requirements for quality controls in radiotherapy with external beams

    International Nuclear Information System (INIS)

    1999-01-01

    Physical dosimetric guidelines have been developed by the Italian National Institute of Health study group on quality assurance in radiotherapy to define protocols for quality controls in external beam radiotherapy. While the document does not determine strict rules or firm recommendations, it suggests minimal requirements for quality controls necessary to guarantee an adequate degree of accuracy in external beam radiotherapy [it

  14. On the convergence of nonconvex minimization methods for image recovery.

    Science.gov (United States)

    Xiao, Jin; Ng, Michael Kwok-Po; Yang, Yu-Fei

    2015-05-01

    Nonconvex nonsmooth regularization method has been shown to be effective for restoring images with neat edges. Fast alternating minimization schemes have also been proposed and developed to solve the nonconvex nonsmooth minimization problem. The main contribution of this paper is to show the convergence of these alternating minimization schemes, based on the Kurdyka-Łojasiewicz property. In particular, we show that the iterates generated by the alternating minimization scheme, converges to a critical point of this nonconvex nonsmooth objective function. We also extend the analysis to nonconvex nonsmooth regularization model with box constraints, and obtain similar convergence results of the related minimization algorithm. Numerical examples are given to illustrate our convergence analysis.

  15. Linearly convergent stochastic heavy ball method for minimizing generalization error

    KAUST Repository

    Loizou, Nicolas

    2017-10-30

    In this work we establish the first linear convergence result for the stochastic heavy ball method. The method performs SGD steps with a fixed stepsize, amended by a heavy ball momentum term. In the analysis, we focus on minimizing the expected loss and not on finite-sum minimization, which is typically a much harder problem. While in the analysis we constrain ourselves to quadratic loss, the overall objective is not necessarily strongly convex.

  16. Reduction of very large reaction mechanisms using methods based on simulation error minimization

    Energy Technology Data Exchange (ETDEWEB)

    Nagy, Tibor; Turanyi, Tamas [Institute of Chemistry, Eoetvoes University (ELTE), P.O. Box 32, H-1518 Budapest (Hungary)

    2009-02-15

    A new species reduction method called the Simulation Error Minimization Connectivity Method (SEM-CM) was developed. According to the SEM-CM algorithm, a mechanism building procedure is started from the important species. Strongly connected sets of species, identified on the basis of the normalized Jacobian, are added and several consistent mechanisms are produced. The combustion model is simulated with each of these mechanisms and the mechanism causing the smallest error (i.e. deviation from the model that uses the full mechanism), considering the important species only, is selected. Then, in several steps other strongly connected sets of species are added, the size of the mechanism is gradually increased and the procedure is terminated when the error becomes smaller than the required threshold. A new method for the elimination of redundant reactions is also presented, which is called the Principal Component Analysis of Matrix F with Simulation Error Minimization (SEM-PCAF). According to this method, several reduced mechanisms are produced by using various PCAF thresholds. The reduced mechanism having the least CPU time requirement among the ones having almost the smallest error is selected. Application of SEM-CM and SEM-PCAF together provides a very efficient way to eliminate redundant species and reactions from large mechanisms. The suggested approach was tested on a mechanism containing 6874 irreversible reactions of 345 species that describes methane partial oxidation to high conversion. The aim is to accurately reproduce the concentration-time profiles of 12 major species with less than 5% error at the conditions of an industrial application. The reduced mechanism consists of 246 reactions of 47 species and its simulation is 116 times faster than using the full mechanism. The SEM-CM was found to be more effective than the classic Connectivity Method, and also than the DRG, two-stage DRG, DRGASA, basic DRGEP and extended DRGEP methods. (author)

  17. Canonical Primal-Dual Method for Solving Non-convex Minimization Problems

    OpenAIRE

    Wu, Changzhi; Li, Chaojie; Gao, David Yang

    2012-01-01

    A new primal-dual algorithm is presented for solving a class of non-convex minimization problems. This algorithm is based on canonical duality theory such that the original non-convex minimization problem is first reformulated as a convex-concave saddle point optimization problem, which is then solved by a quadratically perturbed primal-dual method. %It is proved that the popular SDP method is indeed a special case of the canonical duality theory. Numerical examples are illustrated. Comparing...

  18. Minimizing convex functions by continuous descent methods

    Directory of Open Access Journals (Sweden)

    Sergiu Aizicovici

    2010-01-01

    Full Text Available We study continuous descent methods for minimizing convex functions, defined on general Banach spaces, which are associated with an appropriate complete metric space of vector fields. We show that there exists an everywhere dense open set in this space of vector fields such that each of its elements generates strongly convergent trajectories.

  19. Minimal processing - preservation methods of the future: an overview

    International Nuclear Information System (INIS)

    Ohlsson, T.

    1994-01-01

    Minimal-processing technologies are modern techniques that provide sufficient shelf life to foods to allow their distribution, while also meeting the demands of the consumers for convenience and fresh-like quality. Minimal-processing technologies can be applied at various stages of the food distribution chain, in storage, in processing and/or in packaging. Examples of methods will be reviewed, including modified-atmosphere packaging, high-pressure treatment, sous-vide cooking and active packaging

  20. Subspace Correction Methods for Total Variation and $\\ell_1$-Minimization

    KAUST Repository

    Fornasier, Massimo

    2009-01-01

    This paper is concerned with the numerical minimization of energy functionals in Hilbert spaces involving convex constraints coinciding with a seminorm for a subspace. The optimization is realized by alternating minimizations of the functional on a sequence of orthogonal subspaces. On each subspace an iterative proximity-map algorithm is implemented via oblique thresholding, which is the main new tool introduced in this work. We provide convergence conditions for the algorithm in order to compute minimizers of the target energy. Analogous results are derived for a parallel variant of the algorithm. Applications are presented in domain decomposition methods for degenerate elliptic PDEs arising in total variation minimization and in accelerated sparse recovery algorithms based on 1-minimization. We include numerical examples which show e.cient solutions to classical problems in signal and image processing. © 2009 Society for Industrial and Applied Physics.

  1. Minimal residual method stronger than polynomial preconditioning

    Energy Technology Data Exchange (ETDEWEB)

    Faber, V.; Joubert, W.; Knill, E. [Los Alamos National Lab., NM (United States)] [and others

    1994-12-31

    Two popular methods for solving symmetric and nonsymmetric systems of equations are the minimal residual method, implemented by algorithms such as GMRES, and polynomial preconditioning methods. In this study results are given on the convergence rates of these methods for various classes of matrices. It is shown that for some matrices, such as normal matrices, the convergence rates for GMRES and for the optimal polynomial preconditioning are the same, and for other matrices such as the upper triangular Toeplitz matrices, it is at least assured that if one method converges then the other must converge. On the other hand, it is shown that matrices exist for which restarted GMRES always converges but any polynomial preconditioning of corresponding degree makes no progress toward the solution for some initial error. The implications of these results for these and other iterative methods are discussed.

  2. Development of a minimal growth medium for Lactobacillus plantarum

    NARCIS (Netherlands)

    Wegkamp, H.B.A.; Teusink, B.; Vos, de W.M.; Smid, E.J.

    2010-01-01

    Aim: A medium with minimal requirements for the growth of Lactobacillus plantarum WCFS was developed. The composition of the minimal medium was compared to a genome-scale metabolic model of L. plantarum. Methods and Results: By repetitive single omission experiments, two minimal media were

  3. Minimal changes in health status questionnaires: distinction between minimally detectable change and minimally important change

    Directory of Open Access Journals (Sweden)

    Knol Dirk L

    2006-08-01

    Full Text Available Abstract Changes in scores on health status questionnaires are difficult to interpret. Several methods to determine minimally important changes (MICs have been proposed which can broadly be divided in distribution-based and anchor-based methods. Comparisons of these methods have led to insight into essential differences between these approaches. Some authors have tried to come to a uniform measure for the MIC, such as 0.5 standard deviation and the value of one standard error of measurement (SEM. Others have emphasized the diversity of MIC values, depending on the type of anchor, the definition of minimal importance on the anchor, and characteristics of the disease under study. A closer look makes clear that some distribution-based methods have been merely focused on minimally detectable changes. For assessing minimally important changes, anchor-based methods are preferred, as they include a definition of what is minimally important. Acknowledging the distinction between minimally detectable and minimally important changes is useful, not only to avoid confusion among MIC methods, but also to gain information on two important benchmarks on the scale of a health status measurement instrument. Appreciating the distinction, it becomes possible to judge whether the minimally detectable change of a measurement instrument is sufficiently small to detect minimally important changes.

  4. German Risk Study - influences of data base, minimal requirements and system modifications

    International Nuclear Information System (INIS)

    Hoertner, H.; Linden, J. von

    1987-01-01

    The reliability analyses for Phase B of the German Risk Study taken into account an improved reliability data base, best-estimate minimal requirements for the relevant system functions and the design modifications, which have been carried out after completion of Phase A. These points and their influence on the frequency of core melt accidents are discussed, emphasizing the reliability data. Although the detailed evaluation of operating experience for the estimation of the reliability data does result in an increase of contributions, the best-estimate minimal requirements and the system modifications carried out for the reference plant reduce the core melt frequency due to those initiating events which were dominant in Phase A of the German Risk Study. The detailed investigation of additional initiating events which had already been recognized as important during Phase A leads to additional contributions to the frequency of core melt accidents. Such initiating events are the main steam line break and the steam generator tube rupture and altogether, the evaluated contributions to the frequency of core melt are lower than the values assessed in Phase A. (orig./HP)

  5. New method for minimizing regular functions with constraints on parameter region

    International Nuclear Information System (INIS)

    Kurbatov, V.S.; Silin, I.N.

    1993-01-01

    The new method of function minimization is developed. Its main features are considered. It is possible minimization of regular function with the arbitrary structure. For χ 2 -like function the usage of simplified second derivatives is possible with the control of correctness. The constraints of arbitrary structure can be used. The means for fast movement along multidimensional valleys are used. The method is tested on real data of K π2 decay of the experiment on rare K - -decays. 6 refs

  6. Linearly convergent stochastic heavy ball method for minimizing generalization error

    KAUST Repository

    Loizou, Nicolas; Richtarik, Peter

    2017-01-01

    In this work we establish the first linear convergence result for the stochastic heavy ball method. The method performs SGD steps with a fixed stepsize, amended by a heavy ball momentum term. In the analysis, we focus on minimizing the expected loss

  7. An optimization based method for line planning to minimize travel time

    DEFF Research Database (Denmark)

    Bull, Simon Henry; Lusby, Richard Martin; Larsen, Jesper

    2015-01-01

    The line planning problem is to select a number of lines from a potential pool which provides sufficient passenger capacity and meets operational requirements, with some objective measure of solution line quality. We model the problem of minimizing the average passenger system time, including...

  8. Guidelines on the facilities required for minor surgical procedures and minimal access interventions.

    LENUS (Irish Health Repository)

    Humphreys, H

    2012-02-01

    There have been many changes in healthcare provision in recent years, including the delivery of some surgical services in primary care or in day surgery centres, which were previously provided by acute hospitals. Developments in the fields of interventional radiology and cardiology have further expanded the range and complexity of procedures undertaken in these settings. In the face of these changes there is a need to define from an infection prevention and control perspective the basic physical requirements for facilities in which such surgical procedures may be carried out. Under the auspices of the Healthcare Infection Society, we have developed the following recommendations for those designing new facilities or upgrading existing facilities. These draw upon best practice, available evidence, other guidelines where appropriate, and expert consensus to provide sensible and feasible advice. An attempt is also made to define minimal access interventions and minor surgical procedures. For minimal access interventions, including interventional radiology, new facilities should be mechanically ventilated to achieve 15 air changes per hour but natural ventilation is satisfactory for minor procedures. All procedures should involve a checklist and operators should be appropriately trained. There is also a need for prospective surveillance to accurately determine the post-procedure infection rate. Finally, there is a requirement for appropriate applied research to develop the evidence base required to support subsequent iterations of this guidance.

  9. A convergent overlapping domain decomposition method for total variation minimization

    KAUST Repository

    Fornasier, Massimo; Langer, Andreas; Schö nlieb, Carola-Bibiane

    2010-01-01

    In this paper we are concerned with the analysis of convergent sequential and parallel overlapping domain decomposition methods for the minimization of functionals formed by a discrepancy term with respect to the data and a total variation

  10. Thermodynamic optimization of ground heat exchangers with single U-tube by entropy generation minimization method

    International Nuclear Information System (INIS)

    Li Min; Lai, Alvin C.K.

    2013-01-01

    Highlights: ► A second-law-based analysis is performed for single U-tube ground heat exchangers. ► Two expressions for the optimal length and flow velocity are developed for GHEs. ► Empirical velocities of GHEs are large compared to thermodynamic optimum values. - Abstract: This paper investigates thermodynamic performance of borehole ground heat exchangers with a single U-tube by the entropy generation minimization method which requires information of heat transfer and fluid mechanics, in addition to thermodynamics analysis. This study first derives an expression for dimensionless entropy generation number, a function that consists of five dimensionless variables, including Reynolds number, dimensionless borehole length, scale factor of pressures, and two duty parameters of ground heat exchangers. The derivation combines a heat transfer model and a hydraulics model for borehole ground heat exchangers with the first law and the second law of thermodynamics. Next, the entropy generation number is minimized to produce two analytical expressions for the optimal length and the optimal flow velocity of ground heat exchangers. Then, this paper discusses and analyzes implications and applications of these optimization formulas with two case studies. An important finding from the case studies is that widely used empirical velocities of circulating fluid are too large to operate ground-coupled heat pump systems in a thermodynamic optimization way. This paper demonstrates that thermodynamic optimal parameters of ground heat exchangers can probably be determined by using the entropy generation minimization method.

  11. Determining the Minimal Required Radioactivity of 18F-FDG for Reliable Semiquantification in PET/CT Imaging: A Phantom Study.

    Science.gov (United States)

    Chen, Ming-Kai; Menard, David H; Cheng, David W

    2016-03-01

    In pursuit of as-low-as-reasonably-achievable (ALARA) doses, this study investigated the minimal required radioactivity and corresponding imaging time for reliable semiquantification in PET/CT imaging. Using a phantom containing spheres of various diameters (3.4, 2.1, 1.5, 1.2, and 1.0 cm) filled with a fixed (18)F-FDG concentration of 165 kBq/mL and a background concentration of 23.3 kBq/mL, we performed PET/CT at multiple time points over 20 h of radioactive decay. The images were acquired for 10 min at a single bed position for each of 10 half-lives of decay using 3-dimensional list mode and were reconstructed into 1-, 2-, 3-, 4-, 5-, and 10-min acquisitions per bed position using an ordered-subsets expectation maximum algorithm with 24 subsets and 2 iterations and a gaussian 2-mm filter. SUVmax and SUVavg were measured for each sphere. The minimal required activity (±10%) for precise SUVmax semiquantification in the spheres was 1.8 kBq/mL for an acquisition of 10 min, 3.7 kBq/mL for 3-5 min, 7.9 kBq/mL for 2 min, and 17.4 kBq/mL for 1 min. The minimal required activity concentration-acquisition time product per bed position was 10-15 kBq/mL⋅min for reproducible SUV measurements within the spheres without overestimation. Using the total radioactivity and counting rate from the entire phantom, we found that the minimal required total activity-time product was 17 MBq⋅min and the minimal required counting rate-time product was 100 kcps⋅min. Our phantom study determined a threshold for minimal radioactivity and acquisition time for precise semiquantification in (18)F-FDG PET imaging that can serve as a guide in pursuit of achieving ALARA doses. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  12. An applied optimization based method for line planning to minimize travel time

    DEFF Research Database (Denmark)

    Bull, Simon Henry; Rezanova, Natalia Jurjevna; Lusby, Richard Martin

    The line planning problem in rail is to select a number of lines froma potential pool which provides sufficient passenger capacity and meetsoperational requirements, with some objective measure of solution linequality. We model the problem of minimizing the average passenger systemtime, including...

  13. Minimizing the Free Energy: A Computer Method for Teaching Chemical Equilibrium Concepts.

    Science.gov (United States)

    Heald, Emerson F.

    1978-01-01

    Presents a computer method for teaching chemical equilibrium concepts using material balance conditions and the minimization of the free energy. Method for the calculation of chemical equilibrium, the computer program used to solve equilibrium problems and applications of the method are also included. (HM)

  14. Projected Gauss-Seidel subspace minimization method for interactive rigid body dynamics

    DEFF Research Database (Denmark)

    Silcowitz-Hansen, Morten; Abel, Sarah Maria Niebe; Erleben, Kenny

    2010-01-01

    artifacts such as viscous or damped contact response. In this paper, we present a new approach to contact force determination. We formulate the contact force problem as a nonlinear complementarity problem, and discretize the problem to derive the Projected Gauss–Seidel method. We combine the Projected Gauss......–Seidel method with a subspace minimization method. Our new method shows improved qualities and superior convergence properties for specific configurations....

  15. Minimizers with discontinuous velocities for the electromagnetic variational method

    International Nuclear Information System (INIS)

    De Luca, Jayme

    2010-01-01

    The electromagnetic two-body problem has neutral differential delay equations of motion that, for generic boundary data, can have solutions with discontinuous derivatives. If one wants to use these neutral differential delay equations with arbitrary boundary data, solutions with discontinuous derivatives must be expected and allowed. Surprisingly, Wheeler-Feynman electrodynamics has a boundary value variational method for which minimizer trajectories with discontinuous derivatives are also expected, as we show here. The variational method defines continuous trajectories with piecewise defined velocities and accelerations, and electromagnetic fields defined by the Euler-Lagrange equations on trajectory points. Here we use the piecewise defined minimizers with the Lienard-Wierchert formulas to define generalized electromagnetic fields almost everywhere (but on sets of points of zero measure where the advanced/retarded velocities and/or accelerations are discontinuous). Along with this generalization we formulate the generalized absorber hypothesis that the far fields vanish asymptotically almost everywhere and show that localized orbits with far fields vanishing almost everywhere must have discontinuous velocities on sewing chains of breaking points. We give the general solution for localized orbits with vanishing far fields by solving a (linear) neutral differential delay equation for these far fields. We discuss the physics of orbits with discontinuous derivatives stressing the differences to the variational methods of classical mechanics and the existence of a spinorial four-current associated with the generalized variational electrodynamics.

  16. A detailed survey of numerical methods for unconstrained minimization. Pt. 1

    International Nuclear Information System (INIS)

    Mika, K.; Chaves, T.

    1980-01-01

    A detailed description of numerical methods for unconstrained minimization is presented. This first part surveys in particular conjugate direction and gradient methods, whereas variable metric methods will be the subject of the second part. Among the results of special interest we quote the following. The conjugate direction methods of Powell, Zangwill and Sutti can be best interpreted if the Smith approach is adopted. The conditions for quadratic termination of Powell's first procedure are analyzed. Numerical results based on nonlinear least squares problems are presented for the following conjugate direction codes: VA04AD from Harwell Subroutine Library and ZXPOW from IMSL, both implementations of Powell's second procedure, DFMND from IBM-SILMATH (Zangwill's method) and Brent's algorithm PRAXIS. VA04AD turns out to be superior in all cases, PRAXIS improves for high-dimensional problems. All codes clearly exhibit superlinear convergence. Akaike's result for the method of steepest descent is derived directly from a set of nonlinear recurrence relations. Numerical results obtained with the highly ill conditioned Hilbert function confirm the theoretical predictions. Several properties of the conjugate gradient method are presented and a new derivation of the equivalence of steepest descent partan and the CG method is given. A comparison of numerical results from the CG codes VA08AD (Fletcher-Reeves), DFMCG (the SSP version of the Fletcher-Reevens algorithm) and VA14AD (Powell's implementation of the Polak-Ribiere formula) reveals that VA14AD is clearly superior in all cases, but that the convergence rate of these codes is only weakly superlinear such that high accuracy solutions require extremely large numbers of function calls. (orig.)

  17. A convergent overlapping domain decomposition method for total variation minimization

    KAUST Repository

    Fornasier, Massimo

    2010-06-22

    In this paper we are concerned with the analysis of convergent sequential and parallel overlapping domain decomposition methods for the minimization of functionals formed by a discrepancy term with respect to the data and a total variation constraint. To our knowledge, this is the first successful attempt of addressing such a strategy for the nonlinear, nonadditive, and nonsmooth problem of total variation minimization. We provide several numerical experiments, showing the successful application of the algorithm for the restoration of 1D signals and 2D images in interpolation/inpainting problems, respectively, and in a compressed sensing problem, for recovering piecewise constant medical-type images from partial Fourier ensembles. © 2010 Springer-Verlag.

  18. Detection of Cavities by Inverse Heat Conduction Boundary Element Method Using Minimal Energy Technique

    International Nuclear Information System (INIS)

    Choi, C. Y.

    1997-01-01

    A geometrical inverse heat conduction problem is solved for the infrared scanning cavity detection by the boundary element method using minimal energy technique. By minimizing the kinetic energy of temperature field, boundary element equations are converted to the quadratic programming problem. A hypothetical inner boundary is defined such that the actual cavity is located interior to the domain. Temperatures at hypothetical inner boundary are determined to meet the constraints of measurement error of surface temperature obtained by infrared scanning, and then boundary element analysis is performed for the position of an unknown boundary (cavity). Cavity detection algorithm is provided, and the effects of minimal energy technique on the inverse solution method are investigated by means of numerical analysis

  19. Algorithm for finding minimal cut sets in a fault tree

    International Nuclear Information System (INIS)

    Rosenberg, Ladislav

    1996-01-01

    This paper presents several algorithms that have been used in a computer code for fault-tree analysing by the minimal cut sets method. The main algorithm is the more efficient version of the new CARA algorithm, which finds minimal cut sets with an auxiliary dynamical structure. The presented algorithm for finding the minimal cut sets enables one to do so by defined requirements - according to the order of minimal cut sets, or to the number of minimal cut sets, or both. This algorithm is from three to six times faster when compared with the primary version of the CARA algorithm

  20. Sequential unconstrained minimization algorithms for constrained optimization

    International Nuclear Information System (INIS)

    Byrne, Charles

    2008-01-01

    The problem of minimizing a function f(x):R J → R, subject to constraints on the vector variable x, occurs frequently in inverse problems. Even without constraints, finding a minimizer of f(x) may require iterative methods. We consider here a general class of iterative algorithms that find a solution to the constrained minimization problem as the limit of a sequence of vectors, each solving an unconstrained minimization problem. Our sequential unconstrained minimization algorithm (SUMMA) is an iterative procedure for constrained minimization. At the kth step we minimize the function G k (x)=f(x)+g k (x), to obtain x k . The auxiliary functions g k (x):D subset of R J → R + are nonnegative on the set D, each x k is assumed to lie within D, and the objective is to minimize the continuous function f:R J → R over x in the set C = D-bar, the closure of D. We assume that such minimizers exist, and denote one such by x-circumflex. We assume that the functions g k (x) satisfy the inequalities 0≤g k (x)≤G k-1 (x)-G k-1 (x k-1 ), for k = 2, 3, .... Using this assumption, we show that the sequence {(x k )} is decreasing and converges to f(x-circumflex). If the restriction of f(x) to D has bounded level sets, which happens if x-circumflex is unique and f(x) is closed, proper and convex, then the sequence {x k } is bounded, and f(x*)=f(x-circumflex), for any cluster point x*. Therefore, if x-circumflex is unique, x* = x-circumflex and {x k } → x-circumflex. When x-circumflex is not unique, convergence can still be obtained, in particular cases. The SUMMA includes, as particular cases, the well-known barrier- and penalty-function methods, the simultaneous multiplicative algebraic reconstruction technique (SMART), the proximal minimization algorithm of Censor and Zenios, the entropic proximal methods of Teboulle, as well as certain cases of gradient descent and the Newton–Raphson method. The proof techniques used for SUMMA can be extended to obtain related results

  1. Minimizing Mutual Couping

    DEFF Research Database (Denmark)

    2010-01-01

    Disclosed herein are techniques, systems, and methods relating to minimizing mutual coupling between a first antenna and a second antenna.......Disclosed herein are techniques, systems, and methods relating to minimizing mutual coupling between a first antenna and a second antenna....

  2. Legal incentives for minimizing waste

    International Nuclear Information System (INIS)

    Clearwater, S.W.; Scanlon, J.M.

    1991-01-01

    Waste minimization, or pollution prevention, has become an integral component of federal and state environmental regulation. Minimizing waste offers many economic and public relations benefits. In addition, waste minimization efforts can also dramatically reduce potential criminal requirements. This paper addresses the legal incentives for minimizing waste under current and proposed environmental laws and regulations

  3. Assessment of LANL waste minimization plan

    International Nuclear Information System (INIS)

    Davis, K.D.; McNair, D.A.; Jennrich, E.A.; Lund, D.M.

    1991-04-01

    The objective of this report is to evaluate the Los Alamos National Laboratory (LANL) Waste Minimization Plan to determine if it meets applicable internal (DOE) and regulatory requirements. The intent of the effort is to assess the higher level elements of the documentation to determine if they have been addressed rather than the detailed mechanics of the program's implementation. The requirement for a Waste Minimization Plan is based in several DOE Orders as well as environmental laws and regulations. Table 2-1 provides a list of the major documents or regulations that require waste minimization efforts. The table also summarizes the applicable requirements

  4. OCOPTR, Minimization of Nonlinear Function, Variable Metric Method, Derivative Calculation. DRVOCR, Minimization of Nonlinear Function, Variable Metric Method, Derivative Calculation

    International Nuclear Information System (INIS)

    Nazareth, J. L.

    1979-01-01

    1 - Description of problem or function: OCOPTR and DRVOCR are computer programs designed to find minima of non-linear differentiable functions f: R n →R with n dimensional domains. OCOPTR requires that the user only provide function values (i.e. it is a derivative-free routine). DRVOCR requires the user to supply both function and gradient information. 2 - Method of solution: OCOPTR and DRVOCR use the variable metric (or quasi-Newton) method of Davidon (1975). For OCOPTR, the derivatives are estimated by finite differences along a suitable set of linearly independent directions. For DRVOCR, the derivatives are user- supplied. Some features of the codes are the storage of the approximation to the inverse Hessian matrix in lower trapezoidal factored form and the use of an optimally-conditioned updating method. Linear equality constraints are permitted subject to the initial Hessian factor being chosen correctly. 3 - Restrictions on the complexity of the problem: The functions to which the routine is applied are assumed to be differentiable. The routine also requires (n 2 /2) + 0(n) storage locations where n is the problem dimension

  5. Waste Minimization and Pollution Prevention Awareness Plan

    International Nuclear Information System (INIS)

    1992-01-01

    The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. It is intended to satisfy Department of Energy (DOE) and other legal requirements that are discussed in Section C, below. The Pollution Prevention Awareness Program is included with the Waste Minimization Program as suggested by DOE Order 5400.1. The intent of this plan is to respond to and comply with the Department's policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Directorate-, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Directorates, Programs and Departments. Several Directorates have been reorganized, necessitating changes in the Directorate plans that were published in 1991

  6. New trends in minimally invasive urological surgery

    Directory of Open Access Journals (Sweden)

    Prabhakar Rajan

    2009-10-01

    Full Text Available Purpose: The perceived benefits of minimally-invasive surgery include less postoperative pain, shorter hospitalization, reduced morbidity and better cosmesis while maintaining diagnostic accuracy and therapeutic outcome. We review the new trends in minimally-invasive urological surgery. Materials and method: We reviewed the English language literature using the National Library of Medicine database to identify the latest technological advances in minimally-invasive surgery with particular reference to urology. Results: Amongst other advances, studies incorporating needlescopic surgery, laparoendoscopic single-site surgery , magnetic anchoring and guidance systems, natural orifice transluminal endoscopic surgery and flexible robots were considered of interest. The results from initial animal and human studies are also outlined. Conclusion: Minimally-invasive surgery continues to evolve to meet the demands of the operators and patients. Many novel technologies are still in the testing phase, whilst others have entered clinical practice. Further evaluation is required to confirm the safety and efficacy of these techniques and validate the published reports.

  7. Alternative sanitization methods for minimally processed lettuce in comparison to sodium hypochlorite.

    Science.gov (United States)

    Bachelli, Mara Lígia Biazotto; Amaral, Rívia Darla Álvares; Benedetti, Benedito Carlos

    2013-01-01

    Lettuce is a leafy vegetable widely used in industry for minimally processed products, in which the step of sanitization is the crucial moment for ensuring a safe food for consumption. Chlorinated compounds, mainly sodium hypochlorite, are the most used in Brazil, but the formation of trihalomethanes from this sanitizer is a drawback. Then, the search for alternative methods to sodium hypochlorite has been emerging as a matter of great interest. The suitability of chlorine dioxide (60 mg L(-1)/10 min), peracetic acid (100 mg L(-1)/15 min) and ozonated water (1.2 mg L(-1)/1 min) as alternative sanitizers to sodium hypochlorite (150 mg L(-1) free chlorine/15 min) were evaluated. Minimally processed lettuce washed with tap water for 1 min was used as a control. Microbiological analyses were performed in triplicate, before and after sanitization, and at 3, 6, 9 and 12 days of storage at 2 ± 1 °C with the product packaged on LDPE bags of 60 μm. It was evaluated total coliforms, Escherichia coli, Salmonella spp., psicrotrophic and mesophilic bacteria, yeasts and molds. All samples of minimally processed lettuce showed absence of E. coli and Salmonella spp. The treatments of chlorine dioxide, peracetic acid and ozonated water promoted reduction of 2.5, 1.1 and 0.7 log cycle, respectively, on count of microbial load of minimally processed product and can be used as substitutes for sodium hypochlorite. These alternative compounds promoted a shelf-life of six days to minimally processed lettuce, while the shelf-life with sodium hypochlorite was 12 days.

  8. Systematic process synthesis and design methods for cost effective waste minimization

    International Nuclear Information System (INIS)

    Biegler, L.T.; Grossman, I.E.; Westerberg, A.W.

    1995-01-01

    We present progress on our work to develop synthesis methods to aid in the design of cost effective approaches to waste minimization. Work continues to combine the approaches of Douglas and coworkers and of Grossmann and coworkers on a hierarchical approach where bounding information allows it to fit within a mixed integer programming approach. We continue work on the synthesis of reactors and of flexible separation processes. In the first instance, we strive for methods we can use to reduce the production of potential pollutants, while in the second we look for ways to recover and recycle solvents

  9. Systematic process synthesis and design methods for cost effective waste minimization

    Energy Technology Data Exchange (ETDEWEB)

    Biegler, L.T.; Grossman, I.E.; Westerberg, A.W. [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1995-12-31

    We present progress on our work to develop synthesis methods to aid in the design of cost effective approaches to waste minimization. Work continues to combine the approaches of Douglas and coworkers and of Grossmann and coworkers on a hierarchical approach where bounding information allows it to fit within a mixed integer programming approach. We continue work on the synthesis of reactors and of flexible separation processes. In the first instance, we strive for methods we can use to reduce the production of potential pollutants, while in the second we look for ways to recover and recycle solvents.

  10. Cell-free protein synthesis in micro compartments: building a minimal cell from biobricks.

    Science.gov (United States)

    Jia, Haiyang; Heymann, Michael; Bernhard, Frank; Schwille, Petra; Kai, Lei

    2017-10-25

    The construction of a minimal cell that exhibits the essential characteristics of life is a great challenge in the field of synthetic biology. Assembling a minimal cell requires multidisciplinary expertise from physics, chemistry and biology. Scientists from different backgrounds tend to define the essence of 'life' differently and have thus proposed different artificial cell models possessing one or several essential features of living cells. Using the tools and methods of molecular biology, the bottom-up engineering of a minimal cell appears in reach. However, several challenges still remain. In particular, the integration of individual sub-systems that is required to achieve a self-reproducing cell model presents a complex optimization challenge. For example, multiple self-organisation and self-assembly processes have to be carefully tuned. We review advances and developments of new methods and techniques, for cell-free protein synthesis as well as micro-fabrication, for their potential to resolve challenges and to accelerate the development of minimal cells. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Alternative sanitization methods for minimally processed lettuce in comparison to sodium hypochlorite

    Directory of Open Access Journals (Sweden)

    Mara Lígia Biazotto Bachelli

    2013-09-01

    Full Text Available Lettuce is a leafy vegetable widely used in industry for minimally processed products, in which the step of sanitization is the crucial moment for ensuring a safe food for consumption. Chlorinated compounds, mainly sodium hypochlorite, are the most used in Brazil, but the formation of trihalomethanes from this sanitizer is a drawback. Then, the search for alternative methods to sodium hypochlorite has been emerging as a matter of great interest. The suitability of chlorine dioxide (60 mg L-1/10 min, peracetic acid (100 mg L-1/15 min and ozonated water (1.2 mg L-1 /1 min as alternative sanitizers to sodium hypochlorite (150 mg L-1 free chlorine/15 min were evaluated. Minimally processed lettuce washed with tap water for 1 min was used as a control. Microbiological analyses were performed in triplicate, before and after sanitization, and at 3, 6, 9 and 12 days of storage at 2 ± 1 ºC with the product packaged on LDPE bags of 60 µm. It was evaluated total coliforms, Escherichia coli, Salmonella spp., psicrotrophic and mesophilic bacteria, yeasts and molds. All samples of minimally processed lettuce showed absence of E. coli and Salmonella spp. The treatments of chlorine dioxide, peracetic acid and ozonated water promoted reduction of 2.5, 1.1 and 0.7 log cycle, respectively, on count of microbial load of minimally processed product and can be used as substitutes for sodium hypochlorite. These alternative compounds promoted a shelf-life of six days to minimally processed lettuce, while the shelf-life with sodium hypochlorite was 12 days.

  12. Minimal quantization of two-dimensional models with chiral anomalies

    International Nuclear Information System (INIS)

    Ilieva, N.

    1987-01-01

    Two-dimensional gauge models with chiral anomalies - ''left-handed'' QED and the chiral Schwinger model, are quantized consistently in the frames of the minimal quantization method. The choice of the cone time as a physical time for system of quantization is motivated. The well-known mass spectrum is found but with a fixed value of the regularization parameter a=2. Such a unique solution is obtained due to the strong requirement of consistency of the minimal quantization that reflects in the physically motivated choice of the time axis

  13. Option Pricing under Risk-Minimization Criterion in an Incomplete Market with the Finite Difference Method

    Directory of Open Access Journals (Sweden)

    Xinfeng Ruan

    2013-01-01

    Full Text Available We study option pricing with risk-minimization criterion in an incomplete market where the dynamics of the risky underlying asset is governed by a jump diffusion equation with stochastic volatility. We obtain the Radon-Nikodym derivative for the minimal martingale measure and a partial integro-differential equation (PIDE of European option. The finite difference method is employed to compute the European option valuation of PIDE.

  14. Taxonomic minimalism.

    Science.gov (United States)

    Beattle, A J; Oliver, I

    1994-12-01

    Biological surveys are in increasing demand while taxonomic resources continue to decline. How much formal taxonomy is required to get the job done? The answer depends on the kind of job but it is possible that taxonomic minimalism, especially (1) the use of higher taxonomic ranks, (2) the use of morphospecies rather than species (as identified by Latin binomials), and (3) the involvement of taxonomic specialists only for training and verification, may offer advantages for biodiversity assessment, environmental monitoring and ecological research. As such, formal taxonomy remains central to the process of biological inventory and survey but resources may be allocated more efficiently. For example, if formal Identification is not required, resources may be concentrated on replication and increasing sample sizes. Taxonomic minimalism may also facilitate the inclusion in these activities of important but neglected groups, especially among the invertebrates, and perhaps even microorganisms. Copyright © 1994. Published by Elsevier Ltd.

  15. Minimization of required model runs in the Random Mixing approach to inverse groundwater flow and transport modeling

    Science.gov (United States)

    Hoerning, Sebastian; Bardossy, Andras; du Plessis, Jaco

    2017-04-01

    Most geostatistical inverse groundwater flow and transport modelling approaches utilize a numerical solver to minimize the discrepancy between observed and simulated hydraulic heads and/or hydraulic concentration values. The optimization procedure often requires many model runs, which for complex models lead to long run times. Random Mixing is a promising new geostatistical technique for inverse modelling. The method is an extension of the gradual deformation approach. It works by finding a field which preserves the covariance structure and maintains observed hydraulic conductivities. This field is perturbed by mixing it with new fields that fulfill the homogeneous conditions. This mixing is expressed as an optimization problem which aims to minimize the difference between the observed and simulated hydraulic heads and/or concentration values. To preserve the spatial structure, the mixing weights must lie on the unit hyper-sphere. We present a modification to the Random Mixing algorithm which significantly reduces the number of model runs required. The approach involves taking n equally spaced points on the unit circle as weights for mixing conditional random fields. Each of these mixtures provides a solution to the forward model at the conditioning locations. For each of the locations the solutions are then interpolated around the circle to provide solutions for additional mixing weights at very low computational cost. The interpolated solutions are used to search for a mixture which maximally reduces the objective function. This is in contrast to other approaches which evaluate the objective function for the n mixtures and then interpolate the obtained values. Keeping the mixture on the unit circle makes it easy to generate equidistant sampling points in the space; however, this means that only two fields are mixed at a time. Once the optimal mixture for two fields has been found, they are combined to form the input to the next iteration of the algorithm. This

  16. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches.

    Science.gov (United States)

    Almutairy, Meznah; Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.

  17. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches.

    Directory of Open Access Journals (Sweden)

    Meznah Almutairy

    Full Text Available Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method.

  18. Comparing fixed sampling with minimizer sampling when using k-mer indexes to find maximal exact matches

    Science.gov (United States)

    Torng, Eric

    2018-01-01

    Bioinformatics applications and pipelines increasingly use k-mer indexes to search for similar sequences. The major problem with k-mer indexes is that they require lots of memory. Sampling is often used to reduce index size and query time. Most applications use one of two major types of sampling: fixed sampling and minimizer sampling. It is well known that fixed sampling will produce a smaller index, typically by roughly a factor of two, whereas it is generally assumed that minimizer sampling will produce faster query times since query k-mers can also be sampled. However, no direct comparison of fixed and minimizer sampling has been performed to verify these assumptions. We systematically compare fixed and minimizer sampling using the human genome as our database. We use the resulting k-mer indexes for fixed sampling and minimizer sampling to find all maximal exact matches between our database, the human genome, and three separate query sets, the mouse genome, the chimp genome, and an NGS data set. We reach the following conclusions. First, using larger k-mers reduces query time for both fixed sampling and minimizer sampling at a cost of requiring more space. If we use the same k-mer size for both methods, fixed sampling requires typically half as much space whereas minimizer sampling processes queries only slightly faster. If we are allowed to use any k-mer size for each method, then we can choose a k-mer size such that fixed sampling both uses less space and processes queries faster than minimizer sampling. The reason is that although minimizer sampling is able to sample query k-mers, the number of shared k-mer occurrences that must be processed is much larger for minimizer sampling than fixed sampling. In conclusion, we argue that for any application where each shared k-mer occurrence must be processed, fixed sampling is the right sampling method. PMID:29389989

  19. Minimal Marking: A Success Story

    Directory of Open Access Journals (Sweden)

    Anne McNeilly

    2014-11-01

    Full Text Available The minimal-marking project conducted in Ryerson’s School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The “minimal-marking” concept (Haswell, 1983, which requires dramatically more student engagement, resulted in more successful learning outcomes for surface-level knowledge acquisition than the more traditional approach of “teacher-corrects-all.” Results suggest it would be effective, not just for grammar, punctuation, and word usage, the objective here, but for any material that requires rote-memory learning, such as the Associated Press or Canadian Press style rules used by news publications across North America.

  20. Performance Analysis of Video Transmission Using Sequential Distortion Minimization Method for Digital Video Broadcasting Terrestrial

    Directory of Open Access Journals (Sweden)

    Novita Astin

    2016-12-01

    Full Text Available This paper presents about the transmission of Digital Video Broadcasting system with streaming video resolution 640x480 on different IQ rate and modulation. In the video transmission, distortion often occurs, so the received video has bad quality. Key frames selection algorithm is flexibel on a change of video, but on these methods, the temporal information of a video sequence is omitted. To minimize distortion between the original video and received video, we aimed at adding methodology using sequential distortion minimization algorithm. Its aim was to create a new video, better than original video without significant loss of content between the original video and received video, fixed sequentially. The reliability of video transmission was observed based on a constellation diagram, with the best result on IQ rate 2 Mhz and modulation 8 QAM. The best video transmission was also investigated using SEDIM (Sequential Distortion Minimization Method and without SEDIM. The experimental result showed that the PSNR (Peak Signal to Noise Ratio average of video transmission using SEDIM was an increase from 19,855 dB to 48,386 dB and SSIM (Structural Similarity average increase 10,49%. The experimental results and comparison of proposed method obtained a good performance. USRP board was used as RF front-end on 2,2 GHz.

  1. A flood-based information flow analysis and network minimization method for gene regulatory networks.

    Science.gov (United States)

    Pavlogiannis, Andreas; Mozhayskiy, Vadim; Tagkopoulos, Ilias

    2013-04-24

    Biological networks tend to have high interconnectivity, complex topologies and multiple types of interactions. This renders difficult the identification of sub-networks that are involved in condition- specific responses. In addition, we generally lack scalable methods that can reveal the information flow in gene regulatory and biochemical pathways. Doing so will help us to identify key participants and paths under specific environmental and cellular context. This paper introduces the theory of network flooding, which aims to address the problem of network minimization and regulatory information flow in gene regulatory networks. Given a regulatory biological network, a set of source (input) nodes and optionally a set of sink (output) nodes, our task is to find (a) the minimal sub-network that encodes the regulatory program involving all input and output nodes and (b) the information flow from the source to the sink nodes of the network. Here, we describe a novel, scalable, network traversal algorithm and we assess its potential to achieve significant network size reduction in both synthetic and E. coli networks. Scalability and sensitivity analysis show that the proposed method scales well with the size of the network, and is robust to noise and missing data. The method of network flooding proves to be a useful, practical approach towards information flow analysis in gene regulatory networks. Further extension of the proposed theory has the potential to lead in a unifying framework for the simultaneous network minimization and information flow analysis across various "omics" levels.

  2. Energy-efficient ECG compression on wireless biosensors via minimal coherence sensing and weighted ℓ₁ minimization reconstruction.

    Science.gov (United States)

    Zhang, Jun; Gu, Zhenghui; Yu, Zhu Liang; Li, Yuanqing

    2015-03-01

    Low energy consumption is crucial for body area networks (BANs). In BAN-enabled ECG monitoring, the continuous monitoring entails the need of the sensor nodes to transmit a huge data to the sink node, which leads to excessive energy consumption. To reduce airtime over energy-hungry wireless links, this paper presents an energy-efficient compressed sensing (CS)-based approach for on-node ECG compression. At first, an algorithm called minimal mutual coherence pursuit is proposed to construct sparse binary measurement matrices, which can be used to encode the ECG signals with superior performance and extremely low complexity. Second, in order to minimize the data rate required for faithful reconstruction, a weighted ℓ1 minimization model is derived by exploring the multisource prior knowledge in wavelet domain. Experimental results on MIT-BIH arrhythmia database reveals that the proposed approach can obtain higher compression ratio than the state-of-the-art CS-based methods. Together with its low encoding complexity, our approach can achieve significant energy saving in both encoding process and wireless transmission.

  3. Sludge minimization technologies - an overview

    Energy Technology Data Exchange (ETDEWEB)

    Oedegaard, Hallvard

    2003-07-01

    The management of wastewater sludge from wastewater treatment plants represents one of the major challenges in wastewater treatment today. The cost of the sludge treatment amounts to more that the cost of the liquid in many cases. Therefore the focus on and interest in sludge minimization is steadily increasing. In the paper an overview is given for sludge minimization (sludge mass reduction) options. It is demonstrated that sludge minimization may be a result of reduced production of sludge and/or disintegration processes that may take place both in the wastewater treatment stage and in the sludge stage. Various sludge disintegration technologies for sludge minimization are discussed, including mechanical methods (focusing on stirred ball-mill, high-pressure homogenizer, ultrasonic disintegrator), chemical methods (focusing on the use of ozone), physical methods (focusing on thermal and thermal/chemical hydrolysis) and biological methods (focusing on enzymatic processes). (author)

  4. Predicting blood transfusion in patients undergoing minimally invasive oesophagectomy.

    Science.gov (United States)

    Schneider, Crispin; Boddy, Alex P; Fukuta, Junaid; Groom, William D; Streets, Christopher G

    2014-12-01

    To evaluate predictors of allogenic blood transfusion requirements in patients undergoing minimal invasive oesophagectomy at a tertiary high volume centre for oesophago-gastric surgery. Retrospective analysis of all patients undergoing minimal access oesophagectomy in our department between January 2010 and December 2011. Patients were divided into two groups depending on whether they required a blood transfusion at any time during their index admission. Factors that have been shown to influence perioperative blood transfusion requirements in major surgery were included in the analysis. Binary logistic regression analysis was performed to determine the impact of patient and perioperative characteristics on transfusion requirements during the index admission. A total of 80 patients underwent minimal access oesophagectomy, of which 61 patients had a laparoscopic assisted oesophagectomy and 19 patients had a minimal invasive oesophagectomy. Perioperative blood transfusion was required in 28 patients at any time during hospital admission. On binary logistic regression analysis, a lower preoperative haemoglobin concentration (p blood transfusion requirements. It has been reported that requirement for blood transfusion can affect long-term outcomes in oesophageal cancer resection. Two factors which could be addressed preoperatively; haemoglobin concentration and type of oesophageal resection, may be valuable in predicting blood transfusions in patients undergoing minimally invasive oesophagectomy. Our analysis revealed that preoperative haemoglobin concentration, occurrence of significant complications and type of minimal access oesophagectomy predicted blood transfusion requirements in the patient population examined. Copyright © 2014 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  5. Variational method for the minimization of entropy generation in solar cells

    Energy Technology Data Exchange (ETDEWEB)

    Smit, Sjoerd; Kessels, W. M. M., E-mail: w.m.m.kessels@tue.nl [Department of Applied Physics, Eindhoven University of Technology, P.O. Box 513, 5600 MB Eindhoven (Netherlands)

    2015-04-07

    In this work, a method is presented to extend traditional solar cell simulation tools to make it possible to calculate the most efficient design of practical solar cells. The method is based on the theory of nonequilibrium thermodynamics, which is used to derive an expression for the local entropy generation rate in the solar cell, making it possible to quantify all free energy losses on the same scale. The framework of non-equilibrium thermodynamics can therefore be combined with the calculus of variations and existing solar cell models to minimize the total entropy generation rate in the cell to find the most optimal design. The variational method is illustrated by applying it to a homojunction solar cell. The optimization results in a set of differential algebraic equations, which determine the optimal shape of the doping profile for given recombination and transport models.

  6. Application of the microbiological method DEFT/APC to detect minimally processed vegetables treated with gamma radiation

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, M.M.; Duarte, R.C.; Silva, P.V. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Centro de Tecnologia das Radiacoes, Laboratorio de Deteccao de Alimentos Irradiados, Cidade Universitaria, Av. Prof. Lineu Prestes 2242, Butanta Zip Code 05508-000 Sao Paulo (Brazil); Marchioni, E. [Laboratoire de Chimie Analytique et Sciences de l' Aliment (UMR 7512), Faculte de Pharmacie, Universite Louis Pasteur, 74, route du Rhin, F-67400 Illkirch (France); Villavicencio, A.L.C.H. [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Centro de Tecnologia das Radiacoes, Laboratorio de Deteccao de Alimentos Irradiados, Cidade Universitaria, Av. Prof. Lineu Prestes 2242, Butanta Zip Code 05508-000 Sao Paulo (Brazil)], E-mail: villavic@ipen.br

    2009-07-15

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a {sup 60}Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.

  7. Application of the microbiological method DEFT/APC to detect minimally processed vegetables treated with gamma radiation

    Science.gov (United States)

    Araújo, M. M.; Duarte, R. C.; Silva, P. V.; Marchioni, E.; Villavicencio, A. L. C. H.

    2009-07-01

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a 60Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.

  8. Application of the microbiological method DEFT/APC to detect minimally processed vegetables treated with gamma radiation

    International Nuclear Information System (INIS)

    Araujo, M.M.; Duarte, R.C.; Silva, P.V.; Marchioni, E.; Villavicencio, A.L.C.H.

    2009-01-01

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a 60 Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.

  9. A Modified Limited-Memory BNS Method for Unconstrained Minimization Based on the Conjugate Directions Idea

    Czech Academy of Sciences Publication Activity Database

    Vlček, Jan; Lukšan, Ladislav

    2015-01-01

    Roč. 30, č. 3 (2015), s. 616-633 ISSN 1055-6788 R&D Projects: GA ČR GA13-06684S Institutional support: RVO:67985807 Keywords : unconstrained minimization * variable metric methods * limited-memory methods * the BFGS update * conjugate directions * numerical results Subject RIV: BA - General Mathematics Impact factor: 0.841, year: 2015

  10. Minimal families of curves on surfaces

    KAUST Repository

    Lubbes, Niels

    2014-01-01

    A minimal family of curves on an embedded surface is defined as a 1-dimensional family of rational curves of minimal degree, which cover the surface. We classify such minimal families using constructive methods. This allows us to compute the minimal

  11. Subspace Correction Methods for Total Variation and $\\ell_1$-Minimization

    KAUST Repository

    Fornasier, Massimo; Schö nlieb, Carola-Bibiane

    2009-01-01

    This paper is concerned with the numerical minimization of energy functionals in Hilbert spaces involving convex constraints coinciding with a seminorm for a subspace. The optimization is realized by alternating minimizations of the functional on a

  12. Westinghouse Hanford Company waste minimization actions

    International Nuclear Information System (INIS)

    Greenhalgh, W.O.

    1988-09-01

    Companies that generate hazardous waste materials are now required by national regulations to establish a waste minimization program. Accordingly, in FY88 the Westinghouse Hanford Company formed a waste minimization team organization. The purpose of the team is to assist the company in its efforts to minimize the generation of waste, train personnel on waste minimization techniques, document successful waste minimization effects, track dollar savings realized, and to publicize and administer an employee incentive program. A number of significant actions have been successful, resulting in the savings of materials and dollars. The team itself has been successful in establishing some worthwhile minimization projects. This document briefly describes the waste minimization actions that have been successful to date. 2 refs., 26 figs., 3 tabs

  13. Analysis and minimization of Torque Ripple for variable Flux reluctance machines

    NARCIS (Netherlands)

    Bao, J.; Gysen, B.L.J.; Boynov, K.; Paulides, J.J.H.; Lomonova, E.A.

    2017-01-01

    Variable flux reluctance machines (VFRMs) are permanent-magnet-free three-phase machines and are promising candidates for applications requiring low cost and robustness. This paper studies the torque ripple and minimization methods for 12-stator VFRMs. Starting with the analysis of harmonics in the

  14. Westinghouse Hanford Company waste minimization and pollution prevention awareness program plan

    International Nuclear Information System (INIS)

    Craig, P.A.; Nichols, D.H.; Lindsey, D.W.

    1991-08-01

    The purpose of this plan is to establish the Westinghouse Hanford Company's Waste Minimization Program. The plan specifies activities and methods that will be employed to reduce the quantity and toxicity of waste generated at Westinghouse Hanford Company (Westinghouse Hanford). It is designed to satisfy the US Department of Energy (DOE) and other legal requirements that are discussed in Subsection C of the section. The Pollution Prevention Awareness Program is included with the Waste Minimization Program as permitted by DOE Order 5400.1 (DOE 1988a). This plan is based on the Hanford Site Waste Minimization and Pollution Prevention Awareness Program Plan, which directs DOE Field Office, Richland contractors to develop and maintain a waste minimization program. This waste minimization program is an organized, comprehensive, and continual effort to systematically reduce waste generation. The Westinghouse Hanford Waste Minimization Program is designed to prevent or minimize pollutant releases to all environmental media from all aspects of Westinghouse Hanford operations and offers increased protection of public health and the environment. 14 refs., 2 figs., 1 tab

  15. Minimal Residual Disease Assessment in Lymphoma: Methods and Applications.

    Science.gov (United States)

    Herrera, Alex F; Armand, Philippe

    2017-12-01

    Standard methods for disease response assessment in patients with lymphoma, including positron emission tomography and computed tomography scans, are imperfect. In other hematologic malignancies, particularly leukemias, the ability to detect minimal residual disease (MRD) is increasingly influencing treatment paradigms. However, in many subtypes of lymphoma, the application of MRD assessment techniques, like flow cytometry or polymerase chain reaction-based methods, has been challenging because of the absence of readily detected circulating disease or canonic chromosomal translocations. Newer MRD detection methods that use next-generation sequencing have yielded promising results in a number of lymphoma subtypes, fueling the hope that MRD detection may soon be applicable in clinical practice for most patients with lymphoma. MRD assessment can provide real-time information about tumor burden and response to therapy, noninvasive genomic profiling, and monitoring of clonal dynamics, allowing for many possible applications that could significantly affect the care of patients with lymphoma. Further validation of MRD assessment methods, including the incorporation of MRD assessment into clinical trials in patients with lymphoma, will be critical to determine how best to deploy MRD testing in routine practice and whether MRD assessment can ultimately bring us closer to the goal of personalized lymphoma care. In this review article, we describe the methods available for detecting MRD in patients with lymphoma and their relative advantages and disadvantages. We discuss preliminary results supporting the potential applications for MRD testing in the care of patients with lymphoma and strategies for including MRD assessment in lymphoma clinical trials.

  16. An Improved Variational Method for Hyperspectral Image Pansharpening with the Constraint of Spectral Difference Minimization

    Science.gov (United States)

    Huang, Z.; Chen, Q.; Shen, Y.; Chen, Q.; Liu, X.

    2017-09-01

    Variational pansharpening can enhance the spatial resolution of a hyperspectral (HS) image using a high-resolution panchromatic (PAN) image. However, this technology may lead to spectral distortion that obviously affect the accuracy of data analysis. In this article, we propose an improved variational method for HS image pansharpening with the constraint of spectral difference minimization. We extend the energy function of the classic variational pansharpening method by adding a new spectral fidelity term. This fidelity term is designed following the definition of spectral angle mapper, which means that for every pixel, the spectral difference value of any two bands in the HS image is in equal proportion to that of the two corresponding bands in the pansharpened image. Gradient descent method is adopted to find the optimal solution of the modified energy function, and the pansharpened image can be reconstructed. Experimental results demonstrate that the constraint of spectral difference minimization is able to preserve the original spectral information well in HS images, and reduce the spectral distortion effectively. Compared to original variational method, our method performs better in both visual and quantitative evaluation, and achieves a good trade-off between spatial and spectral information.

  17. An Engineering Method of Civil Jet Requirements Validation Based on Requirements Project Principle

    Science.gov (United States)

    Wang, Yue; Gao, Dan; Mao, Xuming

    2018-03-01

    A method of requirements validation is developed and defined to meet the needs of civil jet requirements validation in product development. Based on requirements project principle, this method will not affect the conventional design elements, and can effectively connect the requirements with design. It realizes the modern civil jet development concept, which is “requirement is the origin, design is the basis”. So far, the method has been successfully applied in civil jet aircraft development in China. Taking takeoff field length as an example, the validation process and the validation method of the requirements are detailed introduced in the study, with the hope of providing the experiences to other civil jet product design.

  18. Quantization of the minimal and non-minimal vector field in curved space

    OpenAIRE

    Toms, David J.

    2015-01-01

    The local momentum space method is used to study the quantized massive vector field (the Proca field) with the possible addition of non-minimal terms. Heat kernel coefficients are calculated and used to evaluate the divergent part of the one-loop effective action. It is shown that the naive expression for the effective action that one would write down based on the minimal coupling case needs modification. We adopt a Faddeev-Jackiw method of quantization and consider the case of an ultrastatic...

  19. Parameter-free method for the shape optimization of stiffeners on thin-walled structures to minimize stress concentration

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yang; Shibutan, Yoji [Osaka University, Osaka (Japan); Shimoda, Masatoshi [Toyota Technological Institute, Nagoya (Japan)

    2015-04-15

    This paper presents a parameter-free shape optimization method for the strength design of stiffeners on thin-walled structures. The maximum von Mises stress is minimized and subjected to the volume constraint. The optimum design problem is formulated as a distributed-parameter shape optimization problem under the assumptions that a stiffener is varied in the in-plane direction and that the thickness is constant. The issue of nondifferentiability, which is inherent in this min-max problem, is avoided by transforming the local measure to a smooth differentiable integral functional by using the Kreisselmeier-Steinhauser function. The shape gradient functions are derived by using the material derivative method and adjoint variable method and are applied to the H{sup 1} gradient method for shells to determine the optimal free-boundary shapes. By using this method, the smooth optimal stiffener shape can be obtained without any shape design parameterization while minimizing the maximum stress. The validity of this method is verified through two practical design examples.

  20. Evaluation of the Efficiency and Effectiveness of Three Minimally Invasive Methods of Caries Removal: An in vitro Study.

    Science.gov (United States)

    Boob, Ankush Ramnarayan; Manjula, M; Reddy, E Rajendra; Srilaxmi, N; Rani, Tabitha

    2014-01-01

    Many chemomechanical caries removal (CMCR) agents have been introduced and marketed since 1970s, with each new one being better and effective than the previously introduced. Papacarie and Carisolv are new systems in the field of CMCR techniques. These are reportedly minimally invasive methods of removing carious dentin while preserving sound dentin. To compare the Efficiency (time taken for caries removal) and effectiveness (Knoop hardness number of the remaining dentin) of caries removal by three minimally invasive methods, i.e. hand excavation and chemomechanical caries removal using Carisolv and Papacarie. Thirty recently extracted human permanent molars with occlusal carious lesions were divided randomly in three equal groups and bisected through the middle of the lesion mesiodistally and excavated by two methods on each tooth. Statistically significant difference was present among three methods with respect to time and knoop hardness values (KHN) of the remaining dentin. The Efficiency of Hand method is better compared to CMCR techniques and effectiveness of CMCR techniques is better than Hand method in terms of dentin preservation so the chances of maintaining vitality of the pulp will be enhanced. How to cite this article: Boob AR, Manjula M, Reddy ER, Srilaxmi N, Rani T. Evaluation of the Efficiency and Effectiveness of Three Minimally Invasive Methods of Caries Removal: An in vitro Study. Int J Clin Pediatr Dent 2014;7(1):11-18.

  1. On eco-efficient technologies to minimize industrial water consumption

    Science.gov (United States)

    Amiri, Mohammad C.; Mohammadifard, Hossein; Ghaffari, Ghasem

    2016-07-01

    Purpose - Water scarcity will further stress on available water systems and decrease the security of water in many areas. Therefore, innovative methods to minimize industrial water usage and waste production are of paramount importance in the process of extending fresh water resources and happen to be the main life support systems in many arid regions of the world. This paper demonstrates that there are good opportunities for many industries to save water and decrease waste water in softening process by substituting traditional with echo-friendly methods. The patented puffing method is an eco-efficient and viable technology for water saving and waste reduction in lime softening process. Design/methodology/approach - Lime softening process (LSP) is a very sensitive process to chemical reactions. In addition, optimal monitoring not only results in minimizing sludge that must be disposed of but also it reduces the operating costs of water conditioning. Weakness of the current (regular) control of LSP based on chemical analysis has been demonstrated experimentally and compared with the eco-efficient puffing method. Findings - This paper demonstrates that there is a good opportunity for many industries to save water and decrease waste water in softening process by substituting traditional method with puffing method, a patented eco-efficient technology. Originality/value - Details of the required innovative works to minimize industrial water usage and waste production are outlined in this paper. Employing the novel puffing method for monitoring of lime softening process results in saving a considerable amount of water while reducing chemical sludge.

  2. Metric-based method of software requirements correctness improvement

    Directory of Open Access Journals (Sweden)

    Yaremchuk Svitlana

    2017-01-01

    Full Text Available The work highlights the most important principles of software reliability management (SRM. The SRM concept construes a basis for developing a method of requirements correctness improvement. The method assumes that complicated requirements contain more actual and potential design faults/defects. The method applies a newer metric to evaluate the requirements complexity and double sorting technique evaluating the priority and complexity of a particular requirement. The method enables to improve requirements correctness due to identification of a higher number of defects with restricted resources. Practical application of the proposed method in the course of demands review assured a sensible technical and economic effect.

  3. Evaluation of the accuracy of the free-energy-minimization method

    International Nuclear Information System (INIS)

    Najafabadi, R.; Srolovitz, D.J.

    1995-01-01

    We have made a detailed comparison between three competing methods for determining the free energies of solids and their defects: the thermodynamic integration of Monte Carlo (TIMC) data, the quasiharmonic (QH) model, and the free-energy-minimization (FEM) method. The accuracy of these methods decreases from the TIMC to QH to FEM method, while the computational efficiency improves in that order. All three methods yield perfect crystal lattice parameters and free energies at finite temperatures which are in good agreement for three different Cu interatomic potentials [embedded atom method (EAM), Morse and Lennard-Jones]. The FEM error (relative to the TIMC) in the (001) surface free energy and in the vacancy formation energy were found to be much larger for the EAM potential than for the other two potentials. Part of the errors in the FEM determination of the free energies are associated with anharmonicities in the interatomic potentials, with the remainder attributed to decoupling of the atomic vibrations. The anharmonicity of the EAM potential was found to be unphysically large compared with experimental vacancy formation entropy determinations. Based upon these results, we show that the FEM method provides a reasonable compromise between accuracy and computational demands. However, the accuracy of this approach is sensitive to the choice of interatomic potential and the nature of the defect to which it is being applied. The accuracy of the FEM is best in high-symmetry environments (perfect crystal, high-symmetry defects, etc.) and when used to describe materials where the anharmonicity is not too large

  4. Minimal quantization and confinement

    International Nuclear Information System (INIS)

    Ilieva, N.P.; Kalinowskij, Yu.L.; Nguyen Suan Han; Pervushin, V.N.

    1987-01-01

    A ''minimal'' version of the Hamiltonian quantization based on the explicit solution of the Gauss equation and on the gauge-invariance principle is considered. By the example of the one-particle Green function we show that the requirement for gauge invariance leads to relativistic covariance of the theory and to more proper definition of the Faddeev - Popov integral that does not depend on the gauge choice. The ''minimal'' quantization is applied to consider the gauge-ambiguity problem and a new topological mechanism of confinement

  5. Minimally invasive brow suspension for facial paralysis.

    Science.gov (United States)

    Costantino, Peter D; Hiltzik, David H; Moche, Jason; Preminger, Aviva

    2003-01-01

    To report a new technique for unilateral brow suspension for facial paralysis that is minimally invasive, limits supraciliary scar formation, does not require specialized endoscopic equipment or expertise, and has proved to be equal to direct brow suspension in durability and symmetry. Retrospective survey of a case series of 23 patients between January 1997 and December 2000. Metropolitan tertiary care center. Patients with head and neck tumors and brow ptosis caused by facial nerve paralysis. The results of the procedure were determined using the following 3-tier rating system: outstanding (excellent elevation and symmetry); acceptable (good elevation and fair symmetry); and unacceptable (loss of elevation). The results were considered outstanding in 12 patients, acceptable in 9 patients, and unacceptable in only 1 patient. One patient developed a hematoma, and 1 patient required a secondary adjustment. The technique has proved to be superior to standard brow suspension procedures with regard to scar formation and equal with respect to facial symmetry and suspension. These results have caused us to abandon direct brow suspension and to use this minimally invasive method in all cases of brow ptosis due to facial paralysis.

  6. A new methodology for minimizing investment in the development of offshore fields

    International Nuclear Information System (INIS)

    Garcia-Diaz, J.C.; Startzman, R.; Hogg, G.L.

    1996-01-01

    The development of an offshore field is often a long, complex, and extremely expensive undertaking. The enormous amount of capital required for making investments of this type motivates one to try to optimize the development of a field. This paper provides an efficient computational method to minimize the initial investment in the development of a field. The problem of minimizing the investment in an offshore field is defined here as the problem of locating a number of offshore facilities and wells and allocating these wells to the facilities at minimum cost. Side constraints include restrictions on the total number of facilities of every type and design and various technology constraints. This problem is modeled as a 0/1 integer program. The solution method is based on an implicit enumeration scheme using efficient mathematical tools, such as Lagrangian relaxation and heuristics, to calculate good bounds and, consequently, to reduce the computation time. The solution method was implemented and tested on some typical field-development cases. Execution times were remarkably small for the size and complexity of the examples. Computational results indicate that the new methodology outperforms existing methods both in execution time and in memory required

  7. MINIMAL INVASIVE PLATE OSTEOSYNTHESIS- AN EFFECTIVE TREATMENT METHOD FOR DISTAL TIBIA INTRAARTICULAR (PILON FRACTURES- AN 18 MONTHS FOLLOW UP

    Directory of Open Access Journals (Sweden)

    Saket Jati

    2016-12-01

    Full Text Available BACKGROUND Tibial pilon fracture though requires operative treatment is difficult to manage. Conventional osteosynthesis is not suitable, because distal tibia is subcutaneous bone with poor vascularity. Closed reduction and Minimally Invasive Plate Osteosynthesis (MIPO for distal tibia has emerged as an alternative treatment option because it respects fracture biology and haematoma and also provides biomechanically stable construct. The aim of the study is to evaluate the results of minimally invasive plate osteosynthesis using locking plates in treating tibial pilon fractures in terms of fracture union, restoration of ankle function and complications. MATERIALS AND METHODS 30 patients with closed tibial pilon fractures (Ruedi and Allgower type I (14, type II (13, type III (3 treated with MIPO with Locking Compression Plates (LCP were prospectively followed for average duration of 18 months. RESULTS Average duration of injury-hospital and injury-surgery interval was as 12.05 hrs. and 3.50 days, respectively. All fractures got united with an average duration of 20.8 weeks (range 14-28 weeks. Olerud and Molander score was used for evaluation at 3 months, 6 months and 18 months. One patient had union with valgus angulation of 15 degrees, but no nonunion was found. CONCLUSION The present study shows that MIPO with LCP is an effective treatment method in terms of union time and complications rate for tibial pilon fracture promoting early union and early weight bearing.

  8. 40 CFR 125.94 - How will requirements reflecting best technology available for minimizing adverse environmental...

    Science.gov (United States)

    2010-07-01

    ... technology available for minimizing adverse environmental impact be established for my Phase II existing... technology available to minimize adverse environmental impact for your facility in accordance with paragraphs... technology available for minimizing adverse environmental impact. This determination must be based on...

  9. Minimal DBM Substraction

    DEFF Research Database (Denmark)

    David, Alexandre; Håkansson, John; G. Larsen, Kim

    In this paper we present an algorithm to compute DBM substractions with a guaranteed minimal number of splits and disjoint DBMs to avoid any redundance. The substraction is one of the few operations that result in a non-convex zone, and thus, requires splitting. It is of prime importance to reduce...

  10. Note: A method for minimizing oxide formation during elevated temperature nanoindentation

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, I. C.; Hodge, A. M., E-mail: ahodge@usc.edu [Department of Aerospace and Mechanical Engineering, University of Southern California, 3650 McClintock Avenue OHE430, Los Angeles, California 90089 (United States); Garcia-Sanchez, E. [Department of Aerospace and Mechanical Engineering, University of Southern California, 3650 McClintock Avenue OHE430, Los Angeles, California 90089 (United States); Facultad de Ingeniería Mecánica y Eléctrica, Universidad Autónoma de Nuevo León, Av. Universidad S/N, San Nicolás de los Garza, NL 66450 (Mexico)

    2014-09-15

    A standardized method to protect metallic samples and minimize oxide formation during elevated-temperature nanoindentation was adapted to a commercial instrument. Nanoindentation was performed on Al (100), Cu (100), and W (100) single crystals submerged in vacuum oil at 200 °C, while the surface morphology and oxidation was carefully monitored using atomic force microscopy (AFM) and X-ray photoelectron spectroscopy (XPS). The results were compared to room temperature and 200 °C nanoindentation tests performed without oil, in order to evaluate the feasibility of using the oil as a protective medium. Extensive surface characterization demonstrated that this methodology is effective for nanoscale testing.

  11. Minimizing the Discrepancy between Simulated and Historical Failures in Turbine Engines: A Simulation-Based Optimization Method

    Directory of Open Access Journals (Sweden)

    Ahmed Kibria

    2015-01-01

    Full Text Available The reliability modeling of a module in a turbine engine requires knowledge of its failure rate, which can be estimated by identifying statistical distributions describing the percentage of failure per component within the turbine module. The correct definition of the failure statistical behavior per component is highly dependent on the engineer skills and may present significant discrepancies with respect to the historical data. There is no formal methodology to approach this problem and a large number of labor hours are spent trying to reduce the discrepancy by manually adjusting the distribution’s parameters. This paper addresses this problem and provides a simulation-based optimization method for the minimization of the discrepancy between the simulated and the historical percentage of failures for turbine engine components. The proposed methodology optimizes the parameter values of the component’s failure statistical distributions within the component’s likelihood confidence bounds. A complete testing of the proposed method is performed on a turbine engine case study. The method can be considered as a decision-making tool for maintenance, repair, and overhaul companies and will potentially reduce the cost of labor associated to finding the appropriate value of the distribution parameters for each component/failure mode in the model and increase the accuracy in the prediction of the mean time to failures (MTTF.

  12. Minimal invasive stabilization of osteoporotic vertebral compression fractures. Methods and preinterventional diagnostics

    International Nuclear Information System (INIS)

    Grohs, J.G.; Krepler, P.

    2004-01-01

    Minimal invasive stabilizations represent a new alternative for the treatment of osteoporotic compression fractures. Vertebroplasty and balloon kyphoplasty are two methods to enhance the strength of osteoporotic vertebral bodies by the means of cement application. Vertebroplasty is the older and technically easier method. The balloon kyphoplasty is the newer and more expensive method which does not only improve pain but also restores the sagittal profile of the spine. By balloon kyphoplasty the height of 101 fractured vertebral bodies could be increased up to 90% and the wedge decreased from 12 to 7 degrees. Pain was reduced from 7,2 to 2,5 points. The Oswestry disability index decreased from 60 to 26 points. This effects persisted over a period of two years. Cement leakage occurred in only 2% of vertebral bodies. Fractures of adjacent vertebral bodies were found in 11%. Good preinterventional diagnostics and intraoperative imaging are necessary to make the balloon kyphoplasty a successful application. (orig.) [de

  13. Minimal families of curves on surfaces

    KAUST Repository

    Lubbes, Niels

    2014-11-01

    A minimal family of curves on an embedded surface is defined as a 1-dimensional family of rational curves of minimal degree, which cover the surface. We classify such minimal families using constructive methods. This allows us to compute the minimal families of a given surface.The classification of minimal families of curves can be reduced to the classification of minimal families which cover weak Del Pezzo surfaces. We classify the minimal families of weak Del Pezzo surfaces and present a table with the number of minimal families of each weak Del Pezzo surface up to Weyl equivalence.As an application of this classification we generalize some results of Schicho. We classify algebraic surfaces that carry a family of conics. We determine the minimal lexicographic degree for the parametrization of a surface that carries at least 2 minimal families. © 2014 Elsevier B.V.

  14. Minimal Dark Matter in the sky

    International Nuclear Information System (INIS)

    Panci, P.

    2016-01-01

    We discuss some theoretical and phenomenological aspects of the Minimal Dark Matter (MDM) model proposed in 2006, which is a theoretical framework highly appreciated for its minimality and yet its predictivity. We first critically review the theoretical requirements of MDM pointing out generalizations of this framework. Then we review the phenomenology of the originally proposed fermionic hyperchargeless electroweak quintuplet showing its main γ-ray tests.

  15. MOCUS, Minimal Cut Sets and Minimal Path Sets from Fault Tree Analysis

    International Nuclear Information System (INIS)

    Fussell, J.B.; Henry, E.B.; Marshall, N.H.

    1976-01-01

    1 - Description of problem or function: From a description of the Boolean failure logic of a system, called a fault tree, and control parameters specifying the minimal cut set length to be obtained MOCUS determines the system failure modes, or minimal cut sets, and the system success modes, or minimal path sets. 2 - Method of solution: MOCUS uses direct resolution of the fault tree into the cut and path sets. The algorithm used starts with the main failure of interest, the top event, and proceeds to basic independent component failures, called primary events, to resolve the fault tree to obtain the minimal sets. A key point of the algorithm is that an and gate alone always increases the number of path sets; an or gate alone always increases the number of cut sets and increases the size of path sets. Other types of logic gates must be described in terms of and and or logic gates. 3 - Restrictions on the complexity of the problem: Output from MOCUS can include minimal cut and path sets for up to 20 gates

  16. Collective motion in prolate γ-rigid nuclei within minimal length concept via a quantum perturbation method

    Science.gov (United States)

    Chabab, M.; El Batoul, A.; Lahbas, A.; Oulne, M.

    2018-05-01

    Based on the minimal length concept, inspired by Heisenberg algebra, a closed analytical formula is derived for the energy spectrum of the prolate γ-rigid Bohr-Mottelson Hamiltonian of nuclei, within a quantum perturbation method (QPM), by considering a scaled Davidson potential in β shape variable. In the resulting solution, called X(3)-D-ML, the ground state and the first β-band are all studied as a function of the free parameters. The fact of introducing the minimal length concept with a QPM makes the model very flexible and a powerful approach to describe nuclear collective excitations of a variety of vibrational-like nuclei. The introduction of scaling parameters in the Davidson potential enables us to get a physical minimum of this latter in comparison with previous works. The analysis of the corrected wave function, as well as the probability density distribution, shows that the minimal length parameter has a physical upper bound limit.

  17. Hazardous waste minimization tracking system

    International Nuclear Information System (INIS)

    Railan, R.

    1994-01-01

    Under RCRA section 3002 9(b) and 3005f(h), hazardous waste generators and owners/operators of treatment, storage, and disposal facilities (TSDFs) are required to certify that they have a program in place to reduce the volume or quantity and toxicity of hazardous waste to the degree determined to be economically practicable. In many cases, there are environmental, as well as, economic benefits, for agencies that pursue pollution prevention options. Several state governments have already enacted waste minimization legislation (e.g., Massachusetts Toxic Use Reduction Act of 1989, and Oregon Toxic Use Reduction Act and Hazardous Waste Reduction Act, July 2, 1989). About twenty six other states have established legislation that will mandate some type of waste minimization program and/or facility planning. The need to address the HAZMIN (Hazardous Waste Minimization) Program at government agencies and private industries has prompted us to identify the importance of managing The HAZMIN Program, and tracking various aspects of the program, as well as the progress made in this area. The open-quotes WASTEclose quotes is a tracking system, which can be used and modified in maintaining the information related to Hazardous Waste Minimization Program, in a manageable fashion. This program maintains, modifies, and retrieves information related to hazardous waste minimization and recycling, and provides automated report generating capabilities. It has a built-in menu, which can be printed either in part or in full. There are instructions on preparing The Annual Waste Report, and The Annual Recycling Report. The program is very user friendly. This program is available in 3.5 inch or 5 1/4 inch floppy disks. A computer with 640K memory is required

  18. MINIMAL REQUIREMENTS FOR THE DIAGNOSIS, CLASSIFICATION, AND EVALUATION OF THE TREATMENT OF CHILDHOOD ACUTE LYMPHOBLASTIC-LEUKEMIA (ALL) IN THE BFM FAMILY COOPERATIVE GROUP

    NARCIS (Netherlands)

    VANDERDOESVANDENBERG, A; BARTRAM, CR; BASSO, G; BENOIT, YCM; BIONDI, A; DEBATIN, KM; HAAS, OA; HARBOTT, J; KAMPS, WA; KOLLER, U; LAMPERT, F; LUDWIG, WD; NIEMEYER, CM; VANWERING, ER

    1992-01-01

    Minimal requirements and their rationale for the diagnosis and the response to treatment in childhood acute lymphoblastic leukemia (ALL) were defined in the recently instituted "BFM-Family"-Group, in which the German, Austrian, Dutch, Italian, Belgian, French and Hungarian childhood leukemia study

  19. Facile microwave synthesis of uniform magnetic nanoparticles with minimal sample processing

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Thomas, E-mail: tom.schneider@ubc.ca [Faculty of Pharmaceutical Sciences, University of British Columbia, Vancouver, BC, V6T 1Z3 Canada (Canada); Löwa, Anna; Karagiozov, Stoyan [Faculty of Pharmaceutical Sciences, University of British Columbia, Vancouver, BC, V6T 1Z3 Canada (Canada); Sprenger, Lisa [Faculty of Pharmaceutical Sciences, University of British Columbia, Vancouver, BC, V6T 1Z3 Canada (Canada); TU Dresden, Chair of Magnetofluiddynamics, Measuring and Automation Technology, Dresden, 01062 Germany (Germany); Gutiérrez, Lucía [Instituto Universitario de Nanociencia de Aragón (INA), University of Zaragoza, Zaragoza, 50018 Spain (Spain); Esposito, Tullio; Marten, Gernot; Saatchi, Katayoun [Faculty of Pharmaceutical Sciences, University of British Columbia, Vancouver, BC, V6T 1Z3 Canada (Canada); Häfeli, Urs O., E-mail: urs.hafeli@ubc.ca [Faculty of Pharmaceutical Sciences, University of British Columbia, Vancouver, BC, V6T 1Z3 Canada (Canada)

    2017-01-01

    We present a simple and rapid method for the synthesis of small magnetic nanoparticles (diameters in the order of 5–20 nm) and narrow size distributions (CV's of 20–40%). The magnetite nanoparticles were synthesized in green solvents within minutes and the saturation magnetization of the particles was tunable by changes in the reaction conditions. We show that this particle synthesis method requires minimal processing steps and we present the successful coating of the particles with reactive bisphosphonates after synthesis without washing or centrifugation. We found minimal batch-to-batch variability and show the scalability of the particle synthesis method. We present a full characterization of the particle properties and believe that this synthesis method holds great promise for facile and rapid generation of magnetic nanoparticles with defined surface coatings for magnetic targeting applications. - Highlights: ●Rapid and facile synthesis of magnetic nanoparticles. ●Microwave synthesis in green solvent. ●Magnetite MNPs with small sizes and high saturation magnetization. ●Tunable particle properties depending on heating duration. ●Scalable MNP synthesis.

  20. Minimal invasive stabilization of osteoporotic vertebral compression fractures. Methods and preinterventional diagnostics; Minimal-invasive Stabilisierung osteoporotischer Wirbelkoerpereinbrueche. Methodik und praeinterventionelle Diagnostik

    Energy Technology Data Exchange (ETDEWEB)

    Grohs, J.G.; Krepler, P. [Orthopaedische Klinik, Universitaet Wien (Austria)

    2004-03-01

    Minimal invasive stabilizations represent a new alternative for the treatment of osteoporotic compression fractures. Vertebroplasty and balloon kyphoplasty are two methods to enhance the strength of osteoporotic vertebral bodies by the means of cement application. Vertebroplasty is the older and technically easier method. The balloon kyphoplasty is the newer and more expensive method which does not only improve pain but also restores the sagittal profile of the spine. By balloon kyphoplasty the height of 101 fractured vertebral bodies could be increased up to 90% and the wedge decreased from 12 to 7 degrees. Pain was reduced from 7,2 to 2,5 points. The Oswestry disability index decreased from 60 to 26 points. This effects persisted over a period of two years. Cement leakage occurred in only 2% of vertebral bodies. Fractures of adjacent vertebral bodies were found in 11%. Good preinterventional diagnostics and intraoperative imaging are necessary to make the balloon kyphoplasty a successful application. (orig.) [German] Minimal-invasive Stabilisierungen stellen eine Alternative zur bisherigen Behandlung osteoporotischer Wirbelfrakturen dar. Die Vertebroplastie und die Ballonkyphoplastik sind 2 Verfahren, um die Festigkeit der Wirbelkoerper nach osteoporotischen Kompressionsfrakturen durch Einbringen von Knochenzement wieder herzustellen. Die Vertebroplastie ist die aeltere, technisch einfachere und kostenguenstigere Technik, geht allerdings regelmaessig mit Zementaustritt einher. Die Ballonkyphoplastik ist die neuere kostenintensivere Technologie, mit der abgesehen von der Schmerzreduktion auch die Wiederherstellung des sagittalen Profils der Wirbelsaeule angestrebt werden kann. Mit der Ballonkyphoplastik konnten bei 101 frakturierten Wirbelkoerpern die Hoehe auf fast 90% des Sollwertes angehoben und die lokale Kyphose von 12 auf 7 vermindert werden. Die Schmerzen wurden - gemessen anhand einer 10-teiligen Skala - von 7,2 auf 2,5 reduziert. Der Oswestry disability

  1. A minimal architecture for joint action

    DEFF Research Database (Denmark)

    Vesper, Cordula; Butterfill, Stephen; Knoblich, Günther

    2010-01-01

    What kinds of processes and representations make joint action possible? In this paper we suggest a minimal architecture for joint action that focuses on representations, action monitoring and action prediction processes, as well as ways of simplifying coordination. The architecture spells out...... minimal requirements for an individual agent to engage in a joint action. We discuss existing evidence in support of the architecture as well as open questions that remain to be empirically addressed. In addition, we suggest possible interfaces between the minimal architecture and other approaches...... to joint action. The minimal architecture has implications for theorizing about the emergence of joint action, for human-machine interaction, and for understanding how coordination can be facilitated by exploiting relations between multiple agents’ actions and between actions and the environment....

  2. An Approximate Proximal Bundle Method to Minimize a Class of Maximum Eigenvalue Functions

    Directory of Open Access Journals (Sweden)

    Wei Wang

    2014-01-01

    Full Text Available We present an approximate nonsmooth algorithm to solve a minimization problem, in which the objective function is the sum of a maximum eigenvalue function of matrices and a convex function. The essential idea to solve the optimization problem in this paper is similar to the thought of proximal bundle method, but the difference is that we choose approximate subgradient and function value to construct approximate cutting-plane model to solve the above mentioned problem. An important advantage of the approximate cutting-plane model for objective function is that it is more stable than cutting-plane model. In addition, the approximate proximal bundle method algorithm can be given. Furthermore, the sequences generated by the algorithm converge to the optimal solution of the original problem.

  3. Estimation and Minimization of Embodied Carbon of Buildings: A Review

    Directory of Open Access Journals (Sweden)

    Ali Akbarnezhad

    2017-01-01

    Full Text Available Building and construction is responsible for up to 30% of annual global greenhouse gas (GHG emissions, commonly reported in carbon equivalent unit. Carbon emissions are incurred in all stages of a building’s life cycle and are generally categorised into operating carbon and embodied carbon, each making varying contributions to the life cycle carbon depending on the building’s characteristics. With recent advances in reducing the operating carbon of buildings, the available literature indicates a clear shift in attention towards investigating strategies to minimize embodied carbon. However, minimizing the embodied carbon of buildings is challenging and requires evaluating the effects of embodied carbon reduction strategies on the emissions incurred in different life cycle phases, as well as the operating carbon of the building. In this paper, the available literature on strategies for reducing the embodied carbon of buildings, as well as methods for estimating the embodied carbon of buildings, is reviewed and the strengths and weaknesses of each method are highlighted.

  4. Optimal reload and depletion method for pressurized water reactors

    International Nuclear Information System (INIS)

    Ahn, D.H.

    1984-01-01

    A new method has been developed to automatically reload and deplete a PWR so that both the enriched inventory requirements during the reactor cycle and the cost of reloading the core are minimized. This is achieved through four stepwise optimization calculations: 1) determination of the minimum fuel requirement for an equivalent three-region core model, 2) optimal selection and allocation of fuel requirement for an equivalent three-region core model, 2) optimal selection and allocation of fuel assemblies for each of the three regions to minimize the cost of the fresh reload fuel, 3) optimal placement of fuel assemblies to conserve regionwise optimal conditions and 4) optimal control through poison management to deplete individual fuel assemblies to maximize EOC k/sub eff/. Optimizing the fuel cost of reloading and depleting a PWR reactor cycle requires solutions to two separate optimization calculations. One of these minimizes the enriched fuel inventory in the core by optimizing the EOC k/sub eff/. The other minimizes the cost of the fresh reload cost. Both of these optimization calculations have now been combined to provide a new method for performing an automatic optimal reload of PWR's. The new method differs from previous methods in that the optimization process performs all tasks required to reload and deplete a PWR

  5. Image denoising by a direct variational minimization

    Directory of Open Access Journals (Sweden)

    Pilipović Stevan

    2011-01-01

    Full Text Available Abstract In this article we introduce a novel method for the image de-noising which combines a mathematically well-posdenes of the variational modeling with the efficiency of a patch-based approach in the field of image processing. It based on a direct minimization of an energy functional containing a minimal surface regularizer that uses fractional gradient. The minimization is obtained on every predefined patch of the image, independently. By doing so, we avoid the use of an artificial time PDE model with its inherent problems of finding optimal stopping time, as well as the optimal time step. Moreover, we control the level of image smoothing on each patch (and thus on the whole image by adapting the Lagrange multiplier using the information on the level of discontinuities on a particular patch, which we obtain by pre-processing. In order to reduce the average number of vectors in the approximation generator and still to obtain the minimal degradation, we combine a Ritz variational method for the actual minimization on a patch, and a complementary fractional variational principle. Thus, the proposed method becomes computationally feasible and applicable for practical purposes. We confirm our claims with experimental results, by comparing the proposed method with a couple of PDE-based methods, where we get significantly better denoising results specially on the oscillatory regions.

  6. Waste minimization handbook, Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility`s life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996.

  7. Waste minimization handbook, Volume 1

    International Nuclear Information System (INIS)

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility's life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996

  8. Chronic Morel-Lavallée Lesion: A Novel Minimally Invasive Method of Treatment.

    Science.gov (United States)

    Mettu, Ramireddy; Surath, Harsha Vardhan; Chayam, Hanumantha Rao; Surath, Amaranth

    2016-11-01

    A Morel-Lavallée lesion is a closed internal degloving injury resulting from a shearing force applied to the skin. The etiology of this condition may be motor vehicle accidents, falls, contact sports (ie, football, wrestling),1 and iatrogenic after mammoplasty or abdominal liposuction.2 Common sites of the lesions include the pelvis and/or thigh.3 Isolated Morel-Lavallée lesions without underlying fracture are likely to be missed, which result in chronicity. Management of this condition often requires extensive surgical procedures such as debridement, sclerotherapy, serial percutaneous drainage, negative pressure wound therapy (NPWT), and skin grafting.4,5 The authors wish to highlight a minimally invasive technique for the treatment of chronic Morel-Lavallée lesions.

  9. Waste minimization at Chalk River Laboratories

    Energy Technology Data Exchange (ETDEWEB)

    Kranz, P.; Wong, P.C.F. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2011-07-01

    Waste minimization supports Atomic Energy of Canada Limited (AECL) Environment Policy with regard to pollution prevention and has positive impacts on the environment, human health and safety, and economy. In accordance with the principle of pollution prevention, the quantities and degree of hazard of wastes requiring storage or disposition at facilities within or external to AECL sites shall be minimized, following the principles of Prevent, Reduce, Reuse, and Recycle, to the extent practical. Waste minimization is an important element in the Waste Management Program. The Waste Management Program has implemented various initiatives for waste minimization since 2007. The key initiatives have focused on waste reduction, segregation and recycling, and included: 1) developed waste minimization requirements and recycling procedure to establish the framework for applying the Waste Minimization Hierarchy; 2) performed waste minimization assessments for the facilities, which generate significant amounts of waste, to identify the opportunities for waste reduction and assist the waste generators to develop waste reduction targets and action plans to achieve the targets; 3) implemented the colour-coded, standardized waste and recycling containers to enhance waste segregation; 4) established partnership with external agents for recycling; 5) extended the likely clean waste and recyclables collection to selected active areas; 6) provided on-going communications to promote waste reduction and increase awareness for recycling; and 7) continually monitored performance, with respect to waste minimization, to identify opportunities for improvement and to communicate these improvements. After implementation of waste minimization initiatives at CRL, the solid waste volume generated from routine operations at CRL has significantly decreased, while the amount of recyclables diverted from the onsite landfill has significantly increased since 2007. The overall refuse volume generated at

  10. Waste minimization at Chalk River Laboratories

    International Nuclear Information System (INIS)

    Kranz, P.; Wong, P.C.F.

    2011-01-01

    Waste minimization supports Atomic Energy of Canada Limited (AECL) Environment Policy with regard to pollution prevention and has positive impacts on the environment, human health and safety, and economy. In accordance with the principle of pollution prevention, the quantities and degree of hazard of wastes requiring storage or disposition at facilities within or external to AECL sites shall be minimized, following the principles of Prevent, Reduce, Reuse, and Recycle, to the extent practical. Waste minimization is an important element in the Waste Management Program. The Waste Management Program has implemented various initiatives for waste minimization since 2007. The key initiatives have focused on waste reduction, segregation and recycling, and included: 1) developed waste minimization requirements and recycling procedure to establish the framework for applying the Waste Minimization Hierarchy; 2) performed waste minimization assessments for the facilities, which generate significant amounts of waste, to identify the opportunities for waste reduction and assist the waste generators to develop waste reduction targets and action plans to achieve the targets; 3) implemented the colour-coded, standardized waste and recycling containers to enhance waste segregation; 4) established partnership with external agents for recycling; 5) extended the likely clean waste and recyclables collection to selected active areas; 6) provided on-going communications to promote waste reduction and increase awareness for recycling; and 7) continually monitored performance, with respect to waste minimization, to identify opportunities for improvement and to communicate these improvements. After implementation of waste minimization initiatives at CRL, the solid waste volume generated from routine operations at CRL has significantly decreased, while the amount of recyclables diverted from the onsite landfill has significantly increased since 2007. The overall refuse volume generated at

  11. Operating cost minimization of a radial distribution system in a deregulated electricity market through reconfiguration using NSGA method

    International Nuclear Information System (INIS)

    Chandramohan, S.; Atturulu, Naresh; Devi, R.P. Kumudini; Venkatesh, B.

    2010-01-01

    In the future, mechanisms for trade in ancillary services such as reactive power will be implemented in many deregulated power systems. In such an operating framework, a Distribution Corporation (DisCo) would have to purchase reactive power along with real power from the connected transmission corporation. A DisCo would want to minimize its operating costs by minimizing the total amount of real and reactive power drawn from the connected transmission system. Optimally reconfiguring the network will achieve such a goal. In this work, we use a non-dominated sorting genetic algorithm (NSGA) for reconfiguring a radial DisCo to minimize its operating costs considering real and reactive power costs while maximizing its operating reliability and satisfying the regular operating constraints. This method is tested on sample test systems and reported. (author)

  12. Quality Assurance of Multiport Image-Guided Minimally Invasive Surgery at the Lateral Skull Base

    Directory of Open Access Journals (Sweden)

    Maria Nau-Hermes

    2014-01-01

    Full Text Available For multiport image-guided minimally invasive surgery at the lateral skull base a quality management is necessary to avoid the damage of closely spaced critical neurovascular structures. So far there is no standardized method applicable independently from the surgery. Therefore, we adapt a quality management method, the quality gates (QG, which is well established in, for example, the automotive industry and apply it to multiport image-guided minimally invasive surgery. QG divide a process into different sections. Passing between sections can only be achieved if previously defined requirements are fulfilled which secures the process chain. An interdisciplinary team of otosurgeons, computer scientists, and engineers has worked together to define the quality gates and the corresponding criteria that need to be fulfilled before passing each quality gate. In order to evaluate the defined QG and their criteria, the new surgery method was applied with a first prototype at a human skull cadaver model. We show that the QG method can ensure a safe multiport minimally invasive surgical process at the lateral skull base. Therewith, we present an approach towards the standardization of quality assurance of surgical processes.

  13. Quality assurance of multiport image-guided minimally invasive surgery at the lateral skull base.

    Science.gov (United States)

    Nau-Hermes, Maria; Schmitt, Robert; Becker, Meike; El-Hakimi, Wissam; Hansen, Stefan; Klenzner, Thomas; Schipper, Jörg

    2014-01-01

    For multiport image-guided minimally invasive surgery at the lateral skull base a quality management is necessary to avoid the damage of closely spaced critical neurovascular structures. So far there is no standardized method applicable independently from the surgery. Therefore, we adapt a quality management method, the quality gates (QG), which is well established in, for example, the automotive industry and apply it to multiport image-guided minimally invasive surgery. QG divide a process into different sections. Passing between sections can only be achieved if previously defined requirements are fulfilled which secures the process chain. An interdisciplinary team of otosurgeons, computer scientists, and engineers has worked together to define the quality gates and the corresponding criteria that need to be fulfilled before passing each quality gate. In order to evaluate the defined QG and their criteria, the new surgery method was applied with a first prototype at a human skull cadaver model. We show that the QG method can ensure a safe multiport minimally invasive surgical process at the lateral skull base. Therewith, we present an approach towards the standardization of quality assurance of surgical processes.

  14. Octasaccharide is the minimal length unit required for efficient binding of cyclophilin B to heparin and cell surface heparan sulphate.

    Science.gov (United States)

    Vanpouille, Christophe; Denys, Agnès; Carpentier, Mathieu; Pakula, Rachel; Mazurier, Joël; Allain, Fabrice

    2004-09-01

    Cyclophilin B (CyPB) is a heparin-binding protein first identified as a receptor for cyclosporin A. In previous studies, we reported that CyPB triggers chemotaxis and integrin-mediated adhesion of T-lymphocytes by way of interaction with two types of binding sites. The first site corresponds to a signalling receptor; the second site has been identified as heparan sulphate (HS) and appears crucial to induce cell adhesion. Characterization of the HS-binding unit is critical to understand the requirement of HS in pro-adhesive activity of CyPB. By using a strategy based on gel mobility shift assays with fluorophore-labelled oligosaccharides, we demonstrated that the minimal heparin unit required for efficient binding of CyPB is an octasaccharide. The mutants CyPB(KKK-) [where KKK- refers to the substitutions K3A(Lys3-->Ala)/K4A/K5A] and CyPB(DeltaYFD) (where Tyr14-Phe-Asp16 has been deleted) failed to interact with octasaccharides, confirming that the Y14FD16 and K3KK5 clusters are required for CyPB binding. Molecular modelling revealed that both clusters are spatially arranged so that they may act synergistically to form a binding site for the octasaccharide. We then demonstrated that heparin-derived octasaccharides and higher degree of polymerization oligosaccharides inhibited the interaction between CyPB and fluorophore-labelled HS chains purified from T-lymphocytes, and strongly reduced the HS-dependent pro-adhesive activity of CyPB. However, oligosaccharides or heparin were unable to restore adhesion of heparinase-treated T-lymphocytes, indicating that HS has to be present on the cell membrane to support the pro-adhesive activity of CyPB. Altogether, these results demonstrate that the octasaccharide is likely to be the minimal length unit required for efficient binding of CyPB to cell surface HS and consequent HS-dependent cell responses.

  15. Tracking-by-detection of surgical instruments in minimally invasive surgery via the convolutional neural network deep learning-based method.

    Science.gov (United States)

    Zhao, Zijian; Voros, Sandrine; Weng, Ying; Chang, Faliang; Li, Ruijian

    2017-12-01

    Worldwide propagation of minimally invasive surgeries (MIS) is hindered by their drawback of indirect observation and manipulation, while monitoring of surgical instruments moving in the operated body required by surgeons is a challenging problem. Tracking of surgical instruments by vision-based methods is quite lucrative, due to its flexible implementation via software-based control with no need to modify instruments or surgical workflow. A MIS instrument is conventionally split into a shaft and end-effector portions, while a 2D/3D tracking-by-detection framework is proposed, which performs the shaft tracking followed by the end-effector one. The former portion is described by line features via the RANSAC scheme, while the latter is depicted by special image features based on deep learning through a well-trained convolutional neural network. The method verification in 2D and 3D formulation is performed through the experiments on ex-vivo video sequences, while qualitative validation on in-vivo video sequences is obtained. The proposed method provides robust and accurate tracking, which is confirmed by the experimental results: its 3D performance in ex-vivo video sequences exceeds those of the available state-of -the-art methods. Moreover, the experiments on in-vivo sequences demonstrate that the proposed method can tackle the difficult condition of tracking with unknown camera parameters. Further refinements of the method will refer to the occlusion and multi-instrumental MIS applications.

  16. A simplified density matrix minimization for linear scaling self-consistent field theory

    International Nuclear Information System (INIS)

    Challacombe, M.

    1999-01-01

    A simplified version of the Li, Nunes and Vanderbilt [Phys. Rev. B 47, 10891 (1993)] and Daw [Phys. Rev. B 47, 10895 (1993)] density matrix minimization is introduced that requires four fewer matrix multiplies per minimization step relative to previous formulations. The simplified method also exhibits superior convergence properties, such that the bulk of the work may be shifted to the quadratically convergent McWeeny purification, which brings the density matrix to idempotency. Both orthogonal and nonorthogonal versions are derived. The AINV algorithm of Benzi, Meyer, and Tuma [SIAM J. Sci. Comp. 17, 1135 (1996)] is introduced to linear scaling electronic structure theory, and found to be essential in transformations between orthogonal and nonorthogonal representations. These methods have been developed with an atom-blocked sparse matrix algebra that achieves sustained megafloating point operations per second rates as high as 50% of theoretical, and implemented in the MondoSCF suite of linear scaling SCF programs. For the first time, linear scaling Hartree - Fock theory is demonstrated with three-dimensional systems, including water clusters and estane polymers. The nonorthogonal minimization is shown to be uncompetitive with minimization in an orthonormal representation. An early onset of linear scaling is found for both minimal and double zeta basis sets, and crossovers with a highly optimized eigensolver are achieved. Calculations with up to 6000 basis functions are reported. The scaling of errors with system size is investigated for various levels of approximation. copyright 1999 American Institute of Physics

  17. A new smoothing modified three-term conjugate gradient method for [Formula: see text]-norm minimization problem.

    Science.gov (United States)

    Du, Shouqiang; Chen, Miao

    2018-01-01

    We consider a kind of nonsmooth optimization problems with [Formula: see text]-norm minimization, which has many applications in compressed sensing, signal reconstruction, and the related engineering problems. Using smoothing approximate techniques, this kind of nonsmooth optimization problem can be transformed into a general unconstrained optimization problem, which can be solved by the proposed smoothing modified three-term conjugate gradient method. The smoothing modified three-term conjugate gradient method is based on Polak-Ribière-Polyak conjugate gradient method. For the Polak-Ribière-Polyak conjugate gradient method has good numerical properties, the proposed method possesses the sufficient descent property without any line searches, and it is also proved to be globally convergent. Finally, the numerical experiments show the efficiency of the proposed method.

  18. Minimal Coleman-Weinberg theory explains the diphoton excess

    DEFF Research Database (Denmark)

    Antipin, Oleg; Mojaza, Matin; Sannino, Francesco

    2016-01-01

    It is possible to delay the hierarchy problem, by replacing the standard Higgs-sector by the Coleman-Weinberg mechanism, and at the same time ensure perturbative naturalness through the so-called Veltman conditions. As we showed in a previous study, minimal models of this type require the introdu......It is possible to delay the hierarchy problem, by replacing the standard Higgs-sector by the Coleman-Weinberg mechanism, and at the same time ensure perturbative naturalness through the so-called Veltman conditions. As we showed in a previous study, minimal models of this type require...

  19. Minimal abdominal incisions

    Directory of Open Access Journals (Sweden)

    João Carlos Magi

    2017-04-01

    Full Text Available Minimally invasive procedures aim to resolve the disease with minimal trauma to the body, resulting in a rapid return to activities and in reductions of infection, complications, costs and pain. Minimally incised laparotomy, sometimes referred to as minilaparotomy, is an example of such minimally invasive procedures. The aim of this study is to demonstrate the feasibility and utility of laparotomy with minimal incision based on the literature and exemplifying with a case. The case in question describes reconstruction of the intestinal transit with the use of this incision. Male, young, HIV-positive patient in a late postoperative of ileotiflectomy, terminal ileostomy and closing of the ascending colon by an acute perforating abdomen, due to ileocolonic tuberculosis. The barium enema showed a proximal stump of the right colon near the ileostomy. The access to the cavity was made through the orifice resulting from the release of the stoma, with a lateral-lateral ileo-colonic anastomosis with a 25 mm circular stapler and manual closure of the ileal stump. These surgeries require their own tactics, such as rigor in the lysis of adhesions, tissue traction, and hemostasis, in addition to requiring surgeon dexterity – but without the need for investments in technology; moreover, the learning curve is reported as being lower than that for videolaparoscopy. Laparotomy with minimal incision should be considered as a valid and viable option in the treatment of surgical conditions. Resumo: Procedimentos minimamente invasivos visam resolver a doença com o mínimo de trauma ao organismo, resultando em retorno rápido às atividades, reduções nas infecções, complicações, custos e na dor. A laparotomia com incisão mínima, algumas vezes referida como minilaparotomia, é um exemplo desses procedimentos minimamente invasivos. O objetivo deste trabalho é demonstrar a viabilidade e utilidade das laparotomias com incisão mínima com base na literatura e

  20. The Sources and Methods of Engineering Design Requirement

    DEFF Research Database (Denmark)

    Li, Xuemeng; Zhang, Zhinan; Ahmed-Kristensen, Saeema

    2014-01-01

    to be defined in a new context. This paper focuses on understanding the design requirement sources at the requirement elicitation phase. It aims at proposing an improved design requirement source classification considering emerging markets and presenting current methods for eliciting requirement for each source...

  1. Using SETS to find minimal cut sets in large fault trees

    International Nuclear Information System (INIS)

    Worrell, R.B.; Stack, D.W.

    1978-01-01

    An efficient algebraic algorithm for finding the minimal cut sets for a large fault tree was defined and a new procedure which implements the algorithm was added to the Set Equation Transformation System (SETS). The algorithm includes the identification and separate processing of independent subtrees, the coalescing of consecutive gates of the same kind, the creation of additional independent subtrees, and the derivation of the fault tree stem equation in stages. The computer time required to determine the minimal cut sets using these techniques is shown to be substantially less than the computer time required to determine the minimal cut sets when these techniques are not employed. It is shown for a given example that the execution time required to determine the minimal cut sets can be reduced from 7,686 seconds to 7 seconds when all of these techniques are employed

  2. A Parsimonious Model of the Rabbit Action Potential Elucidates the Minimal Physiological Requirements for Alternans and Spiral Wave Breakup.

    Science.gov (United States)

    Gray, Richard A; Pathmanathan, Pras

    2016-10-01

    Elucidating the underlying mechanisms of fatal cardiac arrhythmias requires a tight integration of electrophysiological experiments, models, and theory. Existing models of transmembrane action potential (AP) are complex (resulting in over parameterization) and varied (leading to dissimilar predictions). Thus, simpler models are needed to elucidate the "minimal physiological requirements" to reproduce significant observable phenomena using as few parameters as possible. Moreover, models have been derived from experimental studies from a variety of species under a range of environmental conditions (for example, all existing rabbit AP models incorporate a formulation of the rapid sodium current, INa, based on 30 year old data from chick embryo cell aggregates). Here we develop a simple "parsimonious" rabbit AP model that is mathematically identifiable (i.e., not over parameterized) by combining a novel Hodgkin-Huxley formulation of INa with a phenomenological model of repolarization similar to the voltage dependent, time-independent rectifying outward potassium current (IK). The model was calibrated using the following experimental data sets measured from the same species (rabbit) under physiological conditions: dynamic current-voltage (I-V) relationships during the AP upstroke; rapid recovery of AP excitability during the relative refractory period; and steady-state INa inactivation via voltage clamp. Simulations reproduced several important "emergent" phenomena including cellular alternans at rates > 250 bpm as observed in rabbit myocytes, reentrant spiral waves as observed on the surface of the rabbit heart, and spiral wave breakup. Model variants were studied which elucidated the minimal requirements for alternans and spiral wave break up, namely the kinetics of INa inactivation and the non-linear rectification of IK.The simplicity of the model, and the fact that its parameters have physiological meaning, make it ideal for engendering generalizable mechanistic

  3. The minimally tuned minimal supersymmetric standard model

    International Nuclear Information System (INIS)

    Essig, Rouven; Fortin, Jean-Francois

    2008-01-01

    The regions in the Minimal Supersymmetric Standard Model with the minimal amount of fine-tuning of electroweak symmetry breaking are presented for general messenger scale. No a priori relations among the soft supersymmetry breaking parameters are assumed and fine-tuning is minimized with respect to all the important parameters which affect electroweak symmetry breaking. The superpartner spectra in the minimally tuned region of parameter space are quite distinctive with large stop mixing at the low scale and negative squark soft masses at the high scale. The minimal amount of tuning increases enormously for a Higgs mass beyond roughly 120 GeV

  4. Minimal entropy approximation for cellular automata

    International Nuclear Information System (INIS)

    Fukś, Henryk

    2014-01-01

    We present a method for the construction of approximate orbits of measures under the action of cellular automata which is complementary to the local structure theory. The local structure theory is based on the idea of Bayesian extension, that is, construction of a probability measure consistent with given block probabilities and maximizing entropy. If instead of maximizing entropy one minimizes it, one can develop another method for the construction of approximate orbits, at the heart of which is the iteration of finite-dimensional maps, called minimal entropy maps. We present numerical evidence that the minimal entropy approximation sometimes outperforms the local structure theory in characterizing the properties of cellular automata. The density response curve for elementary CA rule 26 is used to illustrate this claim. (paper)

  5. A Survey of Various Object Oriented Requirement Engineering Methods

    OpenAIRE

    Anandi Mahajan; Dr. Anurag Dixit

    2013-01-01

    In current years many industries have been moving to the use of object-oriented methods for the development of large scale information systems The requirement of Object Oriented approach in the development of software systems is increasing day by day. This paper is basically a survey paper on various Object-oriented requirement engineering methods. This paper contains a summary of the available Object-oriented requirement engineering methods with their relative advantages and disadvantages...

  6. Silver nanoparticles: an alternative method for sanitization of minimally processed cabbage

    Directory of Open Access Journals (Sweden)

    Emiliane Andrade Araújo

    2015-06-01

    Full Text Available The minimal processing of vegetables basically aims to extend food shelf life, which depends on a number of factors, such as sanitization, that is considered a critical step for food microbiological quality. However, the usual antimicrobial agents reduce the microbial population in a maximum of two logarithmic cycles. Therefore, it is necessary to develop alternative sanitizers. This study aimed to increase the innocuity of minimally processed cabbage through sanitization with silver nanoparticles. It was observed that the nanoparticles promoted three logarithmic reductions, i.e. a 99.9 % reduction rate, in the Escherichia coli population inoculated on the cabbage surface. When compared to other antimicrobial agents (sodium dichloroisocyanurate and sodium hypochlorite, the nanoparticles were more efficient in sanitizing minimally processed cabbage, showing a lower count of aerobic mesophils. It was also observed that the cabbage surface presents hydrophobic characteristics, resulting in a higher propension for bacterial adhesion, which was confirmed in the thermodynamic evaluation of favorable adhesion for Staphylococcus aureus, Escherichia coli and Listeria innocua.

  7. New hybrid frequency reuse method for packet loss minimization in LTE network.

    Science.gov (United States)

    Ali, Nora A; El-Dakroury, Mohamed A; El-Soudani, Magdi; ElSayed, Hany M; Daoud, Ramez M; Amer, Hassanein H

    2015-11-01

    This paper investigates the problem of inter-cell interference (ICI) in Long Term Evolution (LTE) mobile systems, which is one of the main problems that causes loss of packets between the base station and the mobile station. Recently, different frequency reuse methods, such as soft and fractional frequency reuse, have been introduced in order to mitigate this type of interference. In this paper, minimizing the packet loss between the base station and the mobile station is the main concern. Soft Frequency Reuse (SFR), which is the most popular frequency reuse method, is examined and the amount of packet loss is measured. In order to reduce packet loss, a new hybrid frequency reuse method is implemented. In this method, each cell occupies the same bandwidth of the SFR, but the total system bandwidth is greater than in SFR. This will provide the new method with a lot of new sub-carriers from the neighboring cells to reduce the ICI which represents a big problem in many applications and causes a lot of packets loss. It is found that the new hybrid frequency reuse method has noticeable improvement in the amount of packet loss compared to SFR method in the different frequency bands. Traffic congestion management in Intelligent Transportation system (ITS) is one of the important applications that is affected by the packet loss due to the large amount of traffic that is exchanged between the base station and the mobile node. Therefore, it is used as a studied application for the proposed frequency reuse method and the improvement in the amount of packet loss reached 49.4% in some frequency bands using the new hybrid frequency reuse method.

  8. The minimal non-minimal standard model

    International Nuclear Information System (INIS)

    Bij, J.J. van der

    2006-01-01

    In this Letter I discuss a class of extensions of the standard model that have a minimal number of possible parameters, but can in principle explain dark matter and inflation. It is pointed out that the so-called new minimal standard model contains a large number of parameters that can be put to zero, without affecting the renormalizability of the model. With the extra restrictions one might call it the minimal (new) non-minimal standard model (MNMSM). A few hidden discrete variables are present. It is argued that the inflaton should be higher-dimensional. Experimental consequences for the LHC and the ILC are discussed

  9. Basic requirements to the methods of personnel monitoring

    International Nuclear Information System (INIS)

    Keirim-Markus, I.B.

    1981-01-01

    Requirements to methods of personnel monitoring (PMM) depending on irradiation conditions are given. The irradiation conditions determine subjected to monitoring types of irradiation, measurement ranges, periodicity of monitoring, operativeness of obtaining results and required accuracy. The PMM based on the photographic effect of ionizing radiation is the main method of the mass monitoring [ru

  10. Waste minimization and pollution prevention awareness plan. Revision 1

    International Nuclear Information System (INIS)

    1994-07-01

    The purpose of this plan is to document Lawrence Livermore National Laboratory (LLNL) projections for present and future waste minimization and pollution prevention. The plan specifies those activities and methods that are or will be used to reduce the quantity and toxicity of wastes generated at the site. It is intended to satisfy Department of Energy (DOE) requirements. This Waste Minimization and Pollution Prevention Awareness Plan provides an overview of projected activities from FY 1994 through FY 1999. The plans are broken into site-wide and problem-specific activities. All directorates at LLNL have had an opportunity to contribute input, estimate budgets, and review the plan. In addition to the above, this plan records LLNL's goals for pollution prevention, regulatory drivers for those activities, assumptions on which the cost estimates are based, analyses of the strengths of the projects, and the barriers to increasing pollution prevention activities

  11. Charge transfer interaction using quasiatomic minimal-basis orbitals in the effective fragment potential method

    International Nuclear Information System (INIS)

    Xu, Peng; Gordon, Mark S.

    2013-01-01

    The charge transfer (CT) interaction, the most time-consuming term in the general effective fragment potential method, is made much more computationally efficient. This is accomplished by the projection of the quasiatomic minimal-basis-set orbitals (QUAMBOs) as the atomic basis onto the self-consistent field virtual molecular orbital (MO) space to select a subspace of the full virtual space called the valence virtual space. The diagonalization of the Fock matrix in terms of QUAMBOs recovers the canonical occupied orbitals and, more importantly, gives rise to the valence virtual orbitals (VVOs). The CT energies obtained using VVOs are generally as accurate as those obtained with the full virtual space canonical MOs because the QUAMBOs span the valence part of the virtual space, which can generally be regarded as “chemically important.” The number of QUAMBOs is the same as the number of minimal-basis MOs of a molecule. Therefore, the number of VVOs is significantly smaller than the number of canonical virtual MOs, especially for large atomic basis sets. This leads to a dramatic decrease in the computational cost

  12. Minimally verbal school-aged children with autism spectrum disorder: the neglected end of the spectrum.

    Science.gov (United States)

    Tager-Flusberg, Helen; Kasari, Connie

    2013-12-01

    It is currently estimated that about 30% of children with autism spectrum disorder remain minimally verbal, even after receiving years of interventions and a range of educational opportunities. Very little is known about the individuals at this end of the autism spectrum, in part because this is a highly variable population with no single set of defining characteristics or patterns of skills or deficits, and in part because it is extremely challenging to provide reliable or valid assessments of their developmental functioning. In this paper, we summarize current knowledge based on research including minimally verbal children. We review promising new novel methods for assessing the verbal and nonverbal abilities of minimally verbal school-aged children, including eye-tracking and brain-imaging methods that do not require overt responses. We then review what is known about interventions that may be effective in improving language and communication skills, including discussion of both nonaugmentative and augmentative methods. In the final section of the paper, we discuss the gaps in the literature and needs for future research. © 2013 International Society for Autism Research, Wiley Periodicals, Inc.

  13. Methods for the minimization of radioactive waste from decontamination and decommissioning of nuclear facilities

    International Nuclear Information System (INIS)

    2001-01-01

    The objective of this report is to provide Member States and their decision makers (ranging from regulators, strategists, planners and designers, to operators) with relevant information on opportunities for minimizing radioactive wastes arising from the D and D of nuclear facilities. This will allow waste minimization options to be properly planned and assessed as part of national, site and plant waste management policies. This objective will be achieved by: reviewing the sources and characteristics of radioactive materials arising from D and D activities; reviewing waste minimization principles and current practical applications, together with regulatory, technical, financial and political factors influencing waste minimization practices; and reviewing current trends in improving waste minimization practices during D and D

  14. Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization

    Czech Academy of Sciences Publication Activity Database

    Vlček, Jan; Lukšan, Ladislav

    Online: 02 April (2018) ISSN 1017-1398 R&D Projects: GA ČR GA13-06684S Institutional support: RVO:67985807 Keywords : Unconstrained minimization * Block variable metric methods * Limited-memory methods * BFGS update * Global convergence * Numerical results Subject RIV: BA - General Mathematics OBOR OECD: Applied mathematics Impact factor: 1.241, year: 2016

  15. A Minimally Invasive, Translational Method to Deliver Hydrogels to the Heart Through the Pericardial Space

    Directory of Open Access Journals (Sweden)

    Jose R. Garcia, MS

    2017-10-01

    Full Text Available Biomaterials are a new treatment strategy for cardiovascular diseases but are difficult to deliver to the heart in a safe, precise, and translatable way. We developed a method to deliver hydrogels to the epicardium through the pericardial space. Our device creates a temporary compartment for hydrogel delivery and gelation using anatomic structures. The method minimizes risk to patients from embolization, thrombotic occlusion, and arrhythmia. In pigs there were no clinically relevant acute or subacute adverse effects from pericardial hydrogel delivery, making this a translatable strategy to deliver biomaterials to the heart.

  16. Minimalism

    CERN Document Server

    Obendorf, Hartmut

    2009-01-01

    The notion of Minimalism is proposed as a theoretical tool supporting a more differentiated understanding of reduction and thus forms a standpoint that allows definition of aspects of simplicity. This book traces the development of minimalism, defines the four types of minimalism in interaction design, and looks at how to apply it.

  17. H∞ memory feedback control with input limitation minimization for offshore jacket platform stabilization

    Science.gov (United States)

    Yang, Jia Sheng

    2018-06-01

    In this paper, we investigate a H∞ memory controller with input limitation minimization (HMCIM) for offshore jacket platforms stabilization. The main objective of this study is to reduce the control consumption as well as protect the actuator when satisfying the requirement of the system performance. First, we introduce a dynamic model of offshore platform with low order main modes based on mode reduction method in numerical analysis. Then, based on H∞ control theory and matrix inequality techniques, we develop a novel H∞ memory controller with input limitation. Furthermore, a non-convex optimization model to minimize input energy consumption is proposed. Since it is difficult to solve this non-convex optimization model by optimization algorithm, we use a relaxation method with matrix operations to transform this non-convex optimization model to be a convex optimization model. Thus, it could be solved by a standard convex optimization solver in MATLAB or CPLEX. Finally, several numerical examples are given to validate the proposed models and methods.

  18. RE-EDUCATIVE METHOD IN THE PROCESS OF MINIMIZING OF AUTOAGRESIVE WAYS OF BEHAVIOR

    Directory of Open Access Journals (Sweden)

    Nenad GLUMBIC

    1999-05-01

    Full Text Available Autoagressive behavior is a relatively frequent symptom of mental disturbances and behavior disturbances which are the subject of professional engagement of clinically oriented defectologists. In the process of rehabilitation numerous methods are used, from behavioral to psychopharmacological ones by which the above mentioned problems are eliminated of softened.The paper deals with four children with different diagnosis (autism, disintegrative psychosis, Patau syndrome and amaurosis that have the same common denominator-mental retardation and autoagression.We have tried to point out-by the description of a study case as well as the ways od work with these children-an application possibly of the particular methods of general and special re-education of psychomotorics in the process of autoagressive ways of behavior minimizing.The paper gives the autor’s notion of indications for re-educative method application with in the multihandicapped children population. Defectological treatment discovers new forms of existence in the existential field, not only to the retarded child but also to the very therapist. Epistemological consequences of the mentioned transfer are given in details in the paper.

  19. An alternating minimization method for blind deconvolution from Poisson data

    International Nuclear Information System (INIS)

    Prato, Marco; La Camera, Andrea; Bonettini, Silvia

    2014-01-01

    Blind deconvolution is a particularly challenging inverse problem since information on both the desired target and the acquisition system have to be inferred from the measured data. When the collected data are affected by Poisson noise, this problem is typically addressed by the minimization of the Kullback-Leibler divergence, in which the unknowns are sought in particular feasible sets depending on the a priori information provided by the specific application. If these sets are separated, then the resulting constrained minimization problem can be addressed with an inexact alternating strategy. In this paper we apply this optimization tool to the problem of reconstructing astronomical images from adaptive optics systems, and we show that the proposed approach succeeds in providing very good results in the blind deconvolution of nondense stellar clusters

  20. emMAW: computing minimal absent words in external memory.

    Science.gov (United States)

    Héliou, Alice; Pissis, Solon P; Puglisi, Simon J

    2017-09-01

    The biological significance of minimal absent words has been investigated in genomes of organisms from all domains of life. For instance, three minimal absent words of the human genome were found in Ebola virus genomes. There exists an O(n) -time and O(n) -space algorithm for computing all minimal absent words of a sequence of length n on a fixed-sized alphabet based on suffix arrays. A standard implementation of this algorithm, when applied to a large sequence of length n , requires more than 20 n  bytes of RAM. Such memory requirements are a significant hurdle to the computation of minimal absent words in large datasets. We present emMAW, the first external-memory algorithm for computing minimal absent words. A free open-source implementation of our algorithm is made available. This allows for computation of minimal absent words on far bigger data sets than was previously possible. Our implementation requires less than 3 h on a standard workstation to process the full human genome when as little as 1 GB of RAM is made available. We stress that our implementation, despite making use of external memory, is fast; indeed, even on relatively smaller datasets when enough RAM is available to hold all necessary data structures, it is less than two times slower than state-of-the-art internal-memory implementations. https://github.com/solonas13/maw (free software under the terms of the GNU GPL). alice.heliou@lix.polytechnique.fr or solon.pissis@kcl.ac.uk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  1. Non-technical skills in minimally invasive surgery teams

    DEFF Research Database (Denmark)

    Gjeraa, Kirsten; Spanager, Lene; Konge, Lars

    2016-01-01

    BACKGROUND: Root cause analyses show that up to 70 % of adverse events are caused by human error. Strong non-technical skills (NTS) can prevent or reduce these errors, considerable numbers of which occur in the operating theatre. Minimally invasive surgery (MIS) requires manipulation of more...... complex equipment than open procedures, likely requiring a different set of NTS for each kind of team. The aims of this study were to identify the MIS teams' key NTS and investigate the effect of training and assessment of NTS on MIS teams. METHODS: The databases of PubMed, Cochrane Library, Embase, Psyc...... were included. All were observational studies without blinding, and they differed in aims, types of evaluation, and outcomes. Only two studies evaluated patient outcomes other than operative time, and overall, the studies' quality of evidence was low. Different communication types were encountered...

  2. Drilling and coring methods that minimize the disturbance of cuttings, core, and rock formation in the unsaturated zone, Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    Hammermeister, D.P.; Blout, D.O.; McDaniel, J.C.

    1985-01-01

    A drilling-and-casing method (Odex 115 system) utilizing air as a drilling fluid was used successfully to drill through various rock types within the unsaturated zone at Yucca Mountain, Nevada. This paper describes this method and the equipment used to rapidly penetrate bouldery alluvial-colluvial deposits, poorly consolidated bedded and nonwelded tuff, and fractured, densely welded tuff to depths of about 130 meters. A comparison of water-content and water-potential data from drill cuttings with similar measurements on rock cores indicates that drill cuttings were only slightly disturbed for several of the rock types penetrated. Coring, sampling, and handling methods were devised to obtain minimally disturbed drive core from bouldery alluvial-colluvial deposits. Bulk-density values obtained from bulk samples dug from nearby trenches were compared to bulk-density values obtained from drive core to determine the effects of drive coring on the porosity of the core. Rotary coring methods utilizing a triple-tube core barrel and air as the drilling fluid were used to obtain core from welded and nonwelded tuff. Results indicate that the disturbance of the water content of the core was minimal. Water-content distributions in alluvium-colluvium were determined before drilling occurred by drive-core methods. After drilling, water-content distributions were determined by nuclear-logging methods. A comparison of the water-content distributions made before and after drilling indicates that Odex 115 drilling minimally disturbs the water content of the formation rock. 10 refs., 12 figs., 4 tabs

  3. A Method for Software Requirement Volatility Analysis Using QFD

    Directory of Open Access Journals (Sweden)

    Yunarso Anang

    2016-10-01

    Full Text Available Changes of software requirements are inevitable during the development life cycle. Rather than avoiding the circumstance, it is easier to just accept it and find a way to anticipate those changes. This paper proposes a method to analyze the volatility of requirement by using the Quality Function Deployment (QFD method and the introduced degree of volatility. Customer requirements are deployed to software functions and subsequently to architectural design elements. And then, after determining the potential for changes of the design elements, the degree of volatility of the software requirements is calculated. In this paper the method is described using a flow diagram and illustrated using a simple example, and is evaluated using a case study.

  4. Environmental Restoration Contractor Waste Minimization and Pollution Prevention Plan

    International Nuclear Information System (INIS)

    Lewis, R.A.

    1994-11-01

    The purpose of this plan is to establish the Environmental Restoration Contractor (ERC) Waste Minimization and Pollution Prevention (WMin/P2) Program and outline the activities and schedules that will be employed to reduce the quantity and toxicity of wastes generated as a result of restoration and remediation activities. It is intended to satisfy the US Department of Energy (DOE) and other legal requirements. As such, the Pollution Prevention Awareness program required by DOE Order 5400.1 is included with the Pollution Prevention Program. This plan is also intended to aid projects in meeting and documenting compliance with the various requirements for WMin/P2, and contains the policy, objectives, strategy, and support activities of the WMin/P2 program. The basic elements of the plan are pollution prevention goals, waste assessments of major waste streams, implementation of feasible waste minimization opportunities, and a process for reporting achievements. Various pollution prevention techniques will be implemented with the support of employee training and awareness programs to reduce waste and still meet applicable requirements. Information about the Hanford Site is in the Hanford Site Waste Minimization and Pollution Prevention Awareness Program Plan

  5. Annual Waste Minimization Summary Report

    International Nuclear Information System (INIS)

    Haworth, D.M.

    2011-01-01

    This report summarizes the waste minimization efforts undertaken by National Security TechnoIogies, LLC, for the U. S. Department of Energy, National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during calendar year 2010. The NNSA/NSO Pollution Prevention Program establishes a process to reduce the volume and toxicity of waste generated by NNSA/NSO activities and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment.

  6. Towards minimal resources of measurement-based quantum computation

    International Nuclear Information System (INIS)

    Perdrix, Simon

    2007-01-01

    We improve the upper bound on the minimal resources required for measurement-only quantum computation (M A Nielsen 2003 Phys. Rev. A 308 96-100; D W Leung 2004 Int. J. Quantum Inform. 2 33; S Perdrix 2005 Int. J. Quantum Inform. 3 219-23). Minimizing the resources required for this model is a key issue for experimental realization of a quantum computer based on projective measurements. This new upper bound also allows one to reply in the negative to the open question presented by Perdrix (2004 Proc. Quantum Communication Measurement and Computing) about the existence of a trade-off between observable and ancillary qubits in measurement-only QC

  7. Safety control and minimization of radioactive wastes

    International Nuclear Information System (INIS)

    Wang Jinming; Rong Feng; Li Jinyan; Wang Xin

    2010-01-01

    Compared with the developed countries, the safety control and minimization of the radwastes in China are under-developed. The research of measures for the safety control and minimization of the radwastes is very important for the safety control of the radwastes, and the reduction of the treatment and disposal cost and environment radiation hazards. This paper has systematically discussed the safety control and the minimization of the radwastes produced in the nuclear fuel circulation, nuclear technology applications and the process of decommission of nuclear facilities, and has provided some measures and methods for the safety control and minimization of the radwastes. (authors)

  8. Energy minimization in medical image analysis: Methodologies and applications.

    Science.gov (United States)

    Zhao, Feng; Xie, Xianghua

    2016-02-01

    Energy minimization is of particular interest in medical image analysis. In the past two decades, a variety of optimization schemes have been developed. In this paper, we present a comprehensive survey of the state-of-the-art optimization approaches. These algorithms are mainly classified into two categories: continuous method and discrete method. The former includes Newton-Raphson method, gradient descent method, conjugate gradient method, proximal gradient method, coordinate descent method, and genetic algorithm-based method, while the latter covers graph cuts method, belief propagation method, tree-reweighted message passing method, linear programming method, maximum margin learning method, simulated annealing method, and iterated conditional modes method. We also discuss the minimal surface method, primal-dual method, and the multi-objective optimization method. In addition, we review several comparative studies that evaluate the performance of different minimization techniques in terms of accuracy, efficiency, or complexity. These optimization techniques are widely used in many medical applications, for example, image segmentation, registration, reconstruction, motion tracking, and compressed sensing. We thus give an overview on those applications as well. Copyright © 2015 John Wiley & Sons, Ltd.

  9. Using the critical incident technique to define a minimal data set for requirements elicitation in public health.

    Science.gov (United States)

    Olvingson, Christina; Hallberg, Niklas; Timpka, Toomas; Greenes, Robert A

    2002-12-18

    The introduction of computer-based information systems (ISs) in public health provides enhanced possibilities for service improvements and hence also for improvement of the population's health. Not least, new communication systems can help in the socialization and integration process needed between the different professions and geographical regions. Therefore, development of ISs that truly support public health practices require that technical, cognitive, and social issues be taken into consideration. A notable problem is to capture 'voices' of all potential users, i.e., the viewpoints of different public health practitioners. Failing to capture these voices will result in inefficient or even useless systems. The aim of this study is to develop a minimal data set for capturing users' voices on problems experienced by public health professionals in their daily work and opinions about how these problems can be solved. The issues of concern thus captured can be used both as the basis for formulating the requirements of ISs for public health professionals and to create an understanding of the use context. Further, the data can help in directing the design to the features most important for the users.

  10. How to Compare the Security Quality Requirements Engineering (SQUARE) Method with Other Methods

    National Research Council Canada - National Science Library

    Mead, Nancy R

    2007-01-01

    The Security Quality Requirements Engineering (SQUARE) method, developed at the Carnegie Mellon Software Engineering Institute, provides a systematic way to identify security requirements in a software development project...

  11. An Approximate Redistributed Proximal Bundle Method with Inexact Data for Minimizing Nonsmooth Nonconvex Functions

    Directory of Open Access Journals (Sweden)

    Jie Shen

    2015-01-01

    Full Text Available We describe an extension of the redistributed technique form classical proximal bundle method to the inexact situation for minimizing nonsmooth nonconvex functions. The cutting-planes model we construct is not the approximation to the whole nonconvex function, but to the local convexification of the approximate objective function, and this kind of local convexification is modified dynamically in order to always yield nonnegative linearization errors. Since we only employ the approximate function values and approximate subgradients, theoretical convergence analysis shows that an approximate stationary point or some double approximate stationary point can be obtained under some mild conditions.

  12. Method for developing cost estimates for generic regulatory requirements

    International Nuclear Information System (INIS)

    1985-01-01

    The NRC has established a practice of performing regulatory analyses, reflecting costs as well as benefits, of proposed new or revised generic requirements. A method had been developed to assist the NRC in preparing the types of cost estimates required for this purpose and for assigning priorities in the resolution of generic safety issues. The cost of a generic requirement is defined as the net present value of total lifetime cost incurred by the public, industry, and government in implementing the requirement for all affected plants. The method described here is for commercial light-water-reactor power plants. Estimating the cost for a generic requirement involves several steps: (1) identifying the activities that must be carried out to fully implement the requirement, (2) defining the work packages associated with the major activities, (3) identifying the individual elements of cost for each work package, (4) estimating the magnitude of each cost element, (5) aggregating individual plant costs over the plant lifetime, and (6) aggregating all plant costs and generic costs to produce a total, national, present value of lifetime cost for the requirement. The method developed addresses all six steps. In this paper, we discuss on the first three

  13. Evolved Minimal Frustration in Multifunctional Biomolecules.

    Science.gov (United States)

    Röder, Konstantin; Wales, David J

    2018-05-25

    Protein folding is often viewed in terms of a funnelled potential or free energy landscape. A variety of experiments now indicate the existence of multifunnel landscapes, associated with multifunctional biomolecules. Here, we present evidence that these systems have evolved to exhibit the minimal number of funnels required to fulfil their cellular functions, suggesting an extension to the principle of minimum frustration. We find that minimal disruptive mutations result in additional funnels, and the associated structural ensembles become more diverse. The same trends are observed in an atomic cluster. These observations suggest guidelines for rational design of engineered multifunctional biomolecules.

  14. Multidimensional Normalization to Minimize Plate Effects of Suspension Bead Array Data.

    Science.gov (United States)

    Hong, Mun-Gwan; Lee, Woojoo; Nilsson, Peter; Pawitan, Yudi; Schwenk, Jochen M

    2016-10-07

    Enhanced by the growing number of biobanks, biomarker studies can now be performed with reasonable statistical power by using large sets of samples. Antibody-based proteomics by means of suspension bead arrays offers one attractive approach to analyze serum, plasma, or CSF samples for such studies in microtiter plates. To expand measurements beyond single batches, with either 96 or 384 samples per plate, suitable normalization methods are required to minimize the variation between plates. Here we propose two normalization approaches utilizing MA coordinates. The multidimensional MA (multi-MA) and MA-loess both consider all samples of a microtiter plate per suspension bead array assay and thus do not require any external reference samples. We demonstrate the performance of the two MA normalization methods with data obtained from the analysis of 384 samples including both serum and plasma. Samples were randomized across 96-well sample plates, processed, and analyzed in assay plates, respectively. Using principal component analysis (PCA), we could show that plate-wise clusters found in the first two components were eliminated by multi-MA normalization as compared with other normalization methods. Furthermore, we studied the correlation profiles between random pairs of antibodies and found that both MA normalization methods substantially reduced the inflated correlation introduced by plate effects. Normalization approaches using multi-MA and MA-loess minimized batch effects arising from the analysis of several assay plates with antibody suspension bead arrays. In a simulated biomarker study, multi-MA restored associations lost due to plate effects. Our normalization approaches, which are available as R package MDimNormn, could also be useful in studies using other types of high-throughput assay data.

  15. Classical Methods and Calculation Algorithms for Determining Lime Requirements

    Directory of Open Access Journals (Sweden)

    André Guarçoni

    Full Text Available ABSTRACT The methods developed for determination of lime requirements (LR are based on widely accepted principles. However, the formulas used for calculation have evolved little over recent decades, and in some cases there are indications of their inadequacy. The aim of this study was to compare the lime requirements calculated by three classic formulas and three algorithms, defining those most appropriate for supplying Ca and Mg to coffee plants and the smaller possibility of causing overliming. The database used contained 600 soil samples, which were collected in coffee plantings. The LR was estimated by the methods of base saturation, neutralization of Al3+, and elevation of Ca2+ and Mg2+ contents (two formulas and by the three calculation algorithms. Averages of the lime requirements were compared, determining the frequency distribution of the 600 lime requirements (LR estimated through each calculation method. In soils with low cation exchange capacity at pH 7, the base saturation method may fail to adequately supply the plants with Ca and Mg in many situations, while the method of Al3+ neutralization and elevation of Ca2+ and Mg2+ contents can result in the calculation of application rates that will increase the pH above the suitable range. Among the methods studied for calculating lime requirements, the algorithm that predicts reaching a defined base saturation, with adequate Ca and Mg supply and the maximum application rate limited to the H+Al value, proved to be the most efficient calculation method, and it can be recommended for use in numerous crops conditions.

  16. Minimization of energy consumption in HVAC systems with data-driven models and an interior-point method

    International Nuclear Information System (INIS)

    Kusiak, Andrew; Xu, Guanglin; Zhang, Zijun

    2014-01-01

    Highlights: • We study the energy saving of HVAC systems with a data-driven approach. • We conduct an in-depth analysis of the topology of developed Neural Network based HVAC model. • We apply interior-point method to solving a Neural Network based HVAC optimization model. • The uncertain building occupancy is incorporated in the minimization of HVAC energy consumption. • A significant potential of saving HVAC energy is discovered. - Abstract: In this paper, a data-driven approach is applied to minimize energy consumption of a heating, ventilating, and air conditioning (HVAC) system while maintaining the thermal comfort of a building with uncertain occupancy level. The uncertainty of arrival and departure rate of occupants is modeled by the Poisson and uniform distributions, respectively. The internal heating gain is calculated from the stochastic process of the building occupancy. Based on the observed and simulated data, a multilayer perceptron algorithm is employed to model and simulate the HVAC system. The data-driven models accurately predict future performance of the HVAC system based on the control settings and the observed historical information. An optimization model is formulated and solved with the interior-point method. The optimization results are compared with the results produced by the simulation models

  17. Minimal surfaces

    CERN Document Server

    Dierkes, Ulrich; Sauvigny, Friedrich; Jakob, Ruben; Kuster, Albrecht

    2010-01-01

    Minimal Surfaces is the first volume of a three volume treatise on minimal surfaces (Grundlehren Nr. 339-341). Each volume can be read and studied independently of the others. The central theme is boundary value problems for minimal surfaces. The treatise is a substantially revised and extended version of the monograph Minimal Surfaces I, II (Grundlehren Nr. 295 & 296). The first volume begins with an exposition of basic ideas of the theory of surfaces in three-dimensional Euclidean space, followed by an introduction of minimal surfaces as stationary points of area, or equivalently

  18. A comprehensive program to minimize platelet outdating.

    Science.gov (United States)

    Fuller, Alice K; Uglik, Kristin M; Braine, Hayden G; King, Karen E

    2011-07-01

    Platelet (PLT) transfusions are essential for patients who are bleeding or have an increased risk of bleeding due to a decreased number or abnormal function of circulating PLTs. A shelf life of 5 days for PLT products presents an inventory management challenge. In 2006, greater than 10% of apheresis PLTs made in the United States outdated. It is imperative to have a sufficient number of products for patients requiring transfusion, but outdating PLTs is a financial burden and a waste of a resource. We present the approach used in our institution to anticipate inventory needs based on current patient census and usage. Strategies to predict usage and to identify changes in anticipated usage are examined. Annual outdating is reviewed for a 10-year period from 2000 through 2009. From January 1, 2000, through December 2009, there were 128,207 PLT transfusions given to 15,265 patients. The methods used to anticipate usage and adjust inventory resulted in an annual outdate rate of approximately 1% for the 10-year period reviewed. In addition we have not faced situations where inventory was inadequate to meet the needs of the patients requiring transfusions. We have identified three elements of our transfusion service that can minimize outdate: a knowledgeable proactive staff dedicated to PLT management, a comprehensive computer-based transfusion history for each patient, and a strong two-way relationship with the primary product supplier. Through our comprehensive program, based on the principles of providing optimal patient care, we have minimized PLT outdating for more than 10 years. © 2011 American Association of Blood Banks.

  19. Minimally Invasive Parathyroidectomy

    Directory of Open Access Journals (Sweden)

    Lee F. Starker

    2011-01-01

    Full Text Available Minimally invasive parathyroidectomy (MIP is an operative approach for the treatment of primary hyperparathyroidism (pHPT. Currently, routine use of improved preoperative localization studies, cervical block anesthesia in the conscious patient, and intraoperative parathyroid hormone analyses aid in guiding surgical therapy. MIP requires less surgical dissection causing decreased trauma to tissues, can be performed safely in the ambulatory setting, and is at least as effective as standard cervical exploration. This paper reviews advances in preoperative localization, anesthetic techniques, and intraoperative management of patients undergoing MIP for the treatment of pHPT.

  20. Handbook of methods for risk-based analysis of technical specification requirements

    International Nuclear Information System (INIS)

    Samanta, P.K.; Vesely, W.E.

    1994-01-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements were based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while others may not be conducive to safety. Improvements in these requirements are facilitated by the availability of plant specific Probabilistic Safety Assessments (PSAs). The use of risk and reliability-based methods to improve TS requirements has gained wide interest because these methods can: Quantitatively evaluate the risk and justify changes based on objective risk arguments; Provide a defensible basis for these requirements for regulatory applications. The US NRC Office of Research is sponsoring research to develop systematic risk-based methods to improve various aspects of TS requirements. The handbook of methods, which is being prepared, summarizes such risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), defenses against common-cause failures, managing plant configurations, and scheduling maintenances. For each topic, the handbook summarizes methods of analysis and data needs, outlines the insights to be gained, lists additional references, and presents examples of evaluations

  1. Handbook of methods for risk-based analysis of Technical Specification requirements

    International Nuclear Information System (INIS)

    Samanta, P.K.; Vesely, W.E.

    1993-01-01

    Technical Specifications (TS) requirements for nuclear power plants define the Limiting Conditions for Operation (LCOs) and Surveillance Requirements (SRs) to assure safety during operation. In general, these requirements were based on deterministic analysis and engineering judgments. Experiences with plant operation indicate that some elements of the requirements are unnecessarily restrictive, while others may not be conducive to safety. Improvements in these requirements are facilitated by the availability of plant specific Probabilistic Safety Assessments (PSAs). The use of risk and reliability-based methods to improve TS requirements has gained wide interest because these methods can: quantitatively evaluate the risk impact and justify changes based on objective risk arguments. Provide a defensible basis for these requirements for regulatory applications. The United States Nuclear Regulatory Commission (USNRC) Office of Research is sponsoring research to develop systematic risk-based methods to improve various aspects of TS requirements. The handbook of methods, which is being prepared, summarizes such risk-based methods. The scope of the handbook includes reliability and risk-based methods for evaluating allowed outage times (AOTs), action statements requiring shutdown where shutdown risk may be substantial, surveillance test intervals (STIs), defenses against common-cause failures, managing plant configurations, and scheduling maintenances. For each topic, the handbook summarizes methods of analysis and data needs, outlines the insights to be gained, lists additional references, and presents examples of evaluations

  2. Statistically Efficient Construction of α-Risk-Minimizing Portfolio

    Directory of Open Access Journals (Sweden)

    Hiroyuki Taniai

    2012-01-01

    Full Text Available We propose a semiparametrically efficient estimator for α-risk-minimizing portfolio weights. Based on the work of Bassett et al. (2004, an α-risk-minimizing portfolio optimization is formulated as a linear quantile regression problem. The quantile regression method uses a pseudolikelihood based on an asymmetric Laplace reference density, and asymptotic properties such as consistency and asymptotic normality are obtained. We apply the results of Hallin et al. (2008 to the problem of constructing α-risk-minimizing portfolios using residual signs and ranks and a general reference density. Monte Carlo simulations assess the performance of the proposed method. Empirical applications are also investigated.

  3. MINIMIZE ENERGY AND COSTS REQUIREMENT OF WEEDING AND FERTILIZING PROCESS FOR FIBER CROPS IN SMALL FARMS

    Directory of Open Access Journals (Sweden)

    Tarek FOUDA

    2015-06-01

    Full Text Available The experimental work was carried out through agricultural summer season of 2014 at the experimental farm of Gemmiza Research Station, Gharbiya governorate to minimize energy and costs in weeding and fertilizing processes for fiber crops (Kenaf and Roselle in small farms. The manufactured multipurpose unit performance was studied as a function of change in machine forward speed (2.2, 2.8, 3.4 and 4 Km/h fertilizing rates (30,45 and 60 Kg.N.fed-1,and constant soil moisture content was 20%(d.b in average. Performance of the manufactured machine was evaluated in terms of fuel consumption, power and energy requirements, effective field capacity, theoretical field capacity, field efficiency, and operational costs as a machine measurements .The experiment results reveled that the manufactured machine decreased energy and increased effective field capacity and efficiency under the following conditions: -machine forward speed 2.2Kmlh. -moisture content average 20%.

  4. New methods to minimize the preventive maintenance cost of series-parallel systems using ant colony optimization

    International Nuclear Information System (INIS)

    Samrout, M.; Yalaoui, F.; Cha-hat telet, E.; Chebbo, N.

    2005-01-01

    This article is based on a previous study made by Bris, Chatelet and Yalaoui [Bris R, Chatelet E, Yalaoui F. New method to minimise the preventive maintenance cost of series-parallel systems. Reliab Eng Syst Saf 2003;82:247-55]. They use genetic algorithm to minimize preventive maintenance cost problem for the series-parallel systems. We propose to improve their results developing a new method based on another technique, the Ant Colony Optimization (ACO). The resolution consists in determining the solution vector of system component inspection periods, T P . Those calculations were applied within the programming tool Matlab. Thus, highly interesting results and improvements of previous studies were obtained

  5. Assessment of methods to determine minimal cellulase concentrations for efficient hydrolysis of cellulose

    Energy Technology Data Exchange (ETDEWEB)

    Hogan, C.M.; Mes-Hartree, M.; Saddler, J.N. (Forintek Canada Corp., Ottawa, ON (Canada). Biotechnology and Chemistry Dept.); Kushner, D.J. (Toronto Univ., Ontario (Canada). Dept. of Microbiology)

    1990-02-01

    The enzyme loading needed to achieve substrate saturation appeared to be the most economical enzyme concentration to use for hydrolysis, based on percentage hydrolysis. Saturation was reached at 25 filter paper units per gram substrate on Solka Floc BW300, as determined by studying (a) initial adsorption of the cellulase preparation onto the substrate, (b) an actual hydrolysis or (c) a combined hydrolysis and fermentation (CHF) process. Initial adsorption of the cellulases onto the substrate can be used to determine the minimal cellulase requirements for efficient hydrolysis since enzymes initially adsorbed to the substrate have a strong role in governing the overall reaction. Trichoderma harzianum E58 produces high levels of {beta}-glucosidase and is able to cause high conversion of Solka Floc BW300 to glucose without the need for exogenous {beta}-glucosidase. End-product inhibition of the cellulase and {beta}-glucosidase can be more effectively reduced by employing a CHF process than by supplemental {beta}-glucosidase. (orig.).

  6. [Manufacture method and clinical application of minimally invasive dental implant guide template based on registration technology].

    Science.gov (United States)

    Lin, Zeming; He, Bingwei; Chen, Jiang; D u, Zhibin; Zheng, Jingyi; Li, Yanqin

    2012-08-01

    To guide doctors in precisely positioning surgical operation, a new production method of minimally invasive implant guide template was presented. The mandible of patient was scanned by CT scanner, and three-dimensional jaw bone model was constructed based on CT images data The professional dental implant software Simplant was used to simulate the plant based on the three-dimensional CT model to determine the location and depth of implants. In the same time, the dental plaster models were scanned by stereo vision system to build the oral mucosa model. Next, curvature registration technology was used to fuse the oral mucosa model and the CT model, then the designed position of implant in the oral mucosa could be determined. The minimally invasive implant guide template was designed in 3-Matic software according to the design position of implant and the oral mucosa model. Finally, the template was produced by rapid prototyping. The three-dimensional registration technology was useful to fuse the CT data and the dental plaster data, and the template was accurate that could provide the doctors a guidance in the actual planting without cut-off mucosa. The guide template which fabricated by comprehensive utilization of three-dimensional registration, Simplant simulation and rapid prototyping positioning are accurate and can achieve the minimally invasive and accuracy implant surgery, this technique is worthy of clinical use.

  7. Minimal Subdermal Shaving by Means of Sclerotherapy Using Absolute Ethanol: A New Method for the Treatment of Axillary Osmidrosis

    Directory of Open Access Journals (Sweden)

    Hyung-Sup Shim

    2013-07-01

    Full Text Available BackgroundAxillary osmidrosis is characterized by unpleasant odors originating from the axillary apocrine glands, resulting in psychosocial stress. The main treatment modality is apocrine gland removal. Until now, of the various surgical techniques have sometimes caused serious complications. We describe herein the favorable outcomes of a new method for ablating apocrine glands by minimal subdermal shaving using sclerotherapy with absolute ethanol.MethodsA total of 12 patients underwent the procedure. The severity of osmidrosis was evaluated before surgery. Conventional subdermal shaving was performed on one side (control group and ablation by means of minimal subdermal shaving and absolute ethanol on the other side (study group. Postoperative outcomes were compared between the study and control groups.ResultsThe length of time to removal of the drain was 1 day shorter in the study group than in the control group. There were no serious complications, such as hematoma or seroma, in either group, but flap margin necrosis and flap desquamation occurred in the control group, and were successfully managed with conservative treatment. Six months after surgery, we and our patients were satisfied with the outcomes.ConclusionsSclerotherapy using absolute ethanol combined with minimal subdermal shaving may be useful for the treatment of axillary osmidrosis. It can reduce the incidence of seroma and hematoma and allow the skin flap to adhere to its recipient site. It can degrade and ablate the remaining apocrine glands and eliminate causative organisms. Furthermore, since this technique is relatively simple, it takes less time than the conventional method.

  8. Minimal Subdermal Shaving by Means of Sclerotherapy Using Absolute Ethanol: A New Method for the Treatment of Axillary Osmidrosis

    Directory of Open Access Journals (Sweden)

    Hyung­Sup Shim

    2013-07-01

    Full Text Available Background Axillary osmidrosis is characterized by unpleasant odors originating from the axillary apocrine glands, resulting in psychosocial stress. The main treatment modality is apocrine gland removal. Until now, of the various surgical techniques have sometimes caused serious complications. We describe herein the favorable outcomes of a new method for ablating apocrine glands by minimal subdermal shaving using sclerotherapy with absolute ethanol.Methods A total of 12 patients underwent the procedure. The severity of osmidrosis was evaluated before surgery. Conventional subdermal shaving was performed on one side (control group and ablation by means of minimal subdermal shaving and absolute ethanol on the other side (study group. Postoperative outcomes were compared between the study and control groups.Results The length of time to removal of the drain was 1 day shorter in the study group than in the control group. There were no serious complications, such as hematoma or seroma, in either group, but flap margin necrosis and flap desquamation occurred in the control group, and were successfully managed with conservative treatment. Six months after surgery, we and our patients were satisfied with the outcomes.Conclusions Sclerotherapy using absolute ethanol combined with minimal subdermal shaving may be useful for the treatment of axillary osmidrosis. It can reduce the incidence of seroma and hematoma and allow the skin flap to adhere to its recipient site. It can degrade and ablate the remaining apocrine glands and eliminate causative organisms. Furthermore, since this technique is relatively simple, it takes less time than the conventional method.

  9. OxMaR: open source free software for online minimization and randomization for clinical trials.

    Science.gov (United States)

    O'Callaghan, Christopher A

    2014-01-01

    Minimization is a valuable method for allocating participants between the control and experimental arms of clinical studies. The use of minimization reduces differences that might arise by chance between the study arms in the distribution of patient characteristics such as gender, ethnicity and age. However, unlike randomization, minimization requires real time assessment of each new participant with respect to the preceding distribution of relevant participant characteristics within the different arms of the study. For multi-site studies, this necessitates centralized computational analysis that is shared between all study locations. Unfortunately, there is no suitable freely available open source or free software that can be used for this purpose. OxMaR was developed to enable researchers in any location to use minimization for patient allocation and to access the minimization algorithm using any device that can connect to the internet such as a desktop computer, tablet or mobile phone. The software is complete in itself and requires no special packages or libraries to be installed. It is simple to set up and run over the internet using online facilities which are very low cost or even free to the user. Importantly, it provides real time information on allocation to the study lead or administrator and generates real time distributed backups with each allocation. OxMaR can readily be modified and customised and can also be used for standard randomization. It has been extensively tested and has been used successfully in a low budget multi-centre study. Hitherto, the logistical difficulties involved in minimization have precluded its use in many small studies and this software should allow more widespread use of minimization which should lead to studies with better matched control and experimental arms. OxMaR should be particularly valuable in low resource settings.

  10. OxMaR: open source free software for online minimization and randomization for clinical trials.

    Directory of Open Access Journals (Sweden)

    Christopher A O'Callaghan

    Full Text Available Minimization is a valuable method for allocating participants between the control and experimental arms of clinical studies. The use of minimization reduces differences that might arise by chance between the study arms in the distribution of patient characteristics such as gender, ethnicity and age. However, unlike randomization, minimization requires real time assessment of each new participant with respect to the preceding distribution of relevant participant characteristics within the different arms of the study. For multi-site studies, this necessitates centralized computational analysis that is shared between all study locations. Unfortunately, there is no suitable freely available open source or free software that can be used for this purpose. OxMaR was developed to enable researchers in any location to use minimization for patient allocation and to access the minimization algorithm using any device that can connect to the internet such as a desktop computer, tablet or mobile phone. The software is complete in itself and requires no special packages or libraries to be installed. It is simple to set up and run over the internet using online facilities which are very low cost or even free to the user. Importantly, it provides real time information on allocation to the study lead or administrator and generates real time distributed backups with each allocation. OxMaR can readily be modified and customised and can also be used for standard randomization. It has been extensively tested and has been used successfully in a low budget multi-centre study. Hitherto, the logistical difficulties involved in minimization have precluded its use in many small studies and this software should allow more widespread use of minimization which should lead to studies with better matched control and experimental arms. OxMaR should be particularly valuable in low resource settings.

  11. Parameter-free Network Sparsification and Data Reduction by Minimal Algorithmic Information Loss

    KAUST Repository

    Zenil, Hector

    2018-02-16

    The study of large and complex datasets, or big data, organized as networks has emerged as one of the central challenges in most areas of science and technology. Cellular and molecular networks in biology is one of the prime examples. Henceforth, a number of techniques for data dimensionality reduction, especially in the context of networks, have been developed. Yet, current techniques require a predefined metric upon which to minimize the data size. Here we introduce a family of parameter-free algorithms based on (algorithmic) information theory that are designed to minimize the loss of any (enumerable computable) property contributing to the object\\'s algorithmic content and thus important to preserve in a process of data dimension reduction when forcing the algorithm to delete first the least important features. Being independent of any particular criterion, they are universal in a fundamental mathematical sense. Using suboptimal approximations of efficient (polynomial) estimations we demonstrate how to preserve network properties outperforming other (leading) algorithms for network dimension reduction. Our method preserves all graph-theoretic indices measured, ranging from degree distribution, clustering-coefficient, edge betweenness, and degree and eigenvector centralities. We conclude and demonstrate numerically that our parameter-free, Minimal Information Loss Sparsification (MILS) method is robust, has the potential to maximize the preservation of all recursively enumerable features in data and networks, and achieves equal to significantly better results than other data reduction and network sparsification methods.

  12. The IFR pyroprocessing for high-level waste minimization

    International Nuclear Information System (INIS)

    Laidler, J.J.

    1993-01-01

    The process developed for the recycle of integral fast reactor (IFR) spent fuel utilizes a combination of pyrometallurgical and electrochemical methods and has been termed pyroprocessing. The process has been operated at full scale with simulated spent fuel using nonradioactive fission product elements. A comprehensive demonstration of the pyroprocessing of irradiated IFR fuel will begin later this year. Pyroprocessing involves the anodic dissolution of all the constituent elements of the IFR spent fuel and controlled electrotransport (electrorefining) to separate the actinide elements from the fission products present in the spent fuel. The process be applied to the processing of spent light water reactor (LWR) fuel as well, requiring only the addition of a reduction step to convert the LWR fuel as well, requiring only the addition of a reduction step to convert the LWR oxide fuel to metallic form and a separation step to separate uranium from the transuranic (TRU) elements. The TRU elements are then recovered by electroefining in the same manner as the actinides from the IFR high-level wastes arising from pyroprocessing are virtually free of actinides, and the volume of the wastes is minimized by the intrinsic characteristics of the processing of the processing method

  13. Graphical approach for multiple values logic minimization

    Science.gov (United States)

    Awwal, Abdul Ahad S.; Iftekharuddin, Khan M.

    1999-03-01

    Multiple valued logic (MVL) is sought for designing high complexity, highly compact, parallel digital circuits. However, the practical realization of an MVL-based system is dependent on optimization of cost, which directly affects the optical setup. We propose a minimization technique for MVL logic optimization based on graphical visualization, such as a Karnaugh map. The proposed method is utilized to solve signed-digit binary and trinary logic minimization problems. The usefulness of the minimization technique is demonstrated for the optical implementation of MVL circuits.

  14. Smartphone-assisted minimally invasive neurosurgery.

    Science.gov (United States)

    Mandel, Mauricio; Petito, Carlo Emanuel; Tutihashi, Rafael; Paiva, Wellingson; Abramovicz Mandel, Suzana; Gomes Pinto, Fernando Campos; Ferreira de Andrade, Almir; Teixeira, Manoel Jacobsen; Figueiredo, Eberval Gadelha

    2018-03-13

    OBJECTIVE Advances in video and fiber optics since the 1990s have led to the development of several commercially available high-definition neuroendoscopes. This technological improvement, however, has been surpassed by the smartphone revolution. With the increasing integration of smartphone technology into medical care, the introduction of these high-quality computerized communication devices with built-in digital cameras offers new possibilities in neuroendoscopy. The aim of this study was to investigate the usefulness of smartphone-endoscope integration in performing different types of minimally invasive neurosurgery. METHODS The authors present a new surgical tool that integrates a smartphone with an endoscope by use of a specially designed adapter, thus eliminating the need for the video system customarily used for endoscopy. The authors used this novel combined system to perform minimally invasive surgery on patients with various neuropathological disorders, including cavernomas, cerebral aneurysms, hydrocephalus, subdural hematomas, contusional hematomas, and spontaneous intracerebral hematomas. RESULTS The new endoscopic system featuring smartphone-endoscope integration was used by the authors in the minimally invasive surgical treatment of 42 patients. All procedures were successfully performed, and no complications related to the use of the new method were observed. The quality of the images obtained with the smartphone was high enough to provide adequate information to the neurosurgeons, as smartphone cameras can record images in high definition or 4K resolution. Moreover, because the smartphone screen moves along with the endoscope, surgical mobility was enhanced with the use of this method, facilitating more intuitive use. In fact, this increased mobility was identified as the greatest benefit of the use of the smartphone-endoscope system compared with the use of the neuroendoscope with the standard video set. CONCLUSIONS Minimally invasive approaches

  15. The Videographic Requirements Gathering Method for Adolescent-Focused Interaction Design

    Directory of Open Access Journals (Sweden)

    Tamara Peyton

    2014-08-01

    Full Text Available We present a novel method for conducting requirements gathering with adolescent populations. Called videographic requirements gathering, this technique makes use of mobile phone data capture and participant creation of media images. The videographic requirements gathering method can help researchers and designers gain intimate insight into adolescent lives while simultaneously reducing power imbalances. We provide rationale for this approach, pragmatics of using the method, and advice on overcoming common challenges facing researchers and designers relying on this technique.

  16. A Sparsity-Promoted Method Based on Majorization-Minimization for Weak Fault Feature Enhancement.

    Science.gov (United States)

    Ren, Bangyue; Hao, Yansong; Wang, Huaqing; Song, Liuyang; Tang, Gang; Yuan, Hongfang

    2018-03-28

    Fault transient impulses induced by faulty components in rotating machinery usually contain substantial interference. Fault features are comparatively weak in the initial fault stage, which renders fault diagnosis more difficult. In this case, a sparse representation method based on the Majorzation-Minimization (MM) algorithm is proposed to enhance weak fault features and extract the features from strong background noise. However, the traditional MM algorithm suffers from two issues, which are the choice of sparse basis and complicated calculations. To address these challenges, a modified MM algorithm is proposed in which a sparse optimization objective function is designed firstly. Inspired by the Basis Pursuit (BP) model, the optimization function integrates an impulsive feature-preserving factor and a penalty function factor. Second, a modified Majorization iterative method is applied to address the convex optimization problem of the designed function. A series of sparse coefficients can be achieved through iterating, which only contain transient components. It is noteworthy that there is no need to select the sparse basis in the proposed iterative method because it is fixed as a unit matrix. Then the reconstruction step is omitted, which can significantly increase detection efficiency. Eventually, envelope analysis of the sparse coefficients is performed to extract weak fault features. Simulated and experimental signals including bearings and gearboxes are employed to validate the effectiveness of the proposed method. In addition, comparisons are made to prove that the proposed method outperforms the traditional MM algorithm in terms of detection results and efficiency.

  17. Approximate k-NN delta test minimization method using genetic algorithms: Application to time series

    CERN Document Server

    Mateo, F; Gadea, Rafael; Sovilj, Dusan

    2010-01-01

    In many real world problems, the existence of irrelevant input variables (features) hinders the predictive quality of the models used to estimate the output variables. In particular, time series prediction often involves building large regressors of artificial variables that can contain irrelevant or misleading information. Many techniques have arisen to confront the problem of accurate variable selection, including both local and global search strategies. This paper presents a method based on genetic algorithms that intends to find a global optimum set of input variables that minimize the Delta Test criterion. The execution speed has been enhanced by substituting the exact nearest neighbor computation by its approximate version. The problems of scaling and projection of variables have been addressed. The developed method works in conjunction with MATLAB's Genetic Algorithm and Direct Search Toolbox. The goodness of the proposed methodology has been evaluated on several popular time series examples, and also ...

  18. Biostatistical analysis of treatment results of bacterial liver abscesses using minimally invasive techniques and open surgery

    Directory of Open Access Journals (Sweden)

    Кipshidze A.A.

    2013-12-01

    Full Text Available Today bacterial abscesses remain one of the most difficult complications in surgical hepatology, both traditional and minimally invasive methods of their treatment are used. Bio-statistical analysis is used due to the fact that strong evidences are required for the effectiveness of one or another method of surgical intervention. The estimation of statistical significance of differences between the control and the main group of patients with liver abscesses is given in this paper. Depending on the treatment method patients were divided into two groups: 1 - minimally invasive surgery (89 cases; 2 – laporatomy surgery (74 patients. Data compa¬ri¬son was performed by means of Stjudent's criterion. The effectiveness of method of abscesses drainage using inter¬ventional sonography, outer nazobiliar drainage with reorganization of ductal liver system and abscess cavity with the help of modern antiseptics was considered. The percentage of cured patients was also estimated.

  19. Pengaruh Pelapis Bionanokomposit terhadap Mutu Mangga Terolah Minimal

    Directory of Open Access Journals (Sweden)

    Ata Aditya Wardana

    2017-04-01

    Full Text Available Abstract Minimally-processed mango is a perishable product due to high respiration and transpiration and microbial decay. Edible coating is one of the alternative methods to maintain the quality of minimally - processed mango. The objective of this study was to evaluate the effects of bionanocomposite edible coating from tapioca and ZnO nanoparticles (NP-ZnO on quality of minimally - processed mango cv. Arumanis, stored for 12 days at 8°C. The combination of tapioca and NP-ZnO (0, 1, 2% by weight of tapioca were used to coat minimally processed mango. The result showed that application of bionanocomposite edible coatings were able to maintain the quality of minimally-processed mango during the storage periods. The bionanocomposite from tapioca + NP-ZnO (2% by weight of tapioca was the most effective in reducing weight loss, firmness, browning index, total acidity, total soluble solids ,respiration, and microbial counts. Thus, the use of bionanocomposite edible coating might provide an alternative method to maintain storage quality of minimally-processed mango. Abstrak Mangga terolah minimal merupakan produk yang cepat mengalami kerusakan dikarenakan respirasi yang cepat, transpirasi dan kerusakan oleh mikroba. Edible coating merupakan salah satu alternatif metode untuk mempertahankan mutu mangga terolah minimal. Tujuan dari penelitian ini adalah untuk mengevaluasi pengaruh pelapis bionanokomposit dari tapioka dan nanopartikel ZnO (NP-ZnO terhadap mutu mangga terolah minimal cv. Arumanis yang disimpan selama 12 hari pada suhu 8oC. Kombinasi dari tapioka dan NP-ZnO (0, 1, 2% b/b tapioka digunakan untuk melapisi mangga terolah minimal. Hasil menunjukkan bahwa pelapisan bionanokomposit mampu mempertahankan mutu mangga terolah minimal selama penyimpanan. Bionanokomposit dari tapioka + NP-ZnO (2% b/b tapioka paling efektif dalam menghambat penurunan susut bobot, kekerasan, indeks pencoklatan, total asam, total padatan terlarut, respirasi dan total

  20. Stochastic LMP (Locational marginal price) calculation method in distribution systems to minimize loss and emission based on Shapley value and two-point estimate method

    International Nuclear Information System (INIS)

    Azad-Farsani, Ehsan; Agah, S.M.M.; Askarian-Abyaneh, Hossein; Abedi, Mehrdad; Hosseinian, S.H.

    2016-01-01

    LMP (Locational marginal price) calculation is a serious impediment in distribution operation when private DG (distributed generation) units are connected to the network. A novel policy is developed in this study to guide distribution company (DISCO) to exert its control over the private units when power loss and green-house gases emissions are minimized. LMP at each DG bus is calculated according to the contribution of the DG to the reduced amount of loss and emission. An iterative algorithm which is based on the Shapley value method is proposed to allocate loss and emission reduction. The proposed algorithm will provide a robust state estimation tool for DISCOs in the next step of operation. The state estimation tool provides the decision maker with the ability to exert its control over private DG units when loss and emission are minimized. Also, a stochastic approach based on the PEM (point estimate method) is employed to capture uncertainty in the market price and load demand. The proposed methodology is applied to a realistic distribution network, and efficiency and accuracy of the method are verified. - Highlights: • Reduction of the loss and emission at the same time. • Fair allocation of loss and emission reduction. • Estimation of the system state using an iterative algorithm. • Ability of DISCOs to control DG units via the proposed policy. • Modeling the uncertainties to calculate the stochastic LMP.

  1. Varietal improvement of irrigated rice under minimal water conditions

    International Nuclear Information System (INIS)

    Abdul Rahim Harun; Marziah Mahmood; Sobri Hussein

    2010-01-01

    Varietal improvement of irrigated rice under minimal water condition is a research project under Program Research of Sustainable Production of High Yielding Irrigated Rice under Minimal Water Input (IRPA- 01-01-03-0000/ PR0068/ 0504). Several agencies were involved in this project such as Malaysian Nuclear Agency (MNA), Malaysian Agricultural Research and Development Institute (MARDI), Universiti Putra Malaysia (UPM) and Ministry of Agriculture (MOA). The project started in early 2004 with approved IRPA fund of RM 275,000.00 for 3 years. The main objective of the project is to generate superior genotypes for minimal water requirement through induced mutation techniques. A cultivated rice Oryza sativa cv MR219 treated with gamma radiation at 300 and 400 Gray were used in the experiment. Two hundred gm M2 seeds from each dose were screened under minimal water stress in greenhouse at Mardi Seberang Perai. Five hundred panicles with good filled grains were selected for paddy field screening with simulate precise water stress regime. Thirty eight potential lines with required adaptive traits were selected in M3. After several series of selection, 12 promising mutant line were observed tolerance to minimal water stress where two promising mutant lines designated as MR219-4 and MR219-9 were selected for further testing under several stress environments. (author)

  2. LLNL Waste Minimization Program Plan

    International Nuclear Information System (INIS)

    1990-01-01

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). The Waste Minimization Policy field has undergone continuous changes since its formal inception in the 1984 HSWA legislation. The first LLNL WMPP, Revision A, is dated March 1985. A series of informal revision were made on approximately a semi-annual basis. This Revision 2 is the third formal issuance of the WMPP document. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this new policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. In response to these policies, DOE has revised and issued implementation guidance for DOE Order 5400.1, Waste Minimization Plan and Waste Reduction reporting of DOE Hazardous, Radioactive, and Radioactive Mixed Wastes, final draft January 1990. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements. 3 figs., 4 tabs

  3. Fault Sample Generation for Virtual Testability Demonstration Test Subject to Minimal Maintenance and Scheduled Replacement

    Directory of Open Access Journals (Sweden)

    Yong Zhang

    2015-01-01

    Full Text Available Virtual testability demonstration test brings new requirements to the fault sample generation. First, fault occurrence process is described by stochastic process theory. It is discussed that fault occurrence process subject to minimal repair is nonhomogeneous Poisson process (NHPP. Second, the interarrival time distribution function of the next fault event is proposed and three typical kinds of parameterized NHPP are discussed. Third, the procedure of fault sample generation is put forward with the assumptions of minimal maintenance and scheduled replacement. The fault modes and their occurrence time subject to specified conditions and time period can be obtained. Finally, an antenna driving subsystem in automatic pointing and tracking platform is taken as a case to illustrate the proposed method. Results indicate that both the size and structure of the fault samples generated by the proposed method are reasonable and effective. The proposed method can be applied to virtual testability demonstration test well.

  4. Rigid Body Energy Minimization on Manifolds for Molecular Docking.

    Science.gov (United States)

    Mirzaei, Hanieh; Beglov, Dmitri; Paschalidis, Ioannis Ch; Vajda, Sandor; Vakili, Pirooz; Kozakov, Dima

    2012-11-13

    Virtually all docking methods include some local continuous minimization of an energy/scoring function in order to remove steric clashes and obtain more reliable energy values. In this paper, we describe an efficient rigid-body optimization algorithm that, compared to the most widely used algorithms, converges approximately an order of magnitude faster to conformations with equal or slightly lower energy. The space of rigid body transformations is a nonlinear manifold, namely, a space which locally resembles a Euclidean space. We use a canonical parametrization of the manifold, called the exponential parametrization, to map the Euclidean tangent space of the manifold onto the manifold itself. Thus, we locally transform the rigid body optimization to an optimization over a Euclidean space where basic optimization algorithms are applicable. Compared to commonly used methods, this formulation substantially reduces the dimension of the search space. As a result, it requires far fewer costly function and gradient evaluations and leads to a more efficient algorithm. We have selected the LBFGS quasi-Newton method for local optimization since it uses only gradient information to obtain second order information about the energy function and avoids the far more costly direct Hessian evaluations. Two applications, one in protein-protein docking, and the other in protein-small molecular interactions, as part of macromolecular docking protocols are presented. The code is available to the community under open source license, and with minimal effort can be incorporated into any molecular modeling package.

  5. Stable 1-Norm Error Minimization Based Linear Predictors for Speech Modeling

    DEFF Research Database (Denmark)

    Giacobello, Daniele; Christensen, Mads Græsbøll; Jensen, Tobias Lindstrøm

    2014-01-01

    In linear prediction of speech, the 1-norm error minimization criterion has been shown to provide a valid alternative to the 2-norm minimization criterion. However, unlike 2-norm minimization, 1-norm minimization does not guarantee the stability of the corresponding all-pole filter and can generate...... saturations when this is used to synthesize speech. In this paper, we introduce two new methods to obtain intrinsically stable predictors with the 1-norm minimization. The first method is based on constraining the roots of the predictor to lie within the unit circle by reducing the numerical range...... based linear prediction for modeling and coding of speech....

  6. Annual Waste Minimization Summary Report, Calendar Year 2008

    International Nuclear Information System (INIS)

    2009-01-01

    This report summarizes the waste minimization efforts undertaken by National Security Technologies, LLC (NSTec), for the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during calendar year 2008. This report was developed in accordance with the requirements of the Nevada Test Site (NTS) Resource Conservation and Recovery Act (RCRA) Permit (No. NEV HW0021), and as clarified in a letter dated April 21, 1995, from Paul Liebendorfer of the Nevada Division of Environmental Protection to Donald Elle of the U.S. Department of Energy, Nevada Operations Office. The NNSA/NSO Pollution Prevention (P2) Program establishes a process to reduce the volume and toxicity of waste generated by NNSA/NSO activities and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment. The following information provides an overview of the P2 Program, major P2 accomplishments during the reporting year, a comparison of the current year waste generation to prior years, and a description of efforts undertaken during the year to reduce the volume and toxicity of waste generated by the NNSA/NSO

  7. Annual Waste Minimization Summary Report, Calendar Year 2009

    International Nuclear Information System (INIS)

    2010-01-01

    This report summarizes the waste minimization efforts undertaken by National Security Technologies, LLC, for the U. S. Department of Energy, National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during calendar year 2009. This report was developed in accordance with the requirements of the Nevada Test Site Resource Conservation and Recovery Act Permit (No. NEV HW0021), and as clarified in a letter dated April 21, 1995, from Paul Liebendorfer of the Nevada Division of Environmental Protection to Donald Elle of the U.S. Department of Energy, Nevada Operations Office. The NNSA/NSO Pollution Prevention (P2) Program establishes a process to reduce the volume and toxicity of waste generated by NNSA/NSO activities and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment. The following information provides an overview of the P2 Program, major P2 accomplishments during the reporting year, a comparison of the current year waste generation to prior years, and a description of efforts undertaken during the year to reduce the volume and toxicity of waste generated by NNSA/NSO.

  8. Annual Waste Minimization Summary Report Calendar Year 2007

    International Nuclear Information System (INIS)

    NSTec Environmental Management

    2008-01-01

    This report summarizes the waste minimization efforts undertaken by National Security Technologies, LLC (NSTec), for the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during calendar year (CY) 2007. This report was developed in accordance with the requirements of the Nevada Test Site (NTS) Resource Conservation and Recovery Act (RCRA) Permit (number NEV HW0021), and as clarified in a letter dated April 21, 1995, from Paul Liebendorfer of the Nevada Division of Environmental Protection to Donald Elle of the U.S. Department of Energy, Nevada Operations Office. The NNSA/NSO Pollution Prevention (P2) Program establishes a process to reduce the volume and toxicity of waste generated by the NNSA/NSO and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment. The following information provides an overview of the P2 Program, major P2 accomplishments during the reporting year, a comparison of the current year waste generation to prior years, and a description of efforts undertaken during the year to reduce the volume and toxicity of waste generated by the NNSA/NSO

  9. Application of the microbiological method DEFT/APC and DNA comet assay to detect ionizing radiation processing of minimally processed vegetables

    International Nuclear Information System (INIS)

    Araujo, Michel Mozeika

    2008-01-01

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent healthy. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and inactivation of food-borne pathogens, Its combination with minimal processing could improve the safety and quality of MPV. Two different food irradiation detection methods, a biological, the DEFT/APC, and another biochemical, the DNA Comet Assay were applied to MPV in order to test its applicability to detect irradiation treatment. DEFT/APC is a microbiological screening method based on the use of the direct epi fluorescent filter technique (DEFT) and the aerobic plate count (APC). DNA Comet Assay detects DNA damage due to ionizing radiation. Samples of lettuce, chard, watercress, dandelion, kale, chicory, spinach, cabbage from retail market were irradiated O.5 kGy and 1.0 kGy using a 60 Co facility. Irradiation treatment guaranteed at least 2 log cycle reduction for aerobic and psychotropic microorganisms. In general, with increasing radiation doses, DEFT counts remained similar independent of irradiation processing while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. DNA Comet Assay allowed distinguishing non-irradiated samples from irradiated ones, which showed different types of comets owing to DNA fragmentation. Both DEFT/APC method and DNA Comet Assay would be satisfactorily used as a screening method for indicating irradiation processing. (author)

  10. Minimization of the emittance growth of multi-charge particle beams in the charge stripping section of RAON

    Energy Technology Data Exchange (ETDEWEB)

    Hwang, Ji-Gwang [Department of Physics, Kyungpook National University, Daegu 702-701 (Korea, Republic of); Kim, Eun-San, E-mail: eskim1@knu.ac.kr [Department of Physics, Kyungpook National University, Daegu 702-701 (Korea, Republic of); Kim, Hye-Jin, E-mail: hjkim87@ibs.re.kr [Rare Isotope Science Project, Institute for Basic Science, Jeonmin-dong, Yuseong-gu, Daejeon (Korea, Republic of); Jeon, Dong-O [Rare Isotope Science Project, Institute for Basic Science, Jeonmin-dong, Yuseong-gu, Daejeon (Korea, Republic of)

    2014-12-11

    The charge stripping section of the Rare isotope Accelerator Of Newness (RAON), which is one of the critical components to achieve a high power of 400 kW with a short lianc, is a source of transverse emittance growth. The dominant effects are the angular straggling in the charge stripper required to increase the charge state of the beam and chromatic aberrations in the dispersive section required to separate the selected ion beam from the various ion beams produced in the stripper. Since the main source of transverse emittance growth in the stripper is the angular straggling, it can be compensated for by changing the angle of the phase ellipse. Therefore the emittance growth is minimized by optimizing the Twiss parameters at the stripper. The emittance growth in the charge selection section is also minimized by the correction of high-order aberrations using six sextupole magnets. In this paper, we present a method to minimize the transverse emittance growth in the stripper by changing the Twiss parameters and in the charge selection section by using sextupole magnets.

  11. Critical flicker frequency and continuous reaction times for the diagnosis of minimal hepatic encephalopathy

    DEFF Research Database (Denmark)

    Lauridsen, Mette Enok Munk; Jepsen, Peter; Vilstrup, Hendrik

    2011-01-01

    Abstract Minimal hepatic encephalopathy (MHE) is intermittently present in up to 2/3 of patients with chronic liver disease. It impairs their daily living and can be treated. However, there is no consensus on diagnostic criteria except that psychometric methods are required. We compared two easy...... appropriately to a sensory stimulus. The choice of test depends on the information needed in the clinical and scientific care and study of the patients....

  12. Minimization of radioactive solid wastes from uranium mining and metallurgy

    International Nuclear Information System (INIS)

    Zhang Xueli; Xu Lechang; Wei Guangzhi; Gao Jie; Wang Erqi

    2010-01-01

    The concept and contents of radioactive waste minimization are introduced. The principle of radioactive waste minimization involving administration optimization, source reduction, recycling and reuse as well as volume reduction are discussed. The strategies and methods to minimize radioactive solid wastes from uranium mining and metallurgy are summarized. In addition, the benefit from its application of radioactive waste minimization is analyzed. Prospects for the research on radioactive so-lid waste minimization are made in the end. (authors)

  13. 13 CFR 115.17 - Minimization of Surety's Loss.

    Science.gov (United States)

    2010-01-01

    ... and collateral—(1) Requirements. The Surety must take all reasonable action to minimize risk of Loss... indemnity agreement must be secured by such collateral as the Surety or SBA finds appropriate. Indemnity...

  14. Waste minimization and pollution prevention awareness plan

    International Nuclear Information System (INIS)

    1991-01-01

    The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. The intent of this plan is to respond to and comply with (DOE's) policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Programs and Departments. 14 refs

  15. Waste minimization and pollution prevention awareness plan

    Energy Technology Data Exchange (ETDEWEB)

    1991-05-31

    The purpose of this plan is to document the Lawrence Livermore National Laboratory (LLNL) Waste Minimization and Pollution Prevention Awareness Program. The plan specifies those activities and methods that are or will be employed to reduce the quantity and toxicity of wastes generated at the site. The intent of this plan is to respond to and comply with (DOE's) policy and guidelines concerning the need for pollution prevention. The Plan is composed of a LLNL Waste Minimization and Pollution Prevention Awareness Program Plan and, as attachments, Program- and Department-specific waste minimization plans. This format reflects the fact that waste minimization is considered a line management responsibility and is to be addressed by each of the Programs and Departments. 14 refs.

  16. Inelastic scattering with Chebyshev polynomials and preconditioned conjugate gradient minimization.

    Science.gov (United States)

    Temel, Burcin; Mills, Greg; Metiu, Horia

    2008-03-27

    We describe and test an implementation, using a basis set of Chebyshev polynomials, of a variational method for solving scattering problems in quantum mechanics. This minimum error method (MEM) determines the wave function Psi by minimizing the least-squares error in the function (H Psi - E Psi), where E is the desired scattering energy. We compare the MEM to an alternative, the Kohn variational principle (KVP), by solving the Secrest-Johnson model of two-dimensional inelastic scattering, which has been studied previously using the KVP and for which other numerical solutions are available. We use a conjugate gradient (CG) method to minimize the error, and by preconditioning the CG search, we are able to greatly reduce the number of iterations necessary; the method is thus faster and more stable than a matrix inversion, as is required in the KVP. Also, we avoid errors due to scattering off of the boundaries, which presents substantial problems for other methods, by matching the wave function in the interaction region to the correct asymptotic states at the specified energy; the use of Chebyshev polynomials allows this boundary condition to be implemented accurately. The use of Chebyshev polynomials allows for a rapid and accurate evaluation of the kinetic energy. This basis set is as efficient as plane waves but does not impose an artificial periodicity on the system. There are problems in surface science and molecular electronics which cannot be solved if periodicity is imposed, and the Chebyshev basis set is a good alternative in such situations.

  17. On balanced minimal repeated measurements designs

    Directory of Open Access Journals (Sweden)

    Shakeel Ahmad Mir

    2014-10-01

    Full Text Available Repeated Measurements designs are concerned with scientific experiments in which each experimental unit is assigned more than once to a treatment either different or identical. This class of designs has the property that the unbiased estimators for elementary contrasts among direct and residual effects are obtainable. Afsarinejad (1983 provided a method of constructing balanced Minimal Repeated Measurements designs p < t , when t is an odd or prime power, one or more than one treatment may occur more than once in some sequences and  designs so constructed no longer remain uniform in periods. In this paper an attempt has been made to provide a new method to overcome this drawback. Specifically, two cases have been considered                RM[t,n=t(t-t/(p-1,p], λ2=1 for balanced minimal repeated measurements designs and  RM[t,n=2t(t-t/(p-1,p], λ2=2 for balanced  repeated measurements designs. In addition , a method has been provided for constructing              extra-balanced minimal designs for special case RM[t,n=t2/(p-1,p], λ2=1.

  18. TOKMINA, Toroidal Magnetic Field Minimization for Tokamak Fusion Reactor. TOKMINA-2, Total Power for Tokamak Fusion Reactor

    International Nuclear Information System (INIS)

    Hatch, A.J.

    1975-01-01

    1 - Description of problem or function: TOKMINA finds the minimum magnetic field, Bm, required at the toroidal coil of a Tokamak type fusion reactor when the input is beta(ratio of plasma pressure to magnetic pressure), q(Kruskal-Shafranov plasma stability factor), and y(ratio of plasma radius to vacuum wall radius: rp/rw) and arrays of PT (total thermal power from both d-t and tritium breeding reactions), Pw (wall loading or power flux) and TB (thickness of blanket), following the method of Golovin, et al. TOKMINA2 finds the total power, PT, of such a fusion reactor, given a specified magnetic field, Bm, at the toroidal coil. 2 - Method of solution: TOKMINA: the aspect ratio(a) is minimized, giving a minimum value for Bm. TOKMINA2: a search is made for PT; the value of PT which minimizes Bm to the required value within 50 Gauss is chosen. 3 - Restrictions on the complexity of the problem: Input arrays presently are dimensioned at 20. This restriction can be overcome by changing a dimension card

  19. Minimization of municipal solid waste transportation route in West Jakarta using Tabu Search method

    Science.gov (United States)

    Chaerul, M.; Mulananda, A. M.

    2018-04-01

    Indonesia still adopts the concept of collect-haul-dispose for municipal solid waste handling and it leads to the queue of the waste trucks at final disposal site (TPA). The study aims to minimize the total distance of waste transportation system by applying a Transshipment model. In this case, analogous of transshipment point is a compaction facility (SPA). Small capacity of trucks collects the waste from waste temporary collection points (TPS) to the compaction facility which located near the waste generator. After compacted, the waste is transported using big capacity of trucks to the final disposal site which is located far away from city. Problem related with the waste transportation can be solved using Vehicle Routing Problem (VRP). In this study, the shortest distance of route from truck pool to TPS, TPS to SPA, and SPA to TPA was determined by using meta-heuristic methods, namely Tabu Search 2 Phases. TPS studied is the container type with total 43 units throughout the West Jakarta City with 38 units of Armroll truck with capacity of 10 m3 each. The result determines the assignment of each truck from the pool to the selected TPS, SPA and TPA with the total minimum distance of 2,675.3 KM. The minimum distance causing the total cost for waste transportation to be spent by the government also becomes minimal.

  20. Scheduling stochastic two-machine flow shop problems to minimize expected makespan

    Directory of Open Access Journals (Sweden)

    Mehdi Heydari

    2013-07-01

    Full Text Available During the past few years, despite tremendous contribution on deterministic flow shop problem, there are only limited number of works dedicated on stochastic cases. This paper examines stochastic scheduling problems in two-machine flow shop environment for expected makespan minimization where processing times of jobs are normally distributed. Since jobs have stochastic processing times, to minimize the expected makespan, the expected sum of the second machine’s free times is minimized. In other words, by minimization waiting times for the second machine, it is possible to reach the minimum of the objective function. A mathematical method is proposed which utilizes the properties of the normal distributions. Furthermore, this method can be used as a heuristic method for other distributions, as long as the means and variances are available. The performance of the proposed method is explored using some numerical examples.

  1. Systems for tracking minimally invasive surgical instruments

    NARCIS (Netherlands)

    Chmarra, M. K.; Grimbergen, C. A.; Dankelman, J.

    2007-01-01

    Minimally invasive surgery (e.g. laparoscopy) requires special surgical skills, which should be objectively assessed. Several studies have shown that motion analysis is a valuable assessment tool of basic surgical skills in laparoscopy. However, to use motion analysis as the assessment tool, it is

  2. Towards the assembly of a minimal oscillator

    NARCIS (Netherlands)

    Nourian, Z.

    2015-01-01

    Life must have started with lower degree of complexity and connectivity. This statement readily triggers the question how simple is the simplest representation of life? In different words and considering a constructive approach, what are the requirements for creating a minimal cell? This thesis sets

  3. 76 FR 30550 - Federal Management Regulation; Change in Consumer Price Index Minimal Value

    Science.gov (United States)

    2011-05-26

    ... Minimal Value AGENCY: Office of Governmentwide Policy, GSA. ACTION: Final rule. SUMMARY: Pursuant to 5 U.S.C. 7342, at three-year intervals following January 1, 1981, the minimal value for foreign gifts must... required consultation has been completed and the minimal value has been increased to $350 or less as of...

  4. Irradiation of watercress (Nasturtium officinale) minimally processed: microbiological and sensory aspects

    International Nuclear Information System (INIS)

    Martins, Cecilia Geraldes

    2004-01-01

    Consumer attitudes towards foods have changed in the last two decades increasing requirements for fresh like products. Consequently less extreme treatments or additives are being required. Minimally processed foods have fresh like characteristics and satisfy this new consumer demand. Besides freshness, the minimally processing also provide convenience required by the market. Salad vegetables can be source of pathogen such as Salmonella, Escherichia coli O157:H7 and Listeria monocytogenes. The minimally processing does not reduce the levels of pathogenic microorganisms to safe levels. Therefore, this study was carried out in order to improve the microbiological safety and the shelf-life of minimally processed vegetables using gamma radiation. Minimally processed watercress inoculated with a cocktail of Salmonella spp was exposed to 0.0, 0.2, 0.5, 0.7, 1.0, 1.2 and 1.5 kGy. D10 values for Salmonella spp inoculated in watercress varied from 0.29 to 0.43 kGy. Samples of watercress exposed of 1, 3 e 4 kGy and non-irradiated sample, stored at 7 deg C, were submitted to sensory evaluation and their shelf-life was determined. All samples were accepted by members of sensory panel. The shelf-life of sample irradiated with 1 kGy was 16 days (one and half day more than shelf-life of non-irradiated sample) and samples exposed to 3 and 4 kGy presented shelf-life of 9 and o days, respectively. (author)

  5. Minimal requirements for quality controls in radiotherapy with external beams; Controlli di qualita' essenziali in radioterapia con fasci esterni

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-07-01

    Physical dosimetric guidelines have been developed by the Italian National Institute of Health study group on quality assurance in radiotherapy to define protocols for quality controls in external beam radiotherapy. While the document does not determine strict rules or firm recommendations, it suggests minimal requirements for quality controls necessary to guarantee an adequate degree of accuracy in external beam radiotherapy. [Italian] Il gruppo di studio Assicurazione di qualita' in radioterapia dell'Istituto Superiore di Sanita' presenta le linee guida per la stesura dei protocolli di controllo di qualita' essenziali necessari a garantire un adeguato livello di accuratezza del trattamento radiante e rappresenta pertanto una parte essenziale del contributo fisico-dosimetrico globale di assicurazione di qualita' in radioterapia con fasci esterni.

  6. Environmental Restoration Progam Waste Minimization and Pollution Prevention Awareness Program Plan

    Energy Technology Data Exchange (ETDEWEB)

    Grumski, J. T.; Swindle, D. W.; Bates, L. D.; DeLozier, M. F.P.; Frye, C. E.; Mitchell, M. E.

    1991-09-30

    In response to DOE Order 5400.1 this plan outlines the requirements for a Waste Minimization and Pollution Prevention Awareness Program for the Environmental Restoration (ER) Program at Martin Marietta Energy System, Inc. Statements of the national, Department of Energy, Energy Systems, and Energy Systems ER Program policies on waste minimization are included and reflect the attitudes of these organizations and their commitment to the waste minimization effort. Organizational responsibilities for the waste minimization effort are clearly defined and discussed, and the program objectives and goals are set forth. Waste assessment is addressed as being a key element in developing the waste generation baseline. There are discussions on the scope of ER-specific waste minimization techniques and approaches to employee awareness and training. There is also a discussion on the process for continual evaluation of the Waste Minimization Program. Appendixes present an implementation schedule for the Waste Minimization and Pollution Prevention Program, the program budget, an organization chart, and the ER waste minimization policy.

  7. Charge and energy minimization in electrical/magnetic stimulation of nervous tissue.

    Science.gov (United States)

    Jezernik, Saso; Sinkjaer, Thomas; Morari, Manfred

    2010-08-01

    In this work we address the problem of stimulating nervous tissue with the minimal necessary energy at reduced/minimal charge. Charge minimization is related to a valid safety concern (avoidance and reduction of stimulation-induced tissue and electrode damage). Energy minimization plays a role in battery-driven electrical or magnetic stimulation systems (increased lifetime, repetition rates, reduction of power requirements, thermal management). Extensive new theoretical results are derived by employing an optimal control theory framework. These results include derivation of the optimal electrical stimulation waveform for a mixed energy/charge minimization problem, derivation of the charge-balanced energy-minimal electrical stimulation waveform, solutions of a pure charge minimization problem with and without a constraint on the stimulation amplitude, and derivation of the energy-minimal magnetic stimulation waveform. Depending on the set stimulus pulse duration, energy and charge reductions of up to 80% are deemed possible. Results are verified in simulations with an active, mammalian-like nerve fiber model.

  8. Environmental Restoration Progam Waste Minimization and Pollution Prevention Awareness Program Plan

    International Nuclear Information System (INIS)

    1991-01-01

    In response to DOE Order 5400.1 this plan outlines the requirements for a Waste Minimization and Pollution Prevention Awareness Program for the Environmental Restoration (ER) Program at Martin Marietta Energy System, Inc. Statements of the national, Department of Energy, Energy Systems, and Energy Systems ER Program policies on waste minimization are included and reflect the attitudes of these organizations and their commitment to the waste minimization effort. Organizational responsibilities for the waste minimization effort are clearly defined and discussed, and the program objectives and goals are set forth. Waste assessment is addressed as being a key element in developing the waste generation baseline. There are discussions on the scope of ER-specific waste minimization techniques and approaches to employee awareness and training. There is also a discussion on the process for continual evaluation of the Waste Minimization Program. Appendixes present an implementation schedule for the Waste Minimization and Pollution Prevention Program, the program budget, an organization chart, and the ER waste minimization policy

  9. Waste minimization in analytical chemistry through innovative sample preparation techniques

    International Nuclear Information System (INIS)

    Smith, L. L.

    1998-01-01

    Because toxic solvents and other hazardous materials are commonly used in analytical methods, characterization procedures result in significant and costly amount of waste. We are developing alternative analytical methods in the radiological and organic areas to reduce the volume or form of the hazardous waste produced during sample analysis. For the radiological area, we have examined high-pressure, closed-vessel microwave digestion as a way to minimize waste from sample preparation operations. Heated solutions of strong mineral acids can be avoided for sample digestion by using the microwave approach. Because reactivity increases with pressure, we examined the use of less hazardous solvents to leach selected contaminants from soil for subsequent analysis. We demonstrated the feasibility of this approach by extracting plutonium from a NET reference material using citric and tartaric acids with microwave digestion. Analytical results were comparable to traditional digestion methods, while hazardous waste was reduced by a factor often. We also evaluated the suitability of other natural acids, determined the extraction performance on a wider variety of soil types, and examined the extraction efficiency of other contaminants. For the organic area, we examined ways to minimize the wastes associated with the determination of polychlorinated biphenyls (PCBs) in environmental samples. Conventional methods for analyzing semivolatile organic compounds are labor intensive and require copious amounts of hazardous solvents. For soil and sediment samples, we have a method to analyze PCBs that is based on microscale extraction using benign solvents (e.g., water or hexane). The extraction is performed at elevated temperatures in stainless steel cells containing the sample and solvent. Gas chromatography-mass spectrometry (GC/MS) was used to quantitate the analytes in the isolated extract. More recently, we developed a method utilizing solid-phase microextraction (SPME) for natural

  10. Visual Occlusion During Minimally Invasive Surgery: A Contemporary Review of Methods to Reduce Laparoscopic and Robotic Lens Fogging and Other Sources of Optical Loss.

    Science.gov (United States)

    Manning, Todd G; Perera, Marlon; Christidis, Daniel; Kinnear, Ned; McGrath, Shannon; O'Beirne, Richard; Zotov, Paul; Bolton, Damien; Lawrentschuk, Nathan

    2017-04-01

    Maintenance of optimal vision during minimally invasive surgery is crucial to maintaining operative awareness, efficiency, and safety. Hampered vision is commonly caused by laparoscopic lens fogging (LLF), which has prompted the development of various antifogging fluids and warming devices. However, limited comparative evidence exists in contemporary literature. Despite technologic advancements there remains no consensus as to superior methods to prevent LLF or restore visual acuity once LLF has occurred. We performed a review of literature to present the current body of evidence supporting the use of numerous techniques. A standardized Preferred Reporting Items for Systematic Reviews and Meta-Analysis review was performed, and PubMed, Embase, Web of Science, and Google Scholar were searched. Articles pertaining to mechanisms and prevention of LLF were reviewed. We applied no limit to year of publication or publication type and all articles encountered were included in final review. Limited original research and heterogenous outcome measures precluded meta-analytical assessment. Vision loss has a multitude of causes and although scientific theory can be applied to in vivo environments, no authors have completely characterized this complex problem. No method to prevent or correct LLF was identified as superior to others and comparative evidence is minimal. Robotic LLF was poorly investigated and aside from a single analysis has not been directly compared to standard laparoscopic fogging in any capacity. Obscured vision during surgery is hazardous and typically caused by LLF. The etiology of LLF despite application of scientific theory is yet to be definitively proven in the in vivo environment. Common methods of prevention of LLF or restoration of vision due to LLF have little evidence-based data to support their use. A multiarm comparative in vivo analysis is required to formally assess these commonly used techniques in both standard and robotic laparoscopes.

  11. Minimizing Adverse Environmental Impact: How Murky the Waters

    Directory of Open Access Journals (Sweden)

    Reed W. Super

    2002-01-01

    Full Text Available The withdrawal of water from the nation’s waterways to cool industrial facilities kills billions of adult, juvenile, and larval fish each year. U.S. Environmental Protection Agency (EPA promulgation of categorical rules defining the best technology available to minimize adverse environmental impact (AEI could standardize and improve the control of such mortality. However, in an attempt to avoid compliance costs, industry has seized on the statutory phrase “adverse environmental impact” to propose significant procedural and substantive hurdles and layers of uncertainty in the permitting of cooling-water intakes under the Clean Water Act. These include, among other things, a requirement to prove that a particular facility threatens the sustainability of an aquatic population as a prerequisite to regulation. Such claims have no foundation in science, law, or the English language. Any nontrivial aquatic mortality constitutes AEI, as the EPA and several state and federal regulatory agencies have properly acknowledged. The focus of scientists, lawyers, regulators, permit applicants, and other interested parties should not be on defining AEI, but rather on minimizing AEI, which requires minimization of impingement and entrainment.

  12. Application of improved Vogel’s approximation method in minimization of rice distribution costs of Perum BULOG

    Science.gov (United States)

    Nahar, J.; Rusyaman, E.; Putri, S. D. V. E.

    2018-03-01

    This research was conducted at Perum BULOG Sub-Divre Medan which is the implementing institution of Raskin program for several regencies and cities in North Sumatera. Raskin is a program of distributing rice to the poor. In order to minimize rice distribution costs then rice should be allocated optimally. The method used in this study consists of the Improved Vogel Approximation Method (IVAM) to analyse the initial feasible solution, and Modified Distribution (MODI) to test the optimum solution. This study aims to determine whether the IVAM method can provide savings or cost efficiency of rice distribution. From the calculation with IVAM obtained the optimum cost is lower than the company's calculation of Rp945.241.715,5 while the cost of the company's calculation of Rp958.073.750,40. Thus, the use of IVAM can save rice distribution costs of Rp12.832.034,9.

  13. Secondary waste minimization in analytical methods

    International Nuclear Information System (INIS)

    Green, D.W.; Smith, L.L.; Crain, J.S.; Boparai, A.S.; Kiely, J.T.; Yaeger, J.S.; Schilling, J.B.

    1995-01-01

    The characterization phase of site remediation is an important and costly part of the process. Because toxic solvents and other hazardous materials are used in common analytical methods, characterization is also a source of new waste, including mixed waste. Alternative analytical methods can reduce the volume or form of hazardous waste produced either in the sample preparation step or in the measurement step. The authors are examining alternative methods in the areas of inorganic, radiological, and organic analysis. For determining inorganic constituents, alternative methods were studied for sample introduction into inductively coupled plasma spectrometers. Figures of merit for the alternative methods, as well as their associated waste volumes, were compared with the conventional approaches. In the radiological area, the authors are comparing conventional methods for gross α/β measurements of soil samples to an alternative method that uses high-pressure microwave dissolution. For determination of organic constituents, microwave-assisted extraction was studied for RCRA regulated semivolatile organics in a variety of solid matrices, including spiked samples in blank soil; polynuclear aromatic hydrocarbons in soils, sludges, and sediments; and semivolatile organics in soil. Extraction efficiencies were determined under varying conditions of time, temperature, microwave power, moisture content, and extraction solvent. Solvent usage was cut from the 300 mL used in conventional extraction methods to about 30 mL. Extraction results varied from one matrix to another. In most cases, the microwave-assisted extraction technique was as efficient as the more common Soxhlet or sonication extraction techniques

  14. Pengaruh Pelapis Bionanokomposit terhadap Mutu Mangga Terolah Minimal

    OpenAIRE

    Wardana, Ata Aditya; Suyatma, Nugraha Edhi; Muchtadi, Tien Ruspriatin; Yuliani, Sri

    2017-01-01

    Abstract Minimally-processed mango is a perishable product due to high respiration and transpiration and microbial decay. Edible coating is one of the alternative methods to maintain the quality of minimally - processed mango. The objective of this study was to evaluate the effects of bionanocomposite edible coating from tapioca and ZnO nanoparticles (NP-ZnO) on quality of minimally - processed mango cv. Arumanis, stored for 12 days at 8°C. The combination of tapioca and NP-ZnO (0, 1, 2% b...

  15. Waste Minimization and Pollution Prevention Awareness Plan

    International Nuclear Information System (INIS)

    1994-04-01

    The purpose of this plan is to document Lawrence Livermore National Laboratory (LLNL) projections for present and future waste minimization and pollution prevention. The plan specifies those activities and methods that are or will be used to reduce the quantity and toxicity of wastes generated at the site. It is intended to satisfy Department of Energy (DOE) requirements. This Plan provides an overview of projected activities from FY 1994 through FY 1999. The plans are broken into site-wide and problem-specific activities. All directorates at LLNL have had an opportunity to contribute input, to estimate budget, and to review the plan. In addition to the above, this plan records LLNL's goals for pollution prevention, regulatory drivers for those activities, assumptions on which the cost estimates are based, analyses of the strengths of the projects, and the barriers to increasing pollution prevention activities

  16. Environmental Restoration Program waste minimization and pollution prevention self-assessment

    International Nuclear Information System (INIS)

    1994-10-01

    The Environmental Restoration (ER) Program within Martin Marietta Energy Systems, Inc. is currently developing a more active waste minimization and pollution prevention program. To determine areas of programmatic improvements within the ER Waste Minimization and Pollution Prevention Awareness Program, the ER Program required an evaluation of the program across the Oak Ridge K-25 Site, the Oak Ridge National Laboratory, the Oak Ridge Y-12 Plant, the Paducah Environmental Restoration and Waste Minimization Site, and the Portsmouth Environmental Restoration and Waste Minimization Site. This document presents the status of the overall program as of fourth quarter FY 1994, presents pollution prevention cost avoidance data associated with FY 1994 activities, and identifies areas for improvement. Results of this assessment indicate that the ER Waste Minimization and Pollution Prevention Awareness Program is firmly established and is developing rapidly. Several procedural goals were met in FY 1994 and many of the sites implemented ER waste minimization options. Additional growth is needed, however, for the ER Waste Minimization and Pollution Prevention Awareness Program

  17. Advanced pyrochemical technologies for minimizing nuclear waste

    International Nuclear Information System (INIS)

    Bronson, M.C.; Dodson, K.E.; Riley, D.C.

    1994-01-01

    The Department of Energy (DOE) is seeking to reduce the size of the current nuclear weapons complex and consequently minimize operating costs. To meet this DOE objective, the national laboratories have been asked to develop advanced technologies that take uranium and plutonium, from retired weapons and prepare it for new weapons, long-term storage, and/or final disposition. Current pyrochemical processes generate residue salts and ceramic wastes that require aqueous processing to remove and recover the actinides. However, the aqueous treatment of these residues generates an estimated 100 liters of acidic transuranic (TRU) waste per kilogram of plutonium in the residue. Lawrence Livermore National Laboratory (LLNL) is developing pyrochemical techniques to eliminate, minimize, or more efficiently treat these residue streams. This paper will present technologies being developed at LLNL on advanced materials for actinide containment, reactors that minimize residues, and pyrochemical processes that remove actinides from waste salts

  18. Systems biology perspectives on minimal and simpler cells.

    Science.gov (United States)

    Xavier, Joana C; Patil, Kiran Raosaheb; Rocha, Isabel

    2014-09-01

    The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  19. Systems Biology Perspectives on Minimal and Simpler Cells

    Science.gov (United States)

    Xavier, Joana C.; Patil, Kiran Raosaheb

    2014-01-01

    SUMMARY The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. PMID:25184563

  20. Quantitative methods for developing C2 system requirement

    Energy Technology Data Exchange (ETDEWEB)

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the ``communications gap`` between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff? This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  1. A study on the theoretical and practical accuracy of conoscopic holography-based surface measurements: toward image registration in minimally invasive surgery†

    Science.gov (United States)

    Burgner, J.; Simpson, A. L.; Fitzpatrick, J. M.; Lathrop, R. A.; Herrell, S. D.; Miga, M. I.; Webster, R. J.

    2013-01-01

    Background Registered medical images can assist with surgical navigation and enable image-guided therapy delivery. In soft tissues, surface-based registration is often used and can be facilitated by laser surface scanning. Tracked conoscopic holography (which provides distance measurements) has been recently proposed as a minimally invasive way to obtain surface scans. Moving this technique from concept to clinical use requires a rigorous accuracy evaluation, which is the purpose of our paper. Methods We adapt recent non-homogeneous and anisotropic point-based registration results to provide a theoretical framework for predicting the accuracy of tracked distance measurement systems. Experiments are conducted a complex objects of defined geometry, an anthropomorphic kidney phantom and a human cadaver kidney. Results Experiments agree with model predictions, producing point RMS errors consistently holography is clinically viable; it enables minimally invasive surface scan accuracy comparable to current clinical methods that require open surgery. PMID:22761086

  2. Quantification of Listeria monocytogenes in minimally processed leafy vegetables using a combined method based on enrichment and 16S rRNA real-time PCR.

    Science.gov (United States)

    Aparecida de Oliveira, Maria; Abeid Ribeiro, Eliana Guimarães; Morato Bergamini, Alzira Maria; Pereira De Martinis, Elaine Cristina

    2010-02-01

    Modern lifestyle markedly changed eating habits worldwide, with an increasing demand for ready-to-eat foods, such as minimally processed fruits and leafy greens. Packaging and storage conditions of those products may favor the growth of psychrotrophic bacteria, including the pathogen Listeria monocytogenes. In this work, minimally processed leafy vegetables samples (n = 162) from retail market from Ribeirão Preto, São Paulo, Brazil, were tested for the presence or absence of Listeria spp. by the immunoassay Listeria Rapid Test, Oxoid. Two L. monocytogenes positive and six artificially contaminated samples of minimally processed leafy vegetables were evaluated by the Most Probable Number (MPN) with detection by classical culture method and also culture method combined with real-time PCR (RTi-PCR) for 16S rRNA genes of L. monocytogenes. Positive MPN enrichment tubes were analyzed by RTi-PCR with primers specific for L. monocytogenes using the commercial preparation ABSOLUTE QPCR SYBR Green Mix (ABgene, UK). Real-time PCR assay presented good exclusivity and inclusivity results and no statistical significant difference was found in comparison with the conventional culture method (p < 0.05). Moreover, RTi-PCR was fast and easy to perform, with MPN results obtained in ca. 48 h for RTi-PCR in comparison to 7 days for conventional method.

  3. Minimizing surgical skin incision scars with a latex surgical glove.

    Science.gov (United States)

    Han, So-Eun; Ryoo, Suk-Tae; Lim, So Young; Pyon, Jai-Kyung; Bang, Sa-Ik; Oh, Kap-Sung; Mun, Goo-Hyun

    2013-04-01

    The current trend in minimally invasive surgery is to make a small surgical incision. However, the excessive tensile stress applied by the retractors to the skin surrounding the incision often results in a long wound healing time and extensive scarring. To minimize these types of wound problems, the authors evaluated a simple and cost-effective method to minimize surgical incision scars based on the use of a latex surgical glove. The tunnel-shaped part of a powder-free latex surgical glove was applied to the incision and the dissection plane. It was fixed to the full layer of the dissection plane with sutures. The glove on the skin surface then was sealed with Ioban (3 M Health Care, St. Paul, MN, USA) to prevent movement. The operation proceeded as usual, with the retractor running through the tunnel of the latex glove. It was possible to complete the operation without any disturbance of the visual field by the surgical glove, and the glove was neither torn nor separated by the retractors. The retractors caused traction and friction during the operation, but the extent of damage to the postoperative skin incision margin was remarkably less than when the operation was performed without a glove. This simple and cost-effective method is based on the use of a latex surgical glove to protect the surgical skin incision site and improve the appearance of the postoperative scar. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  4. Particle production after inflation with non-minimal derivative coupling to gravity

    International Nuclear Information System (INIS)

    Ema, Yohei; Jinno, Ryusuke; Nakayama, Kazunori; Mukaida, Kyohei

    2015-01-01

    We study cosmological evolution after inflation in models with non-minimal derivative coupling to gravity. The background dynamics is solved and particle production associated with rapidly oscillating Hubble parameter is studied in detail. In addition, production of gravitons through the non-minimal derivative coupling with the inflaton is studied. We also find that the sound speed squared of the scalar perturbation oscillates between positive and negative values when the non-minimal derivative coupling dominates over the minimal kinetic term. This may lead to an instability of this model. We point out that the particle production rates are the same as those in the Einstein gravity with the minimal kinetic term, if we require the sound speed squared is positive definite

  5. Stabilization of a locally minimal forest

    Science.gov (United States)

    Ivanov, A. O.; Mel'nikova, A. E.; Tuzhilin, A. A.

    2014-03-01

    The method of partial stabilization of locally minimal networks, which was invented by Ivanov and Tuzhilin to construct examples of shortest trees with given topology, is developed. According to this method, boundary vertices of degree 2 are not added to all edges of the original locally minimal tree, but only to some of them. The problem of partial stabilization of locally minimal trees in a finite-dimensional Euclidean space is solved completely in the paper, that is, without any restrictions imposed on the number of edges remaining free of subdivision. A criterion for the realizability of such stabilization is established. In addition, the general problem of searching for the shortest forest connecting a finite family of boundary compact sets in an arbitrary metric space is formalized; it is shown that such forests exist for any family of compact sets if and only if for any finite subset of the ambient space there exists a shortest tree connecting it. The theory developed here allows us to establish further generalizations of the stabilization theorem both for arbitrary metric spaces and for metric spaces with some special properties. Bibliography: 10 titles.

  6. 2013 Los Alamos National Laboratory Hazardous Waste Minimization Report

    Energy Technology Data Exchange (ETDEWEB)

    Salzman, Sonja L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); English, Charles J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2015-08-24

    Waste minimization and pollution prevention are inherent goals within the operating procedures of Los Alamos National Security, LLC (LANS). The US Department of Energy (DOE) and LANS are required to submit an annual hazardous waste minimization report to the New Mexico Environment Department (NMED) in accordance with the Los Alamos National Laboratory (LANL or the Laboratory) Hazardous Waste Facility Permit. The report was prepared pursuant to the requirements of Section 2.9 of the LANL Hazardous Waste Facility Permit. This report describes the hazardous waste minimization program (a component of the overall Waste Minimization/Pollution Prevention [WMin/PP] Program) administered by the Environmental Stewardship Group (ENV-ES). This report also supports the waste minimization and pollution prevention goals of the Environmental Programs Directorate (EP) organizations that are responsible for implementing remediation activities and describes its programs to incorporate waste reduction practices into remediation activities and procedures. LANS was very successful in fiscal year (FY) 2013 (October 1-September 30) in WMin/PP efforts. Staff funded four projects specifically related to reduction of waste with hazardous constituents, and LANS won four national awards for pollution prevention efforts from the National Nuclear Security Administration (NNSA). In FY13, there was no hazardous, mixedtransuranic (MTRU), or mixed low-level (MLLW) remediation waste generated at the Laboratory. More hazardous waste, MTRU waste, and MLLW was generated in FY13 than in FY12, and the majority of the increase was related to MTRU processing or lab cleanouts. These accomplishments and analysis of the waste streams are discussed in much more detail within this report.

  7. Cognitive radio adaptation for power consumption minimization using biogeography-based optimization

    International Nuclear Information System (INIS)

    Qi Pei-Han; Zheng Shi-Lian; Yang Xiao-Niu; Zhao Zhi-Jin

    2016-01-01

    Adaptation is one of the key capabilities of cognitive radio, which focuses on how to adjust the radio parameters to optimize the system performance based on the knowledge of the radio environment and its capability and characteristics. In this paper, we consider the cognitive radio adaptation problem for power consumption minimization. The problem is formulated as a constrained power consumption minimization problem, and the biogeography-based optimization (BBO) is introduced to solve this optimization problem. A novel habitat suitability index (HSI) evaluation mechanism is proposed, in which both the power consumption minimization objective and the quality of services (QoS) constraints are taken into account. The results show that under different QoS requirement settings corresponding to different types of services, the algorithm can minimize power consumption while still maintaining the QoS requirements. Comparison with particle swarm optimization (PSO) and cat swarm optimization (CSO) reveals that BBO works better, especially at the early stage of the search, which means that the BBO is a better choice for real-time applications. (paper)

  8. On Time with Minimal Expected Cost!

    DEFF Research Database (Denmark)

    David, Alexandre; Jensen, Peter Gjøl; Larsen, Kim Guldstrand

    2014-01-01

    (Priced) timed games are two-player quantitative games involving an environment assumed to be completely antogonistic. Classical analysis consists in the synthesis of strategies ensuring safety, time-bounded or cost-bounded reachability objectives. Assuming a randomized environment, the (priced......) timed game essentially defines an infinite-state Markov (reward) decision proces. In this setting the objective is classically to find a strategy that will minimize the expected reachability cost, but with no guarantees on worst-case behaviour. In this paper, we provide efficient methods for computing...... reachability strategies that will both ensure worst case time-bounds as well as provide (near-) minimal expected cost. Our method extends the synthesis algorithms of the synthesis tool Uppaal-Tiga with suitable adapted reinforcement learning techniques, that exhibits several orders of magnitude improvements w...

  9. Guidelines, "minimal requirements" and standard of care in glioblastoma around the Mediterranean Area: A report from the AROME (Association of Radiotherapy and Oncology of the Mediterranean arEa) Neuro-Oncology working party.

    Science.gov (United States)

    2016-02-01

    Glioblastoma is the most common and the most lethal primary brain tumor in adults. Although studies are ongoing, the epidemiology of glioblastoma in North Africa (i.e. Morocco, Algeria and Tunisia) remains imperfectly settled and needs to be specified for a better optimization of the neuro-oncology healthcare across the Mediterranean area and in North Africa countries. Over the last years significant therapeutic advances have been accomplished improving survival and quality of life of glioblastoma patients. Indeed, concurrent temozolomide-radiotherapy (temoradiation) and adjuvant temozolomide has been established as the standard of care associated with a survival benefit and a better outcome. Therefore, considering this validated strategy and regarding the means and some other North Africa countries specificities, we decided, under the auspices of AROME (association of radiotherapy and oncology of the Mediterranean area; www.aromecancer.org), a non-profit organization, to organize a dedicated meeting to discuss the standards and elaborate a consensus on the "minimal requirements" adapted to the local resources. Thus, panels of physicians involved in daily multidisciplinary brain tumors management in the two borders of the Mediterranean area have been invited to the AROME neuro-oncology working party. We report here the consensus, established for minimal human and material resources for glioblastoma diagnosis and treatment faced to the standard of care, which should be reached. If the minimal requirements are not reached, the patients should be referred to the closest specialized medical center where at least minimal requirements, or, ideally, the standard of care should be guaranteed to the patients. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  10. Quality functions for requirements engineering in system development methods.

    Science.gov (United States)

    Johansson, M; Timpka, T

    1996-01-01

    Based on a grounded theory framework, this paper analyses the quality characteristics for methods to be used for requirements engineering in the development of medical decision support systems (MDSS). The results from a Quality Function Deployment (QFD) used to rank functions connected to user value and a focus group study were presented to a validation focus group. The focus group studies take advantage of a group process to collect data for further analyses. The results describe factors considered by the participants as important in the development of methods for requirements engineering in health care. Based on the findings, the content which, according to the user a MDSS method should support is established.

  11. An interval-based possibilistic programming method for waste management with cost minimization and environmental-impact abatement under uncertainty.

    Science.gov (United States)

    Li, Y P; Huang, G H

    2010-09-15

    Considerable public concerns have been raised in the past decades since a large amount of pollutant emissions from municipal solid waste (MSW) disposal of processes pose risks on surrounding environment and human health. Moreover, in MSW management, various uncertainties exist in the related costs, impact factors and objectives, which can affect the optimization processes and the decision schemes generated. In this study, an interval-based possibilistic programming (IBPP) method is developed for planning the MSW management with minimized system cost and environmental impact under uncertainty. The developed method can deal with uncertainties expressed as interval values and fuzzy sets in the left- and right-hand sides of constraints and objective function. An interactive algorithm is provided for solving the IBPP problem, which does not lead to more complicated intermediate submodels and has a relatively low computational requirement. The developed model is applied to a case study of planning a MSW management system, where mixed integer linear programming (MILP) technique is introduced into the IBPP framework to facilitate dynamic analysis for decisions of timing, sizing and siting in terms of capacity expansion for waste-management facilities. Three cases based on different waste-management policies are examined. The results obtained indicate that inclusion of environmental impacts in the optimization model can change the traditional waste-allocation pattern merely based on the economic-oriented planning approach. The results obtained can help identify desired alternatives for managing MSW, which has advantages in providing compromised schemes under an integrated consideration of economic efficiency and environmental impact under uncertainty. Copyright 2010 Elsevier B.V. All rights reserved.

  12. On Birnbaum importance assessment for aging multi-state system under minimal repair by using the Lz-transform method

    International Nuclear Information System (INIS)

    Lisnianski, Anatoly; Frenkel, Ilia; Khvatskin, Lev

    2015-01-01

    This paper considers a reliability importance evaluation for components in an aging multi-state system. In practical reliability engineering a “curse of dimensionality” (the large number of states that should be analyzed for a multi-state system model) is a main obstacle for importance assessment. In order to challenge the problem, this paper proposes a new method that is based on an L Z -transform of the discrete-state continuous-time Markov process and on Ushakov's Universal Generating Operator. The paper shows that the proposed method can drastically reduce a computational burden. In order to illustrate the method, a solution of a real world problem is presented as a numerical example. - Highlights: • Aging multi-state system under minimal repair is studied. • A new method for Birnbaum importance assessment is developed. • The method is based on the L Z -transform. • The proposed method provides a drastic reduction of computation burden. • Numerical example is presented in order to illustrate the method

  13. A criterion for flatness in minimal area metrics that define string diagrams

    International Nuclear Information System (INIS)

    Ranganathan, K.; Massachusetts Inst. of Tech., Cambridge, MA

    1992-01-01

    It has been proposed that the string diagrams of closed string field theory be defined by a minimal area problem that requires that all nontrivial homotopy curves have length greater than or equal to 2π. Consistency requires that the minimal area metric be flat in a neighbourhood of the punctures. The theorem proven in this paper, yields a criterion which if satisfied, will ensure this requirement. The theorem states roughly that the metric is flat in an open set, U if there is a unique closed curve of length 2π through every point in U and all of these closed curves are in the same free homotopy class. (orig.)

  14. Operational tank leak detection and minimization during retrieval

    International Nuclear Information System (INIS)

    Hertzel, J.S.

    1996-03-01

    This report evaluates the activities associated with the retrieval of wastes from the single-shell tanks proposed under the initial Single-Shell Tank Retrieval System. This report focuses on minimizing leakage during retrieval by using effective leak detection and mitigating actions. After reviewing the historical data available on single-shell leakage, and evaluating current leak detection technology, this report concludes that the only currently available leak detection method which can function within the most probable leakage range is the mass balance system. If utilized after each sluicing campaign, this method should allow detection at a leakage value well below the leakage value where significant health effects occur which is calculated for each tank. Furthermore, this report concludes that the planned sequence or sluicing activities will serve to further minimize the probability and volume of leaks by keeping liquid away from areas with the greatest potential for leaking. Finally, this report identifies a series of operational responses which when used in conjunction with the recommended sluicing sequence and leak detection methods will minimize worker exposure and environmental safety health risks

  15. Effectiveness and efficacy of minimally invasive lung volume reduction surgery for emphysema.

    Science.gov (United States)

    Pertl, Daniela; Eisenmann, Alexander; Holzer, Ulrike; Renner, Anna-Theresa; Valipour, A

    2014-01-01

    Lung emphysema is a chronic, progressive and irreversible destruction of the lung tissue. Besides non-medical therapies and the well established medical treatment there are surgical and minimally invasive methods for lung volume reduction (LVR) to treat severe emphysema. This report deals with the effectiveness and cost-effectiveness of minimally invasive methods compared to other treatments for LVR in patients with lung emphysema. Furthermore, legal and ethical aspects are discussed. No clear benefit of minimally invasive methods compared to surgical methods can be demonstrated based on the identified and included evidence. In order to assess the different methods for LVR regarding their relative effectiveness and safety in patients with lung emphysema direct comparative studies are necessary.

  16. Responsiveness and minimal clinically important change

    DEFF Research Database (Denmark)

    Christiansen, David Høyrup; Frost, Poul; Falla, Deborah

    2015-01-01

    Study Design A prospective cohort study nested in a randomized controlled trial. Objectives To determine and compare responsiveness and minimal clinically important change of the modified Constant score (CS) and the Oxford Shoulder Score (OSS). Background The OSS and the CS are commonly used...... to assess shoulder outcomes. However, few studies have evaluated the measurement properties of the OSS and CS in terms of responsiveness and minimal clinically important change. Methods The study included 126 patients who reported having difficulty returning to usual activities 8 to 12 weeks after...... were observed for the CS and the OSS. Minimal clinically important change ROC values were 6 points for the OSS and 11 points for the CS, with upper 95% cutoff limits of 12 and 22 points, respectively. Conclusion The CS and the OSS were both suitable for assessing improvement after decompression surgery....

  17. Evaluation of the Efficiency and Effectiveness of Three Minimally Invasive Methods of Caries Removal: An in vitro Study

    OpenAIRE

    Boob, Ankush Ramnarayan; Manjula, M; Reddy, E Rajendra; Srilaxmi, N; Rani, Tabitha

    2014-01-01

    ABSTRACT Background: Many chemomechanical caries removal (CMCR) agents have been introduced and marketed since 1970s, with each new one being better and effective than the previously introduced. Papacarie and Carisolv are new systems in the field of CMCR techniques. These are reportedly minimally invasive methods of removing carious dentin while preserving sound dentin. Aim: To compare the Efficiency (time taken for caries removal) and effectiveness (Knoop hardness number of the remaining den...

  18. Minimal requirements for quality controls in radiotherapy with external beams; Controlli di qualita' essenziali in radioterapia con fasci esterni

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-07-01

    Physical dosimetric guidelines have been developed by the Italian National Institute of Health study group on quality assurance in radiotherapy to define protocols for quality controls in external beam radiotherapy. While the document does not determine strict rules or firm recommendations, it suggests minimal requirements for quality controls necessary to guarantee an adequate degree of accuracy in external beam radiotherapy. [Italian] Il gruppo di studio Assicurazione di qualita' in radioterapia dell'Istituto Superiore di Sanita' presenta le linee guida per la stesura dei protocolli di controllo di qualita' essenziali necessari a garantire un adeguato livello di accuratezza del trattamento radiante e rappresenta pertanto una parte essenziale del contributo fisico-dosimetrico globale di assicurazione di qualita' in radioterapia con fasci esterni.

  19. Automatic optimized reload and depletion method for a pressurized water reactor

    International Nuclear Information System (INIS)

    Ahn, D.H.; Levene, S.H.

    1985-01-01

    A new method has been developed to automatically reload and deplete a pressurized water reactor (PWR) so that both the enriched inventory requirements during the reactor cycle and the cost of reloading the core are minimized. This is achieved through four stepwise optimization calculations: (a) determination of the minimum fuel requirement for an equivalent three-region core model, (b) optimal selection and allocation of fuel assemblies for each of the three regions to minimize the reload cost, (c) optimal placement of fuel assemblies to conserve regionwise optimal conditions, and (d) optimal control through poison management to deplete individual fuel assemblies to maximize end-of-cycle k /SUB eff/ . The new method differs from previous methods in that the optimization process automatically performs all tasks required to reload and deplete a PWR. In addition, the previous work that developed optimization methods principally for the initial reactor cycle was modified to handle subsequent cycles with fuel assemblies having burnup at beginning of cycle. Application of the method to the fourth reactor cycle at Three Mile Island Unit 1 has shown that both the enrichment and the number of fresh reload fuel assemblies can be decreased and fully amortized fuel assemblies can be reused to minimize the fuel cost of the reactor

  20. OPEN SURGICAL VS. MINIMALLY INVASIVE TREATMENT OF THORACOLUMBAR AO FRACTURES TYPE A AND B1 IN A REFERENCE HOSPITAL

    Directory of Open Access Journals (Sweden)

    José Enrique Salcedo Oviedo

    Full Text Available ABSTRACT Objective: The thoracolumbar spine trauma represents 30% of spinal diseases. To compare the minimally invasive technique with the open technique in lumbar fractures. Method: A prospective, cross-sectional, comparative observational study, which evaluated the following variables: surgery time, length of hospital stay, transoperative bleeding, postoperative pain, analyzed by SPSS software using Student's t test with statistical significance of p ≥ 0.05, with 24 patients with single-level thoracolumbar fractures, randomly treated with percutaneous pedicle screws and by open technique with a transpedicular system. Results: The surgery time was 90 minutes for the minimally invasive technique and 60 minutes for the open technique, the bleeding was on average 50 cm3 vs. 400 cm3. The mean visual analogue scale for pain at 24 hours of surgery was 5 for the minimally invasive group vs. 8 for the open group. The number of fluoroscopic projections of pedicle screws was 220 in the minimally invasive technique vs. 100 in the traditional technique. Quantified bleeding was minimal for percutaneous access vs. 340 cm3 for the traditional system. The hospital discharge for the minimally invasive group was at 24 hours and at 72 hours for those treated with open surgery. Conclusions: It is a technique that requires longer surgical time, with reports of less bleeding, less postoperative pain and less time for hospital discharge, reasons why it is supposed to be a procedure that requires a learning curve, statistical significance with respect to bleeding, visual analogue scale for pain and showed no significant difference in the variables of surgical time.

  1. Distal tibial pilon fractures (AO/OTA type B, and C) treated with the external skeletal and minimal internal fixation method.

    Science.gov (United States)

    Milenković, Sasa; Mitković, Milorad; Micić, Ivan; Mladenović, Desimir; Najman, Stevo; Trajanović, Miroslav; Manić, Miodrag; Mitković, Milan

    2013-09-01

    Distal tibial pilon fractures include extra-articular fractures of the tibial metaphysis and the more severe intra-articular tibial pilon fractures. There is no universal method for treating distal tibial pilon fractures. These fractures are treated by means of open reduction, internal fixation (ORIF) and external skeletal fixation. The high rate of soft-tissue complications associated with primary ORIF of pilon fractures led to the use of external skeletal fixation, with limited internal fixation as an alternative technique for definitive management. The aim of this study was to estimate efficacy of distal tibial pilon fratures treatment using the external skeletal and minimal internal fixation method. We presented a series of 31 operated patients with tibial pilon fractures. The patients were operated on using the method of external skeletal fixation with a minimal internal fixation. According to the AO/OTA classification, 17 patients had type B fracture and 14 patients type C fractures. The rigid external skeletal fixation was transformed into a dynamic external skeletal fixation 6 weeks post-surgery. This retrospective study involved 31 patients with tibial pilon fractures, average age 41.81 (from 21 to 60) years. The average follow-up was 21.86 (from 12 to 48) months. The percentage of union was 90.32%, nonunion 3.22% and malunion 6.45%. The mean to fracture union was 14 (range 12-20) weeks. There were 4 (12.19%) infections around the pins of the external skeletal fixator and one (3.22%) deep infections. The ankle joint arthrosis as a late complication appeared in 4 (12.90%) patients. All arthroses appeared in patients who had type C fractures. The final functional results based on the AOFAS score were excellent in 51.61%, good in 32.25%, average in 12.90% and bad in 3.22% of the patients. External skeletal fixation and minimal internal fixation of distal tibial pilon fractures is a good method for treating all types of inta-articular pilon fractures. In

  2. Quantitative methods for developing C2 system requirement

    Energy Technology Data Exchange (ETDEWEB)

    Tyler, K.K.

    1992-06-01

    The US Army established the Army Tactical Command and Control System (ATCCS) Experimentation Site (AES) to provide a place where material and combat developers could experiment with command and control systems. The AES conducts fundamental and applied research involving command and control issues using a number of research methods, ranging from large force-level experiments, to controlled laboratory experiments, to studies and analyses. The work summarized in this paper was done by Pacific Northwest Laboratory under task order from the Army Tactical Command and Control System Experimentation Site. The purpose of the task was to develop the functional requirements for army engineer automation and support software, including MCS-ENG. A client, such as an army engineer, has certain needs and requirements of his or her software; these needs must be presented in ways that are readily understandable to the software developer. A requirements analysis then, such as the one described in this paper, is simply the means of communication between those who would use a piece of software and those who would develop it. The analysis from which this paper was derived attempted to bridge the communications gap'' between army combat engineers and software engineers. It sought to derive and state the software needs of army engineers in ways that are meaningful to software engineers. In doing this, it followed a natural sequence of investigation: (1) what does an army engineer do, (2) with which tasks can software help, (3) how much will it cost, and (4) where is the highest payoff This paper demonstrates how each of these questions was addressed during an analysis of the functional requirements of engineer support software. Systems engineering methods are used in a task analysis and a quantitative scoring method was developed to score responses regarding the feasibility of task automation. The paper discusses the methods used to perform utility and cost-benefits estimates.

  3. Proposal for Requirement Validation Criteria and Method Based on Actor Interaction

    Science.gov (United States)

    Hattori, Noboru; Yamamoto, Shuichiro; Ajisaka, Tsuneo; Kitani, Tsuyoshi

    We propose requirement validation criteria and a method based on the interaction between actors in an information system. We focus on the cyclical transitions of one actor's situation against another and clarify observable stimuli and responses based on these transitions. Both actors' situations can be listed in a state transition table, which describes the observable stimuli or responses they send or receive. Examination of the interaction between both actors in the state transition tables enables us to detect missing or defective observable stimuli or responses. Typically, this method can be applied to the examination of the interaction between a resource managed by the information system and its user. As a case study, we analyzed 332 requirement defect reports of an actual system development project in Japan. We found that there were a certain amount of defects regarding missing or defective stimuli and responses, which can be detected using our proposed method if this method is used in the requirement definition phase. This means that we can reach a more complete requirement definition with our proposed method.

  4. 2016 Los Alamos National Laboratory Hazardous Waste Minimization Report

    Energy Technology Data Exchange (ETDEWEB)

    Salzman, Sonja L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); English, Charles Joe [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-12-02

    Waste minimization and pollution prevention are goals within the operating procedures of Los Alamos National Security, LLC (LANS). The US Department of Energy (DOE), inclusive of the National Nuclear Security Administration (NNSA) and the Office of Environmental Management, and LANS are required to submit an annual hazardous waste minimization report to the New Mexico Environment Department (NMED) in accordance with the Los Alamos National Laboratory (LANL or the Laboratory) Hazardous Waste Facility Permit. The report was prepared pursuant to the requirements of Section 2.9 of the LANL Hazardous Waste Facility Permit. This report describes the hazardous waste minimization program, which is a component of the overall Pollution Prevention (P2) Program, administered by the Environmental Stewardship Group (EPC-ES). This report also supports the waste minimization and P2 goals of the Associate Directorate of Environmental Management (ADEM) organizations that are responsible for implementing remediation activities and describes its programs to incorporate waste reduction practices into remediation activities and procedures. This report includes data for all waste shipped offsite from LANL during fiscal year (FY) 2016 (October 1, 2015 – September 30, 2016). LANS was active during FY2016 in waste minimization and P2 efforts. Multiple projects were funded that specifically related to reduction of hazardous waste. In FY2016, there was no hazardous, mixed-transuranic (MTRU), or mixed low-level (MLLW) remediation waste shipped offsite from the Laboratory. More non-remediation hazardous waste and MLLW was shipped offsite from the Laboratory in FY2016 compared to FY2015. Non-remediation MTRU waste was not shipped offsite during FY2016. These accomplishments and analysis of the waste streams are discussed in much more detail within this report.

  5. Effectiveness and efficacy of minimally invasive lung volume reduction surgery for emphysema

    Directory of Open Access Journals (Sweden)

    Pertl, Daniela

    2014-10-01

    Full Text Available [english] Lung emphysema is a chronic, progressive and irreversible destruction of the lung tissue. Besides non-medical therapies and the well established medical treatment there are surgical and minimally invasive methods for lung volume reduction (LVR to treat severe emphysema. This report deals with the effectiveness and cost-effectiveness of minimally invasive methods compared to other treatments for LVR in patients with lung emphysema. Furthermore, legal and ethical aspects are discussed. No clear benefit of minimally invasive methods compared to surgical methods can be demonstrated based on the identified and included evidence. In order to assess the different methods for LVR regarding their relative effectiveness and safety in patients with lung emphysema direct comparative studies are necessary.

  6. Distal tibial pilon fractures (AO/OTA type B, and C treated with the external skeletal and minimal internal fixation method

    Directory of Open Access Journals (Sweden)

    Milenković Saša

    2013-01-01

    Full Text Available Background/Aim. Distal tibial pilon fractures include extra-articular fractures of the tibial metaphysis and the more severe intra-articular tibial pilon fractures. There is no universal method for treating distal tibial pilon fractures. These fractures are treated by means of open reduction, internal fixation (ORIF and external skeletal fixation. The high rate of soft-tissue complications associated with primary ORIF of pilon fractures led to the use of external skeletal fixation, with limited internal fixation as an alternative technique for definitive management. The aim of this study was to estimate efficacy of distal tibial pilon fratures treatment using the external skeletal and minimal internal fixation method. Methods. We presented a series of 31 operated patients with tibial pilon fractures. The patients were operated on using the method of external skeletal fixation with a minimal internal fixation. According to the AO/OTA classification, 17 patients had type B fracture and 14 patients type C fractures. The rigid external skeletal fixation was transformed into a dynamic external skeletal fixation 6 weeks post-surgery. Results. This retrospective study involved 31 patients with tibial pilon fractures, average age 41.81 (from 21 to 60 years. The average follow-up was 21.86 (from 12 to 48 months. The percentage of union was 90.32%, nonunion 3.22% and malunion 6.45%. The mean to fracture union was 14 (range 12-20 weeks. There were 4 (12.19% infections around the pins of the external skeletal fixator and one (3.22% deep infections. The ankle joint arthrosis as a late complication appeared in 4 (12.90% patients. All arthroses appeared in patients who had type C fractures. The final functional results based on the AOFAS score were excellent in 51.61%, good in 32.25%, average in 12.90% and bad in 3.22% of the patients. Conclusion. External skeletal fixation and minimal internal fixation of distal tibial pilon fractures is a good method for

  7. Matrix factorizations, minimal models and Massey products

    International Nuclear Information System (INIS)

    Knapp, Johanna; Omer, Harun

    2006-01-01

    We present a method to compute the full non-linear deformations of matrix factorizations for ADE minimal models. This method is based on the calculation of higher products in the cohomology, called Massey products. The algorithm yields a polynomial ring whose vanishing relations encode the obstructions of the deformations of the D-branes characterized by these matrix factorizations. This coincides with the critical locus of the effective superpotential which can be computed by integrating these relations. Our results for the effective superpotential are in agreement with those obtained from solving the A-infinity relations. We point out a relation to the superpotentials of Kazama-Suzuki models. We will illustrate our findings by various examples, putting emphasis on the E 6 minimal model

  8. Robust imaging of localized scatterers using the singular value decomposition and ℓ1 minimization

    International Nuclear Information System (INIS)

    Chai, A; Moscoso, M; Papanicolaou, G

    2013-01-01

    We consider narrow band, active array imaging of localized scatterers in a homogeneous medium with and without additive noise. We consider both single and multiple illuminations and study ℓ 1 minimization-based imaging methods. We show that for large arrays, with array diameter comparable to range, and when scatterers are sparse and well separated, ℓ 1 minimization using a single illumination and without additive noise can recover the location and reflectivity of the scatterers exactly. For multiple illuminations, we introduce a hybrid method which combines the singular value decomposition and ℓ 1 minimization. This method can be used when the essential singular vectors of the array response matrix are available. We show that with this hybrid method we can recover the location and reflectivity of the scatterers exactly when there is no noise in the data. Numerical simulations indicate that the hybrid method is, in addition, robust to noise in the data. We also compare the ℓ 1 minimization-based methods with others including Kirchhoff migration, ℓ 2 minimization and multiple signal classification. (paper)

  9. Critical management practices influencing on-site waste minimization in construction projects.

    Science.gov (United States)

    Ajayi, Saheed O; Oyedele, Lukumon O; Bilal, Muhammad; Akinade, Olugbenga O; Alaka, Hafiz A; Owolabi, Hakeem A

    2017-01-01

    As a result of increasing recognition of effective site management as the strategic approach for achieving the required performance in construction projects, this study seeks to identify the key site management practices that are requisite for construction waste minimization. A mixed methods approach, involving field study and survey research were used as means of data collection. After confirmation of construct validity and reliability of scale, data analysis was carried out through a combination of Kruskal-Wallis test, descriptive statistics and exploratory factor analysis. The study suggests that site management functions could significantly reduce waste generation through strict adherence to project drawings, and by ensuring fewer or no design changes during construction process. Provision of waste skips for specific materials and maximisation of on-site reuse of materials are also found to be among the key factors for engendering waste minimization. The result of factor analysis suggests four factors underlying on-site waste management practices with 96.093% of total variance. These measures include contractual provisions for waste minimization, waste segregation, maximisation of materials reuse and effective logistic management. Strategies through which each of the underlying measures could be achieved are further discussed in the paper. Findings of this study would assist construction site managers and other site operatives in reducing waste generated by construction activities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  10. Efficient Levenberg-Marquardt minimization of the maximum likelihood estimator for Poisson deviates

    International Nuclear Information System (INIS)

    Laurence, T.; Chromy, B.

    2010-01-01

    Histograms of counted events are Poisson distributed, but are typically fitted without justification using nonlinear least squares fitting. The more appropriate maximum likelihood estimator (MLE) for Poisson distributed data is seldom used. We extend the use of the Levenberg-Marquardt algorithm commonly used for nonlinear least squares minimization for use with the MLE for Poisson distributed data. In so doing, we remove any excuse for not using this more appropriate MLE. We demonstrate the use of the algorithm and the superior performance of the MLE using simulations and experiments in the context of fluorescence lifetime imaging. Scientists commonly form histograms of counted events from their data, and extract parameters by fitting to a specified model. Assuming that the probability of occurrence for each bin is small, event counts in the histogram bins will be distributed according to the Poisson distribution. We develop here an efficient algorithm for fitting event counting histograms using the maximum likelihood estimator (MLE) for Poisson distributed data, rather than the non-linear least squares measure. This algorithm is a simple extension of the common Levenberg-Marquardt (L-M) algorithm, is simple to implement, quick and robust. Fitting using a least squares measure is most common, but it is the maximum likelihood estimator only for Gaussian-distributed data. Non-linear least squares methods may be applied to event counting histograms in cases where the number of events is very large, so that the Poisson distribution is well approximated by a Gaussian. However, it is not easy to satisfy this criterion in practice - which requires a large number of events. It has been well-known for years that least squares procedures lead to biased results when applied to Poisson-distributed data; a recent paper providing extensive characterization of these biases in exponential fitting is given. The more appropriate measure based on the maximum likelihood estimator (MLE

  11. Rapid, minimally invasive adult voluntary male circumcision: A ...

    African Journals Online (AJOL)

    To compare conventional open surgical circumcision with suturing to a minimally invasive technique using a single-use-only disposable instrument (Unicirc) plus tissue adhesive. This technique completes the circumcision at the time of surgery, and requires no further visits for device removal. We hypothesised that the new ...

  12. Defining the Minimal Factors Required for Erythropoiesis through Direct Lineage Conversion

    Directory of Open Access Journals (Sweden)

    Sandra Capellera-Garcia

    2016-06-01

    Full Text Available Erythroid cell commitment and differentiation proceed through activation of a lineage-restricted transcriptional network orchestrated by a group of well characterized genes. However, the minimal set of factors necessary for instructing red blood cell (RBC development remains undefined. We employed a screen for transcription factors allowing direct lineage reprograming from fibroblasts to induced erythroid progenitors/precursors (iEPs. We show that Gata1, Tal1, Lmo2, and c-Myc (GTLM can rapidly convert murine and human fibroblasts directly to iEPs. The transcriptional signature of murine iEPs resembled mainly that of primitive erythroid progenitors in the yolk sac, whereas addition of Klf1 or Myb to the GTLM cocktail resulted in iEPs with a more adult-type globin expression pattern. Our results demonstrate that direct lineage conversion is a suitable platform for defining and studying the core factors inducing the different waves of erythroid development.

  13. Learn with SAT to Minimize Büchi Automata

    Directory of Open Access Journals (Sweden)

    Stephan Barth

    2012-10-01

    Full Text Available We describe a minimization procedure for nondeterministic Büchi automata (NBA. For an automaton A another automaton A_min with the minimal number of states is learned with the help of a SAT-solver. This is done by successively computing automata A' that approximate A in the sense that they accept a given finite set of positive examples and reject a given finite set of negative examples. In the course of the procedure these example sets are successively increased. Thus, our method can be seen as an instance of a generic learning algorithm based on a "minimally adequate teacher'' in the sense of Angluin. We use a SAT solver to find an NBA for given sets of positive and negative examples. We use complementation via construction of deterministic parity automata to check candidates computed in this manner for equivalence with A. Failure of equivalence yields new positive or negative examples. Our method proved successful on complete samplings of small automata and of quite some examples of bigger automata. We successfully ran the minimization on over ten thousand automata with mostly up to ten states, including the complements of all possible automata with two states and alphabet size three and discuss results and runtimes; single examples had over 100 states.

  14. Improving the performance of minimizers and winnowing schemes.

    Science.gov (United States)

    Marçais, Guillaume; Pellow, David; Bork, Daniel; Orenstein, Yaron; Shamir, Ron; Kingsford, Carl

    2017-07-15

    The minimizers scheme is a method for selecting k -mers from sequences. It is used in many bioinformatics software tools to bin comparable sequences or to sample a sequence in a deterministic fashion at approximately regular intervals, in order to reduce memory consumption and processing time. Although very useful, the minimizers selection procedure has undesirable behaviors (e.g. too many k -mers are selected when processing certain sequences). Some of these problems were already known to the authors of the minimizers technique, and the natural lexicographic ordering of k -mers used by minimizers was recognized as their origin. Many software tools using minimizers employ ad hoc variations of the lexicographic order to alleviate those issues. We provide an in-depth analysis of the effect of k -mer ordering on the performance of the minimizers technique. By using small universal hitting sets (a recently defined concept), we show how to significantly improve the performance of minimizers and avoid some of its worse behaviors. Based on these results, we encourage bioinformatics software developers to use an ordering based on a universal hitting set or, if not possible, a randomized ordering, rather than the lexicographic order. This analysis also settles negatively a conjecture (by Schleimer et al. ) on the expected density of minimizers in a random sequence. The software used for this analysis is available on GitHub: https://github.com/gmarcais/minimizers.git . gmarcais@cs.cmu.edu or carlk@cs.cmu.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  15. Accelerating an Ordered-Subset Low-Dose X-Ray Cone Beam Computed Tomography Image Reconstruction with a Power Factor and Total Variation Minimization.

    Science.gov (United States)

    Huang, Hsuan-Ming; Hsiao, Ing-Tsung

    2016-01-01

    In recent years, there has been increased interest in low-dose X-ray cone beam computed tomography (CBCT) in many fields, including dentistry, guided radiotherapy and small animal imaging. Despite reducing the radiation dose, low-dose CBCT has not gained widespread acceptance in routine clinical practice. In addition to performing more evaluation studies, developing a fast and high-quality reconstruction algorithm is required. In this work, we propose an iterative reconstruction method that accelerates ordered-subsets (OS) reconstruction using a power factor. Furthermore, we combine it with the total-variation (TV) minimization method. Both simulation and phantom studies were conducted to evaluate the performance of the proposed method. Results show that the proposed method can accelerate conventional OS methods, greatly increase the convergence speed in early iterations. Moreover, applying the TV minimization to the power acceleration scheme can further improve the image quality while preserving the fast convergence rate.

  16. 75 FR 18041 - Defense Federal Acquisition Regulation Supplement; Minimizing Use of Hexavalent Chromium (DFARS...

    Science.gov (United States)

    2010-04-08

    ...-AG35 Defense Federal Acquisition Regulation Supplement; Minimizing Use of Hexavalent Chromium (DFARS... Regulation Supplement (DFARS) to address requirements for minimizing the use of hexavalent chromium in... of items containing hexavalent chromium under DoD contracts unless an exception applies. DATES...

  17. Stabilization of a locally minimal forest

    International Nuclear Information System (INIS)

    Ivanov, A O; Mel'nikova, A E; Tuzhilin, A A

    2014-01-01

    The method of partial stabilization of locally minimal networks, which was invented by Ivanov and Tuzhilin to construct examples of shortest trees with given topology, is developed. According to this method, boundary vertices of degree 2 are not added to all edges of the original locally minimal tree, but only to some of them. The problem of partial stabilization of locally minimal trees in a finite-dimensional Euclidean space is solved completely in the paper, that is, without any restrictions imposed on the number of edges remaining free of subdivision. A criterion for the realizability of such stabilization is established. In addition, the general problem of searching for the shortest forest connecting a finite family of boundary compact sets in an arbitrary metric space is formalized; it is shown that such forests exist for any family of compact sets if and only if for any finite subset of the ambient space there exists a shortest tree connecting it. The theory developed here allows us to establish further generalizations of the stabilization theorem both for arbitrary metric spaces and for metric spaces with some special properties. Bibliography: 10 titles

  18. Hanford Site waste minimization and pollution prevention awareness program plan

    International Nuclear Information System (INIS)

    Place, B.G.

    1998-01-01

    This plan, which is required by US Department of Energy (DOE) Order 5400. 1, provides waste minimization and pollution prevention guidance for all Hanford Site contractors. The plan is primary in a hierarchical series that includes the Hanford Site Waste Minimization and Pollution Prevention Awareness Program Plan, Prime contractor implementation plans, and the Hanford Site Guide for Preparing and Maintaining Generator Group Pollution Prevention Program Documentation (DOE-RL, 1997a) describing programs required by Resource Conservation and Recovery Act of 1976 (RCRA) 3002(b) and 3005(h) (RCRA and EPA, 1994). Items discussed include the pollution prevention policy and regulatory background, organizational structure, the major objectives and goals of Hanford Site's pollution prevention program, and an itemized description of the Hanford Site pollution prevention program. The document also includes US Department of Energy, Richland Operations Office's (RL's) statement of policy on pollution prevention as well as a listing of regulatory drivers that require a pollution prevention program

  19. SAR image regularization with fast approximate discrete minimization.

    Science.gov (United States)

    Denis, Loïc; Tupin, Florence; Darbon, Jérôme; Sigelle, Marc

    2009-07-01

    Synthetic aperture radar (SAR) images, like other coherent imaging modalities, suffer from speckle noise. The presence of this noise makes the automatic interpretation of images a challenging task and noise reduction is often a prerequisite for successful use of classical image processing algorithms. Numerous approaches have been proposed to filter speckle noise. Markov random field (MRF) modelization provides a convenient way to express both data fidelity constraints and desirable properties of the filtered image. In this context, total variation minimization has been extensively used to constrain the oscillations in the regularized image while preserving its edges. Speckle noise follows heavy-tailed distributions, and the MRF formulation leads to a minimization problem involving nonconvex log-likelihood terms. Such a minimization can be performed efficiently by computing minimum cuts on weighted graphs. Due to memory constraints, exact minimization, although theoretically possible, is not achievable on large images required by remote sensing applications. The computational burden of the state-of-the-art algorithm for approximate minimization (namely the alpha -expansion) is too heavy specially when considering joint regularization of several images. We show that a satisfying solution can be reached, in few iterations, by performing a graph-cut-based combinatorial exploration of large trial moves. This algorithm is applied to joint regularization of the amplitude and interferometric phase in urban area SAR images.

  20. A numerical method for eigenvalue problems in modeling liquid crystals

    Energy Technology Data Exchange (ETDEWEB)

    Baglama, J.; Farrell, P.A.; Reichel, L.; Ruttan, A. [Kent State Univ., OH (United States); Calvetti, D. [Stevens Inst. of Technology, Hoboken, NJ (United States)

    1996-12-31

    Equilibrium configurations of liquid crystals in finite containments are minimizers of the thermodynamic free energy of the system. It is important to be able to track the equilibrium configurations as the temperature of the liquid crystals decreases. The path of the minimal energy configuration at bifurcation points can be computed from the null space of a large sparse symmetric matrix. We describe a new variant of the implicitly restarted Lanczos method that is well suited for the computation of extreme eigenvalues of a large sparse symmetric matrix, and we use this method to determine the desired null space. Our implicitly restarted Lanczos method determines adoptively a polynomial filter by using Leja shifts, and does not require factorization of the matrix. The storage requirement of the method is small, and this makes it attractive to use for the present application.

  1. [Minimally invasive coronary artery surgery].

    Science.gov (United States)

    Zalaquett, R; Howard, M; Irarrázaval, M J; Morán, S; Maturana, G; Becker, P; Medel, J; Sacco, C; Lema, G; Canessa, R; Cruz, F

    1999-01-01

    There is a growing interest to perform a left internal mammary artery (LIMA) graft to the left anterior descending coronary artery (LAD) on a beating heart through a minimally invasive access to the chest cavity. To report the experience with minimally invasive coronary artery surgery. Analysis of 11 patients aged 48 to 79 years old with single vessel disease that, between 1996 and 1997, had a LIMA graft to the LAD performed through a minimally invasive left anterior mediastinotomy, without cardiopulmonary bypass. A 6 to 10 cm left parasternal incision was done. The LIMA to the LAD anastomosis was done after pharmacological heart rate and blood pressure control and a period of ischemic pre conditioning. Graft patency was confirmed intraoperatively by standard Doppler techniques. Patients were followed for a mean of 11.6 months (7-15 months). All patients were extubated in the operating room and transferred out of the intensive care unit on the next morning. Seven patients were discharged on the third postoperative day. Duplex scanning confirmed graft patency in all patients before discharge; in two patients, it was confirmed additionally by arteriography. There was no hospital mortality, no perioperative myocardial infarction and no bleeding problems. After follow up, ten patients were free of angina, in functional class I and pleased with the surgical and cosmetic results. One patient developed atypical angina on the seventh postoperative month and a selective arteriography confirmed stenosis of the anastomosis. A successful angioplasty of the original LAD lesion was carried out. A minimally invasive left anterior mediastinotomy is a good surgical access to perform a successful LIMA to LAD graft without cardiopulmonary bypass, allowing a shorter hospital stay and earlier postoperative recovery. However, a larger experience and a longer follow up is required to define its role in the treatment of coronary artery disease.

  2. Factors Influencing the Adoption of Minimally Invasive Surgery ...

    African Journals Online (AJOL)

    Background: Cost is a major concern for delivery of minimally invasive surgical technologies due to the nature of resources required. It is unclear whether factors extrinsic to technology availability impact on this uptake. Objectives: To establish the influence of institutional, patient and surgeon-related factors in the adoption of ...

  3. Development of Bi-phase sodium-oxygen-hydrogen chemical equilibrium calculation program (BISHOP) using Gibbs free energy minimization method

    International Nuclear Information System (INIS)

    Okano, Yasushi

    1999-08-01

    In order to analyze the reaction heat and compounds due to sodium combustion, the multiphase chemical equilibrium calculation program for chemical reaction among sodium, oxygen and hydrogen is developed in this study. The developed numerical program is named BISHOP; which denotes Bi-Phase, Sodium - Oxygen - Hydrogen, Chemical Equilibrium Calculation Program'. Gibbs free energy minimization method is used because of the special merits that easily add and change chemical species, and generally deal many thermochemical reaction systems in addition to constant temperature and pressure one. Three new methods are developed for solving multi-phase sodium reaction system in this study. One is to construct equation system by simplifying phase, and the other is to expand the Gibbs free energy minimization method into multi-phase system, and the last is to establish the effective searching method for the minimum value. Chemical compounds by the combustion of sodium in the air are calculated using BISHOP. The Calculated temperature and moisture conditions where sodium-oxide and hydroxide are formed qualitatively agree with the experiments. Deformation of sodium hydride is calculated by the program. The estimated result of the relationship between the deformation temperature and pressure closely agree with the well known experimental equation of Roy and Rodgers. It is concluded that BISHOP can be used for evaluated the combustion and deformation behaviors of sodium and its compounds. Hydrogen formation condition of the dump-tank room at the sodium leak event of FBR is quantitatively evaluated by BISHOP. It can be concluded that to keep the temperature of dump-tank room lower is effective method to suppress the formation of hydrogen. In case of choosing the lower inflammability limit of 4.1 mol% as the hydrogen concentration criterion, formation reaction of sodium hydride from sodium and hydrogen is facilitated below the room temperature of 800 K, and concentration of hydrogen

  4. Minimal Liouville gravity correlation numbers from Douglas string equation

    International Nuclear Information System (INIS)

    Belavin, Alexander; Dubrovin, Boris; Mukhametzhanov, Baur

    2014-01-01

    We continue the study of (q,p) Minimal Liouville Gravity with the help of Douglas string equation. We generalize the results of http://dx.doi.org/10.1016/0550-3213(91)90548-Chttp://dx.doi.org/10.1088/1751-8113/42/30/304004, where Lee-Yang series (2,2s+1) was studied, to (3,3s+p 0 ) Minimal Liouville Gravity, where p 0 =1,2. We demonstrate that there exist such coordinates τ m,n on the space of the perturbed Minimal Liouville Gravity theories, in which the partition function of the theory is determined by the Douglas string equation. The coordinates τ m,n are related in a non-linear fashion to the natural coupling constants λ m,n of the perturbations of Minimal Lioville Gravity by the physical operators O m,n . We find this relation from the requirement that the correlation numbers in Minimal Liouville Gravity must satisfy the conformal and fusion selection rules. After fixing this relation we compute three- and four-point correlation numbers when they are not zero. The results are in agreement with the direct calculations in Minimal Liouville Gravity available in the literature http://dx.doi.org/10.1103/PhysRevLett.66.2051http://dx.doi.org/10.1007/s11232-005-0003-3http://dx.doi.org/10.1007/s11232-006-0075-8

  5. KCUT, code to generate minimal cut sets for fault trees

    International Nuclear Information System (INIS)

    Han, Sang Hoon

    2008-01-01

    1 - Description of program or function: KCUT is a software to generate minimal cut sets for fault trees. 2 - Methods: Expand a fault tree into cut sets and delete non minimal cut sets. 3 - Restrictions on the complexity of the problem: Size and complexity of the fault tree

  6. A look-ahead variant of the Lanczos algorithm and its application to the quasi-minimal residual method for non-Hermitian linear systems. Ph.D. Thesis - Massachusetts Inst. of Technology, Aug. 1991

    Science.gov (United States)

    Nachtigal, Noel M.

    1991-01-01

    The Lanczos algorithm can be used both for eigenvalue problems and to solve linear systems. However, when applied to non-Hermitian matrices, the classical Lanczos algorithm is susceptible to breakdowns and potential instabilities. In addition, the biconjugate gradient (BCG) algorithm, which is the natural generalization of the conjugate gradient algorithm to non-Hermitian linear systems, has a second source of breakdowns, independent of the Lanczos breakdowns. Here, we present two new results. We propose an implementation of a look-ahead variant of the Lanczos algorithm which overcomes the breakdowns by skipping over those steps where a breakdown or a near-breakdown would occur. The new algorithm can handle look-ahead steps of any length and requires the same number of matrix-vector products and inner products per step as the classical Lanczos algorithm without look-ahead. Based on the proposed look-ahead Lanczos algorithm, we then present a novel BCG-like approach, the quasi-minimal residual (QMR) method, which avoids the second source of breakdowns in the BCG algorithm. We present details of the new method and discuss some of its properties. In particular, we discuss the relationship between QMR and BCG, showing how one can recover the BCG iterates, when they exist, from the QMR iterates. We also present convergence results for QMR, showing the connection between QMR and the generalized minimal residual (GMRES) algorithm, the optimal method in this class of methods. Finally, we give some numerical examples, both for eigenvalue computations and for non-Hermitian linear systems.

  7. The electroweak phase transition in minimal supergravity models

    CERN Document Server

    Nanopoulos, Dimitri V

    1994-01-01

    We have explored the electroweak phase transition in minimal supergravity models by extending previous analysis of the one-loop Higgs potential to include finite temperature effects. Minimal supergravity is characterized by two higgs doublets at the electroweak scale, gauge coupling unification, and universal soft-SUSY breaking at the unification scale. We have searched for the allowed parameter space that avoids washout of baryon number via unsuppressed anomalous Electroweak sphaleron processes after the phase transition. This requirement imposes strong constraints on the Higgs sector. With respect to weak scale baryogenesis, we find that the generic MSSM is {\\it not} phenomenologically acceptable, and show that the additional experimental and consistency constraints of minimal supergravity restricts the mass of the lightest CP-even Higgs even further to $m_h\\lsim 32\\GeV$ (at one loop), also in conflict with experiment. Thus, if supergravity is to allow for baryogenesis via any other mechanism above the weak...

  8. Determination of fuel irradiation parameters. Required accuracies and available methods

    International Nuclear Information System (INIS)

    Mas, P.

    1977-01-01

    This paper reports on the present point of some main methods to determine the nuclear parameters of fuel irradiation in testing reactors (nuclear power, burn up, ...) The different methods (theoretical or experimental) are reviewed: neutron measurements and calculations, gamma scanning, heat balance, ... . The required accuracies are reviewed: they are of 3-5 % on flux, fluences, nuclear power, burn-up, conversion factor. These required accuracies are compared with the real accuracies available which are the present time of order of 5-20 % on these parameters

  9. Determination of material irradiation parameters. Required accuracies and available methods

    International Nuclear Information System (INIS)

    Cerles, J.M.; Mas, P.

    1978-01-01

    In this paper, the author reports some main methods to determine the nuclear parameters of material irradiation in testing reactor (nuclear power, burn-up, fluxes, fluences, ...). The different methods (theoretical or experimental) are reviewed: neutronics measurements and calculations, gamma scanning, thermal balance, ... The required accuracies are reviewed: they are of 3-5% on flux, fluences, nuclear power, burn-up, conversion factor, ... These required accuracies are compared with the real accuracies available which are at the present time of order of 5-20% on these parameters

  10. Minimally invasive orthognathic surgery.

    Science.gov (United States)

    Resnick, Cory M; Kaban, Leonard B; Troulis, Maria J

    2009-02-01

    Minimally invasive surgery is defined as the discipline in which operative procedures are performed in novel ways to diminish the sequelae of standard surgical dissections. The goals of minimally invasive surgery are to reduce tissue trauma and to minimize bleeding, edema, and injury, thereby improving the rate and quality of healing. In orthognathic surgery, there are two minimally invasive techniques that can be used separately or in combination: (1) endoscopic exposure and (2) distraction osteogenesis. This article describes the historical developments of the fields of orthognathic surgery and minimally invasive surgery, as well as the integration of the two disciplines. Indications, techniques, and the most current outcome data for specific minimally invasive orthognathic surgical procedures are presented.

  11. [Research on fuzzy proportional-integral-derivative control of master-slave minimally invasive operation robot driver].

    Science.gov (United States)

    Zhao, Ximei; Ren, Chengyi; Liu, Hao; Li, Haogyi

    2014-12-01

    Robotic catheter minimally invasive operation requires that the driver control system has the advantages of quick response, strong anti-jamming and real-time tracking of target trajectory. Since the catheter parameters of itself and movement environment and other factors continuously change, when the driver is controlled using traditional proportional-integral-derivative (PID), the controller gain becomes fixed once the PID parameters are set. It can not change with the change of the parameters of the object and environmental disturbance so that its change affects the position tracking accuracy, and may bring a large overshoot endangering patients' vessel. Therefore, this paper adopts fuzzy PID control method to adjust PID gain parameters in the tracking process in order to improve the system anti-interference ability, dynamic performance and tracking accuracy. The simulation results showed that the fuzzy PID control method had a fast tracking performance and a strong robustness. Compared with those of traditional PID control, the feasibility and practicability of fuzzy PID control are verified in a robotic catheter minimally invasive operation.

  12. Regularity of Minimal Surfaces

    CERN Document Server

    Dierkes, Ulrich; Tromba, Anthony J; Kuster, Albrecht

    2010-01-01

    "Regularity of Minimal Surfaces" begins with a survey of minimal surfaces with free boundaries. Following this, the basic results concerning the boundary behaviour of minimal surfaces and H-surfaces with fixed or free boundaries are studied. In particular, the asymptotic expansions at interior and boundary branch points are derived, leading to general Gauss-Bonnet formulas. Furthermore, gradient estimates and asymptotic expansions for minimal surfaces with only piecewise smooth boundaries are obtained. One of the main features of free boundary value problems for minimal surfaces is t

  13. LLNL Waste Minimization Program Plan

    International Nuclear Information System (INIS)

    1990-05-01

    This document is the February 14, 1990 version of the LLNL Waste Minimization Program Plan (WMPP). Now legislation at the federal level is being introduced. Passage will result in new EPA regulations and also DOE orders. At the state level the Hazardous Waste Reduction and Management Review Act of 1989 was signed by the Governor. DHS is currently promulgating regulations to implement the new law. EPA has issued a proposed new policy statement on source reduction and recycling. This policy reflects a preventative strategy to reduce or eliminate the generation of environmentally-harmful pollutants which may be released to the air, land surface, water, or ground water. In accordance with this policy new guidance to hazardous waste generators on the elements of a Waste Minimization Program was issued. This WMPP is formatted to meet the current DOE guidance outlines. The current WMPP will be revised to reflect all of these proposed changes when guidelines are established. Updates, changes and revisions to the overall LLNL WMPP will be made as appropriate to reflect ever-changing regulatory requirements

  14. Numerical solution of large nonlinear boundary value problems by quadratic minimization techniques

    International Nuclear Information System (INIS)

    Glowinski, R.; Le Tallec, P.

    1984-01-01

    The objective of this paper is to describe the numerical treatment of large highly nonlinear two or three dimensional boundary value problems by quadratic minimization techniques. In all the different situations where these techniques were applied, the methodology remains the same and is organized as follows: 1) derive a variational formulation of the original boundary value problem, and approximate it by Galerkin methods; 2) transform this variational formulation into a quadratic minimization problem (least squares methods) or into a sequence of quadratic minimization problems (augmented lagrangian decomposition); 3) solve each quadratic minimization problem by a conjugate gradient method with preconditioning, the preconditioning matrix being sparse, positive definite, and fixed once for all in the iterative process. This paper will illustrate the methodology above on two different examples: the description of least squares solution methods and their application to the solution of the unsteady Navier-Stokes equations for incompressible viscous fluids; the description of augmented lagrangian decomposition techniques and their application to the solution of equilibrium problems in finite elasticity

  15. Gamma radiation in the reduction of Salmonella spp. inoculated on minimally processed watercress (Nasturtium officinalis)

    International Nuclear Information System (INIS)

    Martins, C.G.; Behrens, J.H.; Destro, M.T.; Franco, B.D.G.M.; Vizeu, D.M.; Hutzler, B.; Landgraf, M.

    2004-01-01

    Consumer attitudes towards foods have changed in the last two decades increasing requirements for freshlike products. Consequently, less extreme treatments or additives are being required. Minimally processed foods have freshlike characteristics and satisfy this new consumer demand. Besides freshness, the minimally processing also provide convenience required by the market. Salad vegetables can be source of pathogen such as Salmonella, Escherichia coli O157:H7, Shigella spp. The minimal processing does not reduce the levels of pathogenic microorganisms to safe levels. Therefore, this study was carried out in order to improve the microbiological safety and the shelf-life of minimally processed vegetables using gamma radiation. Minimally processed watercress inoculated with a cocktail of Salmonella spp was exposed to 0.0, 0.2, 0.5, 0.7, 1.0, 1.2 and 1.5 kGy. Irradiated samples were diluted 1:10 in saline peptone water and plated onto tryptic soy agar that were incubated at 37 deg. C/24 h. D 10 values for Salmonella spp. inoculated in watercress varied from 0.29 to 0.43 kGy. Therefore, a dose of 1.7 kGy will reduce Salmonella population in watercress by 4 log 10 . The shelf-life was increased by 1 1/2 day when the product was exposed to 1 kGy

  16. Functional Mobility Testing: A Novel Method to Create Suit Design Requirements

    Science.gov (United States)

    England, Scott A.; Benson, Elizabeth A.; Rajulu, Sudhakar L.

    2008-01-01

    This study was performed to aide in the creation of design requirements for the next generation of space suits that more accurately describe the level of mobility necessary for a suited crewmember through the use of an innovative methodology utilizing functional mobility. A novel method was utilized involving the collection of kinematic data while 20 subjects (10 male, 10 female) performed pertinent functional tasks that will be required of a suited crewmember during various phases of a lunar mission. These tasks were selected based on relevance and criticality from a larger list of tasks that may be carried out by the crew. Kinematic data was processed through Vicon BodyBuilder software to calculate joint angles for the ankle, knee, hip, torso, shoulder, elbow, and wrist. Maximum functional mobility was consistently lower than maximum isolated mobility. This study suggests that conventional methods for establishing design requirements for human-systems interfaces based on maximal isolated joint capabilities may overestimate the required mobility. Additionally, this method provides a valuable means of evaluating systems created from these requirements by comparing the mobility available in a new spacesuit, or the mobility required to use a new piece of hardware, to this newly established database of functional mobility.

  17. [Determination of minimal concentrations of biocorrosion inhibitors by a bioluminescence method in relation to bacteria, participating in biocorrosion].

    Science.gov (United States)

    Efremenko, E N; Azizov, R E; Makhlis, T A; Abbasov, V M; Varfolomeev, S D

    2005-01-01

    By using a bioluminescence ATP assay, we have determined the minimal concentrations of some biocorrosion inhibitors (Katon, Khazar, VFIKS-82, Nitro-1, Kaspii-2, and Kaspii-4) suppressing most common microbial corrosion agents: Desulfovibrio desulfuricans, Desulfovibrio vulgaris, Pseudomonas putida, Pseudomonas fluorescens, and Acidithiobacillus ferrooxidans. The cell titers determined by the bioluminescence method, including not only dividing cells but also their dormant living counterparts, are two- to sixfold greater than the values determined microbiologically. It is shown that the bioluminescence method can be applied to determination of cell titers in samples of oil-field waters in the presence of iron ions (up to 260 mM) and iron sulfide (to 186 mg/l) and in the absence or presence of biocidal corrosion inhibitors.

  18. Self-Averaging Property of Minimal Investment Risk of Mean-Variance Model.

    Science.gov (United States)

    Shinzato, Takashi

    2015-01-01

    In portfolio optimization problems, the minimum expected investment risk is not always smaller than the expected minimal investment risk. That is, using a well-known approach from operations research, it is possible to derive a strategy that minimizes the expected investment risk, but this strategy does not always result in the best rate of return on assets. Prior to making investment decisions, it is important to an investor to know the potential minimal investment risk (or the expected minimal investment risk) and to determine the strategy that will maximize the return on assets. We use the self-averaging property to analyze the potential minimal investment risk and the concentrated investment level for the strategy that gives the best rate of return. We compare the results from our method with the results obtained by the operations research approach and with those obtained by a numerical simulation using the optimal portfolio. The results of our method and the numerical simulation are in agreement, but they differ from that of the operations research approach.

  19. Self-Averaging Property of Minimal Investment Risk of Mean-Variance Model.

    Directory of Open Access Journals (Sweden)

    Takashi Shinzato

    Full Text Available In portfolio optimization problems, the minimum expected investment risk is not always smaller than the expected minimal investment risk. That is, using a well-known approach from operations research, it is possible to derive a strategy that minimizes the expected investment risk, but this strategy does not always result in the best rate of return on assets. Prior to making investment decisions, it is important to an investor to know the potential minimal investment risk (or the expected minimal investment risk and to determine the strategy that will maximize the return on assets. We use the self-averaging property to analyze the potential minimal investment risk and the concentrated investment level for the strategy that gives the best rate of return. We compare the results from our method with the results obtained by the operations research approach and with those obtained by a numerical simulation using the optimal portfolio. The results of our method and the numerical simulation are in agreement, but they differ from that of the operations research approach.

  20. Effects of gamma radiation in cauliflower (Brassica spp) minimally processed

    International Nuclear Information System (INIS)

    Nunes, Thaise C.F.; Rogovschi, Vladimir D.; Thomaz, Fernanda S.; Trindade, Reginaldo A.; Villavicencio, Anna L.C.H.; Alencar, Severino M.

    2007-01-01

    Consumers demand for health interests and the latest diet trends. The consumption of vegetables worldwide has increased every year over the past decade, consequently, less extreme treatments or additives are being required. Minimally processed foods have fresh-like characteristics and satisfy the new consumer demand. Food irradiation is an exposure process of the product to controlled sources of gamma radiation with the intention to destroy pathogens and to extend the shelf life. Minimally processed cauliflower (Brassica oleraceae) exposed to low dose of gamma radiation does not show any change in sensory attributes. The aim of this study was to analyze the effects of the low doses of gamma radiation on sensorial aspects like appearance, texture and flavor of minimally processed cauliflower. (author)

  1. Minimal Poems Written in 1979 Minimal Poems Written in 1979

    Directory of Open Access Journals (Sweden)

    Sandra Sirangelo Maggio

    2008-04-01

    Full Text Available The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism. The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism.

  2. Electric dipole moment constraints on minimal electroweak baryogenesis

    CERN Document Server

    Huber, S J; Ritz, A; Huber, Stephan J.; Pospelov, Maxim; Ritz, Adam

    2007-01-01

    We study the simplest generic extension of the Standard Model which allows for conventional electroweak baryogenesis, through the addition of dimension six operators in the Higgs sector. At least one such operator is required to be CP-odd, and we study the constraints on such a minimal setup, and related scenarios with minimal flavor violation, from the null results of searches for electric dipole moments (EDMs), utilizing the full set of two-loop contributions to the EDMs. The results indicate that the current bounds are stringent, particularly that of the recently updated neutron EDM, but fall short of ruling out these scenarios. The next generation of EDM experiments should be sufficiently sensitive to provide a conclusive test.

  3. Minimization over randomly selected lines

    Directory of Open Access Journals (Sweden)

    Ismet Sahin

    2013-07-01

    Full Text Available This paper presents a population-based evolutionary optimization method for minimizing a given cost function. The mutation operator of this method selects randomly oriented lines in the cost function domain, constructs quadratic functions interpolating the cost function at three different points over each line, and uses extrema of the quadratics as mutated points. The crossover operator modifies each mutated point based on components of two points in population, instead of one point as is usually performed in other evolutionary algorithms. The stopping criterion of this method depends on the number of almost degenerate quadratics. We demonstrate that the proposed method with these mutation and crossover operations achieves faster and more robust convergence than the well-known Differential Evolution and Particle Swarm algorithms.

  4. Software Safety Analysis of Digital Protection System Requirements Using a Qualitative Formal Method

    International Nuclear Information System (INIS)

    Lee, Jang-Soo; Kwon, Kee-Choon; Cha, Sung-Deok

    2004-01-01

    The safety analysis of requirements is a key problem area in the development of software for the digital protection systems of a nuclear power plant. When specifying requirements for software of the digital protection systems and conducting safety analysis, engineers find that requirements are often known only in qualitative terms and that existing fault-tree analysis techniques provide little guidance on formulating and evaluating potential failure modes. A framework for the requirements engineering process is proposed that consists of a qualitative method for requirements specification, called the qualitative formal method (QFM), and a safety analysis method for the requirements based on causality information, called the causal requirements safety analysis (CRSA). CRSA is a technique that qualitatively evaluates causal relationships between software faults and physical hazards. This technique, extending the qualitative formal method process and utilizing information captured in the state trajectory, provides specific guidelines on how to identify failure modes and the relationship among them. The QFM and CRSA processes are described using shutdown system 2 of the Wolsong nuclear power plants as the digital protection system example

  5. Correlates of minimal dating.

    Science.gov (United States)

    Leck, Kira

    2006-10-01

    Researchers have associated minimal dating with numerous factors. The present author tested shyness, introversion, physical attractiveness, performance evaluation, anxiety, social skill, social self-esteem, and loneliness to determine the nature of their relationships with 2 measures of self-reported minimal dating in a sample of 175 college students. For women, shyness, introversion, physical attractiveness, self-rated anxiety, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. For men, physical attractiveness, observer-rated social skill, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. The patterns of relationships were not identical for the 2 indicators of minimal dating, indicating the possibility that minimal dating is not a single construct as researchers previously believed. The present author discussed implications and suggestions for future researchers.

  6. A Framework for the Development of Automatic DFA Method to Minimize the Number of Components and Assembly Reorientations

    Science.gov (United States)

    Alfadhlani; Samadhi, T. M. A. Ari; Ma’ruf, Anas; Setiasyah Toha, Isa

    2018-03-01

    Assembly is a part of manufacturing processes that must be considered at the product design stage. Design for Assembly (DFA) is a method to evaluate product design in order to make it simpler, easier and quicker to assemble, so that assembly cost is reduced. This article discusses a framework for developing a computer-based DFA method. The method is expected to aid product designer to extract data, evaluate assembly process, and provide recommendation for the product design improvement. These three things are desirable to be performed without interactive process or user intervention, so product design evaluation process could be done automatically. Input for the proposed framework is a 3D solid engineering drawing. Product design evaluation is performed by: minimizing the number of components; generating assembly sequence alternatives; selecting the best assembly sequence based on the minimum number of assembly reorientations; and providing suggestion for design improvement.

  7. Numerical methods

    CERN Document Server

    Dahlquist, Germund

    1974-01-01

    ""Substantial, detailed and rigorous . . . readers for whom the book is intended are admirably served."" - MathSciNet (Mathematical Reviews on the Web), American Mathematical Society.Practical text strikes fine balance between students' requirements for theoretical treatment and needs of practitioners, with best methods for large- and small-scale computing. Prerequisites are minimal (calculus, linear algebra, and preferably some acquaintance with computer programming). Text includes many worked examples, problems, and an extensive bibliography.

  8. Navy Shipboard Hazardous Material Minimization Program

    Energy Technology Data Exchange (ETDEWEB)

    Bieberich, M.J. [Naval Surface Warfare Center, Annapolis, MD (United States). Carderock Div.; Robinson, P. [Life Cycle Engineering, Inc., Charleston, SC (United States); Chastain, B.

    1994-12-31

    The use of hazardous (and potentially hazardous) materials in shipboard cleaning applications has proliferated as new systems and equipments have entered the fleet to reside alongside existing equipments. With the growing environmental awareness (and additional, more restrictive regulations) at all levels/echelon commands of the DoD, the Navy has initiated a proactive program to undertake the minimization/elimination of these hazardous materials in order to eliminate HMs at the source. This paper will focus on the current Shipboard Hazardous Materials Minimization Program initiatives including the identification of authorized HM currently used onboard, identification of potential substitute materials for HM replacement, identification of new cleaning technologies and processes/procedures, and identification of technical documents which will require revision to eliminate the procurement of HMs into the federal supply system. Also discussed will be the anticipated path required to implement the changes into the fleet and automated decision processes (substitution algorithm) currently employed. The paper will also present the most recent technologies identified for approval or additional testing and analysis including: supercritical CO{sub 2} cleaning, high pressure blasting (H{sub 2}O + baking soda), aqueous and semi-aqueous cleaning materials and processes, solvent replacements and dedicated parts washing systems with internal filtering capabilities, automated software for solvent/cleaning process substitute selection. Along with these technological advances, data availability (from on-line databases and CDROM Database libraries) will be identified and discussed.

  9. Minimal Super Technicolor

    DEFF Research Database (Denmark)

    Antola, M.; Di Chiara, S.; Sannino, F.

    2011-01-01

    We introduce novel extensions of the Standard Model featuring a supersymmetric technicolor sector (supertechnicolor). As the first minimal conformal supertechnicolor model we consider N=4 Super Yang-Mills which breaks to N=1 via the electroweak interactions. This is a well defined, economical......, between unparticle physics and Minimal Walking Technicolor. We consider also other N =1 extensions of the Minimal Walking Technicolor model. The new models allow all the standard model matter fields to acquire a mass....

  10. 5 CFR 610.404 - Requirement for time-accounting method.

    Science.gov (United States)

    2010-01-01

    ... REGULATIONS HOURS OF DUTY Flexible and Compressed Work Schedules § 610.404 Requirement for time-accounting method. An agency that authorizes a flexible work schedule or a compressed work schedule under this...

  11. A cyclotron isotope production facility designed to maximize production and minimize radiation dose

    International Nuclear Information System (INIS)

    Dickie, W.J.; Stevenson, N.R.; Szlavik, F.F.

    1993-01-01

    Continuing increases in requirements from the nuclear medicine industry for cyclotron isotopes is increasing the demands being put on an aging stock of machines. In addition, with the 1990 recommendations of the ICRP publication in place, strict dose limits will be required and this will have an effect on the way these machines are being operated. Recent advances in cyclotron design combined with lessons learned from two decades of commercial production mean that new facilities can result in a substantial charge on target, low personnel dose, and minimal residual activation. An optimal facility would utilize a well engineered variable energy/high current H - cyclotron design, multiple beam extraction, and individual target caves. Materials would be selected to minimize activation and absorb neutrons. Equipment would be designed to minimize maintenance activities performed in high radiation fields. (orig.)

  12. Effective teaching methods in higher education: requirements and barriers

    Directory of Open Access Journals (Sweden)

    NAHID SHIRANI BIDABADI

    2016-10-01

    Full Text Available Introduction: Teaching is one of the main components in educational planning which is a key factor in conducting educational plans. Despite the importance of good teaching, the outcomes are far from ideal. The present qualitative study aimed to investigate effective teaching in higher education in Iran based on the experiences of best professors in the country and the best local professors of Isfahan University of Technology. Methods: This qualitative content analysis study was conducted through purposeful sampling. Semi-structured interviews were conducted with ten faculty members (3 of them from the best professors in the country and 7 from the best local professors. Content analysis was performed by MAXQDA software. The codes, categories and themes were explored through an inductive process that began from semantic units or direct quotations to general themes. Results: According to the results of this study, the best teaching approach is the mixed method (student-centered together with teacher-centered plus educational planning and previous readiness. But whenever the teachers can teach using this method confront with some barriers and requirements; some of these requirements are prerequisite in professors’ behavior and some of these are prerequisite in professors’ outlook. Also, there are some major barriers, some of which are associated with the professors’ operation and others are related to laws and regulations. Implications of these findings for teachers’ preparation in education are discussed. Conclusion: In the present study, it was illustrated that a good teaching method helps the students to question their preconceptions, and motivates them to learn, by putting them in a situation in which they come to see themselves as the authors of answers, as the agents of responsibility for change. But training through this method has some barriers and requirements. To have an effective teaching; the faculty members of the universities

  13. What is Quantum Mechanics? A Minimal Formulation

    Science.gov (United States)

    Friedberg, R.; Hohenberg, P. C.

    2018-03-01

    This paper presents a minimal formulation of nonrelativistic quantum mechanics, by which is meant a formulation which describes the theory in a succinct, self-contained, clear, unambiguous and of course correct manner. The bulk of the presentation is the so-called "microscopic theory", applicable to any closed system S of arbitrary size N, using concepts referring to S alone, without resort to external apparatus or external agents. An example of a similar minimal microscopic theory is the standard formulation of classical mechanics, which serves as the template for a minimal quantum theory. The only substantive assumption required is the replacement of the classical Euclidean phase space by Hilbert space in the quantum case, with the attendant all-important phenomenon of quantum incompatibility. Two fundamental theorems of Hilbert space, the Kochen-Specker-Bell theorem and Gleason's theorem, then lead inevitably to the well-known Born probability rule. For both classical and quantum mechanics, questions of physical implementation and experimental verification of the predictions of the theories are the domain of the macroscopic theory, which is argued to be a special case or application of the more general microscopic theory.

  14. Search for Minimal Standard Model and Minimal Supersymmetric Model Higgs Bosons in e+ e- Collisions with the OPAL detector at LEP

    International Nuclear Information System (INIS)

    Ganel, Ofer

    1993-06-01

    When LEP machine was turned on in August 1989, a new era had opened. For the first time, direct, model-independent searches for Higgs boson could be carried out. The Minimal Standard Model Higgs boson is expected to be produced in e + e - collisions via the H o Z o . The Minimal Supersymmetric Model Higgs boson are expected to be produced in the analogous e + e - -> h o Z o process or in pairs via the process e + e - -> h o A o . In this thesis we describe the search for Higgs bosons within the framework of the Minimal Standard Model and the Minimal Supersymmetric Model, using the data accumulated by the OPAL detector at LEP in the 1989, 1990, 1991 and part of the 1992 running periods at and around the Z o pole. An MInimal Supersymmetric Model Higgs boson generator is described as well as its use in several different searches. As a result of this work, the Minimal Standard Model Higgs boson mass is bounded from below by 54.2 GeV/c 2 at 95% C.L. This is, at present, the highest such bound. A novel method of overcoming the m τ and m s dependence of Minimal Supersymmetric Higgs boson production and decay introduced by one-loop radiative corrections is used to obtain model-independent exclusion. The thesis describes also an algorithm for off line identification of calorimeter noise in the OPAL detector. (author)

  15. Proceedings of the Department of Energy Defense Programs hazardous and mixed waste minimization workshop: Hazardous Waste Remedial Actions Program

    International Nuclear Information System (INIS)

    1988-09-01

    The first workshop on hazardous and mixed waste minimization was held in Las Vegas, Nevada, on July 26--28, 1988. The objective of this workshop was to establish an interchange between DOE headquarters (DOE-HQ) DP, Operations Offices, and contractors of waste minimization strategies and successes. The first day of the workshop began with presentations stressing the importance of establishing a waste minimization program at each site as required by RCRA, the land ban restrictions, and the decrease in potential liabilities associated with waste disposal. Discussions were also centered on pending legislation which would create an Office of Waste Reduction in the Environmental Protection Agency (EPA). The Waste Minimization and Avoidance Study was initiated by DOE as an addition to the long-term productivity study to address the issues of evolving requirements facing RCRA waste management activities at the DP sites, to determine how major operations will be affected by these requirements, and to determine the available strategies and options for waste minimization and avoidance. Waste minimization was defined in this study as source reduction and recycling

  16. A reweighted ℓ1-minimization based compressed sensing for the spectral estimation of heart rate variability using the unevenly sampled data.

    Directory of Open Access Journals (Sweden)

    Szi-Wen Chen

    Full Text Available In this paper, a reweighted ℓ1-minimization based Compressed Sensing (CS algorithm incorporating the Integral Pulse Frequency Modulation (IPFM model for spectral estimation of HRV is introduced. Knowing as a novel sensing/sampling paradigm, the theory of CS asserts certain signals that are considered sparse or compressible can be possibly reconstructed from substantially fewer measurements than those required by traditional methods. Our study aims to employ a novel reweighted ℓ1-minimization CS method for deriving the spectrum of the modulating signal of IPFM model from incomplete RR measurements for HRV assessments. To evaluate the performance of HRV spectral estimation, a quantitative measure, referred to as the Percent Error Power (PEP that measures the percentage of difference between the true spectrum and the spectrum derived from the incomplete RR dataset, was used. We studied the performance of spectral reconstruction from incomplete simulated and real HRV signals by experimentally truncating a number of RR data accordingly in the top portion, in the bottom portion, and in a random order from the original RR column vector. As a result, for up to 20% data truncation/loss the proposed reweighted ℓ1-minimization CS method produced, on average, 2.34%, 2.27%, and 4.55% PEP in the top, bottom, and random data-truncation cases, respectively, on Autoregressive (AR model derived simulated HRV signals. Similarly, for up to 20% data loss the proposed method produced 5.15%, 4.33%, and 0.39% PEP in the top, bottom, and random data-truncation cases, respectively, on a real HRV database drawn from PhysioNet. Moreover, results generated by a number of intensive numerical experiments all indicated that the reweighted ℓ1-minimization CS method always achieved the most accurate and high-fidelity HRV spectral estimates in every aspect, compared with the ℓ1-minimization based method and Lomb's method used for estimating the spectrum of HRV from

  17. Rapid methods for jugular bleeding of dogs requiring one technician.

    Science.gov (United States)

    Frisk, C S; Richardson, M R

    1979-06-01

    Two methods were used to collect blood from the jugular vein of dogs. In both techniques, only one technician was required. A rope with a slip knot was placed around the base of the neck to assist in restraint and act as a tourniquet for the vein. The technician used one hand to restrain the dog by the muzzle and position the head. The other hand was used for collecting the sample. One of the methods could be accomplished with the dog in its cage. The bleeding techniques were rapid, requiring approximately 1 minute per dog.

  18. A new approach to the inverse kinematics of a multi-joint robot manipulator using a minimization method

    International Nuclear Information System (INIS)

    Sasaki, Shinobu

    1987-01-01

    This paper proposes a new approach to solve the inverse kinematics of a type of sixlink manipulator. Directing our attention to features of joint structures of the manipulator, the original problem is first formulated by a system of equations with four variables and solved by means of a minimization technique. The remaining two variables are determined from constrained conditions involved. This is the basic idea in the present approach. The results of computer simulation of the present algorithm showed that the accuracies of solutions and convergence speed are much higher and quite satisfactory for practical purposes, as compared with the linearization-iteration method based on the conventional inverse Jacobian matrix. (author)

  19. Minimal Length Effects on Tunnelling from Spherically Symmetric Black Holes

    Directory of Open Access Journals (Sweden)

    Benrong Mu

    2015-01-01

    Full Text Available We investigate effects of the minimal length on quantum tunnelling from spherically symmetric black holes using the Hamilton-Jacobi method incorporating the minimal length. We first derive the deformed Hamilton-Jacobi equations for scalars and fermions, both of which have the same expressions. The minimal length correction to the Hawking temperature is found to depend on the black hole’s mass and the mass and angular momentum of emitted particles. Finally, we calculate a Schwarzschild black hole's luminosity and find the black hole evaporates to zero mass in infinite time.

  20. Minimally Invasive Spine Surgery in Small Animals.

    Science.gov (United States)

    Hettlich, Bianca F

    2018-01-01

    Minimally invasive spine surgery (MISS) seems to have many benefits for human patients and is currently used for various minor and major spine procedures. For MISS, a change in access strategy to the target location is necessary and it requires intraoperative imaging, special instrumentation, and magnification. Few veterinary studies have evaluated MISS for canine patients for spinal decompression procedures. This article discusses the general requirements for MISS and how these can be applied to veterinary spinal surgery. The current veterinary MISS literature is reviewed and suggestions are made on how to apply MISS to different spinal locations. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. A videoscope for use in minimally invasive periodontal surgery.

    Science.gov (United States)

    Harrel, Stephen K; Wilson, Thomas G; Rivera-Hidalgo, Francisco

    2013-09-01

    Minimally invasive periodontal procedures have been reported to produce excellent clinical results. Visualization during minimally invasive procedures has traditionally been obtained by the use of surgical telescopes, surgical microscopes, glass fibre endoscopes or a combination of these devices. All of these methods for visualization are less than fully satisfactory due to problems with access, magnification and blurred imaging. A videoscope for use with minimally invasive periodontal procedures has been developed to overcome some of the difficulties that exist with current visualization approaches. This videoscope incorporates a gas shielding technology that eliminates the problems of fogging and fouling of the optics of the videoscope that has previously prevented the successful application of endoscopic visualization to periodontal surgery. In addition, as part of the gas shielding technology the videoscope also includes a moveable retractor specifically adapted for minimally invasive surgery. The clinical use of the videoscope during minimally invasive periodontal surgery is demonstrated and discussed. The videoscope with gas shielding alleviates many of the difficulties associated with visualization during minimally invasive periodontal surgery. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. Gamma radiation in the reduction of Salmonella spp. inoculated on minimally processed watercress (Nasturtium officinalis)

    Energy Technology Data Exchange (ETDEWEB)

    Martins, C.G.; Behrens, J.H.; Destro, M.T.; Franco, B.D.G.M.; Vizeu, D.M.; Hutzler, B.; Landgraf, M. E-mail: landgraf@usp.br

    2004-10-01

    Consumer attitudes towards foods have changed in the last two decades increasing requirements for freshlike products. Consequently, less extreme treatments or additives are being required. Minimally processed foods have freshlike characteristics and satisfy this new consumer demand. Besides freshness, the minimally processing also provide convenience required by the market. Salad vegetables can be source of pathogen such as Salmonella, Escherichia coli O157:H7, Shigella spp. The minimal processing does not reduce the levels of pathogenic microorganisms to safe levels. Therefore, this study was carried out in order to improve the microbiological safety and the shelf-life of minimally processed vegetables using gamma radiation. Minimally processed watercress inoculated with a cocktail of Salmonella spp was exposed to 0.0, 0.2, 0.5, 0.7, 1.0, 1.2 and 1.5 kGy. Irradiated samples were diluted 1:10 in saline peptone water and plated onto tryptic soy agar that were incubated at 37 deg. C/24 h. D{sub 10} values for Salmonella spp. inoculated in watercress varied from 0.29 to 0.43 kGy. Therefore, a dose of 1.7 kGy will reduce Salmonella population in watercress by 4 log{sub 10}. The shelf-life was increased by 1 1/2 day when the product was exposed to 1 kGy.

  3. Multi-terminal pipe routing by Steiner minimal tree and particle swarm optimisation

    Science.gov (United States)

    Liu, Qiang; Wang, Chengen

    2012-08-01

    Computer-aided design of pipe routing is of fundamental importance for complex equipments' developments. In this article, non-rectilinear branch pipe routing with multiple terminals that can be formulated as a Euclidean Steiner Minimal Tree with Obstacles (ESMTO) problem is studied in the context of an aeroengine-integrated design engineering. Unlike the traditional methods that connect pipe terminals sequentially, this article presents a new branch pipe routing algorithm based on the Steiner tree theory. The article begins with a new algorithm for solving the ESMTO problem by using particle swarm optimisation (PSO), and then extends the method to the surface cases by using geodesics to meet the requirements of routing non-rectilinear pipes on the surfaces of aeroengines. Subsequently, the adaptive region strategy and the basic visibility graph method are adopted to increase the computation efficiency. Numeral computations show that the proposed routing algorithm can find satisfactory routing layouts while running in polynomial time.

  4. B-ALL minimal residual disease flow cytometry: an application of a novel method for optimization of a single-tube model.

    Science.gov (United States)

    Shaver, Aaron C; Greig, Bruce W; Mosse, Claudio A; Seegmiller, Adam C

    2015-05-01

    Optimizing a clinical flow cytometry panel can be a subjective process dependent on experience. We develop a quantitative method to make this process more rigorous and apply it to B lymphoblastic leukemia/lymphoma (B-ALL) minimal residual disease (MRD) testing. We retrospectively analyzed our existing three-tube, seven-color B-ALL MRD panel and used our novel method to develop an optimized one-tube, eight-color panel, which was tested prospectively. The optimized one-tube, eight-color panel resulted in greater efficiency of time and resources with no loss in diagnostic power. Constructing a flow cytometry panel using a rigorous, objective, quantitative method permits optimization and avoids problems of interdependence and redundancy in a large, multiantigen panel. Copyright© by the American Society for Clinical Pathology.

  5. Good Practice Guide Waste Minimization/Pollution Prevention

    Energy Technology Data Exchange (ETDEWEB)

    J Dorsey

    1999-10-14

    This Good Practice Guide provides tools, information, and examples for promoting the implementation of pollution prevention during the design phases of U.S. Department of Energy (DOE) projects. It is one of several Guides for implementing DOE Order 430.1, Life-cycle Asset Management. DOE Order 430.1 provides requirements for DOE, in partnership with its contractors, to plan, acquire, operate, maintain, and dispose of physical assets. The goals of designing for pollution prevention are to minimize raw material consumption, energy consumption, waste generation, health and safety impacts, and ecological degradation over the entire life of the facility (EPA 1993a). Users of this Guide will learn to translate national policy and regulatory requirements for pollution prevention into action at the project level. The Guide was written to be applicable to all DOE projects, regardless of project size or design phase. Users are expected to interpret the Guide for their individual project's circumstances, applying a graded approach so that the effort is consistent with the anticipated waste generation and resource consumption of the physical asset. This Guide employs a combination of pollution prevention opportunity assessment (PPOA) methods and design for environment (DfE) philosophies. The PPOA process was primarily developed for existing products, processes, and facilities. The PPOA process has been modified in this Guide to address the circumstances of the DOE design process as delineated in DOE Order 430.1 and its associated Good Practice Guides. This modified form of the PPOA is termed the Pollution Prevention Design Assessment (P2DA). Information on current nationwide methods and successes in designing for the environment also have been reviewed and are integrated into this guidance.

  6. Gamma radiation in the reduction of S almonella spp. inoculated on minimally processed watercress ( Nasturtium officinalis)

    Science.gov (United States)

    Martins, C. G.; Behrens, J. H.; Destro, M. T.; Franco, B. D. G. M.; Vizeu, D. M.; Hutzler, B.; Landgraf, M.

    2004-09-01

    Consumer attitudes towards foods have changed in the last two decades increasing requirements for freshlike products. Consequently, less extreme treatments or additives are being required. Minimally processed foods have freshlike characteristics and satisfy this new consumer demand. Besides freshness, the minimally processing also provide convenience required by the market. Salad vegetables can be source of pathogen such as Salmonella, Escherichia coli O157:H7, Shigella spp. The minimal processing does not reduce the levels of pathogenic microorganisms to safe levels. Therefore, this study was carried out in order to improve the microbiological safety and the shelf-life of minimally processed vegetables using gamma radiation. Minimally processed watercress inoculated with a cocktail of Salmonella spp was exposed to 0.0, 0.2, 0.5, 0.7, 1.0, 1.2 and 1.5 kGy. Irradiated samples were diluted 1:10 in saline peptone water and plated onto tryptic soy agar that were incubated at 37°C/24 h. D 10 values for Salmonella spp. inoculated in watercress varied from 0.29 to 0.43 kGy. Therefore, a dose of 1.7 kGy will reduce Salmonella population in watercress by 4 log 10. The shelf-life was increased by 1 {1}/{2} day when the product was exposed to 1 kGy.

  7. Probabilistic Properties of Rectilinear Steiner Minimal Trees

    Directory of Open Access Journals (Sweden)

    V. N. Salnikov

    2015-01-01

    Full Text Available This work concerns the properties of Steiner minimal trees for the manhattan plane in the context of introducing a probability measure. This problem is important because exact algorithms to solve the Steiner problem are computationally expensive (NP-hard and the solution (especially in the case of big number of points to be connected has a diversity of practical applications. That is why the work considers a possibility to rank the possible topologies of the minimal trees with respect to a probability of their usage. For this, the known facts about the structural properties of minimal trees for selected metrics have been analyzed to see their usefulness for the problem in question. For the small amount of boundary (fixed vertices, the paper offers a way to introduce a probability measure as a corollary of proved theorem about some structural properties of the minimal trees.This work is considered to further the previous similar activity concerning a problem of searching for minimal fillings, and it is a door opener to the more general (complicated task. The stated method demonstrates the possibility to reach the final result analytically, which gives a chance of its applicability to the case of the bigger number of boundary vertices (probably, with the use of computer engineering.The introducing definition of an essential Steiner point allowed a considerable restriction of the ambiguity of initial problem solution and, at the same time, comparison of such an approach with more classical works in the field concerned. The paper also lists main barriers of classical approaches, preventing their use for the task of introducing a probability measure.In prospect, application areas of the described method are expected to be wider both in terms of system enlargement (the number of boundary vertices and in terms of other metric spaces (the Euclidean case is of especial interest. The main interest is to find the classes of topologies with significantly

  8. Minimizing the Fluid Used to Induce Fracturing

    Science.gov (United States)

    Boyle, E. J.

    2015-12-01

    The less fluid injected to induce fracturing means less fluid needing to be produced before gas is produced. One method is to inject as fast as possible until the desired fracture length is obtained. Presented is an alternative injection strategy derived by applying optimal system control theory to the macroscopic mass balance. The picture is that the fracture is constant in aperture, fluid is injected at a controlled rate at the near end, and the fracture unzips at the far end until the desired length is obtained. The velocity of the fluid is governed by Darcy's law with larger permeability for flow along the fracture length. Fracture growth is monitored through micro-seismicity. Since the fluid is assumed to be incompressible, the rate at which fluid is injected is balanced by rate of fracture growth and rate of loss to bounding rock. Minimizing injected fluid loss to the bounding rock is the same as minimizing total injected fluid How to change the injection rate so as to minimize the total injected fluid is a problem in optimal control. For a given total length, the variation of the injected rate is determined by variations in overall time needed to obtain the desired fracture length, the length at any time, and the rate at which the fracture is growing at that time. Optimal control theory leads to a boundary condition and an ordinary differential equation in time whose solution is an injection protocol that minimizes the fluid used under the stated assumptions. That method is to monitor the rate at which the square of the fracture length is growing and adjust the injection rate proportionately.

  9. Waste minimization activities in the Materials Fabrication Division at Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Dini, J.W.

    1991-08-01

    The mission of the Materials Fabrication Division (MFD) is to provide fabrication services and technology in support of all programs at Lawrence Livermore National Laboratory (LLNL). MFD involvement is called for when fabrication activity requires levels of expertise, technology, equipment, process development, hazardous processes, security, or scheduling that is typically not commercially available. Customers are encouraged to utilize private industry for fabrication activity requiring routine processing or for production applications. Our waste minimization (WM) program has been directed at source reduction and recycling in concert with the working definition of waste minimization used by EPA. The principal focus of WM activities has been on hazardous wastes as defined by RCRA, however, all pollutant emissions into air, water and land are being considered as part of the program. The incentives include: (1) economics, (2) regulatory conformance, (3) public image and (4) environmental concern. This report discusses the waste minimization program at LLNL

  10. Text-Based On-Line Conferencing: A Conceptual and Empirical Analysis Using a Minimal Prototype.

    Science.gov (United States)

    McCarthy, John C.; And Others

    1993-01-01

    Analyzes requirements for text-based online conferencing through the use of a minimal prototype. Topics discussed include prototyping with a minimal system; text-based communication; the system as a message passer versus the system as a shared data structure; and three exercises that showed how users worked with the prototype. (Contains 61…

  11. 40 CFR 63.344 - Performance test requirements and test methods.

    Science.gov (United States)

    2010-07-01

    ... electroplating tanks or chromium anodizing tanks. The sampling time and sample volume for each run of Methods 306... Chromium Anodizing Tanks § 63.344 Performance test requirements and test methods. (a) Performance test... Emissions From Decorative and Hard Chromium Electroplating and Anodizing Operations,” appendix A of this...

  12. A majorization-minimization approach to design of power distribution networks

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Jason K [Los Alamos National Laboratory; Chertkov, Michael [Los Alamos National Laboratory

    2010-01-01

    We consider optimization approaches to design cost-effective electrical networks for power distribution. This involves a trade-off between minimizing the power loss due to resistive heating of the lines and minimizing the construction cost (modeled by a linear cost in the number of lines plus a linear cost on the conductance of each line). We begin with a convex optimization method based on the paper 'Minimizing Effective Resistance of a Graph' [Ghosh, Boyd & Saberi]. However, this does not address the Alternating Current (AC) realm and the combinatorial aspect of adding/removing lines of the network. Hence, we consider a non-convex continuation method that imposes a concave cost of the conductance of each line thereby favoring sparser solutions. By varying a parameter of this penalty we extrapolate from the convex problem (with non-sparse solutions) to the combinatorial problem (with sparse solutions). This is used as a heuristic to find good solutions (local minima) of the non-convex problem. To perform the necessary non-convex optimization steps, we use the majorization-minimization algorithm that performs a sequence of convex optimizations obtained by iteratively linearizing the concave part of the objective. A number of examples are presented which suggest that the overall method is a good heuristic for network design. We also consider how to obtain sparse networks that are still robust against failures of lines and/or generators.

  13. Waste minimization applications at a remediation site

    International Nuclear Information System (INIS)

    Allmon, L.A.

    1995-01-01

    The Fernald Environmental Management Project (FEMP) owned by the Department of Energy was used for the processing of uranium. In 1989 Fernald suspended production of uranium metals and was placed on the National Priorities List (NPL). The site's mission has changed from one of production to environmental restoration. Many groups necessary for producing a product were deemed irrelevant for remediation work, including Waste Minimization. Waste Minimization does not readily appear to be applicable to remediation work. Environmental remediation is designed to correct adverse impacts to the environment from past operations and generates significant amounts of waste requiring management. The premise of pollution prevention is to avoid waste generation, thus remediation is in direct conflict with this premise. Although greater amounts of waste will be generated during environmental remediation, treatment capacities are not always available and disposal is becoming more difficult and costly. This creates the need for pollution prevention and waste minimization. Applying waste minimization principles at a remediation site is an enormous challenge. If the remediation site is also radiologically contaminated it is even a bigger challenge. Innovative techniques and ideas must be utilized to achieve reductions in the amount of waste that must be managed or dispositioned. At Fernald the waste minimization paradigm was shifted from focusing efforts on source reduction to focusing efforts on recycle/reuse by inverting the EPA waste management hierarchy. A fundamental difference at remediation sites is that source reduction has limited applicability to legacy wastes but can be applied successfully on secondary waste generation. The bulk of measurable waste reduction will be achieved by the recycle/reuse of primary wastes and by segregation and decontamination of secondary wastestreams. Each effort must be measured in terms of being economically and ecologically beneficial

  14. Minimizing inner product data dependencies in conjugate gradient iteration

    Science.gov (United States)

    Vanrosendale, J.

    1983-01-01

    The amount of concurrency available in conjugate gradient iteration is limited by the summations required in the inner product computations. The inner product of two vectors of length N requires time c log(N), if N or more processors are available. This paper describes an algebraic restructuring of the conjugate gradient algorithm which minimizes data dependencies due to inner product calculations. After an initial start up, the new algorithm can perform a conjugate gradient iteration in time c*log(log(N)).

  15. Comparison of different methods for shielding design in computed tomography

    International Nuclear Information System (INIS)

    Ciraj-Bjelac, O.; Arandjic, D.; Kosutic, D.

    2011-01-01

    The purpose of this work is to compare different methods for shielding calculation in computed tomography (CT). The BIR-IPEM (British Inst. of Radiology and Inst. of Physics in Engineering in Medicine) and NCRP (National Council on Radiation Protection) method were used for shielding thickness calculation. Scattered dose levels and calculated barrier thickness were also compared with those obtained by scatter dose measurements in the vicinity of a dedicated CT unit. Minimal requirement for protective barriers based on BIR-IPEM method ranged between 1.1 and 1.4 mm of lead demonstrating underestimation of up to 20 % and overestimation of up to 30 % when compared with thicknesses based on measured dose levels. For NCRP method, calculated thicknesses were 33 % higher (27-42 %). BIR-IPEM methodology-based results were comparable with values based on scattered dose measurements, while results obtained using NCRP methodology demonstrated an overestimation of the minimal required barrier thickness. (authors)

  16. Finding A Minimally Informative Dirichlet Prior Using Least Squares

    International Nuclear Information System (INIS)

    Kelly, Dana

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  17. Constrained convex minimization via model-based excessive gap

    OpenAIRE

    Tran Dinh, Quoc; Cevher, Volkan

    2014-01-01

    We introduce a model-based excessive gap technique to analyze first-order primal- dual methods for constrained convex minimization. As a result, we construct new primal-dual methods with optimal convergence rates on the objective residual and the primal feasibility gap of their iterates separately. Through a dual smoothing and prox-function selection strategy, our framework subsumes the augmented Lagrangian, and alternating methods as special cases, where our rates apply.

  18. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    International Nuclear Information System (INIS)

    Wahl, K.L.; Campbell, J.A.; Clauss, S.A.

    1995-09-01

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety

  19. Phylogenetic rooting using minimal ancestor deviation.

    Science.gov (United States)

    Tria, Fernando Domingues Kümmel; Landan, Giddy; Dagan, Tal

    2017-06-19

    Ancestor-descendent relations play a cardinal role in evolutionary theory. Those relations are determined by rooting phylogenetic trees. Existing rooting methods are hampered by evolutionary rate heterogeneity or the unavailability of auxiliary phylogenetic information. Here we present a rooting approach, the minimal ancestor deviation (MAD) method, which accommodates heterotachy by using all pairwise topological and metric information in unrooted trees. We demonstrate the performance of the method, in comparison to existing rooting methods, by the analysis of phylogenies from eukaryotes and prokaryotes. MAD correctly recovers the known root of eukaryotes and uncovers evidence for the origin of cyanobacteria in the ocean. MAD is more robust and consistent than existing methods, provides measures of the root inference quality and is applicable to any tree with branch lengths.

  20. Efficient Deterministic Finite Automata Minimization Based on Backward Depth Information.

    Science.gov (United States)

    Liu, Desheng; Huang, Zhiping; Zhang, Yimeng; Guo, Xiaojun; Su, Shaojing

    2016-01-01

    Obtaining a minimal automaton is a fundamental issue in the theory and practical implementation of deterministic finite automatons (DFAs). A minimization algorithm is presented in this paper that consists of two main phases. In the first phase, the backward depth information is built, and the state set of the DFA is partitioned into many blocks. In the second phase, the state set is refined using a hash table. The minimization algorithm has a lower time complexity O(n) than a naive comparison of transitions O(n2). Few states need to be refined by the hash table, because most states have been partitioned by the backward depth information in the coarse partition. This method achieves greater generality than previous methods because building the backward depth information is independent of the topological complexity of the DFA. The proposed algorithm can be applied not only to the minimization of acyclic automata or simple cyclic automata, but also to automata with high topological complexity. Overall, the proposal has three advantages: lower time complexity, greater generality, and scalability. A comparison to Hopcroft's algorithm demonstrates experimentally that the algorithm runs faster than traditional algorithms.

  1. Minimization of heatwave morbidity and mortality.

    Science.gov (United States)

    Kravchenko, Julia; Abernethy, Amy P; Fawzy, Maria; Lyerly, H Kim

    2013-03-01

    Global climate change is projected to increase the frequency and duration of periods of extremely high temperatures. Both the general populace and public health authorities often underestimate the impact of high temperatures on human health. To highlight the vulnerable populations and illustrate approaches to minimization of health impacts of extreme heat, the authors reviewed the studies of heat-related morbidity and mortality for high-risk populations in the U.S. and Europe from 1958 to 2012. Heat exposure not only can cause heat exhaustion and heat stroke but also can exacerbate a wide range of medical conditions. Vulnerable populations, such as older adults; children; outdoor laborers; some racial and ethnic subgroups (particularly those with low SES); people with chronic diseases; and those who are socially or geographically isolated, have increased morbidity and mortality during extreme heat. In addition to ambient temperature, heat-related health hazards are exacerbated by air pollution, high humidity, and lack of air-conditioning. Consequently, a comprehensive approach to minimize the health effects of extreme heat is required and must address educating the public of the risks and optimizing heatwave response plans, which include improving access to environmentally controlled public havens, adaptation of social services to address the challenges required during extreme heat, and consistent monitoring of morbidity and mortality during periods of extreme temperatures. Copyright © 2013 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  2. An ESDIRK Method with Sensitivity Analysis Capabilities

    DEFF Research Database (Denmark)

    Kristensen, Morten Rode; Jørgensen, John Bagterp; Thomsen, Per Grove

    2004-01-01

    of the sensitivity equations. A key feature is the reuse of information already computed for the state integration, hereby minimizing the extra effort required for sensitivity integration. Through case studies the new algorithm is compared to an extrapolation method and to the more established BDF based approaches...

  3. Inverse atmospheric radiative transfer problems - A nonlinear minimization search method of solution. [aerosol pollution monitoring

    Science.gov (United States)

    Fymat, A. L.

    1976-01-01

    The paper studies the inversion of the radiative transfer equation describing the interaction of electromagnetic radiation with atmospheric aerosols. The interaction can be considered as the propagation in the aerosol medium of two light beams: the direct beam in the line-of-sight attenuated by absorption and scattering, and the diffuse beam arising from scattering into the viewing direction, which propagates more or less in random fashion. The latter beam has single scattering and multiple scattering contributions. In the former case and for single scattering, the problem is reducible to first-kind Fredholm equations, while for multiple scattering it is necessary to invert partial integrodifferential equations. A nonlinear minimization search method, applicable to the solution of both types of problems has been developed, and is applied here to the problem of monitoring aerosol pollution, namely the complex refractive index and size distribution of aerosol particles.

  4. Performance potential of mechanical ventilation systems with minimized pressure loss

    DEFF Research Database (Denmark)

    Terkildsen, Søren; Svendsen, Svend

    2013-01-01

    simulations that quantify fan power consumption, heating demand and indoor environmental conditions. The system was designed with minimal pressure loss in the duct system and heat exchanger. Also, it uses state-of-the-art components such as electrostatic precipitators, diffuse ceiling inlets and demand......In many locations mechanical ventilation has been the most widely used principle of ventilation over the last 50 years but the conventional system design must be revised to comply with future energy requirements. This paper examines the options and describes a concept for the design of mechanical...... ventilation systems with minimal pressure loss and minimal energy use. This can provide comfort ventilation and avoid overheating through increased ventilation and night cooling. Based on this concept, a test system was designed for a fictive office building and its performance was documented using building...

  5. Temporal structure of consciousness and minimal self in schizophrenia

    Directory of Open Access Journals (Sweden)

    Brice eMartin

    2014-10-01

    Full Text Available The concept of the minimal self refers to the consciousness of oneself as an immediate subject of experience. According to recent studies, disturbances of the minimal self may be a core feature of schizophrenia. They are emphasized in classical psychiatry literature and in phenomenological work. Impaired minimal self experience may be defined as a distortion of one’s first-person experiential perspective as, for example, an ‘altered presence’ during which the sense of the experienced self (‘mineness’ is subtly affected, or ‘altered sense of demarcation’, i.e. a difficulty discriminating the self from the non-self. Little is known, however, about the cognitive basis of these disturbances. In fact, recent work indicates that disorders of the self are not correlated with cognitive impairments commonly found in schizophrenia such as working-memory and attention disorders. In addition, a major difficulty with exploring the minimal self experimentally lies in its definition as being non self-reflexive, and distinct from the verbalized, explicit awareness of an ‘I’.In this paper we shall discuss the possibility that disturbances of the minimal self observed in patients with schizophrenia are related to alterations in time processing. We shall review the literature on schizophrenia and time processing that lends support to this possibility. In particular we shall discuss the involvement of temporal integration windows on different time scales (implicit time processing as well as duration perception disturbances (explicit time processing in disorders of the minimal self. We argue that a better understanding of the relationship between time and the minimal self as well of issues of embodiment require research that looks more specifically at implicit time processing. Some methodological issues will be discussed.

  6. Mixed-Methods Design in Biology Education Research: Approach and Uses

    Science.gov (United States)

    Warfa, Abdi-Rizak M.

    2016-01-01

    Educational research often requires mixing different research methodologies to strengthen findings, better contextualize or explain results, or minimize the weaknesses of a single method. This article provides practical guidelines on how to conduct such research in biology education, with a focus on mixed-methods research (MMR) that uses both…

  7. Effective Teaching Methods in Higher Education: Requirements and Barriers.

    Science.gov (United States)

    Shirani Bidabadi, Nahid; Nasr Isfahani, Ahmmadreza; Rouhollahi, Amir; Khalili, Roya

    2016-10-01

    Teaching is one of the main components in educational planning which is a key factor in conducting educational plans. Despite the importance of good teaching, the outcomes are far from ideal. The present qualitative study aimed to investigate effective teaching in higher education in Iran based on the experiences of best professors in the country and the best local professors of Isfahan University of Technology. This qualitative content analysis study was conducted through purposeful sampling. Semi-structured interviews were conducted with ten faculty members (3 of them from the best professors in the country and 7 from the best local professors). Content analysis was performed by MAXQDA software. The codes, categories and themes were explored through an inductive process that began from semantic units or direct quotations to general themes. According to the results of this study, the best teaching approach is the mixed method (student-centered together with teacher-centered) plus educational planning and previous readiness. But whenever the teachers can teach using this method confront with some barriers and requirements; some of these requirements are prerequisite in professors' behavior and some of these are prerequisite in professors' outlook. Also, there are some major barriers, some of which are associated with the professors' operation and others are related to laws and regulations. Implications of these findings for teachers' preparation in education are discussed. In the present study, it was illustrated that a good teaching method helps the students to question their preconceptions, and motivates them to learn, by putting them in a situation in which they come to see themselves as the authors of answers, as the agents of responsibility for change. But training through this method has some barriers and requirements. To have an effective teaching; the faculty members of the universities should be awarded of these barriers and requirements as a way to

  8. Minimally invasive pediatric surgery: Increasing implementation in daily practice and resident's training

    NARCIS (Netherlands)

    E.A.T. Velde (Te); N.M.A. Bax (Klaas); S.H.A.J. Tytgat; J.R. de Jong (Justin); D.V. Travassos (Vieira); W.L.M. Kramer; D.C. van der Zee (David)

    2008-01-01

    textabstractBackground: In 1998, the one-year experience in minimally invasive abdominal surgery in children at a pediatric training center was assessed. Seven years later, we determined the current status of pediatric minimally invasive surgery in daily practice and surgical training. Methods: A

  9. Minimizing hydride cracking in zirconium alloys

    International Nuclear Information System (INIS)

    Coleman, C.E.; Cheadle, B.A.; Ambler, J.F.R.; Eadie, R.L.

    1985-01-01

    Zirconium alloy components can fail by hydride cracking if they contain large flaws and are highly stressed. If cracking in such components is suspected, crack growth can be minimized by following two simple operating rules: components should be heated up from at least 30K below any operating temperature above 450K, and when the component requires cooling to room temperature from a high temperature, any tensile stress should be reduced as much and as quickly as is practical during cooling. This paper describes the physical basis for these rules

  10. Global Sufficient Optimality Conditions for a Special Cubic Minimization Problem

    Directory of Open Access Journals (Sweden)

    Xiaomei Zhang

    2012-01-01

    Full Text Available We present some sufficient global optimality conditions for a special cubic minimization problem with box constraints or binary constraints by extending the global subdifferential approach proposed by V. Jeyakumar et al. (2006. The present conditions generalize the results developed in the work of V. Jeyakumar et al. where a quadratic minimization problem with box constraints or binary constraints was considered. In addition, a special diagonal matrix is constructed, which is used to provide a convenient method for justifying the proposed sufficient conditions. Then, the reformulation of the sufficient conditions follows. It is worth noting that this reformulation is also applicable to the quadratic minimization problem with box or binary constraints considered in the works of V. Jeyakumar et al. (2006 and Y. Wang et al. (2010. Finally some examples demonstrate that our optimality conditions can effectively be used for identifying global minimizers of the certain nonconvex cubic minimization problem.

  11. Cauliflower ear – a minimally invasive treatment method in a wrestling athlete: a case report

    Directory of Open Access Journals (Sweden)

    Haik J

    2018-01-01

    Full Text Available Josef Haik,1–4 Or Givol,2 Rachel Kornhaber,1,5 Michelle Cleary,6 Hagit Ofir,1,2 Moti Harats1–3 1Department of Plastic and Reconstructive Surgery, Sheba Medical Center, Tel Hashomer, Ramat Gan, 2Sackler School of Medicine, Tel Aviv University, Tel Aviv, Israel; 3Burn Injury Research Node, Institute for Health Research University of Notre Dame Fremantle, Fremantle WA, Australia; 4Talpiot Leadership Program, Sheba Medical Center, Tel Hashomer, Ramat Gan, Israel; 5Faculty of Health, 6School of Health Sciences, College of Health and Medicine, University of Tasmania, Sydney, NSW, Australia Abstract: Acute auricular hematoma can be caused by direct blunt trauma or other injury to the external ear. It is typically seen in those who practice full contact sports such as boxing, wrestling, and rugby. “Cauliflower ear” deformity, fibrocartilage formation during scarring, is a common complication of auricular hematomas. Therefore, acute drainage of the hematoma and postprocedural techniques for preventing recurrence are necessary for preventing the deformity. There are many techniques although no superior method of treatment has been found. In this case report, we describe a novel method using needle aspiration followed by the application of a magnet and an adapted disc to the affected area of the auricular. This minimally invasive, simple, and accessible method could potentially facilitate the treatment of cauliflower ear among full contact sports athletes. Keywords: cauliflower ear, hematoma, ear deformity, athletic injuries, wrestling, case report

  12. Splines and variational methods

    CERN Document Server

    Prenter, P M

    2008-01-01

    One of the clearest available introductions to variational methods, this text requires only a minimal background in calculus and linear algebra. Its self-contained treatment explains the application of theoretic notions to the kinds of physical problems that engineers regularly encounter. The text's first half concerns approximation theoretic notions, exploring the theory and computation of one- and two-dimensional polynomial and other spline functions. Later chapters examine variational methods in the solution of operator equations, focusing on boundary value problems in one and two dimension

  13. OPTIM, Minimization of Band-Width of Finite Elements Problems

    International Nuclear Information System (INIS)

    Huart, M.

    1977-01-01

    1 - Nature of the physical problem solved: To minimize the band-width of finite element problems. 2 - Method of solution: A surface is constructed from the x-y-coordinates of each node using its node number as z-value. This surface consists of triangles. Nodes are renumbered in such a way as to minimize the surface area. 3 - Restrictions on the complexity of the problem: This program is applicable to 2-D problems. It is dimensioned for a maximum of 1000 elements

  14. Waste Minimization Policy at the Romanian Nuclear Power Plant

    International Nuclear Information System (INIS)

    Andrei, V.; Daian, I.

    2002-01-01

    The radioactive waste management system at Cernavoda Nuclear Power Plant (NPP) in Romania was designed to maintain acceptable levels of safety for workers and to protect human health and the environment from exposure to unacceptable levels of radiation. In accordance with terminology of the International Atomic Energy Agency (IAEA), this system consists of the ''pretreatment'' of solid and organic liquid radioactive waste, which may include part or all of the following activities: collection, handling, volume reduction (by an in-drum compactor, if appropriate), and storage. Gaseous and aqueous liquid wastes are managed according to the ''dilute and discharge'' strategy. Taking into account the fact that treatment/conditioning and disposal technologies are still not established, waste minimization at the source is a priority environmental management objective, while waste minimization at the disposal stage is presently just a theoretical requirement for future adopted technologies . The necessary operational and maintenance procedures are in place at Cernavoda to minimize the production and contamination of waste. Administrative and technical measures are established to minimize waste volumes. Thus, an annual environmental target of a maximum 30 m3 of radioactive waste volume arising from operation and maintenance has been established. Within the first five years of operations at Cernavoda NPP, this target has been met. The successful implementation of the waste minimization policy has been accompanied by a cost reduction while the occupational doses for plant workers have been maintained at as low as reasonably practicable levels. This paper will describe key features of the waste management system along with the actual experience that has been realized with respect to minimizing the waste volumes at the Cernavoda NPP

  15. Free energy minimization to predict RNA secondary structures and computational RNA design.

    Science.gov (United States)

    Churkin, Alexander; Weinbrand, Lina; Barash, Danny

    2015-01-01

    Determining the RNA secondary structure from sequence data by computational predictions is a long-standing problem. Its solution has been approached in two distinctive ways. If a multiple sequence alignment of a collection of homologous sequences is available, the comparative method uses phylogeny to determine conserved base pairs that are more likely to form as a result of billions of years of evolution than by chance. In the case of single sequences, recursive algorithms that compute free energy structures by using empirically derived energy parameters have been developed. This latter approach of RNA folding prediction by energy minimization is widely used to predict RNA secondary structure from sequence. For a significant number of RNA molecules, the secondary structure of the RNA molecule is indicative of its function and its computational prediction by minimizing its free energy is important for its functional analysis. A general method for free energy minimization to predict RNA secondary structures is dynamic programming, although other optimization methods have been developed as well along with empirically derived energy parameters. In this chapter, we introduce and illustrate by examples the approach of free energy minimization to predict RNA secondary structures.

  16. Identifying and prioritizing customer requirements from tractor production by QFD method

    Directory of Open Access Journals (Sweden)

    H Taghizadeh

    2017-05-01

    Full Text Available Introduction Discovering and understanding customer needs and expectations are considered as important factors on customer satisfaction and play vital role to maintain the current activity among its competitors, proceeding and obtaining customer satisfaction which are critical factors to design a successful production; thus the successful organizations must meet their needs containing the quality of the products or services to customers. Quality Function Deployment (QFD is a technique for studying demands and needs of customers which is going to give more emphasis to the customer's interests in this way. The QFD method in general implemented various tools and methods for reaching qualitative goals; but the most important and the main tool of this method is the house of quality diagrams. The Analytic Hierarchy Process (AHP is a famous and common MADM method based on pair wise comparisons used for determining the priority of understudied factors in various studies until now. With considering effectiveness of QFD method to explicating customer's demands and obtaining customer satisfaction, generally, the researchers followed this question's suite and scientific answer: how can QFD explicate real demands and requirements of customers from tractor final production and what is the prioritization of these demands and requirements in view of customers. Accordingly, the aim of this study was to identify and prioritize the customer requirements of Massey Ferguson (MF 285 tractor production in Iran tractor manufacturing company with t- student statistical test, AHP and QFD methods. Materials and Methods Research method was descriptive and statistical population included all of the tractor customers of Tractor Manufacturing Company in Iran from March 2011 to March 2015. The statistical sample size was 171 which are determined with Cochran index. Moreover, 20 experts' opinion has been considered for determining product's technical requirements. Literature

  17. Replica Approach for Minimal Investment Risk with Cost

    Science.gov (United States)

    Shinzato, Takashi

    2018-06-01

    In the present work, the optimal portfolio minimizing the investment risk with cost is discussed analytically, where an objective function is constructed in terms of two negative aspects of investment, the risk and cost. We note the mathematical similarity between the Hamiltonian in the mean-variance model and the Hamiltonians in the Hopfield model and the Sherrington-Kirkpatrick model, show that we can analyze this portfolio optimization problem by using replica analysis, and derive the minimal investment risk with cost and the investment concentration of the optimal portfolio. Furthermore, we validate our proposed method through numerical simulations.

  18. Is non-minimal inflation eternal?

    International Nuclear Information System (INIS)

    Feng, Chao-Jun; Li, Xin-Zhou

    2010-01-01

    The possibility that the non-minimal coupling inflation could be eternal is investigated. We calculate the quantum fluctuation of the inflaton in a Hubble time and find that it has the same value as that in the minimal case in the slow-roll limit. Armed with this result, we have studied some concrete non-minimal inflationary models including the chaotic inflation and the natural inflation, in which the inflaton is non-minimally coupled to the gravity. We find that the non-minimal coupling inflation could be eternal in some parameter spaces.

  19. Sportsmen’s Groin—Diagnostic Approach and Treatment With the Minimal Repair Technique

    Science.gov (United States)

    Muschaweck, Ulrike; Berger, Luise Masami

    2010-01-01

    Context: Sportsmen’s groin, also called sports hernia and Gilmore groin, is one of the most frequent sports injuries in athletes and may place an athletic career at risk. It presents with acute or chronic groin pain exacerbated with physical activity. So far, there is little consensus regarding pathogenesis, diagnostic criteria, or treatment. There have been various attempts to explain the cause of the groin pain. The assumption is that a circumscribed weakness in the posterior wall of the inguinal canal, which leads to a localized bulge, induces a compression of the genital branch of the genitofemoral nerve, considered responsible for the symptoms. Methods: The authors developed an innovative open suture repair—the Minimal Repair technique—to fit the needs of professional athletes. With this technique, the circumscribed weakness of the posterior wall of the inguinal canal is repaired by an elastic suture; the compression on the nerve is abolished, and the cause of the pain is removed. In contrast with that of common open suture repairs, the defect of the posterior wall is not enlarged, the suture is nearly tension free, and the patient can return to full training and athletic activity within a shorter time. The outcome of patients undergoing operations with the Minimal Repair technique was compared with that of commonly used surgical procedures. Results: The following advantages of the Minimal Repair technique were found: no insertion of prosthetic mesh, no general anesthesia required, less traumatization, and lower risk of severe complications with equal or even faster convalescence. In 2009, a prospective cohort of 129 patients resumed training in 7 days and experienced complete pain relief in an average of 14 days. Professional athletes (67%) returned to full activity in 14 days (median). Conclusion: The Minimal Repair technique is an effective and safe way to treat sportsmen’s groin. PMID:23015941

  20. Functional Mobility Testing: A Novel Method to Establish Human System Interface Design Requirements

    Science.gov (United States)

    England, Scott A.; Benson, Elizabeth A.; Rajulu, Sudhakar

    2008-01-01

    Across all fields of human-system interface design it is vital to posses a sound methodology dictating the constraints on the system based on the capabilities of the human user. These limitations may be based on strength, mobility, dexterity, cognitive ability, etc. and combinations thereof. Data collected in an isolated environment to determine, for example, maximal strength or maximal range of motion would indeed be adequate for establishing not-to-exceed type design limitations, however these restraints on the system may be excessive over what is basally needed. Resources may potentially be saved by having a technique to determine the minimum measurements a system must accommodate. This paper specifically deals with the creation of a novel methodology for establishing mobility requirements for a new generation of space suit design concepts. Historically, the Space Shuttle and the International Space Station vehicle and space hardware design requirements documents such as the Man-Systems Integration Standards and International Space Station Flight Crew Integration Standard explicitly stated that the designers should strive to provide the maximum joint range of motion capabilities exhibited by a minimally clothed human subject. In the course of developing the Human-Systems Integration Requirements (HSIR) for the new space exploration initiative (Constellation), an effort was made to redefine the mobility requirements in the interest of safety and cost. Systems designed for manned space exploration can receive compounded gains from simplified designs that are both initially less expensive to produce and lighter, thereby, cheaper to launch.

  1. Hexavalent Chromium Minimization Strategy

    Science.gov (United States)

    2011-05-01

    Logistics 4 Initiative - DoD Hexavalent Chromium Minimization Non- Chrome Primer IIEXAVAJ ENT CHRO:M I~UMI CHROMIUM (VII Oil CrfVli.J CANCEfl HAnRD CD...Management Office of the Secretary of Defense Hexavalent Chromium Minimization Strategy Report Documentation Page Form ApprovedOMB No. 0704-0188...00-2011 4. TITLE AND SUBTITLE Hexavalent Chromium Minimization Strategy 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  2. Significant reduction in blood loss in patients undergoing minimal extracorporeal circulation

    NARCIS (Netherlands)

    Gerritsen, W. B.; van Boven, W. J.; Smelt, M.; Morshuis, W. J.; van Dongen, H. P.; Haas, F. J.; Aarts, L. P.

    2006-01-01

    Several recent studies have shown differences in blood loss and allogeneic transfusion requirements between on-pump and off-pump coronary artery bypass grafting (CABG). Recently a new concept, the mini-extracorporeal circulation, was introduced to minimize the side effects of extracorporeal

  3. NUFFT-Based Iterative Image Reconstruction via Alternating Direction Total Variation Minimization for Sparse-View CT

    Directory of Open Access Journals (Sweden)

    Bin Yan

    2015-01-01

    Full Text Available Sparse-view imaging is a promising scanning method which can reduce the radiation dose in X-ray computed tomography (CT. Reconstruction algorithm for sparse-view imaging system is of significant importance. The adoption of the spatial iterative algorithm for CT image reconstruction has a low operation efficiency and high computation requirement. A novel Fourier-based iterative reconstruction technique that utilizes nonuniform fast Fourier transform is presented in this study along with the advanced total variation (TV regularization for sparse-view CT. Combined with the alternating direction method, the proposed approach shows excellent efficiency and rapid convergence property. Numerical simulations and real data experiments are performed on a parallel beam CT. Experimental results validate that the proposed method has higher computational efficiency and better reconstruction quality than the conventional algorithms, such as simultaneous algebraic reconstruction technique using TV method and the alternating direction total variation minimization approach, with the same time duration. The proposed method appears to have extensive applications in X-ray CT imaging.

  4. A study on the theoretical and practical accuracy of conoscopic holography-based surface measurements: toward image registration in minimally invasive surgery.

    Science.gov (United States)

    Burgner, J; Simpson, A L; Fitzpatrick, J M; Lathrop, R A; Herrell, S D; Miga, M I; Webster, R J

    2013-06-01

    Registered medical images can assist with surgical navigation and enable image-guided therapy delivery. In soft tissues, surface-based registration is often used and can be facilitated by laser surface scanning. Tracked conoscopic holography (which provides distance measurements) has been recently proposed as a minimally invasive way to obtain surface scans. Moving this technique from concept to clinical use requires a rigorous accuracy evaluation, which is the purpose of our paper. We adapt recent non-homogeneous and anisotropic point-based registration results to provide a theoretical framework for predicting the accuracy of tracked distance measurement systems. Experiments are conducted a complex objects of defined geometry, an anthropomorphic kidney phantom and a human cadaver kidney. Experiments agree with model predictions, producing point RMS errors consistently Tracked conoscopic holography is clinically viable; it enables minimally invasive surface scan accuracy comparable to current clinical methods that require open surgery. Copyright © 2012 John Wiley & Sons, Ltd.

  5. Optimal replacement of residential air conditioning equipment to minimize energy, greenhouse gas emissions, and consumer cost in the US

    International Nuclear Information System (INIS)

    De Kleine, Robert D.; Keoleian, Gregory A.; Kelly, Jarod C.

    2011-01-01

    A life cycle optimization of the replacement of residential central air conditioners (CACs) was conducted in order to identify replacement schedules that minimized three separate objectives: life cycle energy consumption, greenhouse gas (GHG) emissions, and consumer cost. The analysis was conducted for the time period of 1985-2025 for Ann Arbor, MI and San Antonio, TX. Using annual sales-weighted efficiencies of residential CAC equipment, the tradeoff between potential operational savings and the burdens of producing new, more efficient equipment was evaluated. The optimal replacement schedule for each objective was identified for each location and service scenario. In general, minimizing energy consumption required frequent replacement (4-12 replacements), minimizing GHG required fewer replacements (2-5 replacements), and minimizing cost required the fewest replacements (1-3 replacements) over the time horizon. Scenario analysis of different federal efficiency standards, regional standards, and Energy Star purchases were conducted to quantify each policy's impact. For example, a 16 SEER regional standard in Texas was shown to either reduce primary energy consumption 13%, GHGs emissions by 11%, or cost by 6-7% when performing optimal replacement of CACs from 2005 or before. The results also indicate that proper servicing should be a higher priority than optimal replacement to minimize environmental burdens. - Highlights: → Optimal replacement schedules for residential central air conditioners were found. → Minimizing energy required more frequent replacement than minimizing consumer cost. → Significant variation in optimal replacement was observed for Michigan and Texas. → Rebates for altering replacement patterns are not cost effective for GHG abatement. → Maintenance levels were significant in determining the energy and GHG impacts.

  6. Restoration ecology: two-sex dynamics and cost minimization.

    Directory of Open Access Journals (Sweden)

    Ferenc Molnár

    Full Text Available We model a spatially detailed, two-sex population dynamics, to study the cost of ecological restoration. We assume that cost is proportional to the number of individuals introduced into a large habitat. We treat dispersal as homogeneous diffusion in a one-dimensional reaction-diffusion system. The local population dynamics depends on sex ratio at birth, and allows mortality rates to differ between sexes. Furthermore, local density dependence induces a strong Allee effect, implying that the initial population must be sufficiently large to avert rapid extinction. We address three different initial spatial distributions for the introduced individuals; for each we minimize the associated cost, constrained by the requirement that the species must be restored throughout the habitat. First, we consider spatially inhomogeneous, unstable stationary solutions of the model's equations as plausible candidates for small restoration cost. Second, we use numerical simulations to find the smallest rectangular cluster, enclosing a spatially homogeneous population density, that minimizes the cost of assured restoration. Finally, by employing simulated annealing, we minimize restoration cost among all possible initial spatial distributions of females and males. For biased sex ratios, or for a significant between-sex difference in mortality, we find that sex-specific spatial distributions minimize the cost. But as long as the sex ratio maximizes the local equilibrium density for given mortality rates, a common homogeneous distribution for both sexes that spans a critical distance yields a similarly low cost.

  7. Restoration ecology: two-sex dynamics and cost minimization.

    Science.gov (United States)

    Molnár, Ferenc; Caragine, Christina; Caraco, Thomas; Korniss, Gyorgy

    2013-01-01

    We model a spatially detailed, two-sex population dynamics, to study the cost of ecological restoration. We assume that cost is proportional to the number of individuals introduced into a large habitat. We treat dispersal as homogeneous diffusion in a one-dimensional reaction-diffusion system. The local population dynamics depends on sex ratio at birth, and allows mortality rates to differ between sexes. Furthermore, local density dependence induces a strong Allee effect, implying that the initial population must be sufficiently large to avert rapid extinction. We address three different initial spatial distributions for the introduced individuals; for each we minimize the associated cost, constrained by the requirement that the species must be restored throughout the habitat. First, we consider spatially inhomogeneous, unstable stationary solutions of the model's equations as plausible candidates for small restoration cost. Second, we use numerical simulations to find the smallest rectangular cluster, enclosing a spatially homogeneous population density, that minimizes the cost of assured restoration. Finally, by employing simulated annealing, we minimize restoration cost among all possible initial spatial distributions of females and males. For biased sex ratios, or for a significant between-sex difference in mortality, we find that sex-specific spatial distributions minimize the cost. But as long as the sex ratio maximizes the local equilibrium density for given mortality rates, a common homogeneous distribution for both sexes that spans a critical distance yields a similarly low cost.

  8. Construction schedules slack time minimizing

    Science.gov (United States)

    Krzemiński, Michał

    2017-07-01

    The article presents two copyright models for minimizing downtime working brigades. Models have been developed for construction schedules performed using the method of work uniform. Application of flow shop models is possible and useful for the implementation of large objects, which can be divided into plots. The article also presents a condition describing gives which model should be used, as well as a brief example of optimization schedule. The optimization results confirm the legitimacy of the work on the newly-developed models.

  9. Nonsurgical, image-guided, minimally invasive therapy for thyroid nodules

    DEFF Research Database (Denmark)

    Gharib, Hossein; Hegedüs, Laszlo; Pacella, Claudio Maurizio

    2013-01-01

    evaluation. These techniques have also been applied to recurrent locoregional cervical thyroid cancer with encouraging initial results, although still limited data. Conclusions: Surgery and radioiodine remain as conventional and established treatments for nodular goiters. However, the new image......Context: Nodular thyroid disease is very common. Most nodules are asymptomatic, are benign by fine-needle aspiration, remain stable, and can be followed by observation alone in the majority of the patients. Occasionally, nodules grow or cause symptoms requiring treatment. So far, surgery has been...... our main option for treatment. Objective: In this review, we discuss nonsurgical, minimally invasive approaches for small thyroid masses, including indications, efficacy, side effects, and costs. Evidence Acquisition: We selected recent publications related to minimally invasive thyroid techniques...

  10. Optimisation of test and maintenance based on probabilistic methods

    International Nuclear Information System (INIS)

    Cepin, M.

    2001-01-01

    This paper presents a method, which based on models and results of probabilistic safety assessment, minimises the nuclear power plant risk by optimisation of arrangement of safety equipment outages. The test and maintenance activities of the safety equipment are timely arranged, so the classical static fault tree models are extended with the time requirements to be capable to model real plant states. A house event matrix is used, which enables modelling of the equipment arrangements through the discrete points of time. The result of the method is determination of such configuration of equipment outages, which result in the minimal risk. Minimal risk is represented by system unavailability. (authors)

  11. Randomization in clinical trials: stratification or minimization? The HERMES free simulation software.

    Science.gov (United States)

    Fron Chabouis, Hélène; Chabouis, Francis; Gillaizeau, Florence; Durieux, Pierre; Chatellier, Gilles; Ruse, N Dorin; Attal, Jean-Pierre

    2014-01-01

    Operative clinical trials are often small and open-label. Randomization is therefore very important. Stratification and minimization are two randomization options in such trials. The first aim of this study was to compare stratification and minimization in terms of predictability and balance in order to help investigators choose the most appropriate allocation method. Our second aim was to evaluate the influence of various parameters on the performance of these techniques. The created software generated patients according to chosen trial parameters (e.g., number of important prognostic factors, number of operators or centers, etc.) and computed predictability and balance indicators for several stratification and minimization methods over a given number of simulations. Block size and proportion of random allocations could be chosen. A reference trial was chosen (50 patients, 1 prognostic factor, and 2 operators) and eight other trials derived from this reference trial were modeled. Predictability and balance indicators were calculated from 10,000 simulations per trial. Minimization performed better with complex trials (e.g., smaller sample size, increasing number of prognostic factors, and operators); stratification imbalance increased when the number of strata increased. An inverse correlation between imbalance and predictability was observed. A compromise between predictability and imbalance still has to be found by the investigator but our software (HERMES) gives concrete reasons for choosing between stratification and minimization; it can be downloaded free of charge. This software will help investigators choose the appropriate randomization method in future two-arm trials.

  12. Minimal and non-minimal standard models: Universality of radiative corrections

    International Nuclear Information System (INIS)

    Passarino, G.

    1991-01-01

    The possibility of describing electroweak processes by means of models with a non-minimal Higgs sector is analyzed. The renormalization procedure which leads to a set of fitting equations for the bare parameters of the lagrangian is first reviewed for the minimal standard model. A solution of the fitting equations is obtained, which correctly includes large higher-order corrections. Predictions for physical observables, notably the W boson mass and the Z O partial widths, are discussed in detail. Finally the extension to non-minimal models is described under the assumption that new physics will appear only inside the vector boson self-energies and the concept of universality of radiative corrections is introduced, showing that to a large extent they are insensitive to the details of the enlarged Higgs sector. Consequences for the bounds on the top quark mass are also discussed. (orig.)

  13. Quality assurance requirements and methods for high level waste package acceptability

    International Nuclear Information System (INIS)

    1992-12-01

    This document should serve as guidance for assigning the necessary items to control the conditioning process in such a way that waste packages are produced in compliance with the waste acceptance requirements. It is also provided to promote the exchange of information on quality assurance requirements and on the application of quality assurance methods associated with the production of high level waste packages, to ensure that these waste packages comply with the requirements for transportation, interim storage and waste disposal in deep geological formations. The document is intended to assist both the operators of conditioning facilities and repositories as well as national authorities and regulatory bodies, involved in the licensing of the conditioning of high level radioactive wastes or in the development of deep underground disposal systems. The document recommends the quality assurance requirements and methods which are necessary to generate data for these parameters identified in IAEA-TECDOC-560 on qualitative acceptance criteria, and indicates where and when the control methods can be applied, e.g. in the operation or commissioning of a process or in the development of a waste package design. Emphasis is on the control of the process and little reliance is placed on non-destructive or destructive testing. Qualitative criteria, relevant to disposal of high level waste, are repository dependent and are not addressed here. 37 refs, 3 figs, 2 tabs

  14. Chiral lattice fermions, minimal doubling, and the axial anomaly

    International Nuclear Information System (INIS)

    Tiburzi, B. C.

    2010-01-01

    Exact chiral symmetry at finite lattice spacing would preclude the axial anomaly. In order to describe a continuum quantum field theory of Dirac fermions, lattice actions with purported exact chiral symmetry must break the flavor-singlet axial symmetry. We demonstrate that this is indeed the case by using a minimally doubled fermion action. For simplicity, we consider the Abelian axial anomaly in two dimensions. At finite lattice spacing and with gauge interactions, the axial anomaly arises from nonconservation of the flavor-singlet current. Similar nonconservation also leads to the axial anomaly in the case of the naieve lattice action. For minimally doubled actions, however, fine-tuning of the action and axial current is necessary to arrive at the anomaly. Conservation of the flavor nonsinglet vector current additionally requires the current to be fine-tuned. Finally, we determine that the chiral projection of a minimally doubled fermion action can be used to arrive at a lattice theory with an undoubled Dirac fermion possessing the correct anomaly in the continuum limit.

  15. Good Practice Guide Waste Minimization/Pollution Prevention; TOPICAL

    International Nuclear Information System (INIS)

    J Dorsey

    1999-01-01

    This Good Practice Guide provides tools, information, and examples for promoting the implementation of pollution prevention during the design phases of U.S. Department of Energy (DOE) projects. It is one of several Guides for implementing DOE Order 430.1, Life-cycle Asset Management. DOE Order 430.1 provides requirements for DOE, in partnership with its contractors, to plan, acquire, operate, maintain, and dispose of physical assets. The goals of designing for pollution prevention are to minimize raw material consumption, energy consumption, waste generation, health and safety impacts, and ecological degradation over the entire life of the facility (EPA 1993a). Users of this Guide will learn to translate national policy and regulatory requirements for pollution prevention into action at the project level. The Guide was written to be applicable to all DOE projects, regardless of project size or design phase. Users are expected to interpret the Guide for their individual project's circumstances, applying a graded approach so that the effort is consistent with the anticipated waste generation and resource consumption of the physical asset. This Guide employs a combination of pollution prevention opportunity assessment (PPOA) methods and design for environment (DfE) philosophies. The PPOA process was primarily developed for existing products, processes, and facilities. The PPOA process has been modified in this Guide to address the circumstances of the DOE design process as delineated in DOE Order 430.1 and its associated Good Practice Guides. This modified form of the PPOA is termed the Pollution Prevention Design Assessment (P2DA). Information on current nationwide methods and successes in designing for the environment also have been reviewed and are integrated into this guidance

  16. Asymptotically safe non-minimal inflation

    Energy Technology Data Exchange (ETDEWEB)

    Tronconi, Alessandro, E-mail: Alessandro.Tronconi@bo.infn.it [Dipartimento di Fisica e Astronomia and INFN, Via Irnerio 46,40126 Bologna (Italy)

    2017-07-01

    We study the constraints imposed by the requirement of Asymptotic Safety on a class of inflationary models with an inflaton field non-minimally coupled to the Ricci scalar. The critical surface in the space of theories is determined by the improved renormalization group flow which takes into account quantum corrections beyond the one loop approximation. The combination of constraints deriving from Planck observations and those from theory puts severe bounds on the values of the parameters of the model and predicts a quite large tensor to scalar ratio. We finally comment on the dependence of the results on the definition of the infrared energy scale which parametrises the running on the critical surface.

  17. Finding a minimally informative Dirichlet prior distribution using least squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straightforward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson λ, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in the form of a standard distribution (e.g., beta, gamma), and so a beta distribution is used as an approximation in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial model for common-cause failure, must be estimated from data that are often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  18. Finding a Minimally Informative Dirichlet Prior Distribution Using Least Squares

    International Nuclear Information System (INIS)

    Kelly, Dana; Atwood, Corwin

    2011-01-01

    In a Bayesian framework, the Dirichlet distribution is the conjugate distribution to the multinomial likelihood function, and so the analyst is required to develop a Dirichlet prior that incorporates available information. However, as it is a multiparameter distribution, choosing the Dirichlet parameters is less straight-forward than choosing a prior distribution for a single parameter, such as p in the binomial distribution. In particular, one may wish to incorporate limited information into the prior, resulting in a minimally informative prior distribution that is responsive to updates with sparse data. In the case of binomial p or Poisson, the principle of maximum entropy can be employed to obtain a so-called constrained noninformative prior. However, even in the case of p, such a distribution cannot be written down in closed form, and so an approximate beta distribution is used in the case of p. In the case of the multinomial model with parametric constraints, the approach of maximum entropy does not appear tractable. This paper presents an alternative approach, based on constrained minimization of a least-squares objective function, which leads to a minimally informative Dirichlet prior distribution. The alpha-factor model for common-cause failure, which is widely used in the United States, is the motivation for this approach, and is used to illustrate the method. In this approach to modeling common-cause failure, the alpha-factors, which are the parameters in the underlying multinomial aleatory model for common-cause failure, must be estimated from data that is often quite sparse, because common-cause failures tend to be rare, especially failures of more than two or three components, and so a prior distribution that is responsive to updates with sparse data is needed.

  19. A two-dimensional discontinuous heterogeneous finite element method for neutron transport calculations

    International Nuclear Information System (INIS)

    Masiello, E.; Sanchez, R.

    2007-01-01

    A discontinuous heterogeneous finite element method is presented and discussed. The method is intended for realistic numerical pin-by-pin lattice calculations when an exact representation of the geometric shape of the pins is made without need for homogenization. The method keeps the advantages of conventional discrete ordinate methods, such as fast execution together with the possibility to deal with a large number of spatial meshes, while minimizing the need for geometric modeling. It also provides a complete factorization in space, angle, and energy for the discretized matrices and minimizes, thus, storage requirements. An angular multigrid acceleration technique has also been developed to speed up the rate of convergence of the inner iterations. A particular aspect of this acceleration is the introduction of boundary restriction and prolongation operators that minimize oscillatory behavior and enhance positivity. Numerical tests are presented that show the high precision of the method and the efficiency of the angular multigrid acceleration. (authors)

  20. Fusion algebras of logarithmic minimal models

    International Nuclear Information System (INIS)

    Rasmussen, Joergen; Pearce, Paul A

    2007-01-01

    We present explicit conjectures for the chiral fusion algebras of the logarithmic minimal models LM(p,p') considering Virasoro representations with no enlarged or extended symmetry algebra. The generators of fusion are countably infinite in number but the ensuing fusion rules are quasi-rational in the sense that the fusion of a finite number of representations decomposes into a finite direct sum of representations. The fusion rules are commutative, associative and exhibit an sl(2) structure but require so-called Kac representations which are typically reducible yet indecomposable representations of rank 1. In particular, the identity of the fundamental fusion algebra p ≠ 1 is a reducible yet indecomposable Kac representation of rank 1. We make detailed comparisons of our fusion rules with the results of Gaberdiel and Kausch for p = 1 and with Eberle and Flohr for (p, p') = (2, 5) corresponding to the logarithmic Yang-Lee model. In the latter case, we confirm the appearance of indecomposable representations of rank 3. We also find that closure of a fundamental fusion algebra is achieved without the introduction of indecomposable representations of rank higher than 3. The conjectured fusion rules are supported, within our lattice approach, by extensive numerical studies of the associated integrable lattice models. Details of our lattice findings and numerical results will be presented elsewhere. The agreement of our fusion rules with the previous fusion rules lends considerable support for the identification of the logarithmic minimal models LM(p,p') with the augmented c p,p' (minimal) models defined algebraically

  1. Absolutely minimal extensions of functions on metric spaces

    International Nuclear Information System (INIS)

    Milman, V A

    1999-01-01

    Extensions of a real-valued function from the boundary ∂X 0 of an open subset X 0 of a metric space (X,d) to X 0 are discussed. For the broad class of initial data coming under discussion (linearly bounded functions) locally Lipschitz extensions to X 0 that preserve localized moduli of continuity are constructed. In the set of these extensions an absolutely minimal extension is selected, which was considered before by Aronsson for Lipschitz initial functions in the case X 0 subset of R n . An absolutely minimal extension can be regarded as an ∞-harmonic function, that is, a limit of p-harmonic functions as p→+∞. The proof of the existence of absolutely minimal extensions in a metric space with intrinsic metric is carried out by the Perron method. To this end, ∞-subharmonic, ∞-superharmonic, and ∞-harmonic functions on a metric space are defined and their properties are established

  2. An extension of command shaping methods for controlling residual vibration using frequency sampling

    Science.gov (United States)

    Singer, Neil C.; Seering, Warren P.

    1992-01-01

    The authors present an extension to the impulse shaping technique for commanding machines to move with reduced residual vibration. The extension, called frequency sampling, is a method for generating constraints that are used to obtain shaping sequences which minimize residual vibration in systems such as robots whose resonant frequencies change during motion. The authors present a review of impulse shaping methods, a development of the proposed extension, and a comparison of results of tests conducted on a simple model of the space shuttle robot arm. Frequency shaping provides a method for minimizing the impulse sequence duration required to give the desired insensitivity.

  3. Surface Reconstruction and Image Enhancement via $L^1$-Minimization

    KAUST Repository

    Dobrev, Veselin; Guermond, Jean-Luc; Popov, Bojan

    2010-01-01

    A surface reconstruction technique based on minimization of the total variation of the gradient is introduced. Convergence of the method is established, and an interior-point algorithm solving the associated linear programming problem is introduced

  4. An individual urinary proteome analysis in normal human beings to define the minimal sample number to represent the normal urinary proteome

    Directory of Open Access Journals (Sweden)

    Liu Xuejiao

    2012-11-01

    Full Text Available Abstract Background The urinary proteome has been widely used for biomarker discovery. A urinary proteome database from normal humans can provide a background for discovery proteomics and candidate proteins/peptides for targeted proteomics. Therefore, it is necessary to define the minimum number of individuals required for sampling to represent the normal urinary proteome. Methods In this study, inter-individual and inter-gender variations of urinary proteome were taken into consideration to achieve a representative database. An individual analysis was performed on overnight urine samples from 20 normal volunteers (10 males and 10 females by 1DLC/MS/MS. To obtain a representative result of each sample, a replicate 1DLCMS/MS analysis was performed. The minimal sample number was estimated by statistical analysis. Results For qualitative analysis, less than 5% of new proteins/peptides were identified in a male/female normal group by adding a new sample when the sample number exceeded nine. In addition, in a normal group, the percentage of newly identified proteins/peptides was less than 5% upon adding a new sample when the sample number reached 10. Furthermore, a statistical analysis indicated that urinary proteomes from normal males and females showed different patterns. For quantitative analysis, the variation of protein abundance was defined by spectrum count and western blotting methods. And then the minimal sample number for quantitative proteomic analysis was identified. Conclusions For qualitative analysis, when considering the inter-individual and inter-gender variations, the minimum sample number is 10 and requires a balanced number of males and females in order to obtain a representative normal human urinary proteome. For quantitative analysis, the minimal sample number is much greater than that for qualitative analysis and depends on the experimental methods used for quantification.

  5. Discrete Curvatures and Discrete Minimal Surfaces

    KAUST Repository

    Sun, Xiang

    2012-06-01

    This thesis presents an overview of some approaches to compute Gaussian and mean curvature on discrete surfaces and discusses discrete minimal surfaces. The variety of applications of differential geometry in visualization and shape design leads to great interest in studying discrete surfaces. With the rich smooth surface theory in hand, one would hope that this elegant theory can still be applied to the discrete counter part. Such a generalization, however, is not always successful. While discrete surfaces have the advantage of being finite dimensional, thus easier to treat, their geometric properties such as curvatures are not well defined in the classical sense. Furthermore, the powerful calculus tool can hardly be applied. The methods in this thesis, including angular defect formula, cotangent formula, parallel meshes, relative geometry etc. are approaches based on offset meshes or generalized offset meshes. As an important application, we discuss discrete minimal surfaces and discrete Koenigs meshes.

  6. Combined shape and topology optimization for minimization of maximal von Mises stress

    DEFF Research Database (Denmark)

    Lian, Haojie; Christiansen, Asger Nyman; Tortorelli, Daniel A.

    2017-01-01

    This work shows that a combined shape and topology optimization method can produce optimal 2D designs with minimal stress subject to a volume constraint. The method represents the surface explicitly and discretizes the domain into a simplicial complex which adapts both structural shape and topology....... By performing repeated topology and shape optimizations and adaptive mesh updates, we can minimize the maximum von Mises stress using the p-norm stress measure with p-values as high as 30, provided that the stress is calculated with sufficient accuracy....

  7. An Efficiency Improved Active Power Decoupling Circuit with Minimized Implementation Cost

    DEFF Research Database (Denmark)

    Tang, Yi; Blaabjerg, Frede

    2014-01-01

    topology does not require additional passive component, e.g. inductors or film capacitors for ripple energy storage because this task can be accomplished by the dc-link capacitors themselves, and therefore its implementation cost can be minimized. Another unique feature of the proposed topology...

  8. Minimal Gromov-Witten rings

    International Nuclear Information System (INIS)

    Przyjalkowski, V V

    2008-01-01

    We construct an abstract theory of Gromov-Witten invariants of genus 0 for quantum minimal Fano varieties (a minimal class of varieties which is natural from the quantum cohomological viewpoint). Namely, we consider the minimal Gromov-Witten ring: a commutative algebra whose generators and relations are of the form used in the Gromov-Witten theory of Fano varieties (of unspecified dimension). The Gromov-Witten theory of any quantum minimal variety is a homomorphism from this ring to C. We prove an abstract reconstruction theorem which says that this ring is isomorphic to the free commutative ring generated by 'prime two-pointed invariants'. We also find solutions of the differential equation of type DN for a Fano variety of dimension N in terms of the generating series of one-pointed Gromov-Witten invariants

  9. Minimal Marking: A Success Story

    Science.gov (United States)

    McNeilly, Anne

    2014-01-01

    The minimal-marking project conducted in Ryerson's School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The "minimal-marking" concept (Haswell, 1983), which requires…

  10. Cell Adhesion Minimization by a Novel Mesh Culture Method Mechanically Directs Trophoblast Differentiation and Self-Assembly Organization of Human Pluripotent Stem Cells.

    Science.gov (United States)

    Okeyo, Kennedy Omondi; Kurosawa, Osamu; Yamazaki, Satoshi; Oana, Hidehiro; Kotera, Hidetoshi; Nakauchi, Hiromitsu; Washizu, Masao

    2015-10-01

    Mechanical methods for inducing differentiation and directing lineage specification will be instrumental in the application of pluripotent stem cells. Here, we demonstrate that minimization of cell-substrate adhesion can initiate and direct the differentiation of human pluripotent stem cells (hiPSCs) into cyst-forming trophoblast lineage cells (TLCs) without stimulation with cytokines or small molecules. To precisely control cell-substrate adhesion area, we developed a novel culture method where cells are cultured on microstructured mesh sheets suspended in a culture medium such that cells on mesh are completely out of contact with the culture dish. We used microfabricated mesh sheets that consisted of open meshes (100∼200 μm in pitch) with narrow mesh strands (3-5 μm in width) to provide support for initial cell attachment and growth. We demonstrate that minimization of cell adhesion area achieved by this culture method can trigger a sequence of morphogenetic transformations that begin with individual hiPSCs attached on the mesh strands proliferating to form cell sheets by self-assembly organization and ultimately differentiating after 10-15 days of mesh culture to generate spherical cysts that secreted human chorionic gonadotropin (hCG) hormone and expressed caudal-related homeobox 2 factor (CDX2), a specific marker of trophoblast lineage. Thus, this study demonstrates a simple and direct mechanical approach to induce trophoblast differentiation and generate cysts for application in the study of early human embryogenesis and drug development and screening.

  11. Minimizing student’s faults in determining the design of experiment through inquiry-based learning

    Science.gov (United States)

    Nilakusmawati, D. P. E.; Susilawati, M.

    2017-10-01

    The purpose of this study were to describe the used of inquiry method in an effort to minimize student’s fault in designing an experiment and to determine the effectiveness of the implementation of the inquiry method in minimizing student’s faults in designing experiments on subjects experimental design. This type of research is action research participants, with a model of action research design. The data source were students of the fifth semester who took a subject of experimental design at Mathematics Department, Faculty of Mathematics and Natural Sciences, Udayana University. Data was collected through tests, interviews, and observations. The hypothesis was tested by t-test. The result showed that the implementation of inquiry methods to minimize of students fault in designing experiments, analyzing experimental data, and interpret them in cycle 1 students can reduce fault by an average of 10.5%. While implementation in Cycle 2, students managed to reduce fault by an average of 8.78%. Based on t-test results can be concluded that the inquiry method effectively used to minimize of student’s fault in designing experiments, analyzing experimental data, and interpreting them. The nature of the teaching materials on subject of Experimental Design that demand the ability of students to think in a systematic, logical, and critical in analyzing the data and interpret the test cases makes the implementation of this inquiry become the proper method. In addition, utilization learning tool, in this case the teaching materials and the students worksheet is one of the factors that makes this inquiry method effectively minimizes of student’s fault when designing experiments.

  12. Neural Interfaces for Intracortical Recording: Requirements, Fabrication Methods, and Characteristics.

    Science.gov (United States)

    Szostak, Katarzyna M; Grand, Laszlo; Constandinou, Timothy G

    2017-01-01

    Implantable neural interfaces for central nervous system research have been designed with wire, polymer, or micromachining technologies over the past 70 years. Research on biocompatible materials, ideal probe shapes, and insertion methods has resulted in building more and more capable neural interfaces. Although the trend is promising, the long-term reliability of such devices has not yet met the required criteria for chronic human application. The performance of neural interfaces in chronic settings often degrades due to foreign body response to the implant that is initiated by the surgical procedure, and related to the probe structure, and material properties used in fabricating the neural interface. In this review, we identify the key requirements for neural interfaces for intracortical recording, describe the three different types of probes-microwire, micromachined, and polymer-based probes; their materials, fabrication methods, and discuss their characteristics and related challenges.

  13. Estimation methods of eco-environmental water requirements: Case study

    Institute of Scientific and Technical Information of China (English)

    YANG Zhifeng; CUI Baoshan; LIU Jingling

    2005-01-01

    Supplying water to the ecological environment with certain quantity and quality is significant for the protection of diversity and the realization of sustainable development. The conception and connotation of eco-environmental water requirements, including the definition of the conception, the composition and characteristics of eco-environmental water requirements, are evaluated in this paper. The classification and estimation methods of eco-environmental water requirements are then proposed. On the basis of the study on the Huang-Huai-Hai Area, the present water use, the minimum and suitable water requirement are estimated and the corresponding water shortage is also calculated. According to the interrelated programs, the eco-environmental water requirements in the coming years (2010, 2030, 2050) are estimated. The result indicates that the minimum and suitable eco-environmental water requirements fluctuate with the differences of function setting and the referential standard of water resources, and so as the water shortage. Moreover, the study indicates that the minimum eco-environmental water requirement of the study area ranges from 2.84×1010m3 to 1.02×1011m3, the suitable water requirement ranges from 6.45×1010m3 to 1.78×1011m3, the water shortage ranges from 9.1×109m3 to 2.16×1010m3 under the minimum water requirement, and it is from 3.07×1010m3 to 7.53×1010m3 under the suitable water requirement. According to the different values of the water shortage, the water priority can be allocated. The ranges of the eco-environmental water requirements in the three coming years (2010, 2030, 2050) are 4.49×1010m3-1.73×1011m3, 5.99×10m3?2.09×1011m3, and 7.44×1010m3-2.52×1011m3, respectively.

  14. GPU-based RFA simulation for minimally invasive cancer treatment of liver tumours

    NARCIS (Netherlands)

    Mariappan, P.; Weir, P.; Flanagan, R.; Voglreiter, P.; Alhonnoro, T.; Pollari, M.; Moche, M.; Busse, H.; Futterer, J.J.; Portugaller, H.R.; Sequeiros, R.B.; Kolesnik, M.

    2017-01-01

    PURPOSE: Radiofrequency ablation (RFA) is one of the most popular and well-standardized minimally invasive cancer treatments (MICT) for liver tumours, employed where surgical resection has been contraindicated. Less-experienced interventional radiologists (IRs) require an appropriate planning tool

  15. The minimal work cost of information processing

    Science.gov (United States)

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-07-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.

  16. DQM: Decentralized Quadratically Approximated Alternating Direction Method of Multipliers

    Science.gov (United States)

    Mokhtari, Aryan; Shi, Wei; Ling, Qing; Ribeiro, Alejandro

    2016-10-01

    This paper considers decentralized consensus optimization problems where nodes of a network have access to different summands of a global objective function. Nodes cooperate to minimize the global objective by exchanging information with neighbors only. A decentralized version of the alternating directions method of multipliers (DADMM) is a common method for solving this category of problems. DADMM exhibits linear convergence rate to the optimal objective but its implementation requires solving a convex optimization problem at each iteration. This can be computationally costly and may result in large overall convergence times. The decentralized quadratically approximated ADMM algorithm (DQM), which minimizes a quadratic approximation of the objective function that DADMM minimizes at each iteration, is proposed here. The consequent reduction in computational time is shown to have minimal effect on convergence properties. Convergence still proceeds at a linear rate with a guaranteed constant that is asymptotically equivalent to the DADMM linear convergence rate constant. Numerical results demonstrate advantages of DQM relative to DADMM and other alternatives in a logistic regression problem.

  17. Waste minimization assessment procedure

    International Nuclear Information System (INIS)

    Kellythorne, L.L.

    1993-01-01

    Perry Nuclear Power Plant began developing a waste minimization plan early in 1991. In March of 1991 the plan was documented following a similar format to that described in the EPA Waste Minimization Opportunity Assessment Manual. Initial implementation involved obtaining management's commitment to support a waste minimization effort. The primary assessment goal was to identify all hazardous waste streams and to evaluate those streams for minimization opportunities. As implementation of the plan proceeded, non-hazardous waste streams routinely generated in large volumes were also evaluated for minimization opportunities. The next step included collection of process and facility data which would be useful in helping the facility accomplish its assessment goals. This paper describes the resources that were used and which were most valuable in identifying both the hazardous and non-hazardous waste streams that existed on site. For each material identified as a waste stream, additional information regarding the materials use, manufacturer, EPA hazardous waste number and DOT hazard class was also gathered. Once waste streams were evaluated for potential source reduction, recycling, re-use, re-sale, or burning for heat recovery, with disposal as the last viable alternative

  18. Combined shape and topology optimization for minimization of maximal von Mises stress

    International Nuclear Information System (INIS)

    Lian, Haojie; Christiansen, Asger N.; Tortorelli, Daniel A.; Sigmund, Ole; Aage, Niels

    2017-01-01

    Here, this work shows that a combined shape and topology optimization method can produce optimal 2D designs with minimal stress subject to a volume constraint. The method represents the surface explicitly and discretizes the domain into a simplicial complex which adapts both structural shape and topology. By performing repeated topology and shape optimizations and adaptive mesh updates, we can minimize the maximum von Mises stress using the p-norm stress measure with p-values as high as 30, provided that the stress is calculated with sufficient accuracy.

  19. The Los Alamos National Laboratory Chemistry and Metallurgy Research Facility upgrades project - A model for waste minimization

    International Nuclear Information System (INIS)

    Burns, M.L.; Durrer, R.E.; Kennicott, M.A.

    1996-07-01

    The Los Alamos National Laboratory (LANL) Chemistry and Metallurgy Research (CMR) Facility, constructed in 1952, is currently undergoing a major, multi-year construction project. Many of the operations required under this project (i.e., design, demolition, decontamination, construction, and waste management) mimic the processes required of a large scale decontamination and decommissioning (D ampersand D) job and are identical to the requirements of any of several upgrades projects anticipated for LANL and other Department of Energy (DOE) sites. For these reasons the CMR Upgrades Project is seen as an ideal model facility - to test the application, and measure the success of - waste minimization techniques which could be brought to bear on any of the similar projects. The purpose of this paper will be to discuss the past, present, and anticipated waste minimization applications at the facility and will focus on the development and execution of the project's open-quotes Waste Minimization/Pollution Prevention Strategic Plan.close quotes

  20. Current status of pediatric minimal access surgery at Sultan Qaboos ...

    African Journals Online (AJOL)

    Keywords: current status, laparoscopy, minimal access surgery, thoracoscopy. Departments of ... Materials and methods ... procedures, the open technique was used for the creation ... operated for bilateral inguinal herniotomy had recurrence.

  1. Transformation of general binary MRF minimization to the first-order case.

    Science.gov (United States)

    Ishikawa, Hiroshi

    2011-06-01

    We introduce a transformation of general higher-order Markov random field with binary labels into a first-order one that has the same minima as the original. Moreover, we formalize a framework for approximately minimizing higher-order multi-label MRF energies that combines the new reduction with the fusion-move and QPBO algorithms. While many computer vision problems today are formulated as energy minimization problems, they have mostly been limited to using first-order energies, which consist of unary and pairwise clique potentials, with a few exceptions that consider triples. This is because of the lack of efficient algorithms to optimize energies with higher-order interactions. Our algorithm challenges this restriction that limits the representational power of the models so that higher-order energies can be used to capture the rich statistics of natural scenes. We also show that some minimization methods can be considered special cases of the present framework, as well as comparing the new method experimentally with other such techniques.

  2. On a minimization of the eigenvalues of Schroedinger operator relatively domains

    International Nuclear Information System (INIS)

    Gasymov, Yu.S.; Niftiev, A.A.

    2001-01-01

    Minimization of the eigenvalues plays an important role in the operators spectral theory. The problem on the minimization of the eigenvalues of the Schroedinger operator by areas is considered in this work. The algorithm, analogous to the conditional gradient method, is proposed for the numerical solution of this problem in the common case. The result is generalized for the case of the positively determined completely continuous operator [ru

  3. Minimal but non-minimal inflation and electroweak symmetry breaking

    Energy Technology Data Exchange (ETDEWEB)

    Marzola, Luca [National Institute of Chemical Physics and Biophysics,Rävala 10, 10143 Tallinn (Estonia); Institute of Physics, University of Tartu,Ravila 14c, 50411 Tartu (Estonia); Racioppi, Antonio [National Institute of Chemical Physics and Biophysics,Rävala 10, 10143 Tallinn (Estonia)

    2016-10-07

    We consider the most minimal scale invariant extension of the standard model that allows for successful radiative electroweak symmetry breaking and inflation. The framework involves an extra scalar singlet, that plays the rôle of the inflaton, and is compatibile with current experimental bounds owing to the non-minimal coupling of the latter to gravity. This inflationary scenario predicts a very low tensor-to-scalar ratio r≈10{sup −3}, typical of Higgs-inflation models, but in contrast yields a scalar spectral index n{sub s}≃0.97 which departs from the Starobinsky limit. We briefly discuss the collider phenomenology of the framework.

  4. Cost-effectiveness analysis in minimally invasive spine surgery.

    Science.gov (United States)

    Al-Khouja, Lutfi T; Baron, Eli M; Johnson, J Patrick; Kim, Terrence T; Drazin, Doniel

    2014-06-01

    Medical care has been evolving with the increased influence of a value-based health care system. As a result, more emphasis is being placed on ensuring cost-effectiveness and utility in the services provided to patients. This study looks at this development in respect to minimally invasive spine surgery (MISS) costs. A literature review using PubMed, the Cost-Effectiveness Analysis (CEA) Registry, and the National Health Service Economic Evaluation Database (NHS EED) was performed. Papers were included in the study if they reported costs associated with minimally invasive spine surgery (MISS). If there was no mention of cost, CEA, cost-utility analysis (CUA), quality-adjusted life year (QALY), quality, or outcomes mentioned, then the article was excluded. Fourteen studies reporting costs associated with MISS in 12,425 patients (3675 undergoing minimally invasive procedures and 8750 undergoing open procedures) were identified through PubMed, the CEA Registry, and NHS EED. The percent cost difference between minimally invasive and open approaches ranged from 2.54% to 33.68%-all indicating cost saving with a minimally invasive surgical approach. Average length of stay (LOS) for minimally invasive surgery ranged from 0.93 days to 5.1 days compared with 1.53 days to 12 days for an open approach. All studies reporting EBL reported lower volume loss in an MISS approach (range 10-392.5 ml) than in an open approach (range 55-535.5 ml). There are currently an insufficient number of studies published reporting the costs of MISS. Of the studies published, none have followed a standardized method of reporting and analyzing cost data. Preliminary findings analyzing the 14 studies showed both cost saving and better outcomes in MISS compared with an open approach. However, more Level I CEA/CUA studies including cost/QALY evaluations with specifics of the techniques utilized need to be reported in a standardized manner to make more accurate conclusions on the cost effectiveness of

  5. Offset Risk Minimization for Open-loop Optimal Control of Oil Reservoirs

    DEFF Research Database (Denmark)

    Capolei, Andrea; Christiansen, Lasse Hjuler; Jørgensen, J. B.

    2017-01-01

    Simulation studies of oil field water flooding have demonstrated a significant potential of optimal control technology to improve industrial practices. However, real-life applications are challenged by unknown geological factors that make reservoir models highly uncertain. To minimize...... the associated financial risks, the oil literature has used ensemble-based methods to manipulate the net present value (NPV) distribution by optimizing sample estimated risk measures. In general, such methods successfully reduce overall risk. However, as this paper demonstrates, ensemble-based control strategies...... practices. The results suggest that it may be more relevant to consider the NPV offset distribution than the NPV distribution when minimizing risk in production optimization....

  6. New minimally invasive option for the treatment of gluteal muscle contracture.

    Science.gov (United States)

    Ye, Bin; Zhou, Panyu; Xia, Yan; Chen, Youyan; Yu, Jun; Xu, Shuogui

    2012-12-01

    Gluteal muscle contracture is a clinical syndrome that involves contracture and distortion of the gluteal muscles and fascia fibers due to multiple causes. Physical examination demonstrates a characteristic gait due to hip adduction and internal thigh rotation. This study introduces a new minimally invasive method for surgical release of gluteal muscle contracture. Patients with gluteal muscle contracture were assigned to 4 categories: type A, contracture occurred mainly in the iliotibial tract; type B, contracture occurred in the Iliotibial tract and gluteus maximus; type C1, movement of the contraction band was palpable and a snapping sound was audible during squatting; and type C2, movement of the contraction band was not palpable or almost absent and a snapping sound was audible during squatting. This classification method allowed prediction of the anatomic location of these pathological contractures and determination of the type of surgery required. Four critical points were used to define the operative field and served as points to mark a surgical incision smaller than 4 mm. The contracture was easily released in this carefully marked operative field without causing significant neurovascular damage. Over a period of 5 years, between March 2003 and June 2008, the authors treated 1059 patients with this method and achieved excellent outcomes. Most patients were fully active within 12 weeks, with the assistance of an early postoperative rehabilitation program. The most significant complication was a postoperative periarticular hematoma, which occurred in 3 patients within 10 days postoperatively and required surgical ligation of the bleeding vessel. Copyright 2012, SLACK Incorporated.

  7. Minimally Invasive Spinal Surgery with Intraoperative Image-Guided Navigation

    Directory of Open Access Journals (Sweden)

    Terrence T. Kim

    2016-01-01

    Full Text Available We present our perioperative minimally invasive spine surgery technique using intraoperative computed tomography image-guided navigation for the treatment of various lumbar spine pathologies. We present an illustrative case of a patient undergoing minimally invasive percutaneous posterior spinal fusion assisted by the O-arm system with navigation. We discuss the literature and the advantages of the technique over fluoroscopic imaging methods: lower occupational radiation exposure for operative room personnel, reduced need for postoperative imaging, and decreased revision rates. Most importantly, we demonstrate that use of intraoperative cone beam CT image-guided navigation has been reported to increase accuracy.

  8. Efficient modified Jacobi relaxation for minimizing the energy functional

    International Nuclear Information System (INIS)

    Park, C.H.; Lee, I.; Chang, K.J.

    1993-01-01

    We present an efficient scheme of diagonalizing large Hamiltonian matrices in a self-consistent manner. In the framework of the preconditioned conjugate gradient minimization of the energy functional, we replace the modified Jacobi relaxation for preconditioning and use for band-by-band minimization the restricted-block Davidson algorithm, in which only the previous wave functions and the relaxation vectors are included additionally for subspace diagonalization. Our scheme is found to be comparable with the preconditioned conjugate gradient method for both large ordered and disordered Si systems, while it is more rapidly converged for systems with transition-metal elements

  9. Ruled Laguerre minimal surfaces

    KAUST Repository

    Skopenkov, Mikhail

    2011-10-30

    A Laguerre minimal surface is an immersed surface in ℝ 3 being an extremal of the functional ∫ (H 2/K-1)dA. In the present paper, we prove that the only ruled Laguerre minimal surfaces are up to isometry the surfaces ℝ (φλ) = (Aφ, Bφ, Cφ + D cos 2φ) + λ(sin φ, cos φ, 0), where A,B,C,D ε ℝ are fixed. To achieve invariance under Laguerre transformations, we also derive all Laguerre minimal surfaces that are enveloped by a family of cones. The methodology is based on the isotropic model of Laguerre geometry. In this model a Laguerre minimal surface enveloped by a family of cones corresponds to a graph of a biharmonic function carrying a family of isotropic circles. We classify such functions by showing that the top view of the family of circles is a pencil. © 2011 Springer-Verlag.

  10. Neural Interfaces for Intracortical Recording: Requirements, Fabrication Methods, and Characteristics

    Directory of Open Access Journals (Sweden)

    Katarzyna M. Szostak

    2017-12-01

    Full Text Available Implantable neural interfaces for central nervous system research have been designed with wire, polymer, or micromachining technologies over the past 70 years. Research on biocompatible materials, ideal probe shapes, and insertion methods has resulted in building more and more capable neural interfaces. Although the trend is promising, the long-term reliability of such devices has not yet met the required criteria for chronic human application. The performance of neural interfaces in chronic settings often degrades due to foreign body response to the implant that is initiated by the surgical procedure, and related to the probe structure, and material properties used in fabricating the neural interface. In this review, we identify the key requirements for neural interfaces for intracortical recording, describe the three different types of probes—microwire, micromachined, and polymer-based probes; their materials, fabrication methods, and discuss their characteristics and related challenges.

  11. A Projected Conjugate Gradient Method for Sparse Minimax Problems

    DEFF Research Database (Denmark)

    Madsen, Kaj; Jonasson, Kristjan

    1993-01-01

    A new method for nonlinear minimax problems is presented. The method is of the trust region type and based on sequential linear programming. It is a first order method that only uses first derivatives and does not approximate Hessians. The new method is well suited for large sparse problems...... as it only requires that software for sparse linear programming and a sparse symmetric positive definite equation solver are available. On each iteration a special linear/quadratic model of the function is minimized, but contrary to the usual practice in trust region methods the quadratic model is only...... with the method are presented. In fact, we find that the number of iterations required is comparable to that of state-of-the-art quasi-Newton codes....

  12. Minimal access surgery in children: An initial experience of 28 months

    Directory of Open Access Journals (Sweden)

    Gupta Abhaya

    2009-01-01

    Full Text Available Background : This study reports our 28 months experience with minimal access surgery (MAS in children. Materials and Methods : This was a review of all children who underwent MAS between December 2004 and March 2007 at the Departments of Paediatric Surgery, Seth Gordhandas Sunderdas Medical College (GSMC and King Edward the VII Memorial (KEM Hospital, India. Results and observations were tabulated and analysed, comparing with observations by various other authors regarding variety of indications such as, operative time, hospital stay, conversion rate, complications, safety, and feasibilty of MAS in neonates, in the appropriate operative groups. Results : A total of 199 procedures were performed in 193 children aged between 10 days and 12 years (average age: 5.7 years. One case of each, adrenal mass, retroperitoneoscopic nephrectomy, laparoscopic congenital diaphragmatic hernia (CDH repair, and abdominoperineal pull-through for anorectal malformation, were converted to open surgeries due to technical difficulty. The overall conversion rate was 3%. Morbidity and mortality were very minimal and the procedures were well tolerated in majority of cases. Conclusion : We concluded that MAS procedures appear to be safe for a wide range of indications in neonates and children. Further development and expansion of its indications in neonatal and paediatric surgery requires further multi-institutional studies and larger cohort of patients, to compare with standards of open surgery.

  13. A negentropy minimization approach to adaptive equalization for digital communication systems.

    Science.gov (United States)

    Choi, Sooyong; Lee, Te-Won

    2004-07-01

    In this paper, we introduce and investigate a new adaptive equalization method based on minimizing approximate negentropy of the estimation error for a finite-length equalizer. We consider an approximate negentropy using nonpolynomial expansions of the estimation error as a new performance criterion to improve performance of a linear equalizer based on minimizing minimum mean squared error (MMSE). Negentropy includes higher order statistical information and its minimization provides improved converge, performance and accuracy compared to traditional methods such as MMSE in terms of bit error rate (BER). The proposed negentropy minimization (NEGMIN) equalizer has two kinds of solutions, the MMSE solution and the other one, depending on the ratio of the normalization parameters. The NEGMIN equalizer has best BER performance when the ratio of the normalization parameters is properly adjusted to maximize the output power(variance) of the NEGMIN equalizer. Simulation experiments show that BER performance of the NEGMIN equalizer with the other solution than the MMSE one has similar characteristics to the adaptive minimum bit error rate (AMBER) equalizer. The main advantage of the proposed equalizer is that it needs significantly fewer training symbols than the AMBER equalizer. Furthermore, the proposed equalizer is more robust to nonlinear distortions than the MMSE equalizer.

  14. Global Analysis of Minimal Surfaces

    CERN Document Server

    Dierkes, Ulrich; Tromba, Anthony J

    2010-01-01

    Many properties of minimal surfaces are of a global nature, and this is already true for the results treated in the first two volumes of the treatise. Part I of the present book can be viewed as an extension of these results. For instance, the first two chapters deal with existence, regularity and uniqueness theorems for minimal surfaces with partially free boundaries. Here one of the main features is the possibility of 'edge-crawling' along free parts of the boundary. The third chapter deals with a priori estimates for minimal surfaces in higher dimensions and for minimizers of singular integ

  15. Minimal Surfaces for Hitchin Representations

    DEFF Research Database (Denmark)

    Li, Qiongling; Dai, Song

    2018-01-01

    . In this paper, we investigate the properties of immersed minimal surfaces inside symmetric space associated to a subloci of Hitchin component: $q_n$ and $q_{n-1}$ case. First, we show that the pullback metric of the minimal surface dominates a constant multiple of the hyperbolic metric in the same conformal...... class and has a strong rigidity property. Secondly, we show that the immersed minimal surface is never tangential to any flat inside the symmetric space. As a direct corollary, the pullback metric of the minimal surface is always strictly negatively curved. In the end, we find a fully decoupled system...

  16. Methods for computing color anaglyphs

    Science.gov (United States)

    McAllister, David F.; Zhou, Ya; Sullivan, Sophia

    2010-02-01

    A new computation technique is presented for calculating pixel colors in anaglyph images. The method depends upon knowing the RGB spectral distributions of the display device and the transmission functions of the filters in the viewing glasses. It requires the solution of a nonlinear least-squares program for each pixel in a stereo pair and is based on minimizing color distances in the CIEL*a*b* uniform color space. The method is compared with several techniques for computing anaglyphs including approximation in CIE space using the Euclidean and Uniform metrics, the Photoshop method and its variants, and a method proposed by Peter Wimmer. We also discuss the methods of desaturation and gamma correction for reducing retinal rivalry.

  17. Environmental Restoration Contractor Waste Minimization and Pollution Prevention Plan. Revision 1

    International Nuclear Information System (INIS)

    Lewis, R.A.

    1996-03-01

    This plan contains the Environmental Restoration Contractor (ERC) Waste Minimization and Pollution Prevention (WMin/P2) Program. The plan outlines the activities and schedules developed by the ERC to reduce the quantity and toxicity of waste dispositioned as a result of restoration and remediation activities. This plan satisfies US Department of Energy (DOE) requirements including the Pollution Prevention Awareness program required by DOE Order 5400.1 (DOE 1988). This plan is consistent with Executive Order 12856 and Secretary O'Leary's pollution prevention Policy Statement of December 27, 1994, which set US and DOE pollution prevention policies, respectively. It is also consistent with the DOE Pollution Prevention Crosscut Plan, 1994, which provides guidance in meeting the DOE goals in pollution prevention. The purpose of this plan is to aid ERC projects in meeting and documenting compliance with requirements for WMin/P2. This plan contains the objectives, strategy, and support activities of the ERC Team WMin/P2 program. The basic elements of the plan are pollution prevention goals, waste assessments of major waste streams, implementation of feasible waste minimization opportunities, and a process for reporting achievements. Wherever appropriate, the ERC will integrate the pollution prevention activities in this plan into regular program activities rather than establishing separate WMin/P2 activities. Moreover, wherever possible, existing documents, procedures, and activities will be used to meet WMin/P2 requirements

  18. Laser radiation in tennis elbow treatment: a new minimally invasive alternative

    Science.gov (United States)

    Paganini, Stefan; Thal, Dietmar R.; Werkmann, Klaus

    1998-01-01

    The epicondylitis humeri radialis (EHR) (tennis elbow), is a common disease in elbow joint pain syndromes. We treated patients with chronic pain for at least one year and no improvement with conservative or operative therapies with a new minimal invasive method, the EHR-Laser radiation (EHR- LR). With this method periepicondylar coagulations were applied to the trigger points of the patients. For this the previously established technique of facet joint coagulation with the Nd:Yag-laser was modified. In a follow-up study of between 6 weeks and 2 years all patients reported either a significant pain reduction or were symptom free. EHR-LR is a new method situated between conservative and surgical treatments for minimal invasive therapy of EHR. Several therapeutic rationales were discussed for the resulting pain reduction.

  19. Minimally Invasive Surgical Treatment of Acute Epidural Hematoma: Case Series

    Directory of Open Access Journals (Sweden)

    Weijun Wang

    2016-01-01

    Full Text Available Background and Objective. Although minimally invasive surgical treatment of acute epidural hematoma attracts increasing attention, no generalized indications for the surgery have been adopted. This study aimed to evaluate the effects of minimally invasive surgery in acute epidural hematoma with various hematoma volumes. Methods. Minimally invasive puncture and aspiration surgery were performed in 59 cases of acute epidural hematoma with various hematoma volumes (13–145 mL; postoperative follow-up was 3 months. Clinical data, including surgical trauma, surgery time, complications, and outcome of hematoma drainage, recovery, and Barthel index scores, were assessed, as well as treatment outcome. Results. Surgical trauma was minimal and surgery time was short (10–20 minutes; no anesthesia accidents or surgical complications occurred. Two patients died. Drainage was completed within 7 days in the remaining 57 cases. Barthel index scores of ADL were ≤40 (n=1, 41–60 (n=1, and >60 (n=55; scores of 100 were obtained in 48 cases, with no dysfunctions. Conclusion. Satisfactory results can be achieved with minimally invasive surgery in treating acute epidural hematoma with hematoma volumes ranging from 13 to 145 mL. For patients with hematoma volume >50 mL and even cerebral herniation, flexible application of minimally invasive surgery would help improve treatment efficacy.

  20. Minimal Webs in Riemannian Manifolds

    DEFF Research Database (Denmark)

    Markvorsen, Steen

    2008-01-01

    For a given combinatorial graph $G$ a {\\it geometrization} $(G, g)$ of the graph is obtained by considering each edge of the graph as a $1-$dimensional manifold with an associated metric $g$. In this paper we are concerned with {\\it minimal isometric immersions} of geometrized graphs $(G, g......)$ into Riemannian manifolds $(N^{n}, h)$. Such immersions we call {\\em{minimal webs}}. They admit a natural 'geometric' extension of the intrinsic combinatorial discrete Laplacian. The geometric Laplacian on minimal webs enjoys standard properties such as the maximum principle and the divergence theorems, which...... are of instrumental importance for the applications. We apply these properties to show that minimal webs in ambient Riemannian spaces share several analytic and geometric properties with their smooth (minimal submanifold) counterparts in such spaces. In particular we use appropriate versions of the divergence...

  1. Loss Minimization Sliding Mode Control of IPM Synchronous Motor Drives

    Directory of Open Access Journals (Sweden)

    Mehran Zamanifar

    2010-01-01

    Full Text Available In this paper, a nonlinear loss minimization control strategy for an interior permanent magnet synchronous motor (IPMSM based on a newly developed sliding mode approach is presented. This control method sets force the speed control of the IPMSM drives and simultaneously ensures the minimization of the losses besides the uncertainties exist in the system such as parameter variations which have undesirable effects on the controller performance except at near nominal conditions. Simulation results are presented to show the effectiveness of the proposed controller.

  2. Smart Cup: A Minimally-Instrumented, Smartphone-Based Point-of-Care Molecular Diagnostic Device.

    Science.gov (United States)

    Liao, Shih-Chuan; Peng, Jing; Mauk, Michael G; Awasthi, Sita; Song, Jinzhao; Friedman, Harvey; Bau, Haim H; Liu, Changchun

    2016-06-28

    Nucleic acid amplification-based diagnostics offer rapid, sensitive, and specific means for detecting and monitoring the progression of infectious diseases. However, this method typically requires extensive sample preparation, expensive instruments, and trained personnel. All of which hinder its use in resource-limited settings, where many infectious diseases are endemic. Here, we report on a simple, inexpensive, minimally-instrumented, smart cup platform for rapid, quantitative molecular diagnostics of pathogens at the point of care. Our smart cup takes advantage of water-triggered, exothermic chemical reaction to supply heat for the nucleic acid-based, isothermal amplification. The amplification temperature is regulated with a phase-change material (PCM). The PCM maintains the amplification reactor at a constant temperature, typically, 60-65°C, when ambient temperatures range from 12 to 35°C. To eliminate the need for an optical detector and minimize cost, we use the smartphone's flashlight to excite the fluorescent dye and the phone camera to record real-time fluorescence emission during the amplification process. The smartphone can concurrently monitor multiple amplification reactors and analyze the recorded data. Our smart cup's utility was demonstrated by amplifying and quantifying herpes simplex virus type 2 (HSV-2) with LAMP assay in our custom-made microfluidic diagnostic chip. We have consistently detected as few as 100 copies of HSV-2 viral DNA per sample. Our system does not require any lab facilities and is suitable for use at home, in the field, and in the clinic, as well as in resource-poor settings, where access to sophisticated laboratories is impractical, unaffordable, or nonexistent.

  3. Implementation of Waste Minimization at a complex R ampersand D site

    International Nuclear Information System (INIS)

    Lang, R.E.; Thuot, J.R.; Devgun, J.S.

    1995-01-01

    Under the 1994 Waste Minimization/Pollution Prevention Crosscut Plan, the Department of Energy (DOE) has set a goal of 50% reduction in waste at its facilities by the end of 1999. Each DOE site is required to set site-specific goals to reduce generation of all types of waste including hazardous, radioactive, and mixed. To meet these goals, Argonne National Laboratory (ANL), Argonne, IL, has developed and implemented a comprehensive Pollution Prevention/Waste Minimization (PP/WMin) Program. The facilities and activities at the site vary from research into basic sciences and research into nuclear fuel cycle to high energy physics and decontamination and decommissioning projects. As a multidisciplinary R ampersand D facility and a multiactivity site, ANL generates waste streams that are varied, in physical form as well as in chemical constituents. This in turn presents a significant challenge to put a cohesive site-wide PP/WMin Program into action. In this paper, we will describe ANL's key activities and waste streams, the regulatory drivers for waste minimization, and the DOE goals in this area, and we will discuss ANL's strategy for waste minimization and it's implementation across the site

  4. Minimally Invasive Technique for PMMA Augmentation of Fenestrated Screws

    Directory of Open Access Journals (Sweden)

    Jan-Helge Klingler

    2015-01-01

    Full Text Available Purpose. To describe the minimally invasive technique for cement augmentation of cannulated and fenestrated screws using an injection cannula as well as to report its safety and efficacy. Methods. A total of 157 cannulated and fenestrated pedicle screws had been cement-augmented during minimally invasive posterior screw-rod spondylodesis in 35 patients from January to December 2012. Retrospective evaluation of cement extravasation and screw loosening was carried out in postoperative plain radiographs and thin-sliced triplanar computed tomography scans. Results. Twenty-seven, largely prevertebral cement extravasations were detected in 157 screws (17.2%. None of the cement extravasations was causing a clinical sequela like a new neurological deficit. One screw loosening was noted (0.6% after a mean follow-up of 12.8 months. We observed no cementation-associated complication like pulmonary embolism or hemodynamic insufficiency. Conclusions. The presented minimally invasive cement augmentation technique using an injection cannula facilitates convenient and safe cement delivery through polyaxial cannulated and fenestrated screws during minimally invasive screw-rod spondylodesis. Nevertheless, the optimal injection technique and design of fenestrated screws have yet to be identified. This trial is registered with German Clinical Trials DRKS00006726.

  5. Minimal surfaces, stratified multivarifolds, and the plateau problem

    CERN Document Server

    Thi, Dao Trong; Primrose, E J F; Silver, Ben

    1991-01-01

    Plateau's problem is a scientific trend in modern mathematics that unites several different problems connected with the study of minimal surfaces. In its simplest version, Plateau's problem is concerned with finding a surface of least area that spans a given fixed one-dimensional contour in three-dimensional space--perhaps the best-known example of such surfaces is provided by soap films. From the mathematical point of view, such films are described as solutions of a second-order partial differential equation, so their behavior is quite complicated and has still not been thoroughly studied. Soap films, or, more generally, interfaces between physical media in equilibrium, arise in many applied problems in chemistry, physics, and also in nature. In applications, one finds not only two-dimensional but also multidimensional minimal surfaces that span fixed closed "contours" in some multidimensional Riemannian space. An exact mathematical statement of the problem of finding a surface of least area or volume requir...

  6. Minimization of number of setups for mounting machines

    Energy Technology Data Exchange (ETDEWEB)

    Kolman, Pavel; Nchor, Dennis; Hampel, David [Department of Statistics and Operation Analysis, Faculty of Business and Economics, Mendel University in Brno, Zemědělská 1, 603 00 Brno (Czech Republic); Žák, Jaroslav [Institute of Technology and Business, Okružní 517/10, 370 01 České Budejovice (Czech Republic)

    2015-03-10

    The article deals with the problem of minimizing the number of setups for mounting SMT machines. SMT is a device used to assemble components on printed circuit boards (PCB) during the manufacturing of electronics. Each type of PCB has a different set of components, which are obligatory. Components are placed in the SMT tray. The problem consists in the fact that the total number of components used for all products is greater than the size of the tray. Therefore, every change of manufactured product requires a complete change of components in the tray (i.e., a setup change). Currently, the number of setups corresponds to the number of printed circuit board type. Any production change affects the change of setup and stops production on one shift. Many components occur in more products therefore the question arose as to how to deploy the products into groups so as to minimize the number of setups. This would result in a huge increase in efficiency of production.

  7. Results of completion arteriography after minimally invasive off-pump coronary artery bypass.

    Science.gov (United States)

    Hoff, Steven J; Ball, Stephen K; Leacche, Marzia; Solenkova, Natalia; Umakanthan, Ramanan; Petracek, Michael R; Ahmad, Rashid; Greelish, James P; Walker, Kristie; Byrne, John G

    2011-01-01

    The benefits of a minimally invasive approach to off-pump coronary artery bypass remain controversial. The value of completion arteriography in validating this technique has not been investigated. From April 2007 to October 2009, fifty-six patients underwent isolated minimally invasive coronary artery bypass grafting through a left thoracotomy without cardiopulmonary bypass. Forty-three of these patients underwent completion arteriography. Sixty-five grafts were performed in these 56 patients, (average, 1.2 grafts per patient; range, 1 to 3). Forty-eight grafts were studied in the 43 patients undergoing completion arteriography. There were 4 findings on arteriogram leading to further immediate intervention (8.3%). These included 3 grafts with anastomotic stenoses or spasm requiring stent placement, and 1 patient who had limited dissection in the left internal mammary artery graft and underwent placement of an additional vein graft. These findings were independent of electrocardiographic changes or hemodynamic instability. The remainder of the studies showed no significant abnormalities. There were no deaths. One patient who did not have a completion arteriogram suffered a postoperative myocardial infarction requiring stent placement for anastomotic stenosis. Patients were discharged home an average of 6.8 days postoperatively. There were no instances of renal dysfunction postoperatively attributable to catheterization. Minimally invasive coronary artery bypass is safe and effective. Findings of completion arteriography occasionally reveal previously under-recognized findings that, if corrected in a timely fashion, could potentially impact graft patency and clinical outcomes. Our experience validates this minimally invasive technique. Copyright © 2011 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  8. Minimally invasive surgical treatment of Bertolotti's Syndrome: case report.

    Science.gov (United States)

    Ugokwe, Kene T; Chen, Tsu-Lee; Klineberg, Eric; Steinmetz, Michael P

    2008-05-01

    This article aims to provide more insight into the presentation, diagnosis, and treatment of Bertolotti's syndrome, which is a rare spinal disorder that is very difficult to recognize and diagnose correctly. The syndrome was first described by Bertolotti in 1917 and affects approximately 4 to 8% of the population. It is characterized by an enlarged transverse process at the most caudal lumbar vertebra with a pseudoarticulation of the transverse process and the sacral ala. It tends to present with low back pain and may be confused with facet and sacroiliac joint disease. In this case report, we describe a 40-year-old man who presented with low back pain and was eventually diagnosed with Bertolotti's syndrome. The correct diagnosis was made based on imaging studies which included computed tomographic scans, plain x-rays, and magnetic resonance imaging scans. The patient experienced temporary relief when the abnormal pseudoarticulation was injected with a cocktail consisting of lidocaine and steroids. In order to minimize the trauma associated with surgical treatment, a minimally invasive approach was chosen to resect the anomalous transverse process with the accompanying pseudoarticulation. The patient did well postoperatively and had 97% resolution of his pain at 6 months after surgery. As with conventional surgical approaches, a complete knowledge of anatomy is required for minimally invasive spine surgery. This case is an example of the expanding utility of minimally invasive approaches in treating spinal disorders.

  9. Operant Conditioning: A Minimal Components Requirement in Artificial Spiking Neurons Designed for Bio-Inspired Robot’s Controller

    Directory of Open Access Journals (Sweden)

    André eCyr

    2014-07-01

    Full Text Available We demonstrate the operant conditioning (OC learning process within a basic bio-inspired robot controller paradigm, using an artificial spiking neural network (ASNN with minimal component count as artificial brain. In biological agents, OC results in behavioral changes that are learned from the consequences of previous actions, using progressive prediction adjustment triggered by reinforcers. In a robotics context, virtual and physical robots may benefit from a similar learning skill when facing unknown environments with no supervision. In this work, we demonstrate that a simple ASNN can efficiently realise many OC scenarios. The elementary learning kernel that we describe relies on a few critical neurons, synaptic links and the integration of habituation and spike-timing dependent plasticity (STDP as learning rules. Using four tasks of incremental complexity, our experimental results show that such minimal neural component set may be sufficient to implement many OC procedures. Hence, with the described bio-inspired module, OC can be implemented in a wide range of robot controllers, including those with limited computational resources.

  10. A randomized prospective study of desflurane versus isoflurane in minimal flow anesthesia using “equilibration time” as the change-over point to minimal flow

    Science.gov (United States)

    Mallik, Tanuja; Aneja, S; Tope, R; Muralidhar, V

    2012-01-01

    Background: In the administration of minimal flow anesthesia, traditionally a fixed time period of high flow has been used before changing over to minimal flow. However, newer studies have used “equilibration time” of a volatile anesthetic agent as the change-over point. Materials and Methods: A randomized prospective study was conducted on 60 patients, who were divided into two groups of 30 patients each. Two volatile inhalational anesthetic agents were compared. Group I received desflurane (n = 30) and group II isoflurane (n = 30). Both the groups received an initial high flow till equilibration between inspired (Fi) and expired (Fe) agent concentration were achieved, which was defined as Fe/Fi = 0.8. The mean (SD) equilibration time was obtained for both the agent. Then, a drift in end-tidal agent concentration during the minimal flow anesthesia and recovery profile was noted. Results: The mean equilibration time obtained for desflurane and isoflurane were 4.96 ± 1.60 and 16.96 ± 9.64 min (P < 0.001). The drift in end-tidal agent concentration over time was minimal in the desflurane group (P = 0.065). Recovery time was 5.70 ± 2.78 min in the desflurane group and 8.06 ± 31 min in the isoflurane group (P = 0.004). Conclusion: Use of equilibration time of the volatile anesthetic agent as a change-over point, from high flow to minimal flow, can help us use minimal flow anesthesia, in a more efficient way. PMID:23225926

  11. Selecting Suitable Drainage Pattern to Minimize Flooding in ...

    African Journals Online (AJOL)

    Water shed analysis is a geographic information system (GIS) based technique designed to model the way surface water flows on the earth surface. This was the method adopted to select suitable drainage pattern to minimized flood in some parts of sangere. The process of watershed computes the local direction of flow ...

  12. IMPORTANCE, Minimal Cut Sets and System Availability from Fault Tree Analysis

    International Nuclear Information System (INIS)

    Lambert, H. W.

    1987-01-01

    1 - Description of problem or function: IMPORTANCE computes various measures of probabilistic importance of basic events and minimal cut sets to a fault tree or reliability network diagram. The minimal cut sets, the failure rates and the fault duration times (i.e., the repair times) of all basic events contained in the minimal cut sets are supplied as input data. The failure and repair distributions are assumed to be exponential. IMPORTANCE, a quantitative evaluation code, then determines the probability of the top event and computes the importance of minimal cut sets and basic events by a numerical ranking. Two measures are computed. The first describes system behavior at one point in time; the second describes sequences of failures that cause the system to fail in time. All measures are computed assuming statistical independence of basic events. In addition, system unavailability and expected number of system failures are computed by the code. 2 - Method of solution: Seven measures of basic event importance and two measures of cut set importance can be computed. Birnbaum's measure of importance (i.e., the partial derivative) and the probability of the top event are computed using the min cut upper bound. If there are no replicated events in the minimal cut sets, then the min cut upper bound is exact. If basic events are replicated in the minimal cut sets, then based on experience the min cut upper bound is accurate if the probability of the top event is less than 0.1. Simpson's rule is used in computing the time-integrated measures of importance. Newton's method for approximating the roots of an equation is employed in the options where the importance measures are computed as a function of the probability of the top event, and a shell sort puts the output in descending order of importance

  13. Super-acceleration from massless, minimally coupled phi sup 4

    CERN Document Server

    Onemli, V K

    2002-01-01

    We derive a simple form for the propagator of a massless, minimally coupled scalar in a locally de Sitter geometry of arbitrary spacetime dimension. We then employ it to compute the fully renormalized stress tensor at one- and two-loop orders for a massless, minimally coupled phi sup 4 theory which is released in Bunch-Davies vacuum at t=0 in co-moving coordinates. In this system, the uncertainty principle elevates the scalar above the minimum of its potential, resulting in a phase of super-acceleration. With the non-derivative self-interaction the scalar's breaking of de Sitter invariance becomes observable. It is also worth noting that the weak-energy condition is violated on cosmological scales. An interesting subsidiary result is that cancelling overlapping divergences in the stress tensor requires a conformal counterterm which has no effect on purely scalar diagrams.

  14. Using in situ bioventing to minimize soil vapor extraction costs

    International Nuclear Information System (INIS)

    Downey, D.C.; Frishmuth, R.A.; Archabal, S.R.; Pluhar, C.J.; Blystone, P.G.; Miller, R.N.

    1995-01-01

    Gasoline-contaminated soils may be difficult to remediate with bioventing because high concentrations of gasoline vapors become mobile when air is injected into the soil. Because outward vapor migration is often unacceptable on small commercial sites, soil vapor extraction (SVE) or innovative bioventing techniques are required to control vapors and to increase soil gas oxygen levels to stimulate hydrocarbon biodegradation. Combinations of SVE, off-gas treatment, and bioventing have been used to reduce the costs normally associated with remediation of gasoline-contaminated sites. At Site 1, low rates of pulsed air injection were used to provide oxygen while minimizing vapor migration. At Site 2, a period of high-rate SVE and off-gas treatment was followed by long-term air injection. Site 3 used an innovative approach that combined regenerative resin for ex situ vapor treatment with in situ bioventing to reduce the overall cost of site remediation. At each of these Air Force sites, bioventing provided cost savings when compared to more traditional SVE methods

  15. Sensitivity Analysis of Hydraulic Methods Regarding Hydromorphologic Data Derivation Methods to Determine Environmental Water Requirements

    Directory of Open Access Journals (Sweden)

    Alireza Shokoohi

    2015-07-01

    Full Text Available This paper studies the accuracy of hydraulic methods in determining environmental flow requirements. Despite the vital importance of deriving river cross sectional data for hydraulic methods, few studies have focused on the criteria for deriving this data. The present study shows that the depth of cross section has a meaningful effect on the results obtained from hydraulic methods and that, considering fish as the index species for river habitat analysis, an optimum depth of 1 m should be assumed for deriving information from cross sections. The second important parameter required for extracting the geometric and hydraulic properties of rivers is the selection of an appropriate depth increment; ∆y. In the present research, this parameter was found to be equal to 1 cm. The uncertainty of the environmental discharge evaluation, when allocating water in areas with water scarcity, should be kept as low as possible. The Manning friction coefficient (n is an important factor in river discharge calculation. Using a range of "n" equal to 3 times the standard deviation for the study area, it is shown that the influence of friction coefficient on the estimation of environmental flow is much less than that on the calculation of river discharge.

  16. Maintaining Low Voiding Solder Die Attach for Power Die While Minimizing Die Tilt

    Energy Technology Data Exchange (ETDEWEB)

    Hamm, Randy; Peterson, Kenneth A.

    2015-10-01

    This paper addresses work to minimize voiding and die tilt in solder attachment of a large power die, measuring 9.0 mm X 6.5 mm X 0.1 mm (0.354” x 0.256” x 0.004”), to a heat spreader. As demands for larger high power die continue, minimizing voiding and die tilt is of interest for improved die functionality, yield, manufacturability, and reliability. High-power die generate considerable heat, which is important to dissipate effectively through control of voiding under high thermal load areas of the die while maintaining a consistent bondline (minimizing die tilt). Voiding was measured using acoustic imaging and die tilt was measured using two different optical measurement systems. 80Au-20Sn solder reflow was achieved using a batch vacuum solder system with optimized fixturing. Minimizing die tilt proved to be the more difficult of the two product requirements to meet. Process development variables included tooling, weight and solder preform thickness.

  17. Identifying Minimal Changes in Nonerosive Reflux Disease: Is the Pay Worth the Labor?

    Science.gov (United States)

    Gabbard, Scott L; Fass, Ronnie; Maradey-Romero, Carla; Gingold Belfer, Rachel; Dickman, Ram

    2016-01-01

    Gastroesophageal reflux disease has a variable presentation on upper endoscopy. Gastroesophageal reflux disease can be divided into 3 endoscopic categories: Barrett's esophagus, erosive esophagitis, and normal mucosa/nonerosive reflux disease (NERD). Each of these phenotypes behave in a distinct manner, in regards to symptom response to treatment, and risk of development of complications such as esophageal adenocarcinoma. Recently, it has been proposed to further differentiate NERD into 2 categories: those with and those without "minimal changes." These minimal changes include endoscopic abnormalities, such as villous mucosal surface, mucosal islands, microerosions, and increased vascularity at the squamocolumnar junction. Although some studies have shown that patients with minimal changes may have higher rates of esophageal acid exposure compared with those without minimal changes, it is currently unclear if these patients behave differently than those currently categorized as having NERD. The clinical utility of identifying these lesions should be weighed against the cost of the requisite equipment and the additional time required for diagnosis, compared with conventional white light endoscopy.

  18. Minimization for conditional simulation: Relationship to optimal transport

    Science.gov (United States)

    Oliver, Dean S.

    2014-05-01

    In this paper, we consider the problem of generating independent samples from a conditional distribution when independent samples from the prior distribution are available. Although there are exact methods for sampling from the posterior (e.g. Markov chain Monte Carlo or acceptance/rejection), these methods tend to be computationally demanding when evaluation of the likelihood function is expensive, as it is for most geoscience applications. As an alternative, in this paper we discuss deterministic mappings of variables distributed according to the prior to variables distributed according to the posterior. Although any deterministic mappings might be equally useful, we will focus our discussion on a class of algorithms that obtain implicit mappings by minimization of a cost function that includes measures of data mismatch and model variable mismatch. Algorithms of this type include quasi-linear estimation, randomized maximum likelihood, perturbed observation ensemble Kalman filter, and ensemble of perturbed analyses (4D-Var). When the prior pdf is Gaussian and the observation operators are linear, we show that these minimization-based simulation methods solve an optimal transport problem with a nonstandard cost function. When the observation operators are nonlinear, however, the mapping of variables from the prior to the posterior obtained from those methods is only approximate. Errors arise from neglect of the Jacobian determinant of the transformation and from the possibility of discontinuous mappings.

  19. Retrograde Renal Cooling to Minimize Ischemia

    Directory of Open Access Journals (Sweden)

    Janet L. Colli

    2013-01-01

    Full Text Available Objective: During partial nephrectomy, renal hypothermia has been shown to decrease ischemia induced renal damage which occurs from renal hilar clamping. In this study we investigate the infusion rate required to safely cool the entire renal unit in a porcine model using retrograde irrigation of iced saline via dual-lumen ureteral catheter. Materials and Methods: Renal cortical, renal medullary, bowel and rectal temperatures during retrograde cooling in a laparoscopic porcine model were monitored in six renal units. Iced normal saline was infused at 300 cc/hour, 600 cc/hour, 1000 cc/hour and gravity (800 cc/hour for 600 seconds with and without hilar clamping. Results: Retrograde cooling with hilar clamping provided rapid medullary renal cooling and significant hypothermia of the medulla and cortex at infusion rates ≥ 600 cc/hour. With hilar clamping, cortical temperatures decreased at -0.9° C/min. reaching a threshold temperature of 26.9° C, and medullary temperatures decreased at -0.90 C/min. reaching a temperature of 26.1° C over 600 seconds on average for combined data at infusion rates ≥ 600 cc/hour. The lowest renal temperatures were achieved with gravity infusion. Without renal hilum clamping, retrograde cooling was minimal at all infusion rates. Conclusions: Significant renal cooling by gravity infusion of iced cold saline via a duel lumen catheter with a clamped renal hilum was achieved in a porcine model. Continuous retrograde irrigation with iced saline via a two way ureteral catheter may be an effective method to induce renal hypothermia in patients undergoing robotic assisted and/or laparoscopic partial nephrectomy.

  20. On minimal coupling of the ABC-superparticle to supergravity background

    OpenAIRE

    Galajinsky, A. V.; Gitman, D. M.

    1998-01-01

    By rigorous application of the Hamiltonian methods we show that the ABC-formulation of the Siegel superparticle admits consistent minimal coupling to external supergravity. The consistency check proves to involve all the supergravity constraints.

  1. Proposed New Method of Interpretation of Infrared Ship Signature Requirements

    NARCIS (Netherlands)

    Neele, F.P.; Wilson, M.T.; Youern, K.

    2005-01-01

    new method of deriving and defining requirements for the infrared signature of new ships is presented. The current approach is to specify the maximum allowed temperature or radiance contrast of the ship with respect to its background. At present, in most NATO countries, it is the contractor’s

  2. Annual Waste Minimization Summary Report for the National Nuclear Security Administration Nevada Site Office

    International Nuclear Information System (INIS)

    Alfred J. Karns

    2007-01-01

    This report summarizes the waste minimization efforts undertaken by National Security Technologies, LLC (NSTec), for the U. S. Department of Energy (DOE) National Nuclear Security Administration Nevada Site Office (NNSA/NSO), during CY06. This report was developed in accordance with the requirements of the Nevada Test Site (NTS) Resource Conservation and Recovery Act (RCRA) Permit ((number s ign)NEV HW0021) and as clarified in a letter dated April 21, 1995, from Paul Liebendorfer of the Nevada Division of Environmental Protection to Donald Elle of the DOE, Nevada Operations Office. The NNSA/NSO Pollution Prevention (P2) Program establishes a process to reduce the volume and toxicity of waste generated by the NNSA/NSO and ensures that proposed methods of treatment, storage, and/or disposal of waste minimize potential threats to human health and the environment. The following information provides an overview of the P2 Program, major P2 accomplishments during the reporting year, a comparison of the current year waste generation to prior years, and a description of efforts undertaken during the year to reduce the volume and toxicity of waste generated by the NNSA/NSO

  3. Protect Yourself! How To Minimize Your Risk of an IRS Audit.

    Science.gov (United States)

    Lukaszewski, Thomas; Moser, Barbara

    1998-01-01

    Shows how to minimize risk of an IRS audit and how to avoid traps caused by improper documentation and reporting requirements. Outlines the most likely audit areas for business tax returns. These include cash business; for sole proprietors, Schedule C; S and C corporation salary issues; shareholder loans; contractors versus employees; fringe…

  4. A step by step selection method for the location and the size of a waste-to-energy facility targeting the maximum output energy and minimization of gate fee.

    Science.gov (United States)

    Kyriakis, Efstathios; Psomopoulos, Constantinos; Kokkotis, Panagiotis; Bourtsalas, Athanasios; Themelis, Nikolaos

    2017-06-23

    This study attempts the development of an algorithm in order to present a step by step selection method for the location and the size of a waste-to-energy facility targeting the maximum output energy, also considering the basic obstacle which is in many cases, the gate fee. Various parameters identified and evaluated in order to formulate the proposed decision making method in the form of an algorithm. The principle simulation input is the amount of municipal solid wastes (MSW) available for incineration and along with its net calorific value are the most important factors for the feasibility of the plant. Moreover, the research is focused both on the parameters that could increase the energy production and those that affect the R1 energy efficiency factor. Estimation of the final gate fee is achieved through the economic analysis of the entire project by investigating both expenses and revenues which are expected according to the selected site and outputs of the facility. In this point, a number of commonly revenue methods were included in the algorithm. The developed algorithm has been validated using three case studies in Greece-Athens, Thessaloniki, and Central Greece, where the cities of Larisa and Volos have been selected for the application of the proposed decision making tool. These case studies were selected based on a previous publication made by two of the authors, in which these areas where examined. Results reveal that the development of a «solid» methodological approach in selecting the site and the size of waste-to-energy (WtE) facility can be feasible. However, the maximization of the energy efficiency factor R1 requires high utilization factors while the minimization of the final gate fee requires high R1 and high metals recovery from the bottom ash as well as economic exploitation of recovered raw materials if any.

  5. On the choice of minimization parameters using 4 momentum conservation law for particle momenta improvement

    International Nuclear Information System (INIS)

    Anykeyev, V.B.; Zhigunov, V.P.; Spiridonov, A.A.

    1981-01-01

    Special choice of parameters for minimization is offered in the problem of improving estimates for particle momenta in the vertex of the event with the use of 4-momentum conservation law. This choice permits to use any unconditional minimization method instead of that of Lagrange multipliers. The above method is used when analysing the data on the K - +p→n + anti k 0 +π 0 reaction [ru

  6. Ruled Laguerre minimal surfaces

    KAUST Repository

    Skopenkov, Mikhail; Pottmann, Helmut; Grohs, Philipp

    2011-01-01

    A Laguerre minimal surface is an immersed surface in ℝ 3 being an extremal of the functional ∫ (H 2/K-1)dA. In the present paper, we prove that the only ruled Laguerre minimal surfaces are up to isometry the surfaces ℝ (φλ) = (Aφ, Bφ, Cφ + D cos 2φ

  7. Minimally coupled N-particle scattering integral equations

    International Nuclear Information System (INIS)

    Kowalski, K.L.

    1977-01-01

    A concise formalism is developed which permits the efficient representation and generalization of several known techniques for deriving connected-kernel N-particle scattering integral equations. The methods of Kouri, Levin, and Tobocman and Bencze and Redish which lead to minimally coupled integral equations are of special interest. The introduction of channel coupling arrays is characterized in a general manner and the common base of this technique and that of the so-called channel coupling scheme is clarified. It is found that in the Bencze-Redish formalism a particular coupling array has a crucial function but one different from that of the arrays employed by Kouri, Levin, and Tobocman. The apparent dependence of the proof of the minimality of the Bencze-Redish integral equations upon the form of the inhomogeneous term in these equations is eliminated. This is achieved by an investigation of the full (nonminimal) Bencze-Redish kernel. It is shown that the second power of this operator is connected, a result which is needed for the full applicability of the Bencze-Redish formalism. This is used to establish the relationship between the existence of solutions to the homogeneous form of the minimal equations and eigenvalues of the full Bencze-Redish kernel

  8. Minimizing Characterization - Derived Waste at the Department of Energy Savannah River Site, Aiken, South Carolina

    Energy Technology Data Exchange (ETDEWEB)

    Van Pelt, R. S.; Amidon, M. B.; Reboul, S. H.

    2002-02-25

    Environmental restoration activities at the Department of Energy Savannah River Site (SRS) utilize innovative site characterization approaches and technologies that minimize waste generation. Characterization is typically conducted in phases, first by collecting large quantities of inexpensive data, followed by targeted minimally invasive drilling to collect depth-discrete soil/groundwater data, and concluded with the installation of permanent multi-level groundwater monitoring wells. Waste-reducing characterization methods utilize non-traditional drilling practices (sonic drilling), minimally intrusive (geoprobe, cone penetrometer) and non-intrusive (3-D seismic, ground penetration radar, aerial monitoring) investigative tools. Various types of sensor probes (moisture sensors, gamma spectroscopy, Raman spectroscopy, laser induced and X-ray fluorescence) and hydrophobic membranes (FLUTe) are used in conjunction with depth-discrete sampling techniques to obtain high-resolution 3-D plume profiles. Groundwater monitoring (short/long-term) approaches utilize multi-level sampling technologies (Strata-Sampler, Cone-Sipper, Solinst Waterloo, Westbay) and low-cost diffusion samplers for seepline/surface water sampling. Upon collection of soil and groundwater data, information is portrayed in a Geographic Information Systems (GIS) format for interpretation and planning purposes. At the SRS, the use of non-traditional drilling methods and minimally/non intrusive investigation approaches along with in-situ sampling methods has minimized waste generation and improved the effectiveness and efficiency of characterization activities.

  9. Minimizing Characterization - Derived Waste at the Department of Energy Savannah River Site, Aiken, South Carolina

    International Nuclear Information System (INIS)

    Van Pelt, R. S.; Amidon, M. B.; Reboul, S. H.

    2002-01-01

    Environmental restoration activities at the Department of Energy Savannah River Site (SRS) utilize innovative site characterization approaches and technologies that minimize waste generation. Characterization is typically conducted in phases, first by collecting large quantities of inexpensive data, followed by targeted minimally invasive drilling to collect depth-discrete soil/groundwater data, and concluded with the installation of permanent multi-level groundwater monitoring wells. Waste-reducing characterization methods utilize non-traditional drilling practices (sonic drilling), minimally intrusive (geoprobe, cone penetrometer) and non-intrusive (3-D seismic, ground penetration radar, aerial monitoring) investigative tools. Various types of sensor probes (moisture sensors, gamma spectroscopy, Raman spectroscopy, laser induced and X-ray fluorescence) and hydrophobic membranes (FLUTe) are used in conjunction with depth-discrete sampling techniques to obtain high-resolution 3-D plume profiles. Groundwater monitoring (short/long-term) approaches utilize multi-level sampling technologies (Strata-Sampler, Cone-Sipper, Solinst Waterloo, Westbay) and low-cost diffusion samplers for seepline/surface water sampling. Upon collection of soil and groundwater data, information is portrayed in a Geographic Information Systems (GIS) format for interpretation and planning purposes. At the SRS, the use of non-traditional drilling methods and minimally/non intrusive investigation approaches along with in-situ sampling methods has minimized waste generation and improved the effectiveness and efficiency of characterization activities

  10. Evaluation of the carotid artery stenosis based on minimization of mechanical energy loss of the blood flow.

    Science.gov (United States)

    Sia, Sheau Fung; Zhao, Xihai; Li, Rui; Zhang, Yu; Chong, Winston; He, Le; Chen, Yu

    2016-11-01

    Internal carotid artery stenosis requires an accurate risk assessment for the prevention of stroke. Although the internal carotid artery area stenosis ratio at the common carotid artery bifurcation can be used as one of the diagnostic methods of internal carotid artery stenosis, the accuracy of results would still depend on the measurement techniques. The purpose of this study is to propose a novel method to estimate the effect of internal carotid artery stenosis on the blood flow based on the concept of minimization of energy loss. Eight internal carotid arteries from different medical centers were diagnosed as stenosed internal carotid arteries, as plaques were found at different locations on the vessel. A computational fluid dynamics solver was developed based on an open-source code (OpenFOAM) to test the flow ratio and energy loss of those stenosed internal carotid arteries. For comparison, a healthy internal carotid artery and an idealized internal carotid artery model have also been tested and compared with stenosed internal carotid artery in terms of flow ratio and energy loss. We found that at a given common carotid artery bifurcation, there must be a certain flow distribution in the internal carotid artery and external carotid artery, for which the total energy loss at the bifurcation is at a minimum; for a given common carotid artery flow rate, an irregular shaped plaque at the bifurcation constantly resulted in a large value of minimization of energy loss. Thus, minimization of energy loss can be used as an indicator for the estimation of internal carotid artery stenosis.

  11. Development of a quantum dot mediated thermometry for minimally invasive thermal therapy

    Science.gov (United States)

    Hanson, Willard L.

    Thermally-related, minimally invasive therapies are designed to treat tumors while minimizing damage to the surrounding tissues. Adjacent tissues become susceptible to thermal injury to ensure the cancer is completely destroyed. Destroying tumor cells, while minimizing collateral damage to the surrounding tissue, requires the capacity to control and monitor tissue temperatures both spatially and temporally. Current devices measure the tumor's tissue temperature at a specific location leaving the majority unmonitored. A point-wise application can not substantiate complete tumor destruction. This type of surgery would be more effective if volumetric tissue temperature measurement were available. On this premise, the feasibility of a quantum dot (QD) assembly to measure the tissue temperature volumetrically was tested in the experiments described in this dissertation. QDs are fluorescence semiconductor nanoparticles having various superior optical properties. This new QD-mediated thermometry is capable of monitoring the thermal features of tissues non-invasively by measuring the aggregate fluorescence intensity of the QDs accumulated at the target tissues prior to and during the surgical procedure. Thus, such a modality would allow evaluation of tissue destruction by measuring the fluorescence intensity of the QD as a function of temperature. The present study also quantified the photoluminescence intensity and attenuation of the QD as a function of depth and wavelength using a tissue phantom. A prototype system was developed to measure the illumination through a tissue phantom as a proof of concept of the feasibility of a noninvasive thermal therapy. This prototype includes experimental hardware, software and working methods to perform image acquisition, and data reduction strategic to quantify the intensity and transport characteristics of the QD. The significance of this work is that real-time volumetric temperature information will prove a more robust tool for use

  12. Method for calculating required shielding in medical x-ray rooms

    International Nuclear Information System (INIS)

    Karppinen, J.

    1997-10-01

    The new annual radiation dose limits - 20 mSv (previously 50 mSv) for radiation workers and 1 mSv (previously 5 mSv) for other persons - implies that the adequacy of existing radiation shielding must be re-evaluated. In principle, one could assume that the thicknesses of old radiation shields should be increased by about one or two half-value layers in order to comply with the new dose limits. However, the assumptions made in the earlier shielding calculations are highly conservative; the required shielding was often determined by applying the maximum high-voltage of the x-ray tube for the whole workload. A more realistic calculation shows that increased shielding is typically not necessary if more practical x-ray tube voltages are used in the evaluation. We have developed a PC-based calculation method for calculating the x-ray shielding which is more realistic than the highly conservative method formerly used. The method may be used to evaluate an existing shield for compliance with new regulations. As examples of these calculations, typical x-ray rooms are considered. The lead and concrete thickness requirements as a function of x-ray tube voltage and workload are also given in tables. (author)

  13. Latency in Visionic Systems: Test Methods and Requirements

    Science.gov (United States)

    Bailey, Randall E.; Arthur, J. J., III; Williams, Steven P.; Kramer, Lynda J.

    2005-01-01

    A visionics device creates a pictorial representation of the external scene for the pilot. The ultimate objective of these systems may be to electronically generate a form of Visual Meteorological Conditions (VMC) to eliminate weather or time-of-day as an operational constraint and provide enhancement over actual visual conditions where eye-limiting resolution may be a limiting factor. Empirical evidence has shown that the total system delays or latencies including the imaging sensors and display systems, can critically degrade their utility, usability, and acceptability. Definitions and measurement techniques are offered herein as common test and evaluation methods for latency testing in visionics device applications. Based upon available data, very different latency requirements are indicated based upon the piloting task, the role in which the visionics device is used in this task, and the characteristics of the visionics cockpit display device including its resolution, field-of-regard, and field-of-view. The least stringent latency requirements will involve Head-Up Display (HUD) applications, where the visionics imagery provides situational information as a supplement to symbology guidance and command information. Conversely, the visionics system latency requirement for a large field-of-view Head-Worn Display application, providing a Virtual-VMC capability from which the pilot will derive visual guidance, will be the most stringent, having a value as low as 20 msec.

  14. Adaptive finite element methods for differential equations

    CERN Document Server

    Bangerth, Wolfgang

    2003-01-01

    These Lecture Notes discuss concepts of `self-adaptivity' in the numerical solution of differential equations, with emphasis on Galerkin finite element methods. The key issues are a posteriori error estimation and it automatic mesh adaptation. Besides the traditional approach of energy-norm error control, a new duality-based technique, the Dual Weighted Residual method for goal-oriented error estimation, is discussed in detail. This method aims at economical computation of arbitrary quantities of physical interest by properly adapting the computational mesh. This is typically required in the design cycles of technical applications. For example, the drag coefficient of a body immersed in a viscous flow is computed, then it is minimized by varying certain control parameters, and finally the stability of the resulting flow is investigated by solving an eigenvalue problem. `Goal-oriented' adaptivity is designed to achieve these tasks with minimal cost. At the end of each chapter some exercises are posed in order ...

  15. Model Arrhenius untuk Pendugaan Laju Respirasi Brokoli Terolah Minimal

    Directory of Open Access Journals (Sweden)

    Nurul Imamah

    2016-04-01

    Full Text Available Minimally processed broccoli are perishable product because it still has some metabolism process during the storage period. One of the metabolism process is respiration. Respiration rate is varied depend on the commodity and storage temperature. The purpose of this research are: to review the respiration pattern of minimally processed broccoli during storage period, to study the effect of storage temperature to respiration rate, and to review the correlation between respiration rate and temperature based on Arrhenius model. Broccoli from farming organization “Agro Segar” was processed minimally and then measure the respiration rate. Closed system method is used to measure O2 and CO2 concentration. Minimally processed broccoli is stored at a temperature of 0oC, 5oC, 10oC and 15oC. The experimental design used was completely randomized design of the factors to analyze the rate of respiration. The result shows that broccoli is a climacteric vegetable. It is indicated by the increasing of O2 consumption and CO2 production during senescence phase. The respiration rate increase as high as the increasing of temperature storage. Models Arrhenius can describe correlation between respiration rate and temperature with R2 = 0.953-0.947. The constant value of activation energy (Eai and pre-exponential factor (Roi from Arrhenius model can be used to predict the respiration rate of minimally processed broccoli in every storage temperature

  16. Y-12 Plant waste minimization strategy

    International Nuclear Information System (INIS)

    Kane, M.A.

    1987-01-01

    The 1984 Amendments to the Resource Conservation and Recovery Act (RCRA) mandate that waste minimization be a major element of hazardous waste management. In response to this mandate and the increasing costs for waste treatment, storage, and disposal, the Oak Ridge Y-12 Plant developed a waste minimization program to encompass all types of wastes. Thus, waste minimization has become an integral part of the overall waste management program. Unlike traditional approaches, waste minimization focuses on controlling waste at the beginning of production instead of the end. This approach includes: (1) substituting nonhazardous process materials for hazardous ones, (2) recycling or reusing waste effluents, (3) segregating nonhazardous waste from hazardous and radioactive waste, and (4) modifying processes to generate less waste or less toxic waste. An effective waste minimization program must provide the appropriate incentives for generators to reduce their waste and provide the necessary support mechanisms to identify opportunities for waste minimization. This presentation focuses on the Y-12 Plant's strategy to implement a comprehensive waste minimization program. This approach consists of four major program elements: (1) promotional campaign, (2) process evaluation for waste minimization opportunities, (3) waste generation tracking system, and (4) information exchange network. The presentation also examines some of the accomplishments of the program and issues which need to be resolved

  17. XMSS : a practical forward secure signature scheme based on minimal security assumptions

    NARCIS (Netherlands)

    Buchmann, Johannes; Dahmen, Erik; Hülsing, Andreas; Yang, B.-Y.

    2011-01-01

    We present the hash-based signature scheme XMSS. It is the first provably (forward) secure and practical signature scheme with minimal security requirements: a pseudorandom and a second preimage resistant (hash) function family. Its signature size is reduced to less than 25% compared to the best

  18. Prediction of metabolic flux distribution from gene expression data based on the flux minimization principle.

    Directory of Open Access Journals (Sweden)

    Hyun-Seob Song

    Full Text Available Prediction of possible flux distributions in a metabolic network provides detailed phenotypic information that links metabolism to cellular physiology. To estimate metabolic steady-state fluxes, the most common approach is to solve a set of macroscopic mass balance equations subjected to stoichiometric constraints while attempting to optimize an assumed optimal objective function. This assumption is justifiable in specific cases but may be invalid when tested across different conditions, cell populations, or other organisms. With an aim to providing a more consistent and reliable prediction of flux distributions over a wide range of conditions, in this article we propose a framework that uses the flux minimization principle to predict active metabolic pathways from mRNA expression data. The proposed algorithm minimizes a weighted sum of flux magnitudes, while biomass production can be bounded to fit an ample range from very low to very high values according to the analyzed context. We have formulated the flux weights as a function of the corresponding enzyme reaction's gene expression value, enabling the creation of context-specific fluxes based on a generic metabolic network. In case studies of wild-type Saccharomyces cerevisiae, and wild-type and mutant Escherichia coli strains, our method achieved high prediction accuracy, as gauged by correlation coefficients and sums of squared error, with respect to the experimentally measured values. In contrast to other approaches, our method was able to provide quantitative predictions for both model organisms under a variety of conditions. Our approach requires no prior knowledge or assumption of a context-specific metabolic functionality and does not require trial-and-error parameter adjustments. Thus, our framework is of general applicability for modeling the transcription-dependent metabolism of bacteria and yeasts.

  19. Minimal open strings

    International Nuclear Information System (INIS)

    Hosomichi, Kazuo

    2008-01-01

    We study FZZT-branes and open string amplitudes in (p, q) minimal string theory. We focus on the simplest boundary changing operators in two-matrix models, and identify the corresponding operators in worldsheet theory through the comparison of amplitudes. Along the way, we find a novel linear relation among FZZT boundary states in minimal string theory. We also show that the boundary ground ring is realized on physical open string operators in a very simple manner, and discuss its use for perturbative computation of higher open string amplitudes.

  20. Minimal Composite Inflation

    DEFF Research Database (Denmark)

    Channuie, Phongpichit; Jark Joergensen, Jakob; Sannino, Francesco

    2011-01-01

    We investigate models in which the inflaton emerges as a composite field of a four dimensional, strongly interacting and nonsupersymmetric gauge theory featuring purely fermionic matter. We show that it is possible to obtain successful inflation via non-minimal coupling to gravity, and that the u......We investigate models in which the inflaton emerges as a composite field of a four dimensional, strongly interacting and nonsupersymmetric gauge theory featuring purely fermionic matter. We show that it is possible to obtain successful inflation via non-minimal coupling to gravity...

  1. Minimal clinically important improvement (MCII) and patient-acceptable symptom state (PASS) in total hip arthroplasty (THA) patients 1 year postoperatively

    DEFF Research Database (Denmark)

    Paulsen, Aksel; Roos, Ewa M.; Pedersen, Alma Becic

    2014-01-01

    -55% improvement from mean baseline PRO score and PASSs corresponded to absolute follow-up scores of 57-91% of the maximum score in THA patients 1 year after surgery. Interpretation - This study improves the interpretability of PRO scores. The different estimation approaches presented may serve as a guide......Background and purpose - The increased use of patient-reported outcomes (PROs) in orthopedics requires data on estimated minimal clinically important improvements (MCIIs) and patient-acceptable symptom states (PASSs). We wanted to find cut-points corresponding to minimal clinically important PRO...... change score and the acceptable postoperative PRO score, by estimating MCII and PASS 1 year after total hip arthroplasty (THA) for the Hip Dysfunction and Osteoarthritis Outcome Score (HOOS) and the EQ-5D. Patients and methods - THA patients from 16 different departments received 2 PROs and additional...

  2. Subject-specific cardiovascular system model-based identification and diagnosis of septic shock with a minimally invasive data set: animal experiments and proof of concept

    International Nuclear Information System (INIS)

    Geoffrey Chase, J; Starfinger, Christina; Hann, Christopher E; Lambermont, Bernard; Ghuysen, Alexandre; Kolh, Philippe; Dauby, Pierre C; Desaive, Thomas; Shaw, Geoffrey M

    2011-01-01

    A cardiovascular system (CVS) model and parameter identification method have previously been validated for identifying different cardiac and circulatory dysfunctions in simulation and using porcine models of pulmonary embolism, hypovolemia with PEEP titrations and induced endotoxic shock. However, these studies required both left and right heart catheters to collect the data required for subject-specific monitoring and diagnosis—a maximally invasive data set in a critical care setting although it does occur in practice. Hence, use of this model-based diagnostic would require significant additional invasive sensors for some subjects, which is unacceptable in some, if not all, cases. The main goal of this study is to prove the concept of using only measurements from one side of the heart (right) in a 'minimal' data set to identify an effective patient-specific model that can capture key clinical trends in endotoxic shock. This research extends existing methods to a reduced and minimal data set requiring only a single catheter and reducing the risk of infection and other complications—a very common, typical situation in critical care patients, particularly after cardiac surgery. The extended methods and assumptions that found it are developed and presented in a case study for the patient-specific parameter identification of pig-specific parameters in an animal model of induced endotoxic shock. This case study is used to define the impact of this minimal data set on the quality and accuracy of the model application for monitoring, detecting and diagnosing septic shock. Six anesthetized healthy pigs weighing 20–30 kg received a 0.5 mg kg −1 endotoxin infusion over a period of 30 min from T0 to T30. For this research, only right heart measurements were obtained. Errors for the identified model are within 8% when the model is identified from data, re-simulated and then compared to the experimentally measured data, including measurements not used in the

  3. Assessing and minimizing contamination in time of flight based validation data

    Science.gov (United States)

    Lennox, Kristin P.; Rosenfield, Paul; Blair, Brenton; Kaplan, Alan; Ruz, Jaime; Glenn, Andrew; Wurtz, Ronald

    2017-10-01

    Time of flight experiments are the gold standard method for generating labeled training and testing data for the neutron/gamma pulse shape discrimination problem. As the popularity of supervised classification methods increases in this field, there will also be increasing reliance on time of flight data for algorithm development and evaluation. However, time of flight experiments are subject to various sources of contamination that lead to neutron and gamma pulses being mislabeled. Such labeling errors have a detrimental effect on classification algorithm training and testing, and should therefore be minimized. This paper presents a method for identifying minimally contaminated data sets from time of flight experiments and estimating the residual contamination rate. This method leverages statistical models describing neutron and gamma travel time distributions and is easily implemented using existing statistical software. The method produces a set of optimal intervals that balance the trade-off between interval size and nuisance particle contamination, and its use is demonstrated on a time of flight data set for Cf-252. The particular properties of the optimal intervals for the demonstration data are explored in detail.

  4. Evaluation of methods to estimate the essential amino acids requirements of fish from the muscle amino acid profile

    Directory of Open Access Journals (Sweden)

    Álvaro José de Almeida Bicudo

    2014-03-01

    Full Text Available Many methods to estimate amino acid requirement based on amino acid profile of fish have been proposed. This study evaluates the methodology proposed by Meyer & Fracalossi (2005 and by Tacon (1989 to estimate amino acids requirement of fish, which do exempt knowledge on previous nutritional requirement of reference amino acid. Data on amino acid requirement of pacu, Piaractus mesopotamicus, were used to validate de accuracy of those methods. Meyer & Fracalossi's and Tacon's methodology estimated the lysine requirement of pacu, respectively, at 13 and 23% above requirement determined using dose-response method. The values estimated by both methods lie within the range of requirements determined for other omnivorous fish species, the Meyer & Fracalossi (2005 method showing better accuracy.

  5. Minimally Invasive Distal Pancreatectomy: Review of the English Literature.

    Science.gov (United States)

    Wang, Kai; Fan, Ying

    2017-02-01

    Recently, the superiority of the minimally invasive approach, which results in a better cosmetic result, faster recovery, and shorter length of hospital stay, is a technique that has been progressively recognized as it has developed. And the minimally invasive approach has been applied to distal pancreatectomy (DP), which is a standard method for the treatment of benign, borderline, and part of malignant lesions of the pancreatic body and tail. This article aims to analyze the types, postoperative recovery, and outcomes of laparoscopic distal pancreatectomy (LDP). A systematic search of the scientific literature was performed using PubMed, EMBASE, online journals, and the Internet for all publications on LDP. Articles were selected if the abstract contained patients who underwent LDP for pancreatic diseases. All selected articles were reviewed and analyzed. If there were no contraindications for LDP, this operation is suitable for benign, borderline, or malignant tumors of the pancreatic body and tail, which should try to be performed with preservation of the spleen. LDP is safe and feasible under some conditions to experienced surgeon. Single-incision laparoscopic distal pancreatectomy (S-LDP) and robotic laparoscopic distal pancreatectomy (R-LDP) perioperative outcomes are similar with conventional multi-incision laparoscopic distal pancreatectomy (C-LDP). And the advantages of S-LDP and R-LDP require further exploration. With the application of enhanced recovery program (ERP), length of hospital stay and costs are reduced. LDP is safe and feasible under some conditions. Compared with open distal pancreatectomy, LDP has a lot of advantages; a trend was observed for LDP to replace traditional open surgery. LDP combined with ERP is expected to become standard in the treatment of pancreatic body and tail lesions.

  6. KidReporter : a method for engaging children in making a newspaper to gather user requirements

    NARCIS (Netherlands)

    Bekker, M.M.; Beusmans, J.; Keyson, D.V.; Lloyd, P.A.; Bekker, M.M.; Markopoulos, P.; Tsikalkina, M.

    2002-01-01

    We describe a design method, called the KidReporter method, for gathering user requirements from children. Two school classes participated in making a newspaper about a zoo, to gather requirements for the design process of an interactive educational game. The educational game was developed to

  7. A procedure to compute equilibrium concentrations in multicomponent systems by Gibbs energy minimization on spreadsheets

    International Nuclear Information System (INIS)

    Lima da Silva, Aline; Heck, Nestor Cesar

    2003-01-01

    Equilibrium concentrations are traditionally calculated with the help of equilibrium constant equations from selected reactions. This procedure, however, is only useful for simpler problems. Analysis of the equilibrium state in a multicomponent and multiphase system necessarily involves solution of several simultaneous equations, and, as the number of system components grows, the required computation becomes more complex and tedious. A more direct and general method for solving the problem is the direct minimization of the Gibbs energy function. The solution for the nonlinear problem consists in minimizing the objective function (Gibbs energy of the system) subjected to the constraints of the elemental mass-balance. To solve it, usually a computer code is developed, which requires considerable testing and debugging efforts. In this work, a simple method to predict equilibrium composition in multicomponent systems is presented, which makes use of an electronic spreadsheet. The ability to carry out these calculations within a spreadsheet environment shows several advantages. First, spreadsheets are available 'universally' on nearly all personal computers. Second, the input and output capabilities of spreadsheets can be effectively used to monitor calculated results. Third, no additional systems or programs need to be learned. In this way, spreadsheets can be as suitable in computing equilibrium concentrations as well as to be used as teaching and learning aids. This work describes, therefore, the use of the Solver tool, contained in the Microsoft Excel spreadsheet package, on computing equilibrium concentrations in a multicomponent system, by the method of direct Gibbs energy minimization. The four phases Fe-Cr-O-C-Ni system is used as an example to illustrate the method proposed. The pure stoichiometric phases considered in equilibrium calculations are: Cr 2 O 3 (s) and FeO C r 2 O 3 (s). The atmosphere consists of O 2 , CO e CO 2 constituents. The liquid iron

  8. Fault tree construction of hybrid system requirements using qualitative formal method

    International Nuclear Information System (INIS)

    Lee, Jang-Soo; Cha, Sung-Deok

    2005-01-01

    When specifying requirements for software controlling hybrid systems and conducting safety analysis, engineers experience that requirements are often known only in qualitative terms and that existing fault tree analysis techniques provide little guidance on formulating and evaluating potential failure modes. In this paper, we propose Causal Requirements Safety Analysis (CRSA) as a technique to qualitatively evaluate causal relationship between software faults and physical hazards. This technique, extending qualitative formal method process and utilizing information captured in the state trajectory, provides specific guidelines on how to identify failure modes and relationship among them. Using a simplified electrical power system as an example, we describe step-by-step procedures of conducting CRSA. Our experience of applying CRSA to perform fault tree analysis on requirements for the Wolsong nuclear power plant shutdown system indicates that CRSA is an effective technique in assisting safety engineers

  9. A safe inexpensive method to isolate high quality plant and fungal ...

    African Journals Online (AJOL)

    The most commonly used plant DNA isolation methods use toxic and hazardous chemicals (phenol, chloroform), which require special equipment to minimize exposure and may limit their use in certain environments. Commercial DNA extraction kits are convenient and usually safe, but their availability to certain developing ...

  10. Enumeration of minimal stoichiometric precursor sets in metabolic networks.

    Science.gov (United States)

    Andrade, Ricardo; Wannagat, Martin; Klein, Cecilia C; Acuña, Vicente; Marchetti-Spaccamela, Alberto; Milreu, Paulo V; Stougie, Leen; Sagot, Marie-France

    2016-01-01

    What an organism needs at least from its environment to produce a set of metabolites, e.g. target(s) of interest and/or biomass, has been called a minimal precursor set. Early approaches to enumerate all minimal precursor sets took into account only the topology of the metabolic network (topological precursor sets). Due to cycles and the stoichiometric values of the reactions, it is often not possible to produce the target(s) from a topological precursor set in the sense that there is no feasible flux. Although considering the stoichiometry makes the problem harder, it enables to obtain biologically reasonable precursor sets that we call stoichiometric. Recently a method to enumerate all minimal stoichiometric precursor sets was proposed in the literature. The relationship between topological and stoichiometric precursor sets had however not yet been studied. Such relationship between topological and stoichiometric precursor sets is highlighted. We also present two algorithms that enumerate all minimal stoichiometric precursor sets. The first one is of theoretical interest only and is based on the above mentioned relationship. The second approach solves a series of mixed integer linear programming problems. We compared the computed minimal precursor sets to experimentally obtained growth media of several Escherichia coli strains using genome-scale metabolic networks. The results show that the second approach efficiently enumerates minimal precursor sets taking stoichiometry into account, and allows for broad in silico studies of strains or species interactions that may help to understand e.g. pathotype and niche-specific metabolic capabilities. sasita is written in Java, uses cplex as LP solver and can be downloaded together with all networks and input files used in this paper at http://www.sasita.gforge.inria.fr.

  11. Methods for ensuring compliance with regulatory requirements: regulators and operators

    International Nuclear Information System (INIS)

    Fleischmann, A.W.

    1989-01-01

    Some of the methods of ensuring compliance with regulatory requirements contained in various radiation protection documents such as Regulations, ICRP Recommendations etc. are considered. These include radiation safety officers and radiation safety committees, personnel monitoring services, dissemination of information, inspection services and legislative power of enforcement. Difficulties in ensuring compliance include outmoded legislation, financial and personnel constraints

  12. Minimally invasive approaches for the treatment of inflammatory bowel disease

    Institute of Scientific and Technical Information of China (English)

    Marco Zoccali; Alessandro Fichera

    2012-01-01

    Despite significant improvements in medical management of inflammatory bowel disease,many of these patients still require surgery at some point in the course of their disease.Their young age and poor general conditions,worsened by the aggressive medical treatments,make minimally invasive approaches particularly enticing to this patient population.However,the typical inflammatory changes that characterize these diseases have hindered wide diffusion of laparoscopy in this setting,currently mostly pursued in high-volume referral centers,despite accumulating evidences in the literature supporting the benefits of minimally invasive surgery.The largest body of evidence currently available for terminal ileal Crohn's disease shows improved short term outcomes after laparoscopic surgery,with prolonged operative times.For Crohn's colitis,high quality evidence supporting laparoscopic surgery is lacking.Encouraging preliminary results have been obtained with the adoption of laparoscopic restorative total proctocolectomy for the treatment of ulcerative colitis.A consensus about patients' selection and the need for staging has not been reached yet.Despite the lack of conclusive evidence,a wave of enthusiasm is pushing towards less invasive strategies,to further minimize surgical trauma,with single incision laparoscopic surgery being the most realistic future development.

  13. Proper project planning helps minimize overruns and delays

    International Nuclear Information System (INIS)

    Donnelly, G.; Cooney, D.J.

    1994-01-01

    This paper describes planning methods to help minimize cost overruns during the construction of oil and gas pipelines. These steps include background data collection methods, field surveys, determining preliminary pipeline routes, regulatory agency pre-application meetings, and preliminary engineering. Methods for planning also include preliminary aerial mapping, biological assessments, cultural resources investigations, wetlands delineation, geotechnical investigations, and environmental audits. Identification of potential problems can allow for rerouting of the pipeline or remediation processes before they are raised during the permitting process. By coordinating these events from the very beginning, significant cost savings will result that prevent having to rebudget for them after the permitting process starts

  14. Products of Snowflaked Euclidean Lines Are Not Minimal for Looking Down

    Directory of Open Access Journals (Sweden)

    Joseph Matthieu

    2017-11-01

    Full Text Available We show that products of snowflaked Euclidean lines are not minimal for looking down. This question was raised in Fractured fractals and broken dreams, Problem 11.17, by David and Semmes. The proof uses arguments developed by Le Donne, Li and Rajala to prove that the Heisenberg group is not minimal for looking down. By a method of shortcuts, we define a new distance d such that the product of snowflaked Euclidean lines looks down on (RN , d, but not vice versa.

  15. Minimal Flavour Violation and Beyond

    CERN Document Server

    Isidori, Gino

    2012-01-01

    We review the formulation of the Minimal Flavour Violation (MFV) hypothesis in the quark sector, as well as some "variations on a theme" based on smaller flavour symmetry groups and/or less minimal breaking terms. We also review how these hypotheses can be tested in B decays and by means of other flavour-physics observables. The phenomenological consequences of MFV are discussed both in general terms, employing a general effective theory approach, and in the specific context of the Minimal Supersymmetric extension of the SM.

  16. Defining patient-based minimal clinically important effect sizes: a study in palliative radiotherapy for painful unresectable pelvic recurrences from rectal cancer

    International Nuclear Information System (INIS)

    Wong, Rebecca K.S.; Gafni, Amiram; Whelan, Tim; Franssen, Edmee; Fung, Karen

    2002-01-01

    Purpose: To measure patient-based minimal clinically important effect sizes (minimal incremental benefit that an individual would require to accept one treatment option over another) for pain relief between two contrasting palliative radiotherapy regimens for painful pelvic recurrences from rectal cancer. Methods and Materials: Forty-three patients with a history of cancer pain without prior pelvic radiotherapy participated in decision aid-facilitated trade-off exercises. The clinical scenario and treatment options of a 5-day vs. a 20-day course of radiotherapy were described. The duration of pain relief for the 20-day regimen was increased until the respondents' preferences switched to the 20-day regimen. The exercises were repeated for different probabilities of benefit and pain intensity at the time of decision making. Results: When the probability of pain relief was unchanged, the median switch point for the duration of pain relief was 6.7 and 7.2 months for severe and mild pain, respectively. The cumulative percentage frequency curve for the switch points approximated a sigmoid distribution. Conclusion: Determining the minimal clinically important effect sizes for symptom relief for palliative therapies is feasible. This type of information can be used to incorporate patient values into clinical trial designs. Modification of this method can be used to improve our understanding of shared (physician and patient) decision making

  17. Effects of Camera Arrangement on Perceptual-Motor Performance in Minimally Invasive Surgery

    Science.gov (United States)

    Delucia, Patricia R.; Griswold, John A.

    2011-01-01

    Minimally invasive surgery (MIS) is performed for a growing number of treatments. Whereas open surgery requires large incisions, MIS relies on small incisions through which instruments are inserted and tissues are visualized with a camera. MIS results in benefits for patients compared with open surgery, but degrades the surgeon's perceptual-motor…

  18. Complications of Minimally Invasive, Tubular Access Surgery for Cervical, Thoracic, and Lumbar Surgery

    Directory of Open Access Journals (Sweden)

    Donald A. Ross

    2014-01-01

    Full Text Available The object of the study was to review the author’s large series of minimally invasive spine surgeries for complication rates. The author reviewed a personal operative database for minimally access spine surgeries done through nonexpandable tubular retractors for extradural, nonfusion procedures. Consecutive cases (n=1231 were reviewed for complications. There were no wound infections. Durotomy occurred in 33 cases (2.7% overall or 3.4% of lumbar cases. There were no external or symptomatic internal cerebrospinal fluid leaks or pseudomeningoceles requiring additional treatment. The only motor injuries were 3 C5 root palsies, 2 of which resolved. Minimally invasive spine surgery performed through tubular retractors can result in a low wound infection rate when compared to open surgery. Durotomy is no more common than open procedures and does not often result in the need for secondary procedures. New neurologic deficits are uncommon, with most observed at the C5 root. Minimally invasive spine surgery, even without benefits such as less pain or shorter hospital stays, can result in considerably lower complication rates than open surgery.

  19. An iterative method for determination of a minimal eigenvalue

    DEFF Research Database (Denmark)

    Kristiansen, G.K.

    1968-01-01

    Kristiansen (1963) has discussed the convergence of a group of iterative methods (denoted the Equipoise methods) for the solution of reactor criticality problems. The main result was that even though the methods are said to work satisfactorily in all practical cases, examples of divergence can be...

  20. Entropy resistance minimization: An alternative method for heat exchanger analyses

    International Nuclear Information System (INIS)

    Cheng, XueTao

    2013-01-01

    In this paper, the concept of entropy resistance is proposed based on the entropy generation analyses of heat transfer processes. It is shown that smaller entropy resistance leads to larger heat transfer rate with fixed thermodynamic force difference and smaller thermodynamic force difference with fixed heat transfer rate, respectively. For the discussed two-stream heat exchangers in which the heat transfer rates are not given and the three-stream heat exchanger with prescribed heat capacity flow rates and inlet temperatures of the streams, smaller entropy resistance leads to larger heat transfer rate. For the two-stream heat exchangers with fixed heat transfer rate, smaller entropy resistance leads to larger effectiveness. Furthermore, it is shown that smaller values of the concepts of entropy generation numbers and modified entropy generation number do not always correspond to better performance of the discussed heat exchangers. - Highlights: • The concept of entropy resistance is defined for heat exchangers. • The concepts based on entropy generation are used to analyze heat exchangers. • Smaller entropy resistance leads to better performance of heat exchangers. • The applicability of entropy generation minimization is conditional

  1. Mixed waste and waste minimization: The effect of regulations and waste minimization on the laboratory

    International Nuclear Information System (INIS)

    Dagan, E.B.; Selby, K.B.

    1993-08-01

    The Hanford Site is located in the State of Washington and is subject to state and federal environmental regulations that hamper waste minimization efforts. This paper addresses the negative effect of these regulations on waste minimization and mixed waste issues related to the Hanford Site. Also, issues are addressed concerning the regulations becoming more lenient. In addition to field operations, the Hanford Site is home to the Pacific Northwest Laboratory which has many ongoing waste minimization activities of particular interest to laboratories

  2. 42 CFR 84.146 - Method of measuring the power and torque required to operate blowers.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Method of measuring the power and torque required... RESPIRATORY PROTECTIVE DEVICES Supplied-Air Respirators § 84.146 Method of measuring the power and torque.... These are used to facilitate timing. To determine the torque or horsepower required to operate the...

  3. Minimally-invasive posterior lumbar stabilization for degenerative low back pain and sciatica. A review

    Energy Technology Data Exchange (ETDEWEB)

    Bonaldi, G., E-mail: bbonaldi@yahoo.com [Neuroradiology Department, Ospedale Papa Giovanni XXIII, Bergamo (Italy); Brembilla, C. [Department of neurosurgery, Ospedale Papa Giovanni XXIII, Bergamo (Italy); Cianfoni, A. [Neuroradiology of Neurocenter of Italian Switzerland, Lugano, CH (Switzerland)

    2015-05-15

    The most diffused surgical techniques for stabilization of the painful degenerated and instable lumbar spine, represented by transpedicular screws and rods instrumentation with or without interbody cages or disk replacements, require widely open and/or difficult and poorly anatomical accesses. However, such surgical techniques and approaches, although still considered “standard of care”, are burdened by high costs, long recovery times and several potential complications. Hence the effort to open new minimally-invasive surgical approaches to eliminate painful abnormal motion. The surgical and radiological communities are exploring, since more than a decade, alternative, minimally-invasive or even percutaneous techniques to fuse and lock an instable lumbar segment. Another promising line of research is represented by the so-called dynamic stabilization (non-fusion or motion preservation back surgery), which aims to provide stabilization to the lumbar spinal units (SUs), while maintaining their mobility and function. Risk of potential complications of traditional fusion methods (infection, CSF leaks, harvest site pain, instrumentation failure) are reduced, particularly transitional disease (i.e., the biomechanical stresses imposed on the adjacent segments, resulting in delayed degenerative changes in adjacent facet joints and discs). Dynamic stabilization modifies the distribution of loads within the SU, moving them away from sensitive (painful) areas of the SU. Basic biomechanics of the SU will be discussed, to clarify the mode of action of the different posterior stabilization devices. Most devices are minimally invasive or percutaneous, thus accessible to radiologists’ interventional practice. Devices will be described, together with indications for patient selection, surgical approaches and possible complications.

  4. Minimally-invasive posterior lumbar stabilization for degenerative low back pain and sciatica. A review

    International Nuclear Information System (INIS)

    Bonaldi, G.; Brembilla, C.; Cianfoni, A.

    2015-01-01

    The most diffused surgical techniques for stabilization of the painful degenerated and instable lumbar spine, represented by transpedicular screws and rods instrumentation with or without interbody cages or disk replacements, require widely open and/or difficult and poorly anatomical accesses. However, such surgical techniques and approaches, although still considered “standard of care”, are burdened by high costs, long recovery times and several potential complications. Hence the effort to open new minimally-invasive surgical approaches to eliminate painful abnormal motion. The surgical and radiological communities are exploring, since more than a decade, alternative, minimally-invasive or even percutaneous techniques to fuse and lock an instable lumbar segment. Another promising line of research is represented by the so-called dynamic stabilization (non-fusion or motion preservation back surgery), which aims to provide stabilization to the lumbar spinal units (SUs), while maintaining their mobility and function. Risk of potential complications of traditional fusion methods (infection, CSF leaks, harvest site pain, instrumentation failure) are reduced, particularly transitional disease (i.e., the biomechanical stresses imposed on the adjacent segments, resulting in delayed degenerative changes in adjacent facet joints and discs). Dynamic stabilization modifies the distribution of loads within the SU, moving them away from sensitive (painful) areas of the SU. Basic biomechanics of the SU will be discussed, to clarify the mode of action of the different posterior stabilization devices. Most devices are minimally invasive or percutaneous, thus accessible to radiologists’ interventional practice. Devices will be described, together with indications for patient selection, surgical approaches and possible complications

  5. What Does It Take to Change an Editor's Mind? Identifying Minimally Important Difference Thresholds for Peer Reviewer Rating Scores of Scientific Articles.

    Science.gov (United States)

    Callaham, Michael; John, Leslie K

    2018-01-05

    We define a minimally important difference for the Likert-type scores frequently used in scientific peer review (similar to existing minimally important differences for scores in clinical medicine). The magnitude of score change required to change editorial decisions has not been studied, to our knowledge. Experienced editors at a journal in the top 6% by impact factor were asked how large a change of rating in "overall desirability for publication" was required to trigger a change in their initial decision on an article. Minimally important differences were assessed twice for each editor: once assessing the rating change required to shift the editor away from an initial decision to accept, and the other assessing the magnitude required to shift away from an initial rejection decision. Forty-one editors completed the survey (89% response rate). In the acceptance frame, the median minimally important difference was 0.4 points on a scale of 1 to 5. Editors required a greater rating change to shift from an initial rejection decision; in the rejection frame, the median minimally important difference was 1.2 points. Within each frame, there was considerable heterogeneity: in the acceptance frame, 38% of editors did not change their decision within the maximum available range; in the rejection frame, 51% did not. To our knowledge, this is the first study to determine the minimally important difference for Likert-type ratings of research article quality, or in fact any nonclinical scientific assessment variable. Our findings may be useful for future research assessing whether changes to the peer review process produce clinically meaningful differences in editorial decisionmaking. Copyright © 2017 American College of Emergency Physicians. Published by Elsevier Inc. All rights reserved.

  6. Decision Optimization of Machine Sets Taking Into Consideration Logical Tree Minimization of Design Guidelines

    Science.gov (United States)

    Deptuła, A.; Partyka, M. A.

    2014-08-01

    The method of minimization of complex partial multi-valued logical functions determines the degree of importance of construction and exploitation parameters playing the role of logical decision variables. Logical functions are taken into consideration in the issues of modelling machine sets. In multi-valued logical functions with weighting products, it is possible to use a modified Quine - McCluskey algorithm of multi-valued functions minimization. Taking into account weighting coefficients in the logical tree minimization reflects a physical model of the object being analysed much better

  7. Specialized minimal PDFs for optimized LHC calculations

    CERN Document Server

    Carrazza, Stefano; Kassabov, Zahari; Rojo, Juan

    2016-04-15

    We present a methodology for the construction of parton distribution functions (PDFs) designed to provide an accurate representation of PDF uncertainties for specific processes or classes of processes with a minimal number of PDF error sets: specialized minimal PDF sets, or SM-PDFs. We construct these SM-PDFs in such a way that sets corresponding to different input processes can be combined without losing information, specifically on their correlations, and that they are robust upon smooth variations of the kinematic cuts. The proposed strategy never discards information, so that the SM-PDF sets can be enlarged by the addition of new processes, until the prior PDF set is eventually recovered for a large enough set of processes. We illustrate the method by producing SM-PDFs tailored to Higgs, top quark pair, and electroweak gauge boson physics, and determine that, when the PDF4LHC15 combined set is used as the prior, around 11, 4 and 11 Hessian eigenvectors respectively are enough to fully describe the corresp...

  8. USE OF EXCEL WORKSHEETS WITH USER-FRIENDLY INTERFACE IN BATCH PROCESS (PSBP TO MINIMIZE THE MAKESPAN

    Directory of Open Access Journals (Sweden)

    Rony Peterson da Rocha

    2014-01-01

    Full Text Available In the chemical industry, the necessity for scheduling is becoming more pronounced, especially in batch production mode. Nowadays, planning industrial activities is a necessity for survival. Intense competition requires diversified products and delivery in accordance with the requirements of consumers. These activities require quick decision making and the lowest possible cost, through an efficient Production Scheduling. So, this work addresses the Permutation Flow Shop scheduling problem, characterized as Production Scheduling in Batch Process (PSBP, with the objective of minimizing the total time to complete the schedule (Makespan. A method to approach the problem of production scheduling is to turn it into Mixed Integer Linear Programming- MILP, and to solve it using commercial mathematical programming packages. In this study an electronic spreadsheet with user-friendly interface (ESUFI was developed in Microsoft Excel. The ease of manipulation of the ESUFI is quite evident, as with the use of VBA language a user-friendly interface could be created between the user and the spreadsheet itself. The results showed that it is possible to use the ESUFI for small problems.

  9. Minimizing waste in environmental restoration

    International Nuclear Information System (INIS)

    Thuot, J.R.; Moos, L.

    1996-01-01

    Environmental restoration, decontamination and decommissioning, and facility dismantlement projects are not typically known for their waste minimization and pollution prevention efforts. Typical projects are driven by schedules and milestones with little attention given to cost or waste minimization. Conventional wisdom in these projects is that the waste already exists and cannot be reduced or minimized; however, there are significant areas where waste and cost can be reduced by careful planning and execution. Waste reduction can occur in three ways: beneficial reuse or recycling, segregation of waste types, and reducing generation of secondary waste

  10. Simultaneous minimizing monitor units and number of segments without leaf end abutment for segmental intensity modulated radiation therapy delivery

    International Nuclear Information System (INIS)

    Li Kaile; Dai Jianrong; Ma Lijun

    2004-01-01

    Leaf end abutment is seldom studied when delivering segmental intensity modulated radiation therapy (IMRT) fields. We developed an efficient leaf sequencing method to eliminate leaf end abutment for segmental IMRT delivery. Our method uses simple matrix and sorting operations to obtain a solution that simultaneously minimizes total monitor units and number of segments without leaf end abutment between segments. We implemented and demonstrated our method for multiple clinical cases. We compared the results of our method with the results from exhaustive search method. We found that our solution without leaf end abutment produced equivalent results to the unconstrained solutions in terms of minimum total monitor units and minimum number of leaf segments. We conclude that the leaf end abutment fields can be avoided without affecting the efficiency of segmental IMRT delivery. The major strength of our method is its simplicity and high computing speed. This potentially provides a useful means for generating segmental IMRT fields that require high spatial resolution or complex intensity distributions

  11. A strategy to find minimal energy nanocluster structures.

    Science.gov (United States)

    Rogan, José; Varas, Alejandro; Valdivia, Juan Alejandro; Kiwi, Miguel

    2013-11-05

    An unbiased strategy to search for the global and local minimal energy structures of free standing nanoclusters is presented. Our objectives are twofold: to find a diverse set of low lying local minima, as well as the global minimum. To do so, we use massively the fast inertial relaxation engine algorithm as an efficient local minimizer. This procedure turns out to be quite efficient to reach the global minimum, and also most of the local minima. We test the method with the Lennard-Jones (LJ) potential, for which an abundant literature does exist, and obtain novel results, which include a new local minimum for LJ13 , 10 new local minima for LJ14 , and thousands of new local minima for 15≤N≤65. Insights on how to choose the initial configurations, analyzing the effectiveness of the method in reaching low-energy structures, including the global minimum, are developed as a function of the number of atoms of the cluster. Also, a novel characterization of the potential energy surface, analyzing properties of the local minima basins, is provided. The procedure constitutes a promising tool to generate a diverse set of cluster conformations, both two- and three-dimensional, that can be used as an input for refinement by means of ab initio methods. Copyright © 2013 Wiley Periodicals, Inc.

  12. Claus sulphur recovery potential approaches 99% while minimizing cost

    Energy Technology Data Exchange (ETDEWEB)

    Berlie, E M

    1974-01-21

    In a summary of a paper presented to the fourth joint engineering conference of the American Institute of Chemical Engineers and the Canadian Society for Chemical Engineering, the Claus process is discussed in a modern setting. Some problems faced in the operation of sulfur recovery plants include (1) strict pollution control regulations; (2) design and operation of existing plants; (3) knowledge of process fundamentals; (4) performance testing; (5) specification of feed gas; (6) catalyst life; (7) instrumentation and process control; and (8) quality of feed gas. Some of the factors which must be considered in order to achieve the ultimate capability of the Claus process are listed. There is strong evidence to support the contention that plant operators are reluctant to accept new fundamental knowledge of the Claus sulfur recovery process and are not taking advantage of its inherent potential to achieve the emission standards required, to minimize cost of tail gas cleanup systems and to minimize operating costs.

  13. Lipofilling With Minimal Access Cranial Suspension Lifting for Enhanced Rejuvenation

    NARCIS (Netherlands)

    Willemsen, Joep C. N.; Mulder, Karlijn M.; Stevens, Hieronymus P. J. D.

    Background: Loss of volume is an important aspect in facial aging, but its relevance is frequently neglected during treatment. Objectives: The authors discuss lipofilling as an ancillary procedure to improve the impact of facelifting procedures. Methods: Fifty patients who underwent minimal access

  14. Reduction of Large Dynamical Systems by Minimization of Evolution Rate

    Science.gov (United States)

    Girimaji, Sharath S.

    1999-01-01

    Reduction of a large system of equations to a lower-dimensional system of similar dynamics is investigated. For dynamical systems with disparate timescales, a criterion for determining redundant dimensions and a general reduction method based on the minimization of evolution rate are proposed.

  15. A mixed-methods systematic review protocol to examine the use of physical restraint with critically ill adults and strategies for minimizing their use

    Directory of Open Access Journals (Sweden)

    Louise Rose

    2016-11-01

    Full Text Available Abstract Background Critically ill patients frequently experience severe agitation placing them at risk of harm. Physical restraint is common in intensive care units (ICUs for clinician concerns about safety. However, physical restraint may not prevent medical device removal and has been associated with negative physical and psychological consequences. While professional society guidelines, legislation, and accreditation standards recommend physical restraint minimization, guidelines for critically ill patients are over a decade old, with recommendations that are non-specific. Our systematic review will synthesize evidence on physical restraint in critically ill adults with the primary objective of identifying effective minimization strategies. Methods Two authors will independently search from inception to July 2016 the following: Ovid MEDLINE, CINAHL, Embase, Web of Science, Cochrane Library, PROSPERO, Joanna Briggs Institute, grey literature, professional society websites, and the International Clinical Trials Registry Platform. We will include quantitative and qualitative study designs, clinical practice guidelines, policy documents, and professional society recommendations relevant to physical restraint of critically ill adults. Authors will independently perform data extraction in duplicate and complete risk of bias and quality assessment using recommended tools. We will assess evidence quality for quantitative studies using the Grading of Recommendations Assessment, Development and Evaluation (GRADE approach and for qualitative studies using the Confidence in the Evidence from Reviews of Qualitative Research (CERQual guidelines. Outcomes of interest include (1 efficacy/effectiveness of physical restraint minimization strategies; (2 adverse events (unintentional device removal, psychological impact, physical injury and associated benefits including harm prevention; (3 ICU outcomes (ventilation duration, length of stay, and mortality; (4

  16. The environmental cost of subsistence: Optimizing diets to minimize footprints

    International Nuclear Information System (INIS)

    Gephart, Jessica A.; Davis, Kyle F.; Emery, Kyle A.; Leach, Allison M.; Galloway, James N.; Pace, Michael L.

    2016-01-01

    The question of how to minimize monetary cost while meeting basic nutrient requirements (a subsistence diet) was posed by George Stigler in 1945. The problem, known as Stigler's diet problem, was famously solved using the simplex algorithm. Today, we are not only concerned with the monetary cost of food, but also the environmental cost. Efforts to quantify environmental impacts led to the development of footprint (FP) indicators. The environmental footprints of food production span multiple dimensions, including greenhouse gas emissions (carbon footprint), nitrogen release (nitrogen footprint), water use (blue and green water footprint) and land use (land footprint), and a diet minimizing one of these impacts could result in higher impacts in another dimension. In this study based on nutritional and population data for the United States, we identify diets that minimize each of these four footprints subject to nutrient constraints. We then calculate tradeoffs by taking the composition of each footprint's minimum diet and calculating the other three footprints. We find that diets for the minimized footprints tend to be similar for the four footprints, suggesting there are generally synergies, rather than tradeoffs, among low footprint diets. Plant-based food and seafood (fish and other aquatic foods) commonly appear in minimized diets and tend to most efficiently supply macronutrients and micronutrients, respectively. Livestock products rarely appear in minimized diets, suggesting these foods tend to be less efficient from an environmental perspective, even when nutrient content is considered. The results' emphasis on seafood is complicated by the environmental impacts of aquaculture versus capture fisheries, increasing in aquaculture, and shifting compositions of aquaculture feeds. While this analysis does not make specific diet recommendations, our approach demonstrates potential environmental synergies of plant- and seafood-based diets. As a result, this study

  17. The environmental cost of subsistence: Optimizing diets to minimize footprints

    Energy Technology Data Exchange (ETDEWEB)

    Gephart, Jessica A.; Davis, Kyle F. [University of Virginia, Department of Environmental Sciences, 291 McCormick Road, Charlottesville, VA 22904 (United States); Emery, Kyle A. [University of Virginia, Department of Environmental Sciences, 291 McCormick Road, Charlottesville, VA 22904 (United States); University of California, Santa Barbara. Marine Science Institute, Santa Barbara, CA 93106 (United States); Leach, Allison M. [University of New Hampshire, 107 Nesmith Hall, 131 Main Street, Durham, NH, 03824 (United States); Galloway, James N.; Pace, Michael L. [University of Virginia, Department of Environmental Sciences, 291 McCormick Road, Charlottesville, VA 22904 (United States)

    2016-05-15

    The question of how to minimize monetary cost while meeting basic nutrient requirements (a subsistence diet) was posed by George Stigler in 1945. The problem, known as Stigler's diet problem, was famously solved using the simplex algorithm. Today, we are not only concerned with the monetary cost of food, but also the environmental cost. Efforts to quantify environmental impacts led to the development of footprint (FP) indicators. The environmental footprints of food production span multiple dimensions, including greenhouse gas emissions (carbon footprint), nitrogen release (nitrogen footprint), water use (blue and green water footprint) and land use (land footprint), and a diet minimizing one of these impacts could result in higher impacts in another dimension. In this study based on nutritional and population data for the United States, we identify diets that minimize each of these four footprints subject to nutrient constraints. We then calculate tradeoffs by taking the composition of each footprint's minimum diet and calculating the other three footprints. We find that diets for the minimized footprints tend to be similar for the four footprints, suggesting there are generally synergies, rather than tradeoffs, among low footprint diets. Plant-based food and seafood (fish and other aquatic foods) commonly appear in minimized diets and tend to most efficiently supply macronutrients and micronutrients, respectively. Livestock products rarely appear in minimized diets, suggesting these foods tend to be less efficient from an environmental perspective, even when nutrient content is considered. The results' emphasis on seafood is complicated by the environmental impacts of aquaculture versus capture fisheries, increasing in aquaculture, and shifting compositions of aquaculture feeds. While this analysis does not make specific diet recommendations, our approach demonstrates potential environmental synergies of plant- and seafood-based diets. As a result

  18. Radwaste minimization successes at Duke Power Company

    International Nuclear Information System (INIS)

    Lan, C.D.; Johnson, G.T.; Groves, D.C.; Smith, T.A.

    1996-01-01

    At Duke Power Company, open-quotes Culture Changeclose quotes is a common term that we have used to describe the incredible transformation. We are becoming a cost conscious, customer driven, highly competitive business. Nowhere has this change been more evident then in the way we process and dispose of our solid radioactive waste. With top-down management support, we have used team-based, formalized problem solving methods and have implemented many successful waste minimization programs. Through these programs, we have dramatically increased employees' awareness of the importance of waste minimization. As a result, we have been able to reduce both our burial volumes and our waste processing and disposal costs. In June, 1994, we invited EPRI to conduct assessments of our waste minimization programs at Oconee and Catawba nuclear stations. Included in the assessments were in-depth looks at contamination control, an inventory of items in the plant, the volume of waste generated in the plant and how it was processed, laundry reject data, site waste-handling operations, and plant open-quotes housekeepingclose quotes routines and process. One of the most important aspects of the assessment is the open-quotes dumpster dive,close quotes which is an evaluation of site dry active waste composition by sorting through approximately fifteen bags of radioactive waste. Finally, there was an evaluation of consumable used at each site in order to gain knowledge of items that could be standardized at all stations. With EPRI recommendations, we made several changes and standardized the items used. We have made significant progress in waste reduction. We realize, however, that we are aiming at a moving target and we still have room for improvement. As the price of processing and disposal (or storage) increases, we will continue to evaluate our waste minimization programs

  19. Cost-Effective Method for Free-Energy Minimization in Complex Systems with Elaborated Ab Initio Potentials.

    Science.gov (United States)

    Bistafa, Carlos; Kitamura, Yukichi; Martins-Costa, Marilia T C; Nagaoka, Masataka; Ruiz-López, Manuel F

    2018-05-22

    We describe a method to locate stationary points in the free-energy hypersurface of complex molecular systems using high-level correlated ab initio potentials. In this work, we assume a combined QM/MM description of the system although generalization to full ab initio potentials or other theoretical schemes is straightforward. The free-energy gradient (FEG) is obtained as the mean force acting on relevant nuclei using a dual level strategy. First, a statistical simulation is carried out using an appropriate, low-level quantum mechanical force-field. Free-energy perturbation (FEP) theory is then used to obtain the free-energy derivatives for the target, high-level quantum mechanical force-field. We show that this composite FEG-FEP approach is able to reproduce the results of a standard free-energy minimization procedure with high accuracy, while simultaneously allowing for a drastic reduction of both computational and wall-clock time. The method has been applied to study the structure of the water molecule in liquid water at the QCISD/aug-cc-pVTZ level of theory, using the sampling from QM/MM molecular dynamics simulations at the B3LYP/6-311+G(d,p) level. The obtained values for the geometrical parameters and for the dipole moment of the water molecule are within the experimental error, and they also display an excellent agreement when compared to other theoretical estimations. The developed methodology represents therefore an important step toward the accurate determination of the mechanism, kinetics, and thermodynamic properties of processes in solution, in enzymes, and in other disordered chemical systems using state-of-the-art ab initio potentials.

  20. Radical perineal prostatectomy: cost efficient, outcome effective, minimally invasive prostate cancer management.

    Science.gov (United States)

    Harris, Michael J

    2003-09-01

    Localized prostate cancer is a common disease for which minimally invasive treatment methods are being explored. Perineal prostatectomy, as a historical open procedure, is modified to incorporate contemporary surgical ideas. There is relatively little in the literature regarding modern adaptations of perineal prostatectomy. This method of anatomic radical perineal prostatectomy has been developed to accomplish a minimally invasive method of achieving goals of disease control and preservation of genito-urinary functions. Prospective outcome data is accumulated on 508 consecutive radical perineal prostatectomies by a single surgeon. Pathologic stage and PSA detectability are measures of cancer control. Pad use and ability to complete intercourse measure urinary and sexual function. General complications and other outcome measures are evaluated. Freedom from PSA detectability by pathologic stage is 96.3%, 79.4%, and 69.4% for organ confined, specimen confined and margin positive in the absence of seminal vesical invasion with an average 4 years follow up (3-114 months). Margins are positive in 18% of cases. The average cancer size is 9.4g and 36% of cases have extracapsular invasion. By the first, third, sixth months and one year, 38%, 65%, 88% and 96% are free of pad use and report being dry. While over 80% of nerve-spared patients enjoy the return of spontaneous erectile function, the men with bilateral nerve preservation note earlier and more complete return of function. There are no cardiopulmonary complications or deaths. Transfusions occurred in 1%, none in the past 400 cases. Average total hospital charges are USD$4889.00 in 1999 and 2000. Anterior urethral strictures, anastomotic strictures and fecal urgency/stress flatus occur 2%, 2% and 2-4%, respectively. This method of prostatectomy is able to achieve complete cancer resection while preserving urinary and sexual function as well as laparoscopic or retropubic prostatectomy. The simplicity and minimally

  1. Iterative CT reconstruction via minimizing adaptively reweighted total variation.

    Science.gov (United States)

    Zhu, Lei; Niu, Tianye; Petrongolo, Michael

    2014-01-01

    Iterative reconstruction via total variation (TV) minimization has demonstrated great successes in accurate CT imaging from under-sampled projections. When projections are further reduced, over-smoothing artifacts appear in the current reconstruction especially around the structure boundaries. We propose a practical algorithm to improve TV-minimization based CT reconstruction on very few projection data. Based on the theory of compressed sensing, the L-0 norm approach is more desirable to further reduce the projection views. To overcome the computational difficulty of the non-convex optimization of the L-0 norm, we implement an adaptive weighting scheme to approximate the solution via a series of TV minimizations for practical use in CT reconstruction. The weight on TV is initialized as uniform ones, and is automatically changed based on the gradient of the reconstructed image from the previous iteration. The iteration stops when a small difference between the weighted TV values is observed on two consecutive reconstructed images. We evaluate the proposed algorithm on both a digital phantom and a physical phantom. Using 20 equiangular projections, our method reduces reconstruction errors in the conventional TV minimization by a factor of more than 5, with improved spatial resolution. By adaptively reweighting TV in iterative CT reconstruction, we successfully further reduce the projection number for the same or better image quality.

  2. Constrained Total Generalized p-Variation Minimization for Few-View X-Ray Computed Tomography Image Reconstruction.

    Science.gov (United States)

    Zhang, Hanming; Wang, Linyuan; Yan, Bin; Li, Lei; Cai, Ailong; Hu, Guoen

    2016-01-01

    Total generalized variation (TGV)-based computed tomography (CT) image reconstruction, which utilizes high-order image derivatives, is superior to total variation-based methods in terms of the preservation of edge information and the suppression of unfavorable staircase effects. However, conventional TGV regularization employs l1-based form, which is not the most direct method for maximizing sparsity prior. In this study, we propose a total generalized p-variation (TGpV) regularization model to improve the sparsity exploitation of TGV and offer efficient solutions to few-view CT image reconstruction problems. To solve the nonconvex optimization problem of the TGpV minimization model, we then present an efficient iterative algorithm based on the alternating minimization of augmented Lagrangian function. All of the resulting subproblems decoupled by variable splitting admit explicit solutions by applying alternating minimization method and generalized p-shrinkage mapping. In addition, approximate solutions that can be easily performed and quickly calculated through fast Fourier transform are derived using the proximal point method to reduce the cost of inner subproblems. The accuracy and efficiency of the simulated and real data are qualitatively and quantitatively evaluated to validate the efficiency and feasibility of the proposed method. Overall, the proposed method exhibits reasonable performance and outperforms the original TGV-based method when applied to few-view problems.

  3. Minimally invasive lateral trans-psoas approach for tuberculosis of lumbar spine

    Directory of Open Access Journals (Sweden)

    Nitin Garg

    2014-01-01

    Full Text Available Anterior, posterolateral and posterior approaches are used for managing lumbar tuberculosis. Minimally invasive methods are being used increasingly for various disorders of the spine. This report presents the utility of lateral trans-psoas approach to the lumbar spine (LS using minimal access techniques, also known as direct lateral lumbar interbody fusion in 2 cases with tuberculosis of LS. Two patients with tuberculosis at L2-3 and L4-5 presented with back pain. Both had destruction and deformity of the vertebral body. The whole procedure comprising debridement and placement of iliac crest graft was performed using tubular retractors and was augmented by posterior fixation using percutaneous transpedicular screws. Both patients recovered well with no significant procedure related morbidity. Post-operative computed tomography scans showed appropriate position of the graft and instrumentation. At follow-up, both patients are ambulant with no progression of the deformity. Minimal access direct lateral transpsoas approach can be used for debridement and reconstruction of ventral column in tuberculous of Lumbar spine. This paper highlights the growing applications of minimal access surgery for spine.

  4. Homogeneous Field and WKB Approximation in Deformed Quantum Mechanics with Minimal Length

    Directory of Open Access Journals (Sweden)

    Jun Tao

    2015-01-01

    Full Text Available In the framework of the deformed quantum mechanics with a minimal length, we consider the motion of a nonrelativistic particle in a homogeneous external field. We find the integral representation for the physically acceptable wave function in the position representation. Using the method of steepest descent, we obtain the asymptotic expansions of the wave function at large positive and negative arguments. We then employ the leading asymptotic expressions to derive the WKB connection formula, which proceeds from classically forbidden region to classically allowed one through a turning point. By the WKB connection formula, we prove the Bohr-Sommerfeld quantization rule up to Oβ2. We also show that if the slope of the potential at a turning point is too steep, the WKB connection formula is no longer valid around the turning point. The effects of the minimal length on the classical motions are investigated using the Hamilton-Jacobi method. We also use the Bohr-Sommerfeld quantization to study statistical physics in deformed spaces with the minimal length.

  5. Anatomic double-bundle anterior cruciate ligament reconstruction using hamstring tendons with minimally required initial tension.

    Science.gov (United States)

    Mae, Tatsuo; Shino, Konsei; Matsumoto, Norinao; Natsu-Ume, Takashi; Yoneda, Kenji; Yoshikawa, Hideki; Yoneda, Minoru

    2010-10-01

    Our purpose was to clarify the clinical outcomes at 2 years after anatomic double-bundle anterior cruciate ligament (ACL) reconstruction with 20 N of the initial tension, which was the minimally required initial tension to perform the reconstruction successfully according to our previous report about the pre-tension necessary to restore the laxity found in the opposite knee (7.3 N; range, 2.2 to 14 N). Of 64 patients who underwent anatomic double-bundle ACL reconstruction with autogenous semitendinosus tendon, 45 were periodically examined for 2 years. Two double-looped grafts were fixed with EndoButton CL devices (Smith & Nephew Endoscopy, Andover, MA) on the femoral side and Double Spike Plates (Smith & Nephew Endoscopy) on the tibial side, while a total of 20 N of initial tension (10 N to each graft) was applied at 20° of knee flexion. The International Knee Documentation Committee Knee Examination Form and Lysholm score were used for the subjective assessment, whereas range of motion and knee stability were evaluated as the objective assessment. Grafts were evaluated in 25 patients with second-look arthroscopy. According to the International Knee Documentation Committee subjective assessment, 62% of knees were graded as normal and 38% as nearly normal. The Lysholm score was 72 points in the preoperative period and improved to 99 points at 2 years' follow-up. A loss of knee extension of less than 3° was found in 2 patients. The pivot-shift test was evaluated as negative in all patients except for 5 as a glide. KT-2000 knee arthrometer side-to-side difference (MEDmetric, San Diego, CA) was 0.1 ± 0.9 mm at 2 years' follow-up. Of the subset of grafts evaluated by second-look arthroscopy, most were considered to have good synovial coverage and to be taut. The anatomic double-bundle ACL reconstruction with 20 N of low initial tension yielded good clinical outcomes at 2 years postoperatively, and second-look arthroscopic findings were excellent. Level IV

  6. Minimal and careful processing

    OpenAIRE

    Nielsen, Thorkild

    2004-01-01

    In several standards, guidelines and publications, organic food processing is strongly associated with "minimal processing" and "careful processing". The term "minimal processing" is nowadays often used in the general food processing industry and described in literature. The term "careful processing" is used more specifically within organic food processing but is not yet clearly defined. The concept of carefulness seems to fit very well with the processing of organic foods, especially if it i...

  7. The minimally invasive endoscopic management of septated chronic subdural hematomas: surgical technique.

    Science.gov (United States)

    Berhouma, M; Jacquesson, T; Jouanneau, E

    2014-12-01

    Fibrin membranes and compartmentalization within the subdural space are a frequent cause of failure in the treatment of chronic subdural hematomas (CSH). This specific subtype of CSH classically requires craniotomy, which carries significant morbidity and mortality rates, particularly in elderly patients. In this work, we describe a minimally invasive endoscopic alternative. Under local scalp anesthesia, a rigid endoscope is inserted through a parietal burr hole in the subdural space to collapse fibrin septa and cut the internal membrane. It also allows cauterization of active bleedings and the placement of a drain under direct visualization. The endoscopic treatment of septated CSH represents a minimally invasive alternative to craniotomy especially for the internal membranectomy.

  8. On Perceptual Distortion Minimization and Nonlinear Least-Squares Frequency Estimation

    DEFF Research Database (Denmark)

    Christensen, Mads Græsbøll; Jensen, Søren Holdt

    2006-01-01

    In this paper, we present a framework for perceptual error minimization and sinusoidal frequency estimation based on a new perceptual distortion measure, and we state its optimal solution. Using this framework, we relate a number of well-known practical methods for perceptual sinusoidal parameter...

  9. Minimally Invasive Subcortical Parafascicular Transsulcal Access for Clot Evacuation (Mi SPACE for Intracerebral Hemorrhage

    Directory of Open Access Journals (Sweden)

    Benjamin Ritsma

    2014-01-01

    Full Text Available Background. Spontaneous intracerebral hemorrhage (ICH is common and causes significant mortality and morbidity. To date, optimal medical and surgical intervention remains uncertain. A lack of definitive benefit for operative management may be attributable to adverse surgical effect, collateral tissue injury. This is particularly relevant for ICH in dominant, eloquent cortex. Minimally invasive surgery (MIS offers the potential advantage of reduced collateral damage. MIS utilizing a parafascicular approach has demonstrated such benefit for intracranial tumor resection. Methods. We present a case of dominant hemisphere spontaneous ICH evacuated via the minimally invasive subcortical parafascicular transsulcal access clot evacuation (Mi SPACE model. We use this report to introduce Mi SPACE and to examine the application of this novel MIS paradigm. Case Presentation. The featured patient presented with a left temporal ICH and severe global aphasia. The hematoma was evacuated via the Mi SPACE approach. Postoperative reassessments showed significant improvement. At two months, bedside language testing was normal. MRI tractography confirmed limited collateral injury. Conclusions. This case illustrates successful application of the Mi SPACE model to ICH in dominant, eloquent cortex and subcortical regions. MRI tractography illustrates collateral tissue preservation. Safety and feasibility studies are required to further assess this promising new therapeutic paradigm.

  10. Non-minimal Wu-Yang monopole

    International Nuclear Information System (INIS)

    Balakin, A.B.; Zayats, A.E.

    2007-01-01

    We discuss new exact spherically symmetric static solutions to non-minimally extended Einstein-Yang-Mills equations. The obtained solution to the Yang-Mills subsystem is interpreted as a non-minimal Wu-Yang monopole solution. We focus on the analysis of two classes of the exact solutions to the gravitational field equations. Solutions of the first class belong to the Reissner-Nordstroem type, i.e., they are characterized by horizons and by the singularity at the point of origin. The solutions of the second class are regular ones. The horizons and singularities of a new type, the non-minimal ones, are indicated

  11. Long-term outcome of biopsy-proven, frequently relapsing minimal-change nephrotic syndrome in children.

    NARCIS (Netherlands)

    Kyrieleis, H.A.; Lowik, M.M.; Pronk, I.; Cruysberg, J.R.M.; Kremer, J.A.M.; Oyen, W.J.G.; Heuvel, L.P.W.J. van den; Wetzels, J.F.M.; Levtchenko, E.N.

    2009-01-01

    BACKGROUND AND OBJECTIVES: Frequently relapsing and steroid-dependent minimal-change nephrotic syndrome (MCNS) that originates in childhood can persist after puberty in >20% of patients. These patients require immunosuppressive treatment during several decades of their life. We examined long-term

  12. Continuous-Time Portfolio Selection and Option Pricing under Risk-Minimization Criterion in an Incomplete Market

    Directory of Open Access Journals (Sweden)

    Xinfeng Ruan

    2013-01-01

    Full Text Available We study option pricing with risk-minimization criterion in an incomplete market where the dynamics of the risky underlying asset are governed by a jump diffusion equation. We obtain the Radon-Nikodym derivative in the minimal martingale measure and a partial integrodifferential equation (PIDE of European call option. In a special case, we get the exact solution for European call option by Fourier transformation methods. Finally, we employ the pricing kernel to calculate the optimal portfolio selection by martingale methods.

  13. Reduction in requirements for allogeneic blood products: nonpharmacologic methods.

    Science.gov (United States)

    Hardy, J F; Bélisle, S; Janvier, G; Samama, M

    1996-12-01

    Various strategies have been proposed to decrease bleeding and allogeneic transfusion requirements during and after cardiac operations. This article attempts to document the usefulness, or lack thereof, of the nonpharmacologic methods available in clinical practice. Blood conservation methods were reviewed in chronologic order, as they become available to patients during the perisurgical period. The literature in support of or against each strategy was reexamined critically. Avoidance of preoperative anemia and adherence to published guidelines for the practice of transfusion are of paramount importance. Intraoperatively, tolerance of low hemoglobin concentrations and use of autologous blood (predonated or harvested before bypass) will reduce allogeneic transfusions. The usefulness of plateletpheresis and retransfusion of shed mediastinal fluid remains controversial. Intraoperatively and postoperatively, maintenance of normothermia contributes to improved hemostasis. Several approaches have been shown to be effective. An efficient combination of methods can reduce, and sometimes abolish, the need for allogeneic blood products after cardiac operations, inasmuch as all those involved in the care of cardiac surgical patients adhere thoughtfully to existing transfusion guidelines.

  14. Minimally-invasive treatment of high velocity intra-articular fractures of the distal tibia.

    LENUS (Irish Health Repository)

    Leonard, M

    2012-02-01

    The pilon fracture is a complex injury. The purpose of this study was to evaluate the outcome of minimally invasive techniques in management of these injuries. This was a prospective study of closed AO type C2 and C3 fractures managed by early (<36 hours) minimally invasive surgical intervention and physiotherapist led rehabilitation. Thirty patients with 32 intra-articular distal tibial fractures were treated by the senior surgeon (GK). Our aim was to record the outcome and all complications with a minimum two year follow-up. There were two superficial wound infections. One patient developed a non-union which required a formal open procedure. Another patient was symptomatic from a palpable plate inferiorly. An excellent AOFAS result was obtained in 83% (20\\/24) of the patients. Early minimally invasive reduction and fixation of complex high velocity pilon fractures gave very satisfactory results at a minimum of two years follow-up.

  15. Wilson loops in minimal surfaces

    International Nuclear Information System (INIS)

    Drukker, Nadav; Gross, David J.; Ooguri, Hirosi

    1999-01-01

    The AdS/CFT correspondence suggests that the Wilson loop of the large N gauge theory with N = 4 supersymmetry in 4 dimensions is described by a minimal surface in AdS 5 x S 5 . The authors examine various aspects of this proposal, comparing gauge theory expectations with computations of minimal surfaces. There is a distinguished class of loops, which the authors call BPS loops, whose expectation values are free from ultra-violet divergence. They formulate the loop equation for such loops. To the extent that they have checked, the minimal surface in AdS 5 x S 5 gives a solution of the equation. The authors also discuss the zig-zag symmetry of the loop operator. In the N = 4 gauge theory, they expect the zig-zag symmetry to hold when the loop does not couple the scalar fields in the supermultiplet. They will show how this is realized for the minimal surface

  16. Wilson loops and minimal surfaces

    International Nuclear Information System (INIS)

    Drukker, Nadav; Gross, David J.; Ooguri, Hirosi

    1999-01-01

    The AdS-CFT correspondence suggests that the Wilson loop of the large N gauge theory with N=4 supersymmetry in four dimensions is described by a minimal surface in AdS 5 xS 5 . We examine various aspects of this proposal, comparing gauge theory expectations with computations of minimal surfaces. There is a distinguished class of loops, which we call BPS loops, whose expectation values are free from ultraviolet divergence. We formulate the loop equation for such loops. To the extent that we have checked, the minimal surface in AdS 5 xS 5 gives a solution of the equation. We also discuss the zigzag symmetry of the loop operator. In the N=4 gauge theory, we expect the zigzag symmetry to hold when the loop does not couple the scalar fields in the supermultiplet. We will show how this is realized for the minimal surface. (c) 1999 The American Physical Society

  17. Minimal mirror twin Higgs

    Energy Technology Data Exchange (ETDEWEB)

    Barbieri, Riccardo [Institute of Theoretical Studies, ETH Zurich,CH-8092 Zurich (Switzerland); Scuola Normale Superiore,Piazza dei Cavalieri 7, 56126 Pisa (Italy); Hall, Lawrence J.; Harigaya, Keisuke [Department of Physics, University of California,Berkeley, California 94720 (United States); Theoretical Physics Group, Lawrence Berkeley National Laboratory,Berkeley, California 94720 (United States)

    2016-11-29

    In a Mirror Twin World with a maximally symmetric Higgs sector the little hierarchy of the Standard Model can be significantly mitigated, perhaps displacing the cutoff scale above the LHC reach. We show that consistency with observations requires that the Z{sub 2} parity exchanging the Standard Model with its mirror be broken in the Yukawa couplings. A minimal such effective field theory, with this sole Z{sub 2} breaking, can generate the Z{sub 2} breaking in the Higgs sector necessary for the Twin Higgs mechanism. The theory has constrained and correlated signals in Higgs decays, direct Dark Matter Detection and Dark Radiation, all within reach of foreseen experiments, over a region of parameter space where the fine-tuning for the electroweak scale is 10-50%. For dark matter, both mirror neutrons and a variety of self-interacting mirror atoms are considered. Neutrino mass signals and the effects of a possible additional Z{sub 2} breaking from the vacuum expectation values of B−L breaking fields are also discussed.

  18. Navicular stress fractures treated with minimally invasive fixation

    Directory of Open Access Journals (Sweden)

    Korula Mani Jacob

    2013-01-01

    Early intervention with minimally invasive surgery has significantly less morbidity and a reliable early return to active sports and is therefore the best option in high-performance athletes. Materials and Methods: Nine athletes with ten stress fractures of the navicular treated at our institution between April 1991 and October 2000. The mean age of the patients was 22.8 years (range 18-50 years. All patients were treated by minimally invasive screw fixation and early weight bearing mobilization without a cast. The average followup was 7 years (range 2-11 years. Results: Seven of the nine patients returned to their pre-fracture level of sporting activity at an average of 5 months (range 3-9 months. One patient returned to full sporting activity following a delay of 2 years due to an associated tibial stress fracture and one patient had an unsatisfactory result. Long term review at an average of 7 years showed that six of these eight patients who returned to sports remained symptom free with two patients experiencing minimal intermittent discomfort after prolonged activity. Conclusions: We recommend percutaneous screw fixation as a reliable, low morbidity procedure allowing early return to full sporting activity without long term complications or recurrences.

  19. Molecular mechanics calculations of proteins. Comparison of different energy minimization strategies

    DEFF Research Database (Denmark)

    Christensen, I T; Jørgensen, Flemming Steen

    1997-01-01

    A general strategy for performing energy minimization of proteins using the SYBYL molecular modelling program has been developed. The influence of several variables including energy minimization procedure, solvation, dielectric function and dielectric constant have been investigated in order...... to develop a general method, which is capable of producing high quality protein structures. Avian pancreatic polypeptide (APP) and bovine pancreatic phospholipase A2 (BP PLA2) were selected for the calculations, because high quality X-ray structures exist and because all classes of secondary structure...... for this protein. Energy minimized structures of the trimeric PLA2 from Indian cobra (N.n.n. PLA2) were used for assessing the impact of protein-protein interactions. Based on the above mentioned criteria, it could be concluded that using the following conditions: Dielectric constant epsilon = 4 or 20; a distance...

  20. The Effect of Microneedle Thickness on Pain During Minimally Invasive Facial Procedures: A Clinical Study.

    Science.gov (United States)

    Sezgin, Billur; Ozel, Bora; Bulam, Hakan; Guney, Kirdar; Tuncer, Serhan; Cenetoglu, Seyhan

    2014-07-01

    Minimally invasive procedures are becoming increasingly popular because they require minimal downtime and are effective for achieving a more youthful appearance. The choice of needle for minimally invasive procedures can be a major factor in the patient's comfort level, which in turn affects the physician's comfort level. In this comparative study, the authors assessed levels of pain and bruising after participants were injected with 30-gauge or 33-gauge (G) microneedles, which are commonly used for minimally invasive injection procedures. Twenty healthy volunteers were recruited for this prospective study. Eight injection points (4 on each side of the face) were determined for each patient. All participants received injections of saline with both microneedles in a randomized, blinded fashion. Levels of pain and bruising were assessed and analyzed for significance. The highest level of pain was in the malar region, and the lowest level was in the glabella. Although all pain scores were lower for the 33-G microneedle, the difference was significant only for the forehead. Because most minimally invasive procedures require multiple injections during the same sitting, the overall procedure was evaluated as well. Assessment of the multiple-injection process demonstrated a significant difference in pain level, favoring the 33-G needle. Although the difference in bruising was not statistically significant between the 2 needles, the degree of bruising was lower with the 33-G needle. For procedures that involve multiple injections to the face (such as mesotherapy and injection of botulinum toxin A), thinner needles result in less pain, making the overall experience more comfortable for the patient and the physician. 3. © 2014 The American Society for Aesthetic Plastic Surgery, Inc.