WorldWideScience

Sample records for regular design reviews

  1. Automating InDesign with Regular Expressions

    CERN Document Server

    Kahrel, Peter

    2006-01-01

    If you need to make automated changes to InDesign documents beyond what basic search and replace can handle, you need regular expressions, and a bit of scripting to make them work. This Short Cut explains both how to write regular expressions, so you can find and replace the right things, and how to use them in InDesign specifically.

  2. 32nm 1-D regular pitch SRAM bitcell design for interference-assisted lithography

    Science.gov (United States)

    Greenway, Robert T.; Jeong, Kwangok; Kahng, Andrew B.; Park, Chul-Hong; Petersen, John S.

    2008-10-01

    As optical lithography advances into the 45nm technology node and beyond, new manufacturing-aware design requirements have emerged. We address layout design for interference-assisted lithography (IAL), a double exposure method that combines maskless interference lithography (IL) and projection lithography (PL); cf. hybrid optical maskless lithography (HOMA) in [2] and [3]. Since IL can generate dense but regular pitch patterns, a key challenge to deployment of IAL is the conversion of existing designs to regular-linewidth, regular-pitch layouts. In this paper, we propose new 1-D regular pitch SRAM bitcell layouts which are amenable to IAL. We evaluate the feasibility of our bitcell designs via lithography simulations and circuit simulations, and confirm that the proposed bitcells can be successfully printed by IAL and that their electrical characteristics are comparable to those of existing bitcells.

  3. A Critical Review of Instructional Design Process of Distance Learning System

    Science.gov (United States)

    Chaudry, Muhammad Ajmal; ur-Rahman, Fazal

    2010-01-01

    Instructional design refers to planning, development, delivery and evaluation of instructional system. It is an applied field of study aiming at the application of descriptive research outcomes in regular instructional settings. The present study was designed to critically review the process of instructional design at Allama Iqbal Open University…

  4. Cultural and Mathematical Meanings of Regular Octagons in Mesopotamia: Examining Islamic Art Designs

    Directory of Open Access Journals (Sweden)

    Jeanam Park

    2018-03-01

    Full Text Available The most common regular polygon in Islamic art design is the octagon. Historical evidence of the use of an 8-star polygon and an 8-fold rosette dates back to Jemdet Nasr (3100-2900 B.C. in Mesopotamia. Additionally, in ancient Egypt, octagons can be found in mathematical problem (Ahmose papyrus, Problem number 48, household goods (papyrus storage, architecture (granite columns and decorations (palace decorations. The regular octagon which is a fundamentally important element of Islamic art design, is widely used as arithmetic objects in metric algebra along with other regular polygons in Mesopotamia. The 8-point star polygon has long been a symbol of the ancient Sumerian goddess Inanna and her East Semitic counterpart Ishtar. During the Neo-Assyrian period, the 8-fold rosette occasionally replaced the star as the symbol of Ishtar. In this paper, we discuss how octagonal design prevailed in the Islamic region since the late ninth century, and has existed in Mesopotamia from Jemdet Nasr to the end of third century B.C. We describe reasons why the geometric pattern of regular polygons, including regular octagons, developed in the Islamic world. Furthermore, we also discuss mathematical meanings of regular polygons.

  5. Optimal Design of the Adaptive Normalized Matched Filter Detector using Regularized Tyler Estimators

    KAUST Repository

    Kammoun, Abla; Couillet, Romain; Pascal, Frederic; Alouini, Mohamed-Slim

    2017-01-01

    This article addresses improvements on the design of the adaptive normalized matched filter (ANMF) for radar detection. It is well-acknowledged that the estimation of the noise-clutter covariance matrix is a fundamental step in adaptive radar detection. In this paper, we consider regularized estimation methods which force by construction the eigenvalues of the covariance estimates to be greater than a positive regularization parameter ρ. This makes them more suitable for high dimensional problems with a limited number of secondary data samples than traditional sample covariance estimates. The motivation behind this work is to understand the effect and properly set the value of ρthat improves estimate conditioning while maintaining a low estimation bias. More specifically, we consider the design of the ANMF detector for two kinds of regularized estimators, namely the regularized sample covariance matrix (RSCM), the regularized Tyler estimator (RTE). The rationale behind this choice is that the RTE is efficient in mitigating the degradation caused by the presence of impulsive noises while inducing little loss when the noise is Gaussian. Based on asymptotic results brought by recent tools from random matrix theory, we propose a design for the regularization parameter that maximizes the asymptotic detection probability under constant asymptotic false alarm rates. Provided Simulations support the efficiency of the proposed method, illustrating its gain over conventional settings of the regularization parameter.

  6. Optimal Design of the Adaptive Normalized Matched Filter Detector using Regularized Tyler Estimators

    KAUST Repository

    Kammoun, Abla

    2017-10-25

    This article addresses improvements on the design of the adaptive normalized matched filter (ANMF) for radar detection. It is well-acknowledged that the estimation of the noise-clutter covariance matrix is a fundamental step in adaptive radar detection. In this paper, we consider regularized estimation methods which force by construction the eigenvalues of the covariance estimates to be greater than a positive regularization parameter ρ. This makes them more suitable for high dimensional problems with a limited number of secondary data samples than traditional sample covariance estimates. The motivation behind this work is to understand the effect and properly set the value of ρthat improves estimate conditioning while maintaining a low estimation bias. More specifically, we consider the design of the ANMF detector for two kinds of regularized estimators, namely the regularized sample covariance matrix (RSCM), the regularized Tyler estimator (RTE). The rationale behind this choice is that the RTE is efficient in mitigating the degradation caused by the presence of impulsive noises while inducing little loss when the noise is Gaussian. Based on asymptotic results brought by recent tools from random matrix theory, we propose a design for the regularization parameter that maximizes the asymptotic detection probability under constant asymptotic false alarm rates. Provided Simulations support the efficiency of the proposed method, illustrating its gain over conventional settings of the regularization parameter.

  7. Exploiting Lexical Regularities in Designing Natural Language Systems.

    Science.gov (United States)

    1988-04-01

    ELEMENT. PROJECT. TASKN Artificial Inteligence Laboratory A1A4WR NTumet 0) 545 Technology Square Cambridge, MA 02139 Ln *t- CONTROLLING OFFICE NAME AND...RO-RI95 922 EXPLOITING LEXICAL REGULARITIES IN DESIGNING NATURAL 1/1 LANGUAGE SYSTENS(U) MASSACHUSETTS INST OF TECH CAMBRIDGE ARTIFICIAL INTELLIGENCE...oes.ary and ftdou.Ip hr Nl wow" L,2This paper presents the lexical component of the START Question Answering system developed at the MIT Artificial

  8. A Preliminary Investigation of User Perception and Behavioral Intention for Different Review Types: Customers and Designers Perspective

    Science.gov (United States)

    Qazi, Atika; Waheed, Mahwish; Abraham, Ajith

    2014-01-01

    Existing opinion mining studies have focused on and explored only two types of reviews, that is, regular and comparative. There is a visible gap in determining the useful review types from customers and designers perspective. Based on Technology Acceptance Model (TAM) and statistical measures we examine users' perception about different review types and its effects in terms of behavioral intention towards using online review system. By using sample of users (N = 400) and designers (N = 106), current research work studies three review types, A (regular), B (comparative), and C (suggestive), which are related to perceived usefulness, perceived ease of use, and behavioral intention. The study reveals that positive perception of the use of suggestive reviews improves users' decision making in business intelligence. The results also depict that type C (suggestive reviews) could be considered a new useful review type in addition to other types, A and B. PMID:24711739

  9. From design of activity-based costing systems to their regular use From design of activity-based costing systems to their regular use Del diseño de modelos de costes basados en las actividades a su uso normalizado

    Directory of Open Access Journals (Sweden)

    M. Angels Fito

    2011-11-01

    Full Text Available Purpose: To understand why many companies that develop activity-based costing (ABC systems do not use them on a regular basis.Design/methodology/approach: We review the existing literature on the process of ABC implementation, concentrating specifically on the step from the acceptance of an ABC model to its routine use. We identify key factors for successful uptake of ABC systems as a regular management tool and use these factors to interpret the experience of two companies that illustrate, respectively, a success and a failure.Findings: Sixteen factors are identified that positively or negatively influence the actual use of ABC costing systems. These factors can be grouped into six categories: strategic, individual, organizational, technological, operational and external factors.Originality/value: This paper sheds some light on the paradoxical situation that regular usage of ABC systems is not as common as might be expected given their widespread acceptance on a conceptual level.Purpose: To understand why many companies that develop activity-based costing (ABC systems do not use them on a regular basis.Design/methodology/approach: We review the existing literature on the process of ABC implementation, concentrating specifically on the step from the acceptance of an ABC model to its routine use. We identify key factors for successful uptake of ABC systems as a regular management tool and use these factors to interpret the experience of two companies that illustrate, respectively, a success and a failure.Findings: Sixteen factors are identified that positively or negatively influence the actual use of ABC costing systems. These factors can be grouped into six categories: strategic, individual, organizational, technological, operational and external factors.Originality/value: This paper sheds some light on the paradoxical situation that regular usage of ABC systems is not as common as might be expected given their widespread acceptance on a

  10. Effects of Regular Classes in Outdoor Education Settings: A Systematic Review on Students’ Learning, Social and Health Dimensions

    Science.gov (United States)

    Becker, Christoph; Lauterbach, Gabriele; Spengler, Sarah; Dettweiler, Ulrich; Mess, Filip

    2017-01-01

    Background: Participants in Outdoor Education Programmes (OEPs) presumably benefit from these programmes in terms of their social and personal development, academic achievement and physical activity (PA). The aim of this systematic review was to identify studies about regular compulsory school- and curriculum-based OEPs, to categorise and evaluate reported outcomes, to assess the methodological quality, and to discuss possible benefits for students. Methods: We searched online databases to identify English- and German-language peer-reviewed journal articles that reported any outcomes on a student level. Two independent reviewers screened studies identified for eligibility and assessed the methodological quality. Results: Thirteen studies were included for analysis. Most studies used a case-study design, the average number of participants was moderate (mean valued (M) = 62.17; standard deviation (SD) = 64.12), and the methodological quality was moderate on average for qualitative studies (M = 0.52; SD = 0.11), and low on average for quantitative studies (M = 0.18; SD = 0.42). Eight studies described outcomes in terms of social dimensions, seven studies in learning dimensions and four studies were subsumed under additional outcomes, i.e., PA and health. Eleven studies reported positive, one study positive as well as negative, and one study reported negative effects. PA and mental health as outcomes were underrepresented. Conclusion: Tendencies were detected that regular compulsory school- and curriculum-based OEPs can promote students in respect of social, academic, physical and psychological dimensions. Very little is known concerning students’ PA or mental health. We recommend conducting more quasi-experimental design and longitudinal studies with a greater number of participants, and a high methodological quality to further investigate these tendencies. PMID:28475167

  11. Effects of Regular Classes in Outdoor Education Settings: A Systematic Review on Students' Learning, Social and Health Dimensions.

    Science.gov (United States)

    Becker, Christoph; Lauterbach, Gabriele; Spengler, Sarah; Dettweiler, Ulrich; Mess, Filip

    2017-05-05

    Participants in Outdoor Education Programmes (OEPs) presumably benefit from these programmes in terms of their social and personal development, academic achievement and physical activity (PA). The aim of this systematic review was to identify studies about regular compulsory school- and curriculum-based OEPs, to categorise and evaluate reported outcomes, to assess the methodological quality, and to discuss possible benefits for students. We searched online databases to identify English- and German-language peer-reviewed journal articles that reported any outcomes on a student level. Two independent reviewers screened studies identified for eligibility and assessed the methodological quality. Thirteen studies were included for analysis. Most studies used a case-study design, the average number of participants was moderate (mean valued (M) = 62.17; standard deviation (SD) = 64.12), and the methodological quality was moderate on average for qualitative studies (M = 0.52; SD = 0.11), and low on average for quantitative studies (M = 0.18; SD = 0.42). Eight studies described outcomes in terms of social dimensions, seven studies in learning dimensions and four studies were subsumed under additional outcomes, i.e., PA and health. Eleven studies reported positive, one study positive as well as negative, and one study reported negative effects. PA and mental health as outcomes were underrepresented. Tendencies were detected that regular compulsory school- and curriculum-based OEPs can promote students in respect of social, academic, physical and psychological dimensions. Very little is known concerning students' PA or mental health. We recommend conducting more quasi-experimental design and longitudinal studies with a greater number of participants, and a high methodological quality to further investigate these tendencies.

  12. Inverse problems with Poisson data: statistical regularization theory, applications and algorithms

    International Nuclear Information System (INIS)

    Hohage, Thorsten; Werner, Frank

    2016-01-01

    Inverse problems with Poisson data arise in many photonic imaging modalities in medicine, engineering and astronomy. The design of regularization methods and estimators for such problems has been studied intensively over the last two decades. In this review we give an overview of statistical regularization theory for such problems, the most important applications, and the most widely used algorithms. The focus is on variational regularization methods in the form of penalized maximum likelihood estimators, which can be analyzed in a general setup. Complementing a number of recent convergence rate results we will establish consistency results. Moreover, we discuss estimators based on a wavelet-vaguelette decomposition of the (necessarily linear) forward operator. As most prominent applications we briefly introduce Positron emission tomography, inverse problems in fluorescence microscopy, and phase retrieval problems. The computation of a penalized maximum likelihood estimator involves the solution of a (typically convex) minimization problem. We also review several efficient algorithms which have been proposed for such problems over the last five years. (topical review)

  13. Tessellating the Sphere with Regular Polygons

    Science.gov (United States)

    Soto-Johnson, Hortensia; Bechthold, Dawn

    2004-01-01

    Tessellations in the Euclidean plane and regular polygons that tessellate the sphere are reviewed. The regular polygons that can possibly tesellate the sphere are spherical triangles, squares and pentagons.

  14. Traditional Islamic cities unveiled: the quest for urban design regularity

    Directory of Open Access Journals (Sweden)

    Jorge Correia

    2015-08-01

    Full Text Available Traditional Islamic cities have generally gathered orientalized gazes and perspectives, picking up from misconceptions and stereotypes that during the second half af the 19th century andwere perpectuated by colonialism. More recent scholarship has shed light on the urban organizationand composition of such tissues; most of them confined to old quarters or historical centres ofthriving contemporary cities within the Arab-Muslim world. In fact, one of the most striking featureshas been the unveiling of layered urban assemblages where exterior agents have somehowlaunched or interrupted an apparent islamicized continuum. Primarly, this paper wishes to search forexternal political factors that have designed regularly geometrized patterns in medium-sized Arabtowns. For that, two case studies from different geographies - Maghreb and the Near East - will bemorphologically analysed through updated urban surveys. Whereas Nablus (Palestine ows the urbanmatrix of its old town to its Roman past, in Azemmour’s medina (Morocco it is still possible to trackthe thin European early-modern colonial stratum. However, both cases show how regularity patternschallenge Western concepts of geometrical design to embrace levels of rationality related to tradionalIslamic urban forms, societal configurations and built environment. Urban morphology becomes afundamental tool for articulating the history with me processes of sedimentation and evolution in orderto read current urban prints and dynamics. Thus, the paper will also interpret alternative logics ofrational urban display in Azemmour and Nablus, linked to ways of living within the Islamic sphere.

  15. Self-Management for Primary School Students Demonstrating Problem Behavior in Regular Classrooms: Evidence Review of Single-Case Design Research

    Science.gov (United States)

    Busacca, Margherita L.; Anderson, Angelika; Moore, Dennis W.

    2015-01-01

    This review evaluates self-management literature targeting problem behaviors of primary school students in general education settings. Thirty-one single-case design studies met inclusion criteria, of which 16 demonstrated adequate methodological rigor, according to What Works Clearinghouse (WWC) design standards. Visual analysis and WWC…

  16. Does Regular Breakfast Cereal Consumption Help Children and Adolescents Stay Slimmer? A Systematic Review and Meta-Analysis

    Directory of Open Access Journals (Sweden)

    Anne de la Hunty

    2013-03-01

    Full Text Available Objective: To review systematically the evidence on breakfast cereal consumption and obesity in children and adolescents and assess whether the regular consumption of breakfast cereals could help to prevent excessive weight gain. Methods: A systematic review and meta-analysis of studies relating breakfast cereal consumption to BMI, BMI z-scores and prevalence of obesity as the outcomes. Results: 14 papers met the inclusion criteria. The computed effect size for mean BMI between high consumers and low or non-consumers over all 25 study subgroups was -1.13 kg/m2 (95% CI -0.81, -1.46, p Conclusion: Overall, the evidence reviewed is suggestive that regular consumption of breakfast cereals results in a lower BMI and a reduced likelihood of being overweight in children and adolescents. However, more evidence from long-term trials and investigations into mechanisms is needed to eliminate possible confounding factors and determine causality.

  17. Regularity effect in prospective memory during aging

    Directory of Open Access Journals (Sweden)

    Geoffrey Blondelle

    2016-10-01

    Full Text Available Background: Regularity effect can affect performance in prospective memory (PM, but little is known on the cognitive processes linked to this effect. Moreover, its impacts with regard to aging remain unknown. To our knowledge, this study is the first to examine regularity effect in PM in a lifespan perspective, with a sample of young, intermediate, and older adults. Objective and design: Our study examined the regularity effect in PM in three groups of participants: 28 young adults (18–30, 16 intermediate adults (40–55, and 25 older adults (65–80. The task, adapted from the Virtual Week, was designed to manipulate the regularity of the various activities of daily life that were to be recalled (regular repeated activities vs. irregular non-repeated activities. We examine the role of several cognitive functions including certain dimensions of executive functions (planning, inhibition, shifting, and binding, short-term memory, and retrospective episodic memory to identify those involved in PM, according to regularity and age. Results: A mixed-design ANOVA showed a main effect of task regularity and an interaction between age and regularity: an age-related difference in PM performances was found for irregular activities (older < young, but not for regular activities. All participants recalled more regular activities than irregular ones with no age effect. It appeared that recalling of regular activities only involved planning for both intermediate and older adults, while recalling of irregular ones were linked to planning, inhibition, short-term memory, binding, and retrospective episodic memory. Conclusion: Taken together, our data suggest that planning capacities seem to play a major role in remembering to perform intended actions with advancing age. Furthermore, the age-PM-paradox may be attenuated when the experimental design is adapted by implementing a familiar context through the use of activities of daily living. The clinical

  18. The effect of additional physiotherapy to hospital inpatients outside of regular business hours: a systematic review.

    Science.gov (United States)

    Brusco, Natasha K; Paratz, Jennifer

    2006-12-01

    Provision of out of regular business hours (OBH) physiotherapy to hospital inpatients is widespread in the hospital setting. This systematic review evaluated the effect of additional OBH physiotherapy services on patient length of stay (LOS), pulmonary complications, discharge destination, discharge mobility status, quality of life, cost saving, adverse events, and mortality compared with physiotherapy only within regular business hours. A literature search was completed on databases with citation tracking using key words. Two reviewers completed data extraction and quality assessment independently by using modified scales for historical cohorts and case control studies as well as the PEDro scale for randomized controlled trials and quasi-randomised controlled trials. This search identified nine articles of low to medium quality. Four reported a significant reduction in LOS associated with additional OBH physiotherapy, with two articles reporting overall significance and two reporting only for specific subgroups. Two studies reported significant reduction in pulmonary complications for two different patient groups in an intensive care unit (ICU) with additional OBH physiotherapy. Three studies accounted for discharge destination and/or discharge mobility status with no significant difference reported. Quality of life, adverse events, and mortality were not reported in any studies. Cost savings were considered in three studies, with two reporting a cost saving. This systematic review was unable to conclude that the provision of additional OBH physiotherapy made significant improvement to patient outcomes for all subgroups of inpatients. One study in critical care reported that overnight physiotherapy decreased LOS and reduced pulmonary complications of patients in the ICU. However, the studies in the area of orthopaedics, neurology, postcardiac surgery, and rheumatology, which all considered additional daytime weekend physiotherapy intervention, did not provide

  19. The Impact of Regular Self-weighing on Weight Management: A Systematic Literature Review

    Directory of Open Access Journals (Sweden)

    Welsh Ericka M

    2008-11-01

    Full Text Available Abstract Background Regular self-weighing has been a focus of attention recently in the obesity literature. It has received conflicting endorsement in that some researchers and practitioners recommend it as a key behavioral strategy for weight management, while others caution against its use due to its potential to cause negative psychological consequences associated with weight management failure. The evidence on frequent self-weighing, however, has not yet been synthesized. The purpose of this paper is to evaluate the evidence regarding the use of regular self-weighing for both weight loss and weight maintenance. Methods A systematic literature review was conducted using the MEDLINE, CINAHL, and PsycINFO online databases. Reviewed studies were broken down by sample characteristics, predictors/conditions, dependent measures, findings, and evidence grade. Results Twelve studies met the inclusion/exclusion criteria, but nearly half received low evidence grades in terms of methodological quality. Findings from 11 of the 12 reviewed studies indicated that more frequent self-weighing was associated with greater weight loss or weight gain prevention. Specifically, individuals who reported self-weighing weekly or daily, typically over a period of several months, held a 1 to 3 kg/m2 (current advantage over individuals who did not self-weigh frequently. The effects of self-weighing in experimental studies, especially those where self-weighing behaviors could be isolated, were less clear. Conclusion Based on the consistency of the evidence reviewed, frequent self-weighing, at the very least, seems to be a good predictor of moderate weight loss, less weight regain, or the avoidance of initial weight gain in adults. More targeted research is needed in this area to determine the causal role of frequent self-weighing in weight loss/weight gain prevention programs. Other open questions to be pursued include the optimal dose of self-weighing, as well as the

  20. Regularization design for high-quality cone-beam CT of intracranial hemorrhage using statistical reconstruction

    Science.gov (United States)

    Dang, H.; Stayman, J. W.; Xu, J.; Sisniega, A.; Zbijewski, W.; Wang, X.; Foos, D. H.; Aygun, N.; Koliatsos, V. E.; Siewerdsen, J. H.

    2016-03-01

    Intracranial hemorrhage (ICH) is associated with pathologies such as hemorrhagic stroke and traumatic brain injury. Multi-detector CT is the current front-line imaging modality for detecting ICH (fresh blood contrast 40-80 HU, down to 1 mm). Flat-panel detector (FPD) cone-beam CT (CBCT) offers a potential alternative with a smaller scanner footprint, greater portability, and lower cost potentially well suited to deployment at the point of care outside standard diagnostic radiology and emergency room settings. Previous studies have suggested reliable detection of ICH down to 3 mm in CBCT using high-fidelity artifact correction and penalized weighted least-squared (PWLS) image reconstruction with a post-artifact-correction noise model. However, ICH reconstructed by traditional image regularization exhibits nonuniform spatial resolution and noise due to interaction between the statistical weights and regularization, which potentially degrades the detectability of ICH. In this work, we propose three regularization methods designed to overcome these challenges. The first two compute spatially varying certainty for uniform spatial resolution and noise, respectively. The third computes spatially varying regularization strength to achieve uniform "detectability," combining both spatial resolution and noise in a manner analogous to a delta-function detection task. Experiments were conducted on a CBCT test-bench, and image quality was evaluated for simulated ICH in different regions of an anthropomorphic head. The first two methods improved the uniformity in spatial resolution and noise compared to traditional regularization. The third exhibited the highest uniformity in detectability among all methods and best overall image quality. The proposed regularization provides a valuable means to achieve uniform image quality in CBCT of ICH and is being incorporated in a CBCT prototype for ICH imaging.

  1. A CRITICAL REVIEW OF INSTRUCTIONAL DESIGN PROCESS OF DISTANCE LEARNING SYSTEM

    Directory of Open Access Journals (Sweden)

    Muhammad Ajmal CHAUDRY

    2010-07-01

    Full Text Available Instructional design refers to planning, development, delivery and evaluation of instructional system. It is an applied field of study aiming at the application of descriptive research outcomes in regular instructional settings. The present study was designed to critically review the process of instructional design at Allama Iqbal Open University (AIOU. It was survey study. Population of the study consisted of 120 academicians of different academic department of AIOU. Survey was conducted through questionnaire for academic staff. It was revealed that need assessment is not done before conceiving the outlines of a course. Also the course did not contain sufficient activities, picture and illustrations. It was also found that did not confirm the course objectives. The study recommended that proper of the course writers for distance learning may be arranged.

  2. ETF interim design review

    International Nuclear Information System (INIS)

    Steiner, D.; Rutherford, P.H.

    1980-01-01

    A three-day ETF Interim Design Review was conducted on July 23-25, 1980, at the Sheraton Potomac Inn in Rockville, Maryland. The intent of the review was to provide a forum for an in-depth assessment and critique of all facets of the ETF design by members of the fusion community. The review began with an opening plenary session at which an overview of the ETF design was presented by D. Steiner, manager of the ETF Design Center, complemented by a physics overview by P.H. Rutherford, chairman of the ETF/INTOR Physics Committee. This was followed by six concurrent review sessions over the next day and a half. The review closed with a plenary session at which the Design Review Board presented its findings. This document consists of the viewgraphs for the opening plenary session and an edited version of the presentations made by Steiner and Rutherford

  3. Linear deflectometry - Regularization and experimental design [Lineare deflektometrie - Regularisierung und experimentelles design

    KAUST Repository

    Balzer, Jonathan

    2011-01-01

    Specular surfaces can be measured with deflectometric methods. The solutions form a one-parameter family whose properties are discussed in this paper. We show in theory and experiment that the shape sensitivity of solutions decreases with growing distance from the optical center of the imaging component of the sensor system and propose a novel regularization strategy. Recommendations for the construction of a measurement setup aim for benefiting this strategy as well as the contrarian standard approach of regularization by specular stereo. © Oldenbourg Wissenschaftsverlag.

  4. Manifold Regularized Experimental Design for Active Learning.

    Science.gov (United States)

    Zhang, Lining; Shum, Hubert P H; Shao, Ling

    2016-12-02

    Various machine learning and data mining tasks in classification require abundant data samples to be labeled for training. Conventional active learning methods aim at labeling the most informative samples for alleviating the labor of the user. Many previous studies in active learning select one sample after another in a greedy manner. However, this is not very effective because the classification models has to be retrained for each newly labeled sample. Moreover, many popular active learning approaches utilize the most uncertain samples by leveraging the classification hyperplane of the classifier, which is not appropriate since the classification hyperplane is inaccurate when the training data are small-sized. The problem of insufficient training data in real-world systems limits the potential applications of these approaches. This paper presents a novel method of active learning called manifold regularized experimental design (MRED), which can label multiple informative samples at one time for training. In addition, MRED gives an explicit geometric explanation for the selected samples to be labeled by the user. Different from existing active learning methods, our method avoids the intrinsic problems caused by insufficiently labeled samples in real-world applications. Various experiments on synthetic datasets, the Yale face database and the Corel image database have been carried out to show how MRED outperforms existing methods.

  5. Design review report for the hydrogen interlock preliminary design

    International Nuclear Information System (INIS)

    Corbett, J.E.

    1996-01-01

    This report documents the completion of a preliminary design review for the hydrogen interlock. The hydrogen interlock, a proposed addition to the Rotary Mode Core Sampling (RMCS) system portable exhauster, is intended to support core sampling operations in waste tanks requiring flammable gas controls. The objective of this review was to validate basic design assumptions and concepts to support a path forward leading to a final design. The conclusion reached by the review committee was that the design was acceptable and efforts should continue toward a final design review

  6. UNFOLDED REGULAR AND SEMI-REGULAR POLYHEDRA

    Directory of Open Access Journals (Sweden)

    IONIŢĂ Elena

    2015-06-01

    Full Text Available This paper proposes a presentation unfolding regular and semi-regular polyhedra. Regular polyhedra are convex polyhedra whose faces are regular and equal polygons, with the same number of sides, and whose polyhedral angles are also regular and equal. Semi-regular polyhedra are convex polyhedra with regular polygon faces, several types and equal solid angles of the same type. A net of a polyhedron is a collection of edges in the plane which are the unfolded edges of the solid. Modeling and unfolding Platonic and Arhimediene polyhedra will be using 3dsMAX program. This paper is intended as an example of descriptive geometry applications.

  7. International Peer Reviews of Design Basis

    International Nuclear Information System (INIS)

    Hughes, Peter

    2013-01-01

    International peer reviews: Design and safety assessment review service: - Review of design requirements; - Review in support of licensing; - Review in support of severe accident management; - Review in support of modifications; - Review in relation to periodic safety, or life extension; - Reviews take place at any time in NPP lifecycle from concept, through design and operations

  8. FDH radiological design review guidelines

    International Nuclear Information System (INIS)

    Millsap, W.J.

    1998-01-01

    These guidelines discuss in more detail the radiological design review process used by the Project Hanford Management Contractors as described in HNF-PRO-1622, Radiological Design Review Process. They are intended to supplement the procedure by providing background information on the design review process and providing a ready source of information to design reviewers. The guidelines are not intended to contain all the information in the procedure, but at points, in order to maintain continuity, they contain some of the same information

  9. FDH radiological design review guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Millsap, W.J.

    1998-09-29

    These guidelines discuss in more detail the radiological design review process used by the Project Hanford Management Contractors as described in HNF-PRO-1622, Radiological Design Review Process. They are intended to supplement the procedure by providing background information on the design review process and providing a ready source of information to design reviewers. The guidelines are not intended to contain all the information in the procedure, but at points, in order to maintain continuity, they contain some of the same information.

  10. Design reviews from a regulatory perspective

    International Nuclear Information System (INIS)

    Foster, B.D.

    1991-01-01

    This paper presents views on the role of the licensing engineer in the design process with specific emphasis on design reviews and the automated information management tools that support design reviews. The licensing engineer is seen as an important member of a design review team. The initial focus of the licensing engineer during design reviews is shown to be on ensuring that applicable regulatory requirements are addressed by the design. The utility of an automated tool, such as a commitments management system, to support regulatory requirements identification is discussed. The next responsibility of the licensing engineer is seen as verifying that regulatory requirements are transformed into measurable performance requirements. Performance requirements are shown to provide the basis for developing detailed design review criteria. Licensing engineer input during design reviews is discussed. This input is shown to be especially critical in cases where review findings may impact application of regulatory requirements. The use of automated tools in supporting design reviews is discussed. An information structure is proposed to support design reviews in a regulated environment. This information structure is shown to be useful to activities beyond design reviews. Incorporation of the proposed information structure into the Licensing Support System is proposed

  11. Perceiving temporal regularity in music: the role of auditory event-related potentials (ERPs) in probing beat perception.

    Science.gov (United States)

    Honing, Henkjan; Bouwer, Fleur L; Háden, Gábor P

    2014-01-01

    The aim of this chapter is to give an overview of how the perception of a regular beat in music can be studied in humans adults, human newborns, and nonhuman primates using event-related brain potentials (ERPs). Next to a review of the recent literature on the perception of temporal regularity in music, we will discuss in how far ERPs, and especially the component called mismatch negativity (MMN), can be instrumental in probing beat perception. We conclude with a discussion on the pitfalls and prospects of using ERPs to probe the perception of a regular beat, in which we present possible constraints on stimulus design and discuss future perspectives.

  12. Bypassing the Limits of Ll Regularization: Convex Sparse Signal Processing Using Non-Convex Regularization

    Science.gov (United States)

    Parekh, Ankit

    Sparsity has become the basis of some important signal processing methods over the last ten years. Many signal processing problems (e.g., denoising, deconvolution, non-linear component analysis) can be expressed as inverse problems. Sparsity is invoked through the formulation of an inverse problem with suitably designed regularization terms. The regularization terms alone encode sparsity into the problem formulation. Often, the ℓ1 norm is used to induce sparsity, so much so that ℓ1 regularization is considered to be `modern least-squares'. The use of ℓ1 norm, as a sparsity-inducing regularizer, leads to a convex optimization problem, which has several benefits: the absence of extraneous local minima, well developed theory of globally convergent algorithms, even for large-scale problems. Convex regularization via the ℓ1 norm, however, tends to under-estimate the non-zero values of sparse signals. In order to estimate the non-zero values more accurately, non-convex regularization is often favored over convex regularization. However, non-convex regularization generally leads to non-convex optimization, which suffers from numerous issues: convergence may be guaranteed to only a stationary point, problem specific parameters may be difficult to set, and the solution is sensitive to the initialization of the algorithm. The first part of this thesis is aimed toward combining the benefits of non-convex regularization and convex optimization to estimate sparse signals more effectively. To this end, we propose to use parameterized non-convex regularizers with designated non-convexity and provide a range for the non-convex parameter so as to ensure that the objective function is strictly convex. By ensuring convexity of the objective function (sum of data-fidelity and non-convex regularizer), we can make use of a wide variety of convex optimization algorithms to obtain the unique global minimum reliably. The second part of this thesis proposes a non-linear signal

  13. Linear deflectometry - Regularization and experimental design [Lineare deflektometrie - Regularisierung und experimentelles design

    KAUST Repository

    Balzer, Jonathan; Werling, Stefan; Beyerer, Jü rgen

    2011-01-01

    distance from the optical center of the imaging component of the sensor system and propose a novel regularization strategy. Recommendations for the construction of a measurement setup aim for benefiting this strategy as well as the contrarian standard

  14. Formal Design Review Foot Clamp Modification

    International Nuclear Information System (INIS)

    OTEN, T.C.

    2000-01-01

    This report documents the Design Review performed for the foot clamp modification. The report documents the acceptability of the design, identifies the documents that were reviewed, the scope of the review and the members of the review team

  15. Sparsity regularization for parameter identification problems

    International Nuclear Information System (INIS)

    Jin, Bangti; Maass, Peter

    2012-01-01

    The investigation of regularization schemes with sparsity promoting penalty terms has been one of the dominant topics in the field of inverse problems over the last years, and Tikhonov functionals with ℓ p -penalty terms for 1 ⩽ p ⩽ 2 have been studied extensively. The first investigations focused on regularization properties of the minimizers of such functionals with linear operators and on iteration schemes for approximating the minimizers. These results were quickly transferred to nonlinear operator equations, including nonsmooth operators and more general function space settings. The latest results on regularization properties additionally assume a sparse representation of the true solution as well as generalized source conditions, which yield some surprising and optimal convergence rates. The regularization theory with ℓ p sparsity constraints is relatively complete in this setting; see the first part of this review. In contrast, the development of efficient numerical schemes for approximating minimizers of Tikhonov functionals with sparsity constraints for nonlinear operators is still ongoing. The basic iterated soft shrinkage approach has been extended in several directions and semi-smooth Newton methods are becoming applicable in this field. In particular, the extension to more general non-convex, non-differentiable functionals by variational principles leads to a variety of generalized iteration schemes. We focus on such iteration schemes in the second part of this review. A major part of this survey is devoted to applying sparsity constrained regularization techniques to parameter identification problems for partial differential equations, which we regard as the prototypical setting for nonlinear inverse problems. Parameter identification problems exhibit different levels of complexity and we aim at characterizing a hierarchy of such problems. The operator defining these inverse problems is the parameter-to-state mapping. We first summarize some

  16. Experiment Design Regularization-Based Hardware/Software Codesign for Real-Time Enhanced Imaging in Uncertain Remote Sensing Environment

    Directory of Open Access Journals (Sweden)

    Castillo Atoche A

    2010-01-01

    Full Text Available A new aggregated Hardware/Software (HW/SW codesign approach to optimization of the digital signal processing techniques for enhanced imaging with real-world uncertain remote sensing (RS data based on the concept of descriptive experiment design regularization (DEDR is addressed. We consider the applications of the developed approach to typical single-look synthetic aperture radar (SAR imaging systems operating in the real-world uncertain RS scenarios. The software design is aimed at the algorithmic-level decrease of the computational load of the large-scale SAR image enhancement tasks. The innovative algorithmic idea is to incorporate into the DEDR-optimized fixed-point iterative reconstruction/enhancement procedure the convex convergence enforcement regularization via constructing the proper multilevel projections onto convex sets (POCS in the solution domain. The hardware design is performed via systolic array computing based on a Xilinx Field Programmable Gate Array (FPGA XC4VSX35-10ff668 and is aimed at implementing the unified DEDR-POCS image enhancement/reconstruction procedures in a computationally efficient multi-level parallel fashion that meets the (near real-time image processing requirements. Finally, we comment on the simulation results indicative of the significantly increased performance efficiency both in resolution enhancement and in computational complexity reduction metrics gained with the proposed aggregated HW/SW co-design approach.

  17. Design review report, 241-S-102 cover plate review; TOPICAL

    International Nuclear Information System (INIS)

    ADAMS, M.R.

    1998-01-01

    The design for the cover plate and lead plate for shielding on 241-S-102 was reviewed on 10/21/98. All Review Comment Record comments were resolved to the satisfaction of the reviewers. Additional comments were taken during the meeting and were also resolved. A design calculation for the Radiological Design Review Screening was presented as criteria for the use of 1 inch lead plate. The review concluded that the use of 2 inch steel plate and 1 inch lead plate provided the required safety function required by HNF-SD-WM-810-001, 5.3.2.20, Basis for Interim Operation. The design was approved with the incorporated comments as recorded on RCR's and meeting minutes

  18. Guidelines for control room design reviews

    International Nuclear Information System (INIS)

    1981-09-01

    The control room design review is part of a broad program being undertaken by the nuclear industry and the government to ensure consideration of human factors in nuclear power plant design and operation. The purpose of the control room design review described by these guidelines is to (1) review and evaluate the control room workspace, instrumentation, controls, and other equipment from a human factors engineering point of view that takes into account both system demands and operator capabilities; and (2) to identify, assess, and implement control room design modifications that correct inadequate or unsuitable items. The scope of the control room design review described by these guidelines covers the human engineering review of completed control rooms; i.e., operational control rooms or those at that stage of the licensing process where control room design and equipment selection are committed. These guidelines should also be of use during the design process for new control rooms. However, additional analyses to optimize the allocation of functions to man and machine, and further examination of advanced control system technology, are recommended for new control rooms. Guidelines and references for comprehensive system analyses designed to incorporate human factors considerations into the design and development of new control rooms are presented in Appendix B. Where possible, a generic approach to the control room design review process is encouraged; for example, when control room designs are replicated wholly or in part in two or more units. Even when designs are not replicated exactly, generic reviews which can be modified to account for specific differences in particular control rooms should be considered. Industry organizations and owners groups are encouraged to coordinate joint efforts and share data to develop generic approaches to the design review process. The control room design review should accomplish the following specific objectives. To determine

  19. NFR TRIGA package design review report

    International Nuclear Information System (INIS)

    Clements, M.D.

    1994-01-01

    The purpose of this document is to compile, present and document the formal design review of the NRF TRIGA packaging. The contents of this document include: the briefing meeting presentations, package description, design calculations, package review drawings, meeting minutes, action item lists, review comment records, final resolutions, and released drawings. This design review required more than two meeting to resolve comments. Therefore, there are three meeting minutes and two action item lists

  20. Consumer perceptions of cigarette pack design in France: a comparison of regular, limited edition and plain packaging.

    Science.gov (United States)

    Gallopel-Morvan, Karine; Moodie, Crawford; Hammond, David; Eker, Figen; Beguinot, Emmanuelle; Martinet, Yves

    2012-09-01

    In the face of comprehensive bans on the marketing of tobacco products, packaging has become an increasingly important promotional tool for the tobacco industry. A ban on the use of branding on tobacco packaging, known as 'plain' packaging, has emerged as a promising regulatory strategy. The current study sought to examine perceptions of cigarette packaging among adults in France. Adult smokers and non-smokers (N=836) were surveyed using computer-assisted personal interviewing to assess perceptions of pack design by comparing 'regular' branded packs and 'limited edition' packs (with novel designs or innovations) with 'plain' versions of these packs with all branding, including colour, removed. Plain packs (PP) were less likely than regular packs, and particularly limited edition packs, to be considered attractive, attention grabbing and likely to motivate youth purchase. PPs were also rated as the most effective in convincing non-smokers not to start and smokers to reduce consumption and quit. Logistic regression showed that smokers motivated to quit, in comparison to smokers not motivated to quit, were significantly more likely to consider the PPs as the packs most likely to motivate cessation. Novel cigarette packaging, in the form of limited edition packs, had the highest ratings of consumer appeal, ahead of regular branded packs and also PPs. Interestingly, PPs were perceived to be the packs most likely to promote cessation among those adults with quitting intentions. Plain packaging, therefore, may be a means of helping existing adult smokers motivated to quit to do so.

  1. Regularity effect in prospective memory during aging

    OpenAIRE

    Blondelle, Geoffrey; Hainselin, Mathieu; Gounden, Yannick; Heurley, Laurent; Voisin, Hélène; Megalakaki, Olga; Bressous, Estelle; Quaglino, Véronique

    2016-01-01

    Background: Regularity effect can affect performance in prospective memory (PM), but little is known on the cognitive processes linked to this effect. Moreover, its impacts with regard to aging remain unknown. To our knowledge, this study is the first to examine regularity effect in PM in a lifespan perspective, with a sample of young, intermediate, and older adults.Objective and design: Our study examined the regularity effect in PM in three groups of participants: 28 young adults (18–30), 1...

  2. A review of Ramadan fasting and regular physical activity on metabolic syndrome indices

    Directory of Open Access Journals (Sweden)

    Seyyed Reza Attarzadeh Hosseini

    2016-03-01

    Full Text Available Introduction: Metabolic syndrome constitutes a cluster of risk factors such as obesity, hyperglycemia,  hypertension, and dyslipidemia, which increase the risk of cardiovascular diseases and type II diabetes mellitus. In this review article, we aimed to discuss the possible effects of fasting and regular physical activity on risk factors for cardiovascular diseases.  Methods: Online databases including Google Scholar, SID, PubMed, and MagIran were searched, using the following keywords:  “training”, “exercise”, “physical activity”, “fasting”, “Ramadan”, “metabolic syndrome”, “fat percentage”, “blood pressure”, “blood sugar”, “cholesterol”, “triglyceride”, and “lowdensity lipoprotein-cholesterol”. All articles including research studies, review articles, descriptive and analytical studies, and ross-sectional research, published during 2006-2015, were reviewed. In case of any errors in the methodologyof articles, they were removed from our analysis. Results:Based on our literature review, inconsistent findings have been reported on risk factors formetabolic syndrome. However, the majority of conducted studies have suggested the positive effects offasting on reducing the risk factors for metabolic syndrome. Conclusion: Although fasting in different seasons of the year has no significant impacts on mental health or physical fitness, it can reduce the risk of various diseases such as cardiovascular diseases. Also, based on the conducted studies, if individuals adhere to a proper diet, avoid excessive eating, drink sufficient amounts of fluids, and keep a healthy level of physical activity, fasting can improve their physical health.

  3. Human-system interface design review guideline -- Reviewer`s checklist: Final report. Revision 1, Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-06-01

    NUREG-0700, Revision 1, provides human factors engineering (HFE) guidance to the US Nuclear Regulatory Commission staff for its: (1) review of the human system interface (HSI) design submittals prepared by licensees or applications for a license or design certification of commercial nuclear power plants, and (2) performance of HSI reviews that could be undertaken as part of an inspection or other type of regulatory review involving HSI design or incidents involving human performance. The guidance consists of a review process and HFE guidelines. The document describes those aspects of the HSI design review process that are important to the identification and resolution of human engineering discrepancies that could adversely affect plant safety. Guidance is provided that could be used by the staff to review an applicant`s HSI design review process or to guide the development of an HSI design review plan, e.g., as part of an inspection activity. The document also provides detailed HFE guidelines for the assessment of HSI design implementations. NUREG-0700, Revision 1, consists of three stand-alone volumes. Volume 2 is a complete set of the guidelines contained in Volume 1, Part 2, but in a checklist format that can be used by reviewers to assemble sets of individual guidelines for use in specific design reviews. The checklist provides space for reviewers to enter guidelines evaluations and comments.

  4. Lattice regularized chiral perturbation theory

    International Nuclear Information System (INIS)

    Borasoy, Bugra; Lewis, Randy; Ouimet, Pierre-Philippe A.

    2004-01-01

    Chiral perturbation theory can be defined and regularized on a spacetime lattice. A few motivations are discussed here, and an explicit lattice Lagrangian is reviewed. A particular aspect of the connection between lattice chiral perturbation theory and lattice QCD is explored through a study of the Wess-Zumino-Witten term

  5. Diverse Regular Employees and Non-regular Employment (Japanese)

    OpenAIRE

    MORISHIMA Motohiro

    2011-01-01

    Currently there are high expectations for the introduction of policies related to diverse regular employees. These policies are a response to the problem of disparities between regular and non-regular employees (part-time, temporary, contract and other non-regular employees) and will make it more likely that workers can balance work and their private lives while companies benefit from the advantages of regular employment. In this paper, I look at two issues that underlie this discussion. The ...

  6. The effectiveness of regular leisure-time physical activities on long-term glycemic control in people with type 2 diabetes: A systematic review and meta-analysis.

    Science.gov (United States)

    Pai, Lee-Wen; Li, Tsai-Chung; Hwu, Yueh-Juen; Chang, Shu-Chuan; Chen, Li-Li; Chang, Pi-Ying

    2016-03-01

    The objective of this study was to systematically review the effectiveness of different types of regular leisure-time physical activities and pooled the effect sizes of those activities on long-term glycemic control in people with type 2 diabetes compared with routine care. This review included randomized controlled trials from 1960 to May 2014. A total of 10 Chinese and English databases were searched, following selection and critical appraisal, 18 randomized controlled trials with 915 participants were included. The standardized mean difference was reported as the summary statistic for the overall effect size in a random effects model. The results indicated yoga was the most effective in lowering glycated haemoglobin A1c (HbA1c) levels. Meta-analysis also revealed that the decrease in HbA1c levels of the subjects who took part in regular leisure-time physical activities was 0.60% more than that of control group participants. A higher frequency of regular leisure-time physical activities was found to be more effective in reducing HbA1c levels. The results of this review provide evidence of the benefits associated with regular leisure-time physical activities compared with routine care for lowering HbA1c levels in people with type 2 diabetes. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  7. Design methodology for wing trailing edge device mechanisms

    OpenAIRE

    Martins Pires, Rui Miguel

    2007-01-01

    Over the last few decades the design of high lift devices has become a very important part of the total aircraft design process. Reviews of the design process are performed on a regular basis, with the intent to improve and optimize the design process. This thesis describes a new and innovative methodology for the design and evaluation of mechanisms for Trailing Edge High-Lift devices. The initial research reviewed existing High-Lift device design methodologies and current f...

  8. Optimal Signal Design for Mixed Equilibrium Networks with Autonomous and Regular Vehicles

    Directory of Open Access Journals (Sweden)

    Nan Jiang

    2017-01-01

    Full Text Available A signal design problem is studied for efficiently managing autonomous vehicles (AVs and regular vehicles (RVs simultaneously in transportation networks. AVs and RVs move on separate lanes and two types of vehicles share the green times at the same intersections. The signal design problem is formulated as a bilevel program. The lower-level model describes a mixed equilibrium where autonomous vehicles follow the Cournot-Nash (CN principle and RVs follow the user equilibrium (UE principle. In the upper-level model, signal timings are optimized at signalized intersections to allocate appropriate green times to both autonomous and RVs to minimize system travel cost. The sensitivity analysis based method is used to solve the bilevel optimization model. Various signal control strategies are evaluated through numerical examples and some insightful findings are obtained. It was found that the number of phases at intersections should be reduced for the optimal control of the AVs and RVs in the mixed networks. More importantly, incorporating AVs into the transportation network would improve the system performance due to the value of AV technologies in reducing random delays at intersections. Meanwhile, travelers prefer to choose AVs when the networks turn to be congested.

  9. Coordinate-invariant regularization

    International Nuclear Information System (INIS)

    Halpern, M.B.

    1987-01-01

    A general phase-space framework for coordinate-invariant regularization is given. The development is geometric, with all regularization contained in regularized DeWitt Superstructures on field deformations. Parallel development of invariant coordinate-space regularization is obtained by regularized functional integration of the momenta. As representative examples of the general formulation, the regularized general non-linear sigma model and regularized quantum gravity are discussed. copyright 1987 Academic Press, Inc

  10. Design of 4D x-ray tomography experiments for reconstruction using regularized iterative algorithms

    Science.gov (United States)

    Mohan, K. Aditya

    2017-10-01

    4D X-ray computed tomography (4D-XCT) is widely used to perform non-destructive characterization of time varying physical processes in various materials. The conventional approach to improving temporal resolution in 4D-XCT involves the development of expensive and complex instrumentation that acquire data faster with reduced noise. It is customary to acquire data with many tomographic views at a high signal to noise ratio. Instead, temporal resolution can be improved using regularized iterative algorithms that are less sensitive to noise and limited views. These algorithms benefit from optimization of other parameters such as the view sampling strategy while improving temporal resolution by reducing the total number of views or the detector exposure time. This paper presents the design principles of 4D-XCT experiments when using regularized iterative algorithms derived using the framework of model-based reconstruction. A strategy for performing 4D-XCT experiments is presented that allows for improving the temporal resolution by progressively reducing the number of views or the detector exposure time. Theoretical analysis of the effect of the data acquisition parameters on the detector signal to noise ratio, spatial reconstruction resolution, and temporal reconstruction resolution is also presented in this paper.

  11. Continuum regularized Yang-Mills theory

    International Nuclear Information System (INIS)

    Sadun, L.A.

    1987-01-01

    Using the machinery of stochastic quantization, Z. Bern, M. B. Halpern, C. Taubes and I recently proposed a continuum regularization technique for quantum field theory. This regularization may be implemented by applying a regulator to either the (d + 1)-dimensional Parisi-Wu Langevin equation or, equivalently, to the d-dimensional second order Schwinger-Dyson (SD) equations. This technique is non-perturbative, respects all gauge and Lorentz symmetries, and is consistent with a ghost-free gauge fixing (Zwanziger's). This thesis is a detailed study of this regulator, and of regularized Yang-Mills theory, using both perturbative and non-perturbative techniques. The perturbative analysis comes first. The mechanism of stochastic quantization is reviewed, and a perturbative expansion based on second-order SD equations is developed. A diagrammatic method (SD diagrams) for evaluating terms of this expansion is developed. We apply the continuum regulator to a scalar field theory. Using SD diagrams, we show that all Green functions can be rendered finite to all orders in perturbation theory. Even non-renormalizable theories can be regularized. The continuum regulator is then applied to Yang-Mills theory, in conjunction with Zwanziger's gauge fixing. A perturbative expansion of the regulator is incorporated into the diagrammatic method. It is hoped that the techniques discussed in this thesis will contribute to the construction of a renormalized Yang-Mills theory is 3 and 4 dimensions

  12. Selection of regularization parameter for l1-regularized damage detection

    Science.gov (United States)

    Hou, Rongrong; Xia, Yong; Bao, Yuequan; Zhou, Xiaoqing

    2018-06-01

    The l1 regularization technique has been developed for structural health monitoring and damage detection through employing the sparsity condition of structural damage. The regularization parameter, which controls the trade-off between data fidelity and solution size of the regularization problem, exerts a crucial effect on the solution. However, the l1 regularization problem has no closed-form solution, and the regularization parameter is usually selected by experience. This study proposes two strategies of selecting the regularization parameter for the l1-regularized damage detection problem. The first method utilizes the residual and solution norms of the optimization problem and ensures that they are both small. The other method is based on the discrepancy principle, which requires that the variance of the discrepancy between the calculated and measured responses is close to the variance of the measurement noise. The two methods are applied to a cantilever beam and a three-story frame. A range of the regularization parameter, rather than one single value, can be determined. When the regularization parameter in this range is selected, the damage can be accurately identified even for multiple damage scenarios. This range also indicates the sensitivity degree of the damage identification problem to the regularization parameter.

  13. Expert systems for assisting in design reviews

    International Nuclear Information System (INIS)

    Brtis, J.S.; Johnson, W.J.; Weber, N.; Naser, J.

    1990-01-01

    This paper discusses Sargent and Lundy's (S and L's) use of expert system technologies to computerize the procedures used for engineering design reviews. This paper discusses expert systems and the advantages that result from using them to computerize the decision-making process. This paper also discusses the design review expert systems that S and L has developed to perform fire protection and ALARA (as low as reasonably achievable) design reviews, and is currently developing for the Electric Power Research Institute (EPRI) to perform 10 CFR 50.59 safety reviews

  14. Effects of Irregular Bridge Columns and Feasibility of Seismic Regularity

    Science.gov (United States)

    Thomas, Abey E.

    2018-05-01

    Bridges with unequal column height is one of the main irregularities in bridge design particularly while negotiating steep valleys, making the bridges vulnerable to seismic action. The desirable behaviour of bridge columns towards seismic loading is that, they should perform in a regular fashion, i.e. the capacity of each column should be utilized evenly. But, this type of behaviour is often missing when the column heights are unequal along the length of the bridge, allowing short columns to bear the maximum lateral load. In the present study, the effects of unequal column height on the global seismic performance of bridges are studied using pushover analysis. Codes such as CalTrans (Engineering service center, earthquake engineering branch, 2013) and EC-8 (EN 1998-2: design of structures for earthquake resistance. Part 2: bridges, European Committee for Standardization, Brussels, 2005) suggests seismic regularity criterion for achieving regular seismic performance level at all the bridge columns. The feasibility of adopting these seismic regularity criterions along with those mentioned in literatures will be assessed for bridges designed as per the Indian Standards in the present study.

  15. Closedness type regularity conditions in convex optimization and beyond

    Directory of Open Access Journals (Sweden)

    Sorin-Mihai Grad

    2016-09-01

    Full Text Available The closedness type regularity conditions have proven during the last decade to be viable alternatives to their more restrictive interiority type counterparts, in both convex optimization and different areas where it was successfully applied. In this review article we de- and reconstruct some closedness type regularity conditions formulated by means of epigraphs and subdifferentials, respectively, for general optimization problems in order to stress that they arise naturally when dealing with such problems. The results are then specialized for constrained and unconstrained convex optimization problems. We also hint towards other classes of optimization problems where closedness type regularity conditions were successfully employed and discuss other possible applications of them.

  16. A projection-based approach to general-form Tikhonov regularization

    DEFF Research Database (Denmark)

    Kilmer, Misha E.; Hansen, Per Christian; Espanol, Malena I.

    2007-01-01

    We present a projection-based iterative algorithm for computing general-form Tikhonov regularized solutions to the problem minx| Ax-b |2^2+lambda2| Lx |2^2, where the regularization matrix L is not the identity. Our algorithm is designed for the common case where lambda is not known a priori...

  17. A novel approach of ensuring layout regularity correct by construction in advanced technologies

    Science.gov (United States)

    Ahmed, Shafquat Jahan; Vaderiya, Yagnesh; Gupta, Radhika; Parthasarathy, Chittoor; Marin, Jean-Claude; Robert, Frederic

    2017-03-01

    In advanced technology nodes, layout regularity has become a mandatory prerequisite to create robust designs less sensitive to variations in manufacturing process in order to improve yield and minimizing electrical variability. In this paper we describe a method for designing regular full custom layouts based on design and process co-optimization. The method includes various design rule checks that can be used on-the-fly during leaf-cell layout development. We extract a Layout Regularity Index (LRI) from the layouts based on the jogs, alignments and pitches used in the design for any given metal layer. Regularity Index of a layout is the direct indicator of manufacturing yield and is used to compare the relative health of different layout blocks in terms of process friendliness. The method has been deployed for 28nm and 40nm technology nodes for Memory IP and is being extended to other IPs (IO, standard-cell). We have quantified the gain of layout regularity with the deployed method on printability and electrical characteristics by process-variation (PV) band simulation analysis and have achieved up-to 5nm reduction in PV band.

  18. DESIGN OF STRUCTURAL ELEMENTS IN THE EVENT OF THE PRE-SET RELIABILITY, REGULAR LOAD AND BEARING CAPACITY DISTRIBUTION

    Directory of Open Access Journals (Sweden)

    Tamrazyan Ashot Georgievich

    2012-10-01

    Full Text Available Accurate and adequate description of external influences and of the bearing capacity of the structural material requires the employment of the probability theory methods. In this regard, the characteristic that describes the probability of failure-free operation is required. The characteristic of reliability means that the maximum stress caused by the action of the load will not exceed the bearing capacity. In this paper, the author presents a solution to the problem of calculation of structures, namely, the identification of reliability of pre-set design parameters, in particular, cross-sectional dimensions. If the load distribution pattern is available, employment of the regularities of distributed functions make it possible to find the pattern of distribution of maximum stresses over the structure. Similarly, we can proceed to the design of structures of pre-set rigidity, reliability and stability in the case of regular load distribution. We consider the element of design (a monolithic concrete slab, maximum stress S which depends linearly on load q. Within a pre-set period of time, the probability will not exceed the values according to the Poisson law. The analysis demonstrates that the variability of the bearing capacity produces a stronger effect on relative sizes of cross sections of a slab than the variability of loads. It is therefore particularly important to reduce the coefficient of variation of the load capacity. One of the methods contemplates the truncation of the bearing capacity distribution by pre-culling the construction material.

  19. Consistent Partial Least Squares Path Modeling via Regularization.

    Science.gov (United States)

    Jung, Sunho; Park, JaeHong

    2018-01-01

    Partial least squares (PLS) path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc), designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.

  20. Consistent Partial Least Squares Path Modeling via Regularization

    Directory of Open Access Journals (Sweden)

    Sunho Jung

    2018-02-01

    Full Text Available Partial least squares (PLS path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc, designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.

  1. Online Manifold Regularization by Dual Ascending Procedure

    Directory of Open Access Journals (Sweden)

    Boliang Sun

    2013-01-01

    Full Text Available We propose a novel online manifold regularization framework based on the notion of duality in constrained optimization. The Fenchel conjugate of hinge functions is a key to transfer manifold regularization from offline to online in this paper. Our algorithms are derived by gradient ascent in the dual function. For practical purpose, we propose two buffering strategies and two sparse approximations to reduce the computational complexity. Detailed experiments verify the utility of our approaches. An important conclusion is that our online MR algorithms can handle the settings where the target hypothesis is not fixed but drifts with the sequence of examples. We also recap and draw connections to earlier works. This paper paves a way to the design and analysis of online manifold regularization algorithms.

  2. Robust design optimization using the price of robustness, robust least squares and regularization methods

    Science.gov (United States)

    Bukhari, Hassan J.

    2017-12-01

    In this paper a framework for robust optimization of mechanical design problems and process systems that have parametric uncertainty is presented using three different approaches. Robust optimization problems are formulated so that the optimal solution is robust which means it is minimally sensitive to any perturbations in parameters. The first method uses the price of robustness approach which assumes the uncertain parameters to be symmetric and bounded. The robustness for the design can be controlled by limiting the parameters that can perturb.The second method uses the robust least squares method to determine the optimal parameters when data itself is subjected to perturbations instead of the parameters. The last method manages uncertainty by restricting the perturbation on parameters to improve sensitivity similar to Tikhonov regularization. The methods are implemented on two sets of problems; one linear and the other non-linear. This methodology will be compared with a prior method using multiple Monte Carlo simulation runs which shows that the approach being presented in this paper results in better performance.

  3. Contour Propagation With Riemannian Elasticity Regularization

    DEFF Research Database (Denmark)

    Bjerre, Troels; Hansen, Mads Fogtmann; Sapru, W.

    2011-01-01

    Purpose/Objective(s): Adaptive techniques allow for correction of spatial changes during the time course of the fractionated radiotherapy. Spatial changes include tumor shrinkage and weight loss, causing tissue deformation and residual positional errors even after translational and rotational image...... the planning CT onto the rescans and correcting to reflect actual anatomical changes. For deformable registration, a free-form, multi-level, B-spline deformation model with Riemannian elasticity, penalizing non-rigid local deformations, and volumetric changes, was used. Regularization parameters was defined...... on the original delineation and tissue deformation in the time course between scans form a better starting point than rigid propagation. There was no significant difference of locally and globally defined regularization. The method used in the present study suggests that deformed contours need to be reviewed...

  4. Three regularities of recognition memory: the role of bias.

    Science.gov (United States)

    Hilford, Andrew; Maloney, Laurence T; Glanzer, Murray; Kim, Kisok

    2015-12-01

    A basic assumption of Signal Detection Theory is that decisions are made on the basis of likelihood ratios. In a preceding paper, Glanzer, Hilford, and Maloney (Psychonomic Bulletin & Review, 16, 431-455, 2009) showed that the likelihood ratio assumption implies that three regularities will occur in recognition memory: (1) the Mirror Effect, (2) the Variance Effect, (3) the normalized Receiver Operating Characteristic (z-ROC) Length Effect. The paper offered formal proofs and computational demonstrations that decisions based on likelihood ratios produce the three regularities. A survey of data based on group ROCs from 36 studies validated the likelihood ratio assumption by showing that its three implied regularities are ubiquitous. The study noted, however, that bias, another basic factor in Signal Detection Theory, can obscure the Mirror Effect. In this paper we examine how bias affects the regularities at the theoretical level. The theoretical analysis shows: (1) how bias obscures the Mirror Effect, not the other two regularities, and (2) four ways to counter that obscuring. We then report the results of five experiments that support the theoretical analysis. The analyses and the experimental results also demonstrate: (1) that the three regularities govern individual, as well as group, performance, (2) alternative explanations of the regularities are ruled out, and (3) that Signal Detection Theory, correctly applied, gives a simple and unified explanation of recognition memory data.

  5. Review of the Tritium Extraction Facility design

    International Nuclear Information System (INIS)

    Barton, R.W.; Bamdad, F.; Blackman, J.

    2000-01-01

    The Defense Nuclear Facilities Safety Board (DNFSB) is an independent executive branch agency responsible for technical safety oversight of the US Department of Energy's (DOE's) defense nuclear facilities. One of DNFSB's responsibilities is the review of design and construction projects for DOE's defense nuclear facilities to ensure that adequate health and safety requirements are identified and implemented. These reviews are performed with the expectation that facility designs are being developed within the framework of a site's Integrated Safety Management (ISM) program. This paper describes the application of ISM principles in DNFSB's ongoing review of the Tritium Extraction Facility (TEF) design/construction project

  6. Review of the Tritium Extraction Facility Design

    International Nuclear Information System (INIS)

    Ronald W. Barton; Farid Bamdad; Joel Blackman

    2000-01-01

    The Defense Nuclear Facilities Safety Board (DNFSB) is an independent executive branch agency responsible for technical safety oversight of the U.S. Department of Energy's (DOE's) defense nuclear facilities. One of DNFSB's responsibilities is the review of design and construction projects for DOE's defense nuclear facilities to ensure that adequate health and safety requirements are identified and implemented. These reviews are performed with the expectation that facility designs are being developed within the framework of a site's Integrated Safety Management (ISM) program. This paper describes the application of ISM principles in DNFSB's ongoing review of the Tritium Extraction Facility (TEF) design/construction project

  7. Quality of systematic reviews in pediatric oncology--a systematic review.

    Science.gov (United States)

    Lundh, Andreas; Knijnenburg, Sebastiaan L; Jørgensen, Anders W; van Dalen, Elvira C; Kremer, Leontien C M

    2009-12-01

    To ensure evidence-based decision making in pediatric oncology systematic reviews are necessary. The objective of our study was to evaluate the methodological quality of all currently existing systematic reviews in pediatric oncology. We identified eligible systematic reviews through a systematic search of the literature. Data on clinical and methodological characteristics of the included systematic reviews were extracted. The methodological quality of the included systematic reviews was assessed using the overview quality assessment questionnaire, a validated 10-item quality assessment tool. We compared the methodological quality of systematic reviews published in regular journals with that of Cochrane systematic reviews. We included 117 systematic reviews, 99 systematic reviews published in regular journals and 18 Cochrane systematic reviews. The average methodological quality of systematic reviews was low for all ten items, but the quality of Cochrane systematic reviews was significantly higher than systematic reviews published in regular journals. On a 1-7 scale, the median overall quality score for all systematic reviews was 2 (range 1-7), with a score of 1 (range 1-7) for systematic reviews in regular journals compared to 6 (range 3-7) in Cochrane systematic reviews (pmethodological flaws leading to a high risk of bias. While Cochrane systematic reviews were of higher methodological quality than systematic reviews in regular journals, some of them also had methodological problems. Therefore, the methodology of each individual systematic review should be scrutinized before accepting its results.

  8. Design review report for the SY-101 RAPID mitigation system

    International Nuclear Information System (INIS)

    SCHLOSSER, R.L.

    1999-01-01

    This report documents design reviews conducted of the SY-101 Respond And Pump In Days (RAPID) Mitigation System. As part of the SY-101 Surface-Level-Rise Remediation Project, the SY-101 WID Mitigation System will reduce the potential unacceptable consequences of crust growth in Tank 241-SY-101 (SY-101). Projections of the crust growth rate indicate that the waste level in the tank may reach the juncture of the primary and secondary confinement structures of the tank late in 1999. Because of this time constraint, many design activities are being conducted in parallel and design reviews were conducted for system adequacy as well as design implementation throughout the process. Design implementation, as used in this design review report, is the final component selection (e.g., which circuit breaker, valve, or thermocouple) that meets the approved design requirements, system design, and design and procurement specifications. Design implementation includes the necessary analysis, testing, verification, and qualification to demonstrate compliance with the system design and design requirements. Design implementation is outside the scope of this design review. The design activities performed prior to detailed design implementation (i.e., system mission requirements, functional design requirements, technical criteria, system conceptual design, and where design and build contracts were placed, the procurement specification) have been reviewed and are within the scope of this design review report. Detailed design implementation will be controlled, reviewed, and where appropriate, approved in accordance with Tank Waste Remediation System (TWRS) engineering procedures. Review of detailed design implementation will continue until all components necessary to perform the transfer function are installed and tested

  9. Design review report for the SY-101 RAPID mitigation system

    Energy Technology Data Exchange (ETDEWEB)

    SCHLOSSER, R.L.

    1999-05-24

    This report documents design reviews conducted of the SY-101 Respond And Pump In Days (RAPID) Mitigation System. As part of the SY-101 Surface-Level-Rise Remediation Project, the SY-101 WID Mitigation System will reduce the potential unacceptable consequences of crust growth in Tank 241-SY-101 (SY-101). Projections of the crust growth rate indicate that the waste level in the tank may reach the juncture of the primary and secondary confinement structures of the tank late in 1999. Because of this time constraint, many design activities are being conducted in parallel and design reviews were conducted for system adequacy as well as design implementation throughout the process. Design implementation, as used in this design review report, is the final component selection (e.g., which circuit breaker, valve, or thermocouple) that meets the approved design requirements, system design, and design and procurement specifications. Design implementation includes the necessary analysis, testing, verification, and qualification to demonstrate compliance with the system design and design requirements. Design implementation is outside the scope of this design review. The design activities performed prior to detailed design implementation (i.e., system mission requirements, functional design requirements, technical criteria, system conceptual design, and where design and build contracts were placed, the procurement specification) have been reviewed and are within the scope of this design review report. Detailed design implementation will be controlled, reviewed, and where appropriate, approved in accordance with Tank Waste Remediation System (TWRS) engineering procedures. Review of detailed design implementation will continue until all components necessary to perform the transfer function are installed and tested.

  10. A Literature Review: Website Design and User Engagement.

    Science.gov (United States)

    Garett, Renee; Chiu, Jason; Zhang, Ly; Young, Sean D

    2016-07-01

    Proper design has become a critical element needed to engage website and mobile application users. However, little research has been conducted to define the specific elements used in effective website and mobile application design. We attempt to review and consolidate research on effective design and to define a short list of elements frequently used in research. The design elements mentioned most frequently in the reviewed literature were navigation, graphical representation, organization, content utility, purpose, simplicity, and readability. We discuss how previous studies define and evaluate these seven elements. This review and the resulting short list of design elements may be used to help designers and researchers to operationalize best practices for facilitating and predicting user engagement.

  11. Fractional Regularization Term for Variational Image Registration

    Directory of Open Access Journals (Sweden)

    Rafael Verdú-Monedero

    2009-01-01

    Full Text Available Image registration is a widely used task of image analysis with applications in many fields. Its classical formulation and current improvements are given in the spatial domain. In this paper a regularization term based on fractional order derivatives is formulated. This term is defined and implemented in the frequency domain by translating the energy functional into the frequency domain and obtaining the Euler-Lagrange equations which minimize it. The new regularization term leads to a simple formulation and design, being applicable to higher dimensions by using the corresponding multidimensional Fourier transform. The proposed regularization term allows for a real gradual transition from a diffusion registration to a curvature registration which is best suited to some applications and it is not possible in the spatial domain. Results with 3D actual images show the validity of this approach.

  12. Computational protein design: a review

    International Nuclear Information System (INIS)

    Coluzza, Ivan

    2017-01-01

    Proteins are one of the most versatile modular assembling systems in nature. Experimentally, more than 110 000 protein structures have been identified and more are deposited every day in the Protein Data Bank. Such an enormous structural variety is to a first approximation controlled by the sequence of amino acids along the peptide chain of each protein. Understanding how the structural and functional properties of the target can be encoded in this sequence is the main objective of protein design. Unfortunately, rational protein design remains one of the major challenges across the disciplines of biology, physics and chemistry. The implications of solving this problem are enormous and branch into materials science, drug design, evolution and even cryptography. For instance, in the field of drug design an effective computational method to design protein-based ligands for biological targets such as viruses, bacteria or tumour cells, could give a significant boost to the development of new therapies with reduced side effects. In materials science, self-assembly is a highly desired property and soon artificial proteins could represent a new class of designable self-assembling materials. The scope of this review is to describe the state of the art in computational protein design methods and give the reader an outline of what developments could be expected in the near future. (topical review)

  13. Female non-regular workers in Japan: their current status and health.

    Science.gov (United States)

    Inoue, Mariko; Nishikitani, Mariko; Tsurugano, Shinobu

    2016-12-07

    The participation of women in the Japanese labor force is characterized by its M-shaped curve, which reflects decreased employment rates during child-rearing years. Although, this M-shaped curve is now improving, the majority of women in employment are likely to fall into the category of non-regular workers. Based on a review of the previous Japanese studies of the health of non-regular workers, we found that non-regular female workers experienced greater psychological distress, poorer self-rated health, a higher smoking rate, and less access to preventive medicine than regular workers did. However, despite the large number of non-regular workers, there are limited researches regarding their health. In contrast, several studies in Japan concluded that regular workers also had worse health conditions due to the additional responsibility and longer work hours associated with the job, housekeeping, and child rearing. The health of non-regular workers might be threatened by the effects of precarious employment status, lower income, a lower safety net, outdated social norm regarding non-regular workers, and difficulty in achieving a work-life balance. A sector wide social approach to consider life course aspect is needed to protect the health and well-being of female workers' health; promotion of an occupational health program alone is insufficient.

  14. Female non-regular workers in Japan: their current status and health

    Science.gov (United States)

    INOUE, Mariko; NISHIKITANI, Mariko; TSURUGANO, Shinobu

    2016-01-01

    The participation of women in the Japanese labor force is characterized by its M-shaped curve, which reflects decreased employment rates during child-rearing years. Although, this M-shaped curve is now improving, the majority of women in employment are likely to fall into the category of non-regular workers. Based on a review of the previous Japanese studies of the health of non-regular workers, we found that non-regular female workers experienced greater psychological distress, poorer self-rated health, a higher smoking rate, and less access to preventive medicine than regular workers did. However, despite the large number of non-regular workers, there are limited researches regarding their health. In contrast, several studies in Japan concluded that regular workers also had worse health conditions due to the additional responsibility and longer work hours associated with the job, housekeeping, and child rearing. The health of non-regular workers might be threatened by the effects of precarious employment status, lower income, a lower safety net, outdated social norm regarding non-regular workers, and difficulty in achieving a work-life balance. A sector wide social approach to consider life course aspect is needed to protect the health and well-being of female workers’ health; promotion of an occupational health program alone is insufficient. PMID:27818453

  15. Distance-regular graphs

    NARCIS (Netherlands)

    van Dam, Edwin R.; Koolen, Jack H.; Tanaka, Hajime

    2016-01-01

    This is a survey of distance-regular graphs. We present an introduction to distance-regular graphs for the reader who is unfamiliar with the subject, and then give an overview of some developments in the area of distance-regular graphs since the monograph 'BCN'[Brouwer, A.E., Cohen, A.M., Neumaier,

  16. Regular expressions cookbook

    CERN Document Server

    Goyvaerts, Jan

    2009-01-01

    This cookbook provides more than 100 recipes to help you crunch data and manipulate text with regular expressions. Every programmer can find uses for regular expressions, but their power doesn't come worry-free. Even seasoned users often suffer from poor performance, false positives, false negatives, or perplexing bugs. Regular Expressions Cookbook offers step-by-step instructions for some of the most common tasks involving this tool, with recipes for C#, Java, JavaScript, Perl, PHP, Python, Ruby, and VB.NET. With this book, you will: Understand the basics of regular expressions through a

  17. Breast ultrasound tomography with total-variation regularization

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Lianjie [Los Alamos National Laboratory; Li, Cuiping [KARMANOS CANCER INSTIT.; Duric, Neb [KARMANOS CANCER INSTIT

    2009-01-01

    Breast ultrasound tomography is a rapidly developing imaging modality that has the potential to impact breast cancer screening and diagnosis. A new ultrasound breast imaging device (CURE) with a ring array of transducers has been designed and built at Karmanos Cancer Institute, which acquires both reflection and transmission ultrasound signals. To extract the sound-speed information from the breast data acquired by CURE, we have developed an iterative sound-speed image reconstruction algorithm for breast ultrasound transmission tomography based on total-variation (TV) minimization. We investigate applicability of the TV tomography algorithm using in vivo ultrasound breast data from 61 patients, and compare the results with those obtained using the Tikhonov regularization method. We demonstrate that, compared to the Tikhonov regularization scheme, the TV regularization method significantly improves image quality, resulting in sound-speed tomography images with sharp (preserved) edges of abnormalities and few artifacts.

  18. LL-regular grammars

    NARCIS (Netherlands)

    Nijholt, Antinus

    1980-01-01

    Culik II and Cogen introduced the class of LR-regular grammars, an extension of the LR(k) grammars. In this paper we consider an analogous extension of the LL(k) grammars called the LL-regular grammars. The relation of this class of grammars to other classes of grammars will be shown. Any LL-regular

  19. Peer review in design: Understanding the impact of collaboration on the review process and student perception

    Science.gov (United States)

    Mandala, Mahender Arjun

    A cornerstone of design and design education is frequent situated feedback. With increasing class sizes, and shrinking financial and human resources, providing rich feedback to students becomes increasingly difficult. In the field of writing, web-based peer review--the process of utilizing equal status learners within a class to provide feedback to each other on their work using networked computing systems--has been shown to be a reliable and valid source of feedback in addition to improving student learning. Designers communicate in myriad ways, using the many languages of design and combining visual and descriptive information. This complex discourse of design intent makes peer reviews by design students ambiguous and often not helpful to the receivers of this feedback. Furthermore, engaging students in the review process itself is often difficult. Teams can complement individual diversity and may assist novice designers collectively resolve complex task. However, teams often incur production losses and may be impacted by individual biases. In the current work, we look at utilizing a collaborative team of reviewers, working collectively and synchronously, in generating web based peer reviews in a sophomore engineering design class. Students participated in a cross-over design, conducting peer reviews as individuals and collaborative teams in parallel sequences. Raters coded the feedback generated on the basis of their appropriateness and accuracy. Self-report surveys and passive observation of teams conducting reviews captured student opinion on the process, its value, and the contrasting experience they had conducting team and individual reviews. We found team reviews generated better quality feedback in comparison to individual reviews. Furthermore, students preferred conducting reviews in teams, finding the process 'fun' and engaging. We observed several learning benefits of using collaboration in reviewing including improved understanding of the assessment

  20. A Literature Review: Website Design and User Engagement.

    OpenAIRE

    Garett, R; Chiu, J; Zhang, L; Young, SD

    2016-01-01

    Proper design has become a critical element needed to engage website and mobile application users. However, little research has been conducted to define the specific elements used in effective website and mobile application design. We attempt to review and consolidate research on effective design and to define a short list of elements frequently used in research. The design elements mentioned most frequently in the reviewed literature were navigation, graphical representation, organization, c...

  1. Review of research in feature based design

    NARCIS (Netherlands)

    Salomons, O.W.; van Houten, Frederikus J.A.M.; Kals, H.J.J.

    1993-01-01

    Research in feature-based design is reviewed. Feature-based design is regarded as a key factor towards CAD/CAPP integration from a process planning point of view. From a design point of view, feature-based design offers possibilities for supporting the design process better than current CAD systems

  2. Prospective regularization design in prior-image-based reconstruction

    International Nuclear Information System (INIS)

    Dang, Hao; Siewerdsen, Jeffrey H; Stayman, J Webster

    2015-01-01

    Prior-image-based reconstruction (PIBR) methods leveraging patient-specific anatomical information from previous imaging studies and/or sequences have demonstrated dramatic improvements in dose utilization and image quality for low-fidelity data. However, a proper balance of information from the prior images and information from the measurements is required (e.g. through careful tuning of regularization parameters). Inappropriate selection of reconstruction parameters can lead to detrimental effects including false structures and failure to improve image quality. Traditional methods based on heuristics are subject to error and sub-optimal solutions, while exhaustive searches require a large number of computationally intensive image reconstructions. In this work, we propose a novel method that prospectively estimates the optimal amount of prior image information for accurate admission of specific anatomical changes in PIBR without performing full image reconstructions. This method leverages an analytical approximation to the implicitly defined PIBR estimator, and introduces a predictive performance metric leveraging this analytical form and knowledge of a particular presumed anatomical change whose accurate reconstruction is sought. Additionally, since model-based PIBR approaches tend to be space-variant, a spatially varying prior image strength map is proposed to optimally admit changes everywhere in the image (eliminating the need to know change locations a priori). Studies were conducted in both an ellipse phantom and a realistic thorax phantom emulating a lung nodule surveillance scenario. The proposed method demonstrated accurate estimation of the optimal prior image strength while achieving a substantial computational speedup (about a factor of 20) compared to traditional exhaustive search. Moreover, the use of the proposed prior strength map in PIBR demonstrated accurate reconstruction of anatomical changes without foreknowledge of change locations in

  3. Office design and health: a systematic review.

    Science.gov (United States)

    Richardson, Ann; Potter, John; Paterson, Margaret; Harding, Thomas; Tyler-Merrick, Gaye; Kirk, Ray; Reid, Kate; McChesney, Jane

    2017-12-15

    To carry out a systematic review of recent research into the effects of workplace design, comparing individual with shared workspaces, on the health of employees. The research question was "Does workplace design (specifically individual offices compared with shared workspaces) affect the health of workers?" A literature search limited to articles published between 2000 and 2017 was undertaken. A systematic review was carried out, and the findings of the reviewed studies grouped into themes according to the primary outcomes measured in the studies. The literature search identified 15 relevant studies addressing health effects of shared or open-plan offices compared with individual offices. Our systematic review found that, compared with individual offices, shared or open-plan office space is not beneficial to employees' health, with consistent findings of deleterious effects on staff health, wellbeing and productivity. Our findings are also consistent with those of earlier reviews. These findings have public health implications for the New Zealand workforce. Decisions about workplace design should include weighing the short-term financial benefits of open-plan or shared workspaces against the significant harms, including increased sickness absence, lower job satisfaction and productivity, and possible threats to recruitment and retention of staff.

  4. A systematic review of protocol studies on conceptual design cognition: design as search and exploration

    OpenAIRE

    Hay, Laura; Duffy, Alex H.B.; McTeague, Chris; Pidgeon, Laura M.; Vuletic, Tijana; Grealy, Madeleine

    2017-01-01

    This paper reports findings from the first systematic review of protocol studies focusing specifically on conceptual design cognition, aiming to answer the following research question: What is our current understanding of the cognitive processes involved in conceptual design tasks carried out by individual designers? We reviewed 47 studies on architectural design, engineering design and product design engineering. This paper reports 24 cognitive processes investigated in a subset of 33 studie...

  5. An iterative method for Tikhonov regularization with a general linear regularization operator

    NARCIS (Netherlands)

    Hochstenbach, M.E.; Reichel, L.

    2010-01-01

    Tikhonov regularization is one of the most popular approaches to solve discrete ill-posed problems with error-contaminated data. A regularization operator and a suitable value of a regularization parameter have to be chosen. This paper describes an iterative method, based on Golub-Kahan

  6. Design Review Closure Report for the SY-101 Rapid Transfer System

    International Nuclear Information System (INIS)

    POWELL, W.J.

    1999-01-01

    The purpose of this report, is to document closure of design review open items, resulting from design reviews conducted for the SY-101 Respond And Pump In Days (RAPID) Transfer System. Results of the various design reviews were documented in the Design Review Report for The SY-101 Rapid Mitigation System, HNF-4519. In that report, twenty-three open items were identified. In this report the 23 items are reviewed and statused

  7. Design Review Closure Report for the SY-101 Rapid Transfer System

    Energy Technology Data Exchange (ETDEWEB)

    POWELL, W.J.

    1999-11-29

    The purpose of this report, is to document closure of design review open items, resulting from design reviews conducted for the SY-101 Respond And Pump In Days (RAPID) Transfer System. Results of the various design reviews were documented in the Design Review Report for The SY-101 Rapid Mitigation System, HNF-4519. In that report, twenty-three open items were identified. In this report the 23 items are reviewed and statused.

  8. Design review report for rotary mode core sample truck (RMCST) modifications for flammable gas tanks, preliminary design

    International Nuclear Information System (INIS)

    Corbett, J.E.

    1996-02-01

    This report documents the completion of a preliminary design review for the Rotary Mode Core Sample Truck (RMCST) modifications for flammable gas tanks. The RMCST modifications are intended to support core sampling operations in waste tanks requiring flammable gas controls. The objective of this review was to validate basic design assumptions and concepts to support a path forward leading to a final design. The conclusion reached by the review committee was that the design was acceptable and efforts should continue toward a final design review

  9. Regular Expression Pocket Reference

    CERN Document Server

    Stubblebine, Tony

    2007-01-01

    This handy little book offers programmers a complete overview of the syntax and semantics of regular expressions that are at the heart of every text-processing application. Ideal as a quick reference, Regular Expression Pocket Reference covers the regular expression APIs for Perl 5.8, Ruby (including some upcoming 1.9 features), Java, PHP, .NET and C#, Python, vi, JavaScript, and the PCRE regular expression libraries. This concise and easy-to-use reference puts a very powerful tool for manipulating text and data right at your fingertips. Composed of a mixture of symbols and text, regular exp

  10. Application of project design peer review to improve quality assurance

    International Nuclear Information System (INIS)

    McClure, F.E.

    1989-01-01

    DOE ORDER 5481.1B Safety Analysis and Review Systems and DOE ORDER 6430.1A General Design Criteria require that the design of facilities shall incorporate the necessary Quality Assurance review requirements to assure that the established program quality assurance objectives are met in the design criteria and the construction documents. The use of Project Design Peer Review to satisfy these requirements is presented. The University of California manages the Lawrence Berkeley Laboratory, the Lawrence Livermore National Laboratory, and the Los Alamos National Scientific Laboratory. The 1988 University Seismic Safety Policy requires the use of independent Project Design Peer Review in its capital improvement and seismic reconstruction program

  11. Application of design review in the heavy power plant industry

    International Nuclear Information System (INIS)

    Yound, N.

    1977-01-01

    The application of design review technique in a company engaged in the design and manufacture of turbo-generators for power stations, is described. One benefit arising from design review is its use as a means of design verification. (U.K.)

  12. Human-system interface design review guideline -- Review software and user's guide: Final report. Revision 1, Volume 3

    International Nuclear Information System (INIS)

    1996-06-01

    NUREG-0700, Revision 1, provides human factors engineering (HFE) guidance to the US Nuclear Regulatory Commission staff for its: (1) review of the human system interface (HSI) design submittals prepared by licensees or applications for a license or design certification of commercial nuclear power plants, and (2) performance of HSI reviews that could be undertaken as part of an inspection or other type of regulatory review involving HSI design or incidents involving human performance. The guidance consists of a review process and HFE guidelines. The document describes those aspects of the HSI design review process that are important to the identification and resolution of human engineering discrepancies that could adversely affect plant safety. Guidance is provided that could be used by the staff to review an applicant's HSI design review process or to guide the development of an HSI design review plan, e.g., as part of an inspection activity. The document also provides detailed HFE guidelines for the assessment of HSI design implementations. NUREG-0700, Revision 1, consists of three stand-alone volumes. Volume 3 contains an interactive software application of the NUREG-0700, Revision 1 guidance and a user's guide for this software. The software supports reviewers during review preparation, evaluation design using the human factors engineering guidelines, and in report preparation. The user's guide provides system requirements and installation instructions, detailed explanations of the software's functions and features, and a tutorial on using the software

  13. Preliminary design review report for K Basin Dose Reduction Project

    International Nuclear Information System (INIS)

    Blackburn, L.D.

    1996-01-01

    The strategy for reducing radiation dose, originating from radionuclides absorbed in the K East Basin concrete, is to raise the pool water level to provide additional shielding. This report documents a preliminary design review conducted to ensure that design approaches for cleaning/coating basin walls and modifying other basin components were appropriate. The conclusion of this review was that design documents presently conclusion of this review was that design documents presently completed or in process of modification are and acceptable basis for proceeding to complete the design

  14. Improvements in GRACE Gravity Fields Using Regularization

    Science.gov (United States)

    Save, H.; Bettadpur, S.; Tapley, B. D.

    2008-12-01

    The unconstrained global gravity field models derived from GRACE are susceptible to systematic errors that show up as broad "stripes" aligned in a North-South direction on the global maps of mass flux. These errors are believed to be a consequence of both systematic and random errors in the data that are amplified by the nature of the gravity field inverse problem. These errors impede scientific exploitation of the GRACE data products, and limit the realizable spatial resolution of the GRACE global gravity fields in certain regions. We use regularization techniques to reduce these "stripe" errors in the gravity field products. The regularization criteria are designed such that there is no attenuation of the signal and that the solutions fit the observations as well as an unconstrained solution. We have used a computationally inexpensive method, normally referred to as "L-ribbon", to find the regularization parameter. This paper discusses the characteristics and statistics of a 5-year time-series of regularized gravity field solutions. The solutions show markedly reduced stripes, are of uniformly good quality over time, and leave little or no systematic observation residuals, which is a frequent consequence of signal suppression from regularization. Up to degree 14, the signal in regularized solution shows correlation greater than 0.8 with the un-regularized CSR Release-04 solutions. Signals from large-amplitude and small-spatial extent events - such as the Great Sumatra Andaman Earthquake of 2004 - are visible in the global solutions without using special post-facto error reduction techniques employed previously in the literature. Hydrological signals as small as 5 cm water-layer equivalent in the small river basins, like Indus and Nile for example, are clearly evident, in contrast to noisy estimates from RL04. The residual variability over the oceans relative to a seasonal fit is small except at higher latitudes, and is evident without the need for de-striping or

  15. Design review of SPWR with PSA methodology

    International Nuclear Information System (INIS)

    Oikawa, Tetsukuni; Muramatsu, Ken; Iwamura, Takamichi; Tone, Tatsuzo; Kasahara, Takeo; Mizuno, Yoshio

    1993-01-01

    This paper presents the procedures and results of a PSA (Probabilistic Safety Assessment) of the SPWR (System-Integrated PWR), which is being developed at the Japan Atomic Energy Research Institute (JAERI) as a medium sized innovative passive safe reactor, to assist in the design improvement of the SPWR by reviewing the design and identifying the design weaknesses. This PSA was performed in four steps: (1) identification of initiating events by the failure mode effect analysis and other methods, (2) delineation of accident sequences for three selected initiating events using accident progression flow charts and event trees, (3) quantification of event trees based on the review of past PSAs for LWRs, and (4) sensitivity analysis and interpretation of results. Qualitative and quantitative results of PSA provided very useful information for decision makings of design improvement and recommendations for further consideration in the process of detailed design

  16. Advanced human-system interface design review guidelines

    International Nuclear Information System (INIS)

    O'Hara, J.M.

    1990-01-01

    Advanced, computer-based, human-system interface designs are emerging in nuclear power plant (NPP) control rooms. These developments may have significant implications for plant safety in that they will greatly affect the ways in which operators interact with systems. At present, however, the only guidance available to the US Nuclear Regulatory Commission (NRC) for the review of control room-operator interfaces, NUREG-0700, was written prior to these technological changes and is thus not designed to address them. The objective of the project reported in this paper is to develop an Advanced Control Room Design Review Guideline for use in performing human factors reviews of advanced operator interfaces. This guideline will be implemented, in part, as a portable, computer-based, interactive document for field use. The paper describes the overall guideline development methodology, the present status of the document, and the plans for further guideline testing and development. 21 refs., 3 figs

  17. The geometry of continuum regularization

    International Nuclear Information System (INIS)

    Halpern, M.B.

    1987-03-01

    This lecture is primarily an introduction to coordinate-invariant regularization, a recent advance in the continuum regularization program. In this context, the program is seen as fundamentally geometric, with all regularization contained in regularized DeWitt superstructures on field deformations

  18. Regular expression containment

    DEFF Research Database (Denmark)

    Henglein, Fritz; Nielsen, Lasse

    2011-01-01

    We present a new sound and complete axiomatization of regular expression containment. It consists of the conventional axiomatiza- tion of concatenation, alternation, empty set and (the singleton set containing) the empty string as an idempotent semiring, the fixed- point rule E* = 1 + E × E......* for Kleene-star, and a general coin- duction rule as the only additional rule. Our axiomatization gives rise to a natural computational inter- pretation of regular expressions as simple types that represent parse trees, and of containment proofs as coercions. This gives the axiom- atization a Curry......-Howard-style constructive interpretation: Con- tainment proofs do not only certify a language-theoretic contain- ment, but, under our computational interpretation, constructively transform a membership proof of a string in one regular expres- sion into a membership proof of the same string in another regular expression. We...

  19. Supersymmetric dimensional regularization

    International Nuclear Information System (INIS)

    Siegel, W.; Townsend, P.K.; van Nieuwenhuizen, P.

    1980-01-01

    There is a simple modification of dimension regularization which preserves supersymmetry: dimensional reduction to real D < 4, followed by analytic continuation to complex D. In terms of component fields, this means fixing the ranges of all indices on the fields (and therefore the numbers of Fermi and Bose components). For superfields, it means continuing in the dimensionality of x-space while fixing the dimensionality of theta-space. This regularization procedure allows the simple manipulation of spinor derivatives in supergraph calculations. The resulting rules are: (1) First do all algebra exactly as in D = 4; (2) Then do the momentum integrals as in ordinary dimensional regularization. This regularization procedure needs extra rules before one can say that it is consistent. Such extra rules needed for superconformal anomalies are discussed. Problems associated with renormalizability and higher order loops are also discussed

  20. A Variance Minimization Criterion to Feature Selection Using Laplacian Regularization.

    Science.gov (United States)

    He, Xiaofei; Ji, Ming; Zhang, Chiyuan; Bao, Hujun

    2011-10-01

    In many information processing tasks, one is often confronted with very high-dimensional data. Feature selection techniques are designed to find the meaningful feature subset of the original features which can facilitate clustering, classification, and retrieval. In this paper, we consider the feature selection problem in unsupervised learning scenarios, which is particularly difficult due to the absence of class labels that would guide the search for relevant information. Based on Laplacian regularized least squares, which finds a smooth function on the data manifold and minimizes the empirical loss, we propose two novel feature selection algorithms which aim to minimize the expected prediction error of the regularized regression model. Specifically, we select those features such that the size of the parameter covariance matrix of the regularized regression model is minimized. Motivated from experimental design, we use trace and determinant operators to measure the size of the covariance matrix. Efficient computational schemes are also introduced to solve the corresponding optimization problems. Extensive experimental results over various real-life data sets have demonstrated the superiority of the proposed algorithms.

  1. Design/Operations review of core sampling trucks and associated equipment

    International Nuclear Information System (INIS)

    Shrivastava, H.P.

    1996-01-01

    A systematic review of the design and operations of the core sampling trucks was commissioned by Characterization Equipment Engineering of the Westinghouse Hanford Company in October 1995. The review team reviewed the design documents, specifications, operating procedure, training manuals and safety analysis reports. The review process, findings and corrective actions are summarized in this supporting document

  2. Point-splitting regularization of composite operators and anomalies

    International Nuclear Information System (INIS)

    Novotny, J.; Schnabl, M.

    2000-01-01

    The point-splitting regularization technique for composite operators is discussed in connection with anomaly calculation. We present a pedagogical and self-contained review of the topic with an emphasis on the technical details. We also develop simple algebraic tools to handle the path ordered exponential insertions used within the covariant and non-covariant version of the point-splitting method. The method is then applied to the calculation of the chiral, vector, trace, translation and Lorentz anomalies within diverse versions of the point-splitting regularization and a connection between the results is described. As an alternative to the standard approach we use the idea of deformed point-split transformation and corresponding Ward-Takahashi identities rather than an application of the equation of motion, which seems to reduce the complexity of the calculations. (orig.)

  3. Operability design review of prototype large breeder reactor (PLBR) designs. Final report, September 1981

    International Nuclear Information System (INIS)

    Beakes, J.H.; Ehman, J.R.; Jones, H.M.; Kinne, B.V.T.; Price, C.M.; Shores, S.P.; Welch, J.K.

    1981-09-01

    Prototype Large Breeder Reactor (PLBR) designs were reviewed by personnel with extensive power plant operations experience. Fourteen normal and off-normal events, such as startup, shutdown, refueling, reactor scram and loss of feedwater, were evaluated using an operational evaluation methodology which is designed to facilitate talk-through sessions on operational events. Human factors engineers participated in the review and assisted in developing and refining the review methodologies. Operating experience at breeder reactor facilities such as Experimental Breeder Reactor-II (EBR-II), Enrico Fermi Atomic Power Plant - Unit 1, and the Fast Flux Test Facility (FFTF) was gathered, analyzed, and used to determine whether lessons learned from operational experience had been incorporated into the PLBR designs. This eighteen month effort resulted in approximately one hundred specific recommendations for improving the operability of PLBR designs

  4. Regularized variable metric method versus the conjugate gradient method in solution of radiative boundary design problem

    International Nuclear Information System (INIS)

    Kowsary, F.; Pooladvand, K.; Pourshaghaghy, A.

    2007-01-01

    In this paper, an appropriate distribution of the heating elements' strengths in a radiation furnace is estimated using inverse methods so that a pre-specified temperature and heat flux distribution is attained on the design surface. Minimization of the sum of the squares of the error function is performed using the variable metric method (VMM), and the results are compared with those obtained by the conjugate gradient method (CGM) established previously in the literature. It is shown via test cases and a well-founded validation procedure that the VMM, when using a 'regularized' estimator, is more accurate and is able to reach at a higher quality final solution as compared to the CGM. The test cases used in this study were two-dimensional furnaces filled with an absorbing, emitting, and scattering gas

  5. Regularization by External Variables

    DEFF Research Database (Denmark)

    Bossolini, Elena; Edwards, R.; Glendinning, P. A.

    2016-01-01

    Regularization was a big topic at the 2016 CRM Intensive Research Program on Advances in Nonsmooth Dynamics. There are many open questions concerning well known kinds of regularization (e.g., by smoothing or hysteresis). Here, we propose a framework for an alternative and important kind of regula......Regularization was a big topic at the 2016 CRM Intensive Research Program on Advances in Nonsmooth Dynamics. There are many open questions concerning well known kinds of regularization (e.g., by smoothing or hysteresis). Here, we propose a framework for an alternative and important kind...

  6. Regular Single Valued Neutrosophic Hypergraphs

    Directory of Open Access Journals (Sweden)

    Muhammad Aslam Malik

    2016-12-01

    Full Text Available In this paper, we define the regular and totally regular single valued neutrosophic hypergraphs, and discuss the order and size along with properties of regular and totally regular single valued neutrosophic hypergraphs. We also extend work on completeness of single valued neutrosophic hypergraphs.

  7. Psychogeriatric inpatient unit design: a literature review.

    Science.gov (United States)

    Dobrohotoff, John T; Llewellyn-Jones, Robert H

    2011-03-01

    In many parts of the world the provision of psychogeriatric inpatient units (PGUs) remains limited. More units will be required over coming decades given rapid population aging. Medline (1950-2010), psycINFO (1806-2009), EMBASE (1980-2009) and CINAHL (1982-2009) were searched for papers about PGU design. Selected non-peer reviewed literature such as government reports and unpublished academic dissertations were also reviewed. Data were also obtained from the literature related to general adult psychiatry inpatient units where there was limited information from studies of units designed for older people. Over 200 papers were reviewed and 130 were included. There are few good quality studies to guide the design of acute PGUs and much of the existing literature is based on opinion and anecdote or, at best, based on observational studies. Randomized controlled studies comparing different designs and assessing outcomes are virtually non-existent. Several studies have identified violence and trauma resulting from hospitalization as significant problems with current acute PGU care. Despite its limitations the available literature provides useful guidance on how PGU design can optimize patient and staff safety and improve clinical outcomes. There are significant problems with current acute PGUs, and patient mix on existing units is an important issue. Future research should examine patient and staff perceptions of different PGU ward environments, the relationship between ward design and clinical outcomes, the effects of segregating patients with challenging behaviors in dementia and the benefits or otherwise of gender segregation.

  8. Study-design selection criteria in systematic reviews of effectiveness of health systems interventions and reforms: A meta-review.

    Science.gov (United States)

    Rockers, Peter C; Feigl, Andrea B; Røttingen, John-Arne; Fretheim, Atle; de Ferranti, David; Lavis, John N; Melberg, Hans Olav; Bärnighausen, Till

    2012-03-01

    At present, there exists no widely agreed upon set of study-design selection criteria for systematic reviews of health systems research, except for those proposed by the Cochrane Collaboration's Effective Practice and Organisation of Care (EPOC) review group (which comprises randomized controlled trials, controlled clinical trials, controlled before-after studies, and interrupted time series). We conducted a meta-review of the study-design selection criteria used in systematic reviews available in the McMaster University's Health Systems Evidence or the EPOC database. Of 414 systematic reviews, 13% did not indicate any study-design selection criteria. Of the 359 studies that described such criteria, 50% limited their synthesis to controlled trials and 68% to some or all of the designs defined by the EPOC criteria. Seven out of eight reviews identified at least one controlled trial that was relevant for the review topic. Seven percent of the reviews included either no or only one relevant primary study. Our meta-review reveals reviewers' preferences for restricting synthesis to controlled experiments or study designs that comply with the EPOC criteria. We discuss the advantages and disadvantages of the current practices regarding study-design selection in systematic reviews of health systems research as well as alternative approaches. Copyright © 2012. Published by Elsevier Ireland Ltd.

  9. SparseBeads data: benchmarking sparsity-regularized computed tomography

    DEFF Research Database (Denmark)

    Jørgensen, Jakob Sauer; Coban, Sophia B.; Lionheart, William R. B.

    2017-01-01

    -regularized reconstruction. A collection of 48 x-ray CT datasets called SparseBeads was designed for benchmarking SR reconstruction algorithms. Beadpacks comprising glass beads of five different sizes as well as mixtures were scanned in a micro-CT scanner to provide structured datasets with variable image sparsity levels...

  10. Design reviews: A process perspective for improved efficiency and effectiveness

    CSIR Research Space (South Africa)

    Young, M

    2013-06-01

    Full Text Available -Black and Iverson 1994) and systematic (Pahl and Beitz 1996) review of the design. Even though it is advisable to use a generic checklist or compliance matrix as the basis for determining these criteria, it is important to use these generics as a guideline only... review to allow sufficient time to address any outstanding issues prior to review meeting. Evolutionary design reviews. When dealing with complex systems, it is advisable to conduct sub-system reviews, building up to a system review that deals...

  11. Material parameters characterization for arbitrary N-sided regular polygonal invisible cloak

    International Nuclear Information System (INIS)

    Wu Qun; Zhang Kuang; Meng Fanyi; Li Lewei

    2009-01-01

    Arbitrary N-sided regular polygonal cylindrical cloaks are proposed and designed based on the coordinate transformation theory. First, the general expressions of constitutive tensors of the N-sided regular polygonal cylindrical cloaks are derived, then there are some full-wave simulations of the cloaks that are composed of inhomogeneous and anisotropic metamaterials, which will bend incoming electromagnetic waves and guide them to propagate around the inner region; such electromagnetic waves will return to their original propagation directions without distorting the waves outside the polygonal cloak. The results of full-wave simulations validate the general expressions of constitutive tensors of the N-sided regular polygonal cylindrical cloaks we derived.

  12. Cold vacuum drying facility 90% design review

    International Nuclear Information System (INIS)

    O'Neill, C.T.

    1997-01-01

    This document contains review comment records for the CVDF 90% design review. Spent fuels retrieved from the K Basins will be dried at the CVDF. It has also been recommended that the Multi-Conister Overpacks be welded, inspected, and repaired at the CVD Facility before transport to dry storage

  13. Cold vacuum drying facility 90% design review

    Energy Technology Data Exchange (ETDEWEB)

    O`Neill, C.T.

    1997-05-02

    This document contains review comment records for the CVDF 90% design review. Spent fuels retrieved from the K Basins will be dried at the CVDF. It has also been recommended that the Multi-Conister Overpacks be welded, inspected, and repaired at the CVD Facility before transport to dry storage.

  14. On a correspondence between regular and non-regular operator monotone functions

    DEFF Research Database (Denmark)

    Gibilisco, P.; Hansen, Frank; Isola, T.

    2009-01-01

    We prove the existence of a bijection between the regular and the non-regular operator monotone functions satisfying a certain functional equation. As an application we give a new proof of the operator monotonicity of certain functions related to the Wigner-Yanase-Dyson skew information....

  15. Design review of the INTOR mechanical configuration

    International Nuclear Information System (INIS)

    Brown, T.G.

    1981-01-01

    The INTOR conceptual design has been carried out by design teams working in the home countries with periodic workshop sessions in Vienna to review the ongoing work and to make decisions on the evolving design. The decisions taken at each workshop session were then incorporated into each national design activity, so that the four national design contributions would progressively converge toward a single design with increasingly greater detail. This paper defines the final INTOR configuration that has evolved during the conceptual design phase, defining the major system design alternatives that were considered and the rationale for selecting the final system configuration

  16. Review of design optimization methods for turbomachinery aerodynamics

    Science.gov (United States)

    Li, Zhihui; Zheng, Xinqian

    2017-08-01

    In today's competitive environment, new turbomachinery designs need to be not only more efficient, quieter, and ;greener; but also need to be developed at on much shorter time scales and at lower costs. A number of advanced optimization strategies have been developed to achieve these requirements. This paper reviews recent progress in turbomachinery design optimization to solve real-world aerodynamic problems, especially for compressors and turbines. This review covers the following topics that are important for optimizing turbomachinery designs. (1) optimization methods, (2) stochastic optimization combined with blade parameterization methods and the design of experiment methods, (3) gradient-based optimization methods for compressors and turbines and (4) data mining techniques for Pareto Fronts. We also present our own insights regarding the current research trends and the future optimization of turbomachinery designs.

  17. Stochastic analytic regularization

    International Nuclear Information System (INIS)

    Alfaro, J.

    1984-07-01

    Stochastic regularization is reexamined, pointing out a restriction on its use due to a new type of divergence which is not present in the unregulated theory. Furthermore, we introduce a new form of stochastic regularization which permits the use of a minimal subtraction scheme to define the renormalized Green functions. (author)

  18. Flexible receiver adapter formal design review

    International Nuclear Information System (INIS)

    Krieg, S.A.

    1995-01-01

    This memo summarizes the results of the Formal (90%) Design Review process and meetings held to evaluate the design of the Flexible Receiver Adapters, support platforms, and associated equipment. The equipment is part of the Flexible Receiver System used to remove, transport, and store long length contaminated equipment and components from both the double and single-shell underground storage tanks at the 200 area tank farms

  19. Design review of the N Reactor

    International Nuclear Information System (INIS)

    1986-09-01

    This review of the design features of the N Reactor was initiated at the request of the Secretary of Energy, John S. Herrington, shortly after, and as a consequence of, reports of the accident at the Soviet reactor complex located at Chernobyl, on April 26, 1986. In the review, special attention was given to those plant systems which are most important in preventing the release of radioactive materials from the plant in the event of combined major equipment failures and human errors. Also, the review studied the potential effects of various severe accident sequences, and addressed the question of whether an event similar in causes or consequences to the Chernobyl accident could occur in the N Reactor. In light of experiences at both Three Mile Island and Chernobyl, the potential for accumulation of hydrogen in excess of flammable limits was given particular attention. The review team was also asked to identify possible improvements to the N Reactor plant, and to evaluate the effects and significance of service-induced degradation. The overall conclusion of the design review is that the N Reactor is safe to operate and that there is no reason to stop or alter its operation in any major respect at this time. Certain additional analyses and testing, are recommended to provide a firmer basis for decisions on long-term operation and on measures which may be needed in the future to accommodate long-term operation

  20. A STUDY ESTABLISHING THE IMPORTANCE OF BODY COMPOSITION ANALYSIS, REGULAR PHYSIOTHERAPY AND DIETARY MODIFICATIONS FOR INDEPENDENT AND HEALTHY LIVING AMONG GERIATRIC POPULATION: A DETAILED SYSTEMATIC REVIEW ARTICLE

    Directory of Open Access Journals (Sweden)

    Rohit Subhedar

    2015-10-01

    Full Text Available Background: This systematic review article aims towards comprehensive and elaborative collection of research articles related to the importance of body composition analysis, Physiotherapy and nutrition for independent geriatric lifestyle. The review article includes articles which suggest the importance of Body composition analysis, Physiotherapy interventions, specific exercises and a combination of fat free, fiber, fruit and fluid diet. Methods: A comprehensive electronic search was conducted using electronic databases Pub Med, MEDLINE, Google Scholar, Science Direct, Research gate, ICMJE, DOAJ, DRJI, IOSR, WAME and many others. In Total 3714, Research papers were reviewed which reported, Age ≥50 years, changes in Body composition in elderly , effects of Diet &Exercises on Body composition and effects of regular Physiotherapy in Geriatric health and obesity. Literature search was restricted to the studies conducted during 1980-2015. Results: Finally 55 papers along with references in research proposal were included. Review shows that ageing, body composition, Physiotherapeutic intervention and nutrition play an interdependent role in providing independent and healthy living among geriatric population. Conclusion: Combined and comprehensive interventions in form of periodic Body Composition Analysis, Physiotherapy interventions with Exercise therapy sessions and Nutritional Supplementation, will be more effective in combating ageing and independent healthy living among Geriatric population. Finally with this review we shall conclude that achieving perfect geriatric health depends upon awareness among the geriatric community to periodically analyze their body composition and regularly comply with exercise therapy sessions, subjective Physiotherapy modality sessions and nutritional supplementation. These principles help in achieving physically fit, healthy, happy and independent geriatric Community.

  1. Quality of systematic reviews in pediatric oncology--a systematic review

    DEFF Research Database (Denmark)

    Lundh, Andreas; Knijnenburg, Sebastiaan L; Jørgensen, Anders W

    2009-01-01

    BACKGROUND: To ensure evidence-based decision making in pediatric oncology systematic reviews are necessary. The objective of our study was to evaluate the methodological quality of all currently existing systematic reviews in pediatric oncology. METHODS: We identified eligible systematic reviews...... through a systematic search of the literature. Data on clinical and methodological characteristics of the included systematic reviews were extracted. The methodological quality of the included systematic reviews was assessed using the overview quality assessment questionnaire, a validated 10-item quality...... assessment tool. We compared the methodological quality of systematic reviews published in regular journals with that of Cochrane systematic reviews. RESULTS: We included 117 systematic reviews, 99 systematic reviews published in regular journals and 18 Cochrane systematic reviews. The average methodological...

  2. Spent Nuclear Fuel Cold Vacuum Drying facility comprehensive formal design review report

    International Nuclear Information System (INIS)

    HALLER, C.S.

    1999-01-01

    The majority of the Cold Vacuum Drying Facility (CVDF) design and construction is complete; isolated portions are still in the design and fabrication process. The project commissioned a formal design review to verify the sufficiency and accuracy of current design media to assure that: (1) the design completely and accurately reflects design criteria, (2) design documents are consistent with one another, and (3) the design media accurately reflects the current design. This review is a key element in the design validation and verification activities required by SNF-4396, ''Design Verification and Validation Plan For The Cold Vacuum Drying Facility''. This report documents the results of the formal design review

  3. Review of Designs for Haptic Data Visualization.

    Science.gov (United States)

    Paneels, Sabrina; Roberts, Jonathan C

    2010-01-01

    There are many different uses for haptics, such as training medical practitioners, teleoperation, or navigation of virtual environments. This review focuses on haptic methods that display data. The hypothesis is that haptic devices can be used to present information, and consequently, the user gains quantitative, qualitative, or holistic knowledge about the presented data. Not only is this useful for users who are blind or partially sighted (who can feel line graphs, for instance), but also the haptic modality can be used alongside other modalities, to increase the amount of variables being presented, or to duplicate some variables to reinforce the presentation. Over the last 20 years, a significant amount of research has been done in haptic data presentation; e.g., researchers have developed force feedback line graphs, bar charts, and other forms of haptic representations. However, previous research is published in different conferences and journals, with different application emphases. This paper gathers and collates these various designs to provide a comprehensive review of designs for haptic data visualization. The designs are classified by their representation: Charts, Maps, Signs, Networks, Diagrams, Images, and Tables. This review provides a comprehensive reference for researchers and learners, and highlights areas for further research.

  4. On-line Peer Review in Teaching Design-oriented Courses

    Directory of Open Access Journals (Sweden)

    Hai Ning

    2004-02-01

    Full Text Available Peer review has been one of the very important designfacilitating processes practiced in education field, particularly in design-oriented courses such as MIT's 2.007 Robot Design. Typically students exchange ideas sketched on a piece of paper and critique on each other's design within a small team. We designed PREP web application backed up by a range of web services that handle the peer-review process on-line, and we argue that this is a significant step towards supporting designoriented course on-line. We believe that the lessons learned could be applied to other interested institutes that offer designoriented courses.

  5. Comparative study of casual and regular workers' job satisfaction ...

    African Journals Online (AJOL)

    This study investigated the comparative study of regular and casual workers' job satisfaction and commitment in two selected banks in Lagos State, A descriptive survey research design was adopted for the study. A total of 145 respondents were selected for the study using proportionate stratified sampling technique.

  6. Effective field theory dimensional regularization

    International Nuclear Information System (INIS)

    Lehmann, Dirk; Prezeau, Gary

    2002-01-01

    A Lorentz-covariant regularization scheme for effective field theories with an arbitrary number of propagating heavy and light particles is given. This regularization scheme leaves the low-energy analytic structure of Greens functions intact and preserves all the symmetries of the underlying Lagrangian. The power divergences of regularized loop integrals are controlled by the low-energy kinematic variables. Simple diagrammatic rules are derived for the regularization of arbitrary one-loop graphs and the generalization to higher loops is discussed

  7. Effective field theory dimensional regularization

    Science.gov (United States)

    Lehmann, Dirk; Prézeau, Gary

    2002-01-01

    A Lorentz-covariant regularization scheme for effective field theories with an arbitrary number of propagating heavy and light particles is given. This regularization scheme leaves the low-energy analytic structure of Greens functions intact and preserves all the symmetries of the underlying Lagrangian. The power divergences of regularized loop integrals are controlled by the low-energy kinematic variables. Simple diagrammatic rules are derived for the regularization of arbitrary one-loop graphs and the generalization to higher loops is discussed.

  8. Asymptotic performance of regularized quadratic discriminant analysis based classifiers

    KAUST Repository

    Elkhalil, Khalil

    2017-12-13

    This paper carries out a large dimensional analysis of the standard regularized quadratic discriminant analysis (QDA) classifier designed on the assumption that data arise from a Gaussian mixture model. The analysis relies on fundamental results from random matrix theory (RMT) when both the number of features and the cardinality of the training data within each class grow large at the same pace. Under some mild assumptions, we show that the asymptotic classification error converges to a deterministic quantity that depends only on the covariances and means associated with each class as well as the problem dimensions. Such a result permits a better understanding of the performance of regularized QDA and can be used to determine the optimal regularization parameter that minimizes the misclassification error probability. Despite being valid only for Gaussian data, our theoretical findings are shown to yield a high accuracy in predicting the performances achieved with real data sets drawn from popular real data bases, thereby making an interesting connection between theory and practice.

  9. Hierarchical regular small-world networks

    International Nuclear Information System (INIS)

    Boettcher, Stefan; Goncalves, Bruno; Guclu, Hasan

    2008-01-01

    Two new networks are introduced that resemble small-world properties. These networks are recursively constructed but retain a fixed, regular degree. They possess a unique one-dimensional lattice backbone overlaid by a hierarchical sequence of long-distance links, mixing real-space and small-world features. Both networks, one 3-regular and the other 4-regular, lead to distinct behaviors, as revealed by renormalization group studies. The 3-regular network is planar, has a diameter growing as √N with system size N, and leads to super-diffusion with an exact, anomalous exponent d w = 1.306..., but possesses only a trivial fixed point T c = 0 for the Ising ferromagnet. In turn, the 4-regular network is non-planar, has a diameter growing as ∼2 √(log 2 N 2 ) , exhibits 'ballistic' diffusion (d w = 1), and a non-trivial ferromagnetic transition, T c > 0. It suggests that the 3-regular network is still quite 'geometric', while the 4-regular network qualifies as a true small world with mean-field properties. As an engineering application we discuss synchronization of processors on these networks. (fast track communication)

  10. 75 FR 76006 - Regular Meeting

    Science.gov (United States)

    2010-12-07

    ... FARM CREDIT SYSTEM INSURANCE CORPORATION Regular Meeting AGENCY: Farm Credit System Insurance Corporation Board. ACTION: Regular meeting. SUMMARY: Notice is hereby given of the regular meeting of the Farm Credit System Insurance Corporation Board (Board). Date and Time: The meeting of the Board will be held...

  11. Perceiving temporal regularity in music: The role of auditory event-related potentials (ERPs) in probing beat perception

    NARCIS (Netherlands)

    Honing, H.; Bouwer, F.L.; Háden, G.P.; Merchant, H.; de Lafuente, V.

    2014-01-01

    The aim of this chapter is to give an overview of how the perception of a regular beat in music can be studied in humans adults, human newborns, and nonhuman primates using event-related brain potentials (ERPs). Next to a review of the recent literature on the perception of temporal regularity in

  12. Reducing errors in the GRACE gravity solutions using regularization

    Science.gov (United States)

    Save, Himanshu; Bettadpur, Srinivas; Tapley, Byron D.

    2012-09-01

    The nature of the gravity field inverse problem amplifies the noise in the GRACE data, which creeps into the mid and high degree and order harmonic coefficients of the Earth's monthly gravity fields provided by GRACE. Due to the use of imperfect background models and data noise, these errors are manifested as north-south striping in the monthly global maps of equivalent water heights. In order to reduce these errors, this study investigates the use of the L-curve method with Tikhonov regularization. L-curve is a popular aid for determining a suitable value of the regularization parameter when solving linear discrete ill-posed problems using Tikhonov regularization. However, the computational effort required to determine the L-curve is prohibitively high for a large-scale problem like GRACE. This study implements a parameter-choice method, using Lanczos bidiagonalization which is a computationally inexpensive approximation to L-curve. Lanczos bidiagonalization is implemented with orthogonal transformation in a parallel computing environment and projects a large estimation problem on a problem of the size of about 2 orders of magnitude smaller for computing the regularization parameter. Errors in the GRACE solution time series have certain characteristics that vary depending on the ground track coverage of the solutions. These errors increase with increasing degree and order. In addition, certain resonant and near-resonant harmonic coefficients have higher errors as compared with the other coefficients. Using the knowledge of these characteristics, this study designs a regularization matrix that provides a constraint on the geopotential coefficients as a function of its degree and order. This regularization matrix is then used to compute the appropriate regularization parameter for each monthly solution. A 7-year time-series of the candidate regularized solutions (Mar 2003-Feb 2010) show markedly reduced error stripes compared with the unconstrained GRACE release 4

  13. General inverse problems for regular variation

    DEFF Research Database (Denmark)

    Damek, Ewa; Mikosch, Thomas Valentin; Rosinski, Jan

    2014-01-01

    Regular variation of distributional tails is known to be preserved by various linear transformations of some random structures. An inverse problem for regular variation aims at understanding whether the regular variation of a transformed random object is caused by regular variation of components ...

  14. Design Review Report for Concrete Cover Block Replaced by Steel Plate

    Energy Technology Data Exchange (ETDEWEB)

    JAKA, O.M.

    2000-07-27

    The design for the steel cover plates to replace concrete cover blocks for U-109 was reviewed and approved in a design review meeting. The design for steel plates to replace concrete blocks were reviewed and approved by comparison and similarity with U-109 for the following additional pits: 241-U-105. 241-I-103, 241-Ax-101. 241-A-101, 241-SX-105, 241-S-A, 241-S-C, 241-SX-A.

  15. Processing SPARQL queries with regular expressions in RDF databases

    Science.gov (United States)

    2011-01-01

    Background As the Resource Description Framework (RDF) data model is widely used for modeling and sharing a lot of online bioinformatics resources such as Uniprot (dev.isb-sib.ch/projects/uniprot-rdf) or Bio2RDF (bio2rdf.org), SPARQL - a W3C recommendation query for RDF databases - has become an important query language for querying the bioinformatics knowledge bases. Moreover, due to the diversity of users’ requests for extracting information from the RDF data as well as the lack of users’ knowledge about the exact value of each fact in the RDF databases, it is desirable to use the SPARQL query with regular expression patterns for querying the RDF data. To the best of our knowledge, there is currently no work that efficiently supports regular expression processing in SPARQL over RDF databases. Most of the existing techniques for processing regular expressions are designed for querying a text corpus, or only for supporting the matching over the paths in an RDF graph. Results In this paper, we propose a novel framework for supporting regular expression processing in SPARQL query. Our contributions can be summarized as follows. 1) We propose an efficient framework for processing SPARQL queries with regular expression patterns in RDF databases. 2) We propose a cost model in order to adapt the proposed framework in the existing query optimizers. 3) We build a prototype for the proposed framework in C++ and conduct extensive experiments demonstrating the efficiency and effectiveness of our technique. Conclusions Experiments with a full-blown RDF engine show that our framework outperforms the existing ones by up to two orders of magnitude in processing SPARQL queries with regular expression patterns. PMID:21489225

  16. Processing SPARQL queries with regular expressions in RDF databases.

    Science.gov (United States)

    Lee, Jinsoo; Pham, Minh-Duc; Lee, Jihwan; Han, Wook-Shin; Cho, Hune; Yu, Hwanjo; Lee, Jeong-Hoon

    2011-03-29

    As the Resource Description Framework (RDF) data model is widely used for modeling and sharing a lot of online bioinformatics resources such as Uniprot (dev.isb-sib.ch/projects/uniprot-rdf) or Bio2RDF (bio2rdf.org), SPARQL - a W3C recommendation query for RDF databases - has become an important query language for querying the bioinformatics knowledge bases. Moreover, due to the diversity of users' requests for extracting information from the RDF data as well as the lack of users' knowledge about the exact value of each fact in the RDF databases, it is desirable to use the SPARQL query with regular expression patterns for querying the RDF data. To the best of our knowledge, there is currently no work that efficiently supports regular expression processing in SPARQL over RDF databases. Most of the existing techniques for processing regular expressions are designed for querying a text corpus, or only for supporting the matching over the paths in an RDF graph. In this paper, we propose a novel framework for supporting regular expression processing in SPARQL query. Our contributions can be summarized as follows. 1) We propose an efficient framework for processing SPARQL queries with regular expression patterns in RDF databases. 2) We propose a cost model in order to adapt the proposed framework in the existing query optimizers. 3) We build a prototype for the proposed framework in C++ and conduct extensive experiments demonstrating the efficiency and effectiveness of our technique. Experiments with a full-blown RDF engine show that our framework outperforms the existing ones by up to two orders of magnitude in processing SPARQL queries with regular expression patterns.

  17. Design iteration in construction projects – Review and directions

    Directory of Open Access Journals (Sweden)

    Purva Mujumdar

    2018-03-01

    Full Text Available Design phase of any construction project involves several designers who exchange information with each other most often in an unstructured manner throughout the design phase. When these information exchanges happen to occur in cycles/loops, it is termed as design iteration. Iteration is an inherent and unavoidable aspect of any design phase which requires proper planning. Till date, very few researchers have explored the design iteration (“complexity” in construction sector. Hence, the objective of this paper was to document and review the complexities of iteration during design phase of construction projects for efficient design planning. To achieve this objective, exhaustive literature review on design iteration was done for four sectors – construction, manufacturing, aerospace, and software development. In addition, semi-structured interviews and discussions were done with a few design experts to verify the different dimensions of iteration. Finally, a design iteration framework was presented in this study that facilitates successful planning. Keywords: Design iteration, Types of iteration, Causes and impact of iteration, Models of iteration, Execution strategies of iteration

  18. Independent design review report for truck number 1 modifications for flammable gas tanks

    International Nuclear Information System (INIS)

    Wilson, G.W.

    1997-01-01

    The East and West Tank Farm Standing Order 97-01 requires that the PMST be modified to include purging of the enclosed space underneath the shielded receiver weather cover per National Fire Protection Association (NFPA) 496, Purged and Pressurized Enclosures for Electrical Equipment. The Standing Order also requires that the PMST be modified by replacing the existing electrical remote latch (RLU) unit with a mechanical remote latch unit. As the mechanical remote latch unit was exactly like the RLU installed on the Rotary Mode Core Sampler Trucks (RMCST) and the design for the RMCST went through formal design review, replacing the RLU was done utilizing informal design verification and was completed per work package ES-97-0028. As the weather cover purge was similar to the design for the RMCSTS, this design was reviewed using the independent review method with multiple independent reviewers. A function design criteria (WHC-SD-WM-FDC-048, Functional Design Criteria for Core Sampling in Flammable Gas Watch List Tanks) provided the criteria for the modifications. The review consisted of distributing the design review package to the reviewers and collecting and dispositioning the RCR comments. The review package included the ECNs for review, the Design Compliance Matrix, copies of all drawings affected, and copies of outstanding ECNs against these drawings. A final meeting was held to ensure that all reviewers were aware of the changes to ECNs from incorporation of RCR comments

  19. Thermal Analysis of Iodine Satellite (iSAT) from Preliminary Design Review (PDR) to Critical Design Review (CDR)

    Science.gov (United States)

    Mauro, Stephanie

    2016-01-01

    The Iodine Satellite (iSAT) is a 12U cubesat with a primary mission to demonstrate the iodine fueled Hall Effect Thruster (HET) propulsion system. The spacecraft (SC) will operate throughout a one year mission in an effort to mature the propulsion system for use in future applications. The benefit of the HET is that it uses a propellant, iodine, which is easy to store and provides a high thrust-to-mass ratio. This paper will describe the thermal analysis and design of the SC between Preliminary Design Review (PDR) and Critical Design Review (CDR). The design of the satellite has undergone many changes due to a variety of challenges, both before PDR and during the time period discussed in this paper. Thermal challenges associated with the system include a high power density, small amounts of available radiative surface area, localized temperature requirements of the propulsion components, and unknown orbital parameters. The thermal control system is implemented to maintain component temperatures within their respective operational limits throughout the mission, while also maintaining propulsion components at the high temperatures needed to allow gaseous iodine propellant to flow. The design includes heaters, insulation, radiators, coatings, and thermal straps. Currently, the maximum temperatures for several components are near to their maximum operation limit, and the battery is close to its minimum operation limit. Mitigation strategies and planned work to solve these challenges will be discussed.

  20. Breckinridge Project, initial effort. Report XI, Volume V. Critical review of the design basis. [Critical review

    Energy Technology Data Exchange (ETDEWEB)

    None

    1982-01-01

    Report XI, Technical Audit, is a compendium of research material used during the Initial Effort in making engineering comparisons and decisions. Volumes 4 and 5 of Report XI present those studies which provide a Critical Review of the Design Basis. The Critical Review Report, prepared by Intercontinental Econergy Associates, Inc., summarizes findings from an extensive review of the data base for the H-Coal process design. Volume 4 presents this review and assessment, and includes supporting material; specifically, Design Data Tabulation (Appendix A), Process Flow Sheets (Appendix B), and References (Appendix C). Volume 5 is a continuation of the references of Appendix C. Studies of a proprietary nature are noted and referenced, but are not included in these volumes. They are included in the Limited Access versions of these reports and may be reviewed by properly cleared personnel in the offices of Ashland Synthetic Fuels, Inc.

  1. Design for six sigma: A review

    Directory of Open Access Journals (Sweden)

    Kouroush Jenab

    2018-01-01

    Full Text Available Six Sigma is recognized as an essential tool for continuous improvement of quality. A large num-ber of publications by various authors reflect the interest in this technique. Reviews of literature on Six Sigma have been done in the past by a few authors. However, considering the contributions in the recent times, a more comprehensive review is attempted here. The authors have examined vari-ous papers and have proposed a different scheme of classification. In addition, certain gaps that would provide hints for further research in Six Sigma have been identified. As a results the rela-tionship between Six Sigma, Design for Six Sigma (DFSS, and how these two concepts support the quality system for organizational learning and innovation performance have been discussed that would help researchers, academicians and practitioners to take a closer look at the growth, devel-opment and applicability of Six Sigma in Design.

  2. Continuum-regularized quantum gravity

    International Nuclear Information System (INIS)

    Chan Huesum; Halpern, M.B.

    1987-01-01

    The recent continuum regularization of d-dimensional Euclidean gravity is generalized to arbitrary power-law measure and studied in some detail as a representative example of coordinate-invariant regularization. The weak-coupling expansion of the theory illustrates a generic geometrization of regularized Schwinger-Dyson rules, generalizing previous rules in flat space and flat superspace. The rules are applied in a non-trivial explicit check of Einstein invariance at one loop: the cosmological counterterm is computed and its contribution is included in a verification that the graviton mass is zero. (orig.)

  3. Communicating Qualitative Research Study Designs to Research Ethics Review Boards

    Science.gov (United States)

    Ells, Carolyn

    2011-01-01

    Researchers using qualitative methodologies appear to be particularly prone to having their study designs called into question by research ethics or funding agency review committees. In this paper, the author considers the issue of communicating qualitative research study designs in the context of institutional research ethics review and offers…

  4. Online co-regularized algorithms

    NARCIS (Netherlands)

    Ruijter, T. de; Tsivtsivadze, E.; Heskes, T.

    2012-01-01

    We propose an online co-regularized learning algorithm for classification and regression tasks. We demonstrate that by sequentially co-regularizing prediction functions on unlabeled data points, our algorithm provides improved performance in comparison to supervised methods on several UCI benchmarks

  5. Geometric continuum regularization of quantum field theory

    International Nuclear Information System (INIS)

    Halpern, M.B.

    1989-01-01

    An overview of the continuum regularization program is given. The program is traced from its roots in stochastic quantization, with emphasis on the examples of regularized gauge theory, the regularized general nonlinear sigma model and regularized quantum gravity. In its coordinate-invariant form, the regularization is seen as entirely geometric: only the supermetric on field deformations is regularized, and the prescription provides universal nonperturbative invariant continuum regularization across all quantum field theory. 54 refs

  6. A review of a scientific work A SCHOOL TO MATCH ANY CHILD A MANUAL FOR WORKING WITH PUPILS WITH DEVELOPMENTAL DIFICULTIES IN REGULAR SCHOOLS

    Directory of Open Access Journals (Sweden)

    Sofija АRNAUDOVA

    2005-12-01

    Full Text Available The presented text is a review of the scientific work A school to match any child, a manual to work with children of regular schools with physical disabilities. The author and the editor of this work is Sulejman Hrnjica, in cooperation with Vera Rajovikj, Tatjana Cholin, Ksenija Krtikj and Dijana Dopunovikj. The manual is a part of the project Inclusion of students with physical disabilities in regular primary education, which was approved and financed by the Ministry of education and sport of the Republic of Serbia, and published by the Institute of Psychology at the Faculty of Philosophy in Belgrade, with the assistance of the foundation "Save the Children" from Great Britain, the office in Belgrade, 2004. This work can hardly be found in bookshops through the country, but it can be found in the library of the Faculty of Philosophy, and certainly at the Book Fair, held every year in Skopje.

  7. Cognitive and personality factors in the regular practice of martial arts.

    Science.gov (United States)

    Fabio, Rosa A; Towey, Giulia E

    2018-06-01

    The effects of regular practice of martial arts is considered controversial and studies in this field limited their attention to singular psychological benefits. The aim of this study is to examine the relationship between the regular practice of martial arts and cognitive and personality factors, such as: attention, creativity and school performance, together with, self-esteem, self-efficacy and aggression. The design consists in a factorial design with two independent variables (groups and age levels) and seven dependent variables (attention, creativity, intelligence, school performance, self-esteem, self-efficacy and aggression). Seventy-six people practicing martial arts were compared with a control group (70 participants) not involved in any martial arts training. Martial artists were divided into groups of three levels of experience: beginners, intermediate and experts. Each completed a battery of tests that measured all the cognitive and personality factors. Martial artists presented a better performance in the attentional and creativity tests. All the personality factors analyzed presented a significant difference between the two groups, resulting in higher levels of self-esteem and self-efficacy, and a decrease of aggressiveness. Regular practice of martial arts can influence many functional aspects, leading to positive effects on both personality and cognitive factors, with implications in psychological well-being, and in the educational field. The results were discussed with reference to theories claiming that regular activity has a differential positive effect on some aspects of cognition.

  8. Using Tikhonov Regularization for Spatial Projections from CSR Regularized Spherical Harmonic GRACE Solutions

    Science.gov (United States)

    Save, H.; Bettadpur, S. V.

    2013-12-01

    It has been demonstrated before that using Tikhonov regularization produces spherical harmonic solutions from GRACE that have very little residual stripes while capturing all the signal observed by GRACE within the noise level. This paper demonstrates a two-step process and uses Tikhonov regularization to remove the residual stripes in the CSR regularized spherical harmonic coefficients when computing the spatial projections. We discuss methods to produce mass anomaly grids that have no stripe features while satisfying the necessary condition of capturing all observed signal within the GRACE noise level.

  9. Regularized maximum correntropy machine

    KAUST Repository

    Wang, Jim Jing-Yan; Wang, Yunji; Jing, Bing-Yi; Gao, Xin

    2015-01-01

    In this paper we investigate the usage of regularized correntropy framework for learning of classifiers from noisy labels. The class label predictors learned by minimizing transitional loss functions are sensitive to the noisy and outlying labels of training samples, because the transitional loss functions are equally applied to all the samples. To solve this problem, we propose to learn the class label predictors by maximizing the correntropy between the predicted labels and the true labels of the training samples, under the regularized Maximum Correntropy Criteria (MCC) framework. Moreover, we regularize the predictor parameter to control the complexity of the predictor. The learning problem is formulated by an objective function considering the parameter regularization and MCC simultaneously. By optimizing the objective function alternately, we develop a novel predictor learning algorithm. The experiments on two challenging pattern classification tasks show that it significantly outperforms the machines with transitional loss functions.

  10. Regularized maximum correntropy machine

    KAUST Repository

    Wang, Jim Jing-Yan

    2015-02-12

    In this paper we investigate the usage of regularized correntropy framework for learning of classifiers from noisy labels. The class label predictors learned by minimizing transitional loss functions are sensitive to the noisy and outlying labels of training samples, because the transitional loss functions are equally applied to all the samples. To solve this problem, we propose to learn the class label predictors by maximizing the correntropy between the predicted labels and the true labels of the training samples, under the regularized Maximum Correntropy Criteria (MCC) framework. Moreover, we regularize the predictor parameter to control the complexity of the predictor. The learning problem is formulated by an objective function considering the parameter regularization and MCC simultaneously. By optimizing the objective function alternately, we develop a novel predictor learning algorithm. The experiments on two challenging pattern classification tasks show that it significantly outperforms the machines with transitional loss functions.

  11. Review on design and control aspects of ankle rehabilitation robots.

    Science.gov (United States)

    Jamwal, Prashant K; Hussain, Shahid; Xie, Sheng Q

    2015-03-01

    Ankle rehabilitation robots can play an important role in improving outcomes of the rehabilitation treatment by assisting therapists and patients in number of ways. Consequently, few robot designs have been proposed by researchers which fall under either of the two categories, namely, wearable robots or platform-based robots. This paper presents a review of both kinds of ankle robots along with a brief analysis of their design, actuation and control approaches. While reviewing these designs it was observed that most of them are undesirably inspired by industrial robot designs. Taking note of the design concerns of current ankle robots, few improvements in the ankle robot designs have also been suggested. Conventional position control or force control approaches, being used in the existing ankle robots, have been reviewed. Apparently, opportunities of improvement also exist in the actuation as well as control of ankle robots. Subsequently, a discussion on most recent research in the development of novel actuators and advanced controllers based on appropriate physical and cognitive human-robot interaction has also been included in this review. Implications for Rehabilitation Ankle joint functions are restricted/impaired as a consequence of stroke or injury during sports or otherwise. Robots can help in reinstating functions faster and can also work as tool for recording rehabilitation data useful for further analysis. Evolution of ankle robots with respect to their design and control aspects has been discussed in the present paper and a novel design with futuristic control approach has been proposed.

  12. Specialist medication review does not benefit short-term outcomes and net costs in continuing-care patients.

    LENUS (Irish Health Repository)

    Pope, George

    2012-01-31

    OBJECTIVES: to evaluate specialist geriatric input and medication review in patients in high-dependency continuing care. DESIGN: prospective, randomised, controlled trial. SETTING: two residential continuing care hospitals. PARTICIPANTS: two hundred and twenty-five permanent patients. INTERVENTION: patients were randomised to either specialist geriatric input or regular input. The specialist group had a medical assessment by a geriatrician and medication review by a multidisciplinary expert panel. Regular input consisted of review as required by a medical officer attached to each ward. Reassessment occurred after 6 months. RESULTS: one hundred and ten patients were randomised to specialist input and 115 to regular input. These were comparable for age, gender, dependency levels and cognition. After 6 months, the total number of medications per patient per day fell from 11.64 to 11.09 in the specialist group (P = 0.0364) and increased from 11.07 to 11.5 in the regular group (P = 0.094). There was no significant difference in mortality or frequency of acute hospital transfers (11 versus 6 in the specialist versus regular group, P = 0.213). CONCLUSION: specialist geriatric assessment and medication review in hospital continuing care resulted in a reduction in medication use, but at a significant cost. No benefits in hard clinical outcomes were demonstrated. However, qualitative benefits and lower costs may become evident over longer periods.

  13. Design review report for ecn 638521 (241-SX-106 cover plate installation)

    Energy Technology Data Exchange (ETDEWEB)

    MCVEY, C.B.

    1998-10-01

    The design for the cover plates on 241-SX-106 was reviewed on 9/10/98. All comments were resolved to the satisfaction of the reviewers. A design calculation for seismic movement was performed and resulted the a design addition to prevent cover block movement. Also calculations were performed for radiological design and are included. The formal design review has no outstanding action items remaining and supports the use of 2 inch steel cover plates to provide personnel shielding and spray knock down protection (as required by the BIO).

  14. Processing SPARQL queries with regular expressions in RDF databases

    Directory of Open Access Journals (Sweden)

    Cho Hune

    2011-03-01

    Full Text Available Abstract Background As the Resource Description Framework (RDF data model is widely used for modeling and sharing a lot of online bioinformatics resources such as Uniprot (dev.isb-sib.ch/projects/uniprot-rdf or Bio2RDF (bio2rdf.org, SPARQL - a W3C recommendation query for RDF databases - has become an important query language for querying the bioinformatics knowledge bases. Moreover, due to the diversity of users’ requests for extracting information from the RDF data as well as the lack of users’ knowledge about the exact value of each fact in the RDF databases, it is desirable to use the SPARQL query with regular expression patterns for querying the RDF data. To the best of our knowledge, there is currently no work that efficiently supports regular expression processing in SPARQL over RDF databases. Most of the existing techniques for processing regular expressions are designed for querying a text corpus, or only for supporting the matching over the paths in an RDF graph. Results In this paper, we propose a novel framework for supporting regular expression processing in SPARQL query. Our contributions can be summarized as follows. 1 We propose an efficient framework for processing SPARQL queries with regular expression patterns in RDF databases. 2 We propose a cost model in order to adapt the proposed framework in the existing query optimizers. 3 We build a prototype for the proposed framework in C++ and conduct extensive experiments demonstrating the efficiency and effectiveness of our technique. Conclusions Experiments with a full-blown RDF engine show that our framework outperforms the existing ones by up to two orders of magnitude in processing SPARQL queries with regular expression patterns.

  15. Decommissioning Lines-of-Inquiry for Design Review of New Nuclear Facilities

    International Nuclear Information System (INIS)

    Negin, C.A.; Urland, C.S.

    2008-01-01

    An independent review of the design of the Salt Waste Processing Facility (SWPF) at Savannah River included a requirement to address the ability to decommission the facility. This paper addresses the lines of inquiry (that were developed for the review and their use in future for reviews of other projects, referred to herein as 'DDLOI'. Decommissioning activities for almost any type of facility are well within the technological state-of-the-art. The major impacts for complications resulting from insufficient consideration during design of a new facility that involves radioactive processes and/or material is the cost of: a) gaining access to high radiation areas and b) dealing with high levels of contamination. For this reason, the DDLOI were developed as a way of raising the awareness of designers and design reviewers to design features that can impede or facilitate ultimate decommissioning. The intent is that this report can be used not only for review, but also by engineers in the early stages of design development when requirements are being assembled. The focus for the DDLOI is on types of facilities that contain nuclear and/or radioactive processes and materials. The level of detail is more specific than would be found in decommissioning plans prepared for regulatory purposes. In commencing this review, the author's could find no precedent for a systematic review of design for decommissioning that included results of a review. Therefore, it was decided to create a report that would provide detailed lines of inquiry along with the rationale for each. The resulting DDLOI report included 21 topical areas for design review. The DDLOI combined the authors' experience in developing baselines for facilities to be deactivated or demolished with prior publications by the U.S. Army and the International Atomic Energy Agency. These two references were found via an Internet search and were the only ones judged to be useful at a field application level. Most others

  16. Mixture design: A review of recent applications in the food industry

    OpenAIRE

    Yeliz Buruk Şahin; Ezgi Aktar Demirtaş; Nimetullah Burnak

    2016-01-01

    Design of experiments (DOE) is a systematic approach to applying statistical methods to the experimental process. The main purpose of this study is to provide useful insights into mixture design as a special type of DOE and to present a review of current mixture design applications in the food industry. The theoretical principles of mixture design and its application in the food industry, based on an extensive review of the literature, are described. Mixture design types, such as simplex-latt...

  17. A Large Dimensional Analysis of Regularized Discriminant Analysis Classifiers

    KAUST Repository

    Elkhalil, Khalil

    2017-11-01

    This article carries out a large dimensional analysis of standard regularized discriminant analysis classifiers designed on the assumption that data arise from a Gaussian mixture model with different means and covariances. The analysis relies on fundamental results from random matrix theory (RMT) when both the number of features and the cardinality of the training data within each class grow large at the same pace. Under mild assumptions, we show that the asymptotic classification error approaches a deterministic quantity that depends only on the means and covariances associated with each class as well as the problem dimensions. Such a result permits a better understanding of the performance of regularized discriminant analsysis, in practical large but finite dimensions, and can be used to determine and pre-estimate the optimal regularization parameter that minimizes the misclassification error probability. Despite being theoretically valid only for Gaussian data, our findings are shown to yield a high accuracy in predicting the performances achieved with real data sets drawn from the popular USPS data base, thereby making an interesting connection between theory and practice.

  18. Supporting design reviews with pre-meeting virtual reality environments

    NARCIS (Netherlands)

    van den Berg, Marc Casper; Hartmann, Timo; de Graaf, Robin S.

    2017-01-01

    The purpose of this paper is to explore how design reviews can be supported with pre-meeting virtual reality environments. Previous research has not systematically investigated how virtual environments can be used to communicate the design intent (to clients) and to communicate feedback (to design

  19. Development of human factors design review guidelines

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Oh, In Suk; Suh, Sang Moon; Lee, Hyun Chul

    1997-10-01

    The Objective of this study is to develop human factors engineering program review guidelines and alarm system review guidelines in order to resolve the two major technical issues: '25, Human factors engineering program review model' and '26, Review criteria for human actors aspects of advanced controls and instrumentation', which are related to the development of human factors safety regulation guides be ing performed by KINS. For the development of human factors program review guidelines, we made a Korean version of NUREG-0711 and added our comments by considering Korean regulatory situation and reviewing the reference documents of NUREG-0711. We also computerized the Korean version of NUREG-0711, additional comments, and selected portion of the reference documents for the developer of safety regulation guides in KINS to see the contents comparatively at a glance and use them easily. For the development of alarm system review guidelines, we made a Korean version of NUREG/CR-6105, which was published by NRC in 1994 as a guideline document for the human factors review of alarm systems. Then we well update the guidelines by reviewing the literature related to alarm design published after 1994

  20. Development of human factors design review guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Oh, In Suk; Suh, Sang Moon; Lee, Hyun Chul [Korea Atomic Energy Research Institute, Taejon (Korea)

    1997-10-01

    The objective of this study is to develop human factors engineering program review guidelines and alarm system review guidelines in order to resolve the two major technical issues: 25. Human Factors Engineering Program Review Model and 26. Review Criteria for Human Factors Aspects of Advanced Controls and Instrumentation, which are related to the development of human factors safety regulation guides being performed by KINS. For the development of human factors program review guidelines, we made a Korean version of NUREG-0711 and added our comments by considering Korean regulatory situation and reviewing the reference documents of NUREG-0711. We also computerized the Korean version of NUREG-0711, additional comments, and selected portion of the reference documents for the developer of safety regulation guides in KINS to see the contents comparatively at a glance and use them easily. For the development of alarm system review guidelines, we made a Korean version of NUREG/CR-6105, which was published by NRC in 1994 as a guideline document for the human factors review of alarm systems. Then we will update the guidelines by reviewing the literature related to alarm design published after 1994. (author). 12 refs., 5 figs., 2 tabs.

  1. The "Learning in Regular Classrooms" Initiative for Inclusive Education in China

    Science.gov (United States)

    Xu, Su Qiong; Cooper, Paul; Sin, Kenneth

    2018-01-01

    The purpose of this article is to understand the Learning in Regular Classrooms (LRC) initiative for inclusive education in China. First, the paper reviews the policy, legislation, and practice in relation to the LRC. It then goes on to explore the specific social-political context of the LRC, and compares the Chinese LRC with the Western…

  2. Advanced human-system interface design review guidelines

    International Nuclear Information System (INIS)

    O'Hara, J.M.

    1990-01-01

    Advanced, computer-based, human-system interface designs are emerging in nuclear power plant control rooms as a result of several factors. These include: (1) incorporation of new systems such as safety parameter display systems, (2) backfitting of current control rooms with new technologies when existing hardware is no longer supported by equipment vendors, and (3) development of advanced control room concepts. Control rooms of the future will be developed almost exclusively with advanced instrumentation and controls based upon digital technology. In addition, the control room operator will be interfacing with more intelligent systems which will be capable of providing information processing support to the operator. These developments may have significant implications for plant safety in that they will greatly affect the operator's role in the system as well as the ways in which he interacts with it. At present, however, the only guidance available to the Nuclear Regulatory Commission (NRC) for the review of control room-operator interfaces is NUREG-0700. It is a document which was written prior to these technological changes and is, therefore, tailored to the technologies used in traditional control rooms. Thus, the present guidance needs to be updated since it is inadequate to serve as the basis for NRC staff review of such advanced or hybrid control room designs. The objective of the project reported in this paper is to develop an Advanced Control Room Design Review Guideline suitable for use in performing human factors reviews of advanced operator interfaces. This guideline will take the form of a portable, interactive, computer-based document that may be conveniently used by an inspector in the field, as well as a text-based document

  3. Design review plan for Multi-Function Waste Tank Facility (Project W-236A)

    International Nuclear Information System (INIS)

    Renfro, G.G.

    1994-01-01

    This plan describes how the Multi-Function Waste Tank Facility (MWTF) Project conducts reviews of design media; describes actions required by Project participants; and provides the methodology to ensure that the design is complete, meets the technical baseline of the Project, is operable and maintainable, and is constructable. Project W-236A is an integrated project wherein the relationship between the operating contractor and architect-engineer is somewhat different than that of a conventional project. Working together, Westinghouse Hanford Company (WHC) and ICF Karser Hanford (ICF KH) have developed a relationship whereby ICF KH performs extensive design reviews and design verification. WHC actively participates in over-the-shoulder reviews during design development, performs a final review of the completed design, and conducts a formal design review of the Safety Class I, ASME boiler and Pressure Vessel Code items in accordance with WHC-CM-6-1, Standard Engineering Practices

  4. Case-only designs in pharmacoepidemiology: a systematic review.

    Directory of Open Access Journals (Sweden)

    Sandra Nordmann

    Full Text Available BACKGROUND: Case-only designs have been used since late 1980's. In these, as opposed to case-control or cohort studies for instance, only cases are required and are self-controlled, eliminating selection biases and confounding related to control subjects, and time-invariant characteristics. The objectives of this systematic review were to analyze how the two main case-only designs - case-crossover (CC and self-controlled case series (SCCS - have been applied and reported in pharmacoepidemiology literature, in terms of applicability assumptions and specificities of these designs. METHODOLOGY/PRINCIPAL FINDINGS: We systematically selected all reports in this field involving case-only designs from MEDLINE and EMBASE up to September 15, 2010. Data were extracted using a standardized form. The analysis included 93 reports 50 (54% of CC and 45 (48% SCCS, 2 reports combined both designs. In 12 (24% CC and 18 (40% SCCS articles, all applicable validity assumptions of the designs were fulfilled, respectively. Fifty (54% articles (15 CC (30% and 35 (78% SCCS adequately addressed the specificities of the case-only analyses in the way they reported results. CONCLUSIONS/SIGNIFICANCE: Our systematic review underlines that implementation of CC and SCCS designs needs to be more rigorous with regard to validity assumptions, as well as improvement in results reporting.

  5. Regularities of Multifractal Measures

    Indian Academy of Sciences (India)

    First, we prove the decomposition theorem for the regularities of multifractal Hausdorff measure and packing measure in R R d . This decomposition theorem enables us to split a set into regular and irregular parts, so that we can analyze each separately, and recombine them without affecting density properties. Next, we ...

  6. Review of design technology of control rod position indicators

    International Nuclear Information System (INIS)

    Yu, Je Yong; Huh, Hyung; Kim, Ji Ho; Kim, Jong In; Chang, Moon Hee

    1999-10-01

    An integral reactor SMART is under development at KAERI. The design characteristics of SMART are radically different from those employer in currently operating loop type water reactors in Korea. The objective of this report is to review the design technology of position indicator, and to study the various sensors which can be used in rod position indicator. Design criteria that rod position indicator should satisfy are also examined. Following position indicators are reviewed in this report. 1. Digital positioning indicator (DRPI), 2. Reed switch type position indicator (RSPT), 3. Choke sensor type position indicator, 4. Ultrasonic sensor type position indicator, 5. Comparison of each position indicator. (author)

  7. Adaptive Regularization of Neural Classifiers

    DEFF Research Database (Denmark)

    Andersen, Lars Nonboe; Larsen, Jan; Hansen, Lars Kai

    1997-01-01

    We present a regularization scheme which iteratively adapts the regularization parameters by minimizing the validation error. It is suggested to use the adaptive regularization scheme in conjunction with optimal brain damage pruning to optimize the architecture and to avoid overfitting. Furthermo......, we propose an improved neural classification architecture eliminating an inherent redundancy in the widely used SoftMax classification network. Numerical results demonstrate the viability of the method...

  8. Review of Automated Design and Optimization of MEMS

    DEFF Research Database (Denmark)

    Achiche, Sofiane; Fan, Zhun; Bolognini, Francesca

    2007-01-01

    carried out. This paper presents a review of these techniques. The design task of MEMS is usually divided into four main stages: System Level, Device Level, Physical Level and the Process Level. The state of the art o automated MEMS design in each of these levels is investigated....

  9. Reconnection of SN-216 to U-D Valve Pit Design Review

    International Nuclear Information System (INIS)

    REED, R.W.

    1999-01-01

    The design for the reconnection of SN-216 to U-D valve pit was reviewed on May 24, 1999. All Review Comment Record comments were resolved and closed at this meeting. The review concluded that the reconnection of SN-216 to U-D valve pit was acceptable. The design was approved with the incorporated comments as recorded on the RCR's. No outstanding comments remain

  10. Condition Number Regularized Covariance Estimation.

    Science.gov (United States)

    Won, Joong-Ho; Lim, Johan; Kim, Seung-Jean; Rajaratnam, Bala

    2013-06-01

    Estimation of high-dimensional covariance matrices is known to be a difficult problem, has many applications, and is of current interest to the larger statistics community. In many applications including so-called the "large p small n " setting, the estimate of the covariance matrix is required to be not only invertible, but also well-conditioned. Although many regularization schemes attempt to do this, none of them address the ill-conditioning problem directly. In this paper, we propose a maximum likelihood approach, with the direct goal of obtaining a well-conditioned estimator. No sparsity assumption on either the covariance matrix or its inverse are are imposed, thus making our procedure more widely applicable. We demonstrate that the proposed regularization scheme is computationally efficient, yields a type of Steinian shrinkage estimator, and has a natural Bayesian interpretation. We investigate the theoretical properties of the regularized covariance estimator comprehensively, including its regularization path, and proceed to develop an approach that adaptively determines the level of regularization that is required. Finally, we demonstrate the performance of the regularized estimator in decision-theoretic comparisons and in the financial portfolio optimization setting. The proposed approach has desirable properties, and can serve as a competitive procedure, especially when the sample size is small and when a well-conditioned estimator is required.

  11. Meta-Review: Systematic Assessment of Program Review

    Science.gov (United States)

    Harlan, Brian

    2012-01-01

    Over 20 years ago, Robert J. Barak and Barbara E. Breier suggested incorporating a regular assessment of the entire program review system into the review schedule in order to ensure that the system itself is as efficient and effective as the programs under review. Barak and Breier's seminal book on the goals and processes of program review has…

  12. [Clustered regularly interspaced short palindromic repeats: structure, function and application--a review].

    Science.gov (United States)

    Cui, Yujun; Li, Yanjun; Yan, Yanfeng; Yang, Ruifu

    2008-11-01

    CRISPRs (Clustered Regularly Interspaced Short Palindromic Repeats), the basis of spoligotyping technology, can provide prokaryotes with heritable adaptive immunity against phages' invasion. Studies on CRISPR loci and their associated elements, including various CAS (CRISPR-associated) proteins and leader sequences, are still in its infant period. We introduce the brief history', structure, function, bioinformatics research and application of this amazing immunity system in prokaryotic organism for inspiring more scientists to find their interest in this developing topic.

  13. Near-field acoustic holography using sparse regularization and compressive sampling principles.

    Science.gov (United States)

    Chardon, Gilles; Daudet, Laurent; Peillot, Antoine; Ollivier, François; Bertin, Nancy; Gribonval, Rémi

    2012-09-01

    Regularization of the inverse problem is a complex issue when using near-field acoustic holography (NAH) techniques to identify the vibrating sources. This paper shows that, for convex homogeneous plates with arbitrary boundary conditions, alternative regularization schemes can be developed based on the sparsity of the normal velocity of the plate in a well-designed basis, i.e., the possibility to approximate it as a weighted sum of few elementary basis functions. In particular, these techniques can handle discontinuities of the velocity field at the boundaries, which can be problematic with standard techniques. This comes at the cost of a higher computational complexity to solve the associated optimization problem, though it remains easily tractable with out-of-the-box software. Furthermore, this sparsity framework allows us to take advantage of the concept of compressive sampling; under some conditions on the sampling process (here, the design of a random array, which can be numerically and experimentally validated), it is possible to reconstruct the sparse signals with significantly less measurements (i.e., microphones) than classically required. After introducing the different concepts, this paper presents numerical and experimental results of NAH with two plate geometries, and compares the advantages and limitations of these sparsity-based techniques over standard Tikhonov regularization.

  14. Regularization in Hilbert space under unbounded operators and general source conditions

    International Nuclear Information System (INIS)

    Hofmann, Bernd; Mathé, Peter; Von Weizsäcker, Heinrich

    2009-01-01

    The authors study ill-posed equations with unbounded operators in Hilbert space. This setup has important applications, but only a few theoretical studies are available. First, the question is addressed and answered whether every element satisfies some general source condition with respect to a given self-adjoint unbounded operator. This generalizes a previous result from Mathé and Hofmann (2008 Inverse Problems 24 015009). The analysis then proceeds to error bounds for regularization, emphasizing some specific points for regularization under unbounded operators. The study finally reviews two examples within the light of the present study, as these are fractional differentiation and some Cauchy problems for the Helmholtz equation, both studied previously and in more detail by U Tautenhahn and co-authors

  15. A review of design and modeling of magnetorheological valve

    Science.gov (United States)

    Abd Fatah, Abdul Yasser; Mazlan, Saiful Amri; Koga, Tsuyoshi; Zamzuri, Hairi; Zeinali, Mohammadjavad; Imaduddin, Fitrian

    2015-01-01

    Following recent rapid development of researches in utilizing Magnetorheological (MR) fluid, a smart material that can be magnetically controlled to change its apparent viscosity instantaneously, a lot of applications have been established to exploit the benefits and advantages of using the MR fluid. One of the most important applications for MR fluid in devices is the MR valve, where it uses the popular flow or valve mode among the available working modes for MR fluid. As such, MR valve is widely applied in a lot of hydraulic actuation and vibration reduction devices, among them are dampers, actuators and shock absorbers. This paper presents a review on MR valve, discusses on several design configurations and the mathematical modeling for the MR valve. Therefore, this review paper classifies the MR valve based on the coil configuration and geometrical arrangement of the valve, and focusing on four different mathematical models for MR valve: Bingham plastic, Herschel-Bulkley, bi-viscous and Herschel-Bulkley with pre-yield viscosity (HBPV) models for calculating yield stress and pressure drop in the MR valve. Design challenges and opportunities for application of MR fluid and MR valve are also highlighted in this review. Hopefully, this review paper can provide basic knowledge on design and modeling of MR valve, complementing other reviews on MR fluid, its applications and technologies.

  16. Final design review report for K basin dose reduction project

    International Nuclear Information System (INIS)

    Blackburn, L.D.

    1996-01-01

    The strategy for reducing radiation dose originating from radionuclides absorbed in the K East Basin concrete is to raise the pool water level to provide additional shielding. This report documents a final design review for cleaning/coating basin walls and modifying other basin components where appropriate. The conclusion of this review was that the documents developed constitute an acceptable design for the Dose Reduction Project

  17. Domain dependent associations between cognitive functioning and regular voluntary exercise behavior

    NARCIS (Netherlands)

    Swagerman, S.C.; de Geus, E.J.C.; Koenis, M.M.G.; Hulshoff Pol, H.E.; Boomsma, D.I.; Kan, K.J.

    2015-01-01

    Regular exercise has often been suggested to have beneficial effects on cognition, but empirical findings are mixed because of heterogeneity in sample composition (age and sex); the cognitive domain being investigated; the definition and reliability of exercise behavior measures; and study design

  18. Domain dependent associations between cognitive functioning and regular voluntary exercise behavior

    NARCIS (Netherlands)

    Swagerman, Suzanne C; de Geus, Eco J C; Koenis, Marinka M G; Hulshoff Pol, Hilleke E; Boomsma, Dorret I; Kan, Kees-Jan

    Regular exercise has often been suggested to have beneficial effects on cognition, but empirical findings are mixed because of heterogeneity in sample composition (age and sex); the cognitive domain being investigated; the definition and reliability of exercise behavior measures; and study design

  19. Condition Number Regularized Covariance Estimation*

    Science.gov (United States)

    Won, Joong-Ho; Lim, Johan; Kim, Seung-Jean; Rajaratnam, Bala

    2012-01-01

    Estimation of high-dimensional covariance matrices is known to be a difficult problem, has many applications, and is of current interest to the larger statistics community. In many applications including so-called the “large p small n” setting, the estimate of the covariance matrix is required to be not only invertible, but also well-conditioned. Although many regularization schemes attempt to do this, none of them address the ill-conditioning problem directly. In this paper, we propose a maximum likelihood approach, with the direct goal of obtaining a well-conditioned estimator. No sparsity assumption on either the covariance matrix or its inverse are are imposed, thus making our procedure more widely applicable. We demonstrate that the proposed regularization scheme is computationally efficient, yields a type of Steinian shrinkage estimator, and has a natural Bayesian interpretation. We investigate the theoretical properties of the regularized covariance estimator comprehensively, including its regularization path, and proceed to develop an approach that adaptively determines the level of regularization that is required. Finally, we demonstrate the performance of the regularized estimator in decision-theoretic comparisons and in the financial portfolio optimization setting. The proposed approach has desirable properties, and can serve as a competitive procedure, especially when the sample size is small and when a well-conditioned estimator is required. PMID:23730197

  20. Review of methods for the integration of reliability and design engineering

    International Nuclear Information System (INIS)

    Reilly, J.T.

    1978-03-01

    A review of methods for the integration of reliability and design engineering was carried out to establish a reliability program philosophy, an initial set of methods, and procedures to be used by both the designer and reliability analyst. The report outlines a set of procedures which implements a philosophy that requires increased involvement by the designer in reliability analysis. Discussions of each method reviewed include examples of its application

  1. Applicability of HRA to support advanced MMI design review

    International Nuclear Information System (INIS)

    Kim, Inn Seock

    2000-01-01

    More than half of all incidents in large complex technological systems, particularly in nuclear power or aviation industries, were attributable in some way to human erroneous actions. These incidents were largely due to the human engineering deficiencies of man-machine interface (MMI). In nuclear industry, advanced computer-based MMI designs are emerging as part of new reactor designs. The impact of advanced MMI technology on the operator performance, and as a result, on plant safety should be thoroughly evaluated before such technology is actually adopted in nuclear power plants. This paper discusses the applicability of human reliability analysis (HRA) to support the design review process. Both the first-generation and the second-generation HRA methods are considered focusing on a couple of promising HRA methods, i.e., ATHEANA and CREAM, with the potential to assist the design review process. (author)

  2. The significance of the structural regularity for the seismic response of buildings

    International Nuclear Information System (INIS)

    Hampe, E.; Goldbach, R.; Schwarz, J.

    1991-01-01

    The paper gives an state-of-the-art report about the international design practice and submits fundamentals for a systematic approach to the solution of that problem. Different criteria of regularity are presented and discussed with respect to EUROCODE Nr. 8. Still remaining questions and the main topics of future research activities are announced and come into consideration. Frame structures with or without additional stiffening wall elements are investigated to illustrate the qualitative differences of the vibrational properties and the earthquake response of regular and irregular systems. (orig./HP) [de

  3. Do the majority of South Africans regularly consult traditional healers?

    Directory of Open Access Journals (Sweden)

    Gabriel Louw

    2016-12-01

    Full Text Available Background The statutory recognition of traditional healers as healthcare practitioners in South Africa in terms of the Traditional Health Practitioners Act 22 of 2007 is based on various assumptions, opinions and generalizations. One of the prominent views is that the majority of South Africans regularly consult traditional healers. It even has been alleged that this number can be as high as 80 per cent of the South African population. For medical doctors and other health practitioners registered with the Health Professions Council of South Africa (HPCSA, this new statutory status of traditional health practitioners, means the required presence of not only a healthcare competitor that can overstock the healthcare market with service lending, medical claims and healthcare costs, but also a competitor prone to malpractice. Aims The study aimed to determine if the majority of South Africans regularly consult traditional healers. Methods This is an exploratory and descriptive study following the modern historical approach of investigation and literature review. The emphasis is on using current documentation like articles, books and newspapers, as primary sources to determine if the majority of South Africans regularly consult traditional healers. The findings are offered in narrative form. Results It is clear that there is no trustworthy statistics on the percentages of South Africans using traditional healers. A scientific survey is needed to determine the extent to which traditional healers are consulted. This will only be possible after the Traditional Health Practitioners Act No 22 has been fully enacted and traditional health practitioners have become fully active in the healthcare sector. Conclusion In poorer, rural areas no more than 11.2 per cent of the South African population regularly consult traditional healers, while the figure for the total population seems to be no more than 1.4 per cent. The argument that the majority of South

  4. Regular-, irregular-, and pseudo-character processing in Chinese: The regularity effect in normal adult readers

    Directory of Open Access Journals (Sweden)

    Dustin Kai Yan Lau

    2014-03-01

    Full Text Available Background Unlike alphabetic languages, Chinese uses a logographic script. However, the pronunciation of many character’s phonetic radical has the same pronunciation as the character as a whole. These are considered regular characters and can be read through a lexical non-semantic route (Weekes & Chen, 1999. Pseudocharacters are another way to study this non-semantic route. A pseudocharacter is the combination of existing semantic and phonetic radicals in their legal positions resulting in a non-existing character (Ho, Chan, Chung, Lee, & Tsang, 2007. Pseudocharacters can be pronounced by direct derivation from the sound of its phonetic radical. Conversely, if the pronunciation of a character does not follow that of the phonetic radical, it is considered as irregular and can only be correctly read through the lexical-semantic route. The aim of the current investigation was to examine reading aloud in normal adults. We hypothesized that the regularity effect, previously described for alphabetical scripts and acquired dyslexic patients of Chinese (Weekes & Chen, 1999; Wu, Liu, Sun, Chromik, & Zhang, 2014, would also be present in normal adult Chinese readers. Method Participants. Thirty (50% female native Hong Kong Cantonese speakers with a mean age of 19.6 years and a mean education of 12.9 years. Stimuli. Sixty regular-, 60 irregular-, and 60 pseudo-characters (with at least 75% of name agreement in Chinese were matched by initial phoneme, number of strokes and family size. Additionally, regular- and irregular-characters were matched by frequency (low and consistency. Procedure. Each participant was asked to read aloud the stimuli presented on a laptop using the DMDX software. The order of stimuli presentation was randomized. Data analysis. ANOVAs were carried out by participants and items with RTs and errors as dependent variables and type of stimuli (regular-, irregular- and pseudo-character as repeated measures (F1 or between subject

  5. Parameter identification for continuous point emission source based on Tikhonov regularization method coupled with particle swarm optimization algorithm.

    Science.gov (United States)

    Ma, Denglong; Tan, Wei; Zhang, Zaoxiao; Hu, Jun

    2017-03-05

    In order to identify the parameters of hazardous gas emission source in atmosphere with less previous information and reliable probability estimation, a hybrid algorithm coupling Tikhonov regularization with particle swarm optimization (PSO) was proposed. When the source location is known, the source strength can be estimated successfully by common Tikhonov regularization method, but it is invalid when the information about both source strength and location is absent. Therefore, a hybrid method combining linear Tikhonov regularization and PSO algorithm was designed. With this method, the nonlinear inverse dispersion model was transformed to a linear form under some assumptions, and the source parameters including source strength and location were identified simultaneously by linear Tikhonov-PSO regularization method. The regularization parameters were selected by L-curve method. The estimation results with different regularization matrixes showed that the confidence interval with high-order regularization matrix is narrower than that with zero-order regularization matrix. But the estimation results of different source parameters are close to each other with different regularization matrixes. A nonlinear Tikhonov-PSO hybrid regularization was also designed with primary nonlinear dispersion model to estimate the source parameters. The comparison results of simulation and experiment case showed that the linear Tikhonov-PSO method with transformed linear inverse model has higher computation efficiency than nonlinear Tikhonov-PSO method. The confidence intervals from linear Tikhonov-PSO are more reasonable than that from nonlinear method. The estimation results from linear Tikhonov-PSO method are similar to that from single PSO algorithm, and a reasonable confidence interval with some probability levels can be additionally given by Tikhonov-PSO method. Therefore, the presented linear Tikhonov-PSO regularization method is a good potential method for hazardous emission

  6. FINAL DESIGN REVIEW REPORT Subcritical Experiments Gen 2, 3-ft Confinement Vessel Weldment

    Energy Technology Data Exchange (ETDEWEB)

    Romero, Christopher [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-28

    A Final Design Review (FDR) of the Subcritical Experiments (SCE) Gen 2, 3-ft. Confinement Vessel Weldment was held at Los Alamos National Laboratory (LANL) on September 14, 2017. The review was a focused review on changes only to the confinement vessel weldment (versus a system design review). The changes resulted from lessons-learned in fabricating and inspecting the current set of confinement vessels used for the SCE Program. The baseline 3-ft. confinement vessel weldment design has successfully been used (to date) for three (3) high explosive (HE) over-tests, two (2) fragment tests, and five (5) integral HE experiments. The design team applied lessons learned from fabrication and inspection of these vessel weldments to enhance fit-up, weldability, inspection, and fitness for service evaluations. The review team consisted of five (5) independent subject matter experts with engineering design, analysis, testing, fabrication, and inspection experience. The

  7. Human-system interface design review guideline -- Process and guidelines: Final report. Revision 1, Volume 1

    International Nuclear Information System (INIS)

    1996-06-01

    NUREG-0700, Revision 1, provides human factors engineering (HFE) guidance to the US Nuclear Regulatory Commission staff for its: (1) review of the human system interface (HSI) design submittals prepared by licensees or applications for a license or design certification of commercial nuclear power plants, and (2) performance of HSI reviews that could be undertaken as part of an inspection or other type of regulatory review involving HSI design or incidents involving human performance. The guidance consists of a review process and HFE guidelines. The document describes those aspects of the HSI design review process that are important to the identification and resolution of human engineering discrepancies that could adversely affect plant safety. Guidance is provided that could be used by the staff to review an applicant's HSI design review process or to guide the development of an HSI design review plan, e.g., as part of an inspection activity. The document also provides detailed HFE guidelines for the assessment of HSI design implementations. NUREG-0700, Revision 1, consists of three stand-alone volumes. Volume 1 consists of two major parts. Part 1 describes those aspects of the review process of the HSI design that are important to identifying and resolving human engineering discrepancies. Part 2 contains detailed guidelines for a human factors engineering review which identify criteria for assessing the implementation of an applicant's or licensee's HSI design

  8. Efficacy of a Respiratory Training System on the Regularity of Breathing

    International Nuclear Information System (INIS)

    Shin, Eun Hyuk; Park, Hee Chul; Han, Young Yih; Ju, Sang Gyu; Shin, Jung Suk; Ahn, Yong Chan

    2008-01-01

    In order to enhance the efficiency of respiratory gated 4-dimensional radiation therapy for more regular and stable respiratory period and amplitude, a respiration training system was designed, and its efficacy was evaluated. Materials and Methods: The experiment was designed to measure the difference in respiration regularity following the use of a training system. A total of 11 subjects (9 volunteers and 2 patients) were included in the experiments. Three different breathing signals, including free breathing (free-breathing), guided breathing that followed training software (guided-breathing), and free breathing after the guided-breathing (post guided-breathing), were consecutively recorded in each subject. The peak-to-peak (PTP) period of the breathing signal, standard deviation (SD), peak-amplitude and its SD, area of the one cycle of the breathing wave form, and its root mean square (RMS) were measured and computed. Results: The temporal regularity was significantly improved in guided-breathing since the SD of breathing period reduced (free-breathing 0.568 vs guided-breathing 0.344, p=0.0013). The SD of the breathing period representing the post guided-breathing was also reduced, but the difference was not statistically significant (free-breathing 0.568 vs. guided-breathing 0.512, p=ns). Also the SD of measured amplitude was reduced in guided-breathing (free-breathing 1.317 vs. guided-breathing 1.068, p=0.187), although not significant. This indicated that the tidal volume for each breath was kept more even in guided-breathing compared to free-breathing. There was no change in breathing pattern between free-breathing and guided-breathing. The average area of breathing wave form and its RMS in postguided-breathing, however, was reduced by 7% and 5.9%, respectively. Conclusion: The guided-breathing was more stable and regular than the other forms of breathing data. Therefore, the developed respiratory training system was effective in improving the temporal

  9. Prevalence and Correlates of Having a Regular Physician among Women Presenting for Induced Abortion.

    Science.gov (United States)

    Chor, Julie; Hebert, Luciana E; Hasselbacher, Lee A; Whitaker, Amy K

    2016-01-01

    To determine the prevalence and correlates of having a regular physician among women presenting for induced abortion. We conducted a retrospective review of women presenting to an urban, university-based family planning clinic for abortion between January 2008 and September 2011. We conducted bivariate analyses, comparing women with and without a regular physician, and multivariable regression modeling, to identify factors associated with not having a regular physician. Of 834 women, 521 (62.5%) had a regular physician and 313 (37.5%) did not. Women with a prior pregnancy, live birth, or spontaneous abortion were more likely than women without these experiences to have a regular physician. Women with a prior induced abortion were not more likely than women who had never had a prior induced abortion to have a regular physician. Compared with women younger than 18 years, women aged 18 to 26 years were less likely to have a physician (adjusted odds ratio [aOR], 0.25; 95% confidence interval [CI], 0.10-0.62). Women with a prior live birth had increased odds of having a regular physician compared with women without a prior pregnancy (aOR, 1.89; 95% CI, 1.13-3.16). Women without medical/fetal indications and who had not been victims of sexual assault (self-indicated) were less likely to report having a regular physician compared with women with medical/fetal indications (aOR, 0.55; 95% CI, 0.17-0.82). The abortion visit is a point of contact with a large number of women without a regular physician and therefore provides an opportunity to integrate women into health care. Copyright © 2016 Jacobs Institute of Women's Health. Published by Elsevier Inc. All rights reserved.

  10. Human reliability analysis in the man-machine interface design review

    International Nuclear Information System (INIS)

    Kim, I.S.

    2001-01-01

    Advanced, computer-based man-machine interface (MMI) is emerging as part of the new design of nuclear power plants. The impact of advanced MMI on the operator performance, and as a result, on plant safety should be thoroughly evaluated before such technology is actually adopted in the plants. This paper discusses the applicability of human reliability analysis (HRA) to support the design review process. Both the first-generation and the second-generation HRA methods are considered focusing on a couple of promising HRA methods, i.e. ATHEANA and CREAM, with the potential to assist the design review process

  11. J-regular rings with injectivities

    OpenAIRE

    Shen, Liang

    2010-01-01

    A ring $R$ is called a J-regular ring if R/J(R) is von Neumann regular, where J(R) is the Jacobson radical of R. It is proved that if R is J-regular, then (i) R is right n-injective if and only if every homomorphism from an $n$-generated small right ideal of $R$ to $R_{R}$ can be extended to one from $R_{R}$ to $R_{R}$; (ii) R is right FP-injective if and only if R is right (J, R)-FP-injective. Some known results are improved.

  12. The three-point function in split dimensional regularization in the Coulomb gauge

    International Nuclear Information System (INIS)

    Leibbrandt, G.

    1998-01-01

    We use a gauge-invariant regularization procedure, called split dimensional regularization, to evaluate the quark self-energy Σ(p) and quark-quark-gluon vertex function Λ μ (p',p) in the Coulomb gauge, ∇-vector.A - vectora=0. The technique of split dimensional regularization was designed to regulate Coulomb-gauge Feynman integrals in non-Abelian theories. The technique which is based on two complex regulating parameters, ω and σ, is shown to generate a well-defined set of Coulomb-gauge integrals. A major component of this project deals with the evaluation of four-propagator and five-propagator Coulomb integrals, some of which are non-local. It is further argued that the standard one-loop BRST identity relating Σ and Λ μ , should by rights be replaced by a more general BRST identity which contains two additional contributions from ghost vertex diagrams. Despite the appearance of non-local Coulomb integrals, both Σ and Λ μ are local functions which satisfy the appropriate BRST identity. Application of split dimensional regularization to two-loop energy integrals is briefly discussed. (orig.)

  13. Regular black holes from semi-classical down to Planckian size

    Science.gov (United States)

    Spallucci, Euro; Smailagic, Anais

    In this paper, we review various models of curvature singularity free black holes (BHs). In the first part of the review, we describe semi-classical solutions of the Einstein equations which, however, contains a “quantum” input through the matter source. We start by reviewing the early model by Bardeen where the metric is regularized by-hand through a short-distance cutoff, which is justified in terms of nonlinear electro-dynamical effects. This toy-model is useful to point-out the common features shared by all regular semi-classical black holes. Then, we solve Einstein equations with a Gaussian source encoding the quantum spread of an elementary particle. We identify, the a priori arbitrary, Gaussian width with the Compton wavelength of the quantum particle. This Compton-Gauss model leads to the estimate of a terminal density that a gravitationally collapsed object can achieve. We identify this density to be the Planck density, and reformulate the Gaussian model assuming this as its peak density. All these models, are physically reliable as long as the BH mass is big enough with respect to the Planck mass. In the truly Planckian regime, the semi-classical approximation breaks down. In this case, a fully quantum BH description is needed. In the last part of this paper, we propose a nongeometrical quantum model of Planckian BHs implementing the Holographic Principle and realizing the “classicalization” scenario recently introduced by Dvali and collaborators. The classical relation between the mass and radius of the BH emerges only in the classical limit, far away from the Planck scale.

  14. Iterative Regularization with Minimum-Residual Methods

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg; Hansen, Per Christian

    2007-01-01

    subspaces. We provide a combination of theory and numerical examples, and our analysis confirms the experience that MINRES and MR-II can work as general regularization methods. We also demonstrate theoretically and experimentally that the same is not true, in general, for GMRES and RRGMRES their success......We study the regularization properties of iterative minimum-residual methods applied to discrete ill-posed problems. In these methods, the projection onto the underlying Krylov subspace acts as a regularizer, and the emphasis of this work is on the role played by the basis vectors of these Krylov...... as regularization methods is highly problem dependent....

  15. Iterative regularization with minimum-residual methods

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg; Hansen, Per Christian

    2006-01-01

    subspaces. We provide a combination of theory and numerical examples, and our analysis confirms the experience that MINRES and MR-II can work as general regularization methods. We also demonstrate theoretically and experimentally that the same is not true, in general, for GMRES and RRGMRES - their success......We study the regularization properties of iterative minimum-residual methods applied to discrete ill-posed problems. In these methods, the projection onto the underlying Krylov subspace acts as a regularizer, and the emphasis of this work is on the role played by the basis vectors of these Krylov...... as regularization methods is highly problem dependent....

  16. Multiple graph regularized protein domain ranking.

    Science.gov (United States)

    Wang, Jim Jing-Yan; Bensmail, Halima; Gao, Xin

    2012-11-19

    Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods. To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods. The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications.

  17. Teachers' Views about the Education of Gifted Students in Regular Classrooms

    Directory of Open Access Journals (Sweden)

    Neşe Kutlu Abu

    2017-12-01

    Full Text Available The purpose of this study was to investigate classroom teachers’ views about the education of gifted students in regular classrooms. The sample of the study is composed of ten primary school teachers working in the city of Amasya and had gifted students in their classes. In the present study, phenomenological research design was used. Data was collected through semi-structured interviews and analyzed descriptively in the QSR N-Vivo package program. The findings showed that teachers did not believe a need for differentiating curriculum for gifted students; rather they expressed that regular curriculum was enough for gifted students. Based on the findings, it is clear that teachers need training both on the need of differentiated education for gifted students and strategies and approaches about how to educate gifted students. Teachers’ attitudes towards gifted students in regular classrooms should be investigated so that teachers’ unsupportive beliefs about differentiation for gifted students also influence their attitudes towards gifted students.

  18. Human-system interface design review guideline -- Process and guidelines: Final report. Revision 1, Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    None

    1996-06-01

    NUREG-0700, Revision 1, provides human factors engineering (HFE) guidance to the US Nuclear Regulatory Commission staff for its: (1) review of the human system interface (HSI) design submittals prepared by licensees or applications for a license or design certification of commercial nuclear power plants, and (2) performance of HSI reviews that could be undertaken as part of an inspection or other type of regulatory review involving HSI design or incidents involving human performance. The guidance consists of a review process and HFE guidelines. The document describes those aspects of the HSI design review process that are important to the identification and resolution of human engineering discrepancies that could adversely affect plant safety. Guidance is provided that could be used by the staff to review an applicant`s HSI design review process or to guide the development of an HSI design review plan, e.g., as part of an inspection activity. The document also provides detailed HFE guidelines for the assessment of HSI design implementations. NUREG-0700, Revision 1, consists of three stand-alone volumes. Volume 1 consists of two major parts. Part 1 describes those aspects of the review process of the HSI design that are important to identifying and resolving human engineering discrepancies. Part 2 contains detailed guidelines for a human factors engineering review which identify criteria for assessing the implementation of an applicant`s or licensee`s HSI design.

  19. Human-centred design in global health: A scoping review of applications and contexts.

    Directory of Open Access Journals (Sweden)

    Alessandra N Bazzano

    Full Text Available Health and wellbeing are determined by a number of complex, interrelated factors. The application of design thinking to questions around health may prove valuable and complement existing approaches. A number of public health projects utilizing human centered design (HCD, or design thinking, have recently emerged, but no synthesis of the literature around these exists. The results of a scoping review of current research on human centered design for health outcomes are presented. The review aimed to understand why and how HCD can be valuable in the contexts of health related research. Results identified pertinent literature as well as gaps in information on the use of HCD for public health research, design, implementation and evaluation. A variety of contexts were identified in which design has been used for health. Global health and design thinking have different underlying conceptual models and terminology, creating some inherent tensions, which could be overcome through clear communication and documentation in collaborative projects. The review concludes with lessons learned from the review on how future projects can better integrate design thinking with global health research.

  20. Isotope Production Facility Conceptual Thermal-Hydraulic Design Review and Scoping Calculations

    International Nuclear Information System (INIS)

    Pasamehmetoglu, K.O.; Shelton, J.D.

    1998-01-01

    The thermal-hydraulic design of the target for the Isotope Production Facility (IPF) is reviewed. In support of the technical review, scoping calculations are performed. The results of the review and scoping calculations are presented in this report

  1. Applying risk insights in US NRC reviews of integral pressurized water reactor designs

    International Nuclear Information System (INIS)

    Caruso, M.A.; Hilsmeier, T.; Kevern, T.A.

    2012-01-01

    In its Staff Requirements Memorandum (SRM) on COMGBJ-10-0004/COMGEA-10-0001, 'Use of Risk Insights to Enhance Safety Focus of Small Modular Reactor Reviews,' dated August 31, 2010 (ML102510405), the U.S. Nuclear Regulatory Commission (NRC) directed the NRC staff to more fully integrate the use of risk insights into pre-application activities and the review of small modular reactor (SMR) applications with near-term focus on integral pressurized water reactor (iPWR) designs. The Commission's objective is to align the review focus and resources with the risk-significant systems, structures, and components (SSCs) and other aspects of the design, that contribute most to safety in order to enhance the efficiency of the review process while still enabling a decision of reasonable assurance of the design's safety. The staff was directed to develop a design-specific, risk-informed review plan for each SMR to address pre-application and application review activities. The NRC staff submitted a response to the Commission which describes its approach for (1) using risk insights, consistent with current regulatory requirements, to assign SSCs to one of a limited set of graded categories, and (2) adjusting the scope and depth of current review plans--where possible--consistent with regulatory requirements and consistent with the applicable graded category. Because the staff's review constitutes an independent audit of the application, the staff may emphasize or de-emphasize particular aspects of its review guidance (i.e., Standard Review Plan), as appropriate and consistent with regulatory requirements, for the application being reviewed. The staff may propose justifications for not performing certain sections of the reviews called for by the applicable review plan. Examples of acceptable variations in the scope of a review can include reduced emphasis on SSC attributes such as reliability, availability, or functional performance when the SSC will be in the scope of a program

  2. Human factors design review guidelines for advanced nuclear control room technologies

    International Nuclear Information System (INIS)

    O'Hara, J.; Brown, W.; Granda, T.; Baker, C.

    1991-01-01

    Advanced control rooms (ACRs) for future nuclear power plants are being designed utilizing computer-based technologies. The US Nuclear Regulatory Commission reviews the human engineering aspects of such control rooms to ensure that they are designed to good human factors engineering principles and that operator performance and reliability are appropriately supported in order to protect public health and safety. This paper describes the rationale, general approach, and initial development of an NRC Advanced Control Room Design Review Guideline. 20 refs., 1 fig

  3. Multiple Learning Strategies Project. Building Maintenance & Engineering. Regular Vocational. [Vol. 2.

    Science.gov (United States)

    Smith, Dwight; And Others

    This instructional package is one of two designed for regular vocational students in the vocational area of building maintenance and engineering. The fifty-three learning modules are organized into ten units: office cleaning; grounds; sanitation; boiler maintenance and operation; power and hand tools; cabinet construction; repair of damaged…

  4. Ensemble manifold regularization.

    Science.gov (United States)

    Geng, Bo; Tao, Dacheng; Xu, Chao; Yang, Linjun; Hua, Xian-Sheng

    2012-06-01

    We propose an automatic approximation of the intrinsic manifold for general semi-supervised learning (SSL) problems. Unfortunately, it is not trivial to define an optimization function to obtain optimal hyperparameters. Usually, cross validation is applied, but it does not necessarily scale up. Other problems derive from the suboptimality incurred by discrete grid search and the overfitting. Therefore, we develop an ensemble manifold regularization (EMR) framework to approximate the intrinsic manifold by combining several initial guesses. Algorithmically, we designed EMR carefully so it 1) learns both the composite manifold and the semi-supervised learner jointly, 2) is fully automatic for learning the intrinsic manifold hyperparameters implicitly, 3) is conditionally optimal for intrinsic manifold approximation under a mild and reasonable assumption, and 4) is scalable for a large number of candidate manifold hyperparameters, from both time and space perspectives. Furthermore, we prove the convergence property of EMR to the deterministic matrix at rate root-n. Extensive experiments over both synthetic and real data sets demonstrate the effectiveness of the proposed framework.

  5. Advanced human-system interface design review guideline. Evaluation procedures and guidelines for human factors engineering reviews

    Energy Technology Data Exchange (ETDEWEB)

    O`Hara, J.M.; Brown, W.S. [Brookhaven National Lab., Upton, NY (United States); Baker, C.C.; Welch, D.L.; Granda, T.M.; Vingelis, P.J. [Carlow International Inc., Falls Church, VA (United States)

    1994-07-01

    Advanced control rooms will use advanced human-system interface (HSI) technologies that may have significant implications for plant safety in that they will affect the operator`s overall role in the system, the method of information presentation, and the ways in which operators interact with the system. The U.S. Nuclear Regulatory Commission (NRC) reviews the HSI aspects of control rooms to ensure that they are designed to good human factors engineering principles and that operator performance and reliability are appropriately supported to protect public health and safety. The principal guidance available to the NRC, however, was developed more than ten years ago, well before these technological changes. Accordingly, the human factors guidance needs to be updated to serve as the basis for NRC review of these advanced designs. The purpose of this project was to develop a general approach to advanced HSI review and the human factors guidelines to support. NRC safety reviews of advanced systems. This two-volume report provides the results of the project. Volume I describes the development of the Advanced HSI Design Review Guideline (DRG) including (1) its theoretical and technical foundation, (2) a general model for the review of advanced HSIs, (3) guideline development in both hard-copy and computer-based versions, and (4) the tests and evaluations performed to develop and validate the DRG. Volume I also includes a discussion of the gaps in available guidance and a methodology for addressing them. Volume 2 provides the guidelines to be used for advanced HSI review and the procedures for their use.

  6. Advanced human-system interface design review guideline. Evaluation procedures and guidelines for human factors engineering reviews

    International Nuclear Information System (INIS)

    O'Hara, J.M.; Brown, W.S.; Baker, C.C.; Welch, D.L.; Granda, T.M.; Vingelis, P.J.

    1994-07-01

    Advanced control rooms will use advanced human-system interface (HSI) technologies that may have significant implications for plant safety in that they will affect the operator's overall role in the system, the method of information presentation, and the ways in which operators interact with the system. The U.S. Nuclear Regulatory Commission (NRC) reviews the HSI aspects of control rooms to ensure that they are designed to good human factors engineering principles and that operator performance and reliability are appropriately supported to protect public health and safety. The principal guidance available to the NRC, however, was developed more than ten years ago, well before these technological changes. Accordingly, the human factors guidance needs to be updated to serve as the basis for NRC review of these advanced designs. The purpose of this project was to develop a general approach to advanced HSI review and the human factors guidelines to support. NRC safety reviews of advanced systems. This two-volume report provides the results of the project. Volume I describes the development of the Advanced HSI Design Review Guideline (DRG) including (1) its theoretical and technical foundation, (2) a general model for the review of advanced HSIs, (3) guideline development in both hard-copy and computer-based versions, and (4) the tests and evaluations performed to develop and validate the DRG. Volume I also includes a discussion of the gaps in available guidance and a methodology for addressing them. Volume 2 provides the guidelines to be used for advanced HSI review and the procedures for their use

  7. Review on JMTR safety design for LEU core conversion

    International Nuclear Information System (INIS)

    Komori, Yoshihiro; Yokokawa, Makoto; Saruta, Toru; Inada, Seiji; Sakurai, Fumio; Yamamoto, Katsumune; Oyamada, Rokuro; Saito, Minoru

    1993-12-01

    Safety of the JMTR was fully reviewed for the core conversion to low enriched uranium fuel. Fundamental policies for the JMTR safety design were reconsidered based on the examination guide for safety design of test and research reactors, and safety of the JMTR was confirmed. This report describes the safety design of the JMTR from the viewpoint of major functions for reactor safety. (author)

  8. Higher derivative regularization and chiral anomaly

    International Nuclear Information System (INIS)

    Nagahama, Yoshinori.

    1985-02-01

    A higher derivative regularization which automatically leads to the consistent chiral anomaly is analyzed in detail. It explicitly breaks all the local gauge symmetry but preserves global chiral symmetry and leads to the chirally symmetric consistent anomaly. This regularization thus clarifies the physics content contained in the consistent anomaly. We also briefly comment on the application of this higher derivative regularization to massless QED. (author)

  9. Development of human factors design review guidelines(II)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Oh, In Suk; Suh, Sang Moon; Lee, Hyun Chul [Korea Atomic Energy Research Institute, Taejon (Korea)

    1998-06-01

    The objective of this study is to develop human factors engineering program review guidelines and alarm system review guidelines in order to resolve the two major technical issues: 25. Human Factors Engineering Program Review Model and 26. Review Criteria for Human Factors Aspects of Advanced Controls and Instrumentation, which are related to the development of human factors safety regulation guides being performed by KINS. For the development of human factors program review guidelines, we made a Korean version of NUREG-0711 and added our comments by considering Korean regulatory situation and reviewing the reference documents of NUREG-0711. We also computerized the Korean version of NUREG-0711, additional comments, and selected portion of the reference documents for the developer of safety regulation guides in KINS to see the contents comparatively at a glance and use them easily. For the development of alarm system review guidelines, we made a Korean version of NUREG/CR-6105, which was published by NRC in 1994 as a guideline document for the human factors review of alarm systems. Then we will update the guidelines by reviewing the literature related to alarm design published after 1994. (author). 11 refs., 2 figs., 2 tabs.

  10. Multiple graph regularized protein domain ranking

    KAUST Repository

    Wang, Jim Jing-Yan

    2012-11-19

    Background: Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods.Results: To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods.Conclusion: The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications. 2012 Wang et al; licensee BioMed Central Ltd.

  11. Multiple graph regularized protein domain ranking

    KAUST Repository

    Wang, Jim Jing-Yan; Bensmail, Halima; Gao, Xin

    2012-01-01

    Background: Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods.Results: To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods.Conclusion: The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications. 2012 Wang et al; licensee BioMed Central Ltd.

  12. Multiple graph regularized protein domain ranking

    Directory of Open Access Journals (Sweden)

    Wang Jim

    2012-11-01

    Full Text Available Abstract Background Protein domain ranking is a fundamental task in structural biology. Most protein domain ranking methods rely on the pairwise comparison of protein domains while neglecting the global manifold structure of the protein domain database. Recently, graph regularized ranking that exploits the global structure of the graph defined by the pairwise similarities has been proposed. However, the existing graph regularized ranking methods are very sensitive to the choice of the graph model and parameters, and this remains a difficult problem for most of the protein domain ranking methods. Results To tackle this problem, we have developed the Multiple Graph regularized Ranking algorithm, MultiG-Rank. Instead of using a single graph to regularize the ranking scores, MultiG-Rank approximates the intrinsic manifold of protein domain distribution by combining multiple initial graphs for the regularization. Graph weights are learned with ranking scores jointly and automatically, by alternately minimizing an objective function in an iterative algorithm. Experimental results on a subset of the ASTRAL SCOP protein domain database demonstrate that MultiG-Rank achieves a better ranking performance than single graph regularized ranking methods and pairwise similarity based ranking methods. Conclusion The problem of graph model and parameter selection in graph regularized protein domain ranking can be solved effectively by combining multiple graphs. This aspect of generalization introduces a new frontier in applying multiple graphs to solving protein domain ranking applications.

  13. 75 FR 53966 - Regular Meeting

    Science.gov (United States)

    2010-09-02

    ... FARM CREDIT SYSTEM INSURANCE CORPORATION Regular Meeting AGENCY: Farm Credit System Insurance Corporation Board. SUMMARY: Notice is hereby given of the regular meeting of the Farm Credit System Insurance Corporation Board (Board). DATE AND TIME: The meeting of the Board will be held at the offices of the Farm...

  14. Work and family life of childrearing women workers in Japan: comparison of non-regular employees with short working hours, non-regular employees with long working hours, and regular employees.

    Science.gov (United States)

    Seto, Masako; Morimoto, Kanehisa; Maruyama, Soichiro

    2006-05-01

    This study assessed the working and family life characteristics, and the degree of domestic and work strain of female workers with different employment statuses and weekly working hours who are rearing children. Participants were the mothers of preschoolers in a large Japanese city. We classified the women into three groups according to the hours they worked and their employment conditions. The three groups were: non-regular employees working less than 30 h a week (n=136); non-regular employees working 30 h or more per week (n=141); and regular employees working 30 h or more a week (n=184). We compared among the groups the subjective values of work, financial difficulties, childcare and housework burdens, psychological effects, and strains such as work and family strain, work-family conflict, and work dissatisfaction. Regular employees were more likely to report job pressures and inflexible work schedules and to experience more strain related to work and family than non-regular employees. Non-regular employees were more likely to be facing financial difficulties. In particular, non-regular employees working longer hours tended to encounter socioeconomic difficulties and often lacked support from family and friends. Female workers with children may have different social backgrounds and different stressors according to their working hours and work status.

  15. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    DOE Order 5637.1, ''Classified Computer Security,'' requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, we have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system. 1 tab

  16. A methodology for performing computer security reviews

    International Nuclear Information System (INIS)

    Hunteman, W.J.

    1991-01-01

    This paper reports on DIE Order 5637.1, Classified Computer Security, which requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, the authors have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system

  17. Human-factors engineering-control-room design review: Shoreham Nuclear Power Station. Draft audit report

    International Nuclear Information System (INIS)

    Peterson, L.R.; Preston-Smith, J.; Savage, J.W.; Rousseau, W.F.

    1981-01-01

    A human factors engineering preliminary design review of the Shoreham control room was performed at the site on March 30 through April 3, 1981. This design review was carried out by a team from the Human Factors Engineering Branch, Division of Human Factors Safety. This report was prepared on the basis of the HFEB's review of the applicant's Preliminary Design Assessment and the human factors engineering design review/audit performed at the site. The presented sections are numbered to conform to the guidelines of the draft version of NUREG-0700. They summarize the teams's observations of the control room design and layout, and of the control room operators' interface with the control room environment

  18. SparseBeads data: benchmarking sparsity-regularized computed tomography

    Science.gov (United States)

    Jørgensen, Jakob S.; Coban, Sophia B.; Lionheart, William R. B.; McDonald, Samuel A.; Withers, Philip J.

    2017-12-01

    Sparsity regularization (SR) such as total variation (TV) minimization allows accurate image reconstruction in x-ray computed tomography (CT) from fewer projections than analytical methods. Exactly how few projections suffice and how this number may depend on the image remain poorly understood. Compressive sensing connects the critical number of projections to the image sparsity, but does not cover CT, however empirical results suggest a similar connection. The present work establishes for real CT data a connection between gradient sparsity and the sufficient number of projections for accurate TV-regularized reconstruction. A collection of 48 x-ray CT datasets called SparseBeads was designed for benchmarking SR reconstruction algorithms. Beadpacks comprising glass beads of five different sizes as well as mixtures were scanned in a micro-CT scanner to provide structured datasets with variable image sparsity levels, number of projections and noise levels to allow the systematic assessment of parameters affecting performance of SR reconstruction algorithms6. Using the SparseBeads data, TV-regularized reconstruction quality was assessed as a function of numbers of projections and gradient sparsity. The critical number of projections for satisfactory TV-regularized reconstruction increased almost linearly with the gradient sparsity. This establishes a quantitative guideline from which one may predict how few projections to acquire based on expected sample sparsity level as an aid in planning of dose- or time-critical experiments. The results are expected to hold for samples of similar characteristics, i.e. consisting of few, distinct phases with relatively simple structure. Such cases are plentiful in porous media, composite materials, foams, as well as non-destructive testing and metrology. For samples of other characteristics the proposed methodology may be used to investigate similar relations.

  19. Incremental projection approach of regularization for inverse problems

    Energy Technology Data Exchange (ETDEWEB)

    Souopgui, Innocent, E-mail: innocent.souopgui@usm.edu [The University of Southern Mississippi, Department of Marine Science (United States); Ngodock, Hans E., E-mail: hans.ngodock@nrlssc.navy.mil [Naval Research Laboratory (United States); Vidard, Arthur, E-mail: arthur.vidard@imag.fr; Le Dimet, François-Xavier, E-mail: ledimet@imag.fr [Laboratoire Jean Kuntzmann (France)

    2016-10-15

    This paper presents an alternative approach to the regularized least squares solution of ill-posed inverse problems. Instead of solving a minimization problem with an objective function composed of a data term and a regularization term, the regularization information is used to define a projection onto a convex subspace of regularized candidate solutions. The objective function is modified to include the projection of each iterate in the place of the regularization. Numerical experiments based on the problem of motion estimation for geophysical fluid images, show the improvement of the proposed method compared with regularization methods. For the presented test case, the incremental projection method uses 7 times less computation time than the regularization method, to reach the same error target. Moreover, at convergence, the incremental projection is two order of magnitude more accurate than the regularization method.

  20. Geometric regularizations and dual conifold transitions

    International Nuclear Information System (INIS)

    Landsteiner, Karl; Lazaroiu, Calin I.

    2003-01-01

    We consider a geometric regularization for the class of conifold transitions relating D-brane systems on noncompact Calabi-Yau spaces to certain flux backgrounds. This regularization respects the SL(2,Z) invariance of the flux superpotential, and allows for computation of the relevant periods through the method of Picard-Fuchs equations. The regularized geometry is a noncompact Calabi-Yau which can be viewed as a monodromic fibration, with the nontrivial monodromy being induced by the regulator. It reduces to the original, non-monodromic background when the regulator is removed. Using this regularization, we discuss the simple case of the local conifold, and show how the relevant field-theoretic information can be extracted in this approach. (author)

  1. Development of human factors design review guidelines(III)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Oh, In Suk; Suh, Sang Moon; Lee, Hyun Chul [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-02-15

    The objective of this study is to develop human factors engineering program review guidelines and alarm system review guidelines in order to resolve the two major technical issues: '25, human factors engineering program review model' and '26, review criteria for human factors aspects of advanced controls and instrumentation', which are related to the development of human factors safety regulation guides being performed by KINS. For the development of human factors program review guidelines, we made a Korean version of NUREG-0711 and added our comments by considering Korean regulatory situation and reviewing the reference documents NUREG--0711, additional comments, and selected portion of the reference documents for the developer of safety regulation guides in KINS to see the contents comparatively at a glance and use them easily. For the development of alarm system review guides in KINS to see the contents comparatively at a glance and use them easily. For the development of alarm system review guidelines, we made a Korean version of NUREG/CR-6105, which was published by NRC in 1994 as a guideline document for the human factors review of alarm system. Then we will update the guidelines by reviewing the literature related to alarm design published after 1994.

  2. Development of human factors design review guidelines(III)

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Oh, In Suk; Suh, Sang Moon; Lee, Hyun Chul

    1999-02-01

    The objective of this study is to develop human factors engineering program review guidelines and alarm system review guidelines in order to resolve the two major technical issues: '25, human factors engineering program review model' and '26, review criteria for human factors aspects of advanced controls and instrumentation', which are related to the development of human factors safety regulation guides being performed by KINS. For the development of human factors program review guidelines, we made a Korean version of NUREG-0711 and added our comments by considering Korean regulatory situation and reviewing the reference documents NUREG--0711, additional comments, and selected portion of the reference documents for the developer of safety regulation guides in KINS to see the contents comparatively at a glance and use them easily. For the development of alarm system review guides in KINS to see the contents comparatively at a glance and use them easily. For the development of alarm system review guidelines, we made a Korean version of NUREG/CR-6105, which was published by NRC in 1994 as a guideline document for the human factors review of alarm system. Then we will update the guidelines by reviewing the literature related to alarm design published after 1994

  3. Development of human factors design review guidelines(II)

    International Nuclear Information System (INIS)

    Lee, Jung Woon; Oh, In Suk; Suh, Sang Moon; Lee, Hyun Chul

    1998-06-01

    The objective of this study is to develop human factors engineering program review guidelines and alarm system review guidelines in order to resolve the two major technical issues: '25, human factors engineering program review model' and '26, review criteria for human factors aspects of advanced controls and instrumentation', which are related to the development of human factors safety regulation guides being performed by KINS. For the development of human factors program review guidelines, we made a Korean version of NUREG-0711 and added our comments by considering Korean regulatory situation and reviewing the reference documents NUREG--0711, additional comments, and selected portion of the reference documents for the developer of safety regulation guides in KINS to see the contents comparatively at a glance and use them easily. For the development of alarm system review guides in KINS to see the contents comparatively at a glance and use them easily. For the development of alarm system review guidelines, we made a Korean version of NUREG/CR-6105, which was published by NRC in 1994 as a guideline document for the human factors review of alarm system. Then we will update the guidelines by reviewing the literature related to alarm design published after 1994

  4. Development of human factors design review guidelines(III)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Oh, In Suk; Suh, Sang Moon; Lee, Hyun Chul [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1999-02-15

    The objective of this study is to develop human factors engineering program review guidelines and alarm system review guidelines in order to resolve the two major technical issues: '25, human factors engineering program review model' and '26, review criteria for human factors aspects of advanced controls and instrumentation', which are related to the development of human factors safety regulation guides being performed by KINS. For the development of human factors program review guidelines, we made a Korean version of NUREG-0711 and added our comments by considering Korean regulatory situation and reviewing the reference documents NUREG--0711, additional comments, and selected portion of the reference documents for the developer of safety regulation guides in KINS to see the contents comparatively at a glance and use them easily. For the development of alarm system review guides in KINS to see the contents comparatively at a glance and use them easily. For the development of alarm system review guidelines, we made a Korean version of NUREG/CR-6105, which was published by NRC in 1994 as a guideline document for the human factors review of alarm system. Then we will update the guidelines by reviewing the literature related to alarm design published after 1994.

  5. Adaptive regularization

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Rasmussen, Carl Edward; Svarer, C.

    1994-01-01

    Regularization, e.g., in the form of weight decay, is important for training and optimization of neural network architectures. In this work the authors provide a tool based on asymptotic sampling theory, for iterative estimation of weight decay parameters. The basic idea is to do a gradient desce...

  6. Regularizing portfolio optimization

    International Nuclear Information System (INIS)

    Still, Susanne; Kondor, Imre

    2010-01-01

    The optimization of large portfolios displays an inherent instability due to estimation error. This poses a fundamental problem, because solutions that are not stable under sample fluctuations may look optimal for a given sample, but are, in effect, very far from optimal with respect to the average risk. In this paper, we approach the problem from the point of view of statistical learning theory. The occurrence of the instability is intimately related to over-fitting, which can be avoided using known regularization methods. We show how regularized portfolio optimization with the expected shortfall as a risk measure is related to support vector regression. The budget constraint dictates a modification. We present the resulting optimization problem and discuss the solution. The L2 norm of the weight vector is used as a regularizer, which corresponds to a diversification 'pressure'. This means that diversification, besides counteracting downward fluctuations in some assets by upward fluctuations in others, is also crucial because it improves the stability of the solution. The approach we provide here allows for the simultaneous treatment of optimization and diversification in one framework that enables the investor to trade off between the two, depending on the size of the available dataset.

  7. Regularizing portfolio optimization

    Science.gov (United States)

    Still, Susanne; Kondor, Imre

    2010-07-01

    The optimization of large portfolios displays an inherent instability due to estimation error. This poses a fundamental problem, because solutions that are not stable under sample fluctuations may look optimal for a given sample, but are, in effect, very far from optimal with respect to the average risk. In this paper, we approach the problem from the point of view of statistical learning theory. The occurrence of the instability is intimately related to over-fitting, which can be avoided using known regularization methods. We show how regularized portfolio optimization with the expected shortfall as a risk measure is related to support vector regression. The budget constraint dictates a modification. We present the resulting optimization problem and discuss the solution. The L2 norm of the weight vector is used as a regularizer, which corresponds to a diversification 'pressure'. This means that diversification, besides counteracting downward fluctuations in some assets by upward fluctuations in others, is also crucial because it improves the stability of the solution. The approach we provide here allows for the simultaneous treatment of optimization and diversification in one framework that enables the investor to trade off between the two, depending on the size of the available dataset.

  8. Accretion onto some well-known regular black holes

    International Nuclear Information System (INIS)

    Jawad, Abdul; Shahzad, M.U.

    2016-01-01

    In this work, we discuss the accretion onto static spherically symmetric regular black holes for specific choices of the equation of state parameter. The underlying regular black holes are charged regular black holes using the Fermi-Dirac distribution, logistic distribution, nonlinear electrodynamics, respectively, and Kehagias-Sftesos asymptotically flat regular black holes. We obtain the critical radius, critical speed, and squared sound speed during the accretion process near the regular black holes. We also study the behavior of radial velocity, energy density, and the rate of change of the mass for each of the regular black holes. (orig.)

  9. Accretion onto some well-known regular black holes

    Energy Technology Data Exchange (ETDEWEB)

    Jawad, Abdul; Shahzad, M.U. [COMSATS Institute of Information Technology, Department of Mathematics, Lahore (Pakistan)

    2016-03-15

    In this work, we discuss the accretion onto static spherically symmetric regular black holes for specific choices of the equation of state parameter. The underlying regular black holes are charged regular black holes using the Fermi-Dirac distribution, logistic distribution, nonlinear electrodynamics, respectively, and Kehagias-Sftesos asymptotically flat regular black holes. We obtain the critical radius, critical speed, and squared sound speed during the accretion process near the regular black holes. We also study the behavior of radial velocity, energy density, and the rate of change of the mass for each of the regular black holes. (orig.)

  10. Accretion onto some well-known regular black holes

    Science.gov (United States)

    Jawad, Abdul; Shahzad, M. Umair

    2016-03-01

    In this work, we discuss the accretion onto static spherically symmetric regular black holes for specific choices of the equation of state parameter. The underlying regular black holes are charged regular black holes using the Fermi-Dirac distribution, logistic distribution, nonlinear electrodynamics, respectively, and Kehagias-Sftesos asymptotically flat regular black holes. We obtain the critical radius, critical speed, and squared sound speed during the accretion process near the regular black holes. We also study the behavior of radial velocity, energy density, and the rate of change of the mass for each of the regular black holes.

  11. Review of FFTF and CRBRP control rod systems designs

    International Nuclear Information System (INIS)

    Pitterle, T.A.; Lagally, H.O.

    1977-01-01

    The evolution of the primary control rod system design for FFTF and CRBR, beginning with the initial choice of the basic concepts, is described. The significant component and systems tests are reviewed together with the test results which referenced the development of the CRBR primary control rod system design. Modifications to the concepts and detail designs of the FFTF control rod system were required principally to satisfy the requirements of CRBR, and at the same time incorporating design refinements shown desirable by the tests

  12. Influence of combination hemodialysis/hemoperfusion against score of depression in regular hemodialysis patients

    Science.gov (United States)

    Permatasari, T. D.; Thamrin, A.; Hanum, H.

    2018-03-01

    Patients with chronic kidney disease, have a higher risk for psychological distress such as anxiety, depression and cognitive decline. Combination of Hemodialysis (HD)/hemoperfusion (HP) regularly able to eliminate uremic toxin with mild-to-large molecular weight better. HD/HP can remove metabolites, toxin, and pathogenic factors and regulate the water, electrolyte and acid-base balance to improve the quality of patient’s sleep and appetite also reduces itching of the skin, which in turn improve the quality and life expectancy. This research was a cross sectional research with a pre-experimental design conducted from July to September 2015 with 17 regular hemodialysis patients as samples. Inclusion criteria were regular hemodialysis patients and willingly participated in the research. The assessmentwas conducted using BDI to assess depression. To obtained the results, data were analyzed using T-Test and showed that that the average BDI score before the combination of HD/HP 18.59±9 to 8.18±2.83 after the combination (p<0.001). In conclusion, combination HD/HP can lower depression scores in patients with regular HD.

  13. GCRA review and appraisal of HTGR reactor-core-design program

    International Nuclear Information System (INIS)

    1980-09-01

    The reactor-core-design program has as its principal objective and responsibility the design and resolution of major technical issues for the reactor core and core components on a schedule consistent with the plant licensing and construction program. The task covered in this review includes three major design areas: core physics, core thermal and hydraulic performance fuel element design, and in-core fuel performance evaluation

  14. Human factors engineering design review acceptance criteria for the safety parameter display

    International Nuclear Information System (INIS)

    McGevna, V.; Peterson, L.R.

    1981-01-01

    This report contains human factors engineering design review acceptance criteria developed by the Human Factors Engineering Branch (HFEB) of the Nuclear Regulatory Commission (NRC) to use in evaluating designs of the Safety Parameter Display System (SPDS). These criteria were developed in response to the functional design criteria for the SPDS defined in NUREG-0696, Functional Criteria for Emergency Response Facilities. The purpose of this report is to identify design review acceptance criteria for the SPDS installed in the control room of a nuclear power plant. Use of computer driven cathode ray tube (CRT) displays is anticipated. General acceptance criteria for displays of plant safety status information by the SPDS are developed. In addition, specific SPDS review criteria corresponding to the SPDS functional criteria specified in NUREG-0696 are established

  15. Diagrammatic methods in phase-space regularization

    International Nuclear Information System (INIS)

    Bern, Z.; Halpern, M.B.; California Univ., Berkeley

    1987-11-01

    Using the scalar prototype and gauge theory as the simplest possible examples, diagrammatic methods are developed for the recently proposed phase-space form of continuum regularization. A number of one-loop and all-order applications are given, including general diagrammatic discussions of the nogrowth theorem and the uniqueness of the phase-space stochastic calculus. The approach also generates an alternate derivation of the equivalence of the large-β phase-space regularization to the more conventional coordinate-space regularization. (orig.)

  16. A review of electronic engineering design free software tools

    OpenAIRE

    Medrano Sánchez, Carlos; Plaza García, Inmaculada; Castro Gil, Manuel Alonso; García Sevilla, Francisco; Martínez Calero, J.D.; Pou Félix, Josep; Corbalán Fuertes, Montserrat

    2010-01-01

    In this paper, we review electronic design free software tools. We have searched open source programs that help with several tasks of the electronic design flow: analog and digital simulation, schematic capture, printed circuit board design and hardware description language compilation and simulation. Using some rapid criteria for verifying their availability, we have selected some of them which are worth working with. This work intends to perform a deeper analysis of fre...

  17. Metric regularity and subdifferential calculus

    International Nuclear Information System (INIS)

    Ioffe, A D

    2000-01-01

    The theory of metric regularity is an extension of two classical results: the Lyusternik tangent space theorem and the Graves surjection theorem. Developments in non-smooth analysis in the 1980s and 1990s paved the way for a number of far-reaching extensions of these results. It was also well understood that the phenomena behind the results are of metric origin, not connected with any linear structure. At the same time it became clear that some basic hypotheses of the subdifferential calculus are closely connected with the metric regularity of certain set-valued maps. The survey is devoted to the metric theory of metric regularity and its connection with subdifferential calculus in Banach spaces

  18. Temporal regularity of the environment drives time perception

    OpenAIRE

    van Rijn, H; Rhodes, D; Di Luca, M

    2016-01-01

    It’s reasonable to assume that a regularly paced sequence should be perceived as regular, but here we show that perceived regularity depends on the context in which the sequence is embedded. We presented one group of participants with perceptually regularly paced sequences, and another group of participants with mostly irregularly paced sequences (75% irregular, 25% regular). The timing of the final stimulus in each sequence could be var- ied. In one experiment, we asked whether the last stim...

  19. Effects of regular exercise on asthma control in young adults.

    Science.gov (United States)

    Heikkinen, Sirpa A M; Mäkikyrö, Elina M S; Hugg, Timo T; Jaakkola, Maritta S; Jaakkola, Jouni J K

    2017-08-28

    According to our systematic literature review, no previous study has assessed potential effects of regular exercise on asthma control among young adults. We hypothesized that regular exercise improves asthma control among young adults. We studied 162 subjects with current asthma recruited from a population-based cohort study of 1,623 young adults 20-27 years of age. Asthma control was assessed by the occurrence of asthma-related symptoms, including wheezing, shortness of breath, cough, and phlegm production, during the past 12 months. Asthma symptom score was calculated based on reported frequencies of these symptoms (range: 0-12). Exercise was assessed as hours/week. In Poisson regression, adjusting for gender, age, smoking, environmental tobacco smoke exposure, and education, the asthma symptom score reduced by 0.09 points per 1 hour of exercise/week (95% CI: 0.00 to 0.17). Applying the "Low exercise" quartile as the reference, "Medium exercise" reduced the asthma symptom score by 0.66 (-0.39 to 1.72), and "High exercise" reduced it significantly by 1.13 (0.03 to 2.22). The effect was strongest among overweight subjects. Our results provide new evidence that regular exercising among young adults improves their asthma control. Thus, advising about exercise should be included as an important part of asthma self-management in clinical practice.

  20. Human-factors engineering control-room design review/audit report: Byron Generating Station, Commonwealth Edison Company

    International Nuclear Information System (INIS)

    Savage, J.W.

    1983-01-01

    A human factors engineering design review/audit of the Byron Unit 1 control room was performed at the site on November 17 through November 19, 1981. This review was accomplished using the Unit 2 control room appropriately mocked-up to reflect design changes already committed to be incorporated in Unit 1. The report was prepared on the basis of the HFEB's audit of the applicant's Preliminary Design Assessment report and the human factors engineering design review performed at the site. This design review was carried out by a team from the Human Factors Engineering Branch, Division of Human Factors Safety. The review team was assisted by consultants from BioTechnology, Inc. (Falls Church, Virginia), and from Lawrence Livermore National Laboratory (University of California), Livermore, California

  1. Position paper - peer review and design verification of selected activities

    International Nuclear Information System (INIS)

    Stine, M.D.

    1994-09-01

    Position Paper to develop and document a position on the performance of independent peer reviews on selected design and analysis components of the Title I (preliminary) and Title II (detailed) design phases of the Multi-Function Waste Tank Facility project

  2. DSRS guidelines. Reference document for the IAEA Design Safety Review Services

    International Nuclear Information System (INIS)

    1999-01-01

    The publication covers the general topic of design safety review of a nuclear power plant. It is intended to make Member States aware of the possibility of a service through which they can have a better appreciation of the overall design of a facility or of a plant already in operation. It includes a generic and procedural part followed by a technical part corresponding to different systems of a nuclear power plant. It is intended to be used mainly in preparation and execution of a design review service by the IAEA and to provide information to potential recipients of the service regarding the effort involved and the topics that can be covered. it is expected to be useful if Member States decide to conduct such reviews themselves either through regulatory authorities or as part of self assessment activities by plant management

  3. The three-point function in split dimensional regularization in the Coulomb gauge

    CERN Document Server

    Leibbrandt, G

    1998-01-01

    We use a gauge-invariant regularization procedure, called ``split dimensional regularization'', to evaluate the quark self-energy $\\Sigma (p)$ and quark-quark-gluon vertex function $\\Lambda_\\mu (p^\\prime,p)$ in the Coulomb gauge, $\\vec{\\bigtriangledown}\\cdot\\vec{A}^a = 0$. The technique of split dimensional regularization was designed to regulate Coulomb-gauge Feynman integrals in non-Abelian theories. The technique which is based on two complex regulating parameters, $\\omega$ and $\\sigma$, is shown to generate a well-defined set of Coulomb-gauge integrals. A major component of this project deals with the evaluation of four-propagator and five-propagator Coulomb integrals, some of which are nonlocal. It is further argued that the standard one-loop BRST identity relating $\\Sigma$ and $\\Lambda_\\mu$, should by rights be replaced by a more general BRST identity which contains two additional contributions from ghost vertex diagrams. Despite the appearance of nonlocal Coulomb integrals, both $\\Sigma$ and $\\Lambda_\\...

  4. Designed graphene-peptide nanocomposites for biosensor applications: A review

    International Nuclear Information System (INIS)

    Wang, Li; Zhang, Yujie; Wu, Aiguo; Wei, Gang

    2017-01-01

    The modification of graphene with biomacromolecules like DNA, protein, peptide, and others extends the potential applications of graphene materials in various fields. The bound biomacromolecules could improve the biocompatibility and bio-recognition ability of graphene-based nanocomposites, therefore could greatly enhance their biosensing performances on both selectivity and sensitivity. In this review, we presented a comprehensive introduction and discussion on recent advance in the synthesis and biosensor applications of graphene-peptide nanocomposites. The biofunctionalization of graphene with specifically designed peptides, and the synthesis strategies of graphene-peptide (monomer, nanofibrils, and nanotubes) nanocomposites were demonstrated. On the other hand, the fabrication of graphene-peptide nanocomposite based biosensor architectures for electrochemical, fluorescent, electronic, and spectroscopic biosensing were further presented. This review includes nearly all the studies on the fabrication and applications of graphene-peptide based biosensors recently, which will promote the future developments of graphene-based biosensors in biomedical detection and environmental analysis. - Highlights: • A comprehensive review on the fabrication and application of graphene-peptide nanocomposites was presented. • The design of peptide sequences for biofunctionalization of various graphene materials was presented. • Multi-strategies on the fabrication of biosensors with graphene-peptide nanocomposites were discussed. • Designed graphene-peptide nanocomposites showed wide biosensor applications.

  5. The uniqueness of the regularization procedure

    International Nuclear Information System (INIS)

    Brzezowski, S.

    1981-01-01

    On the grounds of the BPHZ procedure, the criteria of correct regularization in perturbation calculations of QFT are given, together with the prescription for dividing the regularized formulas into the finite and infinite parts. (author)

  6. Coupling regularizes individual units in noisy populations

    International Nuclear Information System (INIS)

    Ly Cheng; Ermentrout, G. Bard

    2010-01-01

    The regularity of a noisy system can modulate in various ways. It is well known that coupling in a population can lower the variability of the entire network; the collective activity is more regular. Here, we show that diffusive (reciprocal) coupling of two simple Ornstein-Uhlenbeck (O-U) processes can regularize the individual, even when it is coupled to a noisier process. In cellular networks, the regularity of individual cells is important when a select few play a significant role. The regularizing effect of coupling surprisingly applies also to general nonlinear noisy oscillators. However, unlike with the O-U process, coupling-induced regularity is robust to different kinds of coupling. With two coupled noisy oscillators, we derive an asymptotic formula assuming weak noise and coupling for the variance of the period (i.e., spike times) that accurately captures this effect. Moreover, we find that reciprocal coupling can regularize the individual period of higher dimensional oscillators such as the Morris-Lecar and Brusselator models, even when coupled to noisier oscillators. Coupling can have a counterintuitive and beneficial effect on noisy systems. These results have implications for the role of connectivity with noisy oscillators and the modulation of variability of individual oscillators.

  7. Learning regularization parameters for general-form Tikhonov

    International Nuclear Information System (INIS)

    Chung, Julianne; Español, Malena I

    2017-01-01

    Computing regularization parameters for general-form Tikhonov regularization can be an expensive and difficult task, especially if multiple parameters or many solutions need to be computed in real time. In this work, we assume training data is available and describe an efficient learning approach for computing regularization parameters that can be used for a large set of problems. We consider an empirical Bayes risk minimization framework for finding regularization parameters that minimize average errors for the training data. We first extend methods from Chung et al (2011 SIAM J. Sci. Comput. 33 3132–52) to the general-form Tikhonov problem. Then we develop a learning approach for multi-parameter Tikhonov problems, for the case where all involved matrices are simultaneously diagonalizable. For problems where this is not the case, we describe an approach to compute near-optimal regularization parameters by using operator approximations for the original problem. Finally, we propose a new class of regularizing filters, where solutions correspond to multi-parameter Tikhonov solutions, that requires less data than previously proposed optimal error filters, avoids the generalized SVD, and allows flexibility and novelty in the choice of regularization matrices. Numerical results for 1D and 2D examples using different norms on the errors show the effectiveness of our methods. (paper)

  8. 5 CFR 551.421 - Regular working hours.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Regular working hours. 551.421 Section... Activities § 551.421 Regular working hours. (a) Under the Act there is no requirement that a Federal employee... distinction based on whether the activity is performed by an employee during regular working hours or outside...

  9. Regular extensions of some classes of grammars

    NARCIS (Netherlands)

    Nijholt, Antinus

    Culik and Cohen introduced the class of LR-regular grammars, an extension of the LR(k) grammars. In this report we consider the analogous extension of the LL(k) grammers, called the LL-regular grammars. The relations of this class of grammars to other classes of grammars are shown. Every LL-regular

  10. Benefits of regular aerobic exercise for executive functioning in healthy populations.

    Science.gov (United States)

    Guiney, Hayley; Machado, Liana

    2013-02-01

    Research suggests that regular aerobic exercise has the potential to improve executive functioning, even in healthy populations. The purpose of this review is to elucidate which components of executive functioning benefit from such exercise in healthy populations. In light of the developmental time course of executive functions, we consider separately children, young adults, and older adults. Data to date from studies of aging provide strong evidence of exercise-linked benefits related to task switching, selective attention, inhibition of prepotent responses, and working memory capacity; furthermore, cross-sectional fitness data suggest that working memory updating could potentially benefit as well. In young adults, working memory updating is the main executive function shown to benefit from regular exercise, but cross-sectional data further suggest that task-switching and post error performance may also benefit. In children, working memory capacity has been shown to benefit, and cross-sectional data suggest potential benefits for selective attention and inhibitory control. Although more research investigating exercise-related benefits for specific components of executive functioning is clearly needed in young adults and children, when considered across the age groups, ample evidence indicates that regular engagement in aerobic exercise can provide a simple means for healthy people to optimize a range of executive functions.

  11. Regular non-twisting S-branes

    International Nuclear Information System (INIS)

    Obregon, Octavio; Quevedo, Hernando; Ryan, Michael P.

    2004-01-01

    We construct a family of time and angular dependent, regular S-brane solutions which corresponds to a simple analytical continuation of the Zipoy-Voorhees 4-dimensional vacuum spacetime. The solutions are asymptotically flat and turn out to be free of singularities without requiring a twist in space. They can be considered as the simplest non-singular generalization of the singular S0-brane solution. We analyze the properties of a representative of this family of solutions and show that it resembles to some extent the asymptotic properties of the regular Kerr S-brane. The R-symmetry corresponds, however, to the general lorentzian symmetry. Several generalizations of this regular solution are derived which include a charged S-brane and an additional dilatonic field. (author)

  12. Near-Regular Structure Discovery Using Linear Programming

    KAUST Repository

    Huang, Qixing

    2014-06-02

    Near-regular structures are common in manmade and natural objects. Algorithmic detection of such regularity greatly facilitates our understanding of shape structures, leads to compact encoding of input geometries, and enables efficient generation and manipulation of complex patterns on both acquired and synthesized objects. Such regularity manifests itself both in the repetition of certain geometric elements, as well as in the structured arrangement of the elements. We cast the regularity detection problem as an optimization and efficiently solve it using linear programming techniques. Our optimization has a discrete aspect, that is, the connectivity relationships among the elements, as well as a continuous aspect, namely the locations of the elements of interest. Both these aspects are captured by our near-regular structure extraction framework, which alternates between discrete and continuous optimizations. We demonstrate the effectiveness of our framework on a variety of problems including near-regular structure extraction, structure-preserving pattern manipulation, and markerless correspondence detection. Robustness results with respect to geometric and topological noise are presented on synthesized, real-world, and also benchmark datasets. © 2014 ACM.

  13. Design and implementation of Metta, a metasearch engine for biomedical literature retrieval intended for systematic reviewers.

    Science.gov (United States)

    Smalheiser, Neil R; Lin, Can; Jia, Lifeng; Jiang, Yu; Cohen, Aaron M; Yu, Clement; Davis, John M; Adams, Clive E; McDonagh, Marian S; Meng, Weiyi

    2014-01-01

    Individuals and groups who write systematic reviews and meta-analyses in evidence-based medicine regularly carry out literature searches across multiple search engines linked to different bibliographic databases, and thus have an urgent need for a suitable metasearch engine to save time spent on repeated searches and to remove duplicate publications from initial consideration. Unlike general users who generally carry out searches to find a few highly relevant (or highly recent) articles, systematic reviewers seek to obtain a comprehensive set of articles on a given topic, satisfying specific criteria. This creates special requirements and challenges for metasearch engine design and implementation. We created a federated search tool that is connected to five databases: PubMed, EMBASE, CINAHL, PsycINFO, and the Cochrane Central Register of Controlled Trials. Retrieved bibliographic records were shown online; optionally, results could be de-duplicated and exported in both BibTex and XML format. The query interface was extensively modified in response to feedback from users within our team. Besides a general search track and one focused on human-related articles, we also added search tracks optimized to identify case reports and systematic reviews. Although users could modify preset search options, they were rarely if ever altered in practice. Up to several thousand retrieved records could be exported within a few minutes. De-duplication of records returned from multiple databases was carried out in a prioritized fashion that favored retaining citations returned from PubMed. Systematic reviewers are used to formulating complex queries using strategies and search tags that are specific for individual databases. Metta offers a different approach that may save substantial time but which requires modification of current search strategies and better indexing of randomized controlled trial articles. We envision Metta as one piece of a multi-tool pipeline that will assist

  14. Regular Expression Matching and Operational Semantics

    Directory of Open Access Journals (Sweden)

    Asiri Rathnayake

    2011-08-01

    Full Text Available Many programming languages and tools, ranging from grep to the Java String library, contain regular expression matchers. Rather than first translating a regular expression into a deterministic finite automaton, such implementations typically match the regular expression on the fly. Thus they can be seen as virtual machines interpreting the regular expression much as if it were a program with some non-deterministic constructs such as the Kleene star. We formalize this implementation technique for regular expression matching using operational semantics. Specifically, we derive a series of abstract machines, moving from the abstract definition of matching to increasingly realistic machines. First a continuation is added to the operational semantics to describe what remains to be matched after the current expression. Next, we represent the expression as a data structure using pointers, which enables redundant searches to be eliminated via testing for pointer equality. From there, we arrive both at Thompson's lockstep construction and a machine that performs some operations in parallel, suitable for implementation on a large number of cores, such as a GPU. We formalize the parallel machine using process algebra and report some preliminary experiments with an implementation on a graphics processor using CUDA.

  15. Design review report for modifications to RMCS safety class equipment

    International Nuclear Information System (INIS)

    Corbett, J.E.

    1997-01-01

    This report documents the completion of the formal design review for modifications to the Rotary Mode Core Sampling (RMCS) safety class equipment. These modifications are intended to support core sampling operations in waste tanks requiring flammable gas controls. The objective of this review was to approve the Engineering Change Notices affecting safety class equipment used in the RMCS system. The conclusion reached by the review committee was that these changes are acceptable

  16. Design review report for modifications to RMCS safety class equipment

    Energy Technology Data Exchange (ETDEWEB)

    Corbett, J.E.

    1997-05-30

    This report documents the completion of the formal design review for modifications to the Rotary Mode Core Sampling (RMCS) safety class equipment. These modifications are intended to support core sampling operations in waste tanks requiring flammable gas controls. The objective of this review was to approve the Engineering Change Notices affecting safety class equipment used in the RMCS system. The conclusion reached by the review committee was that these changes are acceptable.

  17. Tetravalent one-regular graphs of order 4p2

    DEFF Research Database (Denmark)

    Feng, Yan-Quan; Kutnar, Klavdija; Marusic, Dragan

    2014-01-01

    A graph is one-regular if its automorphism group acts regularly on the set of its arcs. In this paper tetravalent one-regular graphs of order 4p2, where p is a prime, are classified.......A graph is one-regular if its automorphism group acts regularly on the set of its arcs. In this paper tetravalent one-regular graphs of order 4p2, where p is a prime, are classified....

  18. Regularization and error assignment to unfolded distributions

    CERN Document Server

    Zech, Gunter

    2011-01-01

    The commonly used approach to present unfolded data only in graphical formwith the diagonal error depending on the regularization strength is unsatisfac-tory. It does not permit the adjustment of parameters of theories, the exclusionof theories that are admitted by the observed data and does not allow the com-bination of data from different experiments. We propose fixing the regulariza-tion strength by a p-value criterion, indicating the experimental uncertaintiesindependent of the regularization and publishing the unfolded data in additionwithout regularization. These considerations are illustrated with three differentunfolding and smoothing approaches applied to a toy example.

  19. Review of public comments on proposed seismic design criteria

    International Nuclear Information System (INIS)

    Philippacopoulos, A.J.; Shaukat, S.K.; Chokshi, N.C.; Bagchi, G.; Nuclear Regulatory Commission, Washington, DC; Nuclear Regulatory Commission, Washington, DC

    1989-01-01

    During the first quarter of 1988, the Nuclear Regulatory Commission (NRC) prepared a proposed Revision 2 to the NUREG-0800 Standard Review Plan (SRP) Sections 2.5.2 (Vibratory Ground Motion), 3.7.1 (Seismic Design Parameters), 3.7.2 (Seismic Systems Analysis) and 3.7.3 (Seismic Subsystem Analysis). The proposed Revision 2 to the SRP was a result of many years' work carried out by the NRC and the nuclear industry on the Unresolved Safety Issue (USI) A-40: ''Seismic Design Criteria.'' The background material related to NRC's efforts for resolving the A-40 issue is described in NUREG-1233. In June 1988, the proposed Revision 2 of the SRP was issued by NRC for public review and comments. Comments were received from Sargent and Lundy Engineers, Westinghouse Electric Corporation, Stevenson and Associates, Duke Power Company, General Electric Company and Electric Power Research Institute. In September 1988, Brookhaven National Laboratory (BNL) and its consultants (C.J. Costantino, R.P. Kennedy, J. Stevenson, M. Shinozuka and A.S. Veletsos) were requested to carry out a review of the comments received from the above six organizations. The objective of this review was to assist the NRC staff with the evaluation and resolution of the public comments. This review was initiated during October 1988 and it was completed on January 1989. As a result of this review, a set of modifications to the above mentioned sections of the SRP were recommended by BNL and its consultants. This paper summarizes the recommended modifications. 4 refs

  20. The design of reliability data bases, part I: review of standard design concepts

    International Nuclear Information System (INIS)

    Cooke, Roger M.

    1996-01-01

    Main styles in the design of reliability data banks (RDB's) are reviewed. The conceptual and mathematical tools underlying these designs are summarized. A key point is the method for assessing failure rates for competing failure modes. The theory of independent competing risk and the relation to colored Poisson processes is explained. The notions of observed and naked failure rates are defined, and their equivalence under the assumption of independence is shown. In conclusion, the needs of different users are compared with the information currently offered

  1. Producing design objects from regular polyhedra: A Pratical approach

    OpenAIRE

    Polimeni, Beniamino

    2017-01-01

    In the last few years, digital modeling techniques have played a major role in Architecture and design, influencing, at the same time, the creative process and the way the objects are fabricated. This revolution has produced a new fertile generation of architects and designers focused on the expanding possibilities of material and formal production reinforcing the idea of architecture as an interaction between art and artisanship. This innovative perspective inspires this paper, w...

  2. Regular cell design approach considering lithography-induced process variations

    OpenAIRE

    Gómez Fernández, Sergio

    2014-01-01

    The deployment delays for EUVL, forces IC design to continue using 193nm wavelength lithography with innovative and costly techniques in order to faithfully print sub-wavelength features and combat lithography induced process variations. The effect of the lithography gap in current and upcoming technologies is to cause severe distortions due to optical diffraction in the printed patterns and thus degrading manufacturing yield. Therefore, a paradigm shift in layout design is mandatory towards ...

  3. Higher order total variation regularization for EIT reconstruction.

    Science.gov (United States)

    Gong, Bo; Schullcke, Benjamin; Krueger-Ziolek, Sabine; Zhang, Fan; Mueller-Lisse, Ullrich; Moeller, Knut

    2018-01-08

    Electrical impedance tomography (EIT) attempts to reveal the conductivity distribution of a domain based on the electrical boundary condition. This is an ill-posed inverse problem; its solution is very unstable. Total variation (TV) regularization is one of the techniques commonly employed to stabilize reconstructions. However, it is well known that TV regularization induces staircase effects, which are not realistic in clinical applications. To reduce such artifacts, modified TV regularization terms considering a higher order differential operator were developed in several previous studies. One of them is called total generalized variation (TGV) regularization. TGV regularization has been successively applied in image processing in a regular grid context. In this study, we adapted TGV regularization to the finite element model (FEM) framework for EIT reconstruction. Reconstructions using simulation and clinical data were performed. First results indicate that, in comparison to TV regularization, TGV regularization promotes more realistic images. Graphical abstract Reconstructed conductivity changes located on selected vertical lines. For each of the reconstructed images as well as the ground truth image, conductivity changes located along the selected left and right vertical lines are plotted. In these plots, the notation GT in the legend stands for ground truth, TV stands for total variation method, and TGV stands for total generalized variation method. Reconstructed conductivity distributions from the GREIT algorithm are also demonstrated.

  4. Application of Turchin's method of statistical regularization

    Science.gov (United States)

    Zelenyi, Mikhail; Poliakova, Mariia; Nozik, Alexander; Khudyakov, Alexey

    2018-04-01

    During analysis of experimental data, one usually needs to restore a signal after it has been convoluted with some kind of apparatus function. According to Hadamard's definition this problem is ill-posed and requires regularization to provide sensible results. In this article we describe an implementation of the Turchin's method of statistical regularization based on the Bayesian approach to the regularization strategy.

  5. On the regularized fermionic projector of the vacuum

    Science.gov (United States)

    Finster, Felix

    2008-03-01

    We construct families of fermionic projectors with spherically symmetric regularization, which satisfy the condition of a distributional MP-product. The method is to analyze regularization tails with a power law or logarithmic scaling in composite expressions in the fermionic projector. The resulting regularizations break the Lorentz symmetry and give rise to a multilayer structure of the fermionic projector near the light cone. Furthermore, we construct regularizations which go beyond the distributional MP-product in that they yield additional distributional contributions supported at the origin. The remaining freedom for the regularization parameters and the consequences for the normalization of the fermionic states are discussed.

  6. On the regularized fermionic projector of the vacuum

    International Nuclear Information System (INIS)

    Finster, Felix

    2008-01-01

    We construct families of fermionic projectors with spherically symmetric regularization, which satisfy the condition of a distributional MP-product. The method is to analyze regularization tails with a power law or logarithmic scaling in composite expressions in the fermionic projector. The resulting regularizations break the Lorentz symmetry and give rise to a multilayer structure of the fermionic projector near the light cone. Furthermore, we construct regularizations which go beyond the distributional MP-product in that they yield additional distributional contributions supported at the origin. The remaining freedom for the regularization parameters and the consequences for the normalization of the fermionic states are discussed

  7. Review of revised Japanese seismic guidelines for Nuclear Power Plant design

    International Nuclear Information System (INIS)

    Kato, M.

    1987-01-01

    Development of aseismic design for nuclear power plants in Japan has evolved roughly in three stages. The first phase, which continued by 1978, was a period progressive development when design for each siteplant has referred to designs of predecessor plants and have added the latest knowledge and experience in that time. The second phase from issuance of 'Regulatory Guide for Aseismic Design of Nuclear Power Reactor Facilities' (1978, revised in 1981), or 1978 to 1986, was a period when the application of customary conservative design method was continued, while standardization of the aseismic design technology have been proceded. It is in this phase when new knowledge was accumulated by aseismic proof studies. The third phase represents a transient period to rational design when the conservative aseismic design technology has been reviewed due to the new knowledge and revision of the above JEAG guideline has progressed for incorporation in design, and ont the other hand by-laws of the Ministry of International Trade and Industry are being provided. In this report a review is given of aseismic design and its guideline after the second phase onward and an overview of the revised JEAG - Recent Aseismic Design Method - and the by-laws including rationalization of aseismic design technology

  8. Residential Solar Design Review: A Manual on Community Architectural Controls and Solar Energy Use.

    Science.gov (United States)

    Jaffe, Martin; Erley, Duncan

    Presented are architectural design issues associated with solar energy use, and procedures for design review committees to consider in examining residential solar installation in light of existing aesthetic goals for their communities. Recommended design review criteria include the type of solar system being used and the ways in which the system…

  9. ALARA Design Review for the Resumption of the Plutonium Finishing Plant (PFP) Cementation Process Project Activities

    CERN Document Server

    Dayley, L

    2000-01-01

    The requirements for the performance of radiological design reviews are codified in 10CFR835, Occupational Radiation Protection. The basic requirements for the performance of ALARA design reviews are presented in the Hanford Site Radiological Control Manual (HSRCM). The HSRCM has established trigger levels requiring radiological reviews of non-routine or complex work activities. These requirements are implemented in site procedures HNF-PRO-1622 and 1623. HNF-PRO-1622 Radiological Design Review Process requires that ''radiological design reviews [be performed] of new facilities and equipment and modifications of existing facilities and equipment''. In addition, HNF-PRO-1623 Radiological Work Planning Process requires a formal ALARA Review for planned activities that are estimated to exceed 1 person-rem total Dose Equivalent (DE). The purpose of this review is to validate that the original design for the PFP Cementation Process ensures that the principles of ALARA (As Low As Reasonably Achievable) were included...

  10. Attitude of Regular and Itinerant Teachers Towards the Inclusion of Hearing Impairment Children

    Directory of Open Access Journals (Sweden)

    Kamal Parhoon

    2014-12-01

    Full Text Available Objectives: Inclusive education is a process of enabling all children to learn and participate effectively within mainstream school systems. It does not segregate children who have different abilities or needs. This article explores the attitudes of regular and itinerant teachers about inclusion of hearing impairment children in their schools in general education. Methods: In a descriptive Survey research design, the sample included 100 teachers (50 regular and 50 itinerant who were selected randomly, according to a multistage sampling method. Data was collected by using questionnaire with 32 questions regarding their attitudes. One-way Analysis of Variance and t-test were performed to obtain between- group comparisons. Results: The results indicated that the teacher's positive attitudes towards inclusive educational system of students with hearing impairment. Significant difference in attitudes was observed, based on the teaching experience, gender, level of teaching. The results also indicate that most teachers are agreeable to the inclusion of students with hearing impairment in their classrooms. Discussion: successful inclusion for hearing impairment children in regular classrooms entails the positive attitudes of Regular and itinerant teachers through a systematic programming within the classroom.

  11. Regularization modeling for large-eddy simulation

    NARCIS (Netherlands)

    Geurts, Bernardus J.; Holm, D.D.

    2003-01-01

    A new modeling approach for large-eddy simulation (LES) is obtained by combining a "regularization principle" with an explicit filter and its inversion. This regularization approach allows a systematic derivation of the implied subgrid model, which resolves the closure problem. The central role of

  12. Multiple Learning Strategies Project. Small Engine Repair Service. Regular Vocational. [Vol. 1.

    Science.gov (United States)

    Pitts, Jim; And Others

    This instructional package is one of two designed for use by regular vocational students in the vocational area of small engine repair service. Contained in this document are forty-four learning modules organized into ten units: engine block; air cleaner; starters; fuel tanks; lines, filters, and pumps; carburetors; electrical; magneto systems;…

  13. Spatially-Variant Tikhonov Regularization for Double-Difference Waveform Inversion

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Youzuo [Los Alamos National Laboratory; Huang, Lianjie [Los Alamos National Laboratory; Zhang, Zhigang [Los Alamos National Laboratory

    2011-01-01

    Double-difference waveform inversion is a potential tool for quantitative monitoring for geologic carbon storage. It jointly inverts time-lapse seismic data for changes in reservoir geophysical properties. Due to the ill-posedness of waveform inversion, it is a great challenge to obtain reservoir changes accurately and efficiently, particularly when using time-lapse seismic reflection data. Regularization techniques can be utilized to address the issue of ill-posedness. The regularization parameter controls the smoothness of inversion results. A constant regularization parameter is normally used in waveform inversion, and an optimal regularization parameter has to be selected. The resulting inversion results are a trade off among regions with different smoothness or noise levels; therefore the images are either over regularized in some regions while under regularized in the others. In this paper, we employ a spatially-variant parameter in the Tikhonov regularization scheme used in double-difference waveform tomography to improve the inversion accuracy and robustness. We compare the results obtained using a spatially-variant parameter with those obtained using a constant regularization parameter and those produced without any regularization. We observe that, utilizing a spatially-variant regularization scheme, the target regions are well reconstructed while the noise is reduced in the other regions. We show that the spatially-variant regularization scheme provides the flexibility to regularize local regions based on the a priori information without increasing computational costs and the computer memory requirement.

  14. NRC review of passive reactor design certification testing programs: Overview, progress, and regulatory perspective

    Energy Technology Data Exchange (ETDEWEB)

    Levin, A.E.

    1995-09-01

    New reactor designs, employing passive safety systems, are currently under development by reactor vendors for certification under the U.S. Nuclear Regulatory Commission`s (NRC`s) design certification rule. The vendors have established testing programs to support the certification of the passive designs, to meet regulatory requirements for demonstration of passive safety system performance. The NRC has, therefore, developed a process for the review of the vendors` testing programs and for incorporation of the results of those reviews into the safety evaluations for the passive plants. This paper discusses progress in the test program reviews, and also addresses unique regulatory aspects of those reviews.

  15. Design review report FFTF interim storage cask

    International Nuclear Information System (INIS)

    Scott, P.L.

    1995-01-01

    Final Design Review Report for the FFTF Interim Storage Cask. The Interim Storage Cask (ISC) will be used for long term above ground dry storage of FFTF irradiated fuel in Core Component Containers (CCC)s. The CCC has been designed and will house assemblies that have been sodium washed in the IEM Cell. The Solid Waste Cask (SWC) will transfer a full CCC from the IEM Cell to the RSB Cask Loading Station where the ISC will be located to receive it. Once the loaded ISC has been sealed at the RSB Cask Loading Station, it will be transferred by facility crane to the DSWC Transporter. After the ISC has been transferred to the Interim Storage Area (ISA), which is yet to be designed, a mobile crane will be used to place the ISC in its final storage location

  16. Formal design review report project W-151 mixer pump procurement

    Energy Technology Data Exchange (ETDEWEB)

    Crass, D.W.

    1997-01-21

    A formal design review for WHC-S-0040 was held on January 21, 1993. The review was completed January 29, 1993. No outstanding action items existed. Comments were recorded on Record Comment Record (RCR) forms and incorporated into the specification. The specification was considered acceptable, approved and issued as WHC-S-0040, Rev. 0 on March 4, 1993.

  17. Manifold Regularized Correlation Object Tracking.

    Science.gov (United States)

    Hu, Hongwei; Ma, Bo; Shen, Jianbing; Shao, Ling

    2018-05-01

    In this paper, we propose a manifold regularized correlation tracking method with augmented samples. To make better use of the unlabeled data and the manifold structure of the sample space, a manifold regularization-based correlation filter is introduced, which aims to assign similar labels to neighbor samples. Meanwhile, the regression model is learned by exploiting the block-circulant structure of matrices resulting from the augmented translated samples over multiple base samples cropped from both target and nontarget regions. Thus, the final classifier in our method is trained with positive, negative, and unlabeled base samples, which is a semisupervised learning framework. A block optimization strategy is further introduced to learn a manifold regularization-based correlation filter for efficient online tracking. Experiments on two public tracking data sets demonstrate the superior performance of our tracker compared with the state-of-the-art tracking approaches.

  18. Review of Y/OWI/TM-36: repository design performance in salt, granite, shale or basalt

    International Nuclear Information System (INIS)

    Talbot, R.; Nair, O.B.

    1979-09-01

    As part of the ongoing work by the Lawrence Livermore Laboratory to evaluate repository design performance, this memorandum presents a review of the preconceptual repository design described in Y/OWI/TM-36, Technical Support for GEIS: Radioactive Waste Isolation in Geologic Formations, April 1978. The purpose of this review is: to assess the adequacy of the design procedures and assumptions; to identify inappropriate or unsubstantiated design issues; to identify areas where additional numerical analyses may be required; and to develop data for inclusion in a reference repository design. The preconceptual repository design is presented in the form of 23 volumes of data base, analyses, and design layouts for four rock types: bedded salt, shale, granite and basalt. This memorandum reviews all four repository designs

  19. From recreational to regular drug use

    DEFF Research Database (Denmark)

    Järvinen, Margaretha; Ravn, Signe

    2011-01-01

    This article analyses the process of going from recreational use to regular and problematic use of illegal drugs. We present a model containing six career contingencies relevant for young people’s progress from recreational to regular drug use: the closing of social networks, changes in forms...

  20. Review of high fidelity imaging spectrometer design for remote sensing

    Science.gov (United States)

    Mouroulis, Pantazis; Green, Robert O.

    2018-04-01

    We review the design and assessment techniques that underlie a number of successfully deployed space and airborne imaging spectrometers that have been demonstrated to achieve demanding specifications in terms of throughput and response uniformity. The principles are illustrated with telescope designs as well as spectrometer examples from the Offner and Dyson families. We also show how the design space can be extended with the use of freeform surfaces and provide additional design examples with grating as well as prism dispersive elements.

  1. Regular variation on measure chains

    Czech Academy of Sciences Publication Activity Database

    Řehák, Pavel; Vitovec, J.

    2010-01-01

    Roč. 72, č. 1 (2010), s. 439-448 ISSN 0362-546X R&D Projects: GA AV ČR KJB100190701 Institutional research plan: CEZ:AV0Z10190503 Keywords : regularly varying function * regularly varying sequence * measure chain * time scale * embedding theorem * representation theorem * second order dynamic equation * asymptotic properties Subject RIV: BA - General Mathematics Impact factor: 1.279, year: 2010 http://www.sciencedirect.com/science/article/pii/S0362546X09008475

  2. New regular black hole solutions

    International Nuclear Information System (INIS)

    Lemos, Jose P. S.; Zanchin, Vilson T.

    2011-01-01

    In the present work we consider general relativity coupled to Maxwell's electromagnetism and charged matter. Under the assumption of spherical symmetry, there is a particular class of solutions that correspond to regular charged black holes whose interior region is de Sitter, the exterior region is Reissner-Nordstroem and there is a charged thin-layer in-between the two. The main physical and geometrical properties of such charged regular black holes are analyzed.

  3. Manifold Regularized Correlation Object Tracking

    OpenAIRE

    Hu, Hongwei; Ma, Bo; Shen, Jianbing; Shao, Ling

    2017-01-01

    In this paper, we propose a manifold regularized correlation tracking method with augmented samples. To make better use of the unlabeled data and the manifold structure of the sample space, a manifold regularization-based correlation filter is introduced, which aims to assign similar labels to neighbor samples. Meanwhile, the regression model is learned by exploiting the block-circulant structure of matrices resulting from the augmented translated samples over multiple base samples cropped fr...

  4. On geodesics in low regularity

    Science.gov (United States)

    Sämann, Clemens; Steinbauer, Roland

    2018-02-01

    We consider geodesics in both Riemannian and Lorentzian manifolds with metrics of low regularity. We discuss existence of extremal curves for continuous metrics and present several old and new examples that highlight their subtle interrelation with solutions of the geodesic equations. Then we turn to the initial value problem for geodesics for locally Lipschitz continuous metrics and generalize recent results on existence, regularity and uniqueness of solutions in the sense of Filippov.

  5. An FPGA Implementation of (3,6-Regular Low-Density Parity-Check Code Decoder

    Directory of Open Access Journals (Sweden)

    Tong Zhang

    2003-05-01

    Full Text Available Because of their excellent error-correcting performance, low-density parity-check (LDPC codes have recently attracted a lot of attention. In this paper, we are interested in the practical LDPC code decoder hardware implementations. The direct fully parallel decoder implementation usually incurs too high hardware complexity for many real applications, thus partly parallel decoder design approaches that can achieve appropriate trade-offs between hardware complexity and decoding throughput are highly desirable. Applying a joint code and decoder design methodology, we develop a high-speed (3,k-regular LDPC code partly parallel decoder architecture based on which we implement a 9216-bit, rate-1/2(3,6-regular LDPC code decoder on Xilinx FPGA device. This partly parallel decoder supports a maximum symbol throughput of 54 Mbps and achieves BER 10−6 at 2 dB over AWGN channel while performing maximum 18 decoding iterations.

  6. Manifold Regularized Reinforcement Learning.

    Science.gov (United States)

    Li, Hongliang; Liu, Derong; Wang, Ding

    2018-04-01

    This paper introduces a novel manifold regularized reinforcement learning scheme for continuous Markov decision processes. Smooth feature representations for value function approximation can be automatically learned using the unsupervised manifold regularization method. The learned features are data-driven, and can be adapted to the geometry of the state space. Furthermore, the scheme provides a direct basis representation extension for novel samples during policy learning and control. The performance of the proposed scheme is evaluated on two benchmark control tasks, i.e., the inverted pendulum and the energy storage problem. Simulation results illustrate the concepts of the proposed scheme and show that it can obtain excellent performance.

  7. Interactive augmented reality system for product design review

    Science.gov (United States)

    Caruso, Giandomenico; Re, Guido Maria

    2010-01-01

    The product development process, of industrial products, includes a phase dedicated to the design review that is a crucial phase where various experts cooperate in selecting the optimal product shape. Although computer graphics allows us to create very realistic virtual representations of the products, it is not uncommon that designers decide to build physical mock-ups of their newly conceived products because they need to physically interact with the prototype and also to evaluate the product within a plurality of real contexts. This paper describes the hardware and software development of our Augmented Reality design review system that allows to overcome some issues related to the 3D visualization and to the interaction with the virtual objects. Our system is composed by a Video See Through Head Mounted Display, which allows to improve the 3D visualization by controlling the convergence of the video cameras automatically, and a wireless control system, which allows us to create some metaphors to interact with the virtual objects. During the development of the system, in order to define and tune the algorithms, we have performed some testing sessions. Then, we have performed further tests in order to verify the effectiveness of the system and to collect additional data and comments about usability and ergonomic aspects.

  8. Review of ASME-NH Design Materials for Creep-Fatigue

    International Nuclear Information System (INIS)

    Koo, Gyeong Hoi; Kim, Jong Bum

    2010-01-01

    To review and recommend the candidate design materials for the Sodium-Cooled Fast Reactor, the material sensitivity evaluations by the comparison of design data between the ASME-NH materials were performed by using the SIE ASME-NH computer program implementing the material database of the ASME-NH. The design material data provided by the ASME-NH code are the elastic modulus and yield Strength, Time-Independent Allowable Stress Intensity value, time-dependent allowable stress intensity value, expected minimum stress-to rupture value, stress rupture Factors for weldment, isochronous stress-strain curves, and design fatigue curves. Among these, the data related with the creep-fatigue evaluation are investigated in this study

  9. Fish-inspired robots: design, sensing, actuation, and autonomy--a review of research.

    Science.gov (United States)

    Raj, Aditi; Thakur, Atul

    2016-04-13

    Underwater robot designs inspired by the behavior, physiology, and anatomy of fishes can provide enhanced maneuverability, stealth, and energy efficiency. Over the last two decades, robotics researchers have developed and reported a large variety of fish-inspired robot designs. The purpose of this review is to report different types of fish-inspired robot designs based upon their intended locomotion patterns. We present a detailed comparison of various design features like sensing, actuation, autonomy, waterproofing, and morphological structure of fish-inspired robots reported in the past decade. We believe that by studying the existing robots, future designers will be able to create new designs by adopting features from the successful robots. The review also summarizes the open research issues that need to be taken up for the further advancement of the field and also for the deployment of fish-inspired robots in practice.

  10. Laplacian manifold regularization method for fluorescence molecular tomography

    Science.gov (United States)

    He, Xuelei; Wang, Xiaodong; Yi, Huangjian; Chen, Yanrong; Zhang, Xu; Yu, Jingjing; He, Xiaowei

    2017-04-01

    Sparse regularization methods have been widely used in fluorescence molecular tomography (FMT) for stable three-dimensional reconstruction. Generally, ℓ1-regularization-based methods allow for utilizing the sparsity nature of the target distribution. However, in addition to sparsity, the spatial structure information should be exploited as well. A joint ℓ1 and Laplacian manifold regularization model is proposed to improve the reconstruction performance, and two algorithms (with and without Barzilai-Borwein strategy) are presented to solve the regularization model. Numerical studies and in vivo experiment demonstrate that the proposed Gradient projection-resolved Laplacian manifold regularization method for the joint model performed better than the comparative algorithm for ℓ1 minimization method in both spatial aggregation and location accuracy.

  11. Learning Sparse Visual Representations with Leaky Capped Norm Regularizers

    OpenAIRE

    Wangni, Jianqiao; Lin, Dahua

    2017-01-01

    Sparsity inducing regularization is an important part for learning over-complete visual representations. Despite the popularity of $\\ell_1$ regularization, in this paper, we investigate the usage of non-convex regularizations in this problem. Our contribution consists of three parts. First, we propose the leaky capped norm regularization (LCNR), which allows model weights below a certain threshold to be regularized more strongly as opposed to those above, therefore imposes strong sparsity and...

  12. Systematic reviews in context: highlighting systematic reviews relevant to Africa in the Pan African Medical Journal.

    Science.gov (United States)

    Wiysonge, Charles Shey; Kamadjeu, Raoul; Tsague, Landry

    2016-01-01

    Health research serves to answer questions concerning health and to accumulate facts (evidence) required to guide healthcare policy and practice. However, research designs vary and different types of healthcare questions are best answered by different study designs. For example, qualitative studies are best suited for answering questions about experiences and meaning; cross-sectional studies for questions concerning prevalence; cohort studies for questions regarding incidence and prognosis; and randomised controlled trials for questions on prevention and treatment. In each case, one study would rarely yield sufficient evidence on which to reliably base a healthcare decision. An unbiased and transparent summary of all existing studies on a given question (i.e. a systematic review) tells a better story than any one of the included studies taken separately. A systematic review enables producers and users of research to gauge what a new study has contributed to knowledge by setting the study's findings in the context of all previous studies investigating the same question. It is therefore inappropriate to initiate a new study without first conducting a systematic review to find out what can be learnt from existing studies. There is nothing new in taking account of earlier studies in either the design or interpretation of new studies. For example, in the 18th century James Lind conducted a clinical trial followed by a systematic review of contemporary treatments for scurvy; which showed fruits to be an effective treatment for the disease. However, surveys of the peer-reviewed literature continue to provide empirical evidence that systematic reviews are seldom used in the design and interpretation of the findings of new studies. Such indifference to systematic reviews as a research function is unethical, unscientific, and uneconomical. Without systematic reviews, limited resources are very likely to be squandered on ill-conceived research and policies. In order to

  13. Adaptive regularization of noisy linear inverse problems

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Madsen, Kristoffer Hougaard; Lehn-Schiøler, Tue

    2006-01-01

    In the Bayesian modeling framework there is a close relation between regularization and the prior distribution over parameters. For prior distributions in the exponential family, we show that the optimal hyper-parameter, i.e., the optimal strength of regularization, satisfies a simple relation: T......: The expectation of the regularization function, i.e., takes the same value in the posterior and prior distribution. We present three examples: two simulations, and application in fMRI neuroimaging....

  14. Design principles and developmental mechanisms underlying retinal mosaics.

    Science.gov (United States)

    Reese, Benjamin E; Keeley, Patrick W

    2015-08-01

    Most structures within the central nervous system (CNS) are composed of different types of neuron that vary in both number and morphology, but relatively little is known about the interplay between these two features, i.e. about the population dynamics of a given cell type. How such arrays of neurons are distributed within a structure, and how they differentiate their dendrites relative to each other, are issues that have recently drawn attention in the invertebrate nervous system, where the genetic and molecular underpinnings of these organizing principles are being revealed in exquisite detail. The retina is one of the few locations where these principles have been extensively studied in the vertebrate CNS, indeed, where the design principles of 'mosaic regularity' and 'uniformity of coverage' were first explicitly defined, quantified, and related to each other. Recent studies have revealed a number of genes that influence the formation of these histotypical features in the retina, including homologues of those invertebrate genes, although close inspection reveals that they do not always mediate comparable developmental processes nor elucidate fundamental design principles. The present review considers just how pervasive these features of 'mosaic regularity' and 'uniform dendritic coverage' are within the mammalian retina, discussing the means by which such features can be assessed in the mature and developing nervous system and examining the limitations associated with those assessments. We then address the extent to which these two design principles co-exist within different populations of neurons, and how they are achieved during development. Finally, we consider the neural phenotypes obtained in mutant nervous systems, to address whether a prospective gene of interest underlies those very design principles. © 2014 The Authors. Biological Reviews © 2014 Cambridge Philosophical Society.

  15. Restorative Virtual Environment Design for Augmenting Nursing Home Rehabilitation

    DEFF Research Database (Denmark)

    Bruun-Pedersen, Jon Ram; Serafin, Stefania; Kofoed, Lise

    2016-01-01

    do, but more studies on content and design of proper custom designs for RVEs is necessary. This paper reviews the background for RVE design, describes four custom RVE designs for recreational VE exploration and presents user preferences among nursing home users concerning content and other pivotal......With increasing age, muscle strength decreases excessively rapidly if physical activity is not maintained. However, physical activity is increasingly difficult with age, due to balance, strength or coordination difficulties, arthritis, etc. Moreover, many nursing home residents become unable...... to experience natural surroundings. Augmenting a conventional biking exercise with a recreational virtual environment (RVE) has shown to serve as an intrinsic motivation contributor to exercise for nursing home residents. RVEs might be able to provide some of the health benefits that regular nature experiences...

  16. Exclusion of children with intellectual disabilities from regular ...

    African Journals Online (AJOL)

    Study investigated why teachers exclude children with intellectual disability from the regular classrooms in Nigeria. Participants were, 169 regular teachers randomly selected from Oyo and Ogun states. Questionnaire was used to collect data result revealed that 57.4% regular teachers could not cope with children with ID ...

  17. A Review Of Design And Control Of Automated Guided Vehicle Systems

    NARCIS (Netherlands)

    T. Le-Anh (Tuan); M.B.M. de Koster (René)

    2004-01-01

    textabstractThis paper presents a review on design and control of automated guided vehicle systems. We address most key related issues including guide-path design, estimating the number of vehicles, vehicle scheduling, idle-vehicle positioning, battery management, vehicle routing, and conflict

  18. On infinite regular and chiral maps

    OpenAIRE

    Arredondo, John A.; Valdez, Camilo Ramírez y Ferrán

    2015-01-01

    We prove that infinite regular and chiral maps take place on surfaces with at most one end. Moreover, we prove that an infinite regular or chiral map on an orientable surface with genus can only be realized on the Loch Ness monster, that is, the topological surface of infinite genus with one end.

  19. 29 CFR 779.18 - Regular rate.

    Science.gov (United States)

    2010-07-01

    ... employee under subsection (a) or in excess of the employee's normal working hours or regular working hours... Relating to Labor (Continued) WAGE AND HOUR DIVISION, DEPARTMENT OF LABOR STATEMENTS OF GENERAL POLICY OR... not less than one and one-half times their regular rates of pay. Section 7(e) of the Act defines...

  20. The design and application of effective written instructional material: a review of published work.

    Science.gov (United States)

    Mayberry, John F

    2007-09-01

    This review will consider the evidence base for the format of educational material drawing on academic papers and the practice of the design industry. The core issues identified from the review are drawn together in guidelines for educational posters, text and web based material. The review deals with the design of written material both for use in leaflets and books as well as the impact of factors such as font type and size as well as colour on poster design. It sets these aspects of educational material within a research framework, which looks at impact on learning and subsequent change in practice. These issues are examined through a practical example of a poster designed for a regional gastroenterology meeting.

  1. Results of readiness review for start of Title II Design of ESF in salt

    International Nuclear Information System (INIS)

    1986-01-01

    The Readiness Review Board recommends that the ESF Title II Design be initiated after approval of revised Functional Design Criteria for Title II design. This review was conducted assuming a Deaf Smith location for ESF. Seventy-four open items and eight technical holds were identified during the Readiness Review that must be addressed and resolved to ensure successful completion of the ESF Title II Design. These items include definition and approval of surface based, EDH, and subsurface testing requirements; development of an approved OCRWM/SRPO licensing position for the ESF; and acquisition and availability of site-specific confirmatory data. A Risk Assessment should be conducted to define corrective action data and technical, cost and schedule impacts and associated program risks of continuation of Title II design activities beyond those dates

  2. Product-oriented design theory for digital information services: A literature review.

    NARCIS (Netherlands)

    Wijnhoven, Alphonsus B.J.M.; Kraaijenbrink, Jeroen

    2008-01-01

    Purpose – The purpose of this paper is to give a structured literature review, design concepts, and research propositions related to a product-oriented design theory for information services. Information services facilitate the exchange of information goods with or without transforming these goods.

  3. On convergence rates for iteratively regularized procedures with linear penalty terms

    International Nuclear Information System (INIS)

    Smirnova, Alexandra

    2012-01-01

    The impact of this paper is twofold. First, we study convergence rates of the iteratively regularized Gauss–Newton (IRGN) algorithm with a linear penalty term under a generalized source assumption and show how the regularizing properties of new iterations depend on the solution smoothness. Secondly, we introduce an adaptive IRGN procedure, which is investigated under a relaxed smoothness condition. The introduction and analysis of a more general penalty term are of great importance since, apart from bringing stability to the numerical scheme designed for solving a large class of applied inverse problems, it allows us to incorporate various types of a priori information available on the model. Both a priori and a posteriori stopping rules are investigated. For the a priori stopping rule, optimal convergence rates are derived. A numerical example illustrating convergence rates is considered. (paper)

  4. Human factors engineering review for CRT screen design

    International Nuclear Information System (INIS)

    Yi, S. M.; Joo, C. Y.; Ra, J. C.

    1999-01-01

    The information interface between man and machine may be more important than hardware and workplace layout considerations. Transmitting and receiving data through this information interface can be characterized as a communication or interface problem. Management of man-machine interface is essential for the enhancement of the information processing and decision-making capability of computer users working in real time, demanding task. The design of human-computer interface is not a rigid and static procedure. The content and context of each interface varies according to the specific application. So, the purpose of this study is to review the human factor design process for interfaces, to make human factor guidelines for CRT screen and to apply these to CRT screen design. (author)

  5. Demystifying Mixed Methods Research Design: A Review of the Literature

    OpenAIRE

    Gail D. Caruth

    2013-01-01

    Mixed methods research evolved in response to the observed limitations of both quantitative and qualitative designs and is a more complex method. The purpose of this paper was to examine mixed methods research in an attempt to demystify the design thereby allowing those less familiar with its design an opportunity to utilize it in future research. A review of the literature revealed that it has been gaining acceptance among researchers, researchers have begun using mixed methods research, it ...

  6. 20 CFR 226.14 - Employee regular annuity rate.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Employee regular annuity rate. 226.14 Section... COMPUTING EMPLOYEE, SPOUSE, AND DIVORCED SPOUSE ANNUITIES Computing an Employee Annuity § 226.14 Employee regular annuity rate. The regular annuity rate payable to the employee is the total of the employee tier I...

  7. Algebraic design theory

    CERN Document Server

    Launey, Warwick De

    2011-01-01

    Combinatorial design theory is a source of simply stated, concrete, yet difficult discrete problems, with the Hadamard conjecture being a prime example. It has become clear that many of these problems are essentially algebraic in nature. This book provides a unified vision of the algebraic themes which have developed so far in design theory. These include the applications in design theory of matrix algebra, the automorphism group and its regular subgroups, the composition of smaller designs to make larger designs, and the connection between designs with regular group actions and solutions to group ring equations. Everything is explained at an elementary level in terms of orthogonality sets and pairwise combinatorial designs--new and simple combinatorial notions which cover many of the commonly studied designs. Particular attention is paid to how the main themes apply in the important new context of cocyclic development. Indeed, this book contains a comprehensive account of cocyclic Hadamard matrices. The book...

  8. Regular algebra and finite machines

    CERN Document Server

    Conway, John Horton

    2012-01-01

    World-famous mathematician John H. Conway based this classic text on a 1966 course he taught at Cambridge University. Geared toward graduate students of mathematics, it will also prove a valuable guide to researchers and professional mathematicians.His topics cover Moore's theory of experiments, Kleene's theory of regular events and expressions, Kleene algebras, the differential calculus of events, factors and the factor matrix, and the theory of operators. Additional subjects include event classes and operator classes, some regulator algebras, context-free languages, communicative regular alg

  9. The poacher turned gamekeeper, or getting the most out of the design review process

    Science.gov (United States)

    Craig, Simon C.

    2010-07-01

    This paper presents an accumulation of knowledge from both sides of the design review table. Using experience gained over many reviews and post-mortems, some painful, some less painful; examining stakeholder's viewpoints and expectations; challenging aspects of accepted wisdom and posing awkward questions, the author brings out what he considers to be key criteria for a constructive design review. While this is not a guarantee to a successful outcome, it may nudge the balance from the reviews being an obligatory milestone (millstone?) towards them being a beneficial mechanism for project development.

  10. 39 CFR 6.1 - Regular meetings, annual meeting.

    Science.gov (United States)

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Regular meetings, annual meeting. 6.1 Section 6.1 Postal Service UNITED STATES POSTAL SERVICE THE BOARD OF GOVERNORS OF THE U.S. POSTAL SERVICE MEETINGS (ARTICLE VI) § 6.1 Regular meetings, annual meeting. The Board shall meet regularly on a schedule...

  11. Convergent and sequential synthesis designs: implications for conducting and reporting systematic reviews of qualitative and quantitative evidence.

    Science.gov (United States)

    Hong, Quan Nha; Pluye, Pierre; Bujold, Mathieu; Wassef, Maggy

    2017-03-23

    Systematic reviews of qualitative and quantitative evidence can provide a rich understanding of complex phenomena. This type of review is increasingly popular, has been used to provide a landscape of existing knowledge, and addresses the types of questions not usually covered in reviews relying solely on either quantitative or qualitative evidence. Although several typologies of synthesis designs have been developed, none have been tested on a large sample of reviews. The aim of this review of reviews was to identify and develop a typology of synthesis designs and methods that have been used and to propose strategies for synthesizing qualitative and quantitative evidence. A review of systematic reviews combining qualitative and quantitative evidence was performed. Six databases were searched from inception to December 2014. Reviews were included if they were systematic reviews combining qualitative and quantitative evidence. The included reviews were analyzed according to three concepts of synthesis processes: (a) synthesis methods, (b) sequence of data synthesis, and (c) integration of data and synthesis results. A total of 459 reviews were included. The analysis of this literature highlighted a lack of transparency in reporting how evidence was synthesized and a lack of consistency in the terminology used. Two main types of synthesis designs were identified: convergent and sequential synthesis designs. Within the convergent synthesis design, three subtypes were found: (a) data-based convergent synthesis design, where qualitative and quantitative evidence is analyzed together using the same synthesis method, (b) results-based convergent synthesis design, where qualitative and quantitative evidence is analyzed separately using different synthesis methods and results of both syntheses are integrated during a final synthesis, and (c) parallel-results convergent synthesis design consisting of independent syntheses of qualitative and quantitative evidence and an

  12. A regularized stationary mean-field game

    KAUST Repository

    Yang, Xianjin

    2016-01-01

    In the thesis, we discuss the existence and numerical approximations of solutions of a regularized mean-field game with a low-order regularization. In the first part, we prove a priori estimates and use the continuation method to obtain the existence of a solution with a positive density. Finally, we introduce the monotone flow method and solve the system numerically.

  13. A regularized stationary mean-field game

    KAUST Repository

    Yang, Xianjin

    2016-04-19

    In the thesis, we discuss the existence and numerical approximations of solutions of a regularized mean-field game with a low-order regularization. In the first part, we prove a priori estimates and use the continuation method to obtain the existence of a solution with a positive density. Finally, we introduce the monotone flow method and solve the system numerically.

  14. Optimal behaviour can violate the principle of regularity.

    Science.gov (United States)

    Trimmer, Pete C

    2013-07-22

    Understanding decisions is a fundamental aim of behavioural ecology, psychology and economics. The regularity axiom of utility theory holds that a preference between options should be maintained when other options are made available. Empirical studies have shown that animals violate regularity but this has not been understood from a theoretical perspective, such decisions have therefore been labelled as irrational. Here, I use models of state-dependent behaviour to demonstrate that choices can violate regularity even when behavioural strategies are optimal. I also show that the range of conditions over which regularity should be violated can be larger when options do not always persist into the future. Consequently, utility theory--based on axioms, including transitivity, regularity and the independence of irrelevant alternatives--is undermined, because even alternatives that are never chosen by an animal (in its current state) can be relevant to a decision.

  15. Advanced Mixed Waste Treatment Project melter system preliminary design technical review meeting

    Energy Technology Data Exchange (ETDEWEB)

    Eddy, T.L.; Raivo, B.D.; Soelberg, N.R.; Wiersholm, O.

    1995-02-01

    The Idaho National Engineering Laboratory Advanced Mixed Waste Treatment Project sponsored a plasma are melter technical design review meeting to evaluate high-temperature melter system configurations for processing heterogeneous alpha-contaminated low-level radioactive waste (ALLW). Thermal processing experts representing Department of Energy contractors, the Environmental Protection Agency, and private sector companies participated in the review. The participants discussed issues and evaluated alternative configurations for three areas of the melter system design: plasma torch melters and graphite arc melters, offgas treatment options, and overall system configuration considerations. The Technical Advisory Committee for the review concluded that graphite arc melters are preferred over plasma torch melters for processing ALLW. Initiating involvement of stakeholders was considered essential at this stage of the design. For the offgas treatment system, the advisory committee raised the question whether to a use wet-dry or a dry-wet system. The committee recommended that the waste stream characterization, feed preparation, and the control system are essential design tasks for the high-temperature melter treatment system. The participants strongly recommended that a complete melter treatment system be assembled to conduct tests with nonradioactive surrogate waste material. A nonradioactive test bed would allow for inexpensive design and operational changes prior to assembling a system for radioactive waste treatment operations.

  16. Advanced Mixed Waste Treatment Project melter system preliminary design technical review meeting

    International Nuclear Information System (INIS)

    Eddy, T.L.; Raivo, B.D.; Soelberg, N.R.; Wiersholm, O.

    1995-02-01

    The Idaho National Engineering Laboratory Advanced Mixed Waste Treatment Project sponsored a plasma are melter technical design review meeting to evaluate high-temperature melter system configurations for processing heterogeneous alpha-contaminated low-level radioactive waste (ALLW). Thermal processing experts representing Department of Energy contractors, the Environmental Protection Agency, and private sector companies participated in the review. The participants discussed issues and evaluated alternative configurations for three areas of the melter system design: plasma torch melters and graphite arc melters, offgas treatment options, and overall system configuration considerations. The Technical Advisory Committee for the review concluded that graphite arc melters are preferred over plasma torch melters for processing ALLW. Initiating involvement of stakeholders was considered essential at this stage of the design. For the offgas treatment system, the advisory committee raised the question whether to a use wet-dry or a dry-wet system. The committee recommended that the waste stream characterization, feed preparation, and the control system are essential design tasks for the high-temperature melter treatment system. The participants strongly recommended that a complete melter treatment system be assembled to conduct tests with nonradioactive surrogate waste material. A nonradioactive test bed would allow for inexpensive design and operational changes prior to assembling a system for radioactive waste treatment operations

  17. Dimensional regularization in configuration space

    International Nuclear Information System (INIS)

    Bollini, C.G.; Giambiagi, J.J.

    1995-09-01

    Dimensional regularization is introduced in configuration space by Fourier transforming in D-dimensions the perturbative momentum space Green functions. For this transformation, Bochner theorem is used, no extra parameters, such as those of Feynman or Bogoliubov-Shirkov are needed for convolutions. The regularized causal functions in x-space have ν-dependent moderated singularities at the origin. They can be multiplied together and Fourier transformed (Bochner) without divergence problems. The usual ultraviolet divergences appear as poles of the resultant functions of ν. Several example are discussed. (author). 9 refs

  18. Matrix regularization of 4-manifolds

    OpenAIRE

    Trzetrzelewski, M.

    2012-01-01

    We consider products of two 2-manifolds such as S^2 x S^2, embedded in Euclidean space and show that the corresponding 4-volume preserving diffeomorphism algebra can be approximated by a tensor product SU(N)xSU(N) i.e. functions on a manifold are approximated by the Kronecker product of two SU(N) matrices. A regularization of the 4-sphere is also performed by constructing N^2 x N^2 matrix representations of the 4-algebra (and as a byproduct of the 3-algebra which makes the regularization of S...

  19. Regular Breakfast and Blood Lead Levels among Preschool Children

    Directory of Open Access Journals (Sweden)

    Needleman Herbert

    2011-04-01

    Full Text Available Abstract Background Previous studies have shown that fasting increases lead absorption in the gastrointestinal tract of adults. Regular meals/snacks are recommended as a nutritional intervention for lead poisoning in children, but epidemiological evidence of links between fasting and blood lead levels (B-Pb is rare. The purpose of this study was to examine the association between eating a regular breakfast and B-Pb among children using data from the China Jintan Child Cohort Study. Methods Parents completed a questionnaire regarding children's breakfast-eating habit (regular or not, demographics, and food frequency. Whole blood samples were collected from 1,344 children for the measurements of B-Pb and micronutrients (iron, copper, zinc, calcium, and magnesium. B-Pb and other measures were compared between children with and without regular breakfast. Linear regression modeling was used to evaluate the association between regular breakfast and log-transformed B-Pb. The association between regular breakfast and risk of lead poisoning (B-Pb≥10 μg/dL was examined using logistic regression modeling. Results Median B-Pb among children who ate breakfast regularly and those who did not eat breakfast regularly were 6.1 μg/dL and 7.2 μg/dL, respectively. Eating breakfast was also associated with greater zinc blood levels. Adjusting for other relevant factors, the linear regression model revealed that eating breakfast regularly was significantly associated with lower B-Pb (beta = -0.10 units of log-transformed B-Pb compared with children who did not eat breakfast regularly, p = 0.02. Conclusion The present study provides some initial human data supporting the notion that eating a regular breakfast might reduce B-Pb in young children. To our knowledge, this is the first human study exploring the association between breakfast frequency and B-Pb in young children.

  20. Increasing work-time influence: consequences for flexibility, variability, regularity and predictability.

    Science.gov (United States)

    Nabe-Nielsen, Kirsten; Garde, Anne Helene; Aust, Birgit; Diderichsen, Finn

    2012-01-01

    This quasi-experimental study investigated how an intervention aiming at increasing eldercare workers' influence on their working hours affected the flexibility, variability, regularity and predictability of the working hours. We used baseline (n = 296) and follow-up (n = 274) questionnaire data and interviews with intervention-group participants (n = 32). The work units in the intervention group designed their own intervention comprising either implementation of computerised self-scheduling (subgroup A), collection of information about the employees' work-time preferences by questionnaires (subgroup B), or discussion of working hours (subgroup C). Only computerised self-scheduling changed the working hours and the way they were planned. These changes implied more flexible but less regular working hours and an experience of less predictability and less continuity in the care of clients and in the co-operation with colleagues. In subgroup B and C, the participants ended up discussing the potential consequences of more work-time influence without actually implementing any changes. Employee work-time influence may buffer the adverse effects of shift work. However, our intervention study suggested that while increasing the individual flexibility, increasing work-time influence may also result in decreased regularity of the working hours and less continuity in the care of clients and co-operation with colleagues.

  1. On the equivalence of different regularization methods

    International Nuclear Information System (INIS)

    Brzezowski, S.

    1985-01-01

    The R-circunflex-operation preceded by the regularization procedure is discussed. Some arguments are given, according to which the results may depend on the method of regularization, introduced in order to avoid divergences in perturbation calculations. 10 refs. (author)

  2. Accreting fluids onto regular black holes via Hamiltonian approach

    Energy Technology Data Exchange (ETDEWEB)

    Jawad, Abdul [COMSATS Institute of Information Technology, Department of Mathematics, Lahore (Pakistan); Shahzad, M.U. [COMSATS Institute of Information Technology, Department of Mathematics, Lahore (Pakistan); University of Central Punjab, CAMS, UCP Business School, Lahore (Pakistan)

    2017-08-15

    We investigate the accretion of test fluids onto regular black holes such as Kehagias-Sfetsos black holes and regular black holes with Dagum distribution function. We analyze the accretion process when different test fluids are falling onto these regular black holes. The accreting fluid is being classified through the equation of state according to the features of regular black holes. The behavior of fluid flow and the existence of sonic points is being checked for these regular black holes. It is noted that the three-velocity depends on critical points and the equation of state parameter on phase space. (orig.)

  3. Regularity and chaos in Vlasov evolution of nuclear matter

    Energy Technology Data Exchange (ETDEWEB)

    Jacquot, B.; Guarnera, A.; Chomaz, Ph.; Colonna, M.

    1995-12-31

    A careful analysis of the mean-field dynamics inside the spinodal instability region is performed. It is shown that, conversely to some recently published results the mean-field evolution appears mostly regular over a long time scale, while some disorder is observed only very late, when fragments are already formed This onset of chaos can be related to the fragment interaction which induces some coalescence effects. Moreover it is shown that the time scale over which the chaos start to develop are very sensitive to the range of the considered force. All the presented results support the various analyses of spinodal instabilities obtained using stochastic mean field approaches. (author). 16 refs. Submitted to Physical Review, C (US).

  4. Parameter choice in Banach space regularization under variational inequalities

    International Nuclear Information System (INIS)

    Hofmann, Bernd; Mathé, Peter

    2012-01-01

    The authors study parameter choice strategies for the Tikhonov regularization of nonlinear ill-posed problems in Banach spaces. The effectiveness of any parameter choice for obtaining convergence rates depends on the interplay of the solution smoothness and the nonlinearity structure, and it can be expressed concisely in terms of variational inequalities. Such inequalities are link conditions between the penalty term, the norm misfit and the corresponding error measure. The parameter choices under consideration include an a priori choice, the discrepancy principle as well as the Lepskii principle. For the convenience of the reader, the authors review in an appendix a few instances where the validity of a variational inequality can be established. (paper)

  5. ITER final design report, cost review and safety analysis (FDR) and relevant documents

    International Nuclear Information System (INIS)

    1999-01-01

    This volume contains the fourth major milestone report and documents associated with its acceptance, review and approval. This ITER Final Design Report, Cost Review and Safety Analysis was presented to the ITER Council at its 13th meeting in February 1998 and was approved at its extraordinary meeting on 25 June 1998. The contents include an outline of the ITER objectives, the ITER parameters and design overview as well as operating scenarios and plasma performance. Furthermore, design features, safety and environmental characteristics and schedule and cost estimates are given

  6. A selective review of knowledge-based approaches to database design

    Directory of Open Access Journals (Sweden)

    Shahrul Azman Noah

    1995-01-01

    Full Text Available The inclusion of real world knowledge or specialised knowledge has not been addressed by the majority of the systems reviewed. ODA has real world knowledge provided by using a thesaurus-type structure to represent generic models. Only NITDT includes the specialised knowledge in its knowledge base. NITDT classified its knowledge into application specific, domain specific and general knowledge. However the literature does not discuss in detail how this knowledge is applied during the design session. One of the key factors that distinguish computer-based expert systems from human experts is that the latter apply not only their specialised expertise to a problem but also their general knowledge of the world. NITDT is the only system reviewed here that holds any form of internal domain specific knowledge, which can be easily augmented, enriched and updated, as required. This knowledge allows the designer to be an active participant along with the user in the design process and significantly eases the user task. The inclusion of real world knowledge and specialised knowledge is an area that must be further addressed before intelligent tools are able to offer a realistic level of assistance to the human designers.

  7. SRNL Review And Assessment Of WTP UFP-02 Sparger Design And Testing

    Energy Technology Data Exchange (ETDEWEB)

    Poirier, M. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Duignan, M. R. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Fink, S. D. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Steimke, J. L. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2014-03-24

    During aerosol testing conducted by Parsons Constructors and Fabricators, Inc. (PCFI), air sparger plugging was observed in small-scale and medium-scale testing. Because of this observation, personnel identified a concern that the steam spargers in Pretreatment Facility vessel UFP-02 could plug during Waste Treatment and Immobilization Plant (WTP) operation. The U. S. Department of Energy (DOE) requested that Savannah River National Laboratory (SRNL) provide consultation on the evaluation of known WTP bubbler, and air and steam sparger issues. The authors used the following approach for this task: reviewed previous test reports (including smallscale testing, medium-scale testing, and Pretreatment Engineering Platform [PEP] testing), met with Bechtel National, Inc. (BNI) personnel to discuss sparger design, reviewed BNI documents supporting the sparger design, discussed sparger experience with Savannah River Site Defense Waste Processing Facility (DWPF) and Sellafield personnel, talked to sparger manufacturers about relevant operating experience and design issues, and reviewed UFP-02 vessel and sparger drawings.

  8. Editorial Note: Reevaluating Book Reviews: As Scientific Contributions

    Directory of Open Access Journals (Sweden)

    Günter Mey

    2000-12-01

    Full Text Available In the first part of this text, I would like to describe some advantages book reviews offer. The book reviews—providing the fact that they succeed in offering more than just a short content description to the reader—can also contribute to scientific discourses in a similar way regular contributions do. One of the reasons why book reviews currently often do not fulfil this possible function is due to the existing restrictions within traditional print media publishing. Additionally worth mentioning are actual standards within the scientific community which tend to underestimate the value of book reviews or review essays. In the second part, I will discuss some developmental potentials in book reviews which up to now were hardly recognized: Especially with the Internet and its characteristics-nearly unlimited space resources; flexible publishing time and design of the contributions; chance for a direct exchange between researchers, for example using discussion boards—a re-evaluation of book reviews and review essays seems to be possible and reasonable. URN: urn:nbn:de:0114-fqs0003400

  9. Design Review Report for formal review of safety class features of exhauster system for rotary mode core sampling

    International Nuclear Information System (INIS)

    JANICEK, G.P.

    2000-01-01

    Report documenting Formal Design Review conducted on portable exhausters used to support rotary mode core sampling of Hanford underground radioactive waste tanks with focus on Safety Class design features and control requirements for flammable gas environment operation and air discharge permitting compliance

  10. Design Review Report for formal review of safety class features of exhauster system for rotary mode core sampling

    Energy Technology Data Exchange (ETDEWEB)

    JANICEK, G.P.

    2000-06-08

    Report documenting Formal Design Review conducted on portable exhausters used to support rotary mode core sampling of Hanford underground radioactive waste tanks with focus on Safety Class design features and control requirements for flammable gas environment operation and air discharge permitting compliance.

  11. Bounded Perturbation Regularization for Linear Least Squares Estimation

    KAUST Repository

    Ballal, Tarig

    2017-10-18

    This paper addresses the problem of selecting the regularization parameter for linear least-squares estimation. We propose a new technique called bounded perturbation regularization (BPR). In the proposed BPR method, a perturbation with a bounded norm is allowed into the linear transformation matrix to improve the singular-value structure. Following this, the problem is formulated as a min-max optimization problem. Next, the min-max problem is converted to an equivalent minimization problem to estimate the unknown vector quantity. The solution of the minimization problem is shown to converge to that of the ℓ2 -regularized least squares problem, with the unknown regularizer related to the norm bound of the introduced perturbation through a nonlinear constraint. A procedure is proposed that combines the constraint equation with the mean squared error (MSE) criterion to develop an approximately optimal regularization parameter selection algorithm. Both direct and indirect applications of the proposed method are considered. Comparisons with different Tikhonov regularization parameter selection methods, as well as with other relevant methods, are carried out. Numerical results demonstrate that the proposed method provides significant improvement over state-of-the-art methods.

  12. Book review. Design for Care: Innovating Healthcare Experience

    Directory of Open Access Journals (Sweden)

    Manuela Aguirre Ulloa

    2014-12-01

    Full Text Available Adapted from a review on the same book published by The Design Observer Group on April 4th, 2014. You can access the original publication online at http://designobserver.com/feature/design-for-care/38382/ Peter Jones´ recently published book represents a timely and comprehensive view of the value design brings to healthcare innovation. The book uses an empathic user story that conveys emotions and life to a structure that embraces the different meanings of Design for Care: Spanning from caring at the personal level to large-scale caring systems. The author has a main objective for each of its three main target audiences: Designers, companies and healthcare teams. Firstly, it allows designers to understand healthcare in a holistic and patient-centered way, breaking down specialized silos. Secondly, it shows how to design better care experiences across care continuums. Consequently, for companies serving the healthcare sector, the book presents how to humanize information technology (IT and services and meet the needs of health seekers. Finally, the book aims to inform healthcare teams (clinical practitioners and administrators the value design brings in research, co-creation and implementation of user and organizational experiences. It also proposes that healthcare teams learn and adopt design and systems thinking techniques so their innovation processes can be more participatory, holistic and user-centered.

  13. The Impact of Environmental Design on Teamwork and Communication in Healthcare Facilities: A Systematic Literature Review.

    Science.gov (United States)

    Gharaveis, Arsalan; Hamilton, D Kirk; Pati, Debajyoti

    2018-01-01

    The purpose of this systematic review is to investigate the current knowledge about the impact of healthcare facility design on teamwork and communication by exploring the relevant literature. Teamwork and communication are behavioral factors that are impacted by physical design. However, the effects of environmental factors on teamwork and communication have not been investigated extensively in healthcare design literature. There are no published systematic reviews on the current topic. Searches were conducted in PubMed and Google Scholar databases in addition to targeted design journals including Health Environmental Research & Design, Environment and Behavior, Environmental Psychology, and Applied Ergonomics. Inclusion criteria were (a) full-text English language articles related to teamwork and communication and (b) involving any healthcare built environment and space design published in peer-reviewed journals between 1984 and 2017. Studies were extracted using defined inclusion and exclusion criteria. In the first phase, 26 of the 195 articles most relevant to teamwork and 19 studies of the 147 were identified and reviewed to understand the impact of communication in healthcare facilities. The literature regarding the impact of built environment on teamwork and communication were reviewed and explored in detail. Eighteen studies were selected and succinctly summarized as the final product of this review. Environmental design, which involves nurses, support staff, and physicians, is one of the critical factors that promotes the efficiency of teamwork and collaborative communication. Layout design, visibility, and accessibility levels are the most cited aspects of design which can affect the level of communication and teamwork in healthcare facilities.

  14. MRI reconstruction with joint global regularization and transform learning.

    Science.gov (United States)

    Tanc, A Korhan; Eksioglu, Ender M

    2016-10-01

    Sparsity based regularization has been a popular approach to remedy the measurement scarcity in image reconstruction. Recently, sparsifying transforms learned from image patches have been utilized as an effective regularizer for the Magnetic Resonance Imaging (MRI) reconstruction. Here, we infuse additional global regularization terms to the patch-based transform learning. We develop an algorithm to solve the resulting novel cost function, which includes both patchwise and global regularization terms. Extensive simulation results indicate that the introduced mixed approach has improved MRI reconstruction performance, when compared to the algorithms which use either of the patchwise transform learning or global regularization terms alone. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Exposure to regular gasoline and ethanol oxyfuel during refueling in Alaska.

    OpenAIRE

    Backer, L C; Egeland, G M; Ashley, D L; Lawryk, N J; Weisel, C P; White, M C; Bundy, T; Shortt, E; Middaugh, J P

    1997-01-01

    Although most people are thought to receive their highest acute exposures to gasoline while refueling, relatively little is actually known about personal, nonoccupational exposures to gasoline during refueling activities. This study was designed to measure exposures associated with the use of an oxygenated fuel under cold conditions in Fairbanks, Alaska. We compared concentrations of gasoline components in the blood and in the personal breathing zone (PBZ) of people who pumped regular unleade...

  16. From random to regular: Neural constraints on the emergence of isochronous rhythm during cultural transmission

    DEFF Research Database (Denmark)

    Lumaca, Massimo; Haumann, Niels Trusbak; Brattico, Elvira

    2017-01-01

    A core design feature of human communication systems and expressive behaviours is their temporal organization. The cultural evolutionary origins of this feature remain unclear. Here, we test the hypothesis that regularities in the temporal organization of signalling sequences arise in the course...

  17. Investigating Peer Review as a Systemic Pedagogy for Developing the Design Knowledge, Skills, and Dispositions of Novice Instructional Design Students

    Science.gov (United States)

    Brill, Jennifer M.

    2016-01-01

    This research investigated peer review as a contemporary instructional pedagogy for fostering the design knowledge, skills, and dispositions of novice Instructional Design and Technology (IDT) professionals. Participants were graduate students enrolled in an introductory instructional design (ID) course. Survey, artifact, and observation data were…

  18. Strictly-regular number system and data structures

    DEFF Research Database (Denmark)

    Elmasry, Amr Ahmed Abd Elmoneim; Jensen, Claus; Katajainen, Jyrki

    2010-01-01

    We introduce a new number system that we call the strictly-regular system, which efficiently supports the operations: digit-increment, digit-decrement, cut, concatenate, and add. Compared to other number systems, the strictly-regular system has distinguishable properties. It is superior to the re...

  19. Analysis of regularized Navier-Stokes equations, 2

    Science.gov (United States)

    Ou, Yuh-Roung; Sritharan, S. S.

    1989-01-01

    A practically important regularization of the Navier-Stokes equations was analyzed. As a continuation of the previous work, the structure of the attractors characterizing the solutins was studied. Local as well as global invariant manifolds were found. Regularity properties of these manifolds are analyzed.

  20. Regularities, Natural Patterns and Laws of Nature

    Directory of Open Access Journals (Sweden)

    Stathis Psillos

    2014-02-01

    Full Text Available  The goal of this paper is to sketch an empiricist metaphysics of laws of nature. The key idea is that there are regularities without regularity-enforcers. Differently put, there are natural laws without law-makers of a distinct metaphysical kind. This sketch will rely on the concept of a natural pattern and more significantly on the existence of a network of natural patterns in nature. The relation between a regularity and a pattern will be analysed in terms of mereology.  Here is the road map. In section 2, I will briefly discuss the relation between empiricism and metaphysics, aiming to show that an empiricist metaphysics is possible. In section 3, I will offer arguments against stronger metaphysical views of laws. Then, in section 4 I will motivate nomic objectivism. In section 5, I will address the question ‘what is a regularity?’ and will develop a novel answer to it, based on the notion of a natural pattern. In section 6, I will raise the question: ‘what is a law of nature?’, the answer to which will be: a law of nature is a regularity that is characterised by the unity of a natural pattern.

  1. Design and manufacturing challenges of optogenetic neural interfaces: a review

    Science.gov (United States)

    Goncalves, S. B.; Ribeiro, J. F.; Silva, A. F.; Costa, R. M.; Correia, J. H.

    2017-08-01

    Optogenetics is a relatively new technology to achieve cell-type specific neuromodulation with millisecond-scale temporal precision. Optogenetic tools are being developed to address neuroscience challenges, and to improve the knowledge about brain networks, with the ultimate aim of catalyzing new treatments for brain disorders and diseases. To reach this ambitious goal the implementation of mature and reliable engineered tools is required. The success of optogenetics relies on optical tools that can deliver light into the neural tissue. Objective/Approach: Here, the design and manufacturing approaches available to the scientific community are reviewed, and current challenges to accomplish appropriate scalable, multimodal and wireless optical devices are discussed. Significance: Overall, this review aims at presenting a helpful guidance to the engineering and design of optical microsystems for optogenetic applications.

  2. Designing and Integrating Purposeful Learning in Game Play: A Systematic Review

    Science.gov (United States)

    Ke, Fengfeng

    2016-01-01

    Via a systematic review of the literature on learning games, this article presents a systematic discussion on the design of intrinsic integration of domain-specific learning in game mechanics and game world design. A total of 69 articles ultimately met the inclusion criteria and were coded for the literature synthesis. Exemplary learning games…

  3. Experience in the review of utility control room design review and safety parameter display system programs

    International Nuclear Information System (INIS)

    Moore, V.A.

    1985-01-01

    The Detailed Control Room Design Review (DCRDR) and the Safety Parameter Display System (SPDS) had their origins in the studies and investigations conducted as the result of the TMI-2 accident. The President's Commission (Kemeny Commission) critized NRC for not examining the man-machine interface, over-emphasizing equipment, ignoring human beings, and tolerating outdated technology in control rooms. The Commission's Special Inquiry Group (Rogovin Report) recommended greater application of human factors engineering including better instrumentation displays and improved control room design. The NRC Lessons Learned Task Force concluded that licensees should review and improve control rooms using NRC Human engineering guidelines, and install safety parameter display systems (then called the safety staff vector). The TMI Action Plan Item I.D.1 and I.D.2 were based on these recommendations

  4. Regularization of the Boundary-Saddle-Node Bifurcation

    Directory of Open Access Journals (Sweden)

    Xia Liu

    2018-01-01

    Full Text Available In this paper we treat a particular class of planar Filippov systems which consist of two smooth systems that are separated by a discontinuity boundary. In such systems one vector field undergoes a saddle-node bifurcation while the other vector field is transversal to the boundary. The boundary-saddle-node (BSN bifurcation occurs at a critical value when the saddle-node point is located on the discontinuity boundary. We derive a local topological normal form for the BSN bifurcation and study its local dynamics by applying the classical Filippov’s convex method and a novel regularization approach. In fact, by the regularization approach a given Filippov system is approximated by a piecewise-smooth continuous system. Moreover, the regularization process produces a singular perturbation problem where the original discontinuous set becomes a center manifold. Thus, the regularization enables us to make use of the established theories for continuous systems and slow-fast systems to study the local behavior around the BSN bifurcation.

  5. Low-Complexity Regularization Algorithms for Image Deblurring

    KAUST Repository

    Alanazi, Abdulrahman

    2016-11-01

    Image restoration problems deal with images in which information has been degraded by blur or noise. In practice, the blur is usually caused by atmospheric turbulence, motion, camera shake, and several other mechanical or physical processes. In this study, we present two regularization algorithms for the image deblurring problem. We first present a new method based on solving a regularized least-squares (RLS) problem. This method is proposed to find a near-optimal value of the regularization parameter in the RLS problems. Experimental results on the non-blind image deblurring problem are presented. In all experiments, comparisons are made with three benchmark methods. The results demonstrate that the proposed method clearly outperforms the other methods in terms of both the output PSNR and structural similarity, as well as the visual quality of the deblurred images. To reduce the complexity of the proposed algorithm, we propose a technique based on the bootstrap method to estimate the regularization parameter in low and high-resolution images. Numerical results show that the proposed technique can effectively reduce the computational complexity of the proposed algorithms. In addition, for some cases where the point spread function (PSF) is separable, we propose using a Kronecker product so as to reduce the computations. Furthermore, in the case where the image is smooth, it is always desirable to replace the regularization term in the RLS problems by a total variation term. Therefore, we propose a novel method for adaptively selecting the regularization parameter in a so-called square root regularized total variation (SRTV). Experimental results demonstrate that our proposed method outperforms the other benchmark methods when applied to smooth images in terms of PSNR, SSIM and the restored image quality. In this thesis, we focus on the non-blind image deblurring problem, where the blur kernel is assumed to be known. However, we developed algorithms that also work

  6. Draft audit report, human factors engineering control room design review: Saint Lucie Nuclear Power Plant, Unit No. 2

    International Nuclear Information System (INIS)

    Peterson, L.R.; Lappa, D.A.; Moore, J.W.

    1981-01-01

    A human factors engineering preliminary design review of the Saint Lucie Unit 2 control room was performed at the site on August 3 through August 7, 1981. This design review was carried out by a team from the Human Factors Engineering Branch, Division of Human Factors Safety. This report was prepared on the basis of the HFEB's review of the applicant's Preliminary Design Assessment and the human factors engineering design review/audit performed at the site. The review team included human factors consultants from BioTechnology, Inc., Falls Church, Virginia, and from Lawrence Livermore National Laboratory (University of California), Livermore, California

  7. Entropy Applications to Water Monitoring Network Design: A Review

    Directory of Open Access Journals (Sweden)

    Jongho Keum

    2017-11-01

    Full Text Available Having reliable water monitoring networks is an essential component of water resources and environmental management. A standardized process for the design of water monitoring networks does not exist with the exception of the World Meteorological Organization (WMO general guidelines about the minimum network density. While one of the major challenges in the design of optimal hydrometric networks has been establishing design objectives, information theory has been successfully adopted to network design problems by providing measures of the information content that can be deliverable from a station or a network. This review firstly summarizes the common entropy terms that have been used in water monitoring network designs. Then, this paper deals with the recent applications of the entropy concept for water monitoring network designs, which are categorized into (1 precipitation; (2 streamflow and water level; (3 water quality; and (4 soil moisture and groundwater networks. The integrated design method for multivariate monitoring networks is also covered. Despite several issues, entropy theory has been well suited to water monitoring network design. However, further work is still required to provide design standards and guidelines for operational use.

  8. Deterministic automata for extended regular expressions

    Directory of Open Access Journals (Sweden)

    Syzdykov Mirzakhmet

    2017-12-01

    Full Text Available In this work we present the algorithms to produce deterministic finite automaton (DFA for extended operators in regular expressions like intersection, subtraction and complement. The method like “overriding” of the source NFA(NFA not defined with subset construction rules is used. The past work described only the algorithm for AND-operator (or intersection of regular languages; in this paper the construction for the MINUS-operator (and complement is shown.

  9. Regularities of intermediate adsorption complex relaxation

    International Nuclear Information System (INIS)

    Manukova, L.A.

    1982-01-01

    The experimental data, characterizing the regularities of intermediate adsorption complex relaxation in the polycrystalline Mo-N 2 system at 77 K are given. The method of molecular beam has been used in the investigation. The analytical expressions of change regularity in the relaxation process of full and specific rates - of transition from intermediate state into ''non-reversible'', of desorption into the gas phase and accumUlation of the particles in the intermediate state are obtained

  10. Sparse structure regularized ranking

    KAUST Repository

    Wang, Jim Jing-Yan; Sun, Yijun; Gao, Xin

    2014-01-01

    Learning ranking scores is critical for the multimedia database retrieval problem. In this paper, we propose a novel ranking score learning algorithm by exploring the sparse structure and using it to regularize ranking scores. To explore the sparse

  11. Design of experiments for microencapsulation applications: A review.

    Science.gov (United States)

    Paulo, Filipa; Santos, Lúcia

    2017-08-01

    Microencapsulation techniques have been intensively explored by many research sectors such as pharmaceutical and food industries. Microencapsulation allows to protect the active ingredient from the external environment, mask undesired flavours, a possible controlled release of compounds among others. The purpose of this review is to provide a background of design of experiments in microencapsulation research context. Optimization processes are required for an accurate research in these fields and therefore, the right implementation of micro-sized techniques at industrial scale. This article critically reviews the use of the response surface methodologies in pharmaceutical and food microencapsulation research areas. A survey of optimization procedures in the literature, in the last few years is also presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Sparse structure regularized ranking

    KAUST Repository

    Wang, Jim Jing-Yan

    2014-04-17

    Learning ranking scores is critical for the multimedia database retrieval problem. In this paper, we propose a novel ranking score learning algorithm by exploring the sparse structure and using it to regularize ranking scores. To explore the sparse structure, we assume that each multimedia object could be represented as a sparse linear combination of all other objects, and combination coefficients are regarded as a similarity measure between objects and used to regularize their ranking scores. Moreover, we propose to learn the sparse combination coefficients and the ranking scores simultaneously. A unified objective function is constructed with regard to both the combination coefficients and the ranking scores, and is optimized by an iterative algorithm. Experiments on two multimedia database retrieval data sets demonstrate the significant improvements of the propose algorithm over state-of-the-art ranking score learning algorithms.

  13. Review: BNL Tokamak graphite blanket design concepts

    International Nuclear Information System (INIS)

    Fillo, J.A.; Powell, J.R.

    1976-01-01

    The BNL minimum activity graphite blanket designs are reviewed, and three are discussed in the context of an experimental power reactor (EPR) and commercial power reactor. Basically, the three designs employ a 30 cm or thicker graphite screen. Bremsstrahlung energy is deposited on the graphite surface and re-radiated away as thermal radiation. Fast neutrons are slowed down in the graphite, depositing most of their energy, which is then radiated to a secondary blanket with coolant tubes, as in types A and B, or removed by intermittent direct gas cooling (type C). In types A and B, radiation damage to the coolant tubes in the secondary blanket is reduced by one or two orders of magnitude, while in type C, the blanket is only cooled when the reactor is shut down, so that coolant cannot quench the plasma. (Auth.)

  14. 20 CFR 226.35 - Deductions from regular annuity rate.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Deductions from regular annuity rate. 226.35... COMPUTING EMPLOYEE, SPOUSE, AND DIVORCED SPOUSE ANNUITIES Computing a Spouse or Divorced Spouse Annuity § 226.35 Deductions from regular annuity rate. The regular annuity rate of the spouse and divorced...

  15. Regularization theory for ill-posed problems selected topics

    CERN Document Server

    Lu, Shuai

    2013-01-01

    Thismonograph is a valuable contribution to thehighly topical and extremly productive field ofregularisationmethods for inverse and ill-posed problems. The author is an internationally outstanding and acceptedmathematicianin this field. In his book he offers a well-balanced mixtureof basic and innovative aspects.He demonstrates new,differentiatedviewpoints, and important examples for applications. The bookdemontrates thecurrent developments inthe field of regularization theory,such as multiparameter regularization and regularization in learning theory. The book is written for graduate and PhDs

  16. Report of the US Nuclear Regulatory Commission Piping Review Committee. Volume 2. Evaluation of seismic designs: a review of seismic design requirements for Nuclear Power Plant Piping

    Energy Technology Data Exchange (ETDEWEB)

    1985-04-01

    This document reports the position and recommendations of the NRC Piping Review Committee, Task Group on Seismic Design. The Task Group considered overlapping conservation in the various steps of seismic design, the effects of using two levels of earthquake as a design criterion, and current industry practices. Issues such as damping values, spectra modification, multiple response spectra methods, nozzle and support design, design margins, inelastic piping response, and the use of snubbers are addressed. Effects of current regulatory requirements for piping design are evaluated, and recommendations for immediate licensing action, changes in existing requirements, and research programs are presented. Additional background information and suggestions given by consultants are also presented.

  17. Transferring Knowledge from Building Operation to Design: A literature review

    DEFF Research Database (Denmark)

    Rasmussen, Helle Lohmann; Jensen, Per Anker; Gregg, Jay Sterling

    . Knowing that the list lacks inputs from cultural and organizational theory, the paper suggests that further research should focus on taking these suggestions to an operational level for the benefit of FM, building clients and design teams. Furthermore, it is found that major concepts that could......As a solution to the previously identified gap between expected and actual building performance, this paper investigates how knowledge can be transferred from operation to design. This is assumed to help bridge the gap and increase the performance of new built facilities. By conducting a systematic...... literature review, it is found, that the theoretical approach in the reviewed articles has a significant impact on the level of how applicable the recommendations are in practice. Furthermore, a list of identified tools to enable knowledge transfer is provided, including POE, PPP and building commissioning...

  18. 20 CFR 226.34 - Divorced spouse regular annuity rate.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Divorced spouse regular annuity rate. 226.34... COMPUTING EMPLOYEE, SPOUSE, AND DIVORCED SPOUSE ANNUITIES Computing a Spouse or Divorced Spouse Annuity § 226.34 Divorced spouse regular annuity rate. The regular annuity rate of a divorced spouse is equal to...

  19. Chimeric mitochondrial peptides from contiguous regular and swinger RNA.

    Science.gov (United States)

    Seligmann, Hervé

    2016-01-01

    Previous mass spectrometry analyses described human mitochondrial peptides entirely translated from swinger RNAs, RNAs where polymerization systematically exchanged nucleotides. Exchanges follow one among 23 bijective transformation rules, nine symmetric exchanges (X ↔ Y, e.g. A ↔ C) and fourteen asymmetric exchanges (X → Y → Z → X, e.g. A → C → G → A), multiplying by 24 DNA's protein coding potential. Abrupt switches from regular to swinger polymerization produce chimeric RNAs. Here, human mitochondrial proteomic analyses assuming abrupt switches between regular and swinger transcriptions, detect chimeric peptides, encoded by part regular, part swinger RNA. Contiguous regular- and swinger-encoded residues within single peptides are stronger evidence for translation of swinger RNA than previously detected, entirely swinger-encoded peptides: regular parts are positive controls matched with contiguous swinger parts, increasing confidence in results. Chimeric peptides are 200 × rarer than swinger peptides (3/100,000 versus 6/1000). Among 186 peptides with > 8 residues for each regular and swinger parts, regular parts of eleven chimeric peptides correspond to six among the thirteen recognized, mitochondrial protein-coding genes. Chimeric peptides matching partly regular proteins are rarer and less expressed than chimeric peptides matching non-coding sequences, suggesting targeted degradation of misfolded proteins. Present results strengthen hypotheses that the short mitogenome encodes far more proteins than hitherto assumed. Entirely swinger-encoded proteins could exist.

  20. The Design of High Efficiency Crossflow Hydro Turbines: A Review and Extension

    Directory of Open Access Journals (Sweden)

    Ram Adhikari

    2018-01-01

    Full Text Available Efficiency is a critical consideration in the design of hydro turbines. The crossflow turbine is the cheapest and easiest hydro turbine to manufacture and so is commonly used in remote power systems for developing countries. A longstanding problem for practical crossflow turbines is their lower maximum efficiency compared to their more advanced counterparts, such as Pelton and Francis turbines. This paper reviews the experimental and computational studies relevant to the design of high efficiency crossflow turbines. We concentrate on the studies that have contributed to designs with efficiencies in the range of 88–90%. Many recent studies have been conducted on turbines of low maximum efficiency, which we believe is due to misunderstanding of design principles for achieving high efficiencies. We synthesize the key results of experimental and computational fluid dynamics studies to highlight the key fundamental design principles for achieving efficiencies of about 90%, as well as future research and development areas to further improve the maximum efficiency. The main finding of this review is that the total conversion of head into kinetic energy in the nozzle and the matching of nozzle and runner designs are the two main design requirements for the design of high efficiency turbines.

  1. Support for designing waste sorting systems: A mini review.

    Science.gov (United States)

    Rousta, Kamran; Ordoñez, Isabel; Bolton, Kim; Dahlén, Lisa

    2017-11-01

    This article presents a mini review of research aimed at understanding material recovery from municipal solid waste. It focuses on two areas, waste sorting behaviour and collection systems, so that research on the link between these areas could be identified and evaluated. The main results presented and the methods used in the articles are categorised and appraised. The mini review reveals that most of the work that offered design guidelines for waste management systems was based on optimising technical aspects only. In contrast, most of the work that focused on user involvement did not consider developing the technical aspects of the system, but was limited to studies of user behaviour. The only clear consensus among the articles that link user involvement with the technical system is that convenient waste collection infrastructure is crucial for supporting source separation. This mini review reveals that even though the connection between sorting behaviour and technical infrastructure has been explored and described in some articles, there is still a gap when using this knowledge to design waste sorting systems. Future research in this field would benefit from being multidisciplinary and from using complementary methods, so that holistic solutions for material recirculation can be identified. It would be beneficial to actively involve users when developing sorting infrastructures, to be sure to provide a waste management system that will be properly used by them.

  2. Fast and compact regular expression matching

    DEFF Research Database (Denmark)

    Bille, Philip; Farach-Colton, Martin

    2008-01-01

    We study 4 problems in string matching, namely, regular expression matching, approximate regular expression matching, string edit distance, and subsequence indexing, on a standard word RAM model of computation that allows logarithmic-sized words to be manipulated in constant time. We show how...... to improve the space and/or remove a dependency on the alphabet size for each problem using either an improved tabulation technique of an existing algorithm or by combining known algorithms in a new way....

  3. Dimensional regularization and analytical continuation at finite temperature

    International Nuclear Information System (INIS)

    Chen Xiangjun; Liu Lianshou

    1998-01-01

    The relationship between dimensional regularization and analytical continuation of infrared divergent integrals at finite temperature is discussed and a method of regularization of infrared divergent integrals and infrared divergent sums is given

  4. Are Long-Term Chloroquine or Hydroxychloroquine Users Being Checked Regularly for Toxic Maculopathy?

    Science.gov (United States)

    Nika, Melisa; Blachley, Taylor S.; Edwards, Paul; Lee, Paul P.; Stein, Joshua D.

    2014-01-01

    Importance According to evidence-based, expert recommendations, long-term users of chloroquine (CQ) or hydroxychloroquine (HCQ) should undergo regular visits to eye-care providers and diagnostic testing to check for maculopathy. Objective To determine whether patients with rheumatoid arthritis (RA) or systemic lupus erythematosus (SLE) taking CQ or HCQ are regularly visiting eye-care providers and being screened for maculopathy. Setting, Design and Participants Patients with RA or SLE who were continuously enrolled in a particular managed-care network for ≥5 years during 2001-2011 were studied. Patients' amount of CQ/HCQ use in the 5 years since initial RA/SLE diagnosis was calculated, along with their number of eye-care visits and diagnostic tests for maculopathy. Those at high risk for maculopathy were identified. Visits to eye providers and diagnostic testing for maculopathy were assessed for each enrollee over the study period. Logistic regression was performed to assess potential factors associated with regular eye-care-provider visits (≥3 in 5 years) among CQ/HCQ users, including those at greatest risk for maculopathy. Main Outcome Measures Among CQ/HCQ users and those at high risk for toxic maculopathy, the proportions with regular eye-care visits and diagnostic testing, and the likelihood of regular eye-care visits (odds ratios [ORs] with 95% confidence intervals [CI]). Results Among 18,051 beneficiaries with RA or SLE, 6,339 (35.1%) had ≥1 record of HCQ/CQ use and 1,409 (7.8%) used HCQ/CQ for ≥4 years. Among those at high risk for maculopathy, 27.9% lacked regular eye-provider visits, 6.1% had no visits to eye providers, and 34.5% had no diagnostic testing for maculopathy during the 5-year period. Among high-risk patients, each additional month of HCQ/CQ use was associated with a 2.0%-increased likelihood of regular eye care (adjusted OR=1.02, CI=1.01-1.03). High-risk patients whose SLE/RA were managed by rheumatologists had a 77%-increased

  5. Applications of computational fluid dynamics (CFD) in the modelling and design of ventilation systems in the agricultural industry: a review.

    Science.gov (United States)

    Norton, Tomás; Sun, Da-Wen; Grant, Jim; Fallon, Richard; Dodd, Vincent

    2007-09-01

    The application of computational fluid dynamics (CFD) in the agricultural industry is becoming ever more important. Over the years, the versatility, accuracy and user-friendliness offered by CFD has led to its increased take-up by the agricultural engineering community. Now CFD is regularly employed to solve environmental problems of greenhouses and animal production facilities. However, due to a combination of increased computer efficacy and advanced numerical techniques, the realism of these simulations has only been enhanced in recent years. This study provides a state-of-the-art review of CFD, its current applications in the design of ventilation systems for agricultural production systems, and the outstanding challenging issues that confront CFD modellers. The current status of greenhouse CFD modelling was found to be at a higher standard than that of animal housing, owing to the incorporation of user-defined routines that simulate crop biological responses as a function of local environmental conditions. Nevertheless, the most recent animal housing simulations have addressed this issue and in turn have become more physically realistic.

  6. Regular and conformal regular cores for static and rotating solutions

    Energy Technology Data Exchange (ETDEWEB)

    Azreg-Aïnou, Mustapha

    2014-03-07

    Using a new metric for generating rotating solutions, we derive in a general fashion the solution of an imperfect fluid and that of its conformal homolog. We discuss the conditions that the stress–energy tensors and invariant scalars be regular. On classical physical grounds, it is stressed that conformal fluids used as cores for static or rotating solutions are exempt from any malicious behavior in that they are finite and defined everywhere.

  7. Regular and conformal regular cores for static and rotating solutions

    International Nuclear Information System (INIS)

    Azreg-Aïnou, Mustapha

    2014-01-01

    Using a new metric for generating rotating solutions, we derive in a general fashion the solution of an imperfect fluid and that of its conformal homolog. We discuss the conditions that the stress–energy tensors and invariant scalars be regular. On classical physical grounds, it is stressed that conformal fluids used as cores for static or rotating solutions are exempt from any malicious behavior in that they are finite and defined everywhere.

  8. How Games are Designed to Increase Students’ Motivation in Learning Physics? A Literature Review

    Science.gov (United States)

    Tinedi, V.; Yohandri, Y.; Djamas, D.

    2018-04-01

    Game is a promising tool to help students in understanding physics concept. It can motivate and provide the opportunities for students to become independent in learning. In order to fulfil these functions, games should be carefully designed. Thus, the objective of this paper is to present how games are designed to increase students’ motivation in learning physics based on several literature reviews. The results showed that there are several ways to increase students’ motivation in learning physics and to achieve that, game dimensions are needed to be considered when designing a game. This literature review may have useful to assist teachers and contribute in improving the design of games.

  9. Low-rank matrix approximation with manifold regularization.

    Science.gov (United States)

    Zhang, Zhenyue; Zhao, Keke

    2013-07-01

    This paper proposes a new model of low-rank matrix factorization that incorporates manifold regularization to the matrix factorization. Superior to the graph-regularized nonnegative matrix factorization, this new regularization model has globally optimal and closed-form solutions. A direct algorithm (for data with small number of points) and an alternate iterative algorithm with inexact inner iteration (for large scale data) are proposed to solve the new model. A convergence analysis establishes the global convergence of the iterative algorithm. The efficiency and precision of the algorithm are demonstrated numerically through applications to six real-world datasets on clustering and classification. Performance comparison with existing algorithms shows the effectiveness of the proposed method for low-rank factorization in general.

  10. Regularity criteria for incompressible magnetohydrodynamics equations in three dimensions

    International Nuclear Information System (INIS)

    Lin, Hongxia; Du, Lili

    2013-01-01

    In this paper, we give some new global regularity criteria for three-dimensional incompressible magnetohydrodynamics (MHD) equations. More precisely, we provide some sufficient conditions in terms of the derivatives of the velocity or pressure, for the global regularity of strong solutions to 3D incompressible MHD equations in the whole space, as well as for periodic boundary conditions. Moreover, the regularity criterion involving three of the nine components of the velocity gradient tensor is also obtained. The main results generalize the recent work by Cao and Wu (2010 Two regularity criteria for the 3D MHD equations J. Diff. Eqns 248 2263–74) and the analysis in part is based on the works by Cao C and Titi E (2008 Regularity criteria for the three-dimensional Navier–Stokes equations Indiana Univ. Math. J. 57 2643–61; 2011 Gobal regularity criterion for the 3D Navier–Stokes equations involving one entry of the velocity gradient tensor Arch. Rational Mech. Anal. 202 919–32) for 3D incompressible Navier–Stokes equations. (paper)

  11. Final design review report for K Basin Dose Reduction Project Clean and Coat Task

    International Nuclear Information System (INIS)

    Blackburn, L.D.

    1996-02-01

    The strategy for reducing radiation dose originating from radionuclides absorbed in the concrete is to raise the pool water level to provide additional shielding. The concrete walls need to be coated to prevent future radionuclide absorption into the walls. This report documents a final design review of equipment to clean and coat basin walls. The review concluded that the design presented was acceptable for release for fabrication

  12. Regular-fat dairy and human health

    DEFF Research Database (Denmark)

    Astrup, Arne; Bradley, Beth H Rice; Brenna, J Thomas

    2016-01-01

    In recent history, some dietary recommendations have treated dairy fat as an unnecessary source of calories and saturated fat in the human diet. These assumptions, however, have recently been brought into question by current research on regular fat dairy products and human health. In an effort to......, cheese and yogurt, can be important components of an overall healthy dietary pattern. Systematic examination of the effects of dietary patterns that include regular-fat milk, cheese and yogurt on human health is warranted....

  13. Bounded Perturbation Regularization for Linear Least Squares Estimation

    KAUST Repository

    Ballal, Tarig; Suliman, Mohamed Abdalla Elhag; Al-Naffouri, Tareq Y.

    2017-01-01

    This paper addresses the problem of selecting the regularization parameter for linear least-squares estimation. We propose a new technique called bounded perturbation regularization (BPR). In the proposed BPR method, a perturbation with a bounded

  14. Recognition Memory for Novel Stimuli: The Structural Regularity Hypothesis

    Science.gov (United States)

    Cleary, Anne M.; Morris, Alison L.; Langley, Moses M.

    2007-01-01

    Early studies of human memory suggest that adherence to a known structural regularity (e.g., orthographic regularity) benefits memory for an otherwise novel stimulus (e.g., G. A. Miller, 1958). However, a more recent study suggests that structural regularity can lead to an increase in false-positive responses on recognition memory tests (B. W. A.…

  15. Human-factors engineering control-room design review/audit: Waterford 3 SES Generating Station, Louisiana Power and Light Company

    International Nuclear Information System (INIS)

    Savage, J.W.

    1983-01-01

    A human factors engineering design review/audit of the Waterford-3 control room was performed at the site on May 10 through May 13, 1982. The report was prepared on the basis of the HFEB's review of the applicant's Preliminary Human Engineering Discrepancy (PHED) report and the human factors engineering design review performed at the site. This design review was carried out by a team from the Human Factors Engineering Branch, Division of Human Factors Safety. The review team was assisted by consultants from Lawrence Livermore National Laboratory (University of California), Livermore, California

  16. [Progress of genome engineering technology via clustered regularly interspaced short palindromic repeats--a review].

    Science.gov (United States)

    Li, Hao; Qiu, Shaofu; Song, Hongbin

    2013-10-04

    In survival competition with phage, bacteria and archaea gradually evolved the acquired immune system--Clustered regularly interspaced short palindromic repeats (CRISPR), presenting the trait of transcribing the crRNA and the CRISPR-associated protein (Cas) to silence or cleaving the foreign double-stranded DNA specifically. In recent years, strong interest arises in prokaryotes primitive immune system and many in-depth researches are going on. Recently, researchers successfully repurposed CRISPR as an RNA-guided platform for sequence-specific gene expression, which provides a simple approach for selectively perturbing gene expression on a genome-wide scale. It will undoubtedly bring genome engineering into a more convenient and accurate new era.

  17. A Domain-Specific Languane for Regular Sets of Strings and Trees

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Klarlund, Nils

    1999-01-01

    We propose a new high-level progr amming notation, called FIDO, that we have designed to concisely express regular sets of strings or trees. In particular, it can be viewed as a domain-specific language for the expression of finite-state automata on large alphabets (of sometimes astronomical size......, called the Monadic Second-order Logic (M2L) on trees. FIDO is translated first into pure M2L via suitable encodings, and finally into finite-state automata through the MONA tool....

  18. Regularization Techniques for Linear Least-Squares Problems

    KAUST Repository

    Suliman, Mohamed

    2016-04-01

    Linear estimation is a fundamental branch of signal processing that deals with estimating the values of parameters from a corrupted measured data. Throughout the years, several optimization criteria have been used to achieve this task. The most astonishing attempt among theses is the linear least-squares. Although this criterion enjoyed a wide popularity in many areas due to its attractive properties, it appeared to suffer from some shortcomings. Alternative optimization criteria, as a result, have been proposed. These new criteria allowed, in one way or another, the incorporation of further prior information to the desired problem. Among theses alternative criteria is the regularized least-squares (RLS). In this thesis, we propose two new algorithms to find the regularization parameter for linear least-squares problems. In the constrained perturbation regularization algorithm (COPRA) for random matrices and COPRA for linear discrete ill-posed problems, an artificial perturbation matrix with a bounded norm is forced into the model matrix. This perturbation is introduced to enhance the singular value structure of the matrix. As a result, the new modified model is expected to provide a better stabilize substantial solution when used to estimate the original signal through minimizing the worst-case residual error function. Unlike many other regularization algorithms that go in search of minimizing the estimated data error, the two new proposed algorithms are developed mainly to select the artifcial perturbation bound and the regularization parameter in a way that approximately minimizes the mean-squared error (MSE) between the original signal and its estimate under various conditions. The first proposed COPRA method is developed mainly to estimate the regularization parameter when the measurement matrix is complex Gaussian, with centered unit variance (standard), and independent and identically distributed (i.i.d.) entries. Furthermore, the second proposed COPRA

  19. Regularized Regression and Density Estimation based on Optimal Transport

    KAUST Repository

    Burger, M.

    2012-03-11

    The aim of this paper is to investigate a novel nonparametric approach for estimating and smoothing density functions as well as probability densities from discrete samples based on a variational regularization method with the Wasserstein metric as a data fidelity. The approach allows a unified treatment of discrete and continuous probability measures and is hence attractive for various tasks. In particular, the variational model for special regularization functionals yields a natural method for estimating densities and for preserving edges in the case of total variation regularization. In order to compute solutions of the variational problems, a regularized optimal transport problem needs to be solved, for which we discuss several formulations and provide a detailed analysis. Moreover, we compute special self-similar solutions for standard regularization functionals and we discuss several computational approaches and results. © 2012 The Author(s).

  20. Report on the Survey of the Design Review of New Reactor Applications. Volume 3: Reactor

    International Nuclear Information System (INIS)

    Downey, Steven; Monninger, John; Nevalainen, Janne; Lorin, Aurelie; ); Webster, Philip; Joyer, Philippe; Kawamura, Tomonori; Lankin, Mikhail; Kubanyi, Jozef; Haluska, Ladislav; Persic, Andreja; Reierson, Craig; Kang, Kyungmin; Kim, Walter

    2016-01-01

    At the tenth meeting of the CNRA Working Group on the Regulation of New Reactors (WGRNR) in March 2013, the Working Group agreed to present the responses to the Second Phase, or Design Phase, of the Licensing Process Survey as a multi-volume text. As such, each report will focus on one of the eleven general technical categories covered in the survey. The general technical categories were selected to conform to the topics covered in the International Atomic Energy Agency (IAEA) Safety Guide GS-G-4.1. This document, which is the third report on the results of the Design Phase Survey, focuses on the Reactor. The Reactor category includes the following technical topics: fuel system design, reactor internals and core support, nuclear design and core nuclear performance, thermal and hydraulic design, reactor materials, and functional design of reactivity control system. For each technical topic, the member countries described the information provided by the applicant, the scope and level of detail of the technical review, the technical basis for granting regulatory authorisation, the skill sets required and the level of effort needed to perform the review. Based on a comparison of the information provided by the member countries in response to the survey, the following observations were made: - Although the description of the information provided by the applicant differs in scope and level of detail among the member countries that provided responses, there are similarities in the information that is required. - All of the technical topics covered in the survey are reviewed in some manner by all of the regulatory authorities that provided responses. - Design review strategies most commonly used to confirm that the regulatory requirements have been met include document review and independent verification of calculations, computer codes, or models used to describe the design and performance of the core and the fuel. - It is common to consider operating experience and

  1. Oxidative stress and inflammation: liver responses and adaptations to acute and regular exercise.

    Science.gov (United States)

    Pillon Barcelos, Rômulo; Freire Royes, Luiz Fernando; Gonzalez-Gallego, Javier; Bresciani, Guilherme

    2017-02-01

    The liver is remarkably important during exercise outcomes due to its contribution to detoxification, synthesis, and release of biomolecules, and energy supply to the exercising muscles. Recently, liver has been also shown to play an important role in redox status and inflammatory modulation during exercise. However, while several studies have described the adaptations of skeletal muscles to acute and chronic exercise, hepatic changes are still scarcely investigated. Indeed, acute intense exercise challenges the liver with increased reactive oxygen species (ROS) and inflammation onset, whereas regular training induces hepatic antioxidant and anti-inflammatory improvements. Acute and regular exercise protocols in combination with antioxidant and anti-inflammatory supplementation have been also tested to verify hepatic adaptations to exercise. Although positive results have been reported in some acute models, several studies have shown an increased exercise-related stress upon liver. A similar trend has been observed during training: while synergistic effects of training and antioxidant/anti-inflammatory supplementations have been occasionally found, others reported a blunting of relevant adaptations to exercise, following the patterns described in skeletal muscles. This review discusses current data regarding liver responses and adaptation to acute and regular exercise protocols alone or combined with antioxidant and anti-inflammatory supplementation. The understanding of the mechanisms behind these modulations is of interest for both exercise-related health and performance outcomes.

  2. Energy functions for regularization algorithms

    Science.gov (United States)

    Delingette, H.; Hebert, M.; Ikeuchi, K.

    1991-01-01

    Regularization techniques are widely used for inverse problem solving in computer vision such as surface reconstruction, edge detection, or optical flow estimation. Energy functions used for regularization algorithms measure how smooth a curve or surface is, and to render acceptable solutions these energies must verify certain properties such as invariance with Euclidean transformations or invariance with parameterization. The notion of smoothness energy is extended here to the notion of a differential stabilizer, and it is shown that to void the systematic underestimation of undercurvature for planar curve fitting, it is necessary that circles be the curves of maximum smoothness. A set of stabilizers is proposed that meet this condition as well as invariance with rotation and parameterization.

  3. ONWI [Office of Nuclear Waste Isolation] 30% design review findings report for Exploratory Shaft Facility, Deaf Smith site

    International Nuclear Information System (INIS)

    1987-01-01

    Based on the performance of a TMP-05 Design Review with integration into the review, via TMP-22, of comments by SRP contractors other than ONWI, the Design Review Board has developed conclusions and recommendations with respect to the 30% design effort submitted by PB/PB-KBB for interim design review. A number of comments were submitted on the design basis criteria and requirements furnished as guidance to PB/PB-KBB. These comments are forwarded to SRPO for disposition and resolution. Additionally, comments whose resolution by PB/PB-KBB require input from other SRP contractors are included. All comments forwarded to SRPO are compiled and subdivided by appropriate category in Section 11.0 of this Findings Report

  4. Review of updated design of SPWR with PSA methodology

    International Nuclear Information System (INIS)

    Oikawa, Tetsukuni; Muramatsu, Ken

    1995-01-01

    This paper presents the procedures and results of a PSA (Probabilistic Safety Assessment) of the SPWR (System-Integrated PWR), which is being developed at the Japan Atomic Energy Research Institute (JAERI) as a medium sized innovative passive safe reactor, to assist in the design improvement of the SPWR at the basic or conceptual design phase by reviewing the design and identifying the design vulnerability. The first phase PSA, which was carried out in 1991, was a scoping analysis in order to understand overall plant characteristics and to search for general design weakness. After discussing the results of the first phase PSA, the SPWR designer group changed some designs of the SPWR. The second phase PSA of the SPWR was performed for the modified design in order to identify the design vulnerability as well as to grasp its overall safety level. Special items of these PSAs are as follows: (1) systematic identification of initiating events related to newly designed systems by the failure mode effect analysis (FMEA), (2) delineation of accident sequences for the internal initiating events using accident progression flow charts which is cost effective for conceptual design phase, (3) quantification of event trees based on many engineering judgement, and (4) lots of sensitivity analyses to examine applicability of data assignment. Qualitative and quantitative results of PSA provided very useful information for decision makings of design improvement and recommendations for further consideration in the process of detailed design. (author)

  5. Method of transferring regular shaped vessel into cell

    International Nuclear Information System (INIS)

    Murai, Tsunehiko.

    1997-01-01

    The present invention concerns a method of transferring regular shaped vessels from a non-contaminated area to a contaminated cell. A passage hole for allowing the regular shaped vessels to pass in the longitudinal direction is formed to a partitioning wall at the bottom of the contaminated cell. A plurality of regular shaped vessel are stacked in multiple stages in a vertical direction from the non-contaminated area present below the passage hole, allowed to pass while being urged and transferred successively into the contaminated cell. As a result, since they are transferred while substantially closing the passage hole by the regular shaped vessels, radiation rays or contaminated materials are prevented from discharging from the contaminated cell to the non-contaminated area. Since there is no requirement to open/close an isolation door frequently, the workability upon transfer can be improved remarkably. In addition, the sealing member for sealing the gap between the regular shaped vessel passing through the passage hole and the partitioning wall of the bottom is disposed to the passage hole, the contaminated materials in the contaminated cells can be prevented from discharging from the gap to the non-contaminated area. (N.H.)

  6. Automatic Constraint Detection for 2D Layout Regularization.

    Science.gov (United States)

    Jiang, Haiyong; Nan, Liangliang; Yan, Dong-Ming; Dong, Weiming; Zhang, Xiaopeng; Wonka, Peter

    2016-08-01

    In this paper, we address the problem of constraint detection for layout regularization. The layout we consider is a set of two-dimensional elements where each element is represented by its bounding box. Layout regularization is important in digitizing plans or images, such as floor plans and facade images, and in the improvement of user-created contents, such as architectural drawings and slide layouts. To regularize a layout, we aim to improve the input by detecting and subsequently enforcing alignment, size, and distance constraints between layout elements. Similar to previous work, we formulate layout regularization as a quadratic programming problem. In addition, we propose a novel optimization algorithm that automatically detects constraints. We evaluate the proposed framework using a variety of input layouts from different applications. Our results demonstrate that our method has superior performance to the state of the art.

  7. Automatic Constraint Detection for 2D Layout Regularization

    KAUST Repository

    Jiang, Haiyong

    2015-09-18

    In this paper, we address the problem of constraint detection for layout regularization. As layout we consider a set of two-dimensional elements where each element is represented by its bounding box. Layout regularization is important for digitizing plans or images, such as floor plans and facade images, and for the improvement of user created contents, such as architectural drawings and slide layouts. To regularize a layout, we aim to improve the input by detecting and subsequently enforcing alignment, size, and distance constraints between layout elements. Similar to previous work, we formulate the layout regularization as a quadratic programming problem. In addition, we propose a novel optimization algorithm to automatically detect constraints. In our results, we evaluate the proposed framework on a variety of input layouts from different applications, which demonstrates our method has superior performance to the state of the art.

  8. Lavrentiev regularization method for nonlinear ill-posed problems

    International Nuclear Information System (INIS)

    Kinh, Nguyen Van

    2002-10-01

    In this paper we shall be concerned with Lavientiev regularization method to reconstruct solutions x 0 of non ill-posed problems F(x)=y o , where instead of y 0 noisy data y δ is an element of X with absolut(y δ -y 0 ) ≤ δ are given and F:X→X is an accretive nonlinear operator from a real reflexive Banach space X into itself. In this regularization method solutions x α δ are obtained by solving the singularly perturbed nonlinear operator equation F(x)+α(x-x*)=y δ with some initial guess x*. Assuming certain conditions concerning the operator F and the smoothness of the element x*-x 0 we derive stability estimates which show that the accuracy of the regularized solutions is order optimal provided that the regularization parameter α has been chosen properly. (author)

  9. Conceptual design review report for K Basin Dose Reduction Project clean and coat task

    International Nuclear Information System (INIS)

    Blackburn, L.D.

    1996-01-01

    The strategy for reducing radiation dose originating from radionuclides absorbed in the concrete is to raise the pool water level to provide additional shielding. The concrete walls need to be coated to prevent future radionuclide absorption into the walls. This report documents a conceptual design review of equipment to clean and coat basin walls. The review concluded that the proposed concepts were and acceptable basis for proceeding with detailed final design

  10. A Review of Design Optimization Methods for Electrical Machines

    Directory of Open Access Journals (Sweden)

    Gang Lei

    2017-11-01

    Full Text Available Electrical machines are the hearts of many appliances, industrial equipment and systems. In the context of global sustainability, they must fulfill various requirements, not only physically and technologically but also environmentally. Therefore, their design optimization process becomes more and more complex as more engineering disciplines/domains and constraints are involved, such as electromagnetics, structural mechanics and heat transfer. This paper aims to present a review of the design optimization methods for electrical machines, including design analysis methods and models, optimization models, algorithms and methods/strategies. Several efficient optimization methods/strategies are highlighted with comments, including surrogate-model based and multi-level optimization methods. In addition, two promising and challenging topics in both academic and industrial communities are discussed, and two novel optimization methods are introduced for advanced design optimization of electrical machines. First, a system-level design optimization method is introduced for the development of advanced electric drive systems. Second, a robust design optimization method based on the design for six-sigma technique is introduced for high-quality manufacturing of electrical machines in production. Meanwhile, a proposal is presented for the development of a robust design optimization service based on industrial big data and cloud computing services. Finally, five future directions are proposed, including smart design optimization method for future intelligent design and production of electrical machines.

  11. RF power harvesting: a review on designing methodologies and applications

    Science.gov (United States)

    Tran, Le-Giang; Cha, Hyouk-Kyu; Park, Woo-Tae

    2017-12-01

    Wireless power transmission was conceptualized nearly a century ago. Certain achievements made to date have made power harvesting a reality, capable of providing alternative sources of energy. This review provides a summ ary of radio frequency (RF) power harvesting technologies in order to serve as a guide for the design of RF energy harvesting units. Since energy harvesting circuits are designed to operate with relatively small voltages and currents, they rely on state-of-the-art electrical technology for obtaining high efficiency. Thus, comprehensive analysis and discussions of various designs and their tradeoffs are included. Finally, recent applications of RF power harvesting are outlined.

  12. Optimizing Mass Spectrometry Analyses: A Tailored Review on the Utility of Design of Experiments.

    Science.gov (United States)

    Hecht, Elizabeth S; Oberg, Ann L; Muddiman, David C

    2016-05-01

    Mass spectrometry (MS) has emerged as a tool that can analyze nearly all classes of molecules, with its scope rapidly expanding in the areas of post-translational modifications, MS instrumentation, and many others. Yet integration of novel analyte preparatory and purification methods with existing or novel mass spectrometers can introduce new challenges for MS sensitivity. The mechanisms that govern detection by MS are particularly complex and interdependent, including ionization efficiency, ion suppression, and transmission. Performance of both off-line and MS methods can be optimized separately or, when appropriate, simultaneously through statistical designs, broadly referred to as "design of experiments" (DOE). The following review provides a tutorial-like guide into the selection of DOE for MS experiments, the practices for modeling and optimization of response variables, and the available software tools that support DOE implementation in any laboratory. This review comes 3 years after the latest DOE review (Hibbert DB, 2012), which provided a comprehensive overview on the types of designs available and their statistical construction. Since that time, new classes of DOE, such as the definitive screening design, have emerged and new calls have been made for mass spectrometrists to adopt the practice. Rather than exhaustively cover all possible designs, we have highlighted the three most practical DOE classes available to mass spectrometrists. This review further differentiates itself by providing expert recommendations for experimental setup and defining DOE entirely in the context of three case-studies that highlight the utility of different designs to achieve different goals. A step-by-step tutorial is also provided.

  13. Yucca Mountain Project: ESF Title I design control process review report

    International Nuclear Information System (INIS)

    1989-01-01

    The Exploratory Shaft Facility (ESF) Title 1 Design Control Process Review was initiated in response to direction from the Office of Civilian Radioactive Waste Management (OCRWM) (letter: Kale to Gertz, NRC Concerns on Title 1 Design Control Process, November 17, 1988). The direction was to identify the existing documentation that described ''hor-ellipsis the design control process and the quality assurance that governed hor-ellipsis'' (a) the development of the requirements documents for the ESF design, (b) the various interfaces between activities, (c) analyses and definitions leading to additional requirements in the System Design Requirements Documents and, (d) completion of Title 1 Design. This report provides historical information for general use in determining the extent of the quality assurance program in existence during the ESF Title 1 Design

  14. Regular graph construction for semi-supervised learning

    International Nuclear Information System (INIS)

    Vega-Oliveros, Didier A; Berton, Lilian; Eberle, Andre Mantini; Lopes, Alneu de Andrade; Zhao, Liang

    2014-01-01

    Semi-supervised learning (SSL) stands out for using a small amount of labeled points for data clustering and classification. In this scenario graph-based methods allow the analysis of local and global characteristics of the available data by identifying classes or groups regardless data distribution and representing submanifold in Euclidean space. Most of methods used in literature for SSL classification do not worry about graph construction. However, regular graphs can obtain better classification accuracy compared to traditional methods such as k-nearest neighbor (kNN), since kNN benefits the generation of hubs and it is not appropriate for high-dimensionality data. Nevertheless, methods commonly used for generating regular graphs have high computational cost. We tackle this problem introducing an alternative method for generation of regular graphs with better runtime performance compared to methods usually find in the area. Our technique is based on the preferential selection of vertices according some topological measures, like closeness, generating at the end of the process a regular graph. Experiments using the global and local consistency method for label propagation show that our method provides better or equal classification rate in comparison with kNN

  15. Personalization in Game Design for Healthcare: a Literature Review on its Definitions and Effects

    Directory of Open Access Journals (Sweden)

    Marierose van Dooren

    2016-12-01

    Full Text Available Personalization, the involvement of stakeholders in the design process, is often applied in serious game design for health. It is expected to enhance the alignment of a game to the preferences and capacities of the end-user, thereby increasing the end-user’s motivation to interact with the game, which finally might enhance the aimed-for health effects of the game. However, the nature and effect of personalization have never been systematically studied, making assumptions regarding personalization ungrounded. In this literature review, we firstly provide a proposal of our Personalized Design Process-model, where personalization is defined as stakeholder involvement in the Problem Definition-, Product Design- and/or Tailoring Phase. Secondly, we conducted a systematic literature review on this model, focusing on health and its effects. In this review, 62 of the 2579 found studies were included. Analysis showed that a minority of the studies were of methodologically higher quality and some of these tested the health effect by contrasting tailored versus non-tailored games. Most studies involved stakeholders in the Tailoring Design Phase. Therefore, we conclude that involving stakeholders in the Tailoring Phase is valuable. However, to know if personalization is effective in the Product Design- and the Problem Definition Phase, more studies are needed.

  16. Design review report for the Hanford K East and K West Basins MCO loading system

    International Nuclear Information System (INIS)

    Brisbin, S.A.

    1997-01-01

    This design report presents the final design of the MCO Loading System. The report includes final design drawings, a system description, failure modes and recovery plans, a system operational description, and stress analysis. Design comments from the final design review have been incorporated

  17. Regularization based on steering parameterized Gaussian filters and a Bhattacharyya distance functional

    Science.gov (United States)

    Lopes, Emerson P.

    2001-08-01

    Template regularization embeds the problem of class separability. In the machine vision perspective, this problem is critical when a textural classification procedure is applied to non-stationary pattern mosaic images. These applications often present low accuracy performance due to disturbance of the classifiers produced by exogenous or endogenous signal regularity perturbations. Natural scene imaging, where the images present certain degree of homogeneity in terms of texture element size or shape (primitives) shows a variety of behaviors, especially varying the preferential spatial directionality. The space-time image pattern characterization is only solved if classification procedures are designed considering the most robust tools within a parallel and hardware perspective. The results to be compared in this paper are obtained using a framework based on multi-resolution, frame and hypothesis approach. Two strategies for the bank of Gabor filters applications are considered: adaptive strategy using the KL transform and fix configuration strategy. The regularization under discussion is accomplished in the pyramid building system instance. The filterings are steering Gaussians controlled by free parameters which are adjusted in accordance with a feedback process driven by hints obtained from sequence of frames interaction functionals pos-processed in the training process and including classification of training set samples as examples. Besides these adjustments there is continuous input data sensitive adaptiveness. The experimental result assessments are focused on two basic issues: Bhattacharyya distance as pattern characterization feature and the combination of KL transform as feature selection and adaptive criterion with the regularization of the pattern Bhattacharyya distance functional (BDF) behavior, using the BDF state separability and symmetry as the main indicators of an optimum framework parameter configuration.

  18. EC6 design features and pre-project licensing review

    Energy Technology Data Exchange (ETDEWEB)

    Yu, S.; Lee, A.G.; Dinh, N.B.; Soulard, M. [CANDU Energy Inc., Mississauga, Ontario, (Canada)

    2013-07-01

    The Enhanced CANDU 6 (EC6) is the new Generation III CANDU reactor design that meets the most up to date Canadian regulatory requirements and customer expectations. Candu Energy Inc. is finalizing development of the EC6 which incorporates the CANDU 6's well-proven features, and adds enhancements that strengthened reactor safety margin and improved operability. The EC6 builds on the proven high performance design and the defence-in-depth features of CANDU 6 units, and has incorporated extensive operational feedback including lessons learned from Fukushima. This paper will provide status of the engineering program including progress on the pre-licensing review of the EC6 design by the Canadian Regulator, CNSC, and will also highlight the design and safety enhancements incorporated in the EC6 product. Safety enhancements to meet safety goals and to improve robustness of systems to respond to design basis accidents and beyond design basis accidents include: new severe accident recovery and heat removal system; improved emergency heat removal system; faster shutoff rods with improved safety margins; mechanical guaranteed shutdown rods; daily load cycling capability; robust containment with containment filter venting system; and improved backed-up electrical supply and cooling services. (author)

  19. Physical model of dimensional regularization

    Energy Technology Data Exchange (ETDEWEB)

    Schonfeld, Jonathan F.

    2016-12-15

    We explicitly construct fractals of dimension 4-ε on which dimensional regularization approximates scalar-field-only quantum-field theory amplitudes. The construction does not require fractals to be Lorentz-invariant in any sense, and we argue that there probably is no Lorentz-invariant fractal of dimension greater than 2. We derive dimensional regularization's power-law screening first for fractals obtained by removing voids from 3-dimensional Euclidean space. The derivation applies techniques from elementary dielectric theory. Surprisingly, fractal geometry by itself does not guarantee the appropriate power-law behavior; boundary conditions at fractal voids also play an important role. We then extend the derivation to 4-dimensional Minkowski space. We comment on generalization to non-scalar fields, and speculate about implications for quantum gravity. (orig.)

  20. Information-theoretic semi-supervised metric learning via entropy regularization.

    Science.gov (United States)

    Niu, Gang; Dai, Bo; Yamada, Makoto; Sugiyama, Masashi

    2014-08-01

    We propose a general information-theoretic approach to semi-supervised metric learning called SERAPH (SEmi-supervised metRic leArning Paradigm with Hypersparsity) that does not rely on the manifold assumption. Given the probability parameterized by a Mahalanobis distance, we maximize its entropy on labeled data and minimize its entropy on unlabeled data following entropy regularization. For metric learning, entropy regularization improves manifold regularization by considering the dissimilarity information of unlabeled data in the unsupervised part, and hence it allows the supervised and unsupervised parts to be integrated in a natural and meaningful way. Moreover, we regularize SERAPH by trace-norm regularization to encourage low-dimensional projections associated with the distance metric. The nonconvex optimization problem of SERAPH could be solved efficiently and stably by either a gradient projection algorithm or an EM-like iterative algorithm whose M-step is convex. Experiments demonstrate that SERAPH compares favorably with many well-known metric learning methods, and the learned Mahalanobis distance possesses high discriminability even under noisy environments.

  1. CONFOUNDING STRUCTURE OF TWO-LEVEL NONREGULAR FACTORIAL DESIGNS

    Institute of Scientific and Technical Information of China (English)

    Ren Junbai

    2012-01-01

    In design theory,the alias structure of regular fractional factorial designs is elegantly described with group theory.However,this approach cannot be applied to nonregular designs directly. For an arbitrary nonregular design,a natural question is how to describe the confounding relations between its effects,is there any inner structure similar to regular designs? The aim of this article is to answer this basic question.Using coefficients of indicator function,confounding structure of nonregular fractional factorial designs is obtained as linear constrains on the values of effects.A method to estimate the sparse significant effects in an arbitrary nonregular design is given through an example.

  2. Fluctuations of quantum fields via zeta function regularization

    International Nuclear Information System (INIS)

    Cognola, Guido; Zerbini, Sergio; Elizalde, Emilio

    2002-01-01

    Explicit expressions for the expectation values and the variances of some observables, which are bilinear quantities in the quantum fields on a D-dimensional manifold, are derived making use of zeta function regularization. It is found that the variance, related to the second functional variation of the effective action, requires a further regularization and that the relative regularized variance turns out to be 2/N, where N is the number of the fields, thus being independent of the dimension D. Some illustrating examples are worked through. The issue of the stress tensor is also briefly addressed

  3. X-ray computed tomography using curvelet sparse regularization.

    Science.gov (United States)

    Wieczorek, Matthias; Frikel, Jürgen; Vogel, Jakob; Eggl, Elena; Kopp, Felix; Noël, Peter B; Pfeiffer, Franz; Demaret, Laurent; Lasser, Tobias

    2015-04-01

    Reconstruction of x-ray computed tomography (CT) data remains a mathematically challenging problem in medical imaging. Complementing the standard analytical reconstruction methods, sparse regularization is growing in importance, as it allows inclusion of prior knowledge. The paper presents a method for sparse regularization based on the curvelet frame for the application to iterative reconstruction in x-ray computed tomography. In this work, the authors present an iterative reconstruction approach based on the alternating direction method of multipliers using curvelet sparse regularization. Evaluation of the method is performed on a specifically crafted numerical phantom dataset to highlight the method's strengths. Additional evaluation is performed on two real datasets from commercial scanners with different noise characteristics, a clinical bone sample acquired in a micro-CT and a human abdomen scanned in a diagnostic CT. The results clearly illustrate that curvelet sparse regularization has characteristic strengths. In particular, it improves the restoration and resolution of highly directional, high contrast features with smooth contrast variations. The authors also compare this approach to the popular technique of total variation and to traditional filtered backprojection. The authors conclude that curvelet sparse regularization is able to improve reconstruction quality by reducing noise while preserving highly directional features.

  4. A Review Of Design And Control Of Automated Guided Vehicle Systems

    OpenAIRE

    Le-Anh, Tuan; Koster, René

    2004-01-01

    textabstractThis paper presents a review on design and control of automated guided vehicle systems. We address most key related issues including guide-path design, estimating the number of vehicles, vehicle scheduling, idle-vehicle positioning, battery management, vehicle routing, and conflict resolution. We discuss and classify important models and results from key publications in literature on automated guided vehicle systems, including often-neglected areas, such as idle-vehicle positionin...

  5. Semisupervised Support Vector Machines With Tangent Space Intrinsic Manifold Regularization.

    Science.gov (United States)

    Sun, Shiliang; Xie, Xijiong

    2016-09-01

    Semisupervised learning has been an active research topic in machine learning and data mining. One main reason is that labeling examples is expensive and time-consuming, while there are large numbers of unlabeled examples available in many practical problems. So far, Laplacian regularization has been widely used in semisupervised learning. In this paper, we propose a new regularization method called tangent space intrinsic manifold regularization. It is intrinsic to data manifold and favors linear functions on the manifold. Fundamental elements involved in the formulation of the regularization are local tangent space representations, which are estimated by local principal component analysis, and the connections that relate adjacent tangent spaces. Simultaneously, we explore its application to semisupervised classification and propose two new learning algorithms called tangent space intrinsic manifold regularized support vector machines (TiSVMs) and tangent space intrinsic manifold regularized twin SVMs (TiTSVMs). They effectively integrate the tangent space intrinsic manifold regularization consideration. The optimization of TiSVMs can be solved by a standard quadratic programming, while the optimization of TiTSVMs can be solved by a pair of standard quadratic programmings. The experimental results of semisupervised classification problems show the effectiveness of the proposed semisupervised learning algorithms.

  6. Regularity and chaos in cavity QED

    International Nuclear Information System (INIS)

    Bastarrachea-Magnani, Miguel Angel; López-del-Carpio, Baldemar; Chávez-Carlos, Jorge; Lerma-Hernández, Sergio; Hirsch, Jorge G

    2017-01-01

    The interaction of a quantized electromagnetic field in a cavity with a set of two-level atoms inside it can be described with algebraic Hamiltonians of increasing complexity, from the Rabi to the Dicke models. Their algebraic character allows, through the use of coherent states, a semiclassical description in phase space, where the non-integrable Dicke model has regions associated with regular and chaotic motion. The appearance of classical chaos can be quantified calculating the largest Lyapunov exponent over the whole available phase space for a given energy. In the quantum regime, employing efficient diagonalization techniques, we are able to perform a detailed quantitative study of the regular and chaotic regions, where the quantum participation ratio (P R ) of coherent states on the eigenenergy basis plays a role equivalent to the Lyapunov exponent. It is noted that, in the thermodynamic limit, dividing the participation ratio by the number of atoms leads to a positive value in chaotic regions, while it tends to zero in the regular ones. (paper)

  7. A Year of Progress: NASA's Space Launch System Approaches Critical Design Review

    Science.gov (United States)

    Askins, Bruce; Robinson, Kimberly

    2015-01-01

    NASA's Space Launch System (SLS) made significant progress on the manufacturing floor and on the test stand in 2014 and positioned itself for a successful Critical Design Review in mid-2015. SLS, the world's only exploration-class heavy lift rocket, has the capability to dramatically increase the mass and volume of human and robotic exploration. Additionally, it will decrease overall mission risk, increase safety, and simplify ground and mission operations - all significant considerations for crewed missions and unique high-value national payloads. Development now is focused on configuration with 70 metric tons (t) of payload to low Earth orbit (LEO), more than double the payload of the retired Space Shuttle program or current operational vehicles. This "Block 1" design will launch NASA's Orion Multi-Purpose Crew Vehicle (MPCV) on an uncrewed flight beyond the Moon and back and the first crewed flight around the Moon. The current design has a direct evolutionary path to a vehicle with a 130t lift capability that offers even more flexibility to reduce planetary trip times, simplify payload design cycles, and provide new capabilities such as planetary sample returns. Every major element of SLS has successfully completed its Critical Design Review and now has hardware in production or testing. In fact, the SLS MPCV-to-Stage-Adapter (MSA) flew successfully on the Exploration Flight Test (EFT) 1 launch of a Delta IV and Orion spacecraft in December 2014. The SLS Program is currently working toward vehicle Critical Design Review in mid-2015. This paper will discuss these and other technical and programmatic successes and challenges over the past year and provide a preview of work ahead before the first flight of this new capability.

  8. Optimized star sensors laboratory calibration method using a regularization neural network.

    Science.gov (United States)

    Zhang, Chengfen; Niu, Yanxiong; Zhang, Hao; Lu, Jiazhen

    2018-02-10

    High-precision ground calibration is essential to ensure the performance of star sensors. However, the complex distortion and multi-error coupling have brought great difficulties to traditional calibration methods, especially for large field of view (FOV) star sensors. Although increasing the complexity of models is an effective way to improve the calibration accuracy, it significantly increases the demand for calibration data. In order to achieve high-precision calibration of star sensors with large FOV, a novel laboratory calibration method based on a regularization neural network is proposed. A multi-layer structure neural network is designed to represent the mapping of the star vector and the corresponding star point coordinate directly. To ensure the generalization performance of the network, regularization strategies are incorporated into the net structure and the training algorithm. Simulation and experiment results demonstrate that the proposed method can achieve high precision with less calibration data and without any other priori information. Compared with traditional methods, the calibration error of the star sensor decreased by about 30%. The proposed method can satisfy the precision requirement for large FOV star sensors.

  9. Actinic Mask Inspection at the ALS Initial Design Review

    International Nuclear Information System (INIS)

    Barty, A; Chapman, H; Sweeney, D; Levesque, R; Bokor, J; Gullikson, E; Jong, S; Liu, Y; Yi, M; Denbeaux, G; Goldberg, K; Naulleau, P; Denham, P; Rekawa, S; Baston, P; Tackaberry, R; Barale, P

    2003-01-01

    This report is the first milestone report for the actinic mask blank inspection project conducted at the VNL, which forms sub-section 3 of the Q1 2003 mask blank technology transfer program at the VNL. Specifically this report addresses deliverable 3.1.1--design review and preliminary tool design. The goal of this project is to design an actinic mask inspection tool capable of operating in two modes: high-speed scanning for the detection of multilayer defects (inspection mode), and a high-resolution aerial image mode in which the image emulates the imaging illumination conditions of a stepper system (aerial image or AIM mode). The purpose and objective of these two modes is as follows: (1) Defect inspection mode--This imaging mode is designed to scan large areas of the mask for defects EUV multilayer coatings. The goal is to detect the presence of multilayer defects on a mask blank and to store the co-ordinates for subsequent review in AIM mode, thus it is not essential that the illumination and imaging conditions match that of a production stepper. Potential uses for this imaging mode include: (a) Correlating the results obtained using actinic inspection with results obtained using other non-EUV defect inspection systems to verify that the non-EUV scanning systems are detecting all critical defects; (b) Gaining sufficient information to associate defects with particular processes, such as various stages of the multilayer deposition or different modes of operation of the deposition tool; and (c) Assessing the density and EUV impact of surface and multilayer anomalies. Because of the low defect density achieved using current multilayer coating technology it is necessary to be able to efficiently scan large areas of the mask in order to obtain sufficient statistics for use in cross-correlation experiments. Speed of operation as well as sensitivity is therefore key to operation in defect inspection mode. (2) Aerial Image Microscope (AIM) mode--In AIM mode the tool is

  10. Cognitive Aspects of Regularity Exhibit When Neighborhood Disappears

    Science.gov (United States)

    Chen, Sau-Chin; Hu, Jon-Fan

    2015-01-01

    Although regularity refers to the compatibility between pronunciation of character and sound of phonetic component, it has been suggested as being part of consistency, which is defined by neighborhood characteristics. Two experiments demonstrate how regularity effect is amplified or reduced by neighborhood characteristics and reveals the…

  11. The platform shapes the message : How website design affects abstraction and valence of online consumer reviews

    NARCIS (Netherlands)

    Aerts, Goele; Smits, Tim; Verlegh, Peeter W.J.

    2017-01-01

    Online consumer reviews provide relevant information about products and services for consumers. In today's networked age, the online consumer review platform market is hyper-competitive. These platforms can easily change different design characteristics to get more reviewers and to nudge reviewers

  12. Prevalence of Dental Erosion among the Young Regular Swimmers in Kaunas, Lithuania.

    Science.gov (United States)

    Zebrauskas, Andrius; Birskute, Ruta; Maciulskiene, Vita

    2014-04-01

    To determine prevalence of dental erosion among competitive swimmers in Kaunas, the second largest city in Lithuania. The study was designed as a cross-sectional survey, with a questionnaire and clinical examination protocols. The participants were 12 - 25 year-old swimmers regularly practicing in the swimming pools of Kaunas. Of the total of 132 participants there were 76 (12 - 17 year-old) and 56 (18 - 25 year-old) individuals; in Groups 1 and 2, respectively. Participants were examined for dental erosion, using a portable dental unit equipped with fibre-optic light, compressed air and suction, and standard dental instruments for oral inspection. Lussi index was applied for recording dental erosion. The completed questionnaires focused on the common erosion risk factors were returned by all participants. Dental erosion was found in 25% of the 12 - 17 year-olds, and in 50% of 18 - 25 years-olds. Mean value of the surfaces with erosion was 6.31 (SD 4.37). All eroded surfaces were evaluated as grade 1. Swimming training duration and the participants' age correlated positively (Kendall correlation, r = 0.65, P dental erosion and the analyzed risk factors (gastroesophageal reflux disease, frequent vomiting, dry mouth, regular intake of acidic medicines, carbonated drinks) was found in both study groups. Prevalence of dental erosion of very low degree was high among the regular swimmers in Kaunas, and was significantly related to swimmers' age.

  13. An adaptive regularization parameter choice strategy for multispectral bioluminescence tomography

    Energy Technology Data Exchange (ETDEWEB)

    Feng Jinchao; Qin Chenghu; Jia Kebin; Han Dong; Liu Kai; Zhu Shouping; Yang Xin; Tian Jie [Medical Image Processing Group, Institute of Automation, Chinese Academy of Sciences, P. O. Box 2728, Beijing 100190 (China); College of Electronic Information and Control Engineering, Beijing University of Technology, Beijing 100124 (China); Medical Image Processing Group, Institute of Automation, Chinese Academy of Sciences, P. O. Box 2728, Beijing 100190 (China); Medical Image Processing Group, Institute of Automation, Chinese Academy of Sciences, P. O. Box 2728, Beijing 100190 (China) and School of Life Sciences and Technology, Xidian University, Xi' an 710071 (China)

    2011-11-15

    Purpose: Bioluminescence tomography (BLT) provides an effective tool for monitoring physiological and pathological activities in vivo. However, the measured data in bioluminescence imaging are corrupted by noise. Therefore, regularization methods are commonly used to find a regularized solution. Nevertheless, for the quality of the reconstructed bioluminescent source obtained by regularization methods, the choice of the regularization parameters is crucial. To date, the selection of regularization parameters remains challenging. With regards to the above problems, the authors proposed a BLT reconstruction algorithm with an adaptive parameter choice rule. Methods: The proposed reconstruction algorithm uses a diffusion equation for modeling the bioluminescent photon transport. The diffusion equation is solved with a finite element method. Computed tomography (CT) images provide anatomical information regarding the geometry of the small animal and its internal organs. To reduce the ill-posedness of BLT, spectral information and the optimal permissible source region are employed. Then, the relationship between the unknown source distribution and multiview and multispectral boundary measurements is established based on the finite element method and the optimal permissible source region. Since the measured data are noisy, the BLT reconstruction is formulated as l{sub 2} data fidelity and a general regularization term. When choosing the regularization parameters for BLT, an efficient model function approach is proposed, which does not require knowledge of the noise level. This approach only requests the computation of the residual and regularized solution norm. With this knowledge, we construct the model function to approximate the objective function, and the regularization parameter is updated iteratively. Results: First, the micro-CT based mouse phantom was used for simulation verification. Simulation experiments were used to illustrate why multispectral data were used

  14. Automotive HMI design and participatory user involvement: review and perspectives.

    Science.gov (United States)

    François, Mathilde; Osiurak, François; Fort, Alexandra; Crave, Philippe; Navarro, Jordan

    2017-04-01

    Automotive human-machine interface (HMI) design is facing new challenges due to the technological advances of the last decades. The design process has to be adapted in order to address human factors and road safety challenges. It is now widely accepted that user involvement in the HMI design process is valuable. However, the current form of user involvement in industry remains at the stages of concept assessment and usability tests. Moreover, the literature in other fields (e.g. information systems) promotes a broader user involvement with participatory design (i.e. the user is fully involved in the development process). This article reviews the established benefits of participatory design and reveals perspectives for automotive HMI quality improvement in a cognitive ergonomic framework. Practitioner Summary: Automotive HMI quality determines, in part, drivers' ability to perform primary driving tasks while using in-vehicle devices. User involvement in the design process is a key point to contribute to HMI quality. This article reports the potential benefits of a broad involvement from drivers to meet automotive HMI design challenges.

  15. Adaptive design methods in clinical trials – a review

    Directory of Open Access Journals (Sweden)

    Chang Mark

    2008-05-01

    Full Text Available Abstract In recent years, the use of adaptive design methods in clinical research and development based on accrued data has become very popular due to its flexibility and efficiency. Based on adaptations applied, adaptive designs can be classified into three categories: prospective, concurrent (ad hoc, and retrospective adaptive designs. An adaptive design allows modifications made to trial and/or statistical procedures of ongoing clinical trials. However, it is a concern that the actual patient population after the adaptations could deviate from the originally target patient population and consequently the overall type I error (to erroneously claim efficacy for an infective drug rate may not be controlled. In addition, major adaptations of trial and/or statistical procedures of on-going trials may result in a totally different trial that is unable to address the scientific/medical questions the trial intends to answer. In this article, several commonly considered adaptive designs in clinical trials are reviewed. Impacts of ad hoc adaptations (protocol amendments, challenges in by design (prospective adaptations, and obstacles of retrospective adaptations are described. Strategies for the use of adaptive design in clinical development of rare diseases are discussed. Some examples concerning the development of Velcade intended for multiple myeloma and non-Hodgkin's lymphoma are given. Practical issues that are commonly encountered when implementing adaptive design methods in clinical trials are also discussed.

  16. Toward best practice in Human Machine Interface design for older drivers: A review of current design guidelines.

    Science.gov (United States)

    Young, K L; Koppel, S; Charlton, J L

    2017-09-01

    Older adults are the fastest growing segment of the driving population. While there is a strong emphasis for older people to maintain their mobility, the safety of older drivers is a serious community concern. Frailty and declines in a range of age-related sensory, cognitive, and physical impairments can place older drivers at an increased risk of crash-related injuries and death. A number of studies have indicated that in-vehicle technologies such as Advanced Driver Assistance Systems (ADAS) and In-Vehicle Information Systems (IVIS) may provide assistance to older drivers. However, these technologies will only benefit older drivers if their design is congruent with the complex needs and diverse abilities of this driving cohort. The design of ADAS and IVIS is largely informed by automotive Human Machine Interface (HMI) guidelines. However, it is unclear to what extent the declining sensory, cognitive and physical capabilities of older drivers are addressed in the current guidelines. This paper provides a review of key current design guidelines for IVIS and ADAS with respect to the extent they address age-related changes in functional capacities. The review revealed that most of the HMI guidelines do not address design issues related to older driver impairments. In fact, in many guidelines driver age and sensory cognitive and physical impairments are not mentioned at all and where reference is made, it is typically very broad. Prescriptive advice on how to actually design a system so that it addresses the needs and limitations of older drivers is not provided. In order for older drivers to reap the full benefits that in-vehicle technology can afford, it is critical that further work establish how older driver limitations and capabilities can be supported by the system design process, including their inclusion into HMI design guidelines. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Matrix regularization of embedded 4-manifolds

    International Nuclear Information System (INIS)

    Trzetrzelewski, Maciej

    2012-01-01

    We consider products of two 2-manifolds such as S 2 ×S 2 , embedded in Euclidean space and show that the corresponding 4-volume preserving diffeomorphism algebra can be approximated by a tensor product SU(N)⊗SU(N) i.e. functions on a manifold are approximated by the Kronecker product of two SU(N) matrices. A regularization of the 4-sphere is also performed by constructing N 2 ×N 2 matrix representations of the 4-algebra (and as a byproduct of the 3-algebra which makes the regularization of S 3 also possible).

  18. Optimal Tikhonov Regularization in Finite-Frequency Tomography

    Science.gov (United States)

    Fang, Y.; Yao, Z.; Zhou, Y.

    2017-12-01

    The last decade has witnessed a progressive transition in seismic tomography from ray theory to finite-frequency theory which overcomes the resolution limit of the high-frequency approximation in ray theory. In addition to approximations in wave propagation physics, a main difference between ray-theoretical tomography and finite-frequency tomography is the sparseness of the associated sensitivity matrix. It is well known that seismic tomographic problems are ill-posed and regularizations such as damping and smoothing are often applied to analyze the tradeoff between data misfit and model uncertainty. The regularizations depend on the structure of the matrix as well as noise level of the data. Cross-validation has been used to constrain data uncertainties in body-wave finite-frequency inversions when measurements at multiple frequencies are available to invert for a common structure. In this study, we explore an optimal Tikhonov regularization in surface-wave phase-velocity tomography based on minimization of an empirical Bayes risk function using theoretical training datasets. We exploit the structure of the sensitivity matrix in the framework of singular value decomposition (SVD) which also allows for the calculation of complete resolution matrix. We compare the optimal Tikhonov regularization in finite-frequency tomography with traditional tradeo-off analysis using surface wave dispersion measurements from global as well as regional studies.

  19. Design for Sustainability and Project Management Literature – A Review

    DEFF Research Database (Denmark)

    Ali, Faheem; Boks, Casper; Bey, Niki

    2016-01-01

    management literature has hardly been considered in design for sustainability research, this article attempts to review the points of intersection between these two fields, and explores the potential that knowledge from project management literature has in improving efficiency and effectiveness...... of development and implementation of design for sustainability tools.......The growing pressure on natural resources and increasing global trade have made sustainability issues a prime area of concern for all businesses alike. The increased focus on sustainability has impacted the way projects are conceived, planned, executed and evaluated in industries. Since project...

  20. Image segmentation with a novel regularized composite shape prior based on surrogate study

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Tingting, E-mail: tingtingzhao@mednet.ucla.edu; Ruan, Dan, E-mail: druan@mednet.ucla.edu [The Department of Radiation Oncology, University of California, Los Angeles, California 90095 (United States)

    2016-05-15

    Purpose: Incorporating training into image segmentation is a good approach to achieve additional robustness. This work aims to develop an effective strategy to utilize shape prior knowledge, so that the segmentation label evolution can be driven toward the desired global optimum. Methods: In the variational image segmentation framework, a regularization for the composite shape prior is designed to incorporate the geometric relevance of individual training data to the target, which is inferred by an image-based surrogate relevance metric. Specifically, this regularization is imposed on the linear weights of composite shapes and serves as a hyperprior. The overall problem is formulated in a unified optimization setting and a variational block-descent algorithm is derived. Results: The performance of the proposed scheme is assessed in both corpus callosum segmentation from an MR image set and clavicle segmentation based on CT images. The resulted shape composition provides a proper preference for the geometrically relevant training data. A paired Wilcoxon signed rank test demonstrates statistically significant improvement of image segmentation accuracy, when compared to multiatlas label fusion method and three other benchmark active contour schemes. Conclusions: This work has developed a novel composite shape prior regularization, which achieves superior segmentation performance than typical benchmark schemes.

  1. Image segmentation with a novel regularized composite shape prior based on surrogate study

    International Nuclear Information System (INIS)

    Zhao, Tingting; Ruan, Dan

    2016-01-01

    Purpose: Incorporating training into image segmentation is a good approach to achieve additional robustness. This work aims to develop an effective strategy to utilize shape prior knowledge, so that the segmentation label evolution can be driven toward the desired global optimum. Methods: In the variational image segmentation framework, a regularization for the composite shape prior is designed to incorporate the geometric relevance of individual training data to the target, which is inferred by an image-based surrogate relevance metric. Specifically, this regularization is imposed on the linear weights of composite shapes and serves as a hyperprior. The overall problem is formulated in a unified optimization setting and a variational block-descent algorithm is derived. Results: The performance of the proposed scheme is assessed in both corpus callosum segmentation from an MR image set and clavicle segmentation based on CT images. The resulted shape composition provides a proper preference for the geometrically relevant training data. A paired Wilcoxon signed rank test demonstrates statistically significant improvement of image segmentation accuracy, when compared to multiatlas label fusion method and three other benchmark active contour schemes. Conclusions: This work has developed a novel composite shape prior regularization, which achieves superior segmentation performance than typical benchmark schemes.

  2. Relationship with the Regularity of Visits Complications of Hypertension in Patients more than 45 years old

    Directory of Open Access Journals (Sweden)

    Wahyu Wijayanto

    2014-01-01

    Full Text Available ABSTRACT Hypertension being a risk factor for the entrance of various degenerative diseases such as coronary heart disease, stroke and other vascular Penyait. One factor that may increase the risk of hypertension one of them is poor lifestyle such as smoking, excessive consumption of salt in the diet and lack of exercise. This study was conducted to determine the level of knowledge about the relationship with the regularity of visits Complications of Hypertension Hypertension in Patients 45 years of age at the Tembok Dukuh  health center at Surabaya. The study was an observational cross-sectional study design. Sample size were 48 people that hypertensive patients more than 45 years old  who were treated at the Tembok Dukuh  health center. Independent variables, namely knowledge about the complications of hypertension patients and dependent variable is the regularity of visits to theTembok Dukuh health center patients with hypertension  Data analysis using the crosstab tes The results with cross-tabulation (crosstab can be seen as many as 30 people from 48 respondents have less knowledge and affect the regularity of visits to the Tembok Dukuh health center. The conclusion can be drawn that most hypertensive patients more than 45 years old whose came to Tembok Dukuh  health centers has less knowledge about hypertension complications that result in hypertensive patients regularity of visits decreased Keyword : hypertension, knowledge, regularity of visits

  3. Likelihood ratio decisions in memory: three implied regularities.

    Science.gov (United States)

    Glanzer, Murray; Hilford, Andrew; Maloney, Laurence T

    2009-06-01

    We analyze four general signal detection models for recognition memory that differ in their distributional assumptions. Our analyses show that a basic assumption of signal detection theory, the likelihood ratio decision axis, implies three regularities in recognition memory: (1) the mirror effect, (2) the variance effect, and (3) the z-ROC length effect. For each model, we present the equations that produce the three regularities and show, in computed examples, how they do so. We then show that the regularities appear in data from a range of recognition studies. The analyses and data in our study support the following generalization: Individuals make efficient recognition decisions on the basis of likelihood ratios.

  4. Low-Rank Matrix Factorization With Adaptive Graph Regularizer.

    Science.gov (United States)

    Lu, Gui-Fu; Wang, Yong; Zou, Jian

    2016-05-01

    In this paper, we present a novel low-rank matrix factorization algorithm with adaptive graph regularizer (LMFAGR). We extend the recently proposed low-rank matrix with manifold regularization (MMF) method with an adaptive regularizer. Different from MMF, which constructs an affinity graph in advance, LMFAGR can simultaneously seek graph weight matrix and low-dimensional representations of data. That is, graph construction and low-rank matrix factorization are incorporated into a unified framework, which results in an automatically updated graph rather than a predefined one. The experimental results on some data sets demonstrate that the proposed algorithm outperforms the state-of-the-art low-rank matrix factorization methods.

  5. Online Manifold Regularization by Dual Ascending Procedure

    OpenAIRE

    Sun, Boliang; Li, Guohui; Jia, Li; Zhang, Hui

    2013-01-01

    We propose a novel online manifold regularization framework based on the notion of duality in constrained optimization. The Fenchel conjugate of hinge functions is a key to transfer manifold regularization from offline to online in this paper. Our algorithms are derived by gradient ascent in the dual function. For practical purpose, we propose two buffering strategies and two sparse approximations to reduce the computational complexity. Detailed experiments verify the utility of our approache...

  6. Degree-regular triangulations of torus and Klein bottle

    Indian Academy of Sciences (India)

    Home; Journals; Proceedings – Mathematical Sciences; Volume 115; Issue 3 ... A triangulation of a connected closed surface is called degree-regular if each of its vertices have the same degree. ... In [5], Datta and Nilakantan have classified all the degree-regular triangulations of closed surfaces on at most 11 vertices.

  7. Computational approaches in the design of synthetic receptors – A review

    Energy Technology Data Exchange (ETDEWEB)

    Cowen, Todd, E-mail: tc203@le.ac.uk; Karim, Kal; Piletsky, Sergey

    2016-09-14

    The rational design of molecularly imprinted polymers (MIPs) has been a major contributor to their reputation as “plastic antibodies” – high affinity robust synthetic receptors which can be optimally designed, and produced for a much reduced cost than their biological equivalents. Computational design has become a routine procedure in the production of MIPs, and has led to major advances in functional monomer screening, selection of cross-linker and solvent, optimisation of monomer(s)-template ratio and selectivity analysis. In this review the various computational methods will be discussed with reference to all the published relevant literature since the end of 2013, with each article described by the target molecule, the computational approach applied (whether molecular mechanics/molecular dynamics, semi-empirical quantum mechanics, ab initio quantum mechanics (Hartree-Fock, Møller–Plesset, etc.) or DFT) and the purpose for which they were used. Detailed analysis is given to novel techniques including analysis of polymer binding sites, the use of novel screening programs and simulations of MIP polymerisation reaction. The further advances in molecular modelling and computational design of synthetic receptors in particular will have serious impact on the future of nanotechnology and biotechnology, permitting the further translation of MIPs into the realms of analytics and medical technology. - Highlights: • A review of computational modelling in the design of molecularly imprinted polymers. • Target analytes and method of analysis for the vast majority of recent articles. • Explanations are given of all the popular and emerging techniques used in design. • Highlighted examples of sophisticated analysis of imprinted polymer systems.

  8. Computational approaches in the design of synthetic receptors – A review

    International Nuclear Information System (INIS)

    Cowen, Todd; Karim, Kal; Piletsky, Sergey

    2016-01-01

    The rational design of molecularly imprinted polymers (MIPs) has been a major contributor to their reputation as “plastic antibodies” – high affinity robust synthetic receptors which can be optimally designed, and produced for a much reduced cost than their biological equivalents. Computational design has become a routine procedure in the production of MIPs, and has led to major advances in functional monomer screening, selection of cross-linker and solvent, optimisation of monomer(s)-template ratio and selectivity analysis. In this review the various computational methods will be discussed with reference to all the published relevant literature since the end of 2013, with each article described by the target molecule, the computational approach applied (whether molecular mechanics/molecular dynamics, semi-empirical quantum mechanics, ab initio quantum mechanics (Hartree-Fock, Møller–Plesset, etc.) or DFT) and the purpose for which they were used. Detailed analysis is given to novel techniques including analysis of polymer binding sites, the use of novel screening programs and simulations of MIP polymerisation reaction. The further advances in molecular modelling and computational design of synthetic receptors in particular will have serious impact on the future of nanotechnology and biotechnology, permitting the further translation of MIPs into the realms of analytics and medical technology. - Highlights: • A review of computational modelling in the design of molecularly imprinted polymers. • Target analytes and method of analysis for the vast majority of recent articles. • Explanations are given of all the popular and emerging techniques used in design. • Highlighted examples of sophisticated analysis of imprinted polymer systems.

  9. Occlusal designs on masticatory ability and patient satisfaction with complete denture: a systematic review.

    Science.gov (United States)

    Zhao, Ke; Mai, Qing-Qing; Wang, Xiao-Dong; Yang, Wen; Zhao, Li

    2013-11-01

    To systematically review clinical outcomes of different occlusal designs of complete dentures. Using a various key words, an electronic search of clinical trials published in English and Chinese literature was performed from four databases: Medline/PubMed, EMBASE, Cochrane Library, and CBM. Furthermore, a manual searching of the relevant journals and the bibliographies of reviews was performed. General satisfaction, masticatory ability, retention, and stability were major criteria for the evaluation of the outcomes. Studies met these criteria were selected for a full-text reading. The whole processes were performed by two reviewers independently. This systematic review started with 1030 articles, which were finally narrowed down to seven, according to the inclusion criteria. The following occlusal designs were included and analyzed: anatomic occlusion, balanced occlusion, canine guidance occlusion, lingualized occlusion, monoplane occlusion, and bilateral-balanced and canine-guided design. Three of the seven studies showed that lingualized occlusion had ratings of higher patients' satisfaction than other occlusion designs. On the other hand, the canine-guided occlusion dentures demonstrated equal or better clinical performances than bilateral-balanced dentures. Because of the heterogeneity and bias of the studies, it was not possible to analyze the data statistically. Lingualized occlusion and canine-guided occlusion can be successfully applied in the fabrication of complete dentures. Canine guided occlusion has also been shown to be satisfactory. More well-controlled randomized trials are needed regarding canine-guided occlusion and the relationship between alveolar ridge resorption, different occlusal schemes and patient satisfaction. The conventional prosthodontic wisdom that complete dentures require a balanced occlusal design is not supported by the included literature. A suitable occlusal scheme would be a critical factor for a successful complete denture

  10. The relationship between lifestyle regularity and subjective sleep quality

    Science.gov (United States)

    Monk, Timothy H.; Reynolds, Charles F 3rd; Buysse, Daniel J.; DeGrazia, Jean M.; Kupfer, David J.

    2003-01-01

    In previous work we have developed a diary instrument-the Social Rhythm Metric (SRM), which allows the assessment of lifestyle regularity-and a questionnaire instrument--the Pittsburgh Sleep Quality Index (PSQI), which allows the assessment of subjective sleep quality. The aim of the present study was to explore the relationship between lifestyle regularity and subjective sleep quality. Lifestyle regularity was assessed by both standard (SRM-17) and shortened (SRM-5) metrics; subjective sleep quality was assessed by the PSQI. We hypothesized that high lifestyle regularity would be conducive to better sleep. Both instruments were given to a sample of 100 healthy subjects who were studied as part of a variety of different experiments spanning a 9-yr time frame. Ages ranged from 19 to 49 yr (mean age: 31.2 yr, s.d.: 7.8 yr); there were 48 women and 52 men. SRM scores were derived from a two-week diary. The hypothesis was confirmed. There was a significant (rho = -0.4, p subjects with higher levels of lifestyle regularity reported fewer sleep problems. This relationship was also supported by a categorical analysis, where the proportion of "poor sleepers" was doubled in the "irregular types" group as compared with the "non-irregular types" group. Thus, there appears to be an association between lifestyle regularity and good sleep, though the direction of causality remains to be tested.

  11. The association between peer, parental influence and tobacco product features and earlier age of onset of regular smoking among adults in 27 European countries.

    Science.gov (United States)

    Filippidis, Filippos T; Agaku, Israel T; Vardavas, Constantine I

    2015-10-01

    Factors that influence smoking initiation and age of smoking onset are important considerations in tobacco control. We evaluated European Union (EU)-wide differences in the age of onset of regular smoking, and the potential role of peer, parental and tobacco product design features on the earlier onset of regular smoking among adults influenced their decision to start smoking, including peer influence, parental influence and features of tobacco products. Multi-variable logistic regression, adjusted for age; geographic region; education; difficulty to pay bills; and gender, was used to assess the role of the various pro-tobacco influences on early onset of regular smoking (i.e. influenced by peers (OR = 1.70; 95%CI 1.30-2.20) or parents (OR = 1.60; 95%CI 1.21-2.12) were more likely to have started smoking regularly <18 years old. No significant association between design and marketing features of tobacco products and an early initiation of regular smoking was observed (OR = 1.04; 95%CI 0.83-1.31). We identified major differences in smoking initiation patterns among EU countries, which may warrant different approaches in the prevention of tobacco use. © The Author 2015. Published by Oxford University Press on behalf of the European Public Health Association. All rights reserved.

  12. Lost in Optimisation of Water Distribution Systems? A Literature Review of System Design

    Directory of Open Access Journals (Sweden)

    Helena Mala-Jetmarova

    2018-03-01

    Full Text Available Optimisation of water distribution system design is a well-established research field, which has been extremely productive since the end of the 1980s. Its primary focus is to minimise the cost of a proposed pipe network infrastructure. This paper reviews in a systematic manner articles published over the past three decades, which are relevant to the design of new water distribution systems, and the strengthening, expansion and rehabilitation of existing water distribution systems, inclusive of design timing, parameter uncertainty, water quality, and operational considerations. It identifies trends and limits in the field, and provides future research directions. Exclusively, this review paper also contains comprehensive information from over one hundred and twenty publications in a tabular form, including optimisation model formulations, solution methodologies used, and other important details.

  13. The impact of therapeutic opioid agonists on driving-related psychomotor skills assessed by a driving simulator or an on-road driving task: A systematic review.

    Science.gov (United States)

    Ferreira, Diana H; Boland, Jason W; Phillips, Jane L; Lam, Lawrence; Currow, David C

    2018-04-01

    Driving cessation is associated with poor health-related outcomes. People with chronic diseases are often prescribed long-term opioid agonists that have the potential to impair driving. Studies evaluating the impact of opioids on driving-related psychomotor skills report contradictory results likely due to heterogeneous designs, assessment tools and study populations. A better understanding of the effects of regular therapeutic opioid agonists on driving can help to inform the balance between individual's independence and community safety. To identify the literature assessing the impact of regular therapeutic opioid agonists on driving-related psychomotor skills for people with chronic pain or chronic breathlessness. Systematic review reported in accordance with the Preferred Reporting Items for Systematic Review and Meta-analysis statement; PROSPERO Registration CRD42017055909. Six electronic databases and grey literature were systematically searched up to January, 2017. Inclusion criteria were as follows: (1) empirical studies reporting data on driving simulation, on-the-road driving tasks or driving outcomes; (2) people with chronic pain or chronic breathlessness; and (3) taking regular therapeutic opioid agonists. Critical appraisal used the National Institutes of Health's quality assessment tools. From 3809 records screened, three studies matched the inclusion criteria. All reported data on people with chronic non-malignant pain. No significant impact of regular therapeutic opioid agonists on people's driving-related psychomotor skills was reported. One study reported more intense pain significantly worsened driving performance. This systematic review does not identify impaired simulated driving performance when people take regular therapeutic opioid agonists for symptom control, although more prospective studies are needed.

  14. On the relationships between generative encodings, regularity, and learning abilities when evolving plastic artificial neural networks.

    Directory of Open Access Journals (Sweden)

    Paul Tonelli

    Full Text Available A major goal of bio-inspired artificial intelligence is to design artificial neural networks with abilities that resemble those of animal nervous systems. It is commonly believed that two keys for evolving nature-like artificial neural networks are (1 the developmental process that links genes to nervous systems, which enables the evolution of large, regular neural networks, and (2 synaptic plasticity, which allows neural networks to change during their lifetime. So far, these two topics have been mainly studied separately. The present paper shows that they are actually deeply connected. Using a simple operant conditioning task and a classic evolutionary algorithm, we compare three ways to encode plastic neural networks: a direct encoding, a developmental encoding inspired by computational neuroscience models, and a developmental encoding inspired by morphogen gradients (similar to HyperNEAT. Our results suggest that using a developmental encoding could improve the learning abilities of evolved, plastic neural networks. Complementary experiments reveal that this result is likely the consequence of the bias of developmental encodings towards regular structures: (1 in our experimental setup, encodings that tend to produce more regular networks yield networks with better general learning abilities; (2 whatever the encoding is, networks that are the more regular are statistically those that have the best learning abilities.

  15. Borderline personality disorder and regularly drinking alcohol before sex.

    Science.gov (United States)

    Thompson, Ronald G; Eaton, Nicholas R; Hu, Mei-Chen; Hasin, Deborah S

    2017-07-01

    Drinking alcohol before sex increases the likelihood of engaging in unprotected intercourse, having multiple sexual partners and becoming infected with sexually transmitted infections. Borderline personality disorder (BPD), a complex psychiatric disorder characterised by pervasive instability in emotional regulation, self-image, interpersonal relationships and impulse control, is associated with substance use disorders and sexual risk behaviours. However, no study has examined the relationship between BPD and drinking alcohol before sex in the USA. This study examined the association between BPD and regularly drinking before sex in a nationally representative adult sample. Participants were 17 491 sexually active drinkers from Wave 2 of the National Epidemiologic Survey on Alcohol and Related Conditions. Logistic regression models estimated effects of BPD diagnosis, specific borderline diagnostic criteria and BPD criterion count on the likelihood of regularly (mostly or always) drinking alcohol before sex, adjusted for controls. Borderline personality disorder diagnosis doubled the odds of regularly drinking before sex [adjusted odds ratio (AOR) = 2.26; confidence interval (CI) = 1.63, 3.14]. Of nine diagnostic criteria, impulsivity in areas that are self-damaging remained a significant predictor of regularly drinking before sex (AOR = 1.82; CI = 1.42, 2.35). The odds of regularly drinking before sex increased by 20% for each endorsed criterion (AOR = 1.20; CI = 1.14, 1.27) DISCUSSION AND CONCLUSIONS: This is the first study to examine the relationship between BPD and regularly drinking alcohol before sex in the USA. Substance misuse treatment should assess regularly drinking before sex, particularly among patients with BPD, and BPD treatment should assess risk at the intersection of impulsivity, sexual behaviour and substance use. [Thompson Jr RG, Eaton NR, Hu M-C, Hasin DS Borderline personality disorder and regularly drinking alcohol

  16. Generalized Bregman distances and convergence rates for non-convex regularization methods

    International Nuclear Information System (INIS)

    Grasmair, Markus

    2010-01-01

    We generalize the notion of Bregman distance using concepts from abstract convexity in order to derive convergence rates for Tikhonov regularization with non-convex regularization terms. In particular, we study the non-convex regularization of linear operator equations on Hilbert spaces, showing that the conditions required for the application of the convergence rates results are strongly related to the standard range conditions from the convex case. Moreover, we consider the setting of sparse regularization, where we show that a rate of order δ 1/p holds, if the regularization term has a slightly faster growth at zero than |t| p

  17. Manufacture of Regularly Shaped Sol-Gel Pellets

    Science.gov (United States)

    Leventis, Nicholas; Johnston, James C.; Kinder, James D.

    2006-01-01

    An extrusion batch process for manufacturing regularly shaped sol-gel pellets has been devised as an improved alternative to a spray process that yields irregularly shaped pellets. The aspect ratio of regularly shaped pellets can be controlled more easily, while regularly shaped pellets pack more efficiently. In the extrusion process, a wet gel is pushed out of a mold and chopped repetitively into short, cylindrical pieces as it emerges from the mold. The pieces are collected and can be either (1) dried at ambient pressure to xerogel, (2) solvent exchanged and dried under ambient pressure to ambigels, or (3) supercritically dried to aerogel. Advantageously, the extruded pellets can be dropped directly in a cross-linking bath, where they develop a conformal polymer coating around the skeletal framework of the wet gel via reaction with the cross linker. These pellets can be dried to mechanically robust X-Aerogel.

  18. Regularization and Complexity Control in Feed-forward Networks

    OpenAIRE

    Bishop, C. M.

    1995-01-01

    In this paper we consider four alternative approaches to complexity control in feed-forward networks based respectively on architecture selection, regularization, early stopping, and training with noise. We show that there are close similarities between these approaches and we argue that, for most practical applications, the technique of regularization should be the method of choice.

  19. Interface, information, interaction: a narrative review of design and functional requirements for clinical decision support.

    Science.gov (United States)

    Miller, Kristen; Mosby, Danielle; Capan, Muge; Kowalski, Rebecca; Ratwani, Raj; Noaiseh, Yaman; Kraft, Rachel; Schwartz, Sanford; Weintraub, William S; Arnold, Ryan

    2018-05-01

    Provider acceptance and associated patient outcomes are widely discussed in the evaluation of clinical decision support systems (CDSSs), but critical design criteria for tools have generally been overlooked. The objective of this work is to inform electronic health record alert optimization and clinical practice workflow by identifying, compiling, and reporting design recommendations for CDSS to support the efficient, effective, and timely delivery of high-quality care. A narrative review was conducted from 2000 to 2016 in PubMed and The Journal of Human Factors and Ergonomics Society to identify papers that discussed/recommended design features of CDSSs that are associated with the success of these systems. Fourteen papers were included as meeting the criteria and were found to have a total of 42 unique recommendations; 11 were classified as interface features, 10 as information features, and 21 as interaction features. Features are defined and described, providing actionable guidance that can be applied to CDSS development and policy. To our knowledge, no reviews have been completed that discuss/recommend design features of CDSS at this scale, and thus we found that this was important for the body of literature. The recommendations identified in this narrative review will help to optimize design, organization, management, presentation, and utilization of information through presentation, content, and function. The designation of 3 categories (interface, information, and interaction) should be further evaluated to determine the critical importance of the categories. Future work will determine how to prioritize them with limited resources for designers and developers in order to maximize the clinical utility of CDSS. This review will expand the field of knowledge and provide a novel organization structure to identify key recommendations for CDSS.

  20. Manifold regularization for sparse unmixing of hyperspectral images.

    Science.gov (United States)

    Liu, Junmin; Zhang, Chunxia; Zhang, Jiangshe; Li, Huirong; Gao, Yuelin

    2016-01-01

    Recently, sparse unmixing has been successfully applied to spectral mixture analysis of remotely sensed hyperspectral images. Based on the assumption that the observed image signatures can be expressed in the form of linear combinations of a number of pure spectral signatures known in advance, unmixing of each mixed pixel in the scene is to find an optimal subset of signatures in a very large spectral library, which is cast into the framework of sparse regression. However, traditional sparse regression models, such as collaborative sparse regression , ignore the intrinsic geometric structure in the hyperspectral data. In this paper, we propose a novel model, called manifold regularized collaborative sparse regression , by introducing a manifold regularization to the collaborative sparse regression model. The manifold regularization utilizes a graph Laplacian to incorporate the locally geometrical structure of the hyperspectral data. An algorithm based on alternating direction method of multipliers has been developed for the manifold regularized collaborative sparse regression model. Experimental results on both the simulated and real hyperspectral data sets have demonstrated the effectiveness of our proposed model.

  1. Regularization dependence on phase diagram in Nambu–Jona-Lasinio model

    International Nuclear Information System (INIS)

    Kohyama, H.; Kimura, D.; Inagaki, T.

    2015-01-01

    We study the regularization dependence on meson properties and the phase diagram of quark matter by using the two flavor Nambu–Jona-Lasinio model. The model also has the parameter dependence in each regularization, so we explicitly give the model parameters for some sets of the input observables, then investigate its effect on the phase diagram. We find that the location or the existence of the critical end point highly depends on the regularization methods and the model parameters. Then we think that regularization and parameters are carefully considered when one investigates the QCD critical end point in the effective model studies

  2. Optimal analysis of structures by concepts of symmetry and regularity

    CERN Document Server

    Kaveh, Ali

    2013-01-01

    Optimal analysis is defined as an analysis that creates and uses sparse, well-structured and well-conditioned matrices. The focus is on efficient methods for eigensolution of matrices involved in static, dynamic and stability analyses of symmetric and regular structures, or those general structures containing such components. Powerful tools are also developed for configuration processing, which is an important issue in the analysis and design of space structures and finite element models. Different mathematical concepts are combined to make the optimal analysis of structures feasible. Canonical forms from matrix algebra, product graphs from graph theory and symmetry groups from group theory are some of the concepts involved in the variety of efficient methods and algorithms presented. The algorithms elucidated in this book enable analysts to handle large-scale structural systems by lowering their computational cost, thus fulfilling the requirement for faster analysis and design of future complex systems. The ...

  3. Generalization Performance of Regularized Ranking With Multiscale Kernels.

    Science.gov (United States)

    Zhou, Yicong; Chen, Hong; Lan, Rushi; Pan, Zhibin

    2016-05-01

    The regularized kernel method for the ranking problem has attracted increasing attentions in machine learning. The previous regularized ranking algorithms are usually based on reproducing kernel Hilbert spaces with a single kernel. In this paper, we go beyond this framework by investigating the generalization performance of the regularized ranking with multiscale kernels. A novel ranking algorithm with multiscale kernels is proposed and its representer theorem is proved. We establish the upper bound of the generalization error in terms of the complexity of hypothesis spaces. It shows that the multiscale ranking algorithm can achieve satisfactory learning rates under mild conditions. Experiments demonstrate the effectiveness of the proposed method for drug discovery and recommendation tasks.

  4. Top-down attention affects sequential regularity representation in the human visual system.

    Science.gov (United States)

    Kimura, Motohiro; Widmann, Andreas; Schröger, Erich

    2010-08-01

    Recent neuroscience studies using visual mismatch negativity (visual MMN), an event-related brain potential (ERP) index of memory-mismatch processes in the visual sensory system, have shown that although sequential regularities embedded in successive visual stimuli can be automatically represented in the visual sensory system, an existence of sequential regularity itself does not guarantee that the sequential regularity will be automatically represented. In the present study, we investigated the effects of top-down attention on sequential regularity representation in the visual sensory system. Our results showed that a sequential regularity (SSSSD) embedded in a modified oddball sequence where infrequent deviant (D) and frequent standard stimuli (S) differing in luminance were regularly presented (SSSSDSSSSDSSSSD...) was represented in the visual sensory system only when participants attended the sequential regularity in luminance, but not when participants ignored the stimuli or simply attended the dimension of luminance per se. This suggests that top-down attention affects sequential regularity representation in the visual sensory system and that top-down attention is a prerequisite for particular sequential regularities to be represented. Copyright 2010 Elsevier B.V. All rights reserved.

  5. Synthetic RNAs for Gene Regulation: Design Principles and Computational Tools

    International Nuclear Information System (INIS)

    Laganà, Alessandro; Shasha, Dennis; Croce, Carlo Maria

    2014-01-01

    The use of synthetic non-coding RNAs for post-transcriptional regulation of gene expression has not only become a standard laboratory tool for gene functional studies but it has also opened up new perspectives in the design of new and potentially promising therapeutic strategies. Bioinformatics has provided researchers with a variety of tools for the design, the analysis, and the evaluation of RNAi agents such as small-interfering RNA (siRNA), short-hairpin RNA (shRNA), artificial microRNA (a-miR), and microRNA sponges. More recently, a new system for genome engineering based on the bacterial CRISPR-Cas9 system (Clustered Regularly Interspaced Short Palindromic Repeats), was shown to have the potential to also regulate gene expression at both transcriptional and post-transcriptional level in a more specific way. In this mini review, we present RNAi and CRISPRi design principles and discuss the advantages and limitations of the current design approaches.

  6. Synthetic RNAs for Gene Regulation: Design Principles and Computational Tools

    Energy Technology Data Exchange (ETDEWEB)

    Laganà, Alessandro [Department of Molecular Virology, Immunology and Medical Genetics, Comprehensive Cancer Center, The Ohio State University, Columbus, OH (United States); Shasha, Dennis [Courant Institute of Mathematical Sciences, New York University, New York, NY (United States); Croce, Carlo Maria [Department of Molecular Virology, Immunology and Medical Genetics, Comprehensive Cancer Center, The Ohio State University, Columbus, OH (United States)

    2014-12-11

    The use of synthetic non-coding RNAs for post-transcriptional regulation of gene expression has not only become a standard laboratory tool for gene functional studies but it has also opened up new perspectives in the design of new and potentially promising therapeutic strategies. Bioinformatics has provided researchers with a variety of tools for the design, the analysis, and the evaluation of RNAi agents such as small-interfering RNA (siRNA), short-hairpin RNA (shRNA), artificial microRNA (a-miR), and microRNA sponges. More recently, a new system for genome engineering based on the bacterial CRISPR-Cas9 system (Clustered Regularly Interspaced Short Palindromic Repeats), was shown to have the potential to also regulate gene expression at both transcriptional and post-transcriptional level in a more specific way. In this mini review, we present RNAi and CRISPRi design principles and discuss the advantages and limitations of the current design approaches.

  7. Regularized Discriminant Analysis: A Large Dimensional Study

    KAUST Repository

    Yang, Xiaoke

    2018-04-28

    In this thesis, we focus on studying the performance of general regularized discriminant analysis (RDA) classifiers. The data used for analysis is assumed to follow Gaussian mixture model with different means and covariances. RDA offers a rich class of regularization options, covering as special cases the regularized linear discriminant analysis (RLDA) and the regularized quadratic discriminant analysis (RQDA) classi ers. We analyze RDA under the double asymptotic regime where the data dimension and the training size both increase in a proportional way. This double asymptotic regime allows for application of fundamental results from random matrix theory. Under the double asymptotic regime and some mild assumptions, we show that the asymptotic classification error converges to a deterministic quantity that only depends on the data statistical parameters and dimensions. This result not only implicates some mathematical relations between the misclassification error and the class statistics, but also can be leveraged to select the optimal parameters that minimize the classification error, thus yielding the optimal classifier. Validation results on the synthetic data show a good accuracy of our theoretical findings. We also construct a general consistent estimator to approximate the true classification error in consideration of the unknown previous statistics. We benchmark the performance of our proposed consistent estimator against classical estimator on synthetic data. The observations demonstrate that the general estimator outperforms others in terms of mean squared error (MSE).

  8. Adaptive Regularization of Neural Networks Using Conjugate Gradient

    DEFF Research Database (Denmark)

    Goutte, Cyril; Larsen, Jan

    1998-01-01

    Andersen et al. (1997) and Larsen et al. (1996, 1997) suggested a regularization scheme which iteratively adapts regularization parameters by minimizing validation error using simple gradient descent. In this contribution we present an improved algorithm based on the conjugate gradient technique........ Numerical experiments with feedforward neural networks successfully demonstrate improved generalization ability and lower computational cost...

  9. 20 CFR 226.33 - Spouse regular annuity rate.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Spouse regular annuity rate. 226.33 Section... COMPUTING EMPLOYEE, SPOUSE, AND DIVORCED SPOUSE ANNUITIES Computing a Spouse or Divorced Spouse Annuity § 226.33 Spouse regular annuity rate. The final tier I and tier II rates, from §§ 226.30 and 226.32, are...

  10. Stark broadening parameter regularities and interpolation and critical evaluation of data for CP star atmospheres research: Stark line shifts

    Science.gov (United States)

    Dimitrijevic, M. S.; Tankosic, D.

    1998-04-01

    In order to find out if regularities and systematic trends found to be apparent among experimental Stark line shifts allow the accurate interpolation of new data and critical evaluation of experimental results, the exceptions to the established regularities are analysed on the basis of critical reviews of experimental data, and reasons for such exceptions are discussed. We found that such exceptions are mostly due to the situations when: (i) the energy gap between atomic energy levels within a supermultiplet is equal or comparable to the energy gap to the nearest perturbing levels; (ii) the most important perturbing level is embedded between the energy levels of the supermultiplet; (iii) the forbidden transitions have influence on Stark line shifts.

  11. The NASA Monographs on Shell Stability Design Recommendations: A Review and Suggested Improvements

    Science.gov (United States)

    Nemeth, Michael P.; Starnes, James H., Jr.

    1998-01-01

    A summary of the existing NASA design criteria monographs for the design of buckling-resistant thin-shell structures is presented. Subsequent improvements in the analysis for nonlinear shell response are reviewed, and current issues in shell stability analysis are discussed. Examples of nonlinear shell responses that are not included in the existing shell design monographs are presented, and an approach for including reliability-based analysis procedures in the shell design process is discussed. Suggestions for conducting future shell experiments are presented, and proposed improvements to the NASA shell design criteria monographs are discussed.

  12. Design management aspects in Brazilian dissertations and theses: a systematic review

    Directory of Open Access Journals (Sweden)

    Claudia de Souza Libanio

    2011-06-01

    Full Text Available Studies about the design management theme had being developed worldwide, especially in research centers in countries like France, Portugal, USA and Italy. In Brazil, although educational institutions who excel in research on Design Management, the scenario is still unclear. This work aims at a systematic review of academic literature on the subject of design management, mapping the intellectual production developed in the last twenty years in the post graduate programs, in Brazil, addressing completely or partly the Design Management subject. The research was conducted on the world wide web, using search engine database of theses and dissertations, search engines and websites of Brazilian university libraries, using the keywords ‘design management’, ‘strategic design’, ‘designer’ and ‘design manager’. The results show the state of the art and the evolution of research on the subject in question, as well as an overview about the current stage in which the research center and studies on design management take place in Brazil.

  13. PET regularization by envelope guided conjugate gradients

    International Nuclear Information System (INIS)

    Kaufman, L.; Neumaier, A.

    1996-01-01

    The authors propose a new way to iteratively solve large scale ill-posed problems and in particular the image reconstruction problem in positron emission tomography by exploiting the relation between Tikhonov regularization and multiobjective optimization to obtain iteratively approximations to the Tikhonov L-curve and its corner. Monitoring the change of the approximate L-curves allows us to adjust the regularization parameter adaptively during a preconditioned conjugate gradient iteration, so that the desired solution can be reconstructed with a small number of iterations

  14. Regularized Regression and Density Estimation based on Optimal Transport

    KAUST Repository

    Burger, M.; Franek, M.; Schonlieb, C.-B.

    2012-01-01

    for estimating densities and for preserving edges in the case of total variation regularization. In order to compute solutions of the variational problems, a regularized optimal transport problem needs to be solved, for which we discuss several formulations

  15. Regularization of Nonmonotone Variational Inequalities

    International Nuclear Information System (INIS)

    Konnov, Igor V.; Ali, M.S.S.; Mazurkevich, E.O.

    2006-01-01

    In this paper we extend the Tikhonov-Browder regularization scheme from monotone to rather a general class of nonmonotone multivalued variational inequalities. We show that their convergence conditions hold for some classes of perfectly and nonperfectly competitive economic equilibrium problems

  16. Interval matrices: Regularity generates singularity

    Czech Academy of Sciences Publication Activity Database

    Rohn, Jiří; Shary, S.P.

    2018-01-01

    Roč. 540, 1 March (2018), s. 149-159 ISSN 0024-3795 Institutional support: RVO:67985807 Keywords : interval matrix * regularity * singularity * P-matrix * absolute value equation * diagonally singilarizable matrix Subject RIV: BA - General Mathematics Impact factor: 0.973, year: 2016

  17. Regular perturbations in a vector space with indefinite metric

    International Nuclear Information System (INIS)

    Chiang, C.C.

    1975-08-01

    The Klein space is discussed in connection with practical applications. Some lemmas are presented which are to be used for the discussion of regular self-adjoint operators. The criteria for the regularity of perturbed operators are given. (U.S.)

  18. Regular Generalized Star Star closed sets in Bitopological Spaces

    OpenAIRE

    K. Kannan; D. Narasimhan; K. Chandrasekhara Rao; R. Ravikumar

    2011-01-01

    The aim of this paper is to introduce the concepts of τ1τ2-regular generalized star star closed sets , τ1τ2-regular generalized star star open sets and study their basic properties in bitopological spaces.

  19. Solution path for manifold regularized semisupervised classification.

    Science.gov (United States)

    Wang, Gang; Wang, Fei; Chen, Tao; Yeung, Dit-Yan; Lochovsky, Frederick H

    2012-04-01

    Traditional learning algorithms use only labeled data for training. However, labeled examples are often difficult or time consuming to obtain since they require substantial human labeling efforts. On the other hand, unlabeled data are often relatively easy to collect. Semisupervised learning addresses this problem by using large quantities of unlabeled data with labeled data to build better learning algorithms. In this paper, we use the manifold regularization approach to formulate the semisupervised learning problem where a regularization framework which balances a tradeoff between loss and penalty is established. We investigate different implementations of the loss function and identify the methods which have the least computational expense. The regularization hyperparameter, which determines the balance between loss and penalty, is crucial to model selection. Accordingly, we derive an algorithm that can fit the entire path of solutions for every value of the hyperparameter. Its computational complexity after preprocessing is quadratic only in the number of labeled examples rather than the total number of labeled and unlabeled examples.

  20. (2+1-dimensional regular black holes with nonlinear electrodynamics sources

    Directory of Open Access Journals (Sweden)

    Yun He

    2017-11-01

    Full Text Available On the basis of two requirements: the avoidance of the curvature singularity and the Maxwell theory as the weak field limit of the nonlinear electrodynamics, we find two restricted conditions on the metric function of (2+1-dimensional regular black hole in general relativity coupled with nonlinear electrodynamics sources. By the use of the two conditions, we obtain a general approach to construct (2+1-dimensional regular black holes. In this manner, we construct four (2+1-dimensional regular black holes as examples. We also study the thermodynamic properties of the regular black holes and verify the first law of black hole thermodynamics.

  1. Suicide prevention e-learning modules designed for gatekeepers: A descriptive review

    NARCIS (Netherlands)

    Ghoncheh, R.; Kerkhof, A.; Koot, H.M.

    2014-01-01

    Background: E-learning modules can be a useful method for educating gatekeepers in suicide prevention and awareness. Aims: To review and provide an overview of e-learning modules on suicide prevention designed for gatekeepers and assess their effectiveness. Method: Two strategies were used. First,

  2. A Review of Organic Photovoltaic Energy Source and Its Technological Designs

    Directory of Open Access Journals (Sweden)

    Egidius Rutatizibwa Rwenyagila

    2017-01-01

    Full Text Available This study reviews and describes some of the existing research and mechanisms of operation of organic photovoltaic (OPV cells. Introduced first are problems that exist with traditional fossil fuels that result in most of the world energy challenges such as environmental pollution. This is followed by the description of baseline organic solar cell (OSC structures and materials. Then, some of the existing modelling approaches that have implemented either a one- or a two-dimensional drift-diffusion model to examine OSC structures are reviewed, and their reproducibility is examined. Both experimental and modelling approaches reviewed are particularly important for more and better designed research to probe practical procedural problems associated with OSCs that hinder the commercialization of OPV technology.

  3. A REVIEW: A MODEL of CULTURAL ASPECTS for SUSTAINABLE PRODUCT DESIGN

    Directory of Open Access Journals (Sweden)

    Ihwan Ghazali,

    2012-04-01

    Full Text Available Product design stages are important to consider critically in production. Generally, product design that shall be created by designer, should consider what the customer wants and needs. Nowadays issues, product design does not only consider the “wants and needs” of user, but also how the design can be created by embedding sustainability aspects in the product. Culture is also one of the important aspects which need to be considered in product design as culture affects the way users respond to the product. This paper aims to develop a new model for design development, in which the aspects of culture are incorporated into sustainable product design. By reviewing the existing literature, the authors attempt to identify the gaps of the existing papers, which illustrate how culture affects sustainable product design. Recent papers have only shown that culture influences product design, but they do not explore sustainability and the culture aspects in product design. Due to these gaps, it is therefore important to create a model which will assist designers to elicit sustainable product design based on cultural aspects. In summary, designers need to reflect on the “wants and needs” of users. The framework presented in this paper can be integrated into designers’ and companies’ decision-making during product design development.

  4. Regularity of the Maxwell equations in heterogeneous media and Lipschitz domains

    KAUST Repository

    Bonito, Andrea

    2013-12-01

    This note establishes regularity estimates for the solution of the Maxwell equations in Lipschitz domains with non-smooth coefficients and minimal regularity assumptions. The argumentation relies on elliptic regularity estimates for the Poisson problem with non-smooth coefficients. © 2013 Elsevier Ltd.

  5. Evaluating and comparing imaging techniques: a review and classification of study designs

    International Nuclear Information System (INIS)

    Freedman, L.S.

    1987-01-01

    The design of studies to evaluate and compare imaging techniques are reviewed. Thirteen principles for the design of studies of diagnostic accuracy are given. Because of the 'independence principle' these studies are not able directly to evaluate the contribution of a technique to clinical management. For the latter, the 'clinical value' study design is recommended. A classification of study designs is proposed in parallel with the standard classification of clinical trials. Studies of diagnostic accuracy are analogous to Phase II, whereas studies evaluating the contribution to clinical management correspond to the Phase III category. Currently the majority of published studies employ the Phase II design. More emphasis on Phase III studies is required. (author)

  6. Task-Driven Optimization of Fluence Field and Regularization for Model-Based Iterative Reconstruction in Computed Tomography.

    Science.gov (United States)

    Gang, Grace J; Siewerdsen, Jeffrey H; Stayman, J Webster

    2017-12-01

    This paper presents a joint optimization of dynamic fluence field modulation (FFM) and regularization in quadratic penalized-likelihood reconstruction that maximizes a task-based imaging performance metric. We adopted a task-driven imaging framework for prospective designs of the imaging parameters. A maxi-min objective function was adopted to maximize the minimum detectability index ( ) throughout the image. The optimization algorithm alternates between FFM (represented by low-dimensional basis functions) and local regularization (including the regularization strength and directional penalty weights). The task-driven approach was compared with three FFM strategies commonly proposed for FBP reconstruction (as well as a task-driven TCM strategy) for a discrimination task in an abdomen phantom. The task-driven FFM assigned more fluence to less attenuating anteroposterior views and yielded approximately constant fluence behind the object. The optimal regularization was almost uniform throughout image. Furthermore, the task-driven FFM strategy redistribute fluence across detector elements in order to prescribe more fluence to the more attenuating central region of the phantom. Compared with all strategies, the task-driven FFM strategy not only improved minimum by at least 17.8%, but yielded higher over a large area inside the object. The optimal FFM was highly dependent on the amount of regularization, indicating the importance of a joint optimization. Sample reconstructions of simulated data generally support the performance estimates based on computed . The improvements in detectability show the potential of the task-driven imaging framework to improve imaging performance at a fixed dose, or, equivalently, to provide a similar level of performance at reduced dose.

  7. Regularized forecasting of chaotic dynamical systems

    International Nuclear Information System (INIS)

    Bollt, Erik M.

    2017-01-01

    While local models of dynamical systems have been highly successful in terms of using extensive data sets observing even a chaotic dynamical system to produce useful forecasts, there is a typical problem as follows. Specifically, with k-near neighbors, kNN method, local observations occur due to recurrences in a chaotic system, and this allows for local models to be built by regression to low dimensional polynomial approximations of the underlying system estimating a Taylor series. This has been a popular approach, particularly in context of scalar data observations which have been represented by time-delay embedding methods. However such local models can generally allow for spatial discontinuities of forecasts when considered globally, meaning jumps in predictions because the collected near neighbors vary from point to point. The source of these discontinuities is generally that the set of near neighbors varies discontinuously with respect to the position of the sample point, and so therefore does the model built from the near neighbors. It is possible to utilize local information inferred from near neighbors as usual but at the same time to impose a degree of regularity on a global scale. We present here a new global perspective extending the general local modeling concept. In so doing, then we proceed to show how this perspective allows us to impose prior presumed regularity into the model, by involving the Tikhonov regularity theory, since this classic perspective of optimization in ill-posed problems naturally balances fitting an objective with some prior assumed form of the result, such as continuity or derivative regularity for example. This all reduces to matrix manipulations which we demonstrate on a simple data set, with the implication that it may find much broader context.

  8. Forcing absoluteness and regularity properties

    NARCIS (Netherlands)

    Ikegami, D.

    2010-01-01

    For a large natural class of forcing notions, we prove general equivalence theorems between forcing absoluteness statements, regularity properties, and transcendence properties over L and the core model K. We use our results to answer open questions from set theory of the reals.

  9. Design and analysis of group-randomized trials in cancer: A review of current practices.

    Science.gov (United States)

    Murray, David M; Pals, Sherri L; George, Stephanie M; Kuzmichev, Andrey; Lai, Gabriel Y; Lee, Jocelyn A; Myles, Ranell L; Nelson, Shakira M

    2018-06-01

    The purpose of this paper is to summarize current practices for the design and analysis of group-randomized trials involving cancer-related risk factors or outcomes and to offer recommendations to improve future trials. We searched for group-randomized trials involving cancer-related risk factors or outcomes that were published or online in peer-reviewed journals in 2011-15. During 2016-17, in Bethesda MD, we reviewed 123 articles from 76 journals to characterize their design and their methods for sample size estimation and data analysis. Only 66 (53.7%) of the articles reported appropriate methods for sample size estimation. Only 63 (51.2%) reported exclusively appropriate methods for analysis. These findings suggest that many investigators do not adequately attend to the methodological challenges inherent in group-randomized trials. These practices can lead to underpowered studies, to an inflated type 1 error rate, and to inferences that mislead readers. Investigators should work with biostatisticians or other methodologists familiar with these issues. Funders and editors should ensure careful methodological review of applications and manuscripts. Reviewers should ensure that studies are properly planned and analyzed. These steps are needed to improve the rigor and reproducibility of group-randomized trials. The Office of Disease Prevention (ODP) at the National Institutes of Health (NIH) has taken several steps to address these issues. ODP offers an online course on the design and analysis of group-randomized trials. ODP is working to increase the number of methodologists who serve on grant review panels. ODP has developed standard language for the Application Guide and the Review Criteria to draw investigators' attention to these issues. Finally, ODP has created a new Research Methods Resources website to help investigators, reviewers, and NIH staff better understand these issues. Published by Elsevier Inc.

  10. Arithmetic properties of $\\ell$-regular overpartition pairs

    OpenAIRE

    NAIKA, MEGADAHALLI SIDDA MAHADEVA; SHIVASHANKAR, CHANDRAPPA

    2017-01-01

    In this paper, we investigate the arithmetic properties of $\\ell$-regular overpartition pairs. Let $\\overline{B}_{\\ell}(n)$ denote the number of $\\ell$-regular overpartition pairs of $n$. We will prove the number of Ramanujan-like congruences and infinite families of congruences modulo 3, 8, 16, 36, 48, 96 for $\\overline{B}_3(n)$ and modulo 3, 16, 64, 96 for $\\overline{B}_4(n)$. For example, we find that for all nonnegative integers $\\alpha$ and $n$, $\\overline{B}_{3}(3^{\\alpha}(3n+2))\\equiv ...

  11. Chaos regularization of quantum tunneling rates

    International Nuclear Information System (INIS)

    Pecora, Louis M.; Wu Dongho; Lee, Hoshik; Antonsen, Thomas; Lee, Ming-Jer; Ott, Edward

    2011-01-01

    Quantum tunneling rates through a barrier separating two-dimensional, symmetric, double-well potentials are shown to depend on the classical dynamics of the billiard trajectories in each well and, hence, on the shape of the wells. For shapes that lead to regular (integrable) classical dynamics the tunneling rates fluctuate greatly with eigenenergies of the states sometimes by over two orders of magnitude. Contrarily, shapes that lead to completely chaotic trajectories lead to tunneling rates whose fluctuations are greatly reduced, a phenomenon we call regularization of tunneling rates. We show that a random-plane-wave theory of tunneling accounts for the mean tunneling rates and the small fluctuation variances for the chaotic systems.

  12. STRUCTURE OPTIMIZATION OF RESERVATION BY PRECISE QUADRATIC REGULARIZATION

    Directory of Open Access Journals (Sweden)

    KOSOLAP A. I.

    2015-11-01

    Full Text Available The problem of optimization of the structure of systems redundancy elements. Such problems arise in the design of complex systems. To improve the reliability of operation of such systems of its elements are duplicated. This increases system cost and improves its reliability. When optimizing these systems is maximized probability of failure of the entire system while limiting its cost or the cost is minimized for a given probability of failure-free operation. A mathematical model of the problem is a discrete backup multiextremal. To search for the global extremum of currently used methods of Lagrange multipliers, coordinate descent, dynamic programming, random search. These methods guarantee a just and local solutions are used in the backup tasks of small dimension. In the work for solving redundancy uses a new method for accurate quadratic regularization. This method allows you to convert the original discrete problem to the maximization of multi vector norm on a convex set. This means that the diversity of the tasks given to the problem of redundancy maximize vector norm on a convex set. To solve the problem, a reformed straightdual interior point methods. Currently, it is the best method for local optimization of nonlinear problems. Transformed the task includes a new auxiliary variable, which is determined by dichotomy. There have been numerous comparative numerical experiments in problems with the number of redundant subsystems to one hundred. These experiments confirm the effectiveness of the method of precise quadratic regularization for solving problems of redundancy.

  13. Regularization Tools Version 3.0 for Matlab 5.2

    DEFF Research Database (Denmark)

    Hansen, Per Christian

    1999-01-01

    This communication describes Version 3.0 of Regularization Tools, a Matlab package for analysis and solution of discrete ill-posed problems.......This communication describes Version 3.0 of Regularization Tools, a Matlab package for analysis and solution of discrete ill-posed problems....

  14. Spiking Regularity and Coherence in Complex Hodgkin–Huxley Neuron Networks

    International Nuclear Information System (INIS)

    Zhi-Qiang, Sun; Ping, Xie; Wei, Li; Peng-Ye, Wang

    2010-01-01

    We study the effects of the strength of coupling between neurons on the spiking regularity and coherence in a complex network with randomly connected Hodgkin–Huxley neurons driven by colored noise. It is found that for the given topology realization and colored noise correlation time, there exists an optimal strength of coupling, at which the spiking regularity of the network reaches the best level. Moreover, when the temporal regularity reaches the best level, the spatial coherence of the system has already increased to a relatively high level. In addition, for the given number of neurons and noise correlation time, the values of average regularity and spatial coherence at the optimal strength of coupling are nearly independent of the topology realization. Furthermore, there exists an optimal value of colored noise correlation time at which the spiking regularity can reach its best level. These results may be helpful for understanding of the real neuron world. (cross-disciplinary physics and related areas of science and technology)

  15. Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications

    Science.gov (United States)

    Chaki, Sagar; Gurfinkel, Arie

    2010-01-01

    We develop a learning-based automated Assume-Guarantee (AG) reasoning framework for verifying omega-regular properties of concurrent systems. We study the applicability of non-circular (AGNC) and circular (AG-C) AG proof rules in the context of systems with infinite behaviors. In particular, we show that AG-NC is incomplete when assumptions are restricted to strictly infinite behaviors, while AG-C remains complete. We present a general formalization, called LAG, of the learning based automated AG paradigm. We show how existing approaches for automated AG reasoning are special instances of LAG.We develop two learning algorithms for a class of systems, called infinite regular systems, that combine finite and infinite behaviors. We show that for infinity-regular systems, both AG-NC and AG-C are sound and complete. Finally, we show how to instantiate LAG to do automated AG reasoning for infinite regular, and omega-regular, systems using both AG-NC and AG-C as proof rules

  16. Graph Regularized Meta-path Based Transductive Regression in Heterogeneous Information Network.

    Science.gov (United States)

    Wan, Mengting; Ouyang, Yunbo; Kaplan, Lance; Han, Jiawei

    2015-01-01

    A number of real-world networks are heterogeneous information networks, which are composed of different types of nodes and links. Numerical prediction in heterogeneous information networks is a challenging but significant area because network based information for unlabeled objects is usually limited to make precise estimations. In this paper, we consider a graph regularized meta-path based transductive regression model ( Grempt ), which combines the principal philosophies of typical graph-based transductive classification methods and transductive regression models designed for homogeneous networks. The computation of our method is time and space efficient and the precision of our model can be verified by numerical experiments.

  17. The Role of Environmental Design in Cancer Prevention, Diagnosis, Treatment, and Survivorship: A Systematic Literature Review.

    Science.gov (United States)

    Gharaveis, Arsalan; Kazem-Zadeh, Mahshad

    2018-01-01

    The purpose of this literature review is to provide a better understanding of the impact that environmental design can have on the process of cancer prevention, diagnosis, treatment, and survivorship. Cancer is considered a chronic disease in the United States, and more than 1.6 million new cases are diagnosed annually. New strategies of cancer care propose patient-centered services to achieve the best outcome, and researchers have found that environmental design can be an important part of improving this care. Searches were conducted in the PubMed and Google Scholar databases as well as in specific healthcare design journals such as Health Environments Research & Design, Environmental Psychology, and Environment and Behavior. The criteria for articles included in the review were (a) English-language articles related to facility design, which addressed (b) the topics of built environment in relation to cancer diagnosis, treatment, and survivorship, and were (c) published in peer-reviewed journals between 2000 and 2017. Finally, 10 articles were selected, and the contents were analyzed. The selected articles demonstrate that environmental design is one of the critical factors for success throughout the whole continuum of cancer care from diagnosis to end-of-treatment. Some of the specific conclusions from the review are that "neighborhood-oriented" design strategies can be beneficial (by providing accessibility to all facilities along the patient's path), that access to nature for patients, staff, and visitors alike is associated with better outcomes, and that provisions for natural lighting and noise reduction are associated with cancer patients' well-being.

  18. Advanced human-system interface design review guideline. General evaluation model, technical development, and guideline description

    International Nuclear Information System (INIS)

    O'Hara, J.M.

    1994-07-01

    Advanced control rooms will use advanced human-system interface (HSI) technologies that may have significant implications for plant safety in that they will affect the operator's overall role in the system, the method of information presentation, and the ways in which operators interact with the system. The U.S. Nuclear Regulatory Commission (NRC) reviews the HSI aspects of control rooms to ensure that they are designed to good human factors engineering principles and that operator performance and reliability are appropriately supported to protect public health and safety. The principal guidance available to the NRC, however, was developed more than ten years ago, well before these technological changes. Accordingly, the human factors guidance needs to be updated to serve as the basis for NRC review of these advanced designs. The purpose of this project was to develop a general approach to advanced HSI review and the human factors guidelines to support NRC safety reviews of advanced systems. This two-volume report provides the results of the project. Volume I describes the development of the Advanced HSI Design Review Guideline (DRG) including (1) its theoretical and technical foundation, (2) a general model for the review of advanced HSIs, (3) guideline development in both hard-copy and computer-based versions, and (4) the tests and evaluations performed to develop and validate the DRG. Volume I also includes a discussion of the gaps in available guidance and a methodology for addressing them. Volume 2 provides the guidelines to be used for advanced HSI review and the procedures for their use

  19. Advanced human-system interface design review guideline. General evaluation model, technical development, and guideline description

    Energy Technology Data Exchange (ETDEWEB)

    O`Hara, J.M.

    1994-07-01

    Advanced control rooms will use advanced human-system interface (HSI) technologies that may have significant implications for plant safety in that they will affect the operator`s overall role in the system, the method of information presentation, and the ways in which operators interact with the system. The U.S. Nuclear Regulatory Commission (NRC) reviews the HSI aspects of control rooms to ensure that they are designed to good human factors engineering principles and that operator performance and reliability are appropriately supported to protect public health and safety. The principal guidance available to the NRC, however, was developed more than ten years ago, well before these technological changes. Accordingly, the human factors guidance needs to be updated to serve as the basis for NRC review of these advanced designs. The purpose of this project was to develop a general approach to advanced HSI review and the human factors guidelines to support NRC safety reviews of advanced systems. This two-volume report provides the results of the project. Volume I describes the development of the Advanced HSI Design Review Guideline (DRG) including (1) its theoretical and technical foundation, (2) a general model for the review of advanced HSIs, (3) guideline development in both hard-copy and computer-based versions, and (4) the tests and evaluations performed to develop and validate the DRG. Volume I also includes a discussion of the gaps in available guidance and a methodology for addressing them. Volume 2 provides the guidelines to be used for advanced HSI review and the procedures for their use.

  20. The Usefulness of Systematic Reviews of Animal Experiments for the Design of Preclinical and Clinical Studies

    Science.gov (United States)

    de Vries, Rob B. M.; Wever, Kimberley E.; Avey, Marc T.; Stephens, Martin L.; Sena, Emily S.; Leenaars, Marlies

    2014-01-01

    The question of how animal studies should be designed, conducted, and analyzed remains underexposed in societal debates on animal experimentation. This is not only a scientific but also a moral question. After all, if animal experiments are not appropriately designed, conducted, and analyzed, the results produced are unlikely to be reliable and the animals have in effect been wasted. In this article, we focus on one particular method to address this moral question, namely systematic reviews of previously performed animal experiments. We discuss how the design, conduct, and analysis of future (animal and human) experiments may be optimized through such systematic reviews. In particular, we illustrate how these reviews can help improve the methodological quality of animal experiments, make the choice of an animal model and the translation of animal data to the clinic more evidence-based, and implement the 3Rs. Moreover, we discuss which measures are being taken and which need to be taken in the future to ensure that systematic reviews will actually contribute to optimizing experimental design and thereby to meeting a necessary condition for making the use of animals in these experiments justified. PMID:25541545

  1. Human visual system automatically encodes sequential regularities of discrete events.

    Science.gov (United States)

    Kimura, Motohiro; Schröger, Erich; Czigler, István; Ohira, Hideki

    2010-06-01

    For our adaptive behavior in a dynamically changing environment, an essential task of the brain is to automatically encode sequential regularities inherent in the environment into a memory representation. Recent studies in neuroscience have suggested that sequential regularities embedded in discrete sensory events are automatically encoded into a memory representation at the level of the sensory system. This notion is largely supported by evidence from investigations using auditory mismatch negativity (auditory MMN), an event-related brain potential (ERP) correlate of an automatic memory-mismatch process in the auditory sensory system. However, it is still largely unclear whether or not this notion can be generalized to other sensory modalities. The purpose of the present study was to investigate the contribution of the visual sensory system to the automatic encoding of sequential regularities using visual mismatch negativity (visual MMN), an ERP correlate of an automatic memory-mismatch process in the visual sensory system. To this end, we conducted a sequential analysis of visual MMN in an oddball sequence consisting of infrequent deviant and frequent standard stimuli, and tested whether the underlying memory representation of visual MMN generation contains only a sensory memory trace of standard stimuli (trace-mismatch hypothesis) or whether it also contains sequential regularities extracted from the repetitive standard sequence (regularity-violation hypothesis). The results showed that visual MMN was elicited by first deviant (deviant stimuli following at least one standard stimulus), second deviant (deviant stimuli immediately following first deviant), and first standard (standard stimuli immediately following first deviant), but not by second standard (standard stimuli immediately following first standard). These results are consistent with the regularity-violation hypothesis, suggesting that the visual sensory system automatically encodes sequential

  2. Subcortical processing of speech regularities underlies reading and music aptitude in children

    Science.gov (United States)

    2011-01-01

    Background Neural sensitivity to acoustic regularities supports fundamental human behaviors such as hearing in noise and reading. Although the failure to encode acoustic regularities in ongoing speech has been associated with language and literacy deficits, how auditory expertise, such as the expertise that is associated with musical skill, relates to the brainstem processing of speech regularities is unknown. An association between musical skill and neural sensitivity to acoustic regularities would not be surprising given the importance of repetition and regularity in music. Here, we aimed to define relationships between the subcortical processing of speech regularities, music aptitude, and reading abilities in children with and without reading impairment. We hypothesized that, in combination with auditory cognitive abilities, neural sensitivity to regularities in ongoing speech provides a common biological mechanism underlying the development of music and reading abilities. Methods We assessed auditory working memory and attention, music aptitude, reading ability, and neural sensitivity to acoustic regularities in 42 school-aged children with a wide range of reading ability. Neural sensitivity to acoustic regularities was assessed by recording brainstem responses to the same speech sound presented in predictable and variable speech streams. Results Through correlation analyses and structural equation modeling, we reveal that music aptitude and literacy both relate to the extent of subcortical adaptation to regularities in ongoing speech as well as with auditory working memory and attention. Relationships between music and speech processing are specifically driven by performance on a musical rhythm task, underscoring the importance of rhythmic regularity for both language and music. Conclusions These data indicate common brain mechanisms underlying reading and music abilities that relate to how the nervous system responds to regularities in auditory input

  3. Subcortical processing of speech regularities underlies reading and music aptitude in children.

    Science.gov (United States)

    Strait, Dana L; Hornickel, Jane; Kraus, Nina

    2011-10-17

    Neural sensitivity to acoustic regularities supports fundamental human behaviors such as hearing in noise and reading. Although the failure to encode acoustic regularities in ongoing speech has been associated with language and literacy deficits, how auditory expertise, such as the expertise that is associated with musical skill, relates to the brainstem processing of speech regularities is unknown. An association between musical skill and neural sensitivity to acoustic regularities would not be surprising given the importance of repetition and regularity in music. Here, we aimed to define relationships between the subcortical processing of speech regularities, music aptitude, and reading abilities in children with and without reading impairment. We hypothesized that, in combination with auditory cognitive abilities, neural sensitivity to regularities in ongoing speech provides a common biological mechanism underlying the development of music and reading abilities. We assessed auditory working memory and attention, music aptitude, reading ability, and neural sensitivity to acoustic regularities in 42 school-aged children with a wide range of reading ability. Neural sensitivity to acoustic regularities was assessed by recording brainstem responses to the same speech sound presented in predictable and variable speech streams. Through correlation analyses and structural equation modeling, we reveal that music aptitude and literacy both relate to the extent of subcortical adaptation to regularities in ongoing speech as well as with auditory working memory and attention. Relationships between music and speech processing are specifically driven by performance on a musical rhythm task, underscoring the importance of rhythmic regularity for both language and music. These data indicate common brain mechanisms underlying reading and music abilities that relate to how the nervous system responds to regularities in auditory input. Definition of common biological underpinnings

  4. Subcortical processing of speech regularities underlies reading and music aptitude in children

    Directory of Open Access Journals (Sweden)

    Strait Dana L

    2011-10-01

    Full Text Available Abstract Background Neural sensitivity to acoustic regularities supports fundamental human behaviors such as hearing in noise and reading. Although the failure to encode acoustic regularities in ongoing speech has been associated with language and literacy deficits, how auditory expertise, such as the expertise that is associated with musical skill, relates to the brainstem processing of speech regularities is unknown. An association between musical skill and neural sensitivity to acoustic regularities would not be surprising given the importance of repetition and regularity in music. Here, we aimed to define relationships between the subcortical processing of speech regularities, music aptitude, and reading abilities in children with and without reading impairment. We hypothesized that, in combination with auditory cognitive abilities, neural sensitivity to regularities in ongoing speech provides a common biological mechanism underlying the development of music and reading abilities. Methods We assessed auditory working memory and attention, music aptitude, reading ability, and neural sensitivity to acoustic regularities in 42 school-aged children with a wide range of reading ability. Neural sensitivity to acoustic regularities was assessed by recording brainstem responses to the same speech sound presented in predictable and variable speech streams. Results Through correlation analyses and structural equation modeling, we reveal that music aptitude and literacy both relate to the extent of subcortical adaptation to regularities in ongoing speech as well as with auditory working memory and attention. Relationships between music and speech processing are specifically driven by performance on a musical rhythm task, underscoring the importance of rhythmic regularity for both language and music. Conclusions These data indicate common brain mechanisms underlying reading and music abilities that relate to how the nervous system responds to

  5. Surface-based prostate registration with biomechanical regularization

    Science.gov (United States)

    van de Ven, Wendy J. M.; Hu, Yipeng; Barentsz, Jelle O.; Karssemeijer, Nico; Barratt, Dean; Huisman, Henkjan J.

    2013-03-01

    Adding MR-derived information to standard transrectal ultrasound (TRUS) images for guiding prostate biopsy is of substantial clinical interest. A tumor visible on MR images can be projected on ultrasound by using MRUS registration. A common approach is to use surface-based registration. We hypothesize that biomechanical modeling will better control deformation inside the prostate than a regular surface-based registration method. We developed a novel method by extending a surface-based registration with finite element (FE) simulation to better predict internal deformation of the prostate. For each of six patients, a tetrahedral mesh was constructed from the manual prostate segmentation. Next, the internal prostate deformation was simulated using the derived radial surface displacement as boundary condition. The deformation field within the gland was calculated using the predicted FE node displacements and thin-plate spline interpolation. We tested our method on MR guided MR biopsy imaging data, as landmarks can easily be identified on MR images. For evaluation of the registration accuracy we used 45 anatomical landmarks located in all regions of the prostate. Our results show that the median target registration error of a surface-based registration with biomechanical regularization is 1.88 mm, which is significantly different from 2.61 mm without biomechanical regularization. We can conclude that biomechanical FE modeling has the potential to improve the accuracy of multimodal prostate registration when comparing it to regular surface-based registration.

  6. Regularization in Matrix Relevance Learning

    NARCIS (Netherlands)

    Schneider, Petra; Bunte, Kerstin; Stiekema, Han; Hammer, Barbara; Villmann, Thomas; Biehl, Michael

    A In this paper, we present a regularization technique to extend recently proposed matrix learning schemes in learning vector quantization (LVQ). These learning algorithms extend the concept of adaptive distance measures in LVQ to the use of relevance matrices. In general, metric learning can

  7. Left regular bands of groups of left quotients

    International Nuclear Information System (INIS)

    El-Qallali, A.

    1988-10-01

    A semigroup S which has a left regular band of groups as a semigroup of left quotients is shown to be the semigroup which is a left regular band of right reversible cancellative semigroups. An alternative characterization is provided by using spinned products. These results are applied to the case where S is a superabundant whose set of idempotents forms a left normal band. (author). 13 refs

  8. An Investigation of the Methods of Logicalizing the Code-Checking System for Architectural Design Review in New Taipei City

    Directory of Open Access Journals (Sweden)

    Wei-I Lee

    2016-12-01

    Full Text Available The New Taipei City Government developed a Code-checking System (CCS using Building Information Modeling (BIM technology to facilitate an architectural design review in 2014. This system was intended to solve problems caused by cognitive gaps between designer and reviewer in the design review process. Along with considering information technology, the most important issue for the system’s development has been the logicalization of literal building codes. Therefore, to enhance the reliability and performance of the CCS, this study uses the Fuzzy Delphi Method (FDM on the basis of design thinking and communication theory to investigate the semantic difference and cognitive gaps among participants in the design review process and to propose the direction of system development. Our empirical results lead us to recommend grouping multi-stage screening and weighted assisted logicalization of non-quantitative building codes to improve the operability of CCS. Furthermore, CCS should integrate the Expert Evaluation System (EES to evaluate the design value under qualitative building codes.

  9. Review of the Safety Design Approaches in Sodium Fast Reactors

    International Nuclear Information System (INIS)

    Suk, Soo Dong; Lee, Yong Bum

    2009-12-01

    The principle of the Defense in depth is essential in securing the safety of nuclear power plants, that is, to prevent cores-damaging severs accidents and to minimize the radiological consequences of the accidents 'as low as possible' (ALARA). One of the major design features of sodium fast reactors (SFRs) is that it has a large amount of sodium in the reactor vessel, providing a large heat capacity, such that it is feasible to contain the consequences of sever core damaging accidents in the vessel and primary system boundary. Containment of a severe accident in the primary system boundary, that is called in-vessel retention(IVR), is not a licensing requirement but set up as a design goal in most of the SFR design in the context of risk minimization. The objective of this report is to broadly review and compare the approaches and efforts made in the some of the major SFR designs of the US, Europe and Japan to prevent severe accidents and mitigate their consequences should they occur. Specifically, the subjects described in this report include design criteria or requirements, accident categorization and acceptance criteria, design features to prevent and contain severs accidents

  10. Coronary artery bypass grafting hemodynamics and anastomosis design: a biomedical engineering review.

    Science.gov (United States)

    Ghista, Dhanjoo N; Kabinejadian, Foad

    2013-12-13

    In this paper, coronary arterial bypass grafting hemodynamics and anastomosis designs are reviewed. The paper specifically addresses the biomechanical factors for enhancement of the patency of coronary artery bypass grafts (CABGs). Stenosis of distal anastomosis, caused by thrombosis and intimal hyperplasia (IH), is the major cause of failure of CABGs. Strong correlations have been established between the hemodynamics and vessel wall biomechanical factors and the initiation and development of IH and thrombus formation. Accordingly, several investigations have been conducted and numerous anastomotic geometries and devices have been designed to better regulate the blood flow fields and distribution of hemodynamic parameters and biomechanical factors at the distal anastomosis, in order to enhance the patency of CABGs. Enhancement of longevity and patency rate of CABGs can eliminate the need for re-operation and can significantly lower morbidity, and thereby reduces medical costs for patients suffering from coronary stenosis. This invited review focuses on various endeavors made thus far to design a patency-enhancing optimized anastomotic configuration for the distal junction of CABGs.

  11. Rotating Hayward’s regular black hole as particle accelerator

    International Nuclear Information System (INIS)

    Amir, Muhammed; Ghosh, Sushant G.

    2015-01-01

    Recently, Bañados, Silk and West (BSW) demonstrated that the extremal Kerr black hole can act as a particle accelerator with arbitrarily high center-of-mass energy (E CM ) when the collision takes place near the horizon. The rotating Hayward’s regular black hole, apart from Mass (M) and angular momentum (a), has a new parameter g (g>0 is a constant) that provides a deviation from the Kerr black hole. We demonstrate that for each g, with M=1, there exist critical a E and r H E , which corresponds to a regular extremal black hole with degenerate horizons, and a E decreases whereas r H E increases with increase in g. While aregular non-extremal black hole with outer and inner horizons. We apply the BSW process to the rotating Hayward’s regular black hole, for different g, and demonstrate numerically that the E CM diverges in the vicinity of the horizon for the extremal cases thereby suggesting that a rotating regular black hole can also act as a particle accelerator and thus in turn provide a suitable framework for Plank-scale physics. For a non-extremal case, there always exist a finite upper bound for the E CM , which increases with the deviation parameter g.

  12. 41 CFR 109-1.5204 - Review and approval of a designated contractor's personal property management system.

    Science.gov (United States)

    2010-07-01

    ... property management system. (a) An initial review of a designated contractor's personal property management... follow-on contract. The purpose of the review is to determine whether the contractor's system provides... by the property administrator that all major system deficiencies identified in the review or...

  13. Regularity of difference equations on Banach spaces

    CERN Document Server

    Agarwal, Ravi P; Lizama, Carlos

    2014-01-01

    This work introduces readers to the topic of maximal regularity for difference equations. The authors systematically present the method of maximal regularity, outlining basic linear difference equations along with relevant results. They address recent advances in the field, as well as basic semigroup and cosine operator theories in the discrete setting. The authors also identify some open problems that readers may wish to take up for further research. This book is intended for graduate students and researchers in the area of difference equations, particularly those with advance knowledge of and interest in functional analysis.

  14. Review of current status for designing severe accident management support system

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwang Sub

    2000-05-01

    The development of operator support system (OSS) is ongoing in many other countries due to the complexity both in design and in operation for nuclear power plant. The computerized operator support system includes monitoring of some critical parameters, early detection of plant transient, monitoring of component status, plant maintenance, and safety parameter display, and the operator support system for these areas are developed and are being used in some plants. Up to now, the most operator support system covers the normal operation, abnormal operation, and emergency operation. Recently, however, the operator support system for severe accident is to be developed in some countries. The study for the phenomena of severe accident is not performed sufficiently, but, based on the result up to now, the operator support system even for severe accident will be developed in this study. To do this, at first, the current status of the operator support system for normal/abnormal/emergency operation is reviewed, and the positive aspects and negative aspects of systems are analyzed by their characteristics. And also, the major items that should be considered in designing the severe accident operator support system are derived from the review. With the survey of domestic and foreign operator support systems, they are reviewed in terms of the safety parameter display system, decision-making support system, and procedure-tracking system. For the severe accident, the severe accident management guideline (SAMG) which is developed by Westinghouse is reviewed; the characteristics, structure, and logical flow of SAMG are studied. In addition, the critical parameters for severe accident, which are the basis for operators decision-making in severe accident management and are supplied to the operators and the technical support center, are reviewed, too.

  15. Review of current status for designing severe accident management support system

    International Nuclear Information System (INIS)

    Jeong, Kwang Sub

    2000-05-01

    The development of operator support system (OSS) is ongoing in many other countries due to the complexity both in design and in operation for nuclear power plant. The computerized operator support system includes monitoring of some critical parameters, early detection of plant transient, monitoring of component status, plant maintenance, and safety parameter display, and the operator support system for these areas are developed and are being used in some plants. Up to now, the most operator support system covers the normal operation, abnormal operation, and emergency operation. Recently, however, the operator support system for severe accident is to be developed in some countries. The study for the phenomena of severe accident is not performed sufficiently, but, based on the result up to now, the operator support system even for severe accident will be developed in this study. To do this, at first, the current status of the operator support system for normal/abnormal/emergency operation is reviewed, and the positive aspects and negative aspects of systems are analyzed by their characteristics. And also, the major items that should be considered in designing the severe accident operator support system are derived from the review. With the survey of domestic and foreign operator support systems, they are reviewed in terms of the safety parameter display system, decision-making support system, and procedure-tracking system. For the severe accident, the severe accident management guideline (SAMG) which is developed by Westinghouse is reviewed; the characteristics, structure, and logical flow of SAMG are studied. In addition, the critical parameters for severe accident, which are the basis for operators decision-making in severe accident management and are supplied to the operators and the technical support center, are reviewed, too

  16. Journal of Open Source Software (JOSS: design and first-year review

    Directory of Open Access Journals (Sweden)

    Arfon M. Smith

    2018-02-01

    Full Text Available This article describes the motivation, design, and progress of the Journal of Open Source Software (JOSS. JOSS is a free and open-access journal that publishes articles describing research software. It has the dual goals of improving the quality of the software submitted and providing a mechanism for research software developers to receive credit. While designed to work within the current merit system of science, JOSS addresses the dearth of rewards for key contributions to science made in the form of software. JOSS publishes articles that encapsulate scholarship contained in the software itself, and its rigorous peer review targets the software components: functionality, documentation, tests, continuous integration, and the license. A JOSS article contains an abstract describing the purpose and functionality of the software, references, and a link to the software archive. The article is the entry point of a JOSS submission, which encompasses the full set of software artifacts. Submission and review proceed in the open, on GitHub. Editors, reviewers, and authors work collaboratively and openly. Unlike other journals, JOSS does not reject articles requiring major revision; while not yet accepted, articles remain visible and under review until the authors make adequate changes (or withdraw, if unable to meet requirements. Once an article is accepted, JOSS gives it a digital object identifier (DOI, deposits its metadata in Crossref, and the article can begin collecting citations on indexers like Google Scholar and other services. Authors retain copyright of their JOSS article, releasing it under a Creative Commons Attribution 4.0 International License. In its first year, starting in May 2016, JOSS published 111 articles, with more than 40 additional articles under review. JOSS is a sponsored project of the nonprofit organization NumFOCUS and is an affiliate of the Open Source Initiative (OSI.

  17. Self-weighing in weight management: a systematic literature review.

    Science.gov (United States)

    Zheng, Yaguang; Klem, Mary Lou; Sereika, Susan M; Danford, Cynthia A; Ewing, Linda J; Burke, Lora E

    2015-02-01

    Regular self-weighing, which in this article is defined as weighing oneself regularly over a period of time (e.g., daily, weekly), is recommended as a weight loss strategy. However, the published literature lacks a review of the recent evidence provided by prospective, longitudinal studies. Moreover, no paper has reviewed the psychological effects of self-weighing. Therefore, the objective is to review the literature related to longitudinal associations between self-weighing and weight change as well as the psychological outcomes. Electronic literature searches in PubMed, Ovid PsycINFO, and Ebscohost CINAHL were conducted. Keywords included overweight, obesity, self-weighing, etc. Inclusion criteria included trials that were published in the past 25 years in English; participants were adults seeking weight loss treatment; results were based on longitudinal data. The results (N=17 studies) revealed that regular self-weighing was associated with more weight loss and not with adverse psychological outcomes (e.g., depression, anxiety). Findings demonstrated that the effect sizes of association between self-weighing and weight change varied across studies and also that the reported frequency of self-weighing varied across studies. The findings from prospective, longitudinal studies provide evidence that regular self-weighing has been associated with weight loss and not with negative psychological outcomes. © 2014 The Obesity Society.

  18. A Review of Methodological Approaches for the Design and Optimization of Wind Farms

    DEFF Research Database (Denmark)

    Herbert-Acero, José F.; Probst, Oliver; Réthoré, Pierre-Elouan

    2014-01-01

    This article presents a review of the state of the art of the Wind Farm Design and Optimization (WFDO) problem. The WFDO problem refers to a set of advanced planning actions needed to extremize the performance of wind farms, which may be composed of a few individual Wind Turbines (WTs) up to thou...... and offshore wind farms; and (3) to propose a comprehensive agenda for future research.......This article presents a review of the state of the art of the Wind Farm Design and Optimization (WFDO) problem. The WFDO problem refers to a set of advanced planning actions needed to extremize the performance of wind farms, which may be composed of a few individual Wind Turbines (WTs) up...

  19. Gamma regularization based reconstruction for low dose CT

    International Nuclear Information System (INIS)

    Zhang, Junfeng; Chen, Yang; Hu, Yining; Luo, Limin; Shu, Huazhong; Li, Bicao; Liu, Jin; Coatrieux, Jean-Louis

    2015-01-01

    Reducing the radiation in computerized tomography is today a major concern in radiology. Low dose computerized tomography (LDCT) offers a sound way to deal with this problem. However, more severe noise in the reconstructed CT images is observed under low dose scan protocols (e.g. lowered tube current or voltage values). In this paper we propose a Gamma regularization based algorithm for LDCT image reconstruction. This solution is flexible and provides a good balance between the regularizations based on l 0 -norm and l 1 -norm. We evaluate the proposed approach using the projection data from simulated phantoms and scanned Catphan phantoms. Qualitative and quantitative results show that the Gamma regularization based reconstruction can perform better in both edge-preserving and noise suppression when compared with other norms. (paper)

  20. Regularization of plurisubharmonic functions with a net of good points

    OpenAIRE

    Li, Long

    2017-01-01

    The purpose of this article is to present a new regularization technique of quasi-plurisubharmoinc functions on a compact Kaehler manifold. The idea is to regularize the function on local coordinate balls first, and then glue each piece together. Therefore, all the higher order terms in the complex Hessian of this regularization vanish at the center of each coordinate ball, and all the centers build a delta-net of the manifold eventually.