WorldWideScience

Sample records for level setting process

  1. Level sets and extrema of random processes and fields

    CERN Document Server

    Azais, Jean-Marc

    2009-01-01

    A timely and comprehensive treatment of random field theory with applications across diverse areas of study Level Sets and Extrema of Random Processes and Fields discusses how to understand the properties of the level sets of paths as well as how to compute the probability distribution of its extremal values, which are two general classes of problems that arise in the study of random processes and fields and in related applications. This book provides a unified and accessible approach to these two topics and their relationship to classical theory and Gaussian processes and fields, and the most modern research findings are also discussed. The authors begin with an introduction to the basic concepts of stochastic processes, including a modern review of Gaussian fields and their classical inequalities. Subsequent chapters are devoted to Rice formulas, regularity properties, and recent results on the tails of the distribution of the maximum. Finally, applications of random fields to various areas of mathematics a...

  2. Appropriate criteria set for personnel promotion across organizational levels using analytic hierarchy process (AHP

    Directory of Open Access Journals (Sweden)

    Charles Noven Castillo

    2017-01-01

    Full Text Available Currently, there has been limited established specific set of criteria for personnel promotion to each level of the organization. This study is conducted in order to develop a personnel promotion strategy by identifying specific sets of criteria for each level of the organization. The complexity of identifying the criteria set along with the subjectivity of these criteria require the use of multi-criteria decision-making approach particularly the analytic hierarchy process (AHP. Results show different sets of criteria for each management level which are consistent with several frameworks in literature. These criteria sets would help avoid mismatch of employee skills and competencies and their job, and at the same time eliminate the issues in personnel promotion such as favouritism, glass ceiling, and gender and physical attractiveness preference. This work also shows that personality and traits, job satisfaction and experience and skills are more critical rather than social capital across different organizational levels. The contribution of this work is in identifying relevant criteria in developing a personnel promotion strategy across organizational levels.

  3. Level Set Approach to Anisotropic Wet Etching of Silicon

    Directory of Open Access Journals (Sweden)

    Branislav Radjenović

    2010-05-01

    Full Text Available In this paper a methodology for the three dimensional (3D modeling and simulation of the profile evolution during anisotropic wet etching of silicon based on the level set method is presented. Etching rate anisotropy in silicon is modeled taking into account full silicon symmetry properties, by means of the interpolation technique using experimentally obtained values for the etching rates along thirteen principal and high index directions in KOH solutions. The resulting level set equations are solved using an open source implementation of the sparse field method (ITK library, developed in medical image processing community, extended for the case of non-convex Hamiltonians. Simulation results for some interesting initial 3D shapes, as well as some more practical examples illustrating anisotropic etching simulation in the presence of masks (simple square aperture mask, convex corner undercutting and convex corner compensation, formation of suspended structures are shown also. The obtained results show that level set method can be used as an effective tool for wet etching process modeling, and that is a viable alternative to the Cellular Automata method which now prevails in the simulations of the wet etching process.

  4. Multi-phase flow monitoring with electrical impedance tomography using level set based method

    International Nuclear Information System (INIS)

    Liu, Dong; Khambampati, Anil Kumar; Kim, Sin; Kim, Kyung Youn

    2015-01-01

    Highlights: • LSM has been used for shape reconstruction to monitor multi-phase flow using EIT. • Multi-phase level set model for conductivity is represented by two level set functions. • LSM handles topological merging and breaking naturally during evolution process. • To reduce the computational time, a narrowband technique was applied. • Use of narrowband and optimization approach results in efficient and fast method. - Abstract: In this paper, a level set-based reconstruction scheme is applied to multi-phase flow monitoring using electrical impedance tomography (EIT). The proposed scheme involves applying a narrowband level set method to solve the inverse problem of finding the interface between the regions having different conductivity values. The multi-phase level set model for the conductivity distribution inside the domain is represented by two level set functions. The key principle of the level set-based method is to implicitly represent the shape of interface as the zero level set of higher dimensional function and then solve a set of partial differential equations. The level set-based scheme handles topological merging and breaking naturally during the evolution process. It also offers several advantages compared to traditional pixel-based approach. Level set-based method for multi-phase flow is tested with numerical and experimental data. It is found that level set-based method has better reconstruction performance when compared to pixel-based method

  5. An efficient, scalable, and adaptable framework for solving generic systems of level-set PDEs

    Directory of Open Access Journals (Sweden)

    Kishore R. Mosaliganti

    2013-12-01

    Full Text Available In the last decade, level-set methods have been actively developed for applications in image registration, segmentation, tracking, and reconstruction. However, the development of a wide variety of level-set PDEs and their numerical discretization schemes, coupled with hybrid combinations of PDE terms, stopping criteria, and reinitialization strategies, has created a software logistics problem. In the absence of an integrative design, current toolkits support only specific types of level-set implementations which restrict future algorithm development since extensions require significant code duplication and effort. In the new NIH/NLM Insight Toolkit (ITK v4 architecture, we implemented a level-set software design that is flexible to different numerical (continuous, discrete, and sparse and grid representations (point, mesh, and image-based. Given that a generic PDE is a summation of different terms, we used a set of linked containers to which level-set terms can be added or deleted at any point in the evolution process. This container-based approach allows the user to explore and customize terms in the level-set equation at compile-time in a flexible manner. The framework is optimized so that repeated computations of common intensity functions (e.g. gradient and Hessians across multiple terms is eliminated. The framework further enables the evolution of multiple level-sets for multi-object segmentation and processing of large datasets. For doing so, we restrict level-set domains to subsets of the image domain and use multithreading strategies to process groups of subdomains or level-set functions. Users can also select from a variety of reinitialization policies and stopping criteria. Finally, we developed a visualization framework that shows the evolution of a level-set in real-time to help guide algorithm development and parameter optimization. We demonstrate the power of our new framework using confocal microscopy images of cells in a

  6. Evaluating healthcare priority setting at the meso level: A thematic review of empirical literature

    Science.gov (United States)

    Waithaka, Dennis; Tsofa, Benjamin; Barasa, Edwine

    2018-01-01

    Background: Decentralization of health systems has made sub-national/regional healthcare systems the backbone of healthcare delivery. These regions are tasked with the difficult responsibility of determining healthcare priorities and resource allocation amidst scarce resources. We aimed to review empirical literature that evaluated priority setting practice at the meso (sub-national) level of health systems. Methods: We systematically searched PubMed, ScienceDirect and Google scholar databases and supplemented these with manual searching for relevant studies, based on the reference list of selected papers. We only included empirical studies that described and evaluated, or those that only evaluated priority setting practice at the meso-level. A total of 16 papers were identified from LMICs and HICs. We analyzed data from the selected papers by thematic review. Results: Few studies used systematic priority setting processes, and all but one were from HICs. Both formal and informal criteria are used in priority-setting, however, informal criteria appear to be more perverse in LMICs compared to HICs. The priority setting process at the meso-level is a top-down approach with minimal involvement of the community. Accountability for reasonableness was the most common evaluative framework as it was used in 12 of the 16 studies. Efficiency, reallocation of resources and options for service delivery redesign were the most common outcome measures used to evaluate priority setting. Limitations: Our study was limited by the fact that there are very few empirical studies that have evaluated priority setting at the meso-level and there is likelihood that we did not capture all the studies. Conclusions: Improving priority setting practices at the meso level is crucial to strengthening health systems. This can be achieved through incorporating and adapting systematic priority setting processes and frameworks to the context where used, and making considerations of both process

  7. Novel gene sets improve set-level classification of prokaryotic gene expression data.

    Science.gov (United States)

    Holec, Matěj; Kuželka, Ondřej; Železný, Filip

    2015-10-28

    Set-level classification of gene expression data has received significant attention recently. In this setting, high-dimensional vectors of features corresponding to genes are converted into lower-dimensional vectors of features corresponding to biologically interpretable gene sets. The dimensionality reduction brings the promise of a decreased risk of overfitting, potentially resulting in improved accuracy of the learned classifiers. However, recent empirical research has not confirmed this expectation. Here we hypothesize that the reported unfavorable classification results in the set-level framework were due to the adoption of unsuitable gene sets defined typically on the basis of the Gene ontology and the KEGG database of metabolic networks. We explore an alternative approach to defining gene sets, based on regulatory interactions, which we expect to collect genes with more correlated expression. We hypothesize that such more correlated gene sets will enable to learn more accurate classifiers. We define two families of gene sets using information on regulatory interactions, and evaluate them on phenotype-classification tasks using public prokaryotic gene expression data sets. From each of the two gene-set families, we first select the best-performing subtype. The two selected subtypes are then evaluated on independent (testing) data sets against state-of-the-art gene sets and against the conventional gene-level approach. The novel gene sets are indeed more correlated than the conventional ones, and lead to significantly more accurate classifiers. The novel gene sets are indeed more correlated than the conventional ones, and lead to significantly more accurate classifiers. Novel gene sets defined on the basis of regulatory interactions improve set-level classification of gene expression data. The experimental scripts and other material needed to reproduce the experiments are available at http://ida.felk.cvut.cz/novelgenesets.tar.gz.

  8. Gradient augmented level set method for phase change simulations

    Science.gov (United States)

    Anumolu, Lakshman; Trujillo, Mario F.

    2018-01-01

    A numerical method for the simulation of two-phase flow with phase change based on the Gradient-Augmented-Level-set (GALS) strategy is presented. Sharp capturing of the vaporization process is enabled by: i) identification of the vapor-liquid interface, Γ (t), at the subgrid level, ii) discontinuous treatment of thermal physical properties (except for μ), and iii) enforcement of mass, momentum, and energy jump conditions, where the gradients of the dependent variables are obtained at Γ (t) and are consistent with their analytical expression, i.e. no local averaging is applied. Treatment of the jump in velocity and pressure at Γ (t) is achieved using the Ghost Fluid Method. The solution of the energy equation employs the sub-grid knowledge of Γ (t) to discretize the temperature Laplacian using second-order one-sided differences, i.e. the numerical stencil completely resides within each respective phase. To carefully evaluate the benefits or disadvantages of the GALS approach, the standard level set method is implemented and compared against the GALS predictions. The results show the expected trend that interface identification and transport are predicted noticeably better with GALS over the standard level set. This benefit carries over to the prediction of the Laplacian and temperature gradients in the neighborhood of the interface, which are directly linked to the calculation of the vaporization rate. However, when combining the calculation of interface transport and reinitialization with two-phase momentum and energy, the benefits of GALS are to some extent neutralized, and the causes for this behavior are identified and analyzed. Overall the additional computational costs associated with GALS are almost the same as those using the standard level set technique.

  9. High-Level Waste System Process Interface Description

    International Nuclear Information System (INIS)

    D'Entremont, P.D.

    1999-01-01

    The High-Level Waste System is a set of six different processes interconnected by pipelines. These processes function as one large treatment plant that receives, stores, and treats high-level wastes from various generators at SRS and converts them into forms suitable for final disposal. The three major forms are borosilicate glass, which will be eventually disposed of in a Federal Repository, Saltstone to be buried on site, and treated water effluent that is released to the environment

  10. Volume Sculpting Using the Level-Set Method

    DEFF Research Database (Denmark)

    Bærentzen, Jakob Andreas; Christensen, Niels Jørgen

    2002-01-01

    In this paper, we propose the use of the Level--Set Method as the underlying technology of a volume sculpting system. The main motivation is that this leads to a very generic technique for deformation of volumetric solids. In addition, our method preserves a distance field volume representation....... A scaling window is used to adapt the Level--Set Method to local deformations and to allow the user to control the intensity of the tool. Level--Set based tools have been implemented in an interactive sculpting system, and we show sculptures created using the system....

  11. Patient level costing in Ireland: process, challenges and opportunities.

    Science.gov (United States)

    Murphy, A; McElroy, B

    2015-03-01

    In 2013, the Department of Health released their policy paper on hospital financing entitled Money Follows the Patient. A fundamental building block for the proposed financing model is patient level costing. This paper outlines the patient level costing process, identifies the opportunities and considers the challenges associated with the process in the Irish hospital setting. Methods involved a review of the existing literature which was complemented with an interview with health service staff. There are considerable challenges associated with implementing patient level costing including deficits in information and communication technologies and financial expertise as well as timeliness of coding. In addition, greater clinical input into the costing process is needed compared to traditional costing processes. However, there are long-term benefits associated with patient level costing; these include empowerment of clinical staff, improved transparency and price setting and greater fairness, especially in the treatment of outliers. These can help to achieve the Government's Health Strategy. The benefits of patient level costing need to be promoted and a commitment to investment in overcoming the challenges is required.

  12. Fast Sparse Level Sets on Graphics Hardware

    NARCIS (Netherlands)

    Jalba, Andrei C.; Laan, Wladimir J. van der; Roerdink, Jos B.T.M.

    The level-set method is one of the most popular techniques for capturing and tracking deformable interfaces. Although level sets have demonstrated great potential in visualization and computer graphics applications, such as surface editing and physically based modeling, their use for interactive

  13. A new level set model for multimaterial flows

    Energy Technology Data Exchange (ETDEWEB)

    Starinshak, David P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Karni, Smadar [Univ. of Michigan, Ann Arbor, MI (United States). Dept. of Mathematics; Roe, Philip L. [Univ. of Michigan, Ann Arbor, MI (United States). Dept. of AerospaceEngineering

    2014-01-08

    We present a new level set model for representing multimaterial flows in multiple space dimensions. Instead of associating a level set function with a specific fluid material, the function is associated with a pair of materials and the interface that separates them. A voting algorithm collects sign information from all level sets and determines material designations. M(M ₋1)/2 level set functions might be needed to represent a general M-material configuration; problems of practical interest use far fewer functions, since not all pairs of materials share an interface. The new model is less prone to producing indeterminate material states, i.e. regions claimed by more than one material (overlaps) or no material at all (vacuums). It outperforms existing material-based level set models without the need for reinitialization schemes, thereby avoiding additional computational costs and preventing excessive numerical diffusion.

  14. Standard-Setting Methods as Measurement Processes

    Science.gov (United States)

    Nichols, Paul; Twing, Jon; Mueller, Canda D.; O'Malley, Kimberly

    2010-01-01

    Some writers in the measurement literature have been skeptical of the meaningfulness of achievement standards and described the standard-setting process as blatantly arbitrary. We argue that standard setting is more appropriately conceived of as a measurement process similar to student assessment. The construct being measured is the panelists'…

  15. Exploring the level sets of quantum control landscapes

    International Nuclear Information System (INIS)

    Rothman, Adam; Ho, Tak-San; Rabitz, Herschel

    2006-01-01

    A quantum control landscape is defined by the value of a physical observable as a functional of the time-dependent control field E(t) for a given quantum-mechanical system. Level sets through this landscape are prescribed by a particular value of the target observable at the final dynamical time T, regardless of the intervening dynamics. We present a technique for exploring a landscape level set, where a scalar variable s is introduced to characterize trajectories along these level sets. The control fields E(s,t) accomplishing this exploration (i.e., that produce the same value of the target observable for a given system) are determined by solving a differential equation over s in conjunction with the time-dependent Schroedinger equation. There is full freedom to traverse a level set, and a particular trajectory is realized by making an a priori choice for a continuous function f(s,t) that appears in the differential equation for the control field. The continuous function f(s,t) can assume an arbitrary form, and thus a level set generally contains a family of controls, where each control takes the quantum system to the same final target value, but produces a distinct control mechanism. In addition, although the observable value remains invariant over the level set, other dynamical properties (e.g., the degree of robustness to control noise) are not specifically preserved and can vary greatly. Examples are presented to illustrate the continuous nature of level-set controls and their associated induced dynamical features, including continuously morphing mechanisms for population control in model quantum systems

  16. Level-Set Topology Optimization with Aeroelastic Constraints

    Science.gov (United States)

    Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia

    2015-01-01

    Level-set topology optimization is used to design a wing considering skin buckling under static aeroelastic trim loading, as well as dynamic aeroelastic stability (flutter). The level-set function is defined over the entire 3D volume of a transport aircraft wing box. Therefore, the approach is not limited by any predefined structure and can explore novel configurations. The Sequential Linear Programming (SLP) level-set method is used to solve the constrained optimization problems. The proposed method is demonstrated using three problems with mass, linear buckling and flutter objective and/or constraints. A constraint aggregation method is used to handle multiple buckling constraints in the wing skins. A continuous flutter constraint formulation is used to handle difficulties arising from discontinuities in the design space caused by a switching of the critical flutter mode.

  17. Correlation test to assess low-level processing of high-density oligonucleotide microarray data

    Directory of Open Access Journals (Sweden)

    Bergh Jonas

    2005-03-01

    Full Text Available Abstract Background There are currently a number of competing techniques for low-level processing of oligonucleotide array data. The choice of technique has a profound effect on subsequent statistical analyses, but there is no method to assess whether a particular technique is appropriate for a specific data set, without reference to external data. Results We analyzed coregulation between genes in order to detect insufficient normalization between arrays, where coregulation is measured in terms of statistical correlation. In a large collection of genes, a random pair of genes should have on average zero correlation, hence allowing a correlation test. For all data sets that we evaluated, and the three most commonly used low-level processing procedures including MAS5, RMA and MBEI, the housekeeping-gene normalization failed the test. For a real clinical data set, RMA and MBEI showed significant correlation for absent genes. We also found that a second round of normalization on the probe set level improved normalization significantly throughout. Conclusion Previous evaluation of low-level processing in the literature has been limited to artificial spike-in and mixture data sets. In the absence of a known gold-standard, the correlation criterion allows us to assess the appropriateness of low-level processing of a specific data set and the success of normalization for subsets of genes.

  18. A new level set model for cell image segmentation

    Science.gov (United States)

    Ma, Jing-Feng; Hou, Kai; Bao, Shang-Lian; Chen, Chun

    2011-02-01

    In this paper we first determine three phases of cell images: background, cytoplasm and nucleolus according to the general physical characteristics of cell images, and then develop a variational model, based on these characteristics, to segment nucleolus and cytoplasm from their relatively complicated backgrounds. In the meantime, the preprocessing obtained information of cell images using the OTSU algorithm is used to initialize the level set function in the model, which can speed up the segmentation and present satisfactory results in cell image processing.

  19. Psychology of Agenda-Setting Effects. Mapping the Paths of Information Processing

    Directory of Open Access Journals (Sweden)

    Maxwell McCombs

    2014-01-01

    Full Text Available The concept of Need for Orientation introduced in the early years of agenda-setting research provided a psychological explanation for why agenda-setting effects occur in terms of what individuals bring to the media experience that determines the strength of these effects. Until recently, there had been no significant additions to our knowledge about the psychology of agenda-setting effects. However, the concept of Need for Orientation is only one part of the answer to the question about why agenda setting occurs. Recent research outlines a second way to answer the why question by describing the psychological process through which these effects occur. In this review, we integrate four contemporary studies that explicate dual psychological paths that lead to agenda-setting effects at the first and second levels. We then examine how information preferences and selective exposure can be profitably included in the agenda-setting framework. Complementing these new models of information processing and varying attention to media content and presentation cues, an expanded concept of psychological relevance, motivated reasoning goals (accuracy versus directional goals, and issue publics are discussed.

  20. The communication process in clinical settings.

    Science.gov (United States)

    Mathews, J J

    1983-01-01

    The communication of information in clinical settings is fraught with problems despite avowed common aims of practitioners and patients. Some reasons for the problematic nature of clinical communication are incongruent frames of reference about what information ought to be shared, sociolinguistic differences and social distance between practitioners and patients. Communication between doctors and nurses is also problematic, largely due to differences in ideology between the professions about what ought to be communicated to patients about their illness and who is ratified to give such information. Recent social changes, such as the Patient Bill of Rights and informed consent which assure access to information, and new conceptualizations of the nurse's role, warrant continued study of the communication process especially in regard to what constitutes appropriate and acceptable information about a patient's illness and who ought to give such information to patients. The purpose of this paper is to outline characteristics of communication in clinical settings and to provide a literature review of patient and practitioner interaction studies in order to reflect on why information exchange is problematic in clinical settings. A framework for presentation of the problems employs principles from interaction and role theory to investigate clinical communication from three viewpoints: (1) the level of shared knowledge between participants; (2) the effect of status, role and ideology on transactions; and (3) the regulation of communication imposed by features of the institution.

  1. A new level set model for cell image segmentation

    International Nuclear Information System (INIS)

    Ma Jing-Feng; Chen Chun; Hou Kai; Bao Shang-Lian

    2011-01-01

    In this paper we first determine three phases of cell images: background, cytoplasm and nucleolus according to the general physical characteristics of cell images, and then develop a variational model, based on these characteristics, to segment nucleolus and cytoplasm from their relatively complicated backgrounds. In the meantime, the preprocessing obtained information of cell images using the OTSU algorithm is used to initialize the level set function in the model, which can speed up the segmentation and present satisfactory results in cell image processing. (cross-disciplinary physics and related areas of science and technology)

  2. Predictive Active Set Selection Methods for Gaussian Processes

    DEFF Research Database (Denmark)

    Henao, Ricardo; Winther, Ole

    2012-01-01

    We propose an active set selection framework for Gaussian process classification for cases when the dataset is large enough to render its inference prohibitive. Our scheme consists of a two step alternating procedure of active set update rules and hyperparameter optimization based upon marginal...... high impact to the classifier decision process while removing those that are less relevant. We introduce two active set rules based on different criteria, the first one prefers a model with interpretable active set parameters whereas the second puts computational complexity first, thus a model...... with active set parameters that directly control its complexity. We also provide both theoretical and empirical support for our active set selection strategy being a good approximation of a full Gaussian process classifier. Our extensive experiments show that our approach can compete with state...

  3. An investigation of children's levels of inquiry in an informal science setting

    Science.gov (United States)

    Clark-Thomas, Beth Anne

    Elementary school students' understanding of both science content and processes are enhanced by the higher level thinking associated with inquiry-based science investigations. Informal science setting personnel, elementary school teachers, and curriculum specialists charged with designing inquiry-based investigations would be well served by an understanding of the varying influence of certain present factors upon the students' willingness and ability to delve into such higher level inquiries. This study examined young children's use of inquiry-based materials and factors which may influence the level of inquiry they engaged in during informal science activities. An informal science setting was selected as the context for the examination of student inquiry behaviors because of the rich inquiry-based environment present at the site and the benefits previously noted in the research regarding the impact of informal science settings upon the construction of knowledge in science. The study revealed several patterns of behavior among children when they are engaged in inquiry-based activities at informal science exhibits. These repeated behaviors varied in the children's apparent purposeful use of the materials at the exhibits. These levels of inquiry behavior were taxonomically defined as high/medium/low within this study utilizing a researcher-developed tool. Furthermore, in this study adult interventions, questions, or prompting were found to impact the level of inquiry engaged in by the children. This study revealed that higher levels of inquiry were preceded by task directed and physical feature prompts. Moreover, the levels of inquiry behaviors were haltered, even lowered, when preceded by a prompt that focused on a science content or concept question. Results of this study have implications for the enhancement of inquiry-based science activities in elementary schools as well as in informal science settings. These findings have significance for all science educators

  4. Priority setting at the micro-, meso- and macro-levels in Canada, Norway and Uganda.

    Science.gov (United States)

    Kapiriri, Lydia; Norheim, Ole Frithjof; Martin, Douglas K

    2007-06-01

    The objectives of this study were (1) to describe the process of healthcare priority setting in Ontario-Canada, Norway and Uganda at the three levels of decision-making; (2) to evaluate the description using the framework for fair priority setting, accountability for reasonableness; so as to identify lessons of good practices. We carried out case studies involving key informant interviews, with 184 health practitioners and health planners from the macro-level, meso-level and micro-level from Canada-Ontario, Norway and Uganda (selected by virtue of their varying experiences in priority setting). Interviews were audio-recorded, transcribed and analyzed using a modified thematic approach. The descriptions were evaluated against the four conditions of "accountability for reasonableness", relevance, publicity, revisions and enforcement. Areas of adherence to these conditions were identified as lessons of good practices; areas of non-adherence were identified as opportunities for improvement. (i) at the macro-level, in all three countries, cabinet makes most of the macro-level resource allocation decisions and they are influenced by politics, public pressure, and advocacy. Decisions within the ministries of health are based on objective formulae and evidence. International priorities influenced decisions in Uganda. Some priority-setting reasons are publicized through circulars, printed documents and the Internet in Canada and Norway. At the meso-level, hospital priority-setting decisions were made by the hospital managers and were based on national priorities, guidelines, and evidence. Hospital departments that handle emergencies, such as surgery, were prioritized. Some of the reasons are available on the hospital intranet or presented at meetings. Micro-level practitioners considered medical and social worth criteria. These reasons are not publicized. Many practitioners lacked knowledge of the macro- and meso-level priority-setting processes. (ii) Evaluation

  5. Processing TES Level-2 Data

    Science.gov (United States)

    Poosti, Sassaneh; Akopyan, Sirvard; Sakurai, Regina; Yun, Hyejung; Saha, Pranjit; Strickland, Irina; Croft, Kevin; Smith, Weldon; Hoffman, Rodney; Koffend, John; hide

    2006-01-01

    TES Level 2 Subsystem is a set of computer programs that performs functions complementary to those of the program summarized in the immediately preceding article. TES Level-2 data pertain to retrieved species (or temperature) profiles, and errors thereof. Geolocation, quality, and other data (e.g., surface characteristics for nadir observations) are also included. The subsystem processes gridded meteorological information and extracts parameters that can be interpolated to the appropriate latitude, longitude, and pressure level based on the date and time. Radiances are simulated using the aforementioned meteorological information for initial guesses, and spectroscopic-parameter tables are generated. At each step of the retrieval, a nonlinear-least-squares- solving routine is run over multiple iterations, retrieving a subset of atmospheric constituents, and error analysis is performed. Scientific TES Level-2 data products are written in a format known as Hierarchical Data Format Earth Observing System 5 (HDF-EOS 5) for public distribution.

  6. An optimized process flow for rapid segmentation of cortical bones of the craniofacial skeleton using the level-set method.

    Science.gov (United States)

    Szwedowski, T D; Fialkov, J; Pakdel, A; Whyne, C M

    2013-01-01

    Accurate representation of skeletal structures is essential for quantifying structural integrity, for developing accurate models, for improving patient-specific implant design and in image-guided surgery applications. The complex morphology of thin cortical structures of the craniofacial skeleton (CFS) represents a significant challenge with respect to accurate bony segmentation. This technical study presents optimized processing steps to segment the three-dimensional (3D) geometry of thin cortical bone structures from CT images. In this procedure, anoisotropic filtering and a connected components scheme were utilized to isolate and enhance the internal boundaries between craniofacial cortical and trabecular bone. Subsequently, the shell-like nature of cortical bone was exploited using boundary-tracking level-set methods with optimized parameters determined from large-scale sensitivity analysis. The process was applied to clinical CT images acquired from two cadaveric CFSs. The accuracy of the automated segmentations was determined based on their volumetric concurrencies with visually optimized manual segmentations, without statistical appraisal. The full CFSs demonstrated volumetric concurrencies of 0.904 and 0.719; accuracy increased to concurrencies of 0.936 and 0.846 when considering only the maxillary region. The highly automated approach presented here is able to segment the cortical shell and trabecular boundaries of the CFS in clinical CT images. The results indicate that initial scan resolution and cortical-trabecular bone contrast may impact performance. Future application of these steps to larger data sets will enable the determination of the method's sensitivity to differences in image quality and CFS morphology.

  7. Theorizing and researching levels of processing in self-regulated learning.

    Science.gov (United States)

    Winne, Philip H

    2018-03-01

    Deep versus surface knowledge is widely discussed by educational practitioners. A corresponding construct, levels of processing, has received extensive theoretical and empirical attention in learning science and psychology. In both arenas, lower levels of information and shallower levels of processing are predicted and generally empirically demonstrated to limit knowledge learners gain, curtail what they can do with newly acquired knowledge, and shorten the life span of recently acquired knowledge. I recapitulate major accounts of levels or depth of information and information processing to set a stage for conceptualizing, first, self-regulated learning (SRL) from this perspective and, second, how a "levels-sensitive" approach might be implemented in research about SRL. I merge the levels construct into a model of SRL (Winne, 2011, Handbook of self-regulation of learning and performance (pp. 15-32), New York: Routledge; Winne, 2017b, Handbook of self-regulation of learning and performance (2 nd ed.), New York: Routledge; Winne & Hadwin, 1998, Metacognition in educational theory and practice (pp. 277-304). Mahwah, NJ: Lawrence Erlbaum) conceptually and with respect to operationally defining the levels construct in the context of SRL in relation to each of the model's four phases - surveying task conditions, setting goals and planning, engaging the task, and composing major adaptations for future tasks. Select illustrations are provided for each phase of SRL. Regarding phase 3, a software system called nStudy is introduced as state-of-the-art instrumentation for gathering fine-grained, time-stamped trace data about information learners select for processing and operations they use to process that information. Self-regulated learning can be viewed through a lens of the levels construct, and operational definitions can be designed to research SRL with respect to levels. While information can be organized arbitrarily deeply, the levels construct may not be particularly

  8. Priority Setting for Universal Health Coverage: We Need to Focus Both on Substance and on Process; Comment on “Priority Setting for Universal Health Coverage: We Need Evidence-Informed Deliberative Processes, not Just More Evidence on Cost-Effectiveness”

    Directory of Open Access Journals (Sweden)

    Jeremy A. Lauer

    2017-10-01

    Full Text Available In an editorial published in this journal, Baltussen et al argue that information on cost-effectiveness is not sufficient for priority setting for universal health coverage (UHC, a claim which is correct as far as it goes. However, their focus on the procedural legitimacy of ‘micro’ priority setting processes (eg, decisions concerning the reimbursement of specific interventions, and their related assumption that values for priority setting are determined only at this level, leads them to ignore the relevance of higher level, ‘macro’ priority setting processes, for example, consultations held by World Health Organization (WHO Member States and other global stakeholders that have resulted in widespread consensus on the principles of UHC. Priority setting is not merely about discrete choices, nor should the focus be exclusively (or even mainly on improving the procedural elements of micro priority setting processes. Systemic activities that shape the health system environment, such as strategic planning, as well as the substantive content of global policy instruments, are critical elements for priority setting for UHC.

  9. Joint level-set and spatio-temporal motion detection for cell segmentation.

    Science.gov (United States)

    Boukari, Fatima; Makrogiannis, Sokratis

    2016-08-10

    Cell segmentation is a critical step for quantification and monitoring of cell cycle progression, cell migration, and growth control to investigate cellular immune response, embryonic development, tumorigenesis, and drug effects on live cells in time-lapse microscopy images. In this study, we propose a joint spatio-temporal diffusion and region-based level-set optimization approach for moving cell segmentation. Moving regions are initially detected in each set of three consecutive sequence images by numerically solving a system of coupled spatio-temporal partial differential equations. In order to standardize intensities of each frame, we apply a histogram transformation approach to match the pixel intensities of each processed frame with an intensity distribution model learned from all frames of the sequence during the training stage. After the spatio-temporal diffusion stage is completed, we compute the edge map by nonparametric density estimation using Parzen kernels. This process is followed by watershed-based segmentation and moving cell detection. We use this result as an initial level-set function to evolve the cell boundaries, refine the delineation, and optimize the final segmentation result. We applied this method to several datasets of fluorescence microscopy images with varying levels of difficulty with respect to cell density, resolution, contrast, and signal-to-noise ratio. We compared the results with those produced by Chan and Vese segmentation, a temporally linked level-set technique, and nonlinear diffusion-based segmentation. We validated all segmentation techniques against reference masks provided by the international Cell Tracking Challenge consortium. The proposed approach delineated cells with an average Dice similarity coefficient of 89 % over a variety of simulated and real fluorescent image sequences. It yielded average improvements of 11 % in segmentation accuracy compared to both strictly spatial and temporally linked Chan

  10. On multiple level-set regularization methods for inverse problems

    International Nuclear Information System (INIS)

    DeCezaro, A; Leitão, A; Tai, X-C

    2009-01-01

    We analyze a multiple level-set method for solving inverse problems with piecewise constant solutions. This method corresponds to an iterated Tikhonov method for a particular Tikhonov functional G α based on TV–H 1 penalization. We define generalized minimizers for our Tikhonov functional and establish an existence result. Moreover, we prove convergence and stability results of the proposed Tikhonov method. A multiple level-set algorithm is derived from the first-order optimality conditions for the Tikhonov functional G α , similarly as the iterated Tikhonov method. The proposed multiple level-set method is tested on an inverse potential problem. Numerical experiments show that the method is able to recover multiple objects as well as multiple contrast levels

  11. A level set method for multiple sclerosis lesion segmentation.

    Science.gov (United States)

    Zhao, Yue; Guo, Shuxu; Luo, Min; Shi, Xue; Bilello, Michel; Zhang, Shaoxiang; Li, Chunming

    2018-06-01

    In this paper, we present a level set method for multiple sclerosis (MS) lesion segmentation from FLAIR images in the presence of intensity inhomogeneities. We use a three-phase level set formulation of segmentation and bias field estimation to segment MS lesions and normal tissue region (including GM and WM) and CSF and the background from FLAIR images. To save computational load, we derive a two-phase formulation from the original multi-phase level set formulation to segment the MS lesions and normal tissue regions. The derived method inherits the desirable ability to precisely locate object boundaries of the original level set method, which simultaneously performs segmentation and estimation of the bias field to deal with intensity inhomogeneity. Experimental results demonstrate the advantages of our method over other state-of-the-art methods in terms of segmentation accuracy. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Stochastic level-set variational implicit-solvent approach to solute-solvent interfacial fluctuations

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Shenggao, E-mail: sgzhou@suda.edu.cn, E-mail: bli@math.ucsd.edu [Department of Mathematics and Mathematical Center for Interdiscipline Research, Soochow University, 1 Shizi Street, Jiangsu, Suzhou 215006 (China); Sun, Hui; Cheng, Li-Tien [Department of Mathematics, University of California, San Diego, La Jolla, California 92093-0112 (United States); Dzubiella, Joachim [Soft Matter and Functional Materials, Helmholtz-Zentrum Berlin, 14109 Berlin, Germany and Institut für Physik, Humboldt-Universität zu Berlin, 12489 Berlin (Germany); Li, Bo, E-mail: sgzhou@suda.edu.cn, E-mail: bli@math.ucsd.edu [Department of Mathematics and Quantitative Biology Graduate Program, University of California, San Diego, La Jolla, California 92093-0112 (United States); McCammon, J. Andrew [Department of Chemistry and Biochemistry, Department of Pharmacology, Howard Hughes Medical Institute, University of California, San Diego, La Jolla, California 92093-0365 (United States)

    2016-08-07

    Recent years have seen the initial success of a variational implicit-solvent model (VISM), implemented with a robust level-set method, in capturing efficiently different hydration states and providing quantitatively good estimation of solvation free energies of biomolecules. The level-set minimization of the VISM solvation free-energy functional of all possible solute-solvent interfaces or dielectric boundaries predicts an equilibrium biomolecular conformation that is often close to an initial guess. In this work, we develop a theory in the form of Langevin geometrical flow to incorporate solute-solvent interfacial fluctuations into the VISM. Such fluctuations are crucial to biomolecular conformational changes and binding process. We also develop a stochastic level-set method to numerically implement such a theory. We describe the interfacial fluctuation through the “normal velocity” that is the solute-solvent interfacial force, derive the corresponding stochastic level-set equation in the sense of Stratonovich so that the surface representation is independent of the choice of implicit function, and develop numerical techniques for solving such an equation and processing the numerical data. We apply our computational method to study the dewetting transition in the system of two hydrophobic plates and a hydrophobic cavity of a synthetic host molecule cucurbit[7]uril. Numerical simulations demonstrate that our approach can describe an underlying system jumping out of a local minimum of the free-energy functional and can capture dewetting transitions of hydrophobic systems. In the case of two hydrophobic plates, we find that the wavelength of interfacial fluctuations has a strong influence to the dewetting transition. In addition, we find that the estimated energy barrier of the dewetting transition scales quadratically with the inter-plate distance, agreeing well with existing studies of molecular dynamics simulations. Our work is a first step toward the

  13. A parametric level-set method for partially discrete tomography

    NARCIS (Netherlands)

    A. Kadu (Ajinkya); T. van Leeuwen (Tristan); K.J. Batenburg (Joost)

    2017-01-01

    textabstractThis paper introduces a parametric level-set method for tomographic reconstruction of partially discrete images. Such images consist of a continuously varying background and an anomaly with a constant (known) grey-value. We express the geometry of the anomaly using a level-set function,

  14. Three-Dimensional Simulation of DRIE Process Based on the Narrow Band Level Set and Monte Carlo Method

    Directory of Open Access Journals (Sweden)

    Jia-Cheng Yu

    2018-02-01

    Full Text Available A three-dimensional topography simulation of deep reactive ion etching (DRIE is developed based on the narrow band level set method for surface evolution and Monte Carlo method for flux distribution. The advanced level set method is implemented to simulate the time-related movements of etched surface. In the meanwhile, accelerated by ray tracing algorithm, the Monte Carlo method incorporates all dominant physical and chemical mechanisms such as ion-enhanced etching, ballistic transport, ion scattering, and sidewall passivation. The modified models of charged particles and neutral particles are epitomized to determine the contributions of etching rate. The effects such as scalloping effect and lag effect are investigated in simulations and experiments. Besides, the quantitative analyses are conducted to measure the simulation error. Finally, this simulator will be served as an accurate prediction tool for some MEMS fabrications.

  15. Setting Healthcare Priorities at the Macro and Meso Levels: A Framework for Evaluation.

    Science.gov (United States)

    Barasa, Edwine W; Molyneux, Sassy; English, Mike; Cleary, Susan

    2015-09-16

    Priority setting in healthcare is a key determinant of health system performance. However, there is no widely accepted priority setting evaluation framework. We reviewed literature with the aim of developing and proposing a framework for the evaluation of macro and meso level healthcare priority setting practices. We systematically searched Econlit, PubMed, CINAHL, and EBSCOhost databases and supplemented this with searches in Google Scholar, relevant websites and reference lists of relevant papers. A total of 31 papers on evaluation of priority setting were identified. These were supplemented by broader theoretical literature related to evaluation of priority setting. A conceptual review of selected papers was undertaken. Based on a synthesis of the selected literature, we propose an evaluative framework that requires that priority setting practices at the macro and meso levels of the health system meet the following conditions: (1) Priority setting decisions should incorporate both efficiency and equity considerations as well as the following outcomes; (a) Stakeholder satisfaction, (b) Stakeholder understanding, (c) Shifted priorities (reallocation of resources), and (d) Implementation of decisions. (2) Priority setting processes should also meet the procedural conditions of (a) Stakeholder engagement, (b) Stakeholder empowerment, (c) Transparency, (d) Use of evidence, (e) Revisions, (f) Enforcement, and (g) Being grounded on community values. Available frameworks for the evaluation of priority setting are mostly grounded on procedural requirements, while few have included outcome requirements. There is, however, increasing recognition of the need to incorporate both consequential and procedural considerations in priority setting practices. In this review, we adapt an integrative approach to develop and propose a framework for the evaluation of priority setting practices at the macro and meso levels that draws from these complementary schools of thought. © 2015

  16. Setting Healthcare Priorities at the Macro and Meso Levels: A Framework for Evaluation

    Science.gov (United States)

    Barasa, Edwine W.; Molyneux, Sassy; English, Mike; Cleary, Susan

    2015-01-01

    Background: Priority setting in healthcare is a key determinant of health system performance. However, there is no widely accepted priority setting evaluation framework. We reviewed literature with the aim of developing and proposing a framework for the evaluation of macro and meso level healthcare priority setting practices. Methods: We systematically searched Econlit, PubMed, CINAHL, and EBSCOhost databases and supplemented this with searches in Google Scholar, relevant websites and reference lists of relevant papers. A total of 31 papers on evaluation of priority setting were identified. These were supplemented by broader theoretical literature related to evaluation of priority setting. A conceptual review of selected papers was undertaken. Results: Based on a synthesis of the selected literature, we propose an evaluative framework that requires that priority setting practices at the macro and meso levels of the health system meet the following conditions: (1) Priority setting decisions should incorporate both efficiency and equity considerations as well as the following outcomes; (a) Stakeholder satisfaction, (b) Stakeholder understanding, (c) Shifted priorities (reallocation of resources), and (d) Implementation of decisions. (2) Priority setting processes should also meet the procedural conditions of (a) Stakeholder engagement, (b) Stakeholder empowerment, (c) Transparency, (d) Use of evidence, (e) Revisions, (f) Enforcement, and (g) Being grounded on community values. Conclusion: Available frameworks for the evaluation of priority setting are mostly grounded on procedural requirements, while few have included outcome requirements. There is, however, increasing recognition of the need to incorporate both consequential and procedural considerations in priority setting practices. In this review, we adapt an integrative approach to develop and propose a framework for the evaluation of priority setting practices at the macro and meso levels that draws from these

  17. Setting Healthcare Priorities at the Macro and Meso Levels: A Framework for Evaluation

    Directory of Open Access Journals (Sweden)

    Edwine W. Barasa

    2015-11-01

    Full Text Available Background Priority setting in healthcare is a key determinant of health system performance. However, there is no widely accepted priority setting evaluation framework. We reviewed literature with the aim of developing and proposing a framework for the evaluation of macro and meso level healthcare priority setting practices. Methods We systematically searched Econlit, PubMed, CINAHL, and EBSCOhost databases and supplemented this with searches in Google Scholar, relevant websites and reference lists of relevant papers. A total of 31 papers on evaluation of priority setting were identified. These were supplemented by broader theoretical literature related to evaluation of priority setting. A conceptual review of selected papers was undertaken. Results Based on a synthesis of the selected literature, we propose an evaluative framework that requires that priority setting practices at the macro and meso levels of the health system meet the following conditions: (1 Priority setting decisions should incorporate both efficiency and equity considerations as well as the following outcomes; (a Stakeholder satisfaction, (b Stakeholder understanding, (c Shifted priorities (reallocation of resources, and (d Implementation of decisions. (2 Priority setting processes should also meet the procedural conditions of (a Stakeholder engagement, (b Stakeholder empowerment, (c Transparency, (d Use of evidence, (e Revisions, (f Enforcement, and (g Being grounded on community values. Conclusion Available frameworks for the evaluation of priority setting are mostly grounded on procedural requirements, while few have included outcome requirements. There is, however, increasing recognition of the need to incorporate both consequential and procedural considerations in priority setting practices. In this review, we adapt an integrative approach to develop and propose a framework for the evaluation of priority setting practices at the macro and meso levels that draws from

  18. Demons versus level-set motion registration for coronary 18F-sodium fluoride PET

    Science.gov (United States)

    Rubeaux, Mathieu; Joshi, Nikhil; Dweck, Marc R.; Fletcher, Alison; Motwani, Manish; Thomson, Louise E.; Germano, Guido; Dey, Damini; Berman, Daniel S.; Newby, David E.; Slomka, Piotr J.

    2016-03-01

    plausible. Therefore, level-set technique will likely require additional post-processing steps. On the other hand, the observed TBR increases were the highest for the level-set technique. Further investigations of the optimal registration technique of this novel coronary PET imaging technique are warranted.

  19. Structural level set inversion for microwave breast screening

    International Nuclear Information System (INIS)

    Irishina, Natalia; Álvarez, Diego; Dorn, Oliver; Moscoso, Miguel

    2010-01-01

    We present a new inversion strategy for the early detection of breast cancer from microwave data which is based on a new multiphase level set technique. This novel structural inversion method uses a modification of the color level set technique adapted to the specific situation of structural breast imaging taking into account the high complexity of the breast tissue. We only use data of a few microwave frequencies for detecting the tumors hidden in this complex structure. Three level set functions are employed for describing four different types of breast tissue, where each of these four regions is allowed to have a complicated topology and to have an interior structure which needs to be estimated from the data simultaneously with the region interfaces. The algorithm consists of several stages of increasing complexity. In each stage more details about the anatomical structure of the breast interior is incorporated into the inversion model. The synthetic breast models which are used for creating simulated data are based on real MRI images of the breast and are therefore quite realistic. Our results demonstrate the potential and feasibility of the proposed level set technique for detecting, locating and characterizing a small tumor in its early stage of development embedded in such a realistic breast model. Both the data acquisition simulation and the inversion are carried out in 2D

  20. Level set segmentation of medical images based on local region statistics and maximum a posteriori probability.

    Science.gov (United States)

    Cui, Wenchao; Wang, Yi; Lei, Tao; Fan, Yangyu; Feng, Yan

    2013-01-01

    This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP) and Bayes' rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method.

  1. Novel room-temperature-setting phosphate ceramics for stabilizing combustion products and low-level mixed wastes

    International Nuclear Information System (INIS)

    Wagh, A.S.; Singh, D.

    1994-01-01

    Argonne National Laboratory, with support from the Office of Technology in the US Department of Energy (DOE), has developed a new process employing novel, chemically bonded ceramic materials to stabilize secondary waste streams. Such waste streams result from the thermal processes used to stabilize low-level, mixed wastes. The process will help the electric power industry treat its combustion and low-level mixed wastes. The ceramic materials are strong, dense, leach-resistant, and inexpensive to fabricate. The room-temperature-setting process allows stabilization of volatile components containing lead, mercury, cadmium, chromium, and nickel. The process also provides effective stabilization of fossil fuel combustion products. It is most suitable for treating fly and bottom ashes

  2. Processing EOS MLS Level-2 Data

    Science.gov (United States)

    Snyder, W. Van; Wu, Dong; Read, William; Jiang, Jonathan; Wagner, Paul; Livesey, Nathaniel; Schwartz, Michael; Filipiak, Mark; Pumphrey, Hugh; Shippony, Zvi

    2006-01-01

    A computer program performs level-2 processing of thermal-microwave-radiance data from observations of the limb of the Earth by the Earth Observing System (EOS) Microwave Limb Sounder (MLS). The purpose of the processing is to estimate the composition and temperature of the atmosphere versus altitude from .8 to .90 km. "Level-2" as used here is a specialists f term signifying both vertical profiles of geophysical parameters along the measurement track of the instrument and processing performed by this or other software to generate such profiles. Designed to be flexible, the program is controlled via a configuration file that defines all aspects of processing, including contents of state and measurement vectors, configurations of forward models, measurement and calibration data to be read, and the manner of inverting the models to obtain the desired estimates. The program can operate in a parallel form in which one instance of the program acts a master, coordinating the work of multiple slave instances on a cluster of computers, each slave operating on a portion of the data. Optionally, the configuration file can be made to instruct the software to produce files of simulated radiances based on state vectors formed from sets of geophysical data-product files taken as input.

  3. Discretisation Schemes for Level Sets of Planar Gaussian Fields

    Science.gov (United States)

    Beliaev, D.; Muirhead, S.

    2018-01-01

    Smooth random Gaussian functions play an important role in mathematical physics, a main example being the random plane wave model conjectured by Berry to give a universal description of high-energy eigenfunctions of the Laplacian on generic compact manifolds. Our work is motivated by questions about the geometry of such random functions, in particular relating to the structure of their nodal and level sets. We study four discretisation schemes that extract information about level sets of planar Gaussian fields. Each scheme recovers information up to a different level of precision, and each requires a maximum mesh-size in order to be valid with high probability. The first two schemes are generalisations and enhancements of similar schemes that have appeared in the literature (Beffara and Gayet in Publ Math IHES, 2017. https://doi.org/10.1007/s10240-017-0093-0; Mischaikow and Wanner in Ann Appl Probab 17:980-1018, 2007); these give complete topological information about the level sets on either a local or global scale. As an application, we improve the results in Beffara and Gayet (2017) on Russo-Seymour-Welsh estimates for the nodal set of positively-correlated planar Gaussian fields. The third and fourth schemes are, to the best of our knowledge, completely new. The third scheme is specific to the nodal set of the random plane wave, and provides global topological information about the nodal set up to `visible ambiguities'. The fourth scheme gives a way to approximate the mean number of excursion domains of planar Gaussian fields.

  4. Identifying Heterogeneities in Subsurface Environment using the Level Set Method

    Energy Technology Data Exchange (ETDEWEB)

    Lei, Hongzhuan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lu, Zhiming [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vesselinov, Velimir Valentinov [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-08-25

    These are slides from a presentation on identifying heterogeneities in subsurface environment using the level set method. The slides start with the motivation, then explain Level Set Method (LSM), the algorithms, some examples are given, and finally future work is explained.

  5. Setting-level influences on implementation of the responsive classroom approach.

    Science.gov (United States)

    Wanless, Shannon B; Patton, Christine L; Rimm-Kaufman, Sara E; Deutsch, Nancy L

    2013-02-01

    We used mixed methods to examine the association between setting-level factors and observed implementation of a social and emotional learning intervention (Responsive Classroom® approach; RC). In study 1 (N = 33 3rd grade teachers after the first year of RC implementation), we identified relevant setting-level factors and uncovered the mechanisms through which they related to implementation. In study 2 (N = 50 4th grade teachers after the second year of RC implementation), we validated our most salient Study 1 finding across multiple informants. Findings suggested that teachers perceived setting-level factors, particularly principal buy-in to the intervention and individualized coaching, as influential to their degree of implementation. Further, we found that intervention coaches' perspectives of principal buy-in were more related to implementation than principals' or teachers' perspectives. Findings extend the application of setting theory to the field of implementation science and suggest that interventionists may want to consider particular accounts of school setting factors before determining the likelihood of schools achieving high levels of implementation.

  6. Modélisation du procédé de soudage hybride Arc / Laser par une approche level set application aux toles d'aciers de fortes épaisseurs A level-set approach for the modelling of hybrid arc/laser welding process application for high thickness steel sheets joining

    Directory of Open Access Journals (Sweden)

    Desmaison Olivier

    2013-11-01

    Full Text Available Le procédé de soudage hybride Arc/Laser est une solution aux assemblages difficiles de tôles de fortes épaisseurs. Ce procédé innovant associe deux sources de chaleur : un arc électrique produit par une torche MIG et une source laser placée en amont. Ce couplage améliore le rendement du procédé, la qualité du cordon et les déformations finales. La modélisation de ce procédé par une approche Level Set permet une prédiction du développement du cordon et du champ de température associé. La simulation du soudage multi-passes d'une nuance d'acier 18MnNiMo5 est présentée ici et les résultats sont comparés aux observations expérimentales. The hybrid arc/laser welding process has been developed in order to overcome the difficulties encountered for joining high thickness steel sheets. This innovative process gathers two heat sources: an arc source developed by a MIG torch and a pre-located laser source. This coupling improves the efficiency of the process, the weld bead quality and the final deformations. The Level-Set approach for the modelling of this process enables the prediction of the weld bead development and the temperature field evolution. The simulation of the multi-passes welding of a 18MnNiMo5 steel grade is detailed and the results are compared to the experimental observations.

  7. Impairment in local and global processing and set-shifting in body dysmorphic disorder

    Science.gov (United States)

    Kerwin, Lauren; Hovav, Sarit; Helleman, Gerhard; Feusner, Jamie D.

    2014-01-01

    Body dysmorphic disorder (BDD) is characterized by distressing and often debilitating preoccupations with misperceived defects in appearance. Research suggests that aberrant visual processing may contribute to these misperceptions. This study used two tasks to probe global and local visual processing as well as set shifting in individuals with BDD. Eighteen unmedicated individuals with BDD and 17 non-clinical controls completed two global-local tasks. The embedded figures task requires participants to determine which of three complex figures contained a simpler figure embedded within it. The Navon task utilizes incongruent stimuli comprised of a large letter (global level) made up of smaller letters (local level). The outcome measures were response time and accuracy rate. On the embedded figures task, BDD individuals were slower and less accurate than controls. On the Navon task, BDD individuals processed both global and local stimuli slower and less accurately than controls, and there was a further decrement in performance when shifting attention between the different levels of stimuli. Worse insight correlated with poorer performance on both tasks. Taken together, these results suggest abnormal global and local processing for non-appearance related stimuli among BDD individuals, in addition to evidence of poor set-shifting abilities. Moreover, these abnormalities appear to relate to the important clinical variable of poor insight. Further research is needed to explore these abnormalities and elucidate their possible role in the development and/or persistence of BDD symptoms. PMID:24972487

  8. Level Set Structure of an Integrable Cellular Automaton

    Directory of Open Access Journals (Sweden)

    Taichiro Takagi

    2010-03-01

    Full Text Available Based on a group theoretical setting a sort of discrete dynamical system is constructed and applied to a combinatorial dynamical system defined on the set of certain Bethe ansatz related objects known as the rigged configurations. This system is then used to study a one-dimensional periodic cellular automaton related to discrete Toda lattice. It is shown for the first time that the level set of this cellular automaton is decomposed into connected components and every such component is a torus.

  9. (Re)framing school as a setting for promoting health and wellbeing: a double translation process

    DEFF Research Database (Denmark)

    Nordin, Lone Lindegard; Jourdan, Didier; Simovska, Venka

    2018-01-01

    , but entangled, processes of translation. At the national level, despite resistance by a number of actors with differing priorities, the translation resulted in the integration of selected key principles of the setting approach to health promotion in the national curriculum for health education. At the municipal......The aim of this article is to discuss the ways in which the setting approach to health promotion in schools, as part of knowledge-based international policies and guidelines, is embedded in the Danish policy landscape and enacted at the local governance level. The study draws on the sociology...... level, however, the principles seem to be ‘lost in translation’, as the treatment of schools as settings for promoting health and wellbeing remains largely subordinate to the discourses of disease prevention and individual behaviour regulation, dominated by the agenda of actors in the health sector....

  10. Reevaluation of steam generator level trip set point

    Energy Technology Data Exchange (ETDEWEB)

    Shim, Yoon Sub; Soh, Dong Sub; Kim, Sung Oh; Jung, Se Won; Sung, Kang Sik; Lee, Joon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1994-06-01

    The reactor trip by the low level of steam generator water accounts for a substantial portion of reactor scrams in a nuclear plant and the feasibility of modification of the steam generator water level trip system of YGN 1/2 was evaluated in this study. The study revealed removal of the reactor trip function from the SG water level trip system is not possible because of plant safety but relaxation of the trip set point by 9 % is feasible. The set point relaxation requires drilling of new holes for level measurement to operating steam generators. Characteristics of negative neutron flux rate trip and reactor trip were also reviewed as an additional work. Since the purpose of the trip system modification for reduction of a reactor scram frequency is not to satisfy legal requirements but to improve plant performance and the modification yields positive and negative aspects, the decision of actual modification needs to be made based on the results of this study and also the policy of a plant owner. 37 figs, 6 tabs, 14 refs. (Author).

  11. Quality Control and Peer Review of Data Sets: Mapping Data Archiving Processes to Data Publication Requirements

    Science.gov (United States)

    Mayernik, M. S.; Daniels, M.; Eaker, C.; Strand, G.; Williams, S. F.; Worley, S. J.

    2012-12-01

    Data sets exist within scientific research and knowledge networks as both technical and non-technical entities. Establishing the quality of data sets is a multi-faceted task that encompasses many automated and manual processes. Data sets have always been essential for science research, but now need to be more visible as first-class scholarly objects at national, international, and local levels. Many initiatives are establishing procedures to publish and curate data sets, as well as to promote professional rewards for researchers that collect, create, manage, and preserve data sets. Traditionally, research quality has been assessed by peer review of textual publications, e.g. journal articles, conference proceedings, and books. Citation indices then provide standard measures of productivity used to reward individuals for their peer-reviewed work. Whether a similar peer review process is appropriate for assessing and ensuring the quality of data sets remains as an open question. How does the traditional process of peer review apply to data sets? This presentation will describe current work being done at the National Center for Atmospheric Research (NCAR) in the context of the Peer REview for Publication & Accreditation of Research Data in the Earth sciences (PREPARDE) project. PREPARDE is assessing practices and processes for data peer review, with the goal of developing recommendations. NCAR data management teams perform various kinds of quality assessment and review of data sets prior to making them publicly available. The poster will investigate how notions of peer review relate to the types of data review already in place at NCAR. We highlight the data set characteristics and management/archiving processes that challenge the traditional peer review processes by using a number of questions as probes, including: Who is qualified to review data sets? What formal and informal documentation is necessary to allow someone outside of a research team to review a data set

  12. Mapping topographic structure in white matter pathways with level set trees.

    Directory of Open Access Journals (Sweden)

    Brian P Kent

    Full Text Available Fiber tractography on diffusion imaging data offers rich potential for describing white matter pathways in the human brain, but characterizing the spatial organization in these large and complex data sets remains a challenge. We show that level set trees--which provide a concise representation of the hierarchical mode structure of probability density functions--offer a statistically-principled framework for visualizing and analyzing topography in fiber streamlines. Using diffusion spectrum imaging data collected on neurologically healthy controls (N = 30, we mapped white matter pathways from the cortex into the striatum using a deterministic tractography algorithm that estimates fiber bundles as dimensionless streamlines. Level set trees were used for interactive exploration of patterns in the endpoint distributions of the mapped fiber pathways and an efficient segmentation of the pathways that had empirical accuracy comparable to standard nonparametric clustering techniques. We show that level set trees can also be generalized to model pseudo-density functions in order to analyze a broader array of data types, including entire fiber streamlines. Finally, resampling methods show the reliability of the level set tree as a descriptive measure of topographic structure, illustrating its potential as a statistical descriptor in brain imaging analysis. These results highlight the broad applicability of level set trees for visualizing and analyzing high-dimensional data like fiber tractography output.

  13. A simple mass-conserved level set method for simulation of multiphase flows

    Science.gov (United States)

    Yuan, H.-Z.; Shu, C.; Wang, Y.; Shu, S.

    2018-04-01

    In this paper, a modified level set method is proposed for simulation of multiphase flows with large density ratio and high Reynolds number. The present method simply introduces a source or sink term into the level set equation to compensate the mass loss or offset the mass increase. The source or sink term is derived analytically by applying the mass conservation principle with the level set equation and the continuity equation of flow field. Since only a source term is introduced, the application of the present method is as simple as the original level set method, but it can guarantee the overall mass conservation. To validate the present method, the vortex flow problem is first considered. The simulation results are compared with those from the original level set method, which demonstrates that the modified level set method has the capability of accurately capturing the interface and keeping the mass conservation. Then, the proposed method is further validated by simulating the Laplace law, the merging of two bubbles, a bubble rising with high density ratio, and Rayleigh-Taylor instability with high Reynolds number. Numerical results show that the mass is a well-conserved by the present method.

  14. Ready for goal setting? Process evaluation of a patient-specific goal-setting method in physiotherapy.

    Science.gov (United States)

    Stevens, Anita; Köke, Albère; van der Weijden, Trudy; Beurskens, Anna

    2017-08-31

    Patient participation and goal setting appear to be difficult in daily physiotherapy practice, and practical methods are lacking. An existing patient-specific instrument, Patient-Specific Complaints (PSC), was therefore optimized into a new Patient Specific Goal-setting method (PSG). The aims of this study were to examine the feasibility of the PSG in daily physiotherapy practice, and to explore the potential impact of the new method. We conducted a process evaluation within a non-controlled intervention study. Community-based physiotherapists were instructed on how to work with the PSG in three group training sessions. The PSG is a six-step method embedded across the physiotherapy process, in which patients are stimulated to participate in the goal-setting process by: identifying problematic activities, prioritizing them, scoring their abilities, setting goals, planning and evaluating. Quantitative and qualitative data were collected among patients and physiotherapists by recording consultations and assessing patient files, questionnaires and written reflection reports. Data were collected from 51 physiotherapists and 218 patients, and 38 recordings and 219 patient files were analysed. The PSG steps were performed as intended, but the 'setting goals' and 'planning treatment' steps were not performed in detail. The patients and physiotherapists were positive about the method, and the physiotherapists perceived increased patient participation. They became aware of the importance of engaging patients in a dialogue, instead of focusing on gathering information. The lack of integration in the electronic patient system was a major barrier for optimal use in practice. Although the self-reported actual use of the PSG, i.e. informing and involving patients, and client-centred competences had improved, this was not completely confirmed by the objectively observed behaviour. The PSG is a feasible method and tends to have impact on increasing patient participation in the goal-setting

  15. Reconstruction of thin electromagnetic inclusions by a level-set method

    International Nuclear Information System (INIS)

    Park, Won-Kwang; Lesselier, Dominique

    2009-01-01

    In this contribution, we consider a technique of electromagnetic imaging (at a single, non-zero frequency) which uses the level-set evolution method for reconstructing a thin inclusion (possibly made of disconnected parts) with either dielectric or magnetic contrast with respect to the embedding homogeneous medium. Emphasis is on the proof of the concept, the scattering problem at hand being so far based on a two-dimensional scalar model. To do so, two level-set functions are employed; the first one describes location and shape, and the other one describes connectivity and length. Speeds of evolution of the level-set functions are calculated via the introduction of Fréchet derivatives of a least-square cost functional. Several numerical experiments on noiseless and noisy data as well illustrate how the proposed method behaves

  16. Level-Set Methodology on Adaptive Octree Grids

    Science.gov (United States)

    Gibou, Frederic; Guittet, Arthur; Mirzadeh, Mohammad; Theillard, Maxime

    2017-11-01

    Numerical simulations of interfacial problems in fluids require a methodology capable of tracking surfaces that can undergo changes in topology and capable to imposing jump boundary conditions in a sharp manner. In this talk, we will discuss recent advances in the level-set framework, in particular one that is based on adaptive grids.

  17. An accurate conservative level set/ghost fluid method for simulating turbulent atomization

    International Nuclear Information System (INIS)

    Desjardins, Olivier; Moureau, Vincent; Pitsch, Heinz

    2008-01-01

    This paper presents a novel methodology for simulating incompressible two-phase flows by combining an improved version of the conservative level set technique introduced in [E. Olsson, G. Kreiss, A conservative level set method for two phase flow, J. Comput. Phys. 210 (2005) 225-246] with a ghost fluid approach. By employing a hyperbolic tangent level set function that is transported and re-initialized using fully conservative numerical schemes, mass conservation issues that are known to affect level set methods are greatly reduced. In order to improve the accuracy of the conservative level set method, high order numerical schemes are used. The overall robustness of the numerical approach is increased by computing the interface normals from a signed distance function reconstructed from the hyperbolic tangent level set by a fast marching method. The convergence of the curvature calculation is ensured by using a least squares reconstruction. The ghost fluid technique provides a way of handling the interfacial forces and large density jumps associated with two-phase flows with good accuracy, while avoiding artificial spreading of the interface. Since the proposed approach relies on partial differential equations, its implementation is straightforward in all coordinate systems, and it benefits from high parallel efficiency. The robustness and efficiency of the approach is further improved by using implicit schemes for the interface transport and re-initialization equations, as well as for the momentum solver. The performance of the method is assessed through both classical level set transport tests and simple two-phase flow examples including topology changes. It is then applied to simulate turbulent atomization of a liquid Diesel jet at Re=3000. The conservation errors associated with the accurate conservative level set technique are shown to remain small even for this complex case

  18. Transport and diffusion of material quantities on propagating interfaces via level set methods

    CERN Document Server

    Adalsteinsson, D

    2003-01-01

    We develop theory and numerical algorithms to apply level set methods to problems involving the transport and diffusion of material quantities in a level set framework. Level set methods are computational techniques for tracking moving interfaces; they work by embedding the propagating interface as the zero level set of a higher dimensional function, and then approximate the solution of the resulting initial value partial differential equation using upwind finite difference schemes. The traditional level set method works in the trace space of the evolving interface, and hence disregards any parameterization in the interface description. Consequently, material quantities on the interface which themselves are transported under the interface motion are not easily handled in this framework. We develop model equations and algorithmic techniques to extend the level set method to include these problems. We demonstrate the accuracy of our approach through a series of test examples and convergence studies.

  19. Transport and diffusion of material quantities on propagating interfaces via level set methods

    International Nuclear Information System (INIS)

    Adalsteinsson, David; Sethian, J.A.

    2003-01-01

    We develop theory and numerical algorithms to apply level set methods to problems involving the transport and diffusion of material quantities in a level set framework. Level set methods are computational techniques for tracking moving interfaces; they work by embedding the propagating interface as the zero level set of a higher dimensional function, and then approximate the solution of the resulting initial value partial differential equation using upwind finite difference schemes. The traditional level set method works in the trace space of the evolving interface, and hence disregards any parameterization in the interface description. Consequently, material quantities on the interface which themselves are transported under the interface motion are not easily handled in this framework. We develop model equations and algorithmic techniques to extend the level set method to include these problems. We demonstrate the accuracy of our approach through a series of test examples and convergence studies

  20. A level set approach for shock-induced α-γ phase transition of RDX

    Science.gov (United States)

    Josyula, Kartik; Rahul; De, Suvranu

    2018-02-01

    We present a thermodynamically consistent level sets approach based on regularization energy functional which can be directly incorporated into a Galerkin finite element framework to model interface motion. The regularization energy leads to a diffusive form of flux that is embedded within the level sets evolution equation which maintains the signed distance property of the level set function. The scheme is shown to compare well with the velocity extension method in capturing the interface position. The proposed level sets approach is employed to study the α-γphase transformation in RDX single crystal shocked along the (100) plane. Example problems in one and three dimensions are presented. We observe smooth evolution of the phase interface along the shock direction in both models. There is no diffusion of the interface during the zero level set evolution in the three dimensional model. The level sets approach is shown to capture the characteristics of the shock-induced α-γ phase transformation such as stress relaxation behind the phase interface and the finite time required for the phase transformation to complete. The regularization energy based level sets approach is efficient, robust, and easy to implement.

  1. Level set method for optimal shape design of MRAM core. Micromagnetic approach

    International Nuclear Information System (INIS)

    Melicher, Valdemar; Cimrak, Ivan; Keer, Roger van

    2008-01-01

    We aim at optimizing the shape of the magnetic core in MRAM memories. The evolution of the magnetization during the writing process is described by the Landau-Lifshitz equation (LLE). The actual shape of the core in one cell is characterized by the coefficient γ. Cost functional f=f(γ) expresses the quality of the writing process having in mind the competition between the full-select and the half-select element. We derive an explicit form of the derivative F=∂f/∂γ which allows for the use of gradient-type methods for the actual computation of the optimized shape (e.g., steepest descend method). The level set method (LSM) is employed for the representation of the piecewise constant coefficient γ

  2. A combined single-multiphase flow formulation of the premixing phase using the level set method

    International Nuclear Information System (INIS)

    Leskovar, M.; Marn, J.

    1999-01-01

    The premixing phase of a steam explosion covers the interaction of the melt jet or droplets with the water prior to any steam explosion occurring. To get a better insight of the hydrodynamic processes during the premixing phase beside hot premixing experiments, where the water evaporation is significant, also cold isothermal premixing experiments are performed. The specialty of isothermal premixing experiments is that three phases are involved: the water, the air and the spheres phase, but only the spheres phase mixes with the other two phases whereas the water and air phases do not mix and remain separated by a free surface. Our idea therefore was to treat the isothermal premixing process with a combined single-multiphase flow model. In this combined model the water and air phase are treated as a single phase with discontinuous phase properties at the water air interface, whereas the spheres are treated as usually with a multiphase flow model, where the spheres represent the dispersed phase and the common water-air phase represents the continuous phase. The common water-air phase was described with the front capturing method based on the level set formulation. In the level set formulation, the boundary of two-fluid interfaces is modeled as the zero set of a smooth signed normal distance function defined on the entire physical domain. The boundary is then updated by solving a nonlinear equation of the Hamilton-Jacobi type on the whole domain. With this single-multiphase flow model the Queos isothermal premixing Q08 has been simulated. A numerical analysis using different treatments of the water-air interface (level set, high-resolution and upwind) has been performed for the incompressible and compressible case and the results were compared to experimental measurements.(author)

  3. Adapting high-level language programs for parallel processing using data flow

    Science.gov (United States)

    Standley, Hilda M.

    1988-01-01

    EASY-FLOW, a very high-level data flow language, is introduced for the purpose of adapting programs written in a conventional high-level language to a parallel environment. The level of parallelism provided is of the large-grained variety in which parallel activities take place between subprograms or processes. A program written in EASY-FLOW is a set of subprogram calls as units, structured by iteration, branching, and distribution constructs. A data flow graph may be deduced from an EASY-FLOW program.

  4. Two Surface-Tension Formulations For The Level Set Interface-Tracking Method

    International Nuclear Information System (INIS)

    Shepel, S.V.; Smith, B.L.

    2005-01-01

    The paper describes a comparative study of two surface-tension models for the Level Set interface tracking method. In both models, the surface tension is represented as a body force, concentrated near the interface, but the technical implementation of the two options is different. The first is based on a traditional Level Set approach, in which the surface tension is distributed over a narrow band around the interface using a smoothed Delta function. In the second model, which is based on the integral form of the fluid-flow equations, the force is imposed only in those computational cells through which the interface passes. Both models have been incorporated into the Finite-Element/Finite-Volume Level Set method, previously implemented into the commercial Computational Fluid Dynamics (CFD) code CFX-4. A critical evaluation of the two models, undertaken in the context of four standard Level Set benchmark problems, shows that the first model, based on the smoothed Delta function approach, is the more general, and more robust, of the two. (author)

  5. A deep level set method for image segmentation

    OpenAIRE

    Tang, Min; Valipour, Sepehr; Zhang, Zichen Vincent; Cobzas, Dana; MartinJagersand

    2017-01-01

    This paper proposes a novel image segmentation approachthat integrates fully convolutional networks (FCNs) with a level setmodel. Compared with a FCN, the integrated method can incorporatesmoothing and prior information to achieve an accurate segmentation.Furthermore, different than using the level set model as a post-processingtool, we integrate it into the training phase to fine-tune the FCN. Thisallows the use of unlabeled data during training in a semi-supervisedsetting. Using two types o...

  6. A LEVEL SET BASED SHAPE OPTIMIZATION METHOD FOR AN ELLIPTIC OBSTACLE PROBLEM

    KAUST Repository

    Burger, Martin

    2011-04-01

    In this paper, we construct a level set method for an elliptic obstacle problem, which can be reformulated as a shape optimization problem. We provide a detailed shape sensitivity analysis for this reformulation and a stability result for the shape Hessian at the optimal shape. Using the shape sensitivities, we construct a geometric gradient flow, which can be realized in the context of level set methods. We prove the convergence of the gradient flow to an optimal shape and provide a complete analysis of the level set method in terms of viscosity solutions. To our knowledge this is the first complete analysis of a level set method for a nonlocal shape optimization problem. Finally, we discuss the implementation of the methods and illustrate its behavior through several computational experiments. © 2011 World Scientific Publishing Company.

  7. Level-set techniques for facies identification in reservoir modeling

    Science.gov (United States)

    Iglesias, Marco A.; McLaughlin, Dennis

    2011-03-01

    In this paper we investigate the application of level-set techniques for facies identification in reservoir models. The identification of facies is a geometrical inverse ill-posed problem that we formulate in terms of shape optimization. The goal is to find a region (a geologic facies) that minimizes the misfit between predicted and measured data from an oil-water reservoir. In order to address the shape optimization problem, we present a novel application of the level-set iterative framework developed by Burger in (2002 Interfaces Free Bound. 5 301-29 2004 Inverse Problems 20 259-82) for inverse obstacle problems. The optimization is constrained by (the reservoir model) a nonlinear large-scale system of PDEs that describes the reservoir dynamics. We reformulate this reservoir model in a weak (integral) form whose shape derivative can be formally computed from standard results of shape calculus. At each iteration of the scheme, the current estimate of the shape derivative is utilized to define a velocity in the level-set equation. The proper selection of this velocity ensures that the new shape decreases the cost functional. We present results of facies identification where the velocity is computed with the gradient-based (GB) approach of Burger (2002) and the Levenberg-Marquardt (LM) technique of Burger (2004). While an adjoint formulation allows the straightforward application of the GB approach, the LM technique requires the computation of the large-scale Karush-Kuhn-Tucker system that arises at each iteration of the scheme. We efficiently solve this system by means of the representer method. We present some synthetic experiments to show and compare the capabilities and limitations of the proposed implementations of level-set techniques for the identification of geologic facies.

  8. Level-set techniques for facies identification in reservoir modeling

    International Nuclear Information System (INIS)

    Iglesias, Marco A; McLaughlin, Dennis

    2011-01-01

    In this paper we investigate the application of level-set techniques for facies identification in reservoir models. The identification of facies is a geometrical inverse ill-posed problem that we formulate in terms of shape optimization. The goal is to find a region (a geologic facies) that minimizes the misfit between predicted and measured data from an oil–water reservoir. In order to address the shape optimization problem, we present a novel application of the level-set iterative framework developed by Burger in (2002 Interfaces Free Bound. 5 301–29; 2004 Inverse Problems 20 259–82) for inverse obstacle problems. The optimization is constrained by (the reservoir model) a nonlinear large-scale system of PDEs that describes the reservoir dynamics. We reformulate this reservoir model in a weak (integral) form whose shape derivative can be formally computed from standard results of shape calculus. At each iteration of the scheme, the current estimate of the shape derivative is utilized to define a velocity in the level-set equation. The proper selection of this velocity ensures that the new shape decreases the cost functional. We present results of facies identification where the velocity is computed with the gradient-based (GB) approach of Burger (2002) and the Levenberg–Marquardt (LM) technique of Burger (2004). While an adjoint formulation allows the straightforward application of the GB approach, the LM technique requires the computation of the large-scale Karush–Kuhn–Tucker system that arises at each iteration of the scheme. We efficiently solve this system by means of the representer method. We present some synthetic experiments to show and compare the capabilities and limitations of the proposed implementations of level-set techniques for the identification of geologic facies

  9. Surface Energy and Setting Process of Contacting Surfaces

    Directory of Open Access Journals (Sweden)

    M. V. Musokhranov

    2014-01-01

    Full Text Available The paper deals with a challenge in terms of ensuring an accuracy of the relative position of the conjugated surfaces that is to determine a coefficient of friction. To solve it, there is a proposal to use the surface energy, as a tool that influences the contacting parts nature. Presently, energy of the surface layers at best is only stated, but not used in practice.Analysis of the conditions of interaction between two contacting surfaces, such as seizing and setting cannot be explained only from the position of the roughness parameters. It is found that these phenomena are explained by the appearing gripe (setting bridges, which result from the energy of interaction between two or more adjacent surfaces. The emerging phenomenon such as micro welding, i.e. occurring bonds, is caused by the overflow of energy, according to the theory of physics, from the surface with a high level of energy to the surface with the smaller one to balance the system as a whole.The paper shows that through the use of process, controlling the depth of the surface layer and creating a certain structure, the energy level of the material as a whole can be specified. And this will allow us to provide the necessary performance and mechanical properties. It means to create as many gripe bridges as possible to ensure continuous positioning i.e. a fixed connection of the contacting surfaces.It was determined that to increase a value of the friction coefficient, the physical and mechanical properties of the surface layer of the parts material must be taken into account, namely, in the part body accumulate the energy to be consumed for forming the surface.The paper gives recommendations for including the parts of the surface energy in the qualitative indicators of characteristics. This will make a technologist, when routing a process, to choose such operations and modes to provide the designer-specified parameters not only of the accuracy and surface finish, but also of the

  10. Computer simulation of the behaviour of Julia sets using switching processes

    Energy Technology Data Exchange (ETDEWEB)

    Negi, Ashish [Department of Computer Science and Engineering, G.B. Pant Engineering College, Pauri Garhwal 246001 (India)], E-mail: ashish_ne@yahoo.com; Rani, Mamta [Department of Computer Science, Galgotia College of Engineering and Technology, UP Technical University, Knowledge Park-II, Greater Noida, Gautam Buddha Nagar, UP (India)], E-mail: vedicmri@sancharnet.in; Mahanti, P.K. [Department of CSAS, University of New Brunswick, Saint Johhn, New Brunswick, E2L4L5 (Canada)], E-mail: pmahanti@unbsj.ca

    2008-08-15

    Inspired by the study of Julia sets using switched processes by Lakhtakia and generation of new fractals by composite functions by Shirriff, we study the effect of switched processes on superior Julia sets given by Rani and Kumar. Further, symmetry for such processes is also discussed in the paper.

  11. Computer simulation of the behaviour of Julia sets using switching processes

    International Nuclear Information System (INIS)

    Negi, Ashish; Rani, Mamta; Mahanti, P.K.

    2008-01-01

    Inspired by the study of Julia sets using switched processes by Lakhtakia and generation of new fractals by composite functions by Shirriff, we study the effect of switched processes on superior Julia sets given by Rani and Kumar. Further, symmetry for such processes is also discussed in the paper

  12. Effects of delayed laboratory processing on platelet serotonin levels.

    Science.gov (United States)

    Sanner, Jennifer E; Frazier, Lorraine; Udtha, Malini

    2013-01-01

    Despite the availability of established guidelines for measuring platelet serotonin, these guidelines may be difficult to follow in a hospital setting where time to processing may vary from sample to sample. The purpose of this study was to evaluate the effect of the time to processing of human blood samples on the stability of the enzyme-linked immunosorbent assay (ELISA) for the determination of platelet serotonin levels in human plasma. Human blood samples collected from a convenience sample of eight healthy volunteers were analyzed to determine platelet serotonin levels from plasma collected in ethylene diamine tetra acetic acid (EDTA) tubes and stored at 4°C for 3 hr, 5 hr, 8 hr, and 12 hr. Refrigeration storage at 4°C for 3 hr, 5 hr, 8 hr, and 12 hr altered the platelet serotonin measurement when compared to immediate processing. The bias for the samples stored at 4°C for 3 hr was 102.3 (±217.39 ng/10(9) platelets), for 5 hr was 200.1 (±132.76 ng/10(9) platelets), for 8 hr was 146.9 (±221.41 ng/10(9) platelets), and for 12 hr was -67.6 (±349.60 ng/10(9) platelets). Results from this study show that accurate measurement of platelet serotonin levels is dependent on time to processing. Researchers should therefore follow a standardized laboratory guideline for obtaining immediate platelet serotonin levels after blood sample collection.

  13. A LEVEL SET BASED SHAPE OPTIMIZATION METHOD FOR AN ELLIPTIC OBSTACLE PROBLEM

    KAUST Repository

    Burger, Martin; Matevosyan, Norayr; Wolfram, Marie-Therese

    2011-01-01

    analysis of the level set method in terms of viscosity solutions. To our knowledge this is the first complete analysis of a level set method for a nonlocal shape optimization problem. Finally, we discuss the implementation of the methods and illustrate its

  14. A local level set method based on a finite element method for unstructured meshes

    International Nuclear Information System (INIS)

    Ngo, Long Cu; Choi, Hyoung Gwon

    2016-01-01

    A local level set method for unstructured meshes has been implemented by using a finite element method. A least-square weighted residual method was employed for implicit discretization to solve the level set advection equation. By contrast, a direct re-initialization method, which is directly applicable to the local level set method for unstructured meshes, was adopted to re-correct the level set function to become a signed distance function after advection. The proposed algorithm was constructed such that the advection and direct reinitialization steps were conducted only for nodes inside the narrow band around the interface. Therefore, in the advection step, the Gauss–Seidel method was used to update the level set function using a node-by-node solution method. Some benchmark problems were solved by using the present local level set method. Numerical results have shown that the proposed algorithm is accurate and efficient in terms of computational time

  15. A local level set method based on a finite element method for unstructured meshes

    Energy Technology Data Exchange (ETDEWEB)

    Ngo, Long Cu; Choi, Hyoung Gwon [School of Mechanical Engineering, Seoul National University of Science and Technology, Seoul (Korea, Republic of)

    2016-12-15

    A local level set method for unstructured meshes has been implemented by using a finite element method. A least-square weighted residual method was employed for implicit discretization to solve the level set advection equation. By contrast, a direct re-initialization method, which is directly applicable to the local level set method for unstructured meshes, was adopted to re-correct the level set function to become a signed distance function after advection. The proposed algorithm was constructed such that the advection and direct reinitialization steps were conducted only for nodes inside the narrow band around the interface. Therefore, in the advection step, the Gauss–Seidel method was used to update the level set function using a node-by-node solution method. Some benchmark problems were solved by using the present local level set method. Numerical results have shown that the proposed algorithm is accurate and efficient in terms of computational time.

  16. GPU accelerated edge-region based level set evolution constrained by 2D gray-scale histogram.

    Science.gov (United States)

    Balla-Arabé, Souleymane; Gao, Xinbo; Wang, Bin

    2013-07-01

    Due to its intrinsic nature which allows to easily handle complex shapes and topological changes, the level set method (LSM) has been widely used in image segmentation. Nevertheless, LSM is computationally expensive, which limits its applications in real-time systems. For this purpose, we propose a new level set algorithm, which uses simultaneously edge, region, and 2D histogram information in order to efficiently segment objects of interest in a given scene. The computational complexity of the proposed LSM is greatly reduced by using the highly parallelizable lattice Boltzmann method (LBM) with a body force to solve the level set equation (LSE). The body force is the link with image data and is defined from the proposed LSE. The proposed LSM is then implemented using an NVIDIA graphics processing units to fully take advantage of the LBM local nature. The new algorithm is effective, robust against noise, independent to the initial contour, fast, and highly parallelizable. The edge and region information enable to detect objects with and without edges, and the 2D histogram information enable the effectiveness of the method in a noisy environment. Experimental results on synthetic and real images demonstrate subjectively and objectively the performance of the proposed method.

  17. A parametric level-set approach for topology optimization of flow domains

    DEFF Research Database (Denmark)

    Pingen, Georg; Waidmann, Matthias; Evgrafov, Anton

    2010-01-01

    of the design variables in the traditional approaches is seen as a possible cause for the slow convergence. Non-smooth material distributions are suspected to trigger premature onset of instationary flows which cannot be treated by steady-state flow models. In the present work, we study whether the convergence...... and the versatility of topology optimization methods for fluidic systems can be improved by employing a parametric level-set description. In general, level-set methods allow controlling the smoothness of boundaries, yield a non-local influence of design variables, and decouple the material description from the flow...... field discretization. The parametric level-set method used in this study utilizes a material distribution approach to represent flow boundaries, resulting in a non-trivial mapping between design variables and local material properties. Using a hydrodynamic lattice Boltzmann method, we study...

  18. Setting priorities in health care organizations: criteria, processes, and parameters of success.

    Science.gov (United States)

    Gibson, Jennifer L; Martin, Douglas K; Singer, Peter A

    2004-09-08

    Hospitals and regional health authorities must set priorities in the face of resource constraints. Decision-makers seek practical ways to set priorities fairly in strategic planning, but find limited guidance from the literature. Very little has been reported from the perspective of Board members and senior managers about what criteria, processes and parameters of success they would use to set priorities fairly. We facilitated workshops for board members and senior leadership at three health care organizations to assist them in developing a strategy for fair priority setting. Workshop participants identified 8 priority setting criteria, 10 key priority setting process elements, and 6 parameters of success that they would use to set priorities in their organizations. Decision-makers in other organizations can draw lessons from these findings to enhance the fairness of their priority setting decision-making. Lessons learned in three workshops fill an important gap in the literature about what criteria, processes, and parameters of success Board members and senior managers would use to set priorities fairly.

  19. A variational approach to multi-phase motion of gas, liquid and solid based on the level set method

    Science.gov (United States)

    Yokoi, Kensuke

    2009-07-01

    We propose a simple and robust numerical algorithm to deal with multi-phase motion of gas, liquid and solid based on the level set method [S. Osher, J.A. Sethian, Front propagating with curvature-dependent speed: Algorithms based on Hamilton-Jacobi formulation, J. Comput. Phys. 79 (1988) 12; M. Sussman, P. Smereka, S. Osher, A level set approach for capturing solution to incompressible two-phase flow, J. Comput. Phys. 114 (1994) 146; J.A. Sethian, Level Set Methods and Fast Marching Methods, Cambridge University Press, 1999; S. Osher, R. Fedkiw, Level Set Methods and Dynamics Implicit Surface, Applied Mathematical Sciences, vol. 153, Springer, 2003]. In Eulerian framework, to simulate interaction between a moving solid object and an interfacial flow, we need to define at least two functions (level set functions) to distinguish three materials. In such simulations, in general two functions overlap and/or disagree due to numerical errors such as numerical diffusion. In this paper, we resolved the problem using the idea of the active contour model [M. Kass, A. Witkin, D. Terzopoulos, Snakes: active contour models, International Journal of Computer Vision 1 (1988) 321; V. Caselles, R. Kimmel, G. Sapiro, Geodesic active contours, International Journal of Computer Vision 22 (1997) 61; G. Sapiro, Geometric Partial Differential Equations and Image Analysis, Cambridge University Press, 2001; R. Kimmel, Numerical Geometry of Images: Theory, Algorithms, and Applications, Springer-Verlag, 2003] introduced in the field of image processing.

  20. Investigating the Decision-Making Process of Standard Setting Participants

    Science.gov (United States)

    Papageorgiou, Spiros

    2010-01-01

    Despite the growing interest of the language testing community in standard setting, primarily due to the use of the Common European Framework of Reference (CEFR-Council of Europe, 2001), the participants' decision-making process in the CEFR standard setting context remains unexplored. This study attempts to fill in this gap by analyzing these…

  1. Multi person detection and tracking based on hierarchical level-set method

    Science.gov (United States)

    Khraief, Chadia; Benzarti, Faouzi; Amiri, Hamid

    2018-04-01

    In this paper, we propose an efficient unsupervised method for mutli-person tracking based on hierarchical level-set approach. The proposed method uses both edge and region information in order to effectively detect objects. The persons are tracked on each frame of the sequence by minimizing an energy functional that combines color, texture and shape information. These features are enrolled in covariance matrix as region descriptor. The present method is fully automated without the need to manually specify the initial contour of Level-set. It is based on combined person detection and background subtraction methods. The edge-based is employed to maintain a stable evolution, guide the segmentation towards apparent boundaries and inhibit regions fusion. The computational cost of level-set is reduced by using narrow band technique. Many experimental results are performed on challenging video sequences and show the effectiveness of the proposed method.

  2. Level set methods for detonation shock dynamics using high-order finite elements

    Energy Technology Data Exchange (ETDEWEB)

    Dobrev, V. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Grogan, F. C. [Univ. of California, San Diego, CA (United States); Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kolev, T. V. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Rieben, R [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Tomov, V. Z. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-05-26

    Level set methods are a popular approach to modeling evolving interfaces. We present a level set ad- vection solver in two and three dimensions using the discontinuous Galerkin method with high-order nite elements. During evolution, the level set function is reinitialized to a signed distance function to maintain ac- curacy. Our approach leads to stable front propagation and convergence on high-order, curved, unstructured meshes. The ability of the solver to implicitly track moving fronts lends itself to a number of applications; in particular, we highlight applications to high-explosive (HE) burn and detonation shock dynamics (DSD). We provide results for two- and three-dimensional benchmark problems as well as applications to DSD.

  3. Processing of low-level wastes

    International Nuclear Information System (INIS)

    Vance, J.N.

    1986-01-01

    Although low-level wastes have been generated and have required processing for more than two decades now, it is noteworthy that processing methods are continuing to change. The changes are not only attributable to improvements in technology, but are also the result of changing regulations and economics and uncertainties regarding the future availabilities of burial space for disposal. Indeed, because of the changes which have and are taking place in the processing of low-level waste, an overview of the current situation is in order. This presentation is a brief overview of the processing methods generally employed to treat the low-level wastes generated from both fuel cycle and non-fuel cycle sources. The presentation is far too brief to deal with the processing technologies in a comprehensive fashion, but does provide a snapshot of what the current or typical processing methods are and what changes are occurring and why

  4. Semihard processes with BLM renormalization scale setting

    Energy Technology Data Exchange (ETDEWEB)

    Caporale, Francesco [Instituto de Física Teórica UAM/CSIC, Nicolás Cabrera 15 and U. Autónoma de Madrid, E-28049 Madrid (Spain); Ivanov, Dmitry Yu. [Sobolev Institute of Mathematics and Novosibirsk State University, 630090 Novosibirsk (Russian Federation); Murdaca, Beatrice; Papa, Alessandro [Dipartimento di Fisica, Università della Calabria, and Istituto Nazionale di Fisica Nucleare, Gruppo collegato di Cosenza, Arcavacata di Rende, I-87036 Cosenza (Italy)

    2015-04-10

    We apply the BLM scale setting procedure directly to amplitudes (cross sections) of several semihard processes. It is shown that, due to the presence of β{sub 0}-terms in the NLA results for the impact factors, the obtained optimal renormalization scale is not universal, but depends both on the energy and on the process in question. We illustrate this general conclusion considering the following semihard processes: (i) inclusive production of two forward high-p{sub T} jets separated by large interval in rapidity (Mueller-Navelet jets); (ii) high-energy behavior of the total cross section for highly virtual photons; (iii) forward amplitude of the production of two light vector mesons in the collision of two virtual photons.

  5. Continuous soil maps - a fuzzy set approach to bridge the gap between aggregation levels of process and distribution models

    NARCIS (Netherlands)

    Gruijter, de J.J.; Walvoort, D.J.J.; Gaans, van P.F.M.

    1997-01-01

    Soil maps as multi-purpose models of spatial soil distribution have a much higher level of aggregation (map units) than the models of soil processes and land-use effects that need input from soil maps. This mismatch between aggregation levels is particularly detrimental in the context of precision

  6. Setting priorities in health care organizations: criteria, processes, and parameters of success

    Directory of Open Access Journals (Sweden)

    Martin Douglas K

    2004-09-01

    Full Text Available Abstract Background Hospitals and regional health authorities must set priorities in the face of resource constraints. Decision-makers seek practical ways to set priorities fairly in strategic planning, but find limited guidance from the literature. Very little has been reported from the perspective of Board members and senior managers about what criteria, processes and parameters of success they would use to set priorities fairly. Discussion We facilitated workshops for board members and senior leadership at three health care organizations to assist them in developing a strategy for fair priority setting. Workshop participants identified 8 priority setting criteria, 10 key priority setting process elements, and 6 parameters of success that they would use to set priorities in their organizations. Decision-makers in other organizations can draw lessons from these findings to enhance the fairness of their priority setting decision-making. Summary Lessons learned in three workshops fill an important gap in the literature about what criteria, processes, and parameters of success Board members and senior managers would use to set priorities fairly.

  7. Levels of Processing in Mild Disabilities.

    Science.gov (United States)

    Al-Hilawani, Yasser A.; And Others

    This study examined the effects of the second level (intermediate acoustical processing of rhyming words) and the third level (deep-semantic processing of words in sentences) of the "levels of processing" framework on memory performance of four types of intermediate-grade students (52 "normal" students, 50 students with…

  8. Variational Level Set Method for Two-Stage Image Segmentation Based on Morphological Gradients

    Directory of Open Access Journals (Sweden)

    Zemin Ren

    2014-01-01

    Full Text Available We use variational level set method and transition region extraction techniques to achieve image segmentation task. The proposed scheme is done by two steps. We first develop a novel algorithm to extract transition region based on the morphological gradient. After this, we integrate the transition region into a variational level set framework and develop a novel geometric active contour model, which include an external energy based on transition region and fractional order edge indicator function. The external energy is used to drive the zero level set toward the desired image features, such as object boundaries. Due to this external energy, the proposed model allows for more flexible initialization. The fractional order edge indicator function is incorporated into the length regularization term to diminish the influence of noise. Moreover, internal energy is added into the proposed model to penalize the deviation of the level set function from a signed distance function. The results evolution of the level set function is the gradient flow that minimizes the overall energy functional. The proposed model has been applied to both synthetic and real images with promising results.

  9. Aerostructural Level Set Topology Optimization for a Common Research Model Wing

    Science.gov (United States)

    Dunning, Peter D.; Stanford, Bret K.; Kim, H. Alicia

    2014-01-01

    The purpose of this work is to use level set topology optimization to improve the design of a representative wing box structure for the NASA common research model. The objective is to minimize the total compliance of the structure under aerodynamic and body force loading, where the aerodynamic loading is coupled to the structural deformation. A taxi bump case was also considered, where only body force loads were applied. The trim condition that aerodynamic lift must balance the total weight of the aircraft is enforced by allowing the root angle of attack to change. The level set optimization method is implemented on an unstructured three-dimensional grid, so that the method can optimize a wing box with arbitrary geometry. Fast matching and upwind schemes are developed for an unstructured grid, which make the level set method robust and efficient. The adjoint method is used to obtain the coupled shape sensitivities required to perform aerostructural optimization of the wing box structure.

  10. A Variational Level Set Model Combined with FCMS for Image Clustering Segmentation

    Directory of Open Access Journals (Sweden)

    Liming Tang

    2014-01-01

    Full Text Available The fuzzy C means clustering algorithm with spatial constraint (FCMS is effective for image segmentation. However, it lacks essential smoothing constraints to the cluster boundaries and enough robustness to the noise. Samson et al. proposed a variational level set model for image clustering segmentation, which can get the smooth cluster boundaries and closed cluster regions due to the use of level set scheme. However it is very sensitive to the noise since it is actually a hard C means clustering model. In this paper, based on Samson’s work, we propose a new variational level set model combined with FCMS for image clustering segmentation. Compared with FCMS clustering, the proposed model can get smooth cluster boundaries and closed cluster regions due to the use of level set scheme. In addition, a block-based energy is incorporated into the energy functional, which enables the proposed model to be more robust to the noise than FCMS clustering and Samson’s model. Some experiments on the synthetic and real images are performed to assess the performance of the proposed model. Compared with some classical image segmentation models, the proposed model has a better performance for the images contaminated by different noise levels.

  11. Surface-to-surface registration using level sets

    DEFF Research Database (Denmark)

    Hansen, Mads Fogtmann; Erbou, Søren G.; Vester-Christensen, Martin

    2007-01-01

    This paper presents a general approach for surface-to-surface registration (S2SR) with the Euclidean metric using signed distance maps. In addition, the method is symmetric such that the registration of a shape A to a shape B is identical to the registration of the shape B to the shape A. The S2SR...... problem can be approximated by the image registration (IR) problem of the signed distance maps (SDMs) of the surfaces confined to some narrow band. By shrinking the narrow bands around the zero level sets the solution to the IR problem converges towards the S2SR problem. It is our hypothesis...... that this approach is more robust and less prone to fall into local minima than ordinary surface-to-surface registration. The IR problem is solved using the inverse compositional algorithm. In this paper, a set of 40 pelvic bones of Duroc pigs are registered to each other w.r.t. the Euclidean transformation...

  12. SET overexpression in HEK293 cells regulates mitochondrial uncoupling proteins levels within a mitochondrial fission/reduced autophagic flux scenario

    Energy Technology Data Exchange (ETDEWEB)

    Almeida, Luciana O.; Goto, Renata N. [Department of Clinical Analyses, Toxicology and Food Sciences, School of Pharmaceutical Sciences of Ribeirão Preto, University of São Paulo, Ribeirão Preto, SP (Brazil); Neto, Marinaldo P.C. [Department of Physics and Chemistry, School of Pharmaceutical Sciences of Ribeirão Preto, University of São Paulo, Ribeirão Preto, SP (Brazil); Sousa, Lucas O. [Department of Clinical Analyses, Toxicology and Food Sciences, School of Pharmaceutical Sciences of Ribeirão Preto, University of São Paulo, Ribeirão Preto, SP (Brazil); Curti, Carlos [Department of Physics and Chemistry, School of Pharmaceutical Sciences of Ribeirão Preto, University of São Paulo, Ribeirão Preto, SP (Brazil); Leopoldino, Andréia M., E-mail: andreiaml@usp.br [Department of Clinical Analyses, Toxicology and Food Sciences, School of Pharmaceutical Sciences of Ribeirão Preto, University of São Paulo, Ribeirão Preto, SP (Brazil)

    2015-03-06

    We hypothesized that SET, a protein accumulated in some cancer types and Alzheimer disease, is involved in cell death through mitochondrial mechanisms. We addressed the mRNA and protein levels of the mitochondrial uncoupling proteins UCP1, UCP2 and UCP3 (S and L isoforms) by quantitative real-time PCR and immunofluorescence as well as other mitochondrial involvements, in HEK293 cells overexpressing the SET protein (HEK293/SET), either in the presence or absence of oxidative stress induced by the pro-oxidant t-butyl hydroperoxide (t-BHP). SET overexpression in HEK293 cells decreased UCP1 and increased UCP2 and UCP3 (S/L) mRNA and protein levels, whilst also preventing lipid peroxidation and decreasing the content of cellular ATP. SET overexpression also (i) decreased the area of mitochondria and increased the number of organelles and lysosomes, (ii) increased mitochondrial fission, as demonstrated by increased FIS1 mRNA and FIS-1 protein levels, an apparent accumulation of DRP-1 protein, and an increase in the VDAC protein level, and (iii) reduced autophagic flux, as demonstrated by a decrease in LC3B lipidation (LC3B-II) in the presence of chloroquine. Therefore, SET overexpression in HEK293 cells promotes mitochondrial fission and reduces autophagic flux in apparent association with up-regulation of UCP2 and UCP3; this implies a potential involvement in cellular processes that are deregulated such as in Alzheimer's disease and cancer. - Highlights: • SET, UCPs and autophagy prevention are correlated. • SET action has mitochondrial involvement. • UCP2/3 may reduce ROS and prevent autophagy. • SET protects cell from ROS via UCP2/3.

  13. A second generation 50 Mbps VLSI level zero processing system prototype

    Science.gov (United States)

    Harris, Jonathan C.; Shi, Jeff; Speciale, Nick; Bennett, Toby

    1994-01-01

    Level Zero Processing (LZP) generally refers to telemetry data processing functions performed at ground facilities to remove all communication artifacts from instrument data. These functions typically include frame synchronization, error detection and correction, packet reassembly and sorting, playback reversal, merging, time-ordering, overlap deletion, and production of annotated data sets. The Data Systems Technologies Division (DSTD) at Goddard Space Flight Center (GSFC) has been developing high-performance Very Large Scale Integration Level Zero Processing Systems (VLSI LZPS) since 1989. The first VLSI LZPS prototype demonstrated 20 Megabits per second (Mbp's) capability in 1992. With a new generation of high-density Application-specific Integrated Circuits (ASIC) and a Mass Storage System (MSS) based on the High-performance Parallel Peripheral Interface (HiPPI), a second prototype has been built that achieves full 50 Mbp's performance. This paper describes the second generation LZPS prototype based upon VLSI technologies.

  14. A Memory and Computation Efficient Sparse Level-Set Method

    NARCIS (Netherlands)

    Laan, Wladimir J. van der; Jalba, Andrei C.; Roerdink, Jos B.T.M.

    Since its introduction, the level set method has become the favorite technique for capturing and tracking moving interfaces, and found applications in a wide variety of scientific fields. In this paper we present efficient data structures and algorithms for tracking dynamic interfaces through the

  15. Measurement of thermally ablated lesions in sonoelastographic images using level set methods

    Science.gov (United States)

    Castaneda, Benjamin; Tamez-Pena, Jose Gerardo; Zhang, Man; Hoyt, Kenneth; Bylund, Kevin; Christensen, Jared; Saad, Wael; Strang, John; Rubens, Deborah J.; Parker, Kevin J.

    2008-03-01

    The capability of sonoelastography to detect lesions based on elasticity contrast can be applied to monitor the creation of thermally ablated lesion. Currently, segmentation of lesions depicted in sonoelastographic images is performed manually which can be a time consuming process and prone to significant intra- and inter-observer variability. This work presents a semi-automated segmentation algorithm for sonoelastographic data. The user starts by planting a seed in the perceived center of the lesion. Fast marching methods use this information to create an initial estimate of the lesion. Subsequently, level set methods refine its final shape by attaching the segmented contour to edges in the image while maintaining smoothness. The algorithm is applied to in vivo sonoelastographic images from twenty five thermal ablated lesions created in porcine livers. The estimated area is compared to results from manual segmentation and gross pathology images. Results show that the algorithm outperforms manual segmentation in accuracy, inter- and intra-observer variability. The processing time per image is significantly reduced.

  16. Skull defect reconstruction based on a new hybrid level set.

    Science.gov (United States)

    Zhang, Ziqun; Zhang, Ran; Song, Zhijian

    2014-01-01

    Skull defect reconstruction is an important aspect of surgical repair. Historically, a skull defect prosthesis was created by the mirroring technique, surface fitting, or formed templates. These methods are not based on the anatomy of the individual patient's skull, and therefore, the prosthesis cannot precisely correct the defect. This study presented a new hybrid level set model, taking into account both the global optimization region information and the local accuracy edge information, while avoiding re-initialization during the evolution of the level set function. Based on the new method, a skull defect was reconstructed, and the skull prosthesis was produced by rapid prototyping technology. This resulted in a skull defect prosthesis that well matched the skull defect with excellent individual adaptation.

  17. The influence of power and actor relations on priority setting and resource allocation practices at the hospital level in Kenya: a case study.

    Science.gov (United States)

    Barasa, Edwine W; Cleary, Susan; English, Mike; Molyneux, Sassy

    2016-09-30

    Priority setting and resource allocation in healthcare organizations often involves the balancing of competing interests and values in the context of hierarchical and politically complex settings with multiple interacting actor relationships. Despite this, few studies have examined the influence of actor and power dynamics on priority setting practices in healthcare organizations. This paper examines the influence of power relations among different actors on the implementation of priority setting and resource allocation processes in public hospitals in Kenya. We used a qualitative case study approach to examine priority setting and resource allocation practices in two public hospitals in coastal Kenya. We collected data by a combination of in-depth interviews of national level policy makers, hospital managers, and frontline practitioners in the case study hospitals (n = 72), review of documents such as hospital plans and budgets, minutes of meetings and accounting records, and non-participant observations in case study hospitals over a period of 7 months. We applied a combination of two frameworks, Norman Long's actor interface analysis and VeneKlasen and Miller's expressions of power framework to examine and interpret our findings RESULTS: The interactions of actors in the case study hospitals resulted in socially constructed interfaces between: 1) senior managers and middle level managers 2) non-clinical managers and clinicians, and 3) hospital managers and the community. Power imbalances resulted in the exclusion of middle level managers (in one of the hospitals) and clinicians and the community (in both hospitals) from decision making processes. This resulted in, amongst others, perceptions of unfairness, and reduced motivation in hospital staff. It also puts to question the legitimacy of priority setting processes in these hospitals. Designing hospital decision making structures to strengthen participation and inclusion of relevant stakeholders could

  18. 76 FR 9004 - Public Comment on Setting Achievement Levels in Writing

    Science.gov (United States)

    2011-02-16

    ... DEPARTMENT OF EDUCATION Public Comment on Setting Achievement Levels in Writing AGENCY: U.S... Achievement Levels in Writing. SUMMARY: The National Assessment Governing Board (Governing Board) is... for NAEP in writing. This notice provides opportunity for public comment and submitting...

  19. Development of very low-level radioactive waste sequestration process criteria

    Energy Technology Data Exchange (ETDEWEB)

    Chan, N.; Wong, P., E-mail: nicholas.chan@cnl.ca [Canadian Nuclear Laboratories, Chalk River, Ontario (Canada)

    2015-12-15

    Segregating radioactive waste at the source and reclassifying radioactive waste to lower waste classes are the key activities to reduce the environmental footprint and long-term liability. In the Canadian Standards Association's radioactive waste classification system, there are 2 sub-classes within low-level radioactive waste: very short-lived radioactive waste and very low-level radioactive waste (VLLW). VLLW has a low hazard potential but is above the Canadian unconditional clearance criteria as set out in Schedule 2 of Nuclear Substances and Devices Regulations. Long-term waste management facilities for VLLW do not require a high degree of containment and isolation. In general, a relatively low-cost near-surface facility with limited regulatory control is suitable for VLLW. At Canadian Nuclear Laboratories' Chalk River Laboratories site an initiative, VLLW Sequestration, was implemented in 2013 to set aside potential VLLW for temporary storage and to be later dispositioned in the planned VLLW facility. As of May 2015, a total of 236m{sup 3} resulting in approximately $1.1 million in total savings have been sequestered. One of the main hurdles in implementing VLLW Sequestration is the development of process criteria. Waste Acceptance Criteria (WAC) are used as a guide or as requirements for determining whether waste is accepted by the waste management facility. Establishment of the process criteria ensures that segregated waste materials have a high likelihood to meet the VLLW WAC and be accepted into the planned VLLW facility. This paper outlines the challenges and various factors which were considered in the development of interim process criteria. (author)

  20. Instruction Set Architectures for Quantum Processing Units

    OpenAIRE

    Britt, Keith A.; Humble, Travis S.

    2017-01-01

    Progress in quantum computing hardware raises questions about how these devices can be controlled, programmed, and integrated with existing computational workflows. We briefly describe several prominent quantum computational models, their associated quantum processing units (QPUs), and the adoption of these devices as accelerators within high-performance computing systems. Emphasizing the interface to the QPU, we analyze instruction set architectures based on reduced and complex instruction s...

  1. Out-of-Core Computations of High-Resolution Level Sets by Means of Code Transformation

    DEFF Research Database (Denmark)

    Christensen, Brian Bunch; Nielsen, Michael Bang; Museth, Ken

    2012-01-01

    We propose a storage efficient, fast and parallelizable out-of-core framework for streaming computations of high resolution level sets. The fundamental techniques are skewing and tiling transformations of streamed level set computations which allow for the combination of interface propagation, re...... computations are now CPU bound and consequently the overall performance is unaffected by disk latency and bandwidth limitations. We demonstrate this with several benchmark tests that show sustained out-of-core throughputs close to that of in-core level set simulations....

  2. Identification of electrical resistance of fresh state concrete for nondestructive setting process monitoring

    International Nuclear Information System (INIS)

    Shin, Sung Woo

    2015-01-01

    Concrete undergoes significant phase changes from liquid to solid states as hydration progresses. These phase changes are known as the setting process. A liquid state concrete is electrically conductive because of the presence of water and ions. However, since the conductive elements in the liquid state of concrete are consumed to produce non-conductive hydration products, the electrical conductivity of hydrating concrete decreases during the setting process. Therefore, the electrical properties of hydrating concrete can be used to monitor the setting process of concrete. In this study, a parameter identification method to estimate electrical parameters such as ohmic resistance of concrete is proposed. The effectiveness of the proposed method for monitoring the setting process of concrete is experimentally validated

  3. Stabilized Conservative Level Set Method with Adaptive Wavelet-based Mesh Refinement

    Science.gov (United States)

    Shervani-Tabar, Navid; Vasilyev, Oleg V.

    2016-11-01

    This paper addresses one of the main challenges of the conservative level set method, namely the ill-conditioned behavior of the normal vector away from the interface. An alternative formulation for reconstruction of the interface is proposed. Unlike the commonly used methods which rely on the unit normal vector, Stabilized Conservative Level Set (SCLS) uses a modified renormalization vector with diminishing magnitude away from the interface. With the new formulation, in the vicinity of the interface the reinitialization procedure utilizes compressive flux and diffusive terms only in the normal direction to the interface, thus, preserving the conservative level set properties, while away from the interfaces the directional diffusion mechanism automatically switches to homogeneous diffusion. The proposed formulation is robust and general. It is especially well suited for use with adaptive mesh refinement (AMR) approaches due to need for a finer resolution in the vicinity of the interface in comparison with the rest of the domain. All of the results were obtained using the Adaptive Wavelet Collocation Method, a general AMR-type method, which utilizes wavelet decomposition to adapt on steep gradients in the solution while retaining a predetermined order of accuracy.

  4. Some numerical studies of interface advection properties of level set ...

    Indian Academy of Sciences (India)

    explicit computational elements moving through an Eulerian grid. ... location. The interface is implicitly defined (captured) as the location of the discontinuity in the ... This level set function is advected with the background flow field and thus ...

  5. Technologies for the Fast Set-Up of Automated Assembly Processes

    DEFF Research Database (Denmark)

    Krüger, Norbert; Ude, Ales; Petersen, Henrik Gordon

    2014-01-01

    of so called few-of-a-kind production. Therefore, most production of this kind is done manually and thus often performed in low-wage countries. In the IntellAct project, we have developed a set of methods which facilitate the set-up of a complex automatic assembly process, and here we present our work...

  6. A Cartesian Adaptive Level Set Method for Two-Phase Flows

    Science.gov (United States)

    Ham, F.; Young, Y.-N.

    2003-01-01

    In the present contribution we develop a level set method based on local anisotropic Cartesian adaptation as described in Ham et al. (2002). Such an approach should allow for the smallest possible Cartesian grid capable of resolving a given flow. The remainder of the paper is organized as follows. In section 2 the level set formulation for free surface calculations is presented and its strengths and weaknesses relative to the other free surface methods reviewed. In section 3 the collocated numerical method is described. In section 4 the method is validated by solving the 2D and 3D drop oscilation problem. In section 5 we present some results from more complex cases including the 3D drop breakup in an impulsively accelerated free stream, and the 3D immiscible Rayleigh-Taylor instability. Conclusions are given in section 6.

  7. Features, Events, and Processes: System Level

    Energy Technology Data Exchange (ETDEWEB)

    D. McGregor

    2004-04-19

    The primary purpose of this analysis is to evaluate System Level features, events, and processes (FEPs). The System Level FEPs typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem level analyses and models reports. The System Level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. This evaluation determines which of the System Level FEPs are excluded from modeling used to support the total system performance assessment for license application (TSPA-LA). The evaluation is based on the information presented in analysis reports, model reports, direct input, or corroborative documents that are cited in the individual FEP discussions in Section 6.2 of this analysis report.

  8. Level Sets and Voronoi based Feature Extraction from any Imagery

    DEFF Research Database (Denmark)

    Sharma, O.; Anton, François; Mioc, Darka

    2012-01-01

    Polygon features are of interest in many GEOProcessing applications like shoreline mapping, boundary delineation, change detection, etc. This paper presents a unique new GPU-based methodology to automate feature extraction combining level sets, or mean shift based segmentation together with Voron...

  9. Online monitoring of oil film using electrical capacitance tomography and level set method

    International Nuclear Information System (INIS)

    Xue, Q.; Ma, M.; Sun, B. Y.; Cui, Z. Q.; Wang, H. X.

    2015-01-01

    In the application of oil-air lubrication system, electrical capacitance tomography (ECT) provides a promising way for monitoring oil film in the pipelines by reconstructing cross sectional oil distributions in real time. While in the case of small diameter pipe and thin oil film, the thickness of the oil film is hard to be observed visually since the interface of oil and air is not obvious in the reconstructed images. And the existence of artifacts in the reconstructions has seriously influenced the effectiveness of image segmentation techniques such as level set method. Besides, level set method is also unavailable for online monitoring due to its low computation speed. To address these problems, a modified level set method is developed: a distance regularized level set evolution formulation is extended to image two-phase flow online using an ECT system, a narrowband image filter is defined to eliminate the influence of artifacts, and considering the continuity of the oil distribution variation, the detected oil-air interface of a former image can be used as the initial contour for the detection of the subsequent frame; thus, the propagation from the initial contour to the boundary can be greatly accelerated, making it possible for real time tracking. To testify the feasibility of the proposed method, an oil-air lubrication facility with 4 mm inner diameter pipe is measured in normal operation using an 8-electrode ECT system. Both simulation and experiment results indicate that the modified level set method is capable of visualizing the oil-air interface accurately online

  10. A level-set method for two-phase flows with soluble surfactant

    Science.gov (United States)

    Xu, Jian-Jun; Shi, Weidong; Lai, Ming-Chih

    2018-01-01

    A level-set method is presented for solving two-phase flows with soluble surfactant. The Navier-Stokes equations are solved along with the bulk surfactant and the interfacial surfactant equations. In particular, the convection-diffusion equation for the bulk surfactant on the irregular moving domain is solved by using a level-set based diffusive-domain method. A conservation law for the total surfactant mass is derived, and a re-scaling procedure for the surfactant concentrations is proposed to compensate for the surfactant mass loss due to numerical diffusion. The whole numerical algorithm is easy for implementation. Several numerical simulations in 2D and 3D show the effects of surfactant solubility on drop dynamics under shear flow.

  11. Level set methods for inverse scattering—some recent developments

    International Nuclear Information System (INIS)

    Dorn, Oliver; Lesselier, Dominique

    2009-01-01

    We give an update on recent techniques which use a level set representation of shapes for solving inverse scattering problems, completing in that matter the exposition made in (Dorn and Lesselier 2006 Inverse Problems 22 R67) and (Dorn and Lesselier 2007 Deformable Models (New York: Springer) pp 61–90), and bringing it closer to the current state of the art

  12. A level set method for cupping artifact correction in cone-beam CT

    International Nuclear Information System (INIS)

    Xie, Shipeng; Li, Haibo; Ge, Qi; Li, Chunming

    2015-01-01

    Purpose: To reduce cupping artifacts and improve the contrast-to-noise ratio in cone-beam computed tomography (CBCT). Methods: A level set method is proposed to reduce cupping artifacts in the reconstructed image of CBCT. The authors derive a local intensity clustering property of the CBCT image and define a local clustering criterion function of the image intensities in a neighborhood of each point. This criterion function defines an energy in terms of the level set functions, which represent a segmentation result and the cupping artifacts. The cupping artifacts are estimated as a result of minimizing this energy. Results: The cupping artifacts in CBCT are reduced by an average of 90%. The results indicate that the level set-based algorithm is practical and effective for reducing the cupping artifacts and preserving the quality of the reconstructed image. Conclusions: The proposed method focuses on the reconstructed image without requiring any additional physical equipment, is easily implemented, and provides cupping correction through a single-scan acquisition. The experimental results demonstrate that the proposed method successfully reduces the cupping artifacts

  13. Level-set simulations of buoyancy-driven motion of single and multiple bubbles

    International Nuclear Information System (INIS)

    Balcázar, Néstor; Lehmkuhl, Oriol; Jofre, Lluís; Oliva, Assensi

    2015-01-01

    Highlights: • A conservative level-set method is validated and verified. • An extensive study of buoyancy-driven motion of single bubbles is performed. • The interactions of two spherical and ellipsoidal bubbles is studied. • The interaction of multiple bubbles is simulated in a vertical channel. - Abstract: This paper presents a numerical study of buoyancy-driven motion of single and multiple bubbles by means of the conservative level-set method. First, an extensive study of the hydrodynamics of single bubbles rising in a quiescent liquid is performed, including its shape, terminal velocity, drag coefficients and wake patterns. These results are validated against experimental and numerical data well established in the scientific literature. Then, a further study on the interaction of two spherical and ellipsoidal bubbles is performed for different orientation angles. Finally, the interaction of multiple bubbles is explored in a periodic vertical channel. The results show that the conservative level-set approach can be used for accurate modelling of bubble dynamics. Moreover, it is demonstrated that the present method is numerically stable for a wide range of Morton and Reynolds numbers.

  14. Increased fairness in priority setting processes within the health sector: the case of Kapiri-Mposhi District, Zambia.

    Science.gov (United States)

    Zulu, Joseph M; Michelo, Charles; Msoni, Carol; Hurtig, Anna-Karin; Byskov, Jens; Blystad, Astrid

    2014-02-18

    The challenge of priority setting (PS) in health care within contexts of severe resource limitations has continued to receive attention. Accountability for Reasonableness (AFR) has emerged as a useful framework to guide the implementation of PS processes. In 2006, the AFR approach to enhance legitimate and fair PS was introduced by researchers and decision makers within the health sector in the EU funded research project entitled 'Response to Accountable priority setting for Trust in health systems' (REACT). The project aimed to strengthen fairness and accountability in the PS processes of health systems at district level in Zambia, Tanzania and Kenya. This paper focuses on local perceptions and practices of fair PS (baseline study) as well as at the evolution of such perceptions and practices in PS following an AFR based intervention (evaluation study), carried out at district level in Kapiri-Mposhi District in Zambia. Data was collected using in depth interviews (IDIs), focus group discussions (FGDs) and review of documents from national to district level. The study population for this paper consisted of health related stakeholders employed in the district administration, in non-governmental organizations (NGO) and in health facilities. During the baseline study, concepts of legitimacy and fairness in PS processes were found to be grounded in local values of equity and impartiality. Government and other organizational strategies strongly supported devolution of PS and decision making procedures. However, important gaps were identified in terms of experiences of stakeholder involvement and fairness in PS processes in practice. The evaluation study revealed that a transformation of the views and methods regarding fairness in PS processes was ongoing in the study district, which was partly attributed to the AFR based intervention. The study findings suggest that increased attention was given to fairness in PS processes at district level. The changes were linked to a

  15. A Level Set Discontinuous Galerkin Method for Free Surface Flows

    DEFF Research Database (Denmark)

    Grooss, Jesper; Hesthaven, Jan

    2006-01-01

    We present a discontinuous Galerkin method on a fully unstructured grid for the modeling of unsteady incompressible fluid flows with free surfaces. The surface is modeled by embedding and represented by a levelset. We discuss the discretization of the flow equations and the level set equation...

  16. SENTINEL-2 LEVEL 1 PRODUCTS AND IMAGE PROCESSING PERFORMANCES

    Directory of Open Access Journals (Sweden)

    S. J. Baillarin

    2012-07-01

    . The stringent image quality requirements are also described, in particular the geo-location accuracy for both absolute (better than 12.5 m and multi-temporal (better than 0.3 pixels cases. Then, the prototyped image processing techniques (both radiometric and geometric will be addressed. The radiometric corrections will be first introduced. They consist mainly in dark signal and detector relative sensitivity correction, crosstalk correction and MTF restoration. Then, a special focus will be done on the geometric corrections. In particular the innovative method of automatic enhancement of the geometric physical model will be detailed. This method takes advantage of a Global Reference Image database, perfectly geo-referenced, to correct the physical geometric model of each image taken. The processing is based on an automatic image matching process which provides accurate ground control points between a given band of the image to refine and a reference image, allowing to dynamically calibrate the viewing model. The generation of the Global Reference Image database made of Sentinel-2 pre-calibrated mono-spectral images will be also addressed. In order to perform independent validation of the prototyping activity, an image simulator dedicated to Sentinel-2 has been set up. Thanks to this, a set of images have been simulated from various source images and combining different acquisition conditions and landscapes (mountains, deserts, cities …. Given disturbances have been also simulated so as to estimate the end to end performance of the processing chain. Finally, the radiometric and geometric performances obtained by the prototype will be presented. In particular, the geo-location performance of the level-1C products which widely fulfils the image quality requirements will be provided.

  17. SENTINEL-2 Level 1 Products and Image Processing Performances

    Science.gov (United States)

    Baillarin, S. J.; Meygret, A.; Dechoz, C.; Petrucci, B.; Lacherade, S.; Tremas, T.; Isola, C.; Martimort, P.; Spoto, F.

    2012-07-01

    stringent image quality requirements are also described, in particular the geo-location accuracy for both absolute (better than 12.5 m) and multi-temporal (better than 0.3 pixels) cases. Then, the prototyped image processing techniques (both radiometric and geometric) will be addressed. The radiometric corrections will be first introduced. They consist mainly in dark signal and detector relative sensitivity correction, crosstalk correction and MTF restoration. Then, a special focus will be done on the geometric corrections. In particular the innovative method of automatic enhancement of the geometric physical model will be detailed. This method takes advantage of a Global Reference Image database, perfectly geo-referenced, to correct the physical geometric model of each image taken. The processing is based on an automatic image matching process which provides accurate ground control points between a given band of the image to refine and a reference image, allowing to dynamically calibrate the viewing model. The generation of the Global Reference Image database made of Sentinel-2 pre-calibrated mono-spectral images will be also addressed. In order to perform independent validation of the prototyping activity, an image simulator dedicated to Sentinel-2 has been set up. Thanks to this, a set of images have been simulated from various source images and combining different acquisition conditions and landscapes (mountains, deserts, cities …). Given disturbances have been also simulated so as to estimate the end to end performance of the processing chain. Finally, the radiometric and geometric performances obtained by the prototype will be presented. In particular, the geo-location performance of the level-1C products which widely fulfils the image quality requirements will be provided.

  18. Embedded Real-Time Architecture for Level-Set-Based Active Contours

    Directory of Open Access Journals (Sweden)

    Dejnožková Eva

    2005-01-01

    Full Text Available Methods described by partial differential equations have gained a considerable interest because of undoubtful advantages such as an easy mathematical description of the underlying physics phenomena, subpixel precision, isotropy, or direct extension to higher dimensions. Though their implementation within the level set framework offers other interesting advantages, their vast industrial deployment on embedded systems is slowed down by their considerable computational effort. This paper exploits the high parallelization potential of the operators from the level set framework and proposes a scalable, asynchronous, multiprocessor platform suitable for system-on-chip solutions. We concentrate on obtaining real-time execution capabilities. The performance is evaluated on a continuous watershed and an object-tracking application based on a simple gradient-based attraction force driving the active countour. The proposed architecture can be realized on commercially available FPGAs. It is built around general-purpose processor cores, and can run code developed with usual tools.

  19. Numerical Modelling of Three-Fluid Flow Using The Level-set Method

    Science.gov (United States)

    Li, Hongying; Lou, Jing; Shang, Zhi

    2014-11-01

    This work presents a numerical model for simulation of three-fluid flow involving two different moving interfaces. These interfaces are captured using the level-set method via two different level-set functions. A combined formulation with only one set of conservation equations for the whole physical domain, consisting of the three different immiscible fluids, is employed. Numerical solution is performed on a fixed mesh using the finite volume method. Surface tension effect is incorporated using the Continuum Surface Force model. Validation of the present model is made against available results for stratified flow and rising bubble in a container with a free surface. Applications of the present model are demonstrated by a variety of three-fluid flow systems including (1) three-fluid stratified flow, (2) two-fluid stratified flow carrying the third fluid in the form of drops and (3) simultaneous rising and settling of two drops in a stationary third fluid. The work is supported by a Thematic and Strategic Research from A*STAR, Singapore (Ref. #: 1021640075).

  20. Processing AIRS Scientific Data Through Level 2

    Science.gov (United States)

    Oliphant, Robert; Lee, Sung-Yung; Chahine, Moustafa; Susskind, Joel; arnet, Christopher; McMillin, Larry; Goldberg, Mitchell; Blaisdell, John; Rosenkranz, Philip; Strow, Larrabee

    2007-01-01

    The Atmospheric Infrared Spectrometer (AIRS) Science Processing System (SPS) is a collection of computer programs, denoted product generation executives (PGEs), for processing the readings of the AIRS suite of infrared and microwave instruments orbiting the Earth aboard NASA s Aqua spacecraft. AIRS SPS at an earlier stage of development was described in "Initial Processing of Infrared Spectral Data' (NPO-35243), NASA Tech Briefs, Vol. 28, No. 11 (November 2004), page 39. To recapitulate: Starting from level 0 (representing raw AIRS data), the PGEs and their data products are denoted by alphanumeric labels (1A, 1B, and 2) that signify the successive stages of processing. The cited prior article described processing through level 1B (the level-2 PGEs were not yet operational). The level-2 PGEs, which are now operational, receive packages of level-1B geolocated radiance data products and produce such geolocated geophysical atmospheric data products such as temperature and humidity profiles. The process of computing these geophysical data products is denoted "retrieval" and is quite complex. The main steps of the process are denoted microwave-only retrieval, cloud detection and cloud clearing, regression, full retrieval, and rapid transmittance algorithm.

  1. Processing AIRS Scientific Data Through Level 3

    Science.gov (United States)

    Granger, Stephanie; Oliphant, Robert; Manning, Evan

    2010-01-01

    The Atmospheric Infra-Red Sounder (AIRS) Science Processing System (SPS) is a collection of computer programs, known as product generation executives (PGEs). The AIRS SPS PGEs are used for processing measurements received from the AIRS suite of infrared and microwave instruments orbiting the Earth onboard NASA's Aqua spacecraft. Early stages of the AIRS SPS development were described in a prior NASA Tech Briefs article: Initial Processing of Infrared Spectral Data (NPO-35243), Vol. 28, No. 11 (November 2004), page 39. In summary: Starting from Level 0 (representing raw AIRS data), the AIRS SPS PGEs and the data products they produce are identified by alphanumeric labels (1A, 1B, 2, and 3) representing successive stages or levels of processing. The previous NASA Tech Briefs article described processing through Level 2, the output of which comprises geo-located atmospheric data products such as temperature and humidity profiles among others. The AIRS Level 3 PGE samples selected information from the Level 2 standard products to produce a single global gridded product. One Level 3 product is generated for each day s collection of Level 2 data. In addition, daily Level 3 products are aggregated into two multiday products: an eight-day (half the orbital repeat cycle) product and monthly (calendar month) product.

  2. A review of processes important in the floodplain setting

    OpenAIRE

    Stuart, M.E.; Lapworth, D.J.

    2011-01-01

    This report reviews the physical and geochemical processes reported in the literature and likely to be operating in the floodplain setting. The review supports a study of the Port Meadow, located within the floodplain of the River Thames to the northwest of the city of Oxford, an area affected by urban pollution. It focuses on floodplains but includes both material for the hyporheic zone and also generally for riparian zones. It describes the processes, generically covers case ...

  3. Some applications of fuzzy sets and the analytical hierarchy process to decision making

    OpenAIRE

    Castro, Alberto Rosas

    1984-01-01

    Approved for public release; distribution unlimited This thesis examines the use of fuzzy set theory and the analytic hierarchy process in decision making. It begins by reviewing the insight of psychologists, social scientists and computer scientists to the decision making process. The Operations Research- Systems Analysis approach is discussed followed by a presentation of the basis of fuzzy set theory and the analytic hierarchy process. Two applications of these meth...

  4. Power battles in ICT standards-setting process : lessons from mobile payments

    NARCIS (Netherlands)

    Lim, A.S.

    2006-01-01

    Standards play an important role in ICT innovation to ensure the interoperability and interconnectivity. However, standardisation is a complex process that involves actors with different interests. Various studies, which are mainly economics, have tried to develop the standards-setting process

  5. Level set method for image segmentation based on moment competition

    Science.gov (United States)

    Min, Hai; Wang, Xiao-Feng; Huang, De-Shuang; Jin, Jing; Wang, Hong-Zhi; Li, Hai

    2015-05-01

    We propose a level set method for image segmentation which introduces the moment competition and weakly supervised information into the energy functional construction. Different from the region-based level set methods which use force competition, the moment competition is adopted to drive the contour evolution. Here, a so-called three-point labeling scheme is proposed to manually label three independent points (weakly supervised information) on the image. Then the intensity differences between the three points and the unlabeled pixels are used to construct the force arms for each image pixel. The corresponding force is generated from the global statistical information of a region-based method and weighted by the force arm. As a result, the moment can be constructed and incorporated into the energy functional to drive the evolving contour to approach the object boundary. In our method, the force arm can take full advantage of the three-point labeling scheme to constrain the moment competition. Additionally, the global statistical information and weakly supervised information are successfully integrated, which makes the proposed method more robust than traditional methods for initial contour placement and parameter setting. Experimental results with performance analysis also show the superiority of the proposed method on segmenting different types of complicated images, such as noisy images, three-phase images, images with intensity inhomogeneity, and texture images.

  6. Topological Hausdorff dimension and level sets of generic continuous functions on fractals

    International Nuclear Information System (INIS)

    Balka, Richárd; Buczolich, Zoltán; Elekes, Márton

    2012-01-01

    Highlights: ► We examine a new fractal dimension, the so called topological Hausdorff dimension. ► The generic continuous function has a level set of maximal Hausdorff dimension. ► This maximal dimension is the topological Hausdorff dimension minus one. ► Homogeneity implies that “most” level sets are of this dimension. ► We calculate the various dimensions of the graph of the generic function. - Abstract: In an earlier paper we introduced a new concept of dimension for metric spaces, the so called topological Hausdorff dimension. For a compact metric space K let dim H K and dim tH K denote its Hausdorff and topological Hausdorff dimension, respectively. We proved that this new dimension describes the Hausdorff dimension of the level sets of the generic continuous function on K, namely sup{ dim H f -1 (y):y∈R} =dim tH K-1 for the generic f ∈ C(K), provided that K is not totally disconnected, otherwise every non-empty level set is a singleton. We also proved that if K is not totally disconnected and sufficiently homogeneous then dim H f −1 (y) = dim tH K − 1 for the generic f ∈ C(K) and the generic y ∈ f(K). The most important goal of this paper is to make these theorems more precise. As for the first result, we prove that the supremum is actually attained on the left hand side of the first equation above, and also show that there may only be a unique level set of maximal Hausdorff dimension. As for the second result, we characterize those compact metric spaces for which for the generic f ∈ C(K) and the generic y ∈ f(K) we have dim H f −1 (y) = dim tH K − 1. We also generalize a result of B. Kirchheim by showing that if K is self-similar then for the generic f ∈ C(K) for every y∈intf(K) we have dim H f −1 (y) = dim tH K − 1. Finally, we prove that the graph of the generic f ∈ C(K) has the same Hausdorff and topological Hausdorff dimension as K.

  7. Relationships between college settings and student alcohol use before, during and after events: a multi-level study.

    Science.gov (United States)

    Paschall, Mallie J; Saltz, Robert F

    2007-11-01

    We examined how alcohol risk is distributed based on college students' drinking before, during and after they go to certain settings. Students attending 14 California public universities (N=10,152) completed a web-based or mailed survey in the fall 2003 semester, which included questions about how many drinks they consumed before, during and after the last time they went to six settings/events: fraternity or sorority party, residence hall party, campus event (e.g. football game), off-campus party, bar/restaurant and outdoor setting (referent). Multi-level analyses were conducted in hierarchical linear modeling (HLM) to examine relationships between type of setting and level of alcohol use before, during and after going to the setting, and possible age and gender differences in these relationships. Drinking episodes (N=24,207) were level 1 units, students were level 2 units and colleges were level 3 units. The highest drinking levels were observed during all settings/events except campus events, with the highest number of drinks being consumed at off-campus parties, followed by residence hall and fraternity/sorority parties. The number of drinks consumed before a fraternity/sorority party was higher than other settings/events. Age group and gender differences in relationships between type of setting/event and 'before,''during' and 'after' drinking levels also were observed. For example, going to a bar/restaurant (relative to an outdoor setting) was positively associated with 'during' drinks among students of legal drinking age while no relationship was observed for underage students. Findings of this study indicate differences in the extent to which college settings are associated with student drinking levels before, during and after related events, and may have implications for intervention strategies targeting different types of settings.

  8. EOS MLS Level 2 Data Processing Software Version 3

    Science.gov (United States)

    Livesey, Nathaniel J.; VanSnyder, Livesey W.; Read, William G.; Schwartz, Michael J.; Lambert, Alyn; Santee, Michelle L.; Nguyen, Honghanh T.; Froidevaux, Lucien; wang, Shuhui; Manney, Gloria L.; hide

    2011-01-01

    This software accepts the EOS MLS calibrated measurements of microwave radiances products and operational meteorological data, and produces a set of estimates of atmospheric temperature and composition. This version has been designed to be as flexible as possible. The software is controlled by a Level 2 Configuration File that controls all aspects of the software: defining the contents of state and measurement vectors, defining the configurations of the various forward models available, reading appropriate a priori spectroscopic and calibration data, performing retrievals, post-processing results, computing diagnostics, and outputting results in appropriate files. In production mode, the software operates in a parallel form, with one instance of the program acting as a master, coordinating the work of multiple slave instances on a cluster of computers, each computing the results for individual chunks of data. In addition, to do conventional retrieval calculations and producing geophysical products, the Level 2 Configuration File can instruct the software to produce files of simulated radiances based on a state vector formed from a set of geophysical product files taken as input. Combining both the retrieval and simulation tasks in a single piece of software makes it far easier to ensure that identical forward model algorithms and parameters are used in both tasks. This also dramatically reduces the complexity of the code maintenance effort.

  9. Individual-and Setting-Level Correlates of Secondary Traumatic Stress in Rape Crisis Center Staff.

    Science.gov (United States)

    Dworkin, Emily R; Sorell, Nicole R; Allen, Nicole E

    2016-02-01

    Secondary traumatic stress (STS) is an issue of significant concern among providers who work with survivors of sexual assault. Although STS has been studied in relation to individual-level characteristics of a variety of types of trauma responders, less research has focused specifically on rape crisis centers as environments that might convey risk or protection from STS, and no research to knowledge has modeled setting-level variation in correlates of STS. The current study uses a sample of 164 staff members representing 40 rape crisis centers across a single Midwestern state to investigate the staff member-and agency-level correlates of STS. Results suggest that correlates exist at both levels of analysis. Younger age and greater severity of sexual assault history were statistically significant individual-level predictors of increased STS. Greater frequency of supervision was more strongly related to secondary stress for non-advocates than for advocates. At the setting level, lower levels of supervision and higher client loads agency-wide accounted for unique variance in staff members' STS. These findings suggest that characteristics of both providers and their settings are important to consider when understanding their STS. © The Author(s) 2014.

  10. Healthcare priority setting in Kenya

    DEFF Research Database (Denmark)

    Bukachi, Salome A.; Onyango-Ouma, Washington; Siso, Jared Maaka

    2014-01-01

    In resource-poor settings, the accountability for reasonableness (A4R) has been identified as an important advance in priority setting that helps to operationalize fair priority setting in specific contexts. The four conditions of A4R are backed by theory, not evidence, that conformance with them...... improves the priority setting decisions. This paper describes the healthcare priority setting processes in Malindi district, Kenya, prior to the implementation of A4R in 2008 and evaluates the process for its conformance with the conditions for A4R. In-depth interviews and focus group discussions with key...... players in the Malindi district health system and a review of key policy documents and national guidelines show that the priority setting process in the district relies heavily on guidelines from the national level, making it more of a vertical, top-down orientation. Multilateral and donor agencies...

  11. Hybrid approach for detection of dental caries based on the methods FCM and level sets

    Science.gov (United States)

    Chaabene, Marwa; Ben Ali, Ramzi; Ejbali, Ridha; Zaied, Mourad

    2017-03-01

    This paper presents a new technique for detection of dental caries that is a bacterial disease that destroys the tooth structure. In our approach, we have achieved a new segmentation method that combines the advantages of fuzzy C mean algorithm and level set method. The results obtained by the FCM algorithm will be used by Level sets algorithm to reduce the influence of the noise effect on the working of each of these algorithms, to facilitate level sets manipulation and to lead to more robust segmentation. The sensitivity and specificity confirm the effectiveness of proposed method for caries detection.

  12. Standard Setting as Psychometric Due Process: Going a Little Further Down an Uncertain Road.

    Science.gov (United States)

    Cizek, Gregory J.

    The concept of due process provides an analogy for the process of standard setting that emphasizes many of the procedural and substantive elements of the process over technical and statistical concerns. Surely such concerns can and should continue to be addressed. However, a sound rationale for standard setting does not rest on this foundation.…

  13. HPC in Basin Modeling: Simulating Mechanical Compaction through Vertical Effective Stress using Level Sets

    Science.gov (United States)

    McGovern, S.; Kollet, S. J.; Buerger, C. M.; Schwede, R. L.; Podlaha, O. G.

    2017-12-01

    In the context of sedimentary basins, we present a model for the simulation of the movement of ageological formation (layers) during the evolution of the basin through sedimentation and compactionprocesses. Assuming a single phase saturated porous medium for the sedimentary layers, the modelfocuses on the tracking of the layer interfaces, through the use of the level set method, as sedimentationdrives fluid-flow and reduction of pore space by compaction. On the assumption of Terzaghi's effectivestress concept, the coupling of the pore fluid pressure to the motion of interfaces in 1-D is presented inMcGovern, et.al (2017) [1] .The current work extends the spatial domain to 3-D, though we maintain the assumption ofvertical effective stress to drive the compaction. The idealized geological evolution is conceptualized asthe motion of interfaces between rock layers, whose paths are determined by the magnitude of a speedfunction in the direction normal to the evolving layer interface. The speeds normal to the interface aredependent on the change in porosity, determined through an effective stress-based compaction law,such as the exponential Athy's law. Provided with the speeds normal to the interface, the level setmethod uses an advection equation to evolve a potential function, whose zero level set defines theinterface. Thus, the moving layer geometry influences the pore pressure distribution which couplesback to the interface speeds. The flexible construction of the speed function allows extension, in thefuture, to other terms to represent different physical processes, analogous to how the compaction rulerepresents material deformation.The 3-D model is implemented using the generic finite element method framework Deal II,which provides tools, building on p4est and interfacing to PETSc, for the massively parallel distributedsolution to the model equations [2]. Experiments are being run on the Juelich Supercomputing Center'sJureca cluster. [1] McGovern, et.al. (2017

  14. An Examination of the Workflow Processes of the Screening, Brief Intervention, and Referral to Treatment (SBIRT) Program in Health Care Settings.

    Science.gov (United States)

    Kaiser, David J; Karuntzos, Georgia

    2016-01-01

    Screening, Brief Intervention, and Referral to Treatment (SBIRT) is a public health program used to identify, reduce, and prevent problematic use, abuse, and dependence on alcohol and illicit drugs that has been adapted for implementation in emergency departments and ambulatory clinics nationwide. This study used a combination of observational, timing, and descriptive analyses from a multisite evaluation to understand the workflow processes implemented in 21 treatment settings. Direct observations of 59 SBIRT practitioners and semi-structured interviews with 170 stakeholders, program administrators, practitioners, and program evaluators provided information about workflow in different medical care settings. The SBIRT workflow processes are presented at three levels: service delivery, information storage, and information sharing. Analyses suggest limited variation in the overall workflow processes across settings, although performance sites tailored the program to fit with existing clinical processes, health information technology, and patient characteristics. Strategies for successful integration include co-locating SBIRT providers in the medical care setting and integrating SBIRT data into electronic health records. Provisions within the Patient Protection and Affordable Care Act of 2010 call for the integration of behavioral health and medical care services. SBIRT is being adapted in different types of medical care settings, and the workflow processes are being adapted to ensure efficient delivery, illustrating the successful integration of behavioral health and medical care. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. CO 2 laser cutting of MDF . 1. Determination of process parameter settings

    Science.gov (United States)

    Lum, K. C. P.; Ng, S. L.; Black, I.

    2000-02-01

    This paper details an investigation into the laser processing of medium-density fibreboard (MDF). Part 1 reports on the determination of process parameter settings for the effective cutting of MDF by CO 2 laser, using an established experimental methodology developed to study the interrelationship between and effects of varying laser set-up parameters. Results are presented for both continuous wave (CW) and pulse mode (PM) cutting, and the associated cut quality effects have been commented on.

  16. Stakeholder views on criteria and processes for priority setting in Norway: a qualitative study.

    Science.gov (United States)

    Aidem, Jeremy M

    2017-06-01

    Since 2013, Norway has engaged in political processes to revise criteria for priority setting. These processes have yielded key efficiency and equity criteria, but excluded potentially relevant social values. This study describes the views of 27 stakeholders in Norway's health system regarding a wider set of priority-setting criteria and procedural characteristics. Between January and February 2016, semi-structured interviews and focus groups were conducted with a purposive sample of policymakers, hospital administrators, practitioners, university students and seniors. Improving health among low-socioeconomic-status groups was considered an important policy objective: some favored giving more priority to diseases affecting socioeconomically disadvantaged groups, and some believed inequalities in health could be more effectively addressed outside the health sector. Age was not widely accepted as an independent criterion, but deemed relevant as an indicator of capacity to benefit, cost-effectiveness and health loss. Cost-effectiveness, severity and health-loss measures were judged relevant to policymaking, but cost-effectiveness and health loss were considered less influential to clinical decision-making. Public engagement was seen as essential yet complicated by media and stakeholder pressures. This study highlights how views on the relevance and implementation of criteria can vary significantly according to the health system level being evaluated. Further, the findings suggest that giving priority to socioeconomically disadvantaged groups and reducing inequalities in health may be relevant preferences not captured in recent policy proposals. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Topology optimization of hyperelastic structures using a level set method

    Science.gov (United States)

    Chen, Feifei; Wang, Yiqiang; Wang, Michael Yu; Zhang, Y. F.

    2017-12-01

    Soft rubberlike materials, due to their inherent compliance, are finding widespread implementation in a variety of applications ranging from assistive wearable technologies to soft material robots. Structural design of such soft and rubbery materials necessitates the consideration of large nonlinear deformations and hyperelastic material models to accurately predict their mechanical behaviour. In this paper, we present an effective level set-based topology optimization method for the design of hyperelastic structures that undergo large deformations. The method incorporates both geometric and material nonlinearities where the strain and stress measures are defined within the total Lagrange framework and the hyperelasticity is characterized by the widely-adopted Mooney-Rivlin material model. A shape sensitivity analysis is carried out, in the strict sense of the material derivative, where the high-order terms involving the displacement gradient are retained to ensure the descent direction. As the design velocity enters into the shape derivative in terms of its gradient and divergence terms, we develop a discrete velocity selection strategy. The whole optimization implementation undergoes a two-step process, where the linear optimization is first performed and its optimized solution serves as the initial design for the subsequent nonlinear optimization. It turns out that this operation could efficiently alleviate the numerical instability and facilitate the optimization process. To demonstrate the validity and effectiveness of the proposed method, three compliance minimization problems are studied and their optimized solutions present significant mechanical benefits of incorporating the nonlinearities, in terms of remarkable enhancement in not only the structural stiffness but also the critical buckling load.

  18. Levels of processing and picture memory: the physical superiority effect.

    Science.gov (United States)

    Intraub, H; Nicklos, S

    1985-04-01

    Six experiments studied the effect of physical orienting questions (e.g., "Is this angular?") and semantic orienting questions (e.g., "Is this edible?") on memory for unrelated pictures at stimulus durations ranging from 125-2,000 ms. Results ran contrary to the semantic superiority "rule of thumb," which is based primarily on verbal memory experiments. Physical questions were associated with better free recall and cued recall of a diverse set of visual scenes (Experiments 1, 2, and 4). This occurred both when general and highly specific semantic questions were used (Experiments 1 and 2). Similar results were obtained when more simplistic visual stimuli--photographs of single objects--were used (Experiments 5 and 6). As in the case of the semantic superiority effect with words, the physical superiority effect for pictures was eliminated or reversed when the same physical questions were repeated throughout the session (Experiments 4 and 6). Conflicts with results of previous levels of processing experiments with words and nonverbal stimuli (e.g., faces) are explained in terms of the sensory-semantic model (Nelson, Reed, & McEvoy, 1977). Implications for picture memory research and the levels of processing viewpoint are discussed.

  19. Tree-indexed processes: a high level crossing analysis

    Directory of Open Access Journals (Sweden)

    Mark Kelbert

    2003-01-01

    Full Text Available Consider a branching diffusion process on R1 starting at the origin. Take a high level u>0 and count the number R(u,n of branches reaching u by generation n. Let Fk,n(u be the probability P(R(u,nset of solutions is analysed. We interpret Fk(u as a potential ruin probability in the situation of a multiple choice of a decision taken at vertices of a ‘logical tree’. It is shown that, unlike the standard risk theory, the above equation has a manifold of solutions. Also an analogue of Lundberg's bound for branching diffusion is derived.

  20. Level-set-based reconstruction algorithm for EIT lung images: first clinical results.

    Science.gov (United States)

    Rahmati, Peyman; Soleimani, Manuchehr; Pulletz, Sven; Frerichs, Inéz; Adler, Andy

    2012-05-01

    We show the first clinical results using the level-set-based reconstruction algorithm for electrical impedance tomography (EIT) data. The level-set-based reconstruction method (LSRM) allows the reconstruction of non-smooth interfaces between image regions, which are typically smoothed by traditional voxel-based reconstruction methods (VBRMs). We develop a time difference formulation of the LSRM for 2D images. The proposed reconstruction method is applied to reconstruct clinical EIT data of a slow flow inflation pressure-volume manoeuvre in lung-healthy and adult lung-injury patients. Images from the LSRM and the VBRM are compared. The results show comparable reconstructed images, but with an improved ability to reconstruct sharp conductivity changes in the distribution of lung ventilation using the LSRM.

  1. Level-set-based reconstruction algorithm for EIT lung images: first clinical results

    International Nuclear Information System (INIS)

    Rahmati, Peyman; Adler, Andy; Soleimani, Manuchehr; Pulletz, Sven; Frerichs, Inéz

    2012-01-01

    We show the first clinical results using the level-set-based reconstruction algorithm for electrical impedance tomography (EIT) data. The level-set-based reconstruction method (LSRM) allows the reconstruction of non-smooth interfaces between image regions, which are typically smoothed by traditional voxel-based reconstruction methods (VBRMs). We develop a time difference formulation of the LSRM for 2D images. The proposed reconstruction method is applied to reconstruct clinical EIT data of a slow flow inflation pressure–volume manoeuvre in lung-healthy and adult lung-injury patients. Images from the LSRM and the VBRM are compared. The results show comparable reconstructed images, but with an improved ability to reconstruct sharp conductivity changes in the distribution of lung ventilation using the LSRM. (paper)

  2. Virtual endoscopy post-processing of helical CT data sets

    International Nuclear Information System (INIS)

    Dessl, A.; Giacomuzzi, S.M.; Springer, P.; Stoeger, A.; Pototschnig, C.; Voelklein, C.; Schreder, S.G.; Jaschke, W.

    1997-01-01

    Purpose: The purpose of this work was to test a newly developed, post-processing software for virtual CT endoscopic methods. Virtual endoscopic images were generated from helical CT data sets in the region of the shoulder joint (n=2), the tracheobronchial system (n=3), the nasal sinuses (n=2), the colon (n=2), and the common carotid artery (n=1). Software developed specifically for virtual endoscopy ('Navigator') was used which, after a previous threshold value selection, makes the reconstruction of internal body surfaces possible by an automatic segmentation process. We have evaluated the usage of the software, the reconstruction time for individual images and sequences of images as well as the quality of the reconstruction. All pathological findings of the virtual endoscopy were confirmed by surgery. Results: The post-processing program is easy to use and provides virtual endoscopic images within 50 seconds. Depending of the extent of the data set, virtual tracheobronchoscopy as a cine loop sequence required about 15 minutes. Thorugh use of the threshold value-dependent surface reconstruction the demands on the computer configuration are limited; however, this also created quality problems in image calculation as a consequence of the accompanying loss of data. Conclusions: The Navigator software enables the calculation of virtual endoscopic models with only moderate demands on the hardware. (orig.) [de

  3. Levels of processing: the evolution of a framework

    OpenAIRE

    Ekuni, Roberta; Vaz, Leonardo José; Bueno, Orlando Francisco Amodeo

    2011-01-01

    Although the levels of processing framework have evolved over its nearly 40 years of existence, the essence of the idea has not changed from the original. The original article published in 1972 suggests that in the encoding stage of a stimulus, there is a series of processing hierarchies ranging from the shallowest level (perceptual processing-the subject initially perceives the physical and sensory characteristics of the stimulus) to the deepest level (semantic processing-related to pattern ...

  4. The Processing Speed of Scene Categorization at Multiple Levels of Description: The Superordinate Advantage Revisited.

    Science.gov (United States)

    Banno, Hayaki; Saiki, Jun

    2015-03-01

    Recent studies have sought to determine which levels of categories are processed first in visual scene categorization and have shown that the natural and man-made superordinate-level categories are understood faster than are basic-level categories. The current study examined the robustness of the superordinate-level advantage in a visual scene categorization task. A go/no-go categorization task was evaluated with response time distribution analysis using an ex-Gaussian template. A visual scene was categorized as either superordinate or basic level, and two basic-level categories forming a superordinate category were judged as either similar or dissimilar to each other. First, outdoor/ indoor groups and natural/man-made were used as superordinate categories to investigate whether the advantage could be generalized beyond the natural/man-made boundary. Second, a set of images forming a superordinate category was manipulated. We predicted that decreasing image set similarity within the superordinate-level category would work against the speed advantage. We found that basic-level categorization was faster than outdoor/indoor categorization when the outdoor category comprised dissimilar basic-level categories. Our results indicate that the superordinate-level advantage in visual scene categorization is labile across different categories and category structures. © 2015 SAGE Publications.

  5. Level set segmentation of bovine corpora lutea in ex situ ovarian ultrasound images

    Directory of Open Access Journals (Sweden)

    Adams Gregg P

    2008-08-01

    Full Text Available Abstract Background The objective of this study was to investigate the viability of level set image segmentation methods for the detection of corpora lutea (corpus luteum, CL boundaries in ultrasonographic ovarian images. It was hypothesized that bovine CL boundaries could be located within 1–2 mm by a level set image segmentation methodology. Methods Level set methods embed a 2D contour in a 3D surface and evolve that surface over time according to an image-dependent speed function. A speed function suitable for segmentation of CL's in ovarian ultrasound images was developed. An initial contour was manually placed and contour evolution was allowed to proceed until the rate of change of the area was sufficiently small. The method was tested on ovarian ultrasonographic images (n = 8 obtained ex situ. A expert in ovarian ultrasound interpretation delineated CL boundaries manually to serve as a "ground truth". Accuracy of the level set segmentation algorithm was determined by comparing semi-automatically determined contours with ground truth contours using the mean absolute difference (MAD, root mean squared difference (RMSD, Hausdorff distance (HD, sensitivity, and specificity metrics. Results and discussion The mean MAD was 0.87 mm (sigma = 0.36 mm, RMSD was 1.1 mm (sigma = 0.47 mm, and HD was 3.4 mm (sigma = 2.0 mm indicating that, on average, boundaries were accurate within 1–2 mm, however, deviations in excess of 3 mm from the ground truth were observed indicating under- or over-expansion of the contour. Mean sensitivity and specificity were 0.814 (sigma = 0.171 and 0.990 (sigma = 0.00786, respectively, indicating that CLs were consistently undersegmented but rarely did the contour interior include pixels that were judged by the human expert not to be part of the CL. It was observed that in localities where gradient magnitudes within the CL were strong due to high contrast speckle, contour expansion stopped too early. Conclusion The

  6. Level Set Projection Method for Incompressible Navier-Stokes on Arbitrary Boundaries

    KAUST Repository

    Williams-Rioux, Bertrand

    2012-01-12

    Second order level set projection method for incompressible Navier-Stokes equations is proposed to solve flow around arbitrary geometries. We used rectilinear grid with collocated cell centered velocity and pressure. An explicit Godunov procedure is used to address the nonlinear advection terms, and an implicit Crank-Nicholson method to update viscous effects. An approximate pressure projection is implemented at the end of the time stepping using multigrid as a conventional fast iterative method. The level set method developed by Osher and Sethian [17] is implemented to address real momentum and pressure boundary conditions by the advection of a distance function, as proposed by Aslam [3]. Numerical results for the Strouhal number and drag coefficients validated the model with good accuracy for flow over a cylinder in the parallel shedding regime (47 < Re < 180). Simulations for an array of cylinders and an oscillating cylinder were performed, with the latter demonstrating our methods ability to handle dynamic boundary conditions.

  7. Envelopes of Sets of Measures, Tightness, and Markov Control Processes

    International Nuclear Information System (INIS)

    Gonzalez-Hernandez, J.; Hernandez-Lerma, O.

    1999-01-01

    We introduce upper and lower envelopes for sets of measures on an arbitrary topological space, which are then used to give a tightness criterion. These concepts are applied to show the existence of optimal policies for a class of Markov control processes

  8. Identification of Arbitrary Zonation in Groundwater Parameters using the Level Set Method and a Parallel Genetic Algorithm

    Science.gov (United States)

    Lei, H.; Lu, Z.; Vesselinov, V. V.; Ye, M.

    2017-12-01

    Simultaneous identification of both the zonation structure of aquifer heterogeneity and the hydrogeological parameters associated with these zones is challenging, especially for complex subsurface heterogeneity fields. In this study, a new approach, based on the combination of the level set method and a parallel genetic algorithm is proposed. Starting with an initial guess for the zonation field (including both zonation structure and the hydraulic properties of each zone), the level set method ensures that material interfaces are evolved through the inverse process such that the total residual between the simulated and observed state variables (hydraulic head) always decreases, which means that the inversion result depends on the initial guess field and the minimization process might fail if it encounters a local minimum. To find the global minimum, the genetic algorithm (GA) is utilized to explore the parameters that define initial guess fields, and the minimal total residual corresponding to each initial guess field is considered as the fitness function value in the GA. Due to the expensive evaluation of the fitness function, a parallel GA is adapted in combination with a simulated annealing algorithm. The new approach has been applied to several synthetic cases in both steady-state and transient flow fields, including a case with real flow conditions at the chromium contaminant site at the Los Alamos National Laboratory. The results show that this approach is capable of identifying the arbitrary zonation structures of aquifer heterogeneity and the hydrogeological parameters associated with these zones effectively.

  9. From face processing to face recognition: Comparing three different processing levels.

    Science.gov (United States)

    Besson, G; Barragan-Jason, G; Thorpe, S J; Fabre-Thorpe, M; Puma, S; Ceccaldi, M; Barbeau, E J

    2017-01-01

    Verifying that a face is from a target person (e.g. finding someone in the crowd) is a critical ability of the human face processing system. Yet how fast this can be performed is unknown. The 'entry-level shift due to expertise' hypothesis suggests that - since humans are face experts - processing faces should be as fast - or even faster - at the individual than at superordinate levels. In contrast, the 'superordinate advantage' hypothesis suggests that faces are processed from coarse to fine, so that the opposite pattern should be observed. To clarify this debate, three different face processing levels were compared: (1) a superordinate face categorization level (i.e. detecting human faces among animal faces), (2) a face familiarity level (i.e. recognizing famous faces among unfamiliar ones) and (3) verifying that a face is from a target person, our condition of interest. The minimal speed at which faces can be categorized (∼260ms) or recognized as familiar (∼360ms) has largely been documented in previous studies, and thus provides boundaries to compare our condition of interest to. Twenty-seven participants were included. The recent Speed and Accuracy Boosting procedure paradigm (SAB) was used since it constrains participants to use their fastest strategy. Stimuli were presented either upright or inverted. Results revealed that verifying that a face is from a target person (minimal RT at ∼260ms) was remarkably fast but longer than the face categorization level (∼240ms) and was more sensitive to face inversion. In contrast, it was much faster than recognizing a face as familiar (∼380ms), a level severely affected by face inversion. Face recognition corresponding to finding a specific person in a crowd thus appears achievable in only a quarter of a second. In favor of the 'superordinate advantage' hypothesis or coarse-to-fine account of the face visual hierarchy, these results suggest a graded engagement of the face processing system across processing

  10. Intermediate Levels of Visual Processing

    National Research Council Canada - National Science Library

    Nakayama, Ken

    1998-01-01

    ...) surface representation, here we have shown that there is an intermediate level of visual processing, between the analysis of the image and higher order representations related to specific objects; (2...

  11. E-learning process maturity level: a conceptual framework

    Science.gov (United States)

    Rahmah, A.; Santoso, H. B.; Hasibuan, Z. A.

    2018-03-01

    ICT advancement is a sure thing with the impact influencing many domains, including learning in both formal and informal situations. It leads to a new mindset that we should not only utilize the given ICT to support the learning process, but also improve it gradually involving a lot of factors. These phenomenon is called e-learning process evolution. Accordingly, this study attempts to explore maturity level concept to provide the improvement direction gradually and progression monitoring for the individual e-learning process. Extensive literature review, observation, and forming constructs are conducted to develop a conceptual framework for e-learning process maturity level. The conceptual framework consists of learner, e-learning process, continuous improvement, evolution of e-learning process, technology, and learning objectives. Whilst, evolution of e-learning process depicted as current versus expected conditions of e-learning process maturity level. The study concludes that from the e-learning process maturity level conceptual framework, it may guide the evolution roadmap for e-learning process, accelerate the evolution, and decrease the negative impact of ICT. The conceptual framework will be verified and tested in the future study.

  12. Features, Events, and Processes: system Level

    Energy Technology Data Exchange (ETDEWEB)

    D. McGregor

    2004-10-15

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760).

  13. Features, Events, and Processes: system Level

    International Nuclear Information System (INIS)

    D. McGregor

    2004-01-01

    The purpose of this analysis report is to evaluate and document the inclusion or exclusion of the system-level features, events, and processes (FEPs) with respect to modeling used to support the total system performance assessment for the license application (TSPA-LA). A screening decision, either Included or Excluded, is given for each FEP along with the technical basis for screening decisions. This information is required by the U.S. Nuclear Regulatory Commission (NRC) at 10 CFR 63.113 (d, e, and f) (DIRS 156605). The system-level FEPs addressed in this report typically are overarching in nature, rather than being focused on a particular process or subsystem. As a result, they are best dealt with at the system level rather than addressed within supporting process-level or subsystem-level analyses and models reports. The system-level FEPs also tend to be directly addressed by regulations, guidance documents, or assumptions listed in the regulations; or are addressed in background information used in development of the regulations. For included FEPs, this analysis summarizes the implementation of the FEP in the TSPA-LA (i.e., how the FEP is included). For excluded FEPs, this analysis provides the technical basis for exclusion from the TSPA-LA (i.e., why the FEP is excluded). The initial version of this report (Revision 00) was developed to support the total system performance assessment for site recommendation (TSPA-SR). This revision addresses the license application (LA) FEP List (DIRS 170760)

  14. Contextual control over task-set retrieval.

    Science.gov (United States)

    Crump, Matthew J C; Logan, Gordon D

    2010-11-01

    Contextual cues signaling task likelihood or the likelihood of task repetition are known to modulate the size of switch costs. We follow up on the finding by Leboe, Wong, Crump, and Stobbe (2008) that location cues predictive of the proportion of switch or repeat trials modulate switch costs. Their design employed one cue per task, whereas our experiment employed two cues per task, which allowed separate assessment of modulations to the cue-repetition benefit, a measure of lower level cue-encoding processes, and to the task-alternation cost, a measure of higher level processes representing task-set information. We demonstrate that location information predictive of switch proportion modulates performance at the level of task-set representations. Furthermore, we demonstrate that contextual control occurs even when subjects are unaware of the associations between context and switch likelihood. We discuss the notion that contextual information provides rapid, unconscious control over the extent to which prior task-set representations are retrieved in the service of guiding online performance.

  15. Processing approaches to cognition: the impetus from the levels-of-processing framework.

    Science.gov (United States)

    Roediger, Henry L; Gallo, David A; Geraci, Lisa

    2002-01-01

    Processing approaches to cognition have a long history, from act psychology to the present, but perhaps their greatest boost was given by the success and dominance of the levels-of-processing framework. We review the history of processing approaches, and explore the influence of the levels-of-processing approach, the procedural approach advocated by Paul Kolers, and the transfer-appropriate processing framework. Processing approaches emphasise the procedures of mind and the idea that memory storage can be usefully conceptualised as residing in the same neural units that originally processed information at the time of encoding. Processing approaches emphasise the unity and interrelatedness of cognitive processes and maintain that they can be dissected into separate faculties only by neglecting the richness of mental life. We end by pointing to future directions for processing approaches.

  16. Application of Fuzzy Sets in an Expert System For Technological Process Management

    Directory of Open Access Journals (Sweden)

    Filip Tošenovský

    2011-12-01

    Full Text Available The paper is preoccupied with application of an expert system in the management of a process with one input and one output, using the fuzzy set theory. It resolves the problem of formalization of a verbal description of the process management coupled with the use of process operator’s experience. The procedure that calculates regulatory intervention in the process is presented and accompanied by graphical illustrations.

  17. The process of care in integrative health care settings - a qualitative study of US practices.

    Science.gov (United States)

    Grant, Suzanne J; Bensoussan, Alan

    2014-10-23

    There is a lack of research on the organisational operations of integrative healthcare (IHC) practices. IHC is a therapeutic strategy integrating conventional and complementary medicine in a shared context to administer individualized treatment. To better understand the process of care in IHC - the way in which patients are triaged and treatment plans are constructed, interviews were conducted with integrative health care leaders and practitioners in the US. Semi-structured interviews were conducted with a pragmatic group of fourteen leaders and practitioners from nine different IHC settings. All interviews were conducted face-to-face with the exception of one phone interview. Questions focussed on understanding the "process of care" in an integrative healthcare setting. Deductive categories were formed from the aims of the study, focusing on: organisational structure, processes of care (subcategories: patient intake, treatment and charting, use of guidelines or protocols), prevalent diseases or conditions treated, and the role of research in the organisation. The similarities and differences of the ITH entities emerged from this process. On an organisational level, conventional and CM services and therapies were co-located in all nine settings. For patients, this means there is more opportunity for 'seamless care'. Shared information systems enabled easy communication using internal messaging or email systems, and shared patient intake information. But beyond this infrastructure alignment for integrative health care was less supported. There were no use of protocols or guidelines within any centre, no patient monitoring mechanism beyond that which occurred within one-on-one appointments. Joint planning for a patient treatment was typically ad hoc through informal mechanisms. Additional duties typically come at a direct financial cost to fee-for-service practitioners. In contrast, service delivery and the process of care within hospital inpatient services followed

  18. Goal-setting in clinical medicine.

    Science.gov (United States)

    Bradley, E H; Bogardus, S T; Tinetti, M E; Inouye, S K

    1999-07-01

    The process of setting goals for medical care in the context of chronic disease has received little attention in the medical literature, despite the importance of goal-setting in the achievement of desired outcomes. Using qualitative research methods, this paper develops a theory of goal-setting in the care of patients with dementia. The theory posits several propositions. First, goals are generated from embedded values but are distinct from values. Goals vary based on specific circumstances and alternatives whereas values are person-specific and relatively stable in the face of changing circumstances. Second, goals are hierarchical in nature, with complex mappings between general and specific goals. Third, there are a number of factors that modify the goal-setting process, by affecting the generation of goals from values or the translation of general goals to specific goals. Modifying factors related to individuals include their degree of risk-taking, perceived self-efficacy, and acceptance of the disease. Disease factors that modify the goal-setting process include the urgency and irreversibility of the medical condition. Pertinent characteristics of the patient-family-clinician interaction include the level of participation, control, and trust among patients, family members, and clinicians. The research suggests that the goal-setting process in clinical medicine is complex, and the potential for disagreements regarding goals substantial. The nature of the goal-setting process suggests that explicit discussion of goals for care may be necessary to promote effective patient-family-clinician communication and adequate care planning.

  19. Low-level radioactive waste disposal technology development through a public process

    International Nuclear Information System (INIS)

    Murphy, M.P.; Hysong, R.J.; Edwards, C.W.

    1989-01-01

    When Pennsylvania's legislature ratified the Appalachian States Low-Level Radioactive Waste Compact in 1985, the Commonwealth of Pennsylvania became the host state designee for the compact's low-level radioactive waste (LLWR) disposal facility. Programs necessary for the establishment of this facility became the responsibility of the Department of Environmental Resources' (DER), Bureau of Radiation Protection's, Division of Nuclear Safety (DNS). It was realized early in the process that the technical aspects of this program, while challenging, probably were not the largest obstacle to completing the facility on schedule. The largest obstacle was likely to be public acceptance. Recognizing this, the DNS set out to develop a program that would maximize public involvement in all aspects of site and facility development. To facilitate public involvement in the process, the DNS established a LLRW advisory committee and a strategy for holding public meetings throughout Pennsylvania. As a result of the significant public involvement generated by these efforts, Pennsylvania passed, in February of 1988, one of the most stringent and technically demanding LLRW disposal laws in the nation. Hopefully, increased public confidence will reduce to a minimum public opposition to the facility

  20. Some free boundary problems in potential flow regime usinga based level set method

    Energy Technology Data Exchange (ETDEWEB)

    Garzon, M.; Bobillo-Ares, N.; Sethian, J.A.

    2008-12-09

    Recent advances in the field of fluid mechanics with moving fronts are linked to the use of Level Set Methods, a versatile mathematical technique to follow free boundaries which undergo topological changes. A challenging class of problems in this context are those related to the solution of a partial differential equation posed on a moving domain, in which the boundary condition for the PDE solver has to be obtained from a partial differential equation defined on the front. This is the case of potential flow models with moving boundaries. Moreover the fluid front will possibly be carrying some material substance which will diffuse in the front and be advected by the front velocity, as for example the use of surfactants to lower surface tension. We present a Level Set based methodology to embed this partial differential equations defined on the front in a complete Eulerian framework, fully avoiding the tracking of fluid particles and its known limitations. To show the advantages of this approach in the field of Fluid Mechanics we present in this work one particular application: the numerical approximation of a potential flow model to simulate the evolution and breaking of a solitary wave propagating over a slopping bottom and compare the level set based algorithm with previous front tracking models.

  1. High-level waste tank farm set point document

    International Nuclear Information System (INIS)

    Anthony, J.A. III.

    1995-01-01

    Setpoints for nuclear safety-related instrumentation are required for actions determined by the design authorization basis. Minimum requirements need to be established for assuring that setpoints are established and held within specified limits. This document establishes the controlling methodology for changing setpoints of all classifications. The instrumentation under consideration involve the transfer, storage, and volume reduction of radioactive liquid waste in the F- and H-Area High-Level Radioactive Waste Tank Farms. The setpoint document will encompass the PROCESS AREA listed in the Safety Analysis Report (SAR) (DPSTSA-200-10 Sup 18) which includes the diversion box HDB-8 facility. In addition to the PROCESS AREAS listed in the SAR, Building 299-H and the Effluent Transfer Facility (ETF) are also included in the scope

  2. High-level waste tank farm set point document

    Energy Technology Data Exchange (ETDEWEB)

    Anthony, J.A. III

    1995-01-15

    Setpoints for nuclear safety-related instrumentation are required for actions determined by the design authorization basis. Minimum requirements need to be established for assuring that setpoints are established and held within specified limits. This document establishes the controlling methodology for changing setpoints of all classifications. The instrumentation under consideration involve the transfer, storage, and volume reduction of radioactive liquid waste in the F- and H-Area High-Level Radioactive Waste Tank Farms. The setpoint document will encompass the PROCESS AREA listed in the Safety Analysis Report (SAR) (DPSTSA-200-10 Sup 18) which includes the diversion box HDB-8 facility. In addition to the PROCESS AREAS listed in the SAR, Building 299-H and the Effluent Transfer Facility (ETF) are also included in the scope.

  3. Reconstruction of incomplete cell paths through a 3D-2D level set segmentation

    Science.gov (United States)

    Hariri, Maia; Wan, Justin W. L.

    2012-02-01

    Segmentation of fluorescent cell images has been a popular technique for tracking live cells. One challenge of segmenting cells from fluorescence microscopy is that cells in fluorescent images frequently disappear. When the images are stacked together to form a 3D image volume, the disappearance of the cells leads to broken cell paths. In this paper, we present a segmentation method that can reconstruct incomplete cell paths. The key idea of this model is to perform 2D segmentation in a 3D framework. The 2D segmentation captures the cells that appear in the image slices while the 3D segmentation connects the broken cell paths. The formulation is similar to the Chan-Vese level set segmentation which detects edges by comparing the intensity value at each voxel with the mean intensity values inside and outside of the level set surface. Our model, however, performs the comparison on each 2D slice with the means calculated by the 2D projected contour. The resulting effect is to segment the cells on each image slice. Unlike segmentation on each image frame individually, these 2D contours together form the 3D level set function. By enforcing minimum mean curvature on the level set surface, our segmentation model is able to extend the cell contours right before (and after) the cell disappears (and reappears) into the gaps, eventually connecting the broken paths. We will present segmentation results of C2C12 cells in fluorescent images to illustrate the effectiveness of our model qualitatively and quantitatively by different numerical examples.

  4. Implications of sea-level rise in a modern carbonate ramp setting

    Science.gov (United States)

    Lokier, Stephen W.; Court, Wesley M.; Onuma, Takumi; Paul, Andreas

    2018-03-01

    This study addresses a gap in our understanding of the effects of sea-level rise on the sedimentary systems and morphological development of recent and ancient carbonate ramp settings. Many ancient carbonate sequences are interpreted as having been deposited in carbonate ramp settings. These settings are poorly-represented in the Recent. The study documents the present-day transgressive flooding of the Abu Dhabi coastline at the southern shoreline of the Arabian/Persian Gulf, a carbonate ramp depositional system that is widely employed as a Recent analogue for numerous ancient carbonate systems. Fourteen years of field-based observations are integrated with historical and recent high-resolution satellite imagery in order to document and assess the onset of flooding. Predicted rates of transgression (i.e. landward movement of the shoreline) of 2.5 m yr- 1 (± 0.2 m yr- 1) based on global sea-level rise alone were far exceeded by the flooding rate calculated from the back-stepping of coastal features (10-29 m yr- 1). This discrepancy results from the dynamic nature of the flooding with increased water depth exposing the coastline to increased erosion and, thereby, enhancing back-stepping. A non-accretionary transgressive shoreline trajectory results from relatively rapid sea-level rise coupled with a low-angle ramp geometry and a paucity of sediments. The flooding is represented by the landward migration of facies belts, a range of erosive features and the onset of bioturbation. Employing Intergovernmental Panel on Climate Change (Church et al., 2013) predictions for 21st century sea-level rise, and allowing for the post-flooding lag time that is typical for the start-up of carbonate factories, it is calculated that the coastline will continue to retrograde for the foreseeable future. Total passive flooding (without considering feedback in the modification of the shoreline) by the year 2100 is calculated to likely be between 340 and 571 m with a flooding rate of 3

  5. Examining the Level of Convergence among Self-Regulated Learning Microanalytic Processes, Achievement, and a Self-Report Questionnaire

    Science.gov (United States)

    Cleary, Timothy J.; Callan, Gregory L.; Malatesta, Jaime; Adams, Tanya

    2015-01-01

    This study examined the convergent and predictive validity of self-regulated learning (SRL) microanalytic measures. Specifically, theoretically based relations among a set of self-reflection processes, self-efficacy, and achievement were examined as was the level of convergence between a microanalytic strategy measure and a SRL self-report…

  6. Influence of different process settings conditions on the accuracy of micro injection molding simulations: an experimental validation

    DEFF Research Database (Denmark)

    Tosello, Guido; Gava, Alberto; Hansen, Hans Nørgaard

    2009-01-01

    Currently available software packages exhibit poor results accuracy when performing micro injection molding (µIM) simulations. However, with an appropriate set-up of the processing conditions, the quality of results can be improved. The effects on the simulation results of different and alternative...... process conditions are investigated, namely the nominal injection speed, as well as the cavity filling time and the evolution of the cavity injection pressure as experimental data. In addition, the sensitivity of the results to the quality of the rheological data is analyzed. Simulated results...... are compared with experiments in terms of flow front position at part and micro features levels, as well as cavity injection filling time measurements....

  7. Robust boundary detection of left ventricles on ultrasound images using ASM-level set method.

    Science.gov (United States)

    Zhang, Yaonan; Gao, Yuan; Li, Hong; Teng, Yueyang; Kang, Yan

    2015-01-01

    Level set method has been widely used in medical image analysis, but it has difficulties when being used in the segmentation of left ventricular (LV) boundaries on echocardiography images because the boundaries are not very distinguish, and the signal-to-noise ratio of echocardiography images is not very high. In this paper, we introduce the Active Shape Model (ASM) into the traditional level set method to enforce shape constraints. It improves the accuracy of boundary detection and makes the evolution more efficient. The experiments conducted on the real cardiac ultrasound image sequences show a positive and promising result.

  8. A highly efficient 3D level-set grain growth algorithm tailored for ccNUMA architecture

    Science.gov (United States)

    Mießen, C.; Velinov, N.; Gottstein, G.; Barrales-Mora, L. A.

    2017-12-01

    A highly efficient simulation model for 2D and 3D grain growth was developed based on the level-set method. The model introduces modern computational concepts to achieve excellent performance on parallel computer architectures. Strong scalability was measured on cache-coherent non-uniform memory access (ccNUMA) architectures. To achieve this, the proposed approach considers the application of local level-set functions at the grain level. Ideal and non-ideal grain growth was simulated in 3D with the objective to study the evolution of statistical representative volume elements in polycrystals. In addition, microstructure evolution in an anisotropic magnetic material affected by an external magnetic field was simulated.

  9. Goal setting with mothers in child development services.

    Science.gov (United States)

    Forsingdal, S; St John, W; Miller, V; Harvey, A; Wearne, P

    2014-07-01

    The aim of this grounded theory study was to explore mothers' perspectives of the processes of collaborative goal setting in multidisciplinary child development services involving follow-up home therapy. Semi-structured interviews were conducted in South East Queensland, Australia with 14 mothers of children aged 3-6 years who were accessing multidisciplinary child development services. Interviews were focussed around the process of goal setting. A grounded theory of Maternal Roles in Goal Setting (The M-RIGS Model) was developed from analysis of data. Mothers assumed Dependent, Active Participator and Collaborator roles when engaging with the therapist in goal-setting processes. These roles were characterized by the mother's level of dependence on the therapist and insight into their child's needs and therapy processes. Goal Factors, Parent Factors and Therapist Factors influenced and added complexity to the goal-setting process. The M-RIGS Model highlights that mothers take on a range of roles in the goal-setting process. Although family-centred practice encourages negotiation and collaborative goal setting, parents may not always be ready to take on highly collaborative roles. Better understanding of parent roles, goal-setting processes and influencing factors will inform better engagement with families accessing multidisciplinary child development services. © 2013 John Wiley & Sons Ltd.

  10. Topology optimization in acoustics and elasto-acoustics via a level-set method

    Science.gov (United States)

    Desai, J.; Faure, A.; Michailidis, G.; Parry, G.; Estevez, R.

    2018-04-01

    Optimizing the shape and topology (S&T) of structures to improve their acoustic performance is quite challenging. The exact position of the structural boundary is usually of critical importance, which dictates the use of geometric methods for topology optimization instead of standard density approaches. The goal of the present work is to investigate different possibilities for handling topology optimization problems in acoustics and elasto-acoustics via a level-set method. From a theoretical point of view, we detail two equivalent ways to perform the derivation of surface-dependent terms and propose a smoothing technique for treating problems of boundary conditions optimization. In the numerical part, we examine the importance of the surface-dependent term in the shape derivative, neglected in previous studies found in the literature, on the optimal designs. Moreover, we test different mesh adaptation choices, as well as technical details related to the implicit surface definition in the level-set approach. We present results in two and three-space dimensions.

  11. Setting research priorities by applying the combined approach matrix.

    Science.gov (United States)

    Ghaffar, Abdul

    2009-04-01

    Priority setting in health research is a dynamic process. Different organizations and institutes have been working in the field of research priority setting for many years. In 1999 the Global Forum for Health Research presented a research priority setting tool called the Combined Approach Matrix or CAM. Since its development, the CAM has been successfully applied to set research priorities for diseases, conditions and programmes at global, regional and national levels. This paper briefly explains the CAM methodology and how it could be applied in different settings, giving examples and describing challenges encountered in the process of setting research priorities and providing recommendations for further work in this field. The construct and design of the CAM is explained along with different steps needed, including planning and organization of a priority-setting exercise and how it could be applied in different settings. The application of the CAM are described by using three examples. The first concerns setting research priorities for a global programme, the second describes application at the country level and the third setting research priorities for diseases. Effective application of the CAM in different and diverse environments proves its utility as a tool for setting research priorities. Potential challenges encountered in the process of research priority setting are discussed and some recommendations for further work in this field are provided.

  12. Setting healthcare priorities: a description and evaluation of the budgeting and planning process in county hospitals in Kenya.

    Science.gov (United States)

    Barasa, Edwine W; Cleary, Susan; Molyneux, Sassy; English, Mike

    2017-04-01

    This paper describes and evaluates the budgeting and planning processes in public hospitals in Kenya. We used a qualitative case study approach to examine these processes in two hospitals in Kenya. We collected data by in-depth interviews of national level policy makers, hospital managers, and frontline practitioners in the case study hospitals (n = 72), a review of documents, and non-participant observations within the hospitals over a 7 month period. We applied an evaluative framework that considers both consequentialist and proceduralist conditions as important to the quality of priority-setting processes. The budgeting and planning process in the case study hospitals was characterized by lack of alignment, inadequate role clarity and the use of informal priority-setting criteria. With regard to consequentialist conditions, the hospitals incorporated economic criteria by considering the affordability of alternatives, but rarely considered the equity of allocative decisions. In the first hospital, stakeholders were aware of - and somewhat satisfied with - the budgeting and planning process, while in the second hospital they were not. Decision making in both hospitals did not result in reallocation of resources. With regard to proceduralist conditions, the budgeting and planning process in the first hospital was more inclusive and transparent, with the stakeholders more empowered compared to the second hospital. In both hospitals, decisions were not based on evidence, implementation of decisions was poor and the community was not included. There were no mechanisms for appeals or to ensure that the proceduralist conditions were met in both hospitals. Public hospitals in Kenya could improve their budgeting and planning processes by harmonizing these processes, improving role clarity, using explicit priority-setting criteria, and by incorporating both consequentialist (efficiency, equity, stakeholder satisfaction and understanding, shifted priorities

  13. [Cardiac Synchronization Function Estimation Based on ASM Level Set Segmentation Method].

    Science.gov (United States)

    Zhang, Yaonan; Gao, Yuan; Tang, Liang; He, Ying; Zhang, Huie

    At present, there is no accurate and quantitative methods for the determination of cardiac mechanical synchronism, and quantitative determination of the synchronization function of the four cardiac cavities with medical images has a great clinical value. This paper uses the whole heart ultrasound image sequence, and segments the left & right atriums and left & right ventricles of each frame. After the segmentation, the number of pixels in each cavity and in each frame is recorded, and the areas of the four cavities of the image sequence are therefore obtained. The area change curves of the four cavities are further extracted, and the synchronous information of the four cavities is obtained. Because of the low SNR of Ultrasound images, the boundary lines of cardiac cavities are vague, so the extraction of cardiac contours is still a challenging problem. Therefore, the ASM model information is added to the traditional level set method to force the curve evolution process. According to the experimental results, the improved method improves the accuracy of the segmentation. Furthermore, based on the ventricular segmentation, the right and left ventricular systolic functions are evaluated, mainly according to the area changes. The synchronization of the four cavities of the heart is estimated based on the area changes and the volume changes.

  14. Trusting Politicians and Institutions in a Multi-Level Setting

    DEFF Research Database (Denmark)

    Hansen, Sune Welling; Kjær, Ulrik

    Trust in government and in politicians is a very crucial prerequisite for democratic processes. This goes not only for the national level of government but also for the regional and local. We make use of a large scale survey among citizens in Denmark to evaluate trust in politicians at different...... formation processes can negatively influence trust in the mayor and the councilors. Reaching out for the local power by being disloyal to one’s own party or by breaking deals already made can sometimes secure the mayoralty but it comes with a prize: lower trust among the electorate....

  15. An Accurate Fire-Spread Algorithm in the Weather Research and Forecasting Model Using the Level-Set Method

    Science.gov (United States)

    Muñoz-Esparza, Domingo; Kosović, Branko; Jiménez, Pedro A.; Coen, Janice L.

    2018-04-01

    The level-set method is typically used to track and propagate the fire perimeter in wildland fire models. Herein, a high-order level-set method using fifth-order WENO scheme for the discretization of spatial derivatives and third-order explicit Runge-Kutta temporal integration is implemented within the Weather Research and Forecasting model wildland fire physics package, WRF-Fire. The algorithm includes solution of an additional partial differential equation for level-set reinitialization. The accuracy of the fire-front shape and rate of spread in uncoupled simulations is systematically analyzed. It is demonstrated that the common implementation used by level-set-based wildfire models yields to rate-of-spread errors in the range 10-35% for typical grid sizes (Δ = 12.5-100 m) and considerably underestimates fire area. Moreover, the amplitude of fire-front gradients in the presence of explicitly resolved turbulence features is systematically underestimated. In contrast, the new WRF-Fire algorithm results in rate-of-spread errors that are lower than 1% and that become nearly grid independent. Also, the underestimation of fire area at the sharp transition between the fire front and the lateral flanks is found to be reduced by a factor of ≈7. A hybrid-order level-set method with locally reduced artificial viscosity is proposed, which substantially alleviates the computational cost associated with high-order discretizations while preserving accuracy. Simulations of the Last Chance wildfire demonstrate additional benefits of high-order accurate level-set algorithms when dealing with complex fuel heterogeneities, enabling propagation across narrow fuel gaps and more accurate fire backing over the lee side of no fuel clusters.

  16. Dynamic-thresholding level set: a novel computer-aided volumetry method for liver tumors in hepatic CT images

    Science.gov (United States)

    Cai, Wenli; Yoshida, Hiroyuki; Harris, Gordon J.

    2007-03-01

    Measurement of the volume of focal liver tumors, called liver tumor volumetry, is indispensable for assessing the growth of tumors and for monitoring the response of tumors to oncology treatments. Traditional edge models, such as the maximum gradient and zero-crossing methods, often fail to detect the accurate boundary of a fuzzy object such as a liver tumor. As a result, the computerized volumetry based on these edge models tends to differ from manual segmentation results performed by physicians. In this study, we developed a novel computerized volumetry method for fuzzy objects, called dynamic-thresholding level set (DT level set). An optimal threshold value computed from a histogram tends to shift, relative to the theoretical threshold value obtained from a normal distribution model, toward a smaller region in the histogram. We thus designed a mobile shell structure, called a propagating shell, which is a thick region encompassing the level set front. The optimal threshold calculated from the histogram of the shell drives the level set front toward the boundary of a liver tumor. When the volume ratio between the object and the background in the shell approaches one, the optimal threshold value best fits the theoretical threshold value and the shell stops propagating. Application of the DT level set to 26 hepatic CT cases with 63 biopsy-confirmed hepatocellular carcinomas (HCCs) and metastases showed that the computer measured volumes were highly correlated with those of tumors measured manually by physicians. Our preliminary results showed that DT level set was effective and accurate in estimating the volumes of liver tumors detected in hepatic CT images.

  17. Audiovisual speech perception development at varying levels of perceptual processing.

    Science.gov (United States)

    Lalonde, Kaylah; Holt, Rachael Frush

    2016-04-01

    This study used the auditory evaluation framework [Erber (1982). Auditory Training (Alexander Graham Bell Association, Washington, DC)] to characterize the influence of visual speech on audiovisual (AV) speech perception in adults and children at multiple levels of perceptual processing. Six- to eight-year-old children and adults completed auditory and AV speech perception tasks at three levels of perceptual processing (detection, discrimination, and recognition). The tasks differed in the level of perceptual processing required to complete them. Adults and children demonstrated visual speech influence at all levels of perceptual processing. Whereas children demonstrated the same visual speech influence at each level of perceptual processing, adults demonstrated greater visual speech influence on tasks requiring higher levels of perceptual processing. These results support previous research demonstrating multiple mechanisms of AV speech processing (general perceptual and speech-specific mechanisms) with independent maturational time courses. The results suggest that adults rely on both general perceptual mechanisms that apply to all levels of perceptual processing and speech-specific mechanisms that apply when making phonetic decisions and/or accessing the lexicon. Six- to eight-year-old children seem to rely only on general perceptual mechanisms across levels. As expected, developmental differences in AV benefit on this and other recognition tasks likely reflect immature speech-specific mechanisms and phonetic processing in children.

  18. Electrochemical processing of low-level waste solutions

    International Nuclear Information System (INIS)

    Hobbs, D.T.; Ebra, M.A.

    1987-01-01

    The feasibility of treating low-level Savannah River Plant (SRP) waste solutions by an electrolytic process has been demonstrated. Although the economics of the process are marginal at the current densities investigated at the laboratory scale, there are a number of positive environmental benefits. These benefits include: (1) reduction in the levels of nitrate and nitrite in the waste, (2) further decontamination of 99 Tc and 106 Ru, and (3) reduction in the volume of waste

  19. Determinants of corporate lobbying intensity in the lease standard-setting process

    Directory of Open Access Journals (Sweden)

    Lucia Mellado

    2017-07-01

    Full Text Available The highly controversial lease standard-setting project that will replace the standards currently in place, establishes a new approach that includes the recognition of all assets and liabilities associated with lease contracts on the balance sheet, regardless of their classification. The complex standard-setting process and the heated debate among stakeholders makes the analysis of the lobbying phenomenon an important matter for study. The aim of this paper is to find explanatory factors that predict the behaviour of corporate groups with respect to the lease standard-setting process. To achieve this objective, we scrutinize the submission of comment letters by 306 non-financial listed companies in response to the discussion paper (DP 2009 and two exposure drafts (ED 2010 and ED 2013 elaborated jointly by the IASB and the FASB by distinguishing among three degrees of intensity in lobbying activities, depending on participation in the different discussion periods. Our empirical study is conducted through a multivariate analysis that shows the intensity of lobbying by considering participation in the three consultation periods. The results show that the intensity of lobbying is associated with size, profitability, age, industry and managerial ownership. The evidence can be used to predict lobbying behaviour. The research has implications for standard setters and contributes to prior lobbying research.

  20. Cascading activation from lexical processing to letter-level processing in written word production.

    Science.gov (United States)

    Buchwald, Adam; Falconer, Carolyn

    2014-01-01

    Descriptions of language production have identified processes involved in producing language and the presence and type of interaction among those processes. In the case of spoken language production, consensus has emerged that there is interaction among lexical selection processes and phoneme-level processing. This issue has received less attention in written language production. In this paper, we present a novel analysis of the writing-to-dictation performance of an individual with acquired dysgraphia revealing cascading activation from lexical processing to letter-level processing. The individual produced frequent lexical-semantic errors (e.g., chipmunk → SQUIRREL) as well as letter errors (e.g., inhibit → INBHITI) and had a profile consistent with impairment affecting both lexical processing and letter-level processing. The presence of cascading activation is suggested by lower letter accuracy on words that are more weakly activated during lexical selection than on those that are more strongly activated. We operationalize weakly activated lexemes as those lexemes that are produced as lexical-semantic errors (e.g., lethal in deadly → LETAHL) compared to strongly activated lexemes where the intended target word (e.g., lethal) is the lexeme selected for production.

  1. GSHR, a Web-Based Platform Provides Gene Set-Level Analyses of Hormone Responses in Arabidopsis

    Directory of Open Access Journals (Sweden)

    Xiaojuan Ran

    2018-01-01

    Full Text Available Phytohormones regulate diverse aspects of plant growth and environmental responses. Recent high-throughput technologies have promoted a more comprehensive profiling of genes regulated by different hormones. However, these omics data generally result in large gene lists that make it challenging to interpret the data and extract insights into biological significance. With the rapid accumulation of theses large-scale experiments, especially the transcriptomic data available in public databases, a means of using this information to explore the transcriptional networks is needed. Different platforms have different architectures and designs, and even similar studies using the same platform may obtain data with large variances because of the highly dynamic and flexible effects of plant hormones; this makes it difficult to make comparisons across different studies and platforms. Here, we present a web server providing gene set-level analyses of Arabidopsis thaliana hormone responses. GSHR collected 333 RNA-seq and 1,205 microarray datasets from the Gene Expression Omnibus, characterizing transcriptomic changes in Arabidopsis in response to phytohormones including abscisic acid, auxin, brassinosteroids, cytokinins, ethylene, gibberellins, jasmonic acid, salicylic acid, and strigolactones. These data were further processed and organized into 1,368 gene sets regulated by different hormones or hormone-related factors. By comparing input gene lists to these gene sets, GSHR helped to identify gene sets from the input gene list regulated by different phytohormones or related factors. Together, GSHR links prior information regarding transcriptomic changes induced by hormones and related factors to newly generated data and facilities cross-study and cross-platform comparisons; this helps facilitate the mining of biologically significant information from large-scale datasets. The GSHR is freely available at http://bioinfo.sibs.ac.cn/GSHR/.

  2. Improved inhalation technology for setting safe exposure levels for workplace chemicals

    Science.gov (United States)

    Stuart, Bruce O.

    1993-01-01

    Threshold Limit Values recommended as allowable air concentrations of a chemical in the workplace are often based upon a no-observable-effect-level (NOEL) determined by experimental inhalation studies using rodents. A 'safe level' for human exposure must then be estimated by the use of generalized safety factors in attempts to extrapolate from experimental rodents to man. The recent development of chemical-specific physiologically-based toxicokinetics makes use of measured physiological, biochemical, and metabolic parameters to construct a validated model that is able to 'scale-up' rodent response data to predict the behavior of the chemical in man. This procedure is made possible by recent advances in personal computer software and the emergence of appropriate biological data, and provides an analytical tool for much more reliable risk evaluation and airborne chemical exposure level setting for humans.

  3. Computerized detection of multiple sclerosis candidate regions based on a level set method using an artificial neural network

    International Nuclear Information System (INIS)

    Kuwazuru, Junpei; Magome, Taiki; Arimura, Hidetaka; Yamashita, Yasuo; Oki, Masafumi; Toyofuku, Fukai; Kakeda, Shingo; Yamamoto, Daisuke

    2010-01-01

    Yamamoto et al. developed the system for computer-aided detection of multiple sclerosis (MS) candidate regions. In a level set method in their proposed method, they employed the constant threshold value for the edge indicator function related to a speed function of the level set method. However, it would be appropriate to adjust the threshold value to each MS candidate region, because the edge magnitudes in MS candidates differ from each other. Our purpose of this study was to develop a computerized detection of MS candidate regions in MR images based on a level set method using an artificial neural network (ANN). To adjust the threshold value for the edge indicator function in the level set method to each true positive (TP) and false positive (FP) region, we constructed the ANN. The ANN could provide the suitable threshold value for each candidate region in the proposed level set method so that TP regions can be segmented and FP regions can be removed. Our proposed method detected MS regions at a sensitivity of 82.1% with 0.204 FPs per slice and similarity index of MS candidate regions was 0.717 on average. (author)

  4. Kir2.1 channels set two levels of resting membrane potential with inward rectification.

    Science.gov (United States)

    Chen, Kuihao; Zuo, Dongchuan; Liu, Zheng; Chen, Haijun

    2018-04-01

    Strong inward rectifier K + channels (Kir2.1) mediate background K + currents primarily responsible for maintenance of resting membrane potential. Multiple types of cells exhibit two levels of resting membrane potential. Kir2.1 and K2P1 currents counterbalance, partially accounting for the phenomenon of human cardiomyocytes in subphysiological extracellular K + concentrations or pathological hypokalemic conditions. The mechanism of how Kir2.1 channels contribute to the two levels of resting membrane potential in different types of cells is not well understood. Here we test the hypothesis that Kir2.1 channels set two levels of resting membrane potential with inward rectification. Under hypokalemic conditions, Kir2.1 currents counterbalance HCN2 or HCN4 cation currents in CHO cells that heterologously express both channels, generating N-shaped current-voltage relationships that cross the voltage axis three times and reconstituting two levels of resting membrane potential. Blockade of HCN channels eliminated the phenomenon in K2P1-deficient Kir2.1-expressing human cardiomyocytes derived from induced pluripotent stem cells or CHO cells expressing both Kir2.1 and HCN2 channels. Weakly inward rectifier Kir4.1 or inward rectification-deficient Kir2.1•E224G mutant channels do not set such two levels of resting membrane potential when co-expressed with HCN2 channels in CHO cells or when overexpressed in human cardiomyocytes derived from induced pluripotent stem cells. These findings demonstrate a common mechanism that Kir2.1 channels set two levels of resting membrane potential with inward rectification by balancing inward currents through different cation channels such as hyperpolarization-activated HCN channels or hypokalemia-induced K2P1 leak channels.

  5. Inferring individual-level processes from population-level patterns in cultural evolution

    Science.gov (United States)

    Wilder, Bryan

    2017-01-01

    Our species is characterized by a great degree of cultural variation, both within and between populations. Understanding how group-level patterns of culture emerge from individual-level behaviour is a long-standing question in the biological and social sciences. We develop a simulation model capturing demographic and cultural dynamics relevant to human cultural evolution, focusing on the interface between population-level patterns and individual-level processes. The model tracks the distribution of variants of cultural traits across individuals in a population over time, conditioned on different pathways for the transmission of information between individuals. From these data, we obtain theoretical expectations for a range of statistics commonly used to capture population-level characteristics (e.g. the degree of cultural diversity). Consistent with previous theoretical work, our results show that the patterns observed at the level of groups are rooted in the interplay between the transmission pathways and the age structure of the population. We also explore whether, and under what conditions, the different pathways can be distinguished based on their group-level signatures, in an effort to establish theoretical limits to inference. Our results show that the temporal dynamic of cultural change over time retains a stronger signature than the cultural composition of the population at a specific point in time. Overall, the results suggest a shift in focus from identifying the one individual-level process that likely produced the observed data to excluding those that likely did not. We conclude by discussing the implications for empirical studies of human cultural evolution. PMID:28989786

  6. Inferring individual-level processes from population-level patterns in cultural evolution.

    Science.gov (United States)

    Kandler, Anne; Wilder, Bryan; Fortunato, Laura

    2017-09-01

    Our species is characterized by a great degree of cultural variation, both within and between populations. Understanding how group-level patterns of culture emerge from individual-level behaviour is a long-standing question in the biological and social sciences. We develop a simulation model capturing demographic and cultural dynamics relevant to human cultural evolution, focusing on the interface between population-level patterns and individual-level processes. The model tracks the distribution of variants of cultural traits across individuals in a population over time, conditioned on different pathways for the transmission of information between individuals. From these data, we obtain theoretical expectations for a range of statistics commonly used to capture population-level characteristics (e.g. the degree of cultural diversity). Consistent with previous theoretical work, our results show that the patterns observed at the level of groups are rooted in the interplay between the transmission pathways and the age structure of the population. We also explore whether, and under what conditions, the different pathways can be distinguished based on their group-level signatures, in an effort to establish theoretical limits to inference. Our results show that the temporal dynamic of cultural change over time retains a stronger signature than the cultural composition of the population at a specific point in time. Overall, the results suggest a shift in focus from identifying the one individual-level process that likely produced the observed data to excluding those that likely did not. We conclude by discussing the implications for empirical studies of human cultural evolution.

  7. Expression of the histone chaperone SET/TAF-Iβ during the strobilation process of Mesocestoides corti (Platyhelminthes, Cestoda).

    Science.gov (United States)

    Costa, Caroline B; Monteiro, Karina M; Teichmann, Aline; da Silva, Edileuza D; Lorenzatto, Karina R; Cancela, Martín; Paes, Jéssica A; Benitz, André de N D; Castillo, Estela; Margis, Rogério; Zaha, Arnaldo; Ferreira, Henrique B

    2015-08-01

    The histone chaperone SET/TAF-Iβ is implicated in processes of chromatin remodelling and gene expression regulation. It has been associated with the control of developmental processes, but little is known about its function in helminth parasites. In Mesocestoides corti, a partial cDNA sequence related to SET/TAF-Iβ was isolated in a screening for genes differentially expressed in larvae (tetrathyridia) and adult worms. Here, the full-length coding sequence of the M. corti SET/TAF-Iβ gene was analysed and the encoded protein (McSET/TAF) was compared with orthologous sequences, showing that McSET/TAF can be regarded as a SET/TAF-Iβ family member, with a typical nucleosome-assembly protein (NAP) domain and an acidic tail. The expression patterns of the McSET/TAF gene and protein were investigated during the strobilation process by RT-qPCR, using a set of five reference genes, and by immunoblot and immunofluorescence, using monospecific polyclonal antibodies. A gradual increase in McSET/TAF transcripts and McSET/TAF protein was observed upon development induction by trypsin, demonstrating McSET/TAF differential expression during strobilation. These results provided the first evidence for the involvement of a protein from the NAP family of epigenetic effectors in the regulation of cestode development.

  8. Level of health care and services in a tertiary health setting in Nigeria

    African Journals Online (AJOL)

    Level of health care and services in a tertiary health setting in Nigeria. ... Background: There is a growing awareness and demand for quality health care across the world; hence the ... Doctors and nurses formed 64.3% of the study population.

  9. Standard setting in the teaching and learning process in the Kenya ...

    African Journals Online (AJOL)

    Standards are set at different levels to govern different requirements that collectively add up to the ingredients of quality education of a child. This study investigated whether or not there are quantitative standards of achievement for guiding teaching and learning in the school system in Kenya. It also investigated teachers' ...

  10. Numerical simulation of interface movement in gas-liquid two-phase flows with Level Set method

    International Nuclear Information System (INIS)

    Li Huixiong; Chinese Academy of Sciences, Beijing; Deng Sheng; Chen Tingkuan; Zhao Jianfu; Wang Fei

    2005-01-01

    Numerical simulation of gas-liquid two-phase flow and heat transfer has been an attractive work for a quite long time, but still remains as a knotty difficulty due to the inherent complexities of the gas-liquid two-phase flow resulted from the existence of moving interfaces with topology changes. This paper reports the effort and the latest advances that have been made by the authors, with special emphasis on the methods for computing solutions to the advection equation of the Level set function, which is utilized to capture the moving interfaces in gas-liquid two-phase flows. Three different schemes, i.e. the simple finite difference scheme, the Superbee-TVD scheme and the 5-order WENO scheme in combination with the Runge-Kutta method are respectively applied to solve the advection equation of the Level Set. A numerical procedure based on the well-verified SIMPLER method is employed to numerically calculate the momentum equations of the two-phase flow. The above-mentioned three schemes are employed to simulate the movement of four typical interfaces under 5 typical flowing conditions. Analysis of the numerical results shows that the 5-order WENO scheme and the Superbee-TVD scheme are much better than the simple finite difference scheme, and the 5-order WENO scheme is the best to compute solutions to the advection equation of the Level Set. The 5-order WENO scheme will be employed as the main scheme to get solutions to the advection equations of the Level Set when gas-liquid two-phase flows are numerically studied in the future. (authors)

  11. Setting ozone critical levels for protecting horticultural Mediterranean crops: Case study of tomato

    International Nuclear Information System (INIS)

    González-Fernández, I.; Calvo, E.; Gerosa, G.; Bermejo, V.; Marzuoli, R.; Calatayud, V.; Alonso, R.

    2014-01-01

    Seven experiments carried out in Italy and Spain have been used to parameterising a stomatal conductance model and establishing exposure– and dose–response relationships for yield and quality of tomato with the main goal of setting O 3 critical levels (CLe). CLe with confidence intervals, between brackets, were set at an accumulated hourly O 3 exposure over 40 nl l −1 , AOT40 = 8.4 (1.2, 15.6) ppm h and a phytotoxic ozone dose above a threshold of 6 nmol m −2 s −1 , POD6 = 2.7 (0.8, 4.6) mmol m −2 for yield and AOT40 = 18.7 (8.5, 28.8) ppm h and POD6 = 4.1 (2.0, 6.2) mmol m −2 for quality, both indices performing equally well. CLe confidence intervals provide information on the quality of the dataset and should be included in future calculations of O 3 CLe for improving current methodologies. These CLe, derived for sensitive tomato cultivars, should not be applied for quantifying O 3 -induced losses at the risk of making important overestimations of the economical losses associated with O 3 pollution. -- Highlights: • Seven independent experiments from Italy and Spain were analysed. • O 3 critical levels are proposed for the protection of summer horticultural crops. • Exposure- and flux-based O 3 indices performed equally well. • Confidence intervals of the new O 3 critical levels are calculated. • A new method to estimate the degree risk of O 3 damage is proposed. -- Critical levels for tomato yield were set at AOT40 = 8.4 ppm h and POD6 = 2.7 mmol m −2 and confidence intervals should be used for improving O 3 risk assessment

  12. High-Level Waste (HLW) Feed Process Control Strategy

    International Nuclear Information System (INIS)

    STAEHR, T.W.

    2000-01-01

    The primary purpose of this document is to describe the overall process control strategy for monitoring and controlling the functions associated with the Phase 1B high-level waste feed delivery. This document provides the basis for process monitoring and control functions and requirements needed throughput the double-shell tank system during Phase 1 high-level waste feed delivery. This document is intended to be used by (1) the developers of the future Process Control Plan and (2) the developers of the monitoring and control system

  13. County-level poverty is equally associated with unmet health care needs in rural and urban settings.

    Science.gov (United States)

    Peterson, Lars E; Litaker, David G

    2010-01-01

    Regional poverty is associated with reduced access to health care. Whether this relationship is equally strong in both rural and urban settings or is affected by the contextual and individual-level characteristics that distinguish these areas, is unclear. Compare the association between regional poverty with self-reported unmet need, a marker of health care access, by rural/urban setting. Multilevel, cross-sectional analysis of a state-representative sample of 39,953 adults stratified by rural/urban status, linked at the county level to data describing contextual characteristics. Weighted random intercept models examined the independent association of regional poverty with unmet needs, controlling for a range of contextual and individual-level characteristics. The unadjusted association between regional poverty levels and unmet needs was similar in both rural (OR = 1.06 [95% CI, 1.04-1.08]) and urban (OR = 1.03 [1.02-1.05]) settings. Adjusting for other contextual characteristics increased the size of the association in both rural (OR = 1.11 [1.04-1.19]) and urban (OR = 1.11 [1.05-1.18]) settings. Further adjustment for individual characteristics had little additional effect in rural (OR = 1.10 [1.00-1.20]) or urban (OR = 1.11 [1.01-1.22]) settings. To better meet the health care needs of all Americans, health care systems in areas with high regional poverty should acknowledge the relationship between poverty and unmet health care needs. Investments, or other interventions, that reduce regional poverty may be useful strategies for improving health through better access to health care. © 2010 National Rural Health Association.

  14. Strategy for design NIR calibration sets based on process spectrum and model space: An innovative approach for process analytical technology.

    Science.gov (United States)

    Cárdenas, V; Cordobés, M; Blanco, M; Alcalà, M

    2015-10-10

    The pharmaceutical industry is under stringent regulations on quality control of their products because is critical for both, productive process and consumer safety. According to the framework of "process analytical technology" (PAT), a complete understanding of the process and a stepwise monitoring of manufacturing are required. Near infrared spectroscopy (NIRS) combined with chemometrics have lately performed efficient, useful and robust for pharmaceutical analysis. One crucial step in developing effective NIRS-based methodologies is selecting an appropriate calibration set to construct models affording accurate predictions. In this work, we developed calibration models for a pharmaceutical formulation during its three manufacturing stages: blending, compaction and coating. A novel methodology is proposed for selecting the calibration set -"process spectrum"-, into which physical changes in the samples at each stage are algebraically incorporated. Also, we established a "model space" defined by Hotelling's T(2) and Q-residuals statistics for outlier identification - inside/outside the defined space - in order to select objectively the factors to be used in calibration set construction. The results obtained confirm the efficacy of the proposed methodology for stepwise pharmaceutical quality control, and the relevance of the study as a guideline for the implementation of this easy and fast methodology in the pharma industry. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Scope of physician procedures independently billed by mid-level providers in the office setting.

    Science.gov (United States)

    Coldiron, Brett; Ratnarathorn, Mondhipa

    2014-11-01

    Mid-level providers (nurse practitioners and physician assistants) were originally envisioned to provide primary care services in underserved areas. This study details the current scope of independent procedural billing to Medicare of difficult, invasive, and surgical procedures by medical mid-level providers. To understand the scope of independent billing to Medicare for procedures performed by mid-level providers in an outpatient office setting for a calendar year. Analyses of the 2012 Medicare Physician/Supplier Procedure Summary Master File, which reflects fee-for-service claims that were paid by Medicare, for Current Procedural Terminology procedures independently billed by mid-level providers. Outpatient office setting among health care providers. The scope of independent billing to Medicare for procedures performed by mid-level providers. In 2012, nurse practitioners and physician assistants billed independently for more than 4 million procedures at our cutoff of 5000 paid claims per procedure. Most (54.8%) of these procedures were performed in the specialty area of dermatology. The findings of this study are relevant to safety and quality of care. Recently, the shortage of primary care clinicians has prompted discussion of widening the scope of practice for mid-level providers. It would be prudent to temper widening the scope of practice of mid-level providers by recognizing that mid-level providers are not solely limited to primary care, and may involve procedures for which they may not have formal training.

  16. Experimental processing of a model data set using Geobit seismic software

    Energy Technology Data Exchange (ETDEWEB)

    Suh, Sang Yong [Korea Inst. of Geology Mining and Materials, Taejon (Korea, Republic of)

    1995-12-01

    A seismic data processing software, Geobit, has been developed and is continuously updated to implement newer processing techniques and to support more hardware platforms. Geobit is intended to support all Unix platforms ranging from PC to CRAY. The current version supports two platform, i.e., PC/Linux and Sun Sparc based Sun OS 4.1.x. PC/Linux attracted geophysicists in some universities trying to install Geobit in their laboratories to be used as their research tool. However, one of the problem is the difficulty in getting the seismic data. The primary reason is its huge volume. The field data is too bulky to fit their relatively small storage media, such as PC disk. To solve the problem, KIGAM released a model seismic data set via ftp.kigam.re.kr. This study aims two purposes. The first one is testing Geobit software for its suitability in seismic data processing. The test includes reproducing the model through the seismic data processing. If it fails to reproduce the original model, the software is considered buggy and incomplete. However, if it can successfully reproduce the input model, I would be proud of what I have accomplished for the last few years in writing Geobit. The second purpose is to give a guide on Geobit usage by providing an example set of job files needed to process a given data. This example will help scientists lacking Geobit experience to concentrate on their study more easily. Once they know the Geobit processing technique, and later on Geobit programming, they can implement their own processing idea, contributing newer technologies to Geobit. The complete Geobit job files needed to process the model data is written, in the following job sequence: (1) data loading, (2) CDP sort, (3) decon analysis, (4) velocity analysis, (5) decon verification, (6) stack, (7) filter analysis, (8) filtered stack, (9) time migration, (10) depth migration. The control variables in the job files are discussed. (author). 10 figs., 1 tab.

  17. Consensus and contention in the priority setting process: examining the health sector in Uganda.

    Science.gov (United States)

    Colenbrander, Sarah; Birungi, Charles; Mbonye, Anthony K

    2015-06-01

    Health priority setting is a critical and contentious issue in low-income countries because of the high burden of disease relative to the limited resource envelope. Many sophisticated quantitative tools and policy frameworks have been developed to promote transparent priority setting processes and allocative efficiency. However, low-income countries frequently lack effective governance systems or implementation capacity, so high-level priorities are not determined through evidence-based decision-making processes. This study uses qualitative research methods to explore how key actors' priorities differ in low-income countries, using Uganda as a case study. Human resources for health, disease prevention and family planning emerge as the common priorities among actors in the health sector (although the last of these is particularly emphasized by international agencies) because of their contribution to the long-term sustainability of health-care provision. Financing health-care services is the most disputed issue. Participants from the Ugandan Ministry of Health preferentially sought to increase net health expenditure and government ownership of the health sector, while non-state actors prioritized improving the efficiency of resource use. Ultimately it is apparent that the power to influence national health outcomes lies with only a handful of decision-makers within key institutions in the health sector, such as the Ministries of Health, the largest bilateral donors and the multilateral development agencies. These power relations reinforce the need for ongoing research into the paradigms and strategic interests of these actors. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.

  18. Use of simulated data sets to evaluate the fidelity of metagenomic processing methods

    Energy Technology Data Exchange (ETDEWEB)

    Mavromatis, K [U.S. Department of Energy, Joint Genome Institute; Ivanova, N [U.S. Department of Energy, Joint Genome Institute; Barry, Kerrie [U.S. Department of Energy, Joint Genome Institute; Shapiro, Harris [U.S. Department of Energy, Joint Genome Institute; Goltsman, Eugene [U.S. Department of Energy, Joint Genome Institute; McHardy, Alice C. [IBM T. J. Watson Research Center; Rigoutsos, Isidore [IBM T. J. Watson Research Center; Salamov, Asaf [U.S. Department of Energy, Joint Genome Institute; Korzeniewski, Frank [U.S. Department of Energy, Joint Genome Institute; Land, Miriam L [ORNL; Lapidus, Alla L. [U.S. Department of Energy, Joint Genome Institute; Grigoriev, Igor [U.S. Department of Energy, Joint Genome Institute; Hugenholtz, Philip [U.S. Department of Energy, Joint Genome Institute; Kyrpides, Nikos C [U.S. Department of Energy, Joint Genome Institute

    2007-01-01

    Metagenomics is a rapidly emerging field of research for studying microbial communities. To evaluate methods presently used to process metagenomic sequences, we constructed three simulated data sets of varying complexity by combining sequencing reads randomly selected from 113 isolate genomes. These data sets were designed to model real metagenomes in terms of complexity and phylogenetic composition. We assembled sampled reads using three commonly used genome assemblers (Phrap, Arachne and JAZZ), and predicted genes using two popular gene-finding pipelines (fgenesb and CRITICA/GLIMMER). The phylogenetic origins of the assembled contigs were predicted using one sequence similarity-based ( blast hit distribution) and two sequence composition-based (PhyloPythia, oligonucleotide frequencies) binning methods. We explored the effects of the simulated community structure and method combinations on the fidelity of each processing step by comparison to the corresponding isolate genomes. The simulated data sets are available online to facilitate standardized benchmarking of tools for metagenomic analysis.

  19. Low-level memory processes in vision.

    Science.gov (United States)

    Magnussen, S

    2000-06-01

    Psychophysical studies of the short-term memory for attributes or dimensions of the visual stimulus that are known to be important in early visual processing (spatial frequency, orientation, contrast, motion and color) identify a low-level perceptual memory mechanism. This proposed mechanism is located early in the visual processing stream, prior to the structural description system responsible for shape priming but beyond primary visual cortex (V1); it is composed of a series of parallel, special-purpose perceptual mechanisms with independent but limited processing resources. Each mechanism is devoted to the analysis of a single dimension and is coupled to a memory store.

  20. Managing the construction bidding process : a move to simpler construction plan sets

    Science.gov (United States)

    2001-01-31

    This project was conducted to determine whether construction plan sets could be significantly simplified to speed the process of moving projects to construction. The work steps included a literature review, a telephone survey of highway agencies in s...

  1. Processing vessel for high level radioactive wastes

    International Nuclear Information System (INIS)

    Maekawa, Hiromichi

    1998-01-01

    Upon transferring an overpack having canisters containing high level radioactive wastes sealed therein and burying it into an underground processing hole, an outer shell vessel comprising a steel plate to be fit and contained in the processing hole is formed. A bury-back layer made of dug earth and sand which had been discharged upon forming the processing hole is formed on the inner circumferential wall of the outer shell vessel. A buffer layer having a predetermined thickness is formed on the inner side of the bury-back layer, and the overpack is contained in the hollow portion surrounded by the layer. The opened upper portion of the hollow portion is covered with the buffer layer and the bury-back layer. Since the processing vessel having a shielding performance previously formed on the ground, the state of packing can be observed. In addition, since an operator can directly operates upon transportation and burying of the high level radioactive wastes, remote control is no more necessary. (T.M.)

  2. Application of physiologically based pharmacokinetic modeling in setting acute exposure guideline levels for methylene chloride.

    NARCIS (Netherlands)

    Bos, Peter Martinus Jozef; Zeilmaker, Marco Jacob; Eijkeren, Jan Cornelis Henri van

    2006-01-01

    Acute exposure guideline levels (AEGLs) are derived to protect the human population from adverse health effects in case of single exposure due to an accidental release of chemicals into the atmosphere. AEGLs are set at three different levels of increasing toxicity for exposure durations ranging from

  3. Cognitive load privileges memory-based over data-driven processing, not group-level over person-level processing.

    Science.gov (United States)

    Skorich, Daniel P; Mavor, Kenneth I

    2013-09-01

    In the current paper, we argue that categorization and individuation, as traditionally discussed and as experimentally operationalized, are defined in terms of two confounded underlying dimensions: a person/group dimension and a memory-based/data-driven dimension. In a series of three experiments, we unconfound these dimensions and impose a cognitive load. Across the three experiments, two with laboratory-created targets and one with participants' friends as the target, we demonstrate that cognitive load privileges memory-based over data-driven processing, not group- over person-level processing. We discuss the results in terms of their implications for conceptualizations of the categorization/individuation distinction, for the equivalence of person and group processes, for the ultimate 'purpose' and meaningfulness of group-based perception and, fundamentally, for the process of categorization, broadly defined. © 2012 The British Psychological Society.

  4. Priority setting and health policy and systems research

    Directory of Open Access Journals (Sweden)

    Bennett Sara C

    2009-12-01

    Full Text Available Abstract Health policy and systems research (HPSR has been identified as critical to scaling-up interventions to achieve the millennium development goals, but research priority setting exercises often do not address HPSR well. This paper aims to (i assess current priority setting methods and the extent to which they adequately include HPSR and (ii draw lessons regarding how HPSR priority setting can be enhanced to promote relevant HPSR, and to strengthen developing country leadership of research agendas. Priority setting processes can be distinguished by the level at which they occur, their degree of comprehensiveness in terms of the topic addressed, the balance between technical versus interpretive approaches and the stakeholders involved. When HPSR is considered through technical, disease-driven priority setting processes it is systematically under-valued. More successful approaches for considering HPSR are typically nationally-driven, interpretive and engage a range of stakeholders. There is still a need however for better defined approaches to enable research funders to determine the relative weight to assign to disease specific research versus HPSR and other forms of cross-cutting health research. While country-level research priority setting is key, there is likely to be a continued need for the identification of global research priorities for HPSR. The paper argues that such global priorities can and should be driven by country level priorities.

  5. A Compositional Knowledge Level Process Model of Requirements Engineering

    NARCIS (Netherlands)

    Herlea, D.E.; Jonker, C.M.; Treur, J.; Wijngaards, W.C.A.

    2002-01-01

    In current literature few detailed process models for Requirements Engineering are presented: usually high-level activities are distinguished, without a more precise specification of each activity. In this paper the process of Requirements Engineering has been analyzed using knowledge-level

  6. Single-step reinitialization and extending algorithms for level-set based multi-phase flow simulations

    Science.gov (United States)

    Fu, Lin; Hu, Xiangyu Y.; Adams, Nikolaus A.

    2017-12-01

    We propose efficient single-step formulations for reinitialization and extending algorithms, which are critical components of level-set based interface-tracking methods. The level-set field is reinitialized with a single-step (non iterative) "forward tracing" algorithm. A minimum set of cells is defined that describes the interface, and reinitialization employs only data from these cells. Fluid states are extrapolated or extended across the interface by a single-step "backward tracing" algorithm. Both algorithms, which are motivated by analogy to ray-tracing, avoid multiple block-boundary data exchanges that are inevitable for iterative reinitialization and extending approaches within a parallel-computing environment. The single-step algorithms are combined with a multi-resolution conservative sharp-interface method and validated by a wide range of benchmark test cases. We demonstrate that the proposed reinitialization method achieves second-order accuracy in conserving the volume of each phase. The interface location is invariant to reapplication of the single-step reinitialization. Generally, we observe smaller absolute errors than for standard iterative reinitialization on the same grid. The computational efficiency is higher than for the standard and typical high-order iterative reinitialization methods. We observe a 2- to 6-times efficiency improvement over the standard method for serial execution. The proposed single-step extending algorithm, which is commonly employed for assigning data to ghost cells with ghost-fluid or conservative interface interaction methods, shows about 10-times efficiency improvement over the standard method while maintaining same accuracy. Despite their simplicity, the proposed algorithms offer an efficient and robust alternative to iterative reinitialization and extending methods for level-set based multi-phase simulations.

  7. A thick level set interface model for simulating fatigue-drive delamination in composites

    NARCIS (Netherlands)

    Latifi, M.; Van der Meer, F.P.; Sluys, L.J.

    2015-01-01

    This paper presents a new damage model for simulating fatigue-driven delamination in composite laminates. This model is developed based on the Thick Level Set approach (TLS) and provides a favorable link between damage mechanics and fracture mechanics through the non-local evaluation of the energy

  8. An Optimized, Grid Independent, Narrow Band Data Structure for High Resolution Level Sets

    DEFF Research Database (Denmark)

    Nielsen, Michael Bang; Museth, Ken

    2004-01-01

    enforced by the convex boundaries of an underlying cartesian computational grid. Here we present a novel very memory efficient narrow band data structure, dubbed the Sparse Grid, that enables the representation of grid independent high resolution level sets. The key features our new data structure are...

  9. Using the Nine Common Themes of Good Practice checklist as a tool for evaluating the research priority setting process of a provincial research and program evaluation program.

    Science.gov (United States)

    Mador, Rebecca L; Kornas, Kathy; Simard, Anne; Haroun, Vinita

    2016-03-23

    Given the context-specific nature of health research prioritization and the obligation to effectively allocate resources to initiatives that will achieve the greatest impact, evaluation of priority setting processes can refine and strengthen such exercises and their outcomes. However, guidance is needed on evaluation tools that can be applied to research priority setting. This paper describes the adaption and application of a conceptual framework to evaluate a research priority setting exercise operating within the public health sector in Ontario, Canada. The Nine Common Themes of Good Practice checklist, described by Viergever et al. (Health Res Policy Syst 8:36, 2010) was used as the conceptual framework to evaluate the research priority setting process developed for the Locally Driven Collaborative Projects (LDCP) program in Ontario, Canada. Multiple data sources were used to inform the evaluation, including a review of selected priority setting approaches, surveys with priority setting participants, document review, and consultation with the program advisory committee. The evaluation assisted in identifying improvements to six elements of the LDCP priority setting process. The modifications were aimed at improving inclusiveness, information gathering practices, planning for project implementation, and evaluation. In addition, the findings identified that the timing of priority setting activities and level of control over the process were key factors that influenced the ability to effectively implement changes. The findings demonstrate the novel adaptation and application of the 'Nine Common Themes of Good Practice checklist' as a tool for evaluating a research priority setting exercise. The tool can guide the development of evaluation questions and enables the assessment of key constructs related to the design and delivery of a research priority setting process.

  10. A mass conserving level set method for detailed numerical simulation of liquid atomization

    Energy Technology Data Exchange (ETDEWEB)

    Luo, Kun; Shao, Changxiao [State Key Laboratory of Clean Energy Utilization, Zhejiang University, Hangzhou 310027 (China); Yang, Yue [State Key Laboratory of Turbulence and Complex Systems, Peking University, Beijing 100871 (China); Fan, Jianren, E-mail: fanjr@zju.edu.cn [State Key Laboratory of Clean Energy Utilization, Zhejiang University, Hangzhou 310027 (China)

    2015-10-01

    An improved mass conserving level set method for detailed numerical simulations of liquid atomization is developed to address the issue of mass loss in the existing level set method. This method introduces a mass remedy procedure based on the local curvature at the interface, and in principle, can ensure the absolute mass conservation of the liquid phase in the computational domain. Three benchmark cases, including Zalesak's disk, a drop deforming in a vortex field, and the binary drop head-on collision, are simulated to validate the present method, and the excellent agreement with exact solutions or experimental results is achieved. It is shown that the present method is able to capture the complex interface with second-order accuracy and negligible additional computational cost. The present method is then applied to study more complex flows, such as a drop impacting on a liquid film and the swirling liquid sheet atomization, which again, demonstrates the advantages of mass conservation and the capability to represent the interface accurately.

  11. Levels-of-processing effect on word recognition in schizophrenia.

    Science.gov (United States)

    Ragland, J Daniel; Moelter, Stephen T; McGrath, Claire; Hill, S Kristian; Gur, Raquel E; Bilker, Warren B; Siegel, Steven J; Gur, Ruben C

    2003-12-01

    Individuals with schizophrenia have difficulty organizing words semantically to facilitate encoding. This is commonly attributed to organizational rather than semantic processing limitations. By requiring participants to classify and encode words on either a shallow (e.g., uppercase/lowercase) or deep level (e.g., concrete/abstract), the levels-of-processing paradigm eliminates the need to generate organizational strategies. This paradigm was administered to 30 patients with schizophrenia and 30 healthy comparison subjects to test whether providing a strategy would improve patient performance. Word classification during shallow and deep encoding was slower and less accurate in patients. Patients also responded slowly during recognition testing and maintained a more conservative response bias following deep encoding; however, both groups showed a robust levels-of-processing effect on recognition accuracy, with unimpaired patient performance following both shallow and deep encoding. This normal levels-of-processing effect in the patient sample suggests that semantic processing is sufficiently intact for patients to benefit from organizational cues. Memory remediation efforts may therefore be most successful if they focus on teaching patients to form organizational strategies during initial encoding.

  12. Shape Reconstruction of Thin Electromagnetic Inclusions via Boundary Measurements: Level-Set Method Combined with the Topological Derivative

    Directory of Open Access Journals (Sweden)

    Won-Kwang Park

    2013-01-01

    Full Text Available An inverse problem for reconstructing arbitrary-shaped thin penetrable electromagnetic inclusions concealed in a homogeneous material is considered in this paper. For this purpose, the level-set evolution method is adopted. The topological derivative concept is incorporated in order to evaluate the evolution speed of the level-set functions. The results of the corresponding numerical simulations with and without noise are presented in this paper.

  13. Robust space-time extraction of ventricular surface evolution using multiphase level sets

    Science.gov (United States)

    Drapaca, Corina S.; Cardenas, Valerie; Studholme, Colin

    2004-05-01

    This paper focuses on the problem of accurately extracting the CSF-tissue boundary, particularly around the ventricular surface, from serial structural MRI of the brain acquired in imaging studies of aging and dementia. This is a challenging problem because of the common occurrence of peri-ventricular lesions which locally alter the appearance of white matter. We examine a level set approach which evolves a four dimensional description of the ventricular surface over time. This has the advantage of allowing constraints on the contour in the temporal dimension, improving the consistency of the extracted object over time. We follow the approach proposed by Chan and Vese which is based on the Mumford and Shah model and implemented using the Osher and Sethian level set method. We have extended this to the 4 dimensional case to propagate a 4D contour toward the tissue boundaries through the evolution of a 5D implicit function. For convergence we use region-based information provided by the image rather than the gradient of the image. This is adapted to allow intensity contrast changes between time frames in the MRI sequence. Results on time sequences of 3D brain MR images are presented and discussed.

  14. Image-guided regularization level set evolution for MR image segmentation and bias field correction.

    Science.gov (United States)

    Wang, Lingfeng; Pan, Chunhong

    2014-01-01

    Magnetic resonance (MR) image segmentation is a crucial step in surgical and treatment planning. In this paper, we propose a level-set-based segmentation method for MR images with intensity inhomogeneous problem. To tackle the initialization sensitivity problem, we propose a new image-guided regularization to restrict the level set function. The maximum a posteriori inference is adopted to unify segmentation and bias field correction within a single framework. Under this framework, both the contour prior and the bias field prior are fully used. As a result, the image intensity inhomogeneity can be well solved. Extensive experiments are provided to evaluate the proposed method, showing significant improvements in both segmentation and bias field correction accuracies as compared with other state-of-the-art approaches. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Application of the level set method for multi-phase flow computation in fusion engineering

    International Nuclear Information System (INIS)

    Luo, X-Y.; Ni, M-J.; Ying, A.; Abdou, M.

    2006-01-01

    Numerical simulation of multi-phase flow is essential to evaluate the feasibility of a liquid protection scheme for the power plant chamber. The level set method is one of the best methods for computing and analyzing the motion of interface among the multi-phase flow. This paper presents a general formula for the second-order projection method combined with the level set method to simulate unsteady incompressible multi-phase flow with/out phase change flow encountered in fusion science and engineering. The third-order ENO scheme and second-order semi-implicit Crank-Nicholson scheme is used to update the convective and diffusion term. The numerical results show this method can handle the complex deformation of the interface and the effect of liquid-vapor phase change will be included in the future work

  16. Method of processing low-level radioactive liquid wastes

    International Nuclear Information System (INIS)

    Matsunaga, Ichiro; Sugai, Hiroshi.

    1984-01-01

    Purpose: To effectively reduce the radioactivity density of low-level radioactive liquid wastes discharged from enriched uranium conversion processing steps or the likes. Method: Hydrazin is added to low-level radioactive liquid wastes, which are in contact with iron hydroxide-cation exchange resins prepared by processing strongly acidic-cation exchange resins with ferric chloride and aqueous ammonia to form hydrorizates of ferric ions in the resin. Hydrazine added herein may be any of hydrazine hydrate, hydrazine hydrochloride and hydranine sulfate. The preferred addition amount is more than 100 mg per one liter of the liquid wastes. If it is less than 100 mg, the reduction rate for the radioactivety density (procession liquid density/original liquid density) is decreased. This method enables to effectively reduce the radioactivity density of the low-level radioactive liquid wastes containing a trace amount of radioactive nucleides. (Yoshihara, H.)

  17. The impact of negative attentional set upon target processing in RSVP : An ERP study

    NARCIS (Netherlands)

    Zhang, Dexuan; Zhou, Xiaolin; Martens, Sander

    2009-01-01

    This study investigates whether the negative attentional set, a form of top-down attentional bias, can be set up on a trial-by-trial basis and impair online target processing in an RSVP (Rapid Serial Visual Presentation) task in which two targets are to be identified. Using the N2pc (N2 posterior

  18. Implementing standard setting into the Conjoint MAFP/FRACGP Part 1 examination - Process and issues.

    Science.gov (United States)

    Chan, S C; Mohd Amin, S; Lee, T W

    2016-01-01

    The College of General Practitioners of Malaysia and the Royal Australian College of General Practitioners held the first Conjoint Member of the College of General Practitioners (MCGP)/Fellow of Royal Australian College of General Practitioners (FRACGP) examination in 1982, later renamed the Conjoint MAFP/FRACGP examinations. The examination assesses competency for safe independent general practice and as family medicine specialists in Malaysia. Therefore, a defensible standard set pass mark is imperative to separate the competent from the incompetent. This paper discusses the process and issues encountered in implementing standard setting to the Conjoint Part 1 examination. Critical to success in standard setting were judges' understanding of the process of the modified Angoff method, defining the borderline candidate's characteristics and the composition of judges. These were overcome by repeated hands-on training, provision of detailed guidelines and careful selection of judges. In December 2013, 16 judges successfully standard set the Part 1 Conjoint examinations, with high inter-rater reliability: Cronbach's alpha coefficient 0.926 (Applied Knowledge Test), 0.921 (Key Feature Problems).

  19. Making sense of intercultural interaction processes in international joint venture settings

    DEFF Research Database (Denmark)

    Dao, Li

    , i.e. competence building interaction, decision making interaction, and socializing interaction, which is consistent with the three major processes of learning, power bargaining, and relationship building as suggested by IJV literature. Second, interaction processes appear to be shaped by individual...... approach toward decision making, a mutual learning attitude, the appreciation and strategic utilization of emergent ties between individual members put together in work settings, the proper implementation of consensus-facilitating mechanisms like ISO standards, and a holistic view of knowledge transfer...... in terms of core skills as well as non-core yet critically supporting skills like decision making and project/ time management....

  20. Process Design Concepts for Stabilization of High Level Waste Calcine

    Energy Technology Data Exchange (ETDEWEB)

    T. R. Thomas; A. K. Herbst

    2005-06-01

    The current baseline assumption is that packaging ¡§as is¡¨ and direct disposal of high level waste (HLW) calcine in a Monitored Geologic Repository will be allowed. The fall back position is to develop a stabilized waste form for the HLW calcine, that will meet repository waste acceptance criteria currently in place, in case regulatory initiatives are unsuccessful. A decision between direct disposal or a stabilization alternative is anticipated by June 2006. The purposes of this Engineering Design File (EDF) are to provide a pre-conceptual design on three low temperature processes under development for stabilization of high level waste calcine (i.e., the grout, hydroceramic grout, and iron phosphate ceramic processes) and to support a down selection among the three candidates. The key assumptions for the pre-conceptual design assessment are that a) a waste treatment plant would operate over eight years for 200 days a year, b) a design processing rate of 3.67 m3/day or 4670 kg/day of HLW calcine would be needed, and c) the performance of waste form would remove the HLW calcine from the hazardous waste category, and d) the waste form loadings would range from about 21-25 wt% calcine. The conclusions of this EDF study are that: (a) To date, the grout formulation appears to be the best candidate stabilizer among the three being tested for HLW calcine and appears to be the easiest to mix, pour, and cure. (b) Only minor differences would exist between the process steps of the grout and hydroceramic grout stabilization processes. If temperature control of the mixer at about 80„aC is required, it would add a major level of complexity to the iron phosphate stabilization process. (c) It is too early in the development program to determine which stabilizer will produce the minimum amount of stabilized waste form for the entire HLW inventory, but the volume is assumed to be within the range of 12,250 to 14,470 m3. (d) The stacked vessel height of the hot process vessels

  1. Patient involvement in a scientific advisory process: setting the research agenda for medical products.

    NARCIS (Netherlands)

    Elberse, J.E.; Pittens, C.A.C.M.; de Cock Buning, J.T.; Broerse, J.E.W.

    2012-01-01

    Patient involvement in scientific advisory processes could lead to more societally relevant advice. This article describes a case study wherein the Health Council of the Netherlands involved patient groups in an advisory process with a predefined focus: setting a research agenda for medical products

  2. Development of a set of process and structure indicators for palliative care: the Europall project

    Directory of Open Access Journals (Sweden)

    Woitha Kathrin

    2012-11-01

    Full Text Available Abstract Background By measuring the quality of the organisation of palliative care with process and structure quality indicators (QIs, patients, caregivers and policy makers are able to monitor to what extent recommendations are met, like those of the council of the WHO on palliative care and guidelines. This will support the implementation of public programmes, and will enable comparisons between organisations or countries. Methods As no European set of indicators for the organisation of palliative care existed, such a set of QIs was developed. An update of a previous systematic review was made and extended with more databases and grey literature. In two project meetings with practitioners and experts in palliative care the development process of a QI set was finalised and the QIs were categorized in a framework, covering the recommendations of the Council of Europe. Results The searches resulted in 151 structure and process indicators, which were discussed in steering group meetings. Of those QIs, 110 were eligible for the final framework. Conclusions We developed the first set of QIs for the organisation of palliative care. This article is the first step in a multi step project to identify, validate and pilot QIs.

  3. Implementing a framework for goal setting in community based stroke rehabilitation: a process evaluation.

    Science.gov (United States)

    Scobbie, Lesley; McLean, Donald; Dixon, Diane; Duncan, Edward; Wyke, Sally

    2013-05-24

    Goal setting is considered 'best practice' in stroke rehabilitation; however, there is no consensus regarding the key components of goal setting interventions or how they should be optimally delivered in practice. We developed a theory-based goal setting and action planning framework (G-AP) to guide goal setting practice. G-AP has 4 stages: goal negotiation, goal setting, action planning & coping planning and appraisal & feedback. All stages are recorded in a patient-held record. In this study we examined the implementation, acceptability and perceived benefits of G-AP in one community rehabilitation team with people recovering from stroke. G-AP was implemented for 6 months with 23 stroke patients. In-depth interviews with 8 patients and 8 health professionals were analysed thematically to investigate views of its implementation, acceptability and perceived benefits. Case notes of interviewed patients were analysed descriptively to assess the fidelity of G-AP implementation. G-AP was mostly implemented according to protocol with deviations noted at the planning and appraisal and feedback stages. Each stage was felt to make a useful contribution to the overall process; however, in practice, goal negotiation and goal setting merged into one stage and the appraisal and feedback stage included an explicit decision making component. Only two issues were raised regarding G-APs acceptability: (i) health professionals were concerned about the impact of goal non-attainment on patient's well-being (patients did not share their concerns), and (ii) some patients and health professionals found the patient-held record unhelpful. G-AP was felt to have a positive impact on patient goal attainment and professional goal setting practice. Collaborative partnerships between health professionals and patients were apparent throughout the process. G-AP has been perceived as both beneficial and broadly acceptable in one community rehabilitation team; however, implementation of novel

  4. Implementing a framework for goal setting in community based stroke rehabilitation: a process evaluation

    Science.gov (United States)

    2013-01-01

    Background Goal setting is considered ‘best practice’ in stroke rehabilitation; however, there is no consensus regarding the key components of goal setting interventions or how they should be optimally delivered in practice. We developed a theory-based goal setting and action planning framework (G-AP) to guide goal setting practice. G-AP has 4 stages: goal negotiation, goal setting, action planning & coping planning and appraisal & feedback. All stages are recorded in a patient-held record. In this study we examined the implementation, acceptability and perceived benefits of G-AP in one community rehabilitation team with people recovering from stroke. Methods G-AP was implemented for 6 months with 23 stroke patients. In-depth interviews with 8 patients and 8 health professionals were analysed thematically to investigate views of its implementation, acceptability and perceived benefits. Case notes of interviewed patients were analysed descriptively to assess the fidelity of G-AP implementation. Results G-AP was mostly implemented according to protocol with deviations noted at the planning and appraisal and feedback stages. Each stage was felt to make a useful contribution to the overall process; however, in practice, goal negotiation and goal setting merged into one stage and the appraisal and feedback stage included an explicit decision making component. Only two issues were raised regarding G-APs acceptability: (i) health professionals were concerned about the impact of goal non-attainment on patient’s well-being (patients did not share their concerns), and (ii) some patients and health professionals found the patient-held record unhelpful. G-AP was felt to have a positive impact on patient goal attainment and professional goal setting practice. Collaborative partnerships between health professionals and patients were apparent throughout the process. Conclusions G-AP has been perceived as both beneficial and broadly acceptable in one community

  5. Theorizing and Researching Levels of Processing in Self-Regulated Learning

    Science.gov (United States)

    Winne, Philip H.

    2018-01-01

    Background: Deep versus surface knowledge is widely discussed by educational practitioners. A corresponding construct, levels of processing, has received extensive theoretical and empirical attention in learning science and psychology. In both arenas, lower levels of information and shallower levels of processing are predicted and generally…

  6. On piecewise constant level-set (PCLS) methods for the identification of discontinuous parameters in ill-posed problems

    International Nuclear Information System (INIS)

    De Cezaro, A; Leitão, A; Tai, X-C

    2013-01-01

    We investigate level-set-type methods for solving ill-posed problems with discontinuous (piecewise constant) coefficients. The goal is to identify the level sets as well as the level values of an unknown parameter function on a model described by a nonlinear ill-posed operator equation. The PCLS approach is used here to parametrize the solution of a given operator equation in terms of a L 2 level-set function, i.e. the level-set function itself is assumed to be a piecewise constant function. Two distinct methods are proposed for computing stable solutions of the resulting ill-posed problem: the first is based on Tikhonov regularization, while the second is based on the augmented Lagrangian approach with total variation penalization. Classical regularization results (Engl H W et al 1996 Mathematics and its Applications (Dordrecht: Kluwer)) are derived for the Tikhonov method. On the other hand, for the augmented Lagrangian method, we succeed in proving the existence of (generalized) Lagrangian multipliers in the sense of (Rockafellar R T and Wets R J-B 1998 Grundlehren der Mathematischen Wissenschaften (Berlin: Springer)). Numerical experiments are performed for a 2D inverse potential problem (Hettlich F and Rundell W 1996 Inverse Problems 12 251–66), demonstrating the capabilities of both methods for solving this ill-posed problem in a stable way (complicated inclusions are recovered without any a priori geometrical information on the unknown parameter). (paper)

  7. Understanding the Relative Contributions of Lower-Level Word Processes, Higher-Level Processes, and Working Memory to Reading Comprehension Performance in Proficient Adult Readers

    Science.gov (United States)

    Hannon, Brenda

    2012-01-01

    Although a considerable amount of evidence has been amassed regarding the contributions of lower-level word processes, higher-level processes, and working memory to reading comprehension, little is known about the relationships among these sources of individual differences or their relative contributions to reading comprehension performance. This…

  8. A Science, Engineering and Technology (SET) Approach Improves Science Process Skills in 4-H Animal Science Participants

    Science.gov (United States)

    Clarke, Katie C.

    2010-01-01

    A new Science, Engineering and Technology (SET) approach was designed for youth who participated in the Minnesota State Fair Livestock interview process. The project and evaluation were designed to determine if the new SET approach increased content knowledge and science process skills in participants. Results revealed that youth participants not…

  9. Level-set segmentation of pulmonary nodules in megavolt electronic portal images using a CT prior

    International Nuclear Information System (INIS)

    Schildkraut, J. S.; Prosser, N.; Savakis, A.; Gomez, J.; Nazareth, D.; Singh, A. K.; Malhotra, H. K.

    2010-01-01

    Purpose: Pulmonary nodules present unique problems during radiation treatment due to nodule position uncertainty that is caused by respiration. The radiation field has to be enlarged to account for nodule motion during treatment. The purpose of this work is to provide a method of locating a pulmonary nodule in a megavolt portal image that can be used to reduce the internal target volume (ITV) during radiation therapy. A reduction in the ITV would result in a decrease in radiation toxicity to healthy tissue. Methods: Eight patients with nonsmall cell lung cancer were used in this study. CT scans that include the pulmonary nodule were captured with a GE Healthcare LightSpeed RT 16 scanner. Megavolt portal images were acquired with a Varian Trilogy unit equipped with an AS1000 electronic portal imaging device. The nodule localization method uses grayscale morphological filtering and level-set segmentation with a prior. The treatment-time portion of the algorithm is implemented on a graphical processing unit. Results: The method was retrospectively tested on eight cases that include a total of 151 megavolt portal image frames. The method reduced the nodule position uncertainty by an average of 40% for seven out of the eight cases. The treatment phase portion of the method has a subsecond execution time that makes it suitable for near-real-time nodule localization. Conclusions: A method was developed to localize a pulmonary nodule in a megavolt portal image. The method uses the characteristics of the nodule in a prior CT scan to enhance the nodule in the portal image and to identify the nodule region by level-set segmentation. In a retrospective study, the method reduced the nodule position uncertainty by an average of 40% for seven out of the eight cases studied.

  10. Level-set dynamics and mixing efficiency of passive and active scalars in DNS and LES of turbulent mixing layers

    NARCIS (Netherlands)

    Geurts, Bernard J.; Vreman, Bert; Kuerten, Hans; Luo, Kai H.

    2001-01-01

    The mixing efficiency in a turbulent mixing layer is quantified by monitoring the surface-area of level-sets of scalar fields. The Laplace transform is applied to numerically calculate integrals over arbitrary level-sets. The analysis includes both direct and large-eddy simulation and is used to

  11. Integrating cross-case analyses and process tracing in set-theoretic research: Strategies and parameters of debate

    DEFF Research Database (Denmark)

    Beach, Derek; Rohlfing, Ingo

    2018-01-01

    In recent years, there has been increasing interest in the combination of two methods on the basis of set theory. In our introduction and this special issue, we focus on two variants of cross-case set-theoretic methods - Qualitative Comparative Analysis (QCA) and typological theory...... – and their combination with process tracing. Our goal is to broaden and deepen set-theoretic empirical research and equip scholars with guidance on how to implement it in multi-method research (MMR). At first glance, set-theoretic cross-case methods and process tracing seem to be highly compatible when causal...... relationships are conceptualized in terms of set-theory. However, multiple issues have not so far been thoroughly addressed. Our paper builds on the emerging MMR literature and seeks to enhance it in four ways. First, we offer a comprehensive and coherent elaboration of the two sequences in which case studies...

  12. Factors Influencing the Degree of Intrajudge Consistency during the Standard Setting Process.

    Science.gov (United States)

    Plake, Barbara S.; And Others

    The accuracy of standards obtained from judgmental methods is dependent on the quality of the judgments made by experts throughout the standard setting process. One important dimension of the quality of these judgments is the consistency of the judges' perceptions with item performance of minimally competent candidates. Several interrelated…

  13. An improved level set method for brain MR images segmentation and bias correction.

    Science.gov (United States)

    Chen, Yunjie; Zhang, Jianwei; Macione, Jim

    2009-10-01

    Intensity inhomogeneities cause considerable difficulty in the quantitative analysis of magnetic resonance (MR) images. Thus, bias field estimation is a necessary step before quantitative analysis of MR data can be undertaken. This paper presents a variational level set approach to bias correction and segmentation for images with intensity inhomogeneities. Our method is based on an observation that intensities in a relatively small local region are separable, despite of the inseparability of the intensities in the whole image caused by the overall intensity inhomogeneity. We first define a localized K-means-type clustering objective function for image intensities in a neighborhood around each point. The cluster centers in this objective function have a multiplicative factor that estimates the bias within the neighborhood. The objective function is then integrated over the entire domain to define the data term into the level set framework. Our method is able to capture bias of quite general profiles. Moreover, it is robust to initialization, and thereby allows fully automated applications. The proposed method has been used for images of various modalities with promising results.

  14. A Better Insight Into IT Contribution by Process Level Structure

    DEFF Research Database (Denmark)

    Shahim, Nazli; Møller, Charles

    2013-01-01

    . The discussion is supported by an introduction to the case of study in Royal Greenland. The contribution of this paper is the results of the discussions and the case study reaching to the point that IT supporting influences are better understood and observed at process levels rather than firm output level.......Creation of IT business value through its impact on value chain processes made the objective of this research to compare and differentiate IT role at both process and firm levels. A discussion about IT’s impact at both levels are made through previous theoretical and empirical studies...

  15. What makes deeply encoded items memorable? Insights into the levels of processing framework from neuroimaging and neuromodulation

    Directory of Open Access Journals (Sweden)

    Giulia eGalli

    2014-05-01

    Full Text Available When we form new memories, their mnestic fate largely depends upon the cognitive operations set in train during encoding. A typical observation in experimental as well as everyday life settings is that if we learn an item using semantic or deep operations, such as attending to its meaning, memory will be better than if we learn the same item using more shallow operations, such as attending to its structural features. In the psychological literature, this phenomenon has been conceptualised within the levels of processing framework and has been consistently replicated since its original proposal by Craik and Lockhart in 1972. However, the exact mechanisms underlying the memory advantage for deeply encoded items are not yet entirely understood. A cognitive neuroscience perspective can add to this field by clarifying the nature of the processes involved in effective deep and shallow encoding and how they are instantiated in the brain, but so far there has been little work to systematically integrate findings from the literature. This work aims to fill this gap by reviewing, first, some of the key neuroimaging findings on the neural correlates of deep and shallow episodic encoding and second, emerging evidence from studies using neuromodulatory approaches such as psychopharmacology and non invasive brain stimulation. Taken together, these studies help further our understanding of levels of processing. In addition, by showing that deep encoding can be modulated by acting upon specific brain regions or systems, the reviewed studies pave the way for selective enhancements of episodic encoding processes

  16. What makes deeply encoded items memorable? Insights into the levels of processing framework from neuroimaging and neuromodulation.

    Science.gov (United States)

    Galli, Giulia

    2014-01-01

    When we form new memories, their mnestic fate largely depends upon the cognitive operations set in train during encoding. A typical observation in experimental as well as everyday life settings is that if we learn an item using semantic or "deep" operations, such as attending to its meaning, memory will be better than if we learn the same item using more "shallow" operations, such as attending to its structural features. In the psychological literature, this phenomenon has been conceptualized within the "levels of processing" framework and has been consistently replicated since its original proposal by Craik and Lockhart in 1972. However, the exact mechanisms underlying the memory advantage for deeply encoded items are not yet entirely understood. A cognitive neuroscience perspective can add to this field by clarifying the nature of the processes involved in effective deep and shallow encoding and how they are instantiated in the brain, but so far there has been little work to systematically integrate findings from the literature. This work aims to fill this gap by reviewing, first, some of the key neuroimaging findings on the neural correlates of deep and shallow episodic encoding and second, emerging evidence from studies using neuromodulatory approaches such as psychopharmacology and non-invasive brain stimulation. Taken together, these studies help further our understanding of levels of processing. In addition, by showing that deep encoding can be modulated by acting upon specific brain regions or systems, the reviewed studies pave the way for selective enhancements of episodic encoding processes.

  17. QUALITY IMPROVEMENT MODEL AT THE MANUFACTURING PROCESS PREPARATION LEVEL

    Directory of Open Access Journals (Sweden)

    Dusko Pavletic

    2009-12-01

    Full Text Available The paper expresses base for an operational quality improvement model at the manufacturing process preparation level. A numerous appropriate related quality assurance and improvement methods and tools are identified. Main manufacturing process principles are investigated in order to scrutinize one general model of manufacturing process and to define a manufacturing process preparation level. Development and introduction of the operational quality improvement model is based on a research conducted and results of methods and tools application possibilities in real manufacturing processes shipbuilding and automotive industry. Basic model structure is described and presented by appropriate general algorithm. Operational quality improvement model developed lays down main guidelines for practical and systematic application of quality improvements methods and tools.

  18. The Defense Waste Processing Facility: an innovative process for high-level waste immobilization

    International Nuclear Information System (INIS)

    Cowan, S.P.

    1985-01-01

    The Defense Waste Processing Facility (DWPF), under construction at the Department of Energy's Savannah River Plant (SRP), will process defense high-level radioactive waste so that it can be disposed of safely. The DWPF will immobilize the high activity fraction of the waste in borosilicate glass cast in stainless steel canisters which can be handled, stored, transported and disposed of in a geologic repository. The low-activity fraction of the waste, which represents about 90% of the high-level waste HLW volume, will be decontaminated and disposed of on the SRP site. After decontamination the canister will be welded shut by an upset resistance welding technique. In this process a slightly oversized plug is pressed into the canister opening. At the same time a large current is passed through the canister and plug. The higher resistance of the canister/plug interface causes the heat which welds the plug in place. This process provides a high quality, reliable weld by a process easily operated remotely

  19. Efficient processing of containment queries on nested sets

    NARCIS (Netherlands)

    Ibrahim, A.; Fletcher, G.H.L.

    2013-01-01

    We study the problem of computing containment queries on sets which can have both atomic and set-valued objects as elements, i.e., nested sets. Containment is a fundamental query pattern with many basic applications. Our study of nested set containment is motivated by the ubiquity of nested data in

  20. ON THE ESTIMATION OF DISTANCE DISTRIBUTION FUNCTIONS FOR POINT PROCESSES AND RANDOM SETS

    Directory of Open Access Journals (Sweden)

    Dietrich Stoyan

    2011-05-01

    Full Text Available This paper discusses various estimators for the nearest neighbour distance distribution function D of a stationary point process and for the quadratic contact distribution function Hq of a stationary random closed set. It recommends the use of Hanisch's estimator of D, which is of Horvitz-Thompson type, and the minussampling estimator of Hq. This recommendation is based on simulations for Poisson processes and Boolean models.

  1. Health technology assessment process of a cardiovascular medical device in four different settings.

    Science.gov (United States)

    Olry de Labry Lima, Antonio; Espín Balbino, Jaime; Lemgruber, Alexandre; Caro Martínez, Araceli; García-Mochón, Leticia; Martín Ruiz, Eva; Lessa, Fernanda

    2017-10-01

    Health technology assessment (HTA) is a tool to help the decision-making process. The aim is to describe methods and processes used in the reimbursement decision making for drug-eluting stents (DES) in four different settings. DES as a technology under study was selected according to different criteria, all of them agreed by a working group. A survey of key informants was designed. DES was evaluated following well-structured HTA processes. Nonetheless, scope for improvement was observed in relation to the data considered for the final decision, the transparency and inclusiveness of the process as well as in the methods employed. An attempt to describe the HTA processes of a well-known medical device.

  2. DESIRE FOR LEVELS. Background study for the policy document "Setting Environmental Quality Standards for Water and Soil"

    NARCIS (Netherlands)

    van de Meent D; Aldenberg T; Canton JH; van Gestel CAM; Slooff W

    1990-01-01

    The report provides scientific support for setting environmental quality objectives for water, sediment and soil. Quality criteria are not set in this report. Only options for decisions are given. The report is restricted to the derivation of the 'maximally acceptable risk' levels (MAR)

  3. Development of a Goal Setting Process and Instrumentation for Teachers and Principals.

    Science.gov (United States)

    Minix, Nancy; And Others

    A pilot program, the Career Ladder Plan, was developed in Kentucky to evaluate a teacher's performance in terms of professional growth and development and professional leadership/initiative based on that teacher's performance on a setting/goal attainment process. Goals jointly selected by the teacher and his/her principal must contribute to school…

  4. Humidity level In psychrometric processes

    International Nuclear Information System (INIS)

    Mojsovski, Filip

    2008-01-01

    When a thermal engineer needs to control, rather than merely moderate humidity, he must focus on the moisture level as a separate variable - not simply an addition of temperature control. Controlling humidity generally demands a correct psychrometric approach dedicated to that purpose [1].Analysis of the humidity level in psychrometric thermal processes leads to relevant data for theory and practice [2]. This paper presents: (1) the summer climatic curve for the Skopje region, (2) selected results of investigation on farm dryers made outside laboratories. The first purpose of such activity was to examine relations between weather conditions and drying conditions. The estimation of weather condition for the warmest season of the year was realized by a summer climatic curve. In the science of drying, basic drying conditions are temperature, relative humidity and velocity of air, thickness of dried product and dryer construction. The second purpose was to realize correct prediction of drying rates for various psychrometrics drying processes and local products. Test runs with the dryer were carried out over a period of 24 h, using fruits and vegetables as experimental material. Air flow rate through the dryer of 150 m3/h, overall drying rate of 0.04 kg/h and air temperature of 65 oC were reached. Three types of solar dryers, were exploited in the research.

  5. Strengthening fairness, transparency and accountability in health care priority setting at district level in Tanzania

    Directory of Open Access Journals (Sweden)

    Stephen Maluka

    2011-11-01

    Full Text Available Health care systems are faced with the challenge of resource scarcity and have insufficient resources to respond to all health problems and target groups simultaneously. Hence, priority setting is an inevitable aspect of every health system. However, priority setting is complex and difficult because the process is frequently influenced by political, institutional and managerial factors that are not considered by conventional priority-setting tools. In a five-year EU-supported project, which started in 2006, ways of strengthening fairness and accountability in priority setting in district health management were studied. This review is based on a PhD thesis that aimed to analyse health care organisation and management systems, and explore the potential and challenges of implementing Accountability for Reasonableness (A4R approach to priority setting in Tanzania. A qualitative case study in Mbarali district formed the basis of exploring the sociopolitical and institutional contexts within which health care decision making takes place. The study also explores how the A4R intervention was shaped, enabled and constrained by the contexts. Key informant interviews were conducted. Relevant documents were also gathered and group priority-setting processes in the district were observed. The study revealed that, despite the obvious national rhetoric on decentralisation, actual practice in the district involved little community participation. The assumption that devolution to local government promotes transparency, accountability and community participation, is far from reality. The study also found that while the A4R approach was perceived to be helpful in strengthening transparency, accountability and stakeholder engagement, integrating the innovation into the district health system was challenging. This study underscores the idea that greater involvement and accountability among local actors may increase the legitimacy and fairness of priority-setting

  6. Considering Actionability at the Participant's Research Setting Level for Anticipatable Incidental Findings from Clinical Research.

    Science.gov (United States)

    Ortiz-Osorno, Alberto Betto; Ehler, Linda A; Brooks, Judith

    2015-01-01

    Determining what constitutes an anticipatable incidental finding (IF) from clinical research and defining whether, and when, this IF should be returned to the participant have been topics of discussion in the field of human subject protections for the last 10 years. It has been debated that implementing a comprehensive IF-approach that addresses both the responsibility of researchers to return IFs and the expectation of participants to receive them can be logistically challenging. IFs have been debated at different levels, such as the ethical reasoning for considering their disclosure or the need for planning for them during the development of the research study. Some authors have discussed the methods for re-contacting participants for disclosing IFs, as well as the relevance of considering the clinical importance of the IFs. Similarly, other authors have debated about when IFs should be disclosed to participants. However, no author has addressed how the "actionability" of the IFs should be considered, evaluated, or characterized at the participant's research setting level. This paper defines the concept of "Actionability at the Participant's Research Setting Level" (APRSL) for anticipatable IFs from clinical research, discusses some related ethical concepts to justify the APRSL concept, proposes a strategy to incorporate APRSL into the planning and management of IFs, and suggests a strategy for integrating APRSL at each local research setting. © 2015 American Society of Law, Medicine & Ethics, Inc.

  7. Complementary Self-Biased Logics Based on Single-Electron Transistor (SET)/CMOS Hybrid Process

    Science.gov (United States)

    Song, Ki-Whan; Lee, Yong Kyu; Sim, Jae Sung; Kim, Kyung Rok; Lee, Jong Duk; Park, Byung-Gook; You, Young Sub; Park, Joo-On; Jin, You Seung; Kim, Young-Wug

    2005-04-01

    We propose a complementary self-biasing method which enables the single-electron transistor (SET)/complementary metal-oxide semiconductor (CMOS) hybrid multi-valued logics (MVLs) to operate well at high temperatures, where the peak-to-valley current ratio (PVCR) of the Coulomb oscillation markedly decreases. The new architecture is implemented with a few transistors by utilizing the phase control capability of the sidewall depletion gates in dual-gate single-electron transistors (DGSETs). The suggested scheme is evaluated by a SPICE simulation with an analytical DGSET model. Furthermore, we have developed a new process technology for the SET/CMOS hybrid systems. We have confirmed that both of the fabricated devices, namely, SET and CMOS transistors, exhibit the ideal characteristics for the complementary self-biasing scheme: the SET shows clear Coulomb oscillations with a 100 mV period and the CMOS transistors show a high voltage gain.

  8. The levels of processing effect under nitrogen narcosis.

    Science.gov (United States)

    Kneller, Wendy; Hobbs, Malcolm

    2013-01-01

    Previous research has consistently demonstrated that inert gas (nitrogen) narcosis affects free recall but not recognition memory in the depth range of 30 to 50 meters of sea water (msw), possibly as a result of narcosis preventing processing when learned material is encoded. The aim of the current research was to test this hypothesis by applying a levels of processing approach to the measurement of free recall under narcosis. Experiment 1 investigated the effect of depth (0-2 msw vs. 37-39 msw) and level of processing (shallow vs. deep) on free recall memory performance in 67 divers. When age was included as a covariate, recall was significantly worse in deep water (i.e., under narcosis), compared to shallow water, and was significantly higher in the deep processing compared to shallow processing conditions in both depth conditions. Experiment 2 demonstrated that this effect was not simply due to the different underwater environments used for the depth conditions in Experiment 1. It was concluded memory performance can be altered by processing under narcosis and supports the contention that narcosis affects the encoding stage of memory as opposed to self-guided search (retrieval).

  9. Process-oriented guided inquiry learning strategy enhances students' higher level thinking skills in a pharmaceutical sciences course.

    Science.gov (United States)

    Soltis, Robert; Verlinden, Nathan; Kruger, Nicholas; Carroll, Ailey; Trumbo, Tiffany

    2015-02-17

    To determine if the process-oriented guided inquiry learning (POGIL) teaching strategy improves student performance and engages higher-level thinking skills of first-year pharmacy students in an Introduction to Pharmaceutical Sciences course. Overall examination scores and scores on questions categorized as requiring either higher-level or lower-level thinking skills were compared in the same course taught over 3 years using traditional lecture methods vs the POGIL strategy. Student perceptions of the latter teaching strategy were also evaluated. Overall mean examination scores increased significantly when POGIL was implemented. Performance on questions requiring higher-level thinking skills was significantly higher, whereas performance on questions requiring lower-level thinking skills was unchanged when the POGIL strategy was used. Student feedback on use of this teaching strategy was positive. The use of the POGIL strategy increased student overall performance on examinations, improved higher-level thinking skills, and provided an interactive class setting.

  10. Effect of culture levels, ultrafiltered retentate addition, total solid levels and heat treatments on quality improvement of buffalo milk plain set yoghurt.

    Science.gov (United States)

    Yadav, Vijesh; Gupta, Vijay Kumar; Meena, Ganga Sahay

    2018-05-01

    Studied the effect of culture (2, 2.5 and 3%), ultrafiltered (UF) retentate addition (0, 11, 18%), total milk solids (13, 13.50, 14%) and heat treatments (80 and 85 °C/30 min) on the change in pH and titratable acidity (TA), sensory scores and rheological parameters of yoghurt. With 3% culture levels, the required TA (0.90% LA) was achieved in minimum 6 h incubation. With an increase in UF retentate addition, there was observed a highly significant decrease in overall acceptability, body and texture and colour and appearance scores, but there was highly significant increase in rheological parameters of yoghurt samples. Yoghurt made from even 13.75% total solids containing nil UF retentate was observed to be sufficiently firm by the sensory panel. Most of the sensory attributes of yoghurt made with 13.50% total solids were significantly better than yoghurt prepared with either 13 or 14% total solids. Standardised milk heated to 85 °C/30 min resulted in significantly better overall acceptability in yoghurt. Overall acceptability of optimised yoghurt was significantly better than a branded market sample. UF retentate addition adversely affected yoghurt quality, whereas optimization of culture levels, totals milk solids and others process parameters noticeably improved the quality of plain set yoghurt with a shelf life of 15 days at 4 °C.

  11. Analysis of Forensic Autopsy in 120 Cases of Medical Disputes Among Different Levels of Institutional Settings.

    Science.gov (United States)

    Yu, Lin-Sheng; Ye, Guang-Hua; Fan, Yan-Yan; Li, Xing-Biao; Feng, Xiang-Ping; Han, Jun-Ge; Lin, Ke-Zhi; Deng, Miao-Wu; Li, Feng

    2015-09-01

    Despite advances in medical science, the causes of death can sometimes only be determined by pathologists after a complete autopsy. Few studies have investigated the importance of forensic autopsy in medically disputed cases among different levels of institutional settings. Our study aimed to analyze forensic autopsy in 120 cases of medical disputes among five levels of institutional settings between 2001 and 2012 in Wenzhou, China. The results showed an overall concordance rate of 55%. Of the 39% of clinically missed diagnosis, cardiovascular pathology comprises 55.32%, while respiratory pathology accounts for the remaining 44. 68%. Factors that increase the likelihood of missed diagnoses were private clinics, community settings, and county hospitals. These results support that autopsy remains an important tool in establishing causes of death in medically disputed case, which may directly determine or exclude the fault of medical care and therefore in helping in resolving these cases. © 2015 American Academy of Forensic Sciences.

  12. INTEGRATED SFM TECHNIQUES USING DATA SET FROM GOOGLE EARTH 3D MODEL AND FROM STREET LEVEL

    Directory of Open Access Journals (Sweden)

    L. Inzerillo

    2017-08-01

    Full Text Available Structure from motion (SfM represents a widespread photogrammetric method that uses the photogrammetric rules to carry out a 3D model from a photo data set collection. Some complex ancient buildings, such as Cathedrals, or Theatres, or Castles, etc. need to implement the data set (realized from street level with the UAV one in order to have the 3D roof reconstruction. Nevertheless, the use of UAV is strong limited from the government rules. In these last years, Google Earth (GE has been enriched with the 3D models of the earth sites. For this reason, it seemed convenient to start to test the potentiality offered by GE in order to extract from it a data set that replace the UAV function, to close the aerial building data set, using screen images of high resolution 3D models. Users can take unlimited “aerial photos” of a scene while flying around in GE at any viewing angle and altitude. The challenge is to verify the metric reliability of the SfM model carried out with an integrated data set (the one from street level and the one from GE aimed at replace the UAV use in urban contest. This model is called integrated GE SfM model (i-GESfM. In this paper will be present a case study: the Cathedral of Palermo.

  13. Reconciling the influence of task-set switching and motor inhibition processes on stop signal after-effects.

    Science.gov (United States)

    Anguera, Joaquin A; Lyman, Kyle; Zanto, Theodore P; Bollinger, Jacob; Gazzaley, Adam

    2013-01-01

    Executive response functions can be affected by preceding events, even if they are no longer associated with the current task at hand. For example, studies utilizing the stop signal task have reported slower response times to "GO" stimuli when the preceding trial involved the presentation of a "STOP" signal. However, the neural mechanisms that underlie this behavioral after-effect are unclear. To address this, behavioral and electroencephalography (EEG) measures were examined in 18 young adults (18-30 years) on "GO" trials following a previously "Successful Inhibition" trial (pSI), a previously "Failed Inhibition" trial (pFI), and a previous "GO" trial (pGO). Like previous research, slower response times were observed during both pSI and pFI trials (i.e., "GO" trials that were preceded by a successful and unsuccessful inhibition trial, respectively) compared to pGO trials (i.e., "GO" trials that were preceded by another "GO" trial). Interestingly, response time slowing was greater during pSI trials compared to pFI trials, suggesting executive control is influenced by both task set switching and persisting motor inhibition processes. Follow-up behavioral analyses indicated that these effects resulted from between-trial control adjustments rather than repetition priming effects. Analyses of inter-electrode coherence (IEC) and inter-trial coherence (ITC) indicated that both pSI and pFI trials showed greater phase synchrony during the inter-trial interval compared to pGO trials. Unlike the IEC findings, differential ITC was present within the beta and alpha frequency bands in line with the observed behavior (pSI > pFI > pGO), suggestive of more consistent phase synchrony involving motor inhibition processes during the ITI at a regional level. These findings suggest that between-trial control adjustments involved with task-set switching and motor inhibition processes influence subsequent performance, providing new insights into the dynamic nature of executive control.

  14. Point process analyses of variations in smoking rate by setting, mood, gender, and dependence

    Science.gov (United States)

    Shiffman, Saul; Rathbun, Stephen L.

    2010-01-01

    The immediate emotional and situational antecedents of ad libitum smoking are still not well understood. We re-analyzed data from Ecological Momentary Assessment using novel point-process analyses, to assess how craving, mood, and social setting influence smoking rate, as well as assessing the moderating effects of gender and nicotine dependence. 304 smokers recorded craving, mood, and social setting using electronic diaries when smoking and at random nonsmoking times over 16 days of smoking. Point-process analysis, which makes use of the known random sampling scheme for momentary variables, examined main effects of setting and interactions with gender and dependence. Increased craving was associated with higher rates of smoking, particularly among women. Negative affect was not associated with smoking rate, even in interaction with arousal, but restlessness was associated with substantially higher smoking rates. Women's smoking tended to be less affected by negative affect. Nicotine dependence had little moderating effect on situational influences. Smoking rates were higher when smokers were alone or with others smoking, and smoking restrictions reduced smoking rates. However, the presence of others smoking undermined the effects of restrictions. The more sensitive point-process analyses confirmed earlier findings, including the surprising conclusion that negative affect by itself was not related to smoking rates. Contrary to hypothesis, men's and not women's smoking was influenced by negative affect. Both smoking restrictions and the presence of others who are not smoking suppress smoking, but others’ smoking undermines the effects of restrictions. Point-process analyses of EMA data can bring out even small influences on smoking rate. PMID:21480683

  15. Predictive information speeds up visual awareness in an individuation task by modulating threshold setting, not processing efficiency.

    Science.gov (United States)

    De Loof, Esther; Van Opstal, Filip; Verguts, Tom

    2016-04-01

    Theories on visual awareness claim that predicted stimuli reach awareness faster than unpredicted ones. In the current study, we disentangle whether prior information about the upcoming stimulus affects visual awareness of stimulus location (i.e., individuation) by modulating processing efficiency or threshold setting. Analogous research on stimulus identification revealed that prior information modulates threshold setting. However, as identification and individuation are two functionally and neurally distinct processes, the mechanisms underlying identification cannot simply be extrapolated directly to individuation. The goal of this study was therefore to investigate how individuation is influenced by prior information about the upcoming stimulus. To do so, a drift diffusion model was fitted to estimate the processing efficiency and threshold setting for predicted versus unpredicted stimuli in a cued individuation paradigm. Participants were asked to locate a picture, following a cue that was congruent, incongruent or neutral with respect to the picture's identity. Pictures were individuated faster in the congruent and neutral condition compared to the incongruent condition. In the diffusion model analysis, the processing efficiency was not significantly different across conditions. However, the threshold setting was significantly higher following an incongruent cue compared to both congruent and neutral cues. Our results indicate that predictive information about the upcoming stimulus influences visual awareness by shifting the threshold for individuation rather than by enhancing processing efficiency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Spacecraft Data Simulator for the test of level zero processing systems

    Science.gov (United States)

    Shi, Jeff; Gordon, Julie; Mirchandani, Chandru; Nguyen, Diem

    1994-01-01

    The Microelectronic Systems Branch (MSB) at Goddard Space Flight Center (GSFC) has developed a Spacecraft Data Simulator (SDS) to support the development, test, and verification of prototype and production Level Zero Processing (LZP) systems. Based on a disk array system, the SDS is capable of generating large test data sets up to 5 Gigabytes and outputting serial test data at rates up to 80 Mbps. The SDS supports data formats including NASA Communication (Nascom) blocks, Consultative Committee for Space Data System (CCSDS) Version 1 & 2 frames and packets, and all the Advanced Orbiting Systems (AOS) services. The capability to simulate both sequential and non-sequential time-ordered downlink data streams with errors and gaps is crucial to test LZP systems. This paper describes the system architecture, hardware and software designs, and test data designs. Examples of test data designs are included to illustrate the application of the SDS.

  17. Low-level wastewater treatment facility process control operational test report

    International Nuclear Information System (INIS)

    Bergquist, G.G.

    1996-01-01

    This test report documents the results obtained while conducting operational testing of a new TK 102 level controller and total outflow integrator added to the NHCON software that controls the Low-Level Wastewater Treatment Facility (LLWTF). The test was performed with WHC-SD-CP-OTP 154, PFP Low-Level Wastewater Treatment Facility Process Control Operational Test. A complete test copy is included in appendix A. The new TK 102 level controller provides a signal, hereafter referred to its cascade mode, to the treatment train flow controller which enables the water treatment process to run for long periods without continuous operator monitoring. The test successfully demonstrated the functionality of the new controller under standard and abnormal conditions expected from the LLWTF operation. In addition, a flow totalizer is now displayed on the LLWTF outlet MICON screen which tallies the process output in gallons. This feature substantially improves the ability to retrieve daily process volumes for maintaining accurate material balances

  18. SET oncoprotein accumulation regulates transcription through DNA demethylation and histone hypoacetylation.

    Science.gov (United States)

    Almeida, Luciana O; Neto, Marinaldo P C; Sousa, Lucas O; Tannous, Maryna A; Curti, Carlos; Leopoldino, Andreia M

    2017-04-18

    Epigenetic modifications are essential in the control of normal cellular processes and cancer development. DNA methylation and histone acetylation are major epigenetic modifications involved in gene transcription and abnormal events driving the oncogenic process. SET protein accumulates in many cancer types, including head and neck squamous cell carcinoma (HNSCC); SET is a member of the INHAT complex that inhibits gene transcription associating with histones and preventing their acetylation. We explored how SET protein accumulation impacts on the regulation of gene expression, focusing on DNA methylation and histone acetylation. DNA methylation profile of 24 tumour suppressors evidenced that SET accumulation decreased DNA methylation in association with loss of 5-methylcytidine, formation of 5-hydroxymethylcytosine and increased TET1 levels, indicating an active DNA demethylation mechanism. However, the expression of some suppressor genes was lowered in cells with high SET levels, suggesting that loss of methylation is not the main mechanism modulating gene expression. SET accumulation also downregulated the expression of 32 genes of a panel of 84 transcription factors, and SET directly interacted with chromatin at the promoter of the downregulated genes, decreasing histone acetylation. Gene expression analysis after cell treatment with 5-aza-2'-deoxycytidine (5-AZA) and Trichostatin A (TSA) revealed that histone acetylation reversed transcription repression promoted by SET. These results suggest a new function for SET in the regulation of chromatin dynamics. In addition, TSA diminished both SET protein levels and SET capability to bind to gene promoter, suggesting that administration of epigenetic modifier agents could be efficient to reverse SET phenotype in cancer.

  19. Individual and setting level predictors of the implementation of a skin cancer prevention program: a multilevel analysis

    Directory of Open Access Journals (Sweden)

    Brownson Ross C

    2010-05-01

    Full Text Available Abstract Background To achieve widespread cancer control, a better understanding is needed of the factors that contribute to successful implementation of effective skin cancer prevention interventions. This study assessed the relative contributions of individual- and setting-level characteristics to implementation of a widely disseminated skin cancer prevention program. Methods A multilevel analysis was conducted using data from the Pool Cool Diffusion Trial from 2004 and replicated with data from 2005. Implementation of Pool Cool by lifeguards was measured using a composite score (implementation variable, range 0 to 10 that assessed whether the lifeguard performed different components of the intervention. Predictors included lifeguard background characteristics, lifeguard sun protection-related attitudes and behaviors, pool characteristics, and enhanced (i.e., more technical assistance, tailored materials, and incentives are provided versus basic treatment group. Results The mean value of the implementation variable was 4 in both years (2004 and 2005; SD = 2 in 2004 and SD = 3 in 2005 indicating a moderate implementation for most lifeguards. Several individual-level (lifeguard characteristics and setting-level (pool characteristics and treatment group factors were found to be significantly associated with implementation of Pool Cool by lifeguards. All three lifeguard-level domains (lifeguard background characteristics, lifeguard sun protection-related attitudes and behaviors and six pool-level predictors (number of weekly pool visitors, intervention intensity, geographic latitude, pool location, sun safety and/or skin cancer prevention programs, and sun safety programs and policies were included in the final model. The most important predictors of implementation were the number of weekly pool visitors (inverse association and enhanced treatment group (positive association. That is, pools with fewer weekly visitors and pools in the enhanced

  20. Setting up a randomized clinical trial in the UK: approvals and process.

    Science.gov (United States)

    Greene, Louise Eleanor; Bearn, David R

    2013-06-01

    Randomized clinical trials are considered the 'gold standard' in primary research for healthcare interventions. However, they can be expensive and time-consuming to set up and require many approvals to be in place before they can begin. This paper outlines how to determine what approvals are required for a trial, the background of each approval and the process for obtaining them.

  1. Microwave Enhanced Cotunneling in SET Transistors

    DEFF Research Database (Denmark)

    Manscher, Martin; Savolainen, M.; Mygind, Jesper

    2003-01-01

    Cotunneling in single electron tunneling (SET) devices is an error process which may severely limit their electronic and metrologic applications. Here is presented an experimental investigation of the theory for adiabatic enhancement of cotunneling by coherent microwaves. Cotunneling in SET...... transistors has been measured as function of temperature, gate voltage, frequency, and applied microwave power. At low temperatures and applied power levels, including also sequential tunneling, the results can be made consistent with theory using the unknown damping in the microwave line as the only free...

  2. Process-Oriented Guided Inquiry Learning Strategy Enhances Students’ Higher Level Thinking Skills in a Pharmaceutical Sciences Course

    Science.gov (United States)

    Verlinden, Nathan; Kruger, Nicholas; Carroll, Ailey; Trumbo, Tiffany

    2015-01-01

    Objective. To determine if the process-oriented guided inquiry learning (POGIL) teaching strategy improves student performance and engages higher-level thinking skills of first-year pharmacy students in an Introduction to Pharmaceutical Sciences course. Design. Overall examination scores and scores on questions categorized as requiring either higher-level or lower-level thinking skills were compared in the same course taught over 3 years using traditional lecture methods vs the POGIL strategy. Student perceptions of the latter teaching strategy were also evaluated. Assessment. Overall mean examination scores increased significantly when POGIL was implemented. Performance on questions requiring higher-level thinking skills was significantly higher, whereas performance on questions requiring lower-level thinking skills was unchanged when the POGIL strategy was used. Student feedback on use of this teaching strategy was positive. Conclusion. The use of the POGIL strategy increased student overall performance on examinations, improved higher-level thinking skills, and provided an interactive class setting. PMID:25741027

  3. GOCI Level-2 Processing Improvements and Cloud Motion Analysis

    Science.gov (United States)

    Robinson, Wayne D.

    2015-01-01

    The Ocean Biology Processing Group has been working with the Korean Institute of Ocean Science and Technology (KIOST) to process geosynchronous ocean color data from the GOCI (Geostationary Ocean Color Instrument) aboard the COMS (Communications, Ocean and Meteorological Satellite). The level-2 processing program, l2gen has GOCI processing as an option. Improvements made to that processing are discussed here as well as a discussion about cloud motion effects.

  4. A min cut-set-wise truncation procedure for importance measures computation in probabilistic safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Duflot, Nicolas [Universite de technologie de Troyes, Institut Charles Delaunay/LM2S, FRE CNRS 2848, 12, rue Marie Curie, BP2060, F-10010 Troyes cedex (France)], E-mail: nicolas.duflot@areva.com; Berenguer, Christophe [Universite de technologie de Troyes, Institut Charles Delaunay/LM2S, FRE CNRS 2848, 12, rue Marie Curie, BP2060, F-10010 Troyes cedex (France)], E-mail: christophe.berenguer@utt.fr; Dieulle, Laurence [Universite de technologie de Troyes, Institut Charles Delaunay/LM2S, FRE CNRS 2848, 12, rue Marie Curie, BP2060, F-10010 Troyes cedex (France)], E-mail: laurence.dieulle@utt.fr; Vasseur, Dominique [EPSNA Group (Nuclear PSA and Application), EDF Research and Development, 1, avenue du Gal de Gaulle, 92141 Clamart cedex (France)], E-mail: dominique.vasseur@edf.fr

    2009-11-15

    A truncation process aims to determine among the set of minimal cut-sets (MCS) produced by a probabilistic safety assessment (PSA) model which of them are significant. Several truncation processes have been proposed for the evaluation of the probability of core damage ensuring a fixed accuracy level. However, the evaluation of new risk indicators as importance measures requires to re-examine the truncation process in order to ensure that the produced estimates will be accurate enough. In this paper a new truncation process is developed permitting to estimate from a single set of MCS the importance measure of any basic event with the desired accuracy level. The main contribution of this new method is to propose an MCS-wise truncation criterion involving two thresholds: an absolute threshold in addition to a new relative threshold concerning the potential probability of the MCS of interest. The method has been tested on a complete level 1 PSA model of a 900 MWe NPP developed by 'Electricite de France' (EDF) and the results presented in this paper indicate that to reach the same accuracy level the proposed method produces a set of MCS whose size is significantly reduced.

  5. A min cut-set-wise truncation procedure for importance measures computation in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Duflot, Nicolas; Berenguer, Christophe; Dieulle, Laurence; Vasseur, Dominique

    2009-01-01

    A truncation process aims to determine among the set of minimal cut-sets (MCS) produced by a probabilistic safety assessment (PSA) model which of them are significant. Several truncation processes have been proposed for the evaluation of the probability of core damage ensuring a fixed accuracy level. However, the evaluation of new risk indicators as importance measures requires to re-examine the truncation process in order to ensure that the produced estimates will be accurate enough. In this paper a new truncation process is developed permitting to estimate from a single set of MCS the importance measure of any basic event with the desired accuracy level. The main contribution of this new method is to propose an MCS-wise truncation criterion involving two thresholds: an absolute threshold in addition to a new relative threshold concerning the potential probability of the MCS of interest. The method has been tested on a complete level 1 PSA model of a 900 MWe NPP developed by 'Electricite de France' (EDF) and the results presented in this paper indicate that to reach the same accuracy level the proposed method produces a set of MCS whose size is significantly reduced.

  6. Wafer level 3-D ICs process technology

    CERN Document Server

    Tan, Chuan Seng; Reif, L Rafael

    2009-01-01

    This book focuses on foundry-based process technology that enables the fabrication of 3-D ICs. The core of the book discusses the technology platform for pre-packaging wafer lever 3-D ICs. However, this book does not include a detailed discussion of 3-D ICs design and 3-D packaging. This is an edited book based on chapters contributed by various experts in the field of wafer-level 3-D ICs process technology. They are from academia, research labs and industry.

  7. Improving district level health planning and priority setting in Tanzania through implementing accountability for reasonableness framework: Perceptions of stakeholders.

    Science.gov (United States)

    Maluka, Stephen; Kamuzora, Peter; San Sebastián, Miguel; Byskov, Jens; Ndawi, Benedict; Hurtig, Anna-Karin

    2010-12-01

    In 2006, researchers and decision-makers launched a five-year project - Response to Accountable Priority Setting for Trust in Health Systems (REACT) - to improve planning and priority-setting through implementing the Accountability for Reasonableness framework in Mbarali District, Tanzania. The objective of this paper is to explore the acceptability of Accountability for Reasonableness from the perspectives of the Council Health Management Team, local government officials, health workforce and members of user boards and committees. Individual interviews were carried out with different categories of actors and stakeholders in the district. The interview guide consisted of a series of questions, asking respondents to describe their perceptions regarding each condition of the Accountability for Reasonableness framework in terms of priority setting. Interviews were analysed using thematic framework analysis. Documentary data were used to support, verify and highlight the key issues that emerged. Almost all stakeholders viewed Accountability for Reasonableness as an important and feasible approach for improving priority-setting and health service delivery in their context. However, a few aspects of Accountability for Reasonableness were seen as too difficult to implement given the socio-political conditions and traditions in Tanzania. Respondents mentioned: budget ceilings and guidelines, low level of public awareness, unreliable and untimely funding, as well as the limited capacity of the district to generate local resources as the major contextual factors that hampered the full implementation of the framework in their context. This study was one of the first assessments of the applicability of Accountability for Reasonableness in health care priority-setting in Tanzania. The analysis, overall, suggests that the Accountability for Reasonableness framework could be an important tool for improving priority-setting processes in the contexts of resource-poor settings

  8. What Makes Deeply Encoded Items Memorable? Insights into the Levels of Processing Framework from Neuroimaging and Neuromodulation

    Science.gov (United States)

    Galli, Giulia

    2014-01-01

    When we form new memories, their mnestic fate largely depends upon the cognitive operations set in train during encoding. A typical observation in experimental as well as everyday life settings is that if we learn an item using semantic or “deep” operations, such as attending to its meaning, memory will be better than if we learn the same item using more “shallow” operations, such as attending to its structural features. In the psychological literature, this phenomenon has been conceptualized within the “levels of processing” framework and has been consistently replicated since its original proposal by Craik and Lockhart in 1972. However, the exact mechanisms underlying the memory advantage for deeply encoded items are not yet entirely understood. A cognitive neuroscience perspective can add to this field by clarifying the nature of the processes involved in effective deep and shallow encoding and how they are instantiated in the brain, but so far there has been little work to systematically integrate findings from the literature. This work aims to fill this gap by reviewing, first, some of the key neuroimaging findings on the neural correlates of deep and shallow episodic encoding and second, emerging evidence from studies using neuromodulatory approaches such as psychopharmacology and non-invasive brain stimulation. Taken together, these studies help further our understanding of levels of processing. In addition, by showing that deep encoding can be modulated by acting upon specific brain regions or systems, the reviewed studies pave the way for selective enhancements of episodic encoding processes. PMID:24904444

  9. Optimized image processing with modified preprocessing of image data sets of a transparent imaging plate by way of the lateral view of the cervical spine

    International Nuclear Information System (INIS)

    Reissberg, S.; Hoeschen, C.; Redlich, U.; Scherlach, C.; Preuss, H.; Kaestner, A.; Doehring, W.; Woischneck, D.; Schuetze, M.; Reichardt, K.; Firsching, R.

    2002-01-01

    Purpose: To improve the diagnostic quality of lateral radiographs of the cervical spine by pre-processing the image data sets produced by a transparent imaging plate with both-side reading and to evaluate any possible impact on minimizing the number of additional radiographs and supplementary investigations. Material and Methods: One hundred lateral digital radiographs of the cervical spine were processed with two different methods: processing of each data set using the system-imminent parameters and using the manual model. The difference between the two types of processing is the level of the latitude value. Hard copies of the processed images were judged by five radiologists and three neurosurgeons. The evaluation applied the image criteria score (ICS) without conventional reference images. Results: In 99% of the lateral radiographs of the cervical spine, all vertebral bodies could be completed delineated using the manual mode, but only 76% of the images processed by the system-imminent parameters showed all vertebral bodies. Thus, the manual mode enabled the evaluation of up to two additional more caudal vertebral bodies. The manual mode processing was significantly better concerning object size and processing artifacts. This optimized image processing and the resultant minimization of supplementary investigations was calculated to correspond to a theoretical dose reduction of about 50%. (orig.) [de

  10. Japan's Siting Process for the Geological Disposal of High-level Radioactive Waste - An International Peer Review

    International Nuclear Information System (INIS)

    Brassinnes, Stephane; Fabbri, Olivier; Rubenstone, James; Seppaelae, Timo; Siemann, Michael; ); Kwong, Gloria; )

    2016-01-01

    The Nuclear Energy Agency carried out an independent peer review of Japan's siting process and criteria for the geological disposal of high-level radioactive waste in May 2016. The review concluded that Japan's site screening process is generally in accordance with international practices. As the goal of the siting process is to locate a site - that is both appropriate and accepted by the community - to host a geological disposal facility for high-level radioactive waste, the international review team emphasises in this report the importance of maintaining an open dialogue and interaction between the regulator, the implementer and the public. Dialogue should begin in the early phases and continue throughout the siting process. The international review team also underlines the importance of taking into account feasibility aspects when selecting a site for preliminary investigations, but suggests that it would be inappropriate to set detailed scientific criteria for nationwide screening at this stage. The team has provided extensive advisory remarks in the report as opportunities for improvement, including the recommendation to use clear and consistent terminology in defining the site screening criteria as it is a critical factor in a successful siting process. (authors)

  11. Segmentation of teeth in CT volumetric dataset by panoramic projection and variational level set

    Energy Technology Data Exchange (ETDEWEB)

    Hosntalab, Mohammad [Islamic Azad University, Faculty of Engineering, Science and Research Branch, Tehran (Iran); Aghaeizadeh Zoroofi, Reza [University of Tehran, Control and Intelligent Processing Center of Excellence, School of Electrical and Computer Engineering, College of Engineering, Tehran (Iran); Abbaspour Tehrani-Fard, Ali [Islamic Azad University, Faculty of Engineering, Science and Research Branch, Tehran (Iran); Sharif University of Technology, Department of Electrical Engineering, Tehran (Iran); Shirani, Gholamreza [Faculty of Dentistry Medical Science of Tehran University, Oral and Maxillofacial Surgery Department, Tehran (Iran)

    2008-09-15

    Quantification of teeth is of clinical importance for various computer assisted procedures such as dental implant, orthodontic planning, face, jaw and cosmetic surgeries. In this regard, segmentation is a major step. In this paper, we propose a method for segmentation of teeth in volumetric computed tomography (CT) data using panoramic re-sampling of the dataset in the coronal view and variational level set. The proposed method consists of five steps as follows: first, we extract a mask in a CT images using Otsu thresholding. Second, the teeth are segmented from other bony tissues by utilizing anatomical knowledge of teeth in the jaws. Third, the proposed method is followed by estimating the arc of the upper and lower jaws and panoramic re-sampling of the dataset. Separation of upper and lower jaws and initial segmentation of teeth are performed by employing the horizontal and vertical projections of the panoramic dataset, respectively. Based the above mentioned procedures an initial mask for each tooth is obtained. Finally, we utilize the initial mask of teeth and apply a Variational level set to refine initial teeth boundaries to final contours. The proposed algorithm was evaluated in the presence of 30 multi-slice CT datasets including 3,600 images. Experimental results reveal the effectiveness of the proposed method. In the proposed algorithm, the variational level set technique was utilized to trace the contour of the teeth. In view of the fact that, this technique is based on the characteristic of the overall region of the teeth image, it is possible to extract a very smooth and accurate tooth contour using this technique. In the presence of the available datasets, the proposed technique was successful in teeth segmentation compared to previous techniques. (orig.)

  12. Segmentation of teeth in CT volumetric dataset by panoramic projection and variational level set

    International Nuclear Information System (INIS)

    Hosntalab, Mohammad; Aghaeizadeh Zoroofi, Reza; Abbaspour Tehrani-Fard, Ali; Shirani, Gholamreza

    2008-01-01

    Quantification of teeth is of clinical importance for various computer assisted procedures such as dental implant, orthodontic planning, face, jaw and cosmetic surgeries. In this regard, segmentation is a major step. In this paper, we propose a method for segmentation of teeth in volumetric computed tomography (CT) data using panoramic re-sampling of the dataset in the coronal view and variational level set. The proposed method consists of five steps as follows: first, we extract a mask in a CT images using Otsu thresholding. Second, the teeth are segmented from other bony tissues by utilizing anatomical knowledge of teeth in the jaws. Third, the proposed method is followed by estimating the arc of the upper and lower jaws and panoramic re-sampling of the dataset. Separation of upper and lower jaws and initial segmentation of teeth are performed by employing the horizontal and vertical projections of the panoramic dataset, respectively. Based the above mentioned procedures an initial mask for each tooth is obtained. Finally, we utilize the initial mask of teeth and apply a Variational level set to refine initial teeth boundaries to final contours. The proposed algorithm was evaluated in the presence of 30 multi-slice CT datasets including 3,600 images. Experimental results reveal the effectiveness of the proposed method. In the proposed algorithm, the variational level set technique was utilized to trace the contour of the teeth. In view of the fact that, this technique is based on the characteristic of the overall region of the teeth image, it is possible to extract a very smooth and accurate tooth contour using this technique. In the presence of the available datasets, the proposed technique was successful in teeth segmentation compared to previous techniques. (orig.)

  13. Tokunaga and Horton self-similarity for level set trees of Markov chains

    International Nuclear Information System (INIS)

    Zaliapin, Ilia; Kovchegov, Yevgeniy

    2012-01-01

    Highlights: ► Self-similar properties of the level set trees for Markov chains are studied. ► Tokunaga and Horton self-similarity are established for symmetric Markov chains and regular Brownian motion. ► Strong, distributional self-similarity is established for symmetric Markov chains with exponential jumps. ► It is conjectured that fractional Brownian motions are Tokunaga self-similar. - Abstract: The Horton and Tokunaga branching laws provide a convenient framework for studying self-similarity in random trees. The Horton self-similarity is a weaker property that addresses the principal branching in a tree; it is a counterpart of the power-law size distribution for elements of a branching system. The stronger Tokunaga self-similarity addresses so-called side branching. The Horton and Tokunaga self-similarity have been empirically established in numerous observed and modeled systems, and proven for two paradigmatic models: the critical Galton–Watson branching process with finite progeny and the finite-tree representation of a regular Brownian excursion. This study establishes the Tokunaga and Horton self-similarity for a tree representation of a finite symmetric homogeneous Markov chain. We also extend the concept of Horton and Tokunaga self-similarity to infinite trees and establish self-similarity for an infinite-tree representation of a regular Brownian motion. We conjecture that fractional Brownian motions are also Tokunaga and Horton self-similar, with self-similarity parameters depending on the Hurst exponent.

  14. Fuzzy sets on step of planning of experiment for organization and management of construction processes

    Directory of Open Access Journals (Sweden)

    Lapidus Azariy

    2016-01-01

    Full Text Available In this article, problems of mathematical modeling and experiment planning of the organization and management of construction. The authors designated the basic restrictions and the difficulties in this field. Concluded that the planning of research experiment is possible in the information sphere with using of heuristic, graphical, mathematical models, as well as neural networks and genetic algorithms. The authors note the need for use of expert information in the case of the formalization of quality parameters. The article presented an overview of the translation methods of qualitative information into mathematical language. Comparison of methods the qualimetry of USSR scientists, the analytic hierarchy process and fuzzy set theory were performed. The benefits of the latter for interpretation of qualitative parameters were identified. The authors have given many examples of application fuzzy sets for formalization of organizational factors of construction processes. Finally, there conclusion was made about progressiveness and effectiveness of fuzzy set theory to describe the qualitative parameters of organization and management of construction.

  15. Impact of Educational Level on Performance on Auditory Processing Tests.

    Science.gov (United States)

    Murphy, Cristina F B; Rabelo, Camila M; Silagi, Marcela L; Mansur, Letícia L; Schochat, Eliane

    2016-01-01

    Research has demonstrated that a higher level of education is associated with better performance on cognitive tests among middle-aged and elderly people. However, the effects of education on auditory processing skills have not yet been evaluated. Previous demonstrations of sensory-cognitive interactions in the aging process indicate the potential importance of this topic. Therefore, the primary purpose of this study was to investigate the performance of middle-aged and elderly people with different levels of formal education on auditory processing tests. A total of 177 adults with no evidence of cognitive, psychological or neurological conditions took part in the research. The participants completed a series of auditory assessments, including dichotic digit, frequency pattern and speech-in-noise tests. A working memory test was also performed to investigate the extent to which auditory processing and cognitive performance were associated. The results demonstrated positive but weak correlations between years of schooling and performance on all of the tests applied. The factor "years of schooling" was also one of the best predictors of frequency pattern and speech-in-noise test performance. Additionally, performance on the working memory, frequency pattern and dichotic digit tests was also correlated, suggesting that the influence of educational level on auditory processing performance might be associated with the cognitive demand of the auditory processing tests rather than auditory sensory aspects itself. Longitudinal research is required to investigate the causal relationship between educational level and auditory processing skills.

  16. Automatic segmentation of Leishmania parasite in microscopic images using a modified CV level set method

    Science.gov (United States)

    Farahi, Maria; Rabbani, Hossein; Talebi, Ardeshir; Sarrafzadeh, Omid; Ensafi, Shahab

    2015-12-01

    Visceral Leishmaniasis is a parasitic disease that affects liver, spleen and bone marrow. According to World Health Organization report, definitive diagnosis is possible just by direct observation of the Leishman body in the microscopic image taken from bone marrow samples. We utilize morphological and CV level set method to segment Leishman bodies in digital color microscopic images captured from bone marrow samples. Linear contrast stretching method is used for image enhancement and morphological method is applied to determine the parasite regions and wipe up unwanted objects. Modified global and local CV level set methods are proposed for segmentation and a shape based stopping factor is used to hasten the algorithm. Manual segmentation is considered as ground truth to evaluate the proposed method. This method is tested on 28 samples and achieved 10.90% mean of segmentation error for global model and 9.76% for local model.

  17. Quasi-min-max Fuzzy MPC of UTSG Water Level Based on Off-Line Invariant Set

    Science.gov (United States)

    Liu, Xiangjie; Jiang, Di; Lee, Kwang Y.

    2015-10-01

    In a nuclear power plant, the water level of the U-tube steam generator (UTSG) must be maintained within a safe range. Traditional control methods encounter difficulties due to the complexity, strong nonlinearity and “swell and shrink” effects, especially at low power levels. A properly designed robust model predictive control can well solve this problem. In this paper, a quasi-min-max fuzzy model predictive controller is developed for controlling the constrained UTSG system. While the online computational burden could be quite large for the real-time control, a bank of ellipsoid invariant sets together with the corresponding feedback control laws are obtained by off-line solving linear matrix inequalities (LMIs). Based on the UTSG states, the online optimization is simplified as a constrained optimization problem with a bisection search for the corresponding ellipsoid invariant set. Simulation results are given to show the effectiveness of the proposed controller.

  18. On the Relationship between Variational Level Set-Based and SOM-Based Active Contours

    Science.gov (United States)

    Abdelsamea, Mohammed M.; Gnecco, Giorgio; Gaber, Mohamed Medhat; Elyan, Eyad

    2015-01-01

    Most Active Contour Models (ACMs) deal with the image segmentation problem as a functional optimization problem, as they work on dividing an image into several regions by optimizing a suitable functional. Among ACMs, variational level set methods have been used to build an active contour with the aim of modeling arbitrarily complex shapes. Moreover, they can handle also topological changes of the contours. Self-Organizing Maps (SOMs) have attracted the attention of many computer vision scientists, particularly in modeling an active contour based on the idea of utilizing the prototypes (weights) of a SOM to control the evolution of the contour. SOM-based models have been proposed in general with the aim of exploiting the specific ability of SOMs to learn the edge-map information via their topology preservation property and overcoming some drawbacks of other ACMs, such as trapping into local minima of the image energy functional to be minimized in such models. In this survey, we illustrate the main concepts of variational level set-based ACMs, SOM-based ACMs, and their relationship and review in a comprehensive fashion the development of their state-of-the-art models from a machine learning perspective, with a focus on their strengths and weaknesses. PMID:25960736

  19. The effect of signal acquisition and processing choices on ApEn values: towards a "gold standard" for distinguishing effort levels from isometric force records.

    Science.gov (United States)

    Forrest, Sarah M; Challis, John H; Winter, Samantha L

    2014-06-01

    Approximate entropy (ApEn) is frequently used to identify changes in the complexity of isometric force records with ageing and disease. Different signal acquisition and processing parameters have been used, making comparison or confirmation of results difficult. This study determined the effect of sampling and parameter choices by examining changes in ApEn values across a range of submaximal isometric contractions of the first dorsal interosseus. Reducing the sample rate by decimation changed both the value and pattern of ApEn values dramatically. The pattern of ApEn values across the range of effort levels was not sensitive to the filter cut-off frequency, or the criterion used to extract the section of data for analysis. The complexity increased with increasing effort levels using a fixed 'r' value (which accounts for measurement noise) but decreased with increasing effort level when 'r' was set to 0.1 of the standard deviation of force. It is recommended isometric force records are sampled at frequencies >200Hz, template length ('m') is set to 2, and 'r' set to measurement system noise or 0.1SD depending on physiological process to be distinguished. It is demonstrated that changes in ApEn across effort levels are related to changes in force gradation strategy. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.

  20. Levels of processing and Eye Movements: A Stimulus driven approach

    DEFF Research Database (Denmark)

    Mulvey, Fiona Bríd

    2014-01-01

    movements can be controlled either by bottom up stimulus properties or by top down cognitive control, studies have compared eye movements in real world tasks and searched for indicators of cognitive load or level of attention when task demands increase. Extracting the effects of cognitive processing on eye......The aim of this research is to investigate the explication of levels of attention through eye movement parameters. Previous research from disparate fields have suggested that eye movements are related to cognitive processing, however, the exact nature of the relationship is unclear. Since eye...... to investigate individual differences in levels of processing within the normal population using existing constructs and tests of cognitive style. Study 4 investigates these stimuli and the eye movements of a clinical group with known interruption to the dorsal stream of processing, and subsequent isolated...

  1. Adaptable Value-Set Analysis for Low-Level Code

    OpenAIRE

    Brauer, Jörg; Hansen, René Rydhof; Kowalewski, Stefan; Larsen, Kim G.; Olesen, Mads Chr.

    2012-01-01

    This paper presents a framework for binary code analysis that uses only SAT-based algorithms. Within the framework, incremental SAT solving is used to perform a form of weakly relational value-set analysis in a novel way, connecting the expressiveness of the value sets to computational complexity. Another key feature of our framework is that it translates the semantics of binary code into an intermediate representation. This allows for a straightforward translation of the program semantics in...

  2. The process of setting micronutrient recommendations

    DEFF Research Database (Denmark)

    Timotijevic, Lada; Barnett, Julie; Brown, Kerry

    2011-01-01

    in the field of micronutrient recommendations and a case study that focused on mandatory folic acid (FA) fortification. Setting: Questionnaire-based data were collected across thirty-five European countries. The FA fortification case study was conducted in the UK, Norway, Denmark, Germany, Spain, Czech...... Republic and Hungary. Results: Varied bodies are responsible for setting micronutrient recommendations, each with different statutory and legal models of operation. Transparency is highest where there are standing scientific advisory committees (SAC). Where the standing SAC is created, the range...... of expertise and the terms of reference for the SAC are determined by the government. Where there is no dedicated SAC, the impetus for the development of micronutrient recommendations and the associated policies comes from interested specialists in the area. This is typically linked with an ad hoc selection...

  3. Outsourcing Set Intersection Computation Based on Bloom Filter for Privacy Preservation in Multimedia Processing

    Directory of Open Access Journals (Sweden)

    Hongliang Zhu

    2018-01-01

    Full Text Available With the development of cloud computing, the advantages of low cost and high computation ability meet the demands of complicated computation of multimedia processing. Outsourcing computation of cloud could enable users with limited computing resources to store and process distributed multimedia application data without installing multimedia application software in local computer terminals, but the main problem is how to protect the security of user data in untrusted public cloud services. In recent years, the privacy-preserving outsourcing computation is one of the most common methods to solve the security problems of cloud computing. However, the existing computation cannot meet the needs for the large number of nodes and the dynamic topologies. In this paper, we introduce a novel privacy-preserving outsourcing computation method which combines GM homomorphic encryption scheme and Bloom filter together to solve this problem and propose a new privacy-preserving outsourcing set intersection computation protocol. Results show that the new protocol resolves the privacy-preserving outsourcing set intersection computation problem without increasing the complexity and the false positive probability. Besides, the number of participants, the size of input secret sets, and the online time of participants are not limited.

  4. Effect of Bread Making Process on Aflatoxin Level Changes

    Directory of Open Access Journals (Sweden)

    Jafar Milani

    2014-12-01

    Full Text Available Wheat flour is a commodity with a high risk of aflatoxins (AFs contamination. During the bread making there are many processes that can affect the AFs stability. The effect of bread making process using different yeast types on AFs levels was investigated. For this purpose, standards of AFs including B and Gwere added to flour and then bread loaves were prepared. Three types of commercially available yeast including active dry yeast, instant dry yeast and compressed yeast were used for dough preparation. AFs levels in flour, dough, and bread were analyzed by high performance liquid chromatography (HPLC with fluorescence detector. The results showed that maximum reduction in aflatoxin levels observed during first proof while the least decline was seen for the baking stage. The order of AFs reduction in bread making process was AFB1>AFB2>AFG1. Furthermore, the results indicated that the most effective yeast for AFs reduction was instant dry yeast.

  5. Fast Streaming 3D Level set Segmentation on the GPU for Smooth Multi-phase Segmentation

    DEFF Research Database (Denmark)

    Sharma, Ojaswa; Zhang, Qin; Anton, François

    2011-01-01

    Level set method based segmentation provides an efficient tool for topological and geometrical shape handling, but it is slow due to high computational burden. In this work, we provide a framework for streaming computations on large volumetric images on the GPU. A streaming computational model...

  6. LevelMerge: Collaborative Game Level Editing by Merging Labeled Graphs

    OpenAIRE

    Santoni, Christian; Salvati, Gabriele; Tibaldo, Valentina; Pellacini, Fabio

    2016-01-01

    Game level editing is the process of constructing a full game level starting from 3D asset libraries, e.g. 3d models, textures, shaders, scripts. In level editing, designers define the look and behavior of the whole level by placing objects, assigning materials and lighting parameters, setting animations and physics properties and customizing the objects AI and behavior by editing scripts. The heterogeneity of the task usually translates to a workflow where a team of people, experts on separa...

  7. Integrating Cross-Case Analyses and Process Tracing in Set-Theoretic Research: Strategies and Parameters of Debate

    Science.gov (United States)

    Beach, Derek; Rohlfing, Ingo

    2018-01-01

    In recent years, there has been increasing interest in the combination of two methods on the basis of set theory. In our introduction and this special issue, we focus on two variants of cross-case set-theoretic methods--"qualitative comparative analysis" (QCA) and typological theory (TT)--and their combination with process tracing (PT).…

  8. Levels-of-Processing Effects in Infant Memory?

    Science.gov (United States)

    Adler, Scott A.; Gerhardstein, Peter; Rovee-Collier, Carolyn

    1998-01-01

    Three experiments manipulated 3-month-olds' attention to different components of a training display and assessed the effect on retention. Results suggested that increasing or decreasing attention to an item during encoding produces a corresponding increase or decrease in memorability. Findings were consistent with a levels-of-processing account…

  9. Managing the high level waste nuclear regulatory commission licensing process

    International Nuclear Information System (INIS)

    Baskin, K.P.

    1992-01-01

    This paper reports that the process for obtaining Nuclear Regulatory Commission permits for the high level waste storage facility is basically the same process commercial nuclear power plants followed to obtain construction permits and operating licenses for their facilities. Therefore, the experience from licensing commercial reactors can be applied to the high level waste facility. Proper management of the licensing process will be the key to the successful project. The management of the licensing process was categorized into four areas as follows: responsibility, organization, communication and documentation. Drawing on experience from nuclear power plant licensing and basic management principles, the management requirement for successfully accomplishing the project goals are discussed

  10. Simulation to aid in interpreting biological relevance and setting of population-level protection goals for risk assessment of pesticides.

    Science.gov (United States)

    Topping, Christopher John; Luttik, Robert

    2017-10-01

    Specific protection goals (SPGs) comprise an explicit expression of the environmental components that need protection and the maximum impacts that can be tolerated. SPGs are set by risk managers and are typically based on protecting populations or functions. However, the measurable endpoints available to risk managers, at least for vertebrates, are typically laboratory tests. We demonstrate, using the example of eggshell thinning in skylarks, how simulation can be used to place laboratory endpoints in context of population-level effects as an aid to setting the SPGs. We develop explanatory scenarios investigating the impact of different assumptions of eggshell thinning on skylark population size, density and distribution in 10 Danish landscapes, chosen to represent the range of typical Danish agricultural conditions. Landscape and timing of application of the pesticide were found to be the most critical factors to consider in the impact assessment. Consequently, a regulatory scenario of monoculture spring barley with an early spray treatment eliciting the eggshell thinning effect was applied using concentrations eliciting effects of zero to 100% in steps of 5%. Setting the SPGs requires balancing scientific, social and political realities. However, the provision of clear and detailed options such as those from comprehensive simulation results can inform the decision process by improving transparency and by putting the more abstract testing data into the context of real-world impacts. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. A finite element/level set model of polyurethane foam expansion and polymerization

    Energy Technology Data Exchange (ETDEWEB)

    Rao, Rekha R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Long, Kevin Nicholas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Roberts, Christine Cardinal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Celina, Mathias C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Brunini, Victor [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Soehnel, Melissa Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Noble, David R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Tinsley, James [Honeywell Federal Manufacturing & Technologies, Kansas City, MO (United States); Mondy, Lisa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    Polyurethane foams are used widely for encapsulation and structural purposes because they are inexpensive, straightforward to process, amenable to a wide range of density variations (1 lb/ft3 - 50 lb/ft3), and able to fill complex molds quickly and effectively. Computational model of the filling and curing process are needed to reduce defects such as voids, out-of-specification density, density gradients, foam decomposition from high temperatures due to exotherms, and incomplete filling. This paper details the development of a computational fluid dynamics model of a moderate density PMDI structural foam, PMDI-10. PMDI is an isocyanate-based polyurethane foam, which is chemically blown with water. The polyol reacts with isocyanate to produces the polymer. PMDI- 10 is catalyzed giving it a short pot life: it foams and polymerizes to a solid within 5 minutes during normal processing. To achieve a higher density, the foam is over-packed to twice or more of its free rise density of 10 lb/ft3. The goal for modeling is to represent the expansion, filling of molds, and the polymerization of the foam. This will be used to reduce defects, optimize the mold design, troubleshoot the processed, and predict the final foam properties. A homogenized continuum model foaming and curing was developed based on reaction kinetics, documented in a recent paper; it uses a simplified mathematical formalism that decouples these two reactions. The chemo-rheology of PMDI is measured experimentally and fit to a generalized- Newtonian viscosity model that is dependent on the extent of cure, gas fraction, and temperature. The conservation equations, including the equations of motion, an energy balance, and three rate equations are solved via a stabilized finite element method. The equations are combined with a level set method to determine the location of the foam-gas interface as it evolves to fill the mold. Understanding the thermal history and loads on the foam due to exothermicity and oven

  12. Priority Setting for Universal Health Coverage: We Need Evidence-Informed Deliberative Processes, Not Just More Evidence on Cost-Effectiveness

    Directory of Open Access Journals (Sweden)

    Rob Baltussen

    2016-11-01

    Full Text Available Priority setting of health interventions is generally considered as a valuable approach to support low- and middle-income countries (LMICs in their strive for universal health coverage (UHC. However, present initiatives on priority setting are mainly geared towards the development of more cost-effectiveness information, and this evidence does not sufficiently support countries to make optimal choices. The reason is that priority setting is in reality a value-laden political process in which multiple criteria beyond cost-effectiveness are important, and stakeholders often justifiably disagree about the relative importance of these criteria. Here, we propose the use of ‘evidence-informed deliberative processes’ as an approach that does explicitly recognise priority setting as a political process and an intrinsically complex task. In these processes, deliberation between stakeholders is crucial to identify, reflect and learn about the meaning and importance of values, informed by evidence on these values. Such processes then result in the use of a broader range of explicit criteria that can be seen as the product of both international learning (‘core’ criteria, which include eg, cost-effectiveness, priority to the worse off, and financial protection and learning among local stakeholders (‘contextual’ criteria. We believe that, with these evidence-informed deliberative processes in place, priority setting can provide a more meaningful contribution to achieving UHC.

  13. Reactor water level control device

    International Nuclear Information System (INIS)

    Utagawa, Kazuyuki.

    1993-01-01

    A device of the present invention can effectively control fluctuation of a reactor water level upon power change by reactor core flow rate control operation. That is, (1) a feedback control section calculates a feedwater flow rate control amount based on a deviation between a set value of a reactor water level and a reactor water level signal. (2) a feed forward control section forecasts steam flow rate change based on a reactor core flow rate signal or a signal determining the reactor core flow rate, to calculate a feedwater flow rate control amount which off sets the steam flow rate change. Then, the sum of the output signal from the process (1) and the output signal from the process (2) is determined as a final feedwater flow rate control signal. With such procedures, it is possible to forecast the steam flow rate change accompanying the reactor core flow rate control operation, thereby enabling to conduct preceding feedwater flow rate control operation which off sets the reactor water level fluctuation based on the steam flow rate change. Further, a reactor water level deviated from the forecast can be controlled by feedback control. Accordingly, reactor water level fluctuation upon power exchange due to the reactor core flow rate control operation can rapidly be suppressed. (I.S.)

  14. Essential processes for cognitive behavioral clinical supervision: Agenda setting, problem-solving, and formative feedback.

    Science.gov (United States)

    Cummings, Jorden A; Ballantyne, Elena C; Scallion, Laura M

    2015-06-01

    Clinical supervision should be a proactive and considered endeavor, not a reactive one. To that end, supervisors should choose supervision processes that are driven by theory, best available research, and clinical experience. These processes should be aimed at helping trainees develop as clinicians. We highlight 3 supervision processes we believe should be used at each supervision meeting: agenda setting, encouraging trainee problem-solving, and formative feedback. Although these are primarily cognitive-behavioral skills, they can be helpful in combination with other supervision models. We provide example dialogue from supervision exchanges, and discuss theoretical and research support for these processes. Using these processes not only encourages trainee development but also models for them how to use the same processes and approaches with clients. (c) 2015 APA, all rights reserved).

  15. Classification of Normal and Apoptotic Cells from Fluorescence Microscopy Images Using Generalized Polynomial Chaos and Level Set Function.

    Science.gov (United States)

    Du, Yuncheng; Budman, Hector M; Duever, Thomas A

    2016-06-01

    Accurate automated quantitative analysis of living cells based on fluorescence microscopy images can be very useful for fast evaluation of experimental outcomes and cell culture protocols. In this work, an algorithm is developed for fast differentiation of normal and apoptotic viable Chinese hamster ovary (CHO) cells. For effective segmentation of cell images, a stochastic segmentation algorithm is developed by combining a generalized polynomial chaos expansion with a level set function-based segmentation algorithm. This approach provides a probabilistic description of the segmented cellular regions along the boundary, from which it is possible to calculate morphological changes related to apoptosis, i.e., the curvature and length of a cell's boundary. These features are then used as inputs to a support vector machine (SVM) classifier that is trained to distinguish between normal and apoptotic viable states of CHO cell images. The use of morphological features obtained from the stochastic level set segmentation of cell images in combination with the trained SVM classifier is more efficient in terms of differentiation accuracy as compared with the original deterministic level set method.

  16. Home advantage in high-level volleyball varies according to set number.

    Science.gov (United States)

    Marcelino, Rui; Mesquita, Isabel; Palao Andrés, José Manuel; Sampaio, Jaime

    2009-01-01

    The aim of the present study was to identify the probability of winning each Volleyball set according to game location (home, away). Archival data was obtained from 275 sets in the 2005 Men's Senior World League and 65,949 actions were analysed. Set result (win, loss), game location (home, away), set number (first, second, third, fourth and fifth) and performance indicators (serve, reception, set, attack, dig and block) were the variables considered in this study. In a first moment, performance indicators were used in a logistic model of set result, by binary logistic regression analysis. After finding the adjusted logistic model, the log-odds of winning the set were analysed according to game location and set number. The results showed that winning a set is significantly related to performance indicators (Chisquare(18)=660.97, padvantage at the beginning of the game (first set) and in the two last sets of the game (fourth and fifth sets), probably due to facilities familiarity and crowd effects. Different game actions explain these advantages and showed that to win the first set is more important to take risk, through a better performance in the attack and block, and to win the final set is important to manage the risk through a better performance on the reception. These results may suggest intra-game variation in home advantage and can be most useful to better prepare and direct the competition. Key pointsHome teams always have more probability of winning the game than away teams.Home teams have higher performance in reception, set and attack in the total of the sets.The advantage of home teams is more pronounced at the beginning of the game (first set) and in two last sets of the game (fourth and fifth sets) suggesting intra-game variation in home advantage.Analysis by sets showed that home teams have a better performance in the attack and block in the first set and in the reception in the third and fifth sets.

  17. The importance of information goods abstraction levels for information commerce process models

    NARCIS (Netherlands)

    Wijnhoven, Alphonsus B.J.M.

    2002-01-01

    A process model, in the context of e-commerce, is an organized set of activities for the creation, (re-)production, trade and delivery of goods. Electronic commerce studies have created important process models for the trade of physical goods via Internet. These models are not easily suitable for

  18. Reservoir characterisation by a binary level set method and adaptive multiscale estimation

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, Lars Kristian

    2006-01-15

    The main focus of this work is on estimation of the absolute permeability as a solution of an inverse problem. We have both considered a single-phase and a two-phase flow model. Two novel approaches have been introduced and tested numerical for solving the inverse problems. The first approach is a multi scale zonation technique which is treated in Paper A. The purpose of the work in this paper is to find a coarse scale solution based on production data from wells. In the suggested approach, the robustness of an already developed method, the adaptive multi scale estimation (AME), has been improved by utilising information from several candidate solutions generated by a stochastic optimizer. The new approach also suggests a way of combining a stochastic and a gradient search method, which in general is a problematic issue. The second approach is a piecewise constant level set approach and is applied in Paper B, C, D and E. Paper B considers the stationary single-phase problem, while Paper C, D and E use a two-phase flow model. In the two-phase flow problem we have utilised information from both production data in wells and spatially distributed data gathered from seismic surveys. Due to the higher content of information provided by the spatially distributed data, we search solutions on a slightly finer scale than one typically does with only production data included. The applied level set method is suitable for reconstruction of fields with a supposed known facies-type of solution. That is, the solution should be close to piecewise constant. This information is utilised through a strong restriction of the number of constant levels in the estimate. On the other hand, the flexibility in the geometries of the zones is much larger for this method than in a typical zonation approach, for example the multi scale approach applied in Paper A. In all these papers, the numerical studies are done on synthetic data sets. An advantage of synthetic data studies is that the true

  19. Effects of technological processes on enniatin levels in pasta.

    Science.gov (United States)

    Serrano, Ana B; Font, Guillermina; Mañes, Jordi; Ferrer, Emilia

    2016-03-30

    Potential human health risks posed by enniatins (ENs) require their control primarily from cereal products, creating a demand for harvesting, food processing and storage techniques capable to prevent, reduce and/or eliminate the contamination. In this study, different methodologies to pasta processing simulating traditional and industrial processes were developed in order to know the fate of the mycotoxin ENs. The levels of ENs were studied at different steps of pasta processing. The effect of the temperature during processing was evaluated in two types of pasta (white and whole-grain pasta). Mycotoxin analysis was performed by LC-MS/MS. High reductions (up to 50% and 80%) were achieved during drying pasta at 45-55°C and 70-90°C, respectively. The treatments at low temperature (25°C) did not change EN levels. The effect of pasta composition did not cause a significant effect on the stability of ENs. The effect of the temperature allowed a marked mycotoxin reduction during pasta processing. Generally, ENA1 and ENB showed higher thermal stability than did ENA and ENB1 . The findings from the present study suggested that pasta processing at medium-high temperatures is a potential tool to remove an important fraction of ENs from the initial durum wheat semolina. © 2015 Society of Chemical Industry.

  20. A social-level macro-governance mode for collaborative manufacturing processes

    Science.gov (United States)

    Gao, Ji; Lv, Hexin; Jin, Zhiyong; Xu, Ping

    2017-08-01

    This paper proposes the social-level macro-governance mode for innovating the popular centralized control for CoM (Collaborative Manufacturing) processes, and makes this mode depend on the support from three aspects of technologies standalone and complementary: social-level CoM process norms, CoM process supervision system, and rational agents as the brokers of enterprises. It is the close coupling of those technologies that redounds to removing effectively the uncontrollability obstacle confronted with by cross-management-domain CoM processes. As a result, this mode enables CoM applications to be implemented by uniting the centralized control of CoM partners for respective CoM activities, and therefore provides a new distributed CoM process control mode to push forward the convenient development and large-scale deployment of SME-oriented CoM applications.

  1. Multi-domain, higher order level set scheme for 3D image segmentation on the GPU

    DEFF Research Database (Denmark)

    Sharma, Ojaswa; Zhang, Qin; Anton, François

    2010-01-01

    to evaluate level set surfaces that are $C^2$ continuous, but are slow due to high computational burden. In this paper, we provide a higher order GPU based solver for fast and efficient segmentation of large volumetric images. We also extend the higher order method to multi-domain segmentation. Our streaming...

  2. A framework for an institutional high level security policy for the processing of medical data and their transmission through the Internet.

    Science.gov (United States)

    Ilioudis, C; Pangalos, G

    2001-01-01

    The Internet provides many advantages when used for interaction and data sharing among health care providers, patients, and researchers. However, the advantages provided by the Internet come with a significantly greater element of risk to the confidentiality, integrity, and availability of information. It is therefore essential that Health Care Establishments processing and exchanging medical data use an appropriate security policy. To develop a High Level Security Policy for the processing of medical data and their transmission through the Internet, which is a set of high-level statements intended to guide Health Care Establishment personnel who process and manage sensitive health care information. We developed the policy based on a detailed study of the existing framework in the EU countries, USA, and Canada, and on consultations with users in the context of the Intranet Health Clinic project. More specifically, this paper has taken into account the major directives, technical reports, law, and recommendations that are related to the protection of individuals with regard to the processing of personal data, and the protection of privacy and medical data on the Internet. We present a High Level Security Policy for Health Care Establishments, which includes a set of 7 principles and 45 guidelines detailed in this paper. The proposed principles and guidelines have been made as generic and open to specific implementations as possible, to provide for maximum flexibility and adaptability to local environments. The High Level Security Policy establishes the basic security requirements that must be addressed to use the Internet to safely transmit patient and other sensitive health care information. The High Level Security Policy is primarily intended for large Health Care Establishments in Europe, USA, and Canada. It is clear however that the general framework presented here can only serve as reference material for developing an appropriate High Level Security Policy in a

  3. Testing for Level Shifts in Fractionally Integrated Processes: a State Space Approach

    DEFF Research Database (Denmark)

    Monache, Davide Delle; Grassi, Stefano; Santucci de Magistris, Paolo

    Short memory models contaminated by level shifts have similar long-memory features as fractionally integrated processes. This makes hard to verify whether the true data generating process is a pure fractionally integrated process when employing standard estimation methods based on the autocorrela......Short memory models contaminated by level shifts have similar long-memory features as fractionally integrated processes. This makes hard to verify whether the true data generating process is a pure fractionally integrated process when employing standard estimation methods based...... on the autocorrelation function or the periodogram. In this paper, we propose a robust testing procedure, based on an encompassing parametric specification that allows to disentangle the level shifts from the fractionally integrated component. The estimation is carried out on the basis of a state-space methodology...... and it leads to a robust estimate of the fractional integration parameter also in presence of level shifts. Once the memory parameter is correctly estimated, we use the KPSS test for presence of level shift. The Monte Carlo simulations show how this approach produces unbiased estimates of the memory parameter...

  4. Audiovisual speech perception development at varying levels of perceptual processing

    OpenAIRE

    Lalonde, Kaylah; Holt, Rachael Frush

    2016-01-01

    This study used the auditory evaluation framework [Erber (1982). Auditory Training (Alexander Graham Bell Association, Washington, DC)] to characterize the influence of visual speech on audiovisual (AV) speech perception in adults and children at multiple levels of perceptual processing. Six- to eight-year-old children and adults completed auditory and AV speech perception tasks at three levels of perceptual processing (detection, discrimination, and recognition). The tasks differed in the le...

  5. Reconciling the influence of task-set switching and motor inhibition processes on stop signal after-effects

    Directory of Open Access Journals (Sweden)

    Joaquin A. Anguera

    2013-09-01

    Full Text Available Executive response functions can be affected by preceding events, even if they are no longer associated with the current task at hand. For example, studies utilizing the stop signal task have reported slower response times to ‘GO’ stimuli when the preceding trial involved the presentation of a ‘STOP’ signal. However, the neural mechanisms that underlie this behavioral after-effect are unclear. To address this, behavioral and electroencephalography (EEG measures were examined in 18 young adults (18-30yrs on 'GO' trials following a previously ‘Successful Inhibition’ trial (pSI, a previously ‘Failed Inhibition’ trial (pFI, and a previous ‘GO’ trial (pGO. Like previous research, slower response times were observed during both pSI and pFI trials (i.e., ‘GO’ trials that were preceded by a successful and unsuccessful inhibition trial, respectively compared to pGO trials (i.e., ‘GO’ trials that were preceded by another ‘GO’ trial. Interestingly, response time slowing was greater during pSI trials compared to pFI trials, suggesting executive control is influenced by both task set switching and persisting motor inhibition processes. Follow-up behavioral analyses indicated that these effects resulted from between-trial control adjustments rather than repetition priming effects. Analyses of inter-electrode coherence (IEC and inter-trial coherence (ITC indicated that both pSI and pFI trials showed greater phase synchrony during the inter-trial interval compared to pGO trials. Unlike the IEC findings, differential ITC was present within the beta and alpha frequency bands in line with the observed behavior (pSI > pFI > pGO, suggestive of more consistent phase synchrony involving motor inhibition processes during the ITI at a regional level. These findings suggest that between-trial control adjustments involved with task-set switching and motor inhibition processes influence subsequent performance, providing new insights into the

  6. How organizational context affects bioethical decision-making: pharmacists' management of gatekeeping processes in retail and hospital settings.

    Science.gov (United States)

    Chiarello, Elizabeth

    2013-12-01

    Social science studies of bioethics demonstrate that ethics are highly contextual, functioning differently across local settings as actors make daily decisions "on the ground." Sociological studies that demonstrate the key role organizations play in shaping ethical decision-making have disproportionately focused on physicians and nurses working in hospital settings where they contend with life and death issues. This study broadens our understanding of the contexts of ethical decision-making by empirically examining understudied healthcare professionals - pharmacists - working in two organizational settings, retail and hospital, where they act as gatekeepers to regulated goods and services as they contend with ethical issues ranging from the serious to the mundane. This study asks: How do organizations shape pharmacists' identification, negotiation, and resolution of ethical challenges; in other words, how do organizations shape pharmacists' gatekeeping processes? Based on 95 semi-structured interviews with U.S. pharmacists practicing in retail and hospital pharmacies conducted between September 2009 and May 2011, this research finds that organizations influence ethical decision-making by shaping how pharmacists construct four gatekeeping processes: medical, legal, fiscal, and moral. Each gatekeeping process manifests differently across organizations due to how these settings structure inter-professional power dynamics, proximity to patients, and means of accessing information. Findings suggest new directions for theorizing about ethical decision-making in medical contexts by drawing attention to new ethical actors, new organizational settings, an expanded definition of ethical challenges, and a broader conceptualization of gatekeeping. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Systems metabolic engineering in an industrial setting.

    Science.gov (United States)

    Sagt, Cees M J

    2013-03-01

    Systems metabolic engineering is based on systems biology, synthetic biology, and evolutionary engineering and is now also applied in industry. Industrial use of systems metabolic engineering focuses on strain and process optimization. Since ambitious yields, titers, productivities, and low costs are key in an industrial setting, the use of effective and robust methods in systems metabolic engineering is becoming very important. Major improvements in the field of proteomics and metabolomics have been crucial in the development of genome-wide approaches in strain and process development. This is accompanied by a rapid increase in DNA sequencing and synthesis capacity. These developments enable the use of systems metabolic engineering in an industrial setting. Industrial systems metabolic engineering can be defined as the combined use of genome-wide genomics, transcriptomics, proteomics, and metabolomics to modify strains or processes. This approach has become very common since the technology for generating large data sets of all levels of the cellular processes has developed quite fast into robust, reliable, and affordable methods. The main challenge and scope of this mini review is how to translate these large data sets in relevant biological leads which can be tested for strain or process improvements. Experimental setup, heterogeneity of the culture, and sample pretreatment are important issues which are easily underrated. In addition, the process of structuring, filtering, and visualization of data is important, but also, the availability of a genetic toolbox and equipment for medium/high-throughput fermentation is a key success factor. For an efficient bioprocess, all the different components in this process have to work together. Therefore, mutual tuning of these components is an important strategy.

  8. [Nursing students' perception of the learning process in a hospital setting].

    Science.gov (United States)

    Alves, Elcilene Andreíne Terra Durgante; Cogo, Ana Luísa Petersen

    2014-03-01

    The aim of this study was to identijf how nursing students perceive and experience the learning process during curricular practice in a hospital setting. A qualitative, retrospective, documentary study was developed in an undergraduate nursing course. Data were comprised of 162 posts made by 34 students in the online discussion forum of the Learning Management System Moodle, during the first half of 2011. The following themes emergedfrom t he thematic content analysis: "nursing students' understanding about the professional practice," and "the teaching and learning process in the perspective of nursing students." The study demonstrated that the forum was a place for reporting experiences such as the description of the physical area, performing procedures, perception of nursing care activities, conJlicts with peers, coping with death and learning evaluation. The online discussion forum needs to be used by professors as a space of interaction so as to contribute to professional training.

  9. Wafer-Level Membrane-Transfer Process for Fabricating MEMS

    Science.gov (United States)

    Yang, Eui-Hyeok; Wiberg, Dean

    2003-01-01

    A process for transferring an entire wafer-level micromachined silicon structure for mating with and bonding to another such structure has been devised. This process is intended especially for use in wafer-level integration of microelectromechanical systems (MEMS) that have been fabricated on dissimilar substrates. Unlike in some older membrane-transfer processes, there is no use of wax or epoxy during transfer. In this process, the substrate of a wafer-level structure to be transferred serves as a carrier, and is etched away once the transfer has been completed. Another important feature of this process is that two electrodes constitutes an electrostatic actuator array. An SOI wafer and a silicon wafer (see Figure 1) are used as the carrier and electrode wafers, respectively. After oxidation, both wafers are patterned and etched to define a corrugation profile and electrode array, respectively. The polysilicon layer is deposited on the SOI wafer. The carrier wafer is bonded to the electrode wafer by using evaporated indium bumps. The piston pressure of 4 kPa is applied at 156 C in a vacuum chamber to provide hermetic sealing. The substrate of the SOI wafer is etched in a 25 weight percent TMAH bath at 80 C. The exposed buried oxide is then removed by using 49 percent HF droplets after an oxygen plasma ashing. The SOI top silicon layer is etched away by using an SF6 plasma to define the corrugation profile, followed by the HF droplet etching of the remaining oxide. The SF6 plasma with a shadow mask selectively etches the polysilicon membrane, if the transferred membrane structure needs to be patterned. Electrostatic actuators with various electrode gaps have been fabricated by this transfer technique. The gap between the transferred membrane and electrode substrate is very uniform ( 0.1 m across a wafer diameter of 100 mm, provided by optimizing the bonding control). Figure 2 depicts the finished product.

  10. A multilevel, level-set method for optimizing eigenvalues in shape design problems

    International Nuclear Information System (INIS)

    Haber, E.

    2004-01-01

    In this paper, we consider optimal design problems that involve shape optimization. The goal is to determine the shape of a certain structure such that it is either as rigid or as soft as possible. To achieve this goal we combine two new ideas for an efficient solution of the problem. First, we replace the eigenvalue problem with an approximation by using inverse iteration. Second, we use a level set method but rather than propagating the front we use constrained optimization methods combined with multilevel continuation techniques. Combining these two ideas we obtain a robust and rapid method for the solution of the optimal design problem

  11. Separation processes for high-level radioactive waste treatment

    International Nuclear Information System (INIS)

    Sutherland, D.G.

    1992-11-01

    During World War II, production of nuclear materials in the United States for national defense, high-level waste (HLW) was generated as a byproduct. Since that time, further quantities of HLW radionuclides have been generated by continued nuclear materials production, research, and the commercial nuclear power program. In this paper HLW is defined as the highly radioactive material resulting from the processing of spent nuclear fuel. The HLW is the liquid waste generated during the recovery of uranium and plutonium in a fuel processing plant that generally contains more than 99% of the nonvolatile fission products produced during reactor operation. Since this paper deals with waste separation processes, spent reactor fuel elements that have not been dissolved and further processed are excluded

  12. Comparison of different statistical methods for estimation of extreme sea levels with wave set-up contribution

    Science.gov (United States)

    Kergadallan, Xavier; Bernardara, Pietro; Benoit, Michel; Andreewsky, Marc; Weiss, Jérôme

    2013-04-01

    Estimating the probability of occurrence of extreme sea levels is a central issue for the protection of the coast. Return periods of sea level with wave set-up contribution are estimated here in one site : Cherbourg in France in the English Channel. The methodology follows two steps : the first one is computation of joint probability of simultaneous wave height and still sea level, the second one is interpretation of that joint probabilities to assess a sea level for a given return period. Two different approaches were evaluated to compute joint probability of simultaneous wave height and still sea level : the first one is multivariate extreme values distributions of logistic type in which all components of the variables become large simultaneously, the second one is conditional approach for multivariate extreme values in which only one component of the variables have to be large. Two different methods were applied to estimate sea level with wave set-up contribution for a given return period : Monte-Carlo simulation in which estimation is more accurate but needs higher calculation time and classical ocean engineering design contours of type inverse-FORM in which the method is simpler and allows more complex estimation of wave setup part (wave propagation to the coast for example). We compare results from the two different approaches with the two different methods. To be able to use both Monte-Carlo simulation and design contours methods, wave setup is estimated with an simple empirical formula. We show advantages of the conditional approach compared to the multivariate extreme values approach when extreme sea-level occurs when either surge or wave height is large. We discuss the validity of the ocean engineering design contours method which is an alternative when computation of sea levels is too complex to use Monte-Carlo simulation method.

  13. Study unique artistic lopburi province for design brass tea set of bantahkrayang community

    Science.gov (United States)

    Pliansiri, V.; Seviset, S.

    2017-07-01

    The objectives of this study were as follows: 1) to study the production process of handcrafted Brass Tea Set; and 2) to design and develop the handcrafted of Brass Tea Set. The process of design was started by mutual analytical processes and conceptual framework for product design, Quality Function Deployment, Theory of Inventive Problem Solving, Principles of Craft Design, and Principle of Reverse Engineering. The experts in field of both Industrial Product Design and Brass Handicraft Product, have evaluated the Brass Tea Set design and created prototype of Brass tea set by the sample of consumers who have ever bought the Brass Tea Set of Bantahkrayang Community on this research. The statistics methods used were percentage, mean ({{{\\overline X}} = }) and standard deviation (S.D.) 3. To assess consumer satisfaction toward of handcrafted Brass tea set was at the high level.

  14. Atypical biological motion kinematics are represented by complementary lower-level and top-down processes during imitation learning.

    Science.gov (United States)

    Hayes, Spencer J; Dutoy, Chris A; Elliott, Digby; Gowen, Emma; Bennett, Simon J

    2016-01-01

    Learning a novel movement requires a new set of kinematics to be represented by the sensorimotor system. This is often accomplished through imitation learning where lower-level sensorimotor processes are suggested to represent the biological motion kinematics associated with an observed movement. Top-down factors have the potential to influence this process based on the social context, attention and salience, and the goal of the movement. In order to further examine the potential interaction between lower-level and top-down processes in imitation learning, the aim of this study was to systematically control the mediating effects during an imitation of biological motion protocol. In this protocol, we used non-human agent models that displayed different novel atypical biological motion kinematics, as well as a control model that displayed constant velocity. Importantly the three models had the same movement amplitude and movement time. Also, the motion kinematics were displayed in the presence, or absence, of end-state-targets. Kinematic analyses showed atypical biological motion kinematics were imitated, and that this performance was different from the constant velocity control condition. Although the imitation of atypical biological motion kinematics was not modulated by the end-state-targets, movement time was more accurate in the absence, compared to the presence, of an end-state-target. The fact that end-state targets modulated movement time accuracy, but not biological motion kinematics, indicates imitation learning involves top-down attentional, and lower-level sensorimotor systems, which operate as complementary processes mediated by the environmental context. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Shifts in information processing level: the speed theory of intelligence revisited.

    Science.gov (United States)

    Sircar, S S

    2000-06-01

    A hypothesis is proposed here to reconcile the inconsistencies observed in the IQ-P3 latency relation. The hypothesis stems from the observation that task-induced increase in P3 latency correlates positively with IQ scores. It is hypothesised that: (a) there are several parallel information processing pathways of varying complexity which are associated with the generation of P3 waves of varying latencies; (b) with increasing workload, there is a shift in the 'information processing level' through progressive recruitment of more complex polysynaptic pathways with greater processing power and inhibition of the oligosynaptic pathways; (c) high-IQ subjects have a greater reserve of higher level processing pathways; (d) a given 'task-load' imposes a greater 'mental workload' in subjects with lower IQ than in those with higher IQ. According to this hypothesis, a meaningful comparison of the P3 correlates of IQ is possible only when the information processing level is pushed to its limits.

  16. Education level inequalities and transportation injury mortality in the middle aged and elderly in European settings

    NARCIS (Netherlands)

    Borrell, C.; Plasència, A.; Huisman, M.; Costa, G.; Kunst, A.; Andersen, O.; Bopp, M.; Borgan, J.-K.; Deboosere, P.; Glickman, M.; Gadeyne, S.; Minder, C.; Regidor, E.; Spadea, T.; Valkonen, T.; Mackenbach, J. P.

    2005-01-01

    OBJECTIVE: To study the differential distribution of transportation injury mortality by educational level in nine European settings, among people older than 30 years, during the 1990s. METHODS: Deaths of men and women older than 30 years from transportation injuries were studied. Rate differences

  17. The impact of educational level on performance on auditory processing tests

    Directory of Open Access Journals (Sweden)

    Cristina F.B. Murphy

    2016-03-01

    Full Text Available Research has demonstrated that a higher level of education is associated with better performance on cognitive tests among middle-aged and elderly people. However, the effects of education on auditory processing skills have not yet been evaluated. Previous demonstrations of sensory-cognitive interactions in the aging process indicate the potential importance of this topic. Therefore, the primary purpose of this study was to investigate the performance of middle-aged and elderly people with different levels of formal education on auditory processing tests. A total of 177 adults with no evidence of cognitive, psychological or neurological conditions took part in the research. The participants completed a series of auditory assessments, including dichotic digit, frequency pattern and speech-in-noise tests. A working memory test was also performed to investigate the extent to which auditory processing and cognitive performance were associated. The results demonstrated positive but weak correlations between years of schooling and performance on all of the tests applied. The factor years of schooling was also one of the best predictors of frequency pattern and speech-in-noise test performance. Additionally, performance on the working memory, frequency pattern and dichotic digit tests was also correlated, suggesting that the influence of educational level on auditory processing performance might be associated with the cognitive demand of the auditory processing tests rather than auditory sensory aspects itself. Longitudinal research is required to investigate the causal relationship between educational level and auditory processing skills.

  18. High level cognitive information processing in neural networks

    Science.gov (United States)

    Barnden, John A.; Fields, Christopher A.

    1992-01-01

    Two related research efforts were addressed: (1) high-level connectionist cognitive modeling; and (2) local neural circuit modeling. The goals of the first effort were to develop connectionist models of high-level cognitive processes such as problem solving or natural language understanding, and to understand the computational requirements of such models. The goals of the second effort were to develop biologically-realistic model of local neural circuits, and to understand the computational behavior of such models. In keeping with the nature of NASA's Innovative Research Program, all the work conducted under the grant was highly innovative. For instance, the following ideas, all summarized, are contributions to the study of connectionist/neural networks: (1) the temporal-winner-take-all, relative-position encoding, and pattern-similarity association techniques; (2) the importation of logical combinators into connection; (3) the use of analogy-based reasoning as a bridge across the gap between the traditional symbolic paradigm and the connectionist paradigm; and (4) the application of connectionism to the domain of belief representation/reasoning. The work on local neural circuit modeling also departs significantly from the work of related researchers. In particular, its concentration on low-level neural phenomena that could support high-level cognitive processing is unusual within the area of biological local circuit modeling, and also serves to expand the horizons of the artificial neural net field.

  19. The levels of perceptual processing and the neural correlates of increasing subjective visibility.

    Science.gov (United States)

    Binder, Marek; Gociewicz, Krzysztof; Windey, Bert; Koculak, Marcin; Finc, Karolina; Nikadon, Jan; Derda, Monika; Cleeremans, Axel

    2017-10-01

    According to the levels-of-processing hypothesis, transitions from unconscious to conscious perception may depend on stimulus processing level, with more gradual changes for low-level stimuli and more dichotomous changes for high-level stimuli. In an event-related fMRI study we explored this hypothesis using a visual backward masking procedure. Task requirements manipulated level of processing. Participants reported the magnitude of the target digit in the high-level task, its color in the low-level task, and rated subjective visibility of stimuli using the Perceptual Awareness Scale. Intermediate stimulus visibility was reported more frequently in the low-level task, confirming prior behavioral results. Visible targets recruited insulo-fronto-parietal regions in both tasks. Task effects were observed in visual areas, with higher activity in the low-level task across all visibility levels. Thus, the influence of level of processing on conscious perception may be mediated by attentional modulation of activity in regions representing features of consciously experienced stimuli. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Microwave imaging of dielectric cylinder using level set method and conjugate gradient algorithm

    International Nuclear Information System (INIS)

    Grayaa, K.; Bouzidi, A.; Aguili, T.

    2011-01-01

    In this paper, we propose a computational method for microwave imaging cylinder and dielectric object, based on combining level set technique and the conjugate gradient algorithm. By measuring the scattered field, we tried to retrieve the shape, localisation and the permittivity of the object. The forward problem is solved by the moment method, while the inverse problem is reformulate in an optimization one and is solved by the proposed scheme. It found that the proposed method is able to give good reconstruction quality in terms of the reconstructed shape and permittivity.

  1. A Comparative Study of Applying Active-Set and Interior Point Methods in MPC for Controlling Nonlinear pH Process

    Directory of Open Access Journals (Sweden)

    Syam Syafiie

    2014-06-01

    Full Text Available A comparative study of Model Predictive Control (MPC using active-set method and interior point methods is proposed as a control technique for highly non-linear pH process. The process is a strong acid-strong base system. A strong acid of hydrochloric acid (HCl and a strong base of sodium hydroxide (NaOH with the presence of buffer solution sodium bicarbonate (NaHCO3 are used in a neutralization process flowing into reactor. The non-linear pH neutralization model governed in this process is presented by multi-linear models. Performance of both controllers is studied by evaluating its ability of set-point tracking and disturbance-rejection. Besides, the optimization time is compared between these two methods; both MPC shows the similar performance with no overshoot, offset, and oscillation. However, the conventional active-set method gives a shorter control action time for small scale optimization problem compared to MPC using IPM method for pH control.

  2. Process description and plant design for preparing ceramic high-level waste forms

    International Nuclear Information System (INIS)

    Grantham, L.F.; McKisson, R.L.; Guon, J.; Flintoff, J.F.; McKenzie, D.E.

    1983-01-01

    The ceramics process flow diagram has been simplified and upgraded to utilize only two major processing steps - fluid-bed calcination and hot isostatic press consolidating. Full-scale fluid-bed calcination has been used at INEL to calcine high-level waste for 18 y; and a second-generation calciner, a fully remotely operated and maintained calciner that meets ALARA guidelines, started calcining high-level waste in 1982. Full-scale hot isostatic consolidation has been used by DOE and commercial enterprises to consolidate radioactive components and to encapsulate spent fuel elements for several years. With further development aimed at process integration and parametric optimization, the operating knowledge of full-scale demonstration of the key process steps should be rapidly adaptable to scale-up of the ceramic process to full plant size. Process flowsheets used to prepare ceramic and glass waste forms from defense and commercial high-level liquid waste are described. Preliminary layouts of process flow diagrams in a high-level processing canyon were prepared and used to estimate the preliminary cost of the plant to fabricate both waste forms. The estimated costs for using both options were compared for total waste management costs of SRP high-level liquid waste. Using our design, for both the ceramic and glass plant, capital and operating costs are essentially the same for both defense and commercial wastes, but total waste management costs are calculated to be significantly less for defense wastes using the ceramic option. It is concluded from this and other studies that the ceramic form may offer important advantages over glass in leach resistance, waste loading, density, and process flexibility. Preliminary economic calculations indicate that ceramics must be considered a leading candidate for the form to immobilize high-level wastes

  3. Process and impact evaluation of the Romp & Chomp obesity prevention intervention in early childhood settings: lessons learned from implementation in preschools and long day care settings.

    Science.gov (United States)

    de Silva-Sanigorski, Andrea M; Bell, Andrew C; Kremer, Peter; Park, Janet; Demajo, Lisa; Smith, Michael; Sharp, Sharon; Nichols, Melanie; Carpenter, Lauren; Boak, Rachel; Swinburn, Boyd

    2012-06-01

    The Romp & Chomp controlled trial, which aimed to prevent obesity in preschool Australian children, was recently found to reduce the prevalence of childhood overweight and obesity and improve children's dietary patterns. The intervention focused on capacity building and policy implementation within various early childhood settings. This paper reports on the process and impact evaluation of this trial and the lessons learned from this complex community intervention. Process data was collected throughout and audits capturing nutrition and physical activity-related environments and practices were completed postintervention by directors of Long Day Care (LDC) centers (n = 10) and preschools (n = 41) in intervention and comparison (n = 161 LDC and n = 347 preschool) groups. The environmental audits demonstrated positive impacts in both settings on policy, nutrition, physical activity opportunities, and staff capacity and practices, although results varied across settings and were more substantial in the preschool settings. Important lessons were learned in relation to implementation of such community-based interventions, including the significant barriers to implementing health-promotion interventions in early childhood settings, lack of engagement of for-profit LDC centers in the evaluation, and an inability to attribute direct intervention impacts when the intervention components were delivered as part of a health-promotion package integrated with other programs. These results provide confidence that obesity prevention interventions in children's settings can be effective; however, significant efforts must be directed toward developing context-specific strategies that invest in policies, capacity building, staff support, and parent engagement. Recognition by funders and reviewers of the difficulties involved in implementing and evaluating such complex interventions is also critical to strengthening the evidence base on the effectiveness of such public health

  4. Sleep Disrupts High-Level Speech Parsing Despite Significant Basic Auditory Processing.

    Science.gov (United States)

    Makov, Shiri; Sharon, Omer; Ding, Nai; Ben-Shachar, Michal; Nir, Yuval; Zion Golumbic, Elana

    2017-08-09

    The extent to which the sleeping brain processes sensory information remains unclear. This is particularly true for continuous and complex stimuli such as speech, in which information is organized into hierarchically embedded structures. Recently, novel metrics for assessing the neural representation of continuous speech have been developed using noninvasive brain recordings that have thus far only been tested during wakefulness. Here we investigated, for the first time, the sleeping brain's capacity to process continuous speech at different hierarchical levels using a newly developed Concurrent Hierarchical Tracking (CHT) approach that allows monitoring the neural representation and processing-depth of continuous speech online. Speech sequences were compiled with syllables, words, phrases, and sentences occurring at fixed time intervals such that different linguistic levels correspond to distinct frequencies. This enabled us to distinguish their neural signatures in brain activity. We compared the neural tracking of intelligible versus unintelligible (scrambled and foreign) speech across states of wakefulness and sleep using high-density EEG in humans. We found that neural tracking of stimulus acoustics was comparable across wakefulness and sleep and similar across all conditions regardless of speech intelligibility. In contrast, neural tracking of higher-order linguistic constructs (words, phrases, and sentences) was only observed for intelligible speech during wakefulness and could not be detected at all during nonrapid eye movement or rapid eye movement sleep. These results suggest that, whereas low-level auditory processing is relatively preserved during sleep, higher-level hierarchical linguistic parsing is severely disrupted, thereby revealing the capacity and limits of language processing during sleep. SIGNIFICANCE STATEMENT Despite the persistence of some sensory processing during sleep, it is unclear whether high-level cognitive processes such as speech

  5. A set of simple cell processes is sufficient to model spiral cleavage.

    Science.gov (United States)

    Brun-Usan, Miguel; Marín-Riera, Miquel; Grande, Cristina; Truchado-Garcia, Marta; Salazar-Ciudad, Isaac

    2017-01-01

    During cleavage, different cellular processes cause the zygote to become partitioned into a set of cells with a specific spatial arrangement. These processes include the orientation of cell division according to: an animal-vegetal gradient; the main axis (Hertwig's rule) of the cell; and the contact areas between cells or the perpendicularity between consecutive cell divisions (Sachs' rule). Cell adhesion and cortical rotation have also been proposed to be involved in spiral cleavage. We use a computational model of cell and tissue biomechanics to account for the different existing hypotheses about how the specific spatial arrangement of cells in spiral cleavage arises during development. Cell polarization by an animal-vegetal gradient, a bias to perpendicularity between consecutive cell divisions (Sachs' rule), cortical rotation and cell adhesion, when combined, reproduce the spiral cleavage, whereas other combinations of processes cannot. Specifically, cortical rotation is necessary at the 8-cell stage to direct all micromeres in the same direction. By varying the relative strength of these processes, we reproduce the spatial arrangement of cells in the blastulae of seven different invertebrate species. © 2017. Published by The Company of Biologists Ltd.

  6. The Minimum Data Set Depression Quality Indicator: Does It Reflect Differences in Care Processes?

    Science.gov (United States)

    Simmons, S.F.; Cadogan, M.P.; Cabrera, G.R.; Al-Samarrai, N.R.; Jorge, J.S.; Levy-Storms, L.; Osterweil, D.; Schnelle, J.F.

    2004-01-01

    Purpose. The objective of this work was to determine if nursing homes that score differently on prevalence of depression, according to the Minimum Data Set (MDS) quality indicator, also provide different processes of care related to depression. Design and Methods. A cross-sectional study with 396 long-term residents in 14 skilled nursing…

  7. Records for radioactive waste management up to repository closure: Managing the primary level information (PLI) set

    International Nuclear Information System (INIS)

    2004-07-01

    The objective of this publication is to highlight the importance of the early establishment of a comprehensive records system to manage primary level information (PLI) as an integrated set of information, not merely as a collection of information, throughout all the phases of radioactive waste management. Early establishment of a comprehensive records system to manage Primary Level Information as an integrated set of information throughout all phases of radioactive waste management is important. In addition to the information described in the waste inventory record keeping system (WIRKS), the PLI of a radioactive waste repository consists of the entire universe of information, data and records related to any aspect of the repository's life cycle. It is essential to establish PLI requirements based on integrated set of needs from Regulators and Waste Managers involved in the waste management chain and to update these requirements as needs change over time. Information flow for radioactive waste management should be back-end driven. Identification of an Authority that will oversee the management of PLI throughout all phases of the radioactive waste management life cycle would guarantee the information flow to future generations. The long term protection of information essential to future generations can only be assured by the timely establishment of a comprehensive and effective RMS capable of capturing, indexing and evaluating all PLI. The loss of intellectual control over the PLI will make it very difficult to subsequently identify the ILI and HLI information sets. At all times prior to the closure of a radioactive waste repository, there should be an identifiable entity with a legally enforceable financial and management responsibility for the continued operation of a PLI Records Management System. The information presented in this publication will assist Member States in ensuring that waste and repository records, relevant for retention after repository closure

  8. Review process of PSA level 2 of KBR - Concept and Experience

    International Nuclear Information System (INIS)

    Andernacht, M.; Glaser, H.; Sonnenkalb, M.

    2013-01-01

    In Germany, a periodic safety review (PSR) has to be performed every ten years by the utility. In the past, a PSR only included a plant-specific probabilistic safety analysis (PSA) Level 1 study. Since a revised version of the German PSA guideline has been released in 2005, these plant-specific PSAs have to include a PSA Level 2, too. For the NPP Brokdorf (KBR) PSA Level 2 project, an agreement was reached between all parties involved that the study will be performed not as a part of the PSR process, but supplementary to it. This paper will focus on conclusions and findings from an ongoing parallel review process of the first full scope PSA Level 2 performed by the utility for KBR, a typical German PWR-1300. The responsible authority 'Ministerium fuer Soziales, Gesundheit, Familie, Jugend und Senioren des Landes Schleswig- Holstein' (MSGF) initiated this parallel review process in agreement with the utility KBR and the E.ON Kernkraft in 2006. The project will be completed soon. Such a review process allows that essential steps of the PSA will be reviewed and commented before the PSA Level 2 will be finished. So the benefit from this parallel review process is a significant enhancement of the quality and completeness of the PSA Level 2 study as the majority of the recommendations given by the review team has been taken over by the utility and the developer of the PSA, the AREVA NP company. Further, a common understanding and agreement will be reached at the end between all parties involved on the major topics of the PSA Level 2 study. The paper is followed by the slides of the presentation. (authors)

  9. Review process of PSA Level 2 of KBR. Concept and experience

    International Nuclear Information System (INIS)

    Andernacht, Martin; Glaser, Hendrik; Sonnenkalb, Martin

    2009-01-01

    In Germany, a periodic safety review (PSR) has to be performed every 10 years by the utility. In the past, a PSR only included a plant-specific probabilistic safety analysis (PSA) Level 1 study. For the NPP Brokdorf (KBR) PSA Level 2 project, an agreement was reached between all parties involved that the study will be performed not as a part of the PSR process, but supplementary to it. Since a revised version of the German PSA guideline has been released in 2005, these plant-specific PSAs have to include a PSA Level 2, too. This paper will focus on conclusions and findings from a ongoing parallel review process of the first full scope PSA Level 2 performed by the utility for KBR, a typical German PWR-1300. The responsible authority 'Ministerium fuer Soziales, Gesundheit, Familie, Jugend und Senioren des Landes Schleswig-Holstein (MSGF)' (Ministry of Social Affairs, Health, Family, Youth and Senior Citizens of Schleswig-Holstein) initiated this parallel review process in agreement with the utility KBR and the E.ON Kernkraft in 2006. The project will be completed soon. Such a review process allows that essential steps of the PSA will be reviewed and commented before the PSA Level 2 will be finished. So the benefit from this parallel review process is a significant enhancement of the quality and completeness of the PSA Level 2 study as the majority of the recommendations given by the review team has been taken over by the utility and the developer of the PSA, the Areva NP company. Further, a common understanding and agreement will be reached at the end between all parties involved on the major topics of the PSA Level 2 study. (orig.)

  10. Nurses' comfort level with spiritual assessment: a study among nurses working in diverse healthcare settings.

    Science.gov (United States)

    Cone, Pamela H; Giske, Tove

    2017-10-01

    To gain knowledge about nurses' comfort level in assessing spiritual matters and to learn what questions nurses use in practice related to spiritual assessment. Spirituality is important in holistic nursing care; however, nurses report feeling uncomfortable and ill-prepared to address this domain with patients. Education is reported to impact nurses' ability to engage in spiritual care. This cross-sectional exploratory survey reports on a mixed-method study examining how comfortable nurses are with spiritual assessment. In 2014, a 21-item survey with 10 demographic variables and three open-ended questions were distributed to Norwegian nurses working in diverse care settings with 172 nurse responses (72 % response rate). SPSS was used to analyse quantitative data; thematic analysis examined the open-ended questions. Norwegian nurses reported a high level of comfort with most questions even though spirituality is seen as private. Nurses with some preparation or experience in spiritual care were most comfortable assessing spirituality. Statistically significant correlations were found between the nurses' comfort level with spiritual assessment and their preparedness and sense of the importance of spiritual assessment. How well-prepared nurses felt was related to years of experience, degree of spirituality and religiosity, and importance of spiritual assessment. Many nurses are poorly prepared for spiritual assessment and care among patients in diverse care settings; educational preparation increases their comfort level with facilitating such care. Nurses who feel well prepared with spirituality feel more comfortable with the spiritual domain. By fostering a culture where patients' spirituality is discussed and reflected upon in everyday practice and in continued education, nurses' sense of preparedness, and thus their level of comfort, can increase. Clinical supervision and interprofessional collaboration with hospital chaplains and/or other spiritual leaders can

  11. Process for Selecting System Level Assessments for Human System Technologies

    Science.gov (United States)

    Watts, James; Park, John

    2006-01-01

    The integration of many life support systems necessary to construct a stable habitat is difficult. The correct identification of the appropriate technologies and corresponding interfaces is an exhaustive process. Once technologies are selected secondary issues such as mechanical and electrical interfaces must be addressed. The required analytical and testing work must be approached in a piecewise fashion to achieve timely results. A repeatable process has been developed to identify and prioritize system level assessments and testing needs. This Assessment Selection Process has been defined to assess cross cutting integration issues on topics at the system or component levels. Assessments are used to identify risks, encourage future actions to mitigate risks, or spur further studies.

  12. Improving satellite-based PM2.5 estimates in China using Gaussian processes modeling in a Bayesian hierarchical setting.

    Science.gov (United States)

    Yu, Wenxi; Liu, Yang; Ma, Zongwei; Bi, Jun

    2017-08-01

    Using satellite-based aerosol optical depth (AOD) measurements and statistical models to estimate ground-level PM 2.5 is a promising way to fill the areas that are not covered by ground PM 2.5 monitors. The statistical models used in previous studies are primarily Linear Mixed Effects (LME) and Geographically Weighted Regression (GWR) models. In this study, we developed a new regression model between PM 2.5 and AOD using Gaussian processes in a Bayesian hierarchical setting. Gaussian processes model the stochastic nature of the spatial random effects, where the mean surface and the covariance function is specified. The spatial stochastic process is incorporated under the Bayesian hierarchical framework to explain the variation of PM 2.5 concentrations together with other factors, such as AOD, spatial and non-spatial random effects. We evaluate the results of our model and compare them with those of other, conventional statistical models (GWR and LME) by within-sample model fitting and out-of-sample validation (cross validation, CV). The results show that our model possesses a CV result (R 2  = 0.81) that reflects higher accuracy than that of GWR and LME (0.74 and 0.48, respectively). Our results indicate that Gaussian process models have the potential to improve the accuracy of satellite-based PM 2.5 estimates.

  13. Preface to the Viewpoint Set: Nanostructured metals - Advances in processing, characterization and application

    DEFF Research Database (Denmark)

    Huang, Xiaoxu

    2009-01-01

    with increasingly finer structures in order to improve properties and sustainability. The structural scale of interest in such materials is therefore reduced to the nanometer range, which means that characterization and modeling of nanostructured metals now address an audience including not only physicists...... and materials scientists but also technologists and engineers. The present Viewpoint Set therefore covers metallic materials with a structural scale ranging from micrometer to nanometer in dimensions and focuses on processing techniques such as plastic deformation and phase transformations. As a result......The theme of two viewpoint sets has been nanostructured metals: one in 2003 on “Mechanical properties of fully dense nanocrystalline metals” (Scripta Materialia 2003;49:625–680) and one in 2004 on “Metals and alloys with a structural scale from the micrometer to the atomic dimensions” (Scripta...

  14. Formal participation in the IASB's due process of standard setting : A multi-issue/multiperiod analysis

    NARCIS (Netherlands)

    Jorissen, A.; Lybaert, N.; Orens, R.; van der Tas, L.G.

    2012-01-01

    This paper sets out to enquire about the nature of constituents' participation in the IASB's due process in terms of representation (constituents' diversity and characteristics) and drivers to participate. We choose to adopt a multi-issue/multi-period approach to investigate constituents' formal

  15. High-level waste processing at the Savannah River Site: An update

    International Nuclear Information System (INIS)

    Marra, J.E.; Bennett, W.M.; Elder, H.H.; Lee, E.D.; Marra, S.L.; Rutland, P.L.

    1997-01-01

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS) in Aiken, SC mg began immobilizing high-level radioactive waste in borosilicate glass in 1996. Currently, the radioactive glass is being produced as a ''sludge-only'' composition by combining washed high-level waste sludge with glass frit. The glass is poured in stainless steel canisters which will eventually be disposed of in a permanent, geological repository. To date, DWPF has produced about 100 canisters of vitrified waste. Future processing operations will, be based on a ''coupled'' feed of washed high-level waste sludge, precipitated cesium, and glass frit. This paper provides an update of the processing activities completed to date, operational/flowsheet problems encountered, and programs underway to increase production rates

  16. An instrumentation and control philosophy for high-level nuclear waste processing facilities

    International Nuclear Information System (INIS)

    Weigle, D.H.

    1990-01-01

    The purpose of this paper is to present an instrumentation and control philosophy which may be applied to high-level nuclear waste processing facilities. This philosophy describes the recommended criteria for automatic/manual control, remote/local control, remote/local display, diagnostic instrumentation, interlocks, alarm levels, and redundancy. Due to the hazardous nature of the process constituents of a high-level nuclear waste processing facility, it is imperative that safety and control features required for accident-free operation and maintenance be incorporated. A well-instrumented and controlled process, while initially more expensive in capital and design costs, is generally safer and less expensive to operate. When the long term cost savings of a well designed process is coupled with the high savings enjoyed by accident avoidance, the benefits far outweigh the initial capital and design costs

  17. Higher levels of depression are associated with reduced global bias in visual processing.

    Science.gov (United States)

    de Fockert, Jan W; Cooper, Andrew

    2014-04-01

    Negative moods have been associated with a tendency to prioritise local details in visual processing. The current study investigated the relation between depression and visual processing using the Navon task, a standard task of local and global processing. In the Navon task, global stimuli are presented that are made up of many local parts, and the participants are instructed to report the identity of either a global or a local target shape. Participants with a low self-reported level of depression showed evidence of the expected global processing bias, and were significantly faster at responding to the global, compared with the local level. By contrast, no such difference was observed in participants with high levels of depression. The reduction of the global bias associated with high levels of depression was only observed in the overall speed of responses to global (versus local) targets, and not in the level of interference produced by the global (versus local) distractors. These results are in line with recent findings of a dissociation between local/global processing bias and interference from local/global distractors, and support the claim that depression is associated with a reduction in the tendency to prioritise global-level processing.

  18. Fluoroscopy in paediatric fractures - Setting a local diagnostic reference level

    International Nuclear Information System (INIS)

    Pillai, A.; McAuley, A.; McMurray, K.; Jain, M.

    2006-01-01

    Background: The ionizing radiations (Medical Exposure) Regulation 2000 has made it mandatory to establish diagnostic reference levels (DRLs) for all typical radiological examinations. Objectives: We attempt to provide dose data for some common fluoroscopic procedures used in orthopaedic trauma that may be used as the basis for setting DRLs for paediatric patients. Materials and methods: The dose area product (DAP) in 865 paediatric trauma examinations was analysed. Median DAP values and screening times for each procedure type along with quartile values for each range are presented. Results: In the upper limb, elbow examinations had maximum exposure with a median DAP value of 1.21 cGy cm 2 . Median DAP values for forearm and wrist examinations were 0.708 and 0.538 cGy cm 2 , respectively. In lower limb, tibia and fibula examinations had a median DAP value of 3.23 cGy cm 2 followed by ankle examinations with a median DAP of 3.10 cGy cm 2 . The rounded third quartile DAP value for each distribution can be used as a provisional DRL for the specific procedure type. (authors)

  19. Developing a set of consensus indicators to support maternity service quality improvement: using Core Outcome Set methodology including a Delphi process.

    Science.gov (United States)

    Bunch, K J; Allin, B; Jolly, M; Hardie, T; Knight, M

    2018-05-16

    To develop a core metric set to monitor the quality of maternity care. Delphi process followed by a face-to-face consensus meeting. English maternity units. Three representative expert panels: service designers, providers and users. Maternity care metrics judged important by participants. Participants were asked to complete a two-phase Delphi process, scoring metrics from existing local maternity dashboards. A consensus meeting discussed the results and re-scored the metrics. In all, 125 distinct metrics across six domains were identified from existing dashboards. Following the consensus meeting, 14 metrics met the inclusion criteria for the final core set: smoking rate at booking; rate of birth without intervention; caesarean section delivery rate in Robson group 1 women; caesarean section delivery rate in Robson group 2 women; caesarean section delivery rate in Robson group 5 women; third- and fourth-degree tear rate among women delivering vaginally; rate of postpartum haemorrhage of ≥1500 ml; rate of successful vaginal birth after a single previous caesarean section; smoking rate at delivery; proportion of babies born at term with an Apgar score improvement. Achieving consensus on core metrics for monitoring the quality of maternity care. © 2018 The Authors. BJOG: An International Journal of Obstetrics and Gynaecology published by John Wiley & Sons Ltd on behalf of Royal College of Obstetricians and Gynaecologists.

  20. Proceedings of the First International Research Workshop for Process Improvement in Small Settings, 2005

    Science.gov (United States)

    2006-01-01

    NeedsCONTINOUS IMPROVEMENT CYCLE Banco de Datos Organizaci ón Data Bank of the Organization ’s Projects Project Data Established Improvements Te m pl at es Te...25 From Bedini G, Alejandro. Calidad y conocimiento , Cartagena de Indias Colombia. CMU/SEI-2006-SR...135 4.3 A Pattern- Based Approach to Deploy Process Improvements in Small Settings

  1. The Levels of Processing Conceptualization of Human Memory: Some Empirical and Theoretical Issues,

    Science.gov (United States)

    1984-12-01

    levels -of- processing (LOP) framework was introduced by Craik and Lockhart in 1972...G. H. A multicomponent theory of the memory trace. In F. I. M. Craik and R. S. Lockhart , Levels of 6 processing : A framework for memory research... Lockhart , R. S. Levels of processing : A framework of memory research. Journal of Verbal Learning and Verbal Behavior, 1972, 11, 671-684. 25. Craik , F.

  2. Level 1 Processing of MODIS Direct Broadcast Data From Terra

    Science.gov (United States)

    Lynnes, Christopher; Smith, Peter; Shotland, Larry; El-Ghazawi, Tarek; Zhu, Ming

    2000-01-01

    In February 2000, an effort was begun to adapt the Moderate Resolution Imaging Spectroradiometer (MODIS) Level 1 production software to process direct broadcast data. Three Level 1 algorithms have been adapted and packaged for release: Level 1A converts raw (level 0) data into Hierarchical Data Format (HDF), unpacking packets into scans; Geolocation computes geographic information for the data points in the Level 1A; and the Level 1B computes geolocated, calibrated radiances from the Level 1A and Geolocation products. One useful aspect of adapting the production software is the ability to incorporate enhancements contributed by the MODIS Science Team. We have therefore tried to limit changes to the software. However, in order to process the data immediately on receipt, we have taken advantage of a branch in the geolocation software that reads orbit and altitude information from the packets themselves, rather than external ancillary files used in standard production. We have also verified that the algorithms can be run with smaller time increments (2.5 minutes) than the five-minute increments used in production. To make the code easier to build and run, we have simplified directories and build scripts. Also, dependencies on a commercial numerics library have been replaced by public domain software. A version of the adapted code has been released for Silicon Graphics machines running lrix. Perhaps owing to its origin in production, the software is rather CPU-intensive. Consequently, a port to Linux is underway, followed by a version to run on PC clusters, with an eventual goal of running in near-real-time (i.e., process a ten-minute pass in ten minutes).

  3. County-Level Poverty Is Equally Associated with Unmet Health Care Needs in Rural and Urban Settings

    Science.gov (United States)

    Peterson, Lars E.; Litaker, David G.

    2010-01-01

    Context: Regional poverty is associated with reduced access to health care. Whether this relationship is equally strong in both rural and urban settings or is affected by the contextual and individual-level characteristics that distinguish these areas, is unclear. Purpose: Compare the association between regional poverty with self-reported unmet…

  4. The Effects of Gravitation on the Inter-Media Agenda-Setting Central Process: The Case of the Murder of Hrant Dink

    Directory of Open Access Journals (Sweden)

    Cem YAŞIN

    2014-02-01

    Full Text Available While the first level agenda setting researches focus on the transfer of issue salience from the media to public agenda, second level agenda setting researches interest in the attributes emphasized in the news and their affect on the public agenda. Some of these researches tends to analysis the media agenda. Influences of the news media on each other are studied by the inter-media agenda setting researches at at both the first and second levels. The same researches examine also the effects of different types of media on each other. However there is the problem of lack of a systematic theoretical model. This is caused by the differentiation in the aims of researchers and in their research objects. The other problem in the inter-media agenda setting researches is that there is no research on the agenda setting effects of the newspapers which have different ideological and political identities. This research aims to scrutinize the inter-media agenda-setting effects among various newspapers that have got different points of view. The research is designed to test the central gravitation effects of the mainstream news papers. Here the Murder of Hrant Dink is selected as a case study.

  5. Comparing of goal setting strategy with group education method to increase physical activity level: A randomized trial.

    Science.gov (United States)

    Jiryaee, Nasrin; Siadat, Zahra Dana; Zamani, Ahmadreza; Taleban, Roya

    2015-10-01

    Designing an intervention to increase physical activity is important to be based on the health care settings resources and be acceptable by the subject group. This study was designed to assess and compare the effect of the goal setting strategy with a group education method on increasing the physical activity of mothers of children aged 1 to 5. Mothers who had at least one child of 1-5 years were randomized into two groups. The effect of 1) goal-setting strategy and 2) group education method on increasing physical activity was assessed and compared 1 month and 3 months after the intervention. Also, the weight, height, body mass index (BMI), waist and hip circumference, and well-being were compared between the two groups before and after the intervention. Physical activity level increased significantly after the intervention in the goal-setting group and it was significantly different between the two groups after intervention (P goal-setting group after the intervention. In the group education method, only the well-being score improved significantly (P goal-setting strategy to boost physical activity, improving the state of well-being and decreasing BMI, waist, and hip circumference.

  6. Decision making using AHP (Analytic Hierarchy Process) and fuzzy set theory in waste management

    International Nuclear Information System (INIS)

    Chung, J.Y.; Lee, K.J.; Kim, C.D.

    1995-01-01

    The major problem is how to consider the differences in opinions, when many experts are involved in decision making process. This paper provides a simple general methodology to treat the differences in various opinions. The authors determined the grade of membership through the process of magnitude estimation derived from pairwise comparisons and AHP developed by Saaty. They used fuzzy set theory to consider the differences in opinions and obtain the priorities for each alternative. An example, which can be applied to radioactive waste management, also was presented. The result shows a good agreement with the results of averaging methods

  7. A topology optimization method based on the level set method for the design of negative permeability dielectric metamaterials

    DEFF Research Database (Denmark)

    Otomori, Masaki; Yamada, Takayuki; Izui, Kazuhiro

    2012-01-01

    This paper presents a level set-based topology optimization method for the design of negative permeability dielectric metamaterials. Metamaterials are artificial materials that display extraordinary physical properties that are unavailable with natural materials. The aim of the formulated...... optimization problem is to find optimized layouts of a dielectric material that achieve negative permeability. The presence of grayscale areas in the optimized configurations critically affects the performance of metamaterials, positively as well as negatively, but configurations that contain grayscale areas...... are highly impractical from an engineering and manufacturing point of view. Therefore, a topology optimization method that can obtain clear optimized configurations is desirable. Here, a level set-based topology optimization method incorporating a fictitious interface energy is applied to a negative...

  8. Setting Mechanical Properties of High Strength Steels for Rapid Hot Forming Processes

    Science.gov (United States)

    Löbbe, Christian; Hering, Oliver; Hiegemann, Lars; Tekkaya, A. Erman

    2016-01-01

    Hot stamping of sheet metal is an established method for the manufacturing of light weight products with tailored properties. However, the generally-applied continuous roller furnace manifests two crucial disadvantages: the overall process time is long and a local setting of mechanical properties is only feasible through special cooling techniques. Hot forming with rapid heating directly before shaping is a new approach, which not only reduces the thermal intervention in the zones of critical formability and requested properties, but also allows the processing of an advantageous microstructure characterized by less grain growth, additional fractions (e.g., retained austenite), and undissolved carbides. Since the austenitization and homogenization process is strongly dependent on the microstructure constitution, the general applicability for the process relevant parameters is unknown. Thus, different austenitization parameters are analyzed for the conventional high strength steels 22MnB5, Docol 1400M, and DP1000 in respect of the mechanical properties. In order to characterize the resulting microstructure, the light optical and scanning electron microscopy, micro and macro hardness measurements, and the X-ray diffraction are conducted subsequent to tensile tests. The investigation proves not only the feasibility to adjust the strength and ductility flexibly, unique microstructures are also observed and the governing mechanisms are clarified. PMID:28773354

  9. EOS MLS Level 1B Data Processing, Version 2.2

    Science.gov (United States)

    Perun, Vincent; Jarnot, Robert; Pickett, Herbert; Cofield, Richard; Schwartz, Michael; Wagner, Paul

    2009-01-01

    A computer program performs level- 1B processing (the term 1B is explained below) of data from observations of the limb of the Earth by the Earth Observing System (EOS) Microwave Limb Sounder (MLS), which is an instrument aboard the Aura spacecraft. This software accepts, as input, the raw EOS MLS scientific and engineering data and the Aura spacecraft ephemeris and attitude data. Its output consists of calibrated instrument radiances and associated engineering and diagnostic data. [This software is one of several computer programs, denoted product generation executives (PGEs), for processing EOS MLS data. Starting from level 0 (representing the aforementioned raw data, the PGEs and their data products are denoted by alphanumeric labels (e.g., 1B and 2) that signify the successive stages of processing.] At the time of this reporting, this software is at version 2.2 and incorporates improvements over a prior version that make the code more robust, improve calibration, provide more diagnostic outputs, improve the interface with the Level 2 PGE, and effect a 15-percent reduction in file sizes by use of data compression.

  10. Does Grade Level Matter for the Assessment of Business Process Management Maturity?

    Directory of Open Access Journals (Sweden)

    Gabryelczyk Renata

    2016-06-01

    Full Text Available The purpose of this paper is to create and test the practical application of a business process management maturity assessment conducted at two different grade levels (management and professional level in an organization. The conceptual framework for this research includes creating a business process maturity indicator (BPMI for six process areas: strategy, documentation, optimization, implementation, execution, and controlling. The comparative analysis of the business process management maturity is performed using the BPMI on two cases: inside a single organization and the sector internally.

  11. Signal Processing for Nondifferentiable Data Defined on Cantor Sets: A Local Fractional Fourier Series Approach

    Directory of Open Access Journals (Sweden)

    Zhi-Yong Chen

    2014-01-01

    Full Text Available From the signal processing point of view, the nondifferentiable data defined on the Cantor sets are investigated in this paper. The local fractional Fourier series is used to process the signals, which are the local fractional continuous functions. Our results can be observed as significant extensions of the previously known results for the Fourier series in the framework of the local fractional calculus. Some examples are given to illustrate the efficiency and implementation of the present method.

  12. Binaural processing of modulated interaural level differences

    DEFF Research Database (Denmark)

    Thompson, Eric Robert; Dau, Torsten

    2008-01-01

    Two experiments are presented that measure the acuity of binaural processing of modulated interaural level differences ILDs using psychoacoustic methods. In both experiments, dynamic ILDs were created by imposing an interaurally antiphasic sinusoidal amplitude modulation AM signal on high...... frequency, broadly tuned, bandpass-shaped patterns were obtained. Simulations with an existing binaural model show that a low-pass filter to limit the binaural temporal resolution is not sufficient to predict the results of the experiments....

  13. Basic set theory

    CERN Document Server

    Levy, Azriel

    2002-01-01

    An advanced-level treatment of the basics of set theory, this text offers students a firm foundation, stopping just short of the areas employing model-theoretic methods. Geared toward upper-level undergraduate and graduate students, it consists of two parts: the first covers pure set theory, including the basic motions, order and well-foundedness, cardinal numbers, the ordinals, and the axiom of choice and some of it consequences; the second deals with applications and advanced topics such as point set topology, real spaces, Boolean algebras, and infinite combinatorics and large cardinals. An

  14. Sea-level changes on multiple spatial scales: estimates and contributing processes

    NARCIS (Netherlands)

    Frederikse, T.

    2018-01-01

    Being one of the major consequences of anthropogenic climate change, sea level rise forms a threat for many coastal areas and their inhabitants. Because all processes that cause sea-level changes have a spatially-varying fingerprint, local sea-level changes deviate substantially from the global

  15. Study on default setting for risk-informed regulation

    International Nuclear Information System (INIS)

    Jang, S.C.; Ha, J.J.; Jung, W.D.; Jeong, K.S.; Han, S.H.

    1998-12-01

    Both performing and validating a detailed risk analysis of a complex system are costly and time-consuming undertakings. With the increased use of probabilistic safety analysis (PSA) in regulatory decision making, both regulated parties and regulators have generally favored the use of defaults, because they can greatly facilitate the process of performing a PSA in the first place as well as the process of reviewing and verifying the PSA. The use of defaults may also ensure more uniform standards of PSA quality. However, regulatory agencies differ in their approaches to the use of default values, and the implications of these differences are not yet well understood. Moreover, large heterogeneity among licensees makes it difficult to set suitable defaults. This study focus on the development of model for setting defaults in order to achieve more applicability of risk-informed regulation. In particular, explored are the effects of different levels of conservatism in setting defaults, and their implications for the crafting of regularity incentives. (author). 17 refs., 1 tab

  16. Computing the dynamics of biomembranes by combining conservative level set and adaptive finite element methods

    OpenAIRE

    Laadhari , Aymen; Saramito , Pierre; Misbah , Chaouqi

    2014-01-01

    International audience; The numerical simulation of the deformation of vesicle membranes under simple shear external fluid flow is considered in this paper. A new saddle-point approach is proposed for the imposition of the fluid incompressibility and the membrane inextensibility constraints, through Lagrange multipliers defined in the fluid and on the membrane respectively. Using a level set formulation, the problem is approximated by mixed finite elements combined with an automatic adaptive ...

  17. Bin Set 1 Calcine Retrieval Feasibility Study

    International Nuclear Information System (INIS)

    Adams, R.D.; Berry, S.M.; Galloway, K.J.; Langenwalter, T.A.; Lopez, D.A.; Noakes, C.M.; Peterson, H.K.; Pope, M.I.; Turk, R.J.

    1999-01-01

    At the Department of Energy's Idaho Nuclear Technology and Engineering Center, as an interim waste management measure, both mixed high-level liquid waste and sodium bearing waste have been solidified by a calculation process and are stored in the Calcine Solids Storage Facilities. This calcined product will eventually be treated to allow final disposal in a national geologic repository. The Calcine Solids Storage Facilities comprise seven ''bit sets.'' Bin Set 1, the first to be constructed, was completed in 1959, and has been in service since 1963. It is the only bin set that does not meet current safe-shutdown earthquake seismic criteria. In addition, it is the only bin set that lacks built-in features to aid in calcine retrieval. One option to alleviate the seismic compliance issue is to transport the calcine from Bin Set 1 to another bin set which has the required capacity and which is seismically qualified. This report studies the feasibility of retrieving the calcine from Bi n Set 1 and transporting it into Bin Set 6 which is located approximately 650 feet away. Because Bin Set 1 was not designed for calcine retrieval, and because of the high radiation levels and potential contamination spread from the calcined material, this is a challenging engineering task. This report presents preconceptual design studies for remotely-operated, low-density, pneumatic vacuum retrieval and transport systems and equipment that are based on past work performed by the Raytheon Engineers and Constructors architectural engineering firm. The designs presented are considered feasible; however, future development work will be needed in several areas during the subsequent conceptual design phase

  18. Bin Set 1 Calcine Retrieval Feasibility Study

    Energy Technology Data Exchange (ETDEWEB)

    R. D. Adams; S. M. Berry; K. J. Galloway; T. A. Langenwalter; D. A. Lopez; C. M. Noakes; H. K. Peterson; M. I. Pope; R. J. Turk

    1999-10-01

    At the Department of Energy's Idaho Nuclear Technology and Engineering Center, as an interim waste management measure, both mixed high-level liquid waste and sodium bearing waste have been solidified by a calculation process and are stored in the Calcine Solids Storage Facilities. This calcined product will eventually be treated to allow final disposal in a national geologic repository. The Calcine Solids Storage Facilities comprise seven ''bit sets.'' Bin Set 1, the first to be constructed, was completed in 1959, and has been in service since 1963. It is the only bin set that does not meet current safe-shutdown earthquake seismic criteria. In addition, it is the only bin set that lacks built-in features to aid in calcine retrieval. One option to alleviate the seismic compliance issue is to transport the calcine from Bin Set 1 to another bin set which has the required capacity and which is seismically qualified. This report studies the feasibility of retrieving the calcine from Bi n Set 1 and transporting it into Bin Set 6 which is located approximately 650 feet away. Because Bin Set 1 was not designed for calcine retrieval, and because of the high radiation levels and potential contamination spread from the calcined material, this is a challenging engineering task. This report presents preconceptual design studies for remotely-operated, low-density, pneumatic vacuum retrieval and transport systems and equipment that are based on past work performed by the Raytheon Engineers and Constructors architectural engineering firm. The designs presented are considered feasible; however, future development work will be needed in several areas during the subsequent conceptual design phase.

  19. Natural setting of Japanese islands and geologic disposal of high-level waste

    International Nuclear Information System (INIS)

    Koide, Hitoshi

    1991-01-01

    The Japanese islands are a combination of arcuate islands along boundaries between four major plates: Eurasia, North America, Pacific and Philippine Sea plates. The interaction among the four plates formed complex geological structures which are basically patchworks of small blocks of land and sea-floor sediments piled up by the subduction of oceanic plates along the margin of the Eurasia continent. Although frequent earthquakes and volcanic eruptions clearly indicate active crustal deformation, the distribution of active faults and volcanoes is localized regionally in the Japanese islands. Crustal displacement faster than 1 mm/year takes place only in restricted regions near plate boundaries or close to major active faults. Volcanic activity is absent in the region between the volcanic front and the subduction zone. The site selection is especially important in Japan. The scenarios for the long-term performance assessment of high-level waste disposal are discussed with special reference to the geological setting of Japan. The long-term prediction of tectonic disturbance, evaluation of faults and fractures in rocks and estimation of long-term water-rock interaction are key issues in the performance assessment of the high-level waste disposal in the Japanese islands. (author)

  20. Methodology for setting the reference levels in the measurements of the dose rate absorbed in air due to the environmental gamma radiation

    International Nuclear Information System (INIS)

    Dominguez Ley, Orlando; Capote Ferrera, Eduardo; Caveda Ramos, Celia; Alonso Abad, Dolores

    2008-01-01

    Full text: The methodology for setting the reference levels of the measurements of the gamma dose rate absorbed in the air is described. The registration level was obtained using statistical methods. To set the alarm levels, it was necessary to begin with certain affectation level, which activates the investigation operation mode when being reached. It is was necessary to transform this affectation level into values of the indicators selected to set the appearance of an alarm in the network, allowing its direct comparison and at the same time a bigger operability of this one. The affectation level was assumed as an effective dose of 1 mSv/y, which is the international dose limit for public. The conversion factor obtained in a practical way as a consequence of the Chernobyl accident was assumed, converting the value of annual effective dose into values of effective dose rate in air. These factors are the most important in our work, since the main task of the National Network of Environmental Radiological Surveillance of the Republic of Cuba is detecting accidents with a situations regional affectation, and this accident is precisely an example of pollution at this scale. The alarm level setting was based on the results obtained in the first year of the Chernobyl accident. For this purpose, some transformations were achieved. In the final results, a correction factor was introduced depending on the year season the measurement was made. It was taken into account the influence of different meteorological events on the measurement of this indicator. (author)

  1. Low level image processing techniques using the pipeline image processing engine in the flight telerobotic servicer

    Science.gov (United States)

    Nashman, Marilyn; Chaconas, Karen J.

    1988-01-01

    The sensory processing system for the NASA/NBS Standard Reference Model (NASREM) for telerobotic control is described. This control system architecture was adopted by NASA of the Flight Telerobotic Servicer. The control system is hierarchically designed and consists of three parallel systems: task decomposition, world modeling, and sensory processing. The Sensory Processing System is examined, and in particular the image processing hardware and software used to extract features at low levels of sensory processing for tasks representative of those envisioned for the Space Station such as assembly and maintenance are described.

  2. FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY

    Energy Technology Data Exchange (ETDEWEB)

    D.L. McGregor

    2000-12-20

    The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisions for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process.

  3. FEATURES, EVENTS, AND PROCESSES: SYSTEM-LEVEL AND CRITICALITY

    International Nuclear Information System (INIS)

    D.L. McGregor

    2000-01-01

    The primary purpose of this Analysis/Model Report (AMR) is to identify and document the screening analyses for the features, events, and processes (FEPs) that do not easily fit into the existing Process Model Report (PMR) structure. These FEPs include the 3 1 FEPs designated as System-Level Primary FEPs and the 22 FEPs designated as Criticality Primary FEPs. A list of these FEPs is provided in Section 1.1. This AMR (AN-WIS-MD-000019) documents the Screening Decision and Regulatory Basis, Screening Argument, and Total System Performance Assessment (TSPA) Disposition for each of the subject Primary FEPs. This AMR provides screening information and decisions for the TSPA-SR report and provides the same information for incorporation into a project-specific FEPs database. This AMR may also assist reviewers during the licensing-review process

  4. [Validity assessment of a low level phonological processing test material in preschool children].

    Science.gov (United States)

    Ptok, M; Altwein, F

    2012-07-01

    The BISC (Bielefelder Screening) is a German test to evaluate phonological skills believed to be a prerequisite for future reading and writing skills. BISC results may indicate an elevated risk for dyslexia. Our research group has put forward test material in order to specifically examine low-level phonological processing LLPP. In this study we analysed whether BISC and low-level phonological processing correlate. A retrospective correlation analysis was carried out on primary school children's test results of the BISC and the newly developed low-level phonological processing test material. Spearman's rho was 0.52 between total LLPP and total BISC. The subscales correlated with a rho below 0.5. Results indicate that a low level phonological processing and higher level phonological processing can be differentiated. Future studies will have to clarify whether these results can be used to construct specifically targeting therapy strategies and whether the LLPP test material can be used to assess the risk of subsequent dyslexia also. © Georg Thieme Verlag KG Stuttgart · New York.

  5. The Daily Events and Emotions of Master's-Level Family Therapy Trainees in Off-Campus Practicum Settings

    Science.gov (United States)

    Edwards, Todd M.; Patterson, Jo Ellen

    2012-01-01

    The Day Reconstruction Method (DRM) was used to assess the daily events and emotions of one program's master's-level family therapy trainees in off-campus practicum settings. This study examines the DRM reports of 35 family therapy trainees in the second year of their master's program in marriage and family therapy. Four themes emerged from the…

  6. Automated defect spatial signature analysis for semiconductor manufacturing process

    Science.gov (United States)

    Tobin, Jr., Kenneth W.; Gleason, Shaun S.; Karnowski, Thomas P.; Sari-Sarraf, Hamed

    1999-01-01

    An apparatus and method for performing automated defect spatial signature alysis on a data set representing defect coordinates and wafer processing information includes categorizing data from the data set into a plurality of high level categories, classifying the categorized data contained in each high level category into user-labeled signature events, and correlating the categorized, classified signature events to a present or incipient anomalous process condition.

  7. Subliminally and consciously induced cognitive conflicts interact at several processing levels.

    Science.gov (United States)

    Stock, Ann-Kathrin; Friedrich, Julia; Beste, Christian

    2016-12-01

    Controlled behavior is susceptible to conflicts that can emerge from subliminal or consciously processed information. While research suggests that both sources of conflicting information may interact in their modulation of controlled behavior, it has remained unclear which cognitive sub-processes involved in controlled behavior are affected by this interaction; i.e., at which processing level subliminally and consciously induced response conflicts interact in modulating controlled behavior. Moreover, we investigated whether this interaction of subliminally and consciously induced response conflicts was due to a nexus between the two types of conflict like a common cognitive process or factor. For this, n = 38 healthy young subjects completed a paradigm which combines subliminal primes and consciously perceived flankers while an electroencephalography (EEG) was recorded. We show that the interaction of subliminal and conscious sources of conflict is not restricted to the response selection level (N2) but can already be shown at the earliest stages of perceptual and attentional processing (P1). While the degree of early attentional processing of subliminal information seems to depend on the absence of consciously perceived response conflicts, conflicts during the stage of response selection may be either reduced or enhanced by subliminal priming. Moreover, the results showed that even though the two different sources of conflict interact at the response selection level, they clearly originate from two distinct processes that interact before they detrimentally affect cognitive control. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Robust nuclei segmentation in cyto-histopathological images using statistical level set approach with topology preserving constraint

    Science.gov (United States)

    Taheri, Shaghayegh; Fevens, Thomas; Bui, Tien D.

    2017-02-01

    Computerized assessments for diagnosis or malignancy grading of cyto-histopathological specimens have drawn increased attention in the field of digital pathology. Automatic segmentation of cell nuclei is a fundamental step in such automated systems. Despite considerable research, nuclei segmentation is still a challenging task due noise, nonuniform illumination, and most importantly, in 2D projection images, overlapping and touching nuclei. In most published approaches, nuclei refinement is a post-processing step after segmentation, which usually refers to the task of detaching the aggregated nuclei or merging the over-segmented nuclei. In this work, we present a novel segmentation technique which effectively addresses the problem of individually segmenting touching or overlapping cell nuclei during the segmentation process. The proposed framework is a region-based segmentation method, which consists of three major modules: i) the image is passed through a color deconvolution step to extract the desired stains; ii) then the generalized fast radial symmetry transform is applied to the image followed by non-maxima suppression to specify the initial seed points for nuclei, and their corresponding GFRS ellipses which are interpreted as the initial nuclei borders for segmentation; iii) finally, these nuclei border initial curves are evolved through the use of a statistical level-set approach along with topology preserving criteria for segmentation and separation of nuclei at the same time. The proposed method is evaluated using Hematoxylin and Eosin, and fluorescent stained images, performing qualitative and quantitative analysis, showing that the method outperforms thresholding and watershed segmentation approaches.

  9. Development of Problem Sets for K-12 and Engineering on Pharmaceutical Particulate Systems

    Science.gov (United States)

    Savelski, Mariano J.; Slater, C. Stewart; Del Vecchio, Christopher A.; Kosteleski, Adrian J.; Wilson, Sarah A.

    2010-01-01

    Educational problem sets have been developed on structured organic particulate systems (SOPS) used in pharmaceutical technology. The sets present topics such as particle properties and powder flow and can be integrated into K-12 and college-level curricula. The materials educate students in specific areas of pharmaceutical particulate processing,…

  10. Specifying the Mechanisms in a Levels-of-Processing Approach to Memory

    Science.gov (United States)

    Klein, Kitty; Saltz, Eli

    1976-01-01

    Craik and Lockhart's (1972) levels-of-processing theory has spurred new interest in semantic processing as a factor in memory, particularly with regard to free recall following incidental learning. However, their formulation lacks a clear description of the operations and structures involved in semantic processing. This research outlines a…

  11. Segmenting the Parotid Gland using Registration and Level Set Methods

    DEFF Research Database (Denmark)

    Hollensen, Christian; Hansen, Mads Fogtmann; Højgaard, Liselotte

    . The method was evaluated on a test set consisting of 8 corresponding data sets. The attained total volume Dice coefficient and mean Haussdorff distance were 0.61 ± 0.20 and 15.6 ± 7.4 mm respectively. The method has improvement potential which could be exploited in order for clinical introduction....

  12. Design and Real Time Implementation of CDM-PI Control System in a Conical Tank Liquid Level Process

    Directory of Open Access Journals (Sweden)

    P. K. Bhaba

    2011-10-01

    Full Text Available The work focuses on the design and real time implementation of Coefficient Diagram Method (CDM based PI (CDM-PI control system for a Conical Tank Liquid Level Process (CTLLP which exhibits severe static non-linear characteristics. By taking this static non-linearity into account, a Wiener Model (WM based CDM-PI control system is developed and implemented in real time operations. The performance of this control system for set point tracking and load disturbance rejection is studied. In addition, the performance is compared with other WM based PI controllers. Real time results clearly show that WM based CDM-PI control system outperforms over the others.

  13. A comparison between the first-fit settings of two multichannel digital signal-processing strategies: music quality ratings and speech-in-noise scores.

    Science.gov (United States)

    Higgins, Paul; Searchfield, Grant; Coad, Gavin

    2012-06-01

    The aim of this study was to determine which level-dependent hearing aid digital signal-processing strategy (DSP) participants preferred when listening to music and/or performing a speech-in-noise task. Two receiver-in-the-ear hearing aids were compared: one using 32-channel adaptive dynamic range optimization (ADRO) and the other wide dynamic range compression (WDRC) incorporating dual fast (4 channel) and slow (15 channel) processing. The manufacturers' first-fit settings based on participants' audiograms were used in both cases. Results were obtained from 18 participants on a quick speech-in-noise (QuickSIN; Killion, Niquette, Gudmundsen, Revit, & Banerjee, 2004) task and for 3 music listening conditions (classical, jazz, and rock). Participants preferred the quality of music and performed better at the QuickSIN task using the hearing aids with ADRO processing. A potential reason for the better performance of the ADRO hearing aids was less fluctuation in output with change in sound dynamics. ADRO processing has advantages for both music quality and speech recognition in noise over the multichannel WDRC processing that was used in the study. Further evaluations of which DSP aspects contribute to listener preference are required.

  14. Transport equations, Level Set and Eulerian mechanics. Application to fluid-structure coupling

    International Nuclear Information System (INIS)

    Maitre, E.

    2008-11-01

    My works were devoted to numerical analysis of non-linear elliptic-parabolic equations, to neutron transport equation and to the simulation of fabrics draping. More recently I developed an Eulerian method based on a level set formulation of the immersed boundary method to deal with fluid-structure coupling problems arising in bio-mechanics. Some of the more efficient algorithms to solve the neutron transport equation make use of the splitting of the transport operator taking into account its characteristics. In the present work we introduced a new algorithm based on this splitting and an adaptation of minimal residual methods to infinite dimensional case. We present the case where the velocity space is of dimension 1 (slab geometry) and 2 (plane geometry) because the splitting is simpler in the former

  15. Setting the stage for master's level success

    Science.gov (United States)

    Roberts, Donna

    Comprehensive reading, writing, research, and study skills play a critical role in a graduate student's success and ability to contribute to a field of study effectively. The literature indicated a need to support graduate student success in the areas of mentoring, navigation, as well as research and writing. The purpose of this two-phased mixed methods explanatory study was to examine factors that characterize student success at the Master's level in the fields of education, sociology and social work. The study was grounded in a transformational learning framework which focused on three levels of learning: technical knowledge, practical or communicative knowledge, and emancipatory knowledge. The study included two data collection points. Phase one consisted of a Master's Level Success questionnaire that was sent via Qualtrics to graduate level students at three colleges and universities in the Central Valley of California: a California State University campus, a University of California campus, and a private college campus. The results of the chi-square indicated that seven questionnaire items were significant with p values less than .05. Phase two in the data collection included semi-structured interview questions that resulted in three themes emerged using Dedoose software: (1) the need for more language and writing support at the Master's level, (2) the need for mentoring, especially for second-language learners, and (3) utilizing the strong influence of faculty in student success. It is recommended that institutions continually assess and strengthen their programs to meet the full range of learners and to support students to degree completion.

  16. Directions in low-level radioactive waste management. The siting process: establishing a low-level waste-disposal facility

    International Nuclear Information System (INIS)

    1982-11-01

    The siting of a low-level radioactive waste disposal facility encompasses many interrelated activities and, therefore, is inherently complex. The purpose of this publication is to assist state policymakers in understanding the nature of the siting process. Initial discussion focuses on the primary activities that require coordination during a siting effort. Available options for determining site development, licensing, regulating, and operating responsibilities are then considered. Additionally, the document calls attention to technical services available from federal agencies to assist states in the siting process; responsibilities of such agencies are also explained. The appendices include a conceptual plan for scheduling siting activities and an explanation of the process for acquiring agreement state status. An agreement state takes responsibility for licensing and regulating a low-level waste facility within its borders

  17. Adaptive Fault Detection for Complex Dynamic Processes Based on JIT Updated Data Set

    Directory of Open Access Journals (Sweden)

    Jinna Li

    2012-01-01

    Full Text Available A novel fault detection technique is proposed to explicitly account for the nonlinear, dynamic, and multimodal problems existed in the practical and complex dynamic processes. Just-in-time (JIT detection method and k-nearest neighbor (KNN rule-based statistical process control (SPC approach are integrated to construct a flexible and adaptive detection scheme for the control process with nonlinear, dynamic, and multimodal cases. Mahalanobis distance, representing the correlation among samples, is used to simplify and update the raw data set, which is the first merit in this paper. Based on it, the control limit is computed in terms of both KNN rule and SPC method, such that we can identify whether the current data is normal or not by online approach. Noted that the control limit obtained changes with updating database such that an adaptive fault detection technique that can effectively eliminate the impact of data drift and shift on the performance of detection process is obtained, which is the second merit in this paper. The efficiency of the developed method is demonstrated by the numerical examples and an industrial case.

  18. The Impact of Process Instructions on Judges' Use of Examinee Performance Data in Angoff Standard Setting Exercises

    Science.gov (United States)

    Mee, Janet; Clauser, Brian E.; Margolis, Melissa J.

    2013-01-01

    Despite being widely used and frequently studied, the Angoff standard setting procedure has received little attention with respect to an integral part of the process: how judges incorporate examinee performance data in the decision-making process. Without performance data, subject matter experts have considerable difficulty accurately making the…

  19. Process Development And Simulation For Cold Fabrication Of Doubly Curved Metal Plate By Using Line Array Roll Set

    International Nuclear Information System (INIS)

    Shim, D. S.; Jung, C. G.; Seong, D. Y.; Yang, D. Y.; Han, J. M.; Han, M. S.

    2007-01-01

    For effective manufacturing of a doubly curved sheet metal, a novel sheet metal forming process is proposed. The suggested process uses a Line Array Roll Set (LARS) composed of a pair of upper and lower roll assemblies in a symmetric manner. The process offers flexibility as compared with the conventional manufacturing processes, because it does not require any complex-shaped die and loss of material by blank-holding is minimized. LARS allows flexibility of the incremental forming process and adopts the principle of bending deformation, resulting in a slight deformation in thickness. Rolls composed of line array roll sets are divided into a driving roll row and two idle roll rows. The arrayed rolls in the central lines of the upper and lower roll assemblies are motor-driven so that they deform and transfer the sheet metal using friction between the rolls and the sheet metal. The remaining rolls are idle rolls, generating bending deformation with driving rolls. Furthermore, all the rolls are movable in any direction so that they are adaptable to any size or shape of the desired three-dimensional configuration. In the process, the sheet is deformed incrementally as deformation proceeds simultaneously in rolling and transverse directions step by step. Consequently, it can be applied to the fabrication of doubly curved ship hull plates by undergoing several passes. In this work, FEM simulations are carried out for verification of the proposed incremental forming system using the chosen design parameters. Based on the results of the simulation, the relationship between the roll set configuration and the curvature of a sheet metal is determined. The process information such as the forming loads and torques acting on every roll is analyzed as important data for the design and development of the manufacturing system

  20. The Influence of Level of Processing on Advertising Repetition Effects.

    OpenAIRE

    Nordhielm, Christie L

    2002-01-01

    This research examines whether or not repetition of features of a stimulus are subject to wear-out effects that have until now only been tested for the stimulus as a whole. When consumers process features in either a shallower or deeper manner, the level of processing performed dictates the effect of repeated feature exposure on their judgments. When repeated exposures to features are processed in a shallower fashion, there is an enhancement in evaluations with no subsequent downturn, whereas...

  1. Development of a pyro-partitioning process for long-lived radioactive nuclides. Process test for pretreatment of simulated high-level waste containing uranium

    International Nuclear Information System (INIS)

    Kurata, Masateru; Hijikata, Takatoshi; Kinoshita, Kensuke; Inoue, Tadashi

    2000-01-01

    A pyro-partitioning process developed at CRIEPI requires a pre-treatment process to convert high-level liquid waste to chloride. A combination process of denitration and chlorination has been developed for this purpose. Continuous process tests using simulated high-level waste were performed to certify the applicability of the process. Test results indicated a successful material balance sufficient for satisfying pyro-partitioning process criteria. In the present study, process tests using simulated high-level waste containing uranium were also carried out to prove that the pre-treatment process is feasible for uranium. The results indicated that uranium can be converted to chloride appropriate for the pyro-partitioning process. The material balance obtained from the tests is to be used to revise the process flow diagram. (author)

  2. Axiomatic set theory

    CERN Document Server

    Suppes, Patrick

    1972-01-01

    This clear and well-developed approach to axiomatic set theory is geared toward upper-level undergraduates and graduate students. It examines the basic paradoxes and history of set theory and advanced topics such as relations and functions, equipollence, finite sets and cardinal numbers, rational and real numbers, and other subjects. 1960 edition.

  3. Set-Valued Stochastic Equation with Set-Valued Square Integrable Martingale

    Directory of Open Access Journals (Sweden)

    Li Jun-Gang

    2017-01-01

    Full Text Available In this paper, we shall introduce the stochastic integral of a stochastic process with respect to set-valued square integrable martingale. Then we shall give the Aumann integral measurable theorem, and give the set-valued stochastic Lebesgue integral and set-valued square integrable martingale integral equation. The existence and uniqueness of solution to set-valued stochastic integral equation are proved. The discussion will be useful in optimal control and mathematical finance in psychological factors.

  4. SETS reference manual

    International Nuclear Information System (INIS)

    Worrell, R.B.

    1985-05-01

    The Set Equation Transformation System (SETS) is used to achieve the symbolic manipulation of Boolean equations. Symbolic manipulation involves changing equations from their original forms into more useful forms - particularly by applying Boolean identities. The SETS program is an interpreter which reads, interprets, and executes SETS user programs. The user writes a SETS user program specifying the processing to be achieved and submits it, along with the required data, for execution by SETS. Because of the general nature of SETS, i.e., the capability to manipulate Boolean equations regardless of their origin, the program has been used for many different kinds of analysis

  5. Ceramic process and plant design for high-level nuclear waste immobilization

    International Nuclear Information System (INIS)

    Grantham, L.F.; McKisson, R.L.; De Wames, R.E.; Guon, J.; Flintoff, J.F.; McKenzie, D.E.

    1983-01-01

    In the last 3 years, significant advances in ceramic technology for high-level nuclear waste solidification have been made. Product quality in terms of leach-resistance, compositional uniformity, structural integrity, and thermal stability promises to be superior to borosilicate glass. This paper addresses the process effectiveness and preliminary designs for glass and ceramic immobilization plants. The reference two-step ceramic process utilizes fluid-bed calcination (FBC) and hot isostatic press (HIP) consolidation. Full-scale demonstration of these well-developed processing steps has been established at DOE and/or commercial facilities for processing radioactive materials. Based on Savannah River-type waste, our model predicts that the capital and operating cost for the solidification of high-level nuclear waste is about the same for the ceramic and glass options. However, when repository costs are included, the ceramic option potentially offers significantly better economics due to its high waste loading and volume reduction. Volume reduction impacts several figures of merit in addition to cost such as system logistics, storage, transportation, and risk. The study concludes that the ceramic product/process has many potential advantages, and rapid deployment of the technology could be realized due to full-scale demonstrations of FBC and HIP technology in radioactive environments. Based on our finding and those of others, the ceramic innovation not only offers a viable backup to the glass reference process but promises to be a viable future option for new high-level nuclear waste management opportunities

  6. The triangle formed by framing, agenda-setting and metacoverage

    Directory of Open Access Journals (Sweden)

    Martín Oller Alonso

    2014-06-01

    Full Text Available The study of framing is a process that takes place at different levels; in the culture; in the minds of the elite and media professionals; in the text of the information; and, in the minds of citizens as individuals. Therefore, framing is an individual psychological process, but also an organizational process, a product and a tool of strategy. In this article the author have conducted a review of concepts such as agenda-setting, metacoverage, gatekeeper, or priming that are evolving from the classic study of framing.

  7. Neighborhood-level social processes and substantiated cases of child maltreatment.

    Science.gov (United States)

    Molnar, Beth E; Goerge, Robert M; Gilsanz, Paola; Hill, Andrea; Subramanian, S V; Holton, John K; Duncan, Dustin T; Beatriz, Elizabeth D; Beardslee, William R

    2016-01-01

    Child maltreatment is a preventable public health problem. Research has demonstrated that neighborhood structural factors (e.g. poverty, crime) can influence the proportion of a neighborhood's children who are victims of maltreatment. A newer strategy is the identification of potentially modifiable social processes at the neighborhood level that can also influence maltreatment. Toward this end, this study examines neighborhood-level data (maltreatment cases substantiated by Illinois' child protection agency, 1995-2005, social processes measured by the Project on Human Development in Chicago Neighborhoods, U.S. Census data, proportions of neighborhoods on public assistance, and crime data) that were linked across clusters of contiguous, relatively homogenous Chicago, IL census tracts with respect to racial/ethnic and socioeconomic composition. Our analysis-an ecological-level, repeated cross-sectional design utilizing random-intercept logit models-with a sensitivity analysis using spatial models to control for spatial autocorrelation-revealed consistent associations between neighborhood social processes and maltreatment. Neighborhoods higher in collective efficacy, intergenerational closure, and social networks, and lower in disorder had lower proportions of neglect, physical abuse, and sexual abuse substantiated cases, controlling for differences in structural factors. Higher collective efficacy and social network size also predicted a lower proportion of substance-exposed infants. This research indicates that strategies to mobilize neighborhood-level protective factors may decrease child maltreatment more effectively than individual and family-focused efforts alone. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Effect of Bread Making Process on Aflatoxin Level Changes

    OpenAIRE

    Jafar Milani; Seyed Saman Seyed Nazari; Elmira Bamyar; Gisou Maleki

    2014-01-01

    Wheat flour is a commodity with a high risk of aflatoxins (AFs) contamination. During the bread making there are many processes that can affect the AFs stability. The effect of bread making process using different yeast types on AFs levels was investigated. For this purpose, standards of AFs including B and Gwere added to flour and then bread loaves were prepared. Three types of commercially available yeast including active dry yeast, instant dry yeast and compressed yeast were used for dough...

  9. Comparing of goal setting strategy with group education method to increase physical activity level: A randomized trial

    Directory of Open Access Journals (Sweden)

    Nasrin Jiryaee

    2015-01-01

    Full Text Available Background: Designing an intervention to increase physical activity is important to be based on the health care settings resources and be acceptable by the subject group. This study was designed to assess and compare the effect of the goal setting strategy with a group education method on increasing the physical activity of mothers of children aged 1 to 5. Materials and Methods: Mothers who had at least one child of 1-5 years were randomized into two groups. The effect of 1 goal-setting strategy and 2 group education method on increasing physical activity was assessed and compared 1 month and 3 months after the intervention. Also, the weight, height, body mass index (BMI, waist and hip circumference, and well-being were compared between the two groups before and after the intervention. Results: Physical activity level increased significantly after the intervention in the goal-setting group and it was significantly different between the two groups after intervention (P < 0.05. BMI, waist circumference, hip circumference, and well-being score were significantly different in the goal-setting group after the intervention. In the group education method, only the well-being score improved significantly (P < 0.05. Conclusion: Our study presented the effects of using the goal-setting strategy to boost physical activity, improving the state of well-being and decreasing BMI, waist, and hip circumference.

  10. The Algorithm Theoretical Basis Document for Level 1A Processing

    Science.gov (United States)

    Jester, Peggy L.; Hancock, David W., III

    2012-01-01

    The first process of the Geoscience Laser Altimeter System (GLAS) Science Algorithm Software converts the Level 0 data into the Level 1A Data Products. The Level 1A Data Products are the time ordered instrument data converted from counts to engineering units. This document defines the equations that convert the raw instrument data into engineering units. Required scale factors, bias values, and coefficients are defined in this document. Additionally, required quality assurance and browse products are defined in this document.

  11. Setting healthcare priorities in hospitals: a review of empirical studies.

    Science.gov (United States)

    Barasa, Edwine W; Molyneux, Sassy; English, Mike; Cleary, Susan

    2015-04-01

    Priority setting research has focused on the macro (national) and micro (bedside) level, leaving the meso (institutional, hospital) level relatively neglected. This is surprising given the key role that hospitals play in the delivery of healthcare services and the large proportion of health systems resources that they absorb. To explore the factors that impact upon priority setting at the hospital level, we conducted a thematic review of empirical studies. A systematic search of PubMed, EBSCOHOST, Econlit databases and Google scholar was supplemented by a search of key websites and a manual search of relevant papers' reference lists. A total of 24 papers were identified from developed and developing countries. We applied a policy analysis framework to examine and synthesize the findings of the selected papers. Findings suggest that priority setting practice in hospitals was influenced by (1) contextual factors such as decision space, resource availability, financing arrangements, availability and use of information, organizational culture and leadership, (2) priority setting processes that depend on the type of priority setting activity, (3) content factors such as priority setting criteria and (4) actors, their interests and power relations. We observe that there is need for studies to examine these issues and the interplay between them in greater depth and propose a conceptual framework that might be useful in examining priority setting practices in hospitals. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine © The Author 2014; all rights reserved.

  12. Effects of long-term voluntary exercise on learning and memory processes: dependency of the task and level of exercise.

    Science.gov (United States)

    García-Capdevila, Sílvia; Portell-Cortés, Isabel; Torras-Garcia, Meritxell; Coll-Andreu, Margalida; Costa-Miserachs, David

    2009-09-14

    The effect of long-term voluntary exercise (running wheel) on anxiety-like behaviour (plus maze and open field) and learning and memory processes (object recognition and two-way active avoidance) was examined on Wistar rats. Because major individual differences in running wheel behaviour were observed, the data were analysed considering the exercising animals both as a whole and grouped according to the time spent in the running wheel (low, high, and very-high running). Although some variables related to anxiety-like behaviour seem to reflect an anxiogenic compatible effect, the view of the complete set of variables could be interpreted as an enhancement of defensive and risk assessment behaviours in exercised animals, without major differences depending on the exercise level. Effects on learning and memory processes were dependent on task and level of exercise. Two-way avoidance was not affected either in the acquisition or in the retention session, while the retention of object recognition task was affected. In this latter task, an enhancement in low running subjects and impairment in high and very-high running animals were observed.

  13. Detection of pyridaben residue levels in hot pepper fruit and leaves by liquid chromatography-tandem mass spectrometry: effect of household processes.

    Science.gov (United States)

    Kim, Sung-Woo; Abd El-Aty, A M; Rahman, Md Musfiqur; Choi, Jeong-Heui; Choi, Ok-Ja; Rhee, Gyu-Seek; Chang, Moon-Ik; Kim, Heejung; Abid, Morad D N; Shin, Sung Chul; Shim, Jae-Han

    2015-07-01

    Following quick, easy, cheap, effective, rugged and safe (QuEChERS) and LC/MS/MS analysis, pyridaben residual levels were determined in unprocessed and processed hot pepper fruit and leaves. The linearities were satisfactory with determination coefficients (R(2)) in excess of 0.995 in processed and unprocessed pepper fruit and leaves. Recoveries at various concentrations were 79.9-105.1% with relative standard deviations ≤15%. The limits of quantitation of 0.003-0.012 mg/kg were very low compared with the maximum residue limits (2-5 mg/kg) set by the Ministry of Food and Drug Safety, Republic of Korea. The effects of various household processes, including washing, blanching, frying and drying under different conditions (water volume, blanching time and temperature) on residual concentrations were evaluated. Both washing and blanching (in combination with high water volume and time factor) significantly reduced residue levels in hot pepper fruit and leaves compared with other processes. In sum, the developed method was satisfactory and could be used to accurately detect residues in unprocessed and processed pepper fruit and leaves. It is recommended that pepper fruit/leaves be blanched after washing before being consumed to protect consumers from the negative health effects of detected pesticide residues. Copyright © 2014 John Wiley & Sons, Ltd.

  14. Delineating Facies Spatial Distribution by Integrating Ensemble Data Assimilation and Indicator Geostatistics with Level Set Transformation.

    Energy Technology Data Exchange (ETDEWEB)

    Hammond, Glenn Edward; Song, Xuehang; Ye, Ming; Dai, Zhenxue; Zachara, John; Chen, Xingyuan

    2017-03-01

    A new approach is developed to delineate the spatial distribution of discrete facies (geological units that have unique distributions of hydraulic, physical, and/or chemical properties) conditioned not only on direct data (measurements directly related to facies properties, e.g., grain size distribution obtained from borehole samples) but also on indirect data (observations indirectly related to facies distribution, e.g., hydraulic head and tracer concentration). Our method integrates for the first time ensemble data assimilation with traditional transition probability-based geostatistics. The concept of level set is introduced to build shape parameterization that allows transformation between discrete facies indicators and continuous random variables. The spatial structure of different facies is simulated by indicator models using conditioning points selected adaptively during the iterative process of data assimilation. To evaluate the new method, a two-dimensional semi-synthetic example is designed to estimate the spatial distribution and permeability of two distinct facies from transient head data induced by pumping tests. The example demonstrates that our new method adequately captures the spatial pattern of facies distribution by imposing spatial continuity through conditioning points. The new method also reproduces the overall response in hydraulic head field with better accuracy compared to data assimilation with no constraints on spatial continuity on facies.

  15. The Effects of Test Trial and Processing Level on Immediate and Delayed Retention

    Science.gov (United States)

    Chang, Sau Hou

    2017-01-01

    The purpose of the present study was to investigate the effects of test trial and processing level on immediate and delayed retention. A 2 × 2 × 2 mixed ANOVAs was used with two between-subject factors of test trial (single test, repeated test) and processing level (shallow, deep), and one within-subject factor of final recall (immediate, delayed). Seventy-six college students were randomly assigned first to the single test (studied the stimulus words three times and took one free-recall test) and the repeated test trials (studied the stimulus words once and took three consecutive free-recall tests), and then to the shallow processing level (asked whether each stimulus word was presented in capital letter or in small letter) and the deep processing level (whether each stimulus word belonged to a particular category) to study forty stimulus words. The immediate test was administered five minutes after the trials, whereas the delayed test was administered one week later. Results showed that single test trial recalled more words than repeated test trial in immediate final free-recall test, participants in deep processing performed better than those in shallow processing in both immediate and delayed retention. However, the dominance of single test trial and deep processing did not happen in delayed retention. Additional study trials did not further enhance the delayed retention of words encoded in deep processing, but did enhance the delayed retention of words encoded in shallow processing. PMID:28344679

  16. The Effects of Test Trial and Processing Level on Immediate and Delayed Retention.

    Science.gov (United States)

    Chang, Sau Hou

    2017-03-01

    The purpose of the present study was to investigate the effects of test trial and processing level on immediate and delayed retention. A 2 × 2 × 2 mixed ANOVAs was used with two between-subject factors of test trial (single test, repeated test) and processing level (shallow, deep), and one within-subject factor of final recall (immediate, delayed). Seventy-six college students were randomly assigned first to the single test (studied the stimulus words three times and took one free-recall test) and the repeated test trials (studied the stimulus words once and took three consecutive free-recall tests), and then to the shallow processing level (asked whether each stimulus word was presented in capital letter or in small letter) and the deep processing level (whether each stimulus word belonged to a particular category) to study forty stimulus words. The immediate test was administered five minutes after the trials, whereas the delayed test was administered one week later. Results showed that single test trial recalled more words than repeated test trial in immediate final free-recall test, participants in deep processing performed better than those in shallow processing in both immediate and delayed retention. However, the dominance of single test trial and deep processing did not happen in delayed retention. Additional study trials did not further enhance the delayed retention of words encoded in deep processing, but did enhance the delayed retention of words encoded in shallow processing.

  17. The Effects of Test Trial and Processing Level on Immediate and Delayed Retention

    Directory of Open Access Journals (Sweden)

    Sau Hou Chang

    2017-03-01

    Full Text Available The purpose of the present study was to investigate the effects of test trial and processing level on immediate and delayed retention. A 2 × 2 × 2 mixed ANOVAs was used with two between-subject factors of test trial (single test, repeated test and processing level (shallow, deep, and one within-subject factor of final recall (immediate, delayed. Seventy-six college students were randomly assigned first to the single test (studied the stimulus words three times and took one free-recall test and the repeated test trials (studied the stimulus words once and took three consecutive free-recall tests, and then to the shallow processing level (asked whether each stimulus word was presented in capital letter or in small letter and the deep processing level (whether each stimulus word belonged to a particular category to study forty stimulus words. The immediate test was administered five minutes after the trials, whereas the delayed test was administered one week later. Results showed that single test trial recalled more words than repeated test trial in immediate final free-recall test, participants in deep processing performed better than those in shallow processing in both immediate and delayed retention. However, the dominance of single test trial and deep processing did not happen in delayed retention. Additional study trials did not further enhance the delayed retention of words encoded in deep processing, but did enhance the delayed retention of words encoded in shallow processing.

  18. process setting models for the minimization of costs defectives

    African Journals Online (AJOL)

    Dr Obe

    determine the mean setting so as to minimise the total loss through under-limit complaints and loss of sales and goodwill as well as over-limit losses through excess materials and rework costs. Models are developed for the two types of setting of the mean so that the minimum costs of losses are achieved. Also, a model is ...

  19. A combined approach of generalized additive model and bootstrap with small sample sets for fault diagnosis in fermentation process of glutamate.

    Science.gov (United States)

    Liu, Chunbo; Pan, Feng; Li, Yun

    2016-07-29

    Glutamate is of great importance in food and pharmaceutical industries. There is still lack of effective statistical approaches for fault diagnosis in the fermentation process of glutamate. To date, the statistical approach based on generalized additive model (GAM) and bootstrap has not been used for fault diagnosis in fermentation processes, much less the fermentation process of glutamate with small samples sets. A combined approach of GAM and bootstrap was developed for the online fault diagnosis in the fermentation process of glutamate with small sample sets. GAM was first used to model the relationship between glutamate production and different fermentation parameters using online data from four normal fermentation experiments of glutamate. The fitted GAM with fermentation time, dissolved oxygen, oxygen uptake rate and carbon dioxide evolution rate captured 99.6 % variance of glutamate production during fermentation process. Bootstrap was then used to quantify the uncertainty of the estimated production of glutamate from the fitted GAM using 95 % confidence interval. The proposed approach was then used for the online fault diagnosis in the abnormal fermentation processes of glutamate, and a fault was defined as the estimated production of glutamate fell outside the 95 % confidence interval. The online fault diagnosis based on the proposed approach identified not only the start of the fault in the fermentation process, but also the end of the fault when the fermentation conditions were back to normal. The proposed approach only used a small sample sets from normal fermentations excitements to establish the approach, and then only required online recorded data on fermentation parameters for fault diagnosis in the fermentation process of glutamate. The proposed approach based on GAM and bootstrap provides a new and effective way for the fault diagnosis in the fermentation process of glutamate with small sample sets.

  20. Ethics and equity in research priority-setting: stakeholder engagement and the needs of disadvantaged groups.

    Science.gov (United States)

    Bhaumik, Soumyadeep; Rana, Sangeeta; Karimkhani, Chante; Welch, Vivian; Armstrong, Rebecca; Pottie, Kevin; Dellavalle, Robert; Dhakal, Purushottam; Oliver, Sandy; Francis, Damian K; Nasser, Mona; Crowe, Sally; Aksut, Baran; Amico, Roberto D

    2015-01-01

    A transparent and evidence-based priority-setting process promotes the optimal use of resources to improve health outcomes. Decision-makers and funders have begun to increasingly engage representatives of patients and healthcare consumers to ensure that research becomes more relevant. However, disadvantaged groups and their needs may not be integrated into the priority-setting process since they do not have a "political voice" or are unable to organise into interest groups. Equitable priority-setting methods need to balance patient needs, values, experiences with population-level issues and issues related to the health system.

  1. How Does Counselling in a Stationary Health Care Setting Affect the Attendance in a Standardised Sports Club Programme? Process Evaluation of a Quasi-Experimental Study

    Directory of Open Access Journals (Sweden)

    Sylvia Titze

    2018-01-01

    Full Text Available Actions in partnership across sectors is one principle for the promotion of health behaviours. The objective of this study was to describe the participation in a sports club-based exercise programme—named JACKPOT—following an intervention in a health care setting. Focus was given to the recruitment into JACKPOT, the attendance level, and whether the different programme elements were implemented as intented. The practicability of the project was also retrospectively rated. Participants were 238 inactive people (50% women between 30 and 65 years of age who attended a health resort. Of these, 77% were assigned to the intervention group (IG. The recruitment into the 12 JACKPOT sessions and the attendance levels were recorded via attendance lists. The implementation of the intervention standards was assessed with structured interviews and participatory observation. The Pragmatic Explanatory Continuum Indicator Summary (PRECIS-2 tool served to rate the practicability of the project. Almost 50% of the IG subjects attended JACKPOT sessions at least once and 54% of the attenders visited ≥75% of the 12 sessions. Some of the programme elements were not delivered fully. The process evaluation results showed that the project worked in a real-world setting, and also uncovered potential reasons such as incomplete information delivery for the moderate recruitment and attendance level.

  2. Standard setting in medical education: fundamental concepts and emerging challenges.

    Science.gov (United States)

    Mortaz Hejri, Sara; Jalili, Mohammad

    2014-01-01

    The process of determining the minimum pass level to separate the competent students from those who do not perform well enough is called standard setting. A large number of methods are widely used to set cut-scores for both written and clinical examinations. There are some challenging issues pertaining to any standard setting procedure. Ignoring these concerns would result in a large dispute regarding the credibility and defensibility of the method. The goal of this review is to provide a basic understanding of the key concepts and challenges in standard setting and to suggest some recommendations to overcome the challenging issues for educators and policymakers who are dealing with decision-making in this field.

  3. Low and intermediate level radioactive waste processing in plasma reactor

    International Nuclear Information System (INIS)

    Sauchyn, V.; Khvedchyn, I.; Van Oost, G.

    2013-01-01

    Methods of low and intermediate level radioactive waste processing comprise: cementation, bituminization, curing in polymer matrices, combustion and pyrolysis. All these methods are limited in their application in the field of chemical, morphological, and aggregate composition of material to be processed. The thermal plasma method is one of the universal methods of RAW processing. The use of electric-arc plasma with mean temperatures 2000 - 8000 K can effectively carry out the destruction of organic compounds into atoms and ions with very high speeds and high degree of conversion. Destruction of complex substances without oxygen leads to a decrease of the volume of exhaust gases and dimension of gas cleaning system. This paper presents the plasma reactor for thermal processing of low and intermediate level radioactive waste of mixed morphology. The equipment realizes plasma-pyrolytic conversion of wastes and results in a conditioned product in a single stage. As a result, the volume of conditioned waste is significantly reduced (more than 10 times). Waste is converted into an environmentally friendly form that suits long-term storage. The leaching rate of macro-components from the vitrified compound is less than 1.10 -7 g/(cm 2 .day). (authors)

  4. Unfolding the assessment process in a whole class mathematics setting

    Directory of Open Access Journals (Sweden)

    Radišić Jelena

    2014-01-01

    Full Text Available Assessment activities in the class are an important aspect of classroom practice, while there is much debate with respect to the formative vs. summative assessment routines and the outcomes that each of them provides for students' learning. As classroom assessment does not occur in seclusion of other aspects of classroom life, the process is seen as rather complex. In this study we wished to explore how assessment serves the function of supporting students' learning and whether this evidence is used to adapt teacher's practices in meeting different learning needs in the mathematics classroom. The authors observed assessment practices of an experienced math teacher in a grammar school in Belgrade. Teacher's assessment practices were observed during a three week period. The analysis has shown the teacher to hold a somewhat complex perception of assessment, yet the perception is largely detached from teaching, which is in line with the previously reported results. However, the elements of formative assessment do emerge, thus contributing to the assessment being in service of learning. In spite of this, a narrow set of practices are visible when observing how the teacher keeps track of students' progress. A mismatch is visible between students' and teacher's perceptions of the assessment as a whole and some of the practices exercised in the process. The teacher struggled to verbalize some aspects of own assessment practices, especially those related to more formative aspects.

  5. Achieving More Sustainable Designs through a Process Synthesis-Intensification Framework

    DEFF Research Database (Denmark)

    Babi, Deenesh Kavi; Woodley, John; Gani, Rafiqul

    2014-01-01

    More sustainable process designs refer to design alternatives that correspond to lowervalues of a set of targeted performance criteria. In this paper, a multi-level frameworkfor process synthesis-intensification that leads to more sustainable process designs ispresented. At the highest level of a...

  6. Levels-of-processing effect on internal source monitoring in schizophrenia.

    Science.gov (United States)

    Ragland, J Daniel; McCarthy, Erin; Bilker, Warren B; Brensinger, Colleen M; Valdez, Jeffrey; Kohler, Christian; Gur, Raquel E; Gur, Ruben C

    2006-05-01

    Recognition can be normalized in schizophrenia by providing patients with semantic organizational strategies through a levels-of-processing (LOP) framework. However, patients may rely primarily on familiarity effects, making recognition less sensitive than source monitoring to the strength of the episodic memory trace. The current study investigates whether providing semantic organizational strategies can also normalize patients' internal source-monitoring performance. Sixteen clinically stable medicated patients with schizophrenia and 15 demographically matched healthy controls were asked to identify the source of remembered words following an LOP-encoding paradigm in which they alternated between processing words on a 'shallow' perceptual versus a 'deep' semantic level. A multinomial analysis provided orthogonal measures of item recognition and source discrimination, and bootstrapping generated variance to allow for parametric analyses. LOP and group effects were tested by contrasting recognition and source-monitoring parameters for words that had been encoded during deep versus shallow processing conditions. As in a previous study there were no group differences in LOP effects on recognition performance, with patients and controls benefiting equally from deep versus shallow processing. Although there were no group differences in internal source monitoring, only controls had significantly better performance for words processed during the deep encoding condition. Patient performance did not correlate with clinical symptoms or medication dose. Providing a deep processing semantic encoding strategy significantly improved patients' recognition performance only. The lack of a significant LOP effect on internal source monitoring in patients may reflect subtle problems in the relational binding of semantic information that are independent of strategic memory processes.

  7. Parallels between a Collaborative Research Process and the Middle Level Philosophy

    Science.gov (United States)

    Dever, Robin; Ross, Diane; Miller, Jennifer; White, Paula; Jones, Karen

    2014-01-01

    The characteristics of the middle level philosophy as described in This We Believe closely parallel the collaborative research process. The journey of one research team is described in relationship to these characteristics. The collaborative process includes strengths such as professional relationships, professional development, courageous…

  8. Enhancing Critical Infrastructure and Key Resources (CIKR) Level-0 Physical Process Security Using Field Device Distinct Native Attribute Features

    Energy Technology Data Exchange (ETDEWEB)

    Lopez, Juan [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Liefer, Nathan C. [Wright-Patterson AFB, Dayton, OH (United States); Busho, Colin R. [Wright-Patterson AFB, Dayton, OH (United States); Temple, Michael A. [Wright-Patterson AFB, Dayton, OH (United States)

    2017-12-04

    Here, the need for improved Critical Infrastructure and Key Resource (CIKR) security is unquestioned and there has been minimal emphasis on Level-0 (PHY Process) improvements. Wired Signal Distinct Native Attribute (WS-DNA) Fingerprinting is investigated here as a non-intrusive PHY-based security augmentation to support an envisioned layered security strategy. Results are based on experimental response collections from Highway Addressable Remote Transducer (HART) Differential Pressure Transmitter (DPT) devices from three manufacturers (Yokogawa, Honeywell, Endress+Hauer) installed in an automated process control system. Device discrimination is assessed using Time Domain (TD) and Slope-Based FSK (SB-FSK) fingerprints input to Multiple Discriminant Analysis, Maximum Likelihood (MDA/ML) and Random Forest (RndF) classifiers. For 12 different classes (two devices per manufacturer at two distinct set points), both classifiers performed reliably and achieved an arbitrary performance benchmark of average cross-class percent correct of %C > 90%. The least challenging cross-manufacturer results included near-perfect %C ≈ 100%, while the more challenging like-model (serial number) discrimination results included 90%< %C < 100%, with TD Fingerprinting marginally outperforming SB-FSK Fingerprinting; SB-FSK benefits from having less stringent response alignment and registration requirements. The RndF classifier was most beneficial and enabled reliable selection of dimensionally reduced fingerprint subsets that minimize data storage and computational requirements. The RndF selected feature sets contained 15% of the full-dimensional feature sets and only suffered a worst case %CΔ = 3% to 4% performance degradation.

  9. A reconfigurable hybrid supervisory system for process control

    International Nuclear Information System (INIS)

    Garcia, H.E.; Ray, A.; Edwards, R.M.

    1994-01-01

    This paper presents a reconfigurable approach to decision and control systems for complex dynamic processes. The proposed supervisory control system is a reconfigurable hybrid architecture structured into three functional levels of hierarchy, namely, execution, supervision, and coordination. While the bottom execution level is constituted by either reconfigurable continuously varying or discrete event systems, the top two levels are necessarily governed by reconfigurable sets of discrete event decision and control systems. Based on the process status, the set of active control and supervisory algorithm is chosen. The reconfigurable hybrid system is briefly described along with a discussion on its implementation at the Experimental Breeder Reactor II of Argonne National Laboratory. A process control application of this hybrid system is presented and evaluated in an in-plant experiment

  10. A reconfigurable hybrid supervisory system for process control

    International Nuclear Information System (INIS)

    Garcia, H.E.; Ray, A.; Edwards, R.M.

    1994-01-01

    This paper presents a reconfigurable approach to decision and control systems for complex dynamic processes. The proposed supervisory control system is a reconfigurable hybrid architecture structured into three functional levels of hierarchy, namely, execution, supervision, and coordination. While, the bottom execution level is constituted by either reconfigurable continuously varying or discrete event systems, the top two levels are necessarily governed by reconfigurable sets of discrete event decision and control systems. Based on the process status, the set of active control and supervisory algorithm is chosen. The reconfigurable hybrid system is briefly described along with a discussion on its implementation at the Experimental Breeder Reactor 2 of Argonne National Laboratory. A process control application of this hybrid system is presented and evaluated in an in-plant experiment

  11. On Adiabatic Processes at the Elementary Particle Level

    OpenAIRE

    A, Michaud

    2016-01-01

    Analysis of adiabatic processes at the elementary particle level and of the manner in which they correlate with the principle of conservation of energy, the principle of least action and entropy. Analysis of the initial and irreversible adiabatic acceleration sequence of newly created elementary particles and its relation to these principles. Exploration of the consequences if this first initial acceleration sequence is not subject to the principle of conservation.

  12. The Semi-opened Infrastructure Model (SopIM): A Frame to Set Up an Organizational Learning Process

    Science.gov (United States)

    Grundstein, Michel

    In this paper, we introduce the "Semi-opened Infrastructure Model (SopIM)" implemented to deploy Artificial Intelligence and Knowledge-based Systems within a large industrial company. This model illustrates what could be two of the operating elements of the Model for General Knowledge Management within the Enterprise (MGKME) that are essential to set up the organizational learning process that leads people to appropriate and use concepts, methods and tools of an innovative technology: the "Ad hoc Infrastructures" element, and the "Organizational Learning Processes" element.

  13. TES Level 1 Algorithms: Interferogram Processing, Geolocation, Radiometric, and Spectral Calibration

    Science.gov (United States)

    Worden, Helen; Beer, Reinhard; Bowman, Kevin W.; Fisher, Brendan; Luo, Mingzhao; Rider, David; Sarkissian, Edwin; Tremblay, Denis; Zong, Jia

    2006-01-01

    The Tropospheric Emission Spectrometer (TES) on the Earth Observing System (EOS) Aura satellite measures the infrared radiance emitted by the Earth's surface and atmosphere using Fourier transform spectrometry. The measured interferograms are converted into geolocated, calibrated radiance spectra by the L1 (Level 1) processing, and are the inputs to L2 (Level 2) retrievals of atmospheric parameters, such as vertical profiles of trace gas abundance. We describe the algorithmic components of TES Level 1 processing, giving examples of the intermediate results and diagnostics that are necessary for creating TES L1 products. An assessment of noise-equivalent spectral radiance levels and current systematic errors is provided. As an initial validation of our spectral radiances, TES data are compared to the Atmospheric Infrared Sounder (AIRS) (on EOS Aqua), after accounting for spectral resolution differences by applying the AIRS spectral response function to the TES spectra. For the TES L1 nadir data products currently available, the agreement with AIRS is 1 K or better.

  14. Multiatlas segmentation of thoracic and abdominal anatomy with level set-based local search.

    Science.gov (United States)

    Schreibmann, Eduard; Marcus, David M; Fox, Tim

    2014-07-08

    Segmentation of organs at risk (OARs) remains one of the most time-consuming tasks in radiotherapy treatment planning. Atlas-based segmentation methods using single templates have emerged as a practical approach to automate the process for brain or head and neck anatomy, but pose significant challenges in regions where large interpatient variations are present. We show that significant changes are needed to autosegment thoracic and abdominal datasets by combining multi-atlas deformable registration with a level set-based local search. Segmentation is hierarchical, with a first stage detecting bulk organ location, and a second step adapting the segmentation to fine details present in the patient scan. The first stage is based on warping multiple presegmented templates to the new patient anatomy using a multimodality deformable registration algorithm able to cope with changes in scanning conditions and artifacts. These segmentations are compacted in a probabilistic map of organ shape using the STAPLE algorithm. Final segmentation is obtained by adjusting the probability map for each organ type, using customized combinations of delineation filters exploiting prior knowledge of organ characteristics. Validation is performed by comparing automated and manual segmentation using the Dice coefficient, measured at an average of 0.971 for the aorta, 0.869 for the trachea, 0.958 for the lungs, 0.788 for the heart, 0.912 for the liver, 0.884 for the kidneys, 0.888 for the vertebrae, 0.863 for the spleen, and 0.740 for the spinal cord. Accurate atlas segmentation for abdominal and thoracic regions can be achieved with the usage of a multi-atlas and perstructure refinement strategy. To improve clinical workflow and efficiency, the algorithm was embedded in a software service, applying the algorithm automatically on acquired scans without any user interaction.

  15. Coupled processes in NRC high-level waste research

    International Nuclear Information System (INIS)

    Costanzi, F.A.

    1987-01-01

    The author discusses NRC research effort in support of evaluating license applications for disposal of nuclear waste and for promulgating regulations and issuing guidance documents on nuclear waste management. In order to do this they fund research activities at a number of laboratories, academic institutions, and commercial organizations. One of our research efforts is the coupled processes study. This paper discusses interest in coupled processes and describes the target areas of research efforts over the next few years. The specific research activities relate to the performance objectives of NRC's high-level waste (HLW) regulation and the U.S. Environmental Protection Agency (EPA) HLW standard. The general objective of the research program is to ensure the NRC has a sufficient independent technical base to make sound regulatory decisions

  16. False memory and level of processing effect: an event-related potential study.

    Science.gov (United States)

    Beato, Maria Soledad; Boldini, Angela; Cadavid, Sara

    2012-09-12

    Event-related potentials (ERPs) were used to determine the effects of level of processing on true and false memory, using the Deese-Roediger-McDermott (DRM) paradigm. In the DRM paradigm, lists of words highly associated to a single nonpresented word (the 'critical lure') are studied and, in a subsequent memory test, critical lures are often falsely remembered. Lists with three critical lures per list were auditorily presented here to participants who studied them with either a shallow (saying whether the word contained the letter 'o') or a deep (creating a mental image of the word) processing task. Visual presentation modality was used on a final recognition test. True recognition of studied words was significantly higher after deep encoding, whereas false recognition of nonpresented critical lures was similar in both experimental groups. At the ERP level, true and false recognition showed similar patterns: no FN400 effect was found, whereas comparable left parietal and late right frontal old/new effects were found for true and false recognition in both experimental conditions. Items studied under shallow encoding conditions elicited more positive ERP than items studied under deep encoding conditions at a 1000-1500 ms interval. These ERP results suggest that true and false recognition share some common underlying processes. Differential effects of level of processing on true and false memory were found only at the behavioral level but not at the ERP level.

  17. Assessment of the Orion-SLS Interface Management Process in Achieving the EIA 731.1 Systems Engineering Capability Model Generic Practices Level 3 Criteria

    Science.gov (United States)

    Jellicorse, John J.; Rahman, Shamin A.

    2016-01-01

    NASA is currently developing the next generation crewed spacecraft and launch vehicle for exploration beyond earth orbit including returning to the Moon and making the transit to Mars. Managing the design integration of major hardware elements of a space transportation system is critical for overcoming both the technical and programmatic challenges in taking a complex system from concept to space operations. An established method of accomplishing this is formal interface management. In this paper we set forth an argument that the interface management process implemented by NASA between the Orion Multi-Purpose Crew Vehicle (MPCV) and the Space Launch System (SLS) achieves the Level 3 tier of the EIA 731.1 System Engineering Capability Model (SECM) for Generic Practices. We describe the relevant NASA systems and associated organizations, and define the EIA SECM Level 3 Generic Practices. We then provide evidence for our compliance with those practices. This evidence includes discussions of: NASA Systems Engineering Interface (SE) Management standard process and best practices; the tailoring of that process for implementation on the Orion to SLS interface; changes made over time to improve the tailored process, and; the opportunities to take the resulting lessons learned and propose improvements to our institutional processes and best practices. We compare this evidence against the practices to form the rationale for the declared SECM maturity level.

  18. Development of a working set of waste package performance criteria for deepsea disposal of low-level radioactive waste. Final report

    International Nuclear Information System (INIS)

    Columbo, P.; Fuhrmann, M.; Neilson, R.M. Jr; Sailor, V.L.

    1982-11-01

    The United States ocean dumping regulations developed pursuant to PL92-532, the Marine Protection, Research, and Sanctuaries Act of 1972, as amended, provide for a general policy of isolation and containment of low-level radioactive waste after disposal into the ocean. In order to determine whether any particular waste packaging system is adequate to meet this general requirement, it is necessary to establish a set of performance criteria against which to evaluate a particular packaging system. These performance criteria must present requirements for the behavior of the waste in combination with its immobilization agent and outer container in a deepsea environment. This report presents a working set of waste package performance criteria, and includes a glossary of terms, characteristics of low-level radioactive waste, radioisotopes of importance in low-level radioactive waste, and a summary of domestic and international regulations which control the ocean disposal of these wastes

  19. Level Set-Based Topology Optimization for the Design of an Electromagnetic Cloak With Ferrite Material

    DEFF Research Database (Denmark)

    Otomori, Masaki; Yamada, Takayuki; Andkjær, Jacob Anders

    2013-01-01

    . A level set-based topology optimization method incorporating a fictitious interface energy is used to find optimized configurations of the ferrite material. The numerical results demonstrate that the optimization successfully found an appropriate ferrite configuration that functions as an electromagnetic......This paper presents a structural optimization method for the design of an electromagnetic cloak made of ferrite material. Ferrite materials exhibit a frequency-dependent degree of permeability, due to a magnetic resonance phenomenon that can be altered by changing the magnitude of an externally...

  20. Implementation Process and Acceptance of a Setting Based Prevention Programme to Promote Healthy Lifestyle in Preschool Children

    Science.gov (United States)

    Herbert, Birgit; Strauss, Angelika; Mayer, Andrea; Duvinage, Kristin; Mitschek, Christine; Koletzko, Berthold

    2013-01-01

    Objective: Evaluation of the implementation process of a kindergarten-based intervention ("TigerKids") to promote a healthy lifestyle. Design: Questionnaire survey among kindergarten teachers about programme implementation and acceptance. Setting: Kindergartens in Bavaria, Germany. Methods: Two hundred and fifteen kindergartens were…

  1. Effect of the bread-making process on zearalenone levels.

    Science.gov (United States)

    Heidari, Sara; Milani, Jafar; Nazari, Seyed Saman Seyed Jafar

    2014-01-01

    The effects of the bread-making process including fermentation with Saccharomyces cerevisiae and lactic acid bacteria (Lactobacillus casei, Lactobacillus rhamnosus, Lactobacillus acidophilus and Lactobacillus fermentum) and baking at 200°C on zearalenone (ZEA) levels were investigated. Standard solutions of ZEA were added to flour and then loaves of bread were prepared. Sourdough and three types of yeast including active dry yeast, instant dry yeast and compressed yeast were used for the fermentation of dough. ZEA levels in flour, dough and bread were determined by HPLC with fluorescence detection after extraction and clean-up on an immunoaffinity column. The highest reduction in levels of ZEA was found in the first fermentation (first proof), while the lowest reduction was observed in the baking stage. In addition, the results showed that compressed yeast had the maximum reduction potential on ZEA levels even at the baking stage.

  2. Priority setting in practice: participants opinions on vertical and horizontal priority setting for reallocation.

    Science.gov (United States)

    Waldau, Susanne; Lindholm, Lars; Wiechel, Anna Helena

    2010-08-01

    In the Västerbotten County Council in Sweden a priority setting process was undertaken to reallocate existing resources for funding of new methods and activities. Resources were created by limiting low priority services. A procedure for priority setting was constructed and fully tested by engaging the entire organisation. The procedure included priority setting within and between departments and political decision making. Participants' views and experiences were collected as a basis for future improvement of the process. Results indicate that participants appreciated the overall approach and methodology and wished to engage in their improvement. Among the improvement proposals is prolongation of the process in order to improve the knowledge base quality. The procedure for identification of new items for funding also needs to be revised. The priority setting process was considered an overall success because it fulfilled its political goals. Factors considered crucial for success are a wish among managers for an economic strategy that addresses existing internal resource allocation; process management characterized by goal orientation and clear leadership; an elaborate communications strategy integrated early in the process and its management; political unity in support of the procedure, and a strong political commitment throughout the process. Generalizability has already been demonstrated by several health care organisations that performed processes founded on this working model. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  3. Optimum Replacement Level of the Soybean Meal for Processed ...

    African Journals Online (AJOL)

    user

    KEY WORDS: Processed Horse Eye Bean, Anti-Nutritional Factors, Soaking, Cooking, Broiler Finisher Diet. INTRODUCTION. The protein intake of Nigerians has been on a decline as a result of ever increasing population. The level of animal protein consumption has direct influence on the general well being and health of ...

  4. Priority-setting in health systems

    DEFF Research Database (Denmark)

    Byskov, Jens

    2013-01-01

    improvements work similarly in the vast array of social and other local contextual factors. Local, fair and accountable priority setting processes are neccessary to make the best of ever shifting national level strategies and priorities. An approach is described, which can assist in the involvement......DBL - under core funding from Danish International Development Agency (Danida) 2013 WHY HAVE HEALTH SYSTEMS WHEN EFFECTIVE INTERVENTIONS ARE KNOWN? Case: A teenage mother lives in a poor sub-Saharan village next to a big lake. The area is known to have malaria transmission all year around......, and surveys in nearby villages have shown a high prevalence of intestinal helminthiasis and schistosomiasis. The HIV prevalence in similar rural settings is about 10% in her age group. She has been losing weight over the last months and now her one-year-old child feels hot and is not eating well. She has...

  5. A web-based study of the relationship of duration of insulin pump infusion set use and fasting blood glucose level in adults with type 1 diabetes.

    Science.gov (United States)

    Sampson Perrin, Alysa J; Guzzetta, Russell C; Miller, Kellee M; Foster, Nicole C; Lee, Anna; Lee, Joyce M; Block, Jennifer M; Beck, Roy W

    2015-05-01

    To evaluate the impact of infusion set use duration on glycemic control, we conducted an Internet-based study using the T1D Exchange's online patient community, Glu ( myGlu.org ). For 14 days, 243 electronically consented adults with type 1 diabetes (T1D) entered online that day's fasting blood glucose (FBG) level, the prior day's total daily insulin (TDI) dose, and whether the infusion set was changed. Mean duration of infusion set use was 3.0 days. Mean FBG level was higher with each successive day of infusion set use, increasing from 126 mg/dL on Day 1 to 133 mg/dL on Day 3 to 147 mg/dL on Day 5 (P<0.001). TDI dose did not vary with increased duration of infusion set use. Internet-based data collection was used to rapidly conduct the study at low cost. The results indicate that FBG levels increase with each additional day of insulin pump infusion set use.

  6. Level-set reconstruction algorithm for ultrafast limited-angle X-ray computed tomography of two-phase flows.

    Science.gov (United States)

    Bieberle, M; Hampel, U

    2015-06-13

    Tomographic image reconstruction is based on recovering an object distribution from its projections, which have been acquired from all angular views around the object. If the angular range is limited to less than 180° of parallel projections, typical reconstruction artefacts arise when using standard algorithms. To compensate for this, specialized algorithms using a priori information about the object need to be applied. The application behind this work is ultrafast limited-angle X-ray computed tomography of two-phase flows. Here, only a binary distribution of the two phases needs to be reconstructed, which reduces the complexity of the inverse problem. To solve it, a new reconstruction algorithm (LSR) based on the level-set method is proposed. It includes one force function term accounting for matching the projection data and one incorporating a curvature-dependent smoothing of the phase boundary. The algorithm has been validated using simulated as well as measured projections of known structures, and its performance has been compared to the algebraic reconstruction technique and a binary derivative of it. The validation as well as the application of the level-set reconstruction on a dynamic two-phase flow demonstrated its applicability and its advantages over other reconstruction algorithms. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  7. Casemix and process indicators of outcome in stroke. The Royal College of Physicians minimum data set for stroke.

    Science.gov (United States)

    Irwin, P; Rudd, A

    1998-01-01

    The emphasis on outcomes measurement requires that casemix is considered in any comparative studies. In 1996 the Intercollegiate Working Party for Stroke agreed a minimum data set to measure the severity of casemix in stroke. The reasons for its development, the evidence base supporting the items included and the possible uses of the data set are described. It is currently being evaluated in national outcome and process audits to be reported at a later date.

  8. THOREX processing and zeolite transfer for high-level waste stream processing blending

    International Nuclear Information System (INIS)

    Kelly, S. Jr.; Meess, D.C.

    1997-07-01

    The West Valley Demonstration Project (WVDP) completed the pretreatment of the high-level radioactive waste (HLW) prior to the start of waste vitrification. The HLW originated form the two million liters of plutonium/uranium extraction (PUREX) and thorium extraction (THOREX) wastes remaining from Nuclear Fuel Services' (NFS) commercial nuclear fuel reprocessing operations at the Western New York Nuclear Service Center (WNYNSC) from 1966 to 1972. The pretreatment process removed cesium as well as other radionuclides from the liquid wastes and captured these radioactive materials onto silica-based molecular sieves (zeolites). The decontaminated salt solutions were volume-reduced and then mixed with portland cement and other admixtures. Nineteen thousand eight hundred and seventy-seven 270-liter square drums were filled with the cement-wastes produced from the pretreatment process. These drums are being stored in a shielded facility on the site until their final disposition is determined. Over 6.4 million liters of liquid HLW were processed through the pretreatment system. PUREX supernatant was processed first, followed by two PUREX sludge wash solutions. A third wash of PUREX/THOREX sludge was then processed after the neutralized THOREX waste was mixed with the PUREX waste. Approximately 6.6 million curies of radioactive cesium-137 (Cs-137) in the HLW liquid were removed and retained on 65,300 kg of zeolites. With pretreatment complete, the zeolite material has been mobilized, size-reduced (ground), and blended with the PUREX and THOREX sludges in a single feed tank that will supply the HLW slurry to the Vitrification Facility

  9. Decomposability and convex structure of thermal processes

    Science.gov (United States)

    Mazurek, Paweł; Horodecki, Michał

    2018-05-01

    We present an example of a thermal process (TP) for a system of d energy levels, which cannot be performed without an instant access to the whole energy space. This TP is uniquely connected with a transition between some states of the system, that cannot be performed without access to the whole energy space even when approximate transitions are allowed. Pursuing the question about the decomposability of TPs into convex combinations of compositions of processes acting non-trivially on smaller subspaces, we investigate transitions within the subspace of states diagonal in the energy basis. For three level systems, we determine the set of extremal points of these operations, as well as the minimal set of operations needed to perform an arbitrary TP, and connect the set of TPs with thermomajorization criterion. We show that the structure of the set depends on temperature, which is associated with the fact that TPs cannot increase deterministically extractable work from a state—the conclusion that holds for arbitrary d level system. We also connect the decomposability problem with detailed balance symmetry of an extremal TPs.

  10. Effectiveness of sensory processing strategies on activity level in inclusive preschool classrooms

    Directory of Open Access Journals (Sweden)

    Lin CL

    2012-10-01

    Full Text Available Chien-Lin Lin,1,2 Yu-Fan Min,3 Li-Wei Chou,1,2,* Chin-Kai Lin,4,* 1Department of Physical Medicine and Rehabilitation, China Medical University Hospital, Taichung, Taiwan; 2School of Chinese Medicine, College of Chinese Medicine, China Medical University, Taichung, Taiwan; 3Faith, Hope and Love, Center for Children and Adults With Disabilities, Taichung, Taiwan; 4Program of Early Intervention, Department of Early Childhood Education, National Taichung University of Education, Taichung, Taiwan*These authors contributed equally to this workBackground: The purpose of this study was to investigate the effectiveness of sensory processing strategies in improving the activity level of children with sensory integration dysfunction.Methods: The study used a matching-only pretest–posttest control group design, which requires random matching of sensory integration dysfunction to the corresponding intervention group (n = 18 and control group (n = 18. The intervention group comprised 3–6-year-old children who received an 8-week school-day intervention during implementation of the theme curriculum.Results: The 8-week treatment significantly reduced the activity level and foot-swinging episodes in children with sensory integration dysfunction, and obtained a medium-effect size. However, the level of improvement in the control group did not show any statistically significant change.Conclusion: Sensory processing strategies could improve activity levels in children with sensory integration dysfunction. However, this study was unable to exclude a developmental effect. The social validity results show that sensory processing strategies can be integrated into the theme curriculum and improve activity levels in children.Keywords: activity level, preschool inclusive classroom, sensory integration dysfunction, sensory processing strategy

  11. Studies of the setting behavior of cement suspensions

    International Nuclear Information System (INIS)

    Rudolph, G.; Luo, S.; Vejmelka, P.; Koester, R.

    1983-10-01

    The design of process for cementation of radioactive waste solutions is determined not only by the quality of the final product but also by the behavior of the cement grout before and during setting. For these reasons quantitative investigations were performed on the characteristics of the cement suspensions considered for solidification of intermediate-level liquid wastes which are composed mainly of cement, bentonite, simulated waste solution, and water. Particular interest was given to the differences in behavior of the various types of cement. The parameters investigated include viscosity, bleeding, volume change during setting, influence of compacting by vibration, time of setting, heat of hydration. At the end of the report the merits and drawbacks of the different cements are tabulated. These data may serve as a decision aid in selecting an appropriate type of cement

  12. Low-Frequency Cortical Entrainment to Speech Reflects Phoneme-Level Processing.

    Science.gov (United States)

    Di Liberto, Giovanni M; O'Sullivan, James A; Lalor, Edmund C

    2015-10-05

    The human ability to understand speech is underpinned by a hierarchical auditory system whose successive stages process increasingly complex attributes of the acoustic input. It has been suggested that to produce categorical speech perception, this system must elicit consistent neural responses to speech tokens (e.g., phonemes) despite variations in their acoustics. Here, using electroencephalography (EEG), we provide evidence for this categorical phoneme-level speech processing by showing that the relationship between continuous speech and neural activity is best described when that speech is represented using both low-level spectrotemporal information and categorical labeling of phonetic features. Furthermore, the mapping between phonemes and EEG becomes more discriminative for phonetic features at longer latencies, in line with what one might expect from a hierarchical system. Importantly, these effects are not seen for time-reversed speech. These findings may form the basis for future research on natural language processing in specific cohorts of interest and for broader insights into how brains transform acoustic input into meaning. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Evaluation of the Standard Setting on the 2005 Grade 12 National Assessment of Educational Progress Mathematics Test

    Science.gov (United States)

    Sireci, Stephen G.; Hauger, Jeffrey B.; Wells, Craig S.; Shea, Christine; Zenisky, April L.

    2009-01-01

    The National Assessment Governing Board used a new method to set achievement level standards on the 2005 Grade 12 NAEP Math test. In this article, we summarize our independent evaluation of the process used to set these standards. The evaluation data included observations of the standard-setting meeting, observations of advisory committee meetings…

  14. Level density from realistic nuclear potentials

    International Nuclear Information System (INIS)

    Calboreanu, A.

    2006-01-01

    Nuclear level density of some nuclei is calculated using a realistic set of single particle states (sps). These states are derived from the parameterization of nuclear potentials that describe the observed sps over a large number of nuclei. This approach has the advantage that one can infer level density for nuclei that are inaccessible for a direct study, but are very important in astrophysical processes such as those close to the drip lines. Level densities at high excitation energies are very sensitive to the actual set of sps. The fact that the sps spectrum is finite has extraordinary consequences upon nuclear reaction yields due to the leveling-off of the level density at extremely high excitation energies wrongly attributed so far to other nuclear effects. Single-particle level density parameter a parameter is extracted by fitting the calculated densities to the standard Bethe formula

  15. Optimization of super-resolution processing using incomplete image sets in PET imaging.

    Science.gov (United States)

    Chang, Guoping; Pan, Tinsu; Clark, John W; Mawlawi, Osama R

    2008-12-01

    Super-resolution (SR) techniques are used in PET imaging to generate a high-resolution image by combining multiple low-resolution images that have been acquired from different points of view (POVs). The number of low-resolution images used defines the processing time and memory storage necessary to generate the SR image. In this paper, the authors propose two optimized SR implementations (ISR-1 and ISR-2) that require only a subset of the low-resolution images (two sides and diagonal of the image matrix, respectively), thereby reducing the overall processing time and memory storage. In an N x N matrix of low-resolution images, ISR-1 would be generated using images from the two sides of the N x N matrix, while ISR-2 would be generated from images across the diagonal of the image matrix. The objective of this paper is to investigate whether the two proposed SR methods can achieve similar performance in contrast and signal-to-noise ratio (SNR) as the SR image generated from a complete set of low-resolution images (CSR) using simulation and experimental studies. A simulation, a point source, and a NEMA/IEC phantom study were conducted for this investigation. In each study, 4 (2 x 2) or 16 (4 x 4) low-resolution images were reconstructed from the same acquired data set while shifting the reconstruction grid to generate images from different POVs. SR processing was then applied in each study to combine all as well as two different subsets of the low-resolution images to generate the CSR, ISR-1, and ISR-2 images, respectively. For reference purpose, a native reconstruction (NR) image using the same matrix size as the three SR images was also generated. The resultant images (CSR, ISR-1, ISR-2, and NR) were then analyzed using visual inspection, line profiles, SNR plots, and background noise spectra. The simulation study showed that the contrast and the SNR difference between the two ISR images and the CSR image were on average 0.4% and 0.3%, respectively. Line profiles of

  16. Modeling Restrained Shrinkage Induced Cracking in Concrete Rings Using the Thick Level Set Approach

    Directory of Open Access Journals (Sweden)

    Rebecca Nakhoul

    2018-03-01

    Full Text Available Modeling restrained shrinkage-induced damage and cracking in concrete is addressed herein. The novel Thick Level Set (TLS damage growth and crack propagation model is used and adapted by introducing shrinkage contribution into the formulation. The TLS capacity to predict damage evolution, crack initiation and growth triggered by restrained shrinkage in absence of external loads is evaluated. A study dealing with shrinkage-induced cracking in elliptical concrete rings is presented herein. Key results such as the effect of rings oblateness on stress distribution and critical shrinkage strain needed to initiate damage are highlighted. In addition, crack positions are compared to those observed in experiments and are found satisfactory.

  17. Glycated haemoglobin (HbA1c ) and fasting plasma glucose relationships in sea-level and high-altitude settings.

    Science.gov (United States)

    Bazo-Alvarez, J C; Quispe, R; Pillay, T D; Bernabé-Ortiz, A; Smeeth, L; Checkley, W; Gilman, R H; Málaga, G; Miranda, J J

    2017-06-01

    Higher haemoglobin levels and differences in glucose metabolism have been reported among high-altitude residents, which may influence the diagnostic performance of HbA 1c . This study explores the relationship between HbA 1c and fasting plasma glucose (FPG) in populations living at sea level and at an altitude of > 3000 m. Data from 3613 Peruvian adults without a known diagnosis of diabetes from sea-level and high-altitude settings were evaluated. Linear, quadratic and cubic regression models were performed adjusting for potential confounders. Receiver operating characteristic (ROC) curves were constructed and concordance between HbA 1c and FPG was assessed using a Kappa index. At sea level and high altitude, means were 13.5 and 16.7 g/dl (P > 0.05) for haemoglobin level; 41 and 40 mmol/mol (5.9% and 5.8%; P < 0.01) for HbA 1c ; and 5.8 and 5.1 mmol/l (105 and 91.3 mg/dl; P < 0.001) for FPG, respectively. The adjusted relationship between HbA 1c and FPG was quadratic at sea level and linear at high altitude. Adjusted models showed that, to predict an HbA 1c value of 48 mmol/mol (6.5%), the corresponding mean FPG values at sea level and high altitude were 6.6 and 14.8 mmol/l (120 and 266 mg/dl), respectively. An HbA 1c cut-off of 48 mmol/mol (6.5%) had a sensitivity for high FPG of 87.3% (95% confidence interval (95% CI) 76.5 to 94.4) at sea level and 40.9% (95% CI 20.7 to 63.6) at high altitude. The relationship between HbA 1c and FPG is less clear at high altitude than at sea level. Caution is warranted when using HbA 1c to diagnose diabetes mellitus in this setting. © 2017 The Authors. Diabetic Medicine published by John Wiley & Sons Ltd on behalf of Diabetes UK.

  18. Leader Development Process in Pakistan Army at the Tactical Level

    National Research Council Canada - National Science Library

    Nawaz, Amer

    2004-01-01

    .... up to a maximum of seven years of service. It analyzes the present leader development process of Pakistan Army to see its effectiveness to train leaders at the tactical level to perform effectively in future...

  19. Industrial noise level study in a wheat processing factory in ilorin, nigeria

    Science.gov (United States)

    Ibrahim, I.; Ajao, K. R.; Aremu, S. A.

    2016-05-01

    An industrial process such as wheat processing generates significant noise which can cause adverse effects on workers and the general public. This study assessed the noise level at a wheat processing mill in Ilorin, Nigeria. A portable digital sound level meter HD600 manufactured by Extech Inc., USA was used to determine the noise level around various machines, sections and offices in the factory at pre-determined distances. Subjective assessment was also mode using a World Health Organization (WHO) standard questionnaire to obtain information regarding noise ratings, effect of noise on personnel and noise preventive measures. The result of the study shows that the highest noise of 99.4 dBA was recorded at a pressure blower when compared to other machines. WHO Class-4 hearing protector is recommended for workers on the shop floor and room acoustics should be upgraded to absorb some sounds transmitted to offices.

  20. [The effect of encoding on false memory: examination on levels of processing and list presentation format].

    Science.gov (United States)

    Hamajima, Hideki

    2004-04-01

    Using the Deese/Roediger-McDermott paradigm, the effects of lists presentation format (blocked/random) and levels of processing of critical nonpresented lures were examined. A levels-of-processing effect in a blocked presentation order was not observed for lures. Rates of false recognition and remember judgments for lures in a shallow level of processing were significantly lower than those in a deep level of processing when items from various themes were inter-mixed instead of blocked. Results showed an interaction between levels of processing and list presentation format. It is thus concluded that encoding of each word and whole list should be both considered in understanding false memory.

  1. Effect of liner design, pulsator setting, and vacuum level on bovine teat tissue changes and milking characteristics as measured by ultrasonography

    Directory of Open Access Journals (Sweden)

    Gleeson David E

    2004-05-01

    Full Text Available Friesian-type dairy cows were milked with different machine settings to determine the effect of these settings on teat tissue reaction and on milking characteristics. Three teat-cup liner designs were used with varying upper barrel dimensions (wide-bore WB = 31.6 mm; narrow-bore NB = 21.0 mm; narrow-bore NB1 = 25.0 mm. These liners were tested with alternate and simultaneous pulsation patterns, pulsator ratios (60:40 and 67:33 and three system vacuum levels (40, 44 and 50 kPa. Teat tissue was measured using ultrasonography, before milking and directly after milking. The measurements recorded were teat canal length (TCL, teat diameter (TD, cistern diameter (CD and teat wall thickness (TWT. Teat tissue changes were similar with a system vacuum level of either 50 kPa (mid-level or 40 kPa (low-level. Widening the liner upper barrel bore dimension from 21.0 mm (P

  2. A book of set theory

    CERN Document Server

    Pinter, Charles C

    2014-01-01

    Suitable for upper-level undergraduates, this accessible approach to set theory poses rigorous but simple arguments. Each definition is accompanied by commentary that motivates and explains new concepts. Starting with a repetition of the familiar arguments of elementary set theory, the level of abstract thinking gradually rises for a progressive increase in complexity.A historical introduction presents a brief account of the growth of set theory, with special emphasis on problems that led to the development of the various systems of axiomatic set theory. Subsequent chapters explore classes and

  3. Aumann Type Set-valued Lebesgue Integral and Representation Theorem

    Directory of Open Access Journals (Sweden)

    Jungang Li

    2009-03-01

    Full Text Available n this paper, we shall firstly illustrate why we should discuss the Aumann type set-valued Lebesgue integral of a set-valued stochastic process with respect to time t under the condition that the set-valued stochastic process takes nonempty compact subset of d -dimensional Euclidean space. After recalling some basic results about set-valued stochastic processes, we shall secondly prove that the Aumann type set-valued Lebesgue integral of a set-valued stochastic process above is a set-valued stochastic process. Finally we shall give the representation theorem, and prove an important inequality of the Aumann type set-valued Lebesgue integrals of set-valued stochastic processes with respect to t , which are useful to study set-valued stochastic differential inclusions with applications in finance.

  4. The Effects of Test Trial and Processing Level on Immediate and Delayed Retention

    Science.gov (United States)

    Chang, Sau Hou

    2017-01-01

    The purpose of the present study was to investigate the effects of test trial and processing level on immediate and delayed retention. A 2 × 2 × 2 mixed ANOVAs was used with two between-subject factors of test trial (single test, repeated test) and processing level (shallow, deep), and one within-subject factor of final recall (immediate,…

  5. Beyond fun runs and fruit bowls: an evaluation of the meso-level processes that shaped the Australian Healthy Workers Initiative.

    Science.gov (United States)

    Grunseit, Anne C; Rowbotham, Samantha; Pescud, Melanie; Indig, Devon; Wutzke, Sonia

    2016-02-01

    Issue addressed The Australian National Partnership Agreement on Preventive Health (NPAPH) charged states and territories with the development and implementation of the Healthy Workers Initiative (HWI) to improve workplace health promotion. Most evaluation efforts focus on the setting (micro) level. In the present study the HWI at the meso-level (state program development) was examined to understand how jurisdictions navigated theoretical, practical, and political priorities to develop their programs, and the programmatic choices that support or hinder perceived success. Methods Interviews with HWI program coordinators and managers across seven Australian jurisdictions explored decision-making processes related to developing and implementing the HWI and the impact of defunding. Interviews were audio-recorded, transcribed and analysed using thematic analysis. Results Despite taking a variety of approaches to the HWI, jurisdictions had common goals, namely achieving sustainability and capacity for meaningful change. These goals transcended the performance indicators set out by the NPAPH, which were considered unachievable in the given timeframe. Four ways jurisdictions sought to achieve their goals were identified, these were: 1) taking an embedded approach to workplace health promotion; 2) ensuring relevance of the HWI to businesses; 3) engaging in collaborative partnerships with agencies responsible for implementation; and 4) cultivating evolution of the HWI. Conclusions This meso-level evaluation has provided valuable insights into how health promotion program coordinators translate broad, national-level initiatives into state-specific programs and how they define program success. The study findings also highlight how broader, contextual factors, such as jurisdiction size, political imperatives and funding decisions impact on the implementation and success of a national health promotion initiative. So what? When evaluating the translation of complex initiatives, a

  6. Levels of Information Processing in a Fitts law task (LIPFitts)

    Science.gov (United States)

    Mosier, K. L.; Hart, S. G.

    1986-01-01

    State-of-the-art flight technology has restructured the task of human operators, decreasing the need for physical and sensory resources, and increasing the quantity of cognitive effort required, changing it qualitatively. Recent technological advances have the most potential for impacting a pilot in two areas: performance and mental workload. In an environment in which timing is critical, additional cognitive processing can cause performance decrements, and increase a pilot's perception of the mental workload involved. The effects of stimulus processing demands on motor response performance and subjective mental workload are examined, using different combinations of response selection and target acquisition tasks. The information processing demands of the response selection were varied (e.g., Sternberg memory set tasks, math equations, pattern matching), as was the difficulty of the response execution. Response latency as well as subjective workload ratings varied in accordance with the cognitive complexity of the task. Movement times varied according to the difficulty of the response execution task. Implications in terms of real-world flight situations are discussed.

  7. Online measurement for geometrical parameters of wheel set based on structure light and CUDA parallel processing

    Science.gov (United States)

    Wu, Kaihua; Shao, Zhencheng; Chen, Nian; Wang, Wenjie

    2018-01-01

    The wearing degree of the wheel set tread is one of the main factors that influence the safety and stability of running train. Geometrical parameters mainly include flange thickness and flange height. Line structure laser light was projected on the wheel tread surface. The geometrical parameters can be deduced from the profile image. An online image acquisition system was designed based on asynchronous reset of CCD and CUDA parallel processing unit. The image acquisition was fulfilled by hardware interrupt mode. A high efficiency parallel segmentation algorithm based on CUDA was proposed. The algorithm firstly divides the image into smaller squares, and extracts the squares of the target by fusion of k_means and STING clustering image segmentation algorithm. Segmentation time is less than 0.97ms. A considerable acceleration ratio compared with the CPU serial calculation was obtained, which greatly improved the real-time image processing capacity. When wheel set was running in a limited speed, the system placed alone railway line can measure the geometrical parameters automatically. The maximum measuring speed is 120km/h.

  8. The Role of Relative Sea Level Changes in Diagenetic Processes and Stacking Pattern of Kangan Formation Sediments in one of the Persian Gulf Fields

    Directory of Open Access Journals (Sweden)

    حسن اشراقی

    2016-01-01

    Full Text Available The Lower to Middle Triassic aged Kangan Formation is one of the most significant carbonate gas reservoirs in Iranian territory. In this study, thin sections data were used to recognize microfacies, sedimentary environments and the interaction between diagenetic processes and facies stacking pattern in a sequence stratigraphic framework. Petrographic studies leaded to recognition of eight microfacies related to three facies belts including tidal flat, lagoon and shoal. Moreover, the observed microfacies patterns indicate a ramp carbonate platform as depositional environment for this carbonate succession. The main diagenetic processes of Kangan Formation include micritization, isopachous and fibrous cements (primary marine diagenesis, dissolution and moldic porosity (meteoric diagenesis, compaction and stylolitization (secondary diagenesis. Based on facies changes, two third-order sequences were specified, each of which could be divided into two systems tracts including transgressive systems tract (TST and highstand systems tract (HST. In addition, sequence boundaries were identified with bedded, massive and nodular anhydrite. These facies, that are indicative of maximum sea level fall, were deposited in hypersaline lagoons. There is a close association between diagenetic processes and relative sea level changes of Kangan Formation, so that diagenetic processes of studied succession have been controlled by sediments stacking patterns during transgression and regression of sea level. During the transgression, the main diagenetic processes in shoal facies are marine cementation and dolomitization in lagoon and tidal flat facies. However, during the sea level fall, these processes include dissolution in shoal facies and dolomitization, anhydrite nodule formation and cementation in lagoon and tidal flat settings.

  9. Modelling estimation and analysis of dynamic processes from image sequences using temporal random closed sets and point processes with application to the cell exocytosis and endocytosis

    OpenAIRE

    Díaz Fernández, Ester

    2010-01-01

    In this thesis, new models and methodologies are introduced for the analysis of dynamic processes characterized by image sequences with spatial temporal overlapping. The spatial temporal overlapping exists in many natural phenomena and should be addressed properly in several Science disciplines such as Microscopy, Material Sciences, Biology, Geostatistics or Communication Networks. This work is related to the Point Process and Random Closed Set theories, within Stochastic Ge...

  10. Ultrafuzziness Optimization Based on Type II Fuzzy Sets for Image Thresholding

    Directory of Open Access Journals (Sweden)

    Hudan Studiawan

    2010-11-01

    Full Text Available Image thresholding is one of the processing techniques to provide high quality preprocessed image. Image vagueness and bad illumination are common obstacles yielding in a poor image thresholding output. By assuming image as fuzzy sets, several different fuzzy thresholding techniques have been proposed to remove these obstacles during threshold selection. In this paper, we proposed an algorithm for thresholding image using ultrafuzziness optimization to decrease uncertainty in fuzzy system by common fuzzy sets like type II fuzzy sets. Optimization was conducted by involving ultrafuzziness measurement for background and object fuzzy sets separately. Experimental results demonstrated that the proposed image thresholding method had good performances for images with high vagueness, low level contrast, and grayscale ambiguity.

  11. Key processes from tree to stand level

    International Nuclear Information System (INIS)

    Hinckley, T.; Ford, D.; Segura, G.; Sprugel, D.

    1991-01-01

    Changes in six factors have been identified as having potential major future impacts on the productivity and survival of forest trees and stands. These factors are atmospheric carbon dioxide concentration, tropospheric ozone concentration, mean annual air temperature and precipitation, extremes in temperature and precipitation, and levels of ultraviolet radiation. Except for precipitation, all of these factors are expected to increase with climatic change. However, the likelihood of their increase or change ranges from the given to the unknown. The way in which one or more of these factors might individually or in combination affect the productivity and survival of trees is discussed, and particularly sensitive physiological processes are identified. For example, increases in winter temperature and a doubling of CO 2 will result in early budburst in many species and therefore increase the risk of frost damage. In other species or locations, warm winters may mean insufficient chilling hours and the requirements for release from bud dormancy may not be met. The interaction of these processes with current species distribution, genotype selection, and management alternatives is reviewed. 52 refs., 1 fig., 1 tab

  12. High-Level waste process and product data annotated bibliography

    International Nuclear Information System (INIS)

    Stegen, G.E.

    1996-01-01

    The objective of this document is to provide information on available issued documents that will assist interested parties in finding available data on high-level waste and transuranic waste feed compositions, properties, behavior in candidate processing operations, and behavior on candidate product glasses made from those wastes. This initial compilation is only a partial list of available references

  13. Feasibility of large volume casting cementation process for intermediate level radioactive waste

    International Nuclear Information System (INIS)

    Chen Zhuying; Chen Baisong; Zeng Jishu; Yu Chengze

    1988-01-01

    The recent tendency of radioactive waste treatment and disposal both in China and abroad is reviewed. The feasibility of the large volume casting cementation process for treating and disposing the intermediate level radioactive waste from spent fuel reprocessing plant in shallow land is assessed on the basis of the analyses of the experimental results (such as formulation study, solidified radioactive waste properties measurement ect.). It can be concluded large volume casting cementation process is a promising, safe and economic process. It is feasible to dispose the intermediate level radioactive waste from reprocessing plant it the disposal site chosen has resonable geological and geographical conditions and some additional effective protection means are taken

  14. The defense waste processing facility: the final processing step for defense high-level waste disposal

    International Nuclear Information System (INIS)

    Cowan, S.P.; Sprecher, W.M.; Walton, R.D.

    1983-01-01

    The policy of the U.S. Department of Energy is to pursue an aggressive and credible waste management program that advocates final disposal of government generated (defense) high-level nuclear wastes in a manner consistent with environmental, health, and safety responsibilities and requirements. The Defense Waste Processing Facility (DWPF) is an essential component of the Department's program. It is the first project undertaken in the United States to immobilize government generated high-level nuclear wastes for geologic disposal. The DWPF will be built at the Department's Savannah River Plant near Aiken, South Carolina. When construction is complete in 1989, the DWPF will begin processing the high-level waste at the Savannah River Plant into a borosilicate glass form, a highly insoluble and non-dispersable product, in easily handled canisters. The immobilized waste will be stored on site followed by transportation to and disposal in a Federal repository. The focus of this paper is on the DWPF. The paper discusses issues which justify the project, summarizes its technical attributes, analyzes relevant environmental and insitutional factors, describes the management approach followed in transforming technical and other concepts into concrete and steel, and concludes with observations about the future role of the facility

  15. Accurate prediction of complex free surface flow around a high speed craft using a single-phase level set method

    Science.gov (United States)

    Broglia, Riccardo; Durante, Danilo

    2017-11-01

    This paper focuses on the analysis of a challenging free surface flow problem involving a surface vessel moving at high speeds, or planing. The investigation is performed using a general purpose high Reynolds free surface solver developed at CNR-INSEAN. The methodology is based on a second order finite volume discretization of the unsteady Reynolds-averaged Navier-Stokes equations (Di Mascio et al. in A second order Godunov—type scheme for naval hydrodynamics, Kluwer Academic/Plenum Publishers, Dordrecht, pp 253-261, 2001; Proceedings of 16th international offshore and polar engineering conference, San Francisco, CA, USA, 2006; J Mar Sci Technol 14:19-29, 2009); air/water interface dynamics is accurately modeled by a non standard level set approach (Di Mascio et al. in Comput Fluids 36(5):868-886, 2007a), known as the single-phase level set method. In this algorithm the governing equations are solved only in the water phase, whereas the numerical domain in the air phase is used for a suitable extension of the fluid dynamic variables. The level set function is used to track the free surface evolution; dynamic boundary conditions are enforced directly on the interface. This approach allows to accurately predict the evolution of the free surface even in the presence of violent breaking waves phenomena, maintaining the interface sharp, without any need to smear out the fluid properties across the two phases. This paper is aimed at the prediction of the complex free-surface flow field generated by a deep-V planing boat at medium and high Froude numbers (from 0.6 up to 1.2). In the present work, the planing hull is treated as a two-degree-of-freedom rigid object. Flow field is characterized by the presence of thin water sheets, several energetic breaking waves and plungings. The computational results include convergence of the trim angle, sinkage and resistance under grid refinement; high-quality experimental data are used for the purposes of validation, allowing to

  16. A game on the universe of sets

    International Nuclear Information System (INIS)

    Saveliev, D I

    2008-01-01

    Working in set theory without the axiom of regularity, we consider a two-person game on the universe of sets. In this game, the players choose in turn an element of a given set, an element of this element and so on. A player wins if he leaves his opponent no possibility of making a move, that is, if he has chosen the empty set. Winning sets (those admitting a winning strategy for one of the players) form a natural hierarchy with levels indexed by ordinals (in the finite case, the ordinal indicates the shortest length of a winning strategy). We show that the class of hereditarily winning sets is an inner model containing all well-founded sets and that each of the four possible relations between the universe, the class of hereditarily winning sets, and the class of well-founded sets is consistent. As far as the class of winning sets is concerned, either it is equal to the whole universe, or many of the axioms of set theory cannot hold on this class. Somewhat surprisingly, this does not apply to the axiom of regularity: we show that the failure of this axiom is consistent with its relativization to winning sets. We then establish more subtle properties of winning non-well-founded sets. We describe all classes of ordinals for which the following is consistent: winning sets without minimal elements (in the sense of membership) occur exactly at the levels indexed by the ordinals of this class. In particular, we show that if an even level of the hierarchy of winning sets contains a set without minimal elements, then all higher levels contain such sets. We show that the failure of the axiom of regularity implies that all odd levels contain sets without minimal elements, but it is consistent with the absence of such sets at all even levels as well as with their appearance at an arbitrary even non-limit or countable-cofinal level. To obtain consistency results, we propose a new method for obtaining models with non-well-founded sets. Finally, we study how long this game can

  17. West Valley demonstration project: alternative processes for solidifying the high-level wastes

    International Nuclear Information System (INIS)

    Holton, L.K.; Larson, D.E.; Partain, W.L.; Treat, R.L.

    1981-10-01

    In 1980, the US Department of Energy (DOE) established the West Valley Solidification Project as the result of legislation passed by the US Congress. The purpose of this project was to carry out a high level nuclear waste management demonstration project at the Western New York Nuclear Service Center in West Valley, New York. The DOE authorized the Pacific Northwest Laboratory (PNL), which is operated by Battelle Memorial Institute, to assess alternative processes for treatment and solidification of the WNYNSC high-level wastes. The Process Alternatives Study is the suject of this report. Two pretreatment approaches and several waste form processes were selected for evaluation in this study. The two waste treatment approaches were the salt/sludge separation process and the combined waste process. Both terminal and interim waste form processes were studied. The terminal waste form processes considered were: borosilicate glass, low-alkali glass, marbles-in-lead matrix, and crystallinolecular potential and molecular dynamics calculations of the effect are yet to be completed. Cous oxide was also investigated. The reaction is first order in nitrite ion, second order in hydrogen ion, and between zero and first order in hydroxylamine monosulfonate, depending on the concentration

  18. Cooperative Fuzzy Games Approach to Setting Target Levels of ECs in Quality Function Deployment

    Directory of Open Access Journals (Sweden)

    Zhihui Yang

    2014-01-01

    Full Text Available Quality function deployment (QFD can provide a means of translating customer requirements (CRs into engineering characteristics (ECs for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach.

  19. Cooperative fuzzy games approach to setting target levels of ECs in quality function deployment.

    Science.gov (United States)

    Yang, Zhihui; Chen, Yizeng; Yin, Yunqiang

    2014-01-01

    Quality function deployment (QFD) can provide a means of translating customer requirements (CRs) into engineering characteristics (ECs) for each stage of product development and production. The main objective of QFD-based product planning is to determine the target levels of ECs for a new product or service. QFD is a breakthrough tool which can effectively reduce the gap between CRs and a new product/service. Even though there are conflicts among some ECs, the objective of developing new product is to maximize the overall customer satisfaction. Therefore, there may be room for cooperation among ECs. A cooperative game framework combined with fuzzy set theory is developed to determine the target levels of the ECs in QFD. The key to develop the model is the formulation of the bargaining function. In the proposed methodology, the players are viewed as the membership functions of ECs to formulate the bargaining function. The solution for the proposed model is Pareto-optimal. An illustrated example is cited to demonstrate the application and performance of the proposed approach.

  20. TASKS OF INNOVATION PROCESSES PROGRAM-TARGET MANAGEMENT AT REGIONAL LEVEL

    Directory of Open Access Journals (Sweden)

    Mikhail Shchepakin

    2015-09-01

    Full Text Available The article analyzes the regional system of innovation management, discussed the existing problems of program-oriented management of innovative processes at the regional level, as well as possible solutions to improve the efficiency of the regional innovation system.

  1. Spacelab Level 4 Programmatic Implementation Assessment Study. Volume 2: Ground Processing requirements

    Science.gov (United States)

    1978-01-01

    Alternate ground processing options are summarized, including installation and test requirements for payloads, space processing, combined astronomy, and life sciences. The level 4 integration resource requirements are also reviewed for: personnel, temporary relocation, transportation, ground support equipment, and Spacelab flight hardware.

  2. Is the Levels of Processing effect language-limited?

    OpenAIRE

    Baddeley, Alan David; Hitch, Graham James

    2017-01-01

    The concept of Levels of Processing (LOP), proposing that deep coding enhances retention,has played a central role in the study of episodic memory. Evidence has however been based almost entirely on retention of individual words. Across five experiments, we compare LOP effects between visual and verbal stimuli, using judgments of pleasantness as a method of inducing deep encoding and a range of shallow encoding judgments selected so as to be applicable to both verbal and visual stimuli. LOP e...

  3. Proceduralism and its role in economic evaluation and priority setting in health.

    Science.gov (United States)

    Jan, Stephen

    2014-05-01

    This paper provides a critical overview of Gavin Mooney's proceduralist approach to economic evaluation and priority setting in health. Proceduralism is the notion that the social value attached to alternative courses of action should be determined not only by outcomes, but also processes. Mooney's brand of proceduralism was unique and couched within a broader critique of 'neo-liberal' economics. It operated on a number of levels. At the micro level of the individual program, he pioneered the notion that 'process utility' could be valued and measured within economic evaluation. At a macro level, he developed a framework in which the social objective of equity was defined by procedural justice in which communitarian values were used as the basis for judging how resources should be allocated across the health system. Finally, he applied the notion of procedural justice to further our understanding of the political economy of resource allocation; highlighting how fairness in decision making processes can overcome the sometimes intractable zero-sum resource allocation problem. In summary, his contributions to this field have set the stage for innovative programs of research to help in developing health policies and programs that are both in alignment with community values and implementable. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Optimal PID settings for first and second-order processes - Comparison with different controller tuning approaches

    OpenAIRE

    Pappas, Iosif

    2016-01-01

    PID controllers are extensively used in industry. Although many tuning methodologies exist, finding good controller settings is not an easy task and frequently optimization-based design is preferred to satisfy more complex criteria. In this thesis, the focus was to find which tuning approaches, if any, present close to optimal behavior. Pareto-optimal controllers were found for different first and second-order processes with time delay. Performance was quantified in terms of the integrat...

  5. An analysis on the level changing of UET and SET in blood and urine in early stage of kidney disease caused by diabetes

    International Nuclear Information System (INIS)

    Liu Juzhen; Yang Wenying; Cai Tietie

    2001-01-01

    Objective: To study the relationship between UET and SET variation and early changes of diabetic nephropathy. Methods: UET and SET were measured in 24 patients with diabetes, 19 with early stage diabetic nephropathy, 21 with advanced diabetic nephropathy and 30 normal as contrast. Results: Apparent uprise of UET and SET was observed in all patients when compared to normal contrasts (P 2 -macroglobulin was revealed (P<0.05). Conclusion: UET and SET levels uprose as long as diabetic nephropathy deteriorated. As a result, UET and SET may act as sensitive indices in diagnosing early stage diabetic nephropathy

  6. Task-set inertia and memory-consolidation bottleneck in dual tasks.

    Science.gov (United States)

    Koch, Iring; Rumiati, Raffaella I

    2006-11-01

    Three dual-task experiments examined the influence of processing a briefly presented visual object for deferred verbal report on performance in an unrelated auditory-manual reaction time (RT) task. RT was increased at short stimulus-onset asynchronies (SOAs) relative to long SOAs, showing that memory consolidation processes can produce a functional processing bottleneck in dual-task performance. In addition, the experiments manipulated the spatial compatibility of the orientation of the visual object and the side of the speeded manual response. This cross-task compatibility produced relative RT benefits only when the instruction for the visual task emphasized overlap at the level of response codes across the task sets (Experiment 1). However, once the effective task set was in place, it continued to produce cross-task compatibility effects even in single-task situations ("ignore" trials in Experiment 2) and when instructions for the visual task did not explicitly require spatial coding of object orientation (Experiment 3). Taken together, the data suggest a considerable degree of task-set inertia in dual-task performance, which is also reinforced by finding costs of switching task sequences (e.g., AC --> BC vs. BC --> BC) in Experiment 3.

  7. Priority Setting in Indigenous Health: Why We Need an Explicit Decision Making Approach

    Directory of Open Access Journals (Sweden)

    Michael E. Otim

    2015-06-01

    Full Text Available Indigenous Australians have significantly poorer health outcomes than the non-Indigenous population worldwide. The Australian government has increased its investment in Indigenous health through the "Closing the Health Gap" initiative. Deciding where to invest scarce resources so as to maximize health outcomes for Indigenous peoples may require improved priority setting processes. Current government practice involves a mix of implicit and explicit processes to varying degrees at the macro and meso decision making levels. In this article, we argue that explicit priority setting should be emphasized in Indigenous health, as it can ensure that the decision making process is accountable, systematic, and transparent. Following a review of the literature, we outline four key issues that need to be considered for explicit priority setting: developing an Indigenous health "constitution," strengthening the evidence base, selecting mechanisms for priority setting, and establishing appropriate incentives and institutional structure. We then summarize our findings into a checklist that can help a decision makers ensure that explicit priority setting is undertaken in Indigenous health. By addressing these key issues, the benefits of an explicit approach, which include increased efficiency, equity, and use of evidence, can be realized, thereby maximizing Indigenous health outcomes.

  8. Innovations in Setting Performance Standards for K-12 Test-Based Accountability

    Science.gov (United States)

    Huff, Kristen; Plake, Barbara S.

    2010-01-01

    Standard setting is a systematic process that uses a combination of judgmental and empirical procedures to make recommendations about where on the score continuum "cut scores" should be placed. Cut scores divide the score scale into categories consistent with the descriptions of student performance associated with multiple levels of achievement.…

  9. ISP: an optimal out-of-core image-set processing streaming architecture for parallel heterogeneous systems.

    Science.gov (United States)

    Ha, Linh Khanh; Krüger, Jens; Dihl Comba, João Luiz; Silva, Cláudio T; Joshi, Sarang

    2012-06-01

    Image population analysis is the class of statistical methods that plays a central role in understanding the development, evolution, and disease of a population. However, these techniques often require excessive computational power and memory that are compounded with a large number of volumetric inputs. Restricted access to supercomputing power limits its influence in general research and practical applications. In this paper we introduce ISP, an Image-Set Processing streaming framework that harnesses the processing power of commodity heterogeneous CPU/GPU systems and attempts to solve this computational problem. In ISP, we introduce specially designed streaming algorithms and data structures that provide an optimal solution for out-of-core multiimage processing problems both in terms of memory usage and computational efficiency. ISP makes use of the asynchronous execution mechanism supported by parallel heterogeneous systems to efficiently hide the inherent latency of the processing pipeline of out-of-core approaches. Consequently, with computationally intensive problems, the ISP out-of-core solution can achieve the same performance as the in-core solution. We demonstrate the efficiency of the ISP framework on synthetic and real datasets.

  10. The Effects of Test Anxiety on Learning at Superficial and Deep Levels of Processing.

    Science.gov (United States)

    Weinstein, Claire E.; And Others

    1982-01-01

    Using a deep-level processing strategy, low test-anxious college students performed significantly better than high test-anxious students in learning a paired-associate word list. Using a superficial-level processing strategy resulted in no significant difference in performance. A cognitive-attentional theory and test anxiety mechanisms are…

  11. Relevant Factors for Implementation of Operational-level IS/ICT Processes in Small IT Organizations

    Directory of Open Access Journals (Sweden)

    Jaroslav Kalina

    2010-10-01

    Full Text Available Having IS/ICT processes compliant according to well known standards like COBIT or ITIL is relatively popular especially among larger organizations (to which these standard are primarily aimed. This paper discusses how standardization of processes affects or is affected by a selected set of process characteristics and tries to provide general guidelines which should be considered prior to their implementation (standards. Special attention is paid to the specifics of small IS/ICT organizations since implementation of these frameworks (intended for rather larger organizations represents in this context more demanding endeavor.

  12. Knowledge Reduction Based on Divide and Conquer Method in Rough Set Theory

    Directory of Open Access Journals (Sweden)

    Feng Hu

    2012-01-01

    Full Text Available The divide and conquer method is a typical granular computing method using multiple levels of abstraction and granulations. So far, although some achievements based on divided and conquer method in the rough set theory have been acquired, the systematic methods for knowledge reduction based on divide and conquer method are still absent. In this paper, the knowledge reduction approaches based on divide and conquer method, under equivalence relation and under tolerance relation, are presented, respectively. After that, a systematic approach, named as the abstract process for knowledge reduction based on divide and conquer method in rough set theory, is proposed. Based on the presented approach, two algorithms for knowledge reduction, including an algorithm for attribute reduction and an algorithm for attribute value reduction, are presented. Some experimental evaluations are done to test the methods on uci data sets and KDDCUP99 data sets. The experimental results illustrate that the proposed approaches are efficient to process large data sets with good recognition rate, compared with KNN, SVM, C4.5, Naive Bayes, and CART.

  13. Volatility Determination in an Ambit Process Setting

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole; Graversen, Svend-Erik

    The probability limit behaviour of normalised quadratic variation is studied for a simple tempo-spatial ambit process, with particular regard to the question of volatility memorylessness.......The probability limit behaviour of normalised quadratic variation is studied for a simple tempo-spatial ambit process, with particular regard to the question of volatility memorylessness....

  14. A conditioned level-set method with block-division strategy to flame front extraction based on OH-PLIF measurements

    International Nuclear Information System (INIS)

    Han Yue; Cai Guo-Biao; Xu Xu; Bruno Renou; Abdelkrim Boukhalfa

    2014-01-01

    A novel approach to extract flame fronts, which is called the conditioned level-set method with block division (CLSB), has been developed. Based on a two-phase level-set formulation, the conditioned initialization and region-lock optimization appear to be beneficial to improve the efficiency and accuracy of the flame contour identification. The original block-division strategy enables the approach to be unsupervised by calculating local self-adaptive threshold values autonomously before binarization. The CLSB approach has been applied to deal with a large set of experimental data involving swirl-stabilized premixed combustion in diluted regimes operating at atmospheric pressures. The OH-PLIF measurements have been carried out in this framework. The resulting images are, thus, featured by lower signal-to-noise ratios (SNRs) than the ideal image; relatively complex flame structures lead to significant non-uniformity in the OH signal intensity; and, the magnitude of the maximum OH gradient observed along the flame front can also vary depending on flow or local stoichiometry. Compared with other conventional edge detection operators, the CLSB method demonstrates a good ability to deal with the OH-PLIF images at low SNR and with the presence of a multiple scales of both OH intensity and OH gradient. The robustness to noise sensitivity and intensity inhomogeneity has been evaluated throughout a range of experimental images of diluted flames, as well as against a circle test as Ground Truth (GT). (interdisciplinary physics and related areas of science and technology)

  15. The utility of imputed matched sets. Analyzing probabilistically linked databases in a low information setting.

    Science.gov (United States)

    Thomas, A M; Cook, L J; Dean, J M; Olson, L M

    2014-01-01

    To compare results from high probability matched sets versus imputed matched sets across differing levels of linkage information. A series of linkages with varying amounts of available information were performed on two simulated datasets derived from multiyear motor vehicle crash (MVC) and hospital databases, where true matches were known. Distributions of high probability and imputed matched sets were compared against the true match population for occupant age, MVC county, and MVC hour. Regression models were fit to simulated log hospital charges and hospitalization status. High probability and imputed matched sets were not significantly different from occupant age, MVC county, and MVC hour in high information settings (p > 0.999). In low information settings, high probability matched sets were significantly different from occupant age and MVC county (p sets were not (p > 0.493). High information settings saw no significant differences in inference of simulated log hospital charges and hospitalization status between the two methods. High probability and imputed matched sets were significantly different from the outcomes in low information settings; however, imputed matched sets were more robust. The level of information available to a linkage is an important consideration. High probability matched sets are suitable for high to moderate information settings and for situations involving case-specific analysis. Conversely, imputed matched sets are preferable for low information settings when conducting population-based analyses.

  16. How is the process of setting micronutrients recommendations reflected in nutrition policies in Poland? The case study of folate

    Directory of Open Access Journals (Sweden)

    Ewa Sicińska

    2018-03-01

    The current Polish nutrition recommendations for folate are consistent with the levels set by most other countries. The constant improvement of nutritional knowledge on folate among consumers, especially young women, is necessary.

  17. Solidification of low-level radioactive liquid waste using a cement-silicate process

    International Nuclear Information System (INIS)

    Grandlund, R.W.; Hayes, J.F.

    1979-01-01

    Extensive use has been made of silicate and Portland cement for the solidification of industrial waste and recently this method has been successfully used to solidify a variety of low level radioactive wastes. The types of wastes processed to date include fuel fabrication sludges, power reactor waste, decontamination solution, and university laboratory waste. The cement-silicate process produces a stable solid with a minimal increase in volume and the chemicals are relatively inexpensive and readily available. The method is adaptable to either batch or continuous processing and the equipment is simple. The solid has leaching characteristics similar to or better than plain Portland cement mixtures and the leaching can be further reduced by the use of ion-exchange additives. The cement-silicate process has been used to solidify waste containing high levels of boric acid, oils, and organic solvents. The experience of handling the various types of liquid waste with a cement-silicate system is described

  18. Levels of processing and language modality specificity in working memory.

    Science.gov (United States)

    Rudner, Mary; Karlsson, Thomas; Gunnarsson, Johan; Rönnberg, Jerker

    2013-03-01

    Neural networks underpinning working memory demonstrate sign language specific components possibly related to differences in temporary storage mechanisms. A processing approach to memory systems suggests that the organisation of memory storage is related to type of memory processing as well. In the present study, we investigated for the first time semantic, phonological and orthographic processing in working memory for sign- and speech-based language. During fMRI we administered a picture-based 2-back working memory task with Semantic, Phonological, Orthographic and Baseline conditions to 11 deaf signers and 20 hearing non-signers. Behavioural data showed poorer and slower performance for both groups in Phonological and Orthographic conditions than in the Semantic condition, in line with depth-of-processing theory. An exclusive masking procedure revealed distinct sign-specific neural networks supporting working memory components at all three levels of processing. The overall pattern of sign-specific activations may reflect a relative intermodality difference in the relationship between phonology and semantics influencing working memory storage and processing. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Pseudo-set framing.

    Science.gov (United States)

    Barasz, Kate; John, Leslie K; Keenan, Elizabeth A; Norton, Michael I

    2017-10-01

    Pseudo-set framing-arbitrarily grouping items or tasks together as part of an apparent "set"-motivates people to reach perceived completion points. Pseudo-set framing changes gambling choices (Study 1), effort (Studies 2 and 3), giving behavior (Field Data and Study 4), and purchase decisions (Study 5). These effects persist in the absence of any reward, when a cost must be incurred, and after participants are explicitly informed of the arbitrariness of the set. Drawing on Gestalt psychology, we develop a conceptual account that predicts what will-and will not-act as a pseudo-set, and defines the psychological process through which these pseudo-sets affect behavior: over and above typical reference points, pseudo-set framing alters perceptions of (in)completeness, making intermediate progress seem less complete. In turn, these feelings of incompleteness motivate people to persist until the pseudo-set has been fulfilled. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Goal oriented Mathematics Survey at Preparatory Level- Revised set ...

    African Journals Online (AJOL)

    This cross sectional study design on mathematical syllabi at preparatory levels of the high schools was to investigate the efficiency of the subject at preparatory level education serving as a basis for several streams, like Natural science, Technology, Computer Science, Health Science and Agriculture found at tertiary levels.

  1. Increased prolactin levels are associated with impaired processing speed in subjects with early psychosis.

    Science.gov (United States)

    Montalvo, Itziar; Gutiérrez-Zotes, Alfonso; Creus, Marta; Monseny, Rosa; Ortega, Laura; Franch, Joan; Lawrie, Stephen M; Reynolds, Rebecca M; Vilella, Elisabet; Labad, Javier

    2014-01-01

    Hyperprolactinaemia, a common side effect of some antipsychotic drugs, is also present in drug-naïve psychotic patients and subjects at risk for psychosis. Recent studies in non-psychiatric populations suggest that increased prolactin may have negative effects on cognition. The aim of our study was to explore whether high plasma prolactin levels are associated with poorer cognitive functioning in subjects with early psychoses. We studied 107 participants: 29 healthy subjects and 78 subjects with an early psychosis (55 psychotic disorders with levels were determined as well as total cortisol levels in plasma. Psychopathological status was assessed and the use of psychopharmacological treatments (antipsychotics, antidepressants, benzodiazepines) recorded. Prolactin levels were negatively associated with cognitive performance in processing speed, in patients with a psychotic disorder and high-risk subjects. In the latter group, increased prolactin levels were also associated with impaired reasoning and problem solving and poorer general cognition. In a multiple linear regression analysis conducted in both high-risk and psychotic patients, controlling for potential confounders, prolactin and benzodiazepines were independently related to poorer cognitive performance in the speed of processing domain. A mediation analysis showed that both prolactin and benzodiazepine treatment act as mediators of the relationship between risperidone/paliperidone treatment and speed of processing. These results suggest that increased prolactin levels are associated with impaired processing speed in early psychosis. If these results are confirmed in future studies, strategies targeting reduction of prolactin levels may improve cognition in this population.

  2. A Proof of Factorization Theorem of Drell–Yan Process at Operator Level

    International Nuclear Information System (INIS)

    Zhou Gao-Liang

    2016-01-01

    An alternative proof of factorization theorem for Drell–Yan process that works at operator level is presented in this paper. Contributions of interactions after the hard collision for such inclusive processes are proved to be canceled at operator level according to the unitarity of time evolution operator. After this cancellation, there are no longer leading pinch singular surface in Glauber region in the time evolution of electromagnetic currents. Effects of soft gluons are absorbed into Wilson lines of scalar-polarized gluons. Cancelation of soft gluons is attribute to unitarity of time evolution operator and such Wilson lines. (paper)

  3. Advice concerning the advantages of a reference incinerator for low-level and intermediate-level radioactive waste processing

    International Nuclear Information System (INIS)

    Luyten, G.B.

    1985-05-01

    In this report, an inventory is presented of new incinerators and flue gas filters used in low and intermediate-level radioactive waste combustion. It is argued that a 'reference equipment' for the combustion of solid and liquid low- and intermediate-level wastes best meets existing Dutch radiation protection standards. A cost-benefit analysis of such an equipment is given including annual costs of investment, capital and exploration. A separate combustion process of organic liquids and carrions is considered finally. (G.J.P.)

  4. Set-Valued Stochastic Lebesque Integral and Representation Theorems

    Directory of Open Access Journals (Sweden)

    Jungang Li

    2008-06-01

    Full Text Available In this paper, we shall firstly illustrate why we should introduce set-valued stochastic integrals, and then we shall discuss some properties of set-valued stochastic processes and the relation between a set-valued stochastic process and its selection set. After recalling the Aumann type definition of stochastic integral, we shall introduce a new definition of Lebesgue integral of a set-valued stochastic process with respect to the time t . Finally we shall prove the presentation theorem of set-valued stochastic integral and dis- cuss further properties that will be useful to study set-valued stochastic differential equations with their applications.

  5. Priority setting: what constitutes success? A conceptual framework for successful priority setting.

    Science.gov (United States)

    Sibbald, Shannon L; Singer, Peter A; Upshur, Ross; Martin, Douglas K

    2009-03-05

    The sustainability of healthcare systems worldwide is threatened by a growing demand for services and expensive innovative technologies. Decision makers struggle in this environment to set priorities appropriately, particularly because they lack consensus about which values should guide their decisions. One way to approach this problem is to determine what all relevant stakeholders understand successful priority setting to mean. The goal of this research was to develop a conceptual framework for successful priority setting. Three separate empirical studies were completed using qualitative data collection methods (one-on-one interviews with healthcare decision makers from across Canada; focus groups with representation of patients, caregivers and policy makers; and Delphi study including scholars and decision makers from five countries). This paper synthesizes the findings from three studies into a framework of ten separate but interconnected elements germane to successful priority setting: stakeholder understanding, shifted priorities/reallocation of resources, decision making quality, stakeholder acceptance and satisfaction, positive externalities, stakeholder engagement, use of explicit process, information management, consideration of values and context, and revision or appeals mechanism. The ten elements specify both quantitative and qualitative dimensions of priority setting and relate to both process and outcome components. To our knowledge, this is the first framework that describes successful priority setting. The ten elements identified in this research provide guidance for decision makers and a common language to discuss priority setting success and work toward improving priority setting efforts.

  6. Standardised assessment of functioning in ADHD: consensus on the ICF Core Sets for ADHD.

    Science.gov (United States)

    Bölte, Sven; Mahdi, Soheil; Coghill, David; Gau, Susan Shur-Fen; Granlund, Mats; Holtmann, Martin; Karande, Sunil; Levy, Florence; Rohde, Luis A; Segerer, Wolfgang; de Vries, Petrus J; Selb, Melissa

    2018-02-12

    Attention-deficit/hyperactivity disorder (ADHD) is associated with significant impairments in social, educational, and occupational functioning, as well as specific strengths. Currently, there is no internationally accepted standard to assess the functioning of individuals with ADHD. WHO's International Classification of Functioning, Disability and Health-child and youth version (ICF) can serve as a conceptual basis for such a standard. The objective of this study is to develop a comprehensive, a common brief, and three age-appropriate brief ICF Core Sets for ADHD. Using a standardised methodology, four international preparatory studies generated 132 second-level ICF candidate categories that served as the basis for developing ADHD Core Sets. Using these categories and following an iterative consensus process, 20 ADHD experts from nine professional disciplines and representing all six WHO regions selected the most relevant categories to constitute the ADHD Core Sets. The consensus process resulted in 72 second-level ICF categories forming the comprehensive ICF Core Set-these represented 8 body functions, 35 activities and participation, and 29 environmental categories. A Common Brief Core Set that included 38 categories was also defined. Age-specific brief Core Sets included a 47 category preschool version for 0-5 years old, a 55 category school-age version for 6-16 years old, and a 52 category version for older adolescents and adults 17 years old and above. The ICF Core Sets for ADHD mark a milestone toward an internationally standardised functional assessment of ADHD across the lifespan, and across educational, administrative, clinical, and research settings.

  7. DOE's planning process for mixed low-level waste disposal

    International Nuclear Information System (INIS)

    Case, J.T.; Letourneau, M.J.; Chu, M.S.Y.

    1995-01-01

    A disposal planning process was established by the Department of Energy (DOE) Mixed Low-Level Waste (MLLW) Disposal Workgroup. The process, jointly developed with the States, includes three steps: site-screening, site-evaluation, and configuration study. As a result of the screening process, 28 sites have been eliminated from further consideration for MLLW disposal and 4 sites have been assigned a lower priority for evaluation. Currently 16 sites are being evaluated by the DOE for their potential strengths and weaknesses as MLLW disposal sites. The results of the evaluation will provide a general idea of the technical capability of the 16 disposal sites; the results can also be used to identify which treated MLLW streams can be disposed on-site and which should be disposed of off-site. The information will then serve as the basis for a disposal configuration study, which includes analysis of both technical as well as non-technical issues, that will lead to the ultimate decision on MLLW disposal site locations

  8. Level of processing modulates the neural correlates of emotional memory formation

    OpenAIRE

    Ritchey, Maureen; LaBar, Kevin S.; Cabeza, Roberto

    2010-01-01

    Emotion is known to influence multiple aspects of memory formation, including the initial encoding of the memory trace and its consolidation over time. However, the neural mechanisms whereby emotion impacts memory encoding remain largely unexplored. The present study employed a levels-of-processing manipulation to characterize the impact of emotion on encoding with and without the influence of elaborative processes. Participants viewed emotionally negative, neutral, and positive scenes under ...

  9. Delta, theta, beta, and gamma brain oscillations index levels of auditory sentence processing.

    Science.gov (United States)

    Mai, Guangting; Minett, James W; Wang, William S-Y

    2016-06-01

    A growing number of studies indicate that multiple ranges of brain oscillations, especially the delta (δ, processing. It is not clear, however, how these oscillations relate to functional processing at different linguistic hierarchical levels. Using scalp electroencephalography (EEG), the current study tested the hypothesis that phonological and the higher-level linguistic (semantic/syntactic) organizations during auditory sentence processing are indexed by distinct EEG signatures derived from the δ, θ, β, and γ oscillations. We analyzed specific EEG signatures while subjects listened to Mandarin speech stimuli in three different conditions in order to dissociate phonological and semantic/syntactic processing: (1) sentences comprising valid disyllabic words assembled in a valid syntactic structure (real-word condition); (2) utterances with morphologically valid syllables, but not constituting valid disyllabic words (pseudo-word condition); and (3) backward versions of the real-word and pseudo-word conditions. We tested four signatures: band power, EEG-acoustic entrainment (EAE), cross-frequency coupling (CFC), and inter-electrode renormalized partial directed coherence (rPDC). The results show significant effects of band power and EAE of δ and θ oscillations for phonological, rather than semantic/syntactic processing, indicating the importance of tracking δ- and θ-rate phonetic patterns during phonological analysis. We also found significant β-related effects, suggesting tracking of EEG to the acoustic stimulus (high-β EAE), memory processing (θ-low-β CFC), and auditory-motor interactions (20-Hz rPDC) during phonological analysis. For semantic/syntactic processing, we obtained a significant effect of γ power, suggesting lexical memory retrieval or processing grammatical word categories. Based on these findings, we confirm that scalp EEG signatures relevant to δ, θ, β, and γ oscillations can index phonological and semantic/syntactic organizations

  10. Setting priorities for environmental restoration at the DOE Nuclear Weapons Complex

    International Nuclear Information System (INIS)

    Ton, My K.; Morgan, Robert P.

    1992-01-01

    This paper provides an evaluation of the computerized methodologies and approaches that the Department of Energy (DOE) has developed to assist in setting cleanup priorities and in allocating Environmental Restoration funds to various activities within the DOE Nuclear Weapons Complex. Issues examined include the appropriateness of the methodologies for priority setting or budget planning, their strengths and weaknesses; the limitations to the use of such systems to aid decision making; public acceptance of these systems; and the level of participation by affected or interested parties and the public in the development and implementation processes. (author)

  11. A hybrid interface tracking - level set technique for multiphase flow with soluble surfactant

    Science.gov (United States)

    Shin, Seungwon; Chergui, Jalel; Juric, Damir; Kahouadji, Lyes; Matar, Omar K.; Craster, Richard V.

    2018-04-01

    A formulation for soluble surfactant transport in multiphase flows recently presented by Muradoglu and Tryggvason (JCP 274 (2014) 737-757) [17] is adapted to the context of the Level Contour Reconstruction Method, LCRM, (Shin et al. IJNMF 60 (2009) 753-778, [8]) which is a hybrid method that combines the advantages of the Front-tracking and Level Set methods. Particularly close attention is paid to the formulation and numerical implementation of the surface gradients of surfactant concentration and surface tension. Various benchmark tests are performed to demonstrate the accuracy of different elements of the algorithm. To verify surfactant mass conservation, values for surfactant diffusion along the interface are compared with the exact solution for the problem of uniform expansion of a sphere. The numerical implementation of the discontinuous boundary condition for the source term in the bulk concentration is compared with the approximate solution. Surface tension forces are tested for Marangoni drop translation. Our numerical results for drop deformation in simple shear are compared with experiments and results from previous simulations. All benchmarking tests compare well with existing data thus providing confidence that the adapted LCRM formulation for surfactant advection and diffusion is accurate and effective in three-dimensional multiphase flows with a structured mesh. We also demonstrate that this approach applies easily to massively parallel simulations.

  12. Low level processing of diode spectrometry results

    International Nuclear Information System (INIS)

    Philippot, J.C.

    1975-01-01

    Systematic measurements in gamma spectrometry on slightly radioactive samples have led to study low levels existing in the spectra and to develop suitable processing methods. These methods and the advance that they represent in reading sensitivity are now applicable to all types of spectrum. The principles of this automatic reading are briefly summarized, leading to a description of the modifications which proved necessary to increase sensitivity. Three sample spectra are used to illustrate the arguments employed to achieve this result. The conclusions from the corresponding measurements provide a clearer understanding of the quality of the responses obtained during the initial reading. The application of these methods to systematic measurements is considered in the case of atmospheric aerosols. The owerall results obtained since 1969 are presented [fr

  13. Characterization of mammographic masses based on level set segmentation with new image features and patient information

    International Nuclear Information System (INIS)

    Shi Jiazheng; Sahiner, Berkman; Chan Heangping; Ge Jun; Hadjiiski, Lubomir; Helvie, Mark A.; Nees, Alexis; Wu Yita; Wei Jun; Zhou Chuan; Zhang Yiheng; Cui Jing

    2008-01-01

    Computer-aided diagnosis (CAD) for characterization of mammographic masses as malignant or benign has the potential to assist radiologists in reducing the biopsy rate without increasing false negatives. The purpose of this study was to develop an automated method for mammographic mass segmentation and explore new image based features in combination with patient information in order to improve the performance of mass characterization. The authors' previous CAD system, which used the active contour segmentation, and morphological, textural, and spiculation features, has achieved promising results in mass characterization. The new CAD system is based on the level set method and includes two new types of image features related to the presence of microcalcifications with the mass and abruptness of the mass margin, and patient age. A linear discriminant analysis (LDA) classifier with stepwise feature selection was used to merge the extracted features into a classification score. The classification accuracy was evaluated using the area under the receiver operating characteristic curve. The authors' primary data set consisted of 427 biopsy-proven masses (200 malignant and 227 benign) in 909 regions of interest (ROIs) (451 malignant and 458 benign) from multiple mammographic views. Leave-one-case-out resampling was used for training and testing. The new CAD system based on the level set segmentation and the new mammographic feature space achieved a view-based A z value of 0.83±0.01. The improvement compared to the previous CAD system was statistically significant (p=0.02). When patient age was included in the new CAD system, view-based and case-based A z values were 0.85±0.01 and 0.87±0.02, respectively. The study also demonstrated the consistency of the newly developed CAD system by evaluating the statistics of the weights of the LDA classifiers in leave-one-case-out classification. Finally, an independent test on the publicly available digital database for screening

  14. Actinide partitioning from high level liquid waste using the Diamex process

    International Nuclear Information System (INIS)

    Madic, C.; Blanc, P.; Condamines, N.; Baron, P.; Berthon, L.; Nicol, C.; Pozo, C.; Lecomte, M.; Philippe, M.; Masson, M.; Hequet, C.

    1994-01-01

    The removal of long-lived radionuclides, which belong to the so-called minor actinides elements, neptunium, americium and curium, from the high level nuclear wastes separated during the reprocessing of the irradiated nuclear fuels in order to transmute them into short-lived nuclides, can substantially decrease the potential hazards associated with the management of these nuclear wastes. In order to separate minor actinides from high-level liquid wastes (HLLW), a liquid-liquid extraction process was considered, based on the use of diamide molecules, which display the property of being totally burnable, thus they do not generate secondary solid wastes. The main extracting properties of dimethyldibutyltetradecylmalonamide (DMDBTDMA), the diamide selected for the development of the DIAMEX process, are briefly described in this paper. Hot tests of the DIAMEX process (using DMDBTDMA) related to the treatment of an mixed oxide fuels (MOX) type HLLW, were successfully performed. The minor actinide decontamination factors of the HLLW obtained were encouraging. The main results of these tests are presented and discussed in this paper. (authors). 9 refs., 2 figs., 7 tabs

  15. Level 1 Processing of MODIS Direct Broadcast Data at the GSFC DAAC

    Science.gov (United States)

    Lynnes, Christopher; Kempler, Steven J. (Technical Monitor)

    2001-01-01

    The GSFC DAAC is working to test and package the MODIS Level 1 Processing software for Aqua Direct Broadcast data. This entails the same code base, but different lookup tables for Aqua and Terra. However, the most significant change is the use of ancillary attitude and ephemeris files instead of orbit/attitude information within the science data stream (as with Terra). In addition, we are working on Linux: ports of the algorithms, which could eventually enable processing on PC clusters. Finally, the GSFC DAAC is also working with the GSFC Direct Readout laboratory to ingest Level 0 data from the GSFC DB antenna into the main DAAC, enabling level 1 production in near real time in support of applications users, such as the Synergy project. The mechanism developed for this could conceivably be extended to other participating stations.

  16. CT Findings of Disease with Elevated Serum D-Dimer Levels in an Emergency Room Setting

    International Nuclear Information System (INIS)

    Choi, Ji Youn; Kwon, Woo Cheol; Kim, Young Ju

    2012-01-01

    Pulmonary embolism and deep vein thrombosis are the leading causes of elevated serum D-dimer levels in the emergency room. Although D-dimer is a useful screening test because of its high sensitivity and negative predictive value, it has a low specificity. In addition, D-dimer can be elevated in various diseases. Therefore, information on the various diseases with elevated D-dimer levels and their radiologic findings may allow for accurate diagnosis and proper management. Herein, we report the CT findings of various diseases with elevated D-dimer levels in an emergency room setting, including an intravascular contrast filling defect with associated findings in a venous thromboembolism, fracture with soft tissue swelling and hematoma formation in a trauma patient, enlargement with contrast enhancement in the infected organ of a patient, coronary artery stenosis with a perfusion defect of the myocardium in a patient with acute myocardial infarction, high density of acute thrombus in a cerebral vessel with a low density of affected brain parenchyma in an acute cerebral infarction, intimal flap with two separated lumens in a case of aortic dissection, organ involvement of malignancy in a cancer patient, and atrophy of a liver with a dilated portal vein and associated findings.

  17. CT Findings of Disease with Elevated Serum D-Dimer Levels in an Emergency Room Setting

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Ji Youn; Kwon, Woo Cheol; Kim, Young Ju [Dept. of Radiology, Wonju Christian Hospital, Yensei University Wonju College of Medicine, Wonju (Korea, Republic of)

    2012-01-15

    Pulmonary embolism and deep vein thrombosis are the leading causes of elevated serum D-dimer levels in the emergency room. Although D-dimer is a useful screening test because of its high sensitivity and negative predictive value, it has a low specificity. In addition, D-dimer can be elevated in various diseases. Therefore, information on the various diseases with elevated D-dimer levels and their radiologic findings may allow for accurate diagnosis and proper management. Herein, we report the CT findings of various diseases with elevated D-dimer levels in an emergency room setting, including an intravascular contrast filling defect with associated findings in a venous thromboembolism, fracture with soft tissue swelling and hematoma formation in a trauma patient, enlargement with contrast enhancement in the infected organ of a patient, coronary artery stenosis with a perfusion defect of the myocardium in a patient with acute myocardial infarction, high density of acute thrombus in a cerebral vessel with a low density of affected brain parenchyma in an acute cerebral infarction, intimal flap with two separated lumens in a case of aortic dissection, organ involvement of malignancy in a cancer patient, and atrophy of a liver with a dilated portal vein and associated findings.

  18. Using SETS to find minimal cut sets in large fault trees

    International Nuclear Information System (INIS)

    Worrell, R.B.; Stack, D.W.

    1978-01-01

    An efficient algebraic algorithm for finding the minimal cut sets for a large fault tree was defined and a new procedure which implements the algorithm was added to the Set Equation Transformation System (SETS). The algorithm includes the identification and separate processing of independent subtrees, the coalescing of consecutive gates of the same kind, the creation of additional independent subtrees, and the derivation of the fault tree stem equation in stages. The computer time required to determine the minimal cut sets using these techniques is shown to be substantially less than the computer time required to determine the minimal cut sets when these techniques are not employed. It is shown for a given example that the execution time required to determine the minimal cut sets can be reduced from 7,686 seconds to 7 seconds when all of these techniques are employed

  19. The Effects of Test Trial and Processing Level on Immediate and Delayed Retention

    OpenAIRE

    Chang, Sau Hou

    2017-01-01

    The purpose of the present study was to investigate the effects of test trial and processing level on immediate and delayed retention. A 2 × 2 × 2 mixed ANOVAs was used with two between-subject factors of test trial (single test, repeated test) and processing level (shallow, deep), and one within-subject factor of final recall (immediate, delayed). Seventy-six college students were randomly assigned first to the single test (studied the stimulus words three times and took one free-recall test...

  20. Bud development, flowering and fruit set of Moringa oleifera Lam. (Horseradish Tree as affected by various irrigation levels

    Directory of Open Access Journals (Sweden)

    Quintin Ernst Muhl

    2013-12-01

    Full Text Available Moringa oleifera is becoming increasingly popular as an industrial crop due to its multitude of useful attributes as water purifier, nutritional supplement and biofuel feedstock. Given its tolerance to sub-optimal growing conditions, most of the current and anticipated cultivation areas are in medium to low rainfall areas. This study aimed to assess the effect of various irrigation levels on floral initiation, flowering and fruit set. Three treatments namely, a 900 mm (900IT, 600 mm (600IT and 300 mm (300IT per annum irrigation treatment were administered through drip irrigation, simulating three total annual rainfall amounts. Individual inflorescences from each treatment were tagged during floral initiation and monitored throughout until fruit set. Flower bud initiation was highest at the 300IT and lowest at the 900IT for two consecutive growing seasons. Fruit set on the other hand, decreased with the decrease in irrigation treatment. Floral abortion, reduced pollen viability as well as moisture stress in the style were contributing factors to the reduction in fruiting/yield observed at the 300IT. Moderate water stress prior to floral initiation could stimulate flower initiation, however, this should be followed by sufficient irrigation to ensure good pollination, fruit set and yield.